Institution: | Slovak University of Technology |
Technologies used: | C#.NET |
Inputs: | user's activity in IDE, user's image |
Outputs: | source code fragments annotated by information tags |
Addressed problem
When programmers work with software code, their activity is not only in writing a new code and committing it. Information from source code repositories, project management software, etc. are not enough to know how the programmer evaluates the source code. Tracking the mouse and keyboard can broaden what we know about these activities, however a programmer can study (read) the code without providing any keyboard or mouse input. By inferring the user's progress through source code (which code fragments were edited or read), source code fragments properties (e.g., many programmers had trouble understanding it), or other knowledge using only basic input devices, we disregard such passive reading, or we consider that the programmer is giving all their attention to all currently displayed code fragments, while in reality the user can be away from the computer or reading only one fragment over and over.
Description
We propose a method for tracking user's attention to individual fragments of a document — source code methods, web page paragraphs, etc. We divided traditional implicit interest indicators such as mouse movement, scrolling, read wear, text selection, or visits into four categories according to relation to fragments: untargeted, passively targeted, actively targeted, and document-level indicators. The interest indicators are collected during user's activity and the user’s attention is evaluated for document fragments by assigning weights to indicator categories and weighing the indicators within each category. Source code fragments are annotated with information tags representing user's attention to them. These can be further used by other methods for recommendation, etc.
We enhance indicator collection by tracking user's presence using webcam and we even include estimation of user's gaze (also using commodity webcam) into passively targeted indicators. The user presence can be simply evaluated by detecting a face in the image feed from camera. However, tracking the user's gaze is more of a challenge. For precise gaze tracking, expensive professional equipment or a commodity webcam with modified infrared filter and infrared lighting can be used, however these are not available or comfortable for tracking the programmers working in a software development company. The image from simple webcam with user's eyes illuminated only by natural or artificial visible light carries less information then aforementioned approaches. In our method, we account for the imprecision by assigning the gaze distributed around the estimated point. In method realization, we provide the users with a convenient way to calibrate and recalibrate the gaze tracking.
We performed an experiment with adaptive learning system users, who had their gaze tracked in their own common settings, e.g., their homes. In this experiment, we found the commodity gaze tracking plausible for such use and we also inferred weights for interest indicators.
By tracking the user with a webcam, we can also estimate their emotional state using face points detection and performing machine learning on faces described by these points. We performed experiments with emotional state detection models learned from sample datasets and augmented with user input on whether the state detected from camera corresponds with how the user actually feels at the moment. User’s emotional state is then used as one of the inputs to recommend actions improving worker morale (e.g. recommend a break when productivity drops), or in combination with gaze tracking to describe source code fragments (e.g. multiple users were frustrated when they were working with given method, class, etc.).
References
Biroš, Michal – Caban, Tomáš – Kunka, Tomáš – Staňo, Filip – Lekeň, Tomáš – Martinkovič, Milan – Szilva, Bálint: Emotional State Recognition. In: Student Research Conference 2013. Vol. 2: 9th Student Research Conference in Informatics and Information Technologies Bratislava, April 23, 2013 Proceedings, Nakladateľstvo STU, Bratislava, 2013, pp. 453-454