Top PDF Eye Tracking in User Interfaces

Eye Tracking in User Interfaces

Eye Tracking in User Interfaces

Abstrakt Tato diplomová práce se zabývá využitím technologie sledování pohledu neboli také sle- dování pohybu očí (Eye-Tracking) pro interakci člověk–počítač (Human–Computer Inter- action (HCI)). Navržený a realizovaný systém mapuje pozici bodu pohledu/zájmu (the point of gaze), která odpovídá souřadnicím v souřadnicovém systému kamery scény do souřadnicového systému displeje. Zároveň tento systém kompenzuje pohyby uživatele a tím odstraňuje jeden z hlavních problémů využití sledování pohledu v HCI. Toho je dosaženo díky stanovení transformace mezi projektivním prostorem scény a projektivním prostorem displeje. Za použití význačných bodů (interesting points), které jsou nalezeny a popsány pomocí metody SURF, vyhledání a spárování korespondujících bodů a vypočítání homo- grafie. Systém byl testován s využitím testovacích bodů, které byly rozložené po celé ploše displeje.
Show more

53 Read more

Perceptual User Interfaces using Vision-based Eye Tracking

Perceptual User Interfaces using Vision-based Eye Tracking

We plan on adding more training data to our recognition model to make it a true ‘black box’ to be used with dif- ferent experimental applications in our laboratory by other researchers. We are also working with HCI and psychology researchers to design user studies to evaluate complete sys- tems using the tracker. The feedback from these user studies could be used to modify the granularity of head pose data provided by the tracking system. We also plan to investi- gate how effective the gaze data has been in facilitating fam- ily communications and what new social implications arise from these kinds of perceptual systems. We would also like to conduct more experiments with several other application prototypes in our laboratory to explore new avenues for us- ing perceptual interfaces based on vision-based eye tracking.
Show more

7 Read more

Binocular eye-tracking for the control of a 3D immersive multimedia user interface.

Binocular eye-tracking for the control of a 3D immersive multimedia user interface.

VR to a commodity available for everyday use. However, Virtual Environments require new paradigms of User Interfaces, since stan- dard 2D interfaces are designed to be viewed from a static vantage point only, e.g. the computer screen. Additionally, traditional in- put methods such as the keyboard and mouse are hard to manipu- late when the user wears a Head Mounted Display. We present a 3D Multimedia User Interface based on eye-tracking and develop six applications which cover commonly operated actions of every- day computing such as mail composing and multimedia viewing. We perform a user study to evaluate our system by acquiring both quantitative and qualitative data. The study indicated that users make less type errors while operating the eye-controlled interface compared to using the standard keyboard during immersive view- ing. Subjects stated that they enjoyed the eye-tracking 3D interface more than the keyboard/mouse combination.
Show more

5 Read more

Perceptions of Interfaces for Eye Movement Biometrics

Perceptions of Interfaces for Eye Movement Biometrics

When prompted to explain their rankings for Ease of Use, many participants said that they ranked the PIN design highly because it was familiar and fast. One participant said “anyone can do it” and that it did not require much focus. This is a desirable characteristic for authentication interfaces where distractions and other tasks typically demand the user’s attention. The Reading variant was described as slow and difficult; some participants complained about the difficulty of the reading selection, saying that they became “detached” and were “just looking at the words” after a while. For several participants who were not native English speakers, the reading was particularly arduous. Clearly the choice of which passage to read has a significant impact on ease of use and universality of the interface.
Show more

8 Read more

Eye-tracking en masse: Group user studies, lab infrastructure, and practices

Eye-tracking en masse: Group user studies, lab infrastructure, and practices

For the data analysis part Figure 6), we map gaze fixations into positions in the source code documents, while considering where and how each source code frag- ment was displayed. This mapping is, though, done out- side of our system. What is still inside our infrastructure, is the fixation filter we used. It is our implementation of I-VT filter 6 based on the Tobii whitepaper (Olsen, 2012). From the recorded interactions with the source code edi- tor, we reconstruct its visual state for each point in time during the recording, then recalculate fixations to the positions relative to the source code document. Since the source code elements form an AOI hierarchy, such map- ping allows us to automatically analyze eye movement data together with AOIs in the source code.
Show more

15 Read more

A User Study of Web Search Session Behaviour using Eye Tracking data

A User Study of Web Search Session Behaviour using Eye Tracking data

First we synthesized the raw historical data log generated by the Eye Tracker in order to obtain a summarized log an excerpt of which can be seen in Table 1. In Table 1 we see in the first column the task id, in the second column the user id and in the third column if the query session was successful. Then, columns 4,5 and 6 are used to identify the user sequence: sequence, query and visit. With this we can define the pattern (column 8) and hence the pattern code (A to E). For example, the first two rows of data represent a query session for task T1 and user P01, that is, a task/user tuple. In the first row the corresponding query and visit numbers are {1,1} and in the second row {2,1}. Hence, as there are no more lines for this task/user tuple, the resulting pattern will be 11-21, which we designated as pattern C. It can be seen that all the other pattern assignments to query sessions are formulated similarly. Finally, the columns 9 to 14 consist of data of web log and Eye Tracker (ocular) variables. By processing the historical log generated by the Eye Tracker as we have just explained, we were able to identify five principal patterns of user behaviour in query sessions. In Figure 1 we see a graphic representation of the patterns, designated as A, B, C, D and E. In the following we briefly describe the user behaviour corresponding to each pattern.
Show more

6 Read more

Using eye-tracking to understand user behavior in deception detection system interaction

Using eye-tracking to understand user behavior in deception detection system interaction

assist individuals to conceal stress, deceive the system, and be successful in lying. Hence, analysis on countermeasures should be carried out in future studies. Another limitation for this study is that the accuracy of the device is not perfect. There is a possibility that the fixations lie slightly beside the point which was captured by the eye tracking device.

46 Read more

Intuitive visualization technique to support eye tracking data analysis: A user-study

Intuitive visualization technique to support eye tracking data analysis: A user-study

The "holy grail" of the eye tracking data visualization is a method that would intuitively convey the directional information in a simi- lar way the heat map conveys the density [r]

6 Read more

Eye-Tracking to Model and Adapt to User Meta-cognition in Intelligent Learning Environments

Eye-Tracking to Model and Adapt to User Meta-cognition in Intelligent Learning Environments

To collect empirical data on the mapping between actual student self-explanations, time and attention patterns, we ran a user study [5], briefly summarized here because it lays the groundwork for the new model and evaluation methodology described in later sections. In this study, we collected data from 18 university students using ACE while their gaze was tracked by an Eyelink I eye-tracker, developed by SR Research Ltd., Canada. Each participant received instructions to try and verbalize all his/her thought processes while using the system. Finally they used the system for as much time as needed to go through all the units. All the student exploration cases were logged, and synchronized with the output of software we developed for the real-time detection of gaze-shifts analogous to the one described earlier. Complete video and audio data of the interaction was also collected; however the analysis described here focuses on the plot unit only. Using the audio and video data, two experts independently analyzed each participant’s exploratory actions for signs of the presence or absence of self-explanation. Only exploratory actions on which the coders fully agreed were used in the rest of the analysis, generating 149 data points.
Show more

8 Read more

Understanding User Interaction in a Video Game by using Eye Tracking and Facial Expressions Analysis

Understanding User Interaction in a Video Game by using Eye Tracking and Facial Expressions Analysis

 Overcoming challenges in a game.  Diverse group of gamers. THQ perform the usability study by utilizing the Think-aloud protocol and observations. However, they only found out that the players were facing challenges and did not find the insights about why players facing those challenges. So, Key Lime Interactive rerun the usability studies with their own eye tracking technology to collect the eye tracking data and insight of the players‟ awareness of in-game cues and objects. For instance, what grasped the players‟ attention in the 3D world, what time they were noticed and what objects were ignored.
Show more

68 Read more

Mobile Multimodal User Interfaces.

Mobile Multimodal User Interfaces.

platform users can always carry with them. That means a familiar basic set of interaction possibilities is always available to users. Secondly, they provide an excellent mean for short range discovery of surrounding devices, such as Bluetooth or for future applications such as RFID. Mobility detection, an essential property for this work can be facilitated through this. Complicated user tracking and application mobility mechanisms are avoided, focusing this work purely on the adaptation mechanisms for mobile multimodal user interfaces. Furthermore, features such as identity management are mostly available through the registration of the mobile terminal to a service provider’s network already and leveraged automatically upon. For practical reasons mobile application platforms provide a well advanced development environment in order to achieve a proof-of-concept implementation of mobile multimodal user interfaces within this work. On the contrary it can be easily understood that a portal device provides a single point of failure as one of the disadvantages. In the case the device becomes faulty or the user may have accidently left the phone somewhere else and it is not accessible to her, functionalities cannot be used anymore. It can be distinguished between a terminal centric approach to mobile multimodal user interfaces, mostly motivated through terminal manufacturers and a service platform centric approach motivated by network operators. In the second case application logic exists partly in the network, therefore if a device fails, it can be substituted by a spare device, by logging in with the same user credentials and subscribed applications will be available instantly again.
Show more

237 Read more

Eye Tracking and Web Experience

Eye Tracking and Web Experience

Eye tracking can provide unique insight into visual search tasks. Researchers often ask participants to think out loud and describe what they’re doing, but there are two potential downsides to this approach: 1), the participant can become distracted from the task, 2) the participant may not report everything that he or she sees, either as an omission or because the information is not observed at a conscious level. Eye tracking provides an objective measurement of participants’ visual patterns that allows us to determine what aspects of our designs draw attention first and most. For websites like ours, this is particularly valuable when educating a user on a complex topic, making it clear what the next step in a process is, or driving users towards a call to action.”
Show more

19 Read more

Eye Tracking to Support eLearning

Eye Tracking to Support eLearning

This leads to the next point, which is, how would one deal with the situation where participants know the topic area sufficiently already that they answer the questions without reading the learning materials? Indeed, this is a limitation of the current studies given that prior-knowledge was not accurately assessed. It would be advantageous to detect that this situation, and would be an interesting future user study. A potential solution for this is to perform a pre-assessment similar to format D from Chapter 3, where participants were shown the questions before given the reading materials. Participants could be asked to complete the pre-assessment to the best of their ability and rate their knowledge on the subject matter and confidence in answering the questions. Then given the reading materials and observation of their eye gaze could take place. Such a scenario would set up testing for prior knowledge and therefore detection of eye gaze patterns of those who have (differing degrees of) prior-knowledge. This would allow for much more accurate personalisation of adaptive content. For example, detecting that a student has significant prior- knowledge of a subject allows the adaptive system to completely bypass the subject for that student. Moreover, if a student is detected to have partial prior-knowledge, then that student could be provided only with the materials that cover their knowledge gap. This also draws to light the potential benefits of combining pre- assessment with the use of eye tracking as complimentary drivers of adaptive and personalised eLearning.
Show more

271 Read more

Eye-tracking Film Music

Eye-tracking Film Music

In the post-task questionnaires, participants were asked to select from a list of adjectives that best described their experience and they could also provide their own words. As the number of responses was unequally distributed and low, with some words only provided once, statistical analysis was inappropriate. However, these results still provide a qualitative understanding about user experience. Figures 7, 8 and 9 show word clouds of terms that participants used to describe the experience of watching the moving images. In these word clouds, the size of the word indicates their relative frequency of use by participants applied to that version. As can be seen, the most frequent terms used to describe the silent version were “understandable” and “clear”, which are fairly neutral. In contrast, in both music versions, the terms “entertaining” and “interesting” featured most, which suggests an increased level of immersion and flow when moving images were accompanied by music.
Show more

28 Read more

Interactive natural user interfaces

Interactive natural user interfaces

4.2.3 Working with the Wiimote Tracking Library As aforementioned, Holovee uses a Wiimote game controller paired with infrared LEDs to create a hand gesture based user interface. We use a C#-based Wiimote library developed by Brian Peek in [41] to interact with the Wiimote game controller. Peek’s Wiimote library allows application developers to easily connect to a Wiimote game controller via a Blue- tooth interface. After a successful connection is established, applications can receive event notifications upon infrared light discovery. Once tracking, the Wiimote library gives devel- opers access to the infrared light’s (x, y) coordinates. These infrared light coordinates can then be normalized and mapped to our Holovee interface which draws basic screen cursors. From our testing the Wiimote game controller can track up to four points simultaneously at a range of about 20-30 feet [26]. During our testing, we found that clustering multiple infrared LEDs together tracked much better than single LEDs. Even though our applica- tion does not make use of them, the Wiimote library gives easy access to the Wiimote’s accelerometer and rumble controls. More information about our infrared LED component will be discussed in the upcoming Input Devices chapter.
Show more

117 Read more

Interactive natural user interfaces

Interactive natural user interfaces

A missing feature of our infrared tracking gloves is the ability to click or pinch. Surely, our speech recognition component is a sufficient substitute for opening photo albums and or issuing commands. However, a much more intuitive mechanism would be to mount a simple push button onto the user’s tracking gloves. In our search for a simple push button, we researched a company called Phidgets [42]. Phidgets manufactures a plethora of plug- and-play USB components which have stable high level programming interfaces for easy manipulation. The only downside to Phidgets is that they are only offered as wired USB interfaces. Our infrared gloves are meant to be both comfortable and able to be moved freely. By adding a wired component to our device, we are limited the user to be within wired distance of a computer to utilize a simple push button mechanism. A more viable solution would be to find a wireless simple push button solution that could mount to our user’s gloves. Moreover, another solution may be to mount additional infrared LEDs on the user’s glove thumbs. Since our Wiimote can simultaneously track up to four points, we could develop an algorithm which detects when a user’s index and thumb fingers perform a pinching motion.
Show more

117 Read more

User Interfaces for Cooperation

User Interfaces for Cooperation

3D-Pointing Benchmarks 19 2.3 3D-Pointing Benchmarks 3D pointing can be considered a small extension to its 1D and 2D counterparts. It seems to involve the same cognitive processes and very similar motor actions, but with one additional degree of freedom of the target location. Under real-world con- ditions, this does not seem to make much difference. The motion trajectories between tapping actions involve all three dimensions anyway and if the targets are within reach, also the number of involved limbs remains the same. A physical target object generally provides support that effectively prevents overshooting in one direction – independent of whether the targets are arranged in one, two, or three dimensions. Our motor skills, however, seem to vary with the direction of movement. Appar- ently, target locations right in front of us can be reached faster than those that involve movement to the side or upwards. Empirical studies on these effects in real-world 3D pointing experiments have revealed differences in the range of 100 ms for pointing tasks with IDs between 2.5 and 6 [57, 241]. Murata and Iwase reported an average throughput of about 4.7 bits/s [241]. Perhaps, better performance would have been possible with a less encumbering measurement apparatus than the wired 3D tracking device they attached to the participants’ fingertips. From the data provided by Cha and Myung on a similar 3D target acquisition task we can derive an average through- put of about 6 bits/s, which seems to better reflect our 3D pointing capabilities in the real world [57].
Show more

261 Read more

A Novel User Experience Study of Parallax Scrolling using  Eye Tracking and User Experience Questionnaire

A Novel User Experience Study of Parallax Scrolling using Eye Tracking and User Experience Questionnaire

In 2017, Wang and Sundar [6] conducted a more recent user experience study of parallax scrolling. They investigated effect of dimension (with vs. without presence) and interaction technique (clicking vs. scrolling) on several dependent variables. The variables were perceived vividness, perceived coolness, natural mapping, and perceived ease of use, user engagement, and attitude toward the website, behavioral intention toward the website, attitude toward the product, and behavioral intention toward the product. They presented four websites of Samsung Gear product, each of which was manipulated based on the studied independent variables. Wang and Sundar argued that parallax scrolling improves both perception of coolness and vividness while positively associated with the reported perceived ease of use and user engagement. However, they also found that if the users said parallax scrolling did not improve perceived vividness, coolness, and ease of use, the users were less likely engaged with the content of the website.
Show more

8 Read more

Adaptive User Interface of Product Recommendation Based on Eye-tracking

Adaptive User Interface of Product Recommendation Based on Eye-tracking

IGA is similar to GA, which can ‘interact’ with uses and percept user’s response, emotion or preference as fitness value for optimization problems when the fitness function (to assess the performance of an individual) can’t be explicitly defined or effectively formalized. Hence, IGA can be used to solve problems that can’t be easily solved by GA, such as aesthetic design and art evolutional systems [18] [6]. In applications, fitness is valued based on objective human evaluation [15], e.g., rating. To reduce these explicit actions, the implicit evaluation methods are used to computer fitness function, such as eye-tracking based evolutionary algorithm for optimizing OneMax Problem (or BitCounting) [11] [2]. The research included three important eye-movement parameters and one basic hypothesis: the time user focus on interested screen area, the number of transitions towards the area, and the average of the pupil diameter; the hypothesis was that the more an individual was visual examined the high the fitness’ value would be. Nevertheless, the related research, such as [11] only coded one feature, color, into bit string; but for
Show more

8 Read more

How To Use Eye Tracking With A Dual Eye Tracking System In A Collaborative Collaborative Eye Tracking (Duet)

How To Use Eye Tracking With A Dual Eye Tracking System In A Collaborative Collaborative Eye Tracking (Duet)

Abstract Dual user eye tracking offers insight into the collaborative behaviours of participants, as well as the potential for improving communication effectiveness when direct eye contact is unavailable. In this paper we outline a system framework for performing colocated synchronous dual eye tracking. Issues around system setup and eye tracker calibration are discussed. An evaluation of accuracy found that lower accuracy was achieved for the viewer who was off axis to the primary display. The impact of lower accuracy depends on the intended application. For the test application presented here in which on-screen gaze markers indicate the general area of the viewer’s interest, the accuracy achieved proved sufficient.
Show more

6 Read more

Show all 10000 documents...