Of course this assumes the user accepting the fact of being tracked to help improving the app’s experience. Every user has the required camera roughly pointed to their eyes most of the time anyway. Instead of doing this in dedicated studies, gaze data can now be obtained during the day to day usage of the app itself. Before ARKit2 and similar frameworks introduced this functionality, such an implementation would have required the use of additional eye tracking devices, either head-mounted or external.Ī very common use case of such additional eye tracking devices was usability testing. The beauty of this approach lies in the fact that this is now possible with just one device. Eye tracking in combination with the input modality of speech could then enable the user to easily annotate certain UI elements by just looking at them while speaking out what’s on their mind. Mobile apps are capable of displaying arbitrary websites in a containing element, a webView. A first and already quite concrete use case which we came up with tackles the problem of quality assurance for mobile websites.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |