I would be really really interested how Gazehawk is doing the webcam eyetracking. Last year I decided to pivot my usabilitytest.com business to do exactly what they are doing now. I have been testing / researching / combining different eye-tracking and head-tracking algorithms/APIs etc. I must say for an engineer it was an ultimate challenge. However, after 3 months of working on it I realized this is too much for me and I called it quits. I managed to get eye-tracking working using 2 webcams (one front to monitor head-movement and one small near the eye to monitor eye movement). Unfortunately, because of the nature of eye reflection I had to work in complete darkness and use IR light to illuminate the eye - you can't call it non-intrusive method.
So I am very happy that guys at GazeHawk cracked this and launched the service that I really wanted to do and offer people low-cost eyetracking method.
You won't find any argument from me; eye-tracking is extremely hard stuff to do correctly and reliably. There are many things that I still want to do to improve our implementation, but can't find the time to write.
(Insert a shameless request for interested readers to send me their resumes here!)
"There are many things that I still want to do to improve our implementation, but can't find the time to write."
- so you did the main part of coding on Gazehawk?
What I did was quite a simple stuff - I combined opensource GazeTracker (http://www.gazegroup.org/downloads/23-gazetracker - that uses OpenCV) with FaceAPI from SeeingMachines. I also experimented with accelerometer module instead of FaceAPI and second webcam but that was a dead end :) In the end after calibration I managed to get about 1-2cm accuracy on 2 test subjects. After my dad came with his smaller eyes and accuracy went dramatically down - I gave up :)
Which face detection module do you use? Do you need to pinpoint pupil with detection tool before later process or estimate pupil position with the rough detection area of eye?
Has anyone tried fuzzing out content further from the mouse pointer -- perhaps with semi-transparent images or new CSS/canvas capabilities -- so that users have to move the mouse near what they're reading, thus making mouse position (for testing purposes) better model gaze?
You'll get bad data. If you put a piece of content to the side in a blurred-out box for testing, people may very well go investigate it -- unblur it and they ignore it entirely.
That's actually used for security applications, as it prevents shoulder-surfing pretty readily. I don't think the average site would want to compromise the user experience that heavily though.
If I don't have to, why should I choose only one or the other? Ideally, I'd love to have access to both sets of studies for my websites.
I would assume that eye tracking is mainly for looking at content and secondly for looking for next actions, while mouse movement would be primarily for locating next actions and secondarily for interacting with content.
I can personally attest that my mouse and my eyes are only very loosely correlated, maybe 50% at best.
(Much of the time I am using hotkeys, and when I am actually using the mouse it is often flitting around in my peripheral vision where I can see it but don't focus on it)
When I'm reading Hacker News my eyes are on the comments, the mouse pointer is on the left or right side so it isn't causing any distraction. I do it all the time. Sometimes I also highlight what I'm reading, mainly if I'm multitasking and it's a long text, I want to know where I started last time I lost focus on the text.
Re: highlighting, I do the exact same thing. Has always made me wonder about an add-on or something that could handle that in a way that gave it advantages over the highlighting method?
Then again, highlighting is low-tech, easy enough and works fine. Maybe I'm looking for an idea that isn't there.
A good point. Interestingly enough, around a quarter of our testers make > $90k/yr. We don't break it down more than that (and I don't know how active they are in actually testing), but I suspect it was because of interest in the new technology rather than actual desire to make money (even if we do pay well).
So I am very happy that guys at GazeHawk cracked this and launched the service that I really wanted to do and offer people low-cost eyetracking method.