Hacker News new | past | comments | ask | show | jobs | submit login
Eye Tracking vs. Mouse Tracking (gazehawk.com)
67 points by bkrausz on Oct 18, 2010 | hide | past | favorite | 19 comments



I would be really really interested how Gazehawk is doing the webcam eyetracking. Last year I decided to pivot my usabilitytest.com business to do exactly what they are doing now. I have been testing / researching / combining different eye-tracking and head-tracking algorithms/APIs etc. I must say for an engineer it was an ultimate challenge. However, after 3 months of working on it I realized this is too much for me and I called it quits. I managed to get eye-tracking working using 2 webcams (one front to monitor head-movement and one small near the eye to monitor eye movement). Unfortunately, because of the nature of eye reflection I had to work in complete darkness and use IR light to illuminate the eye - you can't call it non-intrusive method.

So I am very happy that guys at GazeHawk cracked this and launched the service that I really wanted to do and offer people low-cost eyetracking method.


I pointed at some of the general techniques we use on Quora: http://www.quora.com/GazeHawk/What-broad-computer-vision-tec...

> for an engineer it was an ultimate challenge

You won't find any argument from me; eye-tracking is extremely hard stuff to do correctly and reliably. There are many things that I still want to do to improve our implementation, but can't find the time to write.

(Insert a shameless request for interested readers to send me their resumes here!)


Thanks for the link - interesting.

"There are many things that I still want to do to improve our implementation, but can't find the time to write." - so you did the main part of coding on Gazehawk?

What I did was quite a simple stuff - I combined opensource GazeTracker (http://www.gazegroup.org/downloads/23-gazetracker - that uses OpenCV) with FaceAPI from SeeingMachines. I also experimented with accelerometer module instead of FaceAPI and second webcam but that was a dead end :) In the end after calibration I managed to get about 1-2cm accuracy on 2 test subjects. After my dad came with his smaller eyes and accuracy went dramatically down - I gave up :)


Which face detection module do you use? Do you need to pinpoint pupil with detection tool before later process or estimate pupil position with the rough detection area of eye?


Just emailed you.


you emailed me? I didn't receive anything. I am not sure where you got my email but here it is: jezowicz at gmail


Not you - jgershen. But I can email you as well if you like. I've been working on some eyetracking stuff and find it fun and interesting.


Has anyone tried fuzzing out content further from the mouse pointer -- perhaps with semi-transparent images or new CSS/canvas capabilities -- so that users have to move the mouse near what they're reading, thus making mouse position (for testing purposes) better model gaze?


You'll get bad data. If you put a piece of content to the side in a blurred-out box for testing, people may very well go investigate it -- unblur it and they ignore it entirely.


You can probably account for that by timing how long your mouse was near that area, no?

You'll have to re-blur the areas when you leave them of course so that the reader goes back to the other content.


Maybe - the only way to confirm that the timing is an effective proxy is to use eye tracking :)


That's actually used for security applications, as it prevents shoulder-surfing pretty readily. I don't think the average site would want to compromise the user experience that heavily though.


If I don't have to, why should I choose only one or the other? Ideally, I'd love to have access to both sets of studies for my websites.

I would assume that eye tracking is mainly for looking at content and secondly for looking for next actions, while mouse movement would be primarily for locating next actions and secondarily for interacting with content.


I can personally attest that my mouse and my eyes are only very loosely correlated, maybe 50% at best.

(Much of the time I am using hotkeys, and when I am actually using the mouse it is often flitting around in my peripheral vision where I can see it but don't focus on it)


I often move my mouse pointer out of the way so it's not distracting me. Surely that wouldn't be too uncommon?


When I'm reading Hacker News my eyes are on the comments, the mouse pointer is on the left or right side so it isn't causing any distraction. I do it all the time. Sometimes I also highlight what I'm reading, mainly if I'm multitasking and it's a long text, I want to know where I started last time I lost focus on the text.


Re: highlighting, I do the exact same thing. Has always made me wonder about an add-on or something that could handle that in a way that gave it advantages over the highlighting method?

Then again, highlighting is low-tech, easy enough and works fine. Maybe I'm looking for an idea that isn't there.


Biggest flaw of eye tracking like this is that it's hard to get ppl in the 200k yr to participate.

With mouse tracking just need to bring them to my web site.

Of course, I understand how better eye tracking is. But I'm not talking about that here.


A good point. Interestingly enough, around a quarter of our testers make > $90k/yr. We don't break it down more than that (and I don't know how active they are in actually testing), but I suspect it was because of interest in the new technology rather than actual desire to make money (even if we do pay well).




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: