jducoeur: (Default)
jducoeur ([personal profile] jducoeur) wrote2006-12-18 01:17 pm
Entry tags:

So when do we start doing eyeball tracking?

And I don't mean "how many people have read this page?" -- I'm musing about, literally, when we're going to have our computers start really paying attention to the user's eyes.

I was thinking about that as I was proofing my last entry. When I wanted to change a word, I exited proofing mode, used the mouse to go to the right position, backspaced over the word, and used the keyboard to re-enter it. But what I really wanted was to simply say, "strike 'fill' and replace it with 'feel'", and have the computer know which word I was talking about.

I'm sure it's *possible* -- it just requires combining good speech-sensitive editing capabilities with a camera sensitive enough to track the user's eye movements, and well-enough aligned with the screen to serve as a pointer. (I envision an alignment UI sort of like the one you get when you boot a Palm PDA -- "Look at this dot. Now look at this dot.") Not easy, but I suspect it's feasible with current high-end technology, and ought to become more straightforward as camera tech improves.

The mouse is a nice tool, and good for precision work, but a hassle to use most of the time -- you have to stop whatever you're doing with your hands to use it. Geeks use keyboard shortcuts instead, but I suspect that we're going to see the emergence of alternatives better-suited to mass use. Minority Report style virtual displays are one approach (especially if you think the keyboard itself will go away), but eye tracking seems to have a lot of potential for very intuitively doing what you want...

[identity profile] ian-gunn.livejournal.com 2006-12-18 06:48 pm (UTC)(link)
It's not so much that you are distracted and looking about but that the eye/brain has a natural scanning mechanism built into the vision process, Saccadic eye motion (http://en.wikipedia.org/wiki/Saccade). I don't think that is trainable away per se but you could certainly train a person to focus enough on something for a machine algorithm to pick up on it.

[identity profile] metahacker.livejournal.com 2006-12-18 07:17 pm (UTC)(link)
It's not so much saccades that are the problem -- I was given a quick demo of center-weighting algorithms to figure out a person's average gaze location that evidently work "okay" -- but that you really do fixate for a short time compare to, say, the process of moving a mouse and clicking on a target. I suppose this means we should think of eye fixations as akin to key presses rather than mouse move-and-clicks...

[identity profile] bkdelong.livejournal.com 2006-12-18 07:28 pm (UTC)(link)
Or program the computer to recognize the Saccadic motion and "filter" it.