![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
And I don't mean "how many people have read this page?" -- I'm musing about, literally, when we're going to have our computers start really paying attention to the user's eyes.
I was thinking about that as I was proofing my last entry. When I wanted to change a word, I exited proofing mode, used the mouse to go to the right position, backspaced over the word, and used the keyboard to re-enter it. But what I really wanted was to simply say, "strike 'fill' and replace it with 'feel'", and have the computer know which word I was talking about.
I'm sure it's *possible* -- it just requires combining good speech-sensitive editing capabilities with a camera sensitive enough to track the user's eye movements, and well-enough aligned with the screen to serve as a pointer. (I envision an alignment UI sort of like the one you get when you boot a Palm PDA -- "Look at this dot. Now look at this dot.") Not easy, but I suspect it's feasible with current high-end technology, and ought to become more straightforward as camera tech improves.
The mouse is a nice tool, and good for precision work, but a hassle to use most of the time -- you have to stop whatever you're doing with your hands to use it. Geeks use keyboard shortcuts instead, but I suspect that we're going to see the emergence of alternatives better-suited to mass use. Minority Report style virtual displays are one approach (especially if you think the keyboard itself will go away), but eye tracking seems to have a lot of potential for very intuitively doing what you want...
I was thinking about that as I was proofing my last entry. When I wanted to change a word, I exited proofing mode, used the mouse to go to the right position, backspaced over the word, and used the keyboard to re-enter it. But what I really wanted was to simply say, "strike 'fill' and replace it with 'feel'", and have the computer know which word I was talking about.
I'm sure it's *possible* -- it just requires combining good speech-sensitive editing capabilities with a camera sensitive enough to track the user's eye movements, and well-enough aligned with the screen to serve as a pointer. (I envision an alignment UI sort of like the one you get when you boot a Palm PDA -- "Look at this dot. Now look at this dot.") Not easy, but I suspect it's feasible with current high-end technology, and ought to become more straightforward as camera tech improves.
The mouse is a nice tool, and good for precision work, but a hassle to use most of the time -- you have to stop whatever you're doing with your hands to use it. Geeks use keyboard shortcuts instead, but I suspect that we're going to see the emergence of alternatives better-suited to mass use. Minority Report style virtual displays are one approach (especially if you think the keyboard itself will go away), but eye tracking seems to have a lot of potential for very intuitively doing what you want...
(no subject)
Date: 2006-12-18 07:20 pm (UTC)I'm familiar with the issue you've described - a good UI for mouse is not necessarily a good UI for eye-tracker. However, the amount of time the eye lingers on something is fairly representative of how interested you are in it, and it's possible to build some UIs (caveat: never done it, just read the papers) that are eerily good, where users reported it was like the computer was reading their mind. This may not be feasible in all domains / for all types of programs; the instances I read about were apps where focus on an object pretty clearly meant "I want to learn more about this thing"; I'd imagine that when focus on an object is ambiguous in intent, what to do about that attention becomes less clear.
And on an only-vaguely related note, there do exist head-mounted mice - stick a little reflective dot on your forehead, do a quick calibration, and boom - the cursor goes where you turn your head. Clicking can either be done with foot pedals or a mouse-in-the-lap, though I'd imagine that voice commands could also be used via application of appropriate tech.
(no subject)
Date: 2006-12-18 07:30 pm (UTC)The demo of head-tracking I got a few months ago at MIT indicated that the little dot really isn't necessary any more; they were tracking both eye locations, eye shape, eyebrows, and mouth shape, all quite accurately and at 20+fps with a simple camera pointed at your face. Make a quick 3d model of your head during the calibration process from five snapshots using other existing tech, and the software will have a very good indication of where your head is turned...
http://citeseer.comp.nus.edu.sg/kapoor02realtime.html
http://vismod.media.mit.edu/vismod/demos/faceview/head-track/head-track.html
etc.
(no subject)
Date: 2006-12-18 07:49 pm (UTC)(I'll pass the word along to my friend who uses the dot-mouse due to wrist issues.)
(no subject)
Date: 2006-12-18 07:52 pm (UTC)