![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
And I don't mean "how many people have read this page?" -- I'm musing about, literally, when we're going to have our computers start really paying attention to the user's eyes.
I was thinking about that as I was proofing my last entry. When I wanted to change a word, I exited proofing mode, used the mouse to go to the right position, backspaced over the word, and used the keyboard to re-enter it. But what I really wanted was to simply say, "strike 'fill' and replace it with 'feel'", and have the computer know which word I was talking about.
I'm sure it's *possible* -- it just requires combining good speech-sensitive editing capabilities with a camera sensitive enough to track the user's eye movements, and well-enough aligned with the screen to serve as a pointer. (I envision an alignment UI sort of like the one you get when you boot a Palm PDA -- "Look at this dot. Now look at this dot.") Not easy, but I suspect it's feasible with current high-end technology, and ought to become more straightforward as camera tech improves.
The mouse is a nice tool, and good for precision work, but a hassle to use most of the time -- you have to stop whatever you're doing with your hands to use it. Geeks use keyboard shortcuts instead, but I suspect that we're going to see the emergence of alternatives better-suited to mass use. Minority Report style virtual displays are one approach (especially if you think the keyboard itself will go away), but eye tracking seems to have a lot of potential for very intuitively doing what you want...
I was thinking about that as I was proofing my last entry. When I wanted to change a word, I exited proofing mode, used the mouse to go to the right position, backspaced over the word, and used the keyboard to re-enter it. But what I really wanted was to simply say, "strike 'fill' and replace it with 'feel'", and have the computer know which word I was talking about.
I'm sure it's *possible* -- it just requires combining good speech-sensitive editing capabilities with a camera sensitive enough to track the user's eye movements, and well-enough aligned with the screen to serve as a pointer. (I envision an alignment UI sort of like the one you get when you boot a Palm PDA -- "Look at this dot. Now look at this dot.") Not easy, but I suspect it's feasible with current high-end technology, and ought to become more straightforward as camera tech improves.
The mouse is a nice tool, and good for precision work, but a hassle to use most of the time -- you have to stop whatever you're doing with your hands to use it. Geeks use keyboard shortcuts instead, but I suspect that we're going to see the emergence of alternatives better-suited to mass use. Minority Report style virtual displays are one approach (especially if you think the keyboard itself will go away), but eye tracking seems to have a lot of potential for very intuitively doing what you want...
(no subject)
Date: 2006-12-18 07:30 pm (UTC)The demo of head-tracking I got a few months ago at MIT indicated that the little dot really isn't necessary any more; they were tracking both eye locations, eye shape, eyebrows, and mouth shape, all quite accurately and at 20+fps with a simple camera pointed at your face. Make a quick 3d model of your head during the calibration process from five snapshots using other existing tech, and the software will have a very good indication of where your head is turned...
http://citeseer.comp.nus.edu.sg/kapoor02realtime.html
http://vismod.media.mit.edu/vismod/demos/faceview/head-track/head-track.html
etc.
(no subject)
Date: 2006-12-18 07:49 pm (UTC)(I'll pass the word along to my friend who uses the dot-mouse due to wrist issues.)
(no subject)
Date: 2006-12-18 07:52 pm (UTC)