![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
And I don't mean "how many people have read this page?" -- I'm musing about, literally, when we're going to have our computers start really paying attention to the user's eyes.
I was thinking about that as I was proofing my last entry. When I wanted to change a word, I exited proofing mode, used the mouse to go to the right position, backspaced over the word, and used the keyboard to re-enter it. But what I really wanted was to simply say, "strike 'fill' and replace it with 'feel'", and have the computer know which word I was talking about.
I'm sure it's *possible* -- it just requires combining good speech-sensitive editing capabilities with a camera sensitive enough to track the user's eye movements, and well-enough aligned with the screen to serve as a pointer. (I envision an alignment UI sort of like the one you get when you boot a Palm PDA -- "Look at this dot. Now look at this dot.") Not easy, but I suspect it's feasible with current high-end technology, and ought to become more straightforward as camera tech improves.
The mouse is a nice tool, and good for precision work, but a hassle to use most of the time -- you have to stop whatever you're doing with your hands to use it. Geeks use keyboard shortcuts instead, but I suspect that we're going to see the emergence of alternatives better-suited to mass use. Minority Report style virtual displays are one approach (especially if you think the keyboard itself will go away), but eye tracking seems to have a lot of potential for very intuitively doing what you want...
I was thinking about that as I was proofing my last entry. When I wanted to change a word, I exited proofing mode, used the mouse to go to the right position, backspaced over the word, and used the keyboard to re-enter it. But what I really wanted was to simply say, "strike 'fill' and replace it with 'feel'", and have the computer know which word I was talking about.
I'm sure it's *possible* -- it just requires combining good speech-sensitive editing capabilities with a camera sensitive enough to track the user's eye movements, and well-enough aligned with the screen to serve as a pointer. (I envision an alignment UI sort of like the one you get when you boot a Palm PDA -- "Look at this dot. Now look at this dot.") Not easy, but I suspect it's feasible with current high-end technology, and ought to become more straightforward as camera tech improves.
The mouse is a nice tool, and good for precision work, but a hassle to use most of the time -- you have to stop whatever you're doing with your hands to use it. Geeks use keyboard shortcuts instead, but I suspect that we're going to see the emergence of alternatives better-suited to mass use. Minority Report style virtual displays are one approach (especially if you think the keyboard itself will go away), but eye tracking seems to have a lot of potential for very intuitively doing what you want...
(no subject)
Date: 2006-12-18 06:33 pm (UTC)One of the real issues, however, is that "pointing" with your eyes turns out to be very non-intuitive and not at all how the eyes work. You don't continue to look at something after you've mentally marked it; your eyes flicker across the page, distracted by a hundred things a second. It may be possible to fix this with training, but past studies (can't find links right now) haven't been too favorable. So basically it's an area waiting for a good design solution; how to figure out what the user is *thinking* of, based on where their eyes went...
(no subject)
Date: 2006-12-18 06:45 pm (UTC)(no subject)
Date: 2006-12-18 06:48 pm (UTC)(no subject)
Date: 2006-12-18 07:17 pm (UTC)(no subject)
Date: 2006-12-18 07:28 pm (UTC)(no subject)
Date: 2006-12-18 07:20 pm (UTC)I'm familiar with the issue you've described - a good UI for mouse is not necessarily a good UI for eye-tracker. However, the amount of time the eye lingers on something is fairly representative of how interested you are in it, and it's possible to build some UIs (caveat: never done it, just read the papers) that are eerily good, where users reported it was like the computer was reading their mind. This may not be feasible in all domains / for all types of programs; the instances I read about were apps where focus on an object pretty clearly meant "I want to learn more about this thing"; I'd imagine that when focus on an object is ambiguous in intent, what to do about that attention becomes less clear.
And on an only-vaguely related note, there do exist head-mounted mice - stick a little reflective dot on your forehead, do a quick calibration, and boom - the cursor goes where you turn your head. Clicking can either be done with foot pedals or a mouse-in-the-lap, though I'd imagine that voice commands could also be used via application of appropriate tech.
(no subject)
Date: 2006-12-18 07:30 pm (UTC)The demo of head-tracking I got a few months ago at MIT indicated that the little dot really isn't necessary any more; they were tracking both eye locations, eye shape, eyebrows, and mouth shape, all quite accurately and at 20+fps with a simple camera pointed at your face. Make a quick 3d model of your head during the calibration process from five snapshots using other existing tech, and the software will have a very good indication of where your head is turned...
http://citeseer.comp.nus.edu.sg/kapoor02realtime.html
http://vismod.media.mit.edu/vismod/demos/faceview/head-track/head-track.html
etc.
(no subject)
Date: 2006-12-18 07:49 pm (UTC)(I'll pass the word along to my friend who uses the dot-mouse due to wrist issues.)
(no subject)
Date: 2006-12-18 07:52 pm (UTC)(no subject)
Date: 2006-12-18 06:43 pm (UTC)One should be able to use any PDA/phone camera that records video. i.e. during that preview mode you'd see it attempt to zero in on your retina and hopefully do a bit of recognition as well.
I honestly wish there was an Open source voice-recognition platform where I could continuously hone the voiceprint and take it around or import it into other applications. IMO it's still not really there yet and I want us to hurry up and get to subvocal recognition - though some people see subvocalization as a problem rather than an enhancement.
(no subject)
Date: 2006-12-18 06:44 pm (UTC)Next, they will be ransacking your computer for *what your eyes lingered on most*, and using it against you.
I'm gonna head down to Chinatown and buy a used abacus.
No, new. The last person might have had something on his fingers when he used it, and they'll try to pin it on me!
--Talvin