jducoeur: (Default)
[personal profile] jducoeur
And I don't mean "how many people have read this page?" -- I'm musing about, literally, when we're going to have our computers start really paying attention to the user's eyes.

I was thinking about that as I was proofing my last entry. When I wanted to change a word, I exited proofing mode, used the mouse to go to the right position, backspaced over the word, and used the keyboard to re-enter it. But what I really wanted was to simply say, "strike 'fill' and replace it with 'feel'", and have the computer know which word I was talking about.

I'm sure it's *possible* -- it just requires combining good speech-sensitive editing capabilities with a camera sensitive enough to track the user's eye movements, and well-enough aligned with the screen to serve as a pointer. (I envision an alignment UI sort of like the one you get when you boot a Palm PDA -- "Look at this dot. Now look at this dot.") Not easy, but I suspect it's feasible with current high-end technology, and ought to become more straightforward as camera tech improves.

The mouse is a nice tool, and good for precision work, but a hassle to use most of the time -- you have to stop whatever you're doing with your hands to use it. Geeks use keyboard shortcuts instead, but I suspect that we're going to see the emergence of alternatives better-suited to mass use. Minority Report style virtual displays are one approach (especially if you think the keyboard itself will go away), but eye tracking seems to have a lot of potential for very intuitively doing what you want...

(no subject)

Date: 2006-12-18 06:33 pm (UTC)
From: [identity profile] metahacker.livejournal.com
The tech for eyetracking is currently good enough to do this. However, that's not consumer tech, that's experimental tech. Word-level precision is probably doable assuming you don't use an exceptionally small font.

One of the real issues, however, is that "pointing" with your eyes turns out to be very non-intuitive and not at all how the eyes work. You don't continue to look at something after you've mentally marked it; your eyes flicker across the page, distracted by a hundred things a second. It may be possible to fix this with training, but past studies (can't find links right now) haven't been too favorable. So basically it's an area waiting for a good design solution; how to figure out what the user is *thinking* of, based on where their eyes went...

(no subject)

Date: 2006-12-18 06:45 pm (UTC)
From: [identity profile] bkdelong.livejournal.com
Agreed - that's probably where gesture recognition rather than eye tracking may be more valuable. The problem you run into there is being too fat-fingered.

(no subject)

Date: 2006-12-18 06:48 pm (UTC)
From: [identity profile] ian-gunn.livejournal.com
It's not so much that you are distracted and looking about but that the eye/brain has a natural scanning mechanism built into the vision process, Saccadic eye motion (http://en.wikipedia.org/wiki/Saccade). I don't think that is trainable away per se but you could certainly train a person to focus enough on something for a machine algorithm to pick up on it.

(no subject)

Date: 2006-12-18 07:17 pm (UTC)
From: [identity profile] metahacker.livejournal.com
It's not so much saccades that are the problem -- I was given a quick demo of center-weighting algorithms to figure out a person's average gaze location that evidently work "okay" -- but that you really do fixate for a short time compare to, say, the process of moving a mouse and clicking on a target. I suppose this means we should think of eye fixations as akin to key presses rather than mouse move-and-clicks...

(no subject)

Date: 2006-12-18 07:28 pm (UTC)
From: [identity profile] bkdelong.livejournal.com
Or program the computer to recognize the Saccadic motion and "filter" it.

(no subject)

Date: 2006-12-18 07:20 pm (UTC)
mindways: (Default)
From: [personal profile] mindways
Back when I was in grad school ('95-'97), our department had a head-mounted eye tracker. It was very neat. :)

I'm familiar with the issue you've described - a good UI for mouse is not necessarily a good UI for eye-tracker. However, the amount of time the eye lingers on something is fairly representative of how interested you are in it, and it's possible to build some UIs (caveat: never done it, just read the papers) that are eerily good, where users reported it was like the computer was reading their mind. This may not be feasible in all domains / for all types of programs; the instances I read about were apps where focus on an object pretty clearly meant "I want to learn more about this thing"; I'd imagine that when focus on an object is ambiguous in intent, what to do about that attention becomes less clear.

And on an only-vaguely related note, there do exist head-mounted mice - stick a little reflective dot on your forehead, do a quick calibration, and boom - the cursor goes where you turn your head. Clicking can either be done with foot pedals or a mouse-in-the-lap, though I'd imagine that voice commands could also be used via application of appropriate tech.

(no subject)

Date: 2006-12-18 07:30 pm (UTC)
From: [identity profile] metahacker.livejournal.com
stick a little dot on your forehead

The demo of head-tracking I got a few months ago at MIT indicated that the little dot really isn't necessary any more; they were tracking both eye locations, eye shape, eyebrows, and mouth shape, all quite accurately and at 20+fps with a simple camera pointed at your face. Make a quick 3d model of your head during the calibration process from five snapshots using other existing tech, and the software will have a very good indication of where your head is turned...

http://citeseer.comp.nus.edu.sg/kapoor02realtime.html
http://vismod.media.mit.edu/vismod/demos/faceview/head-track/head-track.html
etc.

(no subject)

Date: 2006-12-18 07:49 pm (UTC)
mindways: (Default)
From: [personal profile] mindways
Very neat!

(I'll pass the word along to my friend who uses the dot-mouse due to wrist issues.)

(no subject)

Date: 2006-12-18 07:52 pm (UTC)
From: [identity profile] metahacker.livejournal.com
Note that I didn't get a demo of cursor-movement-through-head-tracking, just the sense that it ought to be possible. It'll probably require some new PhD at MIT to spin off yet another company to produce a commercial version... ;)

(no subject)

Date: 2006-12-18 06:43 pm (UTC)
From: [identity profile] bkdelong.livejournal.com
There's been a lot of research in eye-tracking - I remember seeing an excellent poster presentation at WWW6 back in 1997. Gesture Recognition ala Minority Report has come a long way though I'm not sure if we'll have the augmented reality displays anytime soon.

One should be able to use any PDA/phone camera that records video. i.e. during that preview mode you'd see it attempt to zero in on your retina and hopefully do a bit of recognition as well.

I honestly wish there was an Open source voice-recognition platform where I could continuously hone the voiceprint and take it around or import it into other applications. IMO it's still not really there yet and I want us to hurry up and get to subvocal recognition - though some people see subvocalization as a problem rather than an enhancement.

(no subject)

Date: 2006-12-18 06:44 pm (UTC)
From: [identity profile] talvinamarich.livejournal.com
Already, if you wind up in court on any of an hundred or more offenses, what you Googled for is going to come out. "He was researching how to make a bomb." "He scanned Google Images for blondes. The victim was an attractive blonde."

Next, they will be ransacking your computer for *what your eyes lingered on most*, and using it against you.

I'm gonna head down to Chinatown and buy a used abacus.

No, new. The last person might have had something on his fingers when he used it, and they'll try to pin it on me!

--Talvin

Profile

jducoeur: (Default)
jducoeur

June 2025

S M T W T F S
12 34567
891011121314
15161718192021
22232425262728
2930     

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags