27 August 2010

More Tech from Suki's World coming true

Mapping Emotions to Video 
EmoRate is an application that demonstrates the power of Affective Computing, a technology that allows computers to detect and react to human emotions and will change drastically the way we interact with them in our daily lives. The last barrier that has held Affective Computing back, an affordable consumer grade EEG headset, has been broken with the introduction of the 14-electrode EPOC headset, created by Emotiv systems, Inc.  Watch the video below to see a demonstration of the EmoRate Emotion Recording system. Then read the FAQ below to find out how I trained the system and what you can expect from this brave new field of computing in the near future.  If you want to be updated about my progress with EmoRate, or get the latest news concerning the fascinating fields of Affective Computing and Brain Control Interfaces, sign up for the EmoRate newsletter using the box on the left.  There’s also a Resources section below with helpful links to find more information . . .
In the environment for the Suki Series, I wanted the interface between people and their devices to have an intuitive look and feel.  Without going into how everything would work, since these are novels not technical manuals, I left out much of the gritty detail.

Once they cook, morph and develop for about 20 years, interfaces like the one above, that maps a user's emotion to video, could add functionality to face recognition systems in the voyeur cams that saturate the industrialized world of the 2030s.

Other devices are voice controlled, like household devices.  In homes, blinds, lights and audio systems are adjusted through voice commands.  Some items, especially lighting, come on to an initial setting as people enter a room (and can be assumed to go to another setting when they leave).  This technology is available today, just not as well developed or in as wide use as what it in the novels.

The technology below is key to the gog interface. Typing in air on virtual keyboards displayed to the user through their contacts or glasses.  What I was thinking about for the future is the device interpret hand gestures accurately with little or no user programming or mapping.
Tired of marking up that touchscreen display with fingerprints? That may soon come to end if device manufactures adopt a new Touchless Gesture User Interface technology developed by Elliptic Labs in their products.

Elliptic plans to showcase their “Mimesign” technology at IFA in Berlin from the 3rd to 8th of September 2010. Mimesign will bring intuitive ways for people to interact with devices. The possibilities range from tablets, remote controls or in-car media controls. The interface is based on ultrasound technology and allows the user to remain in an unchanged state. Meaning, you can pimp slap your way through a photo album.
It’s really as simple as it sounds. A set of gestures are programmed in to the interface allowing the user to navigate with free, unrestricted hand movements. It’s a great invention and has many practical uses, for instance waving your hand to change radio stations in the car is much safer than leaning over to find the button. But waving your hand across the air to change photos on your media player is a bit of extra work for me, I would rather tap the edge of the screen to flip pictures than to wave my hand across two feet of air. Perhaps the movements in the video are more dramatic than they need to be?

 
ORDER SUKI SERIES eBOOKS AND PAPERBACKS
Suki Series Tech
Order the paperback edition of Suki V: The Collection
Browse the series on Google: Suki I, Suki II, Suki III, Suki IV, Suki V
Fan Fiction: John and Suki: Vacation Fun
John and Suki's news and comment area, from a Libertarian perspective.
Copyright 1970 - 2010, SJE Enterprises, all rights reserved.

No comments:

Post a Comment