S4’s Eye-Scroll feature ‘sees’ whether you’re looking at the screen, and where
What if when playing a video on your phone you turn away to find the playback has automatically paused, and resumes playback when you look at the screen again? It’s as if the phone is watching you, literally.
This isn’t fiction reserved for the future, but one of the enhancements in the Galaxy S4, the latest offering from smartphone maker Samsung. While improvements from the S3 aren’t many, the one major improvement which opens up a new domain of human-machine interaction in smartphones after speech, touch and gesture is to ‘see’.
Eye-Scroll is a feature with which the S4 detects the position of the user’s eyes. If it finds the eyes are looking at the bottom of the screen, it understands the need to scroll to the next page. It is not entirely automated now, but with a slight tilt of the wrist, it scrolls to the next page, without the users having to swipe the screen.
The most stable of the eye motion detection features is the automatic pause-play one. The S4 has a 2 megapixel front camera apart from the 13 MP rear one. This front camera is the eye of the S4 staring at you.
In digital cameras
In digital cameras, feature detection options such as face and smile recognition are well established. Smiles, for instance, are identified using image-processing algorithms, which correlate image pixels and look for regions where the transitions from skin to mouth happen.
Because any digital image is a 2-dimensional matrix with intensity values, by scanning these pixels with intelligent programs, a condition can be defined to spot the pink lips with white teeth, as the pixel intensities of these regions are different. When run efficiently on a series of images consecutively, it appears as if live detection is happening.
Something along the same lines, the S4 front camera is watching the eyes of the user. When it detects the eyeballs, which are easier to track because they are in wells of skin attached to a white area, the video player continues playback.
When the user turns his or her head, and the camera is staring into the ear or temple of the user, a decision is made to pause the playback.
Although this feature is new to the smartphone segment, the technology has been widely used in other desktop applications. Even Facebook for that matter, uses face detection and places the ‘tag’ box on the faces of the people in the photos uploaded.
Scientific programming tools such as OpenCV are extensively being used to perform object recognition and motion capture using digital images sensors.
As this is a new feature, with subsequent releases and software patches it will get more accurate. Currently, not every turn of the head is detected.
It will be interesting to see, if subsequently, a long wide-eyed stare will signal ‘power off’ to the device.