Now, a smartphone app that tags photos automatically

June 30, 2011 07:21 pm | Updated 07:21 pm IST - Washington

The phone’s built-in accelerometer can tell if a person is standing still for a posed photograph, bowling or even dancing. File photo

The phone’s built-in accelerometer can tell if a person is standing still for a posed photograph, bowling or even dancing. File photo

A team from Duke University and the University of South Carolina that includes one Indian-origin student has unveiled a new smartphone app that can automatically tag photos.

The system works by taking advantage of the multiple sensors on a mobile phone, as well as those of other mobile phones in the vicinity.

Dubbed TagSense, the new app offers “greater sophistication” in tagging photos than Apple’s iPhoto and Google’s Picasa, its developers say. “In our system, when you take a picture with a phone, at the same time it senses the people and the context by gathering information from all the other phones in the area,” said Xuan Bao, a Ph.D. student in computer science at Duke who received his master’s degree at Duke in electrical and computer engineering.

Bao and Chuan Qin, a visiting graduate student from USC, developed the app working with Romit Roy Choudhury, assistant professor of electrical and computer engineering at Duke’s Pratt School of Engineering. “Phones have many different kinds of sensors that you can take advantage of,” Qin said. “They collect diverse information like sound, movement, location and light. By putting all that information together, you can sense the setting of a photograph and describe its attributes.”

By using information about the environment of a photograph, the students believe they can achieve a more accurate tagging of a particular photograph than could be achieved by facial recognition alone. For example, the phone’s built-in accelerometer can tell if a person is standing still for a posed photograph, bowling or even dancing.

Light sensors in the phone’s camera can tell if the shot is being taken indoors or outdoors on a sunny or cloudy day. The sensors can also approximate environmental conditions — such as snow or rain — by looking up the weather conditions at that time and location. The microphone can detect whether or not a person in the photograph is laughing, or quiet. All of these attributes are then assigned to each photograph, the students said.

The students envision that TagSense would most likely be adopted by groups of people, such as friends, who would “opt in,” allowing their mobile phone capabilities to be harnessed when members of the group were together. Importantly, Roy Choudhury added, TagSense would not request sensed data from nearby phones that do not belong to this group, thereby protecting users’ privacy.

The experiments were conducted using eight Google Nexus One mobile phones on more than 200 photos taken at various locations across the Duke campus.

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.