A team from Duke University and the University of South Carolina that includes one Indian-origin student has unveiled a new smartphone app that can automatically tag photos.

The system works by taking advantage of the multiple sensors on a mobile phone, as well as those of other mobile phones in the vicinity.

Dubbed TagSense, the new app offers “greater sophistication” in tagging photos than Apple’s iPhoto and Google’s Picasa, its developers say. “In our system, when you take a picture with a phone, at the same time it senses the people and the context by gathering information from all the other phones in the area,” said Xuan Bao, a Ph.D. student in computer science at Duke who received his master’s degree at Duke in electrical and computer engineering.

Bao and Chuan Qin, a visiting graduate student from USC, developed the app working with Romit Roy Choudhury, assistant professor of electrical and computer engineering at Duke’s Pratt School of Engineering. “Phones have many different kinds of sensors that you can take advantage of,” Qin said. “They collect diverse information like sound, movement, location and light. By putting all that information together, you can sense the setting of a photograph and describe its attributes.”

By using information about the environment of a photograph, the students believe they can achieve a more accurate tagging of a particular photograph than could be achieved by facial recognition alone. For example, the phone’s built-in accelerometer can tell if a person is standing still for a posed photograph, bowling or even dancing.

Light sensors in the phone’s camera can tell if the shot is being taken indoors or outdoors on a sunny or cloudy day. The sensors can also approximate environmental conditions — such as snow or rain — by looking up the weather conditions at that time and location. The microphone can detect whether or not a person in the photograph is laughing, or quiet. All of these attributes are then assigned to each photograph, the students said.

The students envision that TagSense would most likely be adopted by groups of people, such as friends, who would “opt in,” allowing their mobile phone capabilities to be harnessed when members of the group were together. Importantly, Roy Choudhury added, TagSense would not request sensed data from nearby phones that do not belong to this group, thereby protecting users’ privacy.

The experiments were conducted using eight Google Nexus One mobile phones on more than 200 photos taken at various locations across the Duke campus.