New Algorithm Searches Your Social Networks To Find You In Untagged Photos

Friday, December 6, 2013

New Algorithm Searches Your Social Networks To Find You In Untagged Photos

 Computer Science
A new algorithm that tags photos based on the relationships that people in images already have with each other has been developed at the University of Toronto. The algorithm uses the name and location of existing photo tags to build a "relationship graph," where personal connections in the images are calculated.




A new algorithm designed at the University of Toronto has the power to profoundly change the way we find photos among the billions on social media sites such as Facebook and Flickr.  This month, the United States Patent and Trademark Office have issued  patent #8,611,673 on this technology.

Developed by Parham Aarabi, a professor in The Edward S. Rogers Sr. Department of Electrical & Computer Engineering, and his former Master’s student Ron Appel, the search tool uses tag locations to quantify relationships between individuals, even those not tagged in any given photo.

"Essentially, we found that if people are standing close together or are tagged close together inside images – in one image it doesn't tell you a lot of information. But across hundreds of images that someone has on Facebook, it's a very good indicator of how close they are in real life, in a social sense," Aarabi told CTVNews.ca in a phone interview.

Related articles
Imagine you and your mother are pictured together, building a sandcastle at the beach. You’re both tagged in the photo quite close together. In the next photo, you and your father are eating watermelon. You’re both tagged. Because of your close ‘tagging’ relationship with both your mother in the first picture and your father in the second, the algorithm can determine that a relationship exists between those two and quantify how strong it may be.

In a third photo, you fly a kite with both parents, but only your mother is tagged. Given the strength of your ‘tagging’ relationship with your parents, when you search for photos of your father the algorithm can return the untagged photo because of the very high likelihood he’s pictured.

“Two things are happening: we understand relationships, and we can search images better,” says Professor Aarabi.

The nimble algorithm, called relational social image search, achieves high reliability without using computationally intensive object- or facial-recognition software.

“If you want to search a trillion photos, normally that takes at least a trillion operations. It’s based on the number of photos you have,” says Aarabi. “Facebook has almost half a trillion photos, but a billion users—it’s almost a 500 order of magnitude difference. Our algorithm is simply based on the number of tags, not on the number of photos, which makes it more efficient to search than standard approaches.”

Work on this project began in 2005 in Professor Aarabi’s Mobile Applications Lab, Canada’s first lab space for mobile application development.

Currently the algorithm’s interface is primarily for research, but Aarabi aims to see it incorporated on the back-end of large image databases or social networks. “I envision the interface would be exactly like you use Facebook search—for users, nothing would change. They would just get better results,” says Aarabi.

While testing the algorithm, Aarabi and Appel discovered an unforeseen application: a new way to generate maps. They tagged a few photographs of buildings around the University of Toronto and ran them through the system with a bunch of untagged campus photos. “The result we got was of almost a pseudo-map of the campus from all these photos we had taken, which was very interesting,” says Aarabi.


SOURCE  University of Toronto

By 33rd SquareSubscribe to 33rd Square

0 comments:

Post a Comment