Photoshopping Big data

A large proportion of big data is visual and it has been estimated that since photography’s invention back in the early 1800’s around 3.5 trillion photographs have been taken of which 10% were shot in the last year. Facebook reports 6 billion photo uploads per month on its site, and YouTube gets 72 hours of video uploaded every minute.

This is as quoted by Alexei Efros, associate professor of electrical engineering and computer sciences, at UC Berkeley who is the lead scientist of a group who have produced some software which, in our words, cleverly averages a huge number of images and which also incorporates editing facilities.

The research was presented and their software (the code for which they say will be available soon) was demonstrated at the International Conference and Exhibition on Computer Graphics and Interactive Techniques, or SIGGRAPH, in Vancouver, Canada. Last week.

Our header is taken from the zillions of images produced from a Google search of cats (our web search came up with 112 million) and on the left is the resultant average moggie!

Amongst the commercial applications they have so far identified are in online shopping, where a consumer may want to quickly home in on two-inch wedge heels in the perfect shade of red.

Much more in their fascinating video;


Funding from Google, Adobe and the Office of Naval Research has helped support their work.


Speak Your Mind