Skip to main content

Online photos: Are they new digital fingerprint?

Mark Milian
Photographer David Bailey tests a Nokia smartphone, which captures more detailed data than most regular cameras.
Photographer David Bailey tests a Nokia smartphone, which captures more detailed data than most regular cameras.
STORY HIGHLIGHTS
  • New photo-sharing networks can extract a great deal of information from photos
  • Many digital images contain location data and other signals that can easily be extracted
  • Computers are learning to interpret objects, text and faces within photos
RELATED TOPICS

(CNN) -- For Mike Smith, Facebook is a fort for communicating freely with friends online.

Within the confines of that giant yet access-restricted network, the music-software engineer from San Francisco believes he can control what's posted about him through the simple courtesy of asking friends to remove unflattering photos.

But on the wide-open Web exists a harsher environment.

Images that make their way outside the walls of Facebook or similarly closed networks can get indexed by search engines and become almost impossible to scrub.

"I don't want to advertise my life," Smith said. "But my last name is Smith, so there's built-in anonymity. No one can find me."

For those less fortunate, a rogue picture can become an unwanted tattoo. As software matures, more data can be extracted from those images with ease.

A digital photograph is like an onion, and advancements in machine reading and software scanning can help peel back layers to extract information from images.

Each layer of a digital picture often contains data about where and when a shot was taken. Rapidly maturing computer algorithms can interpret what or who is in the frame.

More than half of people online have uploaded photos to be shared with others, according to a study from the Pew Research Center for a report that hasn't yet been published. It was 55% in November, up from 46% in July 2008, Pew's studies found.

Previously, the Pew Internet Project hadn't studied photo sharing as closely as status updates and blogs, said Lee Rainie, the project's director.

"The photo piece of this is now rising in importance and volume, we think, so we're going to pay more attention to this in the future," Rainie said. "It's become such a central feature for social networking."

Pew is also considering the privacy implications. "As location awareness now comes in your pocket with that smartphone, it's very likely that there's more of that (GPS data) inadvertently passed along," Rainie said.

Coye Cheshire, a University of California, Berkeley professor who studies social interaction online, is also planning to research this subject more deeply. He's working on a study about people's perceptions of the pictures they post to Facebook and Twitter.

So far in his research, Cheshire has observed that people tend to perceive a loss in their ability to control and contain info about themselves after something bad happens with it.

"What we don't see, however, is any increase in their online discretionary behaviors," he said.

Several factors could account for this phenomenon, which seems to run counter to the experiments where an animal learns to avoid electrodes after getting zapped a few times. "Thankfully, we don't have any data showing people aren't able to learn," Cheshire said with a chuckle.

But perhaps new technologies, with their increasingly slick and simplified interfaces, are outpacing humans' ability to adjust.

How long did it take us to determine the manners and appropriate response times associated with e-mail and text messages? Have we even figured them out yet?

"People are kind of slow, actually, to evolve to large-scale normative shifts," Cheshire said. "It takes a very long time for that to happen."

While we're trying to figure out whether it's appropriate to tag a tipsy friend in a Facebook photo, software engineers are barreling ahead.

Google has already deployed apps capable of identifying objects, goods, text, artwork and buildings by taking a picture from a phone and running some algorithms over it.

That architecture is also used for privacy-related endeavors, such as the blurring of faces and license plates captured by Google's Street View vehicles.

The search giant is also tuning the ability to identify the faces of people who agree to be included in its database, a director for the project said in an interview last week.

Face.com released an app called Photo Finder, which looks for familiar faces in images on Facebook in an attempt to find a person's photos that haven't already been tagged manually. The company's computers have scanned 23 billion photos from people who have installed the app and authorized it to look at their pictures and ones from friends.

"When it comes to normal people's photos, the truth is that most of the photos are within the closed doors of a social network," said Face.com CEO Gil Hirsch. "Not that many people have a lot of photos of themselves out there on the open Web."

Let's say you take a picture at your office that has a business card or envelope with your home address or some kind of sensitive information visible in the background.

Evernote, ZoomReader and many other companies have proprietary image-processing capabilities that can recognize words in images and then make that text searchable. About one-fifth of all notes stored in Evernote's database contain images, Evernote CEO Phil Libin said in a recent interview.

Generally, text transcribed by image services, such as Evernote's, isn't offered up to public search engines such as Google. However, "today, every image that Google touches is analyzed by one or several of our algorithms," said Hartmut Neven, Google's engineering director for image-recognition development.

Flickr, a Yahoo property that's among the largest photo-sharing sites, declined to comment on development plans, but a spokeswoman said, "No idea is out of the question."

Beyond the stacks of info contained within standard picture files, a new breed of applications can pile on even more detailed signals about where a photo was taken.

For example, a new photo-sharing app called Color leverages a smartphone's various sensors to determine more accurately the setting where a picture is taken.

In addition to the phone's GPS location, Color can record gyroscope and compass orientation, as well as ambient sound from the microphone and lighting from the phone's proximity sensor -- tracking 20 to 40 data points in all, Color Labs CEO Bill Nguyen has said.

Some of that info is sent over the internet to Color's servers moments after the app is opened, not just when pictures are taken. Using those signals, the app figures out who is nearby and then displays their photos. On the iPhone, users must tap a button to grant Color permission to access the device's GPS after the app is first loaded, and it won't work at all without that.

Though Color collects all of this info, someone's exact location isn't shown publicly, and the goal isn't to sell any of this data to other companies, Nguyen said. The actual business model involves partnering with restaurateurs and store owners to provide services that make environments more hospitable, he said.

"I think the problem that happens to me a lot online is I never remember: Is this public or private?" Nguyen said of competing social-networking services. "One of the great things about Color is we're telling you, 'Hey, it's public; it's public.' "

However, some people have complained that Color has not been totally upfront about the extent of data that's collected, some of which is instantly made available to nearby strangers. Nguyen acknowledges these concerns and said an upcoming version could make the terms "more clear."

"We think there are, without a doubt, moments where you share things privately and where you share things publicly," Nguyen said. "This is a way that you share openly."

Even popular smartphone systems, such as the iPhone and Android, aren't always explicit about the info they store in photos. Evidence of that can be found in the stream of pictures that are shared online from people unwittingly publishing data that can pinpoint their whereabouts.

Photos shared through e-mall or using Flickr, Photobucket and others can include precise location info, easily surfaced by free software, according to a CNN report in October. Facebook, the most popular photo-sharing site, wipes that info from each image uploaded for security reasons, a spokeswoman said then.

A computer program, aptly named Creepy, demonstrates how easily the location data in photos can be surfaced and plotted on maps. Various apps have popped up that let users selectively strike sensitive data from pictures. Alternatively, smartphone owners can disable location tagging in their phone's settings panel.

AnchorFree, a security-software firm, is planning to offer a feature in the next six months that can automatically remove GPS data from photos before they're sent over the Web.

"IPhone doesn't protect itself," said Eugene Lapidous, AnchorFree's chief architect. "So we have to provide some intermediary service in the cloud."

Any existing privacy concerns we have may be perpetually aggravated by the constant strides made in technical laboratories.

"As we think of new ways to use the content, there's no way to go back," Berkeley's Cheshire said. "It's an added problem to think about how this could be indexed, searchable on a completely separate system that hasn't been invented yet."

[TECH: NEWSPULSE]

Most popular Tech stories right now