Lately, there's been a lot of buzz in the media about Facebook's facial-recognition technology--admittedly, in part thanks to us. But now we've decided to look at the technology itself: Is it as dangerous as we think it is? In the end, Facebook's new face-recognizing feature doesn't yet work well enough to pose a significant threat to your privacy.

How Well Does It Work?

Facebook says its facial-recognition technology is convenient because it groups together multiple images of the same person; as a result, you have to type a friend’s name only once, and the tag will apply to all photos of that person. If your friend has been previously tagged in enough photos, Facebook will suggest his or her name so you don’t have to do anything. And yes, we think that would be more convenient--if it worked.

We did a few experiments to see how well Facebook's facial-recognition technology, in its current state, actually recognizes faces. We uploaded pictures of people who had no tags on Facebook, as well as photos of people who had many tags on Facebook; we even reuploaded photos that already existed on Facebook with tags, just to see what Facebook could detect and what it couldn’t. Since Facebook allows you to disable the facial-recognition feature in your privacy settings, we made sure to upload photos of people who left the facial-recognition feature on.

As it turns out, the tool isn't exactly advanced to a worrying degree (or anywhere close).

At the moment, Facebook's facial-recognition technology seems to be good at one thing, and one thing only: recognizing that a face is a face. Considering that most point-and-shoot cameras have such technology, this isn't exactly mind-blowing.

As you can see below, Facebook's facial-recognition technology has no trouble recognizing face shapes in these photos. Unfortunately, it also cannot tell that they're photos of the same person.

Below, Facebook is able to group photos together and determine that the top three are of one person, and the bottom two are of another person. Pretty good, right? Except for the fact that, again, these are all photos of the same person.

Facebook's facial-recognition technology appears to be able to recognize people's faces across multiple photos if the lighting is similar, the coloring is similar, the angle is similar, and the person's expression is similar. For the photos below, Facebook recognizes the first two as being of the same person--but that isn't so impressive, considering that the lighting, coloring, angle, and expression are all basically the same in both photos. But Facebook can't tell that the other two photos are of the same person, probably owing to the drastic changes in lighting and angle.

Apparently Facebook can still recognize faces if only one of the variables changes. In the picture below, Facebook correctly identifies the photos as being of the same person, despite the expression change. However, the other variables (lighting, color, angle) are all the same.

We heard a rumor that Facebook's facial-recognition system wasn't too great at recognizing Asian faces. As far as we can tell, that isn't true--the photo set below was one of the only sets that Facebook grouped correctly and properly identified.

Sometimes Facebook is just weird. Even though Megan has over 600 tagged photos of herself on Facebook--a database large enough to give any face-recognizing technology a pretty good idea of what she looks like--we reuploaded four previously uploaded photos of her, and Facebook was able to correctly identify only the one photo in which she's wearing a fake moustache.

We can draw two conclusions here: First, Facebook's facial-recognition system is pretty bad. Second, fake moustaches are bad disguises.

How Facebook Recognizes Faces

Believe it or not, facial-recognition technology has improved rapidly in the past few years, with Facebook now at the crest of the trend. Initially, point-and-shoot cameras paved the way, using basic face-shape recognition, without assigning identity, to make autofocus more accurate.

The next step was making the recognition more detailed, and assigning identity to faces. Computer makers began building face-tracking and face-identifying software into webcams, and now even game consoles are adopting the technology to let users interact with games using biometric data from their faces (first with Microsoft's Kinect for the Xbox 360 and possibly with Nintendo's upcoming Wii U).

Clearly the most exciting--and scary--part about Facebook’s facial-recognition technology, however, is that it’s supposed to learn what your face looks like as you add and tag more photos of yourself, even though it doesn’t work well now.

“Where we see a face, a computer only sees numbers," notes Andrew Ng of Stanford University, who has worked on artificial intelligence and machine-learning problems throughout his career. "It assigns a value to every pixel, and it's the computer's task to decide that the values add up to 'This is my good friend Joe.'" There's a disparity between the process that’s instant and natural for our eyes and brain, and the process that takes a computer hundreds of thousands of calculations to decide. Facebook, however omnipresent it may seem, isn't omniscient--yet.

To get a better grasp of how Facebook recognizes your face, it’s worth looking at companies like Viewdle, which developed a mobile facial-recognition app that can use Facebook’s databases to identify the names of your friends as you snap a picture of them. Marisol MacGregor, marketing director for Viewdle, says that machine learning is the key to making facial recognition more accurate. "The next time the phone is held up to you, we're using all your database photos [to identify your face],” MacGregor explains. This means that the more photos you have from different angles and in different lighting, the more capable an algorithm becomes at identifying who you are.

However, despite the promises of facial-recognition technology, Facebook and others undoubtedly have a long way to go to create accurate and workable programs.

A big hurdle in biometric tech is that much of it thus far has in some way failed to detect a wide range of facial variation. Companies that adopted facial-recognition technology in the early days learned the hard way that capturing and reading faces means that a computer has to digitally quantify something as socially sensitive and complex as the shape and color of a person's face. The Nikon Coolpix S630 asked Asian users if they blinked after the photo was taken, and HP's MediaSmart computer and Microsoft's Kinect both failed at detecting people with dark skin in low lighting.

Although Facebook doesn’t yet deliver when it comes to consistently suggesting the names of people in uploaded photos, that might not be because the feature doesn’t work--instead, the feature may be erring on the side of caution. Users have hundreds (sometimes thousands) of accurately tagged photos that Facebook can access, but the success of the feature depends on Facebook's refraining from giving embarrassing or insensitive results.

Should You Worry?

Facebook's facial-recognition technology, as it currently exists, is pretty harmless. After all, not only is Facebook using it for benign purposes--tagging photos--but it's just not very accurate, as we’ve shown.

But that doesn't mean you shouldn't be worried about the future implications of such technology. Although facial-recognition technology on its own is harmless, you should realize what, exactly, Facebook is doing: It's building a huge database of people's faces.

Many people may say, "So what? If you don't want Facebook to know what your face looks like, don't use Facebook."

Well, it isn't as simple as that. Facebook can use its ever-growing database as a means of improving its facial-recognition technology, and whether you are on Facebook or not, you will be affected in the future by the site's advances in this technology, as long as you (however accidentally) show up in photos that other people have snapped.

Facebook could eventually sell the improved technology to a search-engine company, or even create its own search engine for people's faces. Considering how often the site changes its privacy policy without users' knowledge, it isn't necessarily paranoid to imagine such a situation. Assuming that the technology advances enough, it will no longer matter if you're "on Facebook" or not--if just one photo of you is out there on the Internet, you could be affected.