Facebook: Send Us Photos of You in the Buff

John Lister's picture

Facebook has come up with a controversial way to help people whose compromising pictures are shared online without their permission. The term is dubbed "revenge porn" - and to solve the problem, Facebook wants users to upload naked pictures of themselves to Facebook servers.

Yes, you read that correctly.

The test program is known as the "Non-Consensual Intimate Image Pilot." It's designed to counter cases where somebody - often a former partner - tries to humiliate the victim by sharing images of them in revealing situations on the victim Facebook wall.

Such behavior can be particularly damaging if the pictures are shared especially on Facebook, as this makes it considerably more likely they'll be seen by friends, family and colleagues of the victim than if the pictures are simply put on an adult website.

Program Would Preemptively Block Pics

Facebook already has a system by which somebody can report a photo of them that is shared on Facebook without their consent. Not only is the picture removed, but it's added to a blacklist so that further attempts to upload and share copies of it are blocked.

The test program goes a step further by trying to prevent images being shared in the first place. Users who either believe there's a risk an image will be added to Facebook, or have already seen it shared on other sites, can file a report with a government agency. In the test program, the eSafety Commissioner in Australia is the governing body which notifies Facebook, but does not have access to the picture(s).

The user then sends themselves a copy of the image via the Facebook Messenger tool, rather than uploading it to any page. A Facebook employee will review the image to check it violates site rules and then creates a "hash" of the image. This is a string of code that contains the information in the photograph in a form readable by computers rather than humans. The user is then informed and can permanently delete the image from Messenger. (Source: fb.com)

Human Review Raises Concerns

If anyone then tries to upload the image, it will be checked against the database of hashes (as happens with all uploads) and blocked if it matches.

While the plan makes sense in technical terms, it requires tremendous trust on behalf of the user. They'll have to trust that Facebook Messenger isn't hacked while their image is being reviewed. They'll also have to trust the Facebook employee who carries out the review. There's also the problem that the "solution" involves letting somebody else (the employee) see the image, which is exactly what the user is trying to prevent. (Source: theguardian.com)

What's Your Opinion?

Is this a useful solution? Is it worth Facebook trying to block uploads even though it can't wipe out the problem completely? Could the process be completely automated or would this be open to abuse?

Rate this article: 
Average: 4.2 (5 votes)


Navy vet's picture

This sounds like a complicated solution looking for a problem.

Dennis Faas's picture

I did a double take when I first heard about this story. Facebook is a gold mine for cyber thieves - just imagine what would happen if nude photos of Facebook users accidentally leaked due to a "flaw" in the system (or a hack)!

Sparkydog's picture

I don't think there are penis or vagina recognition applications, but there ARE facial recognition applications, and that should be all that is needed.
Someone is creating a large depository of nude photos.

matt_2058's picture

Why not make it simpler and skip the employe review of the image? Make it so ANY image a user loads gets hashed and banned? This would allow anyone to keep an image from being floated around.

If they made that a feature, I'd consider an account just to keep some family photos from being posted.

LouisianaJoe's picture

In most cases you are not going to possess the image that someone else might post.

DancerNYC's picture

If the sender's image is rejected via it's hash, a slight crop will change the hash.