Google Bans User Over Image Stored on Phone

John Lister's picture

Google has closed the account of a user who sent a photograph of his son's inflamed groin to a doctor. It's sticking by the closure despite an obvious case of its automated systems mistakenly flagging the image as abusive material.

The account closure covers all Google services including the man's Gmail and his phone service which was provided by Google Fi. This was a disastrous combination as losing access to both emails and SMS messages to his phone number meant he couldn't get security codes to prove the ownership of many other online accounts.

The man, identified by the New York Times only as Mark, had taken the photo for a doctor to provide a remote consultation which led to a prescription for antibiotics. Google did not moderate the image in the context of the email, but rather because the photo was automatically backed up to his online Google Cloud account, the default setting for many Android phones.

Police Get Involved

A few days later Google disabled his account, saying the image constituted "a severe violation of the company's policies and might be illegal." It later flagged a video on Mark's phone. This was soon followed by a San Francisco police investigation that concluded he had done nothing illegal. (Source: nytimes.com)

The original flagging was done by an automated system that compares images with known abusive material. Amazingly, Google stuck by the decision despite carrying out a human review to check the original automated verdict.

Context Is Everything

The company told The Guardian that "We follow US law in defining what constitutes CSAM and use a combination of hash matching technology and artificial intelligence to identify it and remove it from our platforms." (Source: theguardian.com)

According to the writer of the New York Times article, who has seen the image in question, the problem is that without context, it does appear explicit. It may be impossible for either a computer or a person to make a judgment about the image without knowing the context and the motivations of the person who created and shared it.

What's Your Opinion?

Do you have any sympathy with Google in this case? Did you know it analyzed photos in Google Cloud for potential abuse? Should it have asked for context before refusing to reinstate the account?

Rate this article: 
Average: 5 (6 votes)

Comments

davolente_10330's picture

The phrase: "What could possibly go wrong?", comes to mind. I dare say there are other examples of a perfectly legitimate illustration of something (in the right context) that may have been flagged up as "unsuitable", but for Google to stand their ground, apparently having had everything explained in simple terms, sounds pretty despicable. On the face of it, this is a classic case of an automated system that doesn't know why the picture was used and I'm certainly uneasy about stuff stored or backed-up remotely being automatically examined by a mindless robot. I'm no lawyer but, from the facts presented here, I would say that the account needs to be reinstated. Incidentally, I have no sympathy for Google, seeing as you asked! They are too pervasive for their own good.

c_hirst_2382's picture

See https://vdocuments.mx/computers-dont-argue.html

doulosg's picture

... What good is Google's automated systems doing in regard to actual abusers. One would think if they are at all effective, there would be stories of those abusers being hauled into court to be tried for their crimes. Instead, innocent people (presumably) are being caught in the the snares that criminals are (possibly) escaping.

Wing and a Chair's picture

PHI (Private health information) should never be sent over the open internet in unsecured email. If the doctor received the email, from the father, unsolicited, then there is not much he could do about it, but if the father contacted the doc and told him he was emailing him the picture, he should have told him not to and then email the father using secure, HIPAA approved email and have the father attach the picture to that reply.

DaLincerGuy's picture

On my phone, as well as the Google phone in the Original Post. Pictures once taken are AUTOMATICALLY sent to the cloud storage... Not upon request -- just done.

No way to prevent this.

Only way around this is to put the phone in Airplane mode, take the picture, transfer to local storage -- somehow without allowing it to the cloud, then attaching it somewhere, AND not every missing a beat along the way.

OUCH!

Google should restore the account.
How can we / Google prevent this overreach again? Hard.

David

Chuckster's picture

Let's talk about monopoly control here?

First issue, what happened to personal privacy issues? It's not clear why and what the hell is Google doing scanning the backup files of a user stored on their cloud, which is a default most users have no clue to change. This would normally be in, or on, the private confines of your phone, and should remain private from oversight by Google or anyone when stored in the cloud. It is NOT in the purview of the public domain like social media.

I have never trusted Google, and any cloud storage. This is more evidence why I do not trust them, or allow upload of my backups to the cloud. I get pissed when my contacts keep getting sucked up by Google. The three biggest cloud players, Google, Amazon, and Microsoft control a major part of the entire cloud storage networking ability. Do you really know who is looking at your "supposedly secure" data?

Did you ever wonder why Microsoft, Google, and Facebook have never been put in check by the FTC or SEC that allowed for their monopolistic purchases of tech companies and startups over the decades that allowed them to amass such power. Google had Maps, but bought Waze, a much better map app to augment their own Map program. GPS tracking data is collected by so many apps now, it's ludicrous. Always felt and still believe the gov't has their hand in the back pockets of these companies. Why not? It's the perfect backdoor for NSA and others to spy on citizens, and not just in the States.