YouTube Algorithm Sparks Supreme Court Case
The Supreme Court will rule on a key challenge to the way tech companies are responsible for user content. The outcome could affect the long-running "publisher vs platform" debate.
The case centers on Section 230 of the Communications Decency Act, which broadly says Internet companies aren't legally responsible for content they post, including cases of defamation.
The validity and interpretation of that rule has been challenged many times since it was created in 1996, partly because technology has evolved. Critics of the rule say it was written when the main issue was whether web hosting companies were responsible for material on their customer's websites.
The big question now is how it applies to websites that host user-generated content such as social media and video sites. The companies that run such sites argue they are merely platforms and don't exercise editorial control in the same way as publishers.
Moderation May Remove Protection
Many legal challenges have argued that because such sites have content moderation, for example blocking some posts for violating terms of use, they do in fact act as editors and bear responsibility for the material that is publishes. Courts have struggled to define a clear line where sites exercise enough control to become "publishers".
The latest case comes at the topic from a slightly different angle: rather than concentrating on content moderation, it deals with the way tech companies decide which content to show or promote to users. This often involved algorithms that try to figure out what content will be most likely to appeal to a user's interests or spark an emotional response that makes them more likely to engage with and share the material.
Terror Attacks Follow Videos
This case is brought by the family of a woman killed in a terrorist attack. They say YouTube bears some responsibility because it recommended videos that encouraged extreme views and were posted by known members of terror groups. How much of a role the videos played in encouraging the attacks is specific to this case, but is not the wider legal point the Supreme Court will address. (Source: nbcnews.com)
YouTube argued that the case should be thrown out because Section 230 means it automatically bears no responsibility. The family argue that the fact YouTube decided (through its algorithm) to promote the videos so they'd be more likely to be seen by particular users means Section 230 doesn't apply. (Source: scotusblog.com)
What's Your Opinion?
Is Section 230 still relevant today? How much responsibility should tech companies bear for user content on their site? What's the dividing line between publisher and platform?
Most popular articles
- Which Processor is Better: Intel or AMD? - Explained
- How to Prevent Ransomware in 2018 - 10 Steps
- 5 Best Anti Ransomware Software Free
- How to Fix: Computer / Network Infected with Ransomware (10 Steps)
- How to Fix: Your Computer is Infected, Call This Number (Scam)
- Scammed by Informatico Experts? Here's What to Do
- Scammed by Smart PC Experts? Here's What to Do
- Scammed by Right PC Experts? Here's What to Do
- Scammed by PC / Web Network Experts? Here's What to Do
- How to Fix: Windows Update Won't Update
- Explained: Do I need a VPN? Are VPNs Safe for Online Banking?
- Explained: VPN vs Proxy; What's the Difference?
- Explained: Difference Between VPN Server and VPN (Service)
- Forgot Password? How to: Reset Any Password: Windows Vista, 7, 8, 10
- How to: Use a Firewall to Block Full Screen Ads on Android
- Explained: Absolute Best way to Limit Data on Android
- Explained: Difference Between Dark Web, Deep Net, Darknet and More
- Explained: If I Reset Windows 10 will it Remove Malware?
My name is Dennis Faas and I am a senior systems administrator and IT technical analyst specializing in cyber crimes (sextortion / blackmail / tech support scams) with over 30 years experience; I also run this website! If you need technical assistance , I can help. Click here to email me now; optionally, you can review my resume here. You can also read how I can fix your computer over the Internet (also includes user reviews).
We are BBB Accredited
We are BBB accredited (A+ rating), celebrating 21 years of excellence! Click to view our rating on the BBB.
Comments
IMO, Section 230 is still
IMO, Section 230 is still relevant as summarized in this piece. At the same time, I do not think it should apply when an entity curates information in such a way that it targets a user.
It seems like the rule was written for internet providers and site hosting companies. As I read it, it does not alleviate responsibility for DIRECT and pointed involvement in a questionable activity. When reading "Internet companies" in the first quote, I believe it refers to ISPs. Not so much as the term 'internet company' today refers to any business with an internet presence.
"Section 230 of the Communications Decency Act, which broadly says Internet companies aren't legally responsible for content they post, including cases of defamation."
&
"The validity and interpretation of that rule has been challenged many times since it was created in 1996, partly because technology has evolved. Critics of the rule say it was written when the main issue was whether web hosting companies were responsible for material on their customer's websites."
Rules
Section 230 does not allow a platform to be sued for content.
Let the platforms know they will lose their protection the second they create "fact checkers".
If someone is being "provocative" are they following the platform rules?
What's the point of having an internet if Big Brother controls it?
People need to re-learn the concept of "sticks and stones" and get away from this whole "emotional wellbeing" folderol.
If it's libel, deal with it in court.
If it's emotional, see a shrink.