2024-05-20 20:12:03
Can Big Tech be blamed? - Democratic Voice USA
Can Big Tech be blamed?

Are Google, Facebook, Twitter and other tech giants ­responsible for what their users see on social media? The question, argued before the Supreme Court Tuesday, could change the Internet radically — perhaps for the better.

Victims of the Paris, Istanbul and San Bernardino terrorist attacks, suing under the name of American victim Nohemi Gonzalez, have sued YouTube, saying its targeted recommendations assisted or aided the terrorists.

Google, YouTube’s parent company, argues that the Communications Decency Act, specifically Section 230, shields Big Tech companies from liability for what users write or post.

The Gonzalez underlying-liability theory is strained, asserting terrorists would have been peaceful individuals had not YouTube’s recommendations exposed them to radical thought. But bad facts make bad law — and, because Google has relied on the Section 230 defense, the court may take the opportunity to correct an erroneous interpretation of the law.

Section 230’s text, as Justice Ketanji Brown Jackson recognized in argument, gives YouTube limited protections like those that telephone companies enjoy. Verizon has no legal liability if you libel your boss to your family talking on the cellphone. Similarly, YouTube is not liable if you post a video doing the same thing.

Nohemi Gonzalez died in shootings and suicide bombings that killed 130 people in the French capital.AP/Chris Carlson

But: YouTube is liable under Section 230 for its own speech — say, posting a statement from its management team stating defamatory things about your boss.

Narrower question

Gonzalez vs. Google presents a narrower question, whether YouTube’s targeted recommendations are the platform’s own speech.

YouTube argues they are simply reorganized speech of their users — and thus come under Section 230.

This narrow question is a tough call that perplexed the justices. They seemed unsatisfied with both parties’ answers.

Personally, I think that if YouTube knows you’re a radical and ISIS videos appear at the top of your newsfeed every day, YouTube is trying to communicate a particular message. Section 230 should offer no protection.

Nohemi GonzalezGonzalez was 23-year-old American college student when she tragically died.REUTERS

The question, though, is whether the court has enough information to prove this.
We must know more about how these algorithms work — information that’s not part of the record. Further, Justice Elana Kagan observed, she and her colleagues are not Internet experts. The court could remand this case for discovery. Or as Justice Amy Coney Barrett suggested, the court could decide the case on the aiding-terrorism claim, without reaching Section 230. Given the incomplete record in this case, that may be the best result.

One argument that was raised disheartened me, however. Some justices feared that paring back the extravagant Section 230 legal protection that some lower courts have given the platforms would result in endless lawsuits. The ­genie is out of the bottle already — moderating billions of posts is impossible so we should give the platforms a pass, lest lawsuits swamp the federal judiciary.

Poor argument

That’s a rotten argument — even expecting the Chicken Little claims that all industries make when defending their legal protections. Congress explicitly intended a different part of Section 230 to give the platforms the incentive to take down porn and other content bad for kids. The legislative history is clear. After all, the statute is called the “Communications Decency Act.”

But platforms didn’t do that. Instead, they spent 25 years, relying on the best lawyers in the country, to pursue a legal strategy that immunizes every editorial decision they make.

Nohemi GonzalezGonzalez vs. Google presents a narrower question, whether YouTube’s targeted recommendations are the platform’s own speech.REUTERS/Jonathan Alcorn

Under that Section 230 liability umbrella, the platforms have hurt children. And it’s not only making porn ubiquitous. Depression, mental illness and loneliness among adolescent girls, as a CDC report released this month documents, have reached epidemic rates. Leading psychologists, such as Jean Twenge and Jonathan Haidt, identify social media as the cause.

Some justices suggested that this is Congress’s mess. It’s not. The court closed its eyes for 20 years to lower courts’ decisions that ignored Section 230’s text and intent. It cannot now, like Pontius Pilate, wash its hands. If compelled to reach the matter, the court must read Section 230 as written and affirm its limited protection.

Adam Candeub is a law professor at Michigan State University, where he directs its IP, Information and Communications law program.

Source link: https://nypost.com/2023/02/21/supreme-test-in-terror-death-posting-case-can-big-tech-be-blamed/

Leave a Reply

Your email address will not be published. Required fields are marked *