The technology policy law that so many now love to hate received another rewrite proposal on Thursday when four executives from House Energy & Commerce Committee announced bill to hold social platforms accountable for personalized algorithmic recommendations of content that inflict “serious physical or emotional harm.”
The Malicious Algorithm Justice Act, presented by Representatives Frank Pallone, Jr . (DN.J.), Mike Doyle (D-Penn.), Jan Schakowsky (D-Ill.) And Anna Eshoo (D-Calif.) - arrives a week after Facebook whistleblower Frances Haugen testified before the Senate Committee on Commerce, science and transport que social network algorithms promote angry content to keep people engaged on the platform.
A draft of five pages indicates that the bill would amend the section 230 of the Communications Decency Act , a provision in the 1996 law that generally states that social forums are not responsible for what their users post - their users are. It would lift that civil immunity for platforms that make “a personalized recommendation” of content that “has subsequently materially contributed to serious physical or emotional injury to any person.”
The bill would not cover not content displayed in response to people's searches and would also exempt online services with less than 5 millions of unique users per month as well as the underlying infrastructure providers, such as web hosts and domain registrars.
The supporting statements that the committee published with the text of the bill include quotes like this from Joan Donovan , Research Director of the Shorenstein Center on Media, Politics and Public Policy: " Disinformation, hate speech and incitement content are reaching massive scale, as recommendation algorithms propel this content more away and faster as more and more people interact with it. "
But like many other proposals introduced to revise" CDA 230 ", the proposal for this project of law to ax abusive algorithms may fail the First Amendment review. In a op-ed in October 9 Washington Post , Jeff Kosseff (Associate Professor of Cyber Security Law at the US Naval Academy) and Daphne Keller (director of the platform regulation program forms at the Cyber Policy Center at Stanford University) wrote that courts tend to frown on attempts to cut corners on publishers' constitutional rights to choose what they wish to publish.
“Although the First Amendment allows liability for certain lies - such as libel, fraud and deceit - lawmakers cannot simply prohibit all misleading speech,” they wrote. “The Supreme Court has made it clear that laws restricting the broadcast of unpopular speech raise the same First Amendment issues as laws that outright ban such speech. "
In a statement sent to journalists on Thursday, Fight for the Future —a tech policy group not known for their love of giant tech companies — warned against the potential consequences of the bill.
"In real life, this bill would function more as a repeal 230 than a reform, as it opens the door to frivolous lawsuits afconfirming that algorithmically amplified user content has caused damage, ”wrote director Evan Greer. She added: "Facebook itself has requested 230 changes because they know they will be in a good position to overcome them (they can afford a lot of lawyers) while the smaller platforms will not." . " What's new now to get our best stories delivered to your inbox every morning. ", "First_published_at ": "2021-09-30T21: 30: 40.000000Z " , "published_at ": "2021-09-30T21: 30: 40.000000Z ", "last_published_at ": "2021-09-30T21: 30: 34 .000000Z ", "created_at ": null, "updated_at ": "2021-09-30T21: 30: 40.000000Z "}) "x-show = " showEmailSignUp () "class = " rounded bg-gray- the lightest text - md: px-32 md: py-8 p-4 font-brand mt-8 container-xs ">
Get our best stories!
Subscribe to What's new now to get our best stories to your inbox every morning.
Thank you for your inion!
Your subion has been confirmed. Keep an eye on your inbox! Sign for d ' other newsletters