Companies would be held legally responsible for using algorithms and design features that addict young people under proposed legislation in California.
The bill would prohibit social media companies from using a design, algorithm, feature or practice that the company knows will cause young people to develop an eating disorder, harm themselves or others, become addicted to the social media platform, or be exposed to content that facilitates the purchase of controlled substances, such as opioids; facilitating suicide by providing information on how to die by suicide or facilitating selling guns illegally. Companies found guilty of violating the Act could be sued by public prosecutors and fined as much as$250,000 for each offence they commit.
The proposal would violate federal law and the US Constitution, in the view of tech lobbyists, digital rights advocates and others. Sophia Cope, senior staff attorney at the nonprofit Electronic Frontier Foundation, said the Supreme Court has generally interpreted the First Amendment to protect publishers' editorial decisions. Eric Goldman, a professor at Santa Clara University School of Law, agreed that the First Amendment limits lawmakers' ability to control the flow of information to children. In addition, under Section 230 of the Federal Communications Decency Act, tech companies are generally shielded from legal liability for the content their users post on their platforms.
Last week, the bill was sent to the Senate Committee on Appropriations and placed in a special pile of bills -known as the 'suspense file' -officially concerned with the budget analysis. Unofficially, it is a politically expedient place for lawmakers to get rid of bills without getting blamed. A bill like this was introduced last year, ended up being put into the suspense file, and died there.