Analyzing KOSA's Constitutional Problems In Depth

EFF does not think KOSA is the right approach to protecting children online, however.
As we've said before, we think that in practice, KOSA is likely to exacerbate the risks of children being harmed online because it will place barriers on their ability to access lawful speech about addiction, eating disorders, bullying, and other important topics.
We do not think that language added to KOSA to address that censorship concern solves the problem.
Based on our understanding of the First Amendment and how all online platforms and services regulated by KOSA will navigate their legal risk, we believe that KOSA will lead to broad online censorship of lawful speech, including content designed to help children navigate and overcome the very same harms KOSA identifies.
KOSA makes online services that host our digital speech liable should they fail to exercise reasonable care in removing or restricting minors' access to lawful content on the topics KOSA identifies.
KOSA is worse than the ordinance in Smith because the First Amendment generally protects speech about addiction, suicide, eating disorders, and the other topics KOSA singles out.
To comply, we expect that platforms will deploy blunt tools, either by gating off entire portions of their site to prevent minors from accessing them or by deploying automated filters that will over-censor speech, including speech that may be beneficial to minors seeking help with addictions or other problems KOSA identifies.
We understand that some interpret this language as a safeguard for online services that limits their liability if a minor happens across information on topics that KOSA identifies, and consequently, platforms hosting content aimed at mitigating addiction, bullying, or other identified harms can take comfort that they will not be sued under KOSA. TAKE ACTION. TELL CONGRESS: OPPOSE THE KIDS ONLINE SAFETY ACT. But EFF does not believe the rule of construction will limit KOSA's censorship, in either a practical or constitutional sense.
In short, KOSA does not clarify that the initial search for the forum precludes any liability should the minor interact with the forum and experience harm later.
KOSA supporters argue that because the duty of care and other provisions of KOSA concern an online service or platforms' design features, the bill raises no First Amendment issues.
Rather, the provision specifically links the design features to minors' access to the enumerated content that KOSA deems harmful.
KOSA creates liability for any regulated platform or service that presents certain content to minors that the bill deems harmful to them.
Some people have concerns that KOSA will result in minors not being able to use social media at all.
We cannot know how every platform will react if KOSA is enacted, but smaller platforms that do not already use complex automated content moderation tools will likely find it financially burdensome to implement both age verification tools and content moderation tools.
One recurring fear that critics of KOSA have shared is that they will no longer to be able to use platforms anonymously.
As we've said, KOSA doesn't technically require age verification but we think it's the most likely outcome.
A provision of KOSA does require the National Academy of Sciences to research these issues and issue reports to the public.
As we have said repeatedly, we do not think KOSA will address harms to young people online.
Even if your stance on KOSA is different from ours, we hope we are all working toward the same goal: an internet that supports freedom, justice, and innovation for all people of the world.
We don't believe KOSA will get us there, but neither will ad hominem attacks.


This Cyber News was published on www.eff.org. Publication date: Fri, 15 Mar 2024 20:13:05 +0000


Cyber News related to Analyzing KOSA's Constitutional Problems In Depth