[23426] in Privacy_Forum

home help back first fref pref prev next nref lref last post

[ PRIVACY Forum ] Script of my national radio report last Monday on

daemon@ATHENA.MIT.EDU (Lauren Weinstein)
Wed Apr 1 13:56:07 2026

Date: Wed, 1 Apr 2026 10:46:08 -0700
From: Lauren Weinstein <lauren@vortex.com>
To: privacy-dist@vortex.com
Message-ID: <20260401174608.GA29316@vortex.com>
Content-Disposition: inline
MIME-Version: 1.0
Content-Transfer-Encoding: 7bit
Content-Type: text/plain; charset="us-ascii"; Format="flowed"
Errors-To: privacy-bounces+privacy-forum=mit.edu@vortex.com


This is the script from my national network radio report last Monday
on recent court decisions holding Meta and Google's YouTube liable
in cases involving their impact on children. As always there may have
been minor wording variations from this script as I presented this
report live on air.

 - - - 

Yeah, so this is a rather complicated situation and frankly I don't
think most of the media is doing a really good job explaining this,
though I don't feel that they're doing this on purpose, there's just a
lot of layers to understand.

So there have been new court rulings that found Meta and Google's
YouTube liable for issues related to how their social media platforms
interact with children. YouTube I should note insists that it's a
video streaming platform not a social media platform per se, and there
is some merit in that argument though as we'll see this is something
likely to be determined down the line.

The overarching principle here is that some observers view these kinds
of outcomes as being Big Tech's "tobacco" moment, drawing parallels
with historical court cases related to smoking and cancer. But many
legal experts who've looked at this in depth are less certain where
this will all go. There are of course differences between someone
physically addicted to cigarettes vs. the assertion that someone's
mental state has been negatively affected -- by some percentage -- by
say social media, along with all the other components that also affect
mental state.

A fundamental complication of course is that for example while you can
measure the amount of nicotine in the blood and determine the degree
of addiction and direct effects like lung cancer, trying to make
similar determinations regarding complex factors such as one's mental
state seems like a different kind of problem, even if you could
somehow separate out the various influences in a rigorous manner. So
putting this all together it appears to many in the legal community
that as these cases move up the court system, and I believe the
companies have already pledged to appeal, appellate courts may take a
different view than the lower courts when they consider these same
issues and associated matters such as the first amendment, privacy and
other concerns surrounding attempts at age verification, and so on --
issues we've talked about here in the past.

So in a very real way it's widely felt that there is a long road ahead
likely all the way to the Supreme Court, which has already expressed
some skepticism about broadly blaming Big Tech social media for these
kinds of issues. This is one situation where it's not at all clear how
the current Supreme Court might actually rule, especially because
these issues are controversial among both political parties.

However, I do want to draw a sharp line where often it seems that very
different issues are being inappropriately blurred together. And
that's the difference between social media and large language model
AI, e.g. AI chatbots. And this seems like a very different category
than social media. Social media traditionally has mostly dealt with
third party content and been protected by what's called Section 230,
and the controversies regarding that protection are deeply enmeshed
within this entire complex of issues.

But at least in my view, AI chatbots -- which you'll recall have been
implicated in suicides and a murder -- represent FIRST party content
on the part of their Big Tech firms, and without any doubt these firms
should be held fully responsible for all effects that their chatbots
have on individuals.

And this difference between third party content and first party
content IS something that is all too often muddled in discussions
about Big Tech, and I don't think that kind of confusion helps solve
these problems. These issues will be getting more complex. For
example, Google is now facing a new class action related to their AI
Search and the Epstein victims. We're NOWHERE near the end of these
controversies. We've really only just begun.

 - - - 

L

 - - -
--Lauren--
Lauren Weinstein 
lauren@vortex.com (https://www.vortex.com/lauren)
Lauren's Blog: https://lauren.vortex.com
Mastodon: https://mastodon.laurenweinstein.org/@lauren
Signal: By request on need to know basis
Founder: Network Neutrality Squad: https://www.nnsquad.org
         PRIVACY Forum: https://www.vortex.com/privacy-info
Co-Founder: People For Internet Responsibility
_______________________________________________
privacy mailing list
https://lists.vortex.com/mailman/listinfo/privacy

home help back first fref pref prev next nref lref last post