Julian Sanchez offers up some sage advice:
There's more to it than this, though. It's certainly true that any government effort to police Facebook's content is almost certain to be overturned by the Supreme Court and is therefore pointless. But it's also true that we shouldn't want the government to police Facebook.
My standard for this is simple: Whatever it is you want regulated on Facebook, would you also want the same thing regulated on CNN or the Washington Post? Why not? Speech is speech, after all, regardless of whether it's in print, pixels, or modulated carrier waves on a cable system.
(Private action, of course, is entirely different. We should all feel free to campaign against Facebook in any way we please. Public pressure is a great way to push media outlets to change the way they operate.)
I'm annoyed that I continually find myself defending Facebook these days. I'm hardly a big fan, but my objections—which now seem practically stodgy—have always revolved around their incessant disrespect for personal privacy. More recently, though, the criticism of Facebook (and other social media platforms) has revolved around content, and I'm a lot less comfortable with that.
Partly this is because I remain an old-school liberal who believes in free speech. There are limits, as with everything, but those limits should be pretty loose.
But it's also partly because I think we've all gone slightly bonkers over our view of Facebook's power. I've spent a fair amount of time researching this, and it turns out there's very little evidence that Facebook actually influences public opinion all that much. There are several reasons for this:
- Surveys show that lots of people get (some of) their news from Facebook, but only a small fraction of that is political news. The vast majority of it is sports or gossip or cute animals.
- As we all know, Facebook users are very siloed. Is there a lot of conservative misinformation on Facebook? Sure. But it mostly gets read by folks who are already true believers.
- There's also positive news on Facebook. In 2020, for example, Facebook was instrumental in getting people out to vote. If you weigh this against the misinformation, it comes out close to even.
There's a ton of research suggesting that social media usage is correlated with depression or loneliness or whatnot. But there's precious little to show that it's correlated with an increase in conservative misinformation. It might be! But so far there's just not much hard evidence to back this up.
So sure, keep pressuring Facebook to do the right thing. Pressure the Biden administration to crack down on Facebook mergers. (It's big enough to deserve a very hard look if it tries to merge or buy a related company.) But until there's better evidence, ease up on the "criticism" that Facebook makes it easier for people to meet in groups or pass along gossip. There have always been good groups and bad groups, just as there's always been good gossip and bad gossip. Facebook really hasn't changed that very much.