Skip to content

Here’s a weekend brain dump about Facebook

Hum de hum. Facebook. I'm truly on the fence about Facebook and disinformation, and I feel like I need to write down some things just to clear my head a little bit. If you feel like following along, take all of this as a bit of thinking out loud. I have very few firm conclusions to draw, but lots of questions swirling around that have influenced how I think about it.


First off, my one firm belief: Facebook is a private corporation and has the same First Amendment rights as any newspaper or TV station—or any other corporation, for that matter. This means that I oppose any content-based government regulation of Facebook, just as I oppose content-based regulation of Fox News or Mark Levin, no matter how hideous they are.


I remain vaguely appalled at the lack of serious research into the impact of Facebook and other social media platforms on politics and disinformation. There's just very little out there, and it surprises me. I know it's hard to do, but social media has been a big deal for nearly a decade now, and I would have expected a better corpus of rigorous research after all this time.


It turns out, however, that there's one thing we do know: social media has a clear positive impact on getting out the vote. Whatever damage it does needs to be weighed against this.


The Facebook empire is truly gigantic. For ordinary regulatory purposes, it should be treated as a monopoly.


I've been uneasy from the start with the Frances Haugen show. Partly this is because it's been rolled out with the military precision of a Prussian offensive, and that immediately made me skeptical of what was behind it. This might be unfair, but it's something that's been rolling around in my mind.

I've also been unhappy with the horrifically bad media reporting of Haugen's leaked documents. There's been almost a synchronized deluge of stories about Instagram being bad for teenage girls, a conclusion that's so wrong it's hard to know where it came from. The truth is simple: On one metric out of twelve (body image), Instagram had a net negative effect on teen girls. On the other eleven metrics it was positive. And it was positive on all twelve metrics for teen boys.

It's hard to draw any conclusion other than the obvious one: Instagram is a huge net positive for teens but has one or two problem areas that Facebook needs to address. That's it. But this is very definitely not what the media collectively reported.


I've read several studies suggesting that Facebook spreads more disinformation than other platforms. However, it's not clear if this has anything to do with Facebook's policies vs. its sheer size. One study, for example, concluded that Facebook generated about six times more disinformation than Twitter, but that's hardly a surprise since Facebook has about six times the number of users.

And YouTube is largely excluded from these studies even though it's (a) nearly as big as Facebook, and (b) a well-known cesspool. I assume this isn't due to YouTube being uninteresting. Rather, it's due to the fact that it's video and therefore can't be studied by simply turning loose some kind of text-based app that analyzes written speech.


There are lots of social media platforms, of course, but disinformation is always going to be most prevalent on those that appeal to adults (aka "voters"). Just by its nature, Facebook will attract more disinformation than TikTok or Snapchat.


All that said, there's no question that (a) Facebook and its subsidiaries account for a gigantic share of the social media market, and (b) they have a legitimate vested interest in keeping users engaged, just like every other social media platform. So where should their algorithm draw the line? What's the "just right" point at which they're presenting users with stuff they're interested in but not pulling them into a rabbit hole of right-wing conspiracy theories?

This is a genuinely difficult question. I can be easily persuaded that Facebook has routinely pushed too hard on profits at the expense of good citizenship, since I figure that nearly every company does the same. But I don't know this for sure.


Nonetheless, Mark Zuckerberg scares me. The guy's vision of the world is very definitely not mine, and he seems to be a true believer in it. This is the worst combination imaginable: a CEO whose obsessive beliefs line up perfectly with the maximally profitable corporate vision. There's just a ton of potential danger there.


Zuckerberg is also way too vulnerable to criticism from right-wingers who have obviously self-interested motives. I suppose he recognizes that they're as ruthless as he is, and is trying to avoid a destructive war with the MAGA crowd. He just wants to be left alone to change the world and make lots of money.


Facebook is a big source of misinformation and conspiracy theories. That's hardly questionable. But conspiracy theories have been around forever and were able to spread long before social media was around. I wrote about this at some length here, and it's something I wish more people would internalize. In the past conspiracy theories were spread via radio; then newsletters; then radio again; then email; and eventually via social media. And those conspiracy theories were no less virulent than those of today. (Just ask Bill Clinton.) The only real difference is that Facebook allows conspiracy theories to spread faster than in the past. But then again, everything is faster now, which means the pushback to conspiracy theories is also faster.


There's another difference: Facebook makes all this stuff public. In the past, most people didn't really know what kinds of conspiracy theories were being peddled until they finally got big enough to appear on the cover of Time. That's because they were spread via newsletters that were mailed only to fellow true believers. Ditto for email. Ditto for radio if you didn't listen to the late-night shows where this stuff thrived. So we all lived in happy ignorance. That's gone now, because Facebook posts and comments are all instantly available for everyone to see. That makes it scarier, but not really all that different from the past.


As always, I continue to think that the evidence points to Fox News and other right-wing outlets as the true source of disinformation. Facebook disinformation mostly seems to stay trapped in little bubbles of nutcases unless Fox News gets hold of it. That's when it explodes. Facebook may be the conduit, but most often it's not the true source of this stuff. Traditional media is.

Also Donald Trump.


I mean, just look at polls showing that two-thirds of Republicans think the 2020 election was stolen. Two thirds! There's no way this is primarily the fault of social media. It's Fox News and talk radio and Donald Trump and the Republican Party. Facebook played at most a small supporting role.


Remember: this is just me doing a brain dump. I'm not pretending everything here is God's own truth. It's just the stuff that I've seen and read, which influences the way I've been thinking about Facebook and social media more generally. Caveat emptor.

60 thoughts on “Here’s a weekend brain dump about Facebook

  1. sonofthereturnofaptidude

    The US has had a massive political movement shot through with conspiracy theories before: The Populist movement of the late 19th century. Interestingly, while the movement's conspiracy theories were all over the place, the general thrust of the Populist Party's arguments were against monopoly, and they had their intended effects, albeit after the movement had lost its force.

    The Populist Party main media outlets were a large, mainly uncoordinated bunch of independent newspapers, also out to make a profit. Fox News and FB are huge monopolistic entities with a hold on public opinion that's never been seen before. There's never been anything like that, so the conspiracy theories of our political times exist in a very different habitat than ever before.

  2. kaleberg

    There are two problems with Facebook.

    It is a "hot" medium, to use McLuhan's term, so it rewards engagement just as radio did in the 1930s and unlike television in the 1960s. That makes it more likely to be used for evil purposes. Facebook has the ability to censor posts, but it has to moderate an awful lot of them. Even China which employs an army of censors and can apply the force of law, dissenting opinions often find a channel before being swatted down. Facebook is always going to be something of a garbage pit, and the Facebook censorship battle is going to wind up in the wrong forum, the courts and legislatures. I don't feel sorry for Facebook, but I'm not sure I like where this is going. People are watching the hearings in DC while the action seems to be at the state level.

    The other problem is that Facebook and Google control almost the entire online advertising market because they bought up all of the competitors and provide a corporate friendly advertising interface. If you want to advertise at scale, they are the only avenues, and we are not likely to see any alternatives for some time.

  3. OverclockedApe

    "The only real difference is that Facebook allows conspiracy theories to spread faster than in the past."

    As you pointed out above this, FB's own studies showed how much more effective at GOTV. It's other ways of spreading information that are also more effective, and that includes mis and disinformation. The other things is there are a good amount of studies, but they're internal to their companies for the most part and FB has been known to stop researchers from accessing them to stop external studies.

    As I linked in the other thread this weekend, for whatever reason conservative links get promoted more than liberal ones over at Twitter as well

    https://blog.twitter.com/en_us/topics/company/2021/rml-politicalcontent

    My personal theory is the added angle of conflict elevates the algorithms "engagement" numbers, but we really need to see all their internal data to get a better and more specific idea.

  4. Steve_OH

    From the NY Times article quoted in your previous post:

    Within days, Facebook had recommended pages and groups related to QAnon...

    How does it make sense for Facebook to ever recommend anything related to QAnon?

      1. MontyTheClipArtMongoose

        I wonder if Mark Zuckerberg is maintaining a residence in Virginia so he can vote this year for Glenn Yungkins, then next year in California for "Hello" Larry Elder's next attempt at an electoral coup.

        1. OverclockedApe

          Tbh with his Augustus fetish, and relationship with Peter Thiel etc, I think he goes broader than a single vote on things he wants. That whole "Move fast and break things" writ large.

          I'm not so sure that he's sold on conservatism as much as he is in having as much control as he can possibly attain.

        2. Austin

          When you register to vote in Virginia, they do ask if you’re registered to vote someplace else and (in theory - I don’t know if they actually follow through) they imply that they will notify the other jurisdiction that you live and vote in Virginia now. That should (again in theory) prevent voting in two places in the same election cycle.

          It works for drivers licenses - the VA DMV definitely had my previous out of state license cancelled when I got my new VA one.

    1. Spadesofgrey

      Because QNan is Republican. If Democrats had outright called this in 2020, it would have been seen as a elitist con that it now is in 2021, losing members faster than Obama after his blunders.

  5. TheMelancholyDonkey

    The conversation about Facebook suffers from a narrow focus on its effects in the United States. I largely agree with Kevin about these. But I also think that Facebook is much more dangerous in less developed countries than it is here.

    In these countries, there is even less oversight of Facebook. I'd bet that it is the source of much more damaging misinformation than it is here. It is a much more used tool for organizing violent movements.

    But, that's all hunches. What Kevin is most correct about is the lack of good research about any of this.

  6. Citizen99

    Couple things here.
    I'm tired of hearing the term "conspiracy theory" applied to every variety of bullshit. Please let's just use that only for theories about conspiracies that have no credible basis. There are all kinds of untruths that have nothing to do with conspiracies. And, of course, there are conspiracies (and theories about them) that are absolutely real.
    Second thing. It's quite obvious that journalists are singularly sensitive about the First Amendment and any restrictions thereon. But I think that devotion has been weaponized to the point where it's absurd. Follow this reasoning: we know that lying in a court of law, or any proceeding where you have been subject to the archaic ritual of holding up your hand and saying "So help me, God," is illegal. Lying to a law enforcement official, regardless of rituals, is also illegal. And lying about a matter that could threaten a business's profitability will subject you to a probability of civil penalties for libel or slander.
    So why is it perfectly fine for a politician to lie to the voters?
    The degree of civic damage that can be (and has) been done by lying to voters is vastly broader and deeper than most of the lies that have ever been told under penalty of perjury, slander, or libel.
    So why is that considered sacred, other than the fact that everyone says it is?
    I can anticipate what some of the answers will be . . . "Who gets to decide what's true and what isn't?" . . ."Where do you draw the line between opinions and facts?" . . . and so on.
    But those just don't wash for me. I can't imagine that the Framers intended the First Amendment to protect the kind of insanity we are living through now.
    Discuss.

      1. TheMelancholyDonkey

        Politicians lie because they are rewarded for it. Americans (and pretty much everyone else) have made it clear that they will never elect someone who consistently tells the truth. Like most human beings, politicians respond to the rewards they are given. So, they lie.

    1. illilillili

      Totally agree. Except for the part about the Framers. The Framers specifically intended that context is important. It doesn't matter what the Framers intended, we are now the Framers for our time. We do want to understand the problems they faced and the solutions they created, but we don't want to limit ourselves.

  7. Scurra

    For me, the biggest red flag - and it's true about Murdoch's set-up as well - is that Zuckerberg controls the company and there's nothing that shareholders can do about it. Even if someone else owned all the shares, Zuckerberg still has the magic one that effectively gives him all the power.

    I would probably be a lot less concerned about either FB or Fox if this problem was addressed. Because the "free" market* has repeatedly shown itself vulnerable to the opinions of customers when profit is on the line and thus shareholders are going to have influence; there is actual accountability even if it is at one or two removes from the main decision making board. That doesn't seem to be true here - Zuckerberg is really not accountable to anyone. (I'm not actually sure if it is true about Alphabet? It may also be.)

    1. illilillili

      Alphabet is at least a duopoly. Page and Brin are at least two people. And Brin is a Russian immigrant, so has some interesting background that make his worldview a lot different from Zuckerberg. And they are relatively removed from the day-to-day management of Google.

  8. MontyTheClipArtMongoose

    Mark Zuckerberg isn't extremely vulnerable to rightwing criticism. He is rightwing criticism, fullstop.

    People think I am playing when I say Zucks (& Snowden!) have drawn more blood than fellow Virginian Willenial Seung-Hui Cho, but I am deadly serious. Zucks is a reactionary bitchboi who found it difficult getting laid, & rather than evolve, he retrenched. & in the process, he collaborated in Myanmar genocide & Russian attacks on democracy in the UK & US (successful) & Cataluna & France (unsuccessful, so far) & Belarus (mixed results), et. al.

    The only thing that's surprising, given Zucks & Poots being het up for national autonomy, is that they didn't try to game the Scottish independence plebiscite, as they did with Brexit & Barcelonexit. (Gee, I wonder why.)

    Fuck him all the way back to Usenet.

  9. sfbay1949

    "There's been almost a synchronized deluge of stories about Instagram being bad for teenage girls, a conclusion that's so wrong it's hard to know where it came from"

    How can you know this Kevin? Sounds like an opinion to me. That's the problem, mixing opinion and some facts leads to conclusions that are not in the least reliable, quantifiable, or useful.

  10. Dana Decker

    I don't know where this fits in, but since 2000 major television networks switched from scripted comedies and dramas to low-cost reality shows that emphasized conflict, fêting low character, ruthless climbing the greasy pole, cheating, ugly put-downs, and a pot about to boil over from anger.

    Trump was a perfect match for that when he was the star of The Apprentice.

    Trash-television king Mark Burnett created and produced The Apprentice and Survivor - both the human equivalent of cockfighting.

    For entertainment at medieval fairs, a cat would be nailed to a post and men would slam into it with their foreheads. We're not there (yet) but celebrating personal destruction is definitely part of today's popular culture.

          1. Austin

            Ugh thanks for introducing this into my mental image gallery. I could’ve gone my entire life without knowing this tidbit of cruelty from history.

  11. cld

    from Mad Magazine, 1968,

    https://pbs.twimg.com/media/Dqwt2VCWoAILAj0.jpg

    Fox is only exploiting something that's always been there, and conspiracy theories are only exploiting something that's always been there.

    When you had to go to the library to look anything up, this didn't stop conspiracy theories, but it limited them because people knew they had to actually do something to find out anything, and most people would never go near a library unless they had to, which limited the bulk of general information to people who would actually do that.

    Now all the information they can handle is dumped on them and most people have no natural aptitude for assimilating it, so whatever appeals most to the imagination piles up until it's a great tower of fancied interest, an illusion of being informed, or hipness as schizophrenia, where areas of misapprehension spill over into one another and Q becomes their organizing aesthetic.

    What if there were no algorithm?

    Isn't the algorithm's collective effect the same thing as the 'grooming' of a child molester, or the systematic misrepresentations of con men? It's action is inherently deceptive in a complex way that simple advertising is not because it voids context entirely giving you just that hit of heroin or that one more shot of whisky without doing anything limiting or ameliorating the effect or helping you keep your job.

    With information and learning context is everything and the algorithm throws that away.

    1. cld

      from,

      https://www.technologyreview.com/2021/09/16/1035851/facebook-troll-farms-report-us-2020-election/

      In the run-up to the 2020 election, the most highly contested in US history, Facebook’s most popular pages for Christian and Black American content were being run by Eastern European troll farms. These pages were part of a larger network that collectively reached nearly half of all Americans, according to an internal company report, and achieved that reach not through user choice but primarily as a result of Facebook’s own platform design and engagement-hungry algorithm.
      . . . .

      It's like Facebook was made to work this way.

      You might ask, well what about the fairness doctrine for algorithms?

      It will never work because the people who would be in charge of implementing it would always be carefully emplaced because they're against it, because it would disincentive the reader.

      1. iamr4man

        The fact that a lot of the conspiracy rabbit hole crap on Facebook is put there by Russians to sow the seeds of discord is not mentioned by Kevin. When he talks about conspiracies of the past, those were spared on the radio and word of mouth, sure. But imagine the scandal in the past if it was shown that Rush Limbaugh was a foreign agent. The fact that conspiracies are being invented and spread by Russia gets little attention and the Right embraces it because it helps them.

  12. cld

    Too many people think dystopias reflect their ideal selves, beset by conflict, and are unwilling to do anything about it.

    Take the drama down a notch and what do they have?

  13. golack

    You're right more independent research needs to be done.

    The damage to teenage girls is probably being over played. But you can not just lump all headings together. Bad body image is a red flag. I can't tell if it's normal teen angst or something that's much, much more serious. And doing better in some other category may not make up for the damage.

  14. Solar

    "There's been almost a synchronized deluge of stories about Instagram being bad for teenage girls, a conclusion that's so wrong it's hard to know where it came from"

    What? Come on Kevin, even for your continually declining standards this is just a bunch of BS and fake outrage (a specialty of you lately).

    The WSJ first made public the report highlighting the negative impact on teens in mid September and then other media followed with their own takes during the following week or so, but that's about it. There has not been a continuous never ending daily deluge of stories like say what we have seen with the Gabby Petito story which has been going non-stop for close to two months now. As far as I can tell, a month after the Instagram report came out, you are the only one who still brings it up seemingly every other day.

    As for where it came from, it came from FB itself. When their own conclusion included the phrase : "1 in 3 teen girls blames Instagram for making their body issues and problematic social media use worse", and their own ranking of the issues ranks Social Comparison and Body issues as the top 2 most important issues for their teen users, don't be surprised if that is the what gets the most attention.

    Anything, whether a drug, a food, a training program, or whatever type of product that says 1 in 3 people will suffer negatively if they use this, will get badly criticized for it and have the product either entirely removed from the market or severely regulated. More so if the main target audience is largely compose of minors.

  15. skeptonomist

    Kevin again draws the wrong conclusion that Instagram is a "huge net positive for teens". All we know from the survey of the teens is that the teens themselve say it is positive. Of course they would say that in justification of their use, whether it is true or not. In fact it is very unlikely that the teens could have detected an actual positive influence in all the various categories listed in the survey. To tell whether the teens are being affected in one way or another overall they would have to be examined by experts. But the response on the category which was negative, female body image, is probably significant.

    And he again seems not to realize the these platforms are not regulated like CBS or Fox News, in that they are exempt from libel suits because of Section 230. Changing that code would presumably force the platforms to be more careful with some content, but it would not prevent them from referring to Fox News or other right-wing media, which are themselves subject to libel suits. So presumably what would be left would look a lot like Fox News - not QAnon.

  16. Solar

    "The only real difference is that Facebook allows conspiracy theories to spread faster than in the past."

    It is indeed faster in the sense that in the past going down a rabbit hole required some effort, with people actually having to dedicate some time seeking the misinformation they were looking for, while now it's all there in their hand at the touch of a button.

    It is also worse because not only is it faster but far more wide reaching. In the past it was fairly easy to be isolated to the nonsense if you were not actively seeking it. Not so with FB. Now all it takes is one of your close contacts to start posting idiocy and you'll start getting exposed to it too, making it that much likelier for someone to fall prey to it and become hooked on it.

    The final piece that makes FB so bad, is that in the past those who had felt some inclination or desire for conspiracy theories or other types of bogus information were self controlling. The fear of being the only loon with such ideas would keep people away from the precipice of madness, keeping them attached to reality to some extent. With FB that self control is gone, because no matter how crazy the ideas they have are, it is extremely easy to find people who think the same, which immediately jettisons that self restrain and makes people embrace the craziness without having to look back.

  17. illilillili

    There are so many problems with this post.

    * You actually do support content based censorship. Not all speech is free speech. The big question these days is whether clear and obvious lies should be considered free speech. Do you advocate allowing people to broadcast lies? Where is your boundary line between broadcasting lies and tricking naive people into giving you their money?

    * Facebook has a completely different publishing model and technology from newspapers. Why should laws created to regulate 18th century publishing models apply to Facebook?

    * It's difficult to believe, or see why, Facebook should have more disinformation (per user) than Tik Tok.

    * You have contradictory arguments. It can't both be that Facebook is basically the same as newsletters and that Facebook spreads the misinformation much more quickly to a broader audience that wouldn't have noticed the misinformation before.

    Before, everything was human curated. A human sees a collection of content and knows a collection of humans and selects content to expose to the collection of humans. And the human curators are coarsely aggregating humans into the collections.

    Now we have automated curation. An automata sees all the content and sees all the humans and finely considers each individual piece of content and each individual human to decide which content should be shared with each individual.

    "Here's someone who likes misogyny. Let's see if they also like homophobia. Oh, they do. I wonder if they like white supremecy?"
    "Oh, here's a piece of content that I shared with a few white supremecists who seem to like it. And I have a list of 30 million white supremecists; let me send the content to them."

    1. SamChevre

      The big question these days is whether clear and obvious lies should be considered free speech.

      That was fairly definitively settled as "yes" by NYT vs Sullivan, which held that even factual "news" that was both false and defamatory was perfectly fine so long as it couldn't be proven that the publisher knew it was false. ("Didn't bother to check obvious records" is fine.)

      1. gbyshenk

        Note that a 'lie' is an intentional falsehood, not merely a falsehood.

        In my view, what NYT v Sullivan established is that a 'lie' (intentionally falsifying information to damage a public official) may be actionable, but merely being mistaken is not.

  18. NealB

    Media is not holy writ from very god, whatever anyone thinks that is. In a democracy it needs to be regulated, and I'd think, just basically you start with a legal limit of bandwidth permitted to any single entity, whether Fox News or Facebook, or jabberwacking.com. Do all the information production you want to do, but up to a limit. I suppose you'd set it like they do health insurance with inscrutable but ultimately absolute limits that will make you feel screwed over when you hit them, but tough shit. You get a trillion megabits of data a year and that's it. After that you're out. Make it really big so there is the possibility of big players, but prevent monopolies like Facebook and, yes Fox News for the idiots that don't know how to do a computer. Media should be regulated by the government. This isn't the early days of the early 19th century when it was still hard to disseminate information. The opposite is true now and getting information out is so easy that the only problem some of the have is scaling it up to the point of absurdity. Limits are a necessary component in a democracy what you want to encourage quality over quantity. The only one's going to complain about that are Facebook and Fox News.

    1. NealB

      Oh, and also, individuals that care may actually disconnect from Facebook, just like most of us, I suppose, choose not to watch the daily news on TV anymore (Fox or otherwise). Just stop going there. Or, if you want to be pro-active, end your account with them. One of the worst things about Facebook is that you can't instantly, at any point, just choose to delete your account and all of anything you ever posted there is immediately deleted and gone forever. Wayback machines will have a record of it, but the provider (Facebook e.g.) should be required by law , when you choose to delete your account, to delete every last bit of anything you ever had to do with it. I have never been clear why Facebook got their claws into our data to the extent that we lost all personal control over its existence. Just for starters, there should be a law that prevents providers like Facebook from holding on to information we provided to them previously if later we decided to discontinue the subscription. Still, it's possible to quit Facebook. I did it about five or six years ago and don't miss a single bit of it since then. We'd all do well to do the same asap. Not as a protest, just as a step in the direction of sanity.

      1. Justin

        "We'd all do well to do the same asap. Not as a protest, just as a step in the direction of sanity."

        This is exactly right, but Mr. Drum won't even bring himself to write that. Why won't Drum quit FB? Because he makes money from it? There is literally nothing FB could do to convince him. They could allow child porn and he wouldn't care. Oh wait... they do that already. And he doesn't care. None of the FB defenders are willing to admit that they enable all this nonsense.

        1. Austin

          I don’t know what FB thinks you’re into or who you’re friends with, but FB algorithms have never served up porn of any kind - legal or illegal - to me. Probably because they do make an effort to get rid of it as soon as it appears… and even if their censors are understaffed, they seem to succeed far more often than fail on that front.

  19. WryCooder

    "First off, my one firm belief: Facebook is a private corporation and has the same First Amendment rights as any newspaper or TV station"

    It seems that it matters that Facebook is viewed as a utility, not a content provider, and is therefore shielded by libel lawsuits that a newspaper or TV station might face. By granting 1A and DCMA (? I think) protections, Facebook is operating in a zone of non-accoutability that traditional 1A providers have not previously been afforded.

    As long as you're noodling....

  20. bokun59elboku

    The problem is not FB but people. Face it: the vast majority of people have zero rational analytical skills. It really is that simple. And Trump, FOX, the republican party, and FB all take advantage of that fact.

  21. Matt Ball

    I haven't done a deep dive, but it seems that the levels of anxiety, self-harm, and suicide among teens and tweens is clearly higher than ever before. This seems ... concerning. What is different about this generation (which continues to be better on other measures)? Social media seems a fair candidate.

  22. Atticus

    I sometimes feel like there's two different Facebooks. I'd say 80% of what I see are pictures/posts about my friends' and family's kids, vacations, other fun events they want to share, etc. They sometimes share articles and OpEd pieces and these sometimes lean more progressive or conservative. But I never see anything that you'd consider radical or extremist on either side. As far as FB advertisements, I honestly hardly notice them but most of them are for products and have nothing to do with politics.

    Reading these articles it seems like FB is a cesspool of ideological and political extremism. I'm sure in some corners of FB land it is but for the vast majority of people I think it's still just a place to keep up with family and friends.

    1. iamr4man

      My wife uses Facebook as you do. But we have a family member who is a member of a right wing Christian church whose pastor is into conspiracies and they share lots of looney stuff. I think most families have someone like that in them. Most of my wife’s family are moderate liberals and they just roll their eyes but I could see a person who is a moderate conservative being drawn down the rabbit hole. The fact that this stuff is being invented and amplified by Russia and that the web sites they are being drawn to are being run by foreign troll farms would have been a concern in the past.

      1. Spadesofgrey

        Then moderate liberals have conspiracy theories about the so called fake Christians. This brings me back to Bruno Bauer, Christ Myth, one of the greatest "conspiracy theories" of the modern world.

    2. Justin

      That’s right. There is the fb you use and then there is fab the terrorist training camp. What would you do if you found out your favorite restaurant was actually a front for drug dealers or gangs? Would you say, “Well the food is good so I don’t mind if they sell drugs or murder people.”

  23. royko

    I don't know exactly what should be done about Facebook. But I believe in general that companies should be held responsible for the results of any algorithms they create. If the algorithm is making the "editorial" choice to promote certain content based (directly or indirectly) on content, the company could be liable for those choices. It's wrong to say "Well, it's an algorithm, it just does what it does." They were written in a specific way to achieve specific things for their companies, so they should be considered conscious choices. If an editor choosing to publish the material would face liability, a company whose algorithm promoted the material should face the same liability.

  24. Special Newb

    "This means that I oppose any content-based government regulation of Facebook, just as I oppose content-based regulation of Fox News or Mark Levin, no matter how hideous they are."

    Thankfully the generations after you are smarter than this.

    1. Austin

      I’ll believe cancelling is a threat once I see people whose comments are full of racism, sexism or sheer stupidity are being deleted from sites like these. Until then, it appears that cancel culture isn’t all encompassing and the “victims” of it can find lots of other places to spew their BS.

      1. Special Newb

        Punishing the guy trying to give the climate talk was the first real cancel I've seen in a long while and even then all that got cancelled was 1 lecture.

  25. johngreenberg

    "Facebook is a private corporation and has the same First Amendment rights as any newspaper or TV station—or any other corporation, for that matter. This means that I oppose any content-based government regulation of Facebook, just as I oppose content-based regulation of Fox News or Mark Levin, no matter how hideous they are."

    Unlike any other corporation or any newspaper or TV station, Facebook can't be sued for libel or any other violation related to content but not protected by the 1st Amendment. That's a HUGE difference that you're glossing over.

  26. pjcamp1905

    "I remain vaguely appalled at the lack of serious research into the impact of Facebook and other social media platforms on politics and disinformation. There's just very little out there, and it surprises me."

    Because Facebook won't all ow access to the data.

    https://www.engadget.com/facebook-data-transparency-researchers-170021370.html

    https://www.theverge.com/2021/8/4/22609020/facebook-bans-academic-researchers-ad-transparency-misinformation-nyu-ad-observatory-plug-in

    https://digiday.com/marketing/facebook-is-not-a-researchers-friendly-space-say-academics-encountering-roadblocks-to-analyzing-its-2020-election-ad-data/

    https://www.inputmag.com/culture/facebooks-restrictive-research-rules-sent-a-princeton-study-packing

Comments are closed.