Skip to content

A random thought about Facebook

I have to leave for lunch, but here's something to discuss while I'm gone.

I think everyone agrees that Facebook can accelerate the spread of disinformation even if it's not the original source. But there's a flip side to this, and it has two elements. First, by making crazy disinformation public very quickly, it allows the pushback to start more quickly. Second, it can help accelerate the pushback, just as it accelerates the initial disinformation.

Thoughts?

55 thoughts on “A random thought about Facebook

  1. Vog46

    Harumph indeed.

    Lunch? Did he jump in his non turbo charge Porsche, spew gas vapors into the air?

    No Hilbert and Hopper pics?
    My my. We may not hear from Kevin until the scratches heal up now............

  2. pflash

    All true. And at the end of the day, the argument was louder, more vicious, more ubiquitous, and it pulled in more uninformed participants than it would have, absent FB.

  3. Steve_OH

    Nope. Information is like order, disinformation is like disorder. Pushback against disinformation is like trying to piece something back together after it's been blown up. It's always much, much harder to go in that direction, so the acceleration is always going to be a net loss for order.

    1. delveg

      I clicked into comments to say this. I think Kevin's even published research noting that rebuttals and debunkingscan wind up reinforcing the original message -- the brain treats it as "it came up again, so it must be important" rather than destroying the incorrect belief. (It's often called the backfire effect.)

      1. delveg

        Though evidently the effect isn't as bad as I remembered -- or there's been new research stating that the backfire effect isn't that bad, unless it becomes a component of identity .
        "To the contrary, an emerging research consensus finds that corrective information is typically at least somewhat effective at increasing belief accuracy when received by respondents. However, the research that I review suggests that the accuracy-increasing effects of corrective information like fact checks often do not last or accumulate; instead, they frequently seem to decay or be overwhelmed by cues from elites and the media promoting more congenial but less accurate claims. "
        From https://www.pnas.org/content/118/15/e1912440117

    2. peterlorre

      There's a higher order effect too, outside of determining whether in the limit facebook users eventually come to understand true facts. A big part of the disinformation strategy is about dominating the discourse to make sure everyone is gabbing back and forth about ridiculous crap and not actually scrutinizing anything (e.g., "flood the zone with shit", as Bannon says).

      Even if you propose a model where disinformation and pushback both happen more efficiently, you still have a "media" ecosystem that is fundamentally uninformative.

    3. Doctor Jay

      I really, really like this. The aim of Information Warfare is not to promote a falsehood, it's to erase any consensus of what is true, which puts people into a "trust your feelings" state. This in turn creates fragmentation in the social structure. It's inherently destructive, not constructive.

      Construction of a shared consensus is slow and difficult, even though it is quite valuable. Destruction of the shared consensus is much easier.

    4. HokieAnnie

      You are always so right Steve, it's old saying that a lie gets around the world while the truth is still putting on their shoes.

  4. Yikes

    I never quite see the first principal addressed, other than indirectly.

    FB is part of an entire system where information is now available instantly, for free, when such information used to be very hard to get, almost impossible.

    For example, my wife misplaced an Apple earbud this morning, I was able to Google "find lost earbud" on my phone and a three minute video by some dude popped up with instructions on how to find it. Total research time: 15 seconds.

    Prior to the internet, and in this case Youtube, who knows how long the search could have gone on.

    So this is the thing, the amount of free, quality information has never, in human history, been more accessaable.

    How is this paid for? Via adds attached to the free pages or videos.

    The content is provided for free by all of us, most of us get no compensation at all, some youtubers or instagrammers do, and finally, with regards to FB, although the service is free they monetize it via adds like anything else.

    The fundamental problem is none of this magic is set up to take any sort of editorial look at what anyone posts. FB is not the NY Times or NBC or even Fox.

    If FB, or fill in the blank of any social media platform, had to screen stuff editorially there is no way it could exist for free, and its entire existence is based on it being free.

    These Congressional hearings are as if FB was the NY Times, and its not.

    Maybe we just all need to start paying for stuff so those basic fees can be used to editorialize.

    Because make no mistake, we are asking FB to be more of an editor.

    1. shadow

      Facebook uses algorithms to decide what to show you. That is being an editor! Just because a human doesn't perform the activity manually doesn't mean a process setup by a human is not filtering what is presented to people. That's why Facebook gets so much more attention that AWS or Cloudflare.

      We have rules for if you're a communication platform and rules for if you're an editorial source of news. We shouldn't allow Facebook to be both and have neither rules apply.

      1. Yikes

        I agree and get that.

        My question was that every time I see this its not really a question of the algorithm, its a question of the content.

        Basically the distinction between giving a bunch of cowboys from Texas endless ads for personalized boots or chaps or something and vs. giving them endless posts from Tucker Carlson, which they love, and endless posts from some ant-vax scammer.

        My understanding is the question is not the algorithm, its essentially rooting out and banning the anti-vax fake stuff from the above list.

        1. shadow

          I'm not sure how you can have that view of the issue. The algorithm is the second point of any serious article criticizing Facebook, Twitter, or YouTube. The article starts with "why are X people seeing so much fascist/racist/radical content on Y", and the answer is inevitable the algorithm sent the user down a hole.

          Do people complain about the content? Yes, of course. But that's because these platforms in particular aim to cater to everyone while also applying some level of editorial control. They will always be reason for someone from the Left or the Right to complain about a company catering to the whims of both. That is a self-inflicted problem for these companies, though, and not what makes them bad for society.

          1. Yikes

            But the answer then is easy. No algorithms.

            You get cat videos from your friends, or an entity you add to your list. Done.

            FB will fight that to the death, though, because what the algorithms are designed to do is make you spend more time on FB by giving you more content you like.

            It just seems totally naive to me that you actually need a congressional hearing to figure out if FB, or any social media platform, will not run algorithms to attempt to give users more of what they already like.

            There are two solutions, pick one:

            No f-ing algorithms.

            or,

            Figure out a way to edit content. Presumably there already is some porn edit feature on FB.

            I just find this odd kabuki theater of "how did X get on your platform, sir?" when X is created by someone else, rather naive.

            I mean, a "whistleblower?" Who with a brain in their head doesn't know that this is how social media platforms operate?

            Its like a "whistleblower" announcing that the US armed forces are trained to kill people.

            If FB picks the adjust the algorithm option, my algorithm would be this: (a) figure out if the content is in any way advocating anything, and (b) if it is, you have to pay an advertising fee to run it. Which then puts the onus on the social media site to look at it.

            Sierra club, ad. You should work on climate change, ad. We should keep those confederate statues, ad. Vote for Homer Simpson, ad. Mask mandates are bogus, ad.

            My solution would make FB fairly boring, but I barely use it to watch even the cat videos.

            1. jeff-fisher

              Well, there's has to be some algorithm.

              But it is possible that a solution would be to regulate the algorithm and makes it very simple, which I expect is your meaning.

              Facebook guy, and soforth will spend billions to stop that though, and the republican party will not let it happen either.

              1. Dee Znutz

                There does not have to be an algorithm.

                Twitter has two options “top tweets” aka algorithm and “show tweets as they appear” aka you follow someone and their tweet shows up on your feed at the time it is made.

                That second one has zero algorithm. You agree to follow someone and then you see all those you follow’s posts in the order they make them.

      2. rick_jones

        Facebook uses algorithms to decide what to show you. That is being an editor!

        I would argue that is being a clippings service. It is deciding what among that which has already been published by someone (else) to show you, not deciding what to publish in the first place.

    2. rick_jones

      The fundamental problem is none of this magic is set up to take any sort of editorial look at what anyone posts. FB is not the NY Times or NBC or even Fox.

      The democratization of mass communication...

    3. dio

      I think that the mistake here is to view Facebook "content" as speech. Fundamentally, w.r.t. Facebook the company, it is not really speech - it is the *product* that allows them to make money, through the magic of "engagement" which is their KPI for their ability to sell ads. FB has found a way to deliberately and aggressively push content that maximizes engagement to a certain extremely profitable segment - the fact that this content is unambiguously harmful to society is regrettable to them, probably, but they will fight to the death to push it anyway because to do otherwise would decrease their revenues.

      We wouldn't accept that it's OK for a paper company to contaminate the groundwater with their manufacturing byproducts, just because to do so would cut into their profits, right? Do we really have to accept basically the same behavior with FB, just because their byproducts are toxic speech, and not toxic chemicals?

      1. MontyTheClipArtMongoose

        It's not regrettable to them at all.

        Zuckerberg is a high functioning sociopath, what Patrick Bateman would have been in the real world.

    4. MontyTheClipArtMongoose

      Sean Fanning & the launch of Napster led everyone to believe that nothing had a price.

      Twenty years on, but nineteen years & 364 days two late, though, we are realizing Lars Ulrich was right, & the anarcho libertarians of Silicon Vallet were very wrong.

    5. Jasper_in_Boston

      The fundamental problem is none of this magic is set up to take any sort of editorial look at what anyone posts. FB is not the NY Times or NBC or even Fox.

      Well, that "fundamental problem" is enabled by US law—ie section 230—which removes any incentive for social media platforms to vet user-generated content for harmfulness.

      I've been wary of calls to jettison section 230 altogether. On balance I still think doing so would cause a lot of problems. But I'm increasingly thinking the obvious compromise is to remove this liability shield for user-generated content that is being algorithmically promoted. In other words, platforms would still enjoy a liability shield for user-generated content, until, that is, they use algorithms to recommend it (because the algorythmics turbo-charge virality).'

      There are a lot of people in this world who inevitably are going to post harmful content on line. That can never be stopped. But if we could radically reduce the frequency with which this stuff goes viral, I think we'd be a lot better off as a society.

  5. clawback

    Kinda like the theory that Trump being elected was good because it would speed up the dictatorship of the proletariat or some such. Didn't expect to find literal accelerationist theory on this blog, but here we are.

    But to answer your question directly: no, rapid dissemination of fascist propaganda is bad.

    1. HokieAnnie

      Agree 100 percent clawback, it's the firehose of disinfo that is bad, there is no bothsides to counter act the troll farms cranking out the crap.

  6. cld

    It's hard not to think the pushback itself encourages acceptance of the error among low-information people.

    Because they vaguely heard something about it so they think there's something in there, and these personalities will almost always think wit is the reverse of what people think it is, so swallowing insane horseshit makes them feel like they're clever.

  7. shadow

    Pushback on large, open platforms does not work. People do not accept criticism from mobs or random strangers, they dig in their heels. There has been loads of research on how trying to fact check misinformation often hurts (i.e. makes people know of/believe the lies more).

    In the same way a broadcast company has some legal and moral liability for the content it hosts, tech companies should be held liable for the recommendations of their algorithms. Right now they get to enjoy the benefits of being both the hosting medium and the editor, while claiming to have the responsibilities of neither.

    Any technology platform should be held to increasingly tough inspection and liability the larger/faster information can spread on it. A Discord server with 1K people is a lot less dangerous than a server with 100K, and 1M, and so forth. We need to adjust for how humans function socially at these scales. Facebook is essentially a single server with billions of users and should be considered capable of extreme damage to any society it touches.

    Humans are not designed for social interaction in big groups, and big groups are highly susceptible to manipulation or accidental mob behavior. If we recognize that huge groups cause greater problems because misinformation or mob behavior emerge quicker and with greater intensity, the solution is to apply greater oversight the larger a group becomes. We want to discourage, but not ban, huge online groups from forming.

    1. jeff-fisher

      Even more the blocking and ghosting tools Facebook has implemented are just the thing to protect a bubble and make people fear being cast out if they push back.

  8. arghasnarg

    This ignores that, at some point, acceleration of a given process creates a difference in kind.

    Look at how the gradual speed increase in market clearing eventually created an entire industry of front-running robots in the stock market, leading non-robots to change behavior.

    I suspect it also is different mentally, for consumers of conspiracies. Getting a small drip from your crazy uncle or over beers on Friday is very different than mainlining it on your phone all day.

  9. Doctor Jay

    The biggest problem with pushback at disinformation on Facebook is finding out it even exists.

    Facebook is the idea platform for a "whispering campaign". Remember the story about how W beat John McCain in the SC primary? The story described a whispering campaign, conducted via email, spreading disinformation about McCain's adopted son.

    Facebook puts this sort of thing on steroids. Because if you aren't the sort of person that is susceptible to the lie, you're likely never to see it, and not know it exists.

    1. MontyTheClipArtMongoose

      Just imagine if the "Bridget Mc Cain is John Mc Cain's mulatto lovechild with a Low Country prostitute" story had been able to be amplified as much as Pizza Pong Pong. We might have had a presidential campaign trail assassination, John Mc Cain serving as Robert Kennedy to some sqrewball's Sirhan Sirhan.

  10. Justin

    Pushback or simply crazy people arguing with other crazy people. Decent people don't use facebook. Or comment on blogs!

  11. KinersKorner

    My random thought is our sides pushback sucks. Sone leaks the allleged settlement details of the ACLU vs US and it’s suddenly “Biden is insane, give illegals a million dollars”. Not one ounce of pushback as loony Senator after Senator bitches on Faux. This goes on and on.. .. Lousy pushback.

  12. Dee Znutz

    This is the GOP’s tactic for dealing with Covid applied to disinformation.

    I don’t believe any further explanation for why this is a bad way to look at it is required.

  13. Vog46

    *************OFF TOPIC************COVID***********

    Since MOST people don't go past the front page I'll drop this here
    https://www.cnn.com/2021/10/29/health/covid-vaccine-protects-better-previous-infection/index.html

    {snip}
    Vaccination protects people against coronavirus infection much better than previous infection does, a team of researchers led by the US Centers for Disease Control and Prevention reported Friday.

    They said their findings should help settle debates over whether people who have been infected should bother getting vaccinated. They should, the researchers said.
    People who had not been vaccinated and who ended up in the hospital were five times more likely to have Covid-19 than people who had been vaccinated within the past three to six months, they found.
    "All eligible persons should be vaccinated against COVID-19 as soon as possible, including unvaccinated persons previously infected with SARS-CoV-2," the researchers wrote in the CDC's weekly report, the MMWR.
    {snip}

    Interesting..........but in a slam against previous administrations who may not have tested enough they did add a codicil:

    {snip}
    They also note that while the study was designed to compare two groups with two different types of immunity-- immunity from natural infection versus immunity from vaccination -- it's possible there were some mixups. Plus, they only included hospitalized patients in the study, so the findings may not apply to everyone.
    {snip}

    Now do I trust our CDC over the UK and Israeli scientists? Nope but it's done in hospitals and presented fairly current information - tests were run from January to September and it compared only 2 of he vaccines Moderna and Pfizer against natural immunity

    I hope the natural immunity folks come around, and get vaccinated.

    Oh and I know the world has ended today because I read that one of the Kardashians and her kid both tested positive as break through cases.
    Don't know which one (and really don't care) but saint's preserve us - this should make huge headlines with the QAnon crowd.

      1. Vog46

        Cld-
        Isn't that something?
        And the Island nation of Tonga had their FIRST case of Covid 19 - guy from New Zealand. Was vaccinated and tested NEGATIVE when he departed NZ. Tested positive upon arrival.

          1. Vog46

            Regarding the report from the CDC:
            "The federal government on Friday quietly released two reports that together destroy many right wing conspiracy theories and talking points on the coronavirus and the COVID-19 vaccine.
            Contrary to the false claims from right wing extremists that "natural immunity" is more powerful and "better" than the coronavirus vaccines, the CDC released a report finding those who are unvaccinated and contracted COVID-19 are five times more likely to be re-infected than those who are fully vaccinated.

            That report, CBS News adds, shows that "vaccine-induced immunity was more protective than infection-induced immunity."

            "We now have additional evidence that reaffirms the importance of COVID-19 vaccines, even if you have had prior infection. This study adds more to the body of knowledge demonstrating the protection of vaccines against severe disease from COVID-19," said CDC Director Dr. Rochelle Walensky.
            Epidemiologist Dr. Eric Feigl-Ding weighs in with results from the study showing for the elderly "natural immunity" is even worse than for younger patients.
            Among elderly, natural immunity is almost 20x weaker against reinfection than vaccines!! But even among 18-64, natural immunity is still 2.57x weaker protection than vaccines!!
            Meanwhile, the Director of National Intelligence also on Friday released a report that finds the novel coronavirus that causes COVID-19 "was not developed as a biological weapon."

            Further destroying right wing claims the DNI's report adds the coronavirus, known as SARS-CoV-2 "probably was not genetically engineered," and says that "two agencies believe there was not sufficient evidence to make an assessment either way.
            *****************************************************
            This seems pretty convincing
            Anyone who still believes that natural immunity is better than vaccines needs to follow the science..........
            What startled me was the effectiveness in the older population. The Israelis study was conducted using healthcare workers who were NOT immuno-compromised whereas the CDC used people who were "normal"

  14. thefxc1616

    BECAUSE FACEBOOK DOES NOT ACCELERATE THE PUSHBACK. It's designed to sustain engagement, and disinfo is demonstrably more engaging than a correction. And in any case once the disinfo is out there it's harder to walk back.

    Kevin, if you took a 101 Media Studies course you'd understand why Facebook is a unique problem. Your defences of it are increasingly befuddling.

  15. cephalopod

    But does the pushback on the misinformation actually reach the same people who initially received the misinformation?

    Right now it seems like the pattern is this:
    1. Bonkers post makes its way around the Facebook feeds of right-wing consumers of other bonkers misinformation.
    2. Some intrepid fact-checking lurker in the misinformation sphere sees the new misinformation and investigates it.
    3. Rebuttal of misinformation is posted to a news source that is viewed almost exclusively by liberals.
    4. Liberals learn of misinformation simultaneously with the rebuttal. Liberals laugh or scowl, depending on the severity of the stupidity that is outed.
    5. Conservatives continue to live in bubble of misinformation.

    I may be consuming lots of fact-checking articles that directly refute Trumpy and QAnon nonsense, but I can guarantee you that my right-wing father is seeing almost none of that. His social media world is nothing but the right-wing nonsense, with almost no refutation ever making it into his feed.

  16. quakerinabasement

    If the algorithms send the misinformation and the correction to different audiences, the people who need to see the correction never do.

  17. D_Ohrk_E1

    Why did FB rename itself Meta? Because the fresh shit that's served up and amplified on its networks is how meta works. Old shit, by definition, isn't meta. But critically, the fresh shit is amplified so loudly that the push back is slow and ineffective.

    Once again, your de facto defense of FB shall be engraved onto stainless steel plates, framed in Rococo wood replete with Elephant Ivory, robustly covered in gold leaf.

Comments are closed.