• brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    2 days ago

    I think the flip side of this is Facebook or wherever the link was pushed to your in-laws (which is what I’d guess happened) feels… empowering. Those apps are literally optimized, with billions of dollars (and extensive science, especially psychology), to validate folk’s views in the pursuit of keeping them clicking. Their world’s telling them they’re right; of course your retort will feel offensive and wrong.

    They’re in a trap.

    And I still see lot of scientists posit ‘why is this happening?’ unironically on Twitter or something, which really frustrates me.

    • psud@aussie.zone
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 days ago

      Every Facebook profile posted to /r/HermanCainAwards (the subreddit for mocking deluded people who died of COVID while spreading misinformation) was the same. Whatever the formula was it worked great at sucking in a specific sort of person

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        1 day ago

        What’s incredible is Facebook is not liable for that at all.

        What if… I dunno, a giant school peddled that same info? Or some religious figure got a ton of people killed? There’s really not a good metaphor for Facebook, which is why folks don’t really know of the sheer influence they command, yet are still treated like a garage startup operating a fair forum that needs legal protection.

    • shawn1122@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      2 days ago

      Absolutely agree. The “internet” was not a harmful worldview reinforcing machine back when we were told not to cite GeoCities in our book reports.

      Asking people to betray their dopamine is a monumental task. It’s like like challenging any other addiction.

    • Taalnazi@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 days ago

      Best way to change that is to shut down algorithms that have that bias, and mandate media literacy.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        2 days ago

        That doesn’t work because people like the algorithms, unfortunately. They win the attention war, and Trump is perfectly emblematic of this.

        It’s also not even about ‘political bias’. Toxicity is the natural end state.