• TinyTimmyTokyo@awful.systems
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Random blue check spouts disinformation about “seed oils” on the internet. Same random blue check runs a company selling “safe” alternatives to seed oils. Yud spreads this huckster’s disinformation further. In the process he reveals his autodidactically-obtained expertise in biology:

    Are you eating animals, especially non-cows? Pigs and chickens inherit linoleic acid from their feed. (Cows reprocess it more.)

    Yes, Yud, because that’s how it works. People directly “inherit” organic molecules totally unmetabolized from the animals they eat.

    I don’t know why Yud is fat, but armchair sciencing probably isn’t going to fix it.

    • saucerwizard@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      The whole ‘trad clean’ market on twitter is wild. You’d be amazed at the mark ups of like, tortilla chips.

      • TinyTimmyTokyo@awful.systems
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        That reminds me. If the world is about to FOOM into a kill-all-humans doomscape, why is he wasting time worrying about seed oils?

        • Sailor Sega Saturn@awful.systems
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          A lot of rationalism is just an intense fear of death. Simulation hypothesis? Means that maybe you can live forever if you’re lucky. Superintelligence? Means that your robot god might grant you immortality someday. Cryogenics? Means that there’s some microscopic chance that even if you pass away you could be revived in the future at some point. Long terminism? Nothing besides maybe someday possibly making me immortal could possibly matter.

          I mean don’t get me wrong I’d give a lot for immortality, but I try to uhh… stay grounded in reality.

            • MaxTheFox@spacey.space
              link
              fedilink
              arrow-up
              5
              ·
              edit-2
              1 year ago

              @sailor_sega_saturn @TinyTimmyTokyo @nyrath For someone claiming to be rational (meaning putting reality above superstition), he [Yudkowsky] really did create what is essentially a proto-religion. Hence, Scientology.

              I am literally a devout (if reformist) Christian and I’m less superstitious than that clown shoe and his zombies.

          • Bob Thomson@mastodon.social
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            @sailor_sega_saturn @TinyTimmyTokyo Been thinking and saying this for a while. These powerful billionaire types are terrified of death as it’s so egalitarian - nobody escapes it; No matter how much money and power they accumulate, they can’t get control over this one thing and it drives them up the wall.

          • LisPi@mastodon.top
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            @sailor_sega_saturn @TinyTimmyTokyo Eh, no guarantee (or any reason to believe really) a simulation would be even focused in any way on humanity (no anthropocentrism needed).

            Similarly for superintelligence, few reasons for it to care.

            Cryogenics is a better bet and as you say it’s quite unlikely unfortunately.

              • self@awful.systemsM
                link
                fedilink
                English
                arrow-up
                8
                ·
                1 year ago

                The anti-TESCREAL conspiracy argues that even relatively cautious people like Bostrom talking about the risks of superintelligence is reactionary since they distract us from algorithmic bias and the electricity use of server farms. While we agree that techno-libertarians tend to be more interested in millennialist and apocalyptic predictions than responding to the problems being created by artificial intelligence today, we also believe that it is legitimate and important to discuss potential catastrophic risks and their mitigation. The anti-TESCREALists dismiss all discussion of AGI, ranging from “believers” to “doomers.”

                none of this looks worthwhile to me

                • Michael Honey@assemblag.es
                  link
                  fedilink
                  arrow-up
                  0
                  arrow-down
                  5
                  ·
                  1 year ago

                  @self the tldr is that lumping everything TESCREAL together into “assholes are into this, therefore it is bad” means that a lot of worthwhile and important ideas, many of which were developed by left thinkers, get lost.

                  (that said, “anti-TESCREAL conspiracy” is I think itself an unfortunate compression)

              • froztbyte@awful.systems
                link
                fedilink
                English
                arrow-up
                5
                ·
                edit-2
                1 year ago

                I tried to read this over breakfast, which consisted of very mellow bowl of jungle oats (no extra flavour) and some semi-terrible filter coffee. and I gotta tell ya, both of those fairly mellow things were better than the entire first quarter of this post

                the author seems to be trying to whiteknight some general idea of maybe some progress isn’t bad and “well obviously there will be some bad associations too”, while willfully excluding the direct and overt bad actions of those associated bad actors?

                admittedly I only got a quarter of the post in (since my oats ran out - scandalous), but up until that point I hadn’t really found anything worthwhile beyond the squirrelly abdication bullshit

                • Michael Honey@assemblag.es
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  3
                  ·
                  1 year ago

                  @froztbyte maybe my breakfast (untoasted muesli, coconut yoghurt) started me in a different frame of mind. I read it as showing that a lot of these ideas, which, yes, some jerks (but also plenty of non-jerks) are into, have deeper left histories, and deserve serious consideration.

              • David Gerard@awful.systemsM
                link
                fedilink
                English
                arrow-up
                4
                ·
                1 year ago

                so the wild bit here is that Hughes previously ranted over transhumanism’s hard-right turn as something Thiel personally did in the late 2000s as he tried to buy his way onto the IEET board

                dude, TESCREAL is talking about precisely those guys

  • carlitoscohones@awful.systems
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    “carving reality at the joints” - does this mean anything?

    Only finer-grained concepts like “linoleic acid” are useful for carving reality at the joints.

  • outer_spec@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 months ago

    This systematic review and meta-analysis doesn’t seem to indicate that linoleic acid is unusually bad for all-cause mortality or cardiovascular disease events. And is there another meta-analysis showing the opposite? I kinda just don’t trust those anymore, unless somebody I trust vouches for the meta-analysis

    “I only trust meta-analyses if the results agree with me”

    • zogwarg@awful.systems
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Yud himself has been on this particular train for years. In one of his dath ilan fanfic blog-stains, he bemoaned the evils of our society killing babies with omega-6 nutrition bags.