• Diplomjodler@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    4
    ·
    edit-2
    7 days ago

    They’re all aware that the chance of success is minimal. However, the one that actually succeeds will achieve an absolutely dominant position both economically and politically. So they’re basically all gambling their companies on being the one that achieves AGI first while being well aware that failure is the most likely outcome. And at this point they can’t pull out because the consequences would be devastating.

    • Ech@lemmy.ca
      link
      fedilink
      arrow-up
      15
      ·
      8 days ago

      Going all in on a tech fundamentally incapable of achieving AGI is a bit dumb, to put it mildly.

      • Diplomjodler@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        7
        ·
        7 days ago

        We don’t know that yet. While the technology in its current state won’t cut it, there can always be some new breakthrough, that moves the field on.

        • Don Piano@feddit.org
          link
          fedilink
          arrow-up
          1
          ·
          5 days ago

          We don’t know it yet in the same way that we don’t know yet whether pressing the right order of buttons on a soda vending machine will make it sentient.

        • Ech@lemmy.ca
          link
          fedilink
          arrow-up
          9
          ·
          7 days ago

          There is no breakthrough that will make an llm sentient, and dumping the world’s supply of RAM into it 1) doesn’t amount to a “breakthrough”, and 2) will never turn llms into something they’re not.

    • very_well_lost@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      7 days ago

      What you’re saying makes sense when we’re talking about Google or OpenAI or something, but this was a Japanese font company. I really doubt they were chasing AGI — more likely they were using generative AI to make slop fonts and failing miserably.