• Damarus@feddit.org
    link
    fedilink
    English
    arrow-up
    72
    ·
    12 hours ago

    It never fails to amaze me, how much C-level people are disconnected from reality.

    • verdi@feddit.org
      link
      fedilink
      English
      arrow-up
      13
      ·
      5 hours ago

      Abstraction layers. They are so detached from everyone else through abstraction layers that we’re nothing more than D2 NPC character sheets to them. That’s why when a Luigi, alegedly, breaks through all of the abstraction layers and brings a leaded reality check to these fucking parasites they double down on palantir like projects to keep themselves safe while making the state even more oppressive and invasive of everyone’s privacy.

    • hcbxzz@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      5 hours ago

      Managers love these AI tools because that’s what they’re already doing and familiar with; the same way you talk an AI to doing something for you is not very different from the experience of instructing a mediocre worker.

    • frunch@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      10 hours ago

      Right?! At the end of the day, they’re still just people. Gotta eat, gotta sleep, gotta shit. I will never understand how any individual gets so much money/power/attention because they’re all just god damn people and in the event of a catastrophe i imagine they would be about as helpful as any other random human. They aren’t gods, and they certainly don’t deserve the stratification. It’s not like they’re enlightened or something, most of the time they’re just sociopaths who are rich, clever, and/or connected. When you get a glimpse under the hood at moments like this, it really is kinda jarring. Helps to dispel those silly presumptions about them at least.

    • Onno (VK6FLAB)@lemmy.radio
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      5
      ·
      10 hours ago

      What you’re describing is a general experience with LLM, not limited to the C-level.

      If an LLM sprouts rubbish you detect it because you have external knowledge, in other words, you’re the subject matter expert.

      What makes you think that those same errors are not happening at the same rate outside your direct personal sphere of knowledge?

      Now consider what this means for the people around you, including the C-level.

      Repeat after me, AI is Assumed Intelligence and should not be considered anything more than autocorrect on steroids.