• Lojcs@piefed.social
    link
    fedilink
    English
    arrow-up
    45
    arrow-down
    2
    ·
    1 day ago

    This is a stupid take on so many levels.

    1. There isn’t a single PC use case LLMs can replace.
    2. ‘Computation’ done by an LLM is in no way comparable to a computer.
    3. Existing computers don’t become incapacitated because prices rose.
    4. Price increases aren’t permanent.
    5. Cloud computing isn’t new.
    6. Computers don’t need oogles of ram to be capable.

    Getting tired of this trend of people cobbling up hot takes from other people’s opinions.

    • Grimy@lemmy.world
      link
      fedilink
      arrow-up
      7
      arrow-down
      24
      ·
      edit-2
      1 day ago
      1. Openai alone has 125 million daily users. Clearly, there is a use case.
      2. Well it uses the same hardware so I guess it is.
      3. If you don’t already have a strong computer, the price rising does incapacitate you. I guess you can also say house prices rising doesn’t incapacitate homeowners.
      4. Prices don’t usually go down as fast as they go up, not sure what’s the point in any case.
      5. Cloud computing needing GPUs or a lot of ram is.
      6. They do to run llms, or they need a strong GPU, which is what this is about.
      • Lojcs@piefed.social
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        3
        ·
        edit-2
        1 day ago
        1. People using it as a replacement for another person, a search engine or for generating media. These aren’t things you used to use a PC for.
        2. It’s factually not.
        3. Obviously. I’m not saying price increases are a good thing. My point is that higher prices won’t lead to people flocking to LLMs, they’ll keep using what they have even if it’s slow.
        4. Yet for oop’s theory to be viable they need to stay up for longer than the current computers stop being viable.
        5. I don’t question that the AI boom is the cause of price increases. The problem is that even if you alter the theory to be about cloud computing in general the fact that we still have PCs disproved it.
        6. I didn’t see anything about this being about local LLMs. I’m sorry for the people whose primary pc use case was self hosted LLMs but didn’t have the memory to run them…

        Maybe this’ll slow down the adoption of self hosted LLMs, but most people either need a computer for something they can’t use an LLM for or they already use an online LLM for it.

        • Grimy@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          2
          ·
          1 day ago

          Maybe this’ll slow down the adoption of self hosted LLMs

          That is what the tweet is about…?

          The whole point is that llms need resources that the current average computer doesn’t have, and they are locking away the higher end computers behind price hikes.

          They are making it so online llms are the only option.

          I didn’t see anything about this being about local LLMs

          What? He talks about selling computation as a service and specifically mentions chatgpt. It’s not about stadia.

          • Postimo@lemmy.zip
            link
            fedilink
            arrow-up
            3
            ·
            24 hours ago

            They mention “all of your computation as a service” and “ChatGPT to do everything for you”, it seems several other comments in the thread, the person you’re replying to, and myself read this as a comment about pushing people towards cloud computing. I don’t think that’s an unreasonable read especially considering the major hardware spike is in ram, not vram or graphics cards, which would be more a comment on self hosted LLMs.

            Further local hosting of LLMs is already pretty outside the mindset of any regular user, and will likely never be comparable to cloud LLMs. The idea that they are intentionally driving up ram prices to hundreds of dollars as a direct method of boxing out the self hosted LLM linux nerds that want DRAM bound models is possibly more absurd.

      • NecroticEuphoria@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        24 hours ago

        On the first point I can say, that there are manufactured use cases and they are based on societal failings in other areas. Companies are just capitalizing on them and offer only a perversion of what we had before.

    • just2look@lemmy.zip
      link
      fedilink
      arrow-up
      21
      ·
      1 day ago

      Yeah, this seems like a pretty weird take. It doesn’t take massive compute power to run a web browser and use a search engine.

      Nothing that I built my computer for is related to what an LLM would do for me.

    • Railcar8095@lemmy.world
      link
      fedilink
      arrow-up
      11
      arrow-down
      6
      ·
      1 day ago

      The industry is going to move towards stream only games very soon. Stadia failed because it was a small player that tried to early.

      That’s one of the slippery slopes of always online SP. “If need to be always online anyway, why not just stream the game?”

      • Lojcs@piefed.social
        link
        fedilink
        English
        arrow-up
        15
        ·
        1 day ago

        Yeah only if cloud gaming was attempted by a big player like Microsoft, Sony, Google or Microsoft…

        • Railcar8095@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          9 hours ago

          It is, and it doesn’t have the backlash stadia has. Google is massive, but stadia was a relatively small product that tried to starts already enshitified.

          Amazon is creeping slowly with Luna, Sony has many games locked behind stream only (PS now). Do you think they want to be the ones paying for the compute for the goodness of their hearts?

          • Lojcs@piefed.social
            link
            fedilink
            English
            arrow-up
            2
            ·
            7 hours ago

            I didn’t know ps had streaming only games. Looking it up it seems to be specifically ps3 titles, which makes sense to me considering that the ps3 cell processor is notoriously hard to emulate.

            Online only single player games are able to exist because most people have some sort of a connection all the time and they either don’t know or care that the game is connecting to internet when they play.

            Cloud gaming is a noticeably worse experience and it has much stricter requirements. I think you underestimate the backlash there would be if the next cod just downloaded a streaming client.

            I am concerned that a generation of gamers will sink thousands of dollars into streaming services, end up owning nothing and have a worse experience in the process. I don’t know how to prevent that other than giving people financial stability to make smart decisions.

      • bcovertigo@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 day ago

        This is absolutely the sentiment of executives, which is why it’s so hilarious to see the ‘friendslop’ genre becoming so popular with $20 games like Peak snatching their profits.

        In my opinion the best possible version of our immediate future is going to look more like this. Execs fire their talent, and the talent memes them to death.

      • Brosplosion@lemmy.zip
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        22 hours ago

        Stadia failed because it’s a flawed concept. Latency is a real thing and running local will always be a better experience.

  • ClockworkOtter@lemmy.world
    link
    fedilink
    arrow-up
    13
    ·
    1 day ago

    What services can LLM-AI providers offer that would otherwise require high RAM usage at home? I feel like people who do home video editing for example aren’t going to be asking ChatGPT to be splicing their footage.

  • CompactFlax@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    1 day ago

    We will always need some level of edge compute.

    But I was there for the heyday of cloud. The need for edge compute was ignored or dismissed.

    Ironically, the edge compute we ended up with after everything moved to cloud is very memory-intensive. Hmmm.