People using it as a replacement for another person, a search engine or for generating media. These aren’t things you used to use a PC for.
It’s factually not.
Obviously. I’m not saying price increases are a good thing. My point is that higher prices won’t lead to people flocking to LLMs, they’ll keep using what they have even if it’s slow.
Yet for oop’s theory to be viable they need to stay up for longer than the current computers stop being viable.
I don’t question that the AI boom is the cause of price increases. The problem is that even if you alter the theory to be about cloud computing in general the fact that we still have PCs disproved it.
I didn’t see anything about this being about local LLMs. I’m sorry for the people whose primary pc use case was self hosted LLMs but didn’t have the memory to run them…
Maybe this’ll slow down the adoption of self hosted LLMs, but most people either need a computer for something they can’t use an LLM for or they already use an online LLM for it.
Maybe this’ll slow down the adoption of self hosted LLMs
That is what the tweet is about…?
The whole point is that llms need resources that the current average computer doesn’t have, and they are locking away the higher end computers behind price hikes.
They are making it so online llms are the only option.
I didn’t see anything about this being about local LLMs
What? He talks about selling computation as a service and specifically mentions chatgpt. It’s not about stadia.
They mention “all of your computation as a service” and “ChatGPT to do everything for you”, it seems several other comments in the thread, the person you’re replying to, and myself read this as a comment about pushing people towards cloud computing. I don’t think that’s an unreasonable read especially considering the major hardware spike is in ram, not vram or graphics cards, which would be more a comment on self hosted LLMs.
Further local hosting of LLMs is already pretty outside the mindset of any regular user, and will likely never be comparable to cloud LLMs. The idea that they are intentionally driving up ram prices to hundreds of dollars as a direct method of boxing out the self hosted LLM linux nerds that want DRAM bound models is possibly more absurd.
Maybe this’ll slow down the adoption of self hosted LLMs, but most people either need a computer for something they can’t use an LLM for or they already use an online LLM for it.
That is what the tweet is about…?
The whole point is that llms need resources that the current average computer doesn’t have, and they are locking away the higher end computers behind price hikes.
They are making it so online llms are the only option.
What? He talks about selling computation as a service and specifically mentions chatgpt. It’s not about stadia.
They mention “all of your computation as a service” and “ChatGPT to do everything for you”, it seems several other comments in the thread, the person you’re replying to, and myself read this as a comment about pushing people towards cloud computing. I don’t think that’s an unreasonable read especially considering the major hardware spike is in ram, not vram or graphics cards, which would be more a comment on self hosted LLMs.
Further local hosting of LLMs is already pretty outside the mindset of any regular user, and will likely never be comparable to cloud LLMs. The idea that they are intentionally driving up ram prices to hundreds of dollars as a direct method of boxing out the self hosted LLM linux nerds that want DRAM bound models is possibly more absurd.