That’s got to be it. Cloud compute is expensive when you’re not being funded in Azure credits. One the dust settles from the AI bubble bursting, most of the AI we’ll see will probably be specialized agents running small models locally.
I have been using a program called GPT4ALL and you can download many models and run them locally. They give you a prompt at boot if you want to share data or not. I select no and use it offline anyway.
That’s got to be it. Cloud compute is expensive when you’re not being funded in Azure credits. One the dust settles from the AI bubble bursting, most of the AI we’ll see will probably be specialized agents running small models locally.
I’m still running Qwen32b-coder on a Mac mini. Works great, a little slow, but fine.
I’m somewhat tech savvy, how do I run llm locally. Any suggestions? How to know if my local data is safe
I have been using a program called GPT4ALL and you can download many models and run them locally. They give you a prompt at boot if you want to share data or not. I select no and use it offline anyway.
Checkout lm studio https://lmstudio.ai/ and you can pair it with vs continue extension https://docs.continue.dev/getting-started/overview.