Pro@programming.dev to Technology@lemmy.worldEnglish · 2 days agoCursorAI "unlimited" plan rug pull: Cursor AI silently changed their "unlimited" Pro plan to severely rate-limited without notice, locking users out after 3-7 requestsconsumerrights.wikiexternal-linkmessage-square27fedilinkarrow-up1294arrow-down17
arrow-up1287arrow-down1external-linkCursorAI "unlimited" plan rug pull: Cursor AI silently changed their "unlimited" Pro plan to severely rate-limited without notice, locking users out after 3-7 requestsconsumerrights.wikiPro@programming.dev to Technology@lemmy.worldEnglish · 2 days agomessage-square27fedilink
minus-squarefmstrat@lemmy.nowsci.comlinkfedilinkEnglisharrow-up14arrow-down1·2 days agoI’m still running Qwen32b-coder on a Mac mini. Works great, a little slow, but fine.
minus-squareAnd009@lemmynsfw.comlinkfedilinkEnglisharrow-up2·1 day agoI’m somewhat tech savvy, how do I run llm locally. Any suggestions? How to know if my local data is safe
minus-squareRetro_unlimited@lemmy.worldlinkfedilinkEnglisharrow-up2·8 hours agoI have been using a program called GPT4ALL and you can download many models and run them locally. They give you a prompt at boot if you want to share data or not. I select no and use it offline anyway.
minus-squareLlak@lemmy.worldlinkfedilinkEnglisharrow-up5·1 day agoCheckout lm studio https://lmstudio.ai/ and you can pair it with vs continue extension https://docs.continue.dev/getting-started/overview.
I’m still running Qwen32b-coder on a Mac mini. Works great, a little slow, but fine.
I’m somewhat tech savvy, how do I run llm locally. Any suggestions? How to know if my local data is safe
I have been using a program called GPT4ALL and you can download many models and run them locally. They give you a prompt at boot if you want to share data or not. I select no and use it offline anyway.
Checkout lm studio https://lmstudio.ai/ and you can pair it with vs continue extension https://docs.continue.dev/getting-started/overview.