• WorldsDumbestMan@lemmy.today
      link
      fedilink
      arrow-up
      8
      arrow-down
      2
      ·
      4 days ago

      I run it on the hardware I have, the data stays with me, offline. I run it on hardware I already have for other purposes, and I even have a portable solar panel (but I use the plug socket).

      • WamGams@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 days ago

        doesn’t AI need like 96 gigs of ram to be comparable in quality (or lack there of, depending on how you view it) yo the commercial options?

        • Xylight@feddit.online
          link
          fedilink
          English
          arrow-up
          5
          ·
          4 days ago

          Qwen3 30b a3b, for example, is brilliant for its size and i can run it on my 8 GB VRAM + 32 GB RAM system at like 20 tokens per second. For lower powered systems, Qwen3 4b + a search tool is also insanely great for its size and can fit in less than 3 GB of RAM or VRAM at Q5 quantization

        • WorldsDumbestMan@lemmy.today
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          3 days ago

          8B QWEN for example. It’s limited, and can’t “reason” almost at all, but it does give well-structured answers. You can extract references from it, up to a point.