• Fedegenerate@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    6
    ·
    12 hours ago

    Tonight, I installed Open Web UI to see what sort of performance I could get out of it.

    My entire homelab is a single n100 mini, so it was a bit of squeeze to add even Gemma3n:e2b onto it.

    It did something. Free chatgpt is better performance, as long as I remember to use place holder variables. At least for my use case: vibe coding compose.yamls and as a rubber duck/level 0 tech support for trouble shooting. But it did something, I’m probably going to re-test when I upgrade to 32gb of ram, then nuke the LXC and wait till I have a beefier host though.

    • self@awful.systems
      link
      fedilink
      English
      arrow-up
      4
      ·
      9 hours ago

      case in point: you jacked off all night over your local model and still got a disappointing result