• brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    edit-2
    11 hours ago

    It’s possible a member of Blackburn’s staff or a supporter went looking for a libelous hallucination in Google’s models.

    Good to see Ars with some common sense here.

    FYI Gemma3 is Google’s open weights release, for local running and finetuning. It’s pretty neat (especially the QAT version), but also old and small; there’s no reason anyone would pick it over Gemini 2.5 in Google’s dev web app, except for esoteric dev testing. It’s not fast, it doesn’t know much, it’s not great with tooling (like web referencing), its literal purpose is squeezing onto desktop PCs or cheap GPUs.

    …Hence this basically impacts no-one.

    The worst risk is that Google may flinch and neuter future Gemma/Gemini over this, lest some other MAGA screams bloody murder over nothing.

    • filister@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      6 hours ago

      The future is very small models trained to work in a certain domain and able to run on devices.

      Huge foundational models are nice and everything, but they are simply too heavy and expensive to run.