• artyom@piefed.social
    link
    fedilink
    English
    arrow-up
    2
    ·
    17 hours ago

    maybe you should define exactly why you think this?

    It’s very simple, copyright. You’re benefitting from someone else’s work without providing them with any compensation for said work. That doesn’t suddenly change because the compute happens on your personal computer.

    Today I wanted to know what the tyre pressures should be for my 2002 Corolla and AI gave me the answer

    If you had actually looked it up, you might have actually gotten the correct answer, as well as learned that it’s printed on the driver’s door jamb of every car.

    my tiny LLM query is going to use far less power locally than a web based search

    Why would you think your local LLM would be any more efficient than a web-based one?

    • manualoverride@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      10 hours ago

      This was exactly my point, when it’s for home use the chance of my depriving anyone of revenue is negligible.

      If I’m running a home assistant anyway not having that assistant constantly connected to the web relaying my audio, processing and sending it back will use less power.

      Finally thanks to the solar panels on my roof I can guarantee my searches are powered on 100% sunshine.