• artyom@piefed.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 hours ago

    Using a standalone LLM for personal use doesn’t seem like an ethical dilemma to me

    What is the ethical dilemma, exactly, and why/how is this different?

    Getting small amounts of medium-trust information on a subject, is a good way to get someone interested enough to read a book, watcha a YouTube video or find a website for more information and validate the AI response.

    Again, how is this different? At least the web-based ones actually link to where the info came from…

    • manualoverride@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 hours ago

      We’re talking about home use AI searches… you said it was unethical so maybe you should define exactly why you think this?

      Today I wanted to know what the tyre pressures should be for my 2002 Corolla and AI gave me the answer, I would not have bought a book or gone anywhere past the first page of google for that information.

      The possible ethical dilemma is depriving someone of compensation because I used their research and deprived them of potential revenue, in reality I would never have bought a book on tyre pressures or car maintenance, and it’s unlikely I would ever have visited a site where adverts would have paid the contributors.

      Another dilemma is of power consumption, the model is already made then it’s already used the power, and my tiny LLM query is going to use far less power locally than a web based search.

      As a company who might make money, or achieve cost savings from using AI trained on data some only intended for use by a human, I can see how this is not always ethical.

      • artyom@piefed.social
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 hours ago

        maybe you should define exactly why you think this?

        It’s very simple, copyright. You’re benefitting from someone else’s work without providing them with any compensation for said work. That doesn’t suddenly change because the compute happens on your personal computer.

        Today I wanted to know what the tyre pressures should be for my 2002 Corolla and AI gave me the answer

        If you had actually looked it up, you might have actually gotten the correct answer, as well as learned that it’s printed on the driver’s door jamb of every car.

        my tiny LLM query is going to use far less power locally than a web based search

        Why would you think your local LLM would be any more efficient than a web-based one?

        • manualoverride@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 hour ago

          This was exactly my point, when it’s for home use the chance of my depriving anyone of revenue is negligible.

          If I’m running a home assistant anyway not having that assistant constantly connected to the web relaying my audio, processing and sending it back will use less power.

          Finally thanks to the solar panels on my roof I can guarantee my searches are powered on 100% sunshine.