Pressing the copilot button to instantly bring up a text box where you can interact with an LLM is amazing UI/UX for productivity. LLMs are by far the best way to retrieve information(that doesnt need to be correct).

If this had been released with Agentic features that allow it to search the web, use toolscripts like fetching time/date and stuff from the OS, use recall, properly integrate with the microsoft app suite. It would be game changing.

We already have proof that this is a popular feature for users since its been integrated in every mobile phone for the past 10 years.

  • Auth@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    6
    ·
    10 days ago

    Unless you think people always come away from google with the right answer I dont see the 1:1.

    If you NEED the right answer you should go to a trusted source same as if you’re using google. If you are looking for an answer then usually blogspam articles, reddit, or AI will all be good enough to return something satisfying. AI is just a faster way of searching a question on google and clicking thte top result.

    • CarrotsHaveEars@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      8 days ago

      a faster way of searching a question on google and clicking thte top result.

      No, it isn’t. The “I’m feeling lucky” button is.

      • Auth@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        3
        ·
        8 days ago

        No its not. Firstly 99% of people have no idea what that button is.

        Secondly opening a web browser and going to google typing in your question then pressing ‘im feeling lucky’ then searching through the webpage is way slower than hitting the copilot button typing your question and getting a quick direct answer.

        • CarrotsHaveEars@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          8 days ago

          Then write yourself a desktop plugin, an icon, an input box, anything, to take you to the first Google search result. What the fuck does this have to do with LLM? How is this justified to use gallons of water, gigawatt of electricity, and PBs of stolen training data?

          • Auth@lemmy.worldOP
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            edit-2
            8 days ago

            Ok so your main complaint is that its to energy intensive? Would you concede that its an OS assistant is a good feature if the query computation cost was lowered? Because I’d argue it already is and the cost of an LLM query isnt unreasonable. The large power costs come from model training and per query cost is negligible.

            Also I wont make an argument on the copyright for training data because i dont respect copyright.