Pressing the copilot button to instantly bring up a text box where you can interact with an LLM is amazing UI/UX for productivity. LLMs are by far the best way to retrieve information(that doesnt need to be correct).

If this had been released with Agentic features that allow it to search the web, use toolscripts like fetching time/date and stuff from the OS, use recall, properly integrate with the microsoft app suite. It would be game changing.

We already have proof that this is a popular feature for users since its been integrated in every mobile phone for the past 10 years.

  • FinjaminPoach@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    9 days ago

    or finding sources for info that would take a LOT longer otherwise.

    Maybe. It adds to the list of sources you have to check from, but i’ve found i still have to manually check to see if it’s actually on topic rqther than only tangentially related to what I’m writing about. But that’s fair enough, because otherwise it’d be like cheating, having whole essays written for you.

    Its great for getting detailed references on code

    I know it’s perhaps unreasonable to ask, but if you can share examples/anecdotes of this I’d like to see. To understand better how people are utilising LLMs