Pressing the copilot button to instantly bring up a text box where you can interact with an LLM is amazing UI/UX for productivity. LLMs are by far the best way to retrieve information(that doesnt need to be correct).

If this had been released with Agentic features that allow it to search the web, use toolscripts like fetching time/date and stuff from the OS, use recall, properly integrate with the microsoft app suite. It would be game changing.

We already have proof that this is a popular feature for users since its been integrated in every mobile phone for the past 10 years.

  • Auth@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    3
    ·
    9 days ago

    So you agree that pressing a button to bring up a box that you can query with natural language is a good feature you just think the LLM part is slower and computationally inefficient? I could agree with that if there was something better proposed. I just see an LLM being a good tech for this because of how dynamic it is and with the addition of tools to do specific tasks in a determinism fashion its a powerful tool for the users.