Pressing the copilot button to instantly bring up a text box where you can interact with an LLM is amazing UI/UX for productivity. LLMs are by far the best way to retrieve information(that doesnt need to be correct).

If this had been released with Agentic features that allow it to search the web, use toolscripts like fetching time/date and stuff from the OS, use recall, properly integrate with the microsoft app suite. It would be game changing.

We already have proof that this is a popular feature for users since its been integrated in every mobile phone for the past 10 years.

  • Jerkface (any/all)@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 days ago

    I’ve spent many many hours working with LLMs to produce code. Actually, it’s an addictive loop, like pulling a slot machine. You forget what you’re actually trying to accomplish, you just need the code to work. It’s kinda scary. But the deeper you get, the worse the code gets. And eventually you realize, the LLM doesn’t know what it’s talking about. Not sometimes, ever.

    • nandeEbisu@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 days ago

      It has been useful for me with poorly documented libraries, not generating more than code snippets or maybe small utilities.

      It’s more of an API search engine to me. I find it’s about 80% correct but it’s easier to search for a specific method to make sure it does what you expect than scroll through pages of generated class documentation, half of which look like internal implementation details I won’t need to care about unless I’m really digging into it as a power user.

      Also, even if the method isn’t correct or is more convoluted to use than a more direct one. it’s usually in the same module as the correct one.