Pressing the copilot button to instantly bring up a text box where you can interact with an LLM is amazing UI/UX for productivity. LLMs are by far the best way to retrieve information(that doesnt need to be correct).

If this had been released with Agentic features that allow it to search the web, use toolscripts like fetching time/date and stuff from the OS, use recall, properly integrate with the microsoft app suite. It would be game changing.

We already have proof that this is a popular feature for users since its been integrated in every mobile phone for the past 10 years.

  • wizardbeard@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    11
    ·
    10 days ago

    Not who you’re responding to, but I used one extensively in a recent work project. It was a matter of necessity, as I didn’t know how to word my question in the technical terms specific to the product, and it was something that was just perfect for search engines to go “I think you actually mean this completely different thing”. There was also a looming deadline.

    Being able to search using natural language, especially when you know conceptually what you’re lookong for but not the product or system specific technical term, is useful.

    Being able to get disparate information that is related to your issue but spread across multiple pages of documentation in one spot is good too.

    But detailed references on code? Reliable sources?

    I have extensive technical background. I had a middling amount of background in the systems of this project, but no experience with the specific aspects this project touched. I had to double check every answer it gave me due to how critical what I was working on was.

    Every single response I got had a significant error, oversight, or massive concealed footgun. Some were resolved by further prompting. Most were resolved by me using my own knowledge to work from what it gave me back to things I could search on my own, and then find ways to non-destructively confirm the information or poke around in it myself.


    Maybe I didn’t prompt it right. Maybe the LLM I used wasn’t the best choice for my needs.

    But I find the attitude of singing praises without massive fucking warnings and caveats to be highly dangerous.

    • CompactFlax@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      6
      ·
      10 days ago

      Great response.

      It’s great until you realize it’s led you down the garden path and the stuff it’s telling you about doesn’t exist.

      It’s horrendously untrustworthy.