Pressing the copilot button to instantly bring up a text box where you can interact with an LLM is amazing UI/UX for productivity. LLMs are by far the best way to retrieve information(that doesnt need to be correct).
If this had been released with Agentic features that allow it to search the web, use toolscripts like fetching time/date and stuff from the OS, use recall, properly integrate with the microsoft app suite. It would be game changing.
We already have proof that this is a popular feature for users since its been integrated in every mobile phone for the past 10 years.


Ok so your main complaint is that its to energy intensive? Would you concede that its an OS assistant is a good feature if the query computation cost was lowered? Because I’d argue it already is and the cost of an LLM query isnt unreasonable. The large power costs come from model training and per query cost is negligible.
Also I wont make an argument on the copyright for training data because i dont respect copyright.