Just heard this today - co worker used an LLM to find him some shoes that fit. Basically prompted it to find specific shoes that fit wider/narrower feet and it just scraped reviews and told him what to get. I guess it worked perfectly for him.
I hate this sort of thing - however this is why normies love LLM’s. It’s going to be the new way every single person uses the internet. Hell, on a new win 11 install the first thing that comes up is copilot saying “hey use me i’m better than google!!”
Frustrating.


I think the other half of this is the confidence with which it’s programmed to give the answer. You don’t have to go through a couple individual answers from different sites and make a decision. It just tells you what to do in a coherent way, eliminating any autonomy you have in the decision process/critical thinking, of course. “Fix my sauce by adding lemon? Ok! Add a bay leaf and kill yourself? Can do!”