Just heard this today - co worker used an LLM to find him some shoes that fit. Basically prompted it to find specific shoes that fit wider/narrower feet and it just scraped reviews and told him what to get. I guess it worked perfectly for him.

I hate this sort of thing - however this is why normies love LLM’s. It’s going to be the new way every single person uses the internet. Hell, on a new win 11 install the first thing that comes up is copilot saying “hey use me i’m better than google!!”

Frustrating.

  • TootSweet@lemmy.worldM
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 day ago

    My mother is constantly googling things and reading me the AI overview. And I know LLMs make shit up all the time, and I don’t want AI hallucinations to infect my brain and slowly fuck up my worldview. So I always have to drop everything and go confirm the claims from the AI overview. And I’ve caught plenty of inaccuracies and hallucinations. (One I remember: she googled for when the East Wing of the White House was originally built and the AI overview told her the year of a major renovation, claiming it was the year it was built, but it had been built much earlier.)