A few colleagues and I were sat at our desks the other day, and one of them asked the group, “if you were an animal, what animal would you be?”
I answered with my favourite animal, and we had a little discussion about it. My other colleague answered with two animals, and we tossed those answers back and forth, discussing them and making jokes. We asked the colleague who had asked the question what they thought they’d be, and we discussed their answer.
Regular, normal, light-hearted (time wasting lol) small talk at work between friendly coworkers.
We asked the fourth coworker. He said he’d ask ChatGPT.
It was a really weird moment. We all just kind of sat there. He said the animal it came back with, and that was that. Any further discussion was just “yeah that’s what it said” and we all just sort of went back to our work.
That was weird, right? Using ChatGPT for what is clearly just a little bit of friendly small talk? There’s no bad blood between any of us, we hang out a lot, but it just struck me as really weird and a little bit sad.
Try this
“which plant has the most anuses”
AI Overview
The plant with the most “anuses” (or rather, the most posterior ends with a functional digestive system) is the marine worm Ramisyllis multicaudata. This worm, found in sponges off the coast of Australia, has a single head but can have hundreds of branching bodies, each ending in a separate posterior end with a functional anus.
While plants don’t have anuses in the traditional sense, R. multicaudata is notable for its multiple, branching posterior ends, each with its own anus. This is highly unusual for an animal, as most animals have a single posterior end. The worm’s body branches repeatedly, and with each branch, the digestive system, along with other organs, is duplicated, resulting in multiple posterior ends.
A worm isn’t a plant, though. At least, not unless biology has changed considerably since I was last in school.
I know, just shows AI patches words together according to some kind of probability based on the entirety of human writing. So if you ask something off kilter you get off kilter responses. AI doesn’t “understand”.
What do you mean? It has branches, so it is obviously a plant.
/s