AI researchers at Andon Labs embedded various LLMs in a vacuum robot to test how ready they were to be embodied. And hilarity ensued.
AI researchers at Andon Labs embedded various LLMs in a vacuum robot to test how ready they were to be embodied. And hilarity ensued.
Sounds human
Sounding human doesn’t make something human, it sounds human until it doesn’t, because it doesn’t really think. That concept is why we originally developed the turing test.
This is my reaction to so many low-effort posts by people who presumably are not practitioners, designers, or researchers in AI
“THEY HALLUCINATE!” So do you. 90% of what you see right now is your visual cortex imaging the gaps between the tiny point your fovea can actually see at any moment
“ITS JUST STATISTICS” Correct. Now tell me how neuronal activity differs from this. I’ll wait
Humans have moral context, societal context, the ability to identify something as factual in the real world and more. It imagines everything, it has no way to discern fact from fiction.