Why don’t they “create thoughts”? I mentioned this in another comment, but most discussions around AI are people talking past each other because they use the same words to mean different things.
It might seem absurd, but it’s a lot harder to define words like “thought” than you’d think, because often the definition just leads to more questions. Wikipedia for example says “In their most common sense, they are understood as conscious processes that can happen independently of sensory stimulation.”, but then what does “conscious” mean? Until we have a rigid definition for words like that all the way down to first principles, I wouldn’t agree with definitive statements.
ELIZA is fundamentally different from an LLM though, it’s much more an expert system.
I see what your doing, but your asking for too much formalism in a casual context. To satisfy the entire vocabulary from first principles would be a non-trivial task - its so daunting I don’t even want to attempt it here.
Why don’t they “create thoughts”? I mentioned this in another comment, but most discussions around AI are people talking past each other because they use the same words to mean different things.
It might seem absurd, but it’s a lot harder to define words like “thought” than you’d think, because often the definition just leads to more questions. Wikipedia for example says “In their most common sense, they are understood as conscious processes that can happen independently of sensory stimulation.”, but then what does “conscious” mean? Until we have a rigid definition for words like that all the way down to first principles, I wouldn’t agree with definitive statements.
ELIZA is fundamentally different from an LLM though, it’s much more an expert system.
I see what your doing, but your asking for too much formalism in a casual context. To satisfy the entire vocabulary from first principles would be a non-trivial task - its so daunting I don’t even want to attempt it here.