I tried putting together a research plan using an LLM. Like nothing crazy I just wanted it to help me structure my thoughts and write LaTeX for me. Horrible experience.
I gave it a reference paper and said "copy that methodology exactly“ and then said exactly what steps I would like to see included.
It kept making bold claims and suggesting irrelevant methods and just plain wrong approaches. If I had no idea about the topic I might have believed it because that thing is so confident but especially if you know what you’re doing they’re bullshit machines.
It only seems confident if you treat it like a person. If you realize it’s a flawed machine, the language it uses shouldn’t matter. The problem is that people treat it like it’s a person, ie. That its confident sounding responses mean anything.
I like how confident it is. Now imagine that this is a topic you know nothing about and are relying on it to get information.
I really wish people understood how it works, so that they wouldn’t rely on it for literally anything.
I tried putting together a research plan using an LLM. Like nothing crazy I just wanted it to help me structure my thoughts and write LaTeX for me. Horrible experience.
I gave it a reference paper and said "copy that methodology exactly“ and then said exactly what steps I would like to see included.
It kept making bold claims and suggesting irrelevant methods and just plain wrong approaches. If I had no idea about the topic I might have believed it because that thing is so confident but especially if you know what you’re doing they’re bullshit machines.
It only seems confident if you treat it like a person. If you realize it’s a flawed machine, the language it uses shouldn’t matter. The problem is that people treat it like it’s a person, ie. That its confident sounding responses mean anything.