I know this is unpopular as hell, but I believe that LLMs have potential to do more good than bad for learning, a long as you don’t use it for critical things. So no health related questions, or questions that is totally unacceptable to have wrong.

The ability to learn about most subjects in a really short time from a “private tutor”, makes it an effective, but flawed tool.

Let’s say that it gets historical facts wrong 10% of the time, is the world more well off if people learn a lot more, but it has some errors here and there? Most people don’t seem to know almost no history at all.

Currently people know very little about critical topics that is important to a society. This ignorance is politically and societally very damaging, maybe a lot more than the source being 10% wrong. If you ask it about social issues, there is a more empathetic answers and views than in the main political discourse. “Criminals are criminals for societal reasons”, “Human rights are important” etc.

Yes, I know manipulation of truth can be done, so it has to be neutral, which some LLMs probably aren’t or will not be.

Am I totally crazy for thinking this?

  • CoyoteFacts@piefed.ca
    link
    fedilink
    English
    arrow-up
    16
    ·
    3 days ago

    It’s not only that 10% is wrong, it’s knowing which 10% is wrong, which is more important than it seems at first glance. I feel strongly that AI is contributing to people’s inability to really perceive reality. If you direct all your learning through a machine that lies 10% of the time, soon enough your entire world-view will be on shaky ground. How are you going to have a debate with someone when you don’t know which parts of your knowledge are true? Will you automatically concede your knowledge to others, who may be more convincing and less careful about repeating what they’ve learned through AI?

    I think all that AI really needs to do is translate natural language requests (“What factors led to WW2?”) into normal destinations for further learning. Letting AI try to summarize those destinations seems like a bad idea (at least with where the technology is right now)