Freedom is the right to tell people what they do not want to hear.

  • George Orwell
  • 4 Posts
  • 112 Comments
Joined 16 days ago
cake
Cake day: July 17th, 2025

help-circle


  • The level of consciousness in something like a brain parasite or a slug is probably so dim that it barely feels like anything to be one. So even if you were reincarnated as one, you likely wouldn’t have much of a subjective experience of it. The only way to really experience a new life after reincarnation would be to come back as something with a complex enough mind to actually have a vivid sense of existence. Not that it matters much - it’s not like you’d remember any of your past lives anyway.

    If reincarnation were real and I had to bet money on how it works, I’d put it down to something like the many‑worlds interpretation of quantum physics - where being “reborn as yourself” just means living out one of your alternate timelines in a parallel universe.













  • Looking back, I realize I was pretty immature at 22. It didn’t feel that way at the time, but it sure does now. These days, 18‑year‑olds look like kids to me.

    I didn’t want kids back then, and I still don’t - but my perspective has shifted a little. When I see parents now, there’s a slight melancholic feeling that comes with knowing that’s something I’ll probably never experience.

    So yeah, if you’re 30 and don’t want kids, that’s probably not going to change. Before that, though, there’s always a chance.





  • I personally think the whole concept of AGI is a mirage. In reality, a truly generally intelligent system would almost immediately be superhuman in its capabilities. Even if it were no “smarter” than a human, it could still process information at a vastly higher speed and solve in minutes what would take a team of scientists years or even decades.

    And the moment it hits “human level” in coding ability, it starts improving itself - building a slightly better version, which builds an even better version, and so on. I just don’t see any plausible scenario where we create an AI that stays at human-level intelligence. It either stalls far short of that, or it blows right past it.