• 4 Posts
  • 1.47K Comments
Joined 2 years ago
cake
Cake day: September 7th, 2023

help-circle




  • The tweet before that:

    Let me tell you something about Akash. During a project at Berkeley, I accidentally deleted our entire codebase 2 days before the deadline. I panicked. Akash just stared at the screen, shrugged, and rewrote everything from scratch in one night—better than before.

    This says more about you, the scale of the project, bad organisation of your group, and the lack of challenge of Berkeley (nice namedrop though) group projects, and the failure of understanding the excersize (the goal is to learn how to work as a group and notice the networking problems), and the goals of being at a university (networking, partying and learning) than anything else.

    Hell I know of a project that also did this and they didnt manage to rewrite the project, as it actually took a lot of time.



  • E: sorry slight spoilers for old and popular science fiction series. (You’d think they would have picked some more positive but less well known transhumanist science fiction books with nice societies, or pick specifics they liked but nope).

    The Foundation is a horrible place to live, it is a frontier city created under false pretenses where you live under constant threat of a crisis, manipulated from afar by the second foundation, while ‘Rome’ falls apart around them. Billions die.

    Hyperion is a horrible place to live, the main hegemonic force is a hypercolonialist empire that destroys all variety, kills the dolphins, is secretly run by AIs who abuse humanity, power everything by destroying the energy that gives us love, and morph into a authoritarian theocracy secretly run by AIs who are now at war with the beings in the love dimension. Also, billions die when the first empire falls.

    Ringworld is … not a book series I remember much from, read it when I was young, might be ok might not be.

    Also all these worlds also have a big magical element in it, dune with all the spice stuff, hyperion with the love dimension, foundation with the psi powers and magical prediction powers. And they are setups for the stories conflicts (and well, stories).

    It is all a bit like saying you want Singularity skies Festival arrive without reading up on what happens afterwards. Helps if you actually read books not skim through them in 2 hours.

    E: I need to make a confession, this guy changed my mind on agentic LLMs. They should use them, it will improve their reading comprehension. (they should also add pronouns, if they don’t want to be called they)







  • Nostalgia has a lowkey reactionary impulse part(see also why those right wing reactionary gamer streamers who do ten hour reactive criticize a movie streams have their backgrounds filled with consumer nerd media toys (and almost never books)) and fear of change is also a part of conservatism. ‘Engineering minds’ who think they can solve things, and have a bit more rigid thinking also tend to be attracted to more extremist ideologies (which usually seems to have more rigid rules and lesser exceptions), which also leads back to the problem where people like this are bad at not realizing their minds are not typical (I can easily use a console so everyone else can and should). So it makes sense to me. Not sure if the ui thing is elitism or just a strong desire to create and patrol the borders of an ingroup. (But isnt that just what elitism is?)






  • But the Ratspace doesn’t just expect them to actually do things, but also self improve. Which is another step above just human level intelligence, it also means that self improvement is possible (and on the highest level of nuttyness, unbound), a thing we have not even seen if it is possible. And it certainly doesn’t seem to be, as the lengths between a newer better version of chatGPT seems to be increasing (an interface around it doesn’t count). So imho due to chatgpt/LLMs and the lack of fast improvements we have seen recently (some even say performance has decreased, so we are not even getting incremental innovations), means that the ‘could lead to AGI-foom’ possibility space has actually shrunk, as LLMs will not take us there. And everything including the kitchen sink has been thrown at the idea. To use some AI-weirdo lingo: With the decels not in play(*), why are the accels not delivering?

    *: And lets face it, on the fronts that matter, we have lost the battle so far.

    E: full disclosure I have not read Zitrons article, they are a bit long at times, look at it, you could read 1/4th of a SSC article in the same time.