Windows 11 often requires new hardware. But that will be extremely pricey or have very little RAM for a while.
I dont believe that a single competent person works at Micro$oft anymore, but maybe maybe this could lead them to make a less shitty OS?
And garbage software like Adobe Creative Cloud too?
They obviously dont care about users, but the pain could become too big.


Not likely. I expect the AI bubble will burst before those software optimization gears even start to turn.
Even returning to JVM languages would be huge over the current js based electron slop. Things are so bad “optimized software” doesn’t need to mean C++ or Rust.
with Rust getting popular the architecture is there to make huge savings without having to be a rocket scientist
the rocket scientists are also getting involved and regularly outperforming even optimised C code
Not just that all of their ai slop code will be more unoptimized
Yeah, the systems in place right now took 40 years to build
Yes, but with AI, you can build it in 4 hours, and with all those extra RAMs, it could drop to 2
Big AI is a bubble but AI in general is not.
If anything, the DRAM shortages will apply pressure on researchers to come up with more efficient AI models rather than more efficient (normal) software overall.
I suspect that as more software gets AI-assisted development we’ll actually see less efficient software but eventually, more efficient as adoption of AI coding assist becomes more mature (and probably more formalized/automated).
I say this because of experience: If you ask an LLM to write something for you it often does a terrible job with efficiency. However, if you ask it to analyze an existing code base to make it more efficient, it often does a great job. The dichotomy is due to the nature of AI prompting: It works best if you only give it one thing to do at a time.
In theory, if AI code assist becomes more mature and formalized, the “optimize this” step will likely be built-in, rather than something the developer has to ask for after the fact.