Dumb devices will not be able to run shitty vibe coded OSes and apps. Your modern Android phones has orders of magnitude more computing power than 20 years old PDA despite having the same (or even less) functionality. Or even compared to 10 year old Android device. Software has been becoming slower and more bloated for decades, and it’s only going to accelerate with “ai”.
There will be more software restrictions and locked down “ecosystems” but I don’t see the hardware becoming weaker. There is no going back.
Microsoft and Nvidia have been trying for years to offload computing power to their own systems, while your computer becomes little more than a remote access terminal into this power when these companies allow you access to it.
See; Nvidia Now, Xbox Cloud Gaming, and pretty much every popular LLM (there are self-hosted options, but that’s not the major market rn, or the direction it’s headed)
There’s ofc struggles there, that they have had a hard time over comming. Particularly with something like gaming, you need a low latency, high speed internet connection; but that’s not necessary for all applications, and has been improving (slowly).
Actually open weights models have gotten better and better to the point they actually can compete meaningfully with ChatGPT and Claude Sonnet. Nvidia are actually one of the ones spearheading this with Nemotron. The issue is more that most of the really competent models need lots of VRAM to run. Small models lag quite far behind. Although with Nemotron Nano they are getting better.
Software has been becoming slower and more bloated for decades and it’s only going to accelerate with “ai”.
This is mostly true, but a little misleading. (although the AI part is absolutely correct)
This is mostly a result of having more powerful hardware. When you’re working with very limited hardware, you have to be clever about the code you write. You’re incentivized to find trade-offs and workarounds to get past physical limitations. Computer history is filled with stuff like this.
Starting around the mid 90s, computer hardware was advancing at such a rapid pace that the goalposts shifted. Developers had fewer limitations, software got more ambitious and teams got larger. This required a methodology change. Code suddenly needed to be easier to understand and modify by devs who might not have a full understanding of the entire codebase.
This also had a benefit to the execs, where entirely unoptimized, or even sometimes unfinished code could be brought to market and that meant a faster return on investment.
Today we are seeing the results of that shift. Massive amounts of RAM and powerful CPUs are commonplace in every modern device, and code is inefficient, but takes up basically the same percentage of resources that it always has.
This change to AI coding is unavoidable because the industry has decided that they want development to be fast and cheap at the cost of quality.
The goal here isnt to have personal devices run the shitty vibe-coded apps, it’s to lease time in datacenters to have them run it for you and stream it to your device for a monthly fee.
The goal here isnt to have personal devices run the shitty vibe-coded apps, it’s to lease time in datacenters to have them run it for you and stream it to your device for a monthly fee.
Sure but there are deep seated problems with this: (1) the shitty vibe coded apps are so bloated that they can’t run their client side code without thick clients, (2) optimizing code is something nobody wants – or in many cases knows how – to do, and (3) Internet access is still spotty in many parts of the US and will likely stay that way due to other digital landlords seeking rent for fallow fields.
It could when you’re literally just running a basic OS and everything else is in “the cloud”. Like that Windows 365 box Microsoft released recently that doesn’t actually run Windows itself
And who is going to create this perfect and resource efficient OS? Literally all tech corporations are headed in the opposite direction. All proprietary consumer OSes are getting more bloated by the hour, and their developers are being replaced with incompetent vibe coders.
Imagine if people who know how to use search engine properly called themselves “search engine prompt engineer”. Maybe people who are good at communicating should start calling themselves “human interface prompt engineer”.
I don’t know but I do know that the reason Sparc boxes and Solaris/SunOS is known by people who worked in business or academia is because that there were Intel PCs that let affordable computing reach the masses even while Crystal Tower computing existed.
Now it seems that affordable PCs are not what the Mega-Wealthy want so they will make every computing device capable of creating a challenge to A.I. as expensive as possible just like Sun did with their hardware.
They can do this because the market can’t respond to make more competition. And tariffs make that worse.
A.I. and big tech does not want you to have computing power to challenge their digital hegemony.
They will start pushing dumber and dumber devices and making development boxes so out of reach that only mega-wealth can afford to buy them.
Dumb devices will not be able to run shitty vibe coded OSes and apps. Your modern Android phones has orders of magnitude more computing power than 20 years old PDA despite having the same (or even less) functionality. Or even compared to 10 year old Android device. Software has been becoming slower and more bloated for decades, and it’s only going to accelerate with “ai”.
There will be more software restrictions and locked down “ecosystems” but I don’t see the hardware becoming weaker. There is no going back.
I uninstalled google services and shit from a 60€ android phone and boom! Now stand-by battery life is 7 days and before it was 2~ days
Well yeah, if it’s not doing anything the battery will last longer, yup.
Microsoft and Nvidia have been trying for years to offload computing power to their own systems, while your computer becomes little more than a remote access terminal into this power when these companies allow you access to it.
See; Nvidia Now, Xbox Cloud Gaming, and pretty much every popular LLM (there are self-hosted options, but that’s not the major market rn, or the direction it’s headed)
There’s ofc struggles there, that they have had a hard time over comming. Particularly with something like gaming, you need a low latency, high speed internet connection; but that’s not necessary for all applications, and has been improving (slowly).
Actually open weights models have gotten better and better to the point they actually can compete meaningfully with ChatGPT and Claude Sonnet. Nvidia are actually one of the ones spearheading this with Nemotron. The issue is more that most of the really competent models need lots of VRAM to run. Small models lag quite far behind. Although with Nemotron Nano they are getting better.
This is mostly true, but a little misleading. (although the AI part is absolutely correct)
This is mostly a result of having more powerful hardware. When you’re working with very limited hardware, you have to be clever about the code you write. You’re incentivized to find trade-offs and workarounds to get past physical limitations. Computer history is filled with stuff like this.
Starting around the mid 90s, computer hardware was advancing at such a rapid pace that the goalposts shifted. Developers had fewer limitations, software got more ambitious and teams got larger. This required a methodology change. Code suddenly needed to be easier to understand and modify by devs who might not have a full understanding of the entire codebase.
This also had a benefit to the execs, where entirely unoptimized, or even sometimes unfinished code could be brought to market and that meant a faster return on investment.
Today we are seeing the results of that shift. Massive amounts of RAM and powerful CPUs are commonplace in every modern device, and code is inefficient, but takes up basically the same percentage of resources that it always has.
This change to AI coding is unavoidable because the industry has decided that they want development to be fast and cheap at the cost of quality.
The goal here isnt to have personal devices run the shitty vibe-coded apps, it’s to lease time in datacenters to have them run it for you and stream it to your device for a monthly fee.
Sure but there are deep seated problems with this: (1) the shitty vibe coded apps are so bloated that they can’t run their client side code without thick clients, (2) optimizing code is something nobody wants – or in many cases knows how – to do, and (3) Internet access is still spotty in many parts of the US and will likely stay that way due to other digital landlords seeking rent for fallow fields.
It could when you’re literally just running a basic OS and everything else is in “the cloud”. Like that Windows 365 box Microsoft released recently that doesn’t actually run Windows itself
And who is going to create this perfect and resource efficient OS? Literally all tech corporations are headed in the opposite direction. All proprietary consumer OSes are getting more bloated by the hour, and their developers are being replaced with incompetent vibe coders.
Incompetent? I’ll have you know that I’m a prompt engineer. 😏
I’m not amateur. I end every query with “and no bugs, please.”
Does it at least apologize to you when it adds bugs anyway?
Imagine if people who know how to use search engine properly called themselves “search engine prompt engineer”. Maybe people who are good at communicating should start calling themselves “human interface prompt engineer”.
Social engineer
It already exists, just the shitty cut down operating systems used by existing thin clients
Yeah and the fucking FTC isn’t going to do anything about it.
We need Lina Kahn back.
What was that book again?
I don’t know but I do know that the reason Sparc boxes and Solaris/SunOS is known by people who worked in business or academia is because that there were Intel PCs that let affordable computing reach the masses even while Crystal Tower computing existed.
Now it seems that affordable PCs are not what the Mega-Wealthy want so they will make every computing device capable of creating a challenge to A.I. as expensive as possible just like Sun did with their hardware.
They can do this because the market can’t respond to make more competition. And tariffs make that worse.