• TheReturnOfPEB@reddthat.com
    link
    fedilink
    English
    arrow-up
    107
    arrow-down
    5
    ·
    edit-2
    4 days ago

    A.I. and big tech does not want you to have computing power to challenge their digital hegemony.

    They will start pushing dumber and dumber devices and making development boxes so out of reach that only mega-wealth can afford to buy them.

    • deadcream@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      47
      ·
      edit-2
      4 days ago

      Dumb devices will not be able to run shitty vibe coded OSes and apps. Your modern Android phones has orders of magnitude more computing power than 20 years old PDA despite having the same (or even less) functionality. Or even compared to 10 year old Android device. Software has been becoming slower and more bloated for decades, and it’s only going to accelerate with “ai”.

      There will be more software restrictions and locked down “ecosystems” but I don’t see the hardware becoming weaker. There is no going back.

      • Axolotl@feddit.it
        link
        fedilink
        English
        arrow-up
        27
        ·
        edit-2
        4 days ago

        I uninstalled google services and shit from a 60€ android phone and boom! Now stand-by battery life is 7 days and before it was 2~ days

      • Darkassassin07@lemmy.ca
        link
        fedilink
        English
        arrow-up
        19
        ·
        4 days ago

        Microsoft and Nvidia have been trying for years to offload computing power to their own systems, while your computer becomes little more than a remote access terminal into this power when these companies allow you access to it.

        See; Nvidia Now, Xbox Cloud Gaming, and pretty much every popular LLM (there are self-hosted options, but that’s not the major market rn, or the direction it’s headed)

        There’s ofc struggles there, that they have had a hard time over comming. Particularly with something like gaming, you need a low latency, high speed internet connection; but that’s not necessary for all applications, and has been improving (slowly).

        • NotANumber@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          Actually open weights models have gotten better and better to the point they actually can compete meaningfully with ChatGPT and Claude Sonnet. Nvidia are actually one of the ones spearheading this with Nemotron. The issue is more that most of the really competent models need lots of VRAM to run. Small models lag quite far behind. Although with Nemotron Nano they are getting better.

      • Godort@lemmy.ca
        link
        fedilink
        English
        arrow-up
        12
        ·
        4 days ago

        Software has been becoming slower and more bloated for decades and it’s only going to accelerate with “ai”.

        This is mostly true, but a little misleading. (although the AI part is absolutely correct)

        This is mostly a result of having more powerful hardware. When you’re working with very limited hardware, you have to be clever about the code you write. You’re incentivized to find trade-offs and workarounds to get past physical limitations. Computer history is filled with stuff like this.

        Starting around the mid 90s, computer hardware was advancing at such a rapid pace that the goalposts shifted. Developers had fewer limitations, software got more ambitious and teams got larger. This required a methodology change. Code suddenly needed to be easier to understand and modify by devs who might not have a full understanding of the entire codebase.

        This also had a benefit to the execs, where entirely unoptimized, or even sometimes unfinished code could be brought to market and that meant a faster return on investment.

        Today we are seeing the results of that shift. Massive amounts of RAM and powerful CPUs are commonplace in every modern device, and code is inefficient, but takes up basically the same percentage of resources that it always has.

        This change to AI coding is unavoidable because the industry has decided that they want development to be fast and cheap at the cost of quality.

        The goal here isnt to have personal devices run the shitty vibe-coded apps, it’s to lease time in datacenters to have them run it for you and stream it to your device for a monthly fee.

        • aesthelete@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 days ago

          The goal here isnt to have personal devices run the shitty vibe-coded apps, it’s to lease time in datacenters to have them run it for you and stream it to your device for a monthly fee.

          Sure but there are deep seated problems with this: (1) the shitty vibe coded apps are so bloated that they can’t run their client side code without thick clients, (2) optimizing code is something nobody wants – or in many cases knows how – to do, and (3) Internet access is still spotty in many parts of the US and will likely stay that way due to other digital landlords seeking rent for fallow fields.

      • Infernal_pizza@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        9
        ·
        4 days ago

        It could when you’re literally just running a basic OS and everything else is in “the cloud”. Like that Windows 365 box Microsoft released recently that doesn’t actually run Windows itself

        • deadcream@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          14
          ·
          4 days ago

          And who is going to create this perfect and resource efficient OS? Literally all tech corporations are headed in the opposite direction. All proprietary consumer OSes are getting more bloated by the hour, and their developers are being replaced with incompetent vibe coders.

      • TheReturnOfPEB@reddthat.com
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        4 days ago

        I don’t know but I do know that the reason Sparc boxes and Solaris/SunOS is known by people who worked in business or academia is because that there were Intel PCs that let affordable computing reach the masses even while Crystal Tower computing existed.

        Now it seems that affordable PCs are not what the Mega-Wealthy want so they will make every computing device capable of creating a challenge to A.I. as expensive as possible just like Sun did with their hardware.

        They can do this because the market can’t respond to make more competition. And tariffs make that worse.