• dustyData@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    2 days ago

    That’s why they’re making it expendable. Those chips are designed to survive no more than 5 years of service in data centers. An unprecedented low level of durability provisioning. They are intentionally making e-waste.

    • NotANumber@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 day ago

      Well yes but it’s not like this is the first time very expensive hardware was entirely expendable or even the first time it makes sense to do so. Look at the Saturn V. It cost something like a billion per rocket in today’s money and each one could only be used once. You had to build a whole new rocket every time you wanted to go to the moon. That’s just how things were with the technology available at the time. The funny thing is it was actually cheaper per launch than the Space Shuttle in the end despite the space shuttle being mostly reused/refurbished between launches.

      Data center hardware has always had a limited lifespan due to new technology making it obsolete. Improvements in efficiency and performance make it cheaper to buy new servers than keep running old ones. I am pretty sure 5 or 6 years was already roughly the lifespan of these things to begin with. AI hasn’t really changed that, only the scale has changed.

      • dustyData@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        13 hours ago

        Comparing AI to the Saturn V is morally insulting. Sure, servers do have a lifespan. There’s a science to the upgrade rate, and it was probably 5 years…back in the 90s. When tech was new, evolving fast and e-waste wasn’t even a concept. Today, durability is measured in decades, which means provision is typically several decades.

        There are many servers with chips from the last 20 years that could be spun today and would still work perfectly fine. They were furbished with proper cooling and engineered to last. In 2020 I worked in a data center where they still had a 1999 mainframe in production, purring away like a kitten and not a single bit less performant. It just received more network storage and new ram memory from time to time. This is not what is happening with AI chips. They are being planned to burn out and become useless out of heat degradation.

        All based on the promise from NVIDIA of new chip’s design breakthroughs that still don’t exist for new models of LLMs that don’t exist either. The problem is that, essentially, LLM tech has reached a pause in performance. More hardware, more data, more tokens, are not solving the problems that AI companies thought they would, and there’s a development dead end where there’s very few easy or simple improvements left to make.

        • NotANumber@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          12 hours ago

          Talking about a single mainframe lasting 20+ years is disingenuous given mainframes are not normal servers and inherently have a longer lifespan. Even then 20+ years is an exceptionally long operation for one of these, not because of hardware limitations, but because it normally does not make financial sense. Mainframes typically run legacy systems so they are the one place these kinds of financial rules don’t apply.

          The average operational lifespan of a server is still 5-6 years as it’s always been. Some businesses are replaced every 3 years. A quick Google search would tell you that. That’s not a limit inherent to the hardware, but simply how long they are warranted and deployed for in most instances. As I explained before it doesn’t make sense to keep servers around for that long when more modern options are available. Saying they “would still work perfectly fine” doesn’t really mean anything outside of the hobbyist and used server market.

          LLMs haven’t reached a pause in performance and they are only a category of AI models. If you actually kept track of advancements instead of sitting here whining about it you would have seen more has been happening in even just the last year then just throwing data and compute at the problem. I find your intentional ignorance to be morally insulting.

          • dustyData@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            10 hours ago

            Lol, tell me you’ve never step inside a data center in your life without telling me.

            Just because the US dominated market is wasteful and destructive doesn’t mean it is like that everywhere. You buy a server today and the offerings will be the same CPUs that were available five years ago. Servers are mean, powerful beasts, and upgrades have been slow and incremental at best for almost two decades. While manufacturer guarantees might last 7 to 10 years, operators offer refurbishment and refreshment services with extended guarantees. A decade old server is not a rare sight in a data center, hell we even kept some old Windows servers from the XP area around. Also, mainframes are most definitely not legacy machinery. Modern and new mainframes are deployed even today. It is a particular mode and architecture quirk, but it is just another server at the end of the day. In fact, the z17 is an AI specialized mainframe that released just this year as a full stack ready made AI solution.

            A business that replaces servers every 3 years is burning money and any sane CFO would kick the CTO in the nuts who made such a stupid decision without a very strong reason to do it. Though C suites are not known for being sane, it is mostly in the US that such kind of wastefulness is found. All this is from experience on the corporate IT side, not at all hobbyist or second hand market.

            • NotANumber@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              9 hours ago

              Yes I am well aware that modern mainframes exist, I am actually planning to get certified on them at some point. They are however a very niche solution, which you clearly should know, and often tasked to run software made decades ago. A mainframe from 1999 is not exactly modern.

              If you legit are running Windows XP or Server 2003 then you are way out of government regulations and compliance and your whole team should be sacked immediately including you. Don’t come on a public forum and brag about the incompetence of your whole organisation for fuck’s sake. You just painted a target on your back. You clearly have no understanding of either cyber security or operational security doing things like that.

              • dustyData@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                6 hours ago

                Which country’s?

                It is awfully priviledged and insulting to imply such horrible things and wish harm on others because of your xenophobia and limited experience with diverse contexts.

                • NotANumber@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  6 hours ago

                  You are really reaching here. I live in the UK but I am pretty sure the EU have regulations just as harsh and certainly harsher than the USA. Where I am in academia we have machines that are 6-8 years old that we are only allowed to use in development and testing environments. They can’t be used in production because our IT team won’t allow it.

                  There is no good reason to put Windows XP on an internet connected network in a production environment. It’s acceptable only on air gapped machines. If you have servers that are too new for modern Windows then use Linux. Failing that buy a new machine.