Archived version

British Columbia proposed legislation to limit how much electricity will be available to artificial intelligence data centers, and moved to permanently ban new cryptocurrency mining projects.

The government of Canada’s third-most populous province will prioritize connections to its power grid for other purposes like mines and natural gas facilities because they provide more jobs and revenue for people in BC, the energy ministry said Monday.

“Other jurisdictions have been challenged to address electricity demands from emerging sectors and, in many cases, have placed significant rate increases on the backs of ratepayers,” the department said Monday.

That’s a reference to US states like Virginia and Maryland, where a proliferation of the power-hungry data centers needed for AI appears to be pushing up citizens’ power bills, according to a Bloomberg analysis. BC “is receiving significant requests for power” from these industries, Energy Minister Adrian Dix said at a press conference.

  • patatas@sh.itjust.works
    link
    fedilink
    arrow-up
    15
    arrow-down
    1
    ·
    2 days ago

    I gotta dispute the idea that we need AI data centers at all, let alone “sovereign” ones. What social purpose do they serve?

    • tarsn@lemmy.ca
      link
      fedilink
      arrow-up
      4
      arrow-down
      2
      ·
      edit-2
      2 days ago

      I can give you one use case that has a public benefit. My brother works in research informatics at a children’s hospital. They use ai to identify children with rare diseases. My understanding is it tracks patterns of appointments and symptoms and matches the patients with specialists. Typically these patients wouldn’t be identified for years because doctors are looking for common ailments before any exotic disease.

      There is lots of uses for urban planning related to population growth and census statistics as well.

      • patatas@sh.itjust.works
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        2 days ago

        I’d be curious to see data on the benefits, but assuming what you say is true: this example in medicine sounds like a pretty basic kind of machine learning and not something that requires massive energy-hungry data centers.

        Same with the urban planning example. These are not the applications that require “sovereign AI compute” at scale. Those would be the generative AI applications like chatbots and image/video generators, as far as I understand these things.

      • Em Adespoton@lemmy.ca
        link
        fedilink
        arrow-up
        2
        ·
        2 days ago

        AI data centres are usually about giant LLMs and agentic bots. “Ai” as in machine learning doesn’t need giant data centres and has been progressing quite well without them.

        The term “AI” tends to get thrown around to claim all the benefits of the entire field to excuse the excesses of a very narrow slice.

      • Victor Villas@lemmy.ca
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        2 days ago

        These applications are great, but they’re not what these compute centers are for. For those applications, a regular supercomputer will do. Those gigantic and power hungry data centers are used for LLM training, which is a VC-funded arms race that we don’t actually need to partake in.

    • AGM@lemmy.ca
      link
      fedilink
      arrow-up
      3
      arrow-down
      5
      ·
      2 days ago

      There will to be huge demand for inference compute in Canada, both in the public and the private sector. It will be needed for Canadian companies to be competitive and for Canadian education and public services to keep up. If we don’t have data servers providing that inference, then we will depend upon it being provided by others, and we will just be creating deepened foreign dependencies across our public and private sectors. What we should have is Canadian resources feeding Canadian energy production, feeding Canadian data centers, feeding inference to Canadian companies and public sector, supporting Canadians and Canadian companies to be competitive.

      • patatas@sh.itjust.works
        link
        fedilink
        arrow-up
        11
        arrow-down
        1
        ·
        2 days ago

        This is a circular argument. “We need it because it is useful”. Useful for what? What, specifically, are the supposed social or productivity benefits from these data centers?

        • AGM@lemmy.ca
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          2 days ago

          Inference is what’s primarily driving demand. Training uses massive energy, but is a one-time use per model (for now). Inference is ongoing and scales with demand and model complexity. As demand has kept on climbing, and model complexity has too, inference energy demands are far more than training over time. That’s true even with big effenciency gains in models.

          • Victor Villas@lemmy.ca
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            2 days ago

            I don’t disagree, but your statement that there will be huge demand for inference compute doesn’t necessarily imply that we need to worry about compute centers buildout for that, because inference consumes much lower resources than training and most of the compute center buildout we’re seeing out there is for training, not inference.

            inference energy demands are far more than training over time

            In aggregate? Sure. But unlike training compute, it doesn’t need to be centralized/colocated and it’s way more energy efficient. If you were just making a case that we need more compute overall, I’d agree, I’d even say it’s near consensus. But that’s not what this legislation discussion is about. The subject here is power-hungry training infrastructure.

            • AGM@lemmy.ca
              link
              fedilink
              arrow-up
              1
              ·
              1 day ago

              That was true a couple of years ago, but inference is the primary driver of data center build out now and expected to only increase over coming years. It’s true that Inference is cheap per token, and a lot of inference will move to the edge, but there will be even more demand for centralized compute to take the place of that with more complex and demanding models which can’t run on edge devices.

              • Victor Villas@lemmy.ca
                link
                fedilink
                arrow-up
                1
                ·
                19 hours ago

                inference is the primary driver of data center build out now

                Hmm maybe I’m not up to speed with latest developments then, but that sounds plausible.

                • AGM@lemmy.ca
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  19 hours ago

                  It’s what makes it such a critical need. Canadian companies, individuals, public services etc will all have a growing demand for inference and it will still largely be coming from centralized servers. If we are not serving that demand domestically and under Canadian regulations, we will be creating huge new vulnerabilities via foreign dependencies. Even despite us not training top models domestically, a lot of demand could be served from domestic use of open models in data centers under Canadian domestic control, including running domestically tuned models and agent swarms that demand tonnes of inference. It’s a serious strategic need that requires national strategic planning, talent development, regulation, and funding.