Archived version

British Columbia proposed legislation to limit how much electricity will be available to artificial intelligence data centers, and moved to permanently ban new cryptocurrency mining projects.

The government of Canada’s third-most populous province will prioritize connections to its power grid for other purposes like mines and natural gas facilities because they provide more jobs and revenue for people in BC, the energy ministry said Monday.

“Other jurisdictions have been challenged to address electricity demands from emerging sectors and, in many cases, have placed significant rate increases on the backs of ratepayers,” the department said Monday.

That’s a reference to US states like Virginia and Maryland, where a proliferation of the power-hungry data centers needed for AI appears to be pushing up citizens’ power bills, according to a Bloomberg analysis. BC “is receiving significant requests for power” from these industries, Energy Minister Adrian Dix said at a press conference.

  • AGM@lemmy.ca
    link
    fedilink
    arrow-up
    3
    arrow-down
    5
    ·
    2 days ago

    There will to be huge demand for inference compute in Canada, both in the public and the private sector. It will be needed for Canadian companies to be competitive and for Canadian education and public services to keep up. If we don’t have data servers providing that inference, then we will depend upon it being provided by others, and we will just be creating deepened foreign dependencies across our public and private sectors. What we should have is Canadian resources feeding Canadian energy production, feeding Canadian data centers, feeding inference to Canadian companies and public sector, supporting Canadians and Canadian companies to be competitive.

    • patatas@sh.itjust.works
      link
      fedilink
      arrow-up
      11
      arrow-down
      1
      ·
      2 days ago

      This is a circular argument. “We need it because it is useful”. Useful for what? What, specifically, are the supposed social or productivity benefits from these data centers?

      • AGM@lemmy.ca
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        2 days ago

        Inference is what’s primarily driving demand. Training uses massive energy, but is a one-time use per model (for now). Inference is ongoing and scales with demand and model complexity. As demand has kept on climbing, and model complexity has too, inference energy demands are far more than training over time. That’s true even with big effenciency gains in models.

        • Victor Villas@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          2 days ago

          I don’t disagree, but your statement that there will be huge demand for inference compute doesn’t necessarily imply that we need to worry about compute centers buildout for that, because inference consumes much lower resources than training and most of the compute center buildout we’re seeing out there is for training, not inference.

          inference energy demands are far more than training over time

          In aggregate? Sure. But unlike training compute, it doesn’t need to be centralized/colocated and it’s way more energy efficient. If you were just making a case that we need more compute overall, I’d agree, I’d even say it’s near consensus. But that’s not what this legislation discussion is about. The subject here is power-hungry training infrastructure.

          • AGM@lemmy.ca
            link
            fedilink
            arrow-up
            1
            ·
            1 day ago

            That was true a couple of years ago, but inference is the primary driver of data center build out now and expected to only increase over coming years. It’s true that Inference is cheap per token, and a lot of inference will move to the edge, but there will be even more demand for centralized compute to take the place of that with more complex and demanding models which can’t run on edge devices.

            • Victor Villas@lemmy.ca
              link
              fedilink
              arrow-up
              1
              ·
              19 hours ago

              inference is the primary driver of data center build out now

              Hmm maybe I’m not up to speed with latest developments then, but that sounds plausible.

              • AGM@lemmy.ca
                link
                fedilink
                arrow-up
                1
                ·
                19 hours ago

                It’s what makes it such a critical need. Canadian companies, individuals, public services etc will all have a growing demand for inference and it will still largely be coming from centralized servers. If we are not serving that demand domestically and under Canadian regulations, we will be creating huge new vulnerabilities via foreign dependencies. Even despite us not training top models domestically, a lot of demand could be served from domestic use of open models in data centers under Canadian domestic control, including running domestically tuned models and agent swarms that demand tonnes of inference. It’s a serious strategic need that requires national strategic planning, talent development, regulation, and funding.