• brucethemoose@lemmy.world
    link
    fedilink
    arrow-up
    7
    arrow-down
    17
    ·
    edit-2
    4 days ago

    Total nonsense. ESRGAN was trained on potatoes, tons of research models are. I fintune models on my desktop for nickels of electricity; it never touches a cloud datacenter.

    At the high end, if you look past bullshiters like Altman, models are dirt cheap to run and getting cheaper. If Bitnet takes off (and a 2B model was just released days ago), inference energy consumption will be basically free and on-device, like video encoding/decoding is now.

    Again, I emphasize, its corporate bullshit giving everything a bad name.