• MadMadBunny@lemmy.ca
    link
    fedilink
    arrow-up
    14
    ·
    2 days ago

    Calling it now : AI use is gonna become quite expensive all of a sudden, after that bubble bursts…

    • Truscape@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      3
      ·
      2 days ago

      Not my thing, but I’d imagine liquidated hardware for self-hosting would be quite cheap for those who care about it that much.

      • very_well_lost@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        2 days ago

        Those models are still outrageously expensive to train. They may be free for you to use, but they aren’t free to make and once the existing hype train crashes no one with the resources to create this stuff is gonna wanna spend money training the next Mistral or Llama or Deepseek.

        The existing models will continue to exist, sure, but they’ll be frozen in their current state and updated versions with newer training data won’t be released. The ecosystem will go stale — at least until someone figures out how to make training cheap which probably isn’t even technically possible.

        • Truscape@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          2
          ·
          2 days ago

          When community models were first getting formed (going back to the pygmalion days), there were distributed systems that allowed you to “contribute” gpu power from your personal computer to help update the model or provide compute for others - I would imagine we would return to something similar to those roots.

          This would not be deepseek - this would be the niche models that were made by hobbyists for hobbyists.