• FE80@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    1
    ·
    6 hours ago

    What kind of shit for brains asshole is still defending Windows in 2025?

    • krooklochurm@lemmy.ca
      link
      fedilink
      arrow-up
      13
      ·
      3 hours ago

      And what kind of slavering mouth-breathing teoglodyte doesn’t understand that Hannah Montana Linux negates all of these issues, will suck your dixk without hesitation, and lets you read news from four days from now.

  • SpaceCowboy@lemmy.ca
    link
    fedilink
    arrow-up
    17
    ·
    5 hours ago

    The real reason is it’s a pain in the ass to deploy software in Windows. It’s not like you can easily set up a server and put some packages on and have it just automatically apt update to that. Sure there’s some “Enterprise” servers you could set up (and pay license fees for) that might work somewhat like that, but it’s easier to just make it a web app and deploy to an internet webserver.

    For product distribution, you need someone download an .exe, hope a virus scanner won’t block it, maybe pay microsoft to sign it or whatever, hope the user has a compatible version of windows, and maybe they can get some working software. But then you have to make some mechanism to handle updates and hopefully that doesn’t get blocked by some security software. So it’s easier to make your software a web application.

    Also putting out windows native applications means you might not be able to enshittify it later since people could continue to use the old version forever. It’s weird to assume enshittification happens accidentally, but it’s actually what some companies want to do their software because $$$. They want applications they can enshitty later, they don’t make applications that may work on linux and whoopsie it just somehow got enshittified because of that… somehow.

    But many times it’s just best solution. If an application doesn’t need access to anything on my system, I’d rather it be a web app. App does the thing I need, and when I’m done, I close the tab and we’re done. Why install more software on my system if I don’t need to?

  • BarqsHasBite@lemmy.world
    link
    fedilink
    arrow-up
    25
    ·
    edit-2
    11 hours ago

    I welcome other’s input but I thought this was a pretty clear cut case of Mac becoming popular. Why write a program for Windows and Mac when you can just make a website. Then Chromebooks in education sealed the deal.

    Linux is only starting mainstream use now because of Europe’s push for digital sovereignty and windows 10 end of life.

  • jollyrogue@lemmy.ml
    link
    fedilink
    arrow-up
    63
    arrow-down
    2
    ·
    edit-2
    13 hours ago
    1. The user land API/ABI is stable to a fault in Linux. The kernel API/ABI is unstable.

    2. Companies are cheap. They hired web devs then tasked them with building a desktop application rather then hiring people to write native apps. They had a hammer and used it to fix every problem they had.

    3. macOS is just as affected by electron apps as a Linux is.

    4. Electron is horrible, but it does bring apps to many an OS once Chromium is ported.

    5. Open protocols or open APIs from the company would fix the non-native app problem.

    • ammonium@lemmy.world
      link
      fedilink
      arrow-up
      13
      ·
      12 hours ago

      The user land API/ABI is stable to a fault in Linux. The kernel API/ABI is unstable

      It’s the other way around. The kernel API stable to a fault, the kernel ABI isn’t. If your application only relies on the kernel API you won’t have many compatibility issues. If you rely on userland stuff such as C++ stdlib, GTK, QT, Python, … Good luck.

      • jollyrogue@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        4 hours ago

        I wasn’t clear and that seems to have cause some confusion. I was talking about the Linux kernel itself, and only the Linux kernel.

        There are two sides to the Linux kernel: internal exposed to drivers and such, external syscalls exposed to the public. That’s what I was talking about.

        All bets are off with 3rd party software. That’s just a general problem in software development. It’s not specific to Linux, and it’s why vendoring libraries is recommended.

        This is why all the 3rd party software is frozen at a point-in-time with fixes backported in distros like Debian or RHEL. It fixes the problems of devs being mercurial. The distro is the SDK. It creates a stable base, and it works rather well.

        Unfortunately, most software relies on libc and a compiler. Both of which can be problems, and both of which are external to the Linux kernel. There’s not much which relies on only kernel syscalls.

    • JackbyDev@programming.dev
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 hours ago

      Further, if you get code into the kernel, anyone who breaks it needs to fix it. So it seems to me it’s only a problem if you’re trying to do something like maintain a proprietary driver without putting it into the kernel? Or something to that effect?

      • jollyrogue@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        5 hours ago

        Basically. Out-of-tree drivers are annoying without an LTS kernel.

        There are also out-of-tree drivers which don’t get mainlined for one reason or another even though they are FOSS. OpenZFS has this problem, and now so does bcachefs.

      • Railcar8095@lemmy.world
        link
        fedilink
        arrow-up
        14
        ·
        edit-2
        11 hours ago

        Well, RAM is dirt cheap anyways. /S

        Edit: I bought this one for 180 in another site just a few months ago. It now costs as much what I paid for RAM, CPU + MOBO. Dodged a bullet not waiting for the black Friday “deals”