MARK SURMAN, PRESIDENT, MOZILLA Keeping the internet, and the content that makes it a vital and vibrant part of our global society, free and accessible has

  • nxn@biglemmowski.win
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    3
    ·
    edit-2
    3 months ago

    To keep a modicum of privacy and openness, the web is de-facto dependent on Firefox continuing to exist in the medium term. And it has to be paid for somehow.

    The web today has no privacy or openness. It has gmail accounts, russian propaganda bots, and AI SEO article spam. Does it matter which rose tinted browser you care to view or interact with this shit through? I’m approaching 40 and the web has been a fundamental part of my life to the point that I am sometimes bewildered about what I’d do without it. It is a sinking ship though, and at this point I’m much more interested in seeing alternatives to HTTP rather than trying to save the mess we built on-top of it.

    • JubilantJaguar@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      2
      ·
      3 months ago

      This analysis strikes me as a nice mix of cynicism and revolutionary thinking. In my own analysis of history, cynicism has never achieved anything except worsen what it claims to hate. As for revolutions, they mostly never even happen, and when they do happen they achieve nothing except heartache and backlash. The only way forward that actually works is slowly, one step at a time, building on what you have.

      • nxn@biglemmowski.win
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        3 months ago

        Ok, let’s try to narrow this down so our exchanges aren’t vague. To me going from propellers to jet engines would have been “revolutionary”, but to you it may have just been incrementally expanding on the concept of a wing that keeps aircraft afloat.

        So for clarity, I’m not suggesting a complete replacement to HTTP. I don’t envision a world where the web as we know gets fully “replaced”. But, I do think that it has out lived its purpose and ultimately we should be seeking a better protocol for information exchange. Or, in other words, I don’t think formulating a solution that can provide privacy, integrity, etc should be restricted to being built on HTTP just because that is what we essentially consider the web to be today.

        • JubilantJaguar@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          3 months ago

          Fair points. Talking of revolution was indeed a bit vague.

          Perhaps I am just more conservative in temperament. I focus on the value in keeping things and improving them. Software lends itself to iterative development where the result can still end up being revolutionary. So my intuition is that if there’s a problem with HTTP then let’s solve that problem rather than throwing the whole thing out and losing all its accrued value. In this case 3 decades of web archives and the skills capital of all the people who make it work.

          Sure, HTTP is suboptimal, and as a sometime web developer I can see that HTML is verbose and ugly and was only chosen because XML was fashionable back then. Even the domain name system suffers from original sin: the TLDs should come first, not last!

          Human culture is messy. Throwing things out is risky and even reckless given that the alternative is all but certain not to work out as imagined. Much safer to build upon and improve things than to destroy them.

          • nxn@biglemmowski.win
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            2 months ago

            It’s one month later and I am back to reply:

            I don’t want to replace HTTP, or the web. But, I also absolutely don’t want to build anything in greater complexity than what we have today. In other words, keep it for what it’s doing now, but having an isolated app/container based platform efficiently served through a browser might just be a good thing for everyone?

            5 years ago I was writing rust code compiled to web-assembly and then struggling to get it to run in a browser. I did that because I couldn’t write an efficient enough version of whatever the algorithm I was following in javascript - probably on account of most things being objects. I got it to run eventually with decent enough performance, but it wasn’t fun gluing all that mess together. I think if there was a better delivery platform for WASM built into browsers and maybe eventually mobile platforms, it would probably be better than today’s approach to cross-platform apps being served via HTTP.

            • JubilantJaguar@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              2 months ago

              This seems to be the argument that the web was designed for documents and that we should stop trying to shoe-horn apps into documents. Hard to disagree at this point, especially when the app in question is, say, a graphics tool, or a game. I still think that, in the case of more document-adjacent applications, a website implemented with best-practices progressive enhancement is about as elegant a solution as is imaginable. Basically: an app which can gracefully degrade to a stateless document, and metamorphose back into an app, depending on system resources and connectivity, and all completely open source and open standards and accessible. That was IMO the promise of the web fulfilled: the separation of content from presentation, and presentation from functionality. Unfortunately there were never more than a tiny minority of websites that achieved this. Hardly any web developers had the deep skill set needed to pull it off.

              I was once skeptical about WASM on the grounds that it’s effectively closed-source software - tantamount to DRM. But people reply that functionally there’s not much difference between WASM and a blob of minified JS, and the WASM security can be locked down. So I guess I accept that WASM is now the best the web can hope for.

              • nxn@biglemmowski.win
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 months ago

                Hardly any web developers had the deep skill set needed to pull it off.

                I’m personally of the opinion it’s not so much an issue of a lack of talent that prevented graceful fallback from being adopted, but simply the amount of extra effort necessary to implement it properly.

                In my opinion, to do it properly you can’t make any assumptions about the browser your app is running on; you should never base anything on the reported user agent string. Instead, you need to test for each individual JavaScript, HTML, (or sometimes even CSS) feature and design the experience around having a fallback for when that one singular piece of functionality isn’t present. Otherwise you create a brand new problem where, for example, a forked Firefox browser with a custom user agent string doesn’t get recognized despite having the feature set to provide the full experience, and that person then gets screwed over.

                But yeah, that approach is incredibly cumbersome and time consuming to code and test for. Even with libraries that help with properly detecting the capabilities of the browser, you’ll still need to implement granular fallbacks that work for your particular application, and that’s a lot of extra work.

                Add to that the fact devs in this field are already burdened with having to support layouts and designs that must scale responsively to everything ranging from a phone screen to a 100" inch TV and it quickly becomes nearly impossible to actually finish any project on a realistic timeline. Doing it that way is a monumental task to undertake, and realistically it probably mainly benefits people that use NoScript or similar – so not a lot of people.

                • JubilantJaguar@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  2 months ago

                  Actually, it doesn’t just benefit “geeks who use NoScript”. The original audience for accessibility was disabled users, which is why some of the best websites ever made are for government agencies. But sure, they don’t count much when there’s a deadline to keep. I know what you’re talking about, I know that progressive enhancement and respecting WCAG etc is just time-consuming and time is money. I was in the meetings. But it’s also just hard, for the reasons you describe, and few developers have ever been able to do it. Maybe precisely because the skillset straddles different domains: not just programming but also UX and graphic design and information architecture. The first web developers were tinkerers and lots of them came from the world of print. Now they’re all just IT guys who see everything as an app. Even when it’s in essence a document.