Some article websites (I’m looking at msn.com right now, as an example) show the first page or so of article content and then have a “Continue Reading” button, which you must click to see the rest of the article. This seems so ridiculous, from a UX perspective–I know how to scroll down to continue reading, so why hide the text and make me click a button, then have me scroll? Why has this become a fairly common practice?

  • anothermember@lemmy.zip
    link
    fedilink
    arrow-up
    21
    ·
    11 months ago

    Page load: The biggest and I mean biggest reason someone leaves a page is page load speed. If you’re deep in researching some information, regardless of your internet speed or if the fault is on the user side and your page load is over 3 seconds, you will leave the site. Loading only 1/4 of the page helps with this along with other tricks like caching at the CDN and lazy loading.

    The thing that always bothers me about this is that I’ve been using the internet since 90s dial-up, and even 90s dial-up never had a “page load speed” problem when loading text-based articles. An extremely conservative estimate is that modern broadband speeds are 1000x what they were then so “page load speed” is entirely about the design of the website, and it seems that mostly the excuse is “we want to spy on people”. Am I wrong? Otherwise why not write an HTML page that would be just as compatible with Geocities as it would now?

    • jas0n@lemmy.world
      link
      fedilink
      arrow-up
      12
      ·
      11 months ago

      You can still write plain html websites, and they would be super fast! But that’s not how we do things damnit! I need to implement feature x. Do I spend all day rolling my own lean version? Fuck no. I download a 5-ton JavaScript library that already has that feature, and I fuck off the rest of the day.

      You are correct on one thing. The math does not add up at all.

      The root cause is the current meta of software development. It’s bloat. Software is so ungodly bloated today because we’ve been taught since as long as I can remember that hardware is so fast nowadays that we don’t need to care about performance. Because of this mindset, many of the best practices that we were taught work directly against performance (OOP was a mistake. Fight me).

      There might be overhead on the ad tracking bullshit… Sure. But, if developers cared about performance, that ad tracking can be fast, too ;]

      How long should it really take to render a webpage? That should be near instant. If modern games can render a full 3D landscape over 100 times a second, surely a wall of text and some images can be done in under 1 second, right?

      This is a problem in all software. For a simple example, I remember Microsoft word from 20 years ago being quite snappy on the desktops of the time. And by comparison, we are running supercomputers today. A cheap android phone would blow that desktop out of the water. Yet, somehow, word is a dog now…