This is the direction we should be moving in.

Bravo.

  • pory@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    6 hours ago

    I’d love to see an extension that, instead of removing the client ID tracking information, instead randomizes it - and does so on inputting a link too. Removing tracking parameters needs 100% certainty, a single link clicked while signed into Google or whatever on another browser can be enough to establish a connection between you and the friend who sent the link. If I show up as clicking one link from Bob and 9 links from null, I’m still connected to Bob. But if my 10 links are from Bob, Jane, Alice, Fheism, Bggur, Daxi8, Michelle, Sssssssssss, Mgke7d, and BRomgi, good luck targeting any ads with all that noise. Especially if the systematically replaced clientIDs are recycled within the addon’s database and end up creating ghost profiles on the advertisers’ end.

  • Matriks404@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    7 hours ago

    The only thing that I would like to find in this menu is a “Find” feature, where the Find function opens up with selected text as input. You can’t how many times I would expect to have this function there and I was disappointed.

  • chromodynamic@piefed.social
    link
    fedilink
    English
    arrow-up
    12
    ·
    10 hours ago

    Browsers should be designed from the start for the benefit of the users. There are too many “features” that only benefit the server owners. It’s been this way for a long time. Like the “Referer” header. Old as dirt, but how do I benefit from telling a server what page I was visiting beforehand?

    • Buddahriffic@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      7 hours ago

      Yeah, this has been my thought ever since the first time I saw javascript that disabled right click. There were workarounds, but I thought the browser shouldn’t even try to cooperate with something like that.

      These days, it’s sites that disable scrolling. Sure, I can fuck with the css to enable it again, but it should be a menu option or just not something that can be disabled in css or via js.

      Same for all of the other metrics that browsers send in the headers. Even though it’s nice to see stats on OS use and such, why does a server need to know which one I’m using? Even for screen resolution, while you could say that it affects how the page should be rendered, I’d prefer a standardized bug reporting mechanism for letting webmasters know when their page is broken for a certain resolution and otherwise letting the browser handle the rendering. And if it means the death of sites that cover their page in images to make a fancy layout, tbh good riddance.

  • thingsiplay@beehaw.org
    link
    fedilink
    arrow-up
    18
    ·
    21 hours ago

    I wish it was a bit more aggressive. For that I have an additional plugin ClearURLs (as linked by someone else in the replies BTW), but don’t know if its actually better or not.

  • solrize@lemmy.ml
    link
    fedilink
    arrow-up
    9
    arrow-down
    3
    ·
    20 hours ago

    It is pretty lame. I have a far more aggressive one written in python that I use in a shell window. One of these days I want to rewrite it in JS and turn it into a firefox extension.

    • Ephera@lemmy.ml
      link
      fedilink
      English
      arrow-up
      12
      ·
      18 hours ago

      It’s the bane of being built-in. You don’t have an extension page to explain to people that the link might not work anymore. You certainly also can’t assume that your users should know of such a possibility, because this can be clicked by any user.

      I guess, there could be like a workflow where it opens the URL in a new tab and asks you, if it still works, but that’s also a good way to ensure your less techy users will not press that button again…

      • solrize@lemmy.ml
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        15 hours ago

        Nah, the Firefox thing just seems to have one or two rules, like remove utm_whatever=something parameters. If you expand that to 10 or 20 rules, some of which are site specific like cleaning ALL the parameters from ebay links and doing some similar rewriting with Amazon links, removing gclid and fclid from everything, retrieving the content of a few of the more common link shorteners and cleaning -that- up, etc., you can get a much less trashy experience with maybe 1 page of code. Adblock already does some of that with its site filtering. You don’t get everything, but a little bit goes a long way.