This is the direction we should be moving in.
Bravo.
I’d love to see an extension that, instead of removing the client ID tracking information, instead randomizes it - and does so on inputting a link too. Removing tracking parameters needs 100% certainty, a single link clicked while signed into Google or whatever on another browser can be enough to establish a connection between you and the friend who sent the link. If I show up as clicking one link from Bob and 9 links from null, I’m still connected to Bob. But if my 10 links are from Bob, Jane, Alice, Fheism, Bggur, Daxi8, Michelle, Sssssssssss, Mgke7d, and BRomgi, good luck targeting any ads with all that noise. Especially if the systematically replaced clientIDs are recycled within the addon’s database and end up creating ghost profiles on the advertisers’ end.
The only thing that I would like to find in this menu is a “Find” feature, where the Find function opens up with selected text as input. You can’t how many times I would expect to have this function there and I was disappointed.
Ctrl+c ctrl+f ctrl+v?
Ctrl-f with text selected also works
Browsers should be designed from the start for the benefit of the users. There are too many “features” that only benefit the server owners. It’s been this way for a long time. Like the “Referer” header. Old as dirt, but how do I benefit from telling a server what page I was visiting beforehand?
Yeah, this has been my thought ever since the first time I saw javascript that disabled right click. There were workarounds, but I thought the browser shouldn’t even try to cooperate with something like that.
These days, it’s sites that disable scrolling. Sure, I can fuck with the css to enable it again, but it should be a menu option or just not something that can be disabled in css or via js.
Same for all of the other metrics that browsers send in the headers. Even though it’s nice to see stats on OS use and such, why does a server need to know which one I’m using? Even for screen resolution, while you could say that it affects how the page should be rendered, I’d prefer a standardized bug reporting mechanism for letting webmasters know when their page is broken for a certain resolution and otherwise letting the browser handle the rendering. And if it means the death of sites that cover their page in images to make a fancy layout, tbh good riddance.
It’s great to see, but ClearURLs works better for me so I use that more.
It would be even better if it would do that automatically when hitting Ctrl+C in the address bar.
Hey did youbknow you can do that in a clipboard manager?
https://ditto-cp.sourceforge.io/
You click on the clips with url and special paste - > Remove URL trackers
Demonstration
I’m using Linux
I have been told linux has an equivalent software for windows software so you should be good to go
That might be but judging by your screenshot this software doesn’t make it really easier to use this feature.
I didn’t know but actually
There is a way to assign keyboard shortcuts to special paste scripts
I just found out about it
So example
https://bruh-clips.com/clips/a95746d5
I just press the Ditto key -> CTRL+`
then I press h
As someone who cares about privacy but has the intelligence of dirt, what does this do?
URL’s can have tracking/surveillance/data harvesting codes appended to them. This option removes those.
Does this work?
I wish it was a bit more aggressive. For that I have an additional plugin ClearURLs (as linked by someone else in the replies BTW), but don’t know if its actually better or not.
Never cleans it the way I want
I assume this strips the utm parameters?
Yeah, and a few other such parameters where it’s pretty clear that they’re only used for tracking.
I don’t think it strips UTM parameters specifically - I think it’s limited to parameters that track you individually?
Hmm, apparently this is the list of parameters it strips: https://searchfox.org/mozilla-release/source/toolkit/components/antitracking/StripOnShareLists/LGPL/StripOnShareLGPL.json
So, yeah, it’s only some of the UTM parameters (+ some non-UTM parameters).
And you brought the receipts too :) Indeed, looks like it’s blocking the ones that might potentially identify you, not more generic ones (like
utm_content=top_banner
or whatever).
It is pretty lame. I have a far more aggressive one written in python that I use in a shell window. One of these days I want to rewrite it in JS and turn it into a firefox extension.
It’s the bane of being built-in. You don’t have an extension page to explain to people that the link might not work anymore. You certainly also can’t assume that your users should know of such a possibility, because this can be clicked by any user.
I guess, there could be like a workflow where it opens the URL in a new tab and asks you, if it still works, but that’s also a good way to ensure your less techy users will not press that button again…
Nah, the Firefox thing just seems to have one or two rules, like remove utm_whatever=something parameters. If you expand that to 10 or 20 rules, some of which are site specific like cleaning ALL the parameters from ebay links and doing some similar rewriting with Amazon links, removing gclid and fclid from everything, retrieving the content of a few of the more common link shorteners and cleaning -that- up, etc., you can get a much less trashy experience with maybe 1 page of code. Adblock already does some of that with its site filtering. You don’t get everything, but a little bit goes a long way.
Well, apparently this is the list of parameters it strips: https://searchfox.org/mozilla-release/source/toolkit/components/antitracking/StripOnShareLists/LGPL/StripOnShareLGPL.json
Thanks that’s helpful. But there are a bunch of other important rewrites needed, like bypassing redirects.
That would be rad if you did.
That’s great! I have been using ClearURLs with varied success. Nice to see it built in .