I highly recommend disabling JavaScript by default in your browser and then whitelisting the websites that you use frequently and need JavaScript to function.
The privacy benefit of this is that when you read articles online or visit new websites, most of the time it will not need JavaScript to function which will stop loading a lot of ads and tracking scripts.
The security benefit here is massive, first if you visited a bad link that contains a malware that is dependent on JavaScript it would not work, secondly if you visited a link for a service that you use and JavaScript did not work there, then you can see in real time that this is a fake page and not the real websitewebsite you intended to visit.
Bonus tip: try to replace the unnecessary websites that can’t work without JavaScript and you need by JavaScript free websites or open source apps.
Disclaimer: Stay cautious. This recommendation will improve your privacy and security, but it does not protect you from everything.
You’re suggesting a whitelisting approach which I’ve used for a long time. But in the end, I I was so upset that most websites required me to enable JavaScript for their unique website because they would otherwise be broken. And I was only interested in blocking it for specific webpages so I ended up having a blacklisting approach which I recommend to keep some sanity, but that’s my opinion :)
15-20 years ago, I’d have agreed with you. But apart from a select few news sites and exceedingly rare static sites, what percentage of websites most users use day to day actually function even minimally without JavaScript?
I’m convinced that in practice, most users would be conditioned to whitelist pretty much every site they visit due to all the breakage. Still a privacy and security improvement, but a massive one? I’m not sure.
Very happy to be convinced otherwise.
Yep, software dev here for a static marketing site for a product. We are in a constant battle with PMs and SEO who want Google tracking, Facebook, TikTok, A/B testing, cursor tracking, etc. We’re trying to keep page-speeds fast while loading megabytes of external JS code…
Luckily all that can be blocked by the end user without affecting the site though, all you’d lose is some carousels and accordions etc that are done with a few lines of code.
It’s incredibly annoying, but it gets easier over time as you fill out you whitelist.
One of the big advantages to something like NoScript is that it lets you enable scripts only from certain domains. So you can enable the functionally-required scripts while still blocking other scripts.
But yes, it’s a giant pain in the ass. It’s absurd that the web has devolved into such a state.
Tried and can confirm almost every webpage even static ones which could be simple as rock needs truckload of bloat js code to be loaded from ext servers.
I agree that most websites don’t load without JavaScript, but you don’t need seven or more different domains with java allowed for the main site to work. Most sites have their own, plus six google domains, including tag manager, Facebook, etc. I whitelist the website and leave the analytics and tracking domains off.
You’d end up whitelisting sob many sites that it makes this approach worthless in my opinion.
Instead I’ve settled on blocking scripts by default and whitelisting subdomains until the site works. It does require more time and effort, but it’s probably the only way to meaningfully block parts of javascript apart from just not using that website.
Depending on how exactly you so this, you’ll end up with a huge filter list. Mine in uBlock Origin has 245kB when exported.