The polyfill.js is a popular open source library to support older browsers. 100K+ sites embed it using the cdn.polyfill.io domain. Notable users are JSTOR, Intuit and World Economic Forum. However, in February this year, a Chinese company bought the domain and the Github account. Since then, this domain was caught injecting malware on mobile devices via any site that embeds cdn.polyfill.io. Any complaints were quickly removed (archive here) from the Github repository.
Noscript would fix this issue… Deny most of that shit and internet still works… Mostly
Not a solution. Much of the modern web is reliant on JavaScript to function.
Noscript made sense when the web was pages with superfluous scripts that enhanced what was already there.
Much of the modern web is web apps that fundamentally break without JS. And picking and choosing unfortunately won’t generally protect from this because it’s common practice to use a bundler such as webpack to keep your page weight down. This will have been pulled in as a dependency in many projects and the site either works or does not based on the presence of the bundle.
Not saying this is a great situation or anything, but suggesting noscript as a solution is increasingly anachronistic.
This wasn’t bundled. People inserted a script tag pointing to a third-party CDN onto their sites. The output changes depending on the browser (it only loads the polyfills needed for the current browser) so you can’t even use a subresource integrity hash.
“function” is doing a lot of lifting there. Trackers, ads, and assorted other bullshit is not the kind of functioning anyone needs.
It’s true the average user gets flummoxed quickly when the scripts are blocked, but they can either sink (eat ads and trackers) or swim (learn what scripts to allow). (Spoiler: they almost always sink)
And much of it works better and faster without JavaScript. Some sites don’t work in Noscript, but most sites run faster and work well enough.
.
I only allow JS on a whitelist.
A whitelist wouldn’t mitigate this issue entirely due to bundling
In this case the script wasn’t bundled at all - it was hotlinked from a third party CDN. Adding malicious code instantly affects all the sites that load it.
The output differs depending on browser (it only loads the polyfills your browser needs) so it’s incompatible with subresource integrity.
Imo, computing, like all other things, requires a little trust and risk. The problem is most people are Wayyy to trusting in general.
deleted by creator
Flash was magnitudes worse than the risk of JS today, it’s not even close.
Accessibility is orthogonal to JavaScript if the site is being built to modern standards.
Unfortunately preference is not reality, the modern web uses JavaScript, no script is not an effective enough solution.
deleted by creator
Flash ran as a browser plugin (as in not an extension, but a native binary that is installed into the OS and runs beside the browser, we basically don’t do this for anything now)
Flash was pretty much on weekly security bulletins in the final years, arbitrary code execution and privilege escalation exploits were common, that’s why Adobe killed it.
Flash was never safe and comparing JavaScript to it as a greater risk shows you’ve not fully understood the threat model of at least one of the two.
deleted by creator
That’s literally the one main somewhat valid use case for plugins, and it’s basically because of DRM. A plugin that allows arbitrary code to run is a security nightmare, that’s why we don’t do it anymore.
A lot of the security features you describe were added by browser vendors late in the game because of how much of a security nightmare flash was. I was building web software back when this was all happening, I know first hand. People actually got pissy when browsers blocked the ability for flash to run without consent and access things like the clipboard. I even seem to remember a hacky way of getting at the filesystem in flash via using the file upload mechanism, but I can’t remember the specifics as this was obviously getting close to two decades ago now.
Your legitimate concerns about JavaScript are blockable by the browser.
Flash was a big component of something called the evercookie—one of the things that led to stuff like GDPR because of how permanently trackable it made people. Modern JavaScript tracking is (quite rightfully) incredibly limited compared to what was possible with flash around. You could track users between browsers FFS.
You’re starting to look like you don’t know what you’re talking about here.
deleted by creator
deleted by creator
Well, by that measure, you don’t need JavaScript to make inaccessible sites, there are plenty of sites out there that ruin accessibility with just HTML and CSS alone.
It’s always up to the developer to make sure the site is accessible. At least now it seems to be something that increasingly matters to search result rankings
deleted by creator
100% agree. A super-fast text only internet layer is approved.
That load-bearing “mostly” is doing a lot of work here.
I invite everybody to find out how everything “mostly” works if you disable “most of” javascript – also have fun deciding which parts to enable because you think they’re trustworthy
I actively do this with uMatrix - granted, I only block non-first-party JavaScript. Most sites I visit only require a few domains to be enabled to function. The ones that don’t are mostly ad-riddled news sites.
There are a few exceptions to this - AWS and Atlassian come to mind - but the majority of what I see on the internet does actually work more or less fine when you block non-first-party JavaScript and some even when you do that. uMatrix also has handy bundles built-in for certain things like sites that embed YouTube, for example, that make this much easier.
Blocking non-first-party like I do does actually solve this issue for the most part, since, according to the article, only bundles that come from the cdn.polyfill.io domain itself that were the problem.
You’re still trusting that the 1st party javascript won’t be vulnerable to supply chain attacks, though
In my experience, first-party JavaScript is more likely to be updated so rarely that bugs and exploits are more likely than supply chain attacks. If I heard about NPM getting attacked as often as I hear about CDNs getting attacked, I’d be more concerned.
Funny that they want you to allow all java scripts but then criticise first party scripts for being unsave.
I bet [insert random autocrat here] would approve of that message.
deleted by creator
Yeah, it took me about that long to get my regular websites working right too. And then i had to reinstall for unrelated reasons and all that customisation was gone.
While you can back it up, at least once you’ve suffered the loss multiple times you can get it 90% back on first re-visit after reinstall.
deleted by creator
Having done this for many many years, I can tell you: if you allow the site scripts (which is an acknowledgement of js at least), and a few “big” ones like ajax.google.com, jquery.com, and ytimg.com, etc., you then find a smaller subset of annoying-but-necessary-for-individual-websites that you can enable as needed or just add them as trusted if you’re into that kind of thing.
After that you have the utter garbage sites with 30 scripts of tracking data-sucking bullshit (CNN, looking at you) and for those sites I have said “Thou shalt bite my shiny metal ass” and i just don’t go there.
It’s a concession to js, yes, but it’s also not free rein to trample all over the surfing experience. Totally worth the time to work out.