The three pillars of JavaScript bloat

(43081j.com)

182 points | by onlyspaceghost 5 hours ago

27 comments

  • procaryote 24 minutes ago
    The most frustrating thing with the "Atomic architecture" bit with tiny packages is how obviously stupid it is. Any borderline sane person should look at isOdd/isEven and see that it's an awful idea

    Instead they've elevated it to a cultural pillar and think they've come up with a great innovation. It's like talking to antivaxers

  • andai 3 hours ago
    Great article, but I think these are all marginal.

    The main cause of bloat is not polyfills or atomic packages. The cause of bloat is bloat!

    I love this quote by Antoine de Saint-Exupéry (author of the Little Prince):

    "Perfection is achieved, not when there is nothing left to add, but nothing to take away."

    Most software is not written like that. It's not asking "how can we make this more elegant?" It's asking "what's the easiest way to add more stuff?"

    The answer is `npm i more-stuff`.

    • sheept 42 minutes ago
      All software has bloat, but npm packages and web apps are notorious for it. Do you think it could be inherent to the language?

      JavaScript seems to be unique in that you want your code to work in browsers of the past and future—so a lot of bloat could come from compatibility, as mentioned in the article—and it's a language for UIs, so a lot of bloat in apps and frameworks could come from support for accessibility, internationalization, mobile, etc.

      • guax 13 minutes ago
        The problem JS development is facing is the same most languages might go through. The "Magic" that solves all problems, frameworks and solutions that solve small issues at a great cost.

        Lots of developers don't even say they are JS devs but React devs or something. This is normal given that the bandwidth and power of targets are so large nowadays. Software is like a gas, it will fill all the space you can give it since there is no reason to optimize anything if it runs ok.

        I've spent countless hours optimising javascript and css to work across devices that were slow and outdated but still relevant (IE7, 8 and 9 were rough years). Cleverness breads in restrictive environments where you want to get the most out of it. Modern computers are so large that its hard for you to hit the walls when doing normal work.

      • lukan 34 minutes ago
        "Do you think it could be inherent to the language?"

        Not to the language but its users. Not to bash them, but most of them did not study IT on a university, did not learn about the KISS principle etc.

        They just followed some tutorials to hack together stuff, now automated via LLM's.

        So in a way the cause is the language as it is so easy to use. And the ecosystem grew organically from users like this - and yes, the ecosystem is full of bloat.

        (I think claude nowdays is a bit smarter, but when building standalone html files without agents, I remember having to always tell chatgpt to explicitely NOT pull in yet another libary, but use plain vanilla js for a standard task, which usually works better and cleaner with the same lines of code or maybe 2 or 3 more for most cases. The standard was to use libaries for every new functionality )

      • socalgal2 15 minutes ago
        Every cargo install (rust) I've down downloads 300 to 700 packages

        Every C++ app I install in linux requires 250 packages

        Every python app I install and then pip install requirements uses 150 packages.

      • bigstrat2003 34 minutes ago
        > All software has bloat, but npm packages and web apps are notorious for it. Do you think it could be inherent to the language?

        It sure seems like it is because JS devs, by and large, suck at programming. C has a pretty sparse standard library, but you don't see C programmers creating shared libraries to determine if a number is odd, or to add whitespace to a string.

    • cwnyth 3 hours ago
      Cf. Vonnegut's rule #4 of good writing:

      > Every sentence must do one of two things—reveal character or advance the action.

      Or Quintilian's praise of Demosthenes and Cicero: "To Demosthenes nothing can be added, but from Cicero nothing can be taken away."

      • cobbzilla 2 hours ago
        Is there no room for describing the setting? Must every utterance that sets the atmosphere also advance the plot or reveal character? Is there no room for mood?
        • hombre_fatal 2 hours ago
          > Is there no room for describing the setting? Is there no room for mood?

          You mean the character of a place?

          • cobbzilla 1 hour ago
            sure, setting and character are the same thing
            • bryanrasmussen 24 minutes ago
              the implication is that if mood is the character of the place then those sentences that set mood are advancing character.
  • auxiliarymoose 3 hours ago
    I really think writing dependency-free JavaScript is the way to go nowadays. The standard library in JS/CSS is great. So are static analysis (TypeScript can check JSDoc), imports (ES modules), UI (web components), etc.

    People keep telling me the approach I am taking won't scale or will be hard to maintain, yet my experience has been that things stay simple and easy to change in a way I haven't experienced in dependency-heavy projects.

    • Maxion 1 hour ago
      Did this for a project in 2022. Haven't had any drama related to CVEs, hadn't had any issues related to migration from some version of something to another.

      The client has not had to pay a cent for any sort of migration work.

    • CoderLuii 2 hours ago
      been doing something similar. the projects ive been building recently use as few dependencies as possible and honestly the maintenance burden dropped significantly. when something breaks you actually know where to look instead of digging through 15 layers of node_modules. people said the same thing to me about it not scaling but the opposite turned out to be true.
    • anematode 2 hours ago
      This is absolutely the way to go
    • leptons 1 hour ago
      If I need a library for nodejs, the first thing I do is search for the dependency-free option. If I can make that work, great.
  • zdc1 4 hours ago
    A lot of this basically reads to me like hidden tech debt: people aren't updating their compilation targets to ESx, people aren't updating their packages, package authors aren't updating their implementations, etc.

    Ancient browser support is a thing, but ES5 has been supported everywhere for like 13 years now (as per https://caniuse.com/es5).

    • hrmtst93837 47 minutes ago
      The root isuue is that the web rewards shipping now and fixing later so old deps and conservative targts linger until stuff breaks.
    • anematode 4 hours ago
      The desire to keep things compatible with even ES6, let alone ES5 and before, is utterly bizarre to me. Then you see folks who unironically want to maintain compatibility with node 0.4, in 2025, and realize it could be way worse....

      Ironically, what often happens is that developers configure Babel to transpile their code to some ancient version, the output is bloated (and slower to execute, since passes like regenerator have a lot of overhead), and then the website doesn't even work on the putatively supported ancient browsers because of the use of recent CSS properties or JS features that can't be polyfilled.

      I've even had a case at work where a polyfill caused the program to break. iirc it was a shitty polyfill of the exponentiation operator ** that didn't handle BigInt inputs.

      • Slothrop99 7 minutes ago
        Maybe I didn't look hard enough, but there's no obvious switch to "just turn off all the legacy stuff, thnx".

        Also, there has been a huge amount of churn on the tooling side, and if you have a legacy app, you probably don't wanna touch whatever build program was cool that year. I've got a react app which is almost 10 years old, there has to be tons of stuff which is even older.

      • Pxtl 2 hours ago
        It's just an excuse to not change things.
      • fragmede 4 hours ago
        Just how old an Android device in the developing world do you not want to support? Life's great at the forefront of technology, but there's a balancing act to be able to support older technology vs the bleeding edge.
        • anematode 4 hours ago
          I like the sentiment, but building a website that can actually function in that setting isn't a matter of mere polyfills. You need to cut out the insane bloat like React, Lottie, etc., and just write a simple website, at which point you don't really need polyfills anyway.

          In other words, if you're pulling in e.g. regenerator-runtime, you're already cutting out a substantial part of the users you're describing.

        • Dylan16807 3 hours ago
          A quick search tells me that firefox 143 from 6 months ago supported android 5 (Lollipop).

          So that's my cutoff.

        • dfabulich 3 hours ago
          Android phones update to the latest version of Chrome for 7 years. As long as you're using browser features that are Baseline: Widely Available, you'll be using features that were working on the latest browsers in 2023; those features will work on Android 7.0 Nougat phones, released in 2016.

          Android Studio has a nifty little tool that tells you what percentage of users are on what versions of Android. 99.2% of users are on Android 7 or later. I predict that next year, a similar percentage of users will be on Android 8 or later.

          • kennywinker 2 hours ago
            3.9 billion android users, means that 0.8% is 31 million people - and for a very small number of developers most of their users will be from that slice. For most of them… yeah go ahead an assume your audience is running a reasonably up to date os
            • oflebbe 17 minutes ago
              Websites built with tons of polyfills are likely not run on these devices anyway, since they will run out of RAM before, let alone after they will only load after sone minutes because of CPU limitations on top of not being loaded because their x509 certs are outdated as well as the bandwith they support is not suitable to load MB sited pages
      • hsbauauvhabzb 4 hours ago
        I’ve been very lost trying to understand the ecosystem between es versions , typescript and everything else. It ends up being a weird battle between seemingly unrelated things like require() vs import vs async when all I want to do is compile. All while I’m utterly confused by all the build tools, npm vs whatever other ones are out there, vite vs whatever other ones are out there, ‘oh babel? I’ve heard the name but no idea what it does’ ends up being my position on like 10 build packages.

        This isn’t the desire of people to build legacy support, it’s a broken, confusing and haphazard build system built on the corpses of other broken, confusing and haphazard build systems.

        • anematode 2 hours ago
          Honestly, Vite is all you need. :) It's super flexible compared to the status quo of require vs. import etc. For example, I recently wanted to ship a WASM binary along with the JS rather than making it a separate download (to avoid having to deal with the failure case of the JS code loading and the WASM not fetching). All I had to do was import `a.wasm?url` and it did the base64 embedding and loading automatically.
          • Maxion 1 hour ago
            This sentiment is all well and good, but when you end up in a new-to-you JS codebase with a list of deps longer than a Costco receipt using some ancient Webpack with it's config split into 5 or so files, then no-one is letting you upgrade to vite unless the site is completely down.
        • CoderLuii 2 hours ago
          this is exactly where i landed too. i build docker images that bundle node tooling and every time i think i understand the build system something changes. require vs import, cjs vs esm, babel vs swc vs esbuild, then half your dependencies use one format and half use the other. the worst part is when you containerize it because now you need it all to work in a clean linux environment with no cached state and suddenly half the assumptions break.
        • conartist6 3 hours ago
          Yes, yes to all of that, but there is still hope.
          • hsbauauvhabzb 2 hours ago
            This fancy new build tool with emojis will fix it!
            • kennywinker 2 hours ago
              This fancy new vibe coded build tool with emojis
    • userbinator 4 hours ago
      The newer version is often even more bloated. This whole article just reinforces my opinion of "WTF is wrong with JS developers" in general: a lot of mostly mindless trendchasing and reinventing wheels by making them square. Meanwhile, I look back at what was possible 2 decades ago with very little JS and see just how far things have degraded.
      • michaelchisari 4 hours ago
        A standard library can help, but js culture is not built in a way that lends to it the way a language like Go is.

        It would take a well-respected org pushing a standard library that has clear benefits over "package shopping."

      • halapro 1 hour ago
        > WTF is wrong with JS developers

        Don't confuse "one idiot who wants to support Node 0.4 in 2026" with "JS developers". Everybody hates this guy and he puts his hands into the most popular packages, introducing his junk dependencies everywhere.

        • saghm 44 minutes ago
          If everyone hates him and thinks his dependencies are junk, why would anyone let him introduce them to popular packages? Clearly there are at least some people who are indifferent enough if the dependencies are getting added elsewhere
        • userbinator 1 hour ago
          Then I wish there were more of these "idiots who want to support Node 0.4 in 2026". Maybe they're the ones with the common sense to value stability and backwards compatibility over constantly trendchasing the new and shiny and wanting to break what was previously working in the misguided name of "progress".
          • Griffinsauce 51 minutes ago
            You wouldn't if you look more deeply at this. He doesn't push for simplicity but for horrible complexity with an enormous stack of polyfills, ignoring language features that would greatly reduce all that bloat. .
        • Maxion 1 hour ago
          The other problem is that this is a bit of a circular path, with deps being so crap and numerous, upgrading existing old projects become a pain. There are A LOT of old projects out there that haven't been updated simply because the burden to do so is so high.
    • hrmtst93837 1 hour ago
      [dead]
  • rtpg 3 hours ago
    I think on the first point, we have to start calling out authors of packages which (IMO) have built out these deptrees to their own subpackages basically entirely for the purpose of getting high download counts on their github account

    Like seriously... at 50 million downloads maybe you should vendor some shit in.

    Packages like this which have _7 lines of code_ should not exist! The metadata of the lockfile is bigger than the minified version of this code!

    At one point in the past like 5% of create-react-app's dep list was all from one author who had built out their own little depgraph in a library they controlled. That person also included download counts on their Github page. They have since "fixed" the main entrypoint to the rats nest though, thankfully.

    https://www.npmjs.com/package/has-symbols

    https://www.npmjs.com/package/is-string

    https://github.com/ljharb

    • matheusmoreira 3 hours ago
      > entirely for the purpose of getting high download counts on their github account

      Is this an ego thing or are people actually reaping benefits from this?

      Anthropic recently offered free Claude to open source maintainers of repositories with over X stars or over Y downloads on npm. I suppose it is entirely possible that these download statistics translate into financial gain...

      • martijnvds 2 hours ago
        I've seen people brag about it in their resumes, so I assume it helps them find (better paying?) work.
      • stephenr 1 hour ago
        I'm completely apathetic about spicy autocomplete for coding tasks and even I wonder which terrible code would be worse.

        The guy who wrote is even/odd was for ages using a specifically obscure method that made it slower than %2===0 because js engines were optimising that but not his arcane bullshit.

    • technion 1 hour ago
      As usual, there's a cultural issue here. I know it's entirely possible to paste those seven lines of code into your app. And in many development cultures this will be considered a good thing.

      If you're working with Javascript people, this is referred to as "reinventing the wheel" or "rolling your own", or any variation of "this is against best practice".

      • saghm 59 minutes ago
        The point isn't that everyone needs to write the same code manually necessarily. It's that an author could easily just combine the entire tree of seven line packages into the one package the create-react-app uses directly. There's no reason to have a dozen or so package downloads each with seven lines of code instead of one that that's still under under a hundred lines; that's still a pretty small network request, and it's not like dead code analysis to prune unused functions isn't a thing. If you somehow find yourself in a scenario where you would be happy to download seven lines of code, but downloading a few dozen more would be an issue, that's when you might want to consider pasting the seven lines of code manually, but I honestly can't imagine when that would be.
      • rtpg 1 hour ago
        I think the fact that everyone cites the same is-number package when saying this is indicative of something though.

        Like I legit think that we are all imagining this cultural problem that's widespread. My claim (and I tried to do some graph theory stuff on this in the past and gave up) is that in fact we are seeing something downstream of a few "bad actors" who are going way too deep on this.

        I also dislike things like webpack making every plugin an external dep but at least I vaguely understand that.

        • serial_dev 1 hour ago
          Have you heard of the left pad incident?

          The problem is not imagined.

      • stephenr 1 hour ago
        The problem I think is that the js community somehow thinks that being on npm is some bastion of good quality.

        Just as the cloud is simply someone else's computer, a package is just someone else's reinvented wheel.

        The problem is half the wheels on npm are fucking square and apparently no one in the cult of JavaScript realises it.

    • CoderLuii 2 hours ago
      from a security perspective this is even worse than it looks. every one of those micro packages is an attack surface. we just saw the trivy supply chain get compromised today and thats a security tool. now imagine how easy it is to slip something into a 7 line package that nobody audits because "its just a utility." the download count incentive makes it actively dangerous because it encourages more packages not fewer.
    • h4ch1 3 hours ago
      I remember seeing this one guy who infiltrated some gh org, and then started adding his own packages to their dependencies or something to pad up his resume/star count.

      Really escapes me who it was.

    • hinkley 3 hours ago
      Hat tip to Sindre who has fifty bagillion packages but few of them depend on more than one of his other packages.
    • stephenr 2 hours ago
      As usual, he's copying someone else who's been doing this for years:

      https://www.npmjs.com/package/is-number - and then look and see shit like is odd, is even (yes two separate packages because who can possibly remember how to get/compare the negated value of a boolean??)

      Honestly for how much attention JavaScript has gotten in the last 15 years it's ridiculous how shit it's type system really is.

      The only type related "improvement" was adding the class keyword because apparently the same people who don't understand "% 2" also don't understand prototypal inheritance.

      • zahlman 2 hours ago
        To be fair, prototypal inheritance is relatively uncommon language design. I'd rank it as considerably harder to understand than the % operator.
        • stephenr 1 hour ago
          That's a good point, it's only been around for 30 years, and used on 95% of websites. It's not really popular enough for a developer to take an hour or two to read how it works.
          • saghm 52 minutes ago
            The word "used" is doing some heavy lifting there. Not all usage is equal, and the fact that it's involved under the hood isn't enough to imply anything significant. Subatomic physics is used by 100% of websites and has been around for billions of years, but that's not a reason to expect every web developer to have a working knowledge of electron fields.
  • burntoutgray 5 hours ago
    I have a single pillar, admittedly for in-house PWAs: Upgrade to the current version of Chrome then if your problem persists, we'll look into it.
    • GianFabien 1 hour ago
      Keeping it simple usually saves the day.
  • lerp-io 5 minutes ago
    just make react native to browser and everything else thats a one off can be ai generated
  • steveharing1 3 minutes ago
    So the guy who called JS, a weird language was not wrong huh?
  • SachitRafa 3 hours ago
    The cross-realm argument for packages like is-string is the one I find hardest to dismiss, but even there the math doesn't add up. The number of projects actually passing values across realms is tiny, and those projects should be the ones pulling in cross-realm-safe utilities, not every downstream consumer of every package that ever considered it. The deeper problem with Pillar 2 is that atomic packages made sense as a philosophical argument but broke down the moment npm made it trivially easy to publish. The incentive was "publish everything, let consumers pick what they need" but the reality is consumers never audit their trees,they just install and forget. So the cost that was supposed to be opt-in became opt-out by default. The ponyfill problem feels most tractable to me. A simple automated check "does every LTS version of Node support this natively?" could catch most of these. The e18e CLI is a good start but it still requires someone to run it intentionally. I wonder if something like a Renovate-style bot that opens PRs to remove outdated ponyfills would move the needle faster than waiting for maintainers to notice.
  • IAmLiterallyAB 2 hours ago
    For the old version support. Why not do some compile time #ifdef SUPPORT_ES3? That way library writers can support it and if the user doesn't need it they can disable it at compile time and all the legacy code will be removed
    • Griffinsauce 36 minutes ago
      Two problems: - people would need to know how to effectively include dependencies in a way that allows them to be tree shaken, that's a fragile setup - polyfills often have quirks and extra behaviours (eg. the extra functions on early promise libraries come to mind ) that they start relying on, making the switch to build-in not so easy

      Also, how is this going to look over time with multiple ES versions?

    • ascorbic 29 minutes ago
      It'll still install the dependencies, which is what this is about
  • il-b 3 hours ago
    The elephants in the room are react and webpack.
  • grishka 1 hour ago
    Yes, of course the tiny packages cause some of the bloat. As mainly a Java developer being pretty paranoid about my dependency tree (I'm responsible for every byte of code I ship to my users, whether I wrote it or not), I'm always blown away by JS dependency trees. Why would you reach for a library for this three-line function? Just write it yourself, ffs.

    But the real cause of JS bloat is the so-called "front-end frameworks". Especially React.

    First of all, why would you want to abstract away the only platform your app runs on? What for? That just changes the shape of your code but it ends up doing the same thing as if you were calling browser APIs directly, just less efficiently.

    Second of all, what's this deal with mutating some model object, discarding the exact change that was made, and then making the "framework" diff the old object with the new one, call your code to render the "virtual DOM", then diff that, and only then update the real DOM tree? This is such an utterly bonkers idea to me. Like, you could just modify your real DOM straight from your networking code, you know?

    Seriously, I don't understand modern web development. Neither does this guy who spent an hour and some to try to figure out React from the first principles using much the same approach I myself apply to new technologies: https://www.youtube.com/watch?v=XAGCULPO_DE

    • ascorbic 15 minutes ago
      That's like asking "why would you use Swing when you can use Graphics2D". Sometimes you want something higher level. The DOM is great and very powerful, but when you're building a highly interactive web app you don't want to be manually mutating the DOM every time state changes.

      I am a core maintainer of Astro, which is largely based around the idea that you don't need to always reach for something like React and can mostly use the web platform. However even I will use something like React (or Solid or Svelte or Vue etc) if I need interactivity that goes beyond attaching some event listeners. I don't agree with all of its design decisions, but I can still see its value.

    • panstromek 30 minutes ago
      Yea, honestly you probably just don't understand. FE frameworks solve a specific problem and they don't make sense unless you understand that problem. That TSoding video is a prime example of that - it chooses a trivial instance of that problem and then acts like the whole problem space is trivial.

      To be fair, React is especially wasteful way to solve that problem. If you want to look at the state od the art, something like Solid makes a lot more sense.

      It's much easier to appreciate that problem if you actually try to build complex interactive UI with vanilla JS (or something like jQuery). Once you have complex state dependency graph and DOM state to preserve between rerenders, it becomes pretty clear.

  • est 3 hours ago
    More like a nodejs bloat rather than JS bloat.

    For personal objects I always prompt the AI to write JS directly, never introduce nodejs stack unless absolutely have to.

    Turns out you don't always need Nodejs/Reactto make a functional SPA.

    • kennywinker 2 hours ago
      You’ve traded supply chain vulnerability for slop vulnerability.
  • sheept 5 hours ago
    I wonder this means there could be a faster npm install tool that pulls from a registry of small utility packages that can be replaced with modern JS features, to skip installing them.
    • seniorsassycat 4 hours ago
      Not sure about faster, but you could do something with overrides, especially pnpm overrides since they can be configured with plugins. Build a list of packages that can be replaced with modern stubs.

      It couldn't inine them, but it could replace ponyfils with wrappers for native impls, and drop the fallback. It could provide simple modern implementations of is-string, and dedupe multiple major versions, tho that begs the question what breaking change lead to a new mv and why?

  • turtleyacht 5 hours ago
    It would be interesting to extend this project where opt-in folks submit a "telemetry of diffs," to track how certain dependencies needed to be extended, adapted, or patched; those special cases would be incorporated as future features and new regression tests.

    Someday, packages may just be "utility-shaped holes" in which are filled in and published on the fly. Package adoption could come from 80/20 agents [1] exploring these edges (security notwithstanding).

    However, as long as new packages inherit dependencies according to a human author's whims, that "voting" cycle has not yet been replaced.

    [1] https://news.ycombinator.com/item?id=47472694

  • skrtskrt 3 hours ago
    the fact that you can just redefine Map in a script is mind boggling
  • casey2 1 hour ago
    There is a clear and widespread cultural problem with javascript. Sites should think seriously hard about server side rendering, both for user privacy (can't port the site to i2p if you drop 5MB every time they load a page) and freedom. Even this antibloat site smacks you with ~100KB and links to one that smacks you with ~200KB. At this rate if you follow 20 links you'll hit a site with 104 GB of JS.
  • skydhash 5 hours ago
    Fantastic write up!

    And we're seeing rust happily going down the same path, especially with the micro packages.

    • CoderLuii 2 hours ago
      the docker side of this is painful too. every extra dependency in any language means a bigger image, more layers to cache, more things that can break during a multi-arch build. ive been building images that are 4GB because of all the node and python tooling bundled in. micro packages make it worse because each one adds metadata overhead on top of the actual code.
    • cute_boi 3 hours ago
      Rust is different as there is no runtime.
      • b00ty4breakfast 1 hour ago
        I'm not very familiar with rust but I'm pretty sure it has a runtime. Even C has a runtime.

        Unless you're talking about an "environment" eg Node or the like

      • onlyspaceghost 3 hours ago
        but it still increases compile time, attack surface area, bandwidth use, etc.
  • stephenr 3 hours ago
    The primary cause of JS bloat is assuming you need JS or that customers want whatever you're using it to provide.

    For $client we've taken a very minimal approach to JavaScript, particularly on customer facing pages. An upcoming feature finally replaces the last jquery (+ plugin) dependent component on the sales page, with a custom implementation.

    That change shaved off ~100K (jquery plus a plugin removed) and for most projects now that probably seems like nothing.

    The sales page after the change is now just 160K of JS.

    The combination of not relying on JS for everything and preferring use-case-specific implementations where we do, means we aren't loading 5 libraries and using 1% of each.

    I'm aware that telling most js community "developers" to "write your own code" is tantamount to telling fish to "just breathe air".

    • CoderLuii 2 hours ago
      160K total is impressive. most landing pages i see are shipping 2-3MB of js before the first paint. the "write your own code" approach gets laughed at but when you actually do it the result is faster, easier to debug, and you dont wake up one morning to find out one of your 200 dependencies got compromised.
  • sipsi 3 hours ago
    i suggess jpeg.news dot com
  • hahaddmmm12x 9 minutes ago
    [dead]
  • leontloveless 4 hours ago
    [dead]
  • krmbzds 2 hours ago
    JavaScript bloat is downstream of low FED interest rates.
  • pjmlp 1 hour ago
    What about only writing JavaScript when it is actually required, instead of SPAs for any kind of content?

    There will be almost no bloat to worry about.