License to npm install
? Why do we burden our road builders when the drivers are drunk at the wheel?
Mike Lieberman May 05, 2025 Updated: May 05, 2025 #Security #Open Source #Software Supply Chain
Alright, let's talk about the digital world we've built. It runs on open source software (OSS). Your phone, your cat's smart litter box, the thing that tells you pizza is on the way – all powered in large part by code generously shared for free by legions of developers worldwide. Many of them are volunteers. It's a beautiful, collaborative, sprawling interstate highway system.
But lately, it feels like we're spending an awful lot of time interrogating the volunteer road builders designing and creating the roads, while letting any random person get on the onramp and drive, all while blindfolded.
You see, there's this intense focus on making sure every single person who contributes a line of code to an open source project is trustworthy. Are they who they say they are? Have they been vetted? Do they have the right vibes? We're talking project reviews stricter than an OSHA inspection, contribution guidelines longer than treaties, and now, whispers of "trust scorecards". Soon, some multinational corporation’s lawyers will be asking for a background check, three letters of recommendation, and a perfect PGP-signed haiku just to trust your typo fix in the documentation of a project you own. The ghost of "Jia Tan" and the xz backdoor scare has everyone ready to demand retinal scans from anyone submitting a pull request on code they don’t own.
Meanwhile, on the other side of the median... chaos reigns.
Giant corporations, nimble startups, your cousin Alice learning Python – they're all consuming this open source code with the digital equivalent of a toddler's abandon. We fret about the trustworthiness of the code authors, but seem remarkably chill about how the code is actually inspected and used.
Think about it! In the real world we require people to pass tests and get a license to operate a two-ton metal box on wheels, because incompetent driving has consequences. Yet, we let organizations responsible for safeguarding millions of people's data, running critical infrastructure, or just keeping the memes flowing, essentially grab the keys to complex software systems with little more than a "YOLO!" attitude. Sure there’s a risk the roads are not maintained, maybe there’s a bad actor who threw something dangerous out on the asphalt, but we have people operating their car with no regard for safe driving and not looking out for any hazards, including the other people driving.
The curl | sh
School of Optimism
Nowhere is this clearer than the enduring, baffling popularity of the curl | sh
command. For the uninitiated, this is like finding some software over the internet that claims to make your car run faster, immediately downloading it directly to your car and executing whatever instructions it finds, sight unseen. What could possibly go wrong?
Well, lots. Someone could intercept the install mid-transit and swap "MAKE CAR FASTER" for "BLOW UP THE CAR" (an adversary-in-the-middle attack). Even sneakier, the website could show you the code for the right software, but send the "BLOW UP THE CAR" version specifically to curl
when it asks. Yet, this practice persists, fueled by the siren song of convenience.
"I'll Patch It Later": Famous Last Words
Then there's the issue of using autoparts that are known to be past their replacement date or have recall notices. We call this using "vulnerable and outdated components". Remember Equifax? That massive data breach affecting nearly 150 million people? It happened because they failed to update a known-vulnerable piece of open source software (Apache Struts) for months after a fix was available. They knew operating the car was dangerous with that part, but left it installed anyway. Or Log4Shell? A vulnerability in a ubiquitous logging library so widespread and easy to exploit it sent the entire internet into a panic spiral. Years later, vulnerable versions are still being downloaded and exploited, like people willingly installing defective airbags (Remember Takata airbags?). Why? Because tracking all your parts (dependencies), especially the smaller parts that make up the bigger parts (transitive dependencies), is hard. It requires effort, tools like Software Composition Analysis (SCA) , and maybe even scanning the bill of materials – often provided as an aptly named Software Bill of Materials (SBOM). But who has time for verifying paperwork when you're innovating at scale?! And let's not forget the digital equivalents of identity theft targeting the supply chain itself: "dependency confusion" where you ask for gasoline but accidentally get diesel from the pump because it had a higher version number, or "typosquatting" where you mistype "unleaded" and get "inleaded," which turns out to be unleaded gas… just filled with metal shavings intended to destroy your engine. We have well resourced global organizations demanding something be done to help us better verify open source contributors but will they actually check anyone’s paperwork anyway? A quick glance of a library like easyjson would show that it’s in fact maintained by Russian nationals, but it only became an issue after someone reported on it. Where was all the due diligence from the organizations consuming the library? Maybe they’re too busy laying off their OSPOs.
Why No License to Consume?
Okay, fine, a literal "License to Consume OSS" is absurd. Who would issue it? The Open Source DMV? Would you need to parallel park a Docker container? The beauty of OSS is its openness, its "permissionless innovation". A licensing regime would kill the very ecosystem it aimed to protect.But the analogy highlights a glaring truth: competence matters. Operating complex software systems carries risks. While we're building elaborate trust systems for contributors, consumer competence relies on... well, voluntary adherence to best practices. It's like driver's ed being optional.This leads to the awkward question of liability. Most OSS comes "as is," with licenses that basically say, "If this blows up your server, don't blame us". It's a legal forcefield that works great until a massive breach happens, and then everyone starts pointing fingers. Is the contributor who unknowingly introduced a flaw responsible? Is the malicious actor who intentionally added a backdoor liable? Or is it the consumer who ignored flashing warning signs and failed to patch a critical vulnerability? The concept of "shared responsibility" sounds nice, but often feels like a game of hot potato with a live grenade.
Time for Some Digital Defensive Driving?
In the real world this is a flawed analogy. Our roads are designed by civil engineers who are certified. The laborers paving the roads go through training. But the people are also compensated for their labor! There are clear incentives for both sides. Organizations building the roads want to ensure that the roads are safe, whether they are public or private. They are compensated through tax dollars or private money. The workers are then paid wages to perform work and to go through a level of certification and scrutiny. That exchange might exist for open source developers who are hired by a company to contribute to or maintain some open source project that is in the company’s interest, but that doesn’t exist for all the folks who are just purely volunteers.
So, what's the answer? We can't license consumers. But maybe, just maybe, organizations consuming mountains of open source code could invest a bit more in digital defensive driving. More importantly, don’t spend all this time and resources investigating the road builders and spend that time and resources on inspecting the roads! Use the tools available (SCA, SBOMs, vulnerability scanners, attestation verification). Vet your sources. Patch your systems. Don't just blindly curl | sh
your way through life. Maybe we can have these same organizations, especially the well resourced ones who are talking about how AI is going to transform the world invest a little more in making our roads safer and highlighting the dangerous ones.
We need to shift some of the intense scrutiny from the thankless volunteers designing and paving the roads to the companies making billions transporting cargo on those very same roads. Because at the end of the day, a secure open source ecosystem requires not just trustworthy contributors, but also competent, responsible consumers who know how to handle their automobiles safely and are doing their part to inspect the roads. Otherwise, we're all just driving drunk on dependencies, hoping we don't crash the internet.