Exploiting Developer Infrastructure Is Ridiculously Easy
In late October, an issue was opened on an extremely popular node.js tool, nodemon, describing a deprecation warning that was being logged to the console.
Warnings like these aren’t uncommon. This one seemed harmless. It wasn’t even related to the nodemon project, but rather to one of its dependencies. This easily could have gone completely ignored because, in many cases, warnings like these often resolve themselves.
Several months ago, control of event-stream changed hands, legitimately, to a relatively unknown user who asked for publishing rights over email. This user then updated event-stream to include the exploited flatmap-stream dependency in a patch version and then bumped the major version of event-stream without the dependency to limit the visibility of the change. New users who, presumably, are a little more inclined to question dependencies would get the latest version (4.x as of this writing) and users who depend on the previous version would automatically update to the infected patch release whenever npm install runs again (with many common configurations).
Details of the Exploit
The payload on flatmap-stream was set up to ingest a data file that had, among some trivially obfuscated strings, two encrypted payloads that could only be decrypted with a known password.
This payload looked for the password in an environment variable named npm_package_description set by npm, node’s package manager. This environment variable is set to the root package’s description, which allows this payload to scope its effects to a particular target package. Clever! In this case, the package was the client application for the bitcoin wallet Copay and the password to decrypt the payload is the phrase “A Secure Bitcoin Wallet” (found via brute force by Github user maths22).
After payload A successfully decodes the first entry in the test data, it executes payload B included below:
This code then makes sure to only continue executing if the script is being run with a particular command line argument, something that follows the pattern “build:*-release”, like npm run build:ios-release. This isolates the execution down to only three build scripts in the Copay build pipeline, the scripts in charge of building the hybrid iOS, Android, and desktop applications.
The script then searches for the internals of another dependency of the application, ReedSolomonDecoder.js from the package @zxing/library. Payload B doesn’t execute this file, it simply injects the next stage, payload C, so that this final payload is executed in the mobile application itself when ReedSolomonDecoder loads. A beautified payload C is included below.
So much software is built on the backs of people who are expected to work for free.
The amount of effort this took was not trivial. This exploit took a lot of research and planning, and it likely had backup routes in the case that event-stream wasn’t able to be hijacked. Given the way the attack played out, it seems plausible that the actor targeted Copay specifically rather than grabbing a valuable library and planning out an attack from there. The popularity of event-stream meant that the attacker had an easy route into privileged computers in hundreds of companies across the globe. Thankfully, it was limited and quickly caught considering how long it could have gone unnoticed, but thinking about what could have happened leads us to an obvious conclusion:
Open Source Is Incredibly Broken
Let’s count all the things that went wrong.
- An application (Copay) was built by consuming dependencies over the network without the entire tree’s dependencies locked.
- Even without locked versions, those dependencies aren’t cached and are pulled on every build.
- Thousands of other projects are dependent on event-stream with the same or similar configurations.
- The maintainer stopped caring about a library that thousands of projects depended on.
- Thousands of projects consume this library for free and expect it to be maintained without any compensation.
- The maintainer gave full control to an unknown entity just because they asked for it.
- There was no notification that control had changed, thousands of projects were just expected to consume the package with no warning.
- There’s really no end—this list of things that went wrong could go on and on…
The damage this could have caused is incredible to think about. The projects that depend on this aren’t trivial either. Microsoft’s original Azure CLI depends on event-stream. Think of the systems that either develop that tool or run that tool. Each one of those potentially had this malicious code installed.
Open source is broken, and the larger it grows the more likely that catastrophic events will occur.
The problem is that so much software is built on the backs of people who are expected to work for free. They deliver useful software once but are expected to maintain it until the end of time. If they can’t, either they go dormant and ignore requests or security vulnerabilities (guilty!) or they pass the baton to someone else hoping they can get away without getting tagged ever again. Sometimes it works. Sometimes it doesn’t. But no outcome can excuse the security vulnerabilities this exposes in the software supply chain. Even the discovery of, research into, and subsequent damage control for this exploit was done largely by unpaid volunteers of the open-source ecosystem.
The fault is so widely distributed there’s no use in placing blame. Open source, as it has grown, is broken. The larger it grows, the more likely it is that catastrophic events will occur. Given the potential for damage with this exploit, the fact that it was so limited is a blessing. It’s also not limited to node.js or npm; there is just as much misplaced trust in sister ecosystems like Python’s pypi and Ruby’s gems — and with Github as a service itself. Anyone can publish to these, and control can change without any notice. Even without a change of control, there’s so much code that thoroughly vetting it all in the first place would grind any team to a halt. In order to meet timelines, developers install what they need to install, and security teams and automated tools just aren’t able to adapt to the pace of ever-changing software.