Node.js vs Deno vs Bun: Who fetches the fastest — (File upload case)?
--
Usually located at the front, the runtimes like Node.js, Deno, and Bun needs to rely on backend/other services to get the job done. This is especially applicable for microservice architectures. Microservices — also known as the microservice architecture — is an architectural style that structures an application as a collection of services that are independently deployable, loosely coupled, etc. The popular ways to reach the backend services are:
- HTTP: Using an HTTP client, make an HTTP request to the backend service
- Message queue: Using a MQ client (like rabbitmq), post a message to the queue, & listen for response in a different queue.
This article is a comparison between the native HTTP clients provided by Node.js, Deno, and Bun.
Before moving ahead, if you want to read about who fetches the fastest for hello world and JSON processing use-cases, the article is here.
Fetch API
As the trend is these days, the server-side runtimes are continually offering web compatible APIs for a variety of tasks. The trend was started by Deno, taken up by Bun from the inception, and now being followed by Node.js too. One of the extremely popular web API is: fetch. This well known API can be used to send an HTTP request & receive a response back.
All the three runtimes now support the web standard fetch API. A code like this works in all three places:
await fetch("http://google.com");
The following log shows a sample run in Node and Deno REPL. Bun doesn’t have a REPL, but the code works there too.
$ node
Welcome to Node.js v19.6.1.
Type ".help" for more information.
> await fetch("http://google.com");
Response {
[Symbol(realm)]: null,
[Symbol(state)]: Proxy [
{
aborted: false,
rangeRequested: false,
timingAllowPassed: true,
requestIncludesCredentials: false,
type: 'default',
status: 200,
..... <<suppressed>>
$ deno
Deno 1.30.3
exit using ctrl+d, ctrl+c, or close()
REPL is running with all permissions allowed.
To specify permissions, run `deno repl` with allow flags.
> await fetch("http://google.com");
Response {
body: ReadableStream { locked: false },
bodyUsed: false,
ok: true,
redirected: true,
status: 200,
.... <<suppressed>>
The following code works in Bun:
//app.js
const res = await fetch("http://google.com");
console.log(res);
//output
$ bun run app.js
Response (16.52 KB) {
ok: true,
url: "http://www.google.com/",
statusText: "OK",
redirected: true,
bodyUsed: false,
status: 200,
Blob (16.52 KB)
}
Test setup
The tests are executed on MacBook Pro M1 with 16G RAM. The mock backend service is a single high performance HTTP server running on Go with fasthttp framework. The mock server simply returns a 204 No Content for all the file uploads.
The runtime versions are:
- Node.js v19.6.1
- Deno v1.30.3
- Bun v0.5.6
The test is executed for a minute. The executed fetch requests (or file uploads) are divided by 60 to get the final upload ops/second.
The number of ops / second could be proportional to the size of the file being uploaded. I’m going to run this test for file size from 1K to 100M. To keep the article length under 5 minutes, I’m only going to include file uploads. A follow-up article will deal with file download, URL encoded data, and multipart/form-data.
Code
Even though all the three runtimes support the fetch web API (that’s what we’re benchmarking here), the way a file is provided for upload is different in each of them. Let’s look at the code first.
Node.js
import { createReadStream } from 'node:fs';
await fetch('http://127.0.0.1:3000', {method: 'POST', body: createReadStream("./data/2M.txt"), headers: {"content-type" : "text/plain"}, duplex: 'half'});
Deno
await fetch('http://localhost:3000', {method: 'POST', body: (await Deno.open("./data/2M.txt")).readable, headers: {"content-type" : "text/plain"}});
Bun
Unfortunately, Bun doesn’t support streaming uploads through fetch. The only way is to give the file buffer.
await fetch('http://localhost:3000', {method: 'POST', body: await (new Response(Bun.file("./data/2M.txt"))).arrayBuffer(), headers: {"content-type" : "text/plain"}});
Whenever Bun has support for streaming uploads, I’ll write a follow-up article. For now, this is how it is.
File upload performance
Now that we’ve gone through the background, let’s start looking at the results. As mentioned earlier, the tests will range from a 1K file to 100M file.
1K file
-rw-r--r--@ 1 mayankc staff 1095 Mar 19 2022 1K.txt
The results for a 60-second test are as follows:
For a small size file (1K), Bun leads with a big margin. However, due to buffering of file in memory, Bun uses way too much memory (peak usage about 400M), while Deno (~30M) and Node.js (~70M) uses way less.
500K file
-rw-r--r--@ 1 mayankc staff 511274 Mar 19 2022 500K.txt
The results for a 60-second test are as follows:
For a 500K file, Bun still leads the pack by a big margin.
2M file
-rw-r--r--@ 1 mayankc staff 1922394 Feb 12 2022 2M.txt
The results for a 60-second test are as follows:
For a 2M file, Bun still leads, but the margin has narrowed now.
6M file
-rw-r--r-- 1 mayankc staff 5994496 Apr 3 2021 sample.mpeg
The results for a 60-second test are as follows:
At this point, the difference is negligible. Still, just by looking at numbers, Deno leads the pack.
50M file
-rw-r--r-- 1 mayankc staff 51924024 Feb 12 2022 textFile50M.txt
The result for a 60-second test are as follows:
At this point, Bun becomes the slowest of the pack. Deno & Node.js are almost the same.
100M file
-rw-r--r--@ 1 mayankc staff 100450390 Aug 23 2021 longVideo.mp4
The result for a 60-second test are as follows:
The same pattern continues for large files. Deno leads the pack, and Bun is the slowest. Node.js stays in between.
Winner
The winner is: It depends.
While, “it depends” is a bad answer, unfortunately that’s the right answer. There is no single winner this time. For smaller files (up to 2M), Bun leads by a good margin. For larger files(>5M), Bun gets the slowest. Deno leads the pack for larger files, but the winning margin isn’t huge.
Again, if you want to read about who fetches the fastest for hello world and JSON processing use-cases, the article is here.