Uploading an Image to IPFS

IPFS (Interplanetary Files System) promises a better and more efficient way of sharing files. For example, instead of having everyone in a classroom download a file from Dropbox, why not from someone else in the room? This removes a lot network overhead. But what about getting files onto the IPFS network? I will go over an example that uploads a file from the browser to IPFS directly.


This is mostly for developers who want to integrate IPFS into their stack. Some rudimentary HTML, and JavaScript knowledge is needed.

IPFS Primer:

Normal Uploading:

To create an uploading feature, a developer needs to receive data from the browser and then store it somewhere. It could be a service like Amazon Web Service S3 or other file hosting service. They could also make one themselves if so desired. But that may not be the best use of time for a development team on a budget.

There is nothing wrong with this structure. It allows a developer to write server code to modify the image for example. There also could be multiple storage solutions. All these solutions increase the amount of bandwidth used by the application. A 1 mb upload becomes 2 mb because the server needs to upload it onto the storage solution. Bandwidth is cheap, but it can be cheaper!

Browser to IPFS:

We could write server code accept data and push it onto IPFS. But what if the browser could upload directly to IPFS? Then the previous 2 MB upload becomes 1 mb again!

This is nice because we save on network costs. This is also doable with current storage platforms. They may be required to do a few authentication requests, but it’s still doable. You will see how easy it will be to upload directly to IPFS.

Project Setup:

To run the example, you need an IPFS node running on your local computer. To make the tutorial easier, you will also configure CORS on your local node. Follow the IPFS install guide for your operating system and then do the following:

  1. ipfs init
  2. ipfs daemon

That will start your IPFS server locally. Lastly we need to configure IPFS to allow CORS. We will have to stop the ipfs (ctrl- c) and modify a few things:

  1. Stop IPFS with ctrl-c
  2. ipfs config -- json API.HTTPHeaders.Access-Control-Allow-Methods ‘[“PUT”, “GET”, “POST”, “OPTIONS”]’
  3. ipfs config — json API.HTTPHeaders.Access-Control-Allow-Origin ‘[“*”]’
  4. ipfs daemon

We are configuring ipfs to return the necessary headers for CORS to work. The last command just restarts the ipfs service locally.

Hosting a Website:

There are several ways to host a website on your laptop. You can use a server like node-http , SimpleHTTPServer , or just drag and drop index.html into the browser. The gist is here. Save the file in a directory of your choice.

Using node to host website:

If you want to use node, run the following in the terminal:

  1. npm install http-server -g
  2. http-server -p 1337

Using Python to host a website:

If you are on OSX, you most likely have Python installed by default. This is nice because there will be no need to install anything. In the terminal, run the following:

  1. python -m SimpleHTTPServer 1337

The above starts an HTTP server on port 1337 in the local directory.

Using the file system to host a website:

The easiest way to host your website is to drag and drop index.html into any browser. No need to install or run any commands.

However you host the index.html file, you should see the following:


The example is a browser to IPFS image uploader. Having the browser directly upload the image to IPFS, bandwidth is reduced for developers. Normally to upload an image a client will upload to a server, which in turn saves it somewhere else. Why have client → server → directory, when you can have client → directory.

  1. Create HTML input field with id="photo"
  2. Create HTML button with onclick="upload()"
  3. upload() creates instance const reader = new FileReader()
  4. Give reader.readAsArrayBuffer the file inside id="photo" element
  5. Bind to reader event emitter method onloaded
  6. Initialize ipfs object bound to local IPFS node on port 5001
  7. Create buffer of the image that was read from reader.result
  8. Call ipfs.files.add with buffer with callback function
  9. Read result variable
  10. Create the url string
  11. Modify data on DOM directly

API Details:

It’s easy to create a service that receives data from the browser and then stores it on a storage platform. As a developer you can write code to do what you want. Conversely, a front-end developer has more restrictions because their code runs in browsers. This add some complexity sometimes.

We will use FileReader to access file data, create a browser supported buffer array, and then finally upload the image to IPFS using js-ipfs-api.


The browser does not have direct access to directory files. As a matter of fact, FileReader will give you a fake directory to the file selected (C:/fakedir/some/path.png). This is a big difference from server code where reading a file is easily done and accessible. But that’s OK, because FileReader allows you to read files in different formats:readAsArrayBuffer, readAsBinaryString, readAsDataUrl, readAsText. We use readAsArrayBuffer in the example.


In Node.js, you have access to the required('buffer').Buffer native module. This means we need to look for a compatible Buffer function. Luckily others have done the work! I found a browser compatible buffer module by feross. In the example code, I added the async html script tag to include the buffer javascript. This created a global variable buffer object. If I was using Webpack I could have included buffer as require('buffer/').Buffer. Either way, as long as the buffer code is accessible, this will work.


IPFS has two different libraries. There is JS-IPFS, which implements the IPFS protocol in javascript. Then there is JS-IPFS-API, which makes calls to an IPFS node. In our example above, we use JS-IPFS-API to upload data to IPFS.


Having users upload directly to IPFS is nice because it removes the need for some server code. One drawback is that it requires a user to have IPFS running locally. Most users will not know why or how to install it. There is a Chrome plugin, but it’s still a hoop to jump through. Another drawback is that images are not replicated between nodes. They only exist on nodes if the file was accessed. This just means if server A receives the image, only server A knows about it. We need to copy the file to multiple servers to increase the availability of the file.

One way to solve the issues above is to modify the localhost and 5001 ports and host your own IPFS node. This means you have to manage a few servers, but at least you don’t have a server processing images between the uploading. But if your server turns off, then the image becomes unavailable again. To survive restarts, you could use ipfs.pin to tell IPFS to store image in a directory instead of memory. This works for the single server, but not all the other nodes you are hosting.

When managing servers, you can think of them as pets or cattle. A pet server is not easily replaced while a cattle server is. Using a tool like Ansible reduces the workload needed to manage servers.

Let’s get back to the topic of having images available in IPFS after servers are destroyed. The goal is to have the uploaded image on as many servers as possible so it becomes more accessible. After an image is uploaded to one IPFS node, maybe the node uploads it again to other servers. Therefore the same image lives on many IPFS servers. Another way is to save it on a cloud platform like AWS or GCE. Then when accessing images, if IPFS is slow or does not return, use the AWS or GCE image URL.

This issue of file availability is open at the moment. IPFS in and of itself does not do that work. The availability can be written on top of IPFS. They are working on FileCoin to incentivize people to store data. People uploading win by having people save data, and hosts win by receiving FileCoin for their work. There are also other protocols like SaiCoin, StorJ, and MainSafe.


You should be able to see the overall structure to upload an image from the browser to IPFS. There are a few drawbacks like availability, speed, and ease of use. Its still early, and hopefully more work can be done to push the technology forward!




Co-Founder of HelloSugoi. Hacking away on Ethereum (blockchain) DApps. Follow me on https://twitter.com/angellopozo

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Limiting Access for a DataWorks UDF to Specific Accounts

Check It How Easy to Create ERD Diagram Using Vertabelo

10 Programs That I Couldn’t Live Without as an IT Technician


Loading a machine learning model in docker

Get Free Sample XIGOO Storage-Stand-Docking-Station-Holder Compatible with Dyson V15 V11 V10 V8 V7…

600 180

How to download Kaggle datasets directly into Google Collab?

Pain, Pain, Pain…

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Angello Pozo

Angello Pozo

Co-Founder of HelloSugoi. Hacking away on Ethereum (blockchain) DApps. Follow me on https://twitter.com/angellopozo

More from Medium

Dynamic Events in React

What Are Smart Contracts, and How Can You Deploy Them For Your Project?

Discuss the Chat Application with Socket.IO and React

Koa JS for beginners