This scraping serverless polyglot is MetaCall

3 min readFeb 20, 2020


MetaCall is a tool that allows mixing languages. You can build Polyglot Applications with it. In this article we will show you an example of a polyglot application step by step using BeautifulSoup (Python) mixed with an Express server (NodeJS) in order to build a Polyglot Scraping API.

Let’s go through the setup process together to see how easy it is to work with. First clone the repository:

git clone

Then we need to install MetaCall. MetaCall is distributed in different ways, including Docker, Guix or Installer. It is highly portable so it should work on any Linux. Let’s install it:

curl -sL | sh

Now we have access to metacall command, which is a complete Command Line Interface (CLI) for developing Polyglot Applications.

After that we should install the dependencies. From Python side, we need to install BeautifulSoup dependency for scraping, and certify dependency in order to query HTTPS webs.

metacall pip3 install beautifulsoup4 certifi

From NodeJS side, we need to install express for our server and metacallas a package for allowing calls to Python.

metacall npm install metacall express

MetaCall uses its own versions of npm and pip, in order to avoid to pollute the system and maintaining portability. That’s why we prefix with metacall all package manager commands, although the syntax is exactly the same.

Great. We are done with the tedious and boring part. It is time to investigate what the application does, and how is it possible to magically mix the programming languages.

In the NodeJS side, we can see the index.js script. It implements the Express server, but also uses metacall package for loading the script.

The first two lines (line 2 and 5) load the Python script. At this point you can easily call functions from with metacall function. We do exactly that in the handler of our REST API in the Express server (line 14). We use metacall('links') in order to call the function links which is implemented in script.

The links function accepts an URL, which is opened and queried in order to obtain the HTML (lines 8 to 17). Then initializes BeautifulSoup in order to parse the HTML (line 20) and then obtains all links the web has (lines 23 to 26).


Now it is time to test it and see the magic happen. In order to run our application, we simply do:

metacall index.js

MetaCall will launch index.js and run the Express server. Logs will be streamed to $HOME/metacall.log and the server will be listening to the port 3000 so we can use curl to query it. Let’s scrape all links from NPM web:

curl localhost:3000/?url=

If everything went well, we should see in the response something similar to this:


For the DevOps gang and in favor of containerization, we also provide a dockerized version of the application which includes automated testing. It can be built and run with the following commands:

docker build -t metacall/beautifulsoup-express-example .
docker run --rm -p 3000:3000 -it metacall/beautifulsoup-express-example

Finally we added a metacall.json file, which is a configuration used for deploying into MetaCall FaaS. In this version we only use the Python script, because MetaCall FaaS is able to export the functions and generate an API Gateway automatically, so the Express server is not needed, and we also will obtain higher performance because our FaaS server is implemented with high performance and concurrency in mind.

By this way we can deploy this as a standalone application on any platform, or through highly scalable FaaS. If you want to try the MetaCall FaaS platform, it is under Beta Testing and it is available only by early access (you can request access at

For more information about MetaCall or in order to become part of our community, you can join our Telegram Group.

The source code is licensed under Apache 2.0 License and can be found at:

See you meta-programmer!

Vicente Eduardo Ferrer García.
Founder & CTO of MetaCall.




From DevOps to NoOps automatically, switch now to Serverless. Drag and Drop your code to generate APIs and deploy Cloud Web Services.