The other day, I had to do some quick data wrangling with some client’s data, decided to just dump their data and spin up a local PostgreSQL database.
In this tutorial, we will create a PostgreSQL database and access it from PgAdmin using docker-compose. In my opinion, docker compose provides the easiest way to have your Postres container and PgAdmin on the same network.
Contents of docker-compose.yml
user: root pgadmin4:
PGADMIN_DEFAULT_EMAIL: firstname.lastname@example.org …
You'll learn how to set up MySQL container for your local development using docker compose. You can easily start it and shut it down when you don't need it and your data won't get lost. Very useful when you just want to create a MySQL database quick for your local development.
Our MySQL data will live inside ./db directory.
You can put any sql file into ./db/sql directory, it will run automatically when during build. (It's useful when you want to restore data from some dump file)
./docker-compose.yml is our docker compose configuration file.
image: mysql:5.7 …
TLDR; After receiving the API request, create a new thread and run your function there. You can pass your desired data in kwargs.
We’ve all been there. What to do when you want to use python in some batch process for your calculations but do not want to wait for the response 10 minutes after triggering your API endpoint. Here’s what you can do.
The rocky road from TF-IDF to BERT
By writing on Medium, I’ll try my best to give some insight on the research we do at the AI Strategy Department of Inglewood Inc. (Tokyo/Japan).
We are a team of 5 (growing fast!) data geeks with passion for machine learning, trying to find and solve interesting problems in the world.
I hope you’ll learn something new about different machine learning approaches and about what we do.
In this particular series of articles I will walk you through our attempt to predict annual salaries from job postings we collected from the Internet. …