Kotlin Spring Boot Tutorial Part 4: Creating REST endpoints for a task app
Here is a quick summary of Part 3:
Connection to a Postgres database on https://supabase.com/ was established and additional application.yml
and environment variables were configured. Here is a link to part 3.
Why set up a local database for development?
When you are in the process of developing your application, you might try out some things and end up inserting dummy data. But you should not insert dummy data in your production database. But of course, you also need to manually test things and see if everything works as expected. This is the reason why we need a local development database while programming our backend.
Set up a local database for development
Okay, now we have connected our API to a real database from Supabase. We should also add a local development database. Navigate to main -> resources and create inside the file application-dev.yml
. Take this configuration:
spring:
datasource:
driver-class-name: org.postgresql.Driver
url: jdbc:postgresql://localhost:5432/habibicoding
username: postgres
password: postgres
jpa:
hibernate:
ddl-auto: update
show-sql: true
properties:
hibernate:
dialect: org.hibernate.dialect.PostgreSQLDialect
format_sql: true
server:
error:
include-message: always
include-binding-errors: always
include-stacktrace: on_param
Don’t worry for now regarding url, username, or password we will configure it in the next steps, just keep them as I provided them to you.
Set up a local network for the Docker Postgres container
We will also save the data on a volume on our local machine, so when the container gets turned off we will still be able to access the data when the container will be restarted.
Run:
docker network create postgres_db
Inside the network, we will create the Postgres docker container. I will create in my home folder a new folder for the data of the volume and name it db-data.
Open a terminal and navigate into the folder `db-data` with:cd db-data
Then type the command to start a container named postgres_db which runs internally and externally on port 5432 which uses the network postgres_db for the container. Then the volume internally stores all the data in “$PWD:/var/lib/postgresql/data”, after that an environment variable called POSTGRES_PASSWORD stores the password. With -d the container runs in the background and we instruct that we want to use postgres a database image with the alpine version as a tag.
Run the command:
docker run --name postgres_db -p 5432:5432 --network=postgres_db -v "$PWD:/var/lib/postgresql/data" -e POSTGRES_PASSWORD=postgres -d postgres:alpine
Run the command: docker ps
to check if your container is up and running
After that check your folder db-data it should contain now a bunch of folders and files.
Connect to your database via PSQL
Run the command:
docker run -it --rm --network=postgres_db postgres:alpine psql -h postgres_db -U postgres
After entering the command you should be prompted for the password which we defined before as “postgres”
Type the command \l
to list all databases
Type now CREATE DATABASE habibicoding;
after that type again \l
to see if it is really there.
Open your IntelliJ IDE — Yalla | يلا
Go to Edit Configurations…
Enter the word dev inside “Active profiles:”
The IDE will now take the application-dev.yml
file. If you want to use the production database again, just remove the word dev from the “Active profiles:”.
Run your application
Call the endpoint: `http://localhost:9091/api/all-tasks`
For now, you should see an empty response when calling the endpoint /all-tasks
.
Let us change that by opening Postman again and call the POST endpoint with the following request body:
{
"description": "Cook Maqluba / طبخ مقلوبة",
"isReminderSet": false,
"isTaskOpen": true,
"createdOn": "2022-12-14T01:06:32.510",
"priority": "MEDIUM"
}
Call the endpoint again: `http://localhost:9091/api/all-tasks`
Everything should work now and we should be able to fully use this local development database like our real production database. Feel free to test it yourself. When you turn off your Docker container which contains the Postgres image, the data will luckily not be lost because of our volume which accesses the folder db-data.
Okay, that’s it for the first part. If you enjoyed this article give it a clap. Here is Part 5:
Here is the completed project, check out the branch part_four
By the way here is the link for the article as YouTube series: https://www.youtube.com/watch?v=ZKMGMZqnmOk&list=PLjuEK3Ez60n2dTFL7-KETl1yl04kOo-rM