Efficient Backend Development with golang-migrate and sqlc

Tuhin Banerjee
9 min readMay 7, 2023

--

In this blog post, we will explore the process of setting up the backend for a whiteboard application using the Go programming language.

We will also introduce some useful tools and libraries that can simplify our development workflow and make our lives easier.

As developers, we often find that the most effective way to address a problem is by writing code, so without further ado, let’s dive straight into the details and get started with our backend setup.

DB schema

In this schema, we have four main tables: authors, Whiteboard, Stroke, and Point.

  • The authors table stores information about users of the whiteboard app, such as their name, email, and password.
  • The Whiteboard table represents individual whiteboards and includes the name of the whiteboard, and the user who created it (created_by), and timestamps for creation and updates.
  • The Stroke table stores information about each stroke made on a whiteboard. It includes details such as the color, line width, and the user who created it (created_by), and timestamps.
  • The Point table represents the individual points that make up a stroke. Each point is associated with a specific stroke (stroke_id) and includes the x and y coordinates.

Database migrations in Golang

Database migrations are crucial for managing and evolving the database schema of Golang applications.

Among the various migration libraries available, golang-migrate has gained popularity for its simplicity and flexibility.

Configure golang-migrate

  1. Install Golang: Make sure you have Go installed on your macOS system. If not, you can download and install it from the official Go website (https://golang.org/dl/).
  2. Install golang-migrate ( you can choose between the two ways ):
go get -u -d github.com/golang-migrate/migrate/cmd/migrate
brew install golang-migrate

3. Verify the installation:

migrate -version

You should see the version number of golang-migrate printed in the Terminal if the installation was successful.

4. Create a folder db/migration

5. Run this command to generate a migration file

migrate create -ext sql -dir db/migration -seq init_whiteboard_schema

This will create migration files and configure our up-and-down scripts.

Let's Recap the steps and understand the golang-migration

  1. Create a Migrations Directory:
  • Create a directory in your project specifically for storing database migration files.
  • This directory will contain SQL scripts that define the changes to be made to the database schema.

2. Write Migration Files:

  • Each migration file should have a unique name that reflects its purpose and order.
  • Use a naming convention that includes a timestamp or sequential numbering to ensure the correct execution order.
  • In each file, write SQL statements to define the changes to the database schema, such as creating tables, modifying columns, or adding indexes.

3. Initialize the Migration Library:

  • In your Go code, initialize the migration library by providing the necessary configuration, such as the database connection details.
  • This step may involve creating a migration instance or importing the library package.

4. Define Up and Down Functions:

  • For each migration file, define an “up” function that applies the changes described in the SQL script.
  • Optionally, define a “down” function that undoes the changes made by the corresponding “up” function.
  • The “down” function is used for rollback or reverting migrations.

Postgres will serve as our chosen database for this project. Now, we can incorporate the schema SQL statements into our up-and-down script.

-- authors table
CREATE TABLE "authors" (
id INT PRIMARY KEY,
name VARCHAR(255),
email VARCHAR(255),
password VARCHAR(255),
created_at timestamp not null default (now()),
updated_at timestamp not null default (now())
);

-- whiteboard table
CREATE TABLE whiteboard (
id INT PRIMARY KEY,
name VARCHAR(255),
created_by INT,
created_at timestamp not null default (now()),
updated_at timestamp not null default (now()),
FOREIGN KEY (created_by) REFERENCES authors(id)
);

-- stroke table
CREATE TABLE stroke (
id INT PRIMARY KEY,
whiteboard_id INT,
color VARCHAR(255),
line_width INT,
created_by INT,
created_at timestamp not null default (now()),
updated_at timestamp not null default (now()),
FOREIGN KEY (whiteboard_id) REFERENCES whiteboard(id),
FOREIGN KEY (created_by) REFERENCES authors(id)
);

-- point table
CREATE TABLE point (
id INT PRIMARY KEY,
stroke_id INT,
x_coordinate FLOAT,
y_coordinate FLOAT,
created_at timestamp not null default (now()),
FOREIGN KEY (stroke_id) REFERENCES stroke(id)
);
-- Drop Point table
DROP TABLE IF EXISTS point;

-- Drop Stroke table
DROP TABLE IF EXISTS stroke;

-- Drop Whiteboard table
DROP TABLE IF EXISTS whiteboard;

-- Drop User table
DROP TABLE IF EXISTS "user";

Run the migration

  1. Configure postgres and pg-admin using docker-compose.
version: '3.3'
services:
db:
container_name: postgres_container
build: ./postgres
# command: -c ssl=on -c ssl_cert_file=/var/lib/postgresql/server.crt -c ssl_key_file=/var/lib/postgresql/server.key
volumes:
- app-db-data:/var/lib/postgresql/data/pgdata
environment:
- PGDATA=/var/lib/postgresql/data/pgdata
- POSTGRES_HOST_AUTH_METHOD=trust
networks:
- default
ports:
- '5432:5432'

pgadmin:
container_name: pgadmin_container
image: dpage/pgadmin4
environment:
PGADMIN_DEFAULT_EMAIL: ${PGADMIN_DEFAULT_EMAIL:-pgadmin4@pgadmin.org}
PGADMIN_DEFAULT_PASSWORD: ${PGADMIN_DEFAULT_PASSWORD:-UbnGQkh66V3twL3c}
volumes:
- pgadmin:/root/.pgadmin
ports:
- '${PGADMIN_PORT:-5050}:80'


volumes:
app-db-data: null
pgadmin: null

2. Run migrate up command

migrate -path db/migration/ -database "postgres://postgres:postgres@localhost:5432?sslmode=disable&database=tdb" --
verbose up

3. Running this command will perform the down migration, which will revert the changes made to the database schema. By observing the command output, you can ensure that the tables have been successfully dropped.

migrate -path db/migration/ -database "postgres://postgres:postgres@localhost:5432?sslmode=disable&database=tdb" --
verbose down

Generate CRUD Golang Code from SQL: A Comparison of db/sql, gorm, sqlx & sqlcTitle

In the world of Golang, developing efficient CRUD (Create, Read, Update, Delete) operations is a common requirement for interacting with databases. To streamline this process, several libraries and tools have emerged, each with its own strengths and features.

After some research I found SQLC to be the best fit.

But By understanding the differences and benefits of each, you can make informed decisions on which approach best suits your specific needs.

1. db/sql : The db/sql package is part of the standard Go library and provides a basic set of database operations.
Pros:
Lightweight and minimalistic, suitable for simple applications.
Familiar interface for developers already well-versed in Golang.
Cons:
Requires manual mapping of query results to Go structs.
Lacks advanced features and functionalities found in other libraries.

2. gorm: Gorm is an Object-Relational Mapping (ORM) library for Golang, offering a high-level abstraction for database interactions.
Pros:
Automatic mapping between Go structs and database tables.
Supports a wide range of databases.
Rich set of features, such as eager loading, associations, and transactions.
Cons:
Overhead from the additional abstraction layer.
Less control over SQL queries, which may impact performance in complex scenarios.

3. sqlx: Sqlx is a lightweight extension to the standard db/sql package, providing additional functionality and convenience methods.
Pros:
Enhanced support for working with SQL databases, including result set mapping and named parameter queries.
Retains the simplicity and performance of the db/sql package.
Cons:
Less feature-rich compared to gorm.
Still requires manual handling of some aspects, such as query preparation.

4. sqlc: Sqlc is a code generator that creates type-safe Go code from SQL queries, leveraging the strengths of the database/sql package.
Pros:
Generates efficient and type-safe code based on SQL queries.
Eliminates the need for manual mapping, reducing errors and improving development speed.
Supports custom queries and integrates well with existing Go codebases.
Cons:
Limited to SQL databases.

Configuring sqlc

query/whiteboard.sql

-- name: GetWhiteboard :one
SELECT * FROM whiteboard
WHERE id = $1 LIMIT 1;

-- name: ListWhiteboard :many
SELECT * FROM whiteboard
ORDER BY name;

-- name: CreateWhiteboard :one
INSERT INTO whiteboard (
name, created_by, created_at, updated_at
) VALUES (
$1, $2, $3, $4
)
RETURNING *;

-- name: DeleteWhiteboard :exec
DELETE FROM whiteboard
WHERE id = $1;
  1. Install SQLc: Start by installing SQLc on your development machine. SQLc has its own command-line interface (CLI) that you can install using package managers like Homebrew (on macOS/Linux) or Chocolatey (on Windows), or by downloading the binary directly from the SQLc GitHub repository… brew install sqlc
  2. Create a folder query: where you write your SQL queries for the CRUD operation. You can include multiple SQL statements in a single file.(query/whiteboard.sql)
  3. Configure SQLc: Create a sqlc.yaml file in your project directory to configure SQLc. This file specifies information such as the database connection details, the output directory for generated code, and any custom settings…. sqlc init
  4. Define database connection: In the sqlc.yaml file, specify the necessary details for connecting to your database, including the database type (e.g., PostgreSQL, MySQL) and the connection string or credentials.
  5. Configure schema: If you are using SQLc for schema code generation, define the file path to your SQL schema file in the sqlc.yaml file. You can also specify the package name and output directory for the generated Go code.
version: "1"
cloud:
organization: ""
project: ""
hostname: ""
project:
id: "tdb"
packages:
- name: "tdb"
path: "./db/sqlc"
queries: "./db/query"
schema: "./db/migration"
engine: "postgresql"
emit_json_tags: true
emit_prepared_queries: true
emit_interface: false
emit_exact_table_names: true

Generate code: Run the SQLc CLI command (sqlc generate) in your project directory. SQLc will read the sqlc.yaml file and generate Go code based on the SQL statements and schema. The generated code will include functions for executing queries, mapping results to Go structs, and handling errors.

Within just 10 minutes, SQLc helped us to effortlessly generate comprehensive CRUD (Create, Read, Update, Delete) code in Go. This eliminates the need to repeatedly write common boilerplate code, ensuring both high quality and type safety. Additionally, SQLc’s code generation process includes compilation, guaranteeing that SQL statements are accurate and eliminating the need to wait for runtime errors to surface. As a result, developers can significantly expedite their workflow while maintaining the assurance of correct SQL execution.

Following are more points on why you should use sqlc.

  1. Type safety: SQLc ensures type safety by generating Go code that corresponds to the database schema. It analyzes the SQL statements and creates strongly-typed functions and structs, enabling developers to write safer and more reliable code. This eliminates the need for manually writing SQL queries as strings in Go code and reduces the chances of runtime errors due to mismatched types.
  2. Code generation: SQLc automatically generates boilerplate code based on the database schema. It creates Go functions for executing SQL queries, mapping query results to appropriate data structures, and handling errors. This saves developers significant time and effort that would otherwise be spent on writing repetitive and error-prone code.
  3. Compile-time validation: SQLc performs compile-time validation on SQL queries against the database schema. It checks for syntax errors, missing or mismatched column names, and other potential issues. This allows developers to catch errors early in the development process, reducing the likelihood of runtime errors or failed deployments.
  4. Improved maintainability: By using SQLc, developers can keep their SQL queries alongside the corresponding Go code, making it easier to understand and maintain the application’s database interactions. SQLc also supports incremental code generation, which means that only the modified queries or schema changes are regenerated, minimizing the impact on existing code.

Now that we have our db functions ready lets generate the RPC stubs using a proto file — we will use both gRPC and RestAPI using go in this project.

syntax = "proto3";

package whiteboard;

import "google/protobuf/empty.proto";

option go_package = "github.com/geektuhin123/whiteboard";

service Whiteboard {
rpc AddDrawing(Drawing) returns (google.protobuf.Empty);
rpc GetDrawings(google.protobuf.Empty) returns (stream Drawing);
rpc ConnectUser(User) returns (google.protobuf.Empty);
rpc DisconnectUser(User) returns (google.protobuf.Empty);
rpc Undo(google.protobuf.Empty) returns (google.protobuf.Empty);
rpc Redo(google.protobuf.Empty) returns (google.protobuf.Empty);
rpc Draw(DrawRequest) returns (DrawResponse);
rpc GetWhiteboard(google.protobuf.Empty) returns (GetWhiteboardResponse);
}

message Drawing {
string id = 1;
string user_id = 2;
repeated Point points = 3;
}

message Point {
float x = 1;
float y = 2;
}

message User {
string id = 1;
}

message GetWhiteboardResponse {
repeated Stroke strokes = 1;
}

message Stroke {
int32 id = 1;
string color = 2;
float line_width = 3;
repeated Point points = 4;
}

message DrawRequest {
string color = 1;
float line_width = 2;
repeated Point points = 3;
}

message DrawResponse {
int32 id = 1;
}
protoc --go_out=whiteboard --go_opt=paths=source_relative \
--go-grpc_out=whiteboard --go-grpc_opt=paths=source_relative \
whiteboard.proto

Checkout the server.go implementation and follow me for more updates on this project.

https://github.com/geektuhin123/tdb

--

--

Tuhin Banerjee

Product Manager with a history of building large-scale enterprise applications.