Database Documentation via DBDocs
Manual Installation
#!/bin/bash
# Check for root privileges
if [[ $EUID -ne 0 ]]; then
echo "This script must be run as root."
exit 1
fi
# Define the desired Node.js version
NODE_VERSION="14" # Change to your desired Node.js version (e.g., "14")
# Verify if Node.js is already installed
if ! command -v node &>/dev/null; then
echo "Node.js is not installed. Installing Node.js version $NODE_VERSION..."
# Add NodeSource repository for the specified Node.js version
curl -fsSL https://deb.nodesource.com/gpgkey/nodesource.gpg.key | sudo gpg --dearmor -o /usr/share/keyrings/nodesource-archive-keyring.gpg
echo "deb [signed-by=/usr/share/keyrings/nodesource-archive-keyring.gpg] https://deb.nodesource.com/node_$NODE_VERSION.x $(lsb_release -cs) main" | sudo tee /etc/apt/sources.list.d/nodesource.list
# Update package list
apt update
# Install Node.js and npm
apt install -y nodejs
else
echo "Node.js is already installed."
fi
# Define the desired dbdocs version
DBDOCS_VERSION="0.8.1" # Change to your desired dbdocs version
# Verify if dbdocs is already installed
if ! npm list -g dbdocs@$DBDOCS_VERSION --depth=0 &>/dev/null; then
echo "dbdocs version $DBDOCS_VERSION is not installed. Installing..."
# Install dbdocs globally
npm install -g dbdocs@$DBDOCS_VERSION
else
echo "dbdocs version $DBDOCS_VERSION is already installed."
fi
# Verify installations
NODEJS_VERSION=$(node -v)
NPM_VERSION=$(npm -v)
DBDOCS_INSTALLED_VERSION=$(dbdocs --version)
echo "Node.js version: $NODEJS_VERSION"
echo "npm version: $NPM_VERSION"
echo "dbdocs version: $DBDOCS_INSTALLED_VERSION"
echo "Node.js, npm, and dbdocs have been installed successfully."
- Create a new file as
install_dbdocs.sh
, and paste the script into it. - Make the script executable:
chmod +x install_dbdocs.sh
- Run the script with root privileges:
sudo ./install_dbdocs.sh
To review dbdocs has installed successfully review it command: dbdocs -v
DBML to SQL or SQL to DBML
To make conversion easily feasible use standard tool npm install -g @dbml/cli.
to being with take any sample.sql file and convert it into .dbml for complete instructions review here. https://dbml.dbdiagram.io/cli/#convert-a-sql-file-to-dbml
$ sql2dbml --mysql dump.sql -o mydatabase.dbml
✔ Generated DBML file from SQL file (MySQL): mydatabase.dbml
Installation via Docker
To create a Docker Compose YAML file for running the dbdocs
tool along with npm, Node.js, and a database schema, while also configuring dbdocs
with environment variables for a different port, username, password, etc., you can use the following example as a starting point. This example assumes you are using PostgreSQL as the database and want to customize various settings:
Dockerfile
# Use an official Node.js image as the base image
FROM node:14
# Set the working directory in the container
WORKDIR /app
# Install dbdocs globally using npm
RUN npm install -g dbdocs
# Expose the port used by dbdocs (default is 8080)
EXPOSE 8080
# Command to run dbdocs (customize as needed)
CMD ["dbdocs", "build", "/app/sample.dbml"]
docker-compose.yml
version: '3'
services:
# dbdocs container
dbdocs:
container_name: my-dbdocs
image: MYAWS.dkr.ecr.us-west-2.amazonaws.com/my-dbdocs:latest
volumes:
- ./sample.dbml:/app/sample.dbml
ports:
- "8080:8080" # Map the container port to host port (customize as needed)
environment:
- DBDOCS_DB_CONNECTION_STRING=postgres://your_db_user:your_db_password@db:5432/your_db_name
- DBDOCS_DB_TYPE=postgres
- DBDOCS_API_PATH=/dbdocs
- DBDOCS_PORT=8080 # Customize the port for dbdocs
- DBDOCS_USERNAME=your_dbdocs_username
- DBDOCS_PASSWORD=your_dbdocs_password
- DBDOCS_TOKEN=dbdocs_token
depends_on:
- db
# Database container (if you're using PostgreSQL)
db:
image: postgres:13 # Use the appropriate PostgreSQL version
environment:
POSTGRES_USER: your_db_user
POSTGRES_PASSWORD: your_db_password
POSTGRES_DB: your_db_name
ports:
- "5432:5432" # Adjust the port as needed
Build and run the containers:
docker-compose build
docker-compose up -d
The dbdocs
container will use this custom image, and you can access dbdocs
on port 8080 of your host machine. Adjust the environment variables in the docker-compose.yml
file to match your specific database and dbdocs
configuration. Remember to replace placeholders like your_db_user
, your_db_password
, your_db_name
, your_dbdocs_username
, and your_dbdocs_password
with your actual configuration.
Local Development
If you want to develop your project locally and then push your project files to the running dbdocs
container on a remote server (e.g., via Jenkins), you'll need to set up a mechanism for file synchronization or copying. Here's a general approach to achieve this:
Shared Volume (Local to Remote):
Create a shared volume that can be accessed by both your local development environment and the running dbdocs
container on the remote server. This shared volume will allow you to copy your project files from your local machine to the remote server.
Modify your docker-compose.yml
file to define a volume for the dbdocs
service:
version: '3'
services:
# ...
# dbdocs container
dbdocs:
volumes:
- ./sample.dbml:/app/sample.dbml # Mount a local directory to the /app/project-files directory in the container
# ... (other configuration)
In this example, the ./sambple.dbml
file on your local machine is mounted to /app/sample.dbml
inside the dbdocs
container.
Deployment Workflow:
. Develop your project locally in the ./project-files
directory.
. When you’re ready to push your project files to the running dbdocs
container on the remote server, use a tool like rsync
or scp
to copy the files. You can also integrate this step into your Jenkins pipeline.
. For example, using rsync
in a Jenkins pipeline:
stage('Deploy to Remote') {
steps {
script {
// Copy project files to the remote server (replace placeholders)
sh "rsync -avz ./project-files/ user@remote-server:/path/to/dbdocs-container/app/project-files/"
}
}
}
Ensure that the running dbdocs
container on the remote server is configured to watch for changes in the /app/project-files
directory and automatically update the documentation when changes occur. One of the easiest way to setup repository for project-files
and start the jenkins pipeline as soon as project-files
are pushed to the repository.
Use this github repo to download scripts, I hope you might find it useful feel free to write at ankurlnmiit@gmail.com or comment.