How docker make my database a briefcase ?

Santamaria Paul
3 min readMay 9, 2021

Sick of installing mysql or Postgresql each time you change of server ?

In this article, I would like to show you how we can Dockerize our structured database, put it in a bag and send it anywhere to make it run in a second

firstly I want to share to you how docker solve my issue in a client project and, secondly I will explain why and how do I manage to use docker.

My project was to redesign a website to make it more friendly and maximize SEO. But the main reason was to scale up my skills and improve my capacity to adapt with unknown technologies. The initial website was done Python with Django framework without any possibility to change data from an administration panel which was a pain point for updating some text.

But the main pain point of the website was its hosting. My clients had no budget for a hosting at the time the website has been created. So he decided to take Heroku ‘s free plan. The issue of this plan is that Heroku put your website “asleep” and whenever the IP address is called the “wake up” take 10 seconds. This system is problematic for SEO and user experience of the website.

So in order to fix these pains points I needed to find a low coast hosting less than €30 a year with the capacity of managing Postgres database and large amount of storage for images…

So after a long time of research I have found mvps.net which offer a €3 vps a month and in order to optimize the vps storage I decided to use Docker to easily and conveniently deploy my project which allow me to have the exact same local database in production and also to not install and heavy dependences such as Postgres and pip packages.

At this point I had never used Docker and from what I have read, using Docker-compose it much easy and understandable than writing Dockerfile for each of my services.

What you need to install in your local machine:

  • Docker, Docker-compose and Docker-volume

Initialize Postgres container with a docker-compose.yml

version: '3'
services:
db:
image: postgres
restart:always
environment:
- POSTGRES_DB=dbSo first, we need to i
- POSTGRES_USER=user
- POSTGRES_PASSWORD=password
volumes:
- ./data:/var/lib/postgresql/data
port:
- 5000:5432

After running “docker-compose ud -d”, a folder named “/data” is generated and Postgres database is available on localhost on port 5000.
/data folder contains Postgres structured data and will contain future data of our Django models.

Populate the database

After building your app using this database, you can compress the data folder with tar :

tar -czvf data.tar.gz data/

Then send your archive to your remote server with rsync :

rsync -av data.tar.gz -e ssh -i /home/YOUR_COLAL_USER/.ssh/id_rsa.pub YOUR_REMOTE_USER@IP:/www/

Log in your remote server

tar xzvf /www/data.tar.gz

Then after having the /data folder in your remote server, you need to :

  • Install Docker and Docker-compose, Docker-volume
  • Create a /www/docker-compose.yml with the same content of your local docker-compose with which you developed your app

Docker-compose up -d

That’s it, you have your exact local database in your remote server !

--

--

Santamaria Paul
0 Followers

I'm Paul Santamaria, a web developer for Digital Projects, Based in Paris and hoping to move in Berlin