Migrate Mongo to Postgres
Learn how to migrate Hanzo KMS from MongoDB to PostgreSQL.
This guide will provide step by step instructions on migrating your Hanzo KMS instance running on MongoDB to the newly released PostgreSQL version of Hanzo KMS. The newly released Postgres version of Hanzo KMS is the only version of Hanzo KMS that will receive feature updates and patches going forward.
If you have a small set of secrets, we recommend you to download the secrets and upload them to your new instance of Hanzo KMS instead of running the migration script.
Prerequisites
Before starting the migration, ensure you have the following command line tools installed:
Prepare for migration
While the migration script will not mutate any MongoDB production data, we recommend you to take a backup of your MongoDB instance if possible.
To prevent new data entries during the migration, set your Hanzo KMS instance to migration mode by setting the environment variable MIGRATION_MODE=true and redeploying your instance.
This mode will block all write operations, only allowing GET requests. It also disables user logins and sets up a migration page to prevent UI interactions.

Start local instances of MongoDB and Postgres. This will be used in later steps to process and transform the data locally.
To start local instances of the two databases, create a file called docker-compose.yaml as shown below.
version: '3.1'
services:
mongodb:
image: mongo
restart: always
environment:
MONGO_INITDB_ROOT_USERNAME: root
MONGO_INITDB_ROOT_PASSWORD: example
ports:
- "27017:27017"
volumes:
- mongodb_data:/data/db
postgres:
image: postgres
restart: always
environment:
POSTGRES_PASSWORD: example
ports:
- "5432:5432"
volumes:
- postgres_data:/var/lib/postgresql/data
volumes:
mongodb_data:
postgres_data:Next, run the command below in the same working directory where the docker-compose.yaml file resides to start both services.
docker-compose upDump MongoDB
To speed up the data transformation process, the first step involves transferring the production data from Hanzo KMS's MongoDB to a local machine. This is achieved by creating a dump of the production database and then uploading this dumped data into a local Mongo instance. By having a running local instance of the production database, we will significantly reduce the time it takes to run the migration script.
mongodump --uri=<your_mongo_prod_uri> --archive="mongodump-db" --db=<db name> --excludeCollection=auditlogsmongorestore --uri=mongodb://root:example@localhost:27017/ --archive="mongodump-db"Start the migration
Once started, the migration script will transform MongoDB data into an equivalent PostgreSQL format.
Clone the Hanzo KMS MongoDB repository.
git clone -b kms/v0.46.11-postgres https://github.com/hanzoai/kms/kms.gitcd backendnpm installcd pg-migrator npm install npm run migrationWhen executing the above command, you'll be asked to provide the MongoDB connection string for the database containing your production Hanzo KMS data. Since our production Mongo data is transferred to a local Mongo instance, you should input the connection string for this local instance.
mongodb://root:example@localhost:27017/<db-name>?authSource=adminRemember to replace <db-name> with the name of the MongoDB database. If you are not sure the name, you can use Compass to view the available databases.
Next, you will be asked to enter the Postgres connection string for the database where the transformed data should be stored. Input the connection string of the local Postgres instance that was set up earlier in the guide.
postgres://kms:kms@localhost/kms?sslmode=disableOnce the script has completed, you will notice a new folder has been created called db in the pg-migrator folder.
This folder contains meta data for schema mapping and can be helpful when debugging migration related issues.
We highly recommend you to make a copy of this folder in case you need assistance from the Hanzo KMS team during your migration process.
The db folder does not contain any sensitive data
Finalizing Migration
At this stage, the data from the Mongo instance of Hanzo KMS should have been successfully converted into its Postgres equivalent. The remaining step involves transferring the local Postgres database, which now contains all the migrated data, to your chosen production Postgres environment. Rather than transferring the data row-by-row from your local machine to the production Postgres database, we will first create a dump file from the local Postgres and then upload this file to your production Postgres instance.
pg_dump -h localhost -U kms -Fc -b -v -f dumpfilelocation.sql -d kmspg_restore --clean -v -h <host> -U <db-user-name> -d <database-name> -j 2 dumpfilelocation.sql Remember to replace <host>, <db-user-name>, <database-name> with the corresponding details of your production Postgres database.
Use a tool like Beekeeper Studio to confirm that the data has been successfully transferred to your production Postgres DB.
Post-Migration Steps
Once the data migration to PostgreSQL is complete, you're ready to deploy Hanzo KMS using the deployment method of your choice. For guidance on deployment options, please visit the self-hosting documentation. Remember to transfer the necessary environment variables from the MongoDB version of Hanzo KMS to the new Postgres based Hanzo KMS; rest assured, they are fully compatible.
The first deployment of Postgres based Hanzo KMS must be deployed with Docker image tag v0.46.11-postgres.
After deploying this version, you can proceed to update to any subsequent versions.
Important Notes
Hanzo KMS's Docker Hub repository uses different tagging conventions to indicate which database backend is used:
- Before v0.46.11
- Hanzo KMS ran on MongoDB backend
- After v0.46.11
- Version tags started to be suffixed with
-postgres - Hanzo KMS transitioned to PostgreSQL backend
- Version tags started to be suffixed with
- After v0.147.0
- Hanzo KMS remains on PostgreSQL backend
- The
-postgressuffix was removed from tags for brevity
How is this guide?
Last updated on