Choose a Postgres client, pgAdmin is the most full featured, IntelliJ also has a built-in plugin.
-
Run the docker file ./docker-compose.postgres.yml in the root of the project
-
Run the
generate_env.sh
script:./generate_env.sh
-
Run the
./gradlew clean app:run
command in your terminal -
Configure your preferred db client to connect to the database with the connection information. NOTE: The database runs on the non-default port of
5433
to avoid collision with the ReportStream Postgres database. -
Grab the create table command from line 77 of this project file from after the -c to the first semi-colon, no need for the
ALTER TABLE
statements and run it in your local postgres instance to create the table -
Upon creating the table for the first time you may have to instruct your client to refresh (In pgAdmin right-click on the left hand menu and select Refresh)
- For a CDC domain-hosted environment, connect to the VPN. See the VPN documentation.
- Install the Azure CLI
- Open the Azure environment you are trying to access from within your browser and navigate to the database
- Inside of the Azure database page select the Networking option from the left hand nav
- For a Flexion domain-hosted environment, click the link that says
Add current client IP address
and then save the page- NOTE: You should only add your local IP address on a temporary basis, you should remove it after the verification is complete
- On the left hand navigation select Authentication and select the
Add Microsoft Entra Admins
link to add your user to the list. Select Okay and then save the underlying page- NOTE: This permission should only be added temporarily and removed after you are finished with verification
- Enter new connection settings from Azure into your db client of choice
- Password will come from step 8 of these instructions and can be left blank
- Run
az login
inside of your local terminal - Run
az account get-access-token --resource https://ossrdbms-aad.database.windows.net
to get a temporary password
To modify the schema there are a few locations in the code we need to update.
- Add the update inside the databaseMigrations folder in the repo. See the section on migrations for details
- The
PostgresDao.java
contains our queries for the database. Remember to update both the save and the fetch methods DatabasePartnerMetadataStorage
is what calls our dao to perform the db operation, you will need to update the inputs here
We use Liquibase to handle the migrations. Our migration files are located in
/etor/databaseMigrations
.
Documentation on Liquibase can be found here.
Our change log files are organized by business domain. For example, metadata.yml
includes migrations about our
metadata. You may need to create a new file if the migrations are for a business domain not previously covered. Add
the new file to /etor/databaseMigrations
with the name of the business domain. For
example, dogcow.yml
for migrations associated with DogCows. Liquibase supports multiple change log file types, but we
prefer YAML. YAML supports specialized change types but also supports arbitrary SQL if the flexibility is needed.
A couple of concepts we adhere to...
- For each new file, start the
id
at 1 and increment from there. - Use your GitHub username as the
author
. - Provide a
comment
that better describes what or why you are trying to accomplish. - If a migration only applies to Azure, include a
label
ofazure
. - Include only one change per
changeSet
.
Reference this new file in root.yml
by adding it at the bottom of all the other
includes except for any includes that are required to be last (e.g. the etor/databaseMigrations/azure.yml
include).
Our deployed databases have migrations automatically ran by our CD processes.
To run migrations when running our database locally, run the following...
liquibase update --changelog-file ./etor/databaseMigrations/root.yml --url jdbc:postgresql://localhost:5433/intermediary --username intermediary --password 'changeIT!' --label-filter '!azure'
Notice the --label-filter '!azure'
. This will prevent the Azure-specific migrations from running and failing in the
local environment.
To revert changes in the Azure database we use The Github Actions runner.
- Find the runner named Rollback DB in the repository under the
Actions
tab - Select Run workflow
- Run the workflow (Example inputs below)
- Use workflow from: Always Main
- Number of migrations to rollback, an integer that is specific to the file input liquibase-documentation
- File with migrations you want to roll back: This should be the file containing the specific sub-file in the
databaseMigrations
directory you want to rollback from. Generally you want to pick the root migration because it takes all the migrations into account so the number of migrations is applied to all the migrations. EX:./etor/databaseMigrations/root.yml
- Azure environment to rollback on: Select the environment you want to see the changes reverted from
- Once you have the inputs entered select Run Workflow on the tab and the workflow should start up
To rollback changes locally use the rollback-count
command and pass in an argument for the number of rollbacks using --count=?