Skip to content

Commit

Permalink
Merge branch 'main' into shanice/transition_ecs_infra
Browse files Browse the repository at this point in the history
  • Loading branch information
shanice-skylight authored Jan 7, 2025
2 parents c13f0b1 + 370365d commit afbce17
Show file tree
Hide file tree
Showing 114 changed files with 89,347 additions and 121,355 deletions.
19 changes: 18 additions & 1 deletion .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,24 @@ jobs:
run: npm install
- name: Run tests
working-directory: ./query-connector
run: npm test
run: npm run test:unit

integration-tests:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: ${{env.NODE_VERSION}}
- name: Install dependencies
working-directory: ./query-connector
run: npm install

- name: Run tests
working-directory: ./query-connector
run: npm run test:integration

end-to-end-tests:
timeout-minutes: 15
Expand Down
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ repos:
name: Pretty Format JSON
args: [--autofix, --no-sort-keys]
- repo: https://github.com/pre-commit/mirrors-eslint
rev: "v9.16.0"
rev: "v9.17.0"
hooks:
- id: eslint
name: ESLint
Expand Down
104 changes: 54 additions & 50 deletions query-connector/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,84 +4,53 @@

The DIBBs Query Connector app offers a REST API for searching for a patient and viewing information tied to your case investigation.

### Running Query Connector
### Running Query Connector (Prerequisites)

The Query Connector app can be run using Docker (or any other OCI container runtime e.g., Podman), or directly from the Node.js source code.

#### Obtaining an eRSD API Key

Before running the Query Connector locally, you will need to obtain an API key for the electronic Reporting and Surveillance Distribution (eRSD). With the API key, you have access to 200+ pre-built queries for reportable conditions, e.g., chlamydia, influenza, hepatitis A, etc. These queries can be used and modified in the Query Connector app.

To obtain a free API key, please visit [https://ersd.aimsplatform.org/#/api-keys](https://ersd.aimsplatform.org/#/api-keys) and follow the sign up instructions.

Next, set up your `.env` file with the following command: `cp .env.sample .env`

Adjust your `DATABASE_URL` as needed.

Add your API keys as an environment variables called `ERSD_API_KEY` and `UMLS_API_KEY` in the `.env` file so that they can be accessed when running the Query Connector app.

#### Running Keycloak for Authentication

```
docker compose -f docker-compose-dev.yaml up keycloak
```

To login via Keycloak, make sure your `.env` is updated using `cp` command above and use the following credentials to login at `localhost:8080` after spinning up the container:

```
Username: qc-admin
Password: QcDev2024!
```

Next, run the app with `npm run dev` or `npm run dev-win`. You should see a sign in button at [http://localhost:3000](http://localhost:3000). Click it and login with the above credentials, and it should redirect back to [http://localhost:3000/query](http://localhost:3000/query)!

#### Running with Docker (Recommended)

To run the Query Connector app with Docker, follow these steps.

1. Confirm that you have Docker installed by running `docker -v`. If you do not see a response similar to what is shown below, follow [these instructions](https://docs.docker.com/get-docker/) to install Docker.
The Query Connector app can be run using Docker (or any other OCI container runtime e.g., Podman), or directly from the Node.js source code. Confirm that you have Docker installed by running `docker -v`. If you do not see a response similar to what is shown below, follow [these instructions](https://docs.docker.com/get-docker/) to install Docker.

```
❯ docker -v
Docker version 20.10.21, build baeda1f
```

2. Download a copy of the Docker image from the PHDI repository by running `docker pull ghcr.io/cdcgov/phdi/query-connector:latest`.
1. If you're using an M1 Mac, you'll need to tell Docker to pull the non-Apple Silicon image using `docker pull --platform linux/amd64 ghcr.io/cdcgov/phdi/query-connector:latest` since we don't have a image for Apple Silicon. If you're using this setup, there might be some issues with architecture incompatability that the team hasn't run into, so please flag if you run into something!
3. Run the service with `docker run -p 3000:3000 query-connector:latest`. If you're on a Windows machine, you may need to run `docker run -p 3000:3000 ghcr.io/cdcgov/phdi/query-connector:latest` instead.
#### Running with Docker (Recommended)

1. Download a copy of the Docker image from the Query Connector repository by running `docker pull ghcr.io/cdcgov/dibbs-query-connector/query-connector:latest`.
1. If you're using an M1 Mac, you'll need to tell Docker to pull the non-Apple Silicon image using `docker pull --platform linux/amd64 ghcr.io/cdcgov/dibbs-query-connector/query-connector:latest` since we don't have a image for Apple Silicon. If you're using this setup, there might be some issues with architecture incompatability that the team hasn't run into, so please flag if you run into something!
2. Run the service with `docker run -p 3000:3000 query-connector:latest`. If you're on a Windows machine, you may need to run `docker run -p 3000:3000 ghcr.io/cdcgov/phdi/query-connector:latest` instead.

Congratulations, the Query Connector app should now be running on `localhost:3000/query-connector`!
The containers should take a few minutes to spin up, but if all goes well, congratulations, the Query Connector app should now be running on `localhost:3000/query-connector`!

#### Running from Node.js Source Code
#### Running from Dev Mode via the Node.js Source Code

We recommend running the Query Connector app from a container, but if that is not feasible for a given use-case, it may also be run directly from Node using the steps below.

1. Ensure that both Git and Node 18.x or higher are installed.
2. Clone the PHDI repository with `git clone https://github.com/CDCgov/phdi`.
3. Navigate to `/phdi/containers/query-connector/`.
1. Ensure that both [Git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git) and [Node 18.x or higher](https://nodejs.org/en/download/package-manager/current) are installed.
2. Clone the Query Connector repository with `git clone https://github.com/CDCgov/dibbs-query-connector`.
3. Navigate to the source folder with `cd /query-connector/`.
4. Install all of the Node dependencies for the Query Connector app with `npm install`.
5. Run the Query Connector app on `localhost:3000` with `npm run dev`. If you are on a Windows Machine, you may need to run `npm run dev-win` instead.

The containers should take a few minutes to spin up, but if all goes well, congratulations, the Query Connector app should now be running on `localhost:3000`!

### Building the Docker Image

To build the Docker image for the Query Connector app from source instead of downloading it from the PHDI repository follow these steps.

1. Ensure that both [Git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git) and [Docker](https://docs.docker.com/get-docker/) are installed.
2. Clone the PHDI repository with `git clone https://github.com/CDCgov/phdi`.
3. Navigate to `/phdi/containers/query-connector/`.
2. Clone the Query Connector repository with `git clone https://github.com/CDCgov/dibbs-query-connector.git`.
3. Navigate to `/query-connector`.
4. Run `docker build -t query-connector .`.

### Running via docker-compose (WIP)

The Query Connector will eventually require other inputs from other DIBBs services. For now, this is a simplified docker compose file that starts the Node service. This can be run with `docker compose up --build`. See the [Docker Compose documentation](https://docs.docker.com/engine/reference/commandline/compose_up/) for additional information.

### Developer Documentation

A Postman collection demonstrating use of the API can be found [here](https://github.com/CDCgov/dibbs-query-connector/blob/main/query-connector/src/app/assets/DIBBs_Query_Connector_API.postman_collection.json).

### Query Connector Data for Query Building

When initializing the backend database for the first time, the Query Connector makes the value sets associated with 200+ reportable conditions available to users tasked with building queries for their jurisdiction. To group value sets by condition and to group the conditions by type, the Query Connector obtains and organizes data from the eRSD and the VSAC in the following way:
When initializing the backend database for the first time, the Query Connector makes the value sets associated with 200+ reportable conditions available to users tasked with building queries for their jurisdiction. To run this seeding script, you'll need to obtain the UMLS and eRSD API key's using the instructions below.

To group value sets by condition and to group the conditions by type, the Query Connector obtains and organizes data from the eRSD and the VSAC in the following way:

1. The Query Connector retrieves the 200+ reportable conditions from the eRSD as well as the value sets' associated IDs.
2. Using the value set IDs from the eRSD, the Query Connector retrieves the value set's comprehensive information from the VSAC, i.e., the LOINC, SNOMED, etc. codes associated with each value set ID.
Expand All @@ -91,6 +60,41 @@ When initializing the backend database for the first time, the Query Connector m

In order to make the dev process as low-lift as possible, we want to avoid executing the `db-creation` scripts when booting up the application in dev mode via `npm run dev` or `npm run dev-win`. To that end, we've created a `pg_dump` file containing all the value sets, concepts, and foreign key mappings that would be extracted from a fresh pull of the eRSD and processed through our creation functions. This file, `vs_dump.sql` has been mounted into the docker volume of our postgres DB when running in dev mode as an entrypoint script. This means it will be automatically executed when the DB is freshly spun up. You shouldn't need to do anything to facilitate this mounting or file running.

#### Obtaining an eRSD API Key

Before running the Query Connector locally, you will need to obtain an API key for the electronic Reporting and Surveillance Distribution (eRSD). With the API key, you have access to 200+ pre-built queries for reportable conditions, e.g., chlamydia, influenza, hepatitis A, etc. These queries can be used and modified in the Query Connector app.
``

To obtain the free API keys, please visit the following URLs and follow the sign up instructions.

- [https://ersd.aimsplatform.org/#/api-keys](https://ersd.aimsplatform.org/#/api-keys)
- [https://uts.nlm.nih.gov/uts/login](https://uts.nlm.nih.gov/uts/login)

Next, set up your `.env` file with the following command: `cp .env.sample .env`

Adjust your `DATABASE_URL` as needed.

Add your API keys as an environment variables called `ERSD_API_KEY` and `UMLS_API_KEY` in the `.env` file so that they can be accessed when running the Query Connector app.

#### Running Keycloak for Authentication

```
docker compose -f docker-compose-dev.yaml up keycloak
```

To login via Keycloak, make sure your `.env` is updated using `cp` command above and use the following credentials to login at `localhost:8080` after spinning up the container:

```
Username: qc-admin
Password: QcDev2024!
```

Next, run the app with `npm run dev` or `npm run dev-win`. You should see a sign in button at [http://localhost:3000](http://localhost:3000). Click it and login with the above credentials, and it should redirect back to [http://localhost:3000/query](http://localhost:3000/query)!

### Developer Documentation

A Postman collection demonstrating use of the API can be found [here](https://github.com/CDCgov/dibbs-query-connector/blob/main/query-connector/src/app/assets/DIBBs_Query_Connector_API.postman_collection.json).

#### Updating the pg_dump

If the DB extract file ever needs to be updated, you can use the following simple process:
Expand Down
30 changes: 30 additions & 0 deletions query-connector/docker-compose-integration.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
services:
# PostgreSQL DB for custom query and value set storage
db:
image: "postgres:alpine"
ports:
- "5432:5432"
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=pw
- POSTGRES_DB=tefca_db

volumes:
- ./vs_dump.sql:/docker-entrypoint-initdb.d/vs_dump.sql
healthcheck:
test: ["CMD-SHELL", "pg_isready -U postgres"]
interval: 2s
timeout: 5s
retries: 20

# Flyway migrations and DB version control
flyway:
image: flyway/flyway:10.16-alpine
command: -configFiles=/flyway/conf/flyway.conf -schemas=public -connectRetries=60 migrate
platform: linux/amd64
volumes:
- ./flyway/sql:/flyway/sql
- ./flyway/conf/flyway.conf:/flyway/conf/flyway.conf
depends_on:
db:
condition: service_started
5 changes: 2 additions & 3 deletions query-connector/e2e/alternate_queries.spec.ts
Original file line number Diff line number Diff line change
Expand Up @@ -30,11 +30,11 @@ test.describe("alternate queries with the Query Connector", () => {
await page.getByRole("button", { name: "Search for patient" }).click();
await expect(page.getByText("Loading")).toHaveCount(0, { timeout: 10000 });
await expect(
page.getByRole("heading", { name: PAGE_TITLES["patient-results"] }),
page.getByRole("heading", { name: PAGE_TITLES["patient-results"].title }),
).toBeVisible();
await page.getByRole("link", { name: "Select patient" }).click();
await expect(
page.getByRole("heading", { name: PAGE_TITLES["select-query"] }),
page.getByRole("heading", { name: PAGE_TITLES["select-query"].title }),
).toBeVisible();
await page.getByTestId("Select").selectOption("chlamydia");
await page.getByRole("button", { name: "Submit" }).click();
Expand Down Expand Up @@ -65,7 +65,6 @@ test.describe("alternate queries with the Query Connector", () => {
await expect(
page.getByRole("heading", { name: "Select a query" }),
).toBeVisible();
// await page.getByTestId("Select").selectOption("social-determinants");
await page.getByTestId("Select").selectOption("cancer");
await page.getByRole("button", { name: "Submit" }).click();
await expect(page.getByText("Loading")).toHaveCount(0, { timeout: 10000 });
Expand Down
11 changes: 7 additions & 4 deletions query-connector/e2e/customize_query.spec.ts
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,10 @@ test.describe("querying with the Query Connector", () => {
"This site is for demo purposes only. Please do not enter PII on this website.",
);
await expect(
page.getByRole("heading", { name: PAGE_TITLES["search"], exact: true }),
page.getByRole("heading", {
name: PAGE_TITLES["search"].title,
exact: true,
}),
).toBeVisible();

await page.getByRole("button", { name: "Fill fields" }).click();
Expand Down Expand Up @@ -51,7 +54,7 @@ test.describe("querying with the Query Connector", () => {
"250 labs found, 4 medications found, 104 conditions found.",
),
).toBeVisible();
await page.getByRole("link", { name: "Medications" }).click();
await page.getByRole("button", { name: "Medications" }).click();
await page.getByRole("button", { name: "Chlamydia Medication" }).click();
await page
.getByRole("row", { name: "azithromycin 1000 MG" })
Expand Down Expand Up @@ -111,7 +114,7 @@ test.describe("querying with the Query Connector", () => {
page,
}) => {
test.slow();
await page.getByRole("link", { name: "Labs" }).click();
await page.getByRole("button", { name: "Labs", exact: true }).click();
await page.getByRole("button", { name: "Deselect all labs" }).click();

// Spot check a couple valuesets for deselection
Expand All @@ -120,7 +123,7 @@ test.describe("querying with the Query Connector", () => {
await expect(page.getByText("0 of 33 selected")).toBeVisible();

// Now de-select all the medications via the group check marks
await page.getByRole("link", { name: "Medications" }).click();
await page.getByRole("button", { name: "Medications" }).click();
await page
.getByRole("button", { name: "Chlamydia Medication" })
.locator("label")
Expand Down
10 changes: 8 additions & 2 deletions query-connector/e2e/query_workflow.spec.ts
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,10 @@ test.describe("querying with the Query Connector", () => {
"This site is for demo purposes only. Please do not enter PII on this website.",
);
await expect(
page.getByRole("heading", { name: PAGE_TITLES["search"], exact: true }),
page.getByRole("heading", {
name: PAGE_TITLES["search"].title,
exact: true,
}),
).toBeVisible();

await page.getByRole("button", { name: "Fill fields" }).click();
Expand Down Expand Up @@ -156,7 +159,10 @@ test.describe("querying with the Query Connector", () => {
// Now let's use the return to search to go back to a blank form
await page.getByRole("button", { name: "New patient search" }).click();
await expect(
page.getByRole("heading", { name: PAGE_TITLES["search"], exact: true }),
page.getByRole("heading", {
name: PAGE_TITLES["search"].title,
exact: true,
}),
).toBeVisible();
});
});
41 changes: 4 additions & 37 deletions query-connector/flyway/sql/V01_01__tcr_custom_query_schema.sql
Original file line number Diff line number Diff line change
Expand Up @@ -51,41 +51,18 @@ CREATE TABLE IF NOT EXISTS valueset_to_concept (
FOREIGN KEY (concept_id) REFERENCES concepts(id)
);

CREATE TABLE IF NOT EXISTS icd_crosswalk (
id TEXT PRIMARY KEY,
icd10_code TEXT,
icd9_code TEXT,
match_flags TEXT);


CREATE TABLE IF NOT EXISTS query (
id UUID DEFAULT uuid_generate_v4 (),
query_name VARCHAR(255),
query_name VARCHAR(255) UNIQUE,
query_data JSON,
conditions_list TEXT[],
author VARCHAR(255),
date_created TIMESTAMP,
date_last_modified TIMESTAMP,
time_window_number INT,
time_window_unit VARCHAR(80),
PRIMARY KEY (id));

CREATE TABLE IF NOT EXISTS query_to_valueset (
id TEXT PRIMARY KEY,
query_id UUID,
valueset_id TEXT,
valueset_oid TEXT,
FOREIGN KEY (query_id) REFERENCES query(id),
FOREIGN KEY (valueset_id) REFERENCES valuesets(id)
);

CREATE TABLE IF NOT EXISTS query_included_concepts (
id TEXT PRIMARY KEY,
query_by_valueset_id TEXT,
concept_id TEXT,
include BOOLEAN,
FOREIGN KEY (query_by_valueset_id) REFERENCES query_to_valueset(id),
FOREIGN KEY (concept_id) REFERENCES concepts(id)
);

-- Create indexes for all primary and foreign keys

CREATE INDEX IF NOT EXISTS conditions_id_index ON conditions (id);
Expand All @@ -103,15 +80,5 @@ CREATE INDEX IF NOT EXISTS valueset_to_concept_id_index ON valueset_to_concept (
CREATE INDEX IF NOT EXISTS valueset_to_concept_valueset_id_index ON valueset_to_concept (valueset_id);
CREATE INDEX IF NOT EXISTS valueset_to_concept_concept_id_index ON valueset_to_concept (concept_id);

CREATE INDEX IF NOT EXISTS icd_crosswalk_id_index ON icd_crosswalk (id);

CREATE INDEX IF NOT EXISTS query_id_index ON query (id);
CREATE INDEX IF NOT EXISTS query_name_index ON query (query_name);

CREATE INDEX IF NOT EXISTS query_to_valueset_id_index ON query_to_valueset (id);
CREATE INDEX IF NOT EXISTS query_to_valueset_query_id_index ON query_to_valueset (query_id);
CREATE INDEX IF NOT EXISTS query_to_valueset_valueset_id_index ON query_to_valueset (valueset_id);

CREATE INDEX IF NOT EXISTS query_included_concepts_id_index ON query_included_concepts (id);
CREATE INDEX IF NOT EXISTS query_included_concepts_query_by_valueset_id_index ON query_included_concepts (query_by_valueset_id);
CREATE INDEX IF NOT EXISTS query_included_concepts_concept_id_index ON query_included_concepts (concept_id);
CREATE INDEX IF NOT EXISTS query_name_index ON query (query_name);
18 changes: 18 additions & 0 deletions query-connector/integration.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
#!/bin/bash

set -e # Exit immediately if a command exits with a non-zero status

docker compose -f docker-compose-integration.yaml up -d

# wait for flyway to finish running before...
docker compose -f docker-compose-integration.yaml logs -f flyway | grep -q "Successfully applied"

# running our integration tests
DATABASE_URL=postgresql://postgres:pw@localhost:5432/tefca_db TEST_TYPE=integration jest --testPathPattern=tests/integration
JEST_EXIT_CODE=$?

# Teardown containers
docker compose -f docker-compose-integration.yaml down

# Exit with the Jest exit code
exit $JEST_EXIT_CODE
13 changes: 13 additions & 0 deletions query-connector/jest.setup.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,19 @@
import { getDbClient } from "@/app/backend/dbClient";
import "@testing-library/jest-dom";
import { toHaveNoViolations } from "jest-axe";
import * as matchers from "jest-extended";
import { Pool } from "pg";

expect.extend(toHaveNoViolations);
expect.extend(matchers);

if (process.env.TEST_TYPE === "integration") {
let dbClient: Pool | null = null;
beforeAll(() => {
dbClient = getDbClient();
});

afterAll(async () => {
await dbClient?.end();
});
}
Loading

0 comments on commit afbce17

Please sign in to comment.