Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump org.apache.hive:hive-jdbc from 3.1.3 to 4.0.0 #1008

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Bump org.apache.hive:hive-jdbc from 3.1.3 to 4.0.0

0c7bac5
Select commit
Loading
Failed to load commit list.
Open

Bump org.apache.hive:hive-jdbc from 3.1.3 to 4.0.0 #1008

Bump org.apache.hive:hive-jdbc from 3.1.3 to 4.0.0
0c7bac5
Select commit
Loading
Failed to load commit list.
Google Cloud Build / fhir-data-pipes-pr (cloud-build-fhir) failed Jul 1, 2024 in 54m 27s

Summary

Build Information

Trigger fhir-data-pipes-pr
Build 098b371a-ce9a-4440-8b7b-cce426af8e58
Start 2024-07-01T06:57:12-07:00
Duration 53m42.358s
Status FAILURE

Steps

Step Status Duration
Launch HAPI Source and Sink Servers SUCCESS 42.023s
Compile Bunsen and Pipeline SUCCESS 6m29.073s
Build Uploader Image SUCCESS 9.794s
Run Uploader Unit Tests SUCCESS 876ms
Build E2E Image SUCCESS 1m58.096s
Upload to HAPI SUCCESS 2m45.799s
Build Pipeline Images SUCCESS 20.579s
Run Batch Pipeline in FHIR-search mode with HAPI source SUCCESS 2m47.17s
Run E2E Test for FHIR-search mode with HAPI source SUCCESS 4.588s
Run Batch Pipeline for JDBC mode with HAPI source SUCCESS 34.372s
Run E2E Test for JDBC mode with HAPI source SUCCESS 4.399s
Create views database SUCCESS 650ms
Turn down FHIR Sink Server SUCCESS 4.15s
Launch HAPI FHIR Sink Server SUCCESS 3.824s
Bring up controller and Spark containers SUCCESS 11m24.838s
Run E2E Test for Dockerized Controller and Spark Thriftserver FAILURE 26m38.506s
Bring down controller and Spark containers QUEUED 0s
Turn down HAPI Source and Sink Servers QUEUED 0s
Launch OpenMRS Server and HAPI FHIR Sink Server QUEUED 0s
Wait for Servers Start QUEUED 0s
Launch Streaming Pipeline QUEUED 0s
Run E2E Test for STREAMING, using OpenMRS Source QUEUED 0s
Upload to OpenMRS QUEUED 0s
Run Batch Pipeline FHIR-search mode with OpenMRS source QUEUED 0s
Run E2E Test for FHIR-search mode with OpenMRS source QUEUED 0s
Run Batch Pipeline for JDBC mode with OpenMRS source QUEUED 0s
Run E2E Test for JDBC mode with OpenMRS source QUEUED 0s
Test Indicators QUEUED 0s
Turn down Webserver and HAPI Server QUEUED 0s

Details

starting build "098b371a-ce9a-4440-8b7b-cce426af8e58"

FETCHSOURCE
hint: Using 'master' as the name for the initial branch. This default branch name
hint: is subject to change. To configure the initial branch name to use in all
hint: of your new repositories, which will suppress this warning, call:
hint:
hint: 	git config --global init.defaultBranch <name>
hint:
hint: Names commonly chosen instead of 'master' are 'main', 'trunk' and
hint: 'development'. The just-created branch can be renamed via this command:
hint:
hint: 	git branch -m <name>
Initialized empty Git repository in /workspace/.git/
From https://github.com/google/fhir-data-pipes
 * branch            0c7bac583cd7983fca850aa4530f4a18515b91dd -> FETCH_HEAD
Updating files:  49% (477/964)
Updating files:  50% (482/964)
Updating files:  51% (492/964)
Updating files:  52% (502/964)
Updating files:  53% (511/964)
Updating files:  54% (521/964)
Updating files:  55% (531/964)
Updating files:  55% (535/964)
Updating files:  56% (540/964)
Updating files:  57% (550/964)
Updating files:  58% (560/964)
Updating files:  59% (569/964)
Updating files:  60% (579/964)
Updating files:  61% (589/964)
Updating files:  62% (598/964)
Updating files:  63% (608/964)
Updating files:  64% (617/964)
Updating files:  65% (627/964)
Updating files:  66% (637/964)
Updating files:  67% (646/964)
Updating files:  68% (656/964)
Updating files:  69% (666/964)
Updating files:  70% (675/964)
Updating files:  71% (685/964)
Updating files:  72% (695/964)
Updating files:  73% (704/964)
Updating files:  74% (714/964)
Updating files:  75% (723/964)
Updating files:  76% (733/964)
Updating files:  77% (743/964)
Updating files:  78% (752/964)
Updating files:  79% (762/964)
Updating files:  80% (772/964)
Updating files:  81% (781/964)
Updating files:  82% (791/964)
Updating files:  83% (801/964)
Updating files:  84% (810/964)
Updating files:  85% (820/964)
Updating files:  86% (830/964)
Updating files:  87% (839/964)
Updating files:  88% (849/964)
Updating files:  89% (858/964)
Updating files:  90% (868/964)
Updating files:  91% (878/964)
Updating files:  92% (887/964)
Updating files:  93% (897/964)
Updating files:  94% (907/964)
Updating files:  95% (916/964)
Updating files:  96% (926/964)
Updating files:  97% (936/964)
Updating files:  98% (945/964)
Updating files:  99% (955/964)
Updating files: 100% (964/964)
Updating files: 100% (964/964), done.
HEAD is now at 0c7bac5 Bump org.apache.hive:hive-jdbc from 3.1.3 to 4.0.0
BUILD
Starting Step #0 - "Launch HAPI Source and Sink Servers"
Starting Step #1 - "Compile Bunsen and Pipeline"
Step #0 - "Launch HAPI Source and Sink Servers": Pulling image: docker/compose
Step #1 - "Compile Bunsen and Pipeline": Pulling image: maven:3.8.5-openjdk-17
Step #0 - "Launch HAPI Source and Sink Servers": Using default tag: latest
Step #1 - "Compile Bunsen and Pipeline": 3.8.5-openjdk-17: Pulling from library/maven
Step #0 - "Launch HAPI Source and Sink Servers": latest: Pulling from docker/compose
Step #0 - "Launch HAPI Source and Sink Servers": aad63a933944: Pulling fs layer
Step #0 - "Launch HAPI Source and Sink Servers": b396cd7cbac4: Pulling fs layer
Step #0 - "Launch HAPI Source and Sink Servers": 0426ec0ed60a: Pulling fs layer
Step #0 - "Launch HAPI Source and Sink Servers": 9ac2a98ece5b: Pulling fs layer
Step #0 - "Launch HAPI Source and Sink Servers": 9ac2a98ece5b: Waiting
Step #0 - "Launch HAPI Source and Sink Servers": b396cd7cbac4: Download complete
Step #1 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Pulling fs layer
Step #1 - "Compile Bunsen and Pipeline": de849f1cfbe6: Pulling fs layer
Step #1 - "Compile Bunsen and Pipeline": a7203ca35e75: Pulling fs layer
Step #1 - "Compile Bunsen and Pipeline": 3337662e6dc9: Pulling fs layer
Step #1 - "Compile Bunsen and Pipeline": 81485058ab89: Pulling fs layer
Step #1 - "Compile Bunsen and Pipeline": b548970362bb: Pulling fs layer
Step #1 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Waiting
Step #1 - "Compile Bunsen and Pipeline": dbd02ad382f5: Pulling fs layer
Step #1 - "Compile Bunsen and Pipeline": de849f1cfbe6: Waiting
Step #1 - "Compile Bunsen and Pipeline": b548970362bb: Waiting
Step #1 - "Compile Bunsen and Pipeline": 81485058ab89: Waiting
Step #1 - "Compile Bunsen and Pipeline": dbd02ad382f5: Waiting
Step #1 - "Compile Bunsen and Pipeline": a7203ca35e75: Waiting
Step #1 - "Compile Bunsen and Pipeline": 3337662e6dc9: Waiting
Step #0 - "Launch HAPI Source and Sink Servers": aad63a933944: Download complete
Step #0 - "Launch HAPI Source and Sink Servers": aad63a933944: Pull complete
Step #0 - "Launch HAPI Source and Sink Servers": b396cd7cbac4: Pull complete
Step #0 - "Launch HAPI Source and Sink Servers": 0426ec0ed60a: Verifying Checksum
Step #0 - "Launch HAPI Source and Sink Servers": 0426ec0ed60a: Download complete
Step #1 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Verifying Checksum
Step #1 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Download complete
Step #0 - "Launch HAPI Source and Sink Servers": 9ac2a98ece5b: Verifying Checksum
Step #0 - "Launch HAPI Source and Sink Servers": 9ac2a98ece5b: Download complete
Step #1 - "Compile Bunsen and Pipeline": de849f1cfbe6: Verifying Checksum
Step #1 - "Compile Bunsen and Pipeline": de849f1cfbe6: Download complete
Step #0 - "Launch HAPI Source and Sink Servers": 0426ec0ed60a: Pull complete
Step #0 - "Launch HAPI Source and Sink Servers": 9ac2a98ece5b: Pull complete
Step #0 - "Launch HAPI Source and Sink Servers": Digest: sha256:b60a020c0f68047b353a4a747f27f5e5ddb17116b7b018762edfb6f7a6439a82
Step #0 - "Launch HAPI Source and Sink Servers": Status: Downloaded newer image for docker/compose:latest
Step #0 - "Launch HAPI Source and Sink Servers": docker.io/docker/compose:latest
Step #1 - "Compile Bunsen and Pipeline": 81485058ab89: Verifying Checksum
Step #1 - "Compile Bunsen and Pipeline": 81485058ab89: Download complete
Step #1 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Pull complete
Step #1 - "Compile Bunsen and Pipeline": b548970362bb: Verifying Checksum
Step #1 - "Compile Bunsen and Pipeline": b548970362bb: Download complete
Step #1 - "Compile Bunsen and Pipeline": dbd02ad382f5: Verifying Checksum
Step #1 - "Compile Bunsen and Pipeline": dbd02ad382f5: Download complete
Step #1 - "Compile Bunsen and Pipeline": de849f1cfbe6: Pull complete
Step #1 - "Compile Bunsen and Pipeline": a7203ca35e75: Verifying Checksum
Step #1 - "Compile Bunsen and Pipeline": a7203ca35e75: Download complete
Step #1 - "Compile Bunsen and Pipeline": 3337662e6dc9: Verifying Checksum
Step #1 - "Compile Bunsen and Pipeline": 3337662e6dc9: Download complete
Step #1 - "Compile Bunsen and Pipeline": a7203ca35e75: Pull complete
Step #0 - "Launch HAPI Source and Sink Servers": Creating volume "docker_hapi-fhir-db" with default driver
Step #0 - "Launch HAPI Source and Sink Servers": Creating volume "docker_hapi-server" with default driver
Step #0 - "Launch HAPI Source and Sink Servers": Creating volume "docker_hapi-data" with default driver
Step #0 - "Launch HAPI Source and Sink Servers": Pulling db (postgres:)...
Step #1 - "Compile Bunsen and Pipeline": 3337662e6dc9: Pull complete
Step #1 - "Compile Bunsen and Pipeline": 81485058ab89: Pull complete
Step #1 - "Compile Bunsen and Pipeline": b548970362bb: Pull complete
Step #1 - "Compile Bunsen and Pipeline": dbd02ad382f5: Pull complete
Step #1 - "Compile Bunsen and Pipeline": Digest: sha256:3a9c30b3af6278a8ae0007d3a3bf00fff80ec3ed7ae4eb9bfa1772853101549b
Step #1 - "Compile Bunsen and Pipeline": Status: Downloaded newer image for maven:3.8.5-openjdk-17
Step #1 - "Compile Bunsen and Pipeline": docker.io/library/maven:3.8.5-openjdk-17
Step #0 - "Launch HAPI Source and Sink Servers": latest: Pulling from library/postgres
Step #1 - "Compile Bunsen and Pipeline": [INFO] Error stacktraces are turned on.
Step #1 - "Compile Bunsen and Pipeline": [INFO] Scanning for projects...
Step #1 - "Compile Bunsen and Pipeline": [INFO] ------------------------------------------------------------------------
Step #1 - "Compile Bunsen and Pipeline": [INFO] Reactor Build Order:
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] root                                                               [pom]
Step #1 - "Compile Bunsen and Pipeline": [INFO] Bunsen Parent                                                      [pom]
Step #1 - "Compile Bunsen and Pipeline": [INFO] Extension Structure Definitions                                    [jar]
Step #1 - "Compile Bunsen and Pipeline": [INFO] Bunsen Core                                                        [jar]
Step #1 - "Compile Bunsen and Pipeline": [INFO] Bunsen Core R4                                                     [jar]
Step #1 - "Compile Bunsen and Pipeline": [INFO] Bunsen Core Stu3                                                   [jar]
Step #1 - "Compile Bunsen and Pipeline": [INFO] Bunsen Avro                                                        [jar]
Step #1 - "Compile Bunsen and Pipeline": [INFO] FHIR Analytics                                                     [pom]
Step #1 - "Compile Bunsen and Pipeline": [INFO] common                                                             [jar]
Step #1 - "Compile Bunsen and Pipeline": [INFO] batch                                                              [jar]
Step #1 - "Compile Bunsen and Pipeline": [INFO] streaming                                                          [jar]
Step #1 - "Compile Bunsen and Pipeline": [INFO] controller                                                         [jar]
Step #1 - "Compile Bunsen and Pipeline": [INFO] coverage                                                           [pom]
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] -------------------< com.google.fhir.analytics:root >-------------------
Step #1 - "Compile Bunsen and Pipeline": [INFO] Building root 0.2.7-SNAPSHOT                                      [1/13]
Step #1 - "Compile Bunsen and Pipeline": [INFO] --------------------------------[ pom ]---------------------------------
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- jacoco-maven-plugin:0.8.12:prepare-agent (default) @ root ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] argLine set to -javaagent:/root/.m2/repository/org/jacoco/org.jacoco.agent/0.8.12/org.jacoco.agent-0.8.12-runtime.jar=destfile=/workspace/target/jacoco.exec
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-install-plugin:2.4:install (default-install) @ root ---
Step #0 - "Launch HAPI Source and Sink Servers": Digest: sha256:46aa2ee5d664b275f05d1a963b30fff60fb422b4b594d509765c42db46d48881
Step #0 - "Launch HAPI Source and Sink Servers": Status: Downloaded newer image for postgres:latest
Step #0 - "Launch HAPI Source and Sink Servers": Pulling hapi-server (hapiproject/hapi:latest)...
Step #1 - "Compile Bunsen and Pipeline": [INFO] Installing /workspace/pom.xml to /root/.m2/repository/com/google/fhir/analytics/root/0.2.7-SNAPSHOT/root-0.2.7-SNAPSHOT.pom
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] ------------------< com.cerner.bunsen:bunsen-parent >-------------------
Step #1 - "Compile Bunsen and Pipeline": [INFO] Building Bunsen Parent 0.5.14-SNAPSHOT                            [2/13]
Step #1 - "Compile Bunsen and Pipeline": [INFO] --------------------------------[ pom ]---------------------------------
Step #0 - "Launch HAPI Source and Sink Servers": latest: Pulling from hapiproject/hapi
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- jacoco-maven-plugin:0.8.12:prepare-agent (default) @ bunsen-parent ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] argLine set to -javaagent:/root/.m2/repository/org/jacoco/org.jacoco.agent/0.8.12/org.jacoco.agent-0.8.12-runtime.jar=destfile=/workspace/bunsen/target/jacoco.exec
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-resources-plugin:3.3.1:resources (generate-resources) @ bunsen-parent ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] skip non existing resourceDirectory /workspace/bunsen/src/main/resources
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- spotless-maven-plugin:2.43.0:apply (default) @ bunsen-parent ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] Spotless apply skipped
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] >>> maven-source-plugin:3.3.1:jar (default) > generate-sources @ bunsen-parent >>>
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- jacoco-maven-plugin:0.8.12:prepare-agent (default) @ bunsen-parent ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] argLine set to -javaagent:/root/.m2/repository/org/jacoco/org.jacoco.agent/0.8.12/org.jacoco.agent-0.8.12-runtime.jar=destfile=/workspace/bunsen/target/jacoco.exec
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] <<< maven-source-plugin:3.3.1:jar (default) < generate-sources @ bunsen-parent <<<
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-source-plugin:3.3.1:jar (default) @ bunsen-parent ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-javadoc-plugin:3.7.0:jar (default) @ bunsen-parent ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] Not executing Javadoc as the project is not a Java classpath-capable package
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-install-plugin:2.4:install (default-install) @ bunsen-parent ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] Installing /workspace/bunsen/pom.xml to /root/.m2/repository/com/cerner/bunsen/bunsen-parent/0.5.14-SNAPSHOT/bunsen-parent-0.5.14-SNAPSHOT.pom
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] ---------< com.cerner.bunsen:extension-structure-definitions >----------
Step #1 - "Compile Bunsen and Pipeline": [INFO] Building Extension Structure Definitions 0.5.14-SNAPSHOT          [3/13]
Step #1 - "Compile Bunsen and Pipeline": [INFO] --------------------------------[ jar ]---------------------------------
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- jacoco-maven-plugin:0.8.12:prepare-agent (default) @ extension-structure-definitions ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] argLine set to -javaagent:/root/.m2/repository/org/jacoco/org.jacoco.agent/0.8.12/org.jacoco.agent-0.8.12-runtime.jar=destfile=/workspace/bunsen/extension-structure-definitions/target/jacoco.exec
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-resources-plugin:3.3.1:resources (generate-resources) @ extension-structure-definitions ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] Copying 112 resources from src/main/resources to target/classes
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-resources-plugin:3.3.1:resources (default-resources) @ extension-structure-definitions ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] Copying 112 resources from src/main/resources to target/classes
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-compiler-plugin:3.13.0:compile (default-compile) @ extension-structure-definitions ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] Recompiling the module because of changed dependency.
Step #1 - "Compile Bunsen and Pipeline": [INFO] Compiling 2 source files with javac [debug deprecation release 11] to target/classes
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- spotless-maven-plugin:2.43.0:apply (default) @ extension-structure-definitions ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] Spotless apply skipped
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-resources-plugin:3.3.1:testResources (default-testResources) @ extension-structure-definitions ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] skip non existing resourceDirectory /workspace/bunsen/extension-structure-definitions/src/test/resources
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-compiler-plugin:3.13.0:testCompile (default-testCompile) @ extension-structure-definitions ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] No sources to compile
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-surefire-plugin:3.3.0:test (default-test) @ extension-structure-definitions ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] No tests to run.
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-jar-plugin:3.4.2:jar (default-jar) @ extension-structure-definitions ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] Building jar: /workspace/bunsen/extension-structure-definitions/target/extension-structure-definitions-0.5.14-SNAPSHOT.jar
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] >>> maven-source-plugin:3.3.1:jar (default) > generate-sources @ extension-structure-definitions >>>
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- jacoco-maven-plugin:0.8.12:prepare-agent (default) @ extension-structure-definitions ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] argLine set to -javaagent:/root/.m2/repository/org/jacoco/org.jacoco.agent/0.8.12/org.jacoco.agent-0.8.12-runtime.jar=destfile=/workspace/bunsen/extension-structure-definitions/target/jacoco.exec
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] <<< maven-source-plugin:3.3.1:jar (default) < generate-sources @ extension-structure-definitions <<<
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-source-plugin:3.3.1:jar (default) @ extension-structure-definitions ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] Building jar: /workspace/bunsen/extension-structure-definitions/target/extension-structure-definitions-0.5.14-SNAPSHOT-sources.jar
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-javadoc-plugin:3.7.0:jar (default) @ extension-structure-definitions ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] No previous run data found, generating javadoc.
Step #1 - "Compile Bunsen and Pipeline": [WARNING] Javadoc Warnings
Step #1 - "Compile Bunsen and Pipeline": [WARNING] /workspace/bunsen/extension-structure-definitions/src/main/java/com/cerner/bunsen/common/R4UsCoreProfileData.java:6: warning: no comment
Step #1 - "Compile Bunsen and Pipeline": [WARNING] public class R4UsCoreProfileData {
Step #1 - "Compile Bunsen and Pipeline": [WARNING] ^
Step #1 - "Compile Bunsen and Pipeline": [WARNING] /workspace/bunsen/extension-structure-definitions/src/main/java/com/cerner/bunsen/common/R4UsCoreProfileData.java:38: warning: no comment
Step #1 - "Compile Bunsen and Pipeline": [WARNING] public static final List<String> US_CORE_CONDITION_PROFILES =
Step #1 - "Compile Bunsen and Pipeline": [WARNING] ^
Step #1 - "Compile Bunsen and Pipeline": [WARNING] /workspace/bunsen/extension-structure-definitions/src/main/java/com/cerner/bunsen/common/R4UsCoreProfileData.java:44: warning: no comment
Step #1 - "Compile Bunsen and Pipeline": [WARNING] public static final List<String> US_CORE_ENCOUNTER_PROFILES =
Step #1 - "Compile Bunsen and Pipeline": [WARNING] ^
Step #1 - "Compile Bunsen and Pipeline": [WARNING] /workspace/bunsen/extension-structure-definitions/src/main/java/com/cerner/bunsen/common/R4UsCoreProfileData.java:49: warning: no comment
Step #1 - "Compile Bunsen and Pipeline": [WARNING] public static final List<String> US_CORE_MEDICATION_PROFILES =
Step #1 - "Compile Bunsen and Pipeline": [WARNING] ^
Step #1 - "Compile Bunsen and Pipeline": [WARNING] /workspace/bunsen/extension-structure-definitions/src/main/java/com/cerner/bunsen/common/R4UsCoreProfileData.java:13: warning: no comment
Step #1 - "Compile Bunsen and Pipeline": [WARNING] public static final List<String> US_CORE_OBSERVATION_PROFILES =
Step #1 - "Compile Bunsen and Pipeline": [WARNING] ^
Step #1 - "Compile Bunsen and Pipeline": [WARNING] /workspace/bunsen/extension-structure-definitions/src/main/java/com/cerner/bunsen/common/R4UsCoreProfileData.java:8: warning: no comment
Step #1 - "Compile Bunsen and Pipeline": [WARNING] public static final List<String> US_CORE_PATIENT_PROFILES =
Step #1 - "Compile Bunsen and Pipeline": [WARNING] ^
Step #1 - "Compile Bunsen and Pipeline": [WARNING] /workspace/bunsen/extension-structure-definitions/src/main/java/com/cerner/bunsen/common/R4UsCoreProfileData.java:54: warning: no comment
Step #1 - "Compile Bunsen and Pipeline": [WARNING] public static final List<String> US_CORE_QUESTIONNAIRE_RESPONSE_PROFILES =
Step #1 - "Compile Bunsen and Pipeline": [WARNING] ^
Step #1 - "Compile Bunsen and Pipeline": [WARNING] /workspace/bunsen/extension-structure-definitions/src/main/java/com/cerner/bunsen/common/Stu3UsCoreProfileData.java:6: warning: no comment
Step #1 - "Compile Bunsen and Pipeline": [WARNING] public class Stu3UsCoreProfileData {
Step #1 - "Compile Bunsen and Pipeline": [WARNING] ^
Step #1 - "Compile Bunsen and Pipeline": [WARNING] /workspace/bunsen/extension-structure-definitions/src/main/java/com/cerner/bunsen/common/Stu3UsCoreProfileData.java:19: warning: no comment
Step #1 - "Compile Bunsen and Pipeline": [WARNING] public static final List<String> US_CORE_CONDITION_PROFILES =
Step #1 - "Compile Bunsen and Pipeline": [WARNING] ^
Step #1 - "Compile Bunsen and Pipeline": [WARNING] /workspace/bunsen/extension-structure-definitions/src/main/java/com/cerner/bunsen/common/Stu3UsCoreProfileData.java:24: warning: no comment
Step #1 - "Compile Bunsen and Pipeline": [WARNING] public static final List<String> US_CORE_MEDICATION_PROFILES =
Step #1 - "Compile Bunsen and Pipeline": [WARNING] ^
Step #1 - "Compile Bunsen and Pipeline": [WARNING] /workspace/bunsen/extension-structure-definitions/src/main/java/com/cerner/bunsen/common/Stu3UsCoreProfileData.java:13: warning: no comment
Step #1 - "Compile Bunsen and Pipeline": [WARNING] public static final List<String> US_CORE_OBSERVATION_PROFILES =
Step #1 - "Compile Bunsen and Pipeline": [WARNING] ^
Step #1 - "Compile Bunsen and Pipeline": [WARNING] /workspace/bunsen/extension-structure-definitions/src/main/java/com/cerner/bunsen/common/Stu3UsCoreProfileData.java:8: warning: no comment
Step #1 - "Compile Bunsen and Pipeline": [WARNING] public static final List<String> US_CORE_PATIENT_PROFILES =
Step #1 - "Compile Bunsen and Pipeline": [WARNING] ^
Step #1 - "Compile Bunsen and Pipeline": [WARNING] 12 warnings
Step #1 - "Compile Bunsen and Pipeline": [INFO] Building jar: /workspace/bunsen/extension-structure-definitions/target/extension-structure-definitions-0.5.14-SNAPSHOT-javadoc.jar
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-install-plugin:2.4:install (default-install) @ extension-structure-definitions ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] Installing /workspace/bunsen/extension-structure-definitions/target/extension-structure-definitions-0.5.14-SNAPSHOT.jar to /root/.m2/repository/com/cerner/bunsen/extension-structure-definitions/0.5.14-SNAPSHOT/extension-structure-definitions-0.5.14-SNAPSHOT.jar
Step #1 - "Compile Bunsen and Pipeline": [INFO] Installing /workspace/bunsen/extension-structure-definitions/pom.xml to /root/.m2/repository/com/cerner/bunsen/extension-structure-definitions/0.5.14-SNAPSHOT/extension-structure-definitions-0.5.14-SNAPSHOT.pom
Step #1 - "Compile Bunsen and Pipeline": [INFO] Installing /workspace/bunsen/extension-structure-definitions/target/extension-structure-definitions-0.5.14-SNAPSHOT-sources.jar to /root/.m2/repository/com/cerner/bunsen/extension-structure-definitions/0.5.14-SNAPSHOT/extension-structure-definitions-0.5.14-SNAPSHOT-sources.jar
Step #1 - "Compile Bunsen and Pipeline": [INFO] Installing /workspace/bunsen/extension-structure-definitions/target/extension-structure-definitions-0.5.14-SNAPSHOT-javadoc.jar to /root/.m2/repository/com/cerner/bunsen/extension-structure-definitions/0.5.14-SNAPSHOT/extension-structure-definitions-0.5.14-SNAPSHOT-javadoc.jar
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] -------------------< com.cerner.bunsen:bunsen-core >--------------------
Step #1 - "Compile Bunsen and Pipeline": [INFO] Building Bunsen Core 0.5.14-SNAPSHOT                              [4/13]
Step #1 - "Compile Bunsen and Pipeline": [INFO] --------------------------------[ jar ]---------------------------------
Step #0 - "Launch HAPI Source and Sink Servers": Digest: sha256:9bcafa8342b572eee248cb7c48c496863d352bbd0347e1d98ea238d09620e89b
Step #0 - "Launch HAPI Source and Sink Servers": Status: Downloaded newer image for hapiproject/hapi:latest
Step #0 - "Launch HAPI Source and Sink Servers": Creating hapi-fhir-db ... 
Step #0 - "Launch HAPI Source and Sink Servers": Creating sink-server  ... 
Step #0 - "Launch HAPI Source and Sink Servers": Creating hapi-server  ... 
Finished Step #0 - "Launch HAPI Source and Sink Servers"
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- jacoco-maven-plugin:0.8.12:prepare-agent (default) @ bunsen-core ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] argLine set to -javaagent:/root/.m2/repository/org/jacoco/org.jacoco.agent/0.8.12/org.jacoco.agent-0.8.12-runtime.jar=destfile=/workspace/bunsen/bunsen-core/target/jacoco.exec
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-resources-plugin:3.3.1:resources (generate-resources) @ bunsen-core ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] skip non existing resourceDirectory /workspace/bunsen/bunsen-core/src/main/resources
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-resources-plugin:3.3.1:resources (default-resources) @ bunsen-core ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] skip non existing resourceDirectory /workspace/bunsen/bunsen-core/src/main/resources
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-compiler-plugin:3.13.0:compile (default-compile) @ bunsen-core ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] Recompiling the module because of changed dependency.
Step #1 - "Compile Bunsen and Pipeline": [INFO] Compiling 24 source files with javac [debug deprecation release 11] to target/classes
Step #1 - "Compile Bunsen and Pipeline": [WARNING] /workspace/bunsen/bunsen-core/src/main/java/com/cerner/bunsen/definitions/FhirConversionSupport.java:[72,54] newInstance() in java.lang.Class has been deprecated
Step #1 - "Compile Bunsen and Pipeline": [INFO] /workspace/bunsen/bunsen-core/src/main/java/com/cerner/bunsen/definitions/PrimitiveConverter.java: Some input files use unchecked or unsafe operations.
Step #1 - "Compile Bunsen and Pipeline": [INFO] /workspace/bunsen/bunsen-core/src/main/java/com/cerner/bunsen/definitions/PrimitiveConverter.java: Recompile with -Xlint:unchecked for details.
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- spotless-maven-plugin:2.43.0:apply (default) @ bunsen-core ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] Spotless apply skipped
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-resources-plugin:3.3.1:testResources (default-testResources) @ bunsen-core ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] skip non existing resourceDirectory /workspace/bunsen/bunsen-core/src/test/resources
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-compiler-plugin:3.13.0:testCompile (default-testCompile) @ bunsen-core ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] Recompiling the module because of changed dependency.
Step #1 - "Compile Bunsen and Pipeline": [INFO] Compiling 2 source files with javac [debug deprecation release 11] to target/test-classes
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] --- maven-surefire-plugin:3.3.0:test (default-test) @ bunsen-core ---
Step #1 - "Compile Bunsen and Pipeline": [INFO] Surefire report directory: /workspace/bunsen/bunsen-core/target/surefire-reports
Step #1 - "Compile Bunsen and Pipeline": [INFO] Using auto detected provider org.apache.maven.surefire.junit4.JUnit4Provider
Step #1 - "Compile Bunsen and Pipeline": [INFO] 
Step #1 - "Compile Bunsen and Pipeline": [INFO] -------------------------------------------------------
Step #1 - "Compile Bunsen and Pipeline": [INFO]  T E S T S
Step #1 - "Compile Bunsen and Pipeline": [INFO] -------------------------------------------------------
Step #1 - "Compile Bunsen and Pipeline": [INFO] Running com.cerner.bunsen.ProfileMapperProviderTest
Step #1 - "Compile Bunsen and Pipeline": 13:58:16.017 [main] INFO ca.uhn.fhir.util.VersionUtil -- HAPI FHIR version 7.2.1 - Rev 547c9320f1
Step #1 - "Compile Bunsen and Pipeline": 13:58:16.054 [main] INFO ca.uhn.fhir.context.FhirContext -- Creating new FHIR context for FHIR version [R4]
Step #1 - "Compile Bunsen and Pipeline": 13:58:16.353 [main] INFO ca.uhn.fhir.context.support.DefaultProfileValidationSupport -- Loading structure definitions from classpath: /org/hl7/fhir/r4/model/profile/profiles-resources.xml
Step #1 - "Compile Bunsen and Pipeline": 13:58:16.426 [main] INFO ca.uhn.fhir.util.XmlUtil -- Unable to determine StAX implementation: java.xml/META-INF/MANIFEST.MF not found
Step #1 - "Compile Bunsen and Pipeline": 13:58:16.426 [main] DEBUG ca.uhn.fhir.util.XmlUtil -- WstxOutputFactory (Woodstox) not found on classpath
Step #1 - "Compile Bunsen and Pipeline": 13:58:17.998 [main] DEBUG ca.uhn.fhir.context.ModelScanner -- Scanning datatype class: org.hl7.fhir.r4.model.DataRequirement
Step #1 - "Compile Bunsen and Pipeline": 13:58:18.022 [main] DEBUG ca.uhn.fhir.context.ModelScanner -- Scanning datatype class: org.hl7.fhir.r4.model.Expression
Step #1 - "Compile Bunsen and Pip
...
[Logs truncated due to log size limitations. For full logs, see https://storage.cloud.google.com/cloud-build-gh-logs/log-098b371a-ce9a-4440-8b7b-cce426af8e58.txt.]
...
ler and Spark Thriftserver":     "valueCode": "M"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   }, {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "url": "http://hl7.org/fhir/StructureDefinition/patient-birthPlace",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "valueAddress": {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "city": "Clinton",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "state": "Massachusetts",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "country": "US"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     }
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   }, {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "url": "http://synthetichealth.github.io/synthea/disability-adjusted-life-years",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "valueDecimal": 0.0
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   }, {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "url": "http://synthetichealth.github.io/synthea/quality-adjusted-life-years",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "valueDecimal": 25.0
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   } ],
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "identifier": [ {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "system": "https://github.com/synthetichealth/synthea",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "value": "9785afad-19cc-02f6-de65-b2de2618c9ac"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   }, {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "type": {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "coding": [ {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "system": "http://terminology.hl7.org/CodeSystem/v2-0203",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "code": "MR",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "display": "Medical Record Number"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       } ],
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "text": "Medical Record Number"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     },
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "system": "http://hospital.smarthealthit.org",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "value": "9785afad-19cc-02f6-de65-b2de2618c9ac"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   }, {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "type": {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "coding": [ {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "system": "http://terminology.hl7.org/CodeSystem/v2-0203",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "code": "SS",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "display": "Social Security Number"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       } ],
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "text": "Social Security Number"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     },
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "system": "http://hl7.org/fhir/sid/us-ssn",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "value": "999-66-5218"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   }, {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "type": {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "coding": [ {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "system": "http://terminology.hl7.org/CodeSystem/v2-0203",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "code": "DL",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "display": "Driver's License"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       } ],
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "text": "Driver's License"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     },
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "system": "urn:oid:2.16.840.1.113883.4.3.25",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "value": "S99939769"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   }, {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "type": {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "coding": [ {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "system": "http://terminology.hl7.org/CodeSystem/v2-0203",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "code": "PPN",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "display": "Passport Number"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       } ],
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "text": "Passport Number"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     },
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "system": "http://standardhealthrecord.org/fhir/StructureDefinition/passportNumber",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "value": "X43420680X"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   } ],
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "name": [ {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "use": "official",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "family": "Anderson",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "given": [ "Jeremiah751" ],
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "prefix": [ "Mr." ]
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   } ],
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "telecom": [ {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "system": "phone",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "value": "555-241-4212",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "use": "home"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   } ],
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "gender": "male",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "birthDate": "1995-04-26",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "address": [ {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "extension": [ {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "url": "http://hl7.org/fhir/StructureDefinition/geolocation",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "extension": [ {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "url": "latitude",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "valueDecimal": 42.72180337634059
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       }, {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "url": "longitude",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "valueDecimal": -71.18537438961067
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       } ]
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     } ],
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "line": [ "788 Mann Mall" ],
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "city": "Lawrence",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "state": "MA",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "postalCode": "01843",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "country": "US"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   } ],
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "maritalStatus": {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "coding": [ {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  1044    0   976  100    68   3398    236 --:--:-- --:--:-- --:--:--  3625
100  4537    0  4469  100    68  15473    235 --:--:-- --:--:-- --:--:-- 15698
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "system": "http://terminology.hl7.org/CodeSystem/v3-MaritalStatus",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "code": "S",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "display": "Never Married"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     } ],
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "text": "Never Married"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   },
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "multipleBirthBoolean": false,
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "communication": [ {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "language": {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "coding": [ {
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "system": "urn:ietf:bcp:47",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "code": "en-US",
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":         "display": "English"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       } ],
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "text": "English"
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     }
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   } ]
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": }E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Patient 4765 updated successfully.
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":                                  Dload  Upload   Total   Spent    Left  Speed
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0*   Trying 192.168.10.8:8080...
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": * Connected to pipeline-controller (192.168.10.8) port 8080 (#0)
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > POST /run?runMode=INCREMENTAL HTTP/1.1
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Host: pipeline-controller:8080
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > User-Agent: curl/7.88.1
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Content-Type: application/json
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Accept: */*
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > 
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < HTTP/1.1 500 
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Content-Type: text/plain;charset=UTF-8
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Content-Length: 28
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Date: Mon, 01 Jul 2024 14:35:30 GMT
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Connection: close
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < 
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": { [28 bytes data]
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
100    28  100    28    0     0   5290      0 --:--:-- --:--:-- --:--:--  5600
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": * Closing connection 0
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Another pipeline is running.E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 79
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 4006
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 17279
Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Could not validate parquet files.
Finished Step #15 - "Run E2E Test for Dockerized Controller and Spark Thriftserver"
ERROR
ERROR: build step 15 "us-docker.pkg.dev/cloud-build-fhir/fhir-analytics/e2e-tests/controller-spark:0c7bac5" failed: step exited with non-zero status: 2
Step #0 - "Launch HAPI Source and Sink Servers": �[1A�[2K
Creating hapi-server  ... �[32mdone�[0m
�[1B�[2A�[2K
Creating sink-server  ... �[32mdone�[0m
�[2B�[3A�[2K
Creating hapi-fhir-db ... �[32mdone�[0m
�[3B
Step #13 - "Launch HAPI FHIR Sink Server": �[1A�[2K
Creating sink-server ... �[32mdone�[0m
�[1B
Step #14 - "Bring up controller and Spark containers": �[1A�[2K
Creating pipeline-controller ... �[32mdone�[0m
�[1B

Build Log: https://storage.cloud.google.com/cloud-build-gh-logs/log-098b371a-ce9a-4440-8b7b-cce426af8e58.txt