You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is just a "has anyone else had this problem" query?
I'm working on generating fhir resources into the Google Healthcare API Fhir store using the fhir.resources.R4B library. I'm doing this in an apache beam datarunner job which you can build then once you are happy run as a Job in GCP (google cloud processing).
It worked 100% running locally and I could build fhir and post to the fhir store just fine. I'd run my apache beam job with data runner which is allows you to run the job on your local pc.
When I run it in GCP using the requirements.txt below the python installs the library fine (google logging indicates the install worked and lib was retrieved from Pypi ok) and the GCP job start fine but just doesn't pick up any data. We're reading from a pub/sub. Remove the fhir.resources.R4B library and the job runs and picks up data!
Anyone has used this library successfully with dataflow in google cloud?
Anyone can suggested libraries that I could add to my installs to possibly help (is this a dependency issue?)
Lastly i did a pip freeze on my local machine which gives you ALL the libraries you have in your virtual environment and installed all of them in GCP Dataflow in the hope it was a missing lib that I wasn't being told about but no joy. The pip freeze library list was long
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
This is just a "has anyone else had this problem" query?
I'm working on generating fhir resources into the Google Healthcare API Fhir store using the fhir.resources.R4B library. I'm doing this in an apache beam datarunner job which you can build then once you are happy run as a Job in GCP (google cloud processing).
It worked 100% running locally and I could build fhir and post to the fhir store just fine. I'd run my apache beam job with data runner which is allows you to run the job on your local pc.
When I run it in GCP using the requirements.txt below the python installs the library fine (google logging indicates the install worked and lib was retrieved from Pypi ok) and the GCP job start fine but just doesn't pick up any data. We're reading from a pub/sub. Remove the fhir.resources.R4B library and the job runs and picks up data!
Requirements.txt:
Just wondered if:
Lastly i did a pip freeze on my local machine which gives you ALL the libraries you have in your virtual environment and installed all of them in GCP Dataflow in the hope it was a missing lib that I wasn't being told about but no joy. The pip freeze library list was long
Pip freeze output is a bit long but here it is:
Any comments or suggestions appreciated
thanks.
Beta Was this translation helpful? Give feedback.
All reactions