Replies: 2 comments 1 reply
-
Yes, it's somehow possible with some bash scripts and a bit of docker trickery, but personally I would use the docker import/export feature: https://computingforgeeks.com/how-to-export-and-import-docker-images-containers/ That way you can convert a docker image to a tar, transfer it to another computer and import it there. |
Beta Was this translation helpful? Give feedback.
0 replies
-
You could also use a bind mount on your personal host and move the folder with the PostgreSQL contents to the new machine and just re-use the volume as a bind mount again. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is your feature request related to a problem? Please describe.
So i need to run a mediagis-nominatim container at work with osm data for France, but the computers are really slow, so i've done the creation of the container/ database at home and exported a dump of the data using pg_dump (db.sql ≈ 60.0 gb)
How could i start a new instance at work with this dump, and will it be faster ?
Describe the solution you'd like
Instead of PBF_DATA="path.osm.pbf", could use something like SQL_DATA="path.sql"
Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
None
Additional context
Add any other context or screenshots about the feature request here.
None
Beta Was this translation helpful? Give feedback.
All reactions