-
Hello all, Also, how can I use the external Postgres Server that we have in production? Ok, confusing, lets drill down this: 1 - We need to export / Migrate the data that we already have in the DD PROD environment, with some tests and findings and so on, to a Postgres external DB. 2 - We need to install the DD using the external Postgres DB. I have found already some info about this but its a bit confusing where all the info should be configured.
3 - Can we streamline the docker-compose file so we use only the configuration that we need (Postgres and so on), instead all teh confusion of the profiles?
Thanks in advance for your support. Jorge Gomes |
Beta Was this translation helpful? Give feedback.
Replies: 13 comments 69 replies
-
@quirinziessler , maybe you can help here |
Beta Was this translation helpful? Give feedback.
-
Hi @jasgggit I actually had nearly the same situation as you a few months ago. I will try to answer your question. Regarding 1.: Regarding 2: Regarding 3: Hope this helps you a bit. |
Beta Was this translation helpful? Give feedback.
-
@quirinziessler Hi, I did, the '-v3' should be verbose 3 ... this is what it outputs: /opt/containers/django-DefectDojo # docker exec -it django-defectdojo-uwsgi-1 python manage.py loaddata -v 3 dumpfile.json Can't figure out what the issue is .... |
Beta Was this translation helpful? Give feedback.
-
About this:
An example of that is available in the Community Contribs repo: If you navigate into the dojo directory in that part of Community Contribs, you can see a 'simplified' version of compose at: |
Beta Was this translation helpful? Give feedback.
-
Hi @jasgggit I just tried to reproduce your case. However dumping and loading the file works fine for me. Could you please check if the dumpfile is loaded inside the container? Just a side note: The command I sent you saved the file in the current directory (not inside the docker container/volume). In order to have it inside the container, move it to one of the defectdojo volumes e.g. media. Another option would be to login to the uwsgi container and drop it directly to media. After configuring with Postgres login again and upload from media. |
Beta Was this translation helpful? Give feedback.
-
Hello all, the problem has been solved by one of my colleagues, we know more about Postgres than I do, and he found the issue of the import of the DUMP JSON file was related to the 'Sequences', on the 'auditlog_logentry_id_seq'. This way all is imported. Thanks for the support. Regards. |
Beta Was this translation helpful? Give feedback.
-
For future readers of this discussion, there's some good info in OWASP's Slack instance in the #defectdojo channel. I'm summarizing greatly here but one suggestion I found interesting was to use mysqldump to get the schema/data from MySQL then use a tool like pgLoader to convert the mysqldump to Postgres. Anyway, hope that helps those that want/need to migrate. |
Beta Was this translation helpful? Give feedback.
-
Well, it just does not work for me either:
My DB dump from mysql is 21 G that am trying to load into postgres
|
Beta Was this translation helpful? Give feedback.
-
THE FOLLOWING PROCEDURE HAS BEEN REVISED BASED ON FEEDBACK RECEIVED ON MY ORIGINAL ANSWER AND ISSUES ENCOUNTERED BY OTHER USERS WITHIN THIS DISCUSSION I can confirm that the tool For a day and a half, I attempted to migrate using Django's So in summary, this is the successful approach I used for migrating DefectDojo with Docker:
You're all set! Explore DefectDojo to ensure functionality and that no 500 errors appear. I hope this helps others! Please share any feedback on how these steps could be refined, especially if there’s a way to avoid using the |
Beta Was this translation helpful? Give feedback.
-
I had the same issue, I loaded the data with this command and it works |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Hi, I am updating my defectdojo from v 2.30.1 to the latest release. My current setup uses MySQL as database. As I want to test the upgrade on my local environment first, I exported the .sql db file from the prod environment, set up my local environment in docker with latest DD and postgres and migrated using pgloader. Everything went smoothly except whenever I open a finding, I get 500 Internal Server error
|
Beta Was this translation helpful? Give feedback.
-
There is just one issue that I noticed after migrating to Postgres. So I have a code written that runs in jenkins that imports scan results everynight to DD. In MySQL, it worked normally as intended. But after upgrading to Postgres, I get 10k + new findings evrytime the import occurs. I have turned on the options to deduplicate the findings but no luck. Maybe anyone has had similar problem or an insight? |
Beta Was this translation helpful? Give feedback.
Hello all, the problem has been solved by one of my colleagues, we know more about Postgres than I do, and he found the issue of the import of the DUMP JSON file was related to the 'Sequences', on the 'auditlog_logentry_id_seq'.
It needs to be changed from the default '1' to something greater than the one on the JSON file.
Then we can use the commands to load the JSON after we flush it first.
This way all is imported.
Thanks for the support.
Regards.
JG