-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Script stop when trying to parse big xml files #67
Comments
Interesting issue. Failing without an error message or even stack trace is not expected.
|
One thing that sometimes confused me was that the default db_engine is sqlite which when I forgot to define the db_engine would lead to me looking for the missing results in postgresql while they were only written in sqlite database file. Not knowing what you have defined in your db_config.json it could be that if the db_engine is not defined it would default to sqlite. Although unlikely as providing host or user options should fail with sqlite and you most likely have those defined. It is also possible that parsing of the db_config.json goes somehow wrong and the tool will use the defaults. |
Thanks for your reply Here is the versions :
And here is the content of the db_config.json file : Note: It works fine if I add options "--no-keywords --no-keyword-stats --ignore-logs" |
I think the pb comes with network congestion when the bandwidth decrease. But I am not sure because I cannot investigate on that part. But yes, what what confuses me is that there is no error displayed when the script end. |
Hi,
when trying to parse big files (around 15Mo) the script stop without any error message..
Example of command :
testarchiver --config db_config.json --format robotframework --team QA --series "Overnight#$CI_JOB_ID" --metadata Application:perso output_206431_valid.xml
The database system is Postgresql.
Any idea on a way to analyse the pb ?
Thks in advance
The text was updated successfully, but these errors were encountered: