Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Script stop when trying to parse big xml files #67

Open
Myrwan opened this issue Feb 13, 2023 · 4 comments
Open

Script stop when trying to parse big xml files #67

Myrwan opened this issue Feb 13, 2023 · 4 comments

Comments

@Myrwan
Copy link

Myrwan commented Feb 13, 2023

Hi,
when trying to parse big files (around 15Mo) the script stop without any error message..

Example of command :
testarchiver --config db_config.json --format robotframework --team QA --series "Overnight#$CI_JOB_ID" --metadata Application:perso output_206431_valid.xml

The database system is Postgresql.

Any idea on a way to analyse the pb ?

Thks in advance

@Muusssi
Copy link
Member

Muusssi commented Feb 13, 2023

Interesting issue. Failing without an error message or even stack trace is not expected.

  • For Robot Framework output 15MB for a is not even that big so that shouldn't be the issue. I have been routinely parsing 1-2 GB output.xml files without issues.
  • What version of TestArchiver and Robot are you using?

@Muusssi
Copy link
Member

Muusssi commented Feb 13, 2023

One thing that sometimes confused me was that the default db_engine is sqlite which when I forgot to define the db_engine would lead to me looking for the missing results in postgresql while they were only written in sqlite database file.

Not knowing what you have defined in your db_config.json it could be that if the db_engine is not defined it would default to sqlite. Although unlikely as providing host or user options should fail with sqlite and you most likely have those defined. It is also possible that parsing of the db_config.json goes somehow wrong and the tool will use the defaults.
Please check the results are not written in sqlite database file called test_archive.

@Myrwan
Copy link
Author

Myrwan commented Feb 15, 2023

Thanks for your reply

Here is the versions :
_- testarchiver 2.6.1

  • psql (PostgreSQL) 15.1 (Debian 15.1-1.pgdg110+1)_

And here is the content of the db_config.json file :
{
"db_engine": "postgresql",
"database": "robot_results_3",
"host": "192.168.4.125",
"port": "5432",
"user": "postgres",
"password": "postgres"
}

The DB connection is correctly opened (but never closed ?) and transactions to DB are ok.. but the script end before the end and the output.xml file is then not fully parsed.

Note: It works fine if I add options "--no-keywords --no-keyword-stats --ignore-logs"

@Myrwan
Copy link
Author

Myrwan commented Feb 15, 2023

I think the pb comes with network congestion when the bandwidth decrease. But I am not sure because I cannot investigate on that part.

But yes, what what confuses me is that there is no error displayed when the script end.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants