Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

datafile.py killed on huge snmpwalk files #143

Open
landy2005 opened this issue Jun 9, 2020 · 3 comments
Open

datafile.py killed on huge snmpwalk files #143

landy2005 opened this issue Jun 9, 2020 · 3 comments

Comments

@landy2005
Copy link
Contributor

landy2005 commented Jun 9, 2020

I have big snmpwalk dump:

$ ls -lh myagent.snmpwalk
-rw-rw-r-- 1 mstupalov mstupalov 221M Jun  9 12:41 myagent.snmpwalk

when try convert to snmprec, datafile script killed by unknown reason (probably memory lost):

$ datafile.py --ignore-broken-records --escaped-strings --source-record-type=snmpwalk --input-file=myagent.snmpwalk --output-file=myagent.snmprec
# Input file #0, processing records from the beginning till the end
Killed
@gainskills
Copy link

searched in the repository and got no code related to 'Killed'. I would suggest you check your OS.

@landy2005
Copy link
Contributor Author

Funny..
Of course "Killed" is a OS message, when memory ran out.

I was able to convert the file, breaking it into 4 parts..
But I not think this is correct way, need to check the work with memory and cleaning at runtime.

Just for sure, I tried to run without additional parameters (--ignore-broken-records --escaped-strings), with the same result.

@gainskills
Copy link

@landy2005 could you share a sample file for the issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants