Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about OOM in GrammarMutator #33

Open
Kiprey opened this issue Mar 28, 2022 · 4 comments
Open

Question about OOM in GrammarMutator #33

Kiprey opened this issue Mar 28, 2022 · 4 comments

Comments

@Kiprey
Copy link
Contributor

Kiprey commented Mar 28, 2022

Hello! Would you like to ask whether OOM is considered in GrammarMutator?
There seems to be no limit to the size of interesting_trees and various *_ candidates in GrammarMutator.
This can lead to OOM during long fuzzing.

If it is a bug, can you please fix it? Or I just miss something?

Thanks.

@ifratric
Copy link
Collaborator

ifratric commented Mar 29, 2022

Hi! It's certainly possible to hit OOM if the samples are large and the corpus keeps growing. However, I'm curious if you can share more about your setup (either here or privately), because I used the grammar engine to fuzz reasonably complex targets (such as v8 javascript engine for example) and didn't encounter such errors in practice (IIRC the fuzzing workers had 8 or 16GB RAM each, not sure anymore).

Jackalope has some mechanisms to decrease RAM usage such as -keep_samples_in_memory=0, however this might not give you much with the grammar mode. The interesting_trees you mentioned keeps all samples in the corpus in the tree form. This is so that these trees can be combined with the currently fuzzed sample to generate new interesting samples. In the interest of saving memory, you could simply comment out this line

interesting_trees.push_back(context->tree);
, however those types of mutations would no longer work.

However, there are several other, easier things you can do to to reduce memory usage in grammar fuzzing mode

  • reduce the value of MAX_DEPTH
    #define MAX_DEPTH 100
    This declares the maximum depth of the tree corresponding to the grammar samples.
  • Reduce the value of MINIMIZATION_LIMIT in
    #define MINIMIZATION_LIMIT 1500
    . This describes when the minimizer stops, e.g. the value of 1500 means that the minimizer will stop when there is less then 1500 nodes in the tree, even if the sample could be minimized further. If significantly reducing this value does not make samples in the corpus smaller, then it could mean that the minimizer is not working properly.

Having said all of the above, if there is a memory leak somewhere, of course this would be considered a bug.

@Kiprey
Copy link
Contributor Author

Kiprey commented Mar 30, 2022

Thank you for your reply! I combined the Jackalope grammar mutation strategy with AFL (alias as afl-jackalope )for better fuzzing binaries with source code. So I'm sorry for not being able to provide OOM setups as I'm not using Jackalope directly.

For Jackalope itself, simply removing interesting_trees.push_back(context->tree) may not reduce memory consumption, because context is still held by other data structures, such as sample_queue.

For afl-jackalope, if interesting_trees's size reaches a certain value, then I just make interesting_trees simply delete half of nodes to prevent from OOM, now it works quite well.

I read the source code of jackalope carefully, although there are quite a few structures that never release memory, their life cycles are quite correct. So there is no hint that there is any memory issue inside.

I have solved the problem about OOM. However, I still have some questions about the corpus.

  • Can jackalope run without inputs?
  • Is there any way to convert from common JS to serialized tree (jackalope input format)?
  • Would the input of jackalope have an impact on its mutation process? And if so, what is the approximate impact?

Please excuse my rough English. Best wishes for you! :)

@ifratric
Copy link
Collaborator

Thanks for reviewing the code :)

"Can jackalope run without inputs?"
Not sure what you mean, did you mean with inputs. Because in the grammar fuzzing mode, it's actually expected that Jackalope will start without inputs; and the initial inputs will be generated using the provided grammar. See more info in this thread #26

"Is there any way to convert from common JS to serialized tree (jackalope input format)?"
As noted in the other thread, there is no guaranteed unique way to parse a sample generated by a context-free grammar back into its grammar representation. We could maybe do a "best effort", but this is not implemented in Jackalope currently.

"Would the input of jackalope have an impact on its mutation process? And if so, what is the approximate impact?"
Sorry, not sure what you mean here - can you give me an example?

@Kiprey
Copy link
Contributor Author

Kiprey commented Mar 31, 2022

Thanks for your kind reply. The last question means whether the corpus is important for the fuzzing process or coverage results. For example, in AST fuzzer, good corpus inputs can significant improve the performance of fuzzer, while junk corpus inputs make littile contribution to the performance of entire fuzzing process.

Now I can probably deduce the answer to the last question with your kind answer and great code implementation.
Thanks for your artwork and patient reply. I have no other questions.

Best wishes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants