-
Notifications
You must be signed in to change notification settings - Fork 220
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Increased Volume Found after --continue #2257
Comments
Thanks @aberenguel for reporting. It shouldn't happen, possibly it is related to #telegram A few weeks ago I noticed some items were receiving a different Are you able to share a small image and provide the aborting point to resume to reproduce the issue? |
Do you have Telegram databases in this case? @hauck-jvsh does Telegram parser extract subitems always in the same order? This is needed for --continue to work properly, since the subitem number/position is used in trackID computation. If a HashSet or HashMap, for example, is used somewhere into the parser to store subitems to be extracted, that may be the cause, just a blind guess... |
Or maybe other parsers are affected by the hypothesis above... |
I never thought about this, and never made any test to check if the items are being extracted in the same order. |
Don't worry, I never gave this recommendation to contributors and just realized that might be the issue today. |
The trigger image has 1TB. I'll try to reproduce with another image. |
I was able to reproduce killing the java processes with |
Are you able to identify which files/artifacts were duplicated in the case? Look at the trackId property in Metadata filter panel, each value should occur just for 1 file. If it is not duplicated, maybe looking for files with same hashes and same paths may help to find the duplicated artifacts. And actually there is a very little chance nothing was duplicated, but it was just a minor bug in the Total Volume counter. |
I took a look at --continue related code and seems it is taking into account the size of items that are containers AND subitems/carved items in the total volume count to process, while standard processing doesn't. So maybe this is just a minor issue with the volume count... |
I'm processing a dd file with master branch (profile pedo, enableOCR and enableFaceRecognition).
During the process, the process aborted, so I re-exectuted with --continue.
Then I noticed that the
Volume Found
(Volume Descoberto) increased.The text was updated successfully, but these errors were encountered: