Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Request: add the ability to quarantine images, instead of deleting them #5

Open
csolisr opened this issue Sep 8, 2023 · 2 comments

Comments

@csolisr
Copy link

csolisr commented Sep 8, 2023

Some jurisdictions may require storing CSAM, instead of deleting it, for law enforcement purposes. In this case the material must be securely stored and its access limited. See https://github.com/iftas-org/resources/tree/main/CSAM-CSE#reporting for details.

This tool currently deletes the detected potential CSAM, which as described above, may fall into a violation of the law for certain jurisdictions. In order to comply with regulations, this tool should have the ability to:

  • move the potentially offending files out of storage and into a (preferably encrypted) quarantine folder
  • store relevant metadata for enforcement purposes, such as the posting user, community it was posted to, and date of posting
  • ensure that the files can be reviewed before quarantining with a dry run, as it can be done already with the tool
@poVoq
Copy link

poVoq commented Sep 8, 2023

IANAL, but this only applies if you are aware of the specific CSAM on your server. So if this system automatically scans and deletes it before you are aware of it, there is no requirement to quarantine it as evidence. In the end it is no different than bulk deleting any other file then.

Which is kinda the point, as interacting with law enforcement on this can get you in hot waters, even if you approach it with best intentions. In Germany for example, doing what you describe can land you in prison even if you only stored it for the purpose as evidence. They seek to revise that law as it extends to totally innocent people ( some details in German here ), but this hasn't happened yet.

I think in many jurisdictions the safest is to delete before you are even aware (which this tool can do), as it is the quickest and what you don't have can't be used as evidence against you.

@db0
Copy link
Owner

db0 commented Sep 8, 2023

I can add this functionality as an optional thing, but keep in mind because this tool will catch mostly false positives, you will end up with thousands of images you would have to filter through manually.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants