In 2023, there were 457,895 complaints of violence against women, with the majority (339,782 cases) being gender-based. Then came Online Gender-Based Violence (GBV), including 440 cases reported in the previous year by LBH APIK Jakarta. Of these, 49 cases were non-consensual intimate imagery (NCII) which uses deepfake AI technology to create fake content with the victim's face in the nude for revenge porn, defamation, and sexual violence. Through an app, a campaign is underway to boost women's preparedness and awareness for handling potential deepfake revenge porn in the coming years. WooDefender lets users digitally watermark their photos and detect edited photos and those created by AI. A built-in community helps users to encourage others to report accounts that create or spread fake content. WooDefender also offers understanding and legal guidance, helping victims know what legal steps to take. Additionally, victims can access a step-by-step rehabilitation program within the app.
How To Install
Go to the release page and download apk v1.0.0 then install it on your Android device! If you want to build the apk manually you need to add the Firebase configuration file.
Team Member:
Hustler: Fajar Ramadhan
Hipster: Muhammad Rezka Al Maghribi
Hipster: Huda Rasyad Wicaksono
Hacker: Dhi'fan Razaqa
Onboarding, Login, and Register
Community (Report, post, detail post, and add post)
Main Screen, Classification
Watermark Embed Watermark
Extract Watermark
History
Profile