-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
【使用反馈】模型识别率太感人了 #15
Comments
你可以在这里打个断点或者 print 看看:
nsfw-model 的判定条件是五个条件中最高概率在 ["hentai", "porn", "sexy"] 之中就是 positive。 你可以自己加点判断条件,这个就比较困难了。而且只拿反例来讲没什么意义,得统计一下错误率,3-5% 我觉得都是良好的。 至于 safety-checker 那边也不方便调整。 |
0.13 已发版,新增 nsfw_image_detection 模型。 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
这是默认模型
The text was updated successfully, but these errors were encountered: