-
-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for CyberTipLine/PhotoDNA (Image fingerprinting for CSAM detection and reporting) #3416
Comments
This comment was originally posted by @paigeadelethompson at matrix-org/dendrite#3416 (comment). Another one is run by Google https://protectingchildren.google/tools-for-partners/ |
This comment was originally posted by @S7evinK at matrix-org/dendrite#3416 (comment). Is there some API a hash can be uploaded to? Couldn't find anything there. |
This comment was originally posted by @paigeadelethompson at matrix-org/dendrite#3416 (comment). I actually found out a little more about these services and they're actually a little bit more prohibitive than I had imagined. Namely because when something is detected, the operator needs to follow up with the organization that provides the API or with proper law enforcement channels and so really using this API at all is a bit of an arrangement but it is becoming more available. They actually take it very seriously when it happens, and there's a window of like 24 hours in which case if you don't reply they reach out to law enforcement on your behalf. So the whole thing is kind of a responsibility that most normal folks aren't going to want anything to do with. But, there's still a serious problem with Matrix right now where the moderators of major Matrix Foundation channels can't do anything except delete abhorrent materials when they're posted and ban users from the channel. I will do some more research on the topic but so far none of the programs I've reached out to have responded with any information about SDKs; also these are just simply not services that most people want the responsibility of or need, some do though but whether or not they can provide much information to help law enforcement is another problem because the protocol is federated so the most you could do is say "it came from that homeserver" and presumably block that homeserver from federating with any of your channels but it's not clear to me whether or not that is even something you can do at the moment. |
This comment was originally posted by @paigeadelethompson at matrix-org/dendrite#3416 (comment).
I actually have a friend who was able to provide that context, also he is going to check with his contacts to see who would should be reached out to, but said that if nothing else we could send an enquiry to [email protected] |
This issue was originally created by @paigeadelethompson at matrix-org/dendrite#3416.
Support for this should really be added to home servers. This is actually something that CloudFlare supports although Cloud Flare is unlikely to fix the problem. There needs to be something for home servers to prevent their users from uploading abhorrent material even if it's just best effort.
https://www.microsoft.com/en-us/photodna
https://blog.cloudflare.com/the-csam-scanning-tool/
https://report.cybertip.org/ws-hashsharing/v2/documentation/#overview
https://report.cybertip.org/ispws/documentation/#curl-example
The text was updated successfully, but these errors were encountered: