• Just guessing what the links may have been…

    Possibly my post on lemmy.world, removed due to breaking rule 2, “Only tech related news or articles”

    I’ll copy paste my comment from there:

    In the reply to Patreon they mentioned having some automated and manual ways of removing CSAM, plus “closely working with NCMEC”, but I have no idea what that means.
    And these statistics of resolved reports: https://www.missingkids.org/content/dam/missingkids/pdfs/cybertiplinedata2024/2024-notifications-by-ncmec-resulting-content-removal.pdf

    Total number of reports of 128 resolved on average in 1.91 days. Less than half the time spent by Amazon, Google and Microsoft (for Bing).

    The other link might have been to this comment:

    • db0@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      3 days ago

      “Having manual ways to remove csam” means almost nothing. All of lemmy has a “manual way to remove csam”. “closely working with NCMEC” can mean they just use the cloudflare mechanism which is just hash matching. Point is, it’s very easy for a malicious actor to upload csam and then report them to patreon for it, without ever reporting it to them.

      • Redjard@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        3
        ·
        3 days ago

        Can’t you always attempt uploads until they bypass arbitrary filters and then report-snipe on that?
        How would a content-based filter prevent this if the malicious actor simply needs to upload correspondingly more images?

        I think the sad reality is that the only escape here is scale. Once you have been hit by this attack and been cleared by the 3rd parties, you’d have precedent for when this happens again and should hopefully be placed in a special bin for better treatment.
        Scale means you will be fire-tested, and are more likely to receive sane treatment instead of the ai-support special.