Today, a prominent child safety organization, Thorn, in partnership with a leading cloud-based AI solutions provider, Hive, announced the release of an AI model designed to flag unknown CSAM at upload. It’s the earliest AI technology striving to expose unreported CSAM at scale.

  • Railcar8095@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 days ago

    Applying GAN won’t work. If used for filtering would result on results being skewed to a younger, but it won’t show 9 the body of a 9 year old unless the model could do that from the beginning.

    If used to “tune” the original model, it will result on massive hallucination and aberrations that can result in false positives.

    In both cases, decent results will be rare and time consuming. Anybody with the dedication to attempt this already has pictures and can build their own model.

    Source: I’m a data scientist