A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

  • @Wirlocke
    cake
    link
    English
    44 months ago

    I think one of the main issues is the matter of fact usage of the term Minor Attracted Person. It’s a controversial term that phrases pedophiles like an identity, like saying Person Of Color.

    I understand wanting a not as judgemental term for those who did no wrong and are seeking help. But it should be phrased as anything else of that nature, a disorder.

    If I was making a term that fit that description I’d probably say Minor Attraction Disorder heavily implying that the person is not ok as is and needs professional help.

    In a more general sense, it feels like the similar apologetic arguments that the dark side of reddit would make. And that’s probably because Google’s officially using Reddit as training data.