I’m finding it harder and harder to tell whether an image has been generated or not (the main giveaways are disappearing). This is probably going to become a big problem in like half a year’s time. Does anyone know of any proof of legitimacy projects that are gaining traction? I can imagine news orgs being the first to be hit by this problem. Are they working on anything?

  • knightly the Sneptaur@pawb.social
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    5 days ago

    The problem is Goodhart’s Law; “Every measure which becomes a target becomes a bad measure”. Implementing a verification system that depends on video evidence creates both an incentive to forge such videos and a set of labeled training data that grows more readily available as the system sees more use. The adversarial generative network is literally designed to evolve better scams in response to a similarly-evolving scam detector, there’s no computational way around the need to have people involved to make sure things are what they’re claimed to be.

    Universal Basic Income would be a good start, but the fundamental problem is money as the primary organizing force of society.