For facial recognition experts and privacy advocates, the East Bay detective’s request, while dystopian, was also entirely predictable. It emphasizes the ways that, without oversight, law enforcement is able to mix and match technologies in unintended ways, using untested algorithms to single out suspects based on unknowable criteria.

  • fidodo@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 months ago

    I think this is a bad idea, especially the way it’s being developed, but let me play devil’s advocate for a second. What if it were only used to narrow a search radius, the same way cell pings are used to narrow a search radius? Cell pings are already used to direct resources. Being near a crime obviously doesn’t mean you committed the crime, but it does narrow down where to look, and once you start looking you can find real evidence more efficiently. You could pair this with other techniques to narrow down the search, and then find real hard corroborating evidence. Also, since they need DNA in the first place they’d need a DNA match from the suspect preventing random people from getting charged.

    Now to stop playing devil’s advocate, there are just so many ways this can be abused, and the police are the worst organization to lead it. They are not technology experts, they’re not even legal experts, and they’ve been shown over and over again to be easily biased, so even if they need corroborating evidence, that doesn’t mean they won’t be biased by the face match and then “find” evidence, or even plant it, plus, even just being accused can be hugely disruptive, and traumatizing when they target a non match. Imagine you’re innocently going about your day and you suddenly get snatched up for questioning and forced to give a DNA sample.

    If anything like this were to be used in any way you would need so many safe guards and it’s obvious the police don’t care about setting any of those up. You’d need a double blind approach to evidence gathering, extreme oversight, and a very restrictive legal framework and course close guarding and anonymization techniques on any personal data, and probably more things I’m not thinking about. The police are so irresponsible to treat this like a whatever thing that isn’t incredibly sensitive and invasive and needing tons of safe guards to not be a massive violation of privacy and a dangerous biasing vector that could easily catch up innocent people.