For facial recognition experts and privacy advocates, the East Bay detective’s request, while dystopian, was also entirely predictable. It emphasizes the ways that, without oversight, law enforcement is able to mix and match technologies in unintended ways, using untested algorithms to single out suspects based on unknowable criteria.
I can see the abuse but what if this actually worked in a best case scenario? So dna is found say from a rape and that DNA is used to create a image of the person and then they find that person and then do DNA tests to match them. The image is not used as evidence but used to find the person. Honestly it seems like a good use, if it’s limited to that.
I don’t know a lot about DNA, but i know about facial recognition.
Facial recognition is highly inaccurate. It would be easy for people from the same country to “match” at facial recognition despite being totally unrelated.
If “face generation from DNA” is only roughly accurate (ex: nose size or skin tone), then anybody from the same ethnic origin could be a match. Basically, the more you look like the “average person”, the more likely you would fit the generated face.
Doesn’t it sound a lot like technology-enabled profiling?
I think this is a bad idea, especially the way it’s being developed, but let me play devil’s advocate for a second. What if it were only used to narrow a search radius, the same way cell pings are used to narrow a search radius? Cell pings are already used to direct resources. Being near a crime obviously doesn’t mean you committed the crime, but it does narrow down where to look, and once you start looking you can find real evidence more efficiently. You could pair this with other techniques to narrow down the search, and then find real hard corroborating evidence. Also, since they need DNA in the first place they’d need a DNA match from the suspect preventing random people from getting charged.
Now to stop playing devil’s advocate, there are just so many ways this can be abused, and the police are the worst organization to lead it. They are not technology experts, they’re not even legal experts, and they’ve been shown over and over again to be easily biased, so even if they need corroborating evidence, that doesn’t mean they won’t be biased by the face match and then “find” evidence, or even plant it, plus, even just being accused can be hugely disruptive, and traumatizing when they target a non match. Imagine you’re innocently going about your day and you suddenly get snatched up for questioning and forced to give a DNA sample.
If anything like this were to be used in any way you would need so many safe guards and it’s obvious the police don’t care about setting any of those up. You’d need a double blind approach to evidence gathering, extreme oversight, and a very restrictive legal framework and course close guarding and anonymization techniques on any personal data, and probably more things I’m not thinking about. The police are so irresponsible to treat this like a whatever thing that isn’t incredibly sensitive and invasive and needing tons of safe guards to not be a massive violation of privacy and a dangerous biasing vector that could easily catch up innocent people.
Dude facial recognition catches the wrong people all the time. It is not as infallible as they make it out to be and this is just adding an entire extra level of mistakes they can make.
Facial recognition tech is bogus and because of its technical limitations, unintentionally(?) racist. (ie the cameras are not designed well to take good photo/video of dark skin, leading to high false positive rates when it comes to dark-skinned people) edit: even further, the cameras are often too small of a resolution for quality matching.
Further, facial reconstruction based on DNA isn’t exactly super accurate on its own.
Please don’t fall for this bullshit.
I can’t follow your reasoning. What if they picked a person at random and it was actually the perpetrator, so it actually worked in a best case scenario?
deleted by creator
Here is an alternative Piped link(s):
https://piped.video/Oe9xGX7lKZc?si=hppJaktpo82F5eD
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
The idea that DNA is extremely predictive of phenotype is already kind of… ehh.
There may be some very large feature predictions you can mostly make, but something as specific as recognizing a person? No way in hell. Far too many environmental factors for appearance.
Don’t assume it wouldn’t be abused since the police have a shit track record. If anything like this we’re to be used then strict laws restricting how it can be used need to come first since the police are dumb and can’t be trusted to invent new applications of technology. They’re the last group that would be leading this.
No. It just does not work that way. The article specifically mentions that there’s no proof whatsoever that the company can actually generate a face from DNA. It’s like looking at a textbook on automotive design and predicting exactly what a specific car built 20 years from now look like. General features? Sure - four wheels, a windshield, etc. Anything more specific? Nope, not at all. And this is before we get into environmental factors - think of scratches or aftermarket spoilers on a car. Humans are similarly influenced by their environment, even down to the level of what we eat or the pollutants in the air we breathe.
What the cops did is as close to bullshit fantasy as makes no difference. Asking a fortune teller to draw you a picture would be only slightly less accurate. This is so insanely problematic those cops ought to be up on charges.
You’re assuming it is useful enough to find the person, and not just A person.
Fruit of the poisoned tree