- cross-posted to:
- hackernews@derp.foo
- cross-posted to:
- hackernews@derp.foo
I’d still buy chicken there.
This is the only suitable comment.
Lol
In case you’re like me and read the summary but didn’t click the article cause I had no idea what KYC was, it means “know your customer”. Used by banks and such for verification purposes.
deleted by creator
It’s really pretty stupid to use acronyms that people are not familiar with in your title. I just downvote and move on usually if the title is not coherent enough.
I imagine that means they’ll be forced to keep more branches open if eyeballing you in person is the only option left.
National ID might help, but ultimately KYC is intrusive from MOST vendors, very few should care.
Good.
KYC is an invasion of privacy.
I mean, that’s the entire point, yes. Some financial transactions, at some level of scale, should not be private.
For instance, if you abolish KYC, you’ve just fully legalized all insider trading. Perhaps you can see that there are some conflicts of interest there. On the crypto side, KYC allows the IRS to go after traders for capital gains tax. Without it, crypto would be an easy way for the ultra-wealthy to just completely bypass taxes, since you couldn’t prove that it belonged to them.
It also prevent money laundering for drug cartels, terrorist cells, and rogue states/agencies. Obviously those organizations still find a eay to move money around, but it isn’t cheap nor easy for them.
This is the best summary I could come up with:
Viral posts on X (formerly Twitter) and Reddit show how, leveraging open source and off-the-shelf software, an attacker could download a selfie of a person, edit it with generative AI tools and use the manipulated ID image to pass a KYC test.
Tutorials online show how Stable Diffusion, a free, open source image generator, can be used to create synthetic renderings of a person against any desired backdrop (e.g., a living room).
Now, yielding the best results with Stable Diffusion requires installing additional tools and extensions and procuring around a dozen images of the target.
A Reddit user going by the username harsh, who’s published a workflow for creating deepfake ID selfies, told TechCrunch that it takes around one to two days to make a convincing image.
Typically, they involve having a user take a short video of themselves turning their head, blinking their eyes or demonstrating in some other way that they’re indeed a real person.
Early last year, Jimmy Su, the chief security officer for cryptocurrency exchange Binance, told Cointelegraph that deepfake tools today are sufficient to pass liveness checks, even those that require users to perform actions like head turns in real time.
The original article contains 589 words, the summary contains 196 words. Saved 67%. I’m a bot and I’m open source!
If this becomes a high risk then this form of verification will be dropped. If it’s going to be assessed as relatively low risk then online / webcam verification will have risk scoring penalty adjusted. KYC will still exist. Thank you for coming to my TED talk.
Misleading clickbait. The article says that even video is not enough for an online identity check, because advances in CGI make it ever easier to make realistic fakes. Many countries issue ID cards with NFC chips for the purpose (and have done so for years). It’s a non problem.
KYC is not about online identity checks.