Relevant bit for those that don’t click through:
Daniel Bernstein at the University of Illinois Chicago says that the US National Institute of Standards and Technology (NIST) is deliberately obscuring the level of involvement the US National Security Agency (NSA) has in developing new encryption standards for “post-quantum cryptography” (PQC). He also believes that NIST has made errors – either accidental or deliberate – in calculations describing the security of the new standards. NIST denies the claims.
“NIST isn’t following procedures designed to stop NSA from weakening PQC,” says Bernstein. “People choosing cryptographic standards should be transparently and verifiably following clear public rules so that we don’t need to worry about their motivations. NIST promised transparency and then claimed it had shown all its work, but that claim simply isn’t true.”
Also, is this the same Daniel Bernstein from the 95’ ruling?
The export of cryptography from the United States was controlled as a munition starting from the Cold War until recategorization in 1996, with further relaxation in the late 1990s.[6] In 1995, Bernstein brought the court case Bernstein v. United States. The ruling in the case declared that software was protected speech under the First Amendment, which contributed to regulatory changes reducing controls on encryption.[7] Bernstein was originally represented by the Electronic Frontier Foundation.[8] He later represented himself.[9]
So highly reputable source with skin in the game thanks for the explanation.
WHAT THE FUCK? This guys a stone cold fuckin gangster!
At 24 he took the largest surveillance apparatus in history to court… and won! He even raw dogged it — representing himself for a portion of the trial.
He’s my hero!
It is indeed one and the same. This is the post that triggered this article (warning: it’s long and not well organized): https://blog.cr.yp.to/20231003-countcorrectly.html
Credit where credit is due, DJB is usually correct even if he could communicate it better.
Honestly, I think his communication here is fine. He’s probably going to offend some people at NIST, but it seems like he’s already tried the cooperative route and is now willing to burn some bridges to bring things to light.
It reads like he’s playing mathematics and not politics, which is exactly what you want from a cryptography researcher.
Sadly not new. The USA considers encryption to be a weapon of war (thanks Germany), so they do whatever they can to interfere with it. If you are making a new encryption scheme it will be illegal if the government doesn’t have an easy way to break it.
Edit: the guy that made pgp got in a stink with the government if memory serves they tried to bop him with something to do with itar.
Removed by mod
I, too, just finished watching Rabbithole.
Removed by mod
it will be illegal if the government doesn’t have an easy way to break it
Aren’t there a lot of existing standards already can’t be broken easily (by anyone)? That’s why we have all these recent attempts to force backdoors into encrypted apps
Or is it just extra scrutiny if you’re trying to make a new one
I’m going to break things down a few levels. Disclaimer: I’m a nerd not a mathematician, so if anyone else can fix my errors that would be great.
Cryptography is a cat and mouse game. There is currently no “perfect solution” so that A and B can communicate and C has no way of cracking the communication at some point.
Cryptography is very complex for obvious reasons, but a lot of modern algorithms hinge on the time it takes to calculate prime numbers and test them against encrypted communication. Traditional PCs take an incredibly long time to calculate prime numbers.
Quantom PCs don’t. The way they operate makes them incredibly helpful for calculating primes, that’s why a lot of cryptographic algorithms will be in jeopardy once it is more widely implemented.
But back to your question. There are already rumors that NSA is using super fast traditional computers to calculate prime numbers and collect them in a database to make cracking traditional encryption easier.
The only thing I can think about with is is that for the NSA they are not moving quickly enough to catch up or they suspect any future quantum key encryption will thwart any attempts they made.
This would be in tandem with moves by the UK parliament to get a law going that implements backdoors in devices or apps (I assume that must be pushed by GCHQ?).
Personal opinion: encryption with a backdoor is ridiculous. The government likes to represent that they’re the only one to access those, but it only takes one savant 10yo interested in penetration testing or one rogue government employee for this backdoor to be used for malicious purposes. And it’s not like these ppl already exist.
So there was an extremely interesting CVE recently about TLS trust issues on Qualcomm modem firmware.
Astute observers have been asking why modem firmware is implementing TLS exchanges in the first place, leading many to speculate that the NSA was using TLS to authenticate their backdoor, and the keys got leaked.
They seem to have calmed that down in recent years, and rely on the dumb public to store all their secrets on readily accessible corporate servers.
The maths war is hard to win (bigger keys handle most of that), and I honestly doubt most current encryption can be beaten reliably even with quantum computing.
It’s because they don’t care about encryption when they can just side channel the endpoints. You can infer device state from observing EM emissions, and in theory observe keys being loaded into the registers under the right circumstances. This has been demonstrated conceptually many times over the past decade, using a wide variety of devices and methods.
Ive never understood how the same crowd that spouts not your keys not your crypto would ever trust any password manager they havent personally read the source code for/compiled/self hosted.
Not your server not your safe/secure password
Because the pop security YouTube crowd goes through great lengths to avoid these conversations which reveal the limits of their own knowledge and abilities. Because a YouTube channel which just says “you are vulnerable to state actors and should focus on protecting yourself from more benign threats” doesn’t generate as much traffic as shilling VPNs.
Didn’t the same thing happen with TrueCrypt?
Cool guy
Bernstein’s website http://safecurves.cr.yp.to/index.html
Interesting article and discussion.
The way Signal is addressing post-quantum encryption is by layering Crystals-KYBER over their current encryption. I initially thought it was overkill, but it’s a great decision.
My phone has a Kyber crystal?! Awesome!
There is np such thing as overkill while some governments actively funding quantum computing projects for the sole purpose of code cracking
Without paywall
How do you remove the paywall from the article? Just copy the URL of the article and provide it to archive.today, and that website just bypassed the paywall? How do they manage to bypass it? O.o
They need the content to be available for Google indexing reasons, it can only really be blocked through the client.
A smart enough backend system can access/crawl/index it, just like Google can. And then make it available to the public without the front end annoyance.
I assume the archive doesn’t run the Javascript portion of the site. You can often bypass pay walls with plug-ins that disable JS as well.
it also runs with different IP addresses and burner accounts for some websites sometimes requiring you to be registered (LinkedIn for example)
I dont think anyone will come to share this knowledge with us since it could be used by newspapers website to block the archiving.
Before, elliptical curve encryption has been hailed as the new golden standard, only too bad there is a serious weakness where if you know the seed you can crack the code. And guess who has the seed? Starts with N and ends with SA.
Goddamn NASA and their meddling!
And here I thought it was the National Emergency Services Academy.
Curve25519 should be fine.
Yeah you can observe this with letsencrypt failing to generate a certificate if you change the elliptic curve from an NSA generated curve to a generic/known safe one. Changing between different NSA curves are functionally fine. Forces all signed certificates to use curves that are known to have issues, deliberate or otherwise - i.e. backdoored.
Can you elaborate on this? Which curves does it happen with? Is there some source that you’ve seen?
You can’t use arbitrary curves with certificates, only those which are standardized because the CA will not implement anything which isn’t unambiguously defined in a standard with support by clients.
My point is that there is a documented listed of supported curves for ECDSA but attempting to use any other safe curve in the list results in a failure. I am not trying to use some arbitrary curve.
If your point is that no safe curve is permitted because the powers that be don’t permit it, TLS is doomed.
https://eff-certbot.readthedocs.io/en/latest/using.html#using-ecdsa-keys
The default is a curve widely believed to be unsafe, p256, with no functioning safe alternative.
That’s Bernstein’s website if anyone was wondering, showing p256 is unsafe.
I run a cryptography forum, I know this stuff, and the problem isn’t algorithmic weakness but complexity of implementation.
All major browsers and similar networking libraries now have safe implementations after experts have taken great care to handle the edge cases.
It’s not a fault with let’s encrypt. If they allowed nonstandard curves then almost nothing would be compatible with it, even the libraries which technically have the code for it because anything not in the TLS spec is disabled.
https://cabforum.org/baseline-requirements-certificate-contents/
CAB is the consortium of Certificate Authorities (TLS x509 certificate issuers)
With that said curve25519 is on its way into the standards
Tldr would be that there are no safe ECC curves in TLS? Yet
P256 isn’t known to be insecure if implemented right, it’s just harder to implement right
The WRC deals with unsafe curves all the time. I think picking a couple of spots on some of their curves at high speed would be interesting. Samir has been known to break some of these…
That’s worrying if true. However I couldn’t find a source. Even if true Let’s encrypt is probably the most secure option
Thanks, I am extremely skeptical and I might just reach out to let’s encrypt for clarification
I know someone in this field and sent him this article. He said the “NIST isn’t being transparent” claim isn’t true
https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=927303 https://nvlpubs.nist.gov/nistpubs/ir/2020/NIST.IR.8309.pdf https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=934458
He also responded with “of course the NSA would try and mess with it, but if it’s peer reviewed properly I don’t see how they would be successful”
We know for a fact that they have done it in the past and managed to hide it until it was too late, what makes you think they can’t do it again?
peer reviewed properly
Is the important bit here. The timeline from that Wikipedia article shows it was published in 2005 and work disproving it’s claim came around in 2006.
If a scientists work is retracted it really kills any more funding they receive. They use examples like the DRBG one as what not to be.
Looking at the history of the any of the Clandestine US orgs should probably remind us these people will do literally anything that they can, like give people LSD in an attempt to control their mind, or put microphones in Russian cats.
but if it’s peer reviewed properly
Is it?
Did you send him Bernstein’s original blog post?
https://blog.cr.yp.to/20231003-countcorrectly.html
Unless he’s just making all of this up, it does seem pretty damning. I would love to see an in-depth rebuttal.