If we consider information to be safe if we encrypt it (e.g., text in a file, encrypted with modern strong encryption), would it be safer (as in harder to crack) if we then encrypted the encrypted file, and encrypted the encrypted^2 file, etc.? Is this what strong encryption already does behind the scenes?

  • slazer2au@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    3 months ago

    I would say, what is the point? If you encrypt something with AES256 it still takes lifetime of the universe to brute force, but if a flaw in the algorithm is discovered or computing power exceeds current projections (say with quantum computing) double or triple encryption won’t help.

    We tried this in the 90s with VPNs with a variation of DES called 3DES and we have since created better algorithms.

    • unexposedhazard@discuss.tchncs.de
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      3 months ago

      I only partially agree. Currently there are many algorithms wirh some expected to be more quantum resistant than others. So if you arent sure which one is actually the best, you could use all the good candidates on top of each other and increase your chances to have used at least one that is actually post quantum safe.

      But yes with current year tech there is no point really. In the end you will just fall victim to the wrench>kneecap attack anyways if your secrets are big enough.

  • solrize@lemmy.world
    link
    fedilink
    arrow-up
    12
    ·
    3 months ago

    As people have said, the keys have to be completely independent of each other or else the layering can make the encryption weaker. And, if you’re worried about one of your layers being weak, you shouldn’t be using that layer in the first place.

    I think SSL/TLS actually gained something from this though. The initial key agreement phase generated (from my foggy memory) a “premaster secret”, then hashed it with both SHA-1 and MD5 and combined the two hashes in some way. Those were the two hash algorithms popular in that era. Later on, weaknesses (free collisions) were found in MD5 and even later, in SHA-1. By combining both algorithms, SSL avoided any hint of compromise from those particular hash problems. SSL’s designer Paul Kocher later said he was very glad he had specified using both.

    I would say though, that secure hashing (with a public algorithm and no secrets) has generally been considered a more difficult problem than secret-key encryption or authentication. And SHA1 and MD5 both used design approaches now considered dubious.

  • otp@sh.itjust.works
    link
    fedilink
    arrow-up
    4
    ·
    3 months ago

    Mathematically yes, but in the real world, it’s probably not much safer.

    Imagine that you’ve got an unpickable lock on the front door to your house. You’re asking about putting another door with another unpickable lock in front of it.

    Great, sure, but we’re talking about a world where most people get in not through picking the lock, but by someone leaving the backdoor open, or hiding keys under the doormat.

    Disclaimer: I’m not making any claims about household break-and-enters. It’s just an analogy.

  • asmoranomar@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    3 months ago

    In some instances of private/public key systems, this is done. It’s mainly for the purpose of ensuring the recipient knows who the sender was and also ensuring the sender knows who the recipient is.

    Quick primer: If you encrypt with your private key, everyone knows it was sent by you. If someone encrypts with your public key, they know you will receive it. Use your private key and someone’s public key together and you know only that person got it.

    In practice, lately another step is added to negotiate a third temporary/session key. This ensures keys aren’t used forever, and if compromised a new one can be generated. This is more secure than encrypting twice, because you never know what data is sensitive and picking the wrong one requires the attacker to start from scratch.

  • marcos@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    3 months ago

    If you use different keys, yes. That’s for example how it’s recommended to use post-quantum algorithms, you encrypt with them and a classical one.

    If you repeat keys, it’s very complicated and probably not safe at all.

  • Fonzie!@ttrpg.network
    link
    fedilink
    arrow-up
    2
    ·
    3 months ago

    Encryption? No, others have good explanations on this.

    Password hashing? Interestingly yes! Passwords hashed with bcrypt are often hashed 10, 12 or more times over! With argon this is usually also 4 or more times!

  • Kidplayer_666@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    This is sort of what TOR does if I’m not mistaken to help you stop getting tracked even if a node is compromised

  • xi00@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    This scheme would only really improve security when using multiple different keys, as well as obviously two different algorithms. Doing the same thing over and over only grows linearly, while encryption is based on the fact that the attacker needs exponentially (not mathematically speaking) more effort to crack than was put in by the encrypting party. So if the attacker can crack it once he can also do it again with no further effort. Furthermore, most of the time the problem with encryption is not the actual cipher, but rather the key storage and distribution. Keepass for example only uses a single encryption layer (AES or ChaCha) for the database, instead offering a very robust portfolio for key derivation (basically making a big key from a small password, or translating entropy to something usable for the cipher, while keeping the determinism)

    But that is essentially what two-factor authentication does. And you can also use this with fully symmetrical encryption to an extent (look up how OTP works with keepass for an example)

    There is also some pretty good literature from the early days of the us military opsec, where they lay out very well the incremental steps into doing it better and better.

    Hope that helps, but I am not qualified to be cited on this information :)