DEF CON 33 - Post Quantum Panic: When Will the Cracking Begin, & Can We Detect it? - K Karagiannis

Due to recently published algorithmic improvements (1399 qubits @ 2048 bit key length for Shor’s) and leaps being made in quantum computing hardware (IBM Starling @ 200 logical qubits in 2029, and IBM Blue Jay @ 2000 logical quibits from 2033 and on), encryption is in danger of State-sponsored and high end-criminal attacks as soon as 2030. Particularly susceptible are crypto-currencies like Bitcoin, which rely on the Elliptic Curve Discrete Logarithm Problem (ECDLP) and are attackable by Shor’s factoring capability on a predictably feasible quantum computer.

    • turdas@suppo.fi
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      10 hours ago

      There was a paper recently about a stable 6100-qubit system, so the trajectory is plausible. If 1399 qubits is needed for 2048-bit Shor’s, this would already meet that by a wide margin – though obviously this is a research system that AFAIK cannot do actual computations.

      https://www.livescience.com/technology/computing/quantum-record-smashed-as-scientists-build-mammoth-6-000-qubit-system-and-it-works-at-room-temperature

      • unexposedhazard@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 hours ago

        As you said, this research isnt functional. The TLDR of all of the following is that to my understanding the record holding quantum computer currently has 4 (four) qubits.

        ##################################################

        From the original source (caltech)

        Looking ahead, the researchers plan to link the qubits in their array together in a state of entanglement, where particles become correlated and behave as one. Entanglement is a necessary step for quantum computers to move beyond simply storing information in superposition; entanglement will allow them to begin carrying out full quantum computations.

        So yeah, they arent at the step where they can actually do anything with the qubits they created. 6100 physical qubits also doesnt equal 6100 logical qubits. https://en.wikipedia.org/wiki/Physical_and_logical_qubits

        Since the development of the first quantum computer in 1998, most technologies used to implement qubits face issues of stability, decoherence,[6][7] fault tolerance[8][9] and scalability.[6][9][10] Because of this, many physical qubits are needed for the purposes of error-correction to produce an entity which behaves logically as a single qubit would in a quantum circuit or algorithm; this is the subject of quantum error correction.

        Im a total non expert on quantum things, but from the looks of it, the most efficient systems (at Microsoft) still need many times the amount of physical to create a single logical qubit.

        The team used quantum error correction techniques developed by Microsoft and Quantinuum’s trapped ion hardware to use 30 physical qubits to form four logical qubits.

        Its also impossible to read up on this stuff, because lots of research for “quantum computers” actually just algorithmically simulates the logical qubits on standard non quantum hardware. So if you search just for “largest logical qubit system” you get lots of garbage and searching for physical qubits gives you research like this 6,100 number that cant be converted into a realistic number of logical qubits, because the overhead needed for error correction varies drastically between techniques.

        What you really wanna know is the largest set of functional logical qubits that actually relies on physical qubits. And the answer to that seems to be 4. Whats needed to break RSA-2048 is probably multiple thousands of those stable, error free logical qubits.

        https://en.wikipedia.org/wiki/Quantinuum#H-Series

        The company also holds the record for two-qubit gate fidelity, becoming the first to reach 99.9%. Microsoft and Quantinuum created four logical qubits on the H2 quantum computer, running 14,000 experiments without a single error.

        • turdas@suppo.fi
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 hours ago

          Even if it’s 8 physical qubits to 1 logical qubit, 6100 qubits would get you 762 logical cubits.

          All I’m saying is that the technology seems to be on a trajectory of the number of qubits improving by an order of magnitude every few years, and as such it’s plausible that in another 5-10 years it could have the necessary thousands of logical qubits to start doing useful computations. Mere 5 years ago the most physical qubits in a quantum computer was still measured in the tens rather than the hundreds, and 10 years ago I’m pretty sure they hadn’t even broken ten.

          • unexposedhazard@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 hours ago

            Its really not on that trajectory tho. Huge inflated numbers of nonfunctional physical qubits are just a way to get funding. Its like AI bros boasting about how much data their LLM model sucked in. The number of usable qubits hasnt changed at all basically. They are still in the stage of figuring out how it even works. Compared to traditional computers, they are at the stage of trying to invent the transistor. Yes in 20-30 years it will maybe be useful, but only if they dont hit physical limitations that prevent scaling. And then the question is FOR WHAT? Dead people cant make use of quantum computers and dead people is what we will be if we dont figure out solutions to some much more imminent, catastrophic problems in the next 10 years.

            • turdas@suppo.fi
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 hours ago

              I mean, the number of logical qubits has gone from basically zero not too long ago to what it is now. The whole error correction thing has really only taken off in the past ~5 years. That Microsoft computer you mentioned that got 4 logical qubits out of 30 physical qubits represents a 3-fold increase over the apparently previous best of 12 logical qubits to 288 physical ones (published earlier the same year), which undoubtedly was a big improvement over whatever they had before.

              And then the question is FOR WHAT? Dead people cant make use of quantum computers and dead people is what we will be if we dont figure out solutions to some much more imminent, catastrophic problems in the next 10 years.

              Strange thing to say. There’s enough people on the planet to work on more than one problem at a time. Useful quantum computing will probably help solve many problems in the future too.

  • nicolauz@feddit.org
    link
    fedilink
    English
    arrow-up
    19
    ·
    13 hours ago

    The fact that he doesn’t talk about the current state and real world process of applying and trying out the algorithms and their improvements of the last 10 years makes me strongly believe this is more propaganda than real.

    It’s all projection, and projections of projections.

    I’m not going to argue against the accelerated introduction of post quantum algorithms… But this talk smells

    • eleijeep@piefed.social
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 hours ago

      This is a brilliant presentation. I heard about his paper that demonstrated integer factorisation with an abacus, a VIC-20 and a dog, but I hadn’t seen this before.

      Aside: The Quantum Supremacy Drinking Game – Open a new bottle of wine every time quantum supremacy is announced – Requires a well-stocked wine cellar

      🤣

  • vacuumflower@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    2
    ·
    9 hours ago

    … Which is why we all should immediately switch to post-quantum encryption possibly much weaker against conventional cryptanalysis. Thank you, NIST, NSA and other such respectable official bodies. Of course I believe you.

    In general the whole “everyone should use standard state-of-the-art cryptography” turned out to be a con. And somehow the more “standard state-of-the-art” things were broken, the more was the confidence that they are what should be used. In the 90s “standard state-of-the-art” things were being broken casually, and non-standardized ciphers were made and used far more often than now, and somehow that was fine.

    I dunno, we’re all using AES with even hardware implementations of it, potentially backdoored, and with approved recommended S-boxes, without explanation how were these chosen (“by the criteria of peace on earth and goodwill toward men” is not an explanation, a mathematical paper consisting of actions you repeat and unambiguously get the same set would be that).

    I think if you are afraid of your cryptography rotting, embracing some pluralism outside of cryptography is what you should do. Like maybe partitioning (by bits, not splitting into meaningful portions god forbid) the compressed data and encrypting partitions with different algorithms (one AES, one Kuznetchik, one something elliptic, one something Chinese).

    • Seefra 1@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 hours ago

      … Which is why we all should immediately switch to post-quantum encryption possibly much weaker against conventional cryptanalysis.

      There’s no need to switch, you can just layer it, and should be done asap

  • just_another_person@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    edit-2
    15 hours ago

    This is a great talk, but it’s ignoring the real issue in that it would need to be “in-line”, which is not anywhere near possible. They sort of address that, but are talking about the cyphers themselves mostly.

    I think we’ve reached the cusp where we can exchange new derivative keys on the fly per request without making too much of a dent in speed, but that comes with all kinds of tradeoffs on session length and convenience I suppose.

    Edit: I guess there is another eventuality where governments just go and farm public keys and use them against targeted traffic. Not a good way to beat that right now.

    • JPAKx4@piefed.blahaj.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 hours ago

      There are now quantum resistant algorithms, with the hope being that even advanced quantum computers wouldn’t be able to crack it in a time that different from regular computers. I think I was reading that it’s already a part of release wireguard?

  • turdas@suppo.fi
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    10 hours ago

    We can only hope that Bitcoin gets pwned by quantum computers. It would be absolutely glorious.

    • Mubelotix@jlai.lu
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      10 hours ago

      Bitcoin is safer than your bank. There are easy ways to protect your Bitcoin wallet from quantum threats and they have been good practices since before 2015