top of page

Quantum BS!

  • Writer: RG
    RG
  • Aug 20
  • 4 min read

Updated: Aug 27

Recently-calibrated reference dog
Recently-calibrated reference dog
“Consciousness consists of morphogenetic fields of quantum energy. ‘Quantum’ means a redefining of the infinite. Inspiration requires exploration.”

It’s hard to provide a better illustration of “quantum bullshit” than the quote above, from the wonderful “New Age Bullshit Generator”. It uses the word “quantum” in a way that is intended to seem profound, while not actually saying anything at all.


I mentioned this site previously, but New Age rubbish is only one way in which “quantum” can be used in ways that are confusing and misleading. In particular, I recently commented on the impact that quantum computing is expected to have on current encryption schemes. To summarize, when quantum computers are large enough and stable enough for practical use at a sufficient scale, they can use an algorithm known as “Shor’s Algorithm” to break current cryptography (based on factorization of integers) in time which increases (roughly) linearly rather than exponentially as the size of the integer keys. Also known as the “quantum apocalypse”.


On Episode #1034 of Security Now, Steve Gibson describes a paper co-authored by Peter Gutmann, of the Department of Computer Science at the University of Auckland, New Zealand. The paper is called

Replication of Quantum Factorisation Records with an 8-bit Home Computer, an Abacus, and a Dog”, and is essentially an attack on a number of recent papers describing “breakthroughs” in the use of quantum computers for factorization of integers.


So far as I understand it, the paper (also mentioned in an article by The Register) is not discussing the validity of Shor’s Algorithm, or the viability of a sufficiently large quantum computer using it to factorize large integers. Instead, Gutmann and his co-author (Dr. Stephan Neuhaus, of the Zurich University of Applied Sciences) focus on claims about quantum computers being used to factorize integers, and describe the “quantum computers” as simply “physics experiments” – thus the comparison with a VIC-20, an abacus, and a dog.


Their contention is that the numbers selected are “sleight-of-hand” numbers which are trivially easy to factorize. As a way to correct this, they propose an initial set of criteria for selection of numbers for future factorization attempts. To summarize:

  1. Factors are of non-trivial size (64 or 128 bits)

  2. Prime values with large differences between them and containing a 50:50 mix of 0 and 1 bits, randomly distributed

  3. No preprocessing of the value to be factorized

  4. Factors are unknown to the experimenters

  5. Factorization is performed on ten different values


It should be noted that these criteria are intended to reduce the possibility of selecting numbers which may be trivially factorized, and are intended to provide a baseline for demonstrating “true” factorization, rather than something representing a true “threat” to current cryptographic models. They also acknowledge that these criteria may evolve over time, as more research in this area is done.


But where does that leave us?


Well, it appears that the theory is sound, but the practical application is not yet there.


The practical challenges around quantum computing include the engineering of qubits, “noise” caused by insufficient isolation of qubits, and “error correction”. I would add cost to the list, and the impact of hype on the level of investment.


Currently, many organizations and governments are pouring money into quantum computing research, but progress seems to be relatively slow, which may be why there are frequent announcements of new “breakthroughs” which don’t appear to have been actual breakthroughs.


There is a risk that, if there are too many “breakthroughs” which don’t actually lead to practical application, the hype may die down – along with the investment, which would obviously have a significant impact on the rate of future development.


This raises the question, though.


Are people going to start thinking about quantum computing the way they do about cold fusion?


While research into fusion power has been ongoing since the 1940’s, so far no one has developed a device which produces more power than it requires to operate. To summarize, fusion is the process by which two or more atomic nuclei combine to form larger nuclei, accompanied by the release or absorption of energy. It is the process that powers active stars, and is based on plasma in a confined environment with sufficient temperature, pressure, and confinement time.


We know it can be done, and we know how to do it. The challenging part is to do it in such a way that we can efficiently produce usable energy, as opposed to a bomb. So far, however, the process of generating the temperature and pressure requires more energy than the reaction produced. If we could create a stable fusion reaction that generates more energy than it costs, we could use it for power generation.


The “holy grail” of fusion power was seen in a hypothesized “cold fusion”, which would occur at or near “room temperature”, as opposed to the immensely high temperatures and/or pressures currently required.


And then, in 1989, Stanley Pons and Martin Fleischmann claimed to have succeeded. On 23-Mar-1989, they held a news conference to announce their work, which generated an enormous amount of excitement and hype.


But then, other scientists tried to replicate their work, and it did not go well. Multiple scientists, at a variety of institutions, either failed to duplicate the results, or generated results which did not reproduce those of Pons and Fleischmann. While they continued their work, most of their peers lost confidence in them, declared cold fusion “dead”, and moved on. It’s notable that it’s been over thirty years, and there has still been no credible progress in the development of cold fusion (though there has been a lot of pseudo-science and many fraudulent claims...)


Practical fusion power generation has been “five years out” for the past fifty or so years. Will it happen? Probably. Will it replace all other forms of power generation? Probably not – it’s “too late” and we’ve moved on, though it may be valuable in “niche” applications, such as space travel.


Which brings us back to “quantum computing”. Are we close to a breakthrough which will give us viable quantum computers next year? Or will there be steady progress over the next few decades? Or will quantum computing never really become practical?


Ultimately, only time will tell, but until then, we have that VIC-20, the abacus, and the dog.


His name is Scribble.


Cheers!

Comments


Want to learn more?

Thanks for subscribing!

What do you think?

Thanks for submitting!

© 2025 by RG

88x31.png

TIL Technology by RG is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise specified. 

Please feel free to share, but provide attribution.

bottom of page