$100,000 Prize: Prove Quantum Computers Impossible 324
mikejuk writes "Quantum computing is currently a major area of research — but is this all a waste of effort? Now Scott Aaronson, a well-known MIT computer scientist, has offered a prize of $100,000 for any proof that quantum computers are impossible: 'I'm now offering a US$100,000 award for a demonstration, convincing to me, that scalable quantum computing is impossible in the physical world.' Notice the two important conditions — 'physical world' and 'scalable.' The proof doesn't have to rule out tiny 'toy' quantum computers, only those that could do any useful work."
D-Wave sold a commercial Quantum computer in 2010 (Score:5, Interesting)
Err, uh,
Didn't D-Wave sell a commercial Quantum computer to Locheed Martin [hpcwire.com] in 2010? Almost a year to the day?
Someone explain to me the difference between this quantum computer and the one they're trying to prove doesn't exist, please.
The ultimate Schroedinger's Cat problem! (Score:3, Interesting)
Now there's a challenge!
Prove that something which already exists CAN'T exist!
Methinks their money might be safe on this one... :P :P :P
Re:Proving something negative is impossible (Score:4, Interesting)
Also, it is definitely possible to prove a negative. I can prove that there are no lions in my refrigerator, no elephants hiding behind my couch, and no dead zombie typing this comment, to most people's satisfaction, for starters.
The lions in your refrigerator are microscopic. The elephants hiding behind your couch are invisible, and you actually are a dead zombie. You just don't realize it, because of a psychological hallucination that you are not actually dead.
Re:D-Wave sold a commercial Quantum computer in 20 (Score:1, Interesting)
Re:D-Wave sold a commercial Quantum computer in 20 (Score:4, Interesting)
The physics of oscillating crystals, such as those used in microphones and phonograph needles as well as radio transmitters, indicates that quantum computing could never not exist. Matched oscillating crystals have been in use for thousands of years and the mathematical model is proven by hundreds of different laboratory and home appliances; eg. an infrared spectrophotometric detector. The emission and absorption frequencies predicted by the mathematical model of the particle [wikipedia.org] in a box (the basis for calculating electron dispersion around the nucleus and the fundamental beginning for subatomic calculations).
Particle in a box model translates into equations known as the Hamiltonian and, in combination with Eigenvalues calculated from the variables used in particle in a box modeling, generates the Schroedinger equation. Quantum computing could never be nonexistent because the mathematics of matched oscillating subatomic particles already has been proven millions of times over.
The marathon runner was not reporting a successful war campaign. The marathon runner was part of a system proving that those crystals do indeed oscillate, matched, from across the universe (at least 26.2 miles), in real time. Begin counting, begin running, when you arrive, repeat what they said back to them and report your current number. They will determine if your number matches theirs and if you repeat the exact words they said.
One aspect of the inside joke is that, when the marathon runner arrived and made his report, the response from the priests was,"That's _NOT_ what we said!" and they promptly hit him over the head with a baseball bat in frustration over the not completely failed experiment. "Don't tell anyone that he made it."
Re:D-Wave sold a commercial Quantum computer in 20 (Score:4, Interesting)
People were already working on solid-state transistors in 1946. The main difficulty was growing pure enough crystals.
Even without solid state transistors, computers would have continued to get more powerful and require less maintenance per tube as vacuum tubes improved (nothing like what was possible with solid-state transistors, of course). Remember, vacuum tubes themselves were only about 35 years old at that time--lots of improvement in size, power and reliabililty was possible, but work on them stopped when it became clear that transistors were so much better.
In the case of quantum computers, there are lots of ideas floating around, but no one actually has any clear idea of what will be needed to maintain quantum coherence across a large number of bits. In fact, it is not yet clear that it is possible.
The D-Wave computer uses quantum annealing which does not require coherence across a large number of bits, but which is also a LOT less useful than one that does.
Re:Sorry, what? (Score:4, Interesting)
I'm not sure what you mean by this. Quantum behavior disappears at macroscopic sizes simply because all lengths involved are microscopic. Take a hallmark of quantum mechanics as a simple example: the Heisenberg uncertainty principle. It has been shown that the standard deviation of the position times that of the momentum MUST equal or exceed Planck's reduced constant divided by two. Considering the latter is in the order of 10^(-34), it's no surprise that macroscopic measurements are not affected by this limit at all, but nanoscopic ones most definitely are. In the same way, quantum tunneling is also an effect which could theoretically happen at macroscopic sizes, but with a probability so low it's effectively impossible. There's no hard limit, it's just a spectrum which rapidly becomes negligible as size increases.
As I said, the biggest problem is an engineering one: how do you scale up the number of qubits to an appreciable amount while keeping errors below an acceptable threshold? How do you operate on said qubits without measuring them so as to preserve the wavefunction? Some cases have answers, but this is still overall an open question, unlike classical computing where the first question's been answered by transistors and the second question has no bearing.