This is the predictions from 2002-2003:
QUANTUM COMPUTATION ROADMAP 2007 AND 2012 HIGH-LEVEL GOALS
Although QC is a basic-science endeavor today, it is realistic to predict that within a decade fault-tolerant QC could be achieved on a small scale. The overall objective of the roadmap canbe accomplished by facilitating the development of QC to reach a point from which scalability
into the fault-tolerant regime can be reliably inferred. It is essential to appreciate that “scalability” has two aspects: the ability to create registers of sufficiently many physical qubits to support logical encoding
and the ability to perform qubit operations within the fault-tolerant preci-sion thresholds. The desired 2007 and 2012 high-level goals of the roadmap for QC are therefore, by the year 2007, to encode a single qubit into the state of a logical qubit formed from several physical qubits, perform repetitive error correction of the logical qubit, and transfer the state of the logical qubit into the state of another set of physical qubits with high fidelity, and by the year 2012, to implement a concatenated quantum error-correcting code.
Meeting these goals will require both experimental and theoretical advances. While remaining within the basic-science regime, the 2007 high-level goal requires the achievement of four ingredients that are necessary for fault-tolerant scalability:ß
- creating deterministic, on-demand quantum entanglement
- encoding quantum information into a logical qubit;
- extending the lifetime of quantum information;
- and communicating quantum information coherently from one part of a quantum computer to another.
This is a challenging 2007 goal—requiring something on the order of ten physical qubits and multiple logic operations between them, yet it is within reach of some present-day QC approaches and new approaches that may emerge from synergistic interactions between present approaches.
The 2012 high-level goal, which requires on the order of 50 physical qubits,exercises multiple logical qubits through the full range of operations required for fault tolerant QC in order to perform a simple instance of a relevant quantum algorithm, and approaches a natural experimental QC benchmark: the limits of full-scale simulation of a quantum computer by a conventional computer.
The 2012 goal would be within reach of approaches that attain the 2007 goal. It would extend QC into the quantum computer test-bed regime, in which architectural and algorithmic issues could be explored experimentally.
Here is the reality:
Google engineers had suggested that performance will improve as D-Wave continues to double the number of qubits on its processors. A qubit, or quantum bit, is the basic unit of information for quantum computing. Unlike a regular binary bit, a qubit is able to hold two states at a single time, an effect called superposition that could be a key element to powerful quantum computers.
Currently Google uses the 512-qubit D-Wave processor, but D-Wave plans to introduce a 1,024-qubit model this year, and a 1,024-qubit processor in 2015.
Graph the reality of quantum computing progress versus the predictons made for it, and you’ll see a striking depiction of the Tech Singularity itself.