Quantum vs Classical Computing
Track: Foundations · Difficulty: Beginner · Est: 13 min
Quantum vs Classical Computing
Overview
This page compares quantum and classical computing in three practical dimensions: how information is represented, how computation is modeled, and how outputs are produced. The goal is not to treat one model as “better,” but to understand what changes when the underlying representation changes.
Intuition
Classical computing is built around stable, copyable information: bits can be read without disturbance, duplicated freely, and processed deterministically.
Quantum computing uses a representation that is more delicate. A quantum state can carry phase information and interference patterns, but extracting that information requires measurement, which produces randomness and typically disturbs the state. This trade-off—richer internal structure versus restricted readout—drives much of quantum computing.
Formal Description
Representation. A classical -bit register stores one of bitstrings. A quantum -qubit register stores a normalized vector
where the are complex amplitudes and .
Computation. Classical computation is typically described as applying Boolean operations (or arithmetic) to bits, producing another bitstring.
Quantum computation (before measurement) is described as applying a sequence of reversible linear transformations:
Output. Classical output is deterministic given the input and program (ignoring hardware randomness). Quantum output is obtained by measurement. The program can increase the probability of certain outcomes, but it generally cannot guarantee a single outcome for nontrivial states.
Worked Example
The differences are easier to see in a compact comparison. The table below summarizes the three dimensions.
| Aspect | Classical computing | Quantum computing |
|---|---|---|
| Representation | One bitstring at a time (e.g. 0101) | State vector with amplitudes over many basis states |
| Evolution | Boolean / arithmetic operations | Unitary (reversible) transformations before measurement |
| Readout | Read bits without disturbing them | Measurement yields random outcome; typically disturbs state |
A small thought experiment helps. Suppose you have a single qubit in
A classical “read” of a bit would reveal either 0 or 1 and leave the bit unchanged. A quantum measurement reveals 0 or 1 with equal probability, and after measuring, the state is no longer the balanced superposition.
Turtle Tip
When comparing models, keep two separate questions in mind: “What does the computer store internally?” and “What can I reliably read out at the end?” Quantum computing changes the first dramatically, but the second is constrained by measurement.
Common Pitfalls
A common mistake is to treat the sum as if it were a classical list of values you can inspect.
You cannot directly “print all amplitudes.” The only direct output is a measurement result, and learning detailed information about the state generally requires repeating the computation many times.
Quick Check
- In one sentence, what makes quantum readout different from classical readout?
- Why does a reversible (unitary) evolution still lead to randomness in the output?
What’s Next
Now that we have a clear comparison, we can discuss what quantum computing is actually used for in practice. Next we’ll survey applications carefully, separating near-term capabilities from longer-term goals.
