What is Quantum Computing?
Track: Foundations · Difficulty: Beginner · Est: 12 min
What is Quantum Computing?
Overview
Quantum computing is a model of computation in which information is stored in quantum states and evolves according to the rules of quantum physics. It matters because the same physical world that limits classical devices also provides a different kind of information processing—one that uses superposition and interference rather than only deterministic bit flips.
Intuition
A classical computer represents information with bits that are either 0 or 1 at any moment. A quantum computer represents information with states that behave more like arrows (vectors) than like switches. When you combine and transform these arrows, they can reinforce or cancel each other. That reinforcement and cancellation—interference—is the central intuition.
It is helpful to think of a quantum program as a carefully controlled sequence of transformations. Instead of directly “trying many answers at once,” you prepare a state, apply transformations that reshape it, and then extract a classical answer by measuring.
Formal Description
The smallest unit of quantum information is a qubit. Formally, a qubit is described by a normalized vector in a two-dimensional complex vector space. Using the computational basis , any pure qubit state can be written as
where and are complex numbers called amplitudes. The connection to probabilities comes from normalization:
Quantum computation (before measurement) is modeled as a sequence of reversible linear transformations. A common formal description is
where is a unitary operator (a matrix) that preserves normalization.
Finally, measurement converts a quantum state into a classical outcome. If you measure in the computational basis, the outcomes are random with probabilities determined by the squared magnitudes of the amplitudes.
Worked Example
Consider a single qubit initialized to . Apply a transformation that maps to a balanced superposition:
Now measure in the computational basis. The probability of seeing 0 is and the probability of seeing 1 is because the amplitudes are equal in magnitude.
This example is intentionally simple: it demonstrates the basic pipeline of quantum computing—state preparation → transformation → measurement—without claiming any computational advantage yet.
Turtle Tip
When you read a quantum statement, separate two ideas: (1) how the state changes before measurement, and (2) how measurement turns that state into classical probabilities. Confusing these two is the fastest way to feel lost.
Common Pitfalls
Two common confusions are worth fixing early.
First, superposition is not a list of classical possibilities. The numbers and are amplitudes; they can interfere. Probabilities do not interfere.
Second, a quantum computer is not automatically faster than a classical one. Quantum mechanics changes the representation and evolution of information, but speedups—when they exist—depend on very specific structure.
Quick Check
- In , what does the condition mean in words?
- Why does measurement force a quantum process to end with a classical output?
What’s Next
Now that you know what quantum computing is at a high level, the next question is: why bother? The next page explains what quantum computing is plausibly useful for, what it is not useful for, and why expectations need to be realistic.
