There's still a lot of work to be done on the theoretical front, there's a certain class of equations that can't be solved yet and these govern the error rates from system decoherence. The topological model is a completely new approach to this problem and was only first suggested about 10 years ago, and has been proven
1 2 to be both inherently more stable and and much simpler at error-control.
The research being done now is very different to that in 2000, for one the models being used are
inherently fault-tolerant (which was not the case with previous models).
To expect some 'magic box' to suddenly appear and everyone to go "
we've discovered a quantum computer! isn't it cool!" is horrendously naïve (though is certainly what the general population expect of physics), and I'd expect a fellow physics student to understand the absolute necessity of theoretical frameworks; the phenomenon of superconductivity was discovered experimentally in 1911 by Heike Onnes, but it wasn't until 1957 when Bardeen, Cooper, and Schrieffer developed a fully microscopic theory of superconductivity that it actually became possible to exploit for anything useful (or more technically until Ginzberg, Landau and Abrikosov developed a slightly-more-macroscopic theory of type II superconductors).
Bardeen, Cooper, Schrieffer, Ginzberg and Abrikosov all won Nobel prizes for their work on superconductivity, but Heike Onnes didn't (don't worry, he got a Nobel prize for something else. Also Landau would've done, but he was a bit too dead at the time).
For an apt summary of the accelerated progress in quantum computing, see:
http://en.wikipedia.org/wiki/Timeline_of_quantum_computing
and notice that there are more papers published on quantum computing in the last 2 years than in the previous 20.
They
could, but there is a physical limit to the processing power of a conventional computer, a limit we're fast (well, maybe not fast, but inevitably) approaching. A quantum computer does have a limit too, but of an entirely different nature and with different dependencies. A quantum computer is simply
necessary when we reach the limit where semiconductor materials become so small as for quantum phenomena to take over. Of course there's plenty we can do with clever software to extend the lifetime of conventional computing, but eventually we're going to have to manipulate the quantum world (and all that clever software can be translated to qubits, so everyone wins!)