There's currently a few prototypes using up to 16 qubits. I wouldn't call them ready for the market, but that's not a problem for now. The big issue is in cryptography. You have, for instance, public key cryptography using the RSA integer factorization problem. It's based on the computational complexity of calculating p and q, where p and q are large prime numbers, and you are given pq. Since the first factor of these huge numbers is going to be tremendously large, and the amount of time it takes to check something like
7403756347956171282804679609742957314259318888923128908493623263897276503402826627689199641962511
7843995894330502127585370118968098286733173273108930900552505116877063299072396380786710086096962537934650563796359 is huge (take for instance the
RSA integer factorization challenge), the problem is hard to solve on current computers, when the best algorithm so far (general number field sieve) is O(e^((1.9229+O(1))∗ln(n)^1/3∗ln(ln(n))2/3)). This still isn't polynomial time, which means it scales
horribly with the size of the input.
Now with QM, one can use Shor's quantum algorithm to factor these bad boys, which is only O((log N)^3). Shor's algorithm basically finds all the solutions to the problem at once, both wrong and right answers, and the
real magic is using quantum mechanical voodoo to make the wrong answers cancel themselves out instantly. This was already tested in 2001, factoring 15 into 3 and 5. With the advent of new resources for quantum computing (NIST and Yale have both built QM computer buses for data transfer across different parts of the system) we should see a huge increase in applications.
One can also use things like Grover's algo to search a large keyspace in O(n^1/2) instead of either a brute force search (O(n)) or a random-guess search (O((n+1)/2)). Where Shor can be used to attack integer factorization problems in public-key crypto, Grover's can be used to search a large keyspace such as AES in 256 bit mode, which is the encryption used for SECRET and TOP SECRET data privacy classifications in the US gov't, having a keyspace of 2^256 possibilities. It is a very exciting time indeed, and calls for a much needed paradigm change in the field of cryptography (elliptic curve crypto might work for a while, it's based on another problem, not integer factorisation, and for which there is currently no decently fast algorithm to crack).
As you say, there's little development in the way of quantum computing theory, but I'd say that's more because of (
scientists: laugh now) lack of funding than anything else. The equipment to make the needed experiments is not nearly as ubiquitous as silicon was in the days of Apple's garage-borne beginnings, or IBM. The "if I touch it I break it" deal with QM is one of the issues, as is finding ways to measure the data. This is all considering it's so far lab-only, and a working commercial product may well be decades ahead. The system needs to be completely sheltered from any outside interventions. Compare this with the
One Laptop Per Child initiative, where the classical system is made to withstand pretty rough conditions.
All in all, I'd guess that with more funding, more research and above all more
tangible results, we'll see a spike in CS interest in developing an equivalent of turing's work on classical systems, on QM based ones. As for a normal timeframe, I'd wager 10 years before enough interest gathers momentum to make such a theory, and make such working examples.