Race Not Over Between Classical and Quantum Computers

It's not bad, but it sounds like it only applies to a specific type of problem and it didn't really mention Shor's algorithm or Grover's algorithm. There's basically a very limited set of problems where quantum computing is better than classical computing and some problems don't lend themselves to the additional potential speed of quantum computing.
 
Ah, the Quantum; extreme multitasking.
The ultimate programmer question, "Which code is better?" Just on a bigger scale!
Fast? Reliable? Easy to modify? Multi-purpose? Reuseable? Less lines of code? Spawn a thread?

In the meantime I wrote a little SQL to update some fiscal calendar tables to 2050 for a company in the North of Germany.
This application has been running since the middle 1990's and drives their company.
 
It's not bad, but it sounds like it only applies to a specific type of problem and it didn't really mention Shor's algorithm or Grover's algorithm. There's basically a very limited set of problems where quantum computing is better than classical computing and some problems don't lend themselves to the additional potential speed of quantum computing.
Exactly. Even if quantum computers ever become practical, meaning they can perform some specific task faster than a classical computer, they will be faster only in that specific task. For all the other other myriad things where we use computers, classical computers will be faster/cheaper/more efficient.

For example one area quantum computers might become faster is in factoring large integers (Shor's algorithm). But they are not yet faster even in this area. They have a lot of technical challenges to overcome before that day, if that day ever arrives at all. However, if they do ever get there, we'll need different methods for security. Most security today is based on asymmetric key encryption which relies on the fact that factoring large integers is computationally infeasible on classical computers.

It's anyone's guess whether quantum computers ever become feasible, but if it ever does happen it ain't going to be any time soon.
 
Exactly. Even if quantum computers ever become practical, meaning they can perform some specific task faster than a classical computer, they will be faster only in that specific task. For all the other other myriad things where we use computers, classical computers will be faster/cheaper/more efficient.

For example one area quantum computers might become faster is in factoring large integers (Shor's algorithm). But they are not yet faster even in this area. They have a lot of technical challenges to overcome before that day, if that day ever arrives at all. However, if they do ever get there, we'll need different methods for security. Most security today is based on asymmetric key encryption which relies on the fact that factoring large integers is computationally infeasible on classical computers.

It's anyone's guess whether quantum computers ever become feasible, but if it ever does happen it ain't going to be any time soon.
They are making progress on the number of qubits that they have. The problem is that they need lots more and decoherence is still a problem.
 
That article has a misleading headline. Sure the race isn't over, but to say it's not over belies the fact that it's barely even started. Where are we in the evolution of classical computers? Churchill might say it's not the end, nor even the beginning of the end. More like the end of the beginning.
 
That article has a misleading headline. Sure the race isn't over, but to say it's not over belies the fact that it's barely even started. Where are we in the evolution of classical computers? Churchill might say it's not the end, nor even the beginning of the end. More like the end of the beginning.
Do you have that backwards? Quantum computing is in its infancy and may never evolve much if decoherence problems and error correction can't be figured out. Classical computing has a limit because you can only go so small before quantum tunneling becomes an issue. In a sense it's already hit a limit because you can't really increase frequency speed that much more and speed increases have come more from adding more cores.
 
Nope
Do you have that backwards? Quantum computing is in its infancy and may never evolve much if decoherence problems and error correction can't be figured out. Classical computing has a limit because you can only go so small before quantum tunneling becomes an issue. In a sense it's already hit a limit because you can't really increase frequency speed that much more and speed increases have come more from adding more cores.
The phrase "end of the beginning" suggests we're past the rapid advancement of the first several decades. Classical computing is hitting several limits: speed of light propagation, heat dissipation, minimum discernable voltages, etc. Progress continues, but at a slower pace as we've solved the easy problems and now up against harder ones. This is the "mature" phase of the technology, somewhere between "the end of the beginning" and "the beginning of the end".

For quantum computing I'd say it's the beginning of the beginning. Not even feasible yet, nobody is using them to solve real-world problems. They're still a research/science project.
 
Nope

The phrase "end of the beginning" suggests we're past the rapid advancement of the first several decades. Classical computing is hitting several limits: speed of light propagation, heat dissipation, minimum discernable voltages, etc. Progress continues, but at a slower pace as we've solved the easy problems and now up against harder ones. This is the "mature" phase of the technology, somewhere between "the end of the beginning" and "the beginning of the end".

For quantum computing I'd say it's the beginning of the beginning. Not even feasible yet, nobody is using them to solve real-world problems. They're still a research/science project.
That's why I was saying classical computing is more at the end than the end of the beginning. It's no longer about how fast a processor is, it's more like how many cores can you cram on a chip.
 
True, classical computers haven't really gotten faster over the past 10-15 years. They've only gotten more parallel. The fast advances of the beginning are over. So the beginning has ended. Yet they're still improving, just more slowly. I don't see them anywhere near the end of their lifecycle. More like in the middle. In my view, their end hasn't yet begun.

One clue that their end has begun is when we can point to whatever it is that will replace them. Even if quantum computers become practical (a big IF), they are unlikely to replace classical computers. More likely to overlap and take over a few narrow applications.
 
They serve different purposes, you won't want a quantum computer as an alarm clock and you don't want to crack encryption brute force with a classical computer.
 
True, classical computers haven't really gotten faster over the past 10-15 years. They've only gotten more parallel. The fast advances of the beginning are over. So the beginning has ended. Yet they're still improving, just more slowly. I don't see them anywhere near the end of their lifecycle. More like in the middle. In my view, their end hasn't yet begun.

One clue that their end has begun is when we can point to whatever it is that will replace them. Even if quantum computers become practical (a big IF), they are unlikely to replace classical computers. More likely to overlap and take over a few narrow applications.
I wouldn't say that. We are solving problem very differently now and don't go through linearly on something that can start parallel and end with one single solution (search). Don't forget the end is what we want and how we get there isn't the ultimate goal, so instead of one fastest way to get there with one machine we can start many and abort all but the only one reaching the end, no need to go back and forth like a traveling salesman.

A lot of the needs are bursty, so why build one machine for the burst when you can rent them from the cloud as needed.
 
Back
Top Bottom