Scroll Top

Quantum computing Rose’s Law is Moore’s Law on steroids



Rose’s Law for Quantum Computing highlights the new platforms sheer power to solve humanity’s and society’s most complex problems on, and off, Earth.


When Steve Jurvetson, Managing Director of the investment firm Draper Fisher Jurvetson (DJF) first met Geordie Rose, now CTO and former CEO of D-Wave back in 2002 he was struck by his ability to explain complex quantum physics and the “spooky” underpinnings of a new class of computing platform – Quantum Computing.


See also
Nvidia's GauGAN turns your crappy sketches into masterpieces


DFJ first invested in D-Wave in 2003, and Rose predicted that he would be able to demonstrate a two-bit quantum computer within 6 months – years, if not decades ahead of the competition and there was a certain precision to his predictions. With one bit under his belt, and a second coming, Rose went on to suggest that the number of qubits in a scalable quantum computing architecture should double every year. Sound familiar?

It sounded a lot like Gordon Moore’s prediction back in 1965, when Moore used just five data points on a log-scale to devise Moore’s Law which, up until recently, had been the law that was used by the world’s largest semiconductor companies, such as Global Foundries, Intel and Samsung as the foundation for their global International Technology Roadmap for semiconductors (ITRS) – the roadmap responsible for pushing today’s CPU silicon based 10nm and 7nm architectures such as Knights Landing, Knights Hill and Knights Mill to the limit.


The gallery was not found!

Plotting Rose’s Law


As you can see from the figure above like Moore’s Law, a straight line is plotted against a logarithmic scale and it describes an exponential but unlike Moore’s Law the computational power of the Quantum Computer should grow exponentially with the number of entangled qubits as well. It’s like Moore’s Law compounded, and as you can see from the figure, albeit it’s missing the dates, D-Wave have already managed to demonstrate a doubling of Qubits each year, since 2002, and as you can see by the end of 2013 when they had bought out their 512 qubit Vesuvius processor, the similarities between Rose’s Law and Moore’s Law look spookily similar.

Bringing the chart up to date though their latest processor, the D-Wave 2X, which was released in June 2015 broke the 1,000 qubit barrier, so while it’s slightly behind the curve it’s not far. The $15 million processor was subsequently snapped up by organisations as wide ranging as Google, NASA, USC and Los Alamos who then went on to demonstrate that the new processor was 100 million times faster than classical computers at solving what we term “Optimization” problems which, on the surface at least appear to be quantum computing’s application sweet spot.


See also
Silicon based quantum computing surpasses 99 percent accuracy for the first time


Optimization problems are just how they sound – imagine you are building a house, and have a list of things you want to have in your house but you can’t afford everything on your list because you are constrained by a budget. What you really want to work out is the combination of items which gives you the best value for your money. That is an optimization problem and they come in all shapes and sizes and span thousands of domains including systems design, mission planning, airline scheduling, financial analysis, web search, cancer radiotherapy and many more.

From helping you plan the most efficient route for your next car journey to helping researchers optimize the combination of tens of millions of unique compounds to create the next breakthrough drug, or treatment the world is full of optimization problems and while quantum computers can help solve some of the smaller ones their power and speed best position them to help solve some of the world’s most complex problems, with potentially enormous benefits to businesses, people and science – not in millenia or eons but in  seconds and minutes. Once you get beyond 270 On/Off switches there are more possible combinations than atoms in the universe, and that’s a lot. And far too much for today’s classical computers to crunch.


See also
IBM wants to build a 10,000 Qubit quantum computer within 10 years


Going back to Rose’s Law though as Jurvetson points out, the potential is mind bending, and if we use D-Wave’s early data on processing power scaling then the very near future should be the watershed moment, where quantum computers surpass conventional computers and never look back. Furthermore, Moore’s Law will never be able to catch up.

Soon new processors will outperform all computers on Earth combined, double the qubits again the following year and it outperforms the universe – and that’s an interesting concept in itself, after all how do you “out perform” the universe? By this Deutsch, Jurvetson and Rose mean that the system could solve certain problems that couldn’t be solved by any non-quantum computer, even if the entire mass and energy of the universe was at its disposal and moulded into the best possible computer.

Quantum computing is a completely different way to compute – as David Deutsch posits – harnessing the refractive quantum echoes of many trillions of parallel universes to perform a computation.


See also
MIT's new artificial synapse brings "Brain on a chip" hardware closer


Jurvetson offers a reality check however. But first the caveat D-Wave’s new processors are not general purpose systems, they are application specific, meaning that they excel at  solving specific, discrete optimization problems. And while this happens to map perfectly to many of todays real world applications, from finance to molecular modelling, artificial intelligence and machine learning it’s not going to change our current personal computing tasks.

Second, the assumptions. There is a lot of room for surprises in the next few years. D-Wave could hit a scaling wall or discover a heretofore unknown fracturing of the physics – perhaps finding local entanglement, noise, or some other technical hitch that might not loom large at small scales, but grows exponentially as a problem just as the theoretical performance grows exponentially with scale and Jurvetson for one thinks the risk is less likely to lie in the steady qubit march, which has held true since 2002, but in the relationship of qubit count to performance.


See also
World first as scientists created the world's first DNA storage file system


There is also the question of the programming model. Programming traditional quantum computers is more difficult than machine coding an Intel processor but even that is changing with breakthroughs from the University of Maryland who have, for the first time managed to create a small re-programmable quantum computer. Without this breakthrough programmers would have had to worry about everything from analog gate voltages to the algorithmic transformations of programming logic to something that was simply only native to quantum computing and that would have limited both the applications and adoption of the new nacent platforms.

According to Jurvetson, “The possibility of a curve like this begs many philosophical and cosmological questions about our compounding capacity to compute… the beginning of infinity if you will.”

It will be fascinating to see if the next few years play out like Rose’s predictions and thirteen years on it still looks like Roses’ Law is holding true. how long it will last noone knows but if it lasts for just another five then we will have truly created a computing revolution.

Related Posts

Leave a comment


Awesome! You're now subscribed.

Pin It on Pinterest

Share This