Trends & Wins

Nonstop Trends and Wins

An important question that is and has been discussed – is there a fundamental limit to computing? We have been living (well a lot of us) in the golden age of what I’ll call the Moore’s Law stage. If you don’t know what Moore’s Law is, shame on you, it is worth a Google and some reading time. It is the stage of computing we’ve been under for most of the time computing and computers have existed. The law that processing power will double in a predictable amount of time. That is, it will increase exponentially and it has for most of my career. Initially, this was an ability to shrink the transistors. Moore’s Law states that the number of transistors on a microchip doubles about every two years, though the cost of computers is halved. Unfortunately, Moore’s Law is starting to fail: transistors have become so small (Intel is currently working on readying its 10mm architecture, which is an atomically small size) that simple physics began to block the process. We have been offering multicore architectures to bolster up the law but many say Moore’s Law has run its course which leads us back to the initial question. Is there a limit and have we reached it?

For some, the hope is quantum computing, which takes us from binary computing to a Qubit. Quantum computers leverage quantum mechanical phenomena to manipulate information, by relying on quantum bits, or qubits. This technology tackles problems such as generating probability distributions, mapping data, testing samples, and iterating. Quantum computing will be able to provide exponential power to mathematically challenging problems. Quantum computing will therefore be improving accuracy, shortening computation runtimes, and tackling previously impenetrable calculations. This is very promising and may just be the next exponential leap in computing but first a joke. Heisenberg is speeding around, trying to find a hall where he’s supposed to give a lecture. Just as he thinks he recognizes the street he’s been looking for, a cop pulls him over and tells him, “You were going 93.287 miles per hour!” Heisenberg exclaims, “Great! Now I’m lost!” This joke depends on Werner Heisenberg’s uncertainty principle. The principle states that when it comes to quantum particles, the more precisely you know the speed of a particle, the less precisely you can know its position. Systems with quantum behavior don’t follow the rules that we are used to, they exist in several different states at the same time – and even change depending on whether they are observed or not. So the qubit can be a 1 or a 0 or both or neither. If it sounds complicated, it is.

I have been asked questions on what NonStop is doing concerning quantum computing since there are many articles on Finance and Banking being one of the first verticals to take advantage of this computing paradigm. I have poked around with Labs and what HPE is doing. As you might imagine HPE is definitely researching and testing. In what I have uncovered so far and in listening to some of our internal talks on quantum. It’s real. It does have enormous promise. It is not happening (commercially) any time soon.

If we consider classical computing, each bit is (always) 1 or 0. We can watch it work, without affecting the result. It is very reliable and, importantly, repeatable. We can copy results. There is well-understood error correction and debugging. Finally, binary operations are exact. Now let’s consider quantum computing. Each qubit can be 1 and 0 at the same time and each qubit can hold more than one bit of information, but collapses to 1 or 0 when you measure it. One needs to run something many times to recover the exact value. You can’t watch a quantum computer at work, therefore no debug. You cannot copy, so there isn’t any Checkpoint/Restart and no saving to disk. Qubit is analogic and noisy, you never quite have a perfect 0 or 1. And so far quantum computing is super expensive. It might solve some very esoteric compute problems – say the traveling salesman – but in the near term (5-10 years) it is highly unlikely that very stable, repeatable, critical systems such as NonStop will be replaced anytime soon if ever. The problems NonStop handles, OLTP and relational queries are not things quantum computing is very good at, or perhaps ever will be.

It is fascinating technology and you should learn about it. Just don’t worry about it.

Author

  • Justin Simonds

    Justin Simonds is a Master Technologist for the Americans Enterprise Solutions and Architecture group (ESA) under the mission- critical division of Hewlett Packard Enterprise. His focus is on emerging technologies, business intelligence for major accounts and strategic business development. He has worked on Internet of Things (IoT) initiatives and integration architectures for improving the reliability of IoT offerings. He has been involved in the AI/ML HPE initiatives around financial services and fraud analysis and was an early member of the Blockchain/MC-DLT strategy. He has written articles and whitepapers for internal publication on TCO/ROI, availability, business intelligence, Internet of Things, Blockchain and Converged Infrastructure. He has been published in Connect/Converge and Connection magazine. He is a featured speaker at HPE’s Technology Forum and at HPE’s Aspire and Bootcamp conferences and at industry conferences such as the XLDB Conference at Stanford, IIBA, ISACA and the Metropolitan Solutions Conference.

Be the first to comment

Leave a Reply

Your email address will not be published.


*