Classical computing is an evolving thing of beauty. At least it is in the eyes of scientists who seek to conquer the universe or a disease, or power a self-driving car and run other modern miracles birthed by modern science. The admiration is warranted considering the U.S. put humans on the moon with less computing power found in a typical smart phone today. But that is not to say that the four computers NASA used in the Apollo space program were dumb. Instead they were marvels in how well scientists could leverage the computer muscle and memory they did have. In short, those computer scientists were masters at making every byte count.
If you’re interested in the details of how NASA scientists pulled that off, check out the video below (don’t worry, it is understandable to the average mortal human).
One would think that given the immense amount of computing power available today, and the fact that mankind conquered space on much less, that nothing would be beyond the reach of inquiring minds now. But one would most definitely be wrong about that. Classical computing has well defined limits.
Despite computers getting more powerful, though smaller and cheaper generally, there remains many questions far too big for them to compute. Even with the help of artificial intelligence (AI) elements like machine learning (ML) and deep learning (DL), there simply isn’t enough classical computing muscle to lift work projects requiring massive data sets beyond the already huge petabytes measurements. Even high-performance computing (HPC), on premises or in the cloud, and designed to “function above a teraflop or 1012 floating-point operations per second” can’t handle such workloads. Neither can supercomputers, which are not the same as HPC computers. Supercomputers operate at the highest computational rates that are physically possible today.
So, what kind of questions can none of these modern day computers with their AI sotware assistants and sophisticated modeling answer?
Oh, just the ones mankind is most curious or concerned about. Like answers and innovations only data trapped in molecules can answer such as how hormones and drug chemistries might be exploited to create new vaccines and cures that are also genetically customized for each individual patient. But there are many, many more super complex questions and conceivable applications requiring substantially more computing muscle to explore.
IBM and Microsoft both point to chemistry as a first application for quantum computing – a completely new form of computing currently under development. Certainly, chemistry has many bioinformatics application possibilities. Quantum chemistry also has applications in agriculture and epigenetics and many other disciplines that Genome Alberta works so diligently to support. To get an idea of the immensity of this subject and its impact on our world and the human experience, check out IBM’s video overview below.
In a nutshell, quantum computing enables scientists to “solve problems that currently take longer than the lifetime of the universe in seconds, hours or days,” Dr. Krysta Svore, computer scientist at Microsoft told me at a two day Microsoft AI and data workshop I attended in Seattle, Washington, USA last Spring. She says the same thing in the Microsoft video below that explains the software giant’s work in quantum computing.
Svore told the 30 or so workshop attendees that Microsoft envisions the first quantum computer being in the form of a sidekick to a classical computer – a hybrid computing model, in other words.
The race is on among the world’s leading computing vendors to deliver a fully functional quantum computer. In the meantime, you can experiment with Quantum programming to prepare your skill level for this bold new computing world and its many bioinformatics applications. Check out the Microsoft Quantum Development Kit or the IBM Q Experience (in beta now) for starters.
Good luck on your career journey and may the Q# programming language be forever in your favor!