Glorious! Mankind progress that is. We continually strive for greatness in the most courageous way possible. We've come so far and we're not relenting in any known way. Technology is at the frontier of mankind's development as it opens new doors for all other possibilities.
As we know, we entered what was called "The Space Age" at the launch of the first rocket into orbit. Since then, we've gone to the moon and back and are preparing to go to Mars in the coming years. How is it possible for mankind to have pushed so much in so little time?
Let's move to computers for example. The first computer built by a Charles Babbage normally referred to as the father of the computer. Though it was a milestone needed for the continued survival of mankind, the piece of machinery he invented was extremely bulky and slow.
Pushing even further from the computers which used vacuum tubes to transistors then all the way to integrated circuits and microprocessors where a computer chip can house more than six million transistors and still be tiny enough to fit into the eye of a needle. An amazing feat indeed but the pace at which computers develop (especially those based on semiconductors to handle operations in 0s and 1s), is decreasing exponentially. This is violating Moore's law.
According to Wikipedia, Moore's law, named after computer scientist Gordon Moore, is an observation that the number of transistors in a densely integrated circuit doubles about every two years. From another perspective, this simply means that computing power doubles approximately every two years. Seems logical? At least it was for the first few years (from around 1975) till about 2010. Manufacturers are seriously trying to uphold this law but it is such a daunting task. It is almost obvious that the law would reach its saturation point over the next decade.
Now if that happens, how do we preserve this trend of technological advancement we've put so much work into? Well, engineers all around the globe are putting their hearts into another aspect of computing essentially abandoning Moore's law. This is Quantum Computing.
As Albert Einstein once said; " God does not play dice!" Or does he? Quantum computing is based on the principle of Quantum mechanics which takes what we understand about physics and totally crushes it with raw force. It is literally a game of probability, a very advanced aspect of physics that deals with the very small as opposed to General relativity (which deals with the physics of the very large).
Taking a crash course on quantum mechanics, from the standard model theory, wave-particle duality all the way to the Schrodinger cat phenomenon, quantum entanglement and quantum chromodynamics, its raw power is very evident and not even all of these are used in quantum computing.
Quantum computing is excitingly different from a modern-day computer and an amazing phenomenon to behold. Now unlike the computers of today that are based on transistors to encode data into only two possible states (that is bits: 0s and 1s), quantum computers use what is known as Quantum bits (or more specifically, Qubits) which can be a superposition of these binary states. What makes it so amazing is that since it can exist in multiple states at once (either as both 0 and 1 at the same time or any state in between), it has the potential of becoming immensely more powerful than even the fastest supercomputers of today. As promising as this technology is, it does have a few problems. Remember the Schrodinger cat phenomenon where the cat is both alive and dead in the box till you look at it, then it becomes either alive or dead but not both, that also applies to quantum computing. Since qubits are based on subatomic particles guided by the principles of quantum mechanics, measuring them will cause them to assume a particular state of 0 or 1 but not a superposition of these states. This essentially results in them working exactly like today's mundane digital computers and turns out to be a big problem. Though, a workaround is available in the form of Quantum entanglement. Since according to quantum mechanics, entanglement causes two subatomic particles to be linked or "entangled", when two electrons, for instance, become linked, one assumes a state or a particular spin whereas the second assumes a state with respect to the first and also takes the opposite spin. This phenomenon makes it possible for scientists to know the value of qubits without actually looking at them.
Another is that it poses very high-security risks. According to Dr. Colin Wilmott; " Quantum computing renders our classical securities thanks and void". You may think this is not the case as it also increases security as in the case of quantum cryptography (which renders classical hacking literally useless), most modern-day computers are still digital. Quantum hacking can effectively and easily breach through any security set up by digital systems. Also, it is very dangerous for such a device to fall into the hands of criminals.
Moreover, it requires a very cold operating temperature (near absolute zero) since the subatomic particles need to be close enough to a stationary state in order to be measured. This makes the current ones non-portable. For example, the cores of D-Wave quantum computers operate at -460°F or -273°C, which is just about 0.02° away from absolute zero.
Yes, it has some disadvantages but its benefits far outweigh them. From machine learning and artificial intelligence, financial services all the way to computational optimizations, biomedical simulations, and exquisite computing power, there truly is no limit for the phenomenon that is quantum computing.
Though we are still a long way from having quantum computers basing their operations on millions of qubits, we take quantum leaps (no pun intended) by the minute. It is essentially one of the most revolutionary creations of mankind, waiting at our doorstep to be used effectively and efficiently. The near-infinite computational power of quantum computers will enable us to discover new frontiers currently hindered by our powerful but otherwise comparably slow digital computers.
Very detailed post. Great job!
ReplyDeleteThank you. I will continue to publish very detailed and high quality posts.
DeleteIn depth analysis of the concept of quantum mechanics and computing. Very interesting.
ReplyDeleteThank you
Delete