Gartner describes quantum computing as: “[T]he use of atomic quantum states to effect computation. Data is held in qubits (quantum bits), which have the ability to hold all possible states simultaneously. Data held in qubits is affected by data held in other qubits, even when physically separated. This effect is known as entanglement.”. The power of AI,and data analytics will supplement the speed which Qubit is offering. Trackable, readable and identifiable things over the internet or in other words IoT when entangled with QC will have a tremendous influence on the devices and their functions.While trillions of devices are connected , there is an ever increasing demand on their responsiveness, process efficiencies and the scaleability and there is the role played by QC to optimize computing,network speed,powered by 5G. Even though blockchain enabled cryptographic algos are avaible for the security of IoT devices and the process connectivity through public keys , QC expert hackers can hack the keys. On the flip side QC has the ability to create an almost un-hackable network of devices and data. The need to securely encrypt and protect IoT connected devices and power them with exponential speed and analytical capabilities is an imperative for both government and the private sector. As quantum computing and IoT merge, there will also be an evolving new ecosystem of policy Issues. These include, ethics, interoperability protocols, cybersecurity, privacy/surveillance, complex autonomous systems, best commercial practices.As quantum computing capabilities advance, we should act now to prepare IoT for the quantum world. There are many areas to explore in research and development and eventually implementation. The coming decade will provide both imperatives and opportunities to explore quantum implications.
In 1936, Alan Turing proposed the Turing machine, which became the foundational reference point for theories about computing and computers. Around the same time, Konrad Zuse invented the Z1 computer, considered to be the first electromagnetic binary computer.
What happened next is history, and in our world today, computers are everywhere. Our lives are dramatically different from how they were even at the end of the 20th century, and our mobile phones have far more powerful CPUs than desktop computers did only few years ago. The advent of the Internet of Things brings computer power into every minute detail of our lives. The world wide web has had such a transformative effect on society that many people can't even remember a life before they were online.The major catalyst behind this transformation was the discovery of silicon, and its use in the production of good transistors. This occurred over a period of more than 100 years, dating from when Michael Faraday first recorded the semiconductor effect in 1833, via Morris Tanenbaum, who built the first silicon transistor at Bell Labs in 1954, to the first integrated circuit in 1960.
We are about to embark on a similar journey in our quest for building the next-generation computer. Quantum physics, which emerged in the early 20th century, is so powerful and yet so unlike anything known before that even the inventors had a hard time understanding it in detail.
Peter Shor published an algorithm in 1994 capable of efficiently solving problems in cryptography that are hard to solve for classical computers – that is, the vast majority of computers used today. In fact, Shor's algorithm continues to threaten the fundaments of most encryption deployed across the globe.The problem was that, in 1994, there was no quantum computer in sight. In 1997, the first tiny quantum computer was built, but the field really took off only when the Canadian startup D-Wave revealed its 28-qubit quantum computer in 2007.Similar to the trajectory of non-quantum communication, which took more than 100 years from discovery to mass use, quantum computers are now maturing very quickly. Today, many players are engaged in a battle over who can build the first powerful quantum computer. These include commercial entities such as IonQ, Rigetti, IBM, Google, Alibaba, Microsoft and Intel, while virtually all major nation states are spending billions of dollars on quantum computing development and research.
Quantum computers are powerful yet so difficult to build that whoever can crack the code will have a lasting powerful advantage. This cannot be understated.
Considering the immense challenges to building quantum computers, I'd say we are roughly where we were in around 1970 with classical computers. We have some quantum computers, but they are still pretty unreliable compared to today's standard. We call them NISQ devices - Noisy Intermediate-Scale Quantum devices. Noisy because they are pretty bad, and intermediate-scale because of their small qubit number. But they work. There are a few public quantum computers available for anyone to programme on. IBM, Rigetti, Google and IonQ all provide public access with open-source tools to real quantum computing hardware. IBM even sells a quantum computer that you can put in your own data centre (the IBM Q System One).But these are not yet powerful enough to break RSA 2048-bit keys, and probably won't be for another 10 to 20 years.
The comparison date of 1970 works from another angle, too. In October 1969, researchers sent the first message over the internet (it was called ARPANET then). When they tried to send the one word "login", the system crashed after sending "l" and "o". It later recovered and the message was successfully sent.Today, we are also building a quantum communication system that doesn't communicate bits and bytes, but quantum states that quantum computers can understand. This is important so that we can build up a quantum version of the internet.
It is also important as a way of encrypting communication, since the quantum channel provides some inherent physical guarantees about a transmission. Without going into too much detail, there is a fundamental property whereby the simple act of wiretapping or listening into a communication will be made detectable to the parties communicating. Not because they have a fancy system setup, but because of fundamental properties of the quantum channel.But quantum computers are not just useful for cryptography applications and communication. One of the most immediate applications is in machine-learning, where we are already today on the cusp of a quantum advantage – meaning that the quantum algorithm will outperform any classical algorithm. It is believed that quantum advantage for machine-learning can be achieved within the next 6-12 months. The near-term applications for quantum computing are endless: cryptography, machine-learning, chemistry, optimization, communication and many more. And this is just the start, with research increasingly extending to other areas.Google and NASA have just announced that they have achieved 'quantum supremacy'. being discussed in below section in detail That is the ability of quantum computers to perform certain tasks that a classical computer simply cannot do in a reasonable timeframe. Their quantum computer solved a problem in 200 seconds that would take the world’s fastest supercomputer 10,000 years.The problem that was solved is without any practical merits or implications, yet it demonstrates the huge potential quantum computers have and the ability to unlock that potential in the coming years.This opens up a completely new era where we can now focus on building quantum computers with practical benefits and while this will still be many years away, it will be the new frontier in computation.
Just recently Google claims to have received quantum supremacy
Google claims to have built the first quantum computer that can carry out calculations beyond the ability of today’s most powerful supercomputers, a landmark moment that has been hotly anticipated by researchers. A paper by Google’s researchers seen by the FT, that was briefly posted earlier this week on a Nasa website before being removed, claimed that their processor was able to perform a calculation in three minutes and 20 seconds that would take today’s most advanced classical computer, known as Summit, approximately 10,000 years. The researchers said this meant the “quantum supremacy”, when quantum computers carry out calculations that had previously been impossible, had been achieved. “This dramatic speed-up relative to all known classical algorithms provides an experimental realization of quantum supremacy on a computational task and heralds the advent of a much-anticipated computing paradigm,” the authors wrote.
A November 2018 report by the Boston Consulting Group said they could “change the game in such fields as cryptography and chemistry (and thus material science, agriculture and pharmaceuticals) not to mention artificial intelligence and machine learning . . . logistics, manufacturing, finance and energy”. Unlike the basic binary elements of classical computers, or bits, which represent either zeros or ones, quantum bits, or qubits, can represent both at the same time. By stringing together qubits, the number of states they could represent rises exponentially, making it possible to compute millions of possibilities instantly.
Quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers. In classical computing, a bit is a single piece of information that can exist in two states – 1 or 0. Quantum computing uses quantum bits, or 'qubits' instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.
"The difference between classical bits and qubits is that we can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called 'entangled states'," says Alexey Fedorov, a physicist at the Moscow Institute of Physics and Technology.
A qubit can be thought of like an imaginary sphere. Whereas a classical bit can be in two states – at either of the two poles of the sphere – a qubit can be any point on the sphere. This means a computer using these bits can store a huge amount more information using less energy than a classical computer.
The Story of Rigetti
The coldest place in the universe is located in sunny California.
On the outskirts of Berkeley, inside an oversized warehouse, hangs a large white pipe. It’s a human-made contraption, a next-generation cryogenic refrigerator cooled to .003 Kelvin, or just north of absolute zero.
The pipe belongs to Rigetti Computing, the next contestant aiming to build useful quantum computers.
The company got going in 2013, when a physicist named Chad Rigetti decided that quantum computers were a lot closer to prime time than many suspected, and that he wanted to be the one to push the technology over the finish line.
In pursuit of his vision, Rigetti left a comfortable job as a quantum researcher at IBM, raised over $119 million in funding, and built the coldest pipe in history. Over fifty patent applications later, Rigetti now manufactures integrated quantum circuits that are directly linked to a quantum computer in the cloud.
One of the biggest and most exciting aspects about Rigetti, however, is their push towards democratization of quantum computing.
Right now, if you go to Rigetti’s website (www.rigetti.com), you can download Forest, their quantum developer’s kit. The kit provides a user-friendly interface to the quantum world.
With it, almost anyone can write a program and run it on Rigetti’s thirty-two-qubit computer. To date, over 120 million programs have already been run.
And other companies are fast following suit, as Microsoft, IBM, and Google have now rolled out quantum cloud services.
Google’s announcement has caused a major stir in the quantum computation community.
Indeed, the very notion of “quantum supremacy” is under scrutiny. “Supremacy” implies that quantum computers will replace traditional computation, rather than serve as a supplement.
By contrast, another term often used (proposed by Rigetti) is “quantum advantage.” According to the company, this concept is demonstrated when an algorithm run on a quantum computing platform “has either a faster time to solution, a better quality solution, or lower cost of classical compute compared to the best classical algorithm.”
In either case, quantum computation offers tremendous quantitative advantages over classical computers in a range of problem domains.
To give an idea of scale: if all the atoms of the universe were each capable of storing one bit of information, an eighty-qubit computer would have more information-storage capacity than all the atoms in the universe.
And already, current state-of-the-art quantum computers, including Google’s Sycamore and IBM’s Q53, have reached a count of 53 qubits.
Yet today, we have no concrete idea of what innovations might arise once quantum computing matures at scale. But what we do know is tantalizing.
Because chemistry and physics are quantum processes, computing in qubits will usher in what Oxford’s Simon Benjamin calls “a golden age of discovery in new materials, new chemicals, and new drugs.”
It will thrust aside today’s computing constraints to artificial intelligence, fundamentally transform cybersecurity, and allow us to simulate systems of unprecedented complexity.
As Chad Rigetti explains, “[The technology] change[s] the economics of research and development. Say you’re trying to create a new cancer drug. Instead of building a large-scale wet lab to explore the properties of hundreds of thousands of compounds in test tubes, you’re going to be able to do much of that exploration inside a computer.”
In other words, the gap between experimental question and any new solution—whether novel drug, optimized material, or personalized product—is about to become a whole lot smaller.
Brace yourself. The era of democratized, scalable, and cloud-accessible quantum computing has just begun.
Before quantum computers, all known realistic computing devices satisfied the extended Church-Turing thesis, which said that the power of any computing device built could be only polynomially faster than a regular “universal” computer—that is, any relative speedup would scale only according to a power law. Designers of these “classical”, computing devices increased computing performance by many orders of magnitude by making the operations faster (increasing the clock frequency) and increasing the number of operations completed during each clock cycle. While these changes have increased computing performance by many orders of magnitude, the result is just a (large) constant factor faster than the universal computing device. Bernstein et al. showed in 1993 that quantum computers could violate the extended Church-Turing thesis, and in 1994 Peter Shor showed a practical example of this power in factoring a large number: a quantum computer could solve this problem exponentially faster than a classical computer. While this result was exciting, at that time no one knew how to build even the most basic element of a quantum computer, a quantum bit, or “qubit,” let alone a full quantum computer. But that situation has recently changed. Two technologies, one using trapped ionized atoms (trapped ions) and the other using miniature superconducting circuits, have advanced to the point where research groups are able to build small demonstration quantum computing systems, and some groups are making these available to the research community. These recent advances have led to an explosion of interest in quantum computing worldwide; however, with this interest also comes hype and confusion about both the potential of quantum computing and its current status. It is not uncommon to read articles about how quantum computing will enable continued computer performance scaling (it will not) or change the computer industry (its short-term effects will be small, and its long-term effects are unknown).
M.A. Nielsen and I. Chuang, 2016, Quantum Computation and Quantum Information, Cambridge University Press, U.K.
P. Kaye, R. Laflamme, and M. Mosca, 2007, An Introduction to Quantum Computing, Oxford University Press, Oxford, UK.
"In the field of quantum computing, and throughout this report, computers that process information according to classical laws of physics are referred to as “classical computers,” in order to distinguish them from “quantum computers,” which rely upon quantum effects in the processing of information."
DGTAL.AI Inc.16192,Coastal Highway,Lewes,Delaware
UN identified Sustainable Development Goals(SDG) , which need attention of world communities, Technological innovations offer great solution to address these. We at DGTAL.AI discuss SDGs and Technological solutions under single platform for the benefit of tech enthusiasts and talents and add value towards the solutions, thereby serving the cause of a billion people.DGTAL.AI is not for profit initiative.
Copyright © 2020, DGTAL.AI Inc.
DGTAL.AI News & Views of Exponential Tech world