HIRE WRITER

Quantum Computers: An Overview

This is FREE sample
This text is free, available online and used for guidance and inspiration. Need a 100% unique paper? Order a custom essay.
  • Any subject
  • Within the deadline
  • Without paying in advance
Get custom essay

Abstract

Quantum computing has developed in the past two decades from a visionary idea to one of the most attracting field of quantum mechanics by combining physics, mathematics & computer science. The nature of computation can be altered if the bits of computer are scaled down to the size of individual atom. The function of such a quantum computer may consists of a superposition of many calculations carried out concurrently. Many computational problems can be solved using quantum computing. They easily solve applications that cannot be done with the help of today’s computers. An overview of quantum computer, description of qubit, difference between quantum & silicon computer, its working has been illustrated in this research paper.

Keywords: Quantum computer, Qubit, Quantum gates, Block sphere, Moore law, Decoherence.

Introduction

A quantum computer is any device for computation that makes direct use of distinctively quantum mechanical phenomena, such as superposition and entanglement, to perform operations on data [1]. In a classical (or conventional) computer, information is stored as bits; in a quantum computer, it is stored as qubits (quantum bits). Research in both theoretical and practical areas continues at a frantic pace, and many national government and military funding agencies support quantum computing research to develop quantum computers for both civilian and national security purposes, such as cryptanalysis. If large-scale quantum computers can be built, they will be able to solve certain problems exponentially faster than any of our current classical computers (for example Shor’s algorithm) [2].

Computers reduce human effort and increases the performance and makes the human work easier. There have been many ways to increase the performance of the computers. One of the way is to minimize the size of the transistors used in the computers. Some other way can be to use quantum computers. It is considered to be very useful to factor large numbers. It has the capacity to decrypt codes in 20 minutes which took a massive time such as millions of years with classical computers.

Quantum Computation was first the idea of Richard Feynman who said that if we use the quantum mechanical effects, we can achieve the computation faster. This was found out when scientists tried simulating these effects on a computer. One other hint was the exponentially large state spaces that quantum mechanics makes available which indicates enormous amount of computational resources [Harrow 2012]. Quantum computing is a new field which is of great interest to the researchers. The reason behind this emerging curiosity is because of its massive computational power.

There are many solutions of the Moore’s law, but they all come to a same conclusion: that the performance of the computer chips is increased, when the size decreased exponentially over time.

Working

To get to grips with quantum computing, first remember that an ordinary computer works on 0s and 1s. Whatever task you want it to perform, whether it’s calculating a sum or booking a holiday, the underlying process is always the same: an instance of the task is translated into a string of 0s and 1s (the input), which is then processed by an algorithm. A new string of 0s and 1s pops out at the end (the output), which encodes the result. However clever an algorithm might appear, all it ever does is manipulate strings of bits — where each bit is either a 0 or a 1. On the machine level, this either/or dichotomy is represented using electrical circuits which can either be closed, in which case a current flows, or open, in which case there isn’t a current.

Quantum computing is based on the fact that, in the microscopic world, things don’t have to be as clear-cut as we’d expect from our macroscopic experience. Tiny particles, such as electrons or photons, can simultaneously take on states that we would normally deem mutually exclusive. They can be in several places at once, for example, and in the case of photons simultaneously exhibit two kinds of polarisation. We never see this superposition of different states in ordinary life because it somehow disappears once a system is observed: when you measure the location of an electron or the polarisation of a photon, all but one of the possible alternatives are eliminated and you will see just one. Nobody knows how that happens, but it does.

Superposition frees us of from binary constraints. A quantum computer works with particles that can be in superposition. Rather than representing bits — such particles would represent qubits, which can take on the value 0, or 1, or both simultaneously. You might object that something like superposition could perhaps be achieved using only ordinary classical physics — perhaps by processing two ordinary bits at the same time or something like that — in which case quantum computing wouldn’t be that much more amazing than classical computing.

But there is more to quantum physics than just superposition. If you look at a system of more than one qubit, then the individual components aren’t generally independent of each other. Instead, they can be entangled. When you measure one of the qubits in an entangled system of two qubits, for example, then the outcome — whether you see a 0 or a 1 — immediately tells you what you will see when you measure the other qubit. Particles can be entangled even if they are separated in space, a fact that caused Einstein to call entanglement ‘spooky action at a distance’

Entanglement means that describing a system of several qubits using ordinary classical information, such as bits or numbers, isn’t simply about stringing together the descriptions of the individual qubits. Instead, you need to describe all the correlations between the different qubits. As you increase the number of qubits, the number of those correlations grows exponentially: for n qubits there are 2ncorrelations.

This number quickly explodes: to describe a system of 300 qubits you’d already need more numbers than there are atoms in the visible Universe. The idea is that, since you can’t hope to write down the information contained in system of just a few hundred qubits using classical bits, perhaps a computer running on qubits, rather than classical bits, can perform tasks a classical computer can never hope to achieve. This is the real reason why physicists think quantum computing holds such promise.

There’s a hitch however. While a quantum algorithm can take entangled qubits in superposition as input, the output will also usually be a quantum state — and such a state will generally change as soon as you try to observe it. ‘Nature pulls a trick here,’ says Jozsa. ‘She updates a quantum state, but then she doesn’t allow you to get all the information.’ The art of quantum computing is to find ways of gaining as much information as possible from the unobservable.

There’s a hitch however. While a quantum algorithm can take entangled qubits in superposition as input, the output will also usually be a quantum state — and such a state will generally change as soon as you try to observe it. ‘Nature pulls a trick here,’ says Jozsa. ‘She updates a quantum state, but then she doesn’t allow you to get all the information.’ The art of quantum computing is to find ways of gaining as much information as possible from the unobservable.

Moore Law

In 1965, Gordon Moore predicted that the number of transistors on a silicon chip would double every year. This was based upon empirical evidence that he witnessed in the early years of the semiconductor industry. In 1975, he modified his prediction to indicate a doubling every two years. That held for the next forty years until Brian Krzanich announced last year that Intel was slowing down the rate to a new process generation every 2.5 years.

So the question comes up whether Moore’s Law can also be applied to quantum qubits. And early evidence suggests that indeed it may. If we take this as an assumption we can make rough forecasts for qubit capacities in the coming years and show when a quantum computer can be used to solve certain meaningful problems. The resulting graph is shown below: [image: Qubit-Projections-verrsus-Algorithm-Requirements-September-12-2016.jpg]

The upward sloping lines represent the projections for qubit density forecasts for various technologies. The adiabatic line would be a prediction for quantum annealing machines like the D-Wave computers. These have followed the Moore’s Law prediction pretty closely so far with the D-Wave 1 at 128 qubits in 2011, the D-Wave 2 at 512 qubits in 2013, the D-Wave 2X at 1097 qubits in 2015, and a 2048 qubit machine in 2017. Since this is not a universal quantum computer and has no error correction, the qubits are easier to build and the densities can be much higher.

The upward sloping lines labelled Physical or Logical represent various types of gate-level quantum computers. The Physical curve predicts the number of physical qubits that will be available.  There is less historical data on these, but there are indications that these will progress rapidly too. As examples, IBM has a 5 qubit machine that is available in the cloud through the IBM Quantum Experience and Google has demonstrated a 9 qubit machine. Both of these companies, and others have indicated that these densities will increase rapidly so the Physical curve maintains the improvement rate of a doubling every year for the next 10 years and a doubling every two years thereafter.

However, as important as the number of qubits are the quality of the qubits and the amount of error correction that will be required. The expectation is that gate level quantum computers will need substantial amount of error correction so that logical qubits will consist of a number of physical qubits. The estimates for the ration of physical to logical qubits can vary widely depending upon the technology used and the fidelity of the individual qubits. I have seen estimates ranging from about 10:1 for topological qubits to 1000:1 or more for other types. So the three curves plotting the number of logical qubits would represent a range of quantum computers with error correction.

The final set of lines are the double horizontal lines that show how many logical qubits would be needed to implement various algorithms. By viewing where these lines cross a specific quantum computer technology curve, one can see a rough estimate of when the algorithm may be solvable. I have shown four different types of algorithms on the graph. These are:

  1. The number of logical qubits needed to surpass the capabilities of a classical computer.  This is assumed to be about 50 qubits because it would require the classical computer to compute 250 states which is an overwhelming number.
  2. The number of logical qubits needed to perform a simulation on quantum chemistry.  Estimates are that this would require at least 100 qubits.
  3. The number of logical qubits needed to implement a machine learning algorithm on a quantum computer. For this graph I am estimating it would take about 1000 qubits, although this could vary widely.
  4. 4Finally, the number of logical qubits needed to factor a 2048 bit number in order to break RSA encryption with a 2048 bit key. This is estimated to take at least 4000 qubits, but could be more depending upon the algorithm used.

By using these assumptions and studying the graph one can come up with some interesting conclusions. First, implementations that require heavy error correction will add considerably to the requirements and time before an algorithm can run successfully. For example, we estimate that a quantum computer with 4000 physical qubits will be built by 2023. If those qubits were perfect and required no error correction (i.e. a 1:1 logical to physical ratio) then the 2048 bit number could be factored as early as 2023. However, if heavy error correction is required with a 1000:1 logical to physical ratio, then this factoring could not be accomplished until about 2041 or about 18 years later.

So in order to make quantum computing as useful as possible as early as possible, there are three things that the industry needs to work on:

  1. Increasing the number of physical qubits.
  2. Increasing the quality of the qubits so that good results can be obtained with smaller error correction codes that have less overhead.
  3. Finding robust algorithms that may still be able to provide useful answers even when errors occur.  These algorithms could then run on machines with fewer physical qubits.

Advantages

Cryptography and peter shor’s algorithm

In 1994 Peter Shor found out the first algorithm with which an efficient factorization can be performed. factorization has become a complex application that only a quantum computer could do. One of the most crucial problems in cryptography is Factorization . For instance- security RSA , a public key cryptography depends on factoring is a major problem. Due to many functional and practical characteristics of quantum computers, scientists are doing their best to build it. However, in 1998, Maney found that breaking any kind of current encryption data takes almost centuries on the prevailing computers, may take a few years on quantum computer.

Artificial intelligence

It has been described that quantum computers will be much expeditious and therefore will perform a large amount of operations in a very short span of time. Moreover, accelerating the speed of operations will help computers to learn quicker even using the easiest method – bound model for learning.

Some other benefits

The complex compression algorithms, voice and image recognition, molecular simulation, true randomness and quantum communication can be developed by its high performance.

Simulation and randomness – In simulation, randomness is important. These molecular simulation are necessary for developing simulation applications for chemistry and biology.

Quantum communication – Via quantum communication both sender and receiver are cautious when an eavesdropper tries to catch the signal. Quantum bits also permit more information to be communicated per bit. Thus quantum computers make communication more secure.

Unique thing about quantum computers

Theoretically, Quantum mechanics explores areas that are nearly unthinkable. For instance- It is possible that a quantum computer holds an infinite number of right answers for an infinite number of parallel universes. I t just happen to give you the right answers for the universe you happen to be in at the time. “It takes a great deal of courage to accept these things,” said Charles Bennet of IBM, one of the prominent quantum computer scientists .“If you do, you have to believe in a lot of other strange things.” (Maney,1998)

Disadvantages

Decoherence

When some parameters of this quantum state are changed by the measurement of quantum state parameters considers interaction process with environment (with other particles – particle of light for example),then the measurement of superposition quantum state will collapse it into a classical state. This is called decoherence.

This is the huge hindrance in the process of producing of a quantum computer. If decoherence problem cannot be solved, a quantum computer will be no better than a silicon one. Many operations must be performed before quantum coherence is lost in order to make quantum computers powerful . Construction of a quantum computer will possibly make calculations before decohering. But if in a quantum computer, where the number of errors is low enough, then even when qubits in the computer decohere, it is viable to use an error-correcting code for hindering data looses.

Many error-correcting codes are present. A classical error correcting code called repetition code is one of the simplest error correcting codes. Here, 0 is encoded as 000 and 1 as 111. Then if only one bit is flipped, one gets a state for example 011 that can be corrected to its actual state 111. The signs of states in a quantum superposition are also important, but sign errors can also be corrected. There exists even a theory about quantum error-correcting codes.

Hardware for quantum computers

It is also one of the problems of quantum computer. Nuclear Magnetic Resonance (NMR) technology is the most accepted today, because of some successful experiments. Some other hardware are based on ion trap and quantum electrodynamics (QED). All of these methods have some restrictions. What the architecture of future quantum computers hardware will be, nobody knows!

Conclusion

Experimental and theoretical research in quantum computation is escalating all over the world. New technologies for understanding quantum computers are being initiated, and new classifications of quantum computation with various benefits over classical computation are frequently being discovered and have examined that some of them will bear technological fruit. The quantum theory of computation is an essential part of the world view for anyone who seeks a basic understanding of the quantum theory and the processing of information. It is one of the biggest steps in science and is undoubtedly revolutionizing the practical computing world.

References

  1. Gershenfeld, Neil; Chuang, Isaac L. (June 1998). ‘Quantum Computing with Molecules’ (PDF). Scientific American.
  2. Quantum Information Science and Technology Roadmap for a sense of where the research is heading.

Cite this paper

Quantum Computers: An Overview. (2021, May 23). Retrieved from https://samploon.com/quantum-computers-an-overview/

We use cookies to give you the best experience possible. By continuing we’ll assume you’re on board with our cookie policy

Hi!
Peter is on the line!

Don't settle for a cookie-cutter essay. Receive a tailored piece that meets your specific needs and requirements.

Check it out