Conventional computing methods are the ones we use normally in our day to day lives on our computers, cell phones, laptops etc. Methods employed to solve problems via conventional or classical computing are by far closest to the way a human brain performs them. In classical computing, information is stored in bits which are the discrete values of 1s and 0s. Performing operations on these discrete values on a classical machine is the same as performing them by hand.
The typical features of classical computers like bits, registers, logic gates and algorithms have analogous features in a quantum computer. In quantum computing information is manipulated in the form of quantum bits or ‘qubits’ which work …show more content…
an electron or photon, with its state 1 and/or 0 indicated by its charge or polarization. The nature and behavior of these particles is explained via quantum physics which forms the basis of quantum computing. The principles of superposition and entanglement form the basis of quantum physics itself.
Data is processed in conventional computers by using switches and ladders and other electronic methods to give a logic state 1 or 0, whereas an ideal quantum computer is one where qubits are stored inside atoms, ions and even electrons and photons. To produce these qubits laser beams, electromagnetic fields, radio waves, and an assortment of other techniques are used.
There is nothing amiss in conventional computers, nothing that would want us to move on to another form of computing…except a law proposed by Intel co-founder Gordon Moore. According to him, the power of computers doubles roughly every 18 months and it has been proven so. More powerful computers means that a greater amount of smaller sized transistors are used which results in smaller and more compact integrated circuits. And now transistor sizes have become so small that it is not possible for them to become any smaller and still be able to follow Moore’s Law. Now smaller transistors would need to operate on the sub atomic level, the level where ordinary laws of physics …show more content…
Computations that take conventional computers several years could be performed in seconds. It will enable better weather forecasting, drug discovery, logistical planning, financial analysis, and even search for human-habitable planets. Along with all these positive benefits we’ll have to cope with compromised security as they will be able to unlock every digital vault in the world. Every bank record, private communication, and password on every computer in the world will become accessible to third party