You have probably heard of it, but you likely know nothing about it. Some claim it will be the next technological game-changer, and they are right. It is quantum computing, and its potential rewards and risks are even greater than those of artificial intelligence. To appreciate what quantum computing is, one must consider the classical computing that we use every day in technologies like laptops and smart phones. Classical computing effectively manages bits — switches that are either on or off, one or zero. Microchips are the foundational electronic devices that allow such bits to be managed. Technologies embedded with microchips use software that enables the functionality that we rely upon. Both the strength and limitation of classical computing is that every piece of information must be represented using a string (or sequence) of ones and zeros. This allows all information to be digitized (hence the term digital economy). It also restricts the efficiency of systems that use classical computing, which is embedded in every available technology. Think of this as moving within a large city along a grid of streets. It would be more efficient if you could move along the diagonals through buildings, rather than around them on the streets. Bit string representation of information has its advantages. One is computer security, captured in the field of cryptography. Electronic systems in finance, government, industry and the military can be protected against bad actors because such systems are effectively locked unless one has the cryptographic key to open…Quantum computing will be bigger than AI — so why is no one talking about it?