## Quantum Mathematics

Assume the following equation: **x ^ y **(this reads “x to the power of y“)

The fundamental difference between quantum computing and standard transistor based computing is that transistor computing can only do one operation at a time. Basic logic tells us that the more bytes you are processing, the longer it takes. Quantum computing on the other hand allows for a single qubit to be superpositioned to both a 0 and a 1. This means that the number of computations that a quantum computer could process is **2 ^ n** where

**is the number of qubits. Now imaging that you have 500 qubits (about the size of... nothing). With 500 qubits, you would be able to perform (2^500) operations**

*n***. To put this into perspective, this number translates into**

__in a single step__**3.2733906078961418700131896968276e+150**operations in a

*single*step (I did that in my head by the way). This number is much much larger than the (estimated) total number of atoms in the universe... which we all know is

*only*about

**1e+79.**

A while ago, I found a cool link that listed a bunch of names for large numbers and I had to dig it back up to see how to verbally say a number this large with out using scientific notation lingo. I eventually found it again and see that the names only go to **10^99**. I combined two of the names to come up with a funky combinations of words...

3.2733906078961418700131896968276e+150 translates to approx. **3.3 sexdecillion duotrigintillion**

and

1e+79 translates to a **mear 10 quinvigintillion**

Rewind back to the real world... I just tried processing a simple calculation on my computer **10e+32768 ^ 2** and it bogged down my machine for about 2 seconds... C'MON, this stuff should be taking nanoseconds... right? :)

Print | posted on Wednesday, February 15, 2006 3:45 AM