100% Quantum and Probabilistic
Quantum computing is a paradigm that exploits the quantum mechanical properties of matter and light to perform computations that are beyond the reach of classical computers. Quantum computing has the potential to revolutionize various fields and applications, such as artificial intelligence, cryptography, optimization, simulation, machine learning, and more. However, quantum computing also requires a different way of thinking and programming than classical computing, as it involves uncertainty, randomness, and probability at its core. Classical computing is based on deterministic bits, which can be either 0 or 1. Classical algorithms operate on these bits using logical operations, such as AND, OR, NOT, and XOR. Classical algorithms can also use random bits, which are generated by some physical process or algorithm, to introduce randomness or uncertainty in the computation. For example, randomized algorithms can use random bits to make probabilistic choices or estimates that can improve the performance or accuracy of the algorithm. Quantum computing is based on probabilistic qubits, which can be 0 or 1 or both at the same time. Quantum algorithms operate on these qubits using quantum operations, such as Hadamard, Pauli-X, Pauli-Y, Pauli-Z, CNOT, and Toffoli. Quantum algorithms can also use measurements, which are operations that collapse the qubits into definite states of 0 or 1 with some probability. For example, quantum algorithms can use measurements to extract the output of the computation or to implement feedback loops that can adapt the computation based on previous outcomes.