Simulating Physics with Computers
Nov 17, 2021
Richard Feynman's 1981 talk on "Simulating physics with computers" covers a remarkable gamut of modern questions, some of which have been resolved, while others remain open. This talk was provided at a time quantum computers had started receiving quite significant attention from computer science researchers around the world. Abstract computation models were being proposed, quite in a similar vein as in the classical computer models from the 30s. It is inevitable, therefore, that physicists would have something to say about computation, considering quantum computers would model the quantum nature of physical reality. As Feynman quips: "Nature is quantum, goddamit!"
His talk is divided into three broad parts:
- An introductory segment where he motivates the problem (Parts 1, 2, 3 below).
- A section where he proposes a quantum mechanical simulator (Part 4).
- A section where he investigates whether classical computers could simulate the quantum nature of reality (Part 5).
In this essay, we discuss these three sections in detail, and conclude with a few open questions parsed from this great expository material.
INTRODUCTION
Feynman makes the important distinction between a numerical approximation of partial differentials (as is the classical approximation) of a continuum and a discrete modelling of reality. Instead of a continuum, one could view space as a lattice structure, which gives us a nice discontinuous jump of time. In this way, one could look at this finite volume system and model the same in the computer using a finite number of logical operations. Thus, already at this point, Feynman has introduced the idea of computational complexity from the point of view of physics. How would the number of logical operations scale with arbitrary size inputs of the lattice volume? This is essentially an algorithmic viewpoint that until then had probably been studied strictly from a mathematical standpoint. Feynman makes the connection to physical reality.
So, those are the basic ingredients. A discrete lattice structure, allowing discontinuous jumps in time, and finite number of logical operations. This leads to the next part of his talk: The notion of simulating time.
SIMULATING TIME
Any intuitive notion of time involves a computer (or Feynman's cellular automata) moving from one discrete state to the other. The first distinction between simulation vs imitation is made here, although it's a bit unclear what Feynman means by this. In the case of anthropology, the standard dichotomy is emulation vs imitation, i.e. An agent that imitates does not take into account its effects on its own environment, whereas in emulation the agent adapts its behaviour with regards to the reactions of the environment.
What could be surmised is that Feynman means that in principal, from the point of view of classical physics ("local, causal, reversible"), a computer simulation, meaning multiple runs with the input variables, is possible. Through this simulation, Feynman wishes to approximate better the following function:
where every state s represents movement from another state to it in time steps, with the function F determining which state the computer moves to. Note that there is no Markovian assumption here. Feynman does not ignore the fact that k < i < j, meaning this function may be affected by points both in s's past and future.
The question that obviously arises is this: How is F available apriori? How are the boundary conditions known apriori? Feynman does not directly answer this question, but he is talking about the principle here. If there was a way we could "lay" out these points s, there may be choices of F that could not be possible to construct in a computer. In any case, owing to classical physics's three distinct features ("local, causal, reversible"), a simulation is certainly possible.
ON SIMULATING PROBABILITY
Nature is quantum mechanical. Therefore, any computer that aims to model the same must model probability appropriately. Again, Feynman is quick to point out the memory/storage issues with even simple modelling issues like P(x,t), the probability of a particle at position x in time t. Notice what happens when we don't have a single particle, but R particles, and N points in space, one can see how the number of configurations N^R can blow up with increasing arbitrarily the size of inputs. Can we imitate rather than simulate? Here, Feynman proposes a formulation suspiciously similar to a Markov Decision Process, although there's no explicit mention of it - Given a restricted (manageable) state set size the probability of any particle at state s:
where m(.) is a typical transition matrix. Using a similar analogy, any wave function ψ with R variables will be difficult to simulate with a classical computer. But what about quantum computers?
UNIVERSAL QUANTUM SIMULATORS
What would a basic quantum simulator of nature (which is quantum mechanical) look like? One should note here that it's strange Feynman calls a quantum computer "not a Turing Machine". It is unclear what he means by this, because during the same period, in the 80s, Paul Benioff produced multiple works that extended the Turing machine to the quantum paradigm ([3], [2]). Does Feynman mean it's a non-deterministic Turing Machine? That would make sense. He does talk about probabilistic computers before in his talk, so maybe he means one and the same thing. But it must be noted that one still needs transition matrices to represent computations in Quantum Turing machines to the Classical/Probabilistic Turing machine paradigm. This is also more interesting because he asks the following question: "Can the set of solvability of problems for the Turing computable functions (P, NP, PPP) extended to Quantum Turing machines?" He claimed this was simple, although it did take close to 2 decades before some parts of the puzzle were figured out, namely the complexity classes BQP and QMA. The former (Bounded error, quantum, polynomial time) are all those problems solvable by a Quantum Turing machine in polynomial time, with some bounded error. The latter (Quantum, Melin-Arthur) is a polynomial time verifier for Quantum Turing machines. The relationships between BQP and QMA is analogous to P and NP respectively. The relation between many of these complexity classes is described below:
Examples of problems in BQP include integer factorisation and the discrete logarithm problem. Examples of QMA include detecting insecure quantum encryption, the quantum clique (quantum analogue of the largest independent set problem for any arbitrary sized graph G), etc [[4]]. There are multiple open questions in this area: What is the relationship of NP to BQP? What is the relationship of BPP(Bounded error, probabilistic TM, Polynomial time) to BQP?
QM SYSTEMS WITH PROBABILISTIC CLASSICAL COMPUTERS?
The final question of Feynman's talk: Can Quantum Mechanical systems be simulated by probabilistic classical computers? The short answer: No. The reasons for this go back to the earliest experiments in uncovering the strange behaviour of quantum reality - Aspect/Gissin/Zeilinger, Stern/Gerlach, Mach/Zehnder [[1]] - which led to observations of "entanglement" (although this term came into being much later), leading to the EPR paradox, and finally culminating in Bell's famous theorem. We summarise this history (possibly unfairly), in three very short steps:
-
Given the state of a bit of information (say the polarisation of a photon) in two particles (two states of polarisation), and two observers, the measurement of one observer leads to a deterministic determination of the other variable's state. This is not done probabilistically.
-
To counter the observation of "spooky action at a distance", EPR postulated the presence of local, hidden variables, keeping in line with Einstein's local realism of special relativity.
-
Bell's theorem states that any local, hidden variable theory leads to in- equality predictions in measurements at a distance, inequalities that are violated by quantum theory (not to mention that experiments match with the predictions of quantum theory).
Quantum "non-locality" does not violate special relativity, but measurement correlations of multi variables quantum systems contradict local realist views of nature [like hidden local variable theories].
In short, Bell's theorem renders any classical, probabilistic (or deterministic) Turing Machine to model quantum mechanical systems, impossible.
OPEN QUESTIONS
Feynman's talk is at once influential and useful with regards to notation. Many of the more fundamental questions posed by the physics community, who wished to integrate the mechanistic aspect of the modern computer with their conception of reality, were summarised succinctly in Feynman's talk. Some of the major questions he asked were answered (although not fully), and the dream of envisioning a modern, functional quantum computer is still in the works. Some of the open questions that can be surmised from this work are the following:
-
The relationship of NP to BQP. Are problems verifiable in polynomial time solvable by a quantum computer in polynomial time? In a sense, is the question of P = NP complete (poor pun?)?
-
Does Feynman's representation of Bose particles extend to Fermi particles too? What does this formulation look like?
Inevitably, the decade following this talk, and the subsequent decade would herald fundamental results in quantum computing. Some of them include extending the Church-Turing thesis to quantum computers, now known as the Church-Turing-Deutsch principle. Other developments include Shor's Algorithm, or using quantum computers to compute prime factorisation in polynomial time, and many more.
If reality is indeed quantum mechanical (the evidence is overwhelmingly in favour), then would a practical realisation of quantum computers not constitute a step forward in developing our understanding of it? In turn, would intelligence not need to be redefined in light of such developments? In upcoming essays, we will try to reach a consensus on a definition of intelligence, and try to tie the same to the consequences of questions posed in this essay.
The original talk can be found here.
References
[1] Ulla Aeschbacher, Arne Hansen, and Stefan Wolf. Invitation to Quantum Informatics. en. Zurich: vdf Hochschulverlag AG an der ETH Zu ̈rich, 2020-01-28. isbn: 978-3-7281-3988-7. doi: 10.3929/ethz-b-000395060.
[2] Paul Benioff. "Quantum Mechanical Models of Turing Machines That Dis- sipate No Energy". In: Phys. Rev. Lett. 48 (23 June 1982), pp. 1581–1585. doi: 10.1103/PhysRevLett.48.1581. url: https://link.aps.org/doi/ 10.1103/PhysRevLett.48.1581.
[3] Paul Benioff. "The computer as a physical system: A microscopic quantum mechanical Hamiltonian model of computers as represented by Turing ma- chines". In: Journal of Statistical Physics 22.5 (May 1980), pp. 563–591. doi: 10.1007/BF01011339.
[4] Adam D. Bookatz. QMA-complete problems. 2013. arXiv: 1212.6312 [quant-ph].