Writings/laptop

← Home

The Ultimate Laptop

Sep 15, 2023

A lot of the material world can be described with an AND, OR, XOR, NOT - When is something true? When is something not true? When is something and something else true together? When are they not? The only constant is the ‘is’, to be, the state - Dasein. Boolean logic encapsulates states of particles like no other system of thought. To be, or not to be, but also to be, and not to be.

It turns out, there are interesting ways to think about states of particles in physics. A state could be arbitrary - the spin of an electron (up or down), the polarization of a photon (horizontal or vertical), but states could also be encoded into electrostatic interactions between charged particles. A state is anything that holds and dissipates information. And the moment we talk about information, about data, it is inevitable that we talk about bits, operations, and energy. And the moment we talk about energy, we talk about mass, memory, and entropy. Two orders beyond the laptop you’re reading these words on, there is mass, there is memory, there is entropy.

The need for formalizing rules led to the creation of the modern computer. The modern computer was created because there was no other way to manage the amount of data that existed in the world at the time. It’s the same for computer networks, for physical servers, and subsequently, the need for the internet - Too much data, and the material world is not enough. Today, the internet’s exponential burst in data-generating capabilities means new computational processes, such as deep learning, are required. chatGPT was born because there was nothing else we could do.

But where does that leave the humble old bit? How much more can we do before it’s enough?

In “Ultimate Physical Limits to Computation”, Seth Lloyd at MIT declares:

Energy limits speed. Entropy limits memory.

It’s a fascinating paper because it targets the end of computation - Beyond Moore’s Law, beyond AI, beyond deep learning.

Lloyd is interested in the maximum amount of information a physical system can process, and how fast it can do so.

The starting point, as with everything, is Heisenberg.

Most people are familiar with and are introduced to the position-momentum interpretation of the uncertainty principle (which relates to the uncertainty in observing the state of a particle before direct observation), but another useful way to think about it is through the energy-time interpretation:

ΔEΔth2π\Delta{E}\Delta{t} \geq \frac{h}{2\pi}

where Planck’s constant makes its usual appearance. One can think about it as a change in energy states over a period of time (conversely, for an infinitesimal period of time, the energy measurement of a particle or system is meaningless). But the usefulness of this interpretation is what it allows us to do - Calculate how long it takes a system to reach a distinguishable energy state.

And why is that useful? Well, any logical operation on any bit requires an amount of energy. Using this interpretation, one can then bound the number of bits per second any physical system can process relating to the change in its energy states.

Thus, on a per-second basis, the total number of bit operations would be at most

2Eπh\frac{2E\pi}{h}

This is fascinating because it means no physical system (classical/quantum) can move between energy distinguishable energy states in time less than (delta)t. Which means the speed of any logical operation is bounded by the same constant of energy.

So imagine The Ultimate Laptop - Mass: 1Kg. Quantum gates running amok under the hood, supreme parallelization, the edge of Moore’s Law without breaking physics. The Ultimate Laptop is the fastest, greatest computing machine ever built.

Yet, Einstein dictates the total energy available to this system (as stored in its mass) to be

E=mc2=8.98×1016 JoulesE = mc^2 = 8.98 \times 10^{16} \text{ Joules}

Plugging in the rest of the constants leads to a cold, hard number:

5.52×1050 ops/sec5.52 \times 10^{50} \text{ ops/sec}

That’s all we can do. Give or take a few orders of magnitude. That’s not the limit Moore’s law imposes on us. It is also not the limit GPU architectures or fabrication engineering imposes. That’s the number physics imposes on us.

And what about memory?

One end of the stick is how fast we can process information, the other, inextricable end is how much we can process.

This discussion is basically a discussion about entropy. A source of energy (such as a battery or a current) manipulates a bit of data. This manipulation uses up the energy of operation (the cost of doing business), and the remaining goes back to the battery (savings account). Lloyd uses a bunch of finance metaphors in a physics paper, which is why it’s such a fun read as well. He regularly cites the “exchange rate” of Boltzmann free energy, the “net profit” gained from a bit manipulation, etc. I like this approach because, at an abstract level, the laws of conservation are not just restricted to (or liberated from) physics. They apply everywhere.

The entropy S and the memory space, related to the Boltzmann constant, are expressed like so:

I=SkBln2I = \frac{S}{k_B \ln 2}

where I is the number of bits of information, and k_B is Boltzmann’s constant.

Combining this equation and the equation above for the number of bits of operation per second leads to an elegant bound on the maximum number of bits of information any arbitrary physical system can process:

2.13×1031 bits2.13 \times 10^{31} \text{ bits}

Lloyd does more fancy physics to show how you can do a bit better, but ~

103110^{31}
is the ballpark.

What I do not want to get lost in the equations and the numbers, as Lloyd doesn’t either, is the hilarious absurdity of it all - The Ultimate Laptop’s memory signature (1 kg of mass and a 1L volume) would be akin to plasma at a billion degrees Kelvin.

This object cannot be delivered on Amazon Prime.

Then, Lloyd does more fancy physics to show that the absurdity can be mitigated by storing bit information more efficiently, the details of which are important only for engineers.

What’s important are the two numbers: The limits on speed, and the limits of memory.

With regard to intelligence, whether artificial or human, a lot of what gets washed away in the narrative flow is the supreme efficiency of our systems and the economies of scale.

If GPT-4 is not able to impress you, GPT-X will. As these limits show us, there is a long, long way to go before physics breaks down, and it seems increasingly likely that for intelligence to emerge, maybe Moore’s law is the last thing we need to worry about.