Skip to content

Escaping the heat death of the universe

This article covers a speculative system that would allow us to circumvent the heat death of the universe and live for an arbitrarily long time in realities of our own design. (It was was largely motivated by a recent playthrough of SOMA - highly recommended).

This article is incomplete.

Given that gravity warps spacetime, we might infer that black holes made out of theoretical negative mass might induce inverted time dilation. If this were true, and if we were also able to create high performance computers on the scale of individual particles, then we could conceivable live arbitrarily close to forever by loading our consciousness onto femtocomputers and firing them close to the aforementioned gravitational singularities. The theoretical temporal effects would accelerate computation, thus allowing us to live arbitrarily close to an infinite amount of time.

Introduction#

In the book Schilds Ladder by Greg Egan, a technology called femtomachines are used for high performance computation. A similar technology is employed in Cixin Liu's Three Body Problem an advanced alien civilization known as Trisolarians are capable of producing 11-dimensional supercomputers endowed with artificial intelligence - which are roughly the same size as a proton.

Of course, this is science fiction - but suppose in the future we invent such a technology? This might allow us to circumvent the heat death of the universe. To be clear, we couldn't do it with just femtocomputers. We would also require the following propositions to be true.

  1. Negative mass singularities either (A) occur naturally, or (B) can be engineered artificially.
  2. Time dilation effects around negative mass singularities accelerate the flow of time from the perspective of outside observers.

If we ignore the absurdity of these premises, let's assume for the sake of this thought experiment that they are true, and that we have femtocomputers.

Baseline estimates#

First we have to estimate upper and lower bound figures for the information storage and computational capacity of a femtocomputer. Once we have these, we can contrast this with analogous metrics for the human brain to produce a guesstimate figure on how many people we could upload, and how many years they could collectively live.

Given the speculative nature of the task, the element we select to host the computer is somewhat arbitrary. Intuitively I feel like Plutonium would be a good fit.

Why Plutonium?

To maximize the effect of time dilation, we need to get the atom as close to the singularity as possible. Accordingly we would want to minimize the atomic radius. I used this chart to check the atomic radius of varying elements. Whether this is relevant in such extreme conditions, I do not know. On the other hand, we also might want to maximize the mass of the atom, in order to maximize the computational capacity of the system; source.

Quantum Computer Precedent

Femtocomputers would differ substantially from modern quantum computers. We aren't simply executing Shor's algorithm; rather we are attempting to simulate a mind (or preferably, many minds), within at most a single atom. As such, I doubt we can make inferences about femtotech - fictional standalone systems - on the basis of the features of real quantum computers. The latter must be cooled to near absolute zero to work, require cumbersome physical support infrastructure, and are only useful when many qubits have been entangled with each other (I think(?) - don't quote me on that).

Storage estimates#

Femtocomputer storage#

Instead of calculating a specific value, I've opted to estimate upper and lower bound values. The specific process is included at the bottom of this section. The lower bound of storage capacity is \(30.5 \space bytes\), or around seven UTF-8 characters. As for the upper bound... If we took every grain of sand on earth, and put the entire internet into each one - we could call that volume of data one "internet sand", or \(IS_0\) for short. If we took every grain of sand on earth, and put an \(IS_0\) in it, we could call that volume of data \(IS_1\). The upper bound storage limit for the femtocomputer would be able to store just over two copies of \(IS_1\), with a capacity of ~\(1.6 \times 10^{61} \space bytes\).

Upper & Lower Bound Calculations

We'd need to decide whether we're going to assume the volume of the atomic radius will be considered a storage/compute medium, or whether we are just considering the atomic nucleus. Of course, I don't know which is the case, but to be conservative (though, really, I don't know what is and isn't a conservative estimate here), I selected just the atomic nucleus.

Furthermore, let's assume that the absolute upper bound of information density is \(1 \space bit\) of information for every discrete cubic volume with dimensions of \(1\) Planck length. Let's assume the lower bound is \(1 \space bit\) per proton or neutron. This means that the lower bound on the storage capacity of the femtocomputer is just the number of protons plus the number of neutrons, or \(94+140=244 \space bits\) or \(30.5 \space bytes\). The upper bound calculation is slightly more complex.

For simplicities sake we're treating the neutron and proton radius as the same. Also rounding the Planck length; given the speculative nature - sig. figs prob don't matter here.

First we need to calculate the total volume available to us in the nucleus - I'm using \(m^3\) to standardize stuff. We'll assume they are spherical, thus the volume of the nucleus is:

\(244 \times \frac{4}{3} \pi \times (8 \times 10^{-16} \space m) ^{3} = 5.2 \times 10^{-43} \space m^3\)

Each cubic Planck volume stores \(1\) bit:

\(\frac{(1.6 \times 10^{-35} \space m)^{3}}{bit} = 4.1 \times 10^{-105} \space m^3/bit\)

Because the Planck length is so small, we'll treat each one as a voxel. Accordingly we can simply divide these two figures into each other to find the theoretical upper bound:

\(\frac{5.2 \times 10^{-43} \space m^3}{4.1 \times 10^{-105} \space m^3/bit} =\) ~\(1.28 \times 10^{62} \space bits \space =\) ~\(1.6 \times 10^{61} \space bytes\)

The result of this calculation is ~\(1.6 \times 10^{61} \space bytes\) - this is a lot of capacity to say the least. For context, the market intelligence firm IDC estimates that the internet will be 175 zettabytes by 2025. One zettabyte is \(10^{21} \space bytes\). Dividing these figures into each other tells us our hypothetical femtocomputer has an upper bound storage limit of ~\(1.6 \times 10^{40}\) internets - in 2025.

Of course, given that the calculation is based on an entirely speculative chain of reasoning, it has very little pragmatic purpose besides serving as a waypoint for an absolute upper bound which will probably never be reached.

Note that even though the margin between the lower and upper bound would better be described as a chasm - this does not impact our analysis.

Finally, for the grain of sand calculation, I used this estimate, and applied the 2025 internet figure.

Brain storage#

According to this paper by Thomas Bartol et al. the human brain can store around \(4.7\) bits of information per synapse. According to this paper by Jiawei Zhang the brain has around \(1000\) trillion synapses. Multiplying these two figures together comes out to around \(4.17\) petabytes, which is within the same order of magnitude of the \(2.5\) petabyte figure obtained if one googles "storage capacity of human brain petabytes" and runs with the first result. On the other hand we really don't know much about the brain; the estimated figure has consistently grown through time, so it is this possible figure could be much higher.

Let's assume that the lower bound storage requirement for the human brain is, \(2.5\) petabytes. Increasing this figure does a disservice to our plan to save humanity, as it means we can fit fewer people on each femtocomputer. In the interest of covering all possibilities, however, we'll add three orders of magnitude and speculate that perhaps each brain requires \(2500\) petabytes of storage space as an upper bound.

Could the brain actually have that much 'disk space'?

Perhaps, though it is unlikely. The figure is really meant to establish an absolute upper bound. There is relevant research suggesting that "a gene crucial for learning, called Arc, can send its genetic material from one neuron to another by employing a strategy commonly used by viruses" - but this does not imply our brains use the epigenome or genome for storage.

Do they? I genuinely have no idea. If they do, then that could increase the storage capacity substantially. To be clear, I'm not saying they do - simply pointing out the uncertainty associated with these estimates. BTW, I found the linked research in a video by Anton Petrov - highly recommend.

Comparing storage#

In general, we can compute the number of people we could store on the femtocomputer with a function \(f\), which simply divides the estimated storage capacity of the computer by the storage required for a mind. Visualizing this relationship generates the following graph.

![[person-capacity-femtocomputer.png]]

[whoops, I need to find where this image was]

Unsurprisingly, the main factor is not the storage requirement of the brain, which spans only three orders of magnitude, but the storage capacity of the femtocomputer, which spans around sixty orders of magnitude.

Of course, we will also need to store all the assets for constructing reality, however this is negligible compared to the brain, and can be discounted. We know this because it only takes the storage of a single brain to store all the assets associated with building reality, at least a reality as complex as the one we perceive. Your brain is doing it after all - and the assets they could be reused for each consciousness, we would only need to store all these assets once.

Referenced#

Look Into#