Jorge Luis Borges’ story “The Library of Babel” has long been an obsession of mine. The 1941 short story1 posits a library that contains every possible book-length2 combination of words. It’s probably my second-favorite short story; I think about it all the time and teach it whenever I can. I once even wrote a program to output the digits of 251,312,000, the number of distinct books the Library of Babel contains, which produced a 2Mb text file of mostly zeros. So when my friend Tony Tulathimutte (about whom I’ve written before) asked me to consult on a “Library of Babel”-inspired essay he is writing on the algorithmic generation of literature, I was happy to help. Tony asked:
Even if 251,312,000 is beyond astronomically large, I’m interested in getting as close as possible to a non-theoretical implementation of the Library. Can we work on a Fermi estimate of what it would take to assemble the library? Like, if we distributed the workload to every computer on Earth, or used the world’s fastest supercomputer (China’s Tianhe-2, 33.86 petaflops), or even assembled a Douglas-Adams-style Deep Thought Computational Matrix made of human brains (the human brain runs at an estimated 36.8 petaflops)? Or if Moore’s law holds, at what point would the processing power on Earth suffice to create the Library within the lifespan of the universe.
This is a completely reasonable question, but one that illustrates just how unnatural it is to think about numbers that are “beyond astronomically large.” The number of books in the Library of Babel is so big, no set of adjectives can meaningfully capture its hugeness. After all, things like petaflops or the computational capacity of the human brain are also too big to really conceptualize. So it makes sense that one might treat them all as members in equal standing of the Numbers Too Big To Think About club. But they aren’t. Here are three illustrations of the absurd magnitude of the Library of Babel.
First we’ll look at the initial question, how long would it take to generate the Library of Babel? Instead of addressing it the way Tony suggests, though, let’s approach the problem from the opposite direction: what is the fastest it’s possible to imagine generating the Library of Babel?
The Heisenberg uncertainty principle implies that there is a smallest possible size something can be, and a shortest possible time in which something can happen. These minimum quantities are built in to the basic workings of the universe, and are called the Planck units. The Planck time is equal to about 5.391 x 10-44 seconds. It isn’t physically possible for an event to occur in less time than that. Let’s imagine that we have computers capable of generating one Library of Babel Book (LoBB) per unit of Planck time. How many of these computers? Let’s be ambitious: through some impossible alchemy, we will now turn every single atom in the observable universe into a computer capable of generating one LoBB per unit of Planck time.
There are on the order of 1080 atoms in the observable universe. So let’s say we have that many computers… what’s that? Oh, you’re asking, “but what about dark matter?” It’s true. Scientists think there might be five times as much dark matter in the universe as there are atoms. So let’s be generous and bump it up ten times. We’ll say with have 1081 computers, each of which generates one LoBB per unit of Planck time. So, if we have 1081 computers generating about 1043 LoBBs per second, that means we generate 10124 LoBBs every second, 10131 LoBBs per year.
There are 251,312,000 possible LoBBs, which is on the order of 101,834,097. At a rate of 10131 LoBBs per year, it will take 101,833,966 years to finish making the whole Library, or on the order of 10106. Take a quick look at Wikipedia’s timeline of the far future. You’ll notice that the time when we finish making the Library at the fastest imaginable rate would be one of the last items on the list, coming well after the entire universe is a cold, dead, iron cinder.
So the answer to Tony’s question is: never.
2. World Enough
But maybe you noticed that I cheated a little. I said I would consider the fastest it’s possible to imagine generating LoBBs, but calculated based on the fastest it would be physically possible to make them. We can imagine things faster than that, though. We can imagine just snapping our fingers and–poof!–a complete Library of Babel made in an instant. So, why not? Let’s consider that case. We now have the power to instantly assemble a Library of Babel.
Assemble it… out of what? I mean, what are we going to make the literal books out of? Not out of atoms; we already said that there are, generously, 1081 atoms worth of matter in the observable universe. Even if we could somehow encode a LoBB in every atom, we wouldn’t come close to making 10106 of them. Not even if we could make a LoBB out of every subatomic particle.
The universe just doesn’t have enough stuff in it to make the Library of Babel.
3. Vaster Than Empires
So let’s add more stuff. We’ve already given ourselves the power to instantly reconfigure every atom in the universe. Why not give ourself the power to make new matter out of nothing while we’re at it? What happens then?
Turns out, even if we could conjure enough new matter to make the Library of Babel, the universe itself would be too small to hold it.
There’s a weird and fascinating result from black hole physics called the holographic principle, which says that all the information needed to describe a volume of space, down to the minutest quantum detail, only ever takes as much space to encode as the surface area of the volume.3 That is, if you wanted to write down all the information necessary to perfectly describe every detail of what’s inside a room, you would always be able to fit all the information on just the walls. In this way, the entire universe can be thought of as a three dimensional projection of what is, on the level of information, a strictly two dimensional system. Sort of like a hologram, which is 2D but looks 3D, a metaphor from which the principle gets its name.
In any normal region of the universe, the amount of information in a given volume will actually be much less than what you could encode on its surface area. For reasons having to do with thermodynamics that are too complicated to go into here, when you max out the amount of information a volume of space can contain, what you have is a black hole.4 Now, remember those Planck units from the beginning? Length was one of them; there’s a smallest possible size that the laws of nature will let something be, and we can use that length to define a new unit, the Planck area. The most efficient possible encoding of information, per the holographic principle, is one bit per unit of Planck area, which is on the order of 10-70 square meters.
The observable universe has a radius of around 4.4 x 1026 meters. That gives it a surface area on the order of 1053 square meters, which means it can hold 10123 bits of information. That’s just the observable universe though; the whole universe is much, much bigger. We aren’t sure exactly how much bigger, it isn’t observable. But inflationary universe theory, which just got some strong confirming evidence, provides an estimate that the whole universe is 3 x 1023 times larger than the part of the universe we can see. Carry out the same calculations, and the estimated size of the whole universe means that it can contain 10170 bits of information. As for the Library, if you assume that it takes a string of at least six bits to encode one of a set of 25 characters, then the whole Library of Babel would require a number of bits on the order, once again, of 10106. Even if we demiurgic librarians do violate the law of conservation of energy to bring the Library into being, the entire universe would collapse into a black hole long before we finished our project.
So: the Library of Babel is so large that the universe isn’t going to be around long enough to make it. And even if it was, there isn’t enough matter and energy to do it. And even if there was, before that point all of reality as we know it would be destroyed. That is how extreme things can get when you start dealing with “beyond astronomically large” numbers.
As described by Borges: 25 symbols, 40 symbols per line, 80 lines per page, 410 pages. ↩
This is because, physically speaking, information is the same thing as entropy. ↩