Possible Insight

The Universe Is a Giant Computation Engine

leave a comment »

This is my hypothesis based on a relatively consistent diet of physics books over the last couple of decades.  Now, this hypothesis is by no means original to me (see here).  But it is certainly not the dominant view of and most laypersons are probably unaware it even exists.

The latest bit of data that reinforces my belief in the Computation Engine Hypothesis is From Eternity to Here, by Sean Carroll.  In this book, Carroll tries to explain the arrow of time using the concept of entropy from the perspective of statistical mechanics.

The basic idea behind entropy is to compute the number of physical microstates (e.g., positions of each molecule of oxygen) that correspond to the same physical macrostate (e.g., the physical distribution of those molecules in a jar).  If a macrostate has lots of different corresponding microstates, it’s “ordinary” and has high entropy.  If a macrostate, has only a few corresponding microstates, it’s “special” and has low entropy.  There are a lot more ways to arrange a set of oxygen molecules so they are uniformly distributed than there are to arrange them in the shape of a duck, so the former has high entropy while the latter has low entropy.

High entropy states occur more frequently than low entropy states.  So any interaction tends to increase entropy because transitions to more common states are more likely.  Thus the arrow of time is a statistical property of dynamic behavior.

But now there’s a problem.  One can apply this same type of analysis to the Universe as a whole (or more precisely, our “observable patch” of the Universe).   You see, it has rather low entropy compared to its maximum (which we can calculate using concepts from statistical mechanics).  There’s all this orderly clumping of matter into galaxies, solar systems, planets, animals, and humans.  And that’s just not very likely.  Now, you could try invoking the Anthropic Principle: that we wouldn’t be here to observe the Universe unless it were ordered this way.  Sorry, but no.  It’s actually much more likely that our brains would materialize out of the ether due to random quantum fluctuations (so called “Boltzmann Brains”).

Carroll has a loophole.  What if our Universe (and indeed each Universe in the “Metaverse”) spawns new Universes?  Then there is no maximum entropy and the configuration of our observable patch becomes much more likely.  Here’s how it might happen.  Even a Universe at maximum entropy still undergoes fluctuations, definitely of quantum fields and perhaps of spacetime itself.  If a quantum fluctuation to a higher vacuum energy occurred at the same time that a bit of spacetime pinched off, you would get what looks like a new universe undergoing a Big Bang.  Astronomically unlikely at any given time and place, but almost certain to happen eventually in a given Universe.

Aha!  Problem solved.  But think of the implications.  There’s a huge proliferation of Universes.  Now, add in the proliferation of different versions of the Universe from from the Many Worlds Interpretation (MWI) of quantum mechanics.  Recall that the MWI explains apparently  “spooky” quantum behavior by suggesting that the wavefunction does not actually collapse.  Instead, every possible value of the wavefunction is realized in a different blob of amplitude, a process known as decoherence.  Effectively, any time a quantum particle interacts with a macro objects, it generates a version of the universe for each possible outcome of that interaction.

So at the quantum level, we’ve got all this branching of the Universe every microsecond.  Then at the astrophysics level, we’ve got new Universes spawning.  Of course, this spawning also obeys the MWI, so you’ve actually got an exponential proliferation of baby Universes.  If you squint, this whole process looks like a multi-dimensional forward-chaining computation.  Every possibility in this Universe is realized, whole new Universes with slightly different rules get created, and every possibility in them is realized.

Going back to the concept of entropy, it turns out that the Thermodynamic Entropy we can calculate for objects is exactly the same as the Shannon Entropy we can calculate for information.  Shannon Entropy measures how unique a piece of information is.  Think of it in terms of compression.  You can’t compress a file any smaller than its Shannon Entropy will allow.  Structured files have low entropy and by encoding their structure, you can compress them more.  A random string of bits in a file has maximum entropy, so you can’t compress it at all.  Shannon Entropy is a measure of how potentially useful information is.  Just like Thermodynamic Entropy is a measure of potential energy.

So there’s already a known equivalence between the physical and informational.  Then if you buy into Carroll’s hypothesis and the MWI, it looks like the Metaverse is trying to compute every possible outcome.  In fact, it may compute every possible outcome more than once.  An infinite number of times if it runs an infinite amount of time.  After it runs long enough, someone who could observe the whole Metaverse could actually calculate very precise odds of any outcome given any condition.  You’d be statistically omnipotent.

Never bet against a statistically omnipotent being.

Advertisement

Written by Kevin

August 18, 2010 at 2:45 pm

Posted in Science

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: