God vs The Multiverse (Part 4: The Initial Conditions)
Rabbi E. Feder & Rabbi E. Zimmer
There is another example of fine tuning in the universe we want to highlight because it is of a very different conceptual nature than the constants, and provides an independent proof of an Intelligent Designer. (For an elaboration of this point, see the first comment below.) This is regarding the initial conditions of the universe, which were set at the big bang.
We've never seen anyone (which doesn't mean they don't exist) propose either the Master Mathematical Equation theory or the Necessary Existences theory, to explain the fine tuning of the initial conditions. It's not even clear how such an explanation would even be formulated, as it seems of a qualitatively different character than our current understanding of physical law. (It would seem at this point, that the only alternative explanation to an Intelligent Agent is the multiverse.)
The big bang is the widely accepted model for the emergence and evolution of the universe as we know it. The arrangement of the matter and other conditions at the big bang were perfectly tuned so that the universe we see today would naturally emerge. This arrangement was highly specialized, in the sense that variations in the initial conditions would have resulted in disorder (a universe filled with black holes) instead of the ordered universe we witness today. The probability of obtaining such a state by random chance is staggeringly low.
(For those afraid of the physics, you can skip to the paragraphs after the video below and you will still follow the main point of this post. This post will be our last post that contains this much physics and math. For those interested, the following will provide a good opportunity to review or learn some physics and mathematics, and thereby have a deeper appreciation for the uniqueness of this proof.)
Someone may ask that although it is highly unlikely that the arrangement of matter at the big bang would be exactly as it was, any one arrangement of matter would have an equally low probability. However, it had to have one arrangement. How do you know the initial conditions were so special?
The critical distinction we need to make in order to understand this question is between:
1) the specific arrangement of the individual parts of a system. (A collection of particles.)
2) the state of a system as a whole.
The relationship between the whole and it's parts is the key concept. Some states of the whole object are contingent on a unique arrangement. For example, the meaning of this very sentence (we're treating this whole sentence as a system, with the letters as its parts) is contingent on all the letters and spaces being arranged in approximately this order. If we jumble up all the letters, the sentence as a whole, loses this state (of making intelligible sense). Other states, like a meaningless jumble of letters, are independent of how the letters are arranged. Almost every random ordering of the letters will be in this state of meaninglessness.
If we randomly scramble an object's parts, entropy is the measure of how probable a particular state of the whole object is. A state that can come about through many different arrangements is called a state of high entropy. A state that can only come about through very few different arrangements is called a state of low entropy. Entropy is thus a number which measures the likelihood of any particular state of the whole object if we randomly shuffle its individual parts. (The fact that a state of lower entropy is less probable is a direct consequence of the fundamental postulate in statistical mechanics.) We'll illustrate with an example.
If we toss 2 individual coins, we consider all the possible ways they could land (H - heads, T - tails):
(1) HH (2) TT (3) HT (4) TH.
The probability of each of these 4 outcomes is 1/4. Upon consideration we notice that outcomes (3) and (4) will appear exactly the same in terms of the whole system; 1 head and 1 tail. Thus a better way to describe the probabilities is as follows: P(0 heads)=1/4, P(1 head)=2/4, P(2 heads)=1/4. One head is more likely to occur then 0 or 2 heads because it can happen in 2 ways, while 0 or 2 heads can only occur in one way each.
We can generalize this idea to flipping 10 coins. In total, there are 210 =1024 possible outcomes. Thus, the probability of obtaining any particular outcome (say, HHHHHHHHHH or HTHTHTHTHT) is 1/1024. However, there is only 1 way to get 10 heads, while there are 252 (for those mathematically inclined, 10 choose 5) ways of getting 5 heads (some examples are HHHHHTTTTT, TTTTTHHHHH, THTHTHTHTH, HTHHHTTTHT). Thus the probability of obtaining 10 heads is 1/1024, while the probability of obtaining 5 heads is the much more likely value of 252/1024, which is approximately 1/4.
Because it can only occur in 1 way, we consider the outcome of 10 heads to be highly unlikely (which counter-intuitively is called a low entropy state). Conversely, since 5 heads can occur in many ways, we consider it to be fairly probably (or a high entropy state). The outcome of eight heads would be somewhere in between in terms of likelihood and entropy.
In general, one can think of a low entropy state as being highly ordered and a high entropy state as being disordered. This is because there many ways to randomly bring about a state of disordered nonsense, but there are only a few ways to bring about a state of meaning and order.
The second law of thermodynamics states that all physical processes move an object from lower states of entropy to higher states of entropy. This means that over time, all objects end up in the state that has the highest number of arrangements that can bring that particular state about. Meaning if you start with 8 out of the 10 coins on heads, and you give them enough time and let them interact (i.e., you shake the container), you'll end up with a state of about 5 heads. While it is not theoretically impossible for the second law (which is essentially a statistical law based on probabilities) to be violated in a particular instance (i.e., the red sea splitting in half for a few hours), a violation of this law has never been observed (without the observers claiming they have witnessed a miracle).
When you apply this reasoning to the universe going forward in time (towards the future), you end up with a conclusion that the universe will, at a point far in the future, end up being in its most likely state (which is a very boring, meaningless state). This is known as the heat death of the universe which is the state of highest entropy and the least amount of order.
The universe is currently in a state of much lower entropy than heat death. We have things in this universe with a lot of order, such as galaxies, stars, planets, life, etc.; things that are very unlikely to be attained by a random arrangement. If we extrapolate backwards in time to the big bang, we realize that based on the second law of thermodynamics, the universe must have been in an even lower state of entropy (an even more ordered, highly improbable, state than it is now).
Another way to see this point is based the idea of meaningful states. The number of possible arrangements of all of the particles in the universe at the big bang was very, very high. Therefore, the probability of any particular arrangement occurring by chance is very, very low. However, we can divide all arrangements into two distinguishable classes: (a) those which eventually unfold to an ordered universe; (b) those which eventually unfold to a universe of total nonsense. There are very, very few arrangements in (a) and therefore these states have a low entropy and a very low probability of occurring by chance. There are many, many arrangements in (b) and therefore these states have a high entropy and a very high probability of occurring by chance.
The fact that at the big bang the universe had such a low state of entropy is like tossing up trillions of letters and having them randomly fall in the arrangement of all the Wikipedia articles. If the universe did not start off in this special, highly unlikely, low entropy state, then even if we had the same qualitative laws of physics and the same fine tuned constants of nature, we would never get a beautiful, ordered, complex universe. This is what is meant by the fine tuning of the initial conditions of the big bang.
(As an aside, this is why the infinitely cyclic universe model of big bang/big crunch was rejected in 1934, as entropy would be infinitely increasing. There is an arrow of time and it had a beginning. There are a few modern day approaches that attempt to reincarnate the theory, but as of yet they are still entirely speculative with no experimental support. In any event, the key point of fine tuning is independent of the cyclic universe. Only a genuine multiverse theory can help. More on this in latter posts.)
Roger Penrose derives the probability for this initial state in his book The Emperor’s New Mind (1989). We highly encourage the more advanced reader to try to read through his basic derivation which is only a few pages mostly in English. The short video below by Roger Penrose will help explain:
The likelihood of the initial conditions of the universe (the arrangement of matter for the big bang) to occur by chance alone, is the biggest number (or smallest probability) we have ever seen with regards to fine tuning, less than 1 out of 1010123. It is a double exponent. For those who are mathematically inclined, try to fathom how big this number really is. It makes the cosmological constant ("trillion, trillion, trillion....") seem minuscule. If you tried to write the number using every single particle in the universe to represent a zero, you run out of particles! It's not even close.
There are a few amazing things about this result. Firstly, that physics, mathematics and computer science have come to the point where we can actually calculate such a probability. Second, that the probability here is so amazingly small. Lastly, that such a fine tuned arrangement was "built in" to the big bang in order to naturally unfold to our universe. It's astounding!
As we are going be moving forward in these posts with the assumtion that we have sufficiently established the fact of fine tuning, both in the constants of nature and the initial conditions of the big bang, we want to mention that there is a very small minority of scientists who deny the fact of fine tuning altogether. Their view is largely rejected by the scientific community as a whole, and the mistakes in their thinking are fairly easy to see. We encourage you look at this 76 page article by Luke Barnes that thoroughly examines and rejects the opinion of Victor Stenger. It also does an excellent job of explaining a lot of the details of fine tuning. (See pages 23-26 in particular for this post, where the author exposes the fallacies in Stenger's attack on Roger Penrose, and concludes "that Stenger has not only failed to solve the entropy problem; he has failed to comprehend it. He has presented the problem itself as its solution.")