When the Earth goes around the sun, we can imagine applying Newton’s laws to predicting how it’s going to move. That’s just like a computer algorithm. If we know the position and motion of the Earth today, we can compute its position and motion this time next year. So the laws of physics could be thought of like a computer algorithm, taking input data, processing it and delivering output data. That inevitably leads to the analogy that the universe is a really gigantic computer. And many people are enamored of that idea.


The universe is just one big information processor. Wheeler calls it “it from bit”. Now, if you take that view – that the universe is a gigantic computer – then it leads immediately to the conclusion that the resources of that computer are limited. The universe is finite. It’s finite because the speed of light is finite. There’s been a finite time since the Big Bang. So if we have a finite universe, we have a computer with finite resource and hence finite accuracy. So once you recognize that the universe is a gigantic computer, then you see that the laws of physics can’t be infinitely precise and prefect. There must be a certain amount of wiggle room or slopping or ambiguity in those laws. And the point is that the degree of error, which is inherent in the laws, depends on time. As the universe gets older, there are fewer errors because it’s had longer to compute. If you go back to the first split second after the Big Bang, then the underlying errors in the laws of physics really would have been very large. So instead of thinking of the universe as beginning magically with a bang, and the laws of physics being imprinted magically on the universe with infinite precision right from the word “go”, we must instead think of the laws as being emergent with and inherent in the universe, starting out a little bit vague and fuzzy, and focusing down over time to the form that we see today.

Paul Davies, in an interview for the book “Atoms and Eden” by Steve Paulson.