Archive for the ‘Science’ Category
When Planck demolished classical mechanics by postulating the discretization of energy levels, he kept the second law of thermodynamics, as the one carryover from the old world, guiding him in the new. Later in the 20th century, starting with the work of Claude Shannon, we’ve come to understand a deep link between the concepts of information and entropy.
A paper published last year looked at the properties of potential generalised physical theories, and found that with a minimal set of constraints,the ‘information becoming useless’ sense of the second law of thermodynamic would have to necessarily hold for all these theories. A simplified description of the paper by one of the authors is here. It is, I think, hugely significant that the second law and entropy have survived from classical to quantum theories, and has now been shown to be true in possible future theories of nature, if there is one.
I’ve always been fascinated by the link between information and entropy, and have written about it earlier in this blog. I think that the indications from modern physics are that the concept of information is a fundamental part of what makes up the universe. And it is as fundamental a part as of the physical world as energy, quarks, leptons and bosons. And that is a crucial element of how we think about the universe around us. As John Wheeler had said (and Taimur repeated) many years ago, “It from Bit“. That may well be the question to which the answer is 42!
A recent paper by Lorenzo Maccone in the Physics Review Letters has what I think is a very interesting argument about the arrow of time.
A long standing dilemma in physics is that all physical laws work the same irrespective of whether time moves forward or back. However, we only experience time moving in one direction only. Specifically, we can only observe ‘entropy’ increasing, ie you can only see an egg crack, or milk split, but will never see those phenomena reverse spontaneously. However there were no widely understood underlying theoretical basis why that should be so.
In the paper, the author argues that actions that increase entropy leave information behind, while processes that reduce entropy necessarily involve erasing information. Which means physics cannot study those processes where entropy has decreased, even if they were commonplace.
Finished reading this fascinating piece by Lawrence Krauss at The Edge. It is interesting in itself as a survey of the state of Physics today, starting from the issue of the energy of empty space. I’m not sure however if his critique of String Theory as a mere formalism rather than a theory is uncontroversial… even if I, sort of, agree.
What I found the most fascinating is his observation that apparently many physicists no longer recoil from a discussion of the anthropic principle as an answer to some of our basic questions. This is new… as Krauss says, and as I remember, some years ago, the concensus of opinion certainly was that there’s only one allowed law of nature that works, that ultimately we might discover fundamental symmetries and mathematical principles that cause the nature to be the way it is.
Gregory Chaitin has put up on the web his wonderful new book “Meta Math! – The Quest for Omega”:http://www.cs.auckland.ac.nz/CDMTCS/chaitin/omega.html (via “Tim Bray”:http://www.tbray.org/ongoing/When/200x/2004/06/11/Chaitin ). Some random thoughts … apart from the fact that the book itself is brilliant.
Tim Bray “writes”:http://tbray.org/ongoing/When/200x/2003/10/21/HumanUniversals about “Human Universals”:http://www.temple.edu/tempress/titles/864_reg.html . A list of over 400 attributes that have showed up in every human culture that anthropologists have ever looked at. Fascinating!