Arts & Photography
Audio CDs
Audiocassettes
Biographies & Memoirs
Business & Investing
Children's Books
Christianity
Comics & Graphic Novels
Computers & Internet
Cooking, Food & Wine
Entertainment
Gay & Lesbian
Health, Mind & Body
History
Home & Garden
Horror
Literature & Fiction
Mystery & Thrillers
Nonfiction
Outdoors & Nature
Parenting & Families
Professional & Technical
Reference
Religion & Spirituality
Romance
Science
Science Fiction & Fantasy
Sports
Teens
Travel
Women's Fiction
|
 |
Information Theory and Evolution |
List Price: $34.00
Your Price: $34.00 |
 |
|
|
Product Info |
Reviews |
<< 1 >>
Rating:  Summary: Easy reading but misleading Review: The book is new but out of date, based on old mythologies about the relationship of information theory to biology. Communication theory is undergoing a revolution but you would not know anything about that from this book which focuses on century old analyses. The author wishes to introduce a new term 'thermodynamical information' which, combined with other terminological contradictions, leads to incorrect inferences and a 'just so story' about how life was inevitable. This may or may not be true but this book has not shed any light on it. The author states 'A flood of information-containing free energy reaches the earth's biosphere in the form of sunlight...much of it is degraded into heat, but part is converted into cybernetic information and preserved in the intricate structures which are characteristic of life.' This view is so simplistic it borders on ridiculous. Many scientists have, in estimating the probability of life purely using probablility theory (which is likely not possible but at least begs the question) estimated only highly improbable (or infinitisimal) numbers arise in light of modern cosmological estimates of the age of the universe and earth. Avery has no concern with cosmology because by equating (Gibbs free) energy with information there is obviously sufficient energy for life on earth from the sun? However why is absolute information relevant? Does not information require a recipient, which in turns begs the question how did it arise? Avery makes a widespread historical error in defining information and entropy as absolutes. This leads to many myths including 'Maxwell's Demon'. In real irreversible processes: 1. We have in thermodynamics an inequality of entropy being greater than heat dissipation, S>[Q/T]; 2. In information theory what Avery refers to (popularly) as 'Shannon entropy' that he equates with information is really 'uncertainty'. 'Shannon' information is the decrease (if any) of uncertainty of a receiver (or 'molecular machine') in going from a before state to an after state; so I = Hbefore - Hafter; where H is the standard entropy-like formula (without Boltzman's K) or -Sum[p.log2.p]. Comparing this with Boltzman's entropy S = -k.Sum[P.ln.P] and using log2(x) = ln(x)/ln(2) one gets and inequality with Clausius's thermodynamical entropy for irreversible systems of: kTln(2) < -Q/I [E.g. see Dr Tom Schneider's website.] Therefore for every bit of information gained, heat is dissipated into the environment. 'Maxwell's demon' is a myth in real irreversible processes. Avery devotes an entire appendix to suggest an equality of entropy with information and suggests Boltzman established the relationship. This is poppycock! All the appendix does is re-derive the usual connection between microconical statistical and macroconical thermodynamical entropy relations, ending with the Clausius equality for reversible systems [and into the Maxwell demon myth!]. As the true relationship between information and entropy is a proportionality (inequality for real systems) Boltzman in no way proved the connection! No one can establish an equality. For instance, if you flip a coin a minimum energy is dissipated but the information gained is 1 bit whether you toss it 1 foot or 10 feet! Also if you toss a coin 1000 times you do not get 1000 bits of information as suggested by W. Dembski in his book 'No Free Lunch' , as would Avery in using the absolute definition; instead Hbefore -Hafter = 0 bits. Information is a state function difference; otherwise it equates with entropy which many then equate with disorder; i.e. a contradiction. It leads to the paradoxical statements of some scientists that a random text has more information than a meaningful one. More poppycock! Further mythologies are perpetrated in Avery's dealing with entropy. As Dr. Frank Lambert says 'Energy is not disorder, not a measure of chaos, not a driving force. Energy's diffusion or dispersal to more microstates is the driving force in chemistry. Entropy is the measure or index of that dispersal.' Similarly he refers to 'negentropy' '...which an organism...maintains in sucking orderliness from the environment.' This is a contradiction in terms. The absolute value of entropy cannot be negative (see Boltzman's equation for S); instead what should be refered to is a decrease in entropy, i.e. dS. The riddle of life though is not simply solved by referring to the free energy from the sun. As P.W. Atkins said in his excellent book on 'The 2nd Law': 'thermodynamic systems do not tend towards states of lower energy...The Universe falls upward in entropy: that is the only law of spontaneous change. The free energy is, in fact, just a disguised form of the total entropy of the Universe...The Second Law is a global [vs local] denial of the emergence of spontaneous structure.' Therefore Avery has not drawn the link between information and entropy (others have like Shannon; see Schneider; though there does not appear to be an adequate book on the subject yet) and has not explained how life arose anywhere, whether it's by free energy (necessary but not sufficient), 'thermodynamic information' (a contradiction in terms) or 'negentropy' (a further contradiction in terms). The mystery remains both quantitatively and qualitatively.
<< 1 >>
|
|
|
|