<< 1 >>
Rating:  Summary: A clear exposition of Shannon's results by a great mathemati Review: A Y Khinchin was one of the great mathematicians of the first half of the twentieth century. His name is is already well-known to students of probability theory along with A N Kolmogorov and others from the host of important theorems, inequalites, constants named after them. He was also famous as a teacher and communicator. The books he wrote on Mathematical Foundations of Information Theory, Statistical Mechanics and Quantum Statistics are still in print in English translations, published by Dover. Like William Feller and Richard Feynman he combines a complete mastery of his subject with an ability to explain clearly without sacrificing mathematical rigour.In his "Mathematical Foundations" books Khinchin develops a sound mathematical structure for the subject under discussion based on the modern theory of probability. His primary reason for doing this is the lack of mathematically rigorous presentation in many textbooks on these subjects. This book contains two papers written by Khinchin on the concept of entropy in probability theory and Shannon's first and second theorems in information theory - with detailed modern proofs. Like all Khinchin's books, this one is very readable. And unlike many recent books on this subject the price is very cheap. Two minor complaints are: lack of an index, and typesetting could be improved.
Rating:  Summary: More rigorous version of Shannon 1948 paper Review: Shannon's paper is great. Easy to read (though many people misunderstand many concepts - I may too) but lacks mathematical rigor. This book has redone several points that Shannon made but more accurately. It requires ergodic theory and measure theory to follow every detail, but some parts may be usable even without much background. I don't think the book is perfectly edited, but I know I paid too little for the knowledge I gained from this book.
<< 1 >>
|