Information Theory - Part I: An Introduction To The Fundamental Concepts
A**.
Interesting Read
Wonderful and Interesting Book.
J**O
like the previous ones
This book is a new release by Prof. Arieh Ben-Naim which, like the previous ones, helps the reader to deepen more and more into the abstract concepts in thermodynamics. The author defends the position that an appropriate introduction to the concept of entropy can only be made from the Information Theory. And indeed, the robust arguments exhibited by Prof. Ben-Naim through the present work in order to show it, are overwhelming.As we have grown accustomed from the author, the material is presented in a very pleasant and affordable way, even for the lay reader, and, at the same time, rigorously, thus avoiding errors and misunderstandings commonly found in some popular science books.After a masterful introduction to Probability Theory, the author reviews the more fundamental points of Shannon’s measure of information. This is done in a very pedagogical manner, with continuous simple examples which nicely illustrate the basic concepts of Shannon’s theory.In the final chapter, the author tackle, with the greatest skill, the main goals of this beautiful book, namely, the derivation of the entropy function from Shannon’s measure of information, and the entropy formulation of the Second Law. Connection between Information Theory and Thermodynamics emerges as a solid link, providing both insight and support to the abstract concept of entropy.A number of mathematical arguments, unnecessary for the beginners but pertinent for the more advanced readers, are included as Appendixes.Summarizing, a new masterpiece of Prof. Ben-Naim that I strongly recommend without any reservation to the potential readers interested in the fundamentals of Thermodynamics.
R**N
How to measure information and understand change
Arieh Ben-Naim presents a thorough account of Shannon's method to measure information and meticulously distinguishes it from the statistical-mechanical entropy and various erroneous interpretations. He shows how SMI is used in a series of intuitive examples such as two-state coin-tossing, the 20-questions game, common probability distribution functions, and the frequency of letters in words used in different languages. With more examples he extends SMI to the two-dimensional case and to higher dimensions, explaining how conditionality and mutual information can be used to quantify information in correlated systems which may change in time or with temperature. Finally, he applies SMI to the ideal gas to show that it equals entropy in the special case when SMI is a maximum (equilibrium). He demonstrates with a number of examples that only for isolated systems can entropy predict the direction of change whereas SMI can do so for any process. The text is helpfully supplemented by the definition of standard mathematical quantities in the introduction and relevant derivations in appendices. This is an excellent account relating powerful, fundamental concepts to everyday experience.
D**H
Dive into Information: the world is full of it!
Have you always be afraid of statistics? have you sometimes be wandering what is exactly Information Theory? Are the words like Entropy and Chaos related?This book will let you slide so perfectly into these notions, provided,you know the symbol for Integral, Derivative and Sigma, without effort, pain. You may smile by reading it, and understand better the accompanying drawings, illustrations, and in-between quizzes. What a splendid introduction to probability; each and every student in this field should start with it.
M**T
instructors and scientists will have a nice reference book as well as a book that presents ...
This book really digs into the Shannon Measure of Information and distinguishes it from entropy which is a thermodynamic property, not a state of information. The book thoroughly applies probability theory to this as well as detailing the various categories of information. Students, instructors and scientists will have a nice reference book as well as a book that presents the material in an informative and entertaining way. Ben-Naim always delivers top quality works and this one is another fine example of that.
W**R
Clarity and Patience are watchwords for this author
Professor Ben-Naim guides readers in a gentle way, making his topics clear at every turn.Entropy and information are very closely related, but its all-too-easy to get lostin other books.Not so this one, as Ben-Naim again shows his mastery of what entropy is,and why the quantity should behave as it does.
T**N
Don’t buy it
Waste of time and money