Knowledge and the Flow of Information
A**R
Five Stars
Great read
R**E
Not something to spend time with.
Dretske starts with Claude Shannon's mathematical theory of communication, and from this foundation, tries to justify a semantic theory of information and explain something about perception, meaning, and belief.In this book, Dretske reminds one of a philosopher who, upon understanding the beauty in a small portion of applied mathematics, tries to explain the world with his newly found tool. Of course he does not explain the world; only expansive topics of perception, meaning, and belief. Dretske is clearly awed by Shannon's innovative work--we can excuse him for that. Shannon's theory opened up an entire field of study and technology.For those who already understand the mathematics of information theory, Dretske takes the well-defined mathematical concept of mutual information and evaluates it at a particular value, in symbols:I(S;R=ri)=H(S)-H(S|R=ri), where I(S;R=ri) is the mutual information, H(S) is the entropy of S, and H(S|R=ri) is the conditional entropy evaluated at a particular value of R=ri. S and R are variables representing source and receiver messages.Dretske calls I(S;R=ri) the amount of information carried by a particular signal ri. He grounds the entire book on this concept, yet, this foundation nearly becomes irrelevant after he defines the information content of a signal: "A signal r carries the information that s is F" = P(s is F|r,k)=1.Now Dretske turns to conditional probability for answers to deep philosophical questions. Yet philosophers and scientists are far from understanding the nature of probability itself, and it is not clear that a conditional probability of 1 says anything more than exactly that. You may say that P(X|Y)=1 means that Y carries information about X, but that interpretation adds nothing to our understanding--it simply defines a natural language sentence in terms of a probabilistic sentence. Philosophers looking to justify their work by connecting natural language concepts to math might find this impressive. Others will see it for what it is: using 'big words' or 'mathematical words' to sound smart.Check out Shannon's original paper on 'information' theory online:[...]
J**O
Five Stars
I WANT TO SEE THE TABLE CONTENTS OF AMAZON BOOKS!!!
B**K
Worst university press book I have ever read
I was required to read this book in grad school (I was embarrassed for the teacher, since the selection reflects on the selector). It is a genuinely awful book. The style was (for me, at least) indigestible. The main thesis of the book, that *meaning* -- as opposed to bit configurations -- can be *quantified* is not just nonsense, but *frightening* nonsense, since quantifying everything gets funded these days. The book is worth buying if you want to discover how appalling what Joseph Weizenbaum described in his fine book: "Computer Power and Human Reason: From Judgment to calculation" can get!
T**O
One of my all time favourites
If you are fascinated by any kind of problem refering to the human mind - and want to learn anything substantial - start with that book. If the brain is the organ to deliver information (and, of course, to steer the body through the environment by employing this information), Dretske simply had to be successful in obtaining a great deal of naturalistic insights in the nature of mind completely different from those "mind is nothing but this or that type of brain-process"-stupidities. By the way: Dretske's later books, especially "Explaining Behaviour" and "Naturalising the mind", are even better. He's definitely my favourite philosopher.
A**R
Dretske my man
A good naturalistic, information-based account of (percepual) knowledge. Not a great fan of externalism, but I sill appreciate it.
Trustpilot
4 days ago
1 week ago