» » Knowledge and the Flow of Information (Bradford Books)

Free eBook Knowledge and the Flow of Information (Bradford Books) download

by Fred Dretske

Free eBook Knowledge and the Flow of Information (Bradford Books) download ISBN: 0262040638
Author: Fred Dretske
Publisher: The MIT Press; 1 edition (May 28, 1981)
Language: English
Pages: 284
Category: Other
Subcategory: Humanities
Size MP3: 1938 mb
Size FLAC: 1736 mb
Rating: 4.9
Format: lrf rtf azw mobi


The problems Dretske addresses in Knowledge and the Flow of Information-What is knowledge? How are the sensory and cognitive processes related? What makes mental activities mental?-appeal to a wide.

The problems Dretske addresses in Knowledge and the Flow of Information-What is knowledge? How are the sensory and cognitive processes related? What makes mental activities mental?-appeal to a wide audience. Established in 1962, the MIT Press is one of the largest and most distinguished university presses in the world and a leading publisher of books and journals at the intersection of science, technology, art, social science, and design. The conceptual tools used to deal with these questions (information, noise, analog versus digital coding, et. are designed to make contact with, and exploit the findings of, empirical work in the cognitive sciences

There are a lot of dated features in the book.

Recommends it for: Naturalist philosophers. Shelves: read-philosophy. There are a lot of dated features in the book.

This book presents an attempt to develop a theory of knowledge and a philosophy of mind using ideas derived from the mathematical theory of communication developed by Claude Shannon. Information is seen as an objective commodity defined by the dependency relations between distinct events. Knowledge is then analyzed as information caused belief. Perception is the delivery of information in analog form for conceptual utilization by cognitive mechanisms. The final chapters attempt to develop a theory of meaning by viewing meaning as a certain kind of role.

In this book, Dretske reminds one of a philosopher who, upon understanding the beauty in a small portion of applied .

In this book, Dretske reminds one of a philosopher who, upon understanding the beauty in a small portion of applied mathematics, tries to explain the world with his newly found tool. Of course he does not explain the world; only expansive topics of perception, meaning, and belief. Dretske calls I(S;R ri) the amount of information carried by a particular signal ri. He grounds the entire book on this concept, yet, this foundation nearly becomes irrelevant after he defines the information content of a signal: "A signal r carries the information that s is F" P(s is Fr,k) 1. Now Dretske turns to conditional probability for answers to deep philosophical questions.

Dretske, Fred I. (1981). Knowledge & the flow of information. Cambridge, Mass : MIT Press. You must be logged in to Tag Records. Knowledge & the flow of information, Fred I. Dretske. Book, Online - Google Books. Dretske, Fred I. Dretske MIT Press Cambridge, Mass 1981. Australian/Harvard Citation. 1981, Knowledge & the flow of information, Fred I. Dretske MIT Press Cambridge, Mass.

August 12, 2010 History. Published March 29, 1983 by The MIT Press. Knowledge and the Flow of Information (Bradford Books). 1 2 3 4 5. Want to Read.

Knowledge and the Flow of Information. The author of Seeing and Knowing presents in his new book a beautifully and persuasively written interdisciplinary approach to traditional problems-a clearsighted interpretation of information theory. Psychologists, biologists, computer scientists, and those seeking a general unified picture of activity will find this provocative reading.

What distinguishes clever computers from stupid people (besides their components)? The author of Seeing and Knowing presents in his new book a beautifully and persuasively written interdisciplinary approach to traditional problems—a clearsighted interpretation of information theory. Psychologists, biologists, computer scientists, and those seeking a general unified picture of perceptual-cognitive activity will find this provocative reading. The problems Dretske addresses in Knowledge and the Flow of Information—What is knowledge? How are the sensory and cognitive processes related? What makes mental activities mental?—appeal to a wide audience. The conceptual tools used to deal with these questions (information, noise, analog versus digital coding, etc.) are designed to make contact with, and exploit the findings of, empirical work in the cognitive sciences. A concept of information is developed, one deriving from (but not identical with) the Shannon idea familiar to communication theorists, in terms of which the analyses of knowledge, perception, learning, and meaning are expressed. The book is materialistic in spirit—that is, spiritedly materialistic—devoted to the view that mental states and processes are merely special ways physical systems have of processing, coding, and using information.
User reviews
Avarm
Great read
Painbrand
Dretske starts with Claude Shannon's mathematical theory of communication, and from this foundation, tries to justify a semantic theory of information and explain something about perception, meaning, and belief.

In this book, Dretske reminds one of a philosopher who, upon understanding the beauty in a small portion of applied mathematics, tries to explain the world with his newly found tool. Of course he does not explain the world; only expansive topics of perception, meaning, and belief. Dretske is clearly awed by Shannon's innovative work--we can excuse him for that. Shannon's theory opened up an entire field of study and technology.

For those who already understand the mathematics of information theory, Dretske takes the well-defined mathematical concept of mutual information and evaluates it at a particular value, in symbols:

I(S;R=ri)=H(S)-H(S|R=ri), where I(S;R=ri) is the mutual information, H(S) is the entropy of S, and H(S|R=ri) is the conditional entropy evaluated at a particular value of R=ri. S and R are variables representing source and receiver messages.

Dretske calls I(S;R=ri) the amount of information carried by a particular signal ri. He grounds the entire book on this concept, yet, this foundation nearly becomes irrelevant after he defines the information content of a signal: "A signal r carries the information that s is F" = P(s is F|r,k)=1.

Now Dretske turns to conditional probability for answers to deep philosophical questions. Yet philosophers and scientists are far from understanding the nature of probability itself, and it is not clear that a conditional probability of 1 says anything more than exactly that. You may say that P(X|Y)=1 means that Y carries information about X, but that interpretation adds nothing to our understanding--it simply defines a natural language sentence in terms of a probabilistic sentence. Philosophers looking to justify their work by connecting natural language concepts to math might find this impressive. Others will see it for what it is: using 'big words' or 'mathematical words' to sound smart.

Check out Shannon's original paper on 'information' theory online:
[...]
fetish
I was required to read this book in grad school (I was embarrassed for the teacher, since the selection reflects on the selector). It is a genuinely awful book. The style was (for me, at least) indigestible. The main thesis of the book, that *meaning* -- as opposed to bit configurations -- can be *quantified* is not just nonsense, but *frightening* nonsense, since quantifying everything gets funded these days. The book is worth buying if you want to discover how appalling what Joseph Weizenbaum described in his fine book: "Computer Power and Human Reason: From Judgment to calculation" can get!
Nicearad
I WANT TO SEE THE TABLE CONTENTS OF AMAZON BOOKS!!!