» » Probably Approximately Correct: Nature’s Algorithms for Learning and Prospering in a Complex World

Free eBook Probably Approximately Correct: Nature’s Algorithms for Learning and Prospering in a Complex World download

by Leslie Valiant

Free eBook Probably Approximately Correct: Nature’s Algorithms for Learning and Prospering in a Complex World download ISBN: 0465032710
Author: Leslie Valiant
Publisher: Basic Books; 1 edition (June 4, 2013)
Language: English
Pages: 208
Category: Technologies and Future
Subcategory: Computer Science
Size MP3: 1664 mb
Size FLAC: 1965 mb
Rating: 4.4
Format: lit rtf lrf mobi


How does life prosper in a complex and erratic world? . Leslie Valiant's Probably Approximately Correct is a detailed, much-needed guide to how nature brought us here, and where technology is taking us next

How does life prosper in a complex and erratic world? While we know that nature follows patterns-such as the law of gravity-our everyday lives are beyond what known science can predict. We nevertheless muddle through even in the absence of theories of how to act. But how do we do it? In Probably Approximately Correct. Leslie Valiant's Probably Approximately Correct is a detailed, much-needed guide to how nature brought us here, and where technology is taking us next. George Dyson, author of Turing's Cathedral and Darwin among the Machines.

Probably Approximately Correct book. How does life prosper in a complex and erratic world? While we know that nature follows patterns-such as the law of gravity-our everyday lives are beyond what known science can predict. We nevertheless muddle through even in the absence of theories From a leading computer scientist, a unifying theory that will revolutionize our understanding of how life evolves and learns. How does life prosper in a complex and erratic world?

Prospering in a Complex World By Leslie Valiant. We define a natural model of random monotone DNF formulas and give an efficient algorithm which with high probability can learn, for any fixed constant γ 0, a random t-term monotone DNF for any t O(n 2−γ)

Prospering in a Complex World By Leslie Valiant. By Noson S. Yanofsky. The highest award one can get in computer science is called the Turing Award after the great. We define a natural model of random monotone DNF formulas and give an efficient algorithm which with high probability can learn, for any fixed constant γ 0, a random t-term monotone DNF for any t O(n 2−γ) efficient algorithm which with high probability can learn a random t-term DNF for any t O(n 3/2−γ).

Электронная книга "Probably Approximately Correct: Nature's Algorithms for Learning and Prospering in a Complex World", Leslie Valiant. Эту книгу можно прочитать в Google Play Книгах на компьютере, а также на устройствах Android и iOS. Выделяйте текст, добавляйте закладки и делайте заметки, скачав книгу "Probably Approximately Correct: Nature's Algorithms for Learning and Prospering in a Complex World" для чтения в офлайн-режиме.

In Probably Approximately Correct, computer scientist Leslie Valiant presents a theory of the theoryless.

Probably Approximately Correct : Nature's Algorithms for Learning and Prospering in a Complex World. From a leading computer scientist, a unifying theory that will revolutionize our understanding of how life evolves and learns

Probably Approximately Correct : Nature's Algorithms for Learning and Prospering in a Complex World. From a leading computer scientist, a unifying theory that will revolutionize our understanding of how life evolves and learns.

In computational learning theory, probably approximately correct (PAC) learning is a framework for mathematical analysis of machine learning. It was proposed in 1984 by Leslie Valiant

In computational learning theory, probably approximately correct (PAC) learning is a framework for mathematical analysis of machine learning. It was proposed in 1984 by Leslie Valiant. In this framework, the learner receives samples and must select a generalization function (called the hypothesis) from a certain class of possible functions. The goal is that, with high probability (the "probably" part), the selected function will have low generalization error (the "approximately correct" part)

Full recovery of all data can take up to 2 weeks! So we came to the decision at this time to double the download limits for all users until the problem is completely resolved. Thanks for your understanding! Progress: 9. 4% restored. Главная Probably Approximately Correct - Nature's Algorithms for Learning and Prospering in a Complex World.

The highest award given in computer science is the Turing, which Valiant received in 2010. When a Turing laureate writes a book about ways of understanding processes in the world, one is well advised to take an interest. Algorithms are set rules about how to deal with inputs. Valiant shows that, often, an algorithm can be improved if it changes along with its environment.

From a leading computer scientist, a unifying theory that will revolutionize our understanding of how life evolves and learns.How does life prosper in a complex and erratic world? While we know that nature follows patterns—such as the law of gravity—our everyday lives are beyond what known science can predict. We nevertheless muddle through even in the absence of theories of how to act. But how do we do it?In Probably Approximately Correct, computer scientist Leslie Valiant presents a masterful synthesis of learning and evolution to show how both individually and collectively we not only survive, but prosper in a world as complex as our own. The key is “probably approximately correct” algorithms, a concept Valiant developed to explain how effective behavior can be learned. The model shows that pragmatically coping with a problem can provide a satisfactory solution in the absence of any theory of the problem. After all, finding a mate does not require a theory of mating. Valiant's theory reveals the shared computational nature of evolution and learning, and sheds light on perennial questions such as nature versus nurture and the limits of artificial intelligence.Offering a powerful and elegant model that encompasses life's complexity, Probably Approximately Correct has profound implications for how we think about behavior, cognition, biological evolution, and the possibilities and limits of human and machine intelligence.
User reviews
Modar
This book is by a machine learning expert. He in interested in models of learning and particularly their assessment in terms of computational complexity theory. It considers seriously the role of evolution itself in the context of knowledge acquisition processes. The book argues that evolution is a subset of learning processes.

Overall, the book is a reasonable one. However, the presentation is a bit dry and boring. The author apparently likes coining terms, and dislikes reviewing the work of others. As Leslie says, there is indeed a close link between the theories of evolution and learning. He correctly argues against the modern dogma of directionless evolution (since evolution and learning are linked and learning is clearly directional). Leslie argues that "fitness" provides such a direction. In fact a much stronger case than the one Leslie gives can be made - based on thermodynamics.

Overall, I am inclined to think that the book has its core thesis backwards. Instead of evolution being a subset of learning processes, learning processes are part of evolution. Most of the rest of this review focuses on this one point, because I think it is an important one.

The idea that learning is a part of evolution is an old one. James Mark Baldwin proposed that organisms could learn a behavioural trait and then see genetic predispositions to learning that behaviour amplified by evolution. This idea was later generalised by Waddington - who proposed that genes could take over the trait completely - via a process known as "genetic assimilation". We see this effect in modern times, with learned milk drinking preceding genetically encoded lactose tolerance. Overall, the course of evolution is altered significantly by individual and social learning processes.

Leslie says that "The idea that evolution is a form of learning sounds implausible to many people when they first hear it." I think this is because he has things backwards - and learning is better seen as one of the products of evolution. How does Leslie argue that evolution is part of learning - and not the other way around? Leslie confines his attention to the case of "Darwinian evolution". According to Leslie, this term refers to evolution without learning. Leslie asserts that, in Darwinian evolution, genetic variations are generated independently of current experiences - a constraint that does not apply to learning systems. Unfortunately for Leslie's thesis, this isn't the kind of evolution that Darwin believed in. Darwin was well aware of the role of learning in evolution. Indeed he formulated a theory to explain how current experiences went on to affect the next generation. Darwin's proposed "gemmules" were subsequently discredited, but they clearly show that Darwin thought that current experiences influenced heritable variation.

Leslie goes on to describe modern cultural evolution, saying that "culture also undergoes change or evolution, but this change is no longer limited by Darwinian principles". However, Darwin was, in fact, a pioneer in the discovery of cultural evolution, writing about how words and languages were subject to natural selection. Leslie argues that human culture introduced learning to evolution. He minimizes the significance of cultural inheritance in other animals and the influence of individual learning on DNA evolution via the Baldwin effect and genetic assimilation. He says that before human culture: "the learning and reasoning carried out by an organism during its life had limited impact that outlived the individual". I think this is a big understatement that is not really consistent with the scientific evidence on the role of learning in evolution. Learning is important, and it's impact on evolution long pre-dates human cultural evolution.

The "Darwinian evolution" described by the author would have been foreign to Darwin. Also, we know that the idea that genetic variations are generated independently of current experiences is wrong - not least because of the role of stress in stimulating the production of mutations. This it isn't the kind of evolutionary theory that is much use for explaining what happens in nature. Why Leslie focuses on this impoverished version of evolutionary theory is not completely clear. Perhaps he really believes that this is what Darwinian evolutionary theory says. Or perhaps making Darwinism look weak makes his own field of learning seem more important.

So far, this has been mostly an argument over terminology - specifically over what the term "Darwinian evolution" refers to. This debate has limited interest - and can mostly be avoided with clear definitions. However the problem with learning theorists placing learning centrally and denigrating the power of Darwinian evolution, is that they then fail to make proper use of the insights evolutionary theory provides. In fact, Darwinism has much to say about how the brain processes responsible for animal learning work. Natural selection acts on synapses. Axon pulses are copied with variation and selection. There's competition between ideas within the brain for attention. The result is a good adaptive fit between an organism's model of its world and its environment. Interesting though these idea are, you won't find anything like them in this book. Indeed, few machine learning experts appear to have looked into the implications of modern versions of Darwinism. Instead, Leslie sees Darwinian evolution as a primitive ladder that led to modern learning systems. He doesn't deal with the more powerful, generalized versions of evolutionary theory that also cover organisms that learn or make use of cultural transmission.
IGOT
When one is confronted with a “your money or your life” proposition from a gun-wielding thug in a dark alley, there is no temptation to poll the data on dark alley robberies in order to calculate the probability that the thief will pull the trigger. Unless one is trained to deal specifically with threats and stressful situations such as this, one quickly hands over the wallet or the purse. No sophisticated time intensive algorithms are in play in this situation. The pattern matching of the gun image and the affective capabilities of the brain take over here, provoking with incredible speed an appropriate fear response. The feedback received from this situation is that of walking away unscathed.

The point to be made here is that if one is to view the brain as a computational entity that deploys various algorithms to deal with situations like this and survival in general one must come to grips with the computational complexity of these algorithms. One must acknowledge that survival entails in many instances that thought processes operate on time scales that can are very short, as well as time scales that can be very long, i.e. require much deliberation and a quantitative assessment of risks.

The author of this book is well aware of the issues with computational complexity and via the idea (which he invented) of ‘probably approximately correct’ or PAC learning for short, has given the evolutionary biologists an interesting and provocative view of evolutionary processes that addresses some of the gaps in the Darwinian paradigm.

The book is highly interesting and its perusal will not only help the reader understand the issues at stake in the Darwinian view of evolution but will also assist the uninitiated reader in the understanding of PAC learning itself. In this regard the author devotes a portion of the book to PAC learning and examples are given that illustrate it. A very plausible case for the role of PAC learning in evolutionary process is outlined and should be understandable to anyone with even a modest background in computer science and mathematical logic. Readers with more background in learning theory and artificial intelligence can still appreciate the book even though the rigorous formalism has been omitted in order to appeal to a wider audience.

Whatever the eventual impact this book has on evolutionary biology it raises issues that should be addressed when contemplating the Darwinian paradigm, and beyond that it addresses requirements that every algorithm developer confronts in everyday practice. These involve the running time of algorithms designed for practical use, the data needed for these algorithms that can frequently be corrupted or sparse, and the overhead generated by the algorithm, especially those deployed on information networks.
Atineda
PAC learning is an interesting paradigm but the writer obscures his argument for it as a path to describing evolution mathematically by spending so much time repeating his view that evolution lacks explanatory/predictive value. The whole work comes off as yet another attack on evolution based on outdated understanding rather than the intended quest to find a fitting computational model for it.