

Hardcover: 640 pages
Publisher: Cambridge University Press; 1 edition (October 6, 2003)
Language: English
ISBN-10: 0521642981
ISBN-13: 978-0521642989
Product Dimensions: 7.4 x 1.3 x 9.7 inches
Shipping Weight: 3.3 pounds (View shipping rates and policies)
Average Customer Review: 4.3 out of 5 stars See all reviews (23 customer reviews)
Best Sellers Rank: #173,820 in Books (See Top 100 in Books) #25 in Books > Computers & Technology > Computer Science > AI & Machine Learning > Computer Vision & Pattern Recognition #35 in Books > Computers & Technology > Computer Science > AI & Machine Learning > Neural Networks #52 in Books > Computers & Technology > Computer Science > Information Theory

I find it interesting that most of the people reviewing this book seem to be reviewing it as they would any other information theory textbook. Such a review, whether positive or critical, could not hope to give a complete picture of what this text actually is. There are many books on information theory, but what makes this book unique (and in my opinion what makes it so outstanding) is the way it integrates information theory with statistical inference. The book covers topics including coding theory, Bayesian inference, and neural networks, but it treats them all as different pieces of a unified puzzle, focusing more on the connections between these areas, and the philosophical implications of these connections, and less on delving into depth in one area or another.This is a learning text, clearly meant to be read and understood. The presentation of topics is greatly expanded and includes much discussion, and although the book is dense, it is rarely concise. The exercises are absolutely essential to understanding the text. Although the author has made some effort to make certain chapters or topics independent, I think that this is one book for which it is best to more or less work straight through. For this reason and others, this book does not make a very good reference: occasionally nonstandard notation or terminology is used.The biggest strength of this text, in my opinion, is on a philosophical level. It is my opinion, and in my opinion it is a great shame, that the vast majority of statistical theory and practice is highly arbitrary. This book will provide some tools to (at least in some cases) anchor your thinking to something less arbitrary.
I am reviewing David MacKay's `Information Theory, Inference, and Learning Algorithms, but I haven't yet read completely. It will be years before I finish it, since it contains the material for several advanced undergraduate or graduate courses. However, it is already on my list of favorite texts and references. It is a book I will keep going back to time after time, but don't take my word for it. According to the back cover, Bob McEliece, the author of a 1977 classic on information theory recommends you buy two copies, one for the office and one for home. There are topics in this book I am aching to find the time to read, work through and learn.It can be used as a text book, reference book or to fill in gaps in your knowledge of Information Theory and related material. MacKay outlines several courses for which it can be used including: his Cambridge Course on Information Theory, Pattern Recognition and Neural Networks, a Short Course on Information Theory, and a Course on Bayesian Inference and Machine Learning. As a reference it covers topics not easily accessible in books including: a variety of modern codes (hash codes, low density parity check codes, digital fountain codes, and many others), Bayesian inference techniques (maximum likelihood, LaPlace's method, variational methods and Monte Carlo methods). It has interesting applications such as information theory applied to genes and evolution and to machine learning.It is well written, with good problems, some help to understand the theory, and others help to apply the theory. Many are worked as examples, and some are especially recommended. He works to keep your attention and interest, and knows how to do it. For example chapter titles include `Why Have Sex' and `Crosswords and Codebreaking'.
Information Theory, Inference and Learning Algorithms Statistical Methods for Dynamic Treatment Regimes: Reinforcement Learning, Causal Inference, and Personalized Medicine (Statistics for Biology and Health) The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition (Springer Series in Statistics) Computer Vision: Models, Learning, and Inference Information Processing with Evolutionary Algorithms: From Industrial Applications to Academic Speculations (Advanced Information and Knowledge Processing) The Design of Innovation: Lessons from and for Competent Genetic Algorithms (Genetic Algorithms and Evolutionary Computation) Algorithms in C++ Part 5: Graph Algorithms (3rd Edition) (Pt.5) Experimental and Quasi-Experimental Designs for Generalized Causal Inference Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference (Morgan Kaufmann Series in Representation and Reasoning) Understanding Machine Learning: From Theory to Algorithms Machine Learning with Spark - Tackle Big Data with Powerful Spark Machine Learning Algorithms Convex Analysis and Minimization Algorithms II: Advanced Theory and Bundle Methods (Grundlehren der mathematischen Wissenschaften) Probabilistic Reasoning in Expert Systems: Theory and Algorithms Doctor Mozart Music Theory Workbook Level 1A: In-Depth Piano Theory Fun for Children's Music Lessons and HomeSchooling: Highly Effective for Beginners Learning a Musical Instrument Music Theory: From Beginner to Expert - The Ultimate Step-By-Step Guide to Understanding and Learning Music Theory Effortlessly Boosting: Foundations and Algorithms (Adaptive Computation and Machine Learning series) Learning JavaScript Data Structures and Algorithms - Second Edition Fundamentals of Machine Learning for Predictive Data Analytics: Algorithms, Worked Examples, and Case Studies (MIT Press) Machine Learning: The Art and Science of Algorithms that Make Sense of Data Genetic Algorithms in Search, Optimization, and Machine Learning