Inference and information theory, which are frequently taught separately, are combined in this engaging textbook.

The intriguing fields of communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography all have these subjects at their core. machine learning and financial engineering.

Along with applications, this textbook provides an introduction to information theory. Information theory is taught with real-world communication techniques like error-correcting sparse-graph codes and arithmetic coding for data reduction.

Applications of these tools to clustering, convolutional coding, independent component analysis, and neural networks are created together with a toolkit of inference approaches, including message-passing algorithms, Monte Carlo methods, and variational approximations.

The state-of-the-art in error-correcting codes, such as low-density parity-check codes, turbo codes, and digital fountain codes, is discussed in the book's concluding section. These codes represent the standards for satellite communications, disk drives, and data broadcasts in the twenty-first century.

David MacKay's ground-breaking book is perfect for self-learning as well as undergraduate or graduate courses because it is richly illustrated, packed with worked examples and over 400 exercises, some of which have full solutions.

Along the way, there are entertaining digressions on sex, evolution, and crosswords. For a new generation of students, this is a textbook on information, communication, and coding. It also offers experts in fields as diverse as computational biology an unrivaled entrance point into these fields.