Information Theory: A Tutorial Introduction (Tutorial Introduction Book) (英語) ペーパーバック – 2015/2/1
Kindle 端末は必要ありません。無料 Kindle アプリのいずれかをダウンロードすると、スマートフォン、タブレットPCで Kindle 本をお読みいただけます。
Originally developed by Claude Shannon in the 1940s, the theory of information laid the foundations for the digital revolution, and is now an essential tool in deep space communication, genetics, linguistics, data compression, and brain sciences. In this richly illustrated book, accessible examples are used to show how information theory can be understood in terms of everyday games like '20 Questions', and the simple MatLab programs provided give hands-on experience of information theory in action. Written in a tutorial style, with a comprehensive glossary, this text represents an ideal primer for novices who wish to become familiar with the basic principles of information theory.
"This is a really great book - it describes a simple and beautiful idea in a way that is accessible for novices and experts alike. This "simple idea" is that information is a formal quantity that underlies nearly everything we do. In this book, Stone leads us through Shannon's fundamental insights; starting with the basics of probability and ending with a range of applications including thermodynamics, telecommunications, computational neuroscience and evolution. There are some lovely anecdotes: I particularly liked the account of how Samuel Morse (inventor of the Morse code) pre-empted modern notions of efficient coding by counting how many copies of each letter were held in stock in a printer's workshop. The treatment of natural selection as "a means by which information about the environment is incorporated into DNA" is both compelling and entertaining. The substance of this book is a clear exposition of information theory, written in an intuitive fashion (true to Stone's observation that "rigour follows insight"). Indeed, I wish that this text had been available when I was learning about information theory. Stone has managed to distil all of the key ideas in information theory into a coherent story. Every idea and equation that underpins recent advances in technology and the life sciences can be found in this informative little book. " Professor Karl Friston, Fellow of the Royal Society. Scientific Director of the Wellcome Trust Centre for Neuroimaging, Institute of Neurology, University College London. "Information lies at the heart of biology, societies depend on it, and our ability to process information ever more efficiently is transforming our lives. By introducing the theory that enabled our information revolution, this book describes what information is, how it can be communicated efficiently, and why it underpins our understanding of biology, brains, and physical reality. Its tutorial approach develops a deep intuitive understanding using the minimum number of elementary equations. Thus, this superb introduction not only enables scientists of all persuasions to appreciate the relevance of information theory, it also equips them to start using it. The same goes for students. I have used a handout to teach elementary information theory to biologists and neuroscientists for many years. I will throw away my handout and use this book. " Simon Laughlin, Professor of Neurobiology, Fellow of the Royal Society, Department of Zoology, University of Cambridge, England.
Mathematics is hard. The language in which mathematicians describe their work only compounds the difficulty of learning math for few are fluent in this succinct language. Unfortunately, it is this language almost all introductory mathematical textbooks inevitably conform to, at the cost of the reader's comprehension. Dr. Stone, overcomes this language rift by explaining the math in a friendly, familiar way. He further takes the care to ensure appropriate time is spent clarifying each topic in a variety of ways (in case one does not make much sense). Stone also provides appendices as reference for the reader who may need more explanation or refreshers. This kind of guidance through mathematical theory is inherently absent in the mathematical language as its core is precision, brevity, and removal of all redundancy.
It is his very thoughtful explanation and walkthrough which makes me confidence to say that Dr. James V Stone's introduction to information theory is conceivably the best book I have read; not just in regards to information theory but in regards to mathematics (applied or otherwise) as a whole.
The reader is guided through Shannon's seminal work in a way that is applicable regardless of the reader's background (mathematics, art, biology, journalism, etc). Dr. Stone helps the reader develop an intuition for information theory.
The feeling of such a clear and expounded grasp on a mathematical field is so rare currently that this feeling is most difficult to describe other than you'll "just get it." If you have had minimal exposure to math, are helplessly confused by proofs, feel like you just never understand, this book is for you. This book is equally applicable to those versed in mathematics, as it provides an understanding that is often disjoint from a theoretical approach.
Consider how many people utilize "basic" mathematics to help them approach and solve questions in their daily life intuitive like e.g. if I make $x per month, how much do I earn per week? Of my weekly earnings, if I set aside y for groceries and z for savings, how much do I have left to spend?
The commonplaceness of applying math in this way, by understanding the concepts behind the math, rather than just plugging into a formula is exactly what you can expect to gain from reading Dr. Stone's book. By the end of even the first section of the first chapter, the reader may find that they are already grasping this intuitive understanding, and applying it to the world around them. Dr. Stone helps the reader integrate the core concepts of information theory so that the math behind it becomes a tool for the reader to use, rather than to be perplexed by.
The book is a joy to read, and a privilege to learn from.
For those who have read the review this far, I study mathematics and neuroscience. As such, I have read my fair share of mathematical textbooks, mathematical introductions, and mathematical books for "dummies" (as I often feel myself). In my experience math never stops being complex, and try as many might to simply it, none have succeeded as well as Dr. Stone. I was versed in information theory before reading this book. The intuition and deep understanding / appreciation for this field that I have gained from the book is unheard-of. It is intuition that makes a great mathematician, and this book will teach you to think intuitively. The clarity of Dr. Stone's work is so profound I have little other way to describe how accessible it is to all walks of life.
While many may not consider this a "mathematical textbook," let Dr. Stone's style be an example for how math should initially be taught. This book works well as a standalone text, and as a supplement to more intricate texts in regard to information theory. Do not let the title introduction fool you, Dr. Stone manages to maintain the intricacies of the field in a way often overlooked.
The book starts out by defining the units of information, bits and then how to quantify information so that it has the properties that we would like it to have namely additive, continuous, maximal with respect to certain probability measures and symmetry. He goes through some of the various examples to give a feel of how information can be considered and how many means of communication include much redundancy. The author first tackles discrete random variables and their entropy and information. Through basic ideas like the less likely a piece of information is the more information it has leads to the construction of a measure of information which fits with Shannon's conditions. From that construction of entropy the author then goes through and explores its repercussions with things like dice examples. The explanations are clear and its hard not to feel like you are really making some progress on what is supposed to be a very difficult subject. The author moves from information to coding theory, which is the practical application of the subject and introduces ideas like channel capacity, how much information can be transmitted in a noiseless channel, conditional expectations and coding schemes that can deliver results arbitrarily close to the channel capacity. The author spends some time discussing redundancy in English and coding schemes for how to create efficient blocks for the language which is really interesting. One learns very quickly through the tutorials that trying to have blocks of code which are equiprobable is the way to get to the best coding efficiency and maximize information transmission. The author then moves on to noisy channels, which is far more practical and further evolves the readers intuition. One learns how to add redundancy to codes to make them resilient to noise contamination. The author discusses continuous random variables and their information. The information of a continuous variable, which is a more difficult subject, can be considered infinite given the events possible are uncountable, but with noise this is avoided and the author explains the ideas well so that the content remains clear. The author discusses mutual information in the continuous case giving the reader a strong understanding of joint and marginal probability distributions. With these tools the author goes back to channel capacity in a noisy channel for continuous variables and re-discusses the ideas from discrete variables in the new light. The author then moves on from communication related information theory to entropy and physics. These chapters were to give a feel of the similarity and topics like thermodynamics and quantum information are lightly touched.
Information Theory is a highly readable account of what is usually a very technical subject. The reader will come through it with an intuitive feel for information and how transmission of information across various channels and coding schemes to do this effectively. The only thing is there are no exercises so the confidence will be to some extent somewhat false as reading and implementing are very different things. That being said as an introductory text before tackling more mathematical works I think this is very helpful. I have both this as well as the author's work on Bayes and I prefer the content of this. Definitely recommended.
This book is a great example of an author really taking the time to hone in on the minimally sufficient prose and exposition to make a challenging topic accessible. I'd like to thank the author for writing an excellent book that enriched my understanding of this wonderful field.