- 本カテゴリの商品を2500円以上購入で買取金額500円UPキャンペーン対象商品です。商品出荷時に買取サービスでご利用いただけるクーポンをメールにてご案内させていただきます。 詳細はこちら (細則もこちらからご覧いただけます)
Pattern Classification (英語) ハードカバー – 2000/10
Kindle 端末は必要ありません。無料 Kindle アプリのいずれかをダウンロードすると、スマートフォン、タブレットPCで Kindle 本をお読みいただけます。
The first edition, published in 1973, has become a classic reference in the field. Now with the second edition, readers will find information on key new topics such as neural networks and statistical pattern recognition, the theory of machine learning, and the theory of invariances. Also included are worked examples, comparisons between different methods, extensive graphics, expanded exercises and computer project topics.
An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.
"…it provides a good introduction to the subject of Pattern Classification." (Journal of Classification, September 2007)
"…a fantastic book! The presentation...could not be better, and I recommend that future authors consider…this book as a role model." (Journal of Statistical Computation and Simulation, March 2006)
"...strongly recommended both as a professional reference and as a text for students..." (Technometrics, February 2002)
"...provides information needed to choose the most appropriate of the many available technique for a given class of problems." (SciTech Book News, Vol. 25, No. 2, June 2001)
"I do not believe anybody wishing to teach or do serious work on Pattern Recognition can ignore this book, as it is the sort of book one wishes to find the time to read from cover to cover!" (Pattern Analysis & Applications Journal, 2001)
"This book is the unique text/professional reference for any serious student or worker in the field of pattern recognition." (Mathematical Reviews, Issue 2001k)
"...gives a systematic overview about the major topics in pattern recognition, based whenever possible on fundamental principles." (Zentralblatt MATH, Vol. 968, 2001/18)
"attractively presented and readable" (Journal of Classification, Vol.18, No.2 2001)商品の説明をすべて表示する
With this in mind the authors and their new coauthor David Stork go about the task of providing a revision. True to the goals of the original the authors undertake to describe pattern recognition under a variety of topics and with several available methods to cover each topic. Important new areas are covered and old but now deemed less significant are dropped. Advances in statistical computing and computing in general also dictate the topics. So although the authors are the same and the title is almost the same (note that scene analysis is dropped from the title) it is more like an entirely new book on the subject rthan a revision of the old. For a revision, I would expect to see mostly the same chapters with the same titles and only a few new chapters along with expansion of old chapters.
Although I view this as a new book, that is not necessarily bad. In fact it may be viewed as a strength of the book. It maintains the style and clarity of the original that we all loved but represents the state-of-the-art in pattern recognition at the beginning of the 21st Century.
The original had some very nice pictures. I liked some of them so much that I used them with permission in the section on classification error rate estimation in my bootstrap book. This edition goes much further with beautiful graphics including many nice three-dimensional color pictures like the one on the cover page.
The standard classical material is covered in the first five chapters with new material included (e.g. the EM algorithm and hidden markov models in Chapter 3). Chapter 6 covers multilayer neural networks (a totally new area). Nonmetric methods including decision trees and the CART methodology are covered in Chapter 8. Each chapter has a large number of relevant references and many homework exercises and computer exercises.
Chapter 9 is "Algorithm-Independent Machine Learning" and it includes the wonderful "No Free Lunch" theorem (Theorem 9.1), a discussion of the minimum desciption length principle, overfitting issues and Occam's razor, bias - variance tradeoffs,resampling method for estimation and classifier evaluation, and ideas about combining classifiers.
Chapter 10 is on unsurpervised learning and clustering. In addition to the traditional techniques covered in the first edition the authors include the many advances in mixture models.
I was particularly interested in that part of Chapter 9. There is good coverage of the topics and they provide a number of good references. However, I was a bit disappointed with the cursory treatment of bootstrap estimation of classification accuracy (section 9.6.3 on pages 485 - 486). I particularly disagree with the simplistic statement "In practice, the high computational complexity of bootstrap estimation of classifier accuracy is rarely worth possible improvements in that estimate (Section 9.5.1)". On the other hand, the book is one of the first to cover the newer and also promising resampling approaches called "Bagging" and "Boosting" that these authors seem to favor.
Davison and Hinkley's bootstrap text is mentioned for its practical applications and guidance for bootstrapping. The authors overlook Shao and Tu which offers more in the way of guidance. Also my book provides some guidance for error rate estimation but is overlooked.
My book also illustrate the limitations of the bootstrap. Phil Good's book provides guidance and is mentioned by the authors. But his book is very superficial and overgeneralized with respect to guiding practitioners. For these reasons I held back my enthusiasm and only gave this text four stars.
For a more up-to-date treatment, see McLachlan's recent book in the Wiley statistics series. However, this book provides valuable explanations of Bayes rules and shows pictorially what the boundaries look like for linear and quadratic classifiers. In fact I borrowed their pictures in Chapter 2 of my book on bootstrap methods and it appears in both the first and second editions of my book.
The authors with the help of a third author have updated this book recently and I highly recommend the new addition which maintains many of the nice features of the original.