Representation And Inference For Natural Language: A First Course In Computational Semantics (Center for the Study of Language And Information - Lecture Notes) ペーパーバック – 2005/4/6
Kindle 端末は必要ありません。無料 Kindle アプリのいずれかをダウンロードすると、スマートフォン、タブレットPCで Kindle 本をお読みいただけます。
"An exciting combination of standard Montague techniques, modern approaches to underspecification, and the use of first order theorem provers, all in a book that can be used by advanced undergraduates or graduate students." - Robin Cooper, Gorg University"商品の説明をすべて表示する
Carl Hewett, living on the procedural side of the river, invented a language called "Plannner" and emphasized that knowledge consists in the ability to *do* things--to execute procedures.
Alain Colmerauer, living on the logical side of the river, invented a language called "Prolog", and emphasized that the knowledge consists of propositions which we can reason about and draw conclusions from.
On the procedural bank, Terry Winograd used Planner to create SHRDLU, a tour-de-force in Natural language processing, which showed how to make a NLP interface which could answer an impressive range of questions about blocks on a table. It could also make and execute plans involving building things with blocks. In his writings, Winograd emphasized the procedural nature of NLP understanding.
On the logical bank, Colmerauer, Rousssel, and coleages, created a French question-and-answer system which for the first time showed that every step of natural language processing, from tokenization to parsing to database query, can be performed by pure logical deduction.
Robert Kowalski was one of the first to percieve that both of these research programs were banks of the same river. As Hewett observes, Prolog reallly can be viewed as a version of Planner. The resulting vision is a stunning synthesis: Doing things can be viewed as theorem proving, and theorem proving can be viewed as doing things. There is no conflict between the proceedural and logical views--indeed they are two sides of the same coin.
Transcending these false dichotomies, these French-speaking and English-speaking researchers created what is, in the opinion of this reviewer, the mainstream of NLP research. Prolog really is the way to go if you want to do NLP research. Even hard-core lispers like Peter Norvig tacitly agree with this. Norvig wrote a book subtitled "Case studies in Common Lisp" which contains some cool NLP programs-but he didn't write those programs directly in lisp--first, he implemented prolog in lisp, and then built his NLP programs on top of prolog. Really, no matter what programming language you use, you'll eventually have to implement much of planner/prolog before you can really do natural language processing.
Through the years, there have been a series of texts which embody the forefront of this school. Michael Covington's book deserves to be mentioned in this regard. But for too long the baton has been held by Pereira and Schieber's book "Prolog and Natural-Language Analysis". Finally, at long last, the field has a book which can be considered to once again to have pushed the field forward. This book by Blackburn & Bos takes its place here at the vangard of the mainstream NLP research.
Long an underground classic--the book has been circulating samizdat in lecture note and draft form--this book was influential even before it was published. A partial list of innovations it introduces are:
1. Unquestionably the BEST discussion of quantifier scope handling techniques ever brought together between two covers. The story of quantifier scope handling is masterfully told from Montegue, through Cooper and Keller storage, to the the super-ultra-postmodern techniques of constraint-based underspecification.
2. Beautiful examples of prolog technique, including a way of modularizing prolog grammars from back-end semantic processing, which allows the authors to use the same grammar for many different processing backends, some which arn't even covered in this book.
3. Novel uses of both theorem provers _and_ model builders, to handle not just reasoning tasks, but also pragmatic tasks like assessing the informativity of sentences following one another in discourse.
Mastering the techniques presented in this book will make accessible to you a whole new vista. Further reaserch by this school, not covered in this book, but available in reasearch papers & lecture notes on the web, are presupposition handling, DRT-based discourse technique, and further advances in underspecified representations.
After reading a catichal text on prolog like "Art of Prolog", if you are ready to wade in the river of the mainstream of NLP research, this book is an excellent place to plung in.
will find Patrick Blackburn, and Johan Bos book refreshing and informative. So much of the material out there is either completely theoretical or the material only introduces very
introductory level examples.
Representation and Inference for Natural Language is a winner. This book presents a legitimate theoretical
introduction and well thought out examples and source code.
The experiment approach that is used in the book takes the reader through various possibilities
demonstrating their strong points and short fallings and then provides the user with
viable (real) solutions both in a theoretical fashion and in implemented source code.
It has definitely helped me to implement in fairly high quality Q&A system.
Cheers to the authors!!!!!!