Boosting

Foundations and Algorithms

Look inside
Paperback
$50.00 US
| $66.00 CAN
On sale Jan 10, 2014 | 544 Pages | 9780262526036
An accessible introduction and essential reference for an approach to machine learning that creates highly accurate prediction rules by combining many weak and inaccurate ones.

Boosting is an approach to machine learning based on the idea of creating a highly accurate predictor by combining many weak and inaccurate “rules of thumb.” A remarkably rich theory has evolved around boosting, with connections to a range of topics, including statistics, game theory, convex optimization, and information geometry. Boosting algorithms have also enjoyed practical success in such fields as biology, vision, and speech processing. At various times in its history, boosting has been perceived as mysterious, controversial, even paradoxical.

This book, written by the inventors of the method, brings together, organizes, simplifies, and substantially extends two decades of research on boosting, presenting both theory and applications in a way that is accessible to readers from diverse backgrounds while also providing an authoritative reference for advanced researchers. With its introductory treatment of all material and its inclusion of exercises in every chapter, the book is appropriate for course use as well.

The book begins with a general introduction to machine learning algorithms and their analysis; then explores the core theory of boosting, especially its ability to generalize; examines some of the myriad other theoretical viewpoints that help to explain and understand boosting; provides practical extensions of boosting for more complex learning problems; and finally presents a number of advanced theoretical topics. Numerous applications and practical illustrations are offered throughout.

This excellent book is a mind-stretcher that should be read and reread, even by nonspecialists.—Computing Reviews

Boosting is, quite simply, one of the best-written books I've read on machine learning...

The Bactra Review

For those who wish to work in the area, it is a clear and insightful view of the subject that deserves a place in the canon of machine learning and on the shelves of those who study it.

Giles Hooker, Journal of the American Statistical Association
Robert E. Schapire is Principal Researcher at Microsoft Research in New York City. For their work on boosting, Freund and Schapire received both the Gödel Prize in 2003 and the Kanellakis Theory and Practice Award in 2004.

Yoav Freund is Professor of Computer Science at the University of California, San Diego. For their work on boosting, Freund and Schapire received both the Gödel Prize in 2003 and the Kanellakis Theory and Practice Award in 2004.

About

An accessible introduction and essential reference for an approach to machine learning that creates highly accurate prediction rules by combining many weak and inaccurate ones.

Boosting is an approach to machine learning based on the idea of creating a highly accurate predictor by combining many weak and inaccurate “rules of thumb.” A remarkably rich theory has evolved around boosting, with connections to a range of topics, including statistics, game theory, convex optimization, and information geometry. Boosting algorithms have also enjoyed practical success in such fields as biology, vision, and speech processing. At various times in its history, boosting has been perceived as mysterious, controversial, even paradoxical.

This book, written by the inventors of the method, brings together, organizes, simplifies, and substantially extends two decades of research on boosting, presenting both theory and applications in a way that is accessible to readers from diverse backgrounds while also providing an authoritative reference for advanced researchers. With its introductory treatment of all material and its inclusion of exercises in every chapter, the book is appropriate for course use as well.

The book begins with a general introduction to machine learning algorithms and their analysis; then explores the core theory of boosting, especially its ability to generalize; examines some of the myriad other theoretical viewpoints that help to explain and understand boosting; provides practical extensions of boosting for more complex learning problems; and finally presents a number of advanced theoretical topics. Numerous applications and practical illustrations are offered throughout.

Reviews

This excellent book is a mind-stretcher that should be read and reread, even by nonspecialists.—Computing Reviews

Boosting is, quite simply, one of the best-written books I've read on machine learning...

The Bactra Review

For those who wish to work in the area, it is a clear and insightful view of the subject that deserves a place in the canon of machine learning and on the shelves of those who study it.

Giles Hooker, Journal of the American Statistical Association

Author

Robert E. Schapire is Principal Researcher at Microsoft Research in New York City. For their work on boosting, Freund and Schapire received both the Gödel Prize in 2003 and the Kanellakis Theory and Practice Award in 2004.

Yoav Freund is Professor of Computer Science at the University of California, San Diego. For their work on boosting, Freund and Schapire received both the Gödel Prize in 2003 and the Kanellakis Theory and Practice Award in 2004.