A Probabilistic Theory of Pattern Recognition

Etukansi
Springer Science & Business Media, 20.2.1997 - 638 sivua
Pattern recognition presents one of the most significant challenges for scientists and engineers, and many different approaches have been proposed. The aim of this book is to provide a self-contained account of probabilistic analysis of these approaches. The book includes a discussion of distance measures, nonparametric methods based on kernels or nearest neighbors, Vapnik-Chervonenkis theory, epsilon entropy, parametric classification, error estimation, free classifiers, and neural networks. Wherever possible, distribution-free properties and inequalities are derived. A substantial portion of the results or the analysis is new. Over 430 problems and exercises complement the material.
 

Sisältö

Introduction
1
The Bayes Error
9
Inequalities and Alternate Distance Measures
21
Linear Discrimination
39
Nearest Neighbor Rules
61
Consistency
91
Slow Rates of Convergence
111
Error Estimation
121
Tree Classifiers
315
DataDependent Partitioning
363
Splitting the Data
387
The Resubstitution Estimate
397
Deleted Estimates of the Error Probability
407
Automatic Kernel Rules
423
Automatic Nearest Neighbor Rules
451
Hypercubes and Discrete Spaces
461

The Regular Histogram Rule
133
Kernel Rules
147
Consistency of the fcNearest Neighbor Rule
169
VapnikChervonenkis Theory
187
Combinatorial Aspects of VapnikChervonenkis Theory
215
Lower Bounds for Empirical Classifier Selection
233
The Maximum Likelihood Principle
249
Parametric Classification
263
Generalized Linear Discrimination
279
Complexity Regularization
289
Condensed and Edited Nearest Neighbor Rules
303
Epsilon Entropy and Totally Bounded Sets
479
Uniform Laws of Large Numbers
489
Neural Networks
507
Other Error Estimates
549
Feature Extraction
561
Appendix
575
Notation
591
Author Index
619
Subject Index
627
Tekijänoikeudet

Muita painoksia - Näytä kaikki

Yleiset termit ja lausekkeet

Kirjaluettelon tiedot