Estimation of Dependences Based on Empirical DataSpringer Science & Business Media, 28.9.2006 - 505 sivua Twenty-?ve years have passed since the publication of the Russian version of the book Estimation of Dependencies Based on Empirical Data (EDBED for short). Twen- ?ve years is a long period of time. During these years many things have happened. Looking back, one can see how rapidly life and technology have changed, and how slow and dif?cult it is to change the theoretical foundation of the technology and its philosophy. I pursued two goals writing this Afterword: to update the technical results presented in EDBED (the easy goal) and to describe a general picture of how the new ideas developed over these years (a much more dif?cult goal). The picture which I would like to present is a very personal (and therefore very biased) account of the development of one particular branch of science, Empirical - ference Science. Such accounts usually are not included in the content of technical publications. I have followed this rule in all of my previous books. But this time I would like to violate it for the following reasons. First of all, for me EDBED is the important milestone in the development of empirical inference theory and I would like to explain why. S- ond, during these years, there were a lot of discussions between supporters of the new 1 paradigm (now it is called the VC theory ) and the old one (classical statistics). |
Kirjan sisältä
Tulokset 6 - 10 kokonaismäärästä 83
Sivu 30
... probability of deviation of a random variable t from its expected value Mt is bounded by 1 - P { t Mt ox } < x2 22 where o2 is the variance of the variable t . Consider now the random variables غ ti , where t1 , ... , t , is a random ...
... probability of deviation of a random variable t from its expected value Mt is bounded by 1 - P { t Mt ox } < x2 22 where o2 is the variance of the variable t . Consider now the random variables غ ti , where t1 , ... , t , is a random ...
Sivu 34
... probability density of the random variable t = y — F ( x , a * ) ( for example , the Gaussian law or Laplacian law ) . The necessity of providing this prior information is a much stronger requirement than the provision of a bound on the ...
... probability density of the random variable t = y — F ( x , a * ) ( for example , the Gaussian law or Laplacian law ) . The necessity of providing this prior information is a much stronger requirement than the provision of a bound on the ...
Sivu 36
... Probability Density Problems which are solved in probability theory on the one hand and mathe- matical statistics on the other are interrelated as direct and inverse . Problems in probability theory can be described by the following ...
... Probability Density Problems which are solved in probability theory on the one hand and mathe- matical statistics on the other are interrelated as direct and inverse . Problems in probability theory can be described by the following ...
Sivu 37
... probability density . We seek an approximate solution of this equation in those situations when instead of a cumulative distribution function F ( z ) an ... probability density §5 The Problem of Estimating the Probability Density 377.
... probability density . We seek an approximate solution of this equation in those situations when instead of a cumulative distribution function F ( z ) an ... probability density §5 The Problem of Estimating the Probability Density 377.
Sivu 39
... probability density is more general than the interpretation of results of indirect experiments . Hence it would seem unreasonable to solve the problem of minimizing the expected risk on the basis of empirical data by means of estimating ...
... probability density is more general than the interpretation of results of indirect experiments . Hence it would seem unreasonable to solve the problem of minimizing the expected risk on the basis of empirical data by means of estimating ...
Sisältö
8 | |
15 | |
22 | |
36 | |
Methods of Parametric Statistics for | 45 |
Evaluation of Qualities of Algorithms for Density Estimation | 51 |
Unbiased Estimators | 63 |
The Problem of Estimating the Parameters of a Density | 70 |
Sufficient Conditions | 162 |
A2 The Growth Function | 163 |
A3 The Basic Lemma | 168 |
A4 Derivation of Sufficient Conditions | 170 |
A5 A Bound on the Quantity г | 173 |
A6 A Bound on the Probability of Uniform Relative Deviation | 176 |
A Method of Minimizing Empirical Risk for the Problem of Regression Estimation | 181 |
A Particular Case | 183 |
Estimation of Parameters of the Probability Density Using | 76 |
A Remark on the Statement of the Problem of Interpreting | 83 |
On Robust Methods of Estimating Location Parameters | 91 |
Robustness of Gaussian and Laplace Distributions | 99 |
Robust Methods for Regression Estimation | 105 |
A Theorem on Estimating the Mean Vector of a Multivariate Normal Distribution | 120 |
The GaussMarkov Theorem | 125 |
Best Linear Estimators | 127 |
Criteria for the Quality of Estimators | 128 |
Evaluation of the Best Linear Estimators | 130 |
Utilizing Prior Information | 134 |
A Method of Minimizing Empirical Risk for the Problem of Pattern Recognition | 139 |
Uniform Convergence of Frequencies of Events to Their Probabilities | 141 |
A Particular Case | 142 |
A Deterministic Statement of the Problem | 144 |
Upper Bounds on Error Probabilities | 146 |
An ɛnet of a Set | 149 |
Necessary and Sufficient Conditions for Uniform Convergence of Frequencies to Probabilities | 152 |
Properties of Growth Functions | 154 |
Bounds on Deviations of Empirically Optimal Decision Rules | 155 |
Remarks on the Bound on the Rate of Uniform Convergence of Frequencies to Probabilities | 158 |
Remark on the General Theory of Uniform Estimating of Probabilities | 159 |
A Generalization to a Class with Infinitely Many Members | 186 |
The Capacity of a Set of Arbitrary Functions | 188 |
Uniform Boundedness of a Ratio of Moments | 191 |
Two Theorems on Uniform Convergence | 192 |
Theorem on Uniform Relative Deviation | 195 |
Remarks on a General Theory of Risk Estimation | 202 |
Appendix to Chapter 7 Theory of Uniform Convergence | 206 |
A3 extension of a Set | 214 |
A7 Corollaries | 228 |
Solution of Illposed Problems Interpretation | 267 |
Proofs of the Theorems | 275 |
Methods of Polynomial and Piecewise Polynomial Approximations | 285 |
The Problem of Probability Density Estimation | 292 |
Density Estimation Using Parzens Method | 301 |
Appendix to Chapter 9 Statistical Theory of Regularization | 308 |
Estimation of Functional Values at Given Points | 312 |
Appendix to Chapter 10 Taxonomy Problems | 347 |
Algorithms for Estimating Nonindicator | 370 |
Bibliographical Remarks | 384 |
Bibliography | 391 |
Index | 397 |
Muita painoksia - Näytä kaikki
Yleiset termit ja lausekkeet
algorithms approximation belonging bound Chapter class of densities class of functions classification complete sample compute consider construct convergence of frequencies cumulative distribution function decision rules F(x defined Denote density P(x determined deviation empirical data empirical risk equivalence classes expected risk ɛ-net fulfilled function F(x given hyperplane ill-posed problems indicator functions inequality inference lemma loss function mathematical expectations matrix method of minimizing method of structural metric minimize the functional minimizes the empirical minimum normal distribution obtain operator equation parameters pattern recognition polynomial prior information probability density problem of estimating random variable regression estimation sample X1 Section set of functions solution solving space splines statistics structural risk minimization theorem theory training sequence transductive unbiased estimator uniform convergence utilize valid Vapnik VC dimension VC theory vectors x₁ Xemp y₁