書目名稱 | Statistical Learning Theory and Stochastic Optimization |
副標(biāo)題 | Ecole d‘Eté de Proba |
編輯 | Olivier Catoni,Jean Picard |
視頻video | http://file.papertrans.cn/877/876458/876458.mp4 |
概述 | Includes supplementary material: |
叢書名稱 | Lecture Notes in Mathematics |
圖書封面 |  |
描述 | .Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as?is often done in practice a notoriously "wrong‘‘ (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools,?that will stimulate further studies and results. . |
出版日期 | Book 2004 |
關(guān)鍵詞 | Estimator; Measure; Probability theory; algorithms; complexity; information theory; learning; learning theo |
版次 | 1 |
doi | https://doi.org/10.1007/b99352 |
isbn_softcover | 978-3-540-22572-0 |
isbn_ebook | 978-3-540-44507-4Series ISSN 0075-8434 Series E-ISSN 1617-9692 |
issn_series | 0075-8434 |
copyright | Springer-Verlag GmbH Germany, part of Springer Nature 2004 |