Shinto Eguchi,Osamu Komori
Minimum Divergence Methods in Statistical Machine Learning: From an Information Geometric Viewpoint
Minimum Divergence Methods in Statistical Machine Learning: From an Information Geometric Viewpoint
YOU SAVE £20.04
- Condition: Brand new
- UK Delivery times: Usually arrives within 2 - 3 working days
- UK Shipping: Fee starts at £2.39. Subject to product weight & dimension
Bulk ordering. Want 15 or more copies? Get a personalised quote and bigger discounts. Learn more about bulk orders.
Couldn't load pickup availability
- More about Minimum Divergence Methods in Statistical Machine Learning: From an Information Geometric Viewpoint
This book explores minimum divergence methods of statistical machine learning, which involve information geometry to elucidate their properties. It discusses Gauss's least squares estimator, Fisher's maximum likelihood estimator, and the minimum divergence estimator and maximum entropy model. It also introduces the U-divergence, a class of information divergence generated by an increasing and convex function, and shows how it can be used for robust statistical procedures and boosting algorithms.
Format: Hardback
Length: 221 pages
Publication date: 16 March 2022
Publisher: Springer Verlag, Japan
This book delves into the realm of statistical machine learning, exploring minimum divergence methods for estimation, regression, prediction, and more. By employing information geometry, it sheds light on the intrinsic properties of the corresponding loss functions, learning algorithms, and statistical models. One of the foundational examples is Gauss's least squares estimator in linear regression, where the estimator is derived by minimizing the sum of squares between a response vector and a vector within the linear subspace defined by explanatory vectors. This concept is extended to Fisher's maximum likelihood estimator (MLE) for exponential models, where the estimator is obtained by minimizing the Kullback-Leibler (KL) divergence between a data distribution and a parametric distribution of the exponential model in an empirical analogy. This geometric interpretation of minimization procedures leads to the preservation of a right triangle with Pythagorean identity, akin to the KL divergence.
We further extend this dualistic structure to the minimum divergence estimator and maximum entropy model, which find applications in robust statistics, maximum entropy, density estimation, principal component analysis, independent component analysis, regression analysis, manifold learning, boosting algorithms, clustering, dynamic treatment regimes, and various other fields. We examine a range of information divergence measures, including KL divergence, to quantify the deviation between probability distributions.
By leveraging information geometry, this book provides a comprehensive framework for understanding and analyzing statistical machine learning methods, offering insights into their mathematical foundations and practical applications.
Weight: 518g
Dimension: 235 x 155 (mm)
ISBN-13: 9784431569206
Edition number: 1st ed. 2022
This item can be found in:
UK and International shipping information
UK and International shipping information
UK Delivery and returns information:
- Delivery within 2 - 3 days when ordering in the UK.
- Shipping fee for UK customers from £2.39. Fully tracked shipping service available.
- Returns policy: Return within 30 days of receipt for full refund.
International deliveries:
Shulph Ink now ships to Australia, Belgium, Canada, France, Germany, Ireland, Italy, India, Luxembourg Saudi Arabia, Singapore, Spain, Netherlands, New Zealand, United Arab Emirates, United States of America.
- Delivery times: within 5 - 10 days for international orders.
- Shipping fee: charges vary for overseas orders. Only tracked services are available for most international orders. Some countries have untracked shipping options.
- Customs charges: If ordering to addresses outside the United Kingdom, you may or may not incur additional customs and duties fees during local delivery.
