Minimum Divergence Methods in Statistical Machine Learning - Komori, Osamu
- Format: Relié Voir le descriptif
Vous en avez un à vendre ?
Vendez-le-vôtreSoyez informé(e) par e-mail dès l'arrivée de cet article
Créer une alerte prix- Payez directement sur Rakuten (CB, PayPal, 4xCB...)
- Récupérez le produit directement chez le vendeur
- Rakuten vous rembourse en cas de problème
Gratuit et sans engagement
Félicitations !
Nous sommes heureux de vous compter parmi nos membres du Club Rakuten !
TROUVER UN MAGASIN
Retour
Avis sur Minimum Divergence Methods In Statistical Machine Learning de Komori, Osamu Format Relié - Livre Informatique
0 avis sur Minimum Divergence Methods In Statistical Machine Learning de Komori, Osamu Format Relié - Livre Informatique
Donnez votre avis et cumulez 5
Les avis publiés font l'objet d'un contrôle automatisé de Rakuten.
Présentation Minimum Divergence Methods In Statistical Machine Learning de Komori, Osamu Format Relié
- Livre Informatique
Résumé : Biographie: Sommaire:
Shinto Eguchi received his master degree from Osaka University in 1979 and a Ph.D. from Hiroshima University, Japan, in 1984. His working career started as Assistant Professor of Hiroshima University, 1984, Associate Professor of Shinamne University, 1986, and Professor of The Institute of Statistical Mathematics, 1995-2020. He is currently Emeritus Professor at the Institute of Statistical Mathematics and Graduate University of Advanced Studies. His research interest is primarily statistics, including statistical machine learning, bioinformatics, information geometry, statistical ecology and parametric/semiparametric inference and robust statistics. His recent publication: -A generalized quasi-linear mixed-effects model, Y Saigusa, S Eguchi, O Komori, Statistical Methods in Medical Research, 31 (7), 1280-1291, 2022. -Robust self-tuning semiparametric PCA for contaminated elliptical distribution, H Hung, SY Huang, S Eguchi, IEEE Transactions on Signal Processing 70, 5885-5897, 2022. -Minimum information divergence of Q-functions for dynamic treatment resumes. S Eguchi, Information Geometry, 1-21, 2022....
the cross entropy associates with a statistical estimation method via minimization of the empirical analogue based on given data. Thus any statistical divergence includes an intrinsic object between the generative model and the estimation method. Typically, KL divergence leads to the exponential model and the maximum likelihood estimation. It is shown that any information divergence leads to a Riemannian metric and a pair of the linear connections in the framework of information geometry. We focus on a class of information divergence generated by an increasing and convex function U, called U-divergence. It is shown that any generator function U generates the U-entropy and U-divergence, in which there is a dualistic structure between the U-divergence method and the maximum U-entropy model. We observe that a specific choice of U leads to a robust statistical procedurevia the minimum U-divergence method. If U is selected as an exponential function, then the corresponding U-entropy and U-divergence are reduced to the Boltzmann-Shanon entropy and the KL divergence...