Personnaliser

OK

Minimum Divergence Methods in Statistical Machine Learning - Komori, Osamu

Note : 0

0 avis
  • Soyez le premier à donner un avis

Vous en avez un à vendre ?

Vendez-le-vôtre
Aucun vendeur ne propose ce produit

Soyez informé(e) par e-mail dès l'arrivée de cet article

Créer une alerte prix
Publicité
 
Vous avez choisi le retrait chez le vendeur à
  • Payez directement sur Rakuten (CB, PayPal, 4xCB...)
  • Récupérez le produit directement chez le vendeur
  • Rakuten vous rembourse en cas de problème

Gratuit et sans engagement

Félicitations !

Nous sommes heureux de vous compter parmi nos membres du Club Rakuten !

En savoir plus

Retour

Horaires

      Note :


      Avis sur Minimum Divergence Methods In Statistical Machine Learning de Komori, Osamu Format Relié  - Livre Informatique

      Note : 0 0 avis sur Minimum Divergence Methods In Statistical Machine Learning de Komori, Osamu Format Relié  - Livre Informatique

      Les avis publiés font l'objet d'un contrôle automatisé de Rakuten.


      Présentation Minimum Divergence Methods In Statistical Machine Learning de Komori, Osamu Format Relié

       - Livre Informatique

      Livre Informatique - Komori, Osamu - 01/03/2022 - Relié - Langue : Anglais

      . .

    • Auteur(s) : Komori, Osamu - Eguchi, Shinto
    • Editeur : Springer Japan Kk
    • Langue : Anglais
    • Parution : 01/03/2022
    • Format : Moyen, de 350g à 1kg
    • Nombre de pages : 232
    • Expédition : 518
    • Dimensions : 24.1 x 16.0 x 1.9
    • ISBN : 9784431569206



    • Résumé :

      This book explores minimum divergence methods of statistical machine learning for estimation, regression, prediction, and so forth, in which we engage in information geometry to elucidate their intrinsic properties of the corresponding loss functions, learning algorithms, and statistical models. One of the most elementary examples is Gauss's least squares estimator in a linear regression model, in which the estimator is given by minimization of the sum of squares between a response vector and a vector of the linear subspace hulled by explanatory vectors. This is extended to Fisher's maximum likelihood estimator (MLE) for an exponential model, in which the estimator is provided by minimization of the Kullback-Leibler (KL) divergence between a data distribution and a parametric distribution of the exponential model in an empirical analogue. Thus, we envisage a geometric interpretation of such minimization procedures such that a right triangle is kept with Pythagorean identity in the sense of the KL divergence. This understanding sublimates a dualistic interplay between a statistical estimation and model, which requires dual geodesic paths, called m-geodesic and e-geodesic paths, in a framework of information geometry. We extend such a dualistic structure of the MLE and exponential model to that of the minimum divergence estimator and the maximum entropy model, which is applied to robust statistics, maximum entropy, density estimation, principal component analysis, independent component analysis, regression analysis, manifold learning, boosting algorithm, clustering, dynamic treatment regimes, and so forth. We consider a variety of information divergence measures typically including KL divergence to express departure from one probability distribution to another. An information divergence is decomposed into the cross-entropy and the (diagonal) entropy in which the entropy associates with a generative model as a family of maximum entropy distributions...

      Biographie:
      Shinto Eguchi received his master degree from Osaka University in 1979 and a Ph.D. from Hiroshima University, Japan, in 1984. His working career started as Assistant Professor of Hiroshima University, 1984, Associate Professor of Shinamne University, 1986, and Professor of The Institute of Statistical Mathematics, 1995-2020. He is currently Emeritus Professor at the Institute of Statistical Mathematics and Graduate University of Advanced Studies. His research interest is primarily statistics, including statistical machine learning, bioinformatics, information geometry, statistical ecology and parametric/semiparametric inference and robust statistics. His recent publication: -A generalized quasi-linear mixed-effects model, Y Saigusa, S Eguchi, O Komori, Statistical Methods in Medical Research, 31 (7), 1280-1291, 2022. -Robust self-tuning semiparametric PCA for contaminated elliptical distribution, H Hung, SY Huang, S Eguchi, IEEE Transactions on Signal Processing 70, 5885-5897, 2022. -Minimum information divergence of Q-functions for dynamic treatment resumes. S Eguchi, Information Geometry, 1-21, 2022....

      Sommaire:
      the cross entropy associates with a statistical estimation method via minimization of the empirical analogue based on given data. Thus any statistical divergence includes an intrinsic object between the generative model and the estimation method. Typically, KL divergence leads to the exponential model and the maximum likelihood estimation. It is shown that any information divergence leads to a Riemannian metric and a pair of the linear connections in the framework of information geometry. We focus on a class of information divergence generated by an increasing and convex function U, called U-divergence. It is shown that any generator function U generates the U-entropy and U-divergence, in which there is a dualistic structure between the U-divergence method and the maximum U-entropy model. We observe that a specific choice of U leads to a robust statistical procedurevia the minimum U-divergence method. If U is selected as an exponential function, then the corresponding U-entropy and U-divergence are reduced to the Boltzmann-Shanon entropy and the KL divergence...

      Le choixNeuf et occasion
      Minimum5% remboursés
      La sécuritéSatisfait ou remboursé
      Le service clientsÀ votre écoute
      LinkedinFacebookTwitterInstagramYoutubePinterestTiktok
      visavisa
      mastercardmastercard
      klarnaklarna
      paypalpaypal
      floafloa
      americanexpressamericanexpress
      Rakuten Logo
      • Rakuten Kobo
      • Rakuten TV
      • Rakuten Viber
      • Rakuten Viki
      • Plus de services
      • À propos de Rakuten
      Rakuten.com