Personnaliser

OK

Convex Optimization with Computational Errors - Alexander J. Zaslavski

Note : 0

0 avis
  • Soyez le premier à donner un avis

Vous en avez un à vendre ?

Vendez-le-vôtre
Filtrer par :
Neuf
Occasion (2)
Reconditionné

48,99 €

Occasion · Comme Neuf

  • Ou 12,25 € /mois

  • Option Express : Chez vous demain (?)

    • Livraison GRATUITE
    • Livré entre le 4 et le 7 avril
    Voir les modes de livraison

    Ammareal

    PRO Vendeur favori

    4,8/5 sur + de 1 000 ventes

    Edition 2020. Ammareal reverse jusqu'à 15% du prix net de cet article à des organisations caritatives.

    Nos autres offres

    • 100,99 €

      Occasion · Très Bon État

      Ou 25,25 € /mois

      • Livraison : 25,00 €
      • Livré entre le 13 et le 21 avril
      Voir les modes de livraison
      4,6/5 sur + de 1 000 ventes
      Service client à l'écoute et une politique de retour sans tracas - Livraison des USA en 3 a 4 semaines (2 mois si circonstances exceptionnelles) - La plupart de nos titres sont en anglais, sauf indication contraire. N'hésitez pas à nous envoyer un e-... Voir plus
    Publicité
     
    Vous avez choisi le retrait chez le vendeur à
    • Payez directement sur Rakuten (CB, PayPal, 4xCB...)
    • Récupérez le produit directement chez le vendeur
    • Rakuten vous rembourse en cas de problème

    Gratuit et sans engagement

    Félicitations !

    Nous sommes heureux de vous compter parmi nos membres du Club Rakuten !

    En savoir plus

    Retour

    Horaires

        Note :


        Avis sur Convex Optimization With Computational Errors de Alexander J. Zaslavski Format Relié  - Livre

        Note : 0 0 avis sur Convex Optimization With Computational Errors de Alexander J. Zaslavski Format Relié  - Livre

        Les avis publiés font l'objet d'un contrôle automatisé de Rakuten.


        Présentation Convex Optimization With Computational Errors de Alexander J. Zaslavski Format Relié

         - Livre

        Livre - Alexander J. Zaslavski - 01/02/2020 - Relié - Langue : Anglais

        . .

      • Auteur(s) : Alexander J. Zaslavski
      • Editeur : Springer International Publishing Ag
      • Langue : Anglais
      • Parution : 01/02/2020
      • Format : Moyen, de 350g à 1kg
      • Nombre de pages : 372
      • Expédition : 723
      • Dimensions : 24.1 x 16.0 x 2.6
      • ISBN : 9783030378219



      • Résumé :
        The book is devoted to the study of approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are known as important tools for solving optimization problems. The research presented in the book is the continuation and the further development of the author's (c) 2016 book Numerical Optimization with Computational Errors, Springer 2016. Both books study the algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to find out what an approximate solution can be obtained and how many iterates one needs for this.

        The main difference between this new book and the 2016 book is that in this present book the discussion takes into consideration the fact that for every algorithm, its iteration consists of several steps and that computational errors for different steps are generally, different. This fact, which was not taken into account in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are different in general.
        It may happen that the feasible set is simple and the objective function is complicated. As a result, the computational error, made when one calculates the projection, is essentially smaller than the computational error of the calculation of the subgradient. Clearly, an opposite case is possible too. Another feature of this book is a study of a number of important algorithms which appeared recently in the literature and which are not discussed in the previous book.
        This monograph contains 12 chapters. Chapter 1 is an introduction. In Chapter 2 we study the subgradient projection algorithm for minimization of convex and nonsmooth functions. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 3 we analyze the mirror descent algorithm for minimization of convex and nonsmooth functions, under the presence of computational errors. For this algorithm each iteration consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we solve an auxiliary minimization problem on the set of feasible points. In each of these two steps there is a computational error. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 4 we analyze the projected gradient algorithm with a smooth objective function under the presence of computational errors. In Chapter 5 we consider an algorithm, which is an extension of the projection gradient algorithm used for solving linear inverse problems arising in signal/image processing. In Chapter 6 we study continuous subgradient method and continuous subgradient projection algorithm for minimization of convex nonsmooth functions and for computing the saddle points of convex-concave functions, under the presence of computational errors. All the results of this chapter has no prototype in [NOCE]. In Chapters 7-12 we analyze several algorithms under the presence of computational errors which were not considered in [NOCE]. Again, each step of an iteration has a computational errors and we take into account that these errors are, in general, different. An optimization problems with a composite objective function is studied in Chapter 7. A zero-sum game with two-players is considered in Chap...

        Biographie:
        ?Alexander J. Zaslavski is professor in the Department of Mathematics, Technion-Israel Institute of Technology, Haifa, Israel. He has authored numerous books with Springer, the most recent of which include Turnpike Theory for the Robinson-Solow-Srinivasan Model (978-3-030-60306-9), The Projected Subgradient Algorithm in Convex Optimization (978-3-030-60299-4), Convex Optimization with Computational Errors (978-3-030-37821-9), Turnpike Conditions in Infinite Dimensional Optimal Control (978-3-030-20177-7), Optimization on Solution Sets of Common Fixed Point Problems (978-3-030-78848-3)....

        Sommaire:
        Preface.- 1. Introduction.- 2. Subgradient Projection Algorithm.- 3. The Mirror Descent Algorithm.- 4. Gradient Algorithm with a Smooth Objective Function.- 5. An Extension of the Gradient Algorithm.- 6. Continuous Subgradient Method.- 7. An optimization problems with a composite objective function.- 8. A zero-sum game with two-players.- 9. PDA-based method for convex optimization.- 10 Minimization of quasiconvex functions.-11. Minimization of sharp weakly convex functions.-12. A Projected Subgradient Method for Nonsmooth Problems.- References. -Index. ...

        Détails de conformité du produit

        Consulter les détails de conformité de ce produit (

        Personne responsable dans l'UE

        )
        Le choixNeuf et occasion
        Minimum5% remboursés
        La sécuritéSatisfait ou remboursé
        Le service clientsÀ votre écoute
        LinkedinFacebookTwitterInstagramYoutubePinterestTiktok
        visavisa
        mastercardmastercard
        klarnaklarna
        paypalpaypal
        floafloa
        americanexpressamericanexpress
        Rakuten Logo
        • Rakuten Kobo
        • Rakuten TV
        • Rakuten Viber
        • Rakuten Viki
        • Plus de services
        • À propos de Rakuten
        Rakuten.com