Amazon cover image
Image from Amazon.com

Approximate Bayesian Inference

By: Contributor(s): Material type: ArticleArticleLanguage: English Publication details: Basel MDPI - Multidisciplinary Digital Publishing Institute 2022Description: 1 electronic resource (508 p.)ISBN:
  • books978-3-0365-3790-0
  • 9783036537894
  • 9783036537900
Subject(s): Online resources: Summary: Extremely popular for statistical inference, Bayesian methods are also becoming popular in machine learning and artificial intelligence problems. Bayesian estimators are often implemented by Monte Carlo methods, such as the Metropolis-Hastings algorithm of the Gibbs sampler. These algorithms target the exact posterior distribution. However, many of the modern models in statistics are simply too complex to use such methodologies. In machine learning, the volume of the data used in practice makes Monte Carlo methods too slow to be useful. On the other hand, these applications often do not require an exact knowledge of the posterior. This has motivated the development of a new generation of algorithms that are fast enough to handle huge datasets but that often target an approximation of the posterior. This book gathers 18 research papers written by Approximate Bayesian Inference specialists and provides an overview of the recent advances in these algorithms. This includes optimization-based methods (such as variational approximations) and simulation-based methods (such as ABC or Monte Carlo algorithms). The theoretical aspects of Approximate Bayesian Inference are covered, specifically the PAC-Bayes bounds and regret analysis. Applications for challenging computational problems in astrophysics, finance, medical data analysis, and computer vision area also presented.
Item type:
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Home library Collection Call number Materials specified Status Date due Barcode
Electronic-Books Electronic-Books OPJGU Sonepat- Campus E-Books Open Access Available

Open Access star Unrestricted online access

Extremely popular for statistical inference, Bayesian methods are also becoming popular in machine learning and artificial intelligence problems. Bayesian estimators are often implemented by Monte Carlo methods, such as the Metropolis-Hastings algorithm of the Gibbs sampler. These algorithms target the exact posterior distribution. However, many of the modern models in statistics are simply too complex to use such methodologies. In machine learning, the volume of the data used in practice makes Monte Carlo methods too slow to be useful. On the other hand, these applications often do not require an exact knowledge of the posterior. This has motivated the development of a new generation of algorithms that are fast enough to handle huge datasets but that often target an approximation of the posterior. This book gathers 18 research papers written by Approximate Bayesian Inference specialists and provides an overview of the recent advances in these algorithms. This includes optimization-based methods (such as variational approximations) and simulation-based methods (such as ABC or Monte Carlo algorithms). The theoretical aspects of Approximate Bayesian Inference are covered, specifically the PAC-Bayes bounds and regret analysis. Applications for challenging computational problems in astrophysics, finance, medical data analysis, and computer vision area also presented.

Creative Commons https://creativecommons.org/licenses/by/4.0/ cc https://creativecommons.org/licenses/by/4.0/

English

There are no comments on this title.

to post a comment.

O.P. Jindal Global University, Sonepat-Narela Road, Sonepat, Haryana (India) - 131001

Send your feedback to glus@jgu.edu.in

Hosted, Implemented & Customized by: BestBookBuddies   |   Maintained by: Global Library