Markov Decision Processes in Practice

Markov Decision Processes in Practice
Author :
Publisher : Springer
Total Pages : 552
Release :
ISBN-10 : 9783319477664
ISBN-13 : 3319477668
Rating : 4/5 (668 Downloads)

Book Synopsis Markov Decision Processes in Practice by : Richard J. Boucherie

Download or read book Markov Decision Processes in Practice written by Richard J. Boucherie and published by Springer. This book was released on 2017-03-10 with total page 552 pages. Available in PDF, EPUB and Kindle. Book excerpt: This book presents classical Markov Decision Processes (MDP) for real-life applications and optimization. MDP allows users to develop and formally support approximate and simple decision rules, and this book showcases state-of-the-art applications in which MDP was key to the solution approach. The book is divided into six parts. Part 1 is devoted to the state-of-the-art theoretical foundation of MDP, including approximate methods such as policy improvement, successive approximation and infinite state spaces as well as an instructive chapter on Approximate Dynamic Programming. It then continues with five parts of specific and non-exhaustive application areas. Part 2 covers MDP healthcare applications, which includes different screening procedures, appointment scheduling, ambulance scheduling and blood management. Part 3 explores MDP modeling within transportation. This ranges from public to private transportation, from airports and traffic lights to car parking or charging your electric car . Part 4 contains three chapters that illustrates the structure of approximate policies for production or manufacturing structures. In Part 5, communications is highlighted as an important application area for MDP. It includes Gittins indices, down-to-earth call centers and wireless sensor networks. Finally Part 6 is dedicated to financial modeling, offering an instructive review to account for financial portfolios and derivatives under proportional transactional costs. The MDP applications in this book illustrate a variety of both standard and non-standard aspects of MDP modeling and its practical use. This book should appeal to readers for practitioning, academic research and educational purposes, with a background in, among others, operations research, mathematics, computer science, and industrial engineering.


Markov Decision Processes in Practice Related Books

Markov Decision Processes in Practice
Language: en
Pages: 552
Authors: Richard J. Boucherie
Categories: Business & Economics
Type: BOOK - Published: 2017-03-10 - Publisher: Springer

DOWNLOAD EBOOK

This book presents classical Markov Decision Processes (MDP) for real-life applications and optimization. MDP allows users to develop and formally support appro
Handbook of Markov Decision Processes
Language: en
Pages: 560
Authors: Eugene A. Feinberg
Categories: Business & Economics
Type: BOOK - Published: 2012-12-06 - Publisher: Springer Science & Business Media

DOWNLOAD EBOOK

Eugene A. Feinberg Adam Shwartz This volume deals with the theory of Markov Decision Processes (MDPs) and their applications. Each chapter was written by a lead
Planning with Markov Decision Processes
Language: en
Pages: 194
Authors: Mausam Natarajan
Categories: Computers
Type: BOOK - Published: 2022-06-01 - Publisher: Springer Nature

DOWNLOAD EBOOK

Markov Decision Processes (MDPs) are widely popular in Artificial Intelligence for modeling sequential decision-making scenarios with probabilistic dynamics. Th
Markov Decision Processes with Applications to Finance
Language: en
Pages: 393
Authors: Nicole Bäuerle
Categories: Mathematics
Type: BOOK - Published: 2011-06-06 - Publisher: Springer Science & Business Media

DOWNLOAD EBOOK

The theory of Markov decision processes focuses on controlled Markov chains in discrete time. The authors establish the theory for general state and action spac
Markov Decision Processes with Their Applications
Language: en
Pages: 305
Authors: Qiying Hu
Categories: Business & Economics
Type: BOOK - Published: 2007-09-14 - Publisher: Springer Science & Business Media

DOWNLOAD EBOOK

Put together by two top researchers in the Far East, this text examines Markov Decision Processes - also called stochastic dynamic programming - and their appli