Please use this identifier to cite or link to this item:
http://hdl.handle.net/123456789/2204
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Mondal, Rishov | - |
dc.date.accessioned | 2024-03-22T09:28:34Z | - |
dc.date.available | 2024-03-22T09:28:34Z | - |
dc.date.issued | 2023-05 | - |
dc.identifier.uri | http://hdl.handle.net/123456789/2204 | - |
dc.description | Embargo period | en_US |
dc.description.abstract | This thesis investigates the potential of Markov decision processes (MDP) as a tool for solving complex decision-making problems in real-life scenarios. The project delves into the application of MDP in stochastic games, specifically by analyzing an inventory duopoly with a yield uncertainty problem as part of the operations research problem. The thesis also explores the role of MDP in analyzing the budget allocation problem in the Voter Model, a popular model in opinion dynamics. The study provides a comprehensive analysis of MDP’s effectiveness in solving real-life problems and highlights its benefits over other decision-making models. The project offers insights into how MDP can be effectively used to analyze and solve real-life problems and provides directions for future research in this area. | en_US |
dc.language.iso | en | en_US |
dc.publisher | IISER Mohali | en_US |
dc.subject | Markov Decision Process | en_US |
dc.subject | perations Research, | en_US |
dc.subject | Opinion Dynamics | en_US |
dc.title | Markov Decision Process and Its Applications | en_US |
dc.type | Thesis | en_US |
dc.guide | Sahasrabudhe, Neeraja | en_US |
Appears in Collections: | MP-2020 |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Need To Add…Full Text_PDF | 15.36 kB | Unknown | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.