Overview Statistic: PDF-Downloads (blue) and Frontdoor-Views (gray)
  • Treffer 1 von 3
Zurück zur Trefferliste

Local Evaluation of Policies for Discounted Markov Decision Problems

Zitieren Sie bitte immer diese URN: urn:nbn:de:0297-zib-11963
  • Providing realistic performance indicators of online algorithms for a given online optimization problem is a difficult task in general. Due to significant drawbacks of other concepts like competitive analysis, Markov decision problems (MDPs) may yield an attractive alternative whenever reasonable stochastic information about future requests is available. However, the number of states in MDPs emerging from real applications is usually exponential in the original input parameters. Therefore, the standard methods for analyzing policies, i.e., online algorithms in our context, are infeasible. In this thesis we propose a new computational tool to evaluate the behavior of policies for discounted MDPs locally, i.e., depending on a particular initial state. The method is based on a column generation algorithm for approximating the total expected discounted cost of an unknown optimal policy, a concrete policy, or a single action (which assumes actions at other states to be made according to an optimal policy). The algorithm determines an $\varepsilon$-approximation by inspecting only relatively small local parts of the total state space. We prove that the number of states required for providing the approximation is independent of the total number of states, which underlines the practicability of the algorithm. The approximations obtained by our algorithm are typically much better than the theoretical bounds obtained by other approaches. We investigate the pricing problem and the structure of the linear programs encountered in the column generation. Moreover, we propose and analyze different extensions of the basic algorithm in order to achieve good approximations fast. The potential of our analysis tool is exemplified for discounted MDPs emerging from different online optimization problems, namely online bin coloring, online target date assignment, and online elevator control. The results of the experiments are quite encouraging: our method is mostly capable to provide performance indicators for online algorithms that much better reflect observations made in simulations than competitive analysis does. Moreover, the analysis allows to reveal weaknesses of the considered online algorithms. This way, we developed a new online algorithm for the online bin coloring problem that outperforms existing ones in our analyses and simulations.

Volltext Dateien herunterladen

Metadaten exportieren

Weitere Dienste

Teilen auf Twitter Suche bei Google Scholar Statistik - Anzahl der Zugriffe auf das Dokument
Metadaten
Verfasserangaben:Andreas Tuchscherer
Dokumentart:Dissertation
Freies Schlagwort / Tag:Markov decision problem; column generation; linear programming; online optimization; performance guarantees
MSC-Klassifikation:60-XX PROBABILITY THEORY AND STOCHASTIC PROCESSES (For additional applications, see 11Kxx, 62-XX, 90-XX, 91-XX, 92-XX, 93-XX, 94-XX) / 60Jxx Markov processes / 60J05 Discrete-time Markov processes on general state spaces
68-XX COMPUTER SCIENCE (For papers involving machine computations and programs in a specific mathematical area, see Section -04 in that area) / 68Wxx Algorithms (For numerical algorithms, see 65-XX; for combinatorics and graph theory, see 05C85, 68Rxx) / 68W27 Online algorithms
68-XX COMPUTER SCIENCE (For papers involving machine computations and programs in a specific mathematical area, see Section -04 in that area) / 68Wxx Algorithms (For numerical algorithms, see 65-XX; for combinatorics and graph theory, see 05C85, 68Rxx) / 68W40 Analysis of algorithms [See also 68Q25]
Titel verleihende Institution:Freie Universität Berlin
Betreuer:Martin Grötschel, Jörg Rambau
Datum der Abschlussprüfung:12.08.2010
Veröffentlichende Institution:Freie Universität Berlin
Datum der Erstveröffentlichung:15.12.2010
Seitenzahl:208
Lizenz (Deutsch):License LogoCreative Commons - Namensnennung-Nicht kommerziell-Keine Bearbeitung
Accept ✔
Diese Webseite verwendet technisch erforderliche Session-Cookies. Durch die weitere Nutzung der Webseite stimmen Sie diesem zu. Unsere Datenschutzerklärung finden Sie hier.