Repository logo
Research Data
Publications
Projects
Persons
Organizations
English
Français
Log In(current)
  1. Home
  2. Publications
  3. Article de recherche (journal article)
  4. Exploration in POMDPs

Exploration in POMDPs

Author(s)
Dimitrakakis, Christos  
Chaire de science des données  
Date issued
#VALEUR!
In
#N/A
Vol
1
No
1
From page
24
To page
31
Abstract
In recent work, Bayesian methods for exploration in Markov decision processes (MDPs) and for solving known partially-observable Markov decision processes (POMDPs) have been proposed. In this paper we review the similarities and differences between those two domains and propose methods to deal with them simultaneously. This enables us to attack the Bayes-optimal reinforcement learning problem in POMDPs.
Publication type
journal article
Identifiers
https://libra.unine.ch/handle/20.500.14713/65599
File(s)
Loading...
Thumbnail Image
Download
Name

IAS-UVA-08-01.pdf

Type

Main Article

Size

153.58 KB

Format

Adobe PDF

Université de Neuchâtel logo

Service information scientifique & bibliothèques

Rue Emile-Argand 11

2000 Neuchâtel

contact.libra@unine.ch

Service informatique et télématique

Rue Emile-Argand 11

Bâtiment B, rez-de-chaussée

Powered by DSpace-CRIS

libra v2.2.0

© 2026 Université de Neuchâtel

Portal overviewUser guideOpen Access strategyOpen Access directive Research at UniNE Open Access ORCIDWhat's new