Logo du site
  • English
  • Français
  • Se connecter
Logo du site
  • English
  • Français
  • Se connecter
  1. Accueil
  2. Université de Neuchâtel
  3. Publications
  4. Motion integration in visual attention models for predicting simple dynamic scenes
 
  • Details
Options
Vignette d'image

Motion integration in visual attention models for predicting simple dynamic scenes

Auteur(s)
Bur, Alexandre
Wurtz, Pascal
Muri, Rene M.
Hügli, Heinz
Date de parution
2007
In
Human Vision and Electronic Imaging XII (Proceedings of SPIE), International Society for Optical Engineering (SPIE), 2007/6492/47/649219- 649229
Mots-clés
  • visualattention
  • computermodel
  • motion
  • bottom-up
  • eyemovement
  • saliency
  • visualattention

  • computermodel

  • motion

  • bottom-up

  • eyemovement

  • saliency

Résumé
Visual attention models mimic the ability of a visual system, to detect potentially relevant parts of a scene. This process of attentional selection is a prerequisite for higher level tasks such as object recognition. Given the high relevance of temporal aspects in human visual attention, dynamic information as well as static information must be considered in computer models of visual attention. While some models have been proposed for extending to motion the classical static model, a comparison of the performances of models integrating motion in different manners is still not available. In this article, we present a comparative study of various visual attention models combining both static and dynamic features. The considered models are compared by measuring their respective performance with respect to the eye movement patterns of human subjects. Simple synthetic video sequences, containing static and moving objects, are used to assess the model suitability. Qualitative and quantitative results provide a ranking of the different models.
Identifiants
https://libra.unine.ch/handle/123456789/15848
_
10.1117/12.704185
Type de publication
journal article
Dossier(s) à télécharger
 main article: Bur_A._-_Motion_integration_in_visual_attention_models_20090205.pdf (501.39 KB)
google-scholar
Présentation du portailGuide d'utilisationStratégie Open AccessDirective Open Access La recherche à l'UniNE Open Access ORCIDNouveautés

Service information scientifique & bibliothèques
Rue Emile-Argand 11
2000 Neuchâtel
contact.libra@unine.ch

Propulsé par DSpace, DSpace-CRIS & 4Science | v2022.02.00