Repository logo
Research Data
Publications
Projects
Persons
Organizations
English
Français
Log In(current)
  1. Home
  2. Publications
  3. Contribution à un congrès (conference paper)
  4. Bandits Meet Mechanism Design to Combat Clickbait in Online Recommendation

Bandits Meet Mechanism Design to Combat Clickbait in Online Recommendation

Author(s)
Thomas Kleine Buening
Aadirupa Saha
Dimitrakakis, Christos  
Chaire de science des données  
Haifen Xu
Date issued
2024
In
The Twelfth International Conference on Learning Representations
From page
1
To page
29
Subjects
bandits mechanism design incentive-aware learning nash equilibrium
Abstract
We study a strategic variant of the multi-armed bandit problem, which we coin the strategic click-bandit. This model is motivated by applications in online recommendation where the choice of recommended items depends on both the click-through rates and the post-click rewards. Like in classical bandits, rewards follow a fixed unknown distribution. However, we assume that the click-rate of each arm is chosen strategically by the arm (e.g., a host on Airbnb) in order to maximize the number of times it gets clicked. The algorithm designer does not know the post-click rewards nor the arms' actions (i.e., strategically chosen click-rates) in advance, and must learn both values over time. To solve this problem, we design an incentive-aware learning algorithm, UCB-S, which achieves two goals simultaneously: (a) incentivizing desirable arm behavior under uncertainty; (b) minimizing regret by learning unknown parameters. We approximately characterize all Nash equilibria of the arms under UCB-S and show a $\tilde{\mathcal{O}} (\sqrt{KT})$ regret bound uniformly in every equilibrium. We also show that incentive-unaware algorithms generally fail to achieve low regret in the strategic click-bandit. Finally, we support our theoretical results by simulations of strategic arm behavior which confirm the effectiveness and robustness of our proposed incentive design
Event name
ICLR 2024
Location
Vienna, Austria
Publication type
conference paper
Identifiers
https://libra.unine.ch/handle/20.500.14713/21785
-
https://libra.unine.ch/handle/123456789/33261
File(s)
Loading...
Thumbnail Image
Download
Name

6260_Bandits_Meet_Mechanism_De.pdf

Type

Main Article

Size

3.26 MB

Format

Adobe PDF

Checksum

(MD5):3af9ca951ae78f1b5f26c8c9dc7b33c8

Université de Neuchâtel logo

Service information scientifique & bibliothèques

Rue Emile-Argand 11

2000 Neuchâtel

contact.libra@unine.ch

Service informatique et télématique

Rue Emile-Argand 11

Bâtiment B, rez-de-chaussée

Powered by DSpace-CRIS

v2.0.0

© 2025 Université de Neuchâtel

Portal overviewUser guideOpen Access strategyOpen Access directive Research at UniNE Open Access ORCIDWhat's new