Multi-armed Bandit Demo

This is a simple JavaScript demonstration of some algorithms for the multi-armed bandit problem.

Some background, references, and information about the bandit problem can be found at my GitHub repository, along with the source code.

Bandits Agents
Run game for .

:

Running the Demo

Bandits and agents can be chosen using the drop-down selections. New bandits and agents can be added by clicking the “+ create” link. If a bandit or agent requires parameters these can be added in the text areas which appear next to the drop-down menu.

Once the bandits and agents are selected, the number of steps for the game can be entered in the text field on the right. Then hit “Go!” to run the game.

The results are plotted below the UI. Rewards for bandits are drawn in light grey, agents are in colour and are listed in the plot’s legend. (The “Averages” check box determines whether running totals or averages are displayed).