Back to Numpy Ml

Bandits

numpy_ml/bandits/README.md

latest1.0 KB
Original Source

Bandits

The bandit.py module includes several simple multi-arm bandit environments.

The policies.py module implements a number of standard multi-arm bandit policies.

  1. Bandits

    • MAB: Bernoulli, Multinomial, and Gaussian payout distributions
    • Contextual MAB: Linear contextual bandits
  2. Policies

Plots

<p align="center"> </p>