Besplatna dostava Overseas kurirskom službom iznad 59.99 €
Overseas 4.99 Pošta 4.99 DPD 5.99 GLS 3.99 GLS paketomat 3.49 Box Now 4.49

Besplatna dostava putem Box Now paketomata i Overseas kurirske službe iznad 59,99 €!

Regret Analysis of Stochastic and Nonstochastic Multi-armed Bandit Problems

Jezik EngleskiEngleski
Knjiga Meki uvez
Knjiga Regret Analysis of Stochastic and Nonstochastic Multi-armed Bandit Problems Sebastian Bubeck
Libristo kod: 04834934
Nakladnici now publishers Inc, prosinac 2012
A multi-armed bandit problem - or, simply, a bandit problem - is a sequential allocation problem def... Cijeli opis
? points 231 b
91.04
Vanjske zalihe Šaljemo za 15-20 dana

30 dana za povrat kupljenih proizvoda


Moglo bi vas zanimati i


TOP NOVO
Reckless Lauren Roberts / Meki uvez
common.buy 10.66
TOP
Happiness 1 Shuzo Oshimi / Meki uvez
common.buy 11.16
TOP
Dead Boy Detectives Omnibus Toby Litt / Tvrdi uvez
common.buy 75.68
TOP
Colour Quest (R) Cityscapes John Woodcock / Meki uvez
common.buy 14.95
TOP
Practical Veterinary Dental Radiography Brook A. Niemiec / Tvrdi uvez
common.buy 130.03
The Course of Love Alain de Botton / Meki uvez
common.buy 9.36
Dog Behaviour, Evolution, and Cognition Adam Miklosi / Meki uvez
common.buy 81.17
Damn Delicious Meal Prep Chungah Rhee / Tvrdi uvez
common.buy 24.12
Cursed / Meki uvez
common.buy 9.36
Japanese Sake Bible Takashi Eguchi / Meki uvez
common.buy 15.25
Albert Camus: A Life Olivier Todd / Meki uvez
common.buy 20.73
Design Principles for Photography Jeremy Webb / Meki uvez
common.buy 43.67

A multi-armed bandit problem - or, simply, a bandit problem - is a sequential allocation problem defined by a set of actions. At each time step, a unit resource is allocated to an action and some observable payoff is obtained. The goal is to maximize the total payoff obtained in a sequence of allocations. The name bandit refers to the colloquial term for a slot machine (a "one-armed bandit" in American slang). In a casino, a sequential allocation problem is obtained when the player is facing many slot machines at once (a "multi-armed bandit"), and must repeatedly choose where to insert the next coin. Multi-armed bandit problems are the most basic examples of sequential decision problems with an exploration-exploitation trade-off. This is the balance between staying with the option that gave highest payoffs in the past and exploring new options that might give higher payoffs in the future. Although the study of bandit problems dates back to the 1930s, exploration-exploitation trade-offs arise in several modern applications, such as ad placement, website optimization, and packet routing. Mathematically, a multi-armed bandit is defined by the payoff process associated with each option. In this book, the focus is on two extreme cases in which the analysis of regret is particularly simple and elegant: independent and identically distributed payoffs and adversarial payoffs. Besides the basic setting of finitely many actions, it also analyzes some of the most important variants and extensions, such as the contextual bandit model. This monograph is an ideal reference for students and researchers with an interest in bandit problems.

Informacije o knjizi

Puni naziv Regret Analysis of Stochastic and Nonstochastic Multi-armed Bandit Problems
Jezik Engleski
Uvez Knjiga - Meki uvez
Datum izdanja 2012
Broj stranica 138
EAN 9781601986269
ISBN 1601986262
Libristo kod 04834934
Nakladnici now publishers Inc
Težina 208
Dimenzije 234 x 159 x 8
Poklonite ovu knjigu još danas
To je jednostavno
1 Dodajte knjigu u košaricu i odaberite isporuku kao poklon 2 Zauzvrat ćemo vam poslati kupon 3 Knjiga dolazi na adresu poklonoprimca

Prijava

Prijavite se na svoj račun. Još nemate Libristo račun? Otvorite ga odmah!

 
obvezno
obvezno

Nemate račun? Ostvarite pogodnosti uz Libristo račun!

Sve ćete imati pod kontrolom uz Libristo račun.

Otvoriti Libristo račun