Archive for category Web Talks

Concentration inequalities, talk by Gábor Lugosi

Very interesting talk on concentration inequalities (understood as inequalities that bound the deviations of a function of independent random variables from its mean). Several basic results are discussed, as well as applications to a number of fields, including, of course, information theory.

Milan

Sergio Verdu’s talk on relative entropy

Some time ago Jorge Silva shared with me this interesting talk. It is an overview of relative entropy (aka Kulback-Leibler divergence, etc.)   and its multiple appearances in information theory, probability, and statistics, including recent results by the speaker.

Cheers,

Milan S. Derpich