Very interesting talk on concentration inequalities (understood as inequalities that bound the deviations of a function of independent random variables from its mean). Several basic results are discussed, as well as applications to a number of fields, including, of course, information theory.
Archive for category Web Talks
Some time ago Jorge Silva shared with me this interesting talk. It is an overview of relative entropy (aka Kulback-Leibler divergence, etc.) and its multiple appearances in information theory, probability, and statistics, including recent results by the speaker.
Cheers,