Some time ago Jorge Silva shared with me this interesting talk. It is an overview of relative entropy (aka Kulback-Leibler divergence, etc.)   and its multiple appearances in information theory, probability, and statistics, including recent results by the speaker.

Cheers,

Milan S. Derpich