This course introduces the basic tools of Information Theory. Entropy, Relative Entropy, and Information, and highlights their utility with applications drawn from various disciplines. After introducing the basics of probability theory and information theory, we explore topics including coding, data compression, channel capacity, thermodynamics, population dynamics, gene transcriptions, network science and more.