Information Theory

Information Theory

Information Theory

Information Theory

Language: english

Note: 0/5 (0 notes) 153 students  New course 

Instructor(s): Prof. Dr. Academic Educator

Last update: 2022-08-11

What you’ll learn

  • Concept of Information
  • Entropy and Mutual Information
  • Communication Channels and Channel Capacity
  • Concept of Data Compression
  • Limits of Data Compression

 

Requirements

  • Probability and Random Variables

 

Description

Today’s communication technology can be considered the result of Shannon’s theorem published in 1948. The paper A Mathematical Theory of Communication where Shannon defined entropy, mutual information, and channel capacity was a milestone in communication science. In this lecture, we teach fundamental subjects of information theory. The student of this course should have a background in probability and random variables. Without the knowledge of probability and random variables, it is not possible to get the fundamentals of the course.

In the course, we first explain the meaning of information and how to measure the information. We explain the concept of entropy and solve various examples clearing the meaning of entropy. Next, we explain mutual information and define channel capacity. We calculate the capacities of some discrete channels. In the sequel, we define the capacity for the AWGN channel and derive the channel capacity expression. We also give information about typical sequences and explain the philosophy behind data compression subject.  We provide the limits od data compression. This course can be taken by anyone interested in the fundamentals of communication theory. This course is especially useful for those working in the communication/telecommunication area. However, it can be studied by anyone interested in the communication field.

 

Who this course is for

  • Communication, Telecommunication, Electronic and Computer Engineers

 

Course content

  • Introduction
    • What is information? How do we measure the information?
    • Measurement of Information
    • Review of Discrete Random Variables
  • Entropy of Discrete Random Variables
    • Entropy
    • Joint Entropy

 

Information Theory

Natural Language Processing [Coursera]

 

Don’t miss any coupons by joining our Telegram group 

Udemy Coupon Code 100% off | Udemy Free Course | Udemy offer | Course with certificate