course-pic
| 0 Reviews | 0 Enrolled | 12 Views

41
Sessions

Free
Course ( $ )

Self Learning Course

Learn any where any time
Course Description

Information theory is a branch of applied mathematics, electrical engineering, and computer science involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data. Since its inception it has broadened to find applications in many other areas, including statistical inference, natural language processing, cryptography, neurobiology,[1] the evolution[2] and function[3] of molecular codes, model selection in ecology,[4] thermal physics,[5] quantum computing, linguistics, plagiarism detection,[6] pattern recognition, anomaly detection and other forms of data analysis.[7]

What will you get?

High Quality Video Content

Fun-learning and engaging experience

Anytime anywhere learning

Relevant content

Sample Course Video

Some of our courses are exclusively meant for registered users. Sign up to browse our entire range of courses, paid and free.

Course Syllabus
  • Mod-01 Lec-01 Introduction to Information Theory and Coding
  • Mod01 Lec-02 Definition of Information Measure and Entropy
  • Mod-01 Lec-03 Extension of An Information Source and Markov Source
  • Mod-01 Lec-04 Adjoint of An Information Source, Joint and Conditional Information Measures
  • Mod-01 Lec-05 Properties of Joint and Conditional Information Measures and a Markov Source
  • Mod-01 Lec-06 Asymptotic Properties of Entropy and Problem Solving in Entropy
  • Mod-01 Lec-07 Block Code and Its Properties
  • Mod-01 Lec-08 Instantaneous Code and Its Properties
  • Mod-01 Lec-09 Kraft-Mcmillan Equality and Compact Codes
  • Mod-01 Lec-10 Shannon`s First Theorem
  • Mod-01 Lec-11 Coding Strategies and Introduction to Huffman Coding
  • Mod-01 Lec-12 Huffman Coding and Proof of Its OptimalityV
  • Mod-01 Lec-13 Competitive Optimality of The Shannon Code
  • Mod-01 Lec-14 Non-Binary Huffman Code and Other Codes
  • Mod-01 Lec-15 Adaptive Huffman Coding part-1
  • Mod-01 lec-16 Adaptive Huffman Coding Part-2
  • Mod-01 Lec-17 Shannon-Fano-Elias Coding and Introduction to Arithmetic Coding
  • Mod-01 Lec-18 Arithmetic Coding Part-1
  • Mod-01 Lec-19 Arithmetic Coding Part-2
  • Mod-01 Lec-20 Introduction to Information Channel
  • Mod-01 Lec-21 Equivocation and Mutual Information
  • Mod-01 Lec22 Properties of Different Information Channels
  • Mod-01 Lec-23 Reduction of Information Channels
  • Mod-01 Lec-24 Properties of Mutual Information and Introduction to Channel Capacity
  • Mod-01 Lec-25 Calculation of Channel Capacity for Different Information Channel
  • Mod-01 Lec-26 Shannon`s Second Theorem
  • Mod-01 Lec-27 Discussion on Error Free Communication Over Noisy Channel
  • Mod-01 Lec-28 Error Free Communication Over a Binary Symmetric Channel
  • Mod-01 Lec-29 Differential Entropy and Evaluation of Mutual Information
  • Mod-01 Lec-30 Channel Capacity of a Bandlimited Continuous Channel
  • Mod-01 Lec-31 Introduction to Rate-Distortion Theory
  • Mod-01 Lec-32 Definition and Properties of Rate-Distortion Functions
  • Mod-01 Lec-33 Calculation of Rate-Distortion Functions
  • Mod-01 Lec-34 Computational Approach For Calculation of Rate-Distortion Functions
  • Mod-01 Lec-35 Introduction to Quantization
  • Mod-01 Lec-36 Lloyd-Max Quantizer
  • Mod-01 Lec-37 Companded Quantization
  • Mod-01 Lec-38 Variable Length Coding and Problem Solving In Quantizer Design
  • Mod-01 Lec-39 Vector Quantization
  • Mod01 Lec-40 Transform Part-1
  • Mod-01 Lec-41 Transform Coding Part-2
Reviews

No Reviews