Information Theory: Introduction, Measure of information, Information content of message, Average Information content of symbols in Long Independent sequences, Average Information content of symbols in Long dependent sequences, Markov Statistical Model for Information Sources, Entropy and Information rate of Mark off Sources
(Section 4.1, 4.2 of Text 1) L1, L2,L3
Source Coding: Encoding of the Source Output, Shannon's Encoding Algorithm(Sections 4.3, 4.3.1 of Text 1), Shannon Fano Encoding Algorithm (Section 2.15 of Reference Book 4)
Source coding theorem, Prefix Codes, Kraft McMillan Inequality property KMI, Huffinan codes (Section 2.2 of Text 2) L1,L2,L3
Information Channels: Communication Channels, Discrete Communication channels Channel Matrix, Joint probabilty Matrix, Binary Symmetric Channel, System Entropies. (Section 4.4, 4.5, 4.51,4.5.2 of Text 1)
Mutual Information, Channel Capacity, Channel Capacity of Binary Symmetric Channel, (Section 2.5, 2.6 of Text 2)
Binary Erasure Channel, Muroga's Theorem (Section 2.27, 2.28 of Reference Book4) L1,L2,L3
Error Control Coding: Introduction, Examples of Error control coding, methods of Controlling Errors, Types of Errors, types of Codes, Linear Block Codes: matrix description of Linear Block Codes, Error detection & Correction capabilities ofLinear Block Codes, Single error correction Hamming code, Table lookup Decoding using Standard Array.
Binary Cyclic Codes: Algebraic Structure of Cyclic Codes, Encoding using an (n-k) Bit Shift register, Syndrome Calculation, Error Detection and Correction
(Sections 9.1, 9.2,9.3,9.3.1,9.3.2,9.3.3 of Text 1), L1, L2, L3
Convolution Codes: Convolution Encoder, Time domain approach, Transform domain approach, Code Tree, Trellis and State Diagram, The Viterbi Algorithm) (Section 8.5-Articles 1,2 and 3, 8.6-Article 1of Text 2), Ll,L2,L3
Course Outcomes:
After studying this course, students will be able to:
1. Explain concept of Dependent & Independent Source, measure of information, Entropy, Rate oflnformation and Order of a source
2 Represent the information using Shannon Encoding, Shannon Fano, Prefix and Huffman Encoding Algorithms
3. Model the continuous and discrete communication channels using input, output and joint probabilities
4. Determine a codeword comprising ofthe check bits computed using Linear Block codes, cyclic codes & convolutional codes
5. Design the encoding and decoding circuits for Linear Block codes, cyclic codes, convolutional codes, BCH and Golay codes.
Question paper pattern:
• Examination will be conducted for 100 marks with question paper containing 10 full questions, each of 20 marks.
• Each full question can have a maximum of 4 sub questions.
• There will be 2 full questions from each module covering all the topics of the module.
• Students will have to answer 5 full questions, selecting one full question from each module.
• The total marks will be proportionally reduced to 60 marks as SEE marks is 60.
TextBook:
1. Digital andAnalog Communication Systems, K. Sam Shanmugam, John Wtley India Pvt Ltd, 1996.
2 Digital Communication, Simon Haykin, John Wtley India Pvt Ltd, 2008.
Reference Books:
1. lTC and Cryptography, Ranjan Bose, TMH, II edition, 2007
2 Principles ofDigital Communication, J. Das, S.K.Mullick, P. K. Chatterjee, Wiley, 1986- Technology &Engineering
3. Digital Conummications- Fundamentals andApplications, Bernard Sklar, SecondEdition, Pearson Education, 2016, ISBN: 9780134724058.
4. Information Theory and Coding, HariBhat, Ganesh Rao, Cengage, 2017.
5. Error Correction Coding, Todd K Moon,Wiley Std. Edition, 2006