What is Entropy? and its relation to Compression
Автор: Iain Explains Signals, Systems, and Digital Comms
Загружено: 2020-11-17
Просмотров: 14496
Explains Entropy in information theory and gives an example that shows its relationship to compression.
Note that it seems I forgot to mention something about the third example code book shown on the right hand side. For the way it is shown, I should have written 1.375 for the average codebook length (not 1.25). However, actually it's not necessary to use the 0 at the end of the third codeword (ie. the 110 code word could equally have been just 11), since for that codebook there are no other codewords that are 3 bits long, and no other code words start with 11. So the final codeword could just be 11, and there would be no ambiguity in decoding. Then it would have an average length of 1.25 (which is the number I wrote).
If you would like to support me to make these videos, you can join the Channel Membership, by hitting the "Join" button below the video, and making a contribution to support the cost of a coffee a month. It would be very much appreciated.
Check out my 'search for signals in everyday life', by following me on social media:
Facebook: https://www.facebook.com/profile.php?...
Instagram: / iainexplains
Website: http://www.iaincollings.com
Related videos: (see: http://iaincollings.com)
• What are Channel Capacity and Code Rate? • What are Channel Capacity and Code Rate?
• What is Water Filling for Communications Channels? • What is Water Filling for Communications C...
• What is a Gaussian Codebook? • What is a Gaussian Codebook?
• What is Fisher Information? • What is Fisher Information?
• How are Throughput, Bandwidth, and Data Rate Related? • How are Throughput, Bandwidth, and Data Ra...
Full categorised list of videos and PDF Summary Sheets: http://iaincollings.com
.
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: