ImageVerifierCode 换一换
格式:DOCX , 页数:7 ,大小:80.11KB ,
资源ID:2017555      下载积分:3 金币
快捷下载
登录下载
邮箱/手机:
温馨提示:
快捷下载时,用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)。 如填写123,账号就是123,密码也是123。
特别说明:
请自助下载,系统不会自动发送文件的哦; 如果您已付费,想二次下载,请登录后访问:我的下载记录
支付方式: 支付宝    微信支付   
验证码:   换一换

加入VIP,免费下载
 

温馨提示:由于个人手机设置不同,如果发现不能下载,请复制以下地址【https://www.bdocx.com/down/2017555.html】到电脑端继续下载(重复下载不扣费)。

已注册用户请登录:
账号:
密码:
验证码:   换一换
  忘记密码?
三方登录: 微信登录   QQ登录  

下载须知

1: 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。
2: 试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓。
3: 文件的所有权益归上传用户所有。
4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
5. 本站仅提供交流平台,并不能对任何下载内容负责。
6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

版权提示 | 免责声明

本文(信息论试题new.docx)为本站会员(b****2)主动上传,冰豆网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知冰豆网(发送邮件至service@bdocx.com或直接QQ联系客服),我们立即给予删除!

信息论试题new.docx

1、信息论试题newEEE315Information Theory and CodingAssignment 1Date Performed: 2011.11.4Date Submitted: 2011.11.5IntroductionInformation theory answers two fundamental questions in communication theory: what is the ultimate data compression and what is the ultimate transmission rate of commu-nication. These

2、 two aspects can be also regard as the entropy H and the channel capacity C. In the early 1940s, Shannon raised that random processes have an irreduci-ble complexity below which the signal cannot be compress and this he named entropy. He also argued that if the entropy of the source is less than the

3、 capacity of the channel, then asymptotically error free communication can be achieved.Shannons information contentShannons information content short for SIC also named as self-information. In infor-mation theory, it is a measure of the information content contains in a single event. By definition,

4、the amount of SIC contained in a probabilistic event depends only on the probability of that event, and SIC has an inverse relationship with probability. The natural measure of the uncertainty of an event X is the probability of X denote by px. By definition, the information content in an event asIn

5、foX = - logpxThe measure of information has some intuitive properties such as:1. Information contained in the events ought to be defined in terms of some measure of uncertainty of the event.2. Less certain events ought to contain more information than more certain events.3. The information of unrela

6、ted events taken as a single event should equal the sum of the information of the unrelated events.The unit of SIC is “bits” if base 2 is used for the logarithm, and “nats” if the natural logarithm is used.EntropyThe entropy quantifies the expected value of the information contained in a message. Th

7、e entropy can be viewed as:1. A measure of the minimum cost needed to send some form of information.2. “The amount of surprise factor” of the information measured in bits.3. Or how much energy it is worth spending to carry the information which translates to the minimum number of bits needed to code

8、 the information.The entropy is defined asHX=-xXpxlogpxIt can be viewed from a number of perspectives:1. The average SIC of X2. The amount of information gained if its values are known.3. The average number of binary question needed to find out its value is in H(X),H(X)+1Entropy is quantified in ter

9、ms of Nats or Bits.If the source is continuous, the entropy can be written asHX=-x pxlogpxdxMutual InformationMutual information of two random variables is a quantity that measures the mutual dependence of the two random variables. It is reduction in the uncertainty of one random variable due to the

10、 knowledge of the other. The most common unit of measure-ment of mutual information is the bit, when logarithms to the base 2 are used.Consider two RVs X,Y. The mutual information I(X,Y) is the relative entropy between the joint distribution P(X,Y) and the product distribution P(X)P(Y).IX;Y=xAxBpx,y

11、logpx,ypx)p(yThe case of continuous random variables, the mutual information isIX;Y=X Y px,ylogpx,ypx)p(ydydxwhere p(x) and(y) are the marginal probability distribution functions of X and Y respectively.Mutual information can be equivalent expressed asIX;Y=HX-HXY = HY-HYX =HX,Y- HYX-HXY =HX- H(Y)-H(

12、X,Y)The relationship between Mutual Information and Various Entropies shows as Fig.1Fig.1Individual H(X), H(Y), joint H(X,Y) and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X;Y).Question 1The code of the first question shows as follows:function entropy = f

13、indEntropy(array) L = length(array);entropy = 0.0; for i = 1:L entropy = entropy + array(i) * log2(1/array(i);enddisp(The entropy for the source is );array endThe running result shows as follows:Question 2By definition of the channel capacity, the information channel capacity of a discrete memoryles

14、s channel asC=maxIX;Ywhere the maximum is taken over all possible input distributions p(x).The mutual information between X and Y isIX;Y=xAxBpx,ylogpx,ypx)p(y=xBpxxBpx|ylog2px,ypx)p(y =x,ypyxpxlogpyxpy =py=0x=0px=0logpy=0x=0py=0+py=1x=0px=0logpy=1x=0py=1+py=0x=1py=1logp(y=0|x=1)p(y=0)+py=1x=1px=1log

15、p(y=1|x=1)p(y=1)The fig.2 illustrates that py=0x=0=py=1x=1=1-ppy=1x=0=py=0x=1=pThe equations reveal that when p=0.5, the uniform distributed input alphabet can get the maximum mutual information. Therefore, the capacity of the BSC isC=1-Hp bit & Hp-plogp-1-plog1-pTherefore the C isC=1+plogp+1-plog1-p bitFig.2The Matlab code for this question is p=0:0.001:1; C=1+(1-p).*log2(1-p)+p.*log2(p); plot(p,C) xlabel(Cross probability p); ylabel(The capacity of a BSC); grid onThe result isFig.3 The capacity of a BSC with cross probability p as function of p where 0p1Question 3

copyright@ 2008-2022 冰豆网网站版权所有

经营许可证编号:鄂ICP备2022015515号-1