1、信息论试题newEEE315Information Theory and CodingAssignment 1Date Performed: 2011.11.4Date Submitted: 2011.11.5IntroductionInformation theory answers two fundamental questions in communication theory: what is the ultimate data compression and what is the ultimate transmission rate of commu-nication. These
2、 two aspects can be also regard as the entropy H and the channel capacity C. In the early 1940s, Shannon raised that random processes have an irreduci-ble complexity below which the signal cannot be compress and this he named entropy. He also argued that if the entropy of the source is less than the
3、 capacity of the channel, then asymptotically error free communication can be achieved.Shannons information contentShannons information content short for SIC also named as self-information. In infor-mation theory, it is a measure of the information content contains in a single event. By definition,
4、the amount of SIC contained in a probabilistic event depends only on the probability of that event, and SIC has an inverse relationship with probability. The natural measure of the uncertainty of an event X is the probability of X denote by px. By definition, the information content in an event asIn
5、foX = - logpxThe measure of information has some intuitive properties such as:1. Information contained in the events ought to be defined in terms of some measure of uncertainty of the event.2. Less certain events ought to contain more information than more certain events.3. The information of unrela
6、ted events taken as a single event should equal the sum of the information of the unrelated events.The unit of SIC is “bits” if base 2 is used for the logarithm, and “nats” if the natural logarithm is used.EntropyThe entropy quantifies the expected value of the information contained in a message. Th
7、e entropy can be viewed as:1. A measure of the minimum cost needed to send some form of information.2. “The amount of surprise factor” of the information measured in bits.3. Or how much energy it is worth spending to carry the information which translates to the minimum number of bits needed to code
8、 the information.The entropy is defined asHX=-xXpxlogpxIt can be viewed from a number of perspectives:1. The average SIC of X2. The amount of information gained if its values are known.3. The average number of binary question needed to find out its value is in H(X),H(X)+1Entropy is quantified in ter
9、ms of Nats or Bits.If the source is continuous, the entropy can be written asHX=-x pxlogpxdxMutual InformationMutual information of two random variables is a quantity that measures the mutual dependence of the two random variables. It is reduction in the uncertainty of one random variable due to the
10、 knowledge of the other. The most common unit of measure-ment of mutual information is the bit, when logarithms to the base 2 are used.Consider two RVs X,Y. The mutual information I(X,Y) is the relative entropy between the joint distribution P(X,Y) and the product distribution P(X)P(Y).IX;Y=xAxBpx,y
11、logpx,ypx)p(yThe case of continuous random variables, the mutual information isIX;Y=X Y px,ylogpx,ypx)p(ydydxwhere p(x) and(y) are the marginal probability distribution functions of X and Y respectively.Mutual information can be equivalent expressed asIX;Y=HX-HXY = HY-HYX =HX,Y- HYX-HXY =HX- H(Y)-H(
12、X,Y)The relationship between Mutual Information and Various Entropies shows as Fig.1Fig.1Individual H(X), H(Y), joint H(X,Y) and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X;Y).Question 1The code of the first question shows as follows:function entropy = f
13、indEntropy(array) L = length(array);entropy = 0.0; for i = 1:L entropy = entropy + array(i) * log2(1/array(i);enddisp(The entropy for the source is );array endThe running result shows as follows:Question 2By definition of the channel capacity, the information channel capacity of a discrete memoryles
14、s channel asC=maxIX;Ywhere the maximum is taken over all possible input distributions p(x).The mutual information between X and Y isIX;Y=xAxBpx,ylogpx,ypx)p(y=xBpxxBpx|ylog2px,ypx)p(y =x,ypyxpxlogpyxpy =py=0x=0px=0logpy=0x=0py=0+py=1x=0px=0logpy=1x=0py=1+py=0x=1py=1logp(y=0|x=1)p(y=0)+py=1x=1px=1log
15、p(y=1|x=1)p(y=1)The fig.2 illustrates that py=0x=0=py=1x=1=1-ppy=1x=0=py=0x=1=pThe equations reveal that when p=0.5, the uniform distributed input alphabet can get the maximum mutual information. Therefore, the capacity of the BSC isC=1-Hp bit & Hp-plogp-1-plog1-pTherefore the C isC=1+plogp+1-plog1-p bitFig.2The Matlab code for this question is p=0:0.001:1; C=1+(1-p).*log2(1-p)+p.*log2(p); plot(p,C) xlabel(Cross probability p); ylabel(The capacity of a BSC); grid onThe result isFig.3 The capacity of a BSC with cross probability p as function of p where 0p1Question 3
copyright@ 2008-2022 冰豆网网站版权所有
经营许可证编号:鄂ICP备2022015515号-1