信息论试题new.docx

上传人:b****2 文档编号:2017555 上传时间:2022-10-26 格式:DOCX 页数:7 大小:80.11KB
下载 相关 举报
信息论试题new.docx_第1页
第1页 / 共7页
信息论试题new.docx_第2页
第2页 / 共7页
信息论试题new.docx_第3页
第3页 / 共7页
信息论试题new.docx_第4页
第4页 / 共7页
信息论试题new.docx_第5页
第5页 / 共7页
点击查看更多>>
下载资源
资源描述

信息论试题new.docx

《信息论试题new.docx》由会员分享,可在线阅读,更多相关《信息论试题new.docx(7页珍藏版)》请在冰豆网上搜索。

信息论试题new.docx

信息论试题new

EEE315

InformationTheoryandCoding

Assignment1

 

DatePerformed:

2011.11.4

DateSubmitted:

2011.11.5

 

Introduction

Informationtheoryanswerstwofundamentalquestionsincommunicationtheory:

whatistheultimatedatacompressionandwhatistheultimatetransmissionrateofcommu-nication.ThesetwoaspectscanbealsoregardastheentropyHandthechannelcapacityC.Intheearly1940s,Shannonraisedthatrandomprocesseshaveanirreduci-blecomplexitybelowwhichthesignalcannotbecompressandthishenamedentropy.Healsoarguedthatiftheentropyofthesourceislessthanthecapacityofthechannel,thenasymptoticallyerrorfreecommunicationcanbeachieved.

Shannon'sinformationcontent

Shannon’sinformationcontentshortforSICalsonamedasself-information.Ininfor-mationtheory,itisameasureoftheinformationcontentcontainsinasingleevent.Bydefinition,theamountofSICcontainedinaprobabilisticeventdependsonlyontheprobabilityofthatevent,andSIChasaninverserelationshipwithprobability.ThenaturalmeasureoftheuncertaintyofaneventXistheprobabilityofXdenotebypx.Bydefinition,theinformationcontentinaneventas

Info{X}=-logpx

Themeasureofinformationhassomeintuitivepropertiessuchas:

1.Informationcontainedintheeventsoughttobedefinedintermsofsomemeasureofuncertaintyoftheevent.

2.Lesscertaineventsoughttocontainmoreinformationthanmorecertainevents.

3.Theinformationofunrelatedeventstakenasasingleeventshouldequalthesumoftheinformationoftheunrelatedevents.

TheunitofSICis“bits”ifbase2isusedforthelogarithm,and“nats”ifthenaturallogarithmisused.

Entropy

Theentropyquantifiestheexpectedvalueoftheinformationcontainedinamessage.Theentropycanbeviewedas:

1.Ameasureoftheminimumcostneededtosendsomeformofinformation.

2.“Theamountofsurprisefactor”oftheinformationmeasuredinbits.

3.Orhowmuchenergyitisworthspendingtocarrytheinformationwhichtranslatestotheminimumnumberofbitsneededtocodetheinformation.

Theentropyisdefinedas

HX=-x∈Xpxlogpx

Itcanbeviewedfromanumberofperspectives:

1.TheaverageSICofX

2.Theamountofinformationgainedifitsvaluesareknown.

3.Theaveragenumberofbinaryquestionneededtofindoutitsvalueisin[H(X),H(X)+1]

EntropyisquantifiedintermsofNatsorBits.

Ifthesourceiscontinuous,theentropycanbewrittenas

HX=-xpxlogpxdx

MutualInformation

Mutualinformationoftworandomvariablesisaquantitythatmeasuresthemutualdependenceofthetworandomvariables.Itisreductionintheuncertaintyofonerandomvariableduetotheknowledgeoftheother.Themostcommonunitofmeasure-mentofmutualinformationisthebit,whenlogarithmstothebase2areused.

ConsidertwoRVsX,Y.ThemutualinformationI(X,Y)istherelativeentropybetweenthejointdistributionP(X,Y)andtheproductdistributionP(X)P(Y).

IX;Y=x∈Ax∈Bpx,ylogpx,ypx)p(y

Thecaseofcontinuousrandomvariables,themutualinformationis

IX;Y=XYpx,ylogpx,ypx)p(ydydx

wherep(x)and(y)arethemarginalprobabilitydistributionfunctionsofXandYrespectively.

Mutualinformationcanbeequivalentexpressedas

IX;Y=HX-HXY

=HY-HYX

=HX,Y-HYX-HXY

=HX-H(Y)-H(X,Y)

TherelationshipbetweenMutualInformationandVariousEntropiesshowsasFig.1

Fig.1

IndividualH(X),H(Y),jointH(X,Y)andconditionalentropiesforapairofcorrelatedsubsystemsX,YwithmutualinformationI(X;Y).

Question1

Thecodeofthefirstquestionshowsasfollows:

function[entropy]=findEntropy(array)

L=length(array);

entropy=0.0;

fori=1:

L

entropy=entropy+array(i)*log2(1/array(i));

end

disp('Theentropyforthesourceis');

array

end

Therunningresultshowsasfollows:

Question2

Bydefinitionofthechannelcapacity,theinformationchannelcapacityofadiscretememorylesschannelas

C=maxIX;Y

wherethemaximumistakenoverallpossibleinputdistributionsp(x).

ThemutualinformationbetweenXandYis

IX;Y=x∈Ax∈Bpx,ylogpx,ypx)p(y=x∈Bpxx∈Bpx|ylog2px,ypx)p(y

=x,ypyxpxlogpyxpy

=py=0x=0px=0logpy=0x=0py=0+py=1x=0px=0logpy=1x=0py=1+py=0x=1py=1logp(y=0|x=1)p(y=0)+py=1x=1px=1logp(y=1|x=1)p(y=1)

Thefig.2illustratesthat

py=0x=0=py=1x=1=1-p

py=1x=0=py=0x=1=p

Theequationsrevealthatwhenp=0.5,theuniformdistributedinputalphabetcangetthemaximummutualinformation.Therefore,thecapacityoftheBSCis

C=1-Hpbit&Hp≝-plogp-1-plog1-p

ThereforetheCis

C=1+plogp+1-plog1-pbit

Fig.2

TheMatlabcodeforthisquestionis

>>p=[0:

0.001:

1];

>>C=1+(1-p).*log2(1-p)+p.*log2(p);

>>plot(p,C)

>>xlabel('Crossprobabilityp');

>>ylabel('ThecapacityofaBSC');

>>gridon

Theresultis

Fig.3ThecapacityofaBSCwithcrossprobabilitypasfunctionofpwhere0

Question3

展开阅读全文
相关资源
猜你喜欢
相关搜索

当前位置:首页 > PPT模板

copyright@ 2008-2022 冰豆网网站版权所有

经营许可证编号:鄂ICP备2022015515号-1