Information Theory Coding信息论与编码(英文版) -梁建武 第2章 信息度量.ppt

Information Theory Coding信息论与编码(英文版) -梁建武 第2章 信息度量.ppt

  1. 1、本文档共80页,可阅读全部内容。
  2. 2、有哪些信誉好的足球投注网站(book118)网站文档一经付费(服务费),不意味着购买了该文档的版权,仅供个人/单位学习、研究之用,不得用于商业用途,未经授权,严禁复制、发行、汇编、翻译或者网络传播等,侵权必究。
  3. 3、本站所有内容均由合作方或网友上传,本站不对文档的完整性、权威性及其观点立场正确性做任何保证或承诺!文档内容仅供研究参考,付费前请自行鉴别。如您付费,意味着您自己接受本站规则且自行承担风险,本站不退款、不进行额外附加服务;查看《如何避免下载的几个坑》。如果您已付费下载过本站文档,您可以点击 这里二次下载
  4. 4、如文档侵犯商业秘密、侵犯著作权、侵犯人身权等,请点击“版权申诉”(推荐),也可以打举报电话:400-050-0827(电话支持时间:9:00-18:30)。
查看更多
Chapter 2.Basic Concepts of Info. Theory--Information statistical measure Introduction——preparation knowledge 1.Information measure The information is measurable – which is the foundation to establish Info. Theory; Methods of information measure :Structure measure , statistical measure , semantic measure , fuzzy measure and so on; Statistical measure: this method counts with logarithm of the probability of the event to describe the uncertain thing and to obtain the info. content of the message, it also establishes a new concept --entropy ; Entropy is the most important concept in Shannon Info.Theory. 2.Single mark discrete source mathematical model Since the discrete source only involves a random event, it can be expressed by a discrete random variable. Single mark discrete mathematical model X,Y,Z are the random variables, refer to whole source ; is certain result of a random event or some element of the source. cannot confuse! 3 R e v i e w 4.Mathematics knowledge Log(xy)=logx+logy Log(x/y)=logx-logy 2.1 Self –Info. And Conditional Self-Info. 2.1.1 Self-Info. Self-Info. Explanations: one will inevitably surprised when small probability event appears, therefore the information produced is rich; If the impossible event once to appear nearly, it will be an explosive news, amazing the world with a single brilliant feat. An event that most probably happened is during the people’s expectation. Even if it occurs, it doesn’t has any information . Especially when the inevitable event has occurred, it cannot give us any information . Nature of the self-Info. I(ai) I(ai) is a non- negative value; When P(ai) =1, I(ai)=0; When P(ai) =0, I(ai)= ∞ ; I(ai) is the monotone decreasing function of P(ai) Union self-Info. The source model (involves two random events) Union self-Info 2.1.2 Conditional Self-Info. negative value of logarithm of the conditional probability the information quantity which brought by the random event

文档评论(0)

118压缩包课件库 + 关注
实名认证
内容提供者

该用户很懒,什么也没介绍

1亿VIP精品文档

相关文档