欢迎光临澳大利亚新华书店网 [登录 | 免费注册]

    • 信息理论基础(英文版工业和信息化部十二五规划教材)
      • 作者:编者:陈杰//孙兵//于泽//周荫清
      • 出版社:北京航空航天大学
      • ISBN:9787512419728
      • 出版日期:2016/01/01
      • 页数:153
    • 售价:11.6
  • 内容大纲

        陈杰、孙兵、于泽、周荫清编著的《信息理论基础(英文版工业和信息化部十二五规划教材)》以通信系统的基本模型为主线,系统全面地阐述信息理论基础课程应包含的知识点。本书包含信息论基本概念和信息论应用2部分,共11章,第1部分包括绪论、信息的统计度量、离散信源、无损编码和数据压缩、离散信道及其容量、信道编码;第2部分包括率失真、连续信源、连续信道及其容量、最大熵和谱估计、计算机仿真实验等。本书根据作者近十年来从事信息论中英文教学和科研实践,总结归纳而成,可作为留学生本科生和研究生教学使用。
  • 作者介绍

  • 目录

    Chapter 1  Introduction
      1.1  Concept of information
      1.2  History of information theory
      1.3  Information, messages and signals
      1.4  Communication system model
      1.5  Information theory applications
        1.5.1  Electrical engineering (communication theory)
        1.5.2  Computer science (algorithmic complexity)
      Exercises
    Chapter 2  Statistical Measure of Information
      2.1  Information of random events
        2.1.1  Self-information
        2.1.2  Conditional self-information
        2.1.3  Mutual information of events
      2.2  Information of discrete random variables
        2.2.1  Entropy of discrete random variables
        2.2.2  Joint entropy
        2.2.3  Conditional entropy
        2.2.4  Mutual information of discrete random variables
      2.3  Relationship between entropy and mutual information
      2.4  Mutual information and entropy of continuous random variables
        2.4.1  Mutual information of continuous random variabies
        2.4.2  Entropy oI continuous random variables
      Exercises
    Chapter 3  Discrete Source and Its Entropy Rate
      3.1  Mathematical model of source
        3.1.1  Discrete source and continuous source
        3.1.2  Simple discrete source and its extension
        3.1.3  Memoryless source and source with memory
      3.2  Discrete memoryles source
        3.2.1  Definition
        3.2.2  Extension of discrete source
      3.3  Discrete stationary source
        3.3.1  Definition
        3.3.2  Entropy rate of discrete stationary source
      3.4  Discrete Markov source
        3.4.1  Markov chain
        3.4.2  Transition probability
        3.4.3  Markov source and its entropy rate
      Exercises
    Chapter 4  Lossless Source Coding and Data Compression
      4.1   Asymptotic equipartition property and typical sequences
      4.2  Lossless source coding
        4.2.1  Encoder
        4.2.2  Blockcode
        4.2.3  Fixed length code
        4.2.4  Variable length code
      4.3  Data compression
        4.3.1  Shannon coding
        4.3.2  Huffman coding

        4.3.3  Fano coding
      Exercises
    Chapter 5  Discrete Channel and Its Capacity
      5.1  Mathematical model of channel
      5.2  Discrete memoryless channel
        5.2.1  Mathematical model o{ discrete memoryless channel
        5.2.2  Simple DMC
        5.2.3  Extension of discrete memoryless channel
      5.3  Channel combination
      5.4  Channel capacity
        5.4.1  Concept of channel capacity
        5.4.2  Channel capacity of several special discrete channels
        5.4.3  Channel capacity of symmetric channels
        5.4.4  Channel capacity of extended DMC
        5.4.5  Channel capacity of independent parallel DMC
        5.4.6  Channel capacity of the sum channel
        5.4.7  Channel capacity of general discrete channels
      Exercises
    Chapter 6  Noisy-channel Coding
      6.1  Probability of error
      6.2  Decoding rules
      6.3  Channel coding
        6.3.1  Simple repetition code
        6.3.2  Linear code
      6.4  Noisy-channel coding theorem
      Exercises
    Chapter 7  Rate Distortion
      7.1  Quantization
      7.2  Distortion definition
        7.2.1  Distortion function
        7.2.2  Mean distortion
      7.3  Rate distortion function
        7.3.1  Fidelity criterion for given channel
        7.3.2  Definition of rate distortion function
        7.3.3   Property of rate distortion function
      7.4  Rate distortion theorem and the converse
      7.5  The ea|culation of rate distortion function
      Exercises
    Chapter 8  Continuous Source find Its Entropy Rate
      8.1  Continuous source
      8.2  Entropy of continuous source
      8.3  Maximum entropy of continuous source
      8.4  Joint entropy, conditional entropy and mutual information for continuous random variables
      8.5  Entropy rate of continuous source
      8.6  Rate distortion for continuous source
      Exercises
    Chapter 9  Continuous Channel and Its Capacity
      9.1  Capacity of continuous channel
        9.1.1  Capacity of discrete-time channel
        9.1.2  Capacity of continuous time channel

      9.2  The Gaussian channel
      9.3  Band-limited channels
      9.4  Coding theorem for continuous channel
      Exercises
    Chapter 10  Maximum Entropy and Spectrum Estimation
      10.1  Maximum entropy probability distribution
        10.1.1  Maximum entropy distribution
        10.1.2  Examples
      10.2  Maximum entropy spectrum estimation
        10.2.1  Burg's max entropy theorem
        10.2.2   Maximum entropy spectrum estimation
        Exercises
    Chapter 11  Experiments of Information Theory
      11.1  Measure of information
        11.1.1  Information calculator
        11.1.2  Properties of entropy
      11.2  Simulation of Markov source
      11.3  Performance simulation for source coding
        11.3.1  Shannon coding
        11.3.2  Huffman coding
        11.3.3  Fano coding
      11.4  Simulation of BSC
      11.5  Simulation of the cascade channel
      11.6  Calculation of channel capacity
      11.7  Decoding rules
      11.8  Performance demonstration of channel coding
      References

推荐书目

  • 孩子你慢慢来/人生三书 华人世界率性犀利的一枝笔,龙应台独家授权《孩子你慢慢来》20周年经典新版。她的《...

  • 时间简史(插图版) 相对论、黑洞、弯曲空间……这些词给我们的感觉是艰深、晦涩、难以理解而且与我们的...

  • 本质(精) 改革开放40年,恰如一部四部曲的年代大戏。技术突变、产品迭代、产业升级、资本对接...

更多>>>