欢迎光临澳大利亚新华书店网 [登录 | 免费注册]

    • 信息论和中心极限定理(英文版香农信息科学经典)
      • 作者:(英)奥利佛·约翰逊|责编:陈亮//刘叶青
      • 出版社:世图出版公司
      • ISBN:9787519296872
      • 出版日期:2023/01/01
      • 页数:209
    • 售价:23.6
  • 内容大纲

        本书采用了不相关的、来自信息论的研究,角度新颖地提出了一种证明中心极限的新方法,并对此进行了全面描述:书中先是读者呈现了嫡和费雪信息概念的基本导论,随后以一系列与它们行为有关的标准测试作为验证。在作者的独特构思与实证下,信息论与中心极限定理两个看似不相干的领域被巧妙地联结起来,实现了跨学科的科研合作。此外,书里还汇编了一些已发表或尚未发表的论文研究成果,展现了技术如何可以给出一个极限定理的统一观点。
  • 作者介绍

  • 目录

    Preface
    1.Introduction to Information Theory
      1.1  Entropy and relative entropy
        1.1.1  Discrete entropy
        1.1.2  Differential entropy
        1.1.3  Relative entropy
        1.1.4  Other entropy-like quantities
        1.1.5  Axiomatic definition of entropy
      1.2  Link to thermodynamic entropy
        1.2.1  Definition of thermodynamic entropy
        1.2.2  Maximum entropy and the Second Law
      1.3  Fisher information
        1.3.1  Definition and properties
        1.3.2  Behaviour on convolution
      1.4  Previous information-theoretic proofs
        1.4.1  Rényi's method
        1.4.2  Convergence of Fisher information
    2.Convergence in Relative Entropy
      2.1  Motivation
        2.1.1  Sandwich inequality
        2.1.2  Projections and adjoints
        2.1.3  Normal case
        2.1.4  Results of Brown and Barron
      2.2  Generalised bounds on projection eigenvalues
        2.2.1  Projection of functions in L
        2.2.2  Restricted Poincaré constants
        2.2.3  Convergence of restricted Poincaré constants
      2.3  Rates of convergence
        2.3.1  Proof of O(1/n) rate of convergence
        2.3.2  Comparison with other forms of convergence
        2.3.3  Extending the Cramér-Rao lower bound
    3.Non-Identical Variables and Random Vectors
      3.1  Non-identical random variables
        3.1.1  Previous results
        3.1.2  Improved projection inequalities
      3.2  Random vectors
        3.2.1  Definitions
        3.2.2  Behaviour on convolution
        3.2.3  Projection inequalities
    4.Dependent Random Variables
      4.1  Introduction and notation
        4.1.1  Mixing coefficients
        4.1.2  Main results
      4.2  Fisher information and convolution
      4.3  Proof of subadditive relations
        4.3.1  Notation and definitions
        4.3.2  _ Bounds on densities
        4.3.3  Bounds on tails
        4.3.4  Control of the mixing coefficients
    5.Convergence to Stable Laws

      5.1  Introduction to stable laws
        5.1.1  Definitions
        5.1.2  Domains of attraction
        5.1.3  Entropy of stable laws
      5.2  Parameter estimation for stable distributions
        5.2.1  Minimising relative entropy
        5.2.2  Minimising Fisher information distance
        5.2.3  Matching logarithm of density
      5.3  Extending de Brujjn's identity
        5.3.1  Partial differential equations
        5.3.2  Derivatives of relative entropy
        5.3.3  Integral form of the identities
      5.4  Relationship between forms of convergence
      5.5  Steps towards a Brown inequality
    6.Convergence on Compact Groups
      6.1  Probability on compact groups
        6.1.1  Introduction to topological groups
        6.1.2  Convergence of convolutions
        6.1.3  Conditions for uniform convergence
      6.2  Convergence in relative entropy
        6.2.1  Introduction and results
        6.2.2  Entropy on compact groups
      6.3  Comparison of forms of convergence
      6.4  Proof of convergence in relative entropy
        6.4.1  Explicit rate of convergence
        6.4.2  No explicit rate of convergence
    7.Convergence to the Poisson Distribution
      7.1  Entropy and the Poisson distribution
        7.1.1  The law of small numbers
        7.1.2  Simplest bounds on relative entropy
      7.2  Fisher information
        7.2.1  Standard Fisher information
        7.2.2  Scaled Fisher information
        7.2.3  Dependent variables
      7.3  Strength of bounds
      7.4  De Bruijn identity
      7.5  L2 bounds on Poisson distance
        7.5.1  L2 definitions
        7.5.2  Sums of Bernoulli variables
        7.5.3  Normal convergence
    8.Free Random Variables
      8.1  Introduction to free variables
        8.1.1  Operators and algebras
        8.1.2  Expectations and Cauchy transforms
      8.2  Derivations and conjugate functions
        8.2.1  Derivations
        8.2.2  Fisher information and entropy
      8.3  Projection inequalities
    Appendix A  Calculating Entropies
      A.1  Gamma distribution

      A.2  Stable distributions
    Appendix B  Poincaré Inequalities
      B.1  Standard Poincaré inequalities
      B.2  Weighted Poincaré inequalities
    Appendix C  de Bruijn Identity
    Appendix D  Entropy Power Inequality
    Appendix E  Relationships Between Different Forms of Convergence
      E.1  Convergence in relative entropy to the Gaussian
      E.2  Convergence to other variables
      E.3  Convergence in Fisher information
    Bibliography
    Index

推荐书目

  • 孩子你慢慢来/人生三书 华人世界率性犀利的一枝笔,龙应台独家授权《孩子你慢慢来》20周年经典新版。她的《...

  • 时间简史(插图版) 相对论、黑洞、弯曲空间……这些词给我们的感觉是艰深、晦涩、难以理解而且与我们的...

  • 本质(精) 改革开放40年,恰如一部四部曲的年代大戏。技术突变、产品迭代、产业升级、资本对接...

更多>>>