logo资料库

Element of information theory Thomas M.cover Joy A.Thomas.pdf

第1页 / 共774页
第2页 / 共774页
第3页 / 共774页
第4页 / 共774页
第5页 / 共774页
第6页 / 共774页
第7页 / 共774页
第8页 / 共774页
资料共774页,剩余部分请下载后查看
Cover
Book Information
Contents
Preface to the Second Edition
Preface to the First Edition
Chapter1 Introduction and Preview
Chapter2 Entropy,Relative Entropy,and Mutual Information
2.1 Entropy
2.2 Joint Entropy and Conditional Entropy
2.3 Relative Entropy and Mutual Information
2.4 Relationship between Entropy and Mutual Information
2.5 Chain Rules for Entropy,Relative Entropy,and Mutual Information
2.6 Jensen's Inequality and It's Consequences
2.7 Log Sum Inequality and It's Applications
2.8 Data-Processing Inequality
2.9 Sufficient Statistics
2.10 Fano's Inequality
SUMMARY
Chapter3 Asymptotic Equipatition Property
3.1 Asymptotic Equipartition Property Theorem
3.2 CONSEQUENCES OF THE AEP: DATA COMPRESSION
3.3 HIGH-PROBABILITY SETS AND THE TYPICAL SET
SUMMARY
Chapter4 ENTROPY RATES OF A STOCHASTIC PROCESS
4.1 MARKOV CHAINS
4.2 ENTROPY RATE
4.3 EXAMPLE: ENTROPY RATE OF A RANDOM WALK ON AWEIGHTED GRAPH
4.4 SECOND LAW OF THERMODYNAMICS
4.5 FUNCTIONS OF MARKOV CHAINS
SUMMARY
Chapter5 DATA COMPRESSION
5.1 EXAMPLES OF CODES
5.2 KRAFT INEQUALITY
5.3 OPTIMAL CODES
5.4 BOUNDS ON THE OPTIMAL CODE LENGTH
5.5 KRAFT INEQUALITY FOR UNIQUELY DECODABLE CODES
5.6 HUFFMAN CODES
5.7 SOME COMMENTS ON HUFFMAN CODES
5.8 OPTIMALITY OF HUFFMAN CODES
5.9 SHANNON–FANO–ELIAS CODING
5.10 COMPETITIVE OPTIMALITY OF THE SHANNON CODE
5.11 GENERATION OF DISCRETE DISTRIBUTIONS FROM FAIR COINS
SUMMARY
Chapter6 GAMBLING AND DATA COMPRESSION
6.1 THE HORSE RACE
6.2 GAMBLING AND SIDE INFORMATION
6.3 DEPENDENT HORSE RACES AND ENTROPY RATE
6.4 THE ENTROPY OF ENGLISH
6.5 DATA COMPRESSION AND GAMBLING
6.6 GAMBLING ESTIMATE OF THE ENTROPY OF ENGLISH
SUMMARY
Chapter7 CHANNEL CAPACITY
7.1 EXAMPLES OF CHANNEL CAPACITY
7.2 SYMMETRIC CHANNELS
7.3 PROPERTIES OF CHANNEL CAPACITY
7.4 PREVIEW OF THE CHANNEL CODING THEOREM
7.5 DEFINITIONS
7.6 JOINTLY TYPICAL SEQUENCES
7.7 CHANNEL CODING THEOREM
7.8 ZERO-ERROR CODES
7.9 FANO’S INEQUALITY AND THE CONVERSE TO THE CODING THEOREM
7.10 EQUALITY IN THE CONVERSE TO THE CHANNEL CODING THEOREM
7.11 HAMMING CODES
7.12 FEEDBACK CAPACITY
7.13 SOURCE–CHANNEL SEPARATION THEOREM
SUMMARY
Chapter8 DIFFERENTIAL ENTROPY
8.1 DEFINITIONS
8.2 AEP FOR CONTINUOUS RANDOM VARIABLES
8.3 RELATION OF DIFFERENTIAL ENTROPY TO DISCRETE ENTROPY
8.4 JOINT AND CONDITIONAL DIFFERENTIAL ENTROPY
8.5 RELATIVE ENTROPY AND MUTUAL INFORMATION
8.6 PROPERTIES OF DIFFERENTIAL ENTROPY, RELATIVE ENTROPY, AND MUTUAL INFORMATION
SUMMARY
Chapter9 GAUSSIAN CHANNEL
9.1 GAUSSIAN CHANNEL: DEFINITIONS
9.2 CONVERSE TO THE CODING THEOREM FOR GAUSSIAN CHANNELS
9.3 BANDLIMITED CHANNELS
9.4 PARALLEL GAUSSIAN CHANNELS
9.5 CHANNELS WITH COLORED GAUSSIAN NOISE
9.6 GAUSSIAN CHANNELS WITH FEEDBACK
SUMMARY
Chapter10 RATE DISTORTION THEORY
10.1 QUANTIZATION
10.2 DEFINITIONS
ELEMENTS OF INFORMATION THEORY Second Edition THOMAS M. COVER JOY A. THOMAS A JOHN WILEY & SONS, INC., PUBLICATION
ELEMENTS OF INFORMATION THEORY
ELEMENTS OF INFORMATION THEORY Second Edition THOMAS M. COVER JOY A. THOMAS A JOHN WILEY & SONS, INC., PUBLICATION
Copyright  2006 by John Wiley & Sons, Inc. All rights reserved. Published by John Wiley & Sons, Inc., Hoboken, New Jersey. Published simultaneously in Canada. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 750-4470, or on the web at www.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008, or online at http://www.wiley.com/go/permission. Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages. For general information on our other products and services or for technical support, please contact our Customer Care Department within the United States at (800) 762-2974, outside the United States at (317) 572-3993 or fax (317) 572-4002. Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic formats. For more information about Wiley products, visit our web site at www.wiley.com. Library of Congress Cataloging-in-Publication Data: Cover, T. M., 1938– Elements of information theory/by Thomas M. Cover, Joy A. Thomas.–2nd ed. p. cm. “A Wiley-Interscience publication.” Includes bibliographical references and index. ISBN-13 978-0-471-24195-9 ISBN-10 0-471-24195-4 1. Information theory. I. Thomas, Joy A. II. Title. Q360.C68 2005 003 .54–dc22 Printed in the United States of America. 10 9 8 7 6 5 4 3 2 1 2005047799
CONTENTS Contents Preface to the Second Edition Preface to the First Edition Acknowledgments for the Second Edition Acknowledgments for the First Edition Introduction and Preview 1.1 Preview of the Book 5 1 2 13 20 16 19 Entropy, Relative Entropy, and Mutual Information 2.1 2.2 2.3 2.4 Entropy Joint Entropy and Conditional Entropy Relative Entropy and Mutual Information Relationship Between Entropy and Mutual Information Chain Rules for Entropy, Relative Entropy, and Mutual Information Jensen’s Inequality and Its Consequences Log Sum Inequality and Its Applications Data-Processing Inequality Sufficient Statistics 35 Fano’s Inequality 25 30 2.5 22 34 2.6 2.7 2.8 2.9 2.10 Summary Problems Historical Notes 54 41 43 37 v xv xvii xxi xxiii 1 13 v
57 71 vi 3 4 CONTENTS Asymptotic Equipartition Property Theorem 58 Consequences of the AEP: Data Compression High-Probability Sets and the Typical Set 62 Asymptotic Equipartition Property 3.1 3.2 3.3 Summary Problems Historical Notes 64 64 69 60 71 Entropy Rates of a Stochastic Process 4.1 4.2 4.3 Markov Chains Entropy Rate 74 Example: Entropy Rate of a Random Walk on a Weighted Graph Second Law of Thermodynamics Functions of Markov Chains 84 78 81 4.4 4.5 Summary Problems Historical Notes 87 88 100 5 Data Compression 103 5.1 5.2 5.3 5.4 5.5 5.6 5.7 5.8 5.9 5.10 103 107 110 Examples of Codes Kraft Inequality Optimal Codes Bounds on the Optimal Code Length Kraft Inequality for Uniquely Decodable Codes 115 Huffman Codes Some Comments on Huffman Codes 123 Optimality of Huffman Codes Shannon–Fano–Elias Coding 127 Competitive Optimality of the Shannon Code 118 130 112 120 5.11 Generation of Discrete Distributions from Fair Coins 134 141 Summary Problems 142 Historical Notes 157
分享到:
收藏