Contents
Preface
Development of the Book
Organization of the Book
Presentation of the Material
Use of the Book in Courses
Dependence Graphs
Acknowledgments
Notation
Sets, Scalars, and Vectors
Probability and Random Variables
Common Functions
ε–δ Notation
Matrices
Order Notation
CHAPTER 1
Introduction
1.1 NETWORK INFORMATION FLOW PROBLEM
1.2 MAX-FLOW MIN-CUT THEOREM
1.3 POINT-TO-POINT INFORMATION THEORY
1.4 NETWORK INFORMATION THEORY
1.4.1 Multiple Sources and Destinations
1.4.2 Wireless Networks
1.4.3 Interactive Communication
1.4.4 Joint Source–Channel Coding
1.4.5 Secure Communication
1.4.6 Network Information Theory and Networking
1.4.7 Toward a Unified Network Information Theory
PART I PRELIMINARIES
CHAPTER 2 Information Measures and Typicality
2.1 ENTROPY
2.2 DIFFERENTIAL ENTROPY
2.3 MUTUAL INFORMATION
2.4 TYPICAL SEQUENCES
2.5 JOINTLY TYPICAL SEQUENCES
2.5.1 Joint Typicality for a Triple of Random Variables
2.5.2 Multivariate Typical Sequences
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
APPENDIX 2A PROOF OF THE CONDITIONAL TYPICALITY LEMMA
CHAPTER 3 Point-to-Point Information Theory
3.1 CHANNEL CODING
3.1.1 Channel Coding Theorem
3.1.2 Proof of Achievability
3.1.3 Achievability Using Linear Codes
3.1.4 Proof of the Converse
3.1.5 DMC with Feedback
3.2 PACKING LEMMA
3.3 CHANNEL CODING WITH INPUT COST
3.4 GAUSSIAN CHANNEL
3.4.1 Capacity of the Gaussian Channel
3.4.2 Minimum Energy Per Bit
3.4.3 Gaussian Product Channel
3.5 LOSSLESS SOURCE CODING
3.5.1 Lossless Source Coding Theorem
3.5.2 Proof of Achievability
3.5.3 Proof of the Converse
3.6 LOSSY SOURCE CODING
3.6.1 Lossy Source Coding Theorem
3.6.2 Proof of the Converse
3.6.3 Proof of Achievability
3.6.4 Lossless Source Coding Revisited
3.7 COVERING LEMMA
3.8 QUADRATIC GAUSSIAN SOURCE CODING
3.9 JOINT SOURCE–CHANNEL CODING
3.9.1 Uncoded Transmission
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
APPENDIX 3A PROOF OF LEMMA 3.2
PART II SINGLE-HOP NETWORKS
CHAPTER 4 Multiple Access Channels
4.1 DISCRETE MEMORYLESS MULTIPLE ACCESS CHANNEL
4.2 SIMPLE BOUNDS ON THE CAPACITY REGION
4.3* MULTILETTER CHARACTERIZATION OF THE CAPACITY REGION
4.4 TIME SHARING
4.5 SINGLE-LETTER CHARACTERIZATION OF THE CAPACITY REGION
4.5.1 Proof of Achievability
4.5.2 Proof of the Converse
4.5.3 Coded Time Sharing
4.6 GAUSSIAN MULTIPLE ACCESS CHANNEL
4.6.1 Capacity Region of the Gaussian MAC
4.6.2 Comparison with Point-to-Point Coding Schemes
4.7 EXTENSIONS TO MORE THAN TWO SENDERS
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
APPENDIX 4A CARDINALITY BOUND ON Q
CHAPTER 5 Degraded Broadcast Channels
5.1 DISCRETE MEMORYLESS BROADCAST CHANNEL
5.2 SIMPLE BOUNDS ON THE CAPACITY REGION
5.3 SUPERPOSITION CODING INNER BOUND
5.3.1 Proof of the Superposition Coding Inner Bound
5.4 DEGRADED DM-BC
5.4.1 Proof of the Converse
5.4.2 Capacity Region of the BS-BC
5.5 GAUSSIAN BROADCAST CHANNEL
5.5.1 Capacity Region of the Gaussian BC
5.5.2 Proof of the Converse
5.6 LESS NOISY AND MORE CAPABLE BROADCAST CHANNELS
5.6.1 Proof of the Converse for the More Capable DM-BC
5.7 EXTENSIONS
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
CHAPTER 6 Interference Channels
6.1 DISCRETE MEMORYLESS INTERFERENCE CHANNEL
6.2 SIMPLE CODING SCHEMES
6.3 STRONG INTERFERENCE
6.4 GAUSSIAN INTERFERENCE CHANNEL
6.4.1 Inner Bounds
6.4.2 Capacity Region of the Gaussian IC with Strong Interference
6.4.3 Sum-Capacity of the Gaussian IC with Weak Interference
6.5 HAN–KOBAYASHI INNER BOUND
6.5.1 Proof of the Han–Kobayashi Inner Bound
6.6 INJECTIVE DETERMINISTIC IC
6.7 CAPACITY REGION OF THE GAUSSIAN IC WITHIN HALF A BIT
6.7.1 Injective Semideterministic IC
6.7.2 Half-Bit Theorem for the Gaussian IC
6.7.3 Symmetric Degrees of Freedom
6.8 DETERMINISTIC APPROXIMATION OF THE GAUSSIAN IC
6.8.1* QED-IC Approximation of the Gaussian IC
6.9 EXTENSIONS TO MORE THAN TWO USER PAIRS
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
APPENDIX 6A PROOF OF LEMMA 6.2
APPENDIX 6B PROOF OF PROPOSITION 6.1
CHAPTER 7 Channels with State
7.1 DISCRETE MEMORYLESS CHANNEL WITH STATE
7.2 COMPOUND CHANNEL
7.3* ARBITRARILY VARYING CHANNEL
7.4 CHANNELS WITH RANDOM STATE
7.4.1 State Information Available at Both the Encoder and the Decoder
7.4.2 Extensions to the DM-MAC with State
7.5 CAUSAL STATE INFORMATION AVAILABLE AT THE ENCODER
7.6 NONCAUSAL STATE INFORMATION AVAILABLE AT THE ENCODER
7.6.1 Proof of Achievability
7.6.2 Proof of the Converse
7.6.3 Comparison with the Causal Case
7.6.4 Extensions to the Degraded DM-BC with State
7.7 WRITING ON DIRTY PAPER
7.7.1 Writing on Dirty Paper for the Gaussian MAC
7.7.2 Writing on Dirty Paper for the Gaussian BC
7.8 CODED STATE INFORMATION
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
CHAPTER 8 General Broadcast Channels
8.1 DM-BC WITH DEGRADED MESSAGE SETS
8.2 THREE-RECEIVER MULTILEVEL DM-BC
8.2.1 Proof of Achievability
8.2.2 Multilevel Product DM-BC
8.3 MARTON’S INNER BOUND
8.3.1 Semideterministic DM-BC
8.3.2 Achievability of Marton’s Inner Bound
8.3.3 Relationship to Gelfand–Pinsker Coding
8.4 MARTON’S INNER BOUND WITH COMMON MESSAGE
8.5 OUTER BOUNDS
8.5.1 A BSC and A BEC
8.5.2 Binary Skew-Symmetric Broadcast Channel
8.5.3 Nair–El Gamal Outer Bound
8.6 INNER BOUNDS FOR MORE THAN TWO RECEIVERS
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
APPENDIX 8A PROOF OF THE MUTUAL COVERING LEMMA
APPENDIX 8B PROOF OF THE NAIR–EL GAMAL OUTER BOUND
CHAPTER 9 Gaussian Vector Channels
9.1 GAUSSIAN VECTOR POINT-TO-POINT CHANNEL
9.1.1 Equivalent Gaussian Product Channel
9.1.2 Reciprocity
9.1.3 Alternative Characterization of KX*
9.2 GAUSSIAN VECTOR MULTIPLE ACCESS CHANNEL
9.2.1 GV-MAC with More than Two Senders
9.3 GAUSSIAN VECTOR BROADCAST CHANNEL
9.4 GAUSSIAN PRODUCT BROADCAST CHANNEL
9.5 VECTOR WRITING ON DIRTY PAPER
9.6 GAUSSIAN VECTOR BC WITH PRIVATE MESSAGES
9.6.1 Proof of Achievability
9.6.2 Gaussian Vector BC–MAC Duality
9.6.3 Proof of the Converse
9.6.4 GV-BC with More than Two Receivers
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
APPENDIX 9A PROOF OF THE BC–MAC DUALITY LEMMA
APPENDIX 9B UNIQUENESS OF THE SUPPORTING LINE
CHAPTER 10 Distributed Lossless Compression
10.1 DISTRIBUTED LOSSLESS SOURCE CODING FOR A 2-DMS
10.2 INNER AND OUTER BOUNDS ON THE OPTIMAL RATE REGION
10.3 SLEPIAN–WOLF THEOREM
10.3.1 Lossless Source Coding via Random Binning
10.3.2 Achievability Proof of the Slepian–Wolf Theorem
10.4 LOSSLESS SOURCE CODING WITH A HELPER
10.4.1 Proof of Achievability
10.4.2 Proof of the Converse
10.5 EXTENSIONS TO MORE THAN TWO SOURCES
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
CHAPTER 11 Lossy Compression with Side Information
11.1 SIMPLE SPECIAL CASES
11.2 CAUSAL SIDE INFORMATION AVAILABLE AT THE DECODER
11.2.1 Proof of Achievability
11.2.2 Proof of the Converse
11.2.3 Lossless Source Coding with Causal Side Information
11.3 NONCAUSAL SIDE INFORMATION AVAILABLE AT THE DECODER
11.3.1 Proof of Achievability
11.3.2 Proof of the Converse
11.3.3 Source–Channel Coding Dualities
11.4 SOURCE CODING WHEN SIDE INFORMATION MAY BE ABSENT
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
APPENDIX 11A PROOF OF LEMMA 11.1
CHAPTER 12 Distributed Lossy Compression
12.1 BERGER–TUNG INNER BOUND
12.1.1 Markov and Mutual Packing Lemmas
12.1.2 Proof of the Berger–Tung Inner Bound
12.2 BERGER–TUNG OUTER BOUND
12.3 QUADRATIC GAUSSIAN DISTRIBUTED SOURCE CODING
12.3.1 Proof of Achievability
12.3.2 Proof of the Converse
12.4 QUADRATIC GAUSSIAN CEO PROBLEM
12.4.1 Proof of the Converse
12.5* SUBOPTIMALITY OF BERGER–TUNG CODING
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
APPENDIX 12A PROOF OF THE MARKOV LEMMA
APPENDIX 12C PROOF OF LEMMA 12.4
APPENDIX 12D PROOF OF LEMMA 12.6
CHAPTER 13 Multiple Description Coding
13.1 MULTIPLE DESCRIPTION CODING FOR A DMS
13.2 SIMPLE SPECIAL CASES
13.3 EL GAMAL–COVER INNER BOUND
13.3.1 Proof of the El Gamal–Cover Inner Bound
13.4 QUADRATIC GAUSSIAN MULTIPLE DESCRIPTION CODING
13.4.1 Proof of Achievability
13.4.2 Proof of the Converse
13.5 SUCCESSIVE REFINEMENT
13.5.1 Successively Refinable Sources
13.6 ZHANG–BERGER INNER BOUND
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
CHAPTER 14 Joint Source–Channel Coding
14.1 LOSSLESS COMMUNICATION OF A 2-DMS OVER A DM-MAC
14.1.1 A Joint Source–Channel Coding Scheme
14.1.2 Proof of Theorem 14.1
14.1.3 Common Part of a 2-DMS
14.1.4 Three-Index Separate Source and Channel Coding Scheme
14.1.5 A Joint Source–Channel Coding Scheme with Common Part
14.2 LOSSLESS COMMUNICATION OF A 2-DMS OVER A DM-BC
14.2.1 Gray–Wyner System
14.2.2 Common Information
14.2.3 A Separate Source–Channel Coding Scheme
14.2.4 A Joint Source–Channel Coding Scheme
14.2.5 Proof of Theorem 14.4
14.3 A GENERAL SINGLE-HOP NETWORK
14.3.1 Separate Source and Channel Coding Scheme
14.3.2* A Hybrid Source–Channel Coding Scheme
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
APPENDIX 14A PROOF OF LEMMA 14.1
PART III MULTIHOP NETWORKS
CHAPTER 15 Graphical Networks
15.1 GRAPHICAL MULTICAST NETWORK
15.2 CAPACITY OF GRAPHICAL UNICAST NETWORK
15.3 CAPACITY OF GRAPHICAL MULTICAST NETWORK
15.3.1 Linear Network Coding
15.3.2 Achievability Proof of the Network Coding Theorem
15.4 GRAPHICAL MULTIMESSAGE NETWORK
15.4.1 Graphical Multimessage Multicast Network
15.4.2 Graphical Multiple-Unicast Network
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
APPENDIX 15A PROOF OF LEMMA 15.1
CHAPTER 16 Relay Channels
16.1 DISCRETE MEMORYLESS RELAY CHANNEL
16.2 CUTSET UPPER BOUND ON THE CAPACITY
16.3 DIRECT-TRANSMISSION LOWER BOUND
16.4 DECODE–FORWARD LOWER BOUND
16.4.1 Multihop Lower Bound
16.4.2 Coherent Multihop Lower Bound
16.4.3 Decode–Forward Lower Bound
16.4.4 Proof via Backward Decoding
16.4.5 Proof via Binning
16.5 GAUSSIAN RELAY CHANNEL
16.5.1 Upper and Lower Bounds on the Capacity of the Gaussian RC
16.6 PARTIAL DECODE–FORWARD LOWER BOUND
16.6.1 Semideterministic DM-RC
16.6.2 Relay Channel with Orthogonal Sender Components
16.6.3 SFD Gaussian Relay Channel
16.7 COMPRESS–FORWARD LOWER BOUND
16.7.1 Proof of the Compress–Forward Lower Bound
16.7.2 Compress–Forward for the Gaussian RC
16.7.3 Relay Channel with Orthogonal Receiver Components
16.8 RFD GAUSSIAN RELAY CHANNEL
16.8.1 Linear Relaying for RFD Gaussian RC
16.8.2 Amplify–Forward
16.8.3* Linear Relaying Capacity of the RFD Gaussian RC
16.9 LOOKAHEAD RELAY CHANNELS
16.9.1 Noncausal Relay Channels
16.9.2 Causal Relay Channels
16.9.3 Coherent Cooperation
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
APPENDIX 16A CUTSET BOUND FOR THE GAUSSIAN RC
APPENDIX 16B PARTIAL DECODE–FORWARD FOR THE GAUSSIAN RC
APPENDIX 16C EQUIVALENT COMPRESS–FORWARD LOWER BOUND
CHAPTER 17 Interactive Channel Coding
17.1 POINT-TO-POINT COMMUNICATION WITH FEEDBACK
17.1.1 Schalkwijk–Kailath Coding Scheme for the Gaussian Channel
17.1.2* Horstein’s Coding Scheme for the BSC
17.1.3 Block Feedback Coding Scheme for the BSC
17.2 MULTIPLE ACCESS CHANNEL WITH FEEDBACK
17.2.1 Cover–Leung Inner Bound
17.2.2 Proof of the Cover–Leung Inner Bound
17.2.3 Gaussian MAC with Feedback
17.2.4 Achievability Proof of Theorem 17.2
17.3 BROADCAST CHANNEL WITH FEEDBACK
17.4 RELAY CHANNEL WITH FEEDBACK
17.5 TWO-WAY CHANNEL
17.5.1 Simple Inner and Outer Bounds
17.5.2 Dependence Balance Bound
17.6 DIRECTED INFORMATION
17.6.1 Multiletter Characterization of the DM-TWC Capacity Region
17.6.2 Proof of the Converse
17.6.3 Proof of Achievability
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
APPENDIX 17A PROOF OF LEMMA 17.1
CHAPTER 18 Discrete Memoryless Networks
18.1 DISCRETE MEMORYLESS MULTICAST NETWORK
18.2 NETWORK DECODE–FORWARD
18.2.1 Sliding Window Decoding for the DM-RC
18.2.2 Proof of the Network Decode–Forward Lower Bound
18.3 NOISY NETWORK CODING
18.3.1 Deterministic Multicast Network
18.3.2 Wireless Erasure Multicast Network
18.3.3 Noisy Network Coding for the DM-RC
18.3.4 Proof of the Noisy Network Coding Lower Bound
18.4 DISCRETE MEMORYLESS MULTIMESSAGE NETWORK
18.4.1 Noisy Network Coding for the Multimessage Multicast Network
18.4.2* Noisy Network Coding for General Multimessage Networks
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
CHAPTER 19 Gaussian Networks
19.1 GAUSSIAN MULTIMESSAGE NETWORK
19.1.1 Noisy Network Coding Lower Bound
19.1.2 Gaussian Two-Way Relay Channel
19.1.3 Multimessage Multicast Capacity Region within a Constant Gap
19.2 CAPACITY SCALING LAWS
19.3 GUPTA–KUMAR RANDOM NETWORK
19.3.1 Proof of the Lower Bound
19.3.2 Proof of the Upper Bound
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
APPENDIX 19A PROOF OF LEMMA 19.1
APPENDIX 19B PROOF OF LEMMA 19.2
CHAPTER 20 Compression over Graphical Networks
20.1 DISTRIBUTED LOSSLESS SOURCE–NETWORK CODING
20.2 MULTIPLE DESCRIPTION NETWORK CODING
20.2.1 Cascade Multiple Description Network
20.2.2 Triangular Multiple Description Network
20.3 INTERACTIVE SOURCE CODING
20.3.1 Two-Way Lossless Source Coding
20.3.2 CFO Problem
20.3.3 Two-Way Lossy Source Coding
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
APPENDIX 20A PROOF OF LEMMA 20.1
PART IV EXTENSIONS
CHAPTER 21 Communication for Computing
21.1 CODING FOR COMPUTING WITH SIDE INFORMATION
21.1.1 Lossless Coding for Computing
21.2 DISTRIBUTED CODING FOR COMPUTING
21.2.1 μ-Sum Problem
21.2.2 Distributed Lossless Computing
21.3 INTERACTIVE CODING FOR COMPUTING
21.3.1 Interactive Coding for Lossless Computing
21.4 CASCADE CODING FOR COMPUTING
21.5 DISTRIBUTED LOSSY AVERAGING
21.6 COMPUTING OVER A MAC
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
CHAPTER 22 Information Theoretic Secrecy
22.1 WIRETAP CHANNEL
22.1.1 Achievability Proof of Theorem 22.1
22.1.2 Converse Proof of Theorem 22.1
22.1.3 Extensions
22.2 CONFIDENTIAL COMMUNICATION VIA SHARED KEY
22.3 SECRET KEY AGREEMENT: SOURCE MODEL
22.3.1 Secret Key Agreement from One-Way Communication
22.3.2 Achievability Proof of Theorem 22.4
22.3.3 Lower Bound on the Secret Key Capacity
22.3.4 Upper Bound on the Secret Key Capacity
22.3.5 Extensions to Multiple Nodes
22.4 SECRET KEY AGREEMENT: CHANNEL MODEL
22.4.1 Maurer’s Example
22.4.2 Lower and Upper Bounds on the Secret Key Capacity
22.4.3 Connection between Key Agreement and the Wiretap Channel
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
APPENDIX 22A PROOF OF LEMMA 22.1
APPENDIX 22B PROOF OF LEMMA 22.2
APPENDIX 22C PROOF OF LEMMA 22.3
CHAPTER 23 Wireless Fading Channels
23.1 GAUSSIAN FADING CHANNEL
23.2 CODING UNDER FAST FADING
23.3 CODING UNDER SLOW FADING
23.3.1 Channel Gain Available Only at the Decoder
23.3.2 Channel Gain Available at Both the Encoder and the Decoder
23.4 GAUSSIAN VECTOR FADING CHANNEL
23.5 GAUSSIAN FADING MAC
23.5.1 Fast Fading
23.5.2 Slow Fading
23.6 GAUSSIAN FADING BC
23.7 GAUSSIAN FADING IC
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
CHAPTER 24 Networking and Information Theory
24.1 RANDOM DATA ARRIVALS
24.2 RANDOM ACCESS CHANNEL
24.3 ASYNCHRONOUS MAC
24.3.1 Proof of Achievability
24.3.2 Proof of the Converse
SUMMARY
BIBLIOGRAPHIC NOTES
PROBLEMS
APPENDIX 24A PROOF OF LEMMA 24.1
APPENDIX 24B PROOF OF LEMMA 24.2
APPENDICES
APPENDIX A Convex Sets and Functions
APPENDIX B Probability and Estimation
Probability Bounds and Limits
Functional Representation Lemma
Mean Squared Error Estimation
Gaussian Random Vectors
APPENDIX C Cardinality Bounding Techniques
Convex Cover Method
Extension to Multiple Random Variables
Perturbation Method
APPENDIX D Fourier–Motzkin Elimination
APPENDIX E Convex Optimization
Bibliography
Common Symbols
Author Index
Subject Index