logo资料库

Sparse and Redundant Representations: From Theory to Application....pdf

第1页 / 共397页
第2页 / 共397页
第3页 / 共397页
第4页 / 共397页
第5页 / 共397页
第6页 / 共397页
第7页 / 共397页
第8页 / 共397页
资料共397页,剩余部分请下载后查看
Cover
Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing
Copyright
9781441970107
Foreword
Preface
Acknowledgments
Contents
Part I Sparse and Redundant Representations – Theoretical and Numerical Foundations
Chapter 1 Prologue
1.1 Underdetermined Linear Systems
1.2 Regularization
1.3 The Temptation of Convexity
1.4 A Closer Look at l1 Minimization
1.5 Conversion of (P1) to Linear Programming
1.6 Promoting Sparse Solutions
1.7 The l0-Norm and Implications
1.8 The (P0) Problem – Our Main Interest
1.9 The Signal Processing Perspective
Further Reading
Chapter 2 Uniqueness and Uncertainty
2.1 Treating the Two-Ortho Case
2.1.1 An Uncertainty Principle
2.1.2 Uncertainty of Redundant Solutions
2.1.3 From Uncertainty to Uniqueness
2.2 Uniqueness Analysis for the General Case
2.2.1 Uniqueness via the Spark
2.2.2 Uniqueness via the Mutual-Coherence
2.2.3 Uniqueness via the Babel Function
2.2.4 Upper-Bounding the Spark
2.3 Constructing Grassmannian Matrices
2.4 Summary
Further Reading
Chapter 3 Pursuit Algorithms – Practice
3.1 Greedy Algorithms
3.1.1 The Core Idea
3.1.2 The Orthogonal-Matching-Pursuit
3.1.3 Other Greedy Methods
3.1.4 Normalization
3.1.5 Rate of Decay of the Residual in Greedy Methods
3.1.6 Thresholding Algorithm
3.1.7 Numerical Demonstration of Greedy Algorithms
3.2 Convex Relaxation Techniques
3.2.1 Relaxation of the l0-Norm
3.2.2 Numerical Algorithms for Solving (P1)
3.2.3 Numerical Demonstration of Relaxation Methods
3.3 Summary
Further Reading
Chapter 4 Pursuit Algorithms – Guarantees
4.1 Back to the Two-Ortho Case
4.1.1 OMP Performance Guarantee
4.1.2 BP Performance Guarantee
4.2 The General Case
4.2.1 OMP Performance Guarantee
4.2.2 Thresholding Performance Guarantee
4.2.3 BP Performance Guarantee
4.2.4 Performance of Pursuit Algorithms – Summary
4.3 The Role of the Sign-Pattern
4.4 Tropp’s Exact Recovery Condition
4.5 Summary
Further Reading
Chapter 5 From Exact to Approximate Solutions
5.1 General Motivation
5.2 Stability of the Sparsest Solution
5.2.1 Uniqueness versus Stability – Gaining Intuition
5.2.2 Theoretical Study of the Stability of (P0)
5.2.3 The RIP and Its Use for Stability Analysis
5.3 Pursuit Algorithms
5.3.1 OMP and BP Extensions
5.3.2 Iteratively-Reweighed-Least-Squares (IRLS)
5.3.3 The LARS Algorithm
5.3.4 Quality of Approximations Obtained
5.4 The Unitary Case
5.5 Performance of Pursuit Algorithms
5.5.1 BPDN Stability Guarantee
5.5.2 Thresholding Stability Guarantee
5.6 Summary
Further Reading
Chapter 6 Iterative-Shrinkage Algorithms
6.1 Background
6.2 The Unitary Case - A Source of Inspiration
6.2.1 Shrinkage For the Unitary case
6.2.2 The BCR Algorithm and Variations
6.3 Developing Iterative-Shrinkage Algorithms
6.3.1 Surrogate Functions and the Prox Method
6.3.2 EM and Bound-Optimization Approaches
6.3.3 An IRLS-Based Shrinkage Algorithm
6.3.4 The Parallel-Coordinate-Descent (PCD) Algorithm
6.3.5 StOMP: A Variation on Greedy Methods
6.3.6 Bottom Line – Iterative-Shrinkage Algorithms
6.4 Acceleration Using Line-Search and SESOP
6.5 Iterative-Shrinkage Algorithms: Tests
6.6 Summary
Further Reading
Chapter 7 Towards Average Performance Analysis
7.1 Empirical Evidence Revisited
7.2 A Glimpse into Probabilistic Analysis
7.2.1 The Analysis Goals
7.2.2 Two-Ortho Analysis by Candes & Romberg
7.2.3 Probabilistic Uniqueness
7.2.4 Donoho’s Analysis
7.2.5 Summary
7.3 Average Performance of Thresholding
7.3.1 Preliminaries
7.3.2 The Analysis
7.3.3 Discussion
7.4 Summary
Further Reading
Chapter 8 The Dantzig-Selector Algorithm
8.1 Dantzig-Selector versus Basis-Pursuit
8.2 The Unitary Case
8.3 Revisiting the Restricted Isometry Machinery
8.4 Dantzig-Selector Performance Guaranty
8.5 Dantzig-Selector in Practice
8.6 Summary
Further Reading
Part II From Theory to Practice – Signal and Image Processing Applications
Chapter 9 Sparsity-Seeking Methods in Signal Processing
9.1 Priors and Transforms for Signals
9.2 The Sparse-Land Model
9.3 Geometric Interpretation of Sparse-Land
9.4 Processing of Sparsely-Generated Signals
9.5 Analysis Versus Synthesis Signal Modeling
9.6 Summary
Further Reading
Chapter 10 Image Deblurring – A Case Study
10.1 Problem Formulation
10.2 The Dictionary
10.3 Numerical Considerations
10.4 Experiment Details and Results
10.5 Summary
Further Reading
Chapter 11 MAP versus MMSE Estimation
11.1 A Stochastic Model and Estimation Goals
11.2 Background on MAP and MMSE
11.3 The Oracle Estimation
11.3.1 Developing the Oracle Estimator
11.3.2 The Oracle Error
11.4 The MAP Estimation
11.4.1 Developing the MAP Estimator
11.4.2 Approximating the MAP Estimator
11.5 The MMSE Estimation
11.5.1 Developing the MMSE Estimator
11.5.2 Approximating the MMSE Estimator
11.6 MMSE and MAP Errors
11.7 More Experimental Results
11.8 Summary
Further Reading
Chapter 12 The Quest for a Dictionary
12.1 Choosing versus Learning
12.2 Dictionary-Learning Algorithms
12.2.1 Core Questions in Dictionary-Learning
12.2.2 The MOD Algorithm
12.2.3 The K-SVD Algorithm
12.3 Training Structured Dictionaries
12.3.1 The Double-Sparsity Model
12.3.2 Union of Unitary Bases
12.3.3 The Signature Dictionary
12.4 Summary
Further Reading
Chapter 13 Image Compression – Facial Images
13.1 Compression of Facial Images
13.2 Previous Work
13.3 Sparse-Representation-Based Coding Scheme
13.3.1 The General Scheme
13.3.2 VQ Versus Sparse Representations
13.4 More Details and Results
13.4.1 K-SVD Dictionaries
13.4.2 Reconstructed Images
13.4.3 Run-Time and Memory Usage
13.4.4 Comparing to Other Techniques
13.4.5 Dictionary Redundancy
13.5 Post-Processing for Deblocking
13.5.1 The Blockiness Artifacts
13.5.2 Possible Approaches For Deblocking
13.5.3 Learning-Based Deblocking Approach
13.6 Deblocking Results
13.7 Summary
Further Reading
Chapter 14 Image Denoising
14.1 General Introduction – Image Denoising
14.2 The Beginning: Global Modeling
14.2.1 The Core Image-Denoising Algorithm
14.2.2 Various Improvements
14.3 From Global to Local Modeling
14.3.1 The General Methodology
14.3.2 Learning the Shrinkage Curves
14.3.3 Learned Dictionary and Globalizing the Prior
14.3.4 The Non-Local-Means Algorithm
14.3.5 3D-DCT Shrinkage: BM3D Denoising
14.4 SURE for Automatic Parameter Setting
14.4.1 Development of the SURE
14.4.2 Demonstrating SURE to Global-Threhsolding
14.5 Summary
Further Reading
Chapter 15 Other Applications
15.1 General
15.2 Image Separation via MCA
15.2.1 Image = Cartoon + Texture
15.2.2 Global MCA for Image Separation
15.2.3 Local MCA for Image Separation
15.3 Image Inpainting and Impulsive Noise Removal
15.3.1 Inpainting Sparse-Land Signals – Core Principles
15.3.2 Inpainting Images – Local K-SVD
15.3.3 Inpainting Images – The Global MCA
15.3.4 Impulse-Noise Filtering
15.4 Image Scale-Up
15.4.1 Modeling the Problem
15.4.2 The Super-Resolution Algorithm
15.4.2.1 Training Phase
15.4.2.2 Testing Phase: Scaling-Up an Image
15.4.3 Scaling-Up Results
15.4.4 Image Scale-Up: Summary
15.5 Summary
Further Reading
Chapter 16 Epilogue
16.1 What is it All About?
16.2 What is Still Missing?
16.3 Bottom Line
Notation
Acronyms
Index
Sparse and Redundant Representations
Sparse and Redundant Representations From Theory to Applications in Signal and Image Processing Michael Elad The Technion – Israel Institute of Technology Haifa, Israel
Michael Elad The Technion – Israel Institute of Technology Computer Science Department 32000 Haifa, Israel elad@cs.technion.ac.il e-ISBN 978-1-4419-7011-4 ISBN 978-1-4419-7010-7 DOI 10.1007/978-1-4419-7011-4 Springer New York Dordrecht Heidelberg London L ibrary of Congress Control Number: 2010933513 Mathematics Subject Classification (2010): 94A12, 62H35, 62M40, 68U10, 94A08, 94H60, 46N10 © Springer Science+Business Media, LLC 2010 All rights reserved. This work may not be translated or copied in whole or in part without the written permission of the publisher (Springer Science+Business Media, LLC, 233 Spring Street, New York, NY 10013, USA), except for brief excerpts in connection with reviews or scholarly analysis. Use in connection with any form of information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed is forbidden. The use in this publication of trade names, trademarks, service marks, and similar terms, even if they are not identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to proprietary rights. Printed on acid-free paper Springer is part of Springer Science+Business Media (www.springer.com)
To my loving and devoted family, My parents, Rita and Shmuel Elbaz, who have shown me the way, My children Sharon, Adar, and Yuval, who give purpose to this way, and My wife, Ritta, who constantly paves this way with love and understanding.
Foreword A long long time ago, echoing philosophical and aesthetic principles that existed since antiquity, William of Ockham enounced the principle of parsimony, better known today as Ockham’s razor: “Entities should not be multiplied without neces- sity.” This principle enabled scientists to select the ”best” physical laws and theories to explain the workings of the Universe and continued to guide scientific research, leading to beautiful results like the minimal description length approach to statistical inference and the related Kolmogorov complexity approach to pattern recognition. However, notions of complexity and description length are subjective concepts and depend on the language “spoken” when presenting ideas and results. The field of sparse representations, that recently underwent a Big-Bang-like expansion, explic- itly deals with the Yin-Yang interplay between the parsimony of descriptions and the “language” or “dictionary” used in them, and it became an extremely exciting area of investigation.It already yielded a rich crop of mathematically pleasing, deep and beautiful results that quickly translated into a wealth of practical engineering applications. You are holding in your hands the first guide-book to Sparseland, and I am sure you’ll find in it both familiar and new landscapes to see and admire, as well as ex- cellent pointers that will help you find further valuable treasures. Enjoy the journey to Sparseland! Haifa, Israel, December 2009 Alfred M. Bruckstein vii
分享到:
收藏