logo资料库

Recursive Estimation and Time-Series Analysis.pdf

第1页 / 共523页
第2页 / 共523页
第3页 / 共523页
第4页 / 共523页
第5页 / 共523页
第6页 / 共523页
第7页 / 共523页
第8页 / 共523页
资料共523页,剩余部分请下载后查看
Cover
Recursive Estimation and Time-Series Analysis, 2nd Edition
ISBN 9783642219801
Preface
Contents
Chapter 1: Introduction
1.1 The Historical Context
1.2 The Contents of the Book
1.3 Software
1.4 The Aims of the Book
Part I: Recursive Estimation of Parameters in Linear Regression Models
Chapter 2: Recursive Estimation: A Simple Tutorial Introduction
2.1 Recursive Estimation of the Mean Value of a Random Variable
2.1.1 Filtering interpretation of the recursive algorithm
2.2 Recursive Least Squares Estimation for a Single Unknown Parameter
2.3 Exercises
2.4 Summary
Chapter 3: Recursive Least Squares Estimation
3.1 The Deterministic Recursive Linear Least Squares Algorithm
3.2 The Stochastic Recursive Linear Least Squares Algorithm
3.3 Some Cautionary Comments: Multiple Collinearity and Errors-in-Variables
3.3.1 Multiple collinearity
3.3.2 Errors-in-variables, the structural model and instrumental variable estimation
3.4 Connection with Stochastic Approximation
3.5 Exercises
3.6 Summary
Chapter 4: Recursive Estimation of Time Variable Parameters in Regression Models
4.1 Shaping the Memory of the Estimator
4.1.1 The moving Rectangular Window (RW)
4.1.2 The moving Exponentially-Weighted-Past(EWP) window
4.2 Modelling the Parameter Variations
4.2.1 The complete TVP regression model
4.3 Vector Measurements
4.4 The Kalman Filter
4.5 Recursive Fixed-Interval Smoothing
4.5.1 Simple FIS estimation
4.5.2 Optimal FIS algorithms
4.5.3 Optimization of hyper-parameters
4.5.4 Implementation of the KALMSMO algorithm
4.5.5 The physical nature of FIS estimation
4.6 Final Recursive Analysis of the Walgett Data
4.6.1 Statistical methods for detecting the presence of parameter variation
4.7 Simplified Constant Gain Algorithms for Time Variable Parameter Estimation
4.8 The Estimation of Rapidly Varying Parameters
4.9 Variance Intervention
4.10 Exercises
4.11 Summary
Chapter 5: Unobserved Component Models
5.1 The Dynamic Linear Regression (DLR) Model
5.1.1 Example: DLR analysis of LIDAR data
5.2 The Dynamic Harmonic Regression (DHR) Model and Time Series Forecasting
5.2.1 Spectral analysis of the DHR model
5.2.1.1 The pseudo-spectra of Generalized Random Walk models
5.2.1.2 The pseudo-spectra of the full DHR model
5.2.1.3 Estimation in the frequency domain
5.2.2 The complete DHR estimation algorithm
5.2.3 Practical example: Signals Passed At Danger (SPAD) data
5.2.4 Extensions of DHR model analysis
5.3 The Dynamic AutoRegression (DAR) Model and Time-Frequency Analysis
5.3.1 Practical example: palaeoclimatic data analysis
5.4 Dynamic ARX and FIR Models
5.4.1 Example: DARX model estimation
5.5 Forecasting with UC models
5.6 Exercises
5.7 Summary
Part II: Recursive Estimation of Parameters in Transfer Function Models
Chapter 6: Transfer Function Models and the Limitations of Recursive Least Squares
6.1 Introduction: Direct Estimation of the State Space Model
6.2 From State Space to Observation Space
6.3 Least Squares Estimation: Its Advantages and Limitations
6.4 Least Squares Estimation: ARX Model Estimation
6.4.1 Example 6.1: Estimation of a simple ARX model
6.5 Least Squares Estimation: FIR Model Estimation
6.5.1 Example 6.2: Estimation of a simple FIR model
6.6 Identifiability
6.6.1 Choice of input signals
6.6.2 Restrictions on the system to be identified
6.6.3 The more general case
6.6.4 Noise process identifiability
6.6.5 Some concluding comments on identifiability
6.7 Recursive Estimation of Transfer Function Models
6.8 Standard Instrumental Variable (SIV) Estimation
6.8.1 Statistical properties of SIV estimates
6.9 Model Structure Identification
6.9.1 Example 6.3: Recursive SIV parameter estimation for a simple TF model
6.9.2 Example 6.4: SIV estimation of tracer experiment data
6.10 Dynamic Transfer Function Models
6.10.1 Example 6.5: DTF model estimation
6.10.2 Example: Using DTF estimation for model diagnosis
6.11 Multivariable Transfer Function Models
6.12 Exercises
6.13 Summary
Chapter 7: Optimal Identification and Estimation of Discrete-Time Transfer Function Models
7.1 Refined Instrumental Variable Estimation
7.1.1 Recursive-iterative instrumental variable estimation
7.1.2 The system TF estimation model and RIV estimation
7.1.3 The ARMA noise estimation model and IVARMA estimation
7.2 The Recursive-Iterative Algorithms
7.2.1 Implementation of the RIV and SRIV algorithms
7.2.2 Convergence of the iterative algorithms
7.2.3 Theoretical justification of the RIV method
7.3 Model Structure Identification
7.4 Input Noise and Errors-in-Variables
7.4.1 Example 7.1: Errors-in-Variables: the effects of input noise
7.5 Examples of RIV Identification and Estimation: SISO Models
7.5.1 Example 7.2: Discrete-time simulation example 1
7.5.2 Example 7.3: Discrete-time simulation example 2
7.5.3 Example 7.4: Evaluating over-parameterization
7.5.4 Example 7.5: RIV estimation of tracer experiment data
7.6 MISO Model Estimation
7.7 Examples of SRIVDD Identification and Estimation for MISO Models
7.7.1 Example 7.6: Simulation results for a 3-inputMISO model
7.7.2 Example 7.7: SRIVDD modelling of a MISO winding process
7.7.3 Example 7.8: SRIVDD evaluation: Monte Carlo simulation results
7.8 Optimal Prefilters and an Adaptive Kalman Filter
7.9 Exercises
7.10 Summary
Chapter 8: Optimal Identification and Estimation of Continuous-Time Transfer Function Models
8.1 Introduction
8.2 TF Conversion Between Continuous and Discrete-Time
8.3 Hybrid Continuous-Time RIVC Estimation
8.4 The Iterative RIVC and SRIVC Algorithms
8.5 Model Structure Identification and MISO Model Estimation
8.6 Examples of RIVC Identification and Estimation
8.6.1 Example 7.1: Continuous-time simulation example
8.6.2 Example 7.2: RIVC SISO estimation of tracer experiment data
8.6.3 Example 7.3: SRIVCDD MISO modelling of the tracer experiment data
8.6.4 Example 7.4: Evaluation of a Global Circulation Model
8.7 Sampling Considerations
8.7.1 The effects of sampling interval on estimation accuracy
8.8 Exercises
8.9 Summary and Conclusions
Chapter 9: Identification of Transfer Function Models in Closed-Loop
9.1 Introduction
9.2 The Generalized Box-Jenkins Model in a Closed-Loop Context
9.3 Closed-Loop Identification and Estimation
9.3.1 Simple CLSRIV and CLSRIVC two-stage closed-loop estimation
9.3.2 Three-stage CLRIV and CLRIVC closed-loop estimation
9.3.3 Unstable systems
9.4 Simulation Examples
9.4.1 Example 9.1: Closed-loop, stable, discrete-time system estimation
9.4.2 Example 9.2: Closed-loop, stable, continuous-time system estimation
9.4.3 Example 9.3: Closed-loop, unstable and marginally stable system estimation
9.5 Exercises
9.6 Summary and Conclusions
Chapter 10: Real-Time Recursive Parameter Estimation
10.1 Prediction Error (PE) Methods and the RPEM Algorithm
10.1.1 Statistical Properties of the PEM Estimates for the BJ Model
10.2 Real-Time Recursive RRIV Estimation
10.2.1 The recursive algorithms
10.2.2 Implementation
10.2.3 Example 10.1: RSRIV estimation of TVPs in a simulated 2nd order TF model
10.3 The Extended Kalman Filter
10.3.1 The EKF in a recursive prediction error form
10.4 Computationally Intensive Methods of Recursive Estimation
10.5 Example 12.1: Data Assimilation and Adaptive Forecasting
10.5.1 The DBM model
10.5.2 Parameter updating by RRIV estimation
10.5.3 State updating by the Kalman Filter
10.5.4 Typical adaptive forecasting results
10.6 Exercises
10.7 Summary
Part III: Other Topics
Chapter 11: State-Dependent Parameter (SDP) Estimation
11.1 Introduction
11.2 SDP Identification of Nonlinear Input-Output Systems
11.2.1 Full SDP estimation
11.2.2 Parameterization and final nonlinear model estimation
11.2.3 The problem of Errors-in-Variables (EIV) estimation
11.3 SDP Identification of Purely Stochastic Nonlinear Systems
11.4 SDP Estimation in Hydrology
11.5 SDP Estimation in Forecasting and Automatic Control
11.6 Exercises
11.7 Summary and Conclusions
Chapter 12: Data-Based Mechanistic (DBM) Modelling
12.1 Introduction
12.2 A brief Review of Data-Based Mechanistic Modelling
12.3 Model Order Reduction
12.3.1 The ALSTOM gasifier example
12.4 Large Computer Model Emulation
12.5 An Illustrative Example: DBM Modelling of Pollution Transport in a Wetland Area
12.5.1 The large simulation model
12.5.2 Emulation modelling
12.5.3 Modelling from real data
12.6 Exercises
12.7 Conclusions
Epilogue: Good, Bad or Optimal?
Appendix A: The K. F. Gauss Derivation of Recursive Least Squares
Appendix B: Basic Mathematical and Statistical Background
B.1 Matrix Algebra
B.1.1 Matrices
B.1.2 Vectors
B.1.3 Matrix Addition (or Subtraction)
B.1.4 Matrix or Vector Transpose
B.1.5 Matrix Multiplication
B.1.6 Determinant of a Matrix
B.1.7 Partitioned Matrices
B.1.8 Inverse of a Matrix
B.1.9 Quadratic Forms
B.1.10 Positive Definite or Semi-Definite Matrices
B.1.11 The Rank of a Matrix
B.1.12 Differentiation of Vectors and Matrices
B.1.13 Cholesky Decomposition
B.1.14 Singular Value Decomposition (SVD)
B.2 Statistics and Probability
B.2.1 Discrete Random Variables
B.2.2 Law of large Numbers
B.2.3 Discrete Random Vectors
B.2.4 Conditional Probabilities
B.2.5 Continuous Random Variables and Vectors
B.2.6 The Normal or Gaussian Density Function
B.2.7 Properties of Estimators
B.2.8 The Likelihood Function and Maximum Likelihood Estimation
B.2.9 The Cramer-Rao Lower Bound
B.2.10 Maximum Likelihood Estimators: the Vector Case
B.3 Time Series
B.3.1 Gauss-Markov Random Processes
B.3.2 The State Space Model of a Linear, Discrete-time, Stochastic Dynamic System
B.3.3 The Discrete-Time Transfer Function Model
B.3.4 Continuous-time, Stochastic Dynamic Models
B.3.5 Hybrid Stochastic Dynamic Models
B.3.6 Multivariable (Multi-Input, Multi-Output) TF Models
B.3.7 Physical Interpretation of TF Models
B.3.7.1 TF Conversion Between Continuous and Discrete-Time
B.3.7.2 Frequency Domain Representations of TF Models
B.3.8 Differentiation of a TF with Respectto a Given Parameter
B.3.9 A Simple Introduction to Monte Carlo Simulation
B.3.9.1 The Generation of Random Realizations
B.3.9.2 Advanced Monte Carlo Methods
Appendix C: Stochastic Approximation
C.1 Some Extensions to Stochastic Approximation
C.1.1 Matrix gain SA and optimum algorithms
C.1.2 Continuous-time algorithms
C.1.3 Search algorithms
C.1.4 Acceleration of convergence
C.2 Summary
Appendix D: Deterministic Regularization and Stochastic Fixed Interval Smoothing
D.1 Wiener-Kolmogorov-Whittle Optimal Signal Extraction
Appendix E: The Instantaneous Cost Function Associated with the Recursive Least Squares Algorithm
Appendix F: Maximum Likelihood Derivation of the Refined Instrumental Variable Algorithm
F.1 RIV System Model Estimation Within the Context of Maximum Likelihood
Appendix G: The CAPTAIN Toolbox for Matlab: an Overview
Glossary
References
Index
Recursive Estimation and Time-Series Analysis
Peter C. Young Recursive Estimation and Time-Series Analysis An Introduction for the Student and Practitioner Second edition 123
Prof. Peter C. Young Green Meadows, Stanmore Drive LA1 5BL Haverbreaks, Lancaster United Kingdom p.young@lancaster.ac.uk ISBN 978-3-642-21980-1 DOI 10.1007/978-3-642-21981-8 Springer Heidelberg Dordrecht London New York e-ISBN 978-3-642-21981-8 Library of Congress Control Number: 2011935045 c Springer-Verlag Berlin Heidelberg 2011 This work is subject to copyright. All rights are reserved, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilm or in any other way, and storage in data banks. Duplication of this publication or parts thereof is permitted only under the provisions of the German Copyright Law of September 9, 1965, in its current version, and permission for use must always be obtained from Springer. Violations are liable to prosecution under the German Copyright Law. The use of general descriptive names, registered names, trademarks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. Cover design: SPi Publisher Services Printed on acid-free paper Springer is part of Springer Science+Business Media (www.springer.com)
To Wendy
Preface This is a revised version of my 1984 book of the same name but, because so much time has elapsed since the publication of the first version, it has been considerably modified and enlarged to accommodate all the developments in recursive estimation and time series analysis that have occurred over the last quarter century. Also over this time, the CAPTAIN Toolbox for recursive estimation and time series analysis has been developed by my colleagues and I at Lancaster, for use in the MatlabTM software environment (see Appendix G). Consequently, the present version of the book is able to exploit the many computational routines that are contained in this widely available Toolbox, as well as some of the other routines in Matlab and its other toolboxes. The book is an introductory one on the topic of recursive estimation and it demonstrates how this approach to estimation, in its various forms, can be an im- pressive aid to the modelling of stochastic, dynamic systems. It is intended for un- dergraduate or Masters students who wish to obtain a grounding in this subject; or for practitioners in industry who may have heard of topics dealt with in this book and, while they want to know more about them, may have been deterred by the rather esoteric nature of some books in this challenging area of study. As such, it can also be considered as a primer for the eventual reading of these more advanced theoretical texts on the subject. However it should be emphasized that the book also contains a considerable amount of novel material which does not appear in any other texts on the subject. There are many people who have influenced my work over many years and who I wish take the opportunity to thank. First, my colleagues in the Environmental Science Department at Lancaster, Keith Beven, Arun Chotai, Wlodek Tych, An- drew Jarvis and Nick Chappell, as well as other colleagues in other Departments: Granville Tunnicliffe-Wilson, Peter Diggle and Jon Tawn in Mathematics and Statis- tics; James Taylor in Engineering; and Robert Fildes in the Management School. Of these, particular thanks are due to Keith Beven, who has continually encouraged my trespass into the hydrological world and has heavily influenced my research on both Data-Based Mechanistic (DBM) modelling and flood forecasting (see 12); Wlodek Tych, one of the major architects of the CAPTAIN Toolbox, who persuaded me to try Matlab out in the first place and who has solved many Matlab problems for me; vii
分享到:
收藏