logo资料库

无人驾驶自动驾驶智能汽车:理论,算法和实现【5rjs.cn】.pdf

第1页 / 共150页
第2页 / 共150页
第3页 / 共150页
第4页 / 共150页
第5页 / 共150页
第6页 / 共150页
第7页 / 共150页
第8页 / 共150页
资料共150页,剩余部分请下载后查看
Autonomous Intelligent Vehicles
Preface
Contents
Part I: Autonomous Intelligent Vehicles
Chapter 1: Introduction
1.1 Research Motivation and Purpose
1.2 The Key Technologies of Intelligent Vehicles
1.2.1 Multi-sensor Fusion Based Environment Perception and Modeling
1.2.2 Vehicle Localization and Map Building
1.2.3 Path Planning and Decision-Making
1.2.4 Low-Level Motion Control
1.3 The Organization of This Book
References
Chapter 2: The State-of-the-Art in the USA
2.1 Introduction
2.2 Carnegie Mellon University-Boss
2.3 Stanford University-Junior
2.4 Virginia Polytechnic Institute and State University-Odin
2.5 Massachusetts Institute of Technology-Talos
2.6 Cornell University-Skynet
2.7 University of Pennsylvania and Lehigh University-Little Ben
2.8 Oshkosh Truck Corporation-TerraMax
References
Chapter 3: The Framework of Intelligent Vehicles
3.1 Introduction
3.2 Related Work
3.3 Interactive Safety Analysis Framework
References
Part II: Environment Perception and Modeling
Chapter 4: Road Detection and Tracking
4.1 Introduction
4.2 Related Work
4.2.1 Model-Based Approaches
4.2.2 Multi-cue Fusion Based Approach
4.2.3 Hypothesis-Validation Based Approaches
4.2.4 Neural Network Based Approaches
4.2.5 Stereo-Based Approaches
4.2.6 Temporal Correlation Based Approaches
4.2.7 Image Filtering Based Approaches
4.3 Lane Detection Using Adaptive Random Hough Transform
4.3.1 The Lane Shape Model
4.3.2 The Adaptive Random Hough Transform
A. Pixel Sampling on Edges
B. Multi-Resolution Parameter Estimating Strategy
4.3.3 Experimental Results
4.4 Lane Tracking
4.4.1 Particle Filtering
4.4.2 Lane Model
4.4.3 Dynamic System Model
4.4.4 The Imaging Model
4.4.5 The Algorithm Implementation
4.4.5.1 Factored Sampling
4.4.5.2 The Observation and Measure Models
4.4.5.3 The Algorithm Flow
4.5 Road Recognition Using a Mean Shift algorithm
4.5.1 The Basic Mean Shift Algorithm
4.5.2 Various Applications of the Mean Shift Algorithm
Mean Shift Clustering
The Mean Shift Segmentation
Mean Shift Tracking
4.5.3 The Road Recognition Algorithm
4.5.4 Experimental Results and Analysis
References
Chapter 5: Vehicle Detection and Tracking
5.1 Introduction
5.2 Related Work
5.3 Generating Candidate ROIs
5.4 Multi-resolution Vehicle Hypothesis
5.5 Vehicle Validation using Gabor Features and SVM
5.5.1 Vehicle Representation
5.5.2 SVM Classi?er
5.6 Boosted Gabor Features
5.6.1 Boosted Gabor Features Using AdaBoost
5.6.1.1 Gabor Feature
5.6.1.2 Boosted Gabor Features
5.6.2 Experimental Results and Analysis
5.6.2.1 Vehicle Database for Detection and Tracking
5.6.2.2 Boosted Gabor Features
5.6.2.3 Vehicle Detection Results and Discussions
References
Chapter 6: Multiple-Sensor Based Multiple-Object Tracking
6.1 Introduction
6.2 Related Work
6.3 Obstacles Stationary or Moving Judgement Using Lidar Data
6.4 Multi-obstacle Tracking and Situation Assessment
6.4.1 Multi-obstacle Tracking Based on EKF Using a Single Sensor
6.4.1.1 Probability Framework of Tracking
6.4.1.2 System Model
6.4.1.3 Initial Conditions
6.4.1.4 Data Association for a Single Sensor
1. Observation-to-Observation Association
2. Observation-to-Track Association
6.4.1.5 Single Track Management
6.4.2 Lidar and Radar Track Fusion
6.4.2.1 Data Alignment
6.4.2.2 Track Association
6.4.2.3 Track Fusion Algorithm
6.5 Conclusion and Future Work
References
Part III: Vehicle Localization and Navigation
Chapter 7: An Integrated DGPS/IMU Positioning Approach
7.1 Introduction
7.2 Related Work
7.3 An Integrated DGPS/IMU Positioning Approach
7.3.1 The System Equation
7.3.2 The Measurement Equation
7.3.3 Data Fusion Using EKF
References
Chapter 8: Vehicle Navigation Using Global Views
8.1 Introduction
8.2 The Problem and Proposed Approach
8.3 The Panoramic Imaging Model
8.4 The Panoramic Inverse Perspective Mapping (pIPM)
8.4.1 The Mapping Relationship Between Each Image and a Panoramic Image
8.4.2 The Panoramic Inverse Perspective Mapping
8.5 The Implementation of the pIPM
8.5.1 The Field of View of N Cameras in the Vehicle Coordinate System
8.5.2 Calculation of Each Interest Point's View Angle in the Vehicle Coordinate System
8.5.3 The Mapping Relationship Between a 3D On-road Point and a Panoramic Image
8.5.4 Image Interpolation in the Vehicle Coordinate System
8.6 The Elimination of Wide-Angle Lens' Radial Error
8.7 Combining Panoramic Images with Electronic Maps
References
Part IV: Advanced Vehicle Motion Control
Chapter 9: The Lateral Motion Control for Intelligent Vehicles
9.1 Introduction
9.2 Related Work
9.3 The Mixed Lateral Control Strategy
9.3.1 Linear Roads
1. Determining Look-Ahead Distance
2. Calculating Looking-Ahead Error
9.3.2 Curvilinear Roads
1. Existing Shape Representation
2. The Proposed Segmenting Approach of Contours
9.3.3 Calculating the Radius of an Arc
9.3.4 The Algorithm Flow
9.4 The Relationship Between Motor Pulses and the Front Wheel Lean Angle
References
Chapter 10: Longitudinal Motion Control for Intelligent Vehicles
10.1 Introduction
10.2 System Identi?cation in Vehicle Longitudinal Control
10.2.1 The First-Order Systems
10.2.2 First-Order Lag Systems
10.2.3 Identi?cation of Our Vehicle System
1. The First-Order System Assumption
2. Validating the First-Order Lag Assumption
3. Validating Second-Order Assumption
10.3 The Proposed Velocity Controller
10.3.1 Validating the Longitudinal Control System Function
10.3.2 Velocity Controller Design
10.4 Experimental Results and Analysis
References
Index
Advances in Computer Vision and Pattern Recognition For further volumes: www.springer.com/series/4205 5rjs.cn 专注无人驾驶
Hong Cheng Autonomous Intelligent Vehicles Theory, Algorithms, and Implementation 5rjs.cn 专注无人驾驶
Prof. Hong Cheng School of Automation Engineering University of Electronic Science and Technology 610054 Chengdu, Sichuan, People’s Republic of China hcheng@uestc.edu.cn Series Editors Professor Sameer Singh, PhD Research School of Informatics Loughborough University Loughborough UK Dr. Sing Bing Kang Microsoft Research Microsoft Corporation One Microsoft Way Redmond, WA 98052 USA ISSN 2191-6586 Advances in Computer Vision and Pattern Recognition ISBN 978-1-4471-2279-1 DOI 10.1007/978-1-4471-2280-7 Springer London Dordrecht Heidelberg New York e-ISSN 2191-6594 e-ISBN 978-1-4471-2280-7 British Library Cataloguing in Publication Data A catalogue record for this book is available from the British Library Library of Congress Control Number: 2011943117 © Springer-Verlag London Limited 2011 Apart from any fair dealing for the purposes of research or private study, or criticism or review, as per- mitted under the Copyright, Designs and Patents Act 1988, this publication may only be reproduced, stored or transmitted, in any form or by any means, with the prior permission in writing of the publish- ers, or in the case of reprographic reproduction in accordance with the terms of licenses issued by the Copyright Licensing Agency. Enquiries concerning reproduction outside those terms should be sent to the publishers. The use of registered names, trademarks, etc., in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant laws and regulations and therefore free for general use. The publisher makes no representation, express or implied, with regard to the accuracy of the information contained in this book and cannot accept any legal responsibility or liability for any errors or omissions that may be made. Printed on acid-free paper Springer is part of Springer Science+Business Media (www.springer.com) 5rjs.cn 专注无人驾驶
Preface Over the years, the field of intelligent vehicles has become a major research theme in intelligent transportation systems since traffic accidents are serious and growing problems all over the world. The goal of an intelligent vehicle is to augment vehicle autonomous driving either entirely or partly for the purposes of safety, comforta- bility, and saving energy. Indeed, many technologies of intelligent vehicles root in autonomous mobile robots. The tasks of intelligent vehicles become even more chal- lenging compared to indoor mobile robots for two reasons. First, real-time dynamic complex environment perception and modeling will challenge current indoor robot technologies. Autonomous intelligent vehicles have to finish the basic procedures: perceiving and modeling environment, localizing and building maps, planning paths and making decisions, and controlling the vehicles within limit time for real-time purposes. Meanwhile, we face the challenge of processing large amounts of data from multi-sensors, such as cameras, lidars, radars. This is extremely hard in more complex outdoor environments. Toward this end, we have to implement those tasks in more efficient ways. Second, vehicle motion control faces the challenges of strong nonlinear characteristics due to high mass, especially in the processes of high speed and sudden steering. In this case, both lateral and longitudinal control algorithms of indoor robots do not work well. This book presents our recent research work on intelligent vehicles and is aimed at the researchers and graduate students interested in intelligent vehicles. Our goal in writing this book is threefold. First, it creates an updated reference book of in- telligent vehicles. Second, this book not only presents object/obstacle detection and recognition, but also introduces vehicle lateral and longitudinal control algorithms, which benefits the readers keen to learn broadly about intelligent vehicles. Finally, we put emphasis on high-level concepts, and at the same time provide the low-level details of implementation. We try to link theory, algorithms, and implementation to promote intelligent vehicle research. This book is divided into four parts. The first part Autonomous Intelligent Ve- hicles presents the research motivation and purposes, the state-of-art of intelligent vehicles research. Also, we introduce the framework of intelligent vehicles. The sec- ond part Environment Perception and Modeling which includes Road detection v 5rjs.cn 专注无人驾驶
vi Preface and tracking, Vehicle detection and tracking, Multiple-sensor based multiple-object tracking introduces environment perception and modeling. The third part Vehicle Localization and Navigation which includes An integrated DGPS/IMU positioning approach, Vehicle navigation using global views presents vehicle navigation based on integrated GPS and INS. The fourth part Advanced Vehicle Motion control introduces vehicle lateral and longitudinal motion control. Most of this book refers to our research work at Xi’an Jiaotong University and Carnegie Mellon University. During the last ten years of research, a large number of people had been working in the Springrobot Project at Xi’an Jiaotong University. I would like to deliver my deep respect to my Ph.D advisor, Professor Nanning Zheng, who leaded me into this field. Also I would like to thank: Yuehu Liu, Xiaojun Lv, Lin Ma, Xuetao Zhang, Junjie Qin, Jingbo Tang, Yingtuan Hou, Jing Yang, Li Zhao, Chong Sun, Fan Mu, Ran Li, Weijie Wang, and Huub van de Wetering. Also, I would like to thank Jie Yang at Carnegie Mellon University who supported Hong Cheng’s research work during his stay at this university and Zicheng Liu at Microsoft Research who helped Hong Cheng discuss vehicle navigation with global views. I also would like to our sincere and deep thanks to Zhongjun Dai who helped immensely with figure preparation and with the typesetting of the book in LaTeX. Many people have helped by proofreading draft materials and providing comments and suggestions, including Nana Chen, Rui Huang, Pingxin Long, Wenjun Jing, Yuzhuo Wang. Springer has provided excellent support throughout the final stages of preparation of this book, and I would like to thank our commissioning editor Wayne Wheeler for his support and professionalism as well as Simon Rees for his help. Chengdu, People’s Republic of China Hong Cheng 5rjs.cn 专注无人驾驶
Contents Part I Autonomous Intelligent Vehicles 1 2 3 . . Introduction . . . . . 1.1 Research Motivation and Purpose . . . . . . . . . . 1.2 The Key Technologies of Intelligent Vehicles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.2.1 Multi-sensor Fusion Based Environment Perception and Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.2.2 Vehicle Localization and Map Building . . . . . . . . . . . 1.2.3 Path Planning and Decision-Making . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.2.4 Low-Level Motion Control 1.3 The Organization of This Book . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . The State-of-the-Art in the USA . . . . 2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.2 Carnegie Mellon University—Boss . . . . . . . . . . . . . . . . . 2.3 Stanford University—Junior . . . . . . . . . . . . . . . . . . . . . 2.4 Virginia Polytechnic Institute and State University—Odin . . . . . 2.5 Massachusetts Institute of Technology—Talos . . . . . . . . . . . 2.6 Cornell University—Skynet . . . . . . . . . . . . . . . . . . . . . 2.7 University of Pennsylvania and Lehigh University—Little Ben . . 2.8 Oshkosh Truck Corporation—TerraMax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . The Framework of Intelligent Vehicles . . . . . . . . . . . . . . . . . 3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.3 Interactive Safety Analysis Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . 3 3 5 6 7 8 9 9 10 13 13 13 15 16 17 18 19 20 21 23 23 24 25 28 vii 5rjs.cn 专注无人驾驶
viii Contents Part II Environment Perception and Modeling 4 5 . . . . . . . . . . . . . . . . . Image Filtering Based Approaches 4.3.1 The Lane Shape Model 4.3.2 The Adaptive Random Hough Transform . 4.3.3 Experimental Results Road Detection and Tracking . . . . . 4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.2.1 Model-Based Approaches . . . . . . . . . . . . . . . . . . 4.2.2 Multi-cue Fusion Based Approach . . . . . . . . . . . . . 4.2.3 Hypothesis-Validation Based Approaches . . . . . . . . . . 4.2.4 Neural Network Based Approaches . . . . . . . . . . . . . 4.2.5 Stereo-Based Approaches . . . . . . . . . . . . . . . . . . 4.2.6 Temporal Correlation Based Approaches . . . . . . . . . . . . . . . . . . . . . . . 4.2.7 4.3 Lane Detection Using Adaptive Random Hough Transform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.4 Lane Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.4.1 Particle Filtering . 4.4.2 Lane Model . . . . . . . . . . . . . . . . . . . . . . . . . 4.4.3 Dynamic System Model . . . . . . . . . . . . . . . . . . . 4.4.4 The Imaging Model . . . . . . . . . . . . . . . . . . . 4.4.5 The Algorithm Implementation . . . . . . . . . . . . . . . . . . . . . . . . 4.5.1 The Basic Mean Shift Algorithm . . . . . . . . . . . . . . 4.5.2 Various Applications of the Mean Shift Algorithm . . . . . 4.5.3 The Road Recognition Algorithm . . . . . . . . . . . . . . 4.5.4 Experimental Results and Analysis . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.5 Road Recognition Using a Mean Shift algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Vehicle Detection and Tracking . . . . 5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.3 Generating Candidate ROIs . . 5.4 Multi-resolution Vehicle Hypothesis . . . . . . . . . . . . . . . . 5.5 Vehicle Validation using Gabor Features and SVM . . . . . . . . . 5.5.1 Vehicle Representation . . . . . . . . . . . . . . . . . . . 5.5.2 SVM Classifier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.6.1 Boosted Gabor Features Using AdaBoost . 5.6.2 Experimental Results and Analysis . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.6 Boosted Gabor Features . . . . . . . . . . . . . . . . . . . . . 6 Multiple-Sensor Based Multiple-Object Tracking . . . . . . . . . . . 6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.3 Obstacles Stationary or Moving Judgement Using Lidar Data . . . 6.4 Multi-obstacle Tracking and Situation Assessment . . . . . . . . . 33 33 35 35 35 36 36 36 37 37 37 37 38 41 43 43 45 46 46 48 51 52 54 55 56 58 61 61 62 63 65 67 67 68 71 72 74 79 81 81 81 82 84 5rjs.cn 专注无人驾驶
Contents 6.4.1 Multi-obstacle Tracking Based on EKF Using a Single Sensor . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.4.2 Lidar and Radar Track Fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.5 Conclusion and Future Work . References . ix 84 90 92 94 Part III Vehicle Localization and Navigation 7 8 An Integrated DGPS/IMU Positioning Approach . . . . . 99 . . . . . . 7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99 7.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 7.3 An Integrated DGPS/IMU Positioning Approach . . . . . . . . . . 101 7.3.1 The System Equation . . . . . . . . . . . . . . . . . . . . 101 7.3.2 The Measurement Equation . . . . . . . . . . . . . . . . . 104 7.3.3 Data Fusion Using EKF . . . . . . . . . . . . . . . . . . . 105 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106 Vehicle Navigation Using Global Views . . . . . . . . . . . . . . . . . 109 8.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 8.2 The Problem and Proposed Approach . . . . . . . . . . . . . . . . 110 8.3 The Panoramic Imaging Model . . . . . . . . . . . . . . . . . . . 112 8.4 The Panoramic Inverse Perspective Mapping (pIPM) . . . . . . . . 114 8.4.1 The Mapping Relationship Between Each Image and a Panoramic Image . . . . . . . . . . . . . . . . . . . 114 8.4.2 The Panoramic Inverse Perspective Mapping . . . . . . . . 115 8.5 The Implementation of the pIPM . . . . . . . . . . . . . . . . . . 116 . . . 8.5.1 The Field of View of N Cameras in the Vehicle Coordinate . . 8.5.2 Calculation of Each Interest Point’s View Angle in the System . . . . . . . . . . . . . . . . . . . . . . . . . . 116 Vehicle Coordinate System . . . . . . . . . . . . . . . . . 116 8.5.3 The Mapping Relationship Between a 3D On-road Point 8.5.4 and a Panoramic Image . . . . . . . . . . . . . . . . . . . 117 Image Interpolation in the Vehicle Coordinate System . . . 117 8.6 The Elimination of Wide-Angle Lens’ Radial Error . . . . . . . . 118 8.7 Combining Panoramic Images with Electronic Maps . . . . . . . . 119 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120 References . Part IV Advanced Vehicle Motion Control 9 The Lateral Motion Control for Intelligent Vehicles . . . . . . . . . . 125 9.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125 9.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125 9.3 The Mixed Lateral Control Strategy . . . . . . . . . . . . . . . . . 126 9.3.1 Linear Roads . . . . . . . . . . . . . . . . . . . . . . . . . 127 9.3.2 Curvilinear Roads . . . . . . . . . . . . . . . . . . . . . . 128 9.3.3 Calculating the Radius of an Arc . . . . . . . . . . . . . . 131 9.3.4 The Algorithm Flow . . . . . . . . . . . . . . . . . . . . . 132 5rjs.cn 专注无人驾驶
分享到:
收藏