logo资料库

Robotics Vision and Control 2nd edition.pdf

第1页 / 共697页
第2页 / 共697页
第3页 / 共697页
第4页 / 共697页
第5页 / 共697页
第6页 / 共697页
第7页 / 共697页
第8页 / 共697页
资料共697页,剩余部分请下载后查看
Professor Bruno Siciliano
Professor Oussama Khatib
Author Peter Corke
Foreword
Preface
Contents
Nomenclature
1 Introduction
1.1 Robots, Jobs and Ethics
1.2 About the Book
1.2.1 MATLAB Software and the Toolboxes
1.2.2 Notation, Conventions and Organization
1.2.3 Audience and Prerequisites
1.2.4 Learning with the Book
1.2.5 Teaching with the Book
1.2.6 Outline
Further Reading
Part I Foundations
2 Representing Position and Orientation
2.1 Working in Two Dimensions (2D)
2.1.1 Orientation in 2-Dimensions
2.1.1.1 Orthonormal Rotation Matrix
2.1.1.2 Matrix Exponential
2.1.2 Pose in 2-Dimensions
2.1.2.1 Homogeneous Transformation Matrix
2.1.2.2 Centers of Rotation
2.1.2.3 Twists in 2D
2.2 Working in Three Dimensions (3D)
2.2.1 Orientation in 3-Dimensions
2.2.1.1 Orthonormal Rotation Matrix
2.2.1.2 Three- Angle Representations
2.2.1.3 Singularities and Gimbal Lock
2.2.1.4 Two Vector Representation
2.2.1.5 Rotation about an Arbitrary Vector
2.2.1.6 Matrix Exponentials
2.2.1.7 Unit Quaternions
2.2.2 Pose in 3-Dimensions
2.2.2.1 Homogeneous Transformation Matrix
2.2.2.2 Vector-Quaternion Pair
2.2.2.3 Twists
2.3 Advanced Topics
2.3.1 Normalization
2.3.2 Understanding the Exponential Mapping
2.3.3 More About Twists
2.3.4 Dual Quaternions
2.3.5 Configuration Space
2.4 Using the Toolbox
2.5 Wrapping Up
Further Reading
Exercises
3 Time and Motion
3.1 Time-Varying Pose
3.1.1 Derivative of Pose
3.1.2 Transforming Spatial Velocities
3.1.3 Incremental Rotation
3.1.4 Incremental Rigid-Body Motion
3.2 Accelerating Bodies and Reference Frames
3.2.1 Dynamics of Moving Bodies
3.2.2 Transforming Forces and Torques
3.2.3 Inertial Reference Frame
3.3 Creating Time-Varying Pose
3.3.1 Smooth One-Dimensional Trajectories
3.3.2 Multi-Dimensional Trajectories
3.3.3 Multi-Segment Trajectories
3.3.4 Interpolation of Orientation in 3D
3.3.4.1 Direction of Rotation
3.3.5 Cartesian Motion in 3D
3.4 Application: Inertial Navigation
l3.4.1 Gyroscopes
3.4.1.1 How Gyroscopes Work
3.4.1.2 Estimating Orientation
3.4.2 Accelerometers
3.4.2.1 How Accelerometers Work
3.4.2.2 Estimating Pose and Body Acceleration
3.4.3 Magnetometers
3.4.3.1 How Magnetometers Work
3.4.3.2 Estimating Heading
3.4.4 Sensor Fusion
3.5 Wrapping Up
Further Reading
Exercises
Part II Mobile Robots
II Mobile Robots
4 Mobile Robot Vehicles
4.1 Wheeled Mobile Robots
4.1.1 Car-Like Mobile Robots
4.1.1.1 Moving to a Point
4.1.1.2 Following a Line
4.1.1.3 Following a Trajectory
4.1.1.4 Moving to a Pose
4.1.2 Differentially-Steered Vehicle
4.1.3 Omnidirectional Vehicle
4.2 Flying Robots
4.3 Advanced Topics
4.3.1 Nonholonomic and Under-Actuated Systems
4.4Wrapping Up
Further Reading
Toolbox and MATLAB Notes
Exercises
Chapter
5 Navigation
5.1 Reactive Navigation
l5.1.1 Braitenberg Vehicles
l5.1.2 Simple Automata
5.2 Map-Based Planning
5.2.1 Distance Transform
5.2.2 D*
5.2.3 Introduction to Roadmap Methods
5.2.4 Probabilistic Roadmap Method (PRM)
5.2.5 Lattice Planner
5.2.6 Rapidly-Exploring Random Tree (RRT)
5.3 Wrapping Up
Further Reading
Resources
MATLAB Notes
Exercises
Chapter
6 Localization
6.1 Dead Reckoning
6.1.1 Modeling the Vehicle
6.1.2 Estimating Pose
6.2 Localizing with a Map
6.3 Creating a Map
6.4 Localization and Mapping
6.5 Rao-Blackwellized SLAM
6.6 Pose Graph SLAM
6.7 Sequential Monte-Carlo Localization
6.8 Application: Scanning Laser Rangefinder
Laser Odometry
Laser-Based Map Building
Laser-Based Localization
6.9 Wrapping Up
Further Reading
Toolbox and MATLAB Notes
Exercises
Part III Arm-Type Robots
III Arm-Type Robots
7 Robot Arm Kinematics
7.1 Forward Kinematics
7.1.1 2-Dimensional (Planar) Robotic Arms
7.1.2 3-Dimensional Robotic Arms
l7.1.2.1 Denavit-Hartenberg Parameters
l7.1.2.2 Product of Exponentials
l7.1.2.3 6-Axis Industrial Robot
7.2 Inverse Kinematics
7.2.1 2-Dimensional (Planar) Robotic Arms
7.2.1.1 Closed-Form Solution
7.2.1.2 Numerical Solution
7.2.2 3-Dimensional Robotic Arms
7.2.2.1Closed-Form Solution
7.2.2.2 Numerical Solution
7.2.2.3 Under-Actuated Manipulator
l7.2.2.4 Redundant Manipulator
7.3 Trajectories
7.3.1 Joint-Space Motion
7.3.2 Cartesian Motion
7.3.3 Kinematics in Simulink
7.3.4 Motion through a Singularity
7.3.5 Configuration Change
7.4 Advanced Topics
7.4.1 Joint Angle Offsets
7.4.2 Determining Denavit-Hartenberg Parameters
7.4.3 Modified Denavit-Hartenberg Parameters
7.5 Applications
7.5.1 Writing on a Surface
7.5.2 A Simple Walking Robot
Kinematics
Motion of One Leg
Motion of Four Legs
7.6 Wrapping Up
Further Reading
MATLAB and Toolbox Notes
Exercises
Chapter
8 Manipulator Velocity
8.1 Manipulator Jacobian
8.1.1 Jacobian in the World Coordinate Frame
8.1.2 Jacobian in the End-Effector Coordinate Frame
8.1.3 Analytical Jacobian
8.2 Jacobian Condition and Manipulability
l8.2.1 Jacobian Singularities
l8.2.2 Manipulability
8.3 Resolved-Rate Motion Control
8.3.1 Jacobian Singularity
8.4 Under- and Over-Actuated Manipulators
8.4.1 Jacobian for Under-Actuated Robot
8.4.2 Jacobian for Over-Actuated Robot
8.5 Force Relationships
8.5.1 Transforming Wrenches to Joint Space
8.5.2 Force Ellipsoids
8.6 Inverse Kinematics: a General Numerical Approach
l8.6.1 Numerical Inverse Kinematics
8.7 Advanced Topics
8.7.1 Computing the Manipulator Jacobian Using Twists
8.8 Wrapping Up
Further Reading
MATLAB and Toolbox Notes
Exercises
9 Dynamics and Control
9.1 Independent Joint Control
9.1.1 Actuators
9.1.2 Friction
9.1.3 Effect of the Link Mass
9.1.4 Gearbox
9.1.5 Modeling the Robot Joint
9.1.6 Velocity Control Loop
9.1.7 Position Control Loop
9.1.8 Independent Joint Control Summary
9.2 Rigid-Body Equations of Motion
9.2.1 Gravity Term
9.2.2 Inertia Matrix
9.2.3 Coriolis Matrix
9.2.4 Friction
9.2.5 Effect of Payload
9.2.6 Base Force
9.2.7 Dynamic Manipulability
9.3 Forward Dynamics
9.4 Rigid-Body Dynamics Compensation
9.4.1 Feedforward Control
9.4.2 Computed Torque Control
l9.4.3 Operational Space Control
9.5 Applications
9.5.1 Series-Elastic Actuator (SEA)
9.6 Wrapping Up
Further Reading
Exercises
Part IV Computer Vision
IV Computer Vision
10 Light and Color
10.1 Spectral Representation of Light
10.1.1 Absorption
10.1.2 Reflectance
10.1.3 Luminance
10.2 Color
10.2.1The Human Eye
10.2.2 Measuring Color
10.2.3 Reproducing Colors
10.2.4 Chromaticity Space
10.2.5 Color Names
10.2.6 Other Color and Chromaticity Spaces
10.2.7 Transforming between Different Primaries
10.2.8 What Is White?
10.3 Advanced Topics
10.3.1 Color Temperature
10.3.2 Color Constancy
10.3.3 White Balancing
10.3.4 Color Change Due to Absorption
10.3.5 Dichromatic Reflectance
10.3.6 Gamma
10.4 Application: Color Image
10.4.1 Comparing Color Spaces
10.4.2 Shadow Removal
10.5 Wrapping Up
Further Reading
Data Sources
Exercises
11 Image Formation
11.1 Perspective Camera
11.1.1 Perspective Projection
11.1.2 Modeling a Perspective Camera
11.1.3 Discrete Image Plane
11.1.4 Camera Matrix
11.1.5 Projecting Points
11.1.6 Lens Distortion
11.2 Camera Calibration
11.2.1 Homogeneous Transformation Approach
11.2.2 Decomposing the Camera Calibration Matrix
11.2.3 Pose Estimation
11.2.4 Camera Calibration Toolbox
11.3 Wide Field-of-View Imaging
11.3.1 Fisheye Lens Camera
11.3.2 Catadioptric Camera
11.3.3 Spherical Camera
11.4 Unified Imaging
11.4.1 Mapping Wide-Angle Images to the Sphere
11.4.2 Mapping from the Sphere to a Perspective Image
11.5 Novel Cameras
11.5.1 Multi-Camera Arrays
11.5.2 Light-Field Cameras
11.6 Advanced Topics
11.6.1 Projecting 3D Lines and Quadrics
11.6.2 Nonperspective Cameras
11.7 Wrapping Up
Further Reading and Resources
Toolbox Notes
Exercises
Chapter
12 Images and Image Processing
12.1 Obtaining an Image
12.1.1 Images from Files
12.1.2 Images from an Attached Camera
12.1.3 Images from a Movie File
12.1.4 Images from the Web
12.1.5 Images from Maps
12.1.6 Images from Code
12.2 Image Histograms
12.3 Monadic Operations
12.4 Diadic Operations
12.5 Spatial Operations
12.5.1 Linear Spatial Filtering
12.5.1.1 Smoothing
12.5.1.2 Boundary Effects
12.5.1.3 Edge Detection
12.5.2 Template Matching
12.5.2.1 Nonparameteric Local Transforms
12.5.3 Nonlinear Operations
12.6 Mathematical Morphology
12.6.1 Noise Removal
12.6.2 Boundary Detection
12.6.3 Hit or Miss Transform
12.6.4 Distance Transform
12.7 Shape Changing
12.7.1 Cropping
12.7.2 Image Resizing
12.7.3 Image Pyramids
12.7.4 Image Warping
12.8 Wrapping Up
Further Reading
Sources of Image Data
MATLAB Notes
General Software Tools
Exercises
Chapter
13 Image Feature Extraction
13.1 Region Features
13.1.1 Classification
13.1.1.1 Grey-Level Classification
13.1.1.2 Color Classification
13.1.2 Representation
13.1.2.1 Graph-Based Segmentation
13.1.3 Description
13.1.3.1 Bounding Boxes
13.1.3.2 Moments
13.1.3.3 Blob Features
13.1.3.4 Shape from Moments
13.1.3.5 Shape from Perimeter
13.1.3.6 Character Recognition
13.1.4 Summary
13.2 Line Features
13.2.1 Summary
13.3 Point Features
13.3.1 Classical Corner Detectors
13.3.2 Scale-Space Corner Detectors
13.3.2.1 Scale-Space Point Feature
13.4 Wrapping Up
MATLAB Notes
Further Reading
Exercises
Chapter
14 Using Multiple Images
14.1 Feature Correspondence
14.2 Geometry of Multiple Views
14.2.1 The Fundamental Matrix
14.2.2 The Essential Matrix
14.2.3 Estimating the Fundamental Matrix from Real Image Data
14.2.4 Planar Homography
14.3 Stereo Vision
14.3.1 Sparse Stereo
14.3.2 Dense Stereo Matching
14.3.2.1 Stereo Failure Modes
14.3.3 Peak Refinement
14.3.4 Cleaning up and Reconstruction
14.3.5 3D Texture Mapped Display
14.3.6 Anaglyphs
14.3.7 Image Rectification
14.4 Bundle Adjustment
14.5 Point Clouds
14.5.1 Fitting a Plane
14.5.2 Matching Two Sets of Points
14.6 Structured Light
14.7 Applications
14.7.1 Perspective Correction
14.7.2 Mosaicing
14.7.3 Image Matching and Retrieval
14.7.4 Visual Odometry
14.8 Wrapping Up
MATLAB and Toolbox Notes
Further Reading
Resources
Exercises
Part V Robotics, Vision and Control
V Robotics, Vision and Control
15 Vision-Based Control
15.1 Position-Based Visual Servoing
15.2 Image-Based Visual Servoing
15.2.1 Camera and Image Motion
15.2.2 Controlling Feature Motion
15.2.3 Estimating Feature Depth
15.2.4 Performance Issues
15.3 Using Other Image Features
15.3.1 Line Features
15.3.2 Circle Features
15.3.3 Photometric Features
15.4 Wrapping Up
Further Reading
Exercises
16 Advanced Visual Servoing
16.1 XY/Z-Partitioned IBVS
16.2 IBVS Using Polar Coordinates
16.3 IBVS for a Spherical Camera
16.4 Applications
16.4.1 Arm-Type Robot
16.4.2 Mobile Robot
16.4.2.1 Holonomic Mobile Robot
16.4.2.2 Nonholonomic Mobile Robot
16.4.3 Aerial Robot
16.5 Wrapping Up
Further Reading
Resources
Exercises
Appendices
A Installing the Toolboxes
B Linear Algebra Refresher
B.1 Vectors
B.2 Matrices
B.2.1 Square Matrices
B.2.2 Nonsquare and Singular Matrices
C Geometry
C.1 Euclidean Geometry
C.1.1 Points
C.1.2 Lines
C.1.2.1 Lines in 2D
C.1.2.2 Lines in 3D and Plücker Coordinates
C.1.3 Planes
C.1.4 Ellipses and Ellipsoids
C.1.4.1 Properties
C.1.4.2 Drawing an Ellipse
C.1.4.3 Fitting an Ellipse to Data
From a Set of Interior Points
From a Set of Boundary Points
C.2 Homogeneous Coordinates
C.2.1 Two Dimensions
C.2.1.1 Conics
C.2.2 Three Dimensions
C.2.2.1 Lines
C.2.2.2 Planes
C.2.2.3 Quadrics
C.3 Geometric Transformations
D Lie Groups and Algebras
E Linearization, Jacobians and Hessians
F Solving Systems of Equations
F.1 Linear Problems
F.1.1 Nonhomogeneous Systems
F.1.2 Homogeneous Systems
F.1.2 Homogeneous Systems
F.2.1 Finding Roots
F.2.2 Nonlinear Minimization
F.2.3 Nonlinear Least Squares Minimization
Numerical Issues
F.2.4 Sparse Nonlinear Least Squares
State Vector
Inherent Structure
Large Scale Problems
Anchoring
G Gaussian Random Variables
H Kalman Filter
H.1 Linear Systems – Kalman Filter
H.2 Nonlinear Systems – Extended Kalman Filter
I Graphs
Appendix
J Peak Finding
Bibliography
Index
A
H
B
I
C
J
K
L
D
E
M
G
N
T
P
U
V
R
W
S
Y
A
B
C
E
F
D
G
H
I
J
K
L
M
N
Q
O
R
P
S
T
V
X, Y, Z
U
Symbols
A
B
C
D
E
F
G
I
H
J
K
L
M
N
O
P
R
Q
S
T
U
V
W
Y
Z
X
Robotics, Vision and Control n o i t i d E d n o c e S Peter Corke FUNDAMENTAL ALGORITHMS IN MATLAB® 123
Springer Tracts in Advanced Robotics Volume 118 Editors: Bruno Siciliano · Oussama Khatib
Peter Corke Robotics, Vision and Control Fundamental Algorithms in MATLAB® Second, completely revised, extended and updated edition With 492 Images Additional material is provided at www.petercorke.com/RVC
Professor Bruno Siciliano Dipartimento di Ingegneria Elettrica e Tecnologie dell’Informazione, Università di Napoli Federico II, Via Claudio 21, 80125 Napoli, Italy, e-mail: siciliano@unina.it Professor Oussama Khatib Artificial Intelligence Laboratory, Department of Computer Science, Stanford University, Stanford, CA 94305-9010, USA, e-mail: khatib@cs.stanford.edu Author Peter Corke School of Electrical Engineering and Computer Science Queensland University of Technology (QUT), Brisbane QLD 4000, Australia e-mail: rvc@petercorke.com ISSN 1610-7438 Springer Tracts in Advanced Robotics ISBN 978-3-319-54412-0 DOI 10.1007/978-3-319-54413-7 ISSN 1610-742X (electronic) ISBN 978-3-319-54413-7 (eBook) Library of Congress Control Number: 2017934638 1st ed. 2011 © Springer-Verlag Berlin Heidelberg 2011 © Springer International Publishing AG 2017 This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, express or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. Production: Armin Stasch and Scientific Publishing Services Pvt. Ltd. Chennai, India Typesetting and layout: Stasch · Bayreuth (stasch@stasch.com) Printed on acid-free paper This Springer imprint is published by Springer Nature The registered company is Springer International Publishing AG The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Editorial Advisory Board Nancy Amato, Texas A & M, USA Oliver Brock, TU Berlin, Germany Herman Bruyninckx, KU Leuven, Belgium Wolfram Burgard, Univ. Freiburg, Germany Raja Chatila, ISIR – UPMC & CNRS, France Francois Chaumette, INRIA Rennes – Bretagne Atlantique, France Wan Kyun Chung, POSTECH, Korea Peter Corke, Queensland Univ. Technology, Australia Paolo Dario, Scuola S. Anna Pisa, Italy Alessandro De Luca, Sapienza Univ. Roma, Italy Rüdiger Dillmann, Univ. Karlsruhe, Germany Ken Goldberg, UC Berkeley, USA John Hollerbach, Univ. Utah, USA Lydia Kavraki, Rice Univ., USA Vijay Kumar, Univ. Pennsylvania, USA Bradley Nelson, ETH Zürich, Switzerland Frank Park, Seoul National Univ., Korea Tim Salcudean, Univ. British Columbia, Canada Roland Siegwart, ETH Zurich, Switzerland Gaurav Sukhatme, Univ. Southern California, USA More information about this series at http://www.springer.com/series/5208
To my family Phillipa, Lucy and Madeline for their indulgence and support; my parents Margaret and David for kindling my curiosity; and to Lou Paul who planted the seed that became this book.
Foreword Once upon a time, a very thick document of a dissertation from a faraway land came to me for evaluation. Visual robot control was the thesis theme and Peter Corke was its author. Here, I am reminded of an excerpt of my comments, which reads, this is a masterful document, a quality of thesis one would like all of one’s students to strive for, knowing very few could attain – very well considered and executed. The connection between robotics and vision has been, for over two decades, the central thread of Peter Corke’s productive investigations and successful developments and implementations. This rare experience is bearing fruit in this second edition of his book on Robotics, Vision, and Control. In its melding of theory and application, this second edition has considerably benefi ted from the author’s unique mix of academic and real-world application infl uences through his many years of work in robotic min- ing, fl ying, underwater, and fi eld robotics. There have been numerous textbooks in robotics and vision, but few have reached the level of integration, analysis, dissection, and practical illustrations evidenced in this book. The discussion is thorough, the narrative is remarkably informative and accessible, and the overall impression is of a signifi cant contribution for researchers and future investigators in our fi eld. Most every element that could be considered as relevant to the task seems to have been analyzed and incorporated, and the effective use of Toolbox software echoes this thoroughness. The reader is taken on a realistic walkthrough the fundamentals of mobile robots, navigation, localization, manipulator-arm kinematics, dynamics, and joint-level con- trol, as well as camera modeling, image processing, feature extraction, and multi-view geometry. These areas are fi nally brought together through extensive discussion of visual servo system. In the process, the author provides insights into how complex problems can be decomposed and solved using powerful numerical tools and effec- tive software. The Springer Tracts in Advanced Robotics (STAR) is devoted to bringing to the research community the latest advances in the robotics fi eld on the basis of their sig- nifi cance and quality. Through a wide and timely dissemination of critical research developments in robotics, our objective with this series is to promote more exchanges and collaborations among the researchers in the community and contribute to further advancements in this rapidly growing fi eld. Peter Corke brings a great addition to our STAR series with an authoritative book, reaching across fi elds, thoughtfully conceived and brilliantly accomplished. Oussama Khatib Stanford, California October 2016
Preface Tell me and I will forget. Show me and I will remember. Involve me and I will understand. Chinese proverb Simple things should be simple, complex things should be possible. Alan Kay These are exciting times for robotics. Since the fi rst edition of this book was published we have seen much progress: the rise of the self-driving car, the Mars science labora- tory rover making profound discoveries on Mars, the Philae comet landing attempt, and the DARPA Robotics Challenge. We have witnessed the drone revolution – fl ying machines that were once the domain of the aerospace giants can now be bought for just tens of dollars. All this has been powered by the continuous and relentless improve- ment in computer power and tremendous advances in low-cost inertial sensors and cameras – driven largely by consumer demand for better mobile phones and gaming experiences. It’s getting easier for individuals to create robots – 3D printing is now very affordable, the Robot Operating System (ROS) is both capable and widely used, and powerful hobby technologies such as the Arduino, Raspberry Pi, Dynamixel servo motors and Lego’s EV3 brick are available at low cost. This in turn has contributed to the rapid growth of the global maker community – ordinary people creating at home what would once have been done by a major corporation. We have also witnessed an explosion of commercial interest in robotics and computer vision – many startups and a lot of acquisitions by big players in the fi eld. Robotics even featured on the front cover of the Economist magazine in 2014! So how does a robot work? Robots are data-driven machines. They acquire data, process it and take action based on it. The data comes from sensors measuring the ve- locity of a wheel, the angle of a robot arm’s joint or the intensities of millions of pixels that comprise an image of the world that the robot is observing. For many robotic ap- plications the amount of data that needs to be processed, in real-time, is massive. For a vision sensor it can be of the order of tens to hundreds of megabytes per second. Progress in robots and machine vision has been, and continues to be, driven by more effective ways to process data. This is achieved through new and more effi cient algorithms, and the dramatic increase in computational power that follows Moore’s law. When I started in robotics and vision in the mid 1980s, see Fig. 0.1, the IBM PC had been recently released – it had a 4.77 MHz 16-bit microprocessor and 16 kbytes (expandable to 256 k) of memory. Over the intervening 30 years computing power has perhaps doubled 20 times which is an increase by a factor of one million. Over the fairly recent history of robotics and machine vision a very large body of algorithms has been developed to effi ciently solve large-scale problems in perception, planning, control and localization – a signifi cant, tangible, and collective achievement of the research community. However its sheer size and complexity presents a very real barrier to somebody new entering the fi eld. Given so many algorithms from which to choose, a real and important question is: What is the right algorithm for this particular problem? One strategy would be to try a few different algorithms and see which works best for the problem at hand, but this is not trivial and leads to the next question: How can I evaluate algorithm X on my own data without spending days coding and debugging it from the original research papers? “Computers in the future may weigh no more than 1.5 tons.” Popular Mechanics, forecasting the relentless march of sci- ence, 1949
分享到:
收藏