Muutke küpsiste eelistusi

Visual Control of Wheeled Mobile Robots: Unifying Vision and Control in Generic Approaches 2014 ed. [Kõva köide]

  • Formaat: Hardback, 118 pages, kõrgus x laius: 235x155 mm, kaal: 3259 g, 24 Illustrations, color; 25 Illustrations, black and white; XII, 118 p. 49 illus., 24 illus. in color., 1 Hardback
  • Sari: Springer Tracts in Advanced Robotics 103
  • Ilmumisaeg: 08-Apr-2014
  • Kirjastus: Springer International Publishing AG
  • ISBN-10: 3319057820
  • ISBN-13: 9783319057828
  • Kõva köide
  • Hind: 122,82 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Tavahind: 144,49 €
  • Säästad 15%
  • Raamatu kohalejõudmiseks kirjastusest kulub orienteeruvalt 2-4 nädalat
  • Kogus:
  • Lisa ostukorvi
  • Tasuta tarne
  • Tellimisaeg 2-4 nädalat
  • Lisa soovinimekirja
  • Formaat: Hardback, 118 pages, kõrgus x laius: 235x155 mm, kaal: 3259 g, 24 Illustrations, color; 25 Illustrations, black and white; XII, 118 p. 49 illus., 24 illus. in color., 1 Hardback
  • Sari: Springer Tracts in Advanced Robotics 103
  • Ilmumisaeg: 08-Apr-2014
  • Kirjastus: Springer International Publishing AG
  • ISBN-10: 3319057820
  • ISBN-13: 9783319057828

Vision-based control of wheeled mobile robots is an interesting field of research from a scientific and even social point of view due to its potential applicability. This book presents a formal treatment of some aspects of control theory applied to the problem of vision-based pose regulation of wheeled mobile robots. In this problem, the robot has to reach a desired position and orientation, which are specified by a target image. It is faced in such a way that vision and control are unified to achieve stability of the closed loop, a large region of convergence, without local minima and good robustness against parametric uncertainty. Three different control schemes that rely on monocular vision as unique sensor are presented and evaluated experimentally. A common benefit of these approaches is that they are valid for imaging systems obeying approximately a central projection model, e.g., conventional cameras, catadioptric systems and some fisheye cameras. Thus, the presented control schemes are generic approaches. A minimum set of visual measurements, integrated in adequate task functions, are taken from a geometric constraint imposed between corresponding image features. Particularly, the epipolar geometry and the trifocal tensor are exploited since they can be used for generic scenes. A detailed experimental evaluation is presented for each control scheme.

1 Introduction
1(20)
1.1 Context of the Book
1(2)
1.2 State of the Art on Visual Control
3(8)
1.2.1 Visual Control in Robotics
3(1)
1.2.2 Classical Visual Servoing Schemes
4(2)
1.2.3 Visual Servoing through a Geometric Constraint
6(2)
1.2.4 Robust Visual Servoing
8(1)
1.2.5 Omnidirectional Visual Servoing
8(1)
1.2.6 Visual Control of Mobile Robots
9(2)
1.3 Mathematical Modeling
11(10)
1.3.1 The Camera-Robot Model
11(2)
1.3.2 Central Camera Model
13(3)
1.3.3 Visual Measurement's Models: Multi-view Geometric Constraints
16(5)
2 Robust Visual Control Based on the Epipolar Geometry
21(24)
2.1 Introduction
21(2)
2.2 Pairwise Epipolar Geometry of Three Views
23(1)
2.3 Epipolar Control Law from Three Views
24(6)
2.3.1 First Step -- Alignment with the Target
25(4)
2.3.2 Second Step -- Depth Correction with Drift Compensation
29(1)
2.4 Stability Analysis
30(4)
2.5 Experimental Evaluation
34(10)
2.5.1 Simulation Results
34(8)
2.5.2 Real-World Experiments
42(2)
2.6 Closure
44(1)
3 A Robust Control Scheme Based on the Trifocal Tensor
45(24)
3.1 Introduction
45(2)
3.2 Defining a Control Framework with the 1D Trifocal Tensor
47(5)
3.2.1 Values of the 1D Trifocal Tensor in Particular Locations
49(1)
3.2.2 Dynamic Behavior of the Elements of the 1D Trifocal Tensor
50(1)
3.2.3 Selecting Suited Outputs
51(1)
3.3 1D Trifocal Tensor-Based Control Law Design
52(3)
3.3.1 First Step -- Position Correction
52(3)
3.3.2 Second Step -- Orientation Correction
55(1)
3.4 Stability Analysis
55(2)
3.5 Experimental Evaluation
57(11)
3.5.1 Simulation Results
58(3)
3.5.2 Experiments with Real Data
61(3)
3.5.3 Real-World Experiments
64(4)
3.6 Closure
68(1)
4 Dynamic Pose-Estimation for Visual Control
69(30)
4.1 Introduction
69(2)
4.2 Dynamic Pose-Estimation from a Geometric Constraint
71(11)
4.2.1 Observability Analysis with the Epipoles as Measurement
71(6)
4.2.2 Observability Analysis with the 1D Trifocal Tensor as Measurement
77(5)
4.3 Non-holonomic Visual Servoing in the Cartesian Space
82(6)
4.3.1 Control of the Position Error
83(1)
4.3.2 Stability of the Estimation-Based Control Loop
84(2)
4.3.3 Pose Regulation through Adequate Reference Tracking
86(2)
4.4 Experimental Evaluation
88(9)
4.4.1 Simulation Results
88(6)
4.4.2 Real-World Experiments
94(3)
4.5 Closure
97(2)
5 Conclusions
99(4)
A Basics of Nonlinear Control and State Estimation
103(8)
A.1 Input-Output Linearization
103(2)
A.2 A Robust Control Technique: Sliding Mode Control
105(2)
A.3 Theory of State Observability
107(2)
A.3.1 Nonlinear Continuous Systems
107(1)
A.3.2 Nonlinear Discrete Systems
108(1)
A.3.3 Discrete Piece-Wise Constant Systems
108(1)
A.4 Dynamic Pose Estimation
109(2)
References 111