Muutke küpsiste eelistusi

E-raamat: Collaborative Perception, Localization and Mapping for Autonomous Systems

Teised raamatud teemal:
  • Formaat - EPUB+DRM
  • Hind: 135,23 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
Teised raamatud teemal:

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

This book presents the breakthrough and cutting-edge progress for collaborative perception and mapping by proposing a novel framework of multimodal perception-relative localization–collaborative mapping for collaborative robot systems. The organization of the book allows the readers to analyze, model and design collaborative perception technology for autonomous robots. It presents the basic foundation in the field of collaborative robot systems and the fundamental theory and technical guidelines for collaborative perception and mapping. The book significantly promotes the development of autonomous systems from individual intelligence to collaborative intelligence by providing extensive simulations and real experiments results in the different chapters. This book caters to engineers, graduate students and researchers in the fields of autonomous systems, robotics, computer vision and collaborative perception.
1 Introduction
1(8)
1.1 Background
1(3)
1.1.1 Motivations
1(1)
1.1.2 Challenges
2(2)
1.2 Objective of This Book
4(1)
1.3 Preview of
Chapters
5(4)
References
6(3)
2 Technical Background
9(20)
2.1 Collaborative Perception and SLAM
9(6)
2.1.1 Single Robot Perception and SLAM
9(3)
2.1.2 Multi-Robot SLAM
12(2)
2.1.3 Multi-Robot Map Fusion
14(1)
2.2 Data Registration and Matching
15(4)
2.2.1 Registration of Sensor Data
15(1)
2.2.2 Homogeneous Map Matching
16(1)
2.2.3 Heterogeneous Map Matching
17(2)
2.3 Collaborative Information Fusion
19(10)
2.3.1 Map Inconsistency Detection
19(1)
2.3.2 Probabilistic Information Integration
20(1)
References
21(8)
3 Point Registration Approach for Map Fusion
29(18)
3.1 Introduction
29(1)
3.2 OICP Algorithm
30(6)
3.2.1 Uncertainty in Occupancy Probability
33(1)
3.2.2 Uncertainty in Positional Value
34(2)
3.3 Transformation Evaluation and Probability Fusion
36(3)
3.3.1 Transformation Evaluation
37(1)
3.3.2 Relative Entropy Filter
38(1)
3.4 Experimental Results
39(6)
3.4.1 Registration Results
40(1)
3.4.2 Transformation Evaluation
41(2)
3.4.3 Probabilistic Map Fusion
43(2)
3.5 Conclusions
45(2)
References
45(2)
4 Hierarchical Map Fusion Framework with Homogeneous Sensors
47(30)
4.1 Introduction
47(3)
4.2 System Overview
50(1)
4.3 Map Uncertainty Modeling
50(6)
4.3.1 Individual Voxel Uncertainty
50(3)
4.3.2 Structural Edge Uncertainty
53(1)
4.3.3 Local Uncertainty Propagation
54(2)
4.4 Two-Level Probabilistic Map Matching
56(5)
4.4.1 The Formulation of Two-Level Probabilistic Map Matching Problem
56(2)
4.4.2 Probabilistic Data Association
58(2)
4.4.3 Error Metric Optimization
60(1)
4.5 Transformation Evaluation and Probability Merging
61(2)
4.5.1 Transformation Evaluation
61(2)
4.5.2 Relative Entropy Filter
63(1)
4.6 Experimental Results
63(11)
4.6.1 Evaluation Protocol
64(1)
4.6.2 Edge Matching Analysis
65(2)
4.6.3 Full Map Matching Analysis
67(4)
4.6.4 Statistical Testing and Map Merging
71(3)
4.7 Conclusions
74(3)
References
75(2)
5 Collaborative 3D Mapping Using Heterogeneous Sensors
77(24)
5.1 Introduction
77(2)
5.2 Distributed Multi-Robot Map Fusion
79(3)
5.2.1 System Architecture
79(1)
5.2.2 System Framework Definition and Formulation
80(1)
5.2.3 Map Fusion Definition and Formulation
81(1)
5.3 Multi-Robot Map Matching
82(4)
5.3.1 Mathematic Formulation
82(1)
5.3.2 3D Occupancy Map Matching
83(1)
5.3.3 E-Step
84(2)
5.3.4 M-Step
86(1)
5.4 Time-Sequential Map Merging
86(4)
5.4.1 Uncertainty Propagation and Transformation
87(2)
5.4.2 Uncertainty Merge
89(1)
5.5 Experimental Results
90(9)
5.5.1 Evaluation Protocol
90(2)
5.5.2 Indoor Environment
92(1)
5.5.3 Mixed Environment Ground Floor
93(2)
5.5.4 Changi Exhibition Center
95(1)
5.5.5 Unstructured Environment Environment
96(1)
5.5.6 Analysis of Experiment Results
96(3)
5.6 Conclusions
99(2)
References
99(2)
6 All-Weather Collaborative Mapping with Dynamic Objects
101(16)
6.1 Introduction
101(2)
6.2 Framework of Collaborative Dynamic Mapping
103(1)
6.3 Multimodal Environmental Perception
104(1)
6.3.1 Heterogeneous Sensors Calibration
105(1)
6.3.2 Separation of Static and Dynamic Observations
105(1)
6.4 Distributed Collaborative Dynamic Mapping
105(4)
6.4.1 Single Robot Level Definition
106(1)
6.4.2 Collaborative Robots Level Definition
107(2)
6.5 Experiments
109(6)
6.5.1 Experiments Overview
109(2)
6.5.2 Daytime Unstructured Environment
111(1)
6.5.3 Night-Time Unstructured Environment
112(1)
6.5.4 Quantitative Analysis
113(2)
6.6 Conclusions
115(2)
References
115(2)
7 Collaborative Probabilistic Semantic Mapping Using CNN
117(22)
7.1 Introduction
117(3)
7.2 System Framework
120(2)
7.2.1 The Framework of Hierarchical Semantic 3D Mapping
120(1)
7.2.2 Centralized Problem Formulation
120(1)
7.2.3 Distributed Hierarchical Definition
121(1)
7.3 Collaborative Semantic 3D Mapping
122(7)
7.3.1 Multimodal Semantic Information Fusion
123(1)
7.3.2 Single Robot Semantic Mapping
124(1)
7.3.3 Collaborative Semantic Map Fusion
125(4)
7.4 Experimental Results
129(8)
7.4.1 Evaluation Overview
129(2)
7.4.2 Open Carpark
131(1)
7.4.3 Mixed Indoor Outdoor
131(1)
7.4.4 UAV-UGV Mapping
131(2)
7.4.5 Quantitative Analysis
133(4)
7.5 Conclusion
137(2)
References
137(2)
8 Conclusions
139
8.1 Summary
139(2)
8.2 Open Challenges
141
Yufeng Yue received the B.Eng. degree in automation from Beijing Institute of Technology, Beijing, China, in 2014, the Ph.D. degree from Nanyang Technological University, Singapore, in 2019. He was a visiting scholar with University of California, Los Angeles in 2019. He served as a research fellow with School of Electrical and Electronic Eng., Nanyang Technological University, Singapore, from 2018 to 2020.  He is currently an associate professor with School of Automation, Beijing Institute of Technology, Beijing, China. His research interests include perception, mapping and navigation for collaborative autonomous systems in complex environments.

Danwei Wang is leading the autonomous mobile robotics research group. He received his Ph.D and MSE degrees from the University of Michigan, Ann Arbor in 1989 and 1984, respectively. He received his B.E degree from the South China University of Technology, China in 1982. Since 1989, he has been with the School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore. Currently, he is professor and the co-director of the ST Engineering NTU Corporate Laboratory. He is a senator in NTU Academics Council. He has served as general chairman, technical chairman and various positions in international conferences, such as ICARCV and IROS conferences. He is an associate editor for the International Journal of Humanoid Robotics and served as an associate editor of Conference Editorial Board, IEEE Control Systems Society from 1998 to 2005. He was a recipient of Alexander von Humboldt fellowship, Germany. His research interests include robotics, control theory and applications.