首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Visual localization systems that are practical for autonomous vehicles in outdoor industrial applications must perform reliably in a wide range of conditions. Changing outdoor conditions cause difficulty by drastically altering the information available in the camera images. To confront the problem, we have developed a visual localization system that uses a surveyed three‐dimensional (3D)‐edge map of permanent structures in the environment. The map has the invariant properties necessary to achieve long‐term robust operation. Previous 3D‐edge map localization systems usually maintain a single pose hypothesis, making it difficult to initialize without an accurate prior pose estimate and also making them susceptible to misalignment with unmapped edges detected in the camera image. A multihypothesis particle filter is employed here to perform the initialization procedure with significant uncertainty in the vehicle's initial pose. A novel observation function for the particle filter is developed and evaluated against two existing functions. The new function is shown to further improve the abilities of the particle filter to converge given a very coarse estimate of the vehicle's initial pose. An intelligent exposure control algorithm is also developed that improves the quality of the pertinent information in the image. Results gathered over an entire sunny day and also during rainy weather illustrate that the localization system can operate in a wide range of outdoor conditions. The conclusion is that an invariant map, a robust multihypothesis localization algorithm, and an intelligent exposure control algorithm all combine to enable reliable visual localization through challenging outdoor conditions. © 2009 Wiley Periodicals, Inc.  相似文献   

2.
The recent technological advances in Micro Aerial Vehicles (MAVs) have triggered great interest in the robotics community, as their deployability in missions of surveillance and reconnaissance has now become a realistic prospect. The state of the art, however, still lacks solutions that can work for a long duration in large, unknown, and GPS‐denied environments. Here, we present our visual pipeline and MAV state‐estimation framework, which uses feeds from a monocular camera and an Inertial Measurement Unit (IMU) to achieve real‐time and onboard autonomous flight in general and realistic scenarios. The challenge lies in dealing with the power and weight restrictions onboard a MAV while providing the robustness necessary in real and long‐term missions. This article provides a concise summary of our work on achieving the first onboard vision‐based power‐on‐and‐go system for autonomous MAV flights. We discuss our insights on the lessons learned throughout the different stages of this research, from the conception of the idea to the thorough theoretical analysis of the proposed framework and, finally, the real‐world implementation and deployment. Looking into the onboard estimation of monocular visual odometry, the sensor fusion strategy, the state estimation and self‐calibration of the system, and finally some implementation issues, the reader is guided through the different modules comprising our framework. The validity and power of this framework are illustrated via a comprehensive set of experiments in a large outdoor mission, demonstrating successful operation over flights of more than 360 m trajectory and 70 m altitude change. 1   相似文献   

3.
GPS‐denied closed‐loop autonomous control of unstable Unmanned Aerial Vehicles (UAVs) such as rotorcraft using information from a monocular camera has been an open problem. Most proposed Vision aided Inertial Navigation Systems (V‐INSs) have been too computationally intensive or do not have sufficient integrity for closed‐loop flight. We provide an affirmative answer to the question of whether V‐INSs can be used to sustain prolonged real‐world GPS‐denied flight by presenting a V‐INS that is validated through autonomous flight‐tests over prolonged closed‐loop dynamic operation in both indoor and outdoor GPS‐denied environments with two rotorcraft unmanned aircraft systems (UASs). The architecture efficiently combines visual feature information from a monocular camera with measurements from inertial sensors. Inertial measurements are used to predict frame‐to‐frame transition of online selected feature locations, and the difference between predicted and observed feature locations is used to bind in real‐time the inertial measurement unit drift, estimate its bias, and account for initial misalignment errors. A novel algorithm to manage a library of features online is presented that can add or remove features based on a measure of relative confidence in each feature location. The resulting V‐INS is sufficiently efficient and reliable to enable real‐time implementation on resource‐constrained aerial vehicles. The presented algorithms are validated on multiple platforms in real‐world conditions: through a 16‐min flight test, including an autonomous landing, of a 66 kg rotorcraft UAV operating in an unconctrolled outdoor environment without using GPS and through a Micro‐UAV operating in a cluttered, unmapped, and gusty indoor environment. © 2013 Wiley Periodicals, Inc.  相似文献   

4.
In this paper, a generic line‐of‐sight‐sensing (LOS)‐based guidance methodology is proposed for the docking of autonomous vehicles/robotic end‐effectors: A multi‐LOS task‐space sensing system is used in conjunction with a guidance algorithm in a closed‐loop feedback environment. The novelty of the overall system is its applicability to cases that do not allow for the direct proximity measurement of the vehicle's pose (position and orientation). In such instances, a guidance‐based technique must be employed to move the vehicle to its desired pose using corrective actions at the final stages of its motion. Namely, after the vehicle/end‐effector has failed to move to its desired docking pose within acceptable tolerances, LOS sensors initiate short‐range corrective motion commands. The objective of the proposed guidance method is, thus, to successfully minimize the systematic errors of the vehicle, accumulated after a long‐range motion, while allowing it to converge within the random noise limits. An additional advantage of the proposed system is its applicability to varying vehicle mobility requirements for high‐precision docking. The proposed system was successfully tested via simulation on a 6 degree‐of‐freedom (DOF) vehicle. Numerous simulation tests of the behavior of the vehicle under the command of the guidance algorithm were conducted, one of which is presented herein. © 2005 Wiley Periodicals, Inc.  相似文献   

5.
Joint simultaneous localization and mapping (SLAM) constitutes the basis for cooperative action in multi‐robot teams. We designed a stereo vision‐based 6D SLAM system combining local and global methods to benefit from their particular advantages: (1) Decoupled local reference filters on each robot for real‐time, long‐term stable state estimation required for stabilization, control and fast obstacle avoidance; (2) Online graph optimization with a novel graph topology and intra‐ as well as inter‐robot loop closures through an improved submap matching method to provide global multi‐robot pose and map estimates; (3) Distribution of the processing of high‐frequency and high‐bandwidth measurements enabling the exchange of aggregated and thus compacted map data. As a result, we gain robustness with respect to communication losses between robots. We evaluated our improved map matcher on simulated and real‐world datasets and present our full system in five real‐world multi‐robot experiments in areas of up 3,000 m2 (bounding box), including visual robot detections and submap matches as loop‐closure constraints. Further, we demonstrate its application to autonomous multi‐robot exploration in a challenging rough‐terrain environment at a Moon‐analogue site located on a volcano.  相似文献   

6.
This paper presents a novel solution for micro aerial vehicles (MAVs) to autonomously search for and land on an arbitrary landing site using real-time monocular vision. The autonomous MAV is provided with only one single reference image of the landing site with an unknown size before initiating this task. We extend a well-known monocular visual SLAM algorithm that enables autonomous navigation of the MAV in unknown environments, in order to search for such landing sites. Furthermore, a multi-scale ORB feature based method is implemented and integrated into the SLAM framework for landing site detection. We use a RANSAC-based method to locate the landing site within the map of the SLAM system, taking advantage of those map points associated with the detected landing site. We demonstrate the efficiency of the presented vision system in autonomous flights, both indoor and in challenging outdoor environment.  相似文献   

7.
Autonomous underwater vehicles are a prominent tool for underwater exploration because they can access dangerous places avoiding the risks for the human beings. However, the autonomous navigation still a challenge due to the characteristics of the environment that decrease the performance of the sensor and the robot perception. In this context, this paper proposes a loop closure detector addressed to the simultaneous localization and mapping problem at semistructured environments using acoustic images acquired by forward‐looking sonars. The images are segmented by an adaptative approach based on the acoustic beams analysis. A pose‐invariant topological graph is build to represent the relationship between image features. The loop closure detection is achieved using a graph comparison. The approach is evaluated in a real environment at a marina. The results reveal all loop closures of the data set are detected with a high precision and present an invariant to image rotation.  相似文献   

8.
Autonomous flight of unmanned full‐size rotor‐craft has the potential to enable many new applications. However, the dynamics of these aircraft, prevailing wind conditions, the need to operate over a variety of speeds and stringent safety requirements make it difficult to generate safe plans for these systems. Prior work has shown results for only parts of the problem. Here we present the first comprehensive approach to planning safe trajectories for autonomous helicopters from takeoff to landing. Our approach is based on two key insights. First, we compose an approximate solution by cascading various modules that can efficiently solve different relaxations of the planning problem. Our framework invokes a long‐term route optimizer, which feeds a receding‐horizon planner which in turn feeds a high‐fidelity safety executive. Secondly, to deal with the diverse planning scenarios that may arise, we hedge our bets with an ensemble of planners. We use a data‐driven approach that maps a planning context to a diverse list of planning algorithms that maximize the likelihood of success. Our approach was extensively evaluated in simulation and in real‐world flight tests on three different helicopter systems for duration of more than 109 autonomous hours and 590 pilot‐in‐the‐loop hours. We provide an in‐depth analysis and discuss the various tradeoffs of decoupling the problem, using approximations and leveraging statistical techniques. We summarize the insights with the hope that it generalizes to other platforms and applications.  相似文献   

9.
Particle Filters (PFs) have been successfully used in three‐dimensional (3D) model‐based pose estimation. Typically, these filters depend on the computation of importance weights that use similarity metrics as a proxy to approximate the likelihood function. In this paper, we explore the use of a two‐stage 3D model‐based approach based on a PF for single‐frame pose estimation. First, we use a classifier trained in a synthetic data set for Unmanned Aerial Vehicle (UAV) detection and a pretrained database indexed by bounding boxes properties to obtain an initial rough pose estimate. Second, we employ optimization algorithms to optimize the used similarity metrics and decrease the obtained error. We have tested four different algorithms: (a) Particle Filter Optimization (PFO), (b) Particle Swarm Optimization (PSO), (c) modified PSO, and (d) an approach based on the evolution strategies present in the genetic algorithms named Genetic Algorithm‐based Framework (GAbF). To check the quality of the estimate on each iteration, we have tested several similarity metrics (color, edge, and mask‐based) based on the UAV Computer‐Aided Design (CAD) model. The framework is applied to the outdoor pose estimation of a fixed‐wing UAV for autonomous landing in a Fast Patrol Boat (FPB). We extend our previous approach by adopting a better problem formulation, using Deep Neural Networks (DNNs) for UAV detection, making the comparison between the used similarity metrics, comparing pose optimization schemes, and showing new results. The future work will focus on the inclusion of this scheme in a tracking architecture to increase the accuracy of the result between observations.  相似文献   

10.
This paper addresses the perception, control, and trajectory planning for an aerial platform to identify and land on a moving car at 15 km/hr. The hexacopter unmanned aerial vehicle (UAV), equipped with onboard sensors and a computer, detects the car using a monocular camera and predicts the car future movement using a nonlinear motion model. While following the car, the UAV lands on its roof, and it attaches itself using magnetic legs. The proposed system is fully autonomous from takeoff to landing. Numerous field tests were conducted throughout the year‐long development and preparations for the Mohamed Bin Zayed International Robotics Challenge (MBZIRC) 2017 competition, for which the system was designed. We propose a novel control system in which a model predictive controller is used in real time to generate a reference trajectory for the UAV, which are then tracked by the nonlinear feedback controller. This combination allows to track predictions of the car motion with minimal position error. The evaluation presents three successful autonomous landings during the MBZIRC 2017, where our system achieved the fastest landing among all competing teams.  相似文献   

11.
Small unmanned aerial vehicles (UAVs) are becoming popular among researchers and vital platforms for several autonomous mission systems. In this paper, we present the design and development of a miniature autonomous rotorcraft weighing less than 700 g and capable of waypoint navigation, trajectory tracking, visual navigation, precise hovering, and automatic takeoff and landing. In an effort to make advanced autonomous behaviors available to mini‐ and microrotorcraft, an embedded and inexpensive autopilot was developed. To compensate for the weaknesses of the low‐cost equipment, we put our efforts into designing a reliable model‐based nonlinear controller that uses an inner‐loop outer‐loop control scheme. The developed flight controller considers the system's nonlinearities, guarantees the stability of the closed‐loop system, and results in a practical controller that is easy to implement and to tune. In addition to controller design and stability analysis, the paper provides information about the overall control architecture and the UAV system integration, including guidance laws, navigation algorithms, control system implementation, and autopilot hardware. The guidance, navigation, and control (GN&C) algorithms were implemented on a miniature quadrotor UAV that has undergone an extensive program of flight tests, resulting in various flight behaviors under autonomous control from takeoff to landing. Experimental results that demonstrate the operation of the GN&C algorithms and the capabilities of our autonomous micro air vehicle are presented. © 2009 Wiley Periodicals, Inc.  相似文献   

12.
Abstract— Recently, a mercury‐free flat fluorescent lamp has been developed for LCD backlight application, utilizing a glow‐discharge mode instead of a discharge contraction. This paper proposes a lamp‐driving system with a feedback loop which prevents discharge contraction and stabilizes the operation of the lamp ignition and radiation. By measuring the current that flows through the lamp, the loop can adjust the current level to a normal operational level and suppress the long‐term excitation that causes discharge contraction. The proposed method has been verified by hardware experiments which are compared to that of a conventional open‐loop circuit by discharge contraction time and a change in luminance.  相似文献   

13.
Large‐scale aerial sensing missions can greatly benefit from the perpetual endurance capability provided by high‐performance low‐altitude solar‐powered unmanned aerial vehicles (UAVs). However, today these UAVs suffer from small payload capacity, low energetic margins, and high operational complexity. To tackle these problems, this paper presents four individual technical contributions and integrates them into an existing solar‐powered UAV system: First, a lightweight and power‐efficient day/night‐capable sensing system is discussed. Second, means to optimize the UAV platform to the specific payload and to thereby achieve sufficient energetic margins for day/night flight with payload are presented. Third, existing autonomous launch and landing functionality is extended for solar‐powered UAVs. Fourth, as a main contribution an extended Kalman filter (EKF)‐based autonomous thermal updraft tracking framework is developed. Its novelty is that it allows the end‐to‐end integration of the thermal‐induced roll moment into the estimation process. It is assessed against unscented Kalman filter and particle filter methods in simulation and implemented on the aircraft's low‐power autopilot. The complete system is verified during a 26 h search‐and‐rescue aerial sensing mock‐up mission that represents the first‐ever fully autonomous perpetual endurance flight of a small solar‐powered UAV with a day/night‐capable sensing payload. It also represents the first time that solar‐electric propulsion and autonomous thermal updraft tracking are combined in flight. In contrast to previous work that has focused on the energetic feasibility of perpetual flight, the individual technical contributions of this paper are considered core functionality to guarantee ease‐of‐use, effectivity, and reliability in future multiday aerial sensing operations with small solar‐powered UAVs.  相似文献   

14.
A nonlinear control algorithm for tracking dynamic trajectories using an aerial vehicle is developed in this work. The control structure is designed using a sliding mode methodology, which contains integral sliding properties. The stability analysis of the closed‐loop system is proved using the Lyapunov formalism, ensuring convergence in a desired finite time and robustness toward unknown and external perturbations from the first time instant, even for high frequency disturbances. In addition, a dynamic trajectory is constructed with the translational dynamics of an aerial robot for autonomous take‐off, surveillance missions, and landing. This trajectory respects the constraints imposed by the vehicle characteristics, allowing free initial trajectory conditions. Simulation results demonstrate the good performance of the controller in closed‐loop system when a quadrotor follows the designed trajectory. In addition, flight tests are developed to validate the trajectory and the controller behavior in real time.  相似文献   

15.
In recent years, a number operational unmanned ground vehicles (UGVs) have been developed that can negotiate irregular terrain. They have a number of degrees‐of‐freedom (DOF) giving them enhanced mobility, e.g., the ability to climb stairs and over obstacles. However, operating them remotely is complicated because their controllers are similar to conventional control pads or joysticks used in computer games or toys. It is hard for the operator to achieve an intuitive and natural feel, thus mistakes are common. To intuitively control the locomotion of a UGV with many DOFs, a master‐slave operation was implemented. A novel UGV called Kurogane, which consists of a typical crawler combined with a human‐like torso section, was developed. The torso section is controlled via a wearable controller interface. In addition, the UGV is equipped with models of muscle viscoelasticity and stretch reflex, called the involuntary autonomous adaptation system, inspired by the adaptive compliance of animals. The proposed system can autonomously and flexibly react and adapt to irregular terrain in real time. Therefore, the operation of Kurogane is simple and does not require great skill or precision. Experimental results show that it performs well over a fixed step, stairs, and rough outdoor terrain. © 2013 Wiley Periodicals, Inc.  相似文献   

16.
This study presents computer vision modules of a multi‐unmanned aerial vehicle (UAV) system, which scored gold, silver, and bronze medals at the Mohamed Bin Zayed International Robotics Challenge 2017. This autonomous system, which was running completely on board and in real time, had to address two complex tasks in challenging outdoor conditions. In the first task, an autonomous UAV had to find, track, and land on a human‐driven car moving at 15 km/hr on a figure‐eight‐shaped track. During the second task, a group of three UAVs had to find small colored objects in a wide area, pick them up, and deliver them into a specified drop‐off zone. The computer vision modules presented here achieved computationally efficient detection, accurate localization, robust velocity estimation, and reliable future position prediction of both the colored objects and the car. These properties had to be achieved in adverse outdoor environments with changing light conditions. Lighting varied from intense direct sunlight with sharp shadows cast over the objects by the UAV itself, to reduced visibility caused by overcast to dust and sand in the air. The results presented in this paper demonstrate good performance of the modules both during testing, which took place in the harsh desert environment of the central area of United Arab Emirates, as well as during the contest, which took place at a racing complex in the urban, near‐sea location of Abu Dhabi. The stability and reliability of these modules contributed to the overall result of the contest, where our multi‐UAV system outperformed teams from world’s leading robotic laboratories in two challenging scenarios.  相似文献   

17.
Distributed as an open‐source library since 2013, real‐time appearance‐based mapping (RTAB‐Map) started as an appearance‐based loop closure detection approach with memory management to deal with large‐scale and long‐term online operation. It then grew to implement simultaneous localization and mapping (SLAM) on various robots and mobile platforms. As each application brings its own set of constraints on sensors, processing capabilities, and locomotion, it raises the question of which SLAM approach is the most appropriate to use in terms of cost, accuracy, computation power, and ease of integration. Since most of SLAM approaches are either visual‐ or lidar‐based, comparison is difficult. Therefore, we decided to extend RTAB‐Map to support both visual and lidar SLAM, providing in one package a tool allowing users to implement and compare a variety of 3D and 2D solutions for a wide range of applications with different robots and sensors. This paper presents this extended version of RTAB‐Map and its use in comparing, both quantitatively and qualitatively, a large selection of popular real‐world datasets (e.g., KITTI, EuRoC, TUM RGB‐D, MIT Stata Center on PR2 robot), outlining strengths, and limitations of visual and lidar SLAM configurations from a practical perspective for autonomous navigation applications.  相似文献   

18.
One of the steps to provide fundamental data for planning a mining effort is the magnetic surveying of a target area, which is typically carried out by conventional aircraft campaigns. However, besides the high cost, fixed‐wing aerial vehicles present shortcomings especially for drape flights on mountainous regions, where steep slopes are often present. Traditional human‐crewed flights have to perform tedious and dangerous trajectories, under strict velocity and attitude constraints. In this paper, we deal with the problem of accomplishing digital magnetic‐elevation maps using autonomous and cooperative aerial robots. The proposed approach for autonomous mapping utilizes a custom‐built fluxgate sensor and off the shelf cameras adapted for small airborne platforms. We also propose an innovative approach for generating a digital magnetic‐elevation model from the gathered data. Our method was evaluated and validated in field tests in an industrial scenario to detect scrap metals in ore piles. Results show that the proposed method could reliably detect magnetic anomalies while generating accurate three‐dimensional magnetic maps.  相似文献   

19.
Visual servoing approaches navigate a robot to the desired pose with respect to a given object using image measurements. As a result, these approaches have several applications in manipulation, navigation and inspection. However, existing visual servoing approaches are instance specific, that is, they control camera motion between two views of the same object. In this paper, we present a framework for visual servoing to a novel object instance. We further employ our framework for the autonomous inspection of vehicles using Micro Aerial Vehicles (MAVs), which is vital for day‐to‐day maintenance, damage assessment, and merchandising a vehicle. This visual inspection task comprises the MAV visiting the essential parts of the vehicle, for example, wheels, lights, and so forth, to get a closer look at the damages incurred. Existing methods for autonomous inspection could not be extended for vehicles due to the following reasons: First, several existing methods require a 3D model of the structure, which is not available for every vehicle. Second, existing methods require expensive depth sensor for localization and path planning. Third, current approaches do not account for the semantic understanding of the vehicle, which is essential for identifying parts. Our instance invariant visual servoing framework is capable of autonomously navigating to every essential part of a vehicle for inspection and can be initialized from any random pose. To the best our knowledge, this is the first approach demonstrating fully autonomous visual inspection of vehicles using MAVs. We have validated the efficacy of our approach through a series of experiments in simulation and outdoor scenarios.  相似文献   

20.
李睿康  黄奇伟  冯辉  胡波 《机器人》2020,42(4):416-426
针对旋翼无人机全自主作业的需求,构建了崎岖地表上的旋翼无人机自主安全降落系统.该系统通过机载实时运算自动分析落区地形,寻找可行落点并实施自动降落.系统以低成本的立体RGB-D相机作为深度传感设备,利用截断符号距离函数(TSDF)对着陆区地形进行实时3维建模,生成低噪的落区地形深度图像,并设计了一种适应起落机构形状的实时精细落点搜索方法,最后使用级联PID(比例-积分-微分)控制器控制无人机实施安全降落.系统基于大疆M100无人机平台实现,定制了仿真器进行算法调试,并最终在实际的崎岖地表上实现了自主安全降落.本文工作可为旋翼无人机紧急降落、物流运输或者灾后搜救提供有效安全的解决方案.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号