Analyzing videos and images captured by unmanned aerial vehicles or aerial drones is an emerging application attracting significant attention from researchers in various areas of computer vision. Currently, the major challenge is the development of autonomous operations to complete missions and replace human operators. In this paper, based on the type of analyzing videos and images captured by drones in computer vision, we have reviewed these applications by categorizing them into three groups. The first group is related to remote sensing with challenges such as camera calibration, image matching, and aerial triangulation. The second group is related to drone-autonomous navigation, in which computer vision methods are designed to explore challenges such as flight control, visual localization and mapping, and target tracking and obstacle detection. The third group is dedicated to using images and videos captured by drones in various applications, such as surveillance, agriculture and forestry, animal detection, disaster detection, and face recognition. Since most of the computer vision methods related to the three categories have been designed for real-world conditions, providing real conditions based on drones is impossible. We aim to explore papers that provide a database for these purposes. In the first two groups, some survey papers presented are current. However, the surveys have not been aimed at exploring any databases. This paper presents a complete review of databases in the first two groups and works that used the databases to apply their methods. Vision-based intelligent applications and their databases are explored in the third group, and we discuss open problems and avenues for future research.
An incubator is an organization that supports new ventures to grow and survive during the early stages. Mainly dedicated to information technology, life science may appear to be the next hot spot for incubators. Are there stabilized good practices? Are the business models in information technology and in life science comparable when it goes to start-ups' incubation? Leveraging our experience as practitioner in this field and using an inductive methodology, this paper tends to propose simple principles to help build robust incubators in life science, and to contribute to disseminate an entrepreneurial approach through an industry still dominated by blue chips. 相似文献
A robust Fault Diagnosis (FD) scheme for a real quadrotor Unmanned Aerial Vehicle (UAV) is proposed in this paper. Firstly, a novel Adaptive Thau observer (ATO) is developed to estimate the quadrotor system states and build a set of offset residuals to indicate actuators’ faults. Based on these residuals, some rules of Fault Diagnosis (FD) are designed to detect and isolate the faults as well as estimate the fault offset parameters. Secondly, a synthetic robust optimization scheme is presented to improve Fault Estimation (FE) accuracies, three key issues include modeling uncertainties, and magnitude order unbalances as well as noises are addressed. Finally, a typical fault of rotors is simulated and injected into one of four rotors of the quadrotor, and experiments for the FD scheme have been carried out. Unlike former research works on the FD schemes for quadrotors, our proposed FD scheme based on the ATO can not only detect and isolate the failed actuators, but also estimate the fault severities. Regardless of roughness of the real flying data, the FD results still have sufficient FE accuracies. 相似文献
To prevent unauthorized access to protected trusted platform module (TPM) objects, authorization protocols, such as the object-specific authorization protocol (OSAP), have been introduced by the trusted computing group (TCG). By using OSAP, processes trying to gain access to the protected TPM objects need to prove their knowledge of relevant authorization data before access to the objects can be granted. Chen and Ryan’s 2009 analysis has demonstrated OSAP’s authentication vulnerability in sessions with shared authorization data. They also proposed the Session Key Authorization Protocol (SKAP) with fewer stages as an alternative to OSAP. Chen and Ryan’s analysis of SKAP using ProVerif proves the authentication property. The purpose of this paper was to examine the usefulness of Colored Petri Nets (CPN) and CPN Tools for security analysis. Using OSAP and SKAP as case studies, we construct intruder and authentication property models in CPN. CPN Tools is used to verify the authentication property using a Dolev–Yao-based model. Verification of the authentication property in both models using the state space tool produces results consistent with those of Chen and Ryan. 相似文献
In this paper, we describe how to use geodesic energies defined on various sets of objects to solve several distance related
problems. We first present the theory of metamorphoses and the geodesic distances it induces on a Riemannian manifold, followed
by classical applications in landmark and image matching. We then explain how to use the geodesic distance for new issues,
which can be embedded in a general framework of matching with free extremities. This is illustrated by results on image and
shape averaging and unlabeled landmark matching.
Laurent Garcin is a former student of the Ecole Polytechnique. He obtained his Ph.D. in 2004 at the Ecole Normale de Cachan, working on
matching methods for landmarks and images. He is an engineer at the French National Geographic Institute.
Laurent Younes is a former student of the Ecole Normale Superieure in Paris. He was awarded the Ph.D. from the University Paris Sud in 1989,
and the thesis advisor certification from the same university in 1995. He works on the statistical analysis of images and
shapes, and on modeling shape deformations and shape spaces.
Laurent Younes entered CNRS, the French National Research Center in October 1991, in which he has been a “Directeur de Recherche"
until 2003. He is now a professor at the Department of Applied Mathematics and Statistics Department and the Center for Imaging
Science at Johns Hopkins University in July 2003. 相似文献
One approach to delaying the spread of the novel coronavirus (COVID-19) is to reduce human travel by imposing travel restriction policies. Understanding the actual human mobility response to such policies remains a challenge owing to the lack of an observed and large-scale dataset describing human mobility during the pandemic. This study uses an integrated dataset, consisting of anonymized and privacy-protected location data from over 150 million monthly active samples in the USA, COVID-19 case data and census population information, to uncover mobility changes during COVID-19 and under the stay-at-home state orders in the USA. The study successfully quantifies human mobility responses with three important metrics: daily average number of trips per person; daily average person-miles travelled; and daily percentage of residents staying at home. The data analytics reveal a spontaneous mobility reduction that occurred regardless of government actions and a ‘floor’ phenomenon, where human mobility reached a lower bound and stopped decreasing soon after each state announced the stay-at-home order. A set of longitudinal models is then developed and confirms that the states'' stay-at-home policies have only led to about a 5% reduction in average daily human mobility. Lessons learned from the data analytics and longitudinal models offer valuable insights for government actions in preparation for another COVID-19 surge or another virus outbreak in the future. 相似文献
The dissolution of kaolinite clay in hydrochloric acid solutions has been carried out in the presence of fluoride ions. Leaching in the presence of fluoride ions activates the clay for leaching, making higher extractions possible at lower roasting and leaching temperatures. The activation energy for the leaching of clay calcined at 540°C is decreased from 71 kJ/mol to 23 kJ/mol in the presence of fluoride ions. Dissolution in the presence of fluoride appears to fit a second-order reaction mechanism. 相似文献
Several Software Reliability Growth Models (SRGMs) have been developed in the literature assuming the debugging process to be perfect and thus implying that there is one-to-one correspondence between the number of failures observed and errors removed. However, in reality it is possible that the error which is supposed to have been removed may cause a failure again. It may be due to the spawning of a new error because of imperfect debugging. If such a phenomenon exists then the Software Reliability Growth is S-shaped. In this paper, we develop a model which can describe imperfect debugging process and has the inbuilt flexibility of capturing a wide class of growth curves. Earlier attempts of modelling such a process were able to capture only a particular curve. In other words, if a failure observation phenomenon is exponential then the error removal is again modelled by an exponential growth curve. Applicability of the model has been shown on several data sets obtained from different software development projects. 相似文献