This paper presents a new real-time localization system for a mobile robot. We show that autonomous navigation is possible
in outdoor situation with the use of a single camera and natural landmarks. To do that, we use a three step approach. In a
learning step, the robot is manually guided on a path and a video sequence is recorded with a front looking camera. Then a
structure from motion algorithm is used to build a 3D map from this learning sequence. Finally in the navigation step, the
robot uses this map to compute its localization in real-time and it follows the learning path or a slightly different path
if desired. The vision algorithms used for map building and localization are first detailed. Then a large part of the paper
is dedicated to the experimental evaluation of the accuracy and robustness of our algorithms based on experimental data collected
during two years in various environments. 相似文献
International Journal on Document Analysis and Recognition (IJDAR) - Motivated by increasing possibility of the tampering of genuine documents during a transmission over digital channels, we focus... 相似文献
Many small software organizations have recognized the need to improve their software product. Evaluating the software product
alone seems insufficient since it is known that its quality is largely dependant on the process that is used to create it.
Thus, small organizations are asking for evaluation of their software processes and products. The ISO/IEC 14598-5 standard
is already used as a methodology basis for evaluating software products. This article explores how it can be combined with
the CMMI to produce a methodology that can be tailored for process evaluation in order to improve their software processes.
SM: CMMI is a service mark of Carnegie-Mellon University.
Sylvie Trudel has over 20 years of experience in software. She worked for more than 10 years in development and implementation of management
information systems and embedded real-time systems. Since 1996, she works as a process improvement specialist, implementing
best practices into organizations processes from CMM and CMMI models. She performed several CMM and CMMI assessments and participated
in many other CMM assessments such as CBA IPI, SCE, and other proprietary methods. She obtained a bachelors degree in computer
science in 1986 from Laval University in Québec City and a Masters degree in Software Engineering at école de Technologie
Supérieure (éTS) in Montréal. Sylvie is currently working as a software engineering advisor at the Centre de Recherche Informatique
de Montréal (CRIM).
Jean-Marc Lavoie has been working in software development for over 10 years. He performed and published a comparative study between the guide
to the SWEBOK and the CMMI in 2003. Jean-Marc obtained a bachelor degree in Electrical Engineering. He is pursuing a Masters
degree in Software Engineering at école de Technologie Supérieure (éTS) in Montréal while working as a software architect
at Trisotech.
Marie-Claude Pare has been working in software development for 7 years. Marie-Claude obtained a bachelor degree in Software Engineering from
école Polytechnique in Montréal. She is pursuing a Masters degree in Software Engineering at école de Technologie Supérieure
(éTS) in Montréal while working as a software engineer at Motorola GSG Canada.
Dr Witold Suryn is a Professor at the école de technologie supérieure, Montreal, Canada (engineering school of the Université du Québec network
of institutions) where he teaches graduate and undergraduate software engineering courses and conducts research in the domain
of software quality engineering, software engineering body of knowledge and software engineering fundamental principles. Dr
Suryn is also the principal researcher and the director of GELOG : IQUAL, the Software Quality Engineering Research Group
at école de technologie supérieure. From October 2003 Dr. Suryn holds the position of the International Secretary of ISO/IEC
SC7 – System and Software Engineering. 相似文献
The ARIANE launcher post mission analysis is done at ARIANESPACE. This activity is called the ‘level 0 post flight analysis’ (PFA) and is carried out after each launch by about 60 engineers who are working together under the leadership of ARIANESPACE.
The PFA is one of the most critical of ARIANE operations, for several reasons:
• - The launch rate (8 a year for ARIANE 4) leaves a very short time to carry out all the verification work. Moreover, the PFA is a mandatory step before authorizing the next launch.
• - The complexity of the ARIANE launcher results in a very high demand on the PFA engineers. Moreover, there are problems of availability of people with relevant expert knowledge (characterized by a substantial staff turn-over during the 10 year life duration of ARIANE 4) which could potentially result in errors or omissions.
It is very important to be able to take into account the experience of the preceding flights and to record the results and the knowledge accumulated for each launch.
• - The quality and the reliability of the PFA mainly depends on the accessibility of data and on the used methodology.
Because the PFA is still largely done manually, and does not benefit from improved methodologies and advanced technologies providing computerized support for data processing and diagnosis, ARIANESPACE has sponsored MATRA ESPACE for the development of a knowledge based system, called ARIANEXPERT, for supporting the PFA activity. This system combines AI techniques and numerical analysis techniques, together with advanced graphical capabilities.
A prototype has been delivered in April 1990 and has been used since 6 months by ARIANESPACE during real PFAs. Several lessons have been drawn from this operational experience and are described in this paper. They concern:
• - The utility and justification of the use of AI techniques mostly coming from the explanation capabilities and the stress put on capturing the expert knowledge.
• - The difficulties associated with the integration of such systems in the exploitation of ARIANE due to the introduction of very new tasks.
• - The user point of view which evolved from reluctant to convinced.
This article describes, from an industrial user's point of view, how large-signal GaAs MESFET and HEMT modeling can be done accurately and efficiently for power MMIC amplifier design. The method is based on commercially available CAD tools enhanced by in-house software (e.g., small-signal parameter extraction, generation of load-pull contours). The Materka model is shown to predict accurately the large-signal characteristics of GaAs MESFETs, but not of pseudomorphic HEMTs. For these devices, a modified Angelov model is found to be adequate. A method for determining the numerous large-signal model parameters is presented. Model verification is achieved by comparing simulated and on-wafer measured data like static I(V)-characteristics, multiple bias S-parameters, gain compression characteristics, and load-pull contours. Results of device scaling and calculations of optimum load impedances are discussed. The close fit to the measured data proves that an excellent basis for large-signal power MMIC design has been established. 相似文献
We consider the problem of assuring the trustworthiness (i.e. reliability and robustness) and prolonging the lifetime of wireless ad hoc networks, using the OLSR routing protocol, in the presence of selfish nodes. Assuring the trustworthiness of these networks can be achieved by selecting the most trusted paths, while prolonging the lifetime can be achieved by (1) reducing the number of relay nodes (MPR) propagating the topology control (TC) messages and (2) considering the residual energy levels of these relay nodes in the selection process. In this paper, we propose a novel clustering algorithm and a relay node selection algorithm based on the residual energy level and connectivity index of the nodes. This hybrid model is referred to as H-OLSR. The OLSR messages are adapted to handle the cluster heads election and the MPR nodes selection algorithms. These algorithms are designed to cope with selfish nodes that are getting benefits from others without cooperating with them. Hence, we propose an incentive compatible mechanism that motivates nodes to behave truthfully during the selection and election processes. Incentive retributions increase the reputation of the nodes. Since network services are granted according to nodes’ accumulated reputation, the nodes should cooperate. Finally, based on nodes’ reputation, the most trusted forwarding paths are determined. This reputation-based hybrid model is referred to as RH-OLSR. Simulation results show that the novel H-OLSR model based on energy and connectivity can efficiently prolong the network lifetime, while the RH-OLSR model improves the trustworthiness of the network through the selection of the most trusted paths based on nodes’ reputations. These are the two different processes used to define the reputation-based clustering OLSR (RBC-OLSR) routing protocol. 相似文献
We have investigated the transport properties, resistivity and Hall effect, in a series of underdoped GdBa2Cu3Oxthin films grown by off-axis magnetron sputtering. We find a systematic correlation between the critical temperature Tcand the inverse Hall constantRH-1
, related, in simple models, to the carrier concentration n. Our experimental thin film Tc(n) data are in good agreement with the temperature-doping phase diagram obtained for YBa2Cu3Oxsingle crystals. By measuring the activation energies in the liquid vortex phase, and by using a 2-dimensional model for vortex dynamics, we have extracted the penetration depth of these samples, and studied the relation between the carrier concentration and the superfluid density to probe the role of phase fluctuations on superconductivity. 相似文献
Many research initiatives carried out in production management consider process planning and operations scheduling as two separate and sequential functions. However, in certain contexts, the two functions must be better integrated. This is the case in divergent production systems with co-production (i.e. production of different products at the same time from a single product input) when alternative production processes are available. This paper studies such a context and focuses on the case of drying and finishing operations in a softwood lumber facility. The situation is addressed using a single model that simultaneously performs process planning and scheduling. We evaluate two alternative formulations. The first one is based on mixed integer programming (MIP) and the second on constraint programming (CP). We also propose a search procedure to improve the performance of the CP approach. Both approaches are compared with respect to their capacity to generate good solutions in short computation time. 相似文献
How do people work when they are collaborating to write a document? What kind of tools do they use and, in particular, do they resort to groupware for this task? Forty-one people filled out a questionnaire placed on the World Wide Web. In spite of the existence of specialized collaborative writing tools, most respondents reported using individual word processors and email as their main tools for writing joint documents. Respondents noted the importance of functions such as change tracking, version control, and synchronous work for collaborative writing tools. This study also confirmed the great variability that exists between collaborative writing projects, whether it be group membership, management, writing strategy, or scheduling issues. 相似文献
The simulation of gross primary production (GPP) at various spatial and temporal scales remains a major challenge for quantifying the global carbon cycle. We developed a light use efficiency model, called EC-LUE, driven by only four variables: normalized difference vegetation index (NDVI), photosynthetically active radiation (PAR), air temperature, and the Bowen ratio of sensible to latent heat flux. The EC-LUE model may have the most potential to adequately address the spatial and temporal dynamics of GPP because its parameters (i.e., the potential light use efficiency and optimal plant growth temperature) are invariant across the various land cover types. However, the application of the previous EC-LUE model was hampered by poor prediction of Bowen ratio at the large spatial scale. In this study, we substituted the Bowen ratio with the ratio of evapotranspiration (ET) to net radiation, and revised the RS-PM (Remote Sensing-Penman Monteith) model for quantifying ET. Fifty-four eddy covariance towers, including various ecosystem types, were selected to calibrate and validate the revised RS-PM and EC-LUE models. The revised RS-PM model explained 82% and 68% of the observed variations of ET for all the calibration and validation sites, respectively. Using estimated ET as input, the EC-LUE model performed well in calibration and validation sites, explaining 75% and 61% of the observed GPP variation for calibration and validation sites respectively.Global patterns of ET and GPP at a spatial resolution of 0.5° latitude by 0.6° longitude during the years 2000-2003 were determined using the global MERRA dataset (Modern Era Retrospective-Analysis for Research and Applications) and MODIS (Moderate Resolution Imaging Spectroradiometer). The global estimates of ET and GPP agreed well with the other global models from the literature, with the highest ET and GPP over tropical forests and the lowest values in dry and high latitude areas. However, comparisons with observed GPP at eddy flux towers showed significant underestimation of ET and GPP due to lower net radiation of MERRA dataset. Applying a procedure to correct the systematic errors of global meteorological data would improve global estimates of GPP and ET. The revised RS-PM and EC-LUE models will provide the alternative approaches making it possible to map ET and GPP over large areas because (1) the model parameters are invariant across various land cover types and (2) all driving forces of the models may be derived from remote sensing data or existing climate observation networks. 相似文献