首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   619篇
  免费   2篇
  国内免费   2篇
电工技术   3篇
综合类   2篇
化学工业   130篇
金属工艺   19篇
机械仪表   5篇
建筑科学   20篇
能源动力   18篇
轻工业   74篇
水利工程   1篇
石油天然气   2篇
无线电   56篇
一般工业技术   133篇
冶金工业   22篇
原子能技术   7篇
自动化技术   131篇
  2024年   1篇
  2023年   5篇
  2022年   12篇
  2021年   13篇
  2020年   9篇
  2019年   7篇
  2018年   5篇
  2017年   10篇
  2016年   18篇
  2015年   11篇
  2014年   31篇
  2013年   25篇
  2012年   42篇
  2011年   67篇
  2010年   40篇
  2009年   38篇
  2008年   36篇
  2007年   30篇
  2006年   30篇
  2005年   16篇
  2004年   23篇
  2003年   25篇
  2002年   13篇
  2001年   7篇
  2000年   14篇
  1999年   14篇
  1998年   5篇
  1997年   14篇
  1996年   8篇
  1995年   7篇
  1994年   7篇
  1993年   6篇
  1992年   5篇
  1991年   4篇
  1990年   1篇
  1989年   3篇
  1988年   2篇
  1987年   2篇
  1986年   2篇
  1984年   2篇
  1983年   1篇
  1981年   1篇
  1980年   4篇
  1977年   1篇
  1976年   2篇
  1974年   2篇
  1973年   1篇
  1967年   1篇
排序方式: 共有623条查询结果,搜索用时 31 毫秒
71.
Monocular Vision for Mobile Robot Localization and Autonomous Navigation   总被引:5,自引:0,他引:5  
This paper presents a new real-time localization system for a mobile robot. We show that autonomous navigation is possible in outdoor situation with the use of a single camera and natural landmarks. To do that, we use a three step approach. In a learning step, the robot is manually guided on a path and a video sequence is recorded with a front looking camera. Then a structure from motion algorithm is used to build a 3D map from this learning sequence. Finally in the navigation step, the robot uses this map to compute its localization in real-time and it follows the learning path or a slightly different path if desired. The vision algorithms used for map building and localization are first detailed. Then a large part of the paper is dedicated to the experimental evaluation of the accuracy and robustness of our algorithms based on experimental data collected during two years in various environments.  相似文献   
72.
International Journal on Document Analysis and Recognition (IJDAR) - Motivated by increasing possibility of the tampering of genuine documents during a transmission over digital channels, we focus...  相似文献   
73.
Many small software organizations have recognized the need to improve their software product. Evaluating the software product alone seems insufficient since it is known that its quality is largely dependant on the process that is used to create it. Thus, small organizations are asking for evaluation of their software processes and products. The ISO/IEC 14598-5 standard is already used as a methodology basis for evaluating software products. This article explores how it can be combined with the CMMI to produce a methodology that can be tailored for process evaluation in order to improve their software processes. SM: CMMI is a service mark of Carnegie-Mellon University. Sylvie Trudel has over 20 years of experience in software. She worked for more than 10 years in development and implementation of management information systems and embedded real-time systems. Since 1996, she works as a process improvement specialist, implementing best practices into organizations processes from CMM and CMMI models. She performed several CMM and CMMI assessments and participated in many other CMM assessments such as CBA IPI, SCE, and other proprietary methods. She obtained a bachelors degree in computer science in 1986 from Laval University in Québec City and a Masters degree in Software Engineering at école de Technologie Supérieure (éTS) in Montréal. Sylvie is currently working as a software engineering advisor at the Centre de Recherche Informatique de Montréal (CRIM). Jean-Marc Lavoie has been working in software development for over 10 years. He performed and published a comparative study between the guide to the SWEBOK and the CMMI in 2003. Jean-Marc obtained a bachelor degree in Electrical Engineering. He is pursuing a Masters degree in Software Engineering at école de Technologie Supérieure (éTS) in Montréal while working as a software architect at Trisotech. Marie-Claude Pare has been working in software development for 7 years. Marie-Claude obtained a bachelor degree in Software Engineering from école Polytechnique in Montréal. She is pursuing a Masters degree in Software Engineering at école de Technologie Supérieure (éTS) in Montréal while working as a software engineer at Motorola GSG Canada. Dr Witold Suryn is a Professor at the école de technologie supérieure, Montreal, Canada (engineering school of the Université du Québec network of institutions) where he teaches graduate and undergraduate software engineering courses and conducts research in the domain of software quality engineering, software engineering body of knowledge and software engineering fundamental principles. Dr Suryn is also the principal researcher and the director of GELOG : IQUAL, the Software Quality Engineering Research Group at école de technologie supérieure. From October 2003 Dr. Suryn holds the position of the International Secretary of ISO/IEC SC7 – System and Software Engineering.  相似文献   
74.
The ARIANE launcher post mission analysis is done at ARIANESPACE. This activity is called the ‘level 0 post flight analysis’ (PFA) and is carried out after each launch by about 60 engineers who are working together under the leadership of ARIANESPACE.

The PFA is one of the most critical of ARIANE operations, for several reasons:

• - The launch rate (8 a year for ARIANE 4) leaves a very short time to carry out all the verification work. Moreover, the PFA is a mandatory step before authorizing the next launch.
• - The complexity of the ARIANE launcher results in a very high demand on the PFA engineers. Moreover, there are problems of availability of people with relevant expert knowledge (characterized by a substantial staff turn-over during the 10 year life duration of ARIANE 4) which could potentially result in errors or omissions.

It is very important to be able to take into account the experience of the preceding flights and to record the results and the knowledge accumulated for each launch.

• - The quality and the reliability of the PFA mainly depends on the accessibility of data and on the used methodology.

Because the PFA is still largely done manually, and does not benefit from improved methodologies and advanced technologies providing computerized support for data processing and diagnosis, ARIANESPACE has sponsored MATRA ESPACE for the development of a knowledge based system, called ARIANEXPERT, for supporting the PFA activity. This system combines AI techniques and numerical analysis techniques, together with advanced graphical capabilities.

A prototype has been delivered in April 1990 and has been used since 6 months by ARIANESPACE during real PFAs. Several lessons have been drawn from this operational experience and are described in this paper. They concern:

• - The utility and justification of the use of AI techniques mostly coming from the explanation capabilities and the stress put on capturing the expert knowledge.
• - The difficulties associated with the integration of such systems in the exploitation of ARIANE due to the introduction of very new tasks.
• - The user point of view which evolved from reluctant to convinced.
  相似文献   
75.
This article describes, from an industrial user's point of view, how large-signal GaAs MESFET and HEMT modeling can be done accurately and efficiently for power MMIC amplifier design. The method is based on commercially available CAD tools enhanced by in-house software (e.g., small-signal parameter extraction, generation of load-pull contours). The Materka model is shown to predict accurately the large-signal characteristics of GaAs MESFETs, but not of pseudomorphic HEMTs. For these devices, a modified Angelov model is found to be adequate. A method for determining the numerous large-signal model parameters is presented. Model verification is achieved by comparing simulated and on-wafer measured data like static I(V)-characteristics, multiple bias S-parameters, gain compression characteristics, and load-pull contours. Results of device scaling and calculations of optimum load impedances are discussed. The close fit to the measured data proves that an excellent basis for large-signal power MMIC design has been established.  相似文献   
76.
We consider the problem of assuring the trustworthiness (i.e. reliability and robustness) and prolonging the lifetime of wireless ad hoc networks, using the OLSR routing protocol, in the presence of selfish nodes. Assuring the trustworthiness of these networks can be achieved by selecting the most trusted paths, while prolonging the lifetime can be achieved by (1) reducing the number of relay nodes (MPR) propagating the topology control (TC) messages and (2) considering the residual energy levels of these relay nodes in the selection process. In this paper, we propose a novel clustering algorithm and a relay node selection algorithm based on the residual energy level and connectivity index of the nodes. This hybrid model is referred to as H-OLSR. The OLSR messages are adapted to handle the cluster heads election and the MPR nodes selection algorithms. These algorithms are designed to cope with selfish nodes that are getting benefits from others without cooperating with them. Hence, we propose an incentive compatible mechanism that motivates nodes to behave truthfully during the selection and election processes. Incentive retributions increase the reputation of the nodes. Since network services are granted according to nodes’ accumulated reputation, the nodes should cooperate. Finally, based on nodes’ reputation, the most trusted forwarding paths are determined. This reputation-based hybrid model is referred to as RH-OLSR. Simulation results show that the novel H-OLSR model based on energy and connectivity can efficiently prolong the network lifetime, while the RH-OLSR model improves the trustworthiness of the network through the selection of the most trusted paths based on nodes’ reputations. These are the two different processes used to define the reputation-based clustering OLSR (RBC-OLSR) routing protocol.  相似文献   
77.
We have investigated the transport properties, resistivity and Hall effect, in a series of underdoped GdBa 2 Cu 3 O x thin films grown by off-axis magnetron sputtering. We find a systematic correlation between the critical temperature T c and the inverse Hall constant R H -1 , related, in simple models, to the carrier concentration n. Our experimental thin film T c (n) data are in good agreement with the temperature-doping phase diagram obtained for YBa 2 Cu 3 O x single crystals. By measuring the activation energies in the liquid vortex phase, and by using a 2-dimensional model for vortex dynamics, we have extracted the penetration depth of these samples, and studied the relation between the carrier concentration and the superfluid density to probe the role of phase fluctuations on superconductivity.  相似文献   
78.
Many research initiatives carried out in production management consider process planning and operations scheduling as two separate and sequential functions. However, in certain contexts, the two functions must be better integrated. This is the case in divergent production systems with co-production (i.e. production of different products at the same time from a single product input) when alternative production processes are available. This paper studies such a context and focuses on the case of drying and finishing operations in a softwood lumber facility. The situation is addressed using a single model that simultaneously performs process planning and scheduling. We evaluate two alternative formulations. The first one is based on mixed integer programming (MIP) and the second on constraint programming (CP). We also propose a search procedure to improve the performance of the CP approach. Both approaches are compared with respect to their capacity to generate good solutions in short computation time.  相似文献   
79.
How do people work when they are collaborating to write a document? What kind of tools do they use and, in particular, do they resort to groupware for this task? Forty-one people filled out a questionnaire placed on the World Wide Web. In spite of the existence of specialized collaborative writing tools, most respondents reported using individual word processors and email as their main tools for writing joint documents. Respondents noted the importance of functions such as change tracking, version control, and synchronous work for collaborative writing tools. This study also confirmed the great variability that exists between collaborative writing projects, whether it be group membership, management, writing strategy, or scheduling issues.  相似文献   
80.
The simulation of gross primary production (GPP) at various spatial and temporal scales remains a major challenge for quantifying the global carbon cycle. We developed a light use efficiency model, called EC-LUE, driven by only four variables: normalized difference vegetation index (NDVI), photosynthetically active radiation (PAR), air temperature, and the Bowen ratio of sensible to latent heat flux. The EC-LUE model may have the most potential to adequately address the spatial and temporal dynamics of GPP because its parameters (i.e., the potential light use efficiency and optimal plant growth temperature) are invariant across the various land cover types. However, the application of the previous EC-LUE model was hampered by poor prediction of Bowen ratio at the large spatial scale. In this study, we substituted the Bowen ratio with the ratio of evapotranspiration (ET) to net radiation, and revised the RS-PM (Remote Sensing-Penman Monteith) model for quantifying ET. Fifty-four eddy covariance towers, including various ecosystem types, were selected to calibrate and validate the revised RS-PM and EC-LUE models. The revised RS-PM model explained 82% and 68% of the observed variations of ET for all the calibration and validation sites, respectively. Using estimated ET as input, the EC-LUE model performed well in calibration and validation sites, explaining 75% and 61% of the observed GPP variation for calibration and validation sites respectively.Global patterns of ET and GPP at a spatial resolution of 0.5° latitude by 0.6° longitude during the years 2000-2003 were determined using the global MERRA dataset (Modern Era Retrospective-Analysis for Research and Applications) and MODIS (Moderate Resolution Imaging Spectroradiometer). The global estimates of ET and GPP agreed well with the other global models from the literature, with the highest ET and GPP over tropical forests and the lowest values in dry and high latitude areas. However, comparisons with observed GPP at eddy flux towers showed significant underestimation of ET and GPP due to lower net radiation of MERRA dataset. Applying a procedure to correct the systematic errors of global meteorological data would improve global estimates of GPP and ET. The revised RS-PM and EC-LUE models will provide the alternative approaches making it possible to map ET and GPP over large areas because (1) the model parameters are invariant across various land cover types and (2) all driving forces of the models may be derived from remote sensing data or existing climate observation networks.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号