首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   845篇
  免费   59篇
  国内免费   2篇
电工技术   6篇
综合类   1篇
化学工业   267篇
金属工艺   18篇
机械仪表   29篇
建筑科学   26篇
矿业工程   7篇
能源动力   51篇
轻工业   124篇
水利工程   11篇
石油天然气   8篇
无线电   40篇
一般工业技术   109篇
冶金工业   55篇
原子能技术   2篇
自动化技术   152篇
  2024年   8篇
  2023年   18篇
  2022年   66篇
  2021年   66篇
  2020年   26篇
  2019年   38篇
  2018年   53篇
  2017年   54篇
  2016年   44篇
  2015年   29篇
  2014年   28篇
  2013年   76篇
  2012年   60篇
  2011年   70篇
  2010年   34篇
  2009年   40篇
  2008年   30篇
  2007年   36篇
  2006年   14篇
  2005年   15篇
  2004年   9篇
  2003年   8篇
  2002年   13篇
  2001年   5篇
  2000年   7篇
  1999年   3篇
  1998年   7篇
  1997年   10篇
  1996年   5篇
  1995年   4篇
  1994年   3篇
  1993年   3篇
  1992年   4篇
  1991年   2篇
  1990年   1篇
  1989年   2篇
  1988年   1篇
  1987年   2篇
  1986年   2篇
  1985年   2篇
  1984年   1篇
  1981年   1篇
  1980年   2篇
  1978年   1篇
  1975年   1篇
  1971年   1篇
  1963年   1篇
排序方式: 共有906条查询结果,搜索用时 15 毫秒
71.
The design of fault tolerant systems is gaining importance in large domains of embedded applications where design constrains are as important as reliability. New software techniques, based on selective application of redundancy, have shown remarkable fault coverage with reduced costs and overheads. However, the large number of different solutions provided by these techniques, and the costly process to assess their reliability, make the design space exploration a very difficult and time-consuming task. This paper proposes the integration of a multi-objective optimization tool with a software hardening environment to perform an automatic design space exploration in the search for the best trade-offs between reliability, cost, and performance. The first tool is commanded by a genetic algorithm which can simultaneously fulfill many design goals thanks to the use of the NSGA-II multi-objective algorithm. The second is a compiler-based infrastructure that automatically produces selective protected (hardened) versions of the software and generates accurate overhead reports and fault coverage estimations. The advantages of our proposal are illustrated by means of a complex and detailed case study involving a typical embedded application, the AES (Advanced Encryption Standard).  相似文献   
72.
Reference spectra extracted from spectral libraries can distinguish different water types in images when associated with limnological information. In this study, we compiled available databases into a single spectral library, using field water reflectance spectra and limnological data collected by different researchers and campaigns in the Amazonian region. By using an iterative clustering procedure based on the combination of reflectance and optically active components (OACs), reference spectra representative of the major Amazonian water types were defined from this library. Differences between the resultant limnological classes were also evaluated by paired t-tests at significance level 0.05. Finally, reference spectra were tested for Spectral Angle Mapper (SAM) classification of waters in Hyperion/Earth Observing-One (EO-1) and Medium Resolution Imaging Spectrometer (MERIS)/Environment Satellite (Envisat) images acquired simultaneously as the field campaigns. Results showed highly variable concentrations of OACs due to the complexity of the Amazonian aquatic environments. Ten classes were defined to represent this complexity, broadly grouped into four limnological characteristics: clear waters with low concentrations of OACs (class 1); black waters rich in dissolved organic carbon (DOC) (class 2); waters with large concentrations of inorganic suspended solids (ISSs) (classes 3–7); and waters dominated by chlorophyll-a (chl-a) (classes 8–10). Using the ten reference spectra, SAM classification of the field water curves produced an overall accuracy of 86% with the highest values observed for classes 3, 4, 6 and 7 and the lowest accuracy for classes 1 and 2. The results of paired t-tests confirmed the class differences based on the concentrations of OACs. SAM classification of the Hyperion and MERIS images using ground truth information resulted in overall classification accuracies of 48% and 67%, respectively, with the highest errors associated with specific portions of the scenes that were not adequately represented in the spectral library.  相似文献   
73.
A macroscopic model is presented that simultaneously estimates route flows and trip matrices for congested road networks using data on link densities instead of link flows. The advantage of this approach is that it avoids errors that may occur in the individual links’ flow-cost relationships when congestion is heavy. Under the proposed methodology, both the flows and the matrices are estimated by the model using an image of the network such as an aerial photograph in which the number of vehicles on each link can be identified. The model itself is formulated as a maximum entropy optimization problem subject to linear constraints given by vehicle densities on the links, and is validated using analytic examples and traffic microsimulations. The results demonstrate the superiority of the link-density approach over the traditional flow-based method.  相似文献   
74.
Although traditional approaches to code profiling help locate performance bottlenecks, they offer only limited support for removing these bottlenecks. The main reason is the lack of detailed visual runtime information to identify and eliminate computation redundancy. We provide three profiling blueprints that help identify and remove performance bottlenecks. The structural distribution blueprint graphically represents the CPU consumption share for each method and class of an application. The behavioral distribution blueprint depicts the distribution of CPU consumption along method invocations and hints at method candidates for caching optimizations. The behavioral evolution blueprint compares profiles of different versions of a software system and highlights performance‐critical changes in the system. These three blueprints helped us to significantly optimize Mondrian, an open source visualization engine. Our implementation is freely available for the Pharo development environment and has been evaluated in a number of different scenarios. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   
75.
This work proposes a reusable architecture that enables the self-configuration of a supporting infrastructure for Web server clusters using virtual machines. The goal of the architecture is to ensure service quality, evaluating how broadly it complies with the application's operating restrictions and proportionally acting on the configuration of physical servers (hosts) or virtual machines. In addition, through the rational use of resources, the proposal aims at saving energy. A prototype of the architecture was developed and a performance evaluation carried out with two different resource management approaches. This evaluation shows how fully functional and advantageous the proposal is in terms of using resources, avoiding waste, yet maintaining the application's quality of service within acceptable levels. The architecture also shows to be flexible enough to accept, with a reasonable amount of effort, different resource self-configuration policies.  相似文献   
76.
Neural Computing and Applications - Understanding at microscopic level the generation of contents in an online social network (OSN) is highly desirable for an improved management of the OSN and the...  相似文献   
77.
We propose a tuner, suitable for adaptive control and (in its discrete-time version) adaptive filtering applications, that sets the second derivative of the parameter estimates rather than the first derivative as is done in the overwhelming majority of the literature.  相似文献   
78.
Accessibility planning with reference to sustainability and equity principles has been advocated as the best approach to deal with the urban mobility complexity. It has enabled the development of more sustainable and fair policies in relation to access provision. However, despite this paradigm shift, many planning initiatives in practice are still focused on assessing alternatives and proposing solutions, instead of centering on the understanding and assessment of problems as the primary activity of planning. Therefore, in order to contribute for the problem-oriented paradigm in the accessibility planning, this work proposes a strategic assessment methodology for unequal and inequitable distribution problems of accessibility and mobility. This methodology relies on spatial analysis techniques and allows the characterization of accessibility and mobility conditions, as well as the diagnosis of accessibility and mobility problems and their causal relationships. It was applied for the case of Lisbon and the results of the performed assessment allowed an intelligent reading of the problems considered. Specifically, it was found that Lisbon presents an unequal and inequitable distribution of job accessibility and mobility by private car and public transport, and also that job accessibility, along with other transportation, land-use and socioeconomic variables, impact the mobility levels of its citizens.  相似文献   
79.
This work introduces a heuristic for mixed integer programming (MIP) problems with binary variables, based on information obtained from differences between feasible solutions as well as solutions from the linear relaxation. This information is used to build a neighborhood that is explored as a sub‐MIP problem. The proposed heuristic is evaluated using 45 problems from the MIPLIB repository. Its performance, in terms of solution improvement over the results obtained after exploring 50,000 nodes of the branch‐and‐bound tree, is compared against that of Solution Polishing, which is another recombination‐based heuristic for MIP problems used within the CPLEX solver; as well as against the solution obtained by running the default CPLEX branch‐and‐cut (B&C) method under a same time limit. The computational results indicate that the proposed method is able to yield results that are significantly better than those obtained by the default CPLEX B&C approach and comparable to those of Solution Polishing in terms of the mean solution quality. This equivalence of expected solution quality, coupled with a simpler implementation, suggests the use of the proposed approach as a possible alternative for improving the quality of solutions in MIP problems.  相似文献   
80.
Blogs can be used as a conduit for customer opinions and, in so doing, building communities around products. We attempt to realise this vision by building blogs out of product catalogues. Unfortunately, the immaturity of blog engines makes this endeavour risky. This paper presents a model-driven approach to face this drawback. This implies the introduction of (meta)models: the catalogue model, based on the standard Open Catalog Format, and blog models, that elaborate on the use of blogs as conduits for virtual communities. Blog models end up being realised through blog engines. Specifically, we focus on two types of engines: a hosted blog platform and a standalone blog platform, both in Blojsom. However, the lack of standards in a broad and constantly evolving blog-engine space, hinders both the portability and the maintainability of the solution. Hence, we resort to the notion of “abstract platform” as a way to depart from the peculiarities of specific blog engines. Additionally, the paper measures the reuse gains brought by MDE in comparison with the manual coding of blogs.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号