首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1208篇
  免费   23篇
  国内免费   1篇
电工技术   16篇
综合类   3篇
化学工业   256篇
金属工艺   30篇
机械仪表   16篇
建筑科学   156篇
矿业工程   2篇
能源动力   40篇
轻工业   50篇
水利工程   4篇
石油天然气   1篇
武器工业   1篇
无线电   77篇
一般工业技术   201篇
冶金工业   88篇
原子能技术   11篇
自动化技术   280篇
  2024年   13篇
  2023年   31篇
  2022年   20篇
  2021年   68篇
  2020年   43篇
  2019年   50篇
  2018年   44篇
  2017年   32篇
  2016年   55篇
  2015年   40篇
  2014年   66篇
  2013年   62篇
  2012年   68篇
  2011年   88篇
  2010年   62篇
  2009年   65篇
  2008年   50篇
  2007年   51篇
  2006年   29篇
  2005年   44篇
  2004年   29篇
  2003年   13篇
  2002年   15篇
  2001年   23篇
  2000年   4篇
  1999年   14篇
  1998年   19篇
  1997年   22篇
  1996年   15篇
  1995年   14篇
  1994年   5篇
  1993年   13篇
  1992年   8篇
  1991年   3篇
  1990年   5篇
  1989年   4篇
  1986年   3篇
  1985年   3篇
  1984年   6篇
  1983年   3篇
  1981年   4篇
  1979年   3篇
  1975年   2篇
  1974年   4篇
  1971年   3篇
  1970年   1篇
  1969年   2篇
  1968年   2篇
  1963年   1篇
  1957年   1篇
排序方式: 共有1232条查询结果,搜索用时 15 毫秒
81.
This paper presents a new approach for increasing the robustness of multi-channel automatic speech recognition in noisy and reverberant multi-source environments. The proposed method uses uncertainty propagation techniques to dynamically compensate the speech features and the acoustic models for the observation uncertainty determined at the beamforming stage. We present and analyze two methods that allow integrating classical multi-channel signal processing approaches like delay and sum beamformers or Zelinski-type Wiener filters, with uncertainty-of-observation techniques like uncertainty decoding or modified imputation. An analysis of the results on the PASCAL-CHiME task shows that this approach consistently outperforms conventional beamformers with a minimal increase in computational complexity. The use of dynamic compensation based on observation uncertainty also outperforms conventional static adaptation with no need of adaptation data.  相似文献   
82.
The analysis of large dynamic networks poses a challenge in many fields, ranging from large bot-nets to social networks. As dynamic networks exhibit different characteristics, e.g., being of sparse or dense structure, or having a continuous or discrete time line, a variety of visualization techniques have been specifically designed to handle these different aspects of network structure and time. This wide range of existing techniques is well justified, as rarely a single visualization is suitable to cover the entire visual analysis. Instead, visual representations are often switched in the course of the exploration of dynamic graphs as the focus of analysis shifts between the temporal and the structural aspects of the data. To support such a switching in a seamless and intuitive manner, we introduce the concept of in situ visualization--a novel strategy that tightly integrates existing visualization techniques for dynamic networks. It does so by allowing the user to interactively select in a base visualization a region for which a different visualization technique is then applied and embedded in the selection made. This permits to change the way a locally selected group of data items, such as nodes or time points, are shown--right in the place where they are positioned, thus supporting the user's overall mental map. Using this approach, a user can switch seamlessly between different visual representations to adapt a region of a base visualization to the specifics of the data within it or to the current analysis focus. This paper presents and discusses the in situ visualization strategy and its implications for dynamic graph visualization. Furthermore, it illustrates its usefulness by employing it for the visual exploration of dynamic networks from two different fields: model versioning and wireless mesh networks.  相似文献   
83.
Without non-linear basis functions many problems can not be solved by linear algorithms. This article proposes a method to automatically construct such basis functions with slow feature analysis (SFA). Non-linear optimization of this unsupervised learning method generates an orthogonal basis on the unknown latent space for a given time series. In contrast to methods like PCA, SFA is thus well suited for techniques that make direct use of the latent space. Real-world time series can be complex, and current SFA algorithms are either not powerful enough or tend to over-fit. We make use of the kernel trick in combination with sparsification to develop a kernelized SFA algorithm which provides a powerful function class for large data sets. Sparsity is achieved by a novel matching pursuit approach that can be applied to other tasks as well. For small data sets, however, the kernel SFA approach leads to over-fitting and numerical instabilities. To enforce a stable solution, we introduce regularization to the SFA objective. We hypothesize that our algorithm generates a feature space that resembles a Fourier basis in the unknown space of latent variables underlying a given real-world time series. We evaluate this hypothesis at the example of a vowel classification task in comparison to sparse kernel PCA. Our results show excellent classification accuracy and demonstrate the superiority of kernel SFA over kernel PCA in encoding latent variables.  相似文献   
84.
    
Functionalized magnetic microspheres have promising applications in different microfluidic devices including MEMS-scale biosensors. These particles exhibit magnetic field-induced aggregation, which can be harnessed to achieve several practical tasks in microfluidic devices. For this, the particle aggregation needs to be well characterized. Herein, a numerical simulation and experimental validation of particle-chaining is presented. Simulations show that the particle aggregation time scales linearly with a group parameter. The predicted growth of one- two- and three-particle chains with time shows a similar trend as that found in the experiments. The results of the study could help predicting the performance of magnetic aggregate-based lab-on-a-chip devices.  相似文献   
85.
Embedded wireless sensors are important components of mobile distributed computing networks, and one of the target applications areas is health care. The preservation of mobility for senior citizens is one of the key issues in maintaining an independent lifestyle. Thus health technologies inside a car can contribute both to safety issues (supervision of driver fitness) as well as healthcare issues by monitoring vitals signs imperceptibly. In this paper, three embedded measurement techniques for non-contact monitoring of vital signals have been investigated. Specifically, capacitive electrocardiogram (cECG) monitoring, mechanical movement analysis (ballistocardiogram, BCG) using piezo-foils and inductive impedance monitoring were examined regarding their potential for integration into car seats. All three sensing techniques omit the need for electroconductive contact to the human body, but require defined mechanical boundary conditions (stable distances or, in the case of BCG, frictional connection). The physical principles of operation, the specific boundary conditions regarding automotive integration and the results during wireless operation in a running car are presented. All three sensors were equipped with local intelligence by incorporating a microcontroller. To eliminate the need for additional cabling, a wireless Bluetooth communication module was added and used to transmit data to a measurement PC. Finally, preliminary results obtained during test drives on German city roads and highways are discussed.  相似文献   
86.
  总被引:1,自引:0,他引:1  
Constraint-based variability modeling is a flexible, declarative approach to managing solution-space variability. Product variants are defined in a top-down manner by successively restricting the admissible combinations of product artifacts until a specific product variant is determined. In this paper, we illustrate the range of constraint-based variability modeling by discussing two of its extreme flavors: constraint-guarded variability modeling and constraint-driven variability modeling. The former applies model checking to establish the global consistency of product variants which are built by manual specification of variations points, whereas the latter uses synthesis technology to fully automatically generate product variants that satisfy all given constraints. Each flavor is illustrated by means of a concrete case study.  相似文献   
87.
88.
The World Wide Web has turned hypertext into a success story by enabling world-wide sharing of unstructured information and informal knowledge. The Semantic Web targets the sharing of structured information and formal knowledge pursuing objectives of achieving collective intelligence on the Web. Germane to the structure of the Semantic Web is a layering and standardization of concerns. These concerns are reflected by an architecture of the Semantic Web that we present through a common use case. Semantic Web data for the use case is now found on the Web and is part of a quickly growing set of Semantic Web resources available for formal processing.  相似文献   
89.
In this paper, crucial aspects of the implications and the complexity of interconnected multi-pollutant multi-effect assessments of both air pollution control strategies and the closely related reduction of greenhouse gas emissions will be discussed. The main aims of the work described here are to identify the core problems which occur when trying to apply current state-of-the-art methodology to conduct integrated assessments – in this context, cost-benefit assessment (CBA) as well as cost-effectiveness assessment (CEA) – using sophisticated computer models and propose solutions to the problems identified. The approaches described will display the integrated use of databases, efficient algorithms and already existing software tools and models in a unified model framework. The first part of the paper discusses the need for new developments in one particular field of Integrated Assessment Models (IAMs), which is the use of (typically) country-specific single pollutant abatement cost curves, which have been applied in a large number of modelling approaches with the aim to find cost-effective solutions for given air quality targets. However, research conducted to find such cost-effective solutions for the non-linear problem of tropospheric ozone abatement (dealing with two primary pollutants and their rather complex relationship to form tropospheric ozone, [see] [Friedrich, R., Reis, S. (Eds.), 2000. Tropospheric Ozone Abatement – Developing Efficient Strategies for the Reduction of Ozone Precursor Emissions in Europe. Springer Publishers] identified basic problems of cost curve based approaches even in this two-pollutant case. The approach discussed here promises to solve the key problems identified, making extensive use of databases in order to provide fast and high quality model input for CEA and CBA. In addition to that, the application of Genetic Algorithms will be discussed as a means to address extremely complex, vast solution spaces which are typical for the tasks IAMs are set to solve nowadays. As the application of the model in extensive assessment studies is currently under way, it is yet too early for a full evaluation of lessons learned. However, initial tests of performance and behaviour have shown robust and promising results.  相似文献   
90.
In this paper, our solution to the problem of modelling functionally complex communication systems at the application level, based on lightweight coordination, is extended to seamlessly capture system-level testing as well. This extension could be realized simply by self-application: the bulk of the work for integrating system-level testing into our development environment, the ABC, concerned domain modelling, which can be done using the ABC. Therefore, the extension of the ABC to cover system-level testing was merely an application development on the basis of the ABC, illustrated here in the domain of Computer Telephony Integration. Here the adoption of a coarse-grained approach to test design, which is central to the scalability of the overall testing environment, is the enabling aspect for system-level test automation. Together with our lightweight coordination approach this induces an understandable modelling paradigm of system-wide test cases that is adequate for the needs and requirements of industrial test engineers. In particular, it enables test engineers to graphically design complex test cases that, in addition, can even be automatically checked for their intended purposes via model checking.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号