首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2148篇
  免费   169篇
  国内免费   6篇
电工技术   27篇
综合类   2篇
化学工业   549篇
金属工艺   45篇
机械仪表   39篇
建筑科学   82篇
能源动力   77篇
轻工业   258篇
水利工程   6篇
石油天然气   2篇
无线电   231篇
一般工业技术   419篇
冶金工业   112篇
原子能技术   36篇
自动化技术   438篇
  2023年   20篇
  2022年   9篇
  2021年   66篇
  2020年   54篇
  2019年   47篇
  2018年   63篇
  2017年   75篇
  2016年   75篇
  2015年   79篇
  2014年   108篇
  2013年   165篇
  2012年   142篇
  2011年   228篇
  2010年   160篇
  2009年   140篇
  2008年   132篇
  2007年   97篇
  2006年   92篇
  2005年   82篇
  2004年   59篇
  2003年   68篇
  2002年   48篇
  2001年   25篇
  2000年   22篇
  1999年   35篇
  1998年   36篇
  1997年   18篇
  1996年   33篇
  1995年   15篇
  1994年   23篇
  1993年   23篇
  1992年   12篇
  1991年   9篇
  1990年   8篇
  1989年   7篇
  1988年   4篇
  1987年   4篇
  1986年   4篇
  1985年   3篇
  1984年   4篇
  1983年   3篇
  1982年   2篇
  1981年   2篇
  1980年   3篇
  1979年   2篇
  1977年   3篇
  1976年   3篇
  1970年   3篇
  1969年   2篇
  1967年   1篇
排序方式: 共有2323条查询结果,搜索用时 237 毫秒
51.
52.
The long-term dynamic behavior of many dynamical systems evolves on a low-dimensional, attracting, invariant slow manifold, which can be parameterized by only a few variables (“observables”). The explicit derivation of such a slow manifold (and thus, the reduction of the long-term system dynamics) is often extremely difficult or practically impossible. For this class of problems, the equation-free framework has been developed to enable performing coarse-grained computations, based on short full model simulations. Each full model simulation should be initialized so that the full model state is consistent with the values of the observables and close to the slow manifold. To compute such an initial full model state, a class of constrained runs functional iterations was proposed (Gear and Kevrekidis, J. Sci. Comput. 25(1), 17–28, 2005; Gear et al., SIAM J. Appl. Dyn. Syst. 4(3), 711–732, 2005). The schemes in this class only use the full model simulator and converge, under certain conditions, to an approximation of the desired state on the slow manifold. In this article, we develop an implementation of the constrained runs scheme that is based on a (preconditioned) Newton-Krylov method rather than on a simple functional iteration. The functional iteration and the Newton-Krylov method are compared in detail using a lattice Boltzmann model for one-dimensional reaction-diffusion as the full model simulator. Depending on the parameters of the lattice Boltzmann model, the functional iteration may converge slowly or even diverge. We show that both issues are largely resolved by using the Newton-Krylov method, especially when a coarse grid correction preconditioner is incorporated.  相似文献   
53.
In the past, lane departure warnings (LDWs) were demonstrated to improve driving behaviours during lane departures but little is known about the effects of unreliable warnings. This experiment focused on the influence of false warnings alone or in combination with missed warnings and warning onset on assistance effectiveness and acceptance. Two assistance unreliability levels (33 and 17%) and two warning onsets (partial and full lane departure) were manipulated in order to investigate interaction. Results showed that assistance, regardless unreliability levels and warning onsets, improved driving behaviours during lane departure episodes and outside of these episodes by favouring better lane-keeping performances. Full lane departure and highly unreliable warnings, however, reduced assistance efficiency. Drivers’ assistance acceptance was better for the most reliable warnings and for the subsequent warnings. The data indicate that imperfect LDWs (false warnings or false and missed warnings) further improve driving behaviours compared to no assistance.

Practitioner Summary: This study revealed that imperfect lane departure warnings are able to significantly improve driving performances and that warning onset is a key element for assistance effectiveness and acceptance. The conclusion may be of particular interest for lane departure warning designers.  相似文献   

54.
In this study, we propose a simple and efficient texture-based algorithm for image segmentation. This method constitutes computing textons and bag of words (BOWs) learned by support vector machine (SVM) classifiers. Textons are composed of local magnitude coefficients that arise from the Q-Shift Dual-Tree Complex Wavelet Transform (DT-CWT) combined with color components. In keeping with the needs of our research context, which addresses land cover mapping from remote images, we use a few small texture patches at the training stage, where other supervised methods usually train fully representative textures. We accounted for the scale and rotation invariance issue of the textons, and three different invariance transforms were evaluated on DT-CWT-based features. The largest contribution of this study is the comparison of three classification schemes in the segmentation algorithm. Specifically, we designed a new scheme that was especially competitive and that uses several classifiers, with each classifier adapted to a specific size of analysis window in texton quantification and trained on a reduced data set by random selection. This configuration allows quick SVM convergence and an easy parallelization of the SVM-bank while maintaining a high segmentation accuracy. We compare classification results with textons made using the well-known maximum response filters bank and speed up robust features features as references. We show that DT-CWT textons provide better distinguishing features in the entire set of configurations tested. Benchmarks of our different method configurations were made over two substantial textured mosaic sets, each composed of 100 grey or color mosaics made up of Brodatz or VisTex textures. Lastly, when applied to remote sensing images, our method yields good region segmentation compared to the ENVI commercial software, which demonstrates that the method could be used to generate land cover maps and is suitable for various purposes in image segmentation.  相似文献   
55.
This paper describes a nonlinear programming‐based robust design methodology for controllers and prefilters of a predefined structure for the linear time‐invariant systems involved in the quantitative feedback theory. This controller and prefilter synthesis problem is formulated as a single optimization problem with a given performance optimization objective and constraints enforcing stability and various specifications usually enforced in the quantitative feedback theory. The focus is set on providing constraints expression that can be used in standard nonlinear programming solvers. The nonlinear solver then computes in a single‐step controller and prefilter design parameters that satisfy the prescribed constraints and maximizes the performance optimization objective. The effectiveness of the proposed approach is demonstrated through a variety of difficult design cases like resonant plants, open‐loop unstable plants, and plants with variation in the time delay. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   
56.
System and process auditors assure – from an information processing perspective – the correctness and integrity of the data that is aggregated in a company’s financial statements. To do so, they assess whether a company’s business processes and information systems process financial data correctly. The audit process is a complex endeavor that in practice has to rely on simplifying assumptions. These simplifying assumptions mainly result from the need to restrict the audit scope and to focus it on the major risks. This article describes a generalized audit process. According to our experience with this process, there is a risk that material deficiencies remain undiscovered when said simplifying assumptions are not satisfied. To address this risk of deficiencies, the article compiles thirteen control patterns, which – according to our experience – are particularly suited to help information systems satisfy the simplifying assumptions. As such, use of these proven control patterns makes information systems easier to audit and IT architects can use them to build systems that meet audit requirements by design. Additionally, the practices and advice offered in this interdisciplinary article help bridge the gap between the architects and auditors of information systems and show either role how to benefit from an understanding of the other role’s terminology, techniques, and general work approach.  相似文献   
57.
An important objective of data mining is the development of predictive models. Based on a number of observations, a model is constructed that allows the analysts to provide classifications or predictions for new observations. Currently, most research focuses on improving the accuracy or precision of these models and comparatively little research has been undertaken to increase their comprehensibility to the analyst or end-user. This is mainly due to the subjective nature of ‘comprehensibility’, which depends on many factors outside the model, such as the user's experience and his/her prior knowledge. Despite this influence of the observer, some representation formats are generally considered to be more easily interpretable than others. In this paper, an empirical study is presented which investigates the suitability of a number of alternative representation formats for classification when interpretability is a key requirement. The formats under consideration are decision tables, (binary) decision trees, propositional rules, and oblique rules. An end-user experiment was designed to test the accuracy, response time, and answer confidence for a set of problem-solving tasks involving the former representations. Analysis of the results reveals that decision tables perform significantly better on all three criteria, while post-test voting also reveals a clear preference of users for decision tables in terms of ease of use.  相似文献   
58.
Implementation of a rule-based transformation engine consists of several tasks with various abstraction levels. We present a new tool called mtom for the efficient implementation of rule-based transformations. This engine should help to bridge the gap between rewriting implementations and practical applications. It aims at implementing well-identified parts of complex applications where the use of rewriting is natural or crucial. These parts are specified using rewrite rules and integrated with the rest of the application, which is kept in a classical imperative language such as C, C++ or Java. Our tool, which can be viewed as a Yacc-like pre-processor, does not depend on a given term representation, rather it accepts implementation of terms (or term like data-types) of yet existing applications and it permits to define and execute rewrite rules upon those types. From our experiences, this system is well-suited for industrial use as well as for implementations of rule-based languages. The paper introduces several features supported by mtom.  相似文献   
59.
In this paper, the impact of outliers on the performance of high-dimensional data analysis methods is studied in the context of face recognition. Most of the existing face recognition methods are based on PCA-like methods: faces are projected into a lower dimensional space in which similarity between faces is supposed to be more easily evaluated. These methods are, however, very sensitive to the quality of the face images used in the training and in the recognition phases. Their performance significantly drops when face images are not well centered or taken under variable illumination conditions. In this paper, we study this phenomenon for two face recognition methods, namely PCA and LDA2D, and we propose a filtering process that allows the automatic selection of noisy face images which are responsible for the performance degradation. This process uses two techniques. The first one is based on the recently proposed robust high-dimensional data analysis method called RobPCA. It is specific to the case of recognition from video sequences. The second technique is based on a novel and effective face classification technique. It allows isolating still face images that are not very precisely cropped, not well-centered or in a non-frontal pose. Experiments show that this filtering process significantly improves recognition rates by 10 to 30%.
Christophe GarciaEmail:
  相似文献   
60.
The medical community is producing and manipulating a tremendous volume of digital data for which computerized archiving, processing and analysis is needed. Grid infrastructures are promising for dealing with challenges arising in computerized medicine but the manipulation of medical data on such infrastructures faces both the problem of interconnecting medical information systems to Grid middlewares and of preserving patients’ privacy in a wide and distributed multi-user system. These constraints are often limiting the use of Grids for manipulating sensitive medical data. This paper describes our design of a medical data management system taking advantage of the advanced gLite data management services, developed in the context of the EGEE project, to fulfill the stringent needs of the medical community. It ensures medical data protection through strict data access control, anonymization and encryption. The multi-level access control provides the flexibility needed for implementing complex medical use-cases. Data anonymization prevents the exposure of most sensitive data to unauthorized users, and data encryption guarantees data protection even when it is stored at remote sites. Moreover, the developed prototype provides a Grid storage resource manager (SRM) interface to standard medical DICOM servers thereby enabling transparent access to medical data without interfering with medical practice.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号