首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2571篇
  免费   173篇
  国内免费   2篇
电工技术   14篇
综合类   1篇
化学工业   863篇
金属工艺   26篇
机械仪表   53篇
建筑科学   77篇
矿业工程   2篇
能源动力   85篇
轻工业   524篇
水利工程   17篇
石油天然气   9篇
无线电   144篇
一般工业技术   343篇
冶金工业   284篇
原子能技术   13篇
自动化技术   291篇
  2024年   5篇
  2023年   35篇
  2022年   195篇
  2021年   196篇
  2020年   92篇
  2019年   83篇
  2018年   101篇
  2017年   84篇
  2016年   97篇
  2015年   81篇
  2014年   112篇
  2013年   160篇
  2012年   161篇
  2011年   180篇
  2010年   125篇
  2009年   112篇
  2008年   103篇
  2007年   93篇
  2006年   83篇
  2005年   55篇
  2004年   52篇
  2003年   67篇
  2002年   58篇
  2001年   22篇
  2000年   30篇
  1999年   35篇
  1998年   93篇
  1997年   57篇
  1996年   40篇
  1995年   22篇
  1994年   13篇
  1993年   20篇
  1992年   10篇
  1991年   8篇
  1990年   8篇
  1989年   5篇
  1988年   7篇
  1987年   2篇
  1986年   6篇
  1985年   9篇
  1984年   3篇
  1983年   5篇
  1982年   9篇
  1979年   1篇
  1978年   2篇
  1977年   2篇
  1976年   7篇
排序方式: 共有2746条查询结果,搜索用时 15 毫秒
71.
The performance of state-of-the-art speaker verification in uncontrolled environment is affected by different variabilities. Short duration variability is very common in these scenarios and causes the speaker verification performance to decrease quickly while the duration of verification utterances decreases. Linear discriminant analysis (LDA) is the most common session variability compensation algorithm, nevertheless it presents some shortcomings when trained with insufficient data. In this paper we introduce two methods for session variability compensation to deal with short-length utterances on i-vector space. The first method proposes to incorporate the short duration variability information in the within-class variance estimation process. The second proposes to compensate the session and short duration variabilities in two different spaces with LDA algorithms (2S-LDA). First, we analyzed the behavior of the within and between class scatters in the first proposed method. Then, both proposed methods are evaluated on telephone session from NIST SRE-08 for different duration of the evaluation utterances: full (average 2.5 min), 20, 15, 10 and 5 s. The 2S-LDA method obtains good results on different short-length utterances conditions in the evaluations, with a EER relative average improvement of 1.58%, compared to the best baseline (WCCN[LDA]). Finally, we applied the 2S-LDA method in speaker verification under reverberant environment, using different reverberant conditions from Reverb challenge 2013, obtaining an improvement of 8.96 and 23% under matched and mismatched reverberant conditions, respectively.  相似文献   
72.
Recently the action systems formalism for parallel and distributed systems has been extended with the procedure mechanism. This gives us a very general framework for describing different communication paradigms for action systems, e.g. remote procedure calls. Action systems come with a design methodology based on the refinement calculus. Data refinement is a powerful technique for refining action systems. In this paper we will develop a theory and proof rules for the refinement of action systems that communicate via remote procedures based on the data refinement approach. The proof rules we develop are compositional so that modular refinement of action systems is supported. As an example we will especially study the atomicity refinement of actions. This is an important refinement strategy, as it potentially increases the degree of parallelism in an action system. Received February 1999 / Accepted in revised form July 2000  相似文献   
73.
Two complex perovskite-related structures were solved by ab initio from precession electron diffraction intensities. Structure models were firstly derived from HREM images and than have been confirmed independently using two and three-dimensional sets of precession intensities. Patterson techniques prove to be effective for ab initio structure resolution, specially in case of projections with no overlapping atoms. Quality of precession intensity data may be suitable enough to resolve unknown heavy oxide structures.  相似文献   
74.
Editing and manipulation of existing 3D geometric objects are a means to extend their repertoire and promote their availability. Traditionally, tools to compose or manipulate objects defined by 3D meshes are in the realm of artists and experts. In this paper, we introduce a simple and effective user interface for easy composition of 3D mesh-parts for non-professionals. Our technique borrows from the cut-and-paste paradigm where a user can cut parts out of existing objects and paste them onto others to create new designs. To assist the user attach objects to each other in a quick and simple manner, many applications in computer graphics support the notion of “snapping”. Similarly, our tool allows the user to loosely drag one mesh part onto another with an overlap, and lets the system snap them together in a graceful manner. Snapping is accomplished using our Soft-ICP algorithm which replaces the global transformation in the ICP algorithm with a set of point-wise locally supported transformations. The technique enhances registration with a set of rigid to elastic transformations that account for simultaneous global positioning and local blending of the objects. For completeness of our framework, we present an additional simple mesh-cutting tool, adapting the graph-cut algorithm to meshes.  相似文献   
75.
Prognosis of B-Chronic Lymphocytic Leukemia (B-CLL) remains a challenging problem in medical research and practice. While the parameters obtained by flow cytometry analysis form the basis of the diagnosis of the disease, the question whether these parameters offer additional prognostic information still remains open. In this work, we attempt to provide computer-assisted support to the clinical experts of the field, by deploying a classification system for B-CLL multiparametric prognosis that combines various heterogeneous (clinical, laboratory and flow cytometry) parameters associated with the disease. For this purpose, we employ the na?ve-Bayes classifier and propose an algorithm that improves its performance. The algorithm discretizes the continuous classification attributes (candidate prognostic parameters) and selects the most useful subset of them to optimize the classification accuracy. Thus, in addition to the high classification accuracy achieved, the proposed approach also suggests the most informative parameters for the prognosis. The experimental results demonstrate that the inclusion of flow cytometry parameters in our system improves prognosis.  相似文献   
76.
Customizing software to perfectly fit individual needs is becoming increasingly important in information systems engineering. Users want to be able to customize software behavior through reference to terms familiar to their diverse needs and experience. We present a requirements-driven approach to behavioral customization of software systems. Goal models are constructed to represent alternative behaviors that users can exhibit to achieve their goals. Customization information is then added to restrict the space of possibilities to those that fit specific users, contexts, or situations. Meanwhile, elements of the goal models are mapped to units of source code. This way, customization preferences posed at the requirements level are directly translated into system customizations. Our approach, which we apply to an on-line shopping cart system and an automated teller machine simulator, does not assume adoption of a particular development methodology, platform, or variability implementation technique and keeps the reasoning computation overhead from interfering with the execution of the configured application.  相似文献   
77.
In this paper, an extension of the natural element method (NEM) is presented to solve finite deformation problems. Since NEM is a meshless method, its implementation does not require an explicit connectivity definition. Consequently, it is quite adequate to simulate large strain problems with important mesh distortions, reducing the need for remeshing and projection of results (extremely important in three‐dimensional problems). NEM has important advantages over other meshless methods, such as the interpolant character of its shape functions and the ability of exactly reproducing essential boundary conditions along convex boundaries. The α‐NEM extension generalizes this behaviour to non‐convex boundaries. A total Lagrangian formulation has been employed to solve different problems with large strains, considering hyperelastic behaviour. Several examples are presented in two and three dimensions, comparing the results with the ones of the finite element method. NEM performs better showing its important capabilities in this kind of applications. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   
78.
A software environment, called EDEN, that prototypes a recent approach to model-based diagnosis of discrete-event systems, is presented. The environment integrates a specification language, called SMILE, a model base, and a diagnostic engine. SMILE enables the user to create libraries of models and systems, which are permanently stored in the model base, wherein both final and intermediate results of the diagnostic sessions are hosted as well. Given the observation of a physical system gathered during its reaction to an external event, the diagnostic engine performs the a posteriori reconstruction of all the possible evolutions of the system over time and, then, draws candidate diagnoses out of them. The diagnostic method is described using a simplified example within the domain of power transmission networks. Strong points of the method include compositional modeling, support for model update, ability to focus on any sub-system, amenability to parallel execution, management of multiple faults, and broad notions of system and observation.  相似文献   
79.
Presents reflections on the life and achievements of Josef Maria Bro?ek. The author notes that, in his way, Bro?ek turned psychology historiography into an international domain involving, in this construction, researchers from countries not only of Anglo Saxon languages but also those of Slavic and Latin languages, including Brazil. Quoting from Braque, he affirmed that "knowledge of the past enables the revelation of the present." However, he believed that "reality is not revealed if it is not sparkled by a poetic beam." The poetry of life illuminated Bro?ek and his actions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
80.
Despite the great technical advancement of mass spectrometry, this technique has contributed in a limited way to the discovery and quantitation of specific/precocious markers linked to free radical-mediated diseases. Unsaturated aldehydes generated by free radical-induced lipid peroxidation of polyunsaturated fatty acids, and in particular 4-hydroxy-trans-2 nonenal (HNE), are involved in the onset and progression of many pathologies such as cardiovascular (atherosclerosis, long-term complications of diabetes) and neurodegenerative diseases (Alzheimer's disease, Parkinson's disease, and cerebral ischemia). Most of the biological effects of HNE are attributed to the capacity of HNE to react with the nucleophilic sites of proteins and peptides (other than nucleic acids), to form covalently modified biomolecules that can disrupt important cellular functions and induce mutations. By considering the emerging role of HNE in several human diseases, an unequivocal analytical approach as mass spectrometry to detect/elucidate the structure of protein-HNE adducts in biological matrices is strictly needed not only to understand the reaction mechanism of HNE, but also to gain a deeper insight into the pathological role of HNE. This with the aim to provide intermediate diagnostic biomarkers for human diseases. This review sheds focus on the "state-of-the-art" of mass spectrometric applications in the field of HNE-protein adducts characterization, starting from the fundamental early studies and discussing the different MS-based approaches that can provide detailed information on the mechanistic aspects of HNE-protein interaction. In the last decade, the increases in the accessible mass ranges of modern instruments and advances in ionization methods have made possible a fundamental improvement in the analysis of protein-HNE adducts by mass spectrometry, and in particular by matrix-assisted laser desorption/ionization (MALDI) and electrospray ionization (ESI) tandem mass spectrometry. The recent developments and uses of combined analytical approaches to detect and characterize the type/site of interaction have been highlighted, and several other aspects, including sample preparation methodologies, structure elucidation, and data analysis have also been considered.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号