首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3985篇
  免费   175篇
  国内免费   8篇
电工技术   59篇
综合类   9篇
化学工业   555篇
金属工艺   72篇
机械仪表   81篇
建筑科学   304篇
矿业工程   16篇
能源动力   139篇
轻工业   415篇
水利工程   46篇
石油天然气   15篇
无线电   325篇
一般工业技术   692篇
冶金工业   416篇
原子能技术   23篇
自动化技术   1001篇
  2024年   6篇
  2023年   22篇
  2022年   36篇
  2021年   74篇
  2020年   53篇
  2019年   76篇
  2018年   106篇
  2017年   124篇
  2016年   111篇
  2015年   103篇
  2014年   123篇
  2013年   295篇
  2012年   239篇
  2011年   294篇
  2010年   240篇
  2009年   204篇
  2008年   279篇
  2007年   229篇
  2006年   223篇
  2005年   170篇
  2004年   130篇
  2003年   131篇
  2002年   134篇
  2001年   79篇
  2000年   81篇
  1999年   70篇
  1998年   52篇
  1997年   54篇
  1996年   51篇
  1995年   50篇
  1994年   34篇
  1993年   23篇
  1992年   32篇
  1991年   20篇
  1990年   18篇
  1989年   18篇
  1988年   20篇
  1987年   18篇
  1986年   14篇
  1985年   22篇
  1984年   21篇
  1983年   7篇
  1982年   11篇
  1981年   8篇
  1980年   8篇
  1979年   14篇
  1978年   11篇
  1977年   5篇
  1976年   7篇
  1975年   5篇
排序方式: 共有4168条查询结果,搜索用时 672 毫秒
131.
Intuitionistic fuzzy rough sets: at the crossroads of imperfect knowledge   总被引:5,自引:0,他引:5  
Abstract: Just like rough set theory, fuzzy set theory addresses the topic of dealing with imperfect knowledge. Recent investigations have shown how both theories can be combined into a more flexible, more expressive framework for modelling and processing incomplete information in information systems. At the same time, intuitionistic fuzzy sets have been proposed as an attractive extension of fuzzy sets, enriching the latter with extra features to represent uncertainty (on top of vagueness). Unfortunately, the various tentative definitions of the concept of an ‘intuitionistic fuzzy rough set’ that were raised in their wake are a far cry from the original objectives of rough set theory. We intend to fill an obvious gap by introducing a new definition of intuitionistic fuzzy rough sets, as the most natural generalization of Pawlak's original concept of rough sets.  相似文献   
132.
This paper studies the modelling of legal reasoning about evidence within general theories of defeasible reasoning and argumentation. In particular, Wigmore's method for charting evidence and its use by modern legal evidence scholars is studied in order to give a formal underpinning in terms of logics for defeasible argumentation. Two notions turn out to be crucial, viz. argumentation schemes and empirical generalisations.  相似文献   
133.
Chris Reed 《AI & Society》1997,11(1-2):138-154
The concept of argumentation in AI is based almost exclusively on the use of formal, abstract representations. Despite their appealing computational properties, these abstractions become increasingly divorced from their real world counterparts, and, crucially, lose the ability to express the rich gamut of natural argument forms required for creating effective text. In this paper, the demands that socially situated argumentation places on knowledge representation are explored, and the various problems with existing formalisations are discussed. Insights from argumentation theory and social psychology are then adduced as key contributions to a notion of social context which is both computationally tractable and suitably expressive for handling the complexities of argumentation found in natural language.  相似文献   
134.
Symmetric multiprocessor systems are increasingly common, not only as high-throughput servers, but as a vehicle for executing a single application in parallel in order to reduce its execution latency. This article presents Pedigree, a compilation tool that employs a new partitioning heuristic based on the program dependence graph (PDG). Pedigree creates overlapping, potentially interdependent threads, each executing on a subset of the SMP processors that matches the thread’s available parallelism. A unified framework is used to build threads from procedures, loop nests, loop iterations, and smaller constructs. Pedigree does not require any parallel language support; it is post-compilation tool that reads in object code. The SDIO Signal and Data Processing Benchmark Suite has been selected as an example of real-time, latency-sensitive code. Its coarse-grained data flow parallelism is naturally exploited by Pedigree to achieve speedups of 1.63×/2.13× (mean/max) and 1.71×/2.41× on two and four processors, respectively. There is roughly a 20% improvement over existing techniques that exploit only data parallelism. By exploiting the unidirectional flow of data for coarse-grained pipelining, the synchronization overhead is typically limited to less than 6% for synchronization latency of 100 cycles, and less than 2% for 10 cycles. This research was supported by ONR contract numbers N00014-91-J-1518 and N00014-96-1-0347. We would like to thank the Pittsburgh Supercomputing Center for use of their Alpha systems.  相似文献   
135.
Information systems methodologies are an important component of the IS infrastructure and a primary device for organizing systems development work. Evidence suggests that methodology adoption and use are problematic. This research seeks to generate insight into business users' interest in adoption through detailed examination of a case. A framework is developed for organizing relevant research findings. The field research methods are described and details of the case reported. The case highlights the role of business managers in methodology adoption and the influence of business pressures originating in the strategic environment. Analysis shows the organizing framework to require extension to include a more direct role for business decision makers. it is argued that previous research has obscured the legitimate concern of business with systems development methodologies. As business increasingly asserts its interest in and control over IS, it will be necessary to give greater consideration to the needs of business in the selection and adoption of methodologies.  相似文献   
136.
Assuring a high quality requirements specification document involves both an early validation process and an increased level of participation. An approach and its supporting environment which combines the benefits of a formal system specification and its subsequent execution via a rapid prototype is reported. The environment assists in the construction, clarification, validation and visualisation of a formal specification. An illustrative case study demonstrates the consequences of assertions about system properties at this early stage of software development. Our approach involves the pragmatic combination of technical benefits of formal systems engineering based techniques with the context‐sensitive notions of increased participation of both developer and user stakeholders to move us closer towards a quality requirements specification document.  相似文献   
137.
The paper describes the fabrication of a novel miniature sensor for electrical tomography. The sensor comprises a number of copper electrodes that are fabricated around a small hole that is etched through a silicon wafer. Copper electrodes are electroplated to fill channels that are formed in thick photo-resist on top of the silicon wafer. Electrodes with a thickness of 60 μm, surrounding a hole of diameter 300 μm, have been realised. Initial measurements have been made using a commercial LCR meter applied to an eight-electrode sensor and images of a 80 μm diameter wire have been obtained. Future work will consider the integration of measurement circuitry alongside the electrodes in order to reduce parasitic capacitances.  相似文献   
138.
139.
A new robust neurofuzzy model construction algorithm has been introduced for the modeling of a priori unknown dynamical systems from observed finite data sets in the form of a set of fuzzy rules. Based on a Takagi-Sugeno (T-S) inference mechanism a one to one mapping between a fuzzy rule base and a model matrix feature subspace is established. This link enables rule based knowledge to be extracted from matrix subspace to enhance model transparency. In order to achieve maximized model robustness and sparsity, a new robust extended Gram-Schmidt (G-S) method has been introduced via two effective and complementary approaches of regularization and D-optimality experimental design. Model rule bases are decomposed into orthogonal subspaces, so as to enhance model transparency with the capability of interpreting the derived rule base energy level. A locally regularized orthogonal least squares algorithm, combined with a D-optimality used for subspace based rule selection, has been extended for fuzzy rule regularization and subspace based information extraction. By using a weighting for the D-optimality cost function, the entire model construction procedure becomes automatic. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.  相似文献   
140.
The 3D reconstruction of scenes containing independently moving objects from uncalibrated monocular sequences still poses serious challenges. Even if the background and the moving objects are rigid, each reconstruction is only known up to a certain scale, which results in a one-parameter family of possible, relative trajectories per moving object with respect to the background. In order to determine a realistic solution from this family of possible trajectories, this paper proposes to exploit the increased linear coupling between camera and object translations that tends to appear at false scales. An independence criterion is formulated in the sense of true object and camera motions being minimally correlated. The increased coupling at false scales can also lead to the destruction of special properties such as planarity, periodicity, etc. of the true object motion. This provides us with a second, ‘non-accidentalness’ criterion for the selection of the correct motion among the one-parameter family.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号