首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   928篇
  免费   29篇
  国内免费   2篇
电工技术   19篇
化学工业   161篇
金属工艺   26篇
机械仪表   20篇
建筑科学   9篇
能源动力   28篇
轻工业   73篇
水利工程   11篇
石油天然气   1篇
无线电   117篇
一般工业技术   145篇
冶金工业   82篇
原子能技术   3篇
自动化技术   264篇
  2024年   4篇
  2023年   15篇
  2022年   39篇
  2021年   42篇
  2020年   42篇
  2019年   30篇
  2018年   40篇
  2017年   26篇
  2016年   34篇
  2015年   26篇
  2014年   26篇
  2013年   51篇
  2012年   47篇
  2011年   43篇
  2010年   38篇
  2009年   34篇
  2008年   43篇
  2007年   32篇
  2006年   27篇
  2005年   24篇
  2004年   25篇
  2003年   16篇
  2002年   30篇
  2001年   10篇
  2000年   17篇
  1999年   11篇
  1998年   24篇
  1997年   26篇
  1996年   29篇
  1995年   11篇
  1994年   17篇
  1993年   11篇
  1992年   7篇
  1991年   6篇
  1989年   7篇
  1988年   2篇
  1987年   3篇
  1986年   2篇
  1985年   3篇
  1984年   2篇
  1983年   2篇
  1982年   5篇
  1981年   4篇
  1980年   4篇
  1979年   2篇
  1978年   3篇
  1977年   6篇
  1976年   3篇
  1967年   1篇
  1966年   1篇
排序方式: 共有959条查询结果,搜索用时 15 毫秒
71.
This article describes RISCBOT (RISCBOT name has been derived from RISC lab and ‘bot’ of robot, a modular 802.11 b-enabled mobile autonomous robot built at the RISC lab of the University of Bridgeport. RISCBOT localizes itself and successfully fulfills www – enabled online user requests and navigates to various rooms, employing a visual recognition algorithm. This article describes the mechanical design, hardware and software algorithms of the robot, and the web-based interface for communicating with the robot. RISC lab: Interdisciplinary Robotics, Intelligent Sensing and Control laboratory at the University of Bridgeport.  相似文献   
72.
The 2010 CAV (Computer-Aided Verification) award was awarded to Kenneth L. McMillan of Cadence Research Laboratories for a series of fundamental contributions resulting in significant advances in scalability of model checking tools. The annual award recognizes a specific fundamental contribution or a series of outstanding contributions to the CAV field.  相似文献   
73.
We propose tackling a “mini challenge” problem: a nontrivial verification effort that can be completed in 2–3 years, and will help establish notational standards, common formats, and libraries of benchmarks that will be essential in order for the verification community to collaborate on meeting Hoare’s 15-year verification grand challenge. We believe that a suitable candidate for such a mini challenge is the development of a filesystem that is verifiably reliable and secure. The paper argues why we believe a filesystem is the right candidate for a mini challenge and describes a project in which we are building a small embedded filesystem for use with flash memory. The work described in this paper was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.  相似文献   
74.
75.
Companies have to adhere to compliance requirements. The compliance analysis of business operations is typically a joint effort of business experts and compliance experts. Those experts need to create a common understanding of business processes to effectively conduct compliance management. In this paper, we present a technique that aims at supporting this process. We argue that process templates generated out of compliance requirements provide a basis for negotiation among business and compliance experts. We introduce a semi-automated and iterative approach to the synthesis of such process templates from compliance requirements expressed in Linear Temporal Logic (LTL). We show how generic constraints related to business process execution are incorporated and present criteria that point at underspecification. Further, we outline how such underspecification may be resolved to iteratively build up a complete specification. For the synthesis, we leverage existing work on process mining and process restructuring. However, our approach is not limited to the control-flow perspective, but also considers direct and indirect data-flow dependencies. Finally, we elaborate on the application of the derived process templates and present an implementation of our approach.  相似文献   
76.
Visual shape parameters and aesthetic aspects of a product are one of the crucial factors for the success of a product in the market. The type and the value of the shape parameters plays an important role in visual appearance of a product and designers tends to be critical while deciding these parameters. The aesthetic aspect of a product has been matter of concern for researchers with its electromechanical design. The Kano model has been found to be a useful tool to establish the relationship between performance criteria and customer satisfaction. To achieve the desired customer satisfaction weight of each product criteria is determined by using Kano model. This study presents an integrative design approach combining the Kano model, Taguchi method and grey relation analysis to obtain the optimal combination of shape parameters and aesthetic aspects. Prioritized criteria of aesthetic attributes have been abstracted through proposed methodology. A case study has been presented to evolve a profile of a car.  相似文献   
77.
Microsystem Technologies - Due to fast technological development, human beings generally depend upon computer and other digital equipments in different areas of concern/applications. Therefore,...  相似文献   
78.

With the advancement of image acquisition devices and social networking services, a huge volume of image data is generated. Using different image and video processing applications, these image data are manipulated, and thus, original images get tampered. These tampered images are the prime source of spreading fake news, defaming the personalities and in some cases (when used as evidence) misleading the law bodies. Hence before relying totally on the image data, the authenticity of the image must be verified. Works of the literature are reported for the verification of the authenticity of an image based on noise inconsistency. However, these works suffer from limitations of confusion between edges and noise, post-processing operation for localization and need of prior knowledge about an image. To handle these limitations, a noise inconsistency-based technique has been presented here to detect and localize a false region in an image. This work consists of three major steps of pre-processing, noise estimation and post-processing. For the experimental purpose two, publicly available datasets are used. The result is discussed in terms of precision, recall, accuracy and f1-score on the pixel level. The result of the presented work is also compared with the recent state-of-the-art techniques. The average accuracy of the proposed work on datasets is 91.70%, which is highest among state-of-the-art techniques.

  相似文献   
79.
80.
There is growing interest in algorithms for processing and querying continuous data streams (i.e., data seen only once in a fixed order) with limited memory resources. In its most general form, a data stream is actually an update stream, i.e., comprising data-item deletions as well as insertions. Such massive update streams arise naturally in several application domains (e.g., monitoring of large IP network installations or processing of retail-chain transactions). Estimating the cardinality of set expressions defined over several (possibly distributed) update streams is perhaps one of the most fundamental query classes of interest; as an example, such a query may ask what is the number of distinct IP source addresses seen in passing packets from both router R 1 and R 2 but not router R 3?. Earlier work only addressed very restricted forms of this problem, focusing solely on the special case of insert-only streams and specific operators (e.g., union). In this paper, we propose the first space-efficient algorithmic solution for estimating the cardinality of full-fledged set expressions over general update streams. Our estimation algorithms are probabilistic in nature and rely on a novel, hash-based synopsis data structure, termed 2-level hash sketch. We demonstrate how our 2-level hash sketch synopses can be used to provide low-error, high-confidence estimates for the cardinality of set expressions (including operators such as set union, intersection, and difference) over continuous update streams, using only space that is significantly sublinear in the sizes of the streaming input (multi-)sets. Furthermore, our estimators never require rescanning or resampling of past stream items, regardless of the number of deletions in the stream. We also present lower bounds for the problem, demonstrating that the space usage of our estimation algorithms is within small factors of the optimal. Finally, we propose an optimized, time-efficient stream synopsis (based on 2-level hash sketches) that provides similar, strong accuracy-space guarantees while requiring only guaranteed logarithmic maintenance time per update, thus making our methods applicable for truly rapid-rate data streams. Our results from an empirical study of our synopsis and estimation techniques verify the effectiveness of our approach.Received: 20 October 2003, Accepted: 16 April 2004, Published online: 14 September 2004Edited by: J. Gehrke and J. Hellerstein.Sumit Ganguly: sganguly@cse.iitk.ac.in Current affiliation: Department of Computer Science and Engineering, Indian Institute of Technology, Kanpur, India  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号