首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1942篇
  免费   130篇
  国内免费   8篇
电工技术   17篇
综合类   1篇
化学工业   425篇
金属工艺   29篇
机械仪表   39篇
建筑科学   147篇
矿业工程   3篇
能源动力   91篇
轻工业   131篇
水利工程   14篇
石油天然气   8篇
无线电   157篇
一般工业技术   370篇
冶金工业   210篇
原子能技术   12篇
自动化技术   426篇
  2023年   23篇
  2022年   35篇
  2021年   87篇
  2020年   53篇
  2019年   60篇
  2018年   64篇
  2017年   55篇
  2016年   73篇
  2015年   70篇
  2014年   75篇
  2013年   144篇
  2012年   99篇
  2011年   146篇
  2010年   88篇
  2009年   86篇
  2008年   94篇
  2007年   96篇
  2006年   94篇
  2005年   79篇
  2004年   45篇
  2003年   64篇
  2002年   51篇
  2001年   23篇
  2000年   21篇
  1999年   26篇
  1998年   44篇
  1997年   23篇
  1996年   25篇
  1995年   22篇
  1994年   31篇
  1993年   14篇
  1992年   17篇
  1991年   13篇
  1990年   9篇
  1989年   15篇
  1988年   7篇
  1987年   9篇
  1986年   9篇
  1985年   19篇
  1984年   8篇
  1983年   10篇
  1982年   6篇
  1981年   7篇
  1980年   12篇
  1979年   3篇
  1978年   4篇
  1977年   3篇
  1976年   3篇
  1973年   3篇
  1969年   5篇
排序方式: 共有2080条查询结果,搜索用时 15 毫秒
41.
Distributed video coding (DVC) constitutes an original coding framework to meet the stringent requirements imposed by uplink-oriented and low-power mobile video applications. The quality of the side information available to the decoder and the efficiency of the employed channel codes are primary factors determining the success of a DVC system. This contribution introduces two novel techniques for probabilistic motion compensation in order to generate side information at the Wyner-Ziv decoder. The employed DVC scheme uses a base layer, serving as a hash to facilitate overlapped block motion estimation at the decoder side. On top of the base layer, a supplementary Wyner-Ziv layer is coded in the DCT domain. Both proposed probabilistic motion compensation techniques are driven by the actual correlation channel statistics and reuse information contained in the hash. Experimental results report significant rate savings caused by the novel side information generation methods compared to previous techniques. Moreover, the compression performance of the presented DVC architecture, featuring the proposed side-information generation techniques, delivers state-of-the-art compression performance.  相似文献   
42.
Recently, a multisecret sharing scheme for secret color images among a set of users was proposed, which allows that each participant to share secret color images with the rest of participants in such way that all of them can recover all secret color images only if all participants pool their shares. In this work a parallel implementation of the cellular automata-based multisecret sharing scheme is proposed, in which the technology of CUDA (Compute Unified Device Architecture) is used in parallelization, taking advantage that each cell of cellular automata can be processed independently. The processing time of the proposed scheme is analyzed and it is proved that the proposed parallel algorithm using the CUDA structure is more than 12 times faster than the conventional sequential algorithm. This reduction of temporal complexity allows the practical use of the secret sharing scheme in many information security fields.  相似文献   
43.
Granular computing is a computational paradigm that mimics human cognition in terms of grouping similar information together. Compatibility operators such as cardinality, orientation, density, and multidimensional length act on both in raw data and information granules which are formed from raw data providing a framework for human-like information processing where information granulation is intrinsic. Granular computing, as a computational concept, is not new, however it is only relatively recent when this concept has been formalised computationally via the use of Computational Intelligence methods such as Fuzzy Logic and Rough Sets. Neutrosophy is a unifying field in logics that extents the concept of fuzzy sets into a three-valued logic that uses an indeterminacy value, and it is the basis of neutrosophic logic, neutrosophic probability, neutrosophic statistics and interval valued neutrosophic theory. In this paper we present a new framework for creating Granular Computing Neural-Fuzzy modelling structures via the use of Neutrosophic Logic to address the issue of uncertainty during the data granulation process. The theoretical and computational aspects of the approach are presented and discussed in this paper, as well as a case study using real industrial data. The case study under investigation is the predictive modelling of the Charpy Toughness of heat-treated steel; a process that exhibits very high uncertainty in the measurements due to the thermomechanical complexity of the Charpy test itself. The results show that the proposed approach leads to more meaningful and simpler granular models, with a better generalisation performance as compared to other recent modelling attempts on the same data set.  相似文献   
44.
45.
The growing need for reliable, efficient, high temperature hydrogen and hydrocarbon monitoring has fueled research into novel structures for gas sensing. Metal oxide semiconductor (MOS) devices employing a catalytic metal layer have emerged as one of the leading sensing platforms for such applications, owing to their high sensitivity and inherent capability for signal amplification. The limited operating temperature of such devices employing silicon as the semiconductor has led research efforts to focus on replacing them with devices based on silicon carbide (SiC). More recently, MOS devices having different oxide layers exhibiting improved sensing performance have emerged. Considering the amount of research that has been carried out in this area in recent times, it is important to elucidate the new findings and the gas interaction mechanisms that have been ascribed to such devices, and bring together several theories proposed by different research groups. In this paper we first highlight the needs which have driven research into SiC based field effect hydrogen and hydrocarbon sensors, illustrate the various structures being investigated, and describe the device evolution and current status. We provide several sensing examples of devices that make use of different oxide layers and demonstrate how their electrical properties change in the presence of the gases, as well as presenting the hydrogen gas interaction mechanisms of these sensors.  相似文献   
46.
Current air quality models generate deterministic forecasts by assuming perfect model, perfectly known parameters, and exact input data. However, our knowledge of the physics is imperfect. It is of interest to extend the deterministic simulation results with “error bars” that quantify the degree of uncertainty, and analyze the impact of the uncertainty input on the simulation results. This added information provides a confidence level for the forecast results. Monte Carlo (MC) method is a popular approach for air quality model uncertainty analysis, but it converges slowly. This work discusses the polynomial chaos (PC) method that is more suitable for uncertainty quantification (UQ) in large-scale models. We propose a new approach for uncertainty apportionment (UA), i.e., we develop a PC approach to attribute the uncertainties in model results to different uncertainty inputs. The UQ and UA techniques are implemented in the Sulfur Transport Eulerian Model (STEM-III). A typical scenario of air pollution in the northeast region of the USA is considered. The UQ and UA results allow us to assess the combined effects of different input uncertainties on the forecast uncertainty. They also enable to quantify the contribution of input uncertainties to the uncertainty in the predicted ozone and PAN concentrations.  相似文献   
47.
Population-based psychiatric admission rates vary across geographic areas, but reasons for this variation are unknown. Insofar as Community Mental Health Centers (CMHCs) provide outpatient services that may deter the need for hospitalization, the presence and structural characteristics of CMHCs may have an impact on a population's psychiatric admission rates. This study uses small area analysis to examine how general hospital psychiatric admission rates are associated with CMHC characteristics. Based on a survey of all CMHCs in Iowa and corresponding small area variation data, it was found that population admission rates were higher in areas closer to the CMHC and lower in outlying catchment areas, adjusting for age, sex, and urban/rural differences in populations. There was little evidence that differences in staffing and service variables influenced admission rates, although greater CMHC staff coverage by social workers and psychiatric residents was associated with lower admission rates. The results suggest that CMHCs do not lower an area's hospitalization rate, and in fact, the presence of CMHCs may promote a "supplier-induced demand" phenomenon of higher admissions.  相似文献   
48.

Robotic process automation is a disruptive technology to automate already digital yet manual tasks and subprocesses as well as whole business processes rapidly. In contrast to other process automation technologies, robotic process automation is lightweight and only accesses the presentation layer of IT systems to mimic human behavior. Due to the novelty of robotic process automation and the varying approaches when implementing the technology, there are reports that up to 50% of robotic process automation projects fail. To tackle this issue, we use a design science research approach to develop a framework for the implementation of robotic process automation projects. We analyzed 35 reports on real-life projects to derive a preliminary sequential model. Then, we performed multiple expert interviews and workshops to validate and refine our model. The result is a framework with variable stages that offers guidelines with enough flexibility to be applicable in complex and heterogeneous corporate environments as well as for small and medium-sized companies. It is structured by the three phases of initialization, implementation, and scaling. They comprise eleven stages relevant during a project and as a continuous cycle spanning individual projects. Together they structure how to manage knowledge and support processes for the execution of robotic process automation implementation projects.

  相似文献   
49.
50.
Worst-case execution time (WCET) analysis is concerned with computing a precise-as-possible bound for the maximum time the execution of a program can take. This information is indispensable for developing safety-critical real-time systems, e. g., in the avionics and automotive fields. Starting with the initial works of Chen, Mok, Puschner, Shaw, and others in the mid and late 1980s, WCET analysis turned into a well-established and vibrant field of research and development in academia and industry. The increasing number and diversity of hardware and software platforms and the ongoing rapid technological advancement became drivers for the development of a wide array of distinct methods and tools for WCET analysis. The precision, generality, and efficiency of these methods and tools depend much on the expressiveness and usability of the annotation languages that are used to describe feasible and infeasible program paths. In this article we survey the annotation languages which we consider formative for the field. By investigating and comparing their individual strengths and limitations with respect to a set of pivotal criteria, we provide a coherent overview of the state of the art. Identifying open issues, we encourage further research. This way, our approach is orthogonal and complementary to a recent approach of Wilhelm et al. who provide a thorough survey of WCET analysis methods and tools that have been developed and used in academia and industry.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号