首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   8155篇
  免费   518篇
  国内免费   7篇
电工技术   112篇
综合类   23篇
化学工业   2029篇
金属工艺   243篇
机械仪表   147篇
建筑科学   594篇
矿业工程   98篇
能源动力   282篇
轻工业   593篇
水利工程   50篇
石油天然气   20篇
武器工业   1篇
无线电   656篇
一般工业技术   1724篇
冶金工业   463篇
原子能技术   48篇
自动化技术   1597篇
  2024年   9篇
  2023年   107篇
  2022年   80篇
  2021年   324篇
  2020年   199篇
  2019年   209篇
  2018年   231篇
  2017年   267篇
  2016年   351篇
  2015年   293篇
  2014年   377篇
  2013年   596篇
  2012年   533篇
  2011年   717篇
  2010年   506篇
  2009年   475篇
  2008年   484篇
  2007年   424篇
  2006年   379篇
  2005年   303篇
  2004年   252篇
  2003年   189篇
  2002年   167篇
  2001年   102篇
  2000年   96篇
  1999年   114篇
  1998年   123篇
  1997年   95篇
  1996年   86篇
  1995年   51篇
  1994年   59篇
  1993年   37篇
  1992年   47篇
  1991年   24篇
  1990年   39篇
  1989年   23篇
  1988年   26篇
  1987年   24篇
  1986年   23篇
  1985年   19篇
  1984年   28篇
  1983年   16篇
  1982年   17篇
  1981年   14篇
  1980年   20篇
  1978年   12篇
  1977年   15篇
  1976年   21篇
  1975年   13篇
  1969年   11篇
排序方式: 共有8680条查询结果,搜索用时 15 毫秒
131.
The evolution of the web has outpaced itself: A growing wealth of information and increasingly sophisticated interfaces necessitate automated processing, yet existing automation and data extraction technologies have been overwhelmed by this very growth. To address this trend, we identify four key requirements for web data extraction, automation, and (focused) web crawling: (1) interact with sophisticated web application interfaces, (2) precisely capture the relevant data to be extracted, (3) scale with the number of visited pages, and (4) readily embed into existing web technologies. We introduce OXPath as an extension of XPath for interacting with web applications and extracting data thus revealed—matching all the above requirements. OXPath’s page-at-a-time evaluation guarantees memory use independent of the number of visited pages, yet remains polynomial in time. We experimentally validate the theoretical complexity and demonstrate that OXPath’s resource consumption is dominated by page rendering in the underlying browser. With an extensive study of sublanguages and properties of OXPath, we pinpoint the effect of specific features on evaluation performance. Our experiments show that OXPath outperforms existing commercial and academic data extraction tools by a wide margin.  相似文献   
132.
Query optimizers rely on statistical models that succinctly describe the underlying data. Models are used to derive cardinality estimates for intermediate relations, which in turn guide the optimizer to choose the best query execution plan. The quality of the resulting plan is highly dependent on the accuracy of the statistical model that represents the data. It is well known that small errors in the model estimates propagate exponentially through joins, and may result in the choice of a highly sub-optimal query execution plan. Most commercial query optimizers make the attribute value independence assumption: all attributes are assumed to be statistically independent. This reduces the statistical model of the data to a collection of one-dimensional synopses (typically in the form of histograms), and it permits the optimizer to estimate the selectivity of a predicate conjunction as the product of the selectivities of the constituent predicates. However, this independence assumption is more often than not wrong, and is considered to be the most common cause of sub-optimal query execution plans chosen by modern query optimizers. We take a step towards a principled and practical approach to performing cardinality estimation without making the independence assumption. By carefully using concepts from the field of graphical models, we are able to factor the joint probability distribution over all the attributes in the database into small, usually two-dimensional distributions, without a significant loss in estimation accuracy. We show how to efficiently construct such a graphical model from the database using only two-way join queries, and we show how to perform selectivity estimation in a highly efficient manner. We integrate our algorithms into the PostgreSQL DBMS. Experimental results indicate that estimation errors can be greatly reduced, leading to orders of magnitude more efficient query execution plans in many cases. Optimization time is kept in the range of tens of milliseconds, making this a practical approach for industrial-strength query optimizers.  相似文献   
133.
Support for generic programming was added to the Java language in 2004, representing perhaps the most significant change to one of the most widely used programming languages today. Researchers and language designers anticipated this addition would relieve many long-standing problems plaguing developers, but surprisingly, no one has yet measured how generics have been adopted and used in practice. In this paper, we report on the first empirical investigation into how Java generics have been integrated into open source software by automatically mining the history of 40 popular open source Java programs, traversing more than 650 million lines of code in the process. We evaluate five hypotheses and research questions about how Java developers use generics. For example, our results suggest that generics sometimes reduce the number of type casts and that generics are usually adopted by a single champion in a project, rather than all committers. We also offer insights into why some features may be adopted sooner and others features may be held back.  相似文献   
134.
When implementing a propagator for a constraint, one must decide about variants: When implementing min, should one also implement max? Should one implement linear constraints both with unit and non-unit coefficients? Constraint variants are ubiquitous: implementing them requires considerable (if not prohibitive) effort and decreases maintainability, but will deliver better performance than resorting to constraint decomposition. This paper shows how to use views to derive propagator variants, combining the efficiency of dedicated propagator implementations with the simplicity and effortlessness of decomposition. A model for views and derived propagators is introduced. Derived propagators are proved to be perfect in that they inherit essential properties such as correctness and domain and bounds consistency. Techniques for systematically deriving propagators such as transformation, generalization, specialization, and type conversion are developed. The paper introduces an implementation architecture for views that is independent of the underlying constraint programming system. A detailed evaluation of views implemented in Gecode shows that derived propagators are efficient and that views often incur no overhead. Views have proven essential for implementing Gecode, substantially reducing the amount of code that needs to be written and maintained.  相似文献   
135.
Libraries, as we know them today, can be defined by the term Library 1.0. This defines the way resources are kept on shelves or at a computer behind a login. These resources can be taken from a shelf, checked out by the library staff, taken home for a certain length of time and absorbed, and then returned to the library for someone else to avail of. Library 1.0 is a one-directional service that takes people to the information they require. Library 2.0 – or L2 as it is now more commonly addressed as – aims to take the information to the people by bringing the library service to the Internet and getting the users more involved by encouraging feedback participation. This paper presents an overview of Library 2.0 and introduces web 2.0 concepts.  相似文献   
136.
The Network Mobility (NEMO) protocol is needed to support the world-wide mobility of aircraft mobile networks across different access networks in the future IPv6 based aeronautical telecommunications network (ATN). NEMO suffers from the constraint that all traffic has to be routed via the home agent though. The already existing correspondent router (CR) protocol solves this triangular routing problem and permits to route packets on a direct path between the mobile network and the ground based correspondent nodes. We identify security deficiencies of this protocol that make it unsuitable for use within the ATN. We therefore propose a new route optimization procedure based on the CR protocol that provides a higher level of security. We evaluate our new protocol in three ways. We first conduct a simulation based handover performance study using an implementation of a realistic aeronautical access technology. We then investigate the mobility signaling overhead. Finally, we specify a threat model applicable for the aeronautical environment and use it to perform a security analysis of both the old and our new protocol. It is shown that our protocol is not only more secure but also provides better handover latency, smaller overhead in the aeronautical scenario and a higher level of resilience when compared to the original CR protocol.  相似文献   
137.
We study a motion planning problem where items have to be transported from the top room of a tower to the bottom of the tower, while simultaneously other items have to be transported in the opposite direction. Item sets are moved in two baskets hanging on a rope and pulley. To guarantee stability of the system, the weight difference between the contents of the two baskets must always stay below a given threshold. We prove that it is $\varPi_{2}^{p}$ -complete to decide whether some given initial situation of the underlying discrete system can lead to a given goal situation. Furthermore we identify several polynomially solvable special cases of this reachability problem, and we also settle the computational complexity of a number of related questions.  相似文献   
138.
This study sought to assess sediment contamination by trace metals (cadmium, chromium, cobalt, copper, manganese, nickel, lead and zinc), to localize contaminated sites and to identify environmental risk for aquatic organisms in Wadis of Kebir Rhumel basin in the Northeast of Algeria. Water and surficial sediments (0-5 cm) were sampled in winter, spring, summer and autumn from 37 sites along permanent Wadis of the Kebir Rhumel basin. Sediment trace metal contents were measured by Flame Atomic Absorption Spectroscopy. Trace metals median concentrations in sediments followed a decreasing order: Mn > Zn > Pb > Cr > Cu > Ni > Co > Cd. Extreme values (dry weights) of the trace metals are as follows: 0.6-3.4 microg/g for Cd, 10-216 microg/g for Cr, 9-446 microg/g for Cu, 3-20 microg/g for Co, 105-576 microg/g for Mn, 10-46 microg/g for Ni, 11-167 microg/g for Pb, and 38-641 microg/g for Zn. According to world natural concentrations, all sediments collected were considered as contaminated by one or more elements. Comparing measured concentrations with American guidelines (Threshold Effect Level: TEL and Probable Effect Level: PEL) showed that biological effects could be occasionally observed for cadmium, chromium, lead and nickel levels but frequently observed for copper and zinc levels. Sediment quality was shown to be excellent for cobalt and manganese but medium to bad for cadmium, chromium, copper, lead, nickel and zinc regardless of sites.  相似文献   
139.
The ever accelerating state of technology has powered an increasing interest in heat transfer solutions and process engineering innovations in the microfluidics domain. In order to carry out such developments, reliable heat transfer diagnostic techniques are necessary. Thermo-liquid crystal (TLC) thermography, in combination with particle image velocimetry, has been a widely accepted and commonly used technique for the simultaneous measurement and characterization of temperature and velocity fields in macroscopic fluid flows for several decades. However, low seeding density, volume illumination, and low TLC particle image quality at high magnifications present unsurpassed challenges to its application to three-dimensional flows with microscopic dimensions. In this work, a measurement technique to evaluate the color response of individual non-encapsulated TLC particles is presented. A Shirasu porous glass membrane emulsification approach was used to produce the non-encapsulated TLC particles with a narrow size distribution and a multi-variable calibration procedure, making use of all three RGB and HSI color components, as well as the proper orthogonally decomposed RGB components, was used to achieve unprecedented low uncertainty levels in the temperature estimation of individual particles, opening the door to simultaneous temperature and velocity tracking using 3D velocimetry techniques.  相似文献   
140.
The intricate structure of the iris constitutes a powerful biometric characteristic utilized by iris recognition algorithms to extract discriminative biometric templates. Iris recognition is field-proven but consequential issues, e.g. privacy protection or recognition in unconstrained environments, still to be solved, raise the need for further investigations. In this paper different improvements focused on template protection and biometric comparators are presented. Experimental evaluations are performed on a public dataset confirming the soundness of proposed enhancements.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号