首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1877篇
  免费   52篇
  国内免费   2篇
电工技术   23篇
综合类   5篇
化学工业   312篇
金属工艺   39篇
机械仪表   30篇
建筑科学   186篇
矿业工程   12篇
能源动力   44篇
轻工业   153篇
水利工程   16篇
石油天然气   7篇
武器工业   1篇
无线电   227篇
一般工业技术   280篇
冶金工业   167篇
原子能技术   16篇
自动化技术   413篇
  2023年   7篇
  2022年   15篇
  2021年   34篇
  2020年   26篇
  2019年   32篇
  2018年   30篇
  2017年   48篇
  2016年   45篇
  2015年   25篇
  2014年   72篇
  2013年   160篇
  2012年   93篇
  2011年   107篇
  2010年   98篇
  2009年   110篇
  2008年   117篇
  2007年   115篇
  2006年   123篇
  2005年   101篇
  2004年   95篇
  2003年   77篇
  2002年   56篇
  2001年   39篇
  2000年   33篇
  1999年   37篇
  1998年   32篇
  1997年   33篇
  1996年   33篇
  1995年   16篇
  1994年   21篇
  1993年   12篇
  1992年   10篇
  1991年   9篇
  1990年   8篇
  1989年   4篇
  1988年   2篇
  1987年   9篇
  1985年   2篇
  1984年   6篇
  1983年   7篇
  1982年   3篇
  1981年   6篇
  1980年   8篇
  1979年   1篇
  1978年   4篇
  1977年   4篇
  1974年   1篇
  1971年   2篇
  1970年   1篇
  1969年   2篇
排序方式: 共有1931条查询结果,搜索用时 250 毫秒
41.
Automating software testing activities can increase the quality and drastically decrease the cost of software development. Toward this direction, various automated test data generation tools have been developed. The majority of existing tools aim at structural testing, while a quite limited number aim at a higher level of testing thoroughness such as mutation. In this paper, an attempt toward automating the generation of mutation-based test cases by utilizing existing automated tools is proposed. This is achieved by reducing the killing mutants’ problem into a covering branches one. To this extent, this paper is motivated by the use of state of the art techniques and tools suitable for covering program branches when performing mutation. Tools and techniques such as symbolic execution, concolic execution, and evolutionary testing can be easily adopted toward automating the test input generation activity for the weak mutation testing criterion by simply utilizing a special form of the mutant schemata technique. The propositions made in this paper integrate three automated tools in order to illustrate and examine the method’s feasibility and effectiveness. The obtained results, based on a set of Java program units, indicate the applicability and effectiveness of the suggested technique. The results advocate that the proposed approach is able to guide existing automating tools in producing test cases according to the weak mutation testing criterion. Additionally, experimental results with the proposed mutation testing regime show that weak mutation is able to speedup the mutant execution time by at least 4.79 times when compared with strong mutation.  相似文献   
42.
Past research has extensively investigated the effect of media, especially focusing on how anonymity increases risk-related behaviors of groups when using computer-mediated communication (CMC). This study extends prior research by examining the differences in group risk-taking behaviors between face-to-face groups and completely non-anonymous CMC groups (i.e., groups working in a fully identified, synchronous CMC environment similar to popular instant messaging systems utilized widely within organizations). Drawing on the “decision analysis” perspective, a key framework for understanding organizational decision-making, the study also examines the effects of the firm's risk preferences as well as the type of information distribution among group members (i.e., full information known to all group members versus partial information know by only some of the members) on the groups' risk-taking behaviors. Results from a laboratory experiment using student subjects found no differences in risk-taking behaviors between CMC and face-to-face groups; additionally, no differences were found related to how information was distributed among group members. A significant effect was found, however, for the risk preference of the firm, showing that risk-neutral firms influenced groups to make riskier decisions than groups from risk-averse firms. Finally, groups within risk-neutral firms receiving partial information made riskier decisions than groups receiving full information. The implications of these results for future research and practice are examined.  相似文献   
43.
Lidars have the unique ability to make direct, physical measurements of forest height and vertical structure in much denser canopies than is possible with passive optical or short wavelength radars. However the literature reports a consistent underestimate of tree height when using physically based methods, necessitating empirical corrections. This bias is a result of overestimating the range to the canopy top due to background noise and failing to correctly identify the ground.This paper introduces a method, referred to as “noise tracking”, to avoid biases when determining the range to the canopy top. Simulated waveforms, created with Monte-Carlo ray tracing over geometrically explicit forest models, are used to test noise tracking against simple thresholding over a range of forest and system characteristics. It was found that noise tracking almost completely removed the bias in all situations except for very high noise levels and very low (< 10%) canopy covers. In all cases noise tracking gave lower errors than simple thresholding and had a lower sensitivity to the initial noise threshold.Finite laser pulses spread out the measured signal, potentially overriding the benefit of noise tracking. In the past laser pulse length has been corrected by adding half that length to the signal start range. This investigation suggests that this is not always appropriate for simple thresholding and that the results for noise tracking were more directly related to pulse length than for simple thresholding. That this effect has not been commented on before may be due to the possible confounding impacts of instrument and survey characteristics inherent in field data. This method should help improve the accuracy of waveform lidar measurements of forests, whether using airborne or spaceborne instruments.  相似文献   
44.
X-machines were proposed by Holcombe as a possible specification language and since then a number of further investigations have demonstrated that the model is intuitive and easy to use as well as general enough to cater for a wide range of applications. In particular (generalised) stream X-machines have been found to be extremely useful as a specification method and most of the theory developed so far has concentrated on this particular class of X-machines. Furthermore, a method for testing systems specified by stream X-machines exists and is proved to detect all faults of the implementation provided that the system meets certain initial requirements. However, this method can only be used to generate test sequences from deterministic X-machine specifications. In this paper we present the theoretical basis for a method for generating test sets from non-deterministic generalised stream X-machines. Received November 1999 / Accepted in revised form September 2000  相似文献   
45.
Due to the rapid development in computer networks, congestion becomes a critical issue. Congestion usually occurs when the connection demands on network resources, i.e. buffer spaces, exceed the available ones. We propose in this paper a new discrete-time queueing network analytical model based on dynamic random early drop (DRED) algorithm to control the congestion in early stages. We apply our analytical model on two-queue nodes queueing network. Furthermore, we compare between the proposed analytical model and three known active queue management (AQM) algorithms, including DRED, random early detection (RED) and adaptive RED, in order to figure out which of them offers better quality of service (QoS). We also experimentally compare the queue nodes of the proposed analytical model and the three AQM methods in terms of different performance measures, including, average queue length, average queueing delay, throughput, packet loss probability, etc., aiming to determine the queue node that offers better performance.  相似文献   
46.
47.
My early research was inspired by the mathematical semantics of Scott and Strachey. Two such topics, recounted in this paper, were the fixed-point analysis of pointer loops and the expressibility of a style of functional programming introduced by Barron and Strachey.  相似文献   
48.
We consider two algorithms for the barrier synchronization ofN processes: the Dissemination algorithm(2) and Brooks algorithm.(1,2) Both algorithms comprise a number of binary communications amongst the processes, organized into a sequence of stages. It is shown that Brooks' algorithm(1) requires between LogN(log2 N) and 2 LogN stages, the lower bound being guaranteed only in the case thatN is a power of 2 (cubic) and the upper bound seemingly needed for most otherN. On the other hand, it is shown(2) that the Dissemination algorithm requires only LogN stages regardless ofN, making it apparently superior to the Brooks algorithm. We introduce a network model of local barrier algorithms. Using it we obtain a rigorous correctness proof for local barrier algorithms, and show that the number of stages in the Brooks algorithm is bounded above by LogN+1. The Brooks algorithms is therefore essentially equivalent in time complexity to the Dissemination algorithm. We then address the question of which values ofN admit exactly LogN Brooks stages. We find a sufficient condition, and conjecture that it is also necessary.  相似文献   
49.
A silicon carbide based enhancement type metal insulator field effect transistor with porous gate metallization has been investigated as a total NO x sensor operated in a temperature cycling mode. This operating mode is quite new for gas sensors based on the field effect but promising results have been reported earlier. Based on static investigations we have developed a suitable T-cycle optimized for NO x detection and quantification in a mixture of typical exhaust gases (CO, C2H4, and NH3). Significant features describing the shape of the sensor response have been extracted and evaluated with multivariate statistics (e.g. linear discriminant analysis) allowing quantification of NO x . Additional cleaning-cycles every 30?min improve the stability of the sensor further. With this kind of advanced signal processing the influence of sensor drift and cross sensitivity to ambient gases can be reduced effectively. Measurements have proven that different concentrations of NO x can be detected even in a changing mixture of other typical exhaust gases under dry and humid conditions. In addition to that, unknown concentrations of NO x can be detected based on a small set of training data. It can be concluded that the performance of GasFETs for NO x determination can be enhanced considerably with temperature cycling and appropriate signal processing.  相似文献   
50.
Grain-oriented Aurivillius phase BaBi2Nb2 O9 ceramics were fabricated using Spark Plasma Sintering (SPS). Their relaxor behaviour was confirmed by a strong frequency dispersion of the dielectric response. The dielectric behaviour has been fitted using different relaxor models. The relaxor parameters are isotropic, while the dielectric constants are highly anisotropic. The piezoelectric constant d 33 is zero perpendicular and parallel to the hot pressing direction, and the PE response is dominated by losses. The inability to pole the samples at room temperature is consistent with the T f temperature (∼ ∼115 K) estimated from fitting the experimental data to the Vogel–Fulcher model. This suggests that it may be possible to observe piezoelectric and ferroelectric properties at very low temperatures.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号