首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   483篇
  免费   9篇
  国内免费   1篇
电工技术   4篇
化学工业   103篇
金属工艺   2篇
机械仪表   7篇
建筑科学   17篇
能源动力   12篇
轻工业   44篇
水利工程   1篇
石油天然气   4篇
无线电   45篇
一般工业技术   93篇
冶金工业   90篇
原子能技术   4篇
自动化技术   67篇
  2022年   3篇
  2021年   5篇
  2020年   7篇
  2019年   12篇
  2018年   5篇
  2017年   7篇
  2016年   6篇
  2015年   6篇
  2014年   9篇
  2013年   28篇
  2012年   27篇
  2011年   19篇
  2010年   12篇
  2009年   17篇
  2008年   29篇
  2007年   25篇
  2006年   16篇
  2005年   17篇
  2004年   17篇
  2003年   19篇
  2002年   15篇
  2001年   9篇
  2000年   8篇
  1999年   11篇
  1998年   26篇
  1997年   20篇
  1996年   14篇
  1995年   5篇
  1994年   9篇
  1993年   7篇
  1992年   4篇
  1990年   7篇
  1988年   4篇
  1987年   6篇
  1986年   5篇
  1984年   5篇
  1983年   2篇
  1982年   2篇
  1981年   2篇
  1980年   6篇
  1979年   3篇
  1978年   3篇
  1977年   2篇
  1976年   7篇
  1975年   2篇
  1973年   6篇
  1970年   3篇
  1967年   1篇
  1966年   4篇
  1965年   3篇
排序方式: 共有493条查询结果,搜索用时 375 毫秒
481.
Acoustic radiation force impulse imaging has been used clinically to study the dynamic response of lesions relative to their background material to focused, impulsive acoustic radiation force excitations through the generation of dynamic displacement field images. Dynamic displacement data are typically displayed as a set of parametric images, including displacement immediately after excitation, maximum displacement, time to peak displacement, and recovery time from peak displacement. To date, however, no definitive trends have been established between these parametric images and the tissues' mechanical properties. This work demonstrates that displacement magnitude, time to peak displacement, and recovery time are all inversely related to the Young's modulus in homogeneous elastic media. Experimentally, pulse repetition frequency during displacement tracking limits stiffness resolution using the time to peak displacement parameter. The excitation pulse duration also impacts the time to peak parameter, with longer pulses reducing the inertial effects present during impulsive excitations. Material density affects tissue dynamics, but is not expected to play a significant role in biological tissues. The presence of an elastic spherical inclusion in the imaged medium significantly alters the tissue dynamics in response to impulsive, focused acoustic radiation force excitations. Times to peak displacement for excitations within and outside an elastic inclusion are still indicative of local material stiffness; however, recovery times are altered due to the reflection and transmission of shear waves at the inclusion boundaries. These shear wave interactions cause stiffer inclusions to appear to be displaced longer than the more compliant background material. The magnitude of shear waves reflected at elastic lesion boundaries is dependent on the stiffness contrast between the inclusion and the background material, and the stiffness and size of the inclusion dictate when shear wave reflections within the lesion will interfere with one another. Jitter and bias associated with the ultrasonic displacement tracking also impact the estimation of a tissue's dynamic response to acoustic radiation force excitation.  相似文献   
482.
Constructing an ultrasonic imaging system capable of compensating for phase errors in real-time is a significant challenge in adaptive imaging. We present a versatile adaptive imaging system capable of updating arrival time profiles at frame rates of approximately 2 frames per second (fps) with 1-D arrays and up to 0.81 fps for 1.75-D arrays, depending on the desired near-field phase correction algorithm. A novel feature included in this system is the ability to update the aberration profile at multiple beam locations for 1-D arrays. The features of this real-time adaptive imaging system are illustrated in tissue-mimicking phantoms with physical near-field phase screens and evaluated in clinical breast tissue with a 1.75-D array. The contrast-to-noise ratio (CNR) of anechoic cysts was shown to improve dramatically in the tissue-mimicking phantoms. In breast tissue, the width of point-like targets showed significant improvement: a reduction of 26.2% on average. Brightness of these targets, however, marginally decreased by 3.9%. For larger structures such as cysts, little improvement in features and CNR were observed, which is likely a result of the system assuming an infinite isoplanatic patch size for the 1.75-D arrays. The necessary requirements for constructing a real-time adaptive imaging system are also discussed.  相似文献   
483.
The continuous tuning range of an external-cavity diode laser can be extended by making small corrections to the external-cavity length through an electronic feedback loop so that the cavity resonance condition is maintained as the laser wavelength is tuned. By maintaining the cavity resonance condition as the laser is tuned, the mode hops that typically limit the continuous tuning range of the external-cavity diode laser are eliminated. We present the design of a simple external-cavity diode laser based on the Littman-Metcalf external-cavity configuration that has a measured continuous tuning range of 1 GHz without an electronic feedback loop. To include the electronic feedback loop, a small sinusoidal signal is added to the drive current of the laser diode creating a small oscillation of the laser power. By comparing the phase of the modulated optical power with the phase of the sinusoidal drive signal using a lock-in amplifier, an error signal is created and used in an electronic feedback loop to control the external-cavity length. With electronic feedback, we find that the continuous tuning range can be extended to over 65 GHz. This occurs because the electronic feedback maintains the cavity resonance condition as the laser is tuned. An experimental demonstration of this extended tuning range is presented in which the external-cavity diode laser is tuned through an absorption feature of diatomic oxygen near 760 nm.  相似文献   
484.
Increasingly, modern‐day software systems are being built by combining externally‐developed software components with application‐specific code. For such systems, existing program‐analysis‐based software engineering techniques may not directly apply, due to lack of information about components. To address this problem, the use of component metadata has been proposed. Component metadata are metadata and metamethods provided with components, that retrieve or calculate information about those components. In particular, two component‐metadata‐based approaches for regression test selection are described: one using code‐based component metadata and the other using specification‐based component metadata. The results of empirical studies that illustrate the potential of these techniques to provide savings in re‐testing effort are provided. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   
485.
Five studies argue against claims that preschoolers understand a biological germ theory of illness. In Studies 1–3, participants were read stories in which characters develop symptoms (e.g., a bellyache) caused by germs, poisons, or events (e.g., eating too much candy) and were asked whether another character could catch the symptoms from the first. Few children made judgments in terms of germs as part of an underlying causal process linking the origin of a symptom to its subsequent transmission. Some children may have reasoned simply that certain kinds of symptoms are likely to be contagious. Studies 4 and 5 undermined the claim that preschoolers understand germs to be uniquely biological causal agents. Young children did not attribute properties to germs as they did for animate beings or for plants. It is suggested that children undergo conceptual reorganization in constructing a Western adult understanding of germs. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
486.
Regression testing is an expensive testing process used to validate modified software. Regression test selection and test‐case prioritization can reduce the costs of regression testing by selecting a subset of test cases for execution, or scheduling test cases to meet testing objectives better. The cost‐effectiveness of these techniques can vary widely, however, and one cause of this variance is the type and magnitude of changes made in producing a new software version. Engineers unaware of the causes and effects of this variance can make poor choices in designing change integration processes, selecting inappropriate regression testing techniques, designing excessively expensive regression test suites and making unnecessarily costly changes. Engineers aware of causal factors can perform regression testing more cost‐effectively. This article reports the results of an embedded multiple case study investigating the modifications made in the evolution of four software systems and their impact on regression testing techniques. The results of this study expose tradeoffs and constraints that affect the success of techniques and provide guidelines for designing and managing regression testing processes. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   
487.
Code‐coverage‐based test data adequacy criteria typically treat all coverable code elements (such as statements, basic blocks or outcomes of decisions) as equal. In practice, however, the probability that a test case can expose a fault in a code element varies: some faults are more easily revealed than others. Thus, several researchers have suggested that if one could estimate the probability that a fault in a code element will cause a failure, one could use this estimate to determine the number of executions of a code element that are required to achieve a certain level of confidence in that element's correctness. This estimate, in turn, could be used to improve the fault‐detection effectiveness of test suites and help testers distribute testing resources more effectively. This conjecture is intriguing; however, like many such conjectures it has never been directly examined empirically. If empirical evidence were to support this conjecture, it would motivate further research into methodologies for obtaining fault‐exposure‐potential estimates and incorporating them into test data adequacy criteria. This paper reports the results of experiments conducted to investigate the effects of incorporating an estimate of fault‐exposure probability into the statement coverage test data adequacy criterion. The results of these experiments, however, ran contrary to the conjectures of previous researchers. Although incorporation of the estimates did produce statistically significant increases in the fault‐detection effectiveness of test suites, these increases were quite small, suggesting that the approach might not be able to produce the gains hoped for and might not be worth the cost of its employment. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   
488.
489.
Many software maintenance and testing tasks involve comparing the behaviours of program versions. Program spectra have recently been proposed as a heuristic for use in performing such comparisons. To assess the potential usefulness of spectra in this context an experiment was conducted, examining the relationship between differences in program spectra and the exposure of regression faults (faults existing in a modified version of a program that were not present prior to modifications, or not revealed in previous testing), and empirically comparing several types of spectra. The results reveal that certain types of spectra differences correlate with high frequency—at least in one direction—with the exposure of regression faults. That is, when regression faults are revealed by particular inputs, spectra differences are likely also to be revealed by those inputs, though the reverse is not true. The results also suggest that several types of spectra that appear, analytically, to offer greater precision in predicting the presence of regression faults than other, cheaper, spectra may provide no greater precision in practice. These results have ramifications for future research on, and for the practical uses of, program spectra. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   
490.
Test‐suite reduction techniques attempt to reduce the costs of saving and reusing test cases during software maintenance by eliminating redundant test cases from test suites. A potential drawback of these techniques is that reducing the size of a test suite might reduce its ability to reveal faults in the software. Previous studies have suggested that test‐suite reduction techniques can reduce test‐suite size without significantly reducing the fault‐detection capabilities of test suites. These studies, however, involved particular programs and types of test suites, and to begin to generalize their results, further work is needed. This paper reports on the design and execution of additional studies, examining the costs and benefits of test‐suite reduction, and the factors that influence these costs and benefits. In contrast to previous studies, results of these studies reveal that the fault‐detection capabilities of test suites can be severely compromised by test‐suite reduction. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号