首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   994篇
  免费   21篇
  国内免费   1篇
电工技术   13篇
化学工业   177篇
金属工艺   13篇
机械仪表   20篇
建筑科学   54篇
矿业工程   22篇
能源动力   34篇
轻工业   103篇
水利工程   19篇
石油天然气   2篇
无线电   85篇
一般工业技术   123篇
冶金工业   97篇
原子能技术   2篇
自动化技术   252篇
  2023年   11篇
  2022年   21篇
  2021年   20篇
  2020年   11篇
  2019年   19篇
  2018年   17篇
  2017年   16篇
  2016年   23篇
  2015年   23篇
  2014年   23篇
  2013年   57篇
  2012年   47篇
  2011年   85篇
  2010年   55篇
  2009年   84篇
  2008年   57篇
  2007年   56篇
  2006年   50篇
  2005年   59篇
  2004年   37篇
  2003年   35篇
  2002年   25篇
  2001年   22篇
  2000年   14篇
  1999年   18篇
  1998年   12篇
  1997年   13篇
  1996年   11篇
  1995年   6篇
  1994年   6篇
  1993年   8篇
  1992年   9篇
  1991年   2篇
  1990年   8篇
  1989年   3篇
  1988年   2篇
  1987年   8篇
  1986年   2篇
  1985年   5篇
  1984年   4篇
  1983年   2篇
  1980年   2篇
  1939年   4篇
  1938年   3篇
  1932年   4篇
  1930年   1篇
  1929年   5篇
  1928年   1篇
  1927年   2篇
  1926年   3篇
排序方式: 共有1016条查询结果,搜索用时 15 毫秒
1.
Rob  M.A. 《Software, IEEE》2003,20(6):94-95
The author relates his experiences in a small software development company. These experiences show that project management is challenging and if the project manager is an obstacle, the project is bound to fail. When it does, it can spell disaster for the manager and anyone who accompanies him or her on a software development journey.  相似文献   
2.
In this paper we describe a framework for analysing the creation and justification of Research & Development. The 4S framework is developed for analysing the scope, scale, skills and social network aspects of Research & Development value. The framework is based on social system theory, a process contingency model, and recent Research & Development metrics. We present a first empirical assessment based on a workshop using the 4S framework for leveraging Research & Development. Results that assist in the assessment of value creation utilising R & D within networks are very relevant in high tech industries. The multi–dimensional process approach of this framework seems promising for understanding and managing R&D value creation, but needs further operationalisation. Case studies are described and a Dutch network on leveraging R&D has been initiated.  相似文献   
3.
4.
Normalized Difference Vegetation Index (NDVI), which is a measure of vegetation vigour, and lake water levels respond variably to precipitation and its deficiency. For a given lake catchment, NDVI may have the ability to depict localized natural variability in water levels in response to weather patterns. This information may be used to decipher natural from unnatural variations of a given lake’s surface. This study evaluates the potential of using NDVI and its associated derivatives (VCI (vegetation condition index), SVI (standardised vegetation index), AINDVI (annually integrated NDVI), green vegetation function (F g ), and NDVIA (NDVI anomaly)) to depict Lake Victoria’s water levels. Thirty years of monthly mean water levels and a portion of the Global Inventory Modelling and Mapping Studies (GIMMS) AVHRR (Advanced Very High Resolution Radiometer) NDVI datasets were used. Their aggregate data structures and temporal co-variabilities were analysed using GIS/spatial analysis tools. Locally, NDVI was found to be more sensitive to drought (i.e., responded more strongly to reduced precipitation) than to water levels. It showed a good ability to depict water levels one-month in advance, especially in moderate to low precipitation years. SVI and SWL (standardized water levels) used in association with AINDVI and AMWLA (annual mean water levels anomaly) readily identified high precipitation years, which are also when NDVI has a low ability to depict water levels. NDVI also appears to be able to highlight unnatural variations in water levels. We propose an iterative approach for the better use of NDVI, which may be useful in developing an early warning mechanisms for the management of lake Victoria and other Lakes with similar characteristics.  相似文献   
5.
The importance of software product evaluations will grow with the awareness of the need for better software quality. The process to conduct such evaluations is crucial to get evaluation results that can be applied and meet customers' expectations. This paper reviews a well-known evaluation process: the ISO 14598 standard. The review focuses on the difficulties in selecting and evaluating the appropriate evaluation techniques. The review shows that the standard has problems in applying evaluation processes in practice due to insufficient attention to goal definition and to relationships between activities being implicit. Also, the standard ignores the trade-off between goals and resources and pays insufficient attention to feedback. To address these deficiencies, the W-process is proposed. It extends the standard through an improved process structure and additional guidelines.  相似文献   
6.
In recent years, embedded memories are the fastest growing segment of system on chip. They therefore have a major impact on the overall Defect per Million (DPM). Further, the shrinking technologies and processes introduce new defects that cause previously unknown faults; such faults have to be understood and modeled in order to design appropriate test techniques that can reduce the DPM level. This paper discusses a new memory fault class, namely dynamic faults, based on industrial test results; it defines the concept of dynamic faults based on the fault primitive concept. It further shows the importance of dynamic faults for the new memory technologies and introduces a systematic way for modeling them. It concludes that current and future SRAM products need to consider testability for dynamic faults or leave substantial DPM on the table, and sets a direction for further research.  相似文献   
7.
The detailed tuning behavior within a single ro-vibrational line of a TEA CO2 laser was studied by using a wedged etalon and a pyroelectric array. The fine tuning of the laser was performed by using a temperature controlled etalon inside a three-mirror resonator. As the laser wavelength was scanned at a constant rate, single-longitudinal-mode pulses were observed ~82% of the time. Two-mode operation occurred over a narrow tuning range near the midpoint between successive cavity modes. The detailed structure of the staircase-like mode-to-mode tuning curve and the corresponding output intensity variations are explained using a simple theoretical model  相似文献   
8.
The shift of electronics industry towards the use of lead-free solders in components manufacturing brought also the challenge of addressing the problem of tin whiskers. Manufacturers of high reliability and safety critical equipment in sectors such as defence and aerospace rely increasingly on the use of commercial-of-the-shelf (COTS) electronic components for their products and systems. The use of COTS components with lead-free solder plated terminations comes with the risks for their long term reliability associated with tin whisker growth related failures. In the case of leaded type electronic components such as Quad Flat Package (QFP) and Small Outline Package (SOP), one of the promising solutions to this problem is to “re-finish” the package terminations by replacing the lead-free solder coatings on the leads with conventional tin–lead solder. This involves subjecting the electronic components to a post-manufacturing process known as Hot Solder Dip (HSD). One of the main concerns for adopting HSD (refinishing) as a strategy to the tin whisker problem is the potential risk for thermally induced damage in the components when subjected to this process.  相似文献   
9.
比较ADC的孔径延迟   总被引:1,自引:0,他引:1  
在通信设计和数据采集等一些应用中,比较多路模数转换器(ADC)之间的孔径延迟非常重要,必须对其进行测量.  相似文献   
10.
In vivo measurements of equivalent resistivities of skull (rho(skull)) and brain (rho(brain)) are performed for six subjects using an electric impedance tomography (EIT)-based method and realistic models for the head. The classical boundary element method (BEM) formulation for EIT is very time consuming. However, the application of the Sherman-Morrison formula reduces the computation time by a factor of 5. Using an optimal point distribution in the BEM model to optimize its accuracy, decreasing systematic errors of numerical origin, is important because cost functions are shallow. Results demonstrate that rho(skull)/rho(brain) is more likely to be within 20 and 50 rather than equal to the commonly accepted value of 80. The variation in rho(brain)(average = 301 omega x cm, SD = 13%) and rho(skull)(average = 12230 omega x cm, SD = 18%) is decreased by half, when compared with the results using the sphere model, showing that the correction for geometry errors is essential to obtain realistic estimations. However, a factor of 2.4 may still exist between values of rho(skull)/rho(brain) corresponding to different subjects. Earlier results show the necessity of calibrating rho(brain) and rho(skull) by measuring them in vivo for each subject, in order to decrease errors associated with the electroencephalogram inverse problem. We show that the proposed method is suited to this goal.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号