首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   5205篇
  免费   317篇
  国内免费   34篇
电工技术   51篇
综合类   19篇
化学工业   1197篇
金属工艺   89篇
机械仪表   86篇
建筑科学   334篇
矿业工程   9篇
能源动力   193篇
轻工业   905篇
水利工程   41篇
石油天然气   16篇
无线电   369篇
一般工业技术   972篇
冶金工业   228篇
原子能技术   28篇
自动化技术   1019篇
  2023年   54篇
  2022年   57篇
  2021年   112篇
  2020年   82篇
  2019年   84篇
  2018年   204篇
  2017年   204篇
  2016年   231篇
  2015年   191篇
  2014年   226篇
  2013年   373篇
  2012年   331篇
  2011年   338篇
  2010年   293篇
  2009年   245篇
  2008年   290篇
  2007年   280篇
  2006年   270篇
  2005年   188篇
  2004年   163篇
  2003年   173篇
  2002年   146篇
  2001年   100篇
  2000年   83篇
  1999年   84篇
  1998年   78篇
  1997年   72篇
  1996年   54篇
  1995年   48篇
  1994年   43篇
  1993年   32篇
  1992年   22篇
  1991年   21篇
  1990年   22篇
  1989年   16篇
  1987年   12篇
  1986年   21篇
  1985年   24篇
  1984年   37篇
  1983年   27篇
  1982年   19篇
  1981年   28篇
  1980年   32篇
  1979年   17篇
  1978年   13篇
  1977年   17篇
  1976年   17篇
  1975年   11篇
  1974年   14篇
  1973年   15篇
排序方式: 共有5556条查询结果,搜索用时 17 毫秒
101.
This paper presents a workplace study of triage work practices within an emergency department (ED). We examine the practices, procedures, and organization in which ED staff uses tools and technologies when coordinating the essential activity of assessing and sorting patients arriving at the ED. The paper provides in-depth empirical observations describing the situated work practices of triage work, and the complex collaborative nature of the triage process. We identify and conceptualize triage work practices as comprising patient trajectories, triage nurse activities, coordinative artefacts and exception handling; we also articulate how these four features of triage practices constitute and connect workflows, organize and re-organize time and space during the triage process. Finally we conceptualize these connections as an assessing and sorting mechanism in collaborative work. We argue that the complexities involved in this mechanism are a necessary asset of triage work, which calls for a reassessment of the concept of triage drift.  相似文献   
102.
A large number of network services rely on IP and reliable transport protocols. For applications that provide abundant data for transmission, loss is usually handled satisfactorily, even if the application is latency-sensitive (Wang et al. 2004). For data streams where small packets are sent intermittently, however, applications can occasionally experience extreme latencies (Griwodz and Halvorsen 2006). As it is not uncommon that such thin-stream applications are time-dependent, any unnecessarily induced delay can have severe consequences for the service provided. Massively Multiplayer Online Games (MMOGs) are a defining example of thin streams. Many MMOGs (like World of Warcraft and Age of Conan) use TCP for the benefits of reliability, in-order delivery and NAT/firewall traversal. It has been shown that TCP has several shortcomings with respect to the latency requirements of thin streams because of the way it handles retransmissions (Griwodz and Halvorsen 2006). As such, an alternative to TCP may be SCTP (Stewart et al. 2000), which was originally developed to meet the requirements of signaling transport. In this paper, we evaluate the Linux-kernel SCTP implementation in the context of thin streams. To address the identified latency challenges, we propose sender-side only enhancements that reduce the application-layer latency in a manner that is compatible with unmodified receivers. These enhancements can be switched on by applications and are used only when the system identifies the stream as thin. To evaluate the latency performance, we have performed several tests over various real networks and over an emulated network, varying parameters like RTT, packet loss and amount of competing cross traffic. When comparing our modifications with SCTP on Linux and FreeBSD and TCP New Reno, our results show great latency improvements and indicate the need for a separate handling of thin and thick streams.  相似文献   
103.
The benefits of software reuse have been studied for many years. Several previous studies have observed that reused software has a lower defect density than newly built software. However, few studies have investigated empirically the reasons for this phenomenon. To date, we have only the common sense observation that as software is reused over time, the fixed defects will accumulate and will result in high-quality software. This paper reports on an industrial case study in a large Norwegian Oil and Gas company, involving a reused Java class framework and two applications that use that framework. We analyzed all trouble reports from the use of the framework and the applications according to the Orthogonal Defect Classification (ODC), followed by a qualitative Root Cause Analysis (RCA). The results reveal that the framework has a much lower defect density in total than one application and a slightly higher defect density than the other. In addition, the defect densities of the most severe defects of the reused framework are similar to those of the applications that are reusing it. The results of the ODC and RCA analyses reveal that systematic reuse (i.e. clearly defined and stable requirements, better design, hesitance to change, and solid testing) lead to lower defect densities of the functional-type defects in the reused framework than in applications that are reusing it. However, the different “nature” of the framework and the applications (e.g. interaction with other software, number and complexity of business logic, and functionality of the software) may confound the causal relationship between systematic reuse and the lower defect density of the reused software. Using the results of the study as a basis, we present an improved overall cause–effect model between systematic reuse and lower defect density that will facilitate further studies and implementations of software reuse.
Anita GuptaEmail:
  相似文献   
104.
This paper is a contribution to the discussion on compiling computational lexical resources from conventional dictionaries. It describes the theoretical as well as practical problems that are encountered when reusing a conventional dictionary for compiling a lexical-semantic resource in terms of a wordnet. More specifically, it describes the methodological issues of compiling a wordnet for Danish, DanNet, from a monolingual basis, and not—as is often seen—by applying the translational expansion method with Princeton WordNet as the English source. Thus, we apply as our basis a large, corpus-based printed dictionary of modern Danish. Using this approach, we discuss the issues of readjusting inconsistent and/or underspecified hyponymy hierarchies taken from the conventional dictionary, sense distinctions as opposed to the synonym sets of wordnets, generating semantic wordnet relations on the basis of sense definitions, and finally, supplementing missing or implicit information.  相似文献   
105.
The aim of this paper is to optimize a thermal model of a friction stir welding process by finding optimal welding parameters. The optimization is performed using space mapping and manifold mapping techniques in which a coarse model is used along with the fine model to be optimized. Different coarse models are applied and the results and computation time are compared to gradient based optimization using the full model. It is found that the use of space and manifold mapping reduces the computational cost significantly due to the fact that fewer function evaluations and no fine model gradient information is required.  相似文献   
106.
eb 3 is a trace-based formal language created for the specification of information systems. In eb 3, each entity and association attribute is independently defined by a recursive function on the valid traces of external events. This paper describes an algorithm that generates, for each external event, a transaction that updates the value of affected attributes in their relational database representation. The benefits are twofold: eb 3 attribute specifications are automatically translated into executable programs, eliminating system design and implementation steps; the construction of information systems is streamlined, because eb 3 specifications are simpler and shorter to write than corresponding traditional specifications, design and implementations. In particular, the paper shows that simple eb 3 constructs can replace complex SQL queries which are typically difficult to write.
Régine LaleauEmail:
  相似文献   
107.
Proteomics analysis of serum from patients with type 1 diabetes (T1D) may lead to novel biomarkers for prediction of disease and for patient monitoring. However, the serum proteome is highly sensitive to sample processing and before proteomics biomarker research serum cohorts should preferably be examined for potential bias between sample groups. SELDI‐TOF MS protein profiling was used for preliminary evaluation of a biological‐bank with 766 serum samples from 270 patients with T1D, collected at 18 different paediatric centers representing 15 countries in Europe and Japan over 2 years (2000–2002). Samples collected 1 (n = 270), 6 (n = 248), and 12 (n = 248) months after T1D diagnosis were grouped across centers and compared. The serum protein profiles varied with collection site and day of analysis; however, markers of sample processing were not systematically different between samples collected at different times after diagnosis. Three members of the apolipoprotein family increased with time in patient serum collected 1, 6, and 12 months after diagnosis (ANOVA, p<0.001). These results support the use of this serum cohort for further proteomic studies and illustrate the potential of high‐throughput MALDI/SELDI‐TOF MS protein profiling for evaluation of serum cohorts before proteomics biomarker research.  相似文献   
108.
An original inversion method specifically adapted to the estimation of Poisson coefficient of balls by using their resonance spectra is described. From the study of their elastic vibrations, it is possible to accurately characterize the balls. The proposed methodology can create both spheroidal modes in the balls and detect such vibrations over a large frequency range. Experimentally, by using both an ultrasonic probe for the emission (piezoelectric transducer) and a heterodyne optic probe for the reception (interferometer), it was possible to take spectroscopic measurements of spheroidal vibrations over a large frequency range (100 kHz-45 MHz) in a continuous regime. This method, which uses ratios between wave resonance frequencies, allows the Poisson coefficient to be determined independently of Young's modulus and the ball's radius and density. This has the advantage of providing highly accurate estimations of Poisson coefficient (+/-4.3 x 10(-4)) over a wide frequency range.  相似文献   
109.
Stretch-dominated truss and plate microstructures are contenders in the quest for realizing architected materials with extreme stiffness and strength. In the low volume fraction limit, closed-cell isotropic plate microstructures meet theoretical upper bounds on stiffness but have low buckling strength, whereas open-cell truss microstructures have high buckling strength at the cost of significantly reduced stiffness. At finite volume fractions, the picture becomes less clear but both are outperformed by hollow truss lattice and hierarchical microstructures in terms of buckling strength. Despite significant advances in manufacturing methods, hollow and multi-scale hierarchical microstructures are still challenging to build. The question is if there exist realizable microstructures providing stiffness and strength matching or even beating hard-to-realize hollow or hierarchical microstructures? Herein, single-scale non-hierarchical (first order) microstructures that beat the buckling strength of hollow truss lattice structures by a factor of 2.4 and first- and second-order plate microstructures by factors of 5 and 1.4, respectively, are systematically designed, built, and tested. Stiffness of the microstructures is within 40% of theoretical bounds and beats both truss and second order plate microstructures. The microstructures are realized with 3D printing. Experiments validate theoretical predictions and additional insight is provided through numerical modeling of a CT-scanned sample.  相似文献   
110.
Magnetic Resonance Materials in Physics, Biology and Medicine - Signal intensity normalization is necessary to reduce heterogeneity in T2-weighted (T2W) magnetic resonance imaging (MRI) for...  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号