首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   5077篇
  免费   303篇
  国内免费   6篇
电工技术   56篇
综合类   6篇
化学工业   1040篇
金属工艺   74篇
机械仪表   56篇
建筑科学   421篇
矿业工程   9篇
能源动力   233篇
轻工业   877篇
水利工程   38篇
石油天然气   8篇
无线电   247篇
一般工业技术   923篇
冶金工业   515篇
原子能技术   17篇
自动化技术   866篇
  2023年   45篇
  2022年   41篇
  2021年   80篇
  2020年   68篇
  2019年   108篇
  2018年   168篇
  2017年   170篇
  2016年   173篇
  2015年   165篇
  2014年   206篇
  2013年   344篇
  2012年   269篇
  2011年   366篇
  2010年   269篇
  2009年   222篇
  2008年   251篇
  2007年   266篇
  2006年   244篇
  2005年   158篇
  2004年   157篇
  2003年   153篇
  2002年   130篇
  2001年   103篇
  2000年   93篇
  1999年   83篇
  1998年   173篇
  1997年   118篇
  1996年   98篇
  1995年   72篇
  1994年   55篇
  1993年   55篇
  1992年   31篇
  1991年   30篇
  1990年   28篇
  1989年   26篇
  1988年   12篇
  1987年   18篇
  1986年   19篇
  1985年   26篇
  1984年   41篇
  1983年   24篇
  1982年   18篇
  1981年   27篇
  1980年   25篇
  1979年   20篇
  1977年   24篇
  1976年   32篇
  1975年   15篇
  1974年   15篇
  1973年   16篇
排序方式: 共有5386条查询结果,搜索用时 433 毫秒
91.
Multi- and hyperspectral imaging and data analysis has been investigated in the last decades in the context of various fields of application like remote sensing or microscopic spectroscopy. However, recent developments in sensor technology and a growing number of application areas require a more generic view on data analysis, that clearly expands the current, domain-specific approaches. In this context, we address the problem of interactive exploration of multi- and hyperspectral data, consisting of (semi-)automatic data analysis and scientific visualization in a comprehensive fashion. In this paper, we propose an approach that enables a generic interactive exploration and easy segmentation of multi- and hyperspectral data, based on characterizing spectra of an individual dataset, the so-called endmembers. Using the concepts of existing endmember extraction algorithms, we derive a visual analysis system, where the characteristic spectra initially identified serve as input to interactively tailor a problem-specific visual analysis by means of visual exploration. An optional outlier detection improves the robustness of the endmember detection and analysis. An adequate system feedback of the costly unmixing procedure for the spectral data with respect to the current set of endmembers is ensured by a novel technique for progressive unmixing and view update which is applied at user modification. The progressive unmixing is based on an efficient prediction scheme applied to previous unmixing results. We present a detailed evaluation of our system in terms of confocal Raman microscopy, common multispectral imaging and remote sensing.  相似文献   
92.
Snowmelt is known to cause peak concentrations of pollutants, which may adversely affect receiving water quality. High concentrations of metals and suspended solids in snow have been reported, whereas studies on organic pollutants are rare. This study aims at investigating the occurrence of anthropogenic organic compounds in urban snow in Gothenburg (Sweden). The most frequently detected organic pollutants in the collected snow samples were polycyclic aromatic hydrocarbons (PAHs), high molecular-weight phthalates, 4-nonylphenol and 4-t-octylphenol. Brominated flame retardants and chlorinated paraffins were only sporadically detected. In several snow samples, the concentrations of specific PAHs, alkylphenols and phthalates were higher than reported stormwater concentrations and European water quality standards. Pollutant source identification and sustainable management of snow are important instruments for the mitigation of organic contaminants in the urban environment.  相似文献   
93.
This paper presents a gradient based optimization method for large-scale topology and thickness optimization of fiber reinforced monolithic laminated composite structures including certain manufacturing constraints to attain industrial relevance. This facilitates application of predefined fiber mats and reduces the risk of failure such as delamination and matrix cracking problems. The method concerns simultaneous determination of the optimum thickness and fiber orientation throughout a laminated structure with fixed outer geometry. The laminate thickness may vary as an integer number of plies, and possible fiber orientations are limited to a finite set. The conceptual combinatorial problem is relaxed to a continuous problem and solved on basis of interpolation schemes with penalization through the so-called Discrete Material Optimization method, explicitly including manufacturing constraints as a large number of sparse linear constraints. The methodology is demonstrated on several numerical examples.  相似文献   
94.
Interactive topology optimization on hand-held devices   总被引:1,自引:1,他引:0  
This paper presents an interactive topology optimization application designed for hand-held devices running iOS or Android. The TopOpt app solves the 2D minimum compliance problem with interactive control of load and support positions as well as volume fraction. Thus, it is possible to change the problem settings on the fly and watch the design evolve to a new optimum in real time. The use of an interactive app makes it extremely simple to learn and understand the influence of load-directions, support conditions and volume fraction. The topology optimization kernel is written in C# and the graphical user interface is developed using the game engine Unity3D. The underlying code is inspired by the publicly available 88 and 99 line Matlab codes for topology optimization but does not utilize any low-level linear algebra routines such as BLAS or LAPACK. The TopOpt App can be downloaded on iOS devices from the Apple App Store, at Google Play for the Android platform, and a web-version can be run from www.topopt.dtu.dk.  相似文献   
95.
The aim of this study was to evaluate the effect of a transfer technique education programme (TT) alone or in combination with physical fitness training (TTPT) compared with a control group, who followed their usual routine. Eleven clinical hospital wards were cluster randomised to either intervention (six wards) or to control (five wards). The intervention cluster was individually randomised to TT (55 nurses) and TTPT (50 nurses), control (76 nurses). The transfer technique programme was a 4-d course of train-the-trainers to teach transfer technique to their colleagues. The physical training consisted of supervised physical fitness training 1 h twice per week for 8 weeks. Implementing transfer technique alone or in combination with physical fitness training among a hospital nursing staff did not, when compared to a control group, show any statistical differences according to self-reported low back pain (LBP), pain level, disability and sick leave at a 12-month follow-up. However, the individual randomised intervention subgroup (transfer technique/physical training) significantly improved the LBP-disability (p = 0.001). Although weakened by a high withdrawal rate, teaching transfer technique to nurses in a hospital setting needs to be thoroughly considered. Other priorities such as physical training may be taken into consideration. The current study supports the findings of other studies that introducing transfer technique alone has no effect in targeting LBP. However, physical training seems to have an influence in minimising the LBP consequences and may be important in the discussion of how to prevent LBP or the recurrence of LBP among nursing personnel.  相似文献   
96.
This paper presents a workplace study of triage work practices within an emergency department (ED). We examine the practices, procedures, and organization in which ED staff uses tools and technologies when coordinating the essential activity of assessing and sorting patients arriving at the ED. The paper provides in-depth empirical observations describing the situated work practices of triage work, and the complex collaborative nature of the triage process. We identify and conceptualize triage work practices as comprising patient trajectories, triage nurse activities, coordinative artefacts and exception handling; we also articulate how these four features of triage practices constitute and connect workflows, organize and re-organize time and space during the triage process. Finally we conceptualize these connections as an assessing and sorting mechanism in collaborative work. We argue that the complexities involved in this mechanism are a necessary asset of triage work, which calls for a reassessment of the concept of triage drift.  相似文献   
97.
A large number of network services rely on IP and reliable transport protocols. For applications that provide abundant data for transmission, loss is usually handled satisfactorily, even if the application is latency-sensitive (Wang et al. 2004). For data streams where small packets are sent intermittently, however, applications can occasionally experience extreme latencies (Griwodz and Halvorsen 2006). As it is not uncommon that such thin-stream applications are time-dependent, any unnecessarily induced delay can have severe consequences for the service provided. Massively Multiplayer Online Games (MMOGs) are a defining example of thin streams. Many MMOGs (like World of Warcraft and Age of Conan) use TCP for the benefits of reliability, in-order delivery and NAT/firewall traversal. It has been shown that TCP has several shortcomings with respect to the latency requirements of thin streams because of the way it handles retransmissions (Griwodz and Halvorsen 2006). As such, an alternative to TCP may be SCTP (Stewart et al. 2000), which was originally developed to meet the requirements of signaling transport. In this paper, we evaluate the Linux-kernel SCTP implementation in the context of thin streams. To address the identified latency challenges, we propose sender-side only enhancements that reduce the application-layer latency in a manner that is compatible with unmodified receivers. These enhancements can be switched on by applications and are used only when the system identifies the stream as thin. To evaluate the latency performance, we have performed several tests over various real networks and over an emulated network, varying parameters like RTT, packet loss and amount of competing cross traffic. When comparing our modifications with SCTP on Linux and FreeBSD and TCP New Reno, our results show great latency improvements and indicate the need for a separate handling of thin and thick streams.  相似文献   
98.
The benefits of software reuse have been studied for many years. Several previous studies have observed that reused software has a lower defect density than newly built software. However, few studies have investigated empirically the reasons for this phenomenon. To date, we have only the common sense observation that as software is reused over time, the fixed defects will accumulate and will result in high-quality software. This paper reports on an industrial case study in a large Norwegian Oil and Gas company, involving a reused Java class framework and two applications that use that framework. We analyzed all trouble reports from the use of the framework and the applications according to the Orthogonal Defect Classification (ODC), followed by a qualitative Root Cause Analysis (RCA). The results reveal that the framework has a much lower defect density in total than one application and a slightly higher defect density than the other. In addition, the defect densities of the most severe defects of the reused framework are similar to those of the applications that are reusing it. The results of the ODC and RCA analyses reveal that systematic reuse (i.e. clearly defined and stable requirements, better design, hesitance to change, and solid testing) lead to lower defect densities of the functional-type defects in the reused framework than in applications that are reusing it. However, the different “nature” of the framework and the applications (e.g. interaction with other software, number and complexity of business logic, and functionality of the software) may confound the causal relationship between systematic reuse and the lower defect density of the reused software. Using the results of the study as a basis, we present an improved overall cause–effect model between systematic reuse and lower defect density that will facilitate further studies and implementations of software reuse.
Anita GuptaEmail:
  相似文献   
99.
This paper is a contribution to the discussion on compiling computational lexical resources from conventional dictionaries. It describes the theoretical as well as practical problems that are encountered when reusing a conventional dictionary for compiling a lexical-semantic resource in terms of a wordnet. More specifically, it describes the methodological issues of compiling a wordnet for Danish, DanNet, from a monolingual basis, and not—as is often seen—by applying the translational expansion method with Princeton WordNet as the English source. Thus, we apply as our basis a large, corpus-based printed dictionary of modern Danish. Using this approach, we discuss the issues of readjusting inconsistent and/or underspecified hyponymy hierarchies taken from the conventional dictionary, sense distinctions as opposed to the synonym sets of wordnets, generating semantic wordnet relations on the basis of sense definitions, and finally, supplementing missing or implicit information.  相似文献   
100.
The aim of this paper is to optimize a thermal model of a friction stir welding process by finding optimal welding parameters. The optimization is performed using space mapping and manifold mapping techniques in which a coarse model is used along with the fine model to be optimized. Different coarse models are applied and the results and computation time are compared to gradient based optimization using the full model. It is found that the use of space and manifold mapping reduces the computational cost significantly due to the fact that fewer function evaluations and no fine model gradient information is required.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号