全文获取类型
收费全文 | 1498篇 |
免费 | 49篇 |
国内免费 | 4篇 |
专业分类
电工技术 | 17篇 |
综合类 | 10篇 |
化学工业 | 446篇 |
金属工艺 | 24篇 |
机械仪表 | 25篇 |
建筑科学 | 120篇 |
能源动力 | 25篇 |
轻工业 | 138篇 |
水利工程 | 10篇 |
石油天然气 | 3篇 |
无线电 | 94篇 |
一般工业技术 | 307篇 |
冶金工业 | 51篇 |
原子能技术 | 13篇 |
自动化技术 | 268篇 |
出版年
2024年 | 8篇 |
2023年 | 27篇 |
2022年 | 38篇 |
2021年 | 56篇 |
2020年 | 45篇 |
2019年 | 46篇 |
2018年 | 37篇 |
2017年 | 36篇 |
2016年 | 41篇 |
2015年 | 37篇 |
2014年 | 58篇 |
2013年 | 74篇 |
2012年 | 68篇 |
2011年 | 121篇 |
2010年 | 79篇 |
2009年 | 90篇 |
2008年 | 78篇 |
2007年 | 61篇 |
2006年 | 63篇 |
2005年 | 56篇 |
2004年 | 50篇 |
2003年 | 40篇 |
2002年 | 36篇 |
2001年 | 33篇 |
2000年 | 35篇 |
1999年 | 30篇 |
1998年 | 18篇 |
1997年 | 22篇 |
1996年 | 15篇 |
1995年 | 18篇 |
1994年 | 22篇 |
1993年 | 11篇 |
1992年 | 9篇 |
1991年 | 20篇 |
1990年 | 10篇 |
1989年 | 9篇 |
1988年 | 6篇 |
1987年 | 3篇 |
1986年 | 3篇 |
1985年 | 5篇 |
1984年 | 8篇 |
1983年 | 7篇 |
1981年 | 2篇 |
1980年 | 2篇 |
1978年 | 3篇 |
1977年 | 3篇 |
1976年 | 2篇 |
1967年 | 2篇 |
1939年 | 1篇 |
1931年 | 3篇 |
排序方式: 共有1551条查询结果,搜索用时 15 毫秒
71.
René-Maxime Gracien Sarah C. Reitz Marlies Wagner Christoph Mayer Steffen Volz Stephanie-Michelle Hof Vinzenz Fleischer Amgad Droby Helmuth Steinmetz Sergiu Groppa Elke Hattingen Johannes C. Klein Ralf Deichmann 《Magma (New York, N.Y.)》2017,30(1):75-83
Objective
Proton density (PD) mapping requires correction for the receive profile (RP), which is frequently performed via bias-field correction. An alternative RP-mapping method utilizes a comparison of uncorrected PD-maps and a value ρ(T1) directly derived from T1-maps via the Fatouros equation. This may be problematic in multiple sclerosis (MS), if respective parameters are only valid for healthy brain tissue. We aimed to investigate whether the alternative method yields correct PD values in MS patients.Materials/methods
PD mapping was performed on 27 patients with relapsing-remitting MS and 27 healthy controls, utilizing both methods, yielding reference PD values (PDref, bias-field method) and PDalt (alternative method).Results
PDalt-values closely matched PDref, both for patients and controls. In contrast, ρ(T1) differed by up to 3 % from PDref, and the voxel-wise correlation between PDref and ρ(T1) was reduced in a patient subgroup with a higher degree of disability. Still, discrepancies between ρ(T1) and PDref were almost identical across different tissue types, thus translating into a scaling factor, which cancelled out during normalization to 100 % in CSF, yielding a good agreement between PDalt and PDref.Conclusion
RP correction utilizing the auxiliary parameter ρ(T1) derived via the Fatouros equation provides accurate PD results in MS patients, in spite of discrepancies between ρ(T1) and actual PD values.72.
Computational Economics - We present a new modeling approach for house price movements as a consequence of the trading behavior of market agents. In our modeling approach, all agents are assumed to... 相似文献
73.
Thorsten Falk Ralf Heese Christian Kaspar Malgorzata Mochol Daniel Pfeiffer Michael Thygs Robert Tolksdorf 《Informatik-Spektrum》2006,29(3):201-209
Zusammenfassung The importance of the Internet for job procurement is increasing for the reason that three quarters of the people in the employment age are online. On the other hand because ever more companies are publishing their job offers on the Web. However, due to the large number of openings published online it is almost impossible for job seekers and job portals to gain an overview of the entire employment market. Since job offers lack semantically meaningful annotations, the search and integration into databases are made highly difficult. Applying Semantic Web technologies to the e-recruitment process provides advantages for all participants in the market. In this paper we describe a method for analysing the domain-specific language of an application domain. We use this method to describe the e-recruitment process and the necessary ontologies for annotating job offers and job applications. In conclusion, we present the prototypical implementation of the scenario based on Semantic Web, especially semantic matching. 相似文献
74.
75.
Wjatscheslaw Missal Jaroslaw KitaEberhard Wappler Frieder GoraAnnette Kipka Thomas BartnitzekFranz Bechtold Dirk SchabbelBeate Pawlowski Ralf Moos 《Sensors and actuators. A, Physical》2011,172(1):21-26
A miniaturized ceramic differential scanning calorimeter (MC-DSC) with integrated oven and crucible is presented. Despite its small size of only 11 mm × 39 mm × 1.5 mm, all functions of a conventional DSC apparatus are integrated in this novel device - including the oven. The MC-DSC is fully manufactured in thick-film and green glass ceramic tape-based low temperature co-fired ceramics (LTCC) technology. Therefore, production costs are considered to be low. Initial results using indium as a sample material show a good dynamic performance of the MC-DSC. Full width at half maximum of the melting peak is 2.4 °C (sample mass approx. 11 mg, heating rate approx. 50 °C/min). Repeatability of the indium melting point is within ±0.02 °C. The melting peak area increases linearly with the sample mass up to at least 26 mg. Simulations of a strongly simplified finite element model of the MC-DSC are in a good agreement with measurement results allowing a model-based prediction of its basic characteristics. 相似文献
76.
A two-layer architecture for dynamic real-time optimization (or nonlinear modelpredictive control (NMPC) with an economic objective) is presented, where the solution of the dynamic optimization problem is computed on two time-scales. On the upper layer, a rigorous optimization problem is solved with an economic objective function at a slow time-scale, which captures slow trends in process uncertainties. On the lower layer, a fast neighboring-extremal controller is tracking the trajectory in order to deal with fast disturbances acting on the process. Compared to a single-layer architecture, the two-layer architecture is able to address control systems with complex models leading to high computational load, since the rigorous optimization problem can be solved at a slower rate than the process sampling time. Furthermore, solving a new rigorous optimization problem is not necessary at each sampling time if the process has rather slow dynamics compared to the disturbance dynamics. The two-layer control strategy is illustrated with a simulated case study of an industrial polymerization process. 相似文献
77.
Anne Martens Heiko Koziolek Lutz Prechelt Ralf Reussner 《Empirical Software Engineering》2011,16(5):587-622
Model-based performance evaluation methods for software architectures can help architects to assess design alternatives and save costs for late life-cycle performance fixes. A recent trend is component-based performance modelling, which aims at creating reusable performance models; a number of such methods have been proposed during the last decade. Their accuracy and the needed effort for modelling are heavily influenced by human factors, which are so far hardly understood empirically. Do component-based methods allow to make performance predictions with a comparable accuracy while saving effort in a reuse scenario? We examined three monolithic methods (SPE, umlPSI, Capacity Planning (CP)) and one component-based performance evaluation method (PCM) with regard to their accuracy and effort from the viewpoint of method users. We conducted a series of three experiments (with different levels of control) involving 47 computer science students. In the first experiment, we compared the applicability of the monolithic methods in order to choose one of them for comparison. In the second experiment, we compared the accuracy and effort of this monolithic and the component-based method for the model creation case. In the third, we studied the effort reduction from reusing component-based models. Data were collected based on the resulting artefacts, questionnaires and screen recording. They were analysed using hypothesis testing, linear models, and analysis of variance. For the monolithic methods, we found that using SPE and CP resulted in accurate predictions, while umlPSI produced over-estimates. Comparing the component-based method PCM with SPE, we found that creating reusable models using PCM takes more (but not drastically more) time than using SPE and that participants can create accurate models with both techniques. Finally, we found that reusing PCM models can save time, because effort to reuse can be explained by a model that is independent of the inner complexity of a component. The tasks performed in our experiments reflect only a subset of the actual activities when applying model-based performance evaluation methods in a software development process. Our results indicate that sufficient prediction accuracy can be achieved with both monolithic and component-based methods, and that the higher effort for component-based performance modelling will indeed pay off when the component models incorporate and hide a sufficient amount of complexity. 相似文献
78.
Vasileios Belagiannis Xinchao Wang Horesh Beny Ben Shitrit Kiyoshi Hashimoto Ralf Stauder Yoshimitsu Aoki Michael Kranzfelder Armin Schneider Pascal Fua Slobodan Ilic Hubertus Feussner Nassir Navab 《Machine Vision and Applications》2016,27(7):1035-1046
Multiple human pose estimation is an important yet challenging problem. In an operating room (OR) environment, the 3D body poses of surgeons and medical staff can provide important clues for surgical workflow analysis. For that purpose, we propose an algorithm for localizing and recovering body poses of multiple human in an OR environment under a multi-camera setup. Our model builds on 3D Pictorial Structures and 2D body part localization across all camera views, using convolutional neural networks (ConvNets). To evaluate our algorithm, we introduce a dataset captured in a real OR environment. Our dataset is unique, challenging and publicly available with annotated ground truths. Our proposed algorithm yields to promising pose estimation results on this dataset. 相似文献
79.
Markus Eisele Ralf Kolb Erwin Kraus Christian von Ehrenstein 《Informatik-Spektrum》2007,30(6):407-412
Zusammenfassung Das Schlagwort NetWeaver ist seit einiger Zeit in unserer Branche en vogue. Wie so oft, ist es nicht leicht, Marketingaspekte
vom informatischen Wesensgehalt zu unterscheiden. Dieser Beitrag m?chte hier helfen. 相似文献
80.
Ralf Östermark 《Computational Economics》1992,5(4):283-302
In this report we formulate a linear multiperiod programming problem and show how it can be solved by a new interior point algorithm. The conditions of convergence and applicability of the algorithm of centers are explicitly connected to the multiperiod programming problem. For the former, effective application is supported by a set of simplifying conditions stated and proved in the text. For the latter, boundedness and nontriviality under real world conditions is demonstrated, allowing for its solution by the interior point algorithm. The use of a fast interior point algorithm is motivated by some empirical evidence from a Revised Simplex optimizer. 相似文献