全文获取类型
收费全文 | 1877篇 |
免费 | 207篇 |
国内免费 | 4篇 |
专业分类
电工技术 | 21篇 |
综合类 | 1篇 |
化学工业 | 428篇 |
金属工艺 | 50篇 |
机械仪表 | 107篇 |
建筑科学 | 63篇 |
矿业工程 | 9篇 |
能源动力 | 62篇 |
轻工业 | 528篇 |
水利工程 | 14篇 |
石油天然气 | 7篇 |
无线电 | 78篇 |
一般工业技术 | 329篇 |
冶金工业 | 23篇 |
原子能技术 | 10篇 |
自动化技术 | 358篇 |
出版年
2024年 | 11篇 |
2023年 | 20篇 |
2022年 | 35篇 |
2021年 | 100篇 |
2020年 | 72篇 |
2019年 | 88篇 |
2018年 | 117篇 |
2017年 | 97篇 |
2016年 | 104篇 |
2015年 | 100篇 |
2014年 | 119篇 |
2013年 | 322篇 |
2012年 | 142篇 |
2011年 | 133篇 |
2010年 | 156篇 |
2009年 | 119篇 |
2008年 | 74篇 |
2007年 | 39篇 |
2006年 | 27篇 |
2005年 | 9篇 |
2004年 | 9篇 |
2003年 | 10篇 |
2002年 | 17篇 |
2001年 | 3篇 |
2000年 | 10篇 |
1999年 | 10篇 |
1998年 | 17篇 |
1997年 | 12篇 |
1996年 | 10篇 |
1995年 | 10篇 |
1994年 | 4篇 |
1993年 | 3篇 |
1992年 | 5篇 |
1989年 | 3篇 |
1988年 | 3篇 |
1986年 | 4篇 |
1985年 | 3篇 |
1984年 | 8篇 |
1982年 | 3篇 |
1981年 | 7篇 |
1980年 | 4篇 |
1978年 | 5篇 |
1977年 | 3篇 |
1976年 | 4篇 |
1975年 | 3篇 |
1974年 | 3篇 |
1973年 | 3篇 |
1972年 | 3篇 |
1961年 | 3篇 |
1938年 | 2篇 |
排序方式: 共有2088条查询结果,搜索用时 834 毫秒
151.
Summary Broad line NMR measurements give information on energy-elastic processes during straining of hard elastic fibers, as there are the shearing of the crystalline regions and transformation of parts of the crystalline into amorphous. The amount of these parts is proportional to straining and recrystallizes again, dependent on temperature and time. Sheared crystalline regions relax within few days, even at very low temperature (–175°C, –120°C). In the case of isotactic polypropylene the mobility of various groups depends not directly on the temperature but on the nearest neighbour distances. Thus methyl groups in the sheared crystalline regions even at –175°C undergo hindered rotation.Presented at the 22nd Microsymposium, Characterization of Structure and Dynamics of Macromolecular Systems by NMR Methods, Prague, July 20–23, 1981 相似文献
152.
The problem of computerized batch control of the silicon epitaxial layer deposition technological process has been solved using optimal stochastic control methods. A control algorithm is presented the main emphasis being given to the forecasting and compensating of disturbing processes which act on a process unit under real operation conditions. The method of multidimensional time series, stochastic model form identification for the process noise is developed based on multidimensional time series, correlation analysis results. The “maximum likelihood” identification method is applied in order to obtain efficient estimates of the model parameters. The identification of the model form and model parameters is carried out on the basis of a rather extensive set of data obtained from operation records of a high capacity epitaxial unit. The adequacy of the identified models is checked by means of a correlation analysis of the model residuals. It is demonstrated that results comparable to those with an intuitive process control by an experienced operator, can be achieved when using the algorithm presented in the present work for process computer control. This algorithm thus represents a reliable and rational basis for process control computer software development. 相似文献
153.
Marcos Faundez-Zanuy David A. Elizondo Miguel-Ángel Ferrer-Ballester Carlos M. Travieso-González 《Neural Processing Letters》2007,26(3):201-216
Biometric based systems for individual authentication are increasingly becoming indispensable for protecting life and property.
They provide ways for uniquely and reliably authenticating people, and are difficult to counterfeit. Biometric based authenticity
systems are currently used in governmental, commercial and public sectors. However, these systems can be expensive to put
in place and often impose physical constraint to the users. This paper introduces an inexpensive, powerful and easy to use
hand geometry based biometric person authentication system using neural networks. The proposed approach followed to construct
this system consists of an acquisition device, a pre-processing stage, and a neural network based classifier. One of the novelties
of this work comprises on the introduction of hand geometry’s related, position independent, feature extraction and identification
which can be useful in problems related to image processing and pattern recognition. Another novelty of this research comprises
on the use of error correction codes to enhance the level of performance of the neural network model. A dataset made of scanned
images of the right hand of fifty different people was created for this study. Identification rates and Detection Cost Function
(DCF) values obtained with the system were evaluated. Several strategies for coding the outputs of the neural networks were
studied. Experimental results show that, when using Error Correction Output Codes (ECOC), up to 100% identification rates and 0% DCF can be obtained. For comparison purposes, results are also given for the Support
Vector Machine method. 相似文献
154.
In fuzzy logic-in wider, but also in narrow sense-the problem of consistency has been discussed more or less occasionally from different points of view. Up to now, however, it has usually not been taken as really important. This may be partly caused by a rather restricted understanding of what is to be meant by consistency or inconsistency in fuzzy logic in the wider sense-e.g. in fuzzy control as one of the main applicational fields of fuzzy logic. In this paper we deal with fuzzy theories in the realm of the fuzzy logic in narrow sense. Based on the fact that usually an "interactive", r-norm based conjunction is considered together with the min-based conjunction one has in a natural way two different consistency condition. We look here at the weaker one. 相似文献
155.
José-Ceferino Ortega Alejandro Álvarez-Melcón Fernando D. Quesada 《Applied Artificial Intelligence》2013,27(5):323-350
The design of coupled resonator filters used in many telecommunication applications poses an optimization problem that can be tackled with heuristic methods. In many configurations, simple heuristic methods do not give satisfactory results, and the combination in hybrid metaheuristics of local and global search methods is a better approach. This article analyzes the systematic development of hybrid metaheuristic methods for the design of coupled resonator filters. Engineers normally use the MATLAB computing environment to work on the design of these devices, so the available MATLAB optimization toolboxes are used here as a basis to address those optimization problems. The results obtained are in general satisfactory, and the best results are obtained in the experiments with memetic algorithms in which methods based in populations (Genetic Algorithms and Scatter Search) are combined with local search methods to improve individuals in the population at different parts of the metaheuristic. 相似文献
156.
David L. González-Álvarez Miguel A. Vega-Rodríguez 《The Journal of supercomputing》2013,66(3):1576-1612
When solving a wide range of complex scenarios of a given optimization problem, it is very difficult, if not impossible, to develop a single technique or algorithm that is able to solve all of them adequately. In this case, it is necessary to combine several algorithms by applying the most appropriate one in each case. Parallel computing can be used to improve the quality of the solutions obtained in a cooperative algorithms model. Exchanging information between parallel cooperative algorithms will alter their behavior in terms of solution searching, and it may be more effective than a sequential metaheuristic. For demonstrating this, a parallel cooperative team of four multiobjective evolutionary algorithms based on OpenMP is proposed for solving different scenarios of the Motif Discovery Problem (MDP), which is an important real-world problem in the biological domain. As we will see, the results show that the application of a properly configured parallel cooperative team achieves high quality solutions when solving the addressed problem, improving those achieved by the algorithms executed independently for a much longer time. 相似文献
157.
Ákos Szőke András Förhécz Gábor Kőrösi György Strausz 《Artificial Intelligence and Law》2013,21(4):485-519
Regulations affect every aspect of our lives. Compliance with the regulations impacts citizens and businesses similarly: they have to find their rights and obligations in the complex legal environment. The situation is more complex when languages and time versions of regulations should be considered. To propose a solution to these demands, we present a semantic enrichment approach which aims at (1) decreasing the ambiguousness of legal texts, (2) increasing the probability of finding the relevant legal materials, and (3) utilizing the application of legal reasoners. Our approach is also implemented both as a service for citizens and businesses and as a modeling environment for legal drafters. To evaluate the usefulness of the approach, a case study was carried out in a large organization and applied to corporate regulations and Hungarian laws. The results suggest this approach can support the previous aims. 相似文献
158.
Alejandro Montes-García Jose María Álvarez-Rodríguez Jose Emilio Labra-Gayo Marcos Martínez-Merino 《Expert systems with applications》2013,40(17):6735-6741
The present paper introduces a context-aware recommendation system for journalists to enable the identification of similar topics across different sources. More specifically a journalist-based recommendation system that can be automatically configured is presented to exploit news according to expert preferences. News contextual features are also taken into account due to the their special nature: time, current user interests, location or existing trends are combined with traditional recommendation techniques to provide an adaptive framework that deals with heterogeneous data providing an enhanced collaborative filtering system. Since the Wesomender approach is able to generate context-aware recommendations in the journalism field, a quantitative evaluation with the aim of comparing Wesomender results with the expectations of a team of experts is also performed to show that a context-aware adaptive recommendation engine can fulfil the needs of journalists daily work when retrieving timely and primary information is required. 相似文献
159.
F. Segovia J.M. Górriz J. Ramírez D. Salas-González I. Álvarez 《Expert systems with applications》2013,40(2):677-683
An accurate and early diagnosis of the Alzheimer’s disease (AD) is of fundamental importance for the patient medical treatment. It has been shown that pathological manifestations of AD may be detected thought functional images even before that the patients becomes symptomatic. This fact has led researchers to propose new ways for analyzing functional data in order to get more accurate Computer-Aided Diagnosis (CAD) systems for this disorder. In this paper we show an effective approach for Single Photon Emission Computed Tomography feature extraction that improves the accuracy of CAD systems for AD. The proposed methodology uses a Partial Least Squares algorithm for extracting score vectors and the Out-Of-Bag error for selecting a number of scores that are used as features. Subsequently, a Support Vector Machine (SVM) based classifier determines the underlying class of the images, thus performing diagnostics. In order to test this approach we have used an image database for AD with 97 SPECT images from controls and AD patients. The images were visually labeled by experienced clinicians after the properly normalization. Several experiments have been developed to compare the proposed methodology and previous approaches. The results show that our method yields accuracy rates over 90%, outperforming several recently reported CAD systems for AD diagnosis. 相似文献
160.
Antonio Martínez-Álvarez Felipe Restrepo-Calle Luis Alberto Vivas Tejuelo Sergio Cuenca-Asensi 《Expert systems with applications》2013,40(17):6813-6822
The design of fault tolerant systems is gaining importance in large domains of embedded applications where design constrains are as important as reliability. New software techniques, based on selective application of redundancy, have shown remarkable fault coverage with reduced costs and overheads. However, the large number of different solutions provided by these techniques, and the costly process to assess their reliability, make the design space exploration a very difficult and time-consuming task. This paper proposes the integration of a multi-objective optimization tool with a software hardening environment to perform an automatic design space exploration in the search for the best trade-offs between reliability, cost, and performance. The first tool is commanded by a genetic algorithm which can simultaneously fulfill many design goals thanks to the use of the NSGA-II multi-objective algorithm. The second is a compiler-based infrastructure that automatically produces selective protected (hardened) versions of the software and generates accurate overhead reports and fault coverage estimations. The advantages of our proposal are illustrated by means of a complex and detailed case study involving a typical embedded application, the AES (Advanced Encryption Standard). 相似文献