全文获取类型
收费全文 | 1212篇 |
免费 | 91篇 |
国内免费 | 2篇 |
专业分类
电工技术 | 21篇 |
化学工业 | 347篇 |
金属工艺 | 25篇 |
机械仪表 | 39篇 |
建筑科学 | 36篇 |
矿业工程 | 1篇 |
能源动力 | 63篇 |
轻工业 | 139篇 |
水利工程 | 9篇 |
石油天然气 | 3篇 |
无线电 | 106篇 |
一般工业技术 | 223篇 |
冶金工业 | 36篇 |
原子能技术 | 4篇 |
自动化技术 | 253篇 |
出版年
2024年 | 10篇 |
2023年 | 29篇 |
2022年 | 77篇 |
2021年 | 82篇 |
2020年 | 44篇 |
2019年 | 64篇 |
2018年 | 53篇 |
2017年 | 54篇 |
2016年 | 58篇 |
2015年 | 49篇 |
2014年 | 45篇 |
2013年 | 79篇 |
2012年 | 90篇 |
2011年 | 100篇 |
2010年 | 73篇 |
2009年 | 72篇 |
2008年 | 63篇 |
2007年 | 67篇 |
2006年 | 25篇 |
2005年 | 30篇 |
2004年 | 31篇 |
2003年 | 14篇 |
2002年 | 18篇 |
2001年 | 9篇 |
2000年 | 3篇 |
1999年 | 9篇 |
1998年 | 11篇 |
1997年 | 4篇 |
1996年 | 8篇 |
1995年 | 3篇 |
1994年 | 2篇 |
1993年 | 3篇 |
1992年 | 4篇 |
1990年 | 2篇 |
1988年 | 1篇 |
1987年 | 3篇 |
1986年 | 1篇 |
1985年 | 1篇 |
1984年 | 3篇 |
1982年 | 3篇 |
1981年 | 3篇 |
1979年 | 1篇 |
1976年 | 1篇 |
1975年 | 1篇 |
1972年 | 1篇 |
1957年 | 1篇 |
排序方式: 共有1305条查询结果,搜索用时 15 毫秒
61.
Henrique Rocha Cesar Couto Cristiano Maffort Rogel Garcia Clarisse Simoes Leonardo Passos Marco Tulio Valente 《Software Quality Journal》2013,21(4):529-549
Despite the relevance of the software evolution phase, there are few characterization studies on recurrent evolution growth patterns and on their impact on software properties, such as coupling and cohesion. In this paper, we report a study designed to investigate whether the software evolution categories proposed by Lanza can be used to explain not only the growth of a system in terms of lines of code (LOC), but also in terms of metrics from the Chidamber and Kemerer (CK) object-oriented metrics suite. Our results show that high levels of recall (ranging on average from 52 to 72 %) are achieved when using LOC to predict the evolution of coupling and size. For cohesion, we have achieved smaller recall rates (<27 % on average). 相似文献
62.
Estevez, L., Kehtarnavaz, N., and Wendt, R. III, Interactive Selective and Adaptive Clustering for Detection of Microcalcifications in Mammograms,Digital Signal Processing6(1996), 224–232.This paper presents a clustering algorithm, called interactive selective and adaptive clustering (Isaac), to assist radiologists in looking for small clusters of microcalcifications in mammograms. Isaac is developed to identify suspicious microcalcification regions which are missed by other classification techniques due to false positive samples in the feature space. It comprises two parts: (i) selective clustering and (ii) interactive adaptation. The first part reduces the number of false positives by identifying the microcalcification subspace or domains in the feature space. The second part allows the radiologist to improve results by interactively identifying additional false positive or true negative samples. Clinical evaluations of mammograms indicate the potential of using this algorithm as an effective tool to bring microcalcification areas to the attention of the radiologist during a routine reading session of mammograms. 相似文献
63.
Enrique J. Fernandez-Sanchez Leonardo Rubio Javier Diaz Eduardo Ros 《Machine Vision and Applications》2014,25(5):1211-1225
Background subtraction consists of segmenting objects in movement in a video captured by a static camera. This is typically performed using color information, but it leads to wrong estimations due to perspective and illumination issues. We show that multimodal approaches based on the integrated use of color and depth cues produce more accurate and robust results than using either data source independently. Depth is less affected by issues such as shadows or foreground objects similar to background. However, objects close to the background may not be detected when using only range information, being color information complementary in those cases. We propose an extension of a well-known background subtraction technique which fuses range and color information, as well as a post-processing mask fusion stage to get the best of each feature. We have evaluated the method proposed using a well-defined dataset and different disparity estimation algorithms, showing the benefits of our method for fusion color and depth cues. 相似文献
64.
Leonardo A. de Andrade Matheus R. U. Zingarelli Rodolfo R. Silva Rudinei Goularte 《Multimedia Tools and Applications》2014,71(3):1673-1697
This paper presents a new spatial compression method specifically designed for stereo videos. Different form current compressors, which simply apply known 2D compression techniques, the method proposed here was developed taking into account specificities of the components of the spatial compression process which may impact the correct depth visualization, named Chrominance Subsampling, Discrete WaveletTransform (DWT) and Quantization. Each component was evaluated analyzing where datalosses occur and proposing ways to provide a good balance between compression ratio and image quality, minimizing losses in depth perception. The evaluations were made using standard objective (PSNR) and subjective (DSCQS) metrics, applied to an anaglyphic stereoscopic video base. The results showedour method is competitive regarding compression rate and providessuperior image quality. 相似文献
65.
This paper presents the STALKER knowledge base refinement system. Like its predecessor KRUST, STALKER proposes many alternative refinements to correct the classification of each wrongly classified example in the training set. However, there are two principal differences between KRUST and STALKER. Firstly, the range of misclassified examples handled by KRUST has been augmented by the introduction of inductive refinement operators. Secondly, STALKER's testing phase has been greatly speeded up by using a Truth Maintenance System (TMS). The resulting system is more effective than other refinement systems because it generates many alternative refinements. At the same time, STALKER is very efficient since KRUST's computationally expensive implementation and testing of refined knowledge bases has been replaced by a TMS-based simulator. 相似文献
66.
Anna Grazia Mignani Peter R. Smith Leonardo Ciaccheri Antonio Cimato Graziano Sani 《Sensors and actuators. B, Chemical》2003,90(1-3):157-162
In spectral nephelometry, absorption spectroscopy and nephelometry are innovatively combined to provide simultaneous online monitoring of color and turbidity of edible oils. Spectral nephelometry instrumentation consists of an optoelectronic device that measures the absorption spectrum of the oil sample at different angles. Data processing, carried out by principal component analysis (PCA), allows identification of the oil sample and creates a two-dimensional map as a fingerprint of oil types. 相似文献
67.
Rodrigo Queiroz Leonardo Passos Marco Tulio Valente Claus Hunsen Sven Apel Krzysztof Czarnecki 《Software and Systems Modeling》2017,16(1):77-96
Feature annotations (e.g., code fragments guarded by #ifdef C-preprocessor directives) control code extensions related to features. Feature annotations have long been said to be undesirable. When maintaining features that control many annotations, there is a high risk of ripple effects. Also, excessive use of feature annotations leads to code clutter, hinder program comprehension and harden maintenance. To prevent such problems, developers should monitor the use of feature annotations, for example, by setting acceptable thresholds. Interestingly, little is known about how to extract thresholds in practice, and which values are representative for feature-related metrics. To address this issue, we analyze the statistical distribution of three feature-related metrics collected from a corpus of 20 well-known and long-lived C-preprocessor-based systems from different domains. We consider three metrics: scattering degree of feature constants, tangling degree of feature expressions, and nesting depth of preprocessor annotations. Our findings show that feature scattering is highly skewed; in 14 systems (70 %), the scattering distributions match a power law, making averages and standard deviations unreliable limits. Regarding tangling and nesting, the values tend to follow a uniform distribution; although outliers exist, they have little impact on the mean, suggesting that central statistics measures are reliable thresholds for tangling and nesting. Following our findings, we then propose thresholds from our benchmark data, as a basis for further investigations. 相似文献
68.
Weber Lukas Vinon Tobias Kndler Christian Solis-Vasquez Leonardo Bernhardt Arthur Petrov Ilia Koch Andreas 《Distributed and Parallel Databases》2022,40(1):27-45
Distributed and Parallel Databases - Massive data transfers in modern data-intensive systems resulting from low data-locality and data-to-code system design hurt their performance and scalability.... 相似文献
69.
Bruno Campello de Souza Leonardo Xavier de Lima e Silva Antonio Roazzi 《Computers in human behavior》2010
The present paper attempts to empirically study the cognitive impacts of Massive Multiplayer Online Role-Playing Games (MMORPGs) in uncontrolled contexts in light of the Cognitive Mediation Networks Theory, a new model of human intelligence that aims to explain cognition as the result of brain activity combined with the information-processing done by external structures such as tools, social groups and culture. A sample of 1280 students Brazilian high school students answered a form inquiring about socio-demographic information plus the use of computer games, and also was submitted to a short knowledge exam and a mini psychometric test. The findings indicated that, due to their underlying structure and sociocultural nature, MMORPGs are associated to a greater level of insertion into the Digital Age, higher levels of logical-numerical performance, and better scholastic ability. Finally, suggestions are made for future studies on the subject. 相似文献
70.
We provide a discussion of bounded rationality learning behind traditional learning mechanisms, i.e., Recursive Ordinary Least Squares and Bayesian Learning . These mechanisms lack for many reasons a behavioral interpretation and, following the Simon criticism, they appear to be substantively rational. In this paper, analyzing the Cagan model, we explore two learning mechanisms which appear to be more plausible from a behavioral point of view and somehow procedurally rational: Least Mean Squares learning for linear models and Back Propagation for Artificial Neural Networks . The two algorithms look for a minimum of the variance of the error forecasting by means of a steepest descent gradient procedure. The analysis of the Cagan model shows an interesting result: non-convergence of learning to the Rational Expectations Equilibrium is not due to the restriction to linear learning devices; also Back Propagation learning for Artificial Neural Networks may fail to converge to the Rational Expectations Equilibrium of the model. 相似文献