全文获取类型
收费全文 | 37424篇 |
免费 | 2043篇 |
国内免费 | 55篇 |
专业分类
电工技术 | 256篇 |
综合类 | 40篇 |
化学工业 | 8297篇 |
金属工艺 | 745篇 |
机械仪表 | 704篇 |
建筑科学 | 1155篇 |
矿业工程 | 76篇 |
能源动力 | 1237篇 |
轻工业 | 6653篇 |
水利工程 | 359篇 |
石油天然气 | 170篇 |
武器工业 | 2篇 |
无线电 | 1689篇 |
一般工业技术 | 5591篇 |
冶金工业 | 7259篇 |
原子能技术 | 252篇 |
自动化技术 | 5037篇 |
出版年
2024年 | 54篇 |
2023年 | 274篇 |
2022年 | 610篇 |
2021年 | 1067篇 |
2020年 | 748篇 |
2019年 | 912篇 |
2018年 | 1242篇 |
2017年 | 1205篇 |
2016年 | 1284篇 |
2015年 | 1043篇 |
2014年 | 1391篇 |
2013年 | 2620篇 |
2012年 | 2108篇 |
2011年 | 2375篇 |
2010年 | 1836篇 |
2009年 | 1812篇 |
2008年 | 1631篇 |
2007年 | 1464篇 |
2006年 | 1140篇 |
2005年 | 960篇 |
2004年 | 933篇 |
2003年 | 813篇 |
2002年 | 737篇 |
2001年 | 529篇 |
2000年 | 518篇 |
1999年 | 591篇 |
1998年 | 2412篇 |
1997年 | 1583篇 |
1996年 | 1103篇 |
1995年 | 615篇 |
1994年 | 518篇 |
1993年 | 550篇 |
1992年 | 208篇 |
1991年 | 196篇 |
1990年 | 164篇 |
1989年 | 184篇 |
1988年 | 162篇 |
1987年 | 147篇 |
1986年 | 121篇 |
1985年 | 174篇 |
1984年 | 113篇 |
1983年 | 101篇 |
1982年 | 109篇 |
1981年 | 121篇 |
1980年 | 113篇 |
1979年 | 58篇 |
1978年 | 59篇 |
1977年 | 185篇 |
1976年 | 342篇 |
1973年 | 48篇 |
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
991.
Erick Corrêa da Silva Aristófanes Corrêa Silva Anselmo Cardoso de Paiva Rodolfo Acatauassu Nunes 《Pattern Analysis & Applications》2008,11(1):89-99
This paper analyzes the application of Moran’s index and Geary’s coefficient to the characterization of lung nodules as malignant
or benign in computerized tomography images. The characterization method is based on a process that verifies which combination
of measures, from the proposed measures, has been best able to discriminate between the benign and malignant nodules using
stepwise discriminant analysis. Then, a linear discriminant analysis procedure was performed using the selected features to
evaluate the ability of these in predicting the classification for each nodule. In order to verify this application we also
describe tests that were carried out using a sample of 36 nodules: 29 benign and 7 malignant. A leave-one-out procedure was
used to provide a less biased estimate of the linear discriminator’s performance. The two analyzed functions and its combinations
have provided above 90% of accuracy and a value area under receiver operation characteristic (ROC) curve above 0.85, that
indicates a promising potential to be used as nodules signature measures. The preliminary results of this approach are very
encouraging in characterizing nodules using the two functions presented.
相似文献
Rodolfo Acatauassu NunesEmail: |
992.
António V. Sousa Ana Maria Mendonça Aurélio Campilho 《Pattern Analysis & Applications》2008,11(3-4):409-423
This paper proposes a non-parametric method for the classification of thin-layer chromatographic (TLC) images from patterns represented in a dissimilarity space. Each pattern corresponds to a mixture of Gaussian approximation of the intensity profile. The methodology comprises various phases, including image processing and analysis steps to extract the chromatographic profiles and a classification phase to discriminate among two groups, one corresponding to normal cases and the other to three pathological classes. We present an extensive study of several dissimilarity-based approaches analysing the influence of the dissimilarity measure and the prototype selection method on the classification performance. The main conclusions of this paper are that, Match and Profile-difference dissimilarity measures present better results, and a new prototype selection methodology achieves a performance similar or even better than conventional methods. Furthermore, we also concluded that simplest classifiers, such as k-NN and linear discriminant classifiers (LDCs), present good performance being the overall classification error less than 10% for the four-class problem. 相似文献
993.
Johan Montagnat Ákos Frohner Daniel Jouvenot Christophe Pera Peter Kunszt Birger Koblitz Nuno Santos Charles Loomis Romain Texier Diane Lingrand Patrick Guio Ricardo Brito Da Rocha Antonio Sobreira de Almeida Zoltán Farkas 《Journal of Grid Computing》2008,6(1):45-59
The medical community is producing and manipulating a tremendous volume of digital data for which computerized archiving,
processing and analysis is needed. Grid infrastructures are promising for dealing with challenges arising in computerized
medicine but the manipulation of medical data on such infrastructures faces both the problem of interconnecting medical information
systems to Grid middlewares and of preserving patients’ privacy in a wide and distributed multi-user system. These constraints
are often limiting the use of Grids for manipulating sensitive medical data. This paper describes our design of a medical
data management system taking advantage of the advanced gLite data management services, developed in the context of the EGEE
project, to fulfill the stringent needs of the medical community. It ensures medical data protection through strict data access
control, anonymization and encryption. The multi-level access control provides the flexibility needed for implementing complex
medical use-cases. Data anonymization prevents the exposure of most sensitive data to unauthorized users, and data encryption
guarantees data protection even when it is stored at remote sites. Moreover, the developed prototype provides a Grid storage
resource manager (SRM) interface to standard medical DICOM servers thereby enabling transparent access to medical data without
interfering with medical practice. 相似文献
994.
Dolors Costal Cristina Gómez Anna Queralt Ruth Raventós Ernest Teniente 《Software and Systems Modeling》2008,7(4):469-486
An important aspect in the specification of conceptual schemas is the definition of general constraints that cannot be expressed
by the predefined constructs provided by conceptual modeling languages. This is generally achieved by using general-purpose
languages like OCL. In this paper we propose a new approach that facilitates the definition of such general constraints in
UML. More precisely, we define a profile that extends the set of predefined UML constraints by adding certain types of constraints
that are commonly used in conceptual schemas. We also show how our proposal facilitates reasoning about the constraints and
their automatic code generation, study the application of our ideas to the specification of two real-life applications, and
present a prototype tool implementation.
相似文献
Ernest TenienteEmail: |
995.
Simulations of extensional flow in microrheometric devices 总被引:1,自引:0,他引:1
Mónica S. N. Oliveira Lucy E. Rodd Gareth H. McKinley Manuel A. Alves 《Microfluidics and nanofluidics》2008,5(6):809-826
We present a detailed numerical study of the flow of a Newtonian fluid through microrheometric devices featuring a sudden
contraction–expansion. This flow configuration is typically used to generate extensional deformations and high strain rates.
The excess pressure drop resulting from the converging and diverging flow is an important dynamic measure to quantify if the
device is intended to be used as a microfluidic extensional rheometer. To explore this idea, we examine the effect of the
contraction length, aspect ratio and Reynolds number on the flow kinematics and resulting pressure field. Analysis of the
computed velocity and pressure fields show that, for typical experimental conditions used in microfluidic devices, the steady
flow is highly three-dimensional with open spiraling vortical structures in the stagnant corner regions. The numerical simulations
of the local kinematics and global pressure drop are in good agreement with experimental results. The device aspect ratio
is shown to have a strong impact on the flow and consequently on the excess pressure drop, which is quantified in terms of
the dimensionless Couette and Bagley correction factors. We suggest an approach for calculating the Bagley correction which
may be especially appropriate for planar microchannels.
Electronic supplementary material The online version of this article (doi:) contains supplementary material, which is available to authorized users. 相似文献
996.
The effect of illuminance on the speed and the quality (percentage of errors) with which workers assemble electronic devices was studied in an electronics factory in The Netherlands. For the study, the horizontal illuminance was alternated per work shift between 800 and 1200 lux. The first test was done during the summer and a second test during the winter. A significant effect of illuminance has been found. With 1200 lux at the working plane, the speed of production in the summer was 2.9% higher than with 800 lux. In the winter it was 3.1% higher with the increased illuminance. There was no significant effect of the illuminance on the percentage of errors. 相似文献
997.
998.
Gaussian mean-shift is an EM algorithm 总被引:2,自引:0,他引:2
Carreira-Perpiñán MA 《IEEE transactions on pattern analysis and machine intelligence》2007,29(5):767-776
The mean-shift algorithm, based on ideas proposed by Fukunaga and Hosteller, is a hill-climbing algorithm on the density defined by a finite mixture or a kernel density estimate. Mean-shift can be used as a nonparametric clustering method and has attracted recent attention in computer vision applications such as image segmentation or tracking. We show that, when the kernel is Gaussian, mean-shift is an expectation-maximization (EM) algorithm and, when the kernel is non-Gaussian, mean-shift is a generalized EM algorithm. This implies that mean-shift converges from almost any starting point and that, in general, its convergence is of linear order. For Gaussian mean-shift, we show: 1) the rate of linear convergence approaches 0 (superlinear convergence) for very narrow or very wide kernels, but is often close to 1 (thus, extremely slow) for intermediate widths and exactly 1 (sublinear convergence) for widths at which modes merge, 2) the iterates approach the mode along the local principal component of the data points from the inside of the convex hull of the data points, and 3) the convergence domains are nonconvex and can be disconnected and show fractal behavior. We suggest ways of accelerating mean-shift based on the EM interpretation 相似文献
999.
Asunción Vicente M Hoyer PO Hyvärinen A 《IEEE transactions on pattern analysis and machine intelligence》2007,29(5):896-900
Recently, a number of empirical studies have compared the performance of PCA and ICA as feature extraction methods in appearance-based object recognition systems, with mixed and seemingly contradictory results. In this paper, we briefly describe the connection between the two methods and argue that whitened PCA may yield identical results to ICA in some cases. Furthermore, we describe the specific situations in which ICA might significantly improve on PCA 相似文献
1000.
Formal translations constitute a suitable framework for dealing with many problems in pattern recognition and computational linguistics. The application of formal transducers to these areas requires a stochastic extension for dealing with noisy, distorted patterns with high variability. In this paper, some estimation criteria are proposed and developed for the parameter estimation of regular syntax-directed translation schemata. These criteria are: maximum likelihood estimation, minimum conditional entropy estimation and conditional maximum likelihood estimation. The last two criteria were proposed in order to deal with situations when training data is sparse. These criteria take into account the possibility of ambiguity in the translations: i.e., there can be different output strings for a single input string. In this case, the final goal of the stochastic framework is to find the highest probability translation of a given input string. These criteria were tested on a translation task which has a high degree of ambiguity. 相似文献