首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3683篇
  免费   183篇
  国内免费   4篇
电工技术   24篇
综合类   2篇
化学工业   730篇
金属工艺   128篇
机械仪表   79篇
建筑科学   195篇
矿业工程   8篇
能源动力   116篇
轻工业   184篇
水利工程   45篇
石油天然气   29篇
无线电   274篇
一般工业技术   773篇
冶金工业   739篇
原子能技术   16篇
自动化技术   528篇
  2023年   37篇
  2022年   54篇
  2021年   105篇
  2020年   77篇
  2019年   91篇
  2018年   95篇
  2017年   89篇
  2016年   95篇
  2015年   84篇
  2014年   125篇
  2013年   232篇
  2012年   177篇
  2011年   253篇
  2010年   166篇
  2009年   160篇
  2008年   178篇
  2007年   163篇
  2006年   138篇
  2005年   103篇
  2004年   87篇
  2003年   94篇
  2002年   69篇
  2001年   52篇
  2000年   55篇
  1999年   43篇
  1998年   86篇
  1997年   72篇
  1996年   64篇
  1995年   40篇
  1994年   51篇
  1993年   40篇
  1992年   33篇
  1991年   29篇
  1990年   39篇
  1989年   45篇
  1988年   35篇
  1987年   52篇
  1986年   29篇
  1985年   35篇
  1984年   33篇
  1983年   32篇
  1982年   32篇
  1981年   21篇
  1980年   22篇
  1979年   28篇
  1978年   30篇
  1977年   26篇
  1976年   25篇
  1975年   23篇
  1973年   18篇
排序方式: 共有3870条查询结果,搜索用时 15 毫秒
121.
Complex geometric features such as oriented points, lines or 3D frames are increasingly used in image processing and computer vision. However, processing these geometric features is far more difficult than processing points, and a number of paradoxes can arise. We establish in this article the basic mathematical framework required to avoid them and analyze more specifically three basic problems: (1) what is a random distribution of features, (2) how to define a distance between features, (3) and what is the “mean feature” of a number of feature measurements? We insist on the importance of an invariance hypothesis for these definitions relative to a group of transformations that models the different possible data acquisitions. We develop general methods to solve these three problems and illustrate them with 3D frame features under rigid transformations. The first problem has a direct application in the computation of the prior probability of a false match in classical model-based object recognition algorithms. We also present experimental results of the two other problems for the statistical analysis of anatomical features automatically extracted from 24 three-dimensional images of a single patient's head. These experiments successfully confirm the importance of the rigorous requirements presented in this article.  相似文献   
122.
The aim of this paper is to propose new regularization and filtering techniques for dense and sparse vector fields, and to focus on their application to non-rigid registration. Indeed, most of the regularization energies used in non-rigid registration operate independently on each coordinate of the transformation. The only common exception is the linear elastic energy, which enables cross-effects between coordinates. Cross-effects are yet essential to give realistic deformations in the uniform parts of the image, where displacements are interpolated.In this paper, we propose to find isotropic quadratic differential forms operating on a vector field, using a known theorem on isotropic tensors, and we give results for differentials of order 1 and 2. The quadratic approximation induced by these energies yields a new class of vectorial filters, applied numerically in the Fourier domain. We also propose a class of separable isotropic filters generalizing Gaussian filtering to vector fields, which enables fast smoothing in the spatial domain. Then we deduce splines in the context of interpolation or approximation of sparse displacements. These splines generalize scalar Laplacian splines, such as thin-plate splines, to vector interpolation. Finally, we propose to solve the problem of approximating a dense and a sparse displacement field at the same time. This last formulation enables us to introduce sparse geometrical constraints in intensity based non-rigid registration algorithms, illustrated here on intersubject brain registration.  相似文献   
123.
An adjoint‐based functional optimization technique in conjunction with the spectral stochastic finite element method is proposed for the solution of an inverse heat conduction problem in the presence of uncertainties in material data, process conditions and measurement noise. The ill‐posed stochastic inverse problem is restated as a conditionally well‐posed L2 optimization problem. The gradient of the objective function is obtained in a distributional sense by defining an appropriate stochastic adjoint field. The L2 optimization problem is solved using a conjugate‐gradient approach. Accuracy and effectiveness of the proposed approach is appraised with the solution of several stochastic inverse heat conduction problems. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   
124.
A stabilized equal‐order velocity–pressure finite element algorithm is presented for the analysis of flow in porous media and in the solidification of binary alloys. The adopted governing macroscopic conservation equations of momentum, energy and species transport are derived from their microscopic counterparts using the volume‐averaging method. The analysis is performed in a single domain with a fixed numerical grid. The fluid flow scheme developed includes SUPG (streamline‐upwind/Petrov–Galerkin), PSPG (pressure stabilizing/Petrov–Galerkin) and DSPG (Darcy stabilizing/Petrov–Galerkin) stabilization terms in a variable porosity medium. For the energy and species equations a classical SUPG‐based finite element method is employed. The developed algorithms were tested extensively with bilinear elements and were shown to perform stably and with nearly quadratic convergence in high Rayleigh number flows in varying porosity media. Examples are shown in natural and double diffusive convection in porous media and in the directional solidification of a binary‐alloy. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   
125.
This paper represents a first attempt at a systematic study of sensitivity analysis for scheduling problems. Because schedules contain both combinatorial and temporal structures, scheduling problems present unique issues for sensitivity analysis. Some of the issues that we discuss have not been considered before. Others, while studied before, have not been explored in the context of scheduling. The applicability of these issues is illustrated using well-known scheduling models. We provide fast methods to determine when a previously optimal schedule remains optimal. Other methods restore an optimal schedule after a parameter change. The value of studying the sensitivity of an optimal sequence instead of the sensitivity of an optimal schedule is demonstrated. We show that, for some problems, sensitivity analysis results depend on the positions of jobs with changed parameters. We identify scheduling problems where performing additional or different computations during optimization facilitates sensitivity analysis. To improve the robustness of an optimal schedule, selection among multiple optimal schedules is considered. We discuss which types of sensitivity analysis questions are intractable because the scheduling problem itself is intractable. We also study how heuristic error bounds vary when the data of a scheduling problem is continuously modified. Although we focus on scheduling problems, several of the issues we discuss and our classification scheme can be extended to other optimization problems.  相似文献   
126.
Over the last several decades, forensic examiners of impression evidence have come under scrutiny in the courtroom due to analysis methods that rely heavily on subjective morphological comparisons. Currently, there is no universally accepted system that generates numerical data to independently corroborate visual comparisons. Our research attempts to develop such a system for tool mark evidence, proposing a methodology that objectively evaluates the association of striated tool marks with the tools that generated them. In our study, 58 primer shear marks on 9?mm cartridge cases, fired from four Glock model 19 pistols, were collected using high-resolution white light confocal microscopy. The resulting three-dimensional surface topographies were filtered to extract all "waviness surfaces"-the essential "line" information that firearm and tool mark examiners view under a microscope. Extracted waviness profiles were processed with principal component analysis (PCA) for dimension reduction. Support vector machines (SVM) were used to make the profile-gun associations, and conformal prediction theory (CPT) for establishing confidence levels. At the 95% confidence level, CPT coupled with PCA-SVM yielded an empirical error rate of 3.5%. Complementary, bootstrap-based computations for estimated error rates were 0%, indicating that the error rate for the algorithmic procedure is likely to remain low on larger data sets. Finally, suggestions are made for practical courtroom application of CPT for assigning levels of confidence to SVM identifications of tool marks recorded with confocal microscopy.  相似文献   
127.
The use of fluorescent probes that allow visualization of leukocyte-endothelial cell (EC) interactions has greatly informed our understanding of leukocyte recruitment. However, effects of these agents on the biological functions of leukocytes are poorly described, leading to concerns about the interpretation of such data. Here we used two flow-based neutrophil adhesion assays to compare the effects of phase contrast illumination (PCI) with high intensity illumination (HII) used for fluorescent microscopy, in the presence or absence of five commonly used fluorochromes. Isolated neutrophils were either (1) perfused across P-selectin to establish a population of rolling cells, which were subsequently activated with fMLP; or (2) perfused across EC activated with TNF-alpha. In the absence of fluorescent dyes, HII did not affect levels of leukocyte adhesion; however, subsequent neutrophil behavior was dramatically altered when compared with cells under PCI, for example, dramatically reducing their migration velocities. In the presence of fluorescent dyes, the effects of HII were exacerbated, although the precise nature of the biological effects of these probes was agent specific. Thus, for the first time, our experiments describe the effects of fluorescent microscopy on the separate stages of the neutrophil recruitment process and reveal a previously unsuspected effect of HII on neutrophil migration.  相似文献   
128.
Recent work suggests that evaporative coolers increase the level and diversity of bioaerosols, but this association remains understudied in low‐income homes. We conducted a cross‐sectional study of metropolitan, low‐income homes in Utah with evaporative coolers (n = 20) and central air conditioners (n = 28). Dust samples (N = 147) were collected from four locations in each home and analyzed for dust‐mite allergens Der p1 and Der f1, endotoxins, and β‐(1 → 3)‐d ‐glucans. In all sample locations combined, Der p1 or Der f1 was significantly higher in evaporative cooler versus central air conditioning homes (OR = 2.29, 95% CI = 1.05‐4.98). Endotoxin concentration was significantly higher in evaporative cooler versus central air conditioning homes in furniture (geometric mean (GM) = 8.05 vs 2.85 EU/mg, P < .01) and all samples combined (GM = 3.60 vs 1.29 EU/mg, P = .03). β‐(1 → 3)‐d ‐glucan concentration and surface loads were significantly higher in evaporative cooler versus central air conditioning homes in all four sample locations and all samples combined (P < .01). Our study suggests that low‐income, evaporative cooled homes have higher levels of immunologically important bioaerosols than central air‐conditioned homes in dry climates, warranting studies on health implications and other exposed populations.  相似文献   
129.
Insights into service response time is important for service-oriented architectures and service management. However, directly measuring the service response time is not always feasible or can be very costly. This paper extends an analytical modeling method which uses enterprise architecture modeling to support the analysis. The extensions consist of (i) a formalization using the Hybrid Probabilistic Relational Model formalism, (ii) an implementation in an analysis tool for enterprise architecture and (iii) a data collection approach using expert assessments collected via interviews and questionnaires. The accuracy and cost effectiveness of the method was tested empirically by comparing it with direct performance measurements of five services of a geographical information system at a Swedish utility company. The tests indicate that the proposed method can be a viable option for rapid service response time estimates when a moderate accuracy within 15% is sufficient.  相似文献   
130.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号