全文获取类型
收费全文 | 25116篇 |
免费 | 1014篇 |
国内免费 | 328篇 |
专业分类
电工技术 | 514篇 |
综合类 | 546篇 |
化学工业 | 4648篇 |
金属工艺 | 601篇 |
机械仪表 | 794篇 |
建筑科学 | 1113篇 |
矿业工程 | 185篇 |
能源动力 | 752篇 |
轻工业 | 2520篇 |
水利工程 | 273篇 |
石油天然气 | 187篇 |
武器工业 | 27篇 |
无线电 | 2740篇 |
一般工业技术 | 3578篇 |
冶金工业 | 4087篇 |
原子能技术 | 200篇 |
自动化技术 | 3693篇 |
出版年
2023年 | 153篇 |
2022年 | 402篇 |
2021年 | 584篇 |
2020年 | 342篇 |
2019年 | 430篇 |
2018年 | 492篇 |
2017年 | 466篇 |
2016年 | 531篇 |
2015年 | 469篇 |
2014年 | 681篇 |
2013年 | 1288篇 |
2012年 | 1066篇 |
2011年 | 1301篇 |
2010年 | 1024篇 |
2009年 | 1069篇 |
2008年 | 1063篇 |
2007年 | 1044篇 |
2006年 | 902篇 |
2005年 | 773篇 |
2004年 | 831篇 |
2003年 | 1028篇 |
2002年 | 1301篇 |
2001年 | 1070篇 |
2000年 | 660篇 |
1999年 | 585篇 |
1998年 | 1391篇 |
1997年 | 912篇 |
1996年 | 686篇 |
1995年 | 465篇 |
1994年 | 365篇 |
1993年 | 381篇 |
1992年 | 220篇 |
1991年 | 179篇 |
1990年 | 168篇 |
1989年 | 161篇 |
1988年 | 147篇 |
1987年 | 134篇 |
1986年 | 133篇 |
1985年 | 172篇 |
1984年 | 108篇 |
1983年 | 109篇 |
1982年 | 105篇 |
1981年 | 118篇 |
1980年 | 111篇 |
1979年 | 82篇 |
1978年 | 58篇 |
1977年 | 115篇 |
1976年 | 204篇 |
1975年 | 54篇 |
1973年 | 58篇 |
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
81.
Ye Chen Keith W. Hipel D. Marc Kilgour Yuming Zhu 《Environmental Modelling & Software》2009,24(5):647-654
Brownfield redevelopment (BR) is an ongoing issue for governments, communities, and consultants around the world. It is also an increasingly popular research topic in several academic fields. Strategic decision support that is now available for BR is surveyed and assessed. Then a dominance-based rough-set approach is developed and used to classify cities facing BR issues according to the level of two characteristics, BR effectiveness and BR future needs. The data for the classification are based on the widely available results of a survey of US cities. The unique features of the method are its reduced requirement for preference information, its ability to handle missing information effectively, and the easily understood linguistic decision rules that it generates, based on a training classification provided by experts. The resulting classification should be a valuable aid to cities and governments as they plan their BR projects and budgets. 相似文献
82.
A modified expectation maximization algorithm for penalized likelihood estimation in emission tomography 总被引:2,自引:0,他引:2
De Pierro AR 《IEEE transactions on medical imaging》1995,14(1):132-137
The maximum likelihood (ML) expectation maximization (EM) approach in emission tomography has been very popular in medical imaging for several years. In spite of this, no satisfactory convergent modifications have been proposed for the regularized approach. Here, a modification of the EM algorithm is presented. The new method is a natural extension of the EM for maximizing likelihood with concave priors. Convergence proofs are given. 相似文献
83.
The authors reviewed 53 patients with 70 congenital trigger digits. Three of these were seen at an early age. Most "congenital" trigger digits present later than the neonatal period. A clear difference exists between trigger thumbs and trigger fingers. In our series, thumbs were more frequently affected, 30% were bilateral and none resolved spontaneously. The long fingers were less frequently affected, and two of them (28%) recovered without operation. All other children had an operative release of the A1 pulley of the flexor tendon sheath, with excellent results. 相似文献
84.
Spectral Measurement of the Film-Substrate Index Difference in Proton-Exchanged LiNbO(3) Waveguides 总被引:1,自引:0,他引:1
El Hadi K Rastogi V Shenoy MR Thyagarajan K De Micheli M Ostrowsky DB 《Applied optics》1998,37(27):6463-6467
We report the spectral characterization of proton-exchanged lithium niobate (PE:LiNbO(3)) waveguides in terms of the variation of the refractive-index difference between the waveguiding layer and the substrate. The dispersion of the extraordinary refractive-index increase (deltan(e)) is measured from 405 to 1319 nm with several light sources. Two types of proton-exchanged waveguide, prepared under different conditions, are studied. These measurements should be of use in the optimization of PE:LiNbO(3) waveguides for nonlinear optical applications, particularly in second-harmonic generation in the blue-green wavelength region. 相似文献
85.
A solid state bonding technique under hot pressing was used for joining alumina with thin metal sheets of Ni, Cu and Fe. The microstructure and microchemistry of the ceramic–metal interface and of the fracture interface were examined using scanning electron microscopy (SEM) energy dispersive spectroscopy (EDS) and X-ray diffraction (XRD), in order to identify the adhesion mechanisms and the nature of strength limiting flaws. Interaction between the selected metals and alumina can be physical or physico-chemical in nature: very low amounts of interfacial compounds were formed, depending on the processing conditions and on the presence of oxygen in the system. Fracture and toughness tests indicated that high ceramic–metal interface strengths (up to 177 MPa) were achieved under the adopted processing conditions and that strength and toughness were directly related. Moreover, an increase in hardening in the metal interlayer at a distance of 2–3 m from the interface was observed in the samples with high strength values. The mechanical behaviour was related to several factors that strongly depend on the bonding conditions: plastic deformation of the metal, metal creep, metal intrusion and diffusion into alumina, and chemical reactions at the interface. © 1998 Chapman & Hall 相似文献
86.
Mainella G de Bernardis P De Petris M Mandiello A Perciballi M Romeo G 《Applied optics》1996,35(13):2246-2252
The Millimetre and Infrared Testa Grigia Observatory 2.6-m Cassegrain telescope has been designed to allow high-sensitivity observations in the millimeter spectral range. For this purpose, in order to reduce unwanted contributions from local foregrounds, we adopted a sky-chopping technique, by wobbling the telescope subreflector. We describe the design and performance of the wobbling system, which can endure external forced two and three fields square-wave modulation and includes features such as high frequency, high amplitude, high duty cycle, low microphonics, and high stability. 相似文献
87.
The SHARC framework for data quality in Web archiving 总被引:1,自引:0,他引:1
Dimitar Denev Arturas Mazeika Marc Spaniol Gerhard Weikum 《The VLDB Journal The International Journal on Very Large Data Bases》2011,20(2):183-207
Web archives preserve the history of born-digital content and offer great potential for sociologists, business analysts, and legal experts on intellectual property and compliance issues. Data quality is crucial for these purposes. Ideally, crawlers should gather coherent captures of entire Web sites, but the politeness etiquette and completeness requirement mandate very slow, long-duration crawling while Web sites undergo changes. This paper presents the SHARC framework for assessing the data quality in Web archives and for tuning capturing strategies toward better quality with given resources. We define data quality measures, characterize their properties, and develop a suite of quality-conscious scheduling strategies for archive crawling. Our framework includes single-visit and visit?Crevisit crawls. Single-visit crawls download every page of a site exactly once in an order that aims to minimize the ??blur?? in capturing the site. Visit?Crevisit strategies revisit pages after their initial downloads to check for intermediate changes. The revisiting order aims to maximize the ??coherence?? of the site capture(number pages that did not change during the capture). The quality notions of blur and coherence are formalized in the paper. Blur is a stochastic notion that reflects the expected number of page changes that a time-travel access to a site capture would accidentally see, instead of the ideal view of a instantaneously captured, ??sharp?? site. Coherence is a deterministic quality measure that counts the number of unchanged and thus coherently captured pages in a site snapshot. Strategies that aim to either minimize blur or maximize coherence are based on prior knowledge of or predictions for the change rates of individual pages. Our framework includes fairly accurate classifiers for change predictions. All strategies are fully implemented in a testbed and shown to be effective by experiments with both synthetically generated sites and a periodic crawl series for different Web sites. 相似文献
88.
Hans De Sterck Killian Miller Geoffrey Sanders 《Computing and Visualization in Science》2011,14(2):51-65
Recently, it was shown how the convergence of a class of multigrid methods for computing the stationary distribution of sparse,
irreducible Markov chains can be accelerated by the addition of an outer iteration based on iterant recombination. The acceleration
was performed by selecting a linear combination of previous fine-level iterates with probability constraints to minimize the
two-norm of the residual using a quadratic programming method. In this paper we investigate the alternative of minimizing
the one-norm of the residual. This gives rise to a nonlinear convex program which must be solved at each acceleration step.
To solve this minimization problem we propose to use a deep-cuts ellipsoid method for nonlinear convex programs. The main
purpose of this paper is to investigate whether an iterant recombination approach can be obtained in this way that is competitive
in terms of execution time and robustness. We derive formulas for subgradients of the one-norm objective function and the
constraint functions, and show how an initial ellipsoid can be constructed that is guaranteed to contain the exact solution
and give conditions for its existence. We also investigate using the ellipsoid method to minimize the two-norm. Numerical
tests show that the one-norm and two-norm acceleration procedures yield a similar reduction in the number of multigrid cycles.
The tests also indicate that one-norm ellipsoid acceleration is competitive with two-norm quadratic programming acceleration
in terms of running time with improved robustness. 相似文献
89.
I. Werbrouck M. Antrop V. Van Eetvelde C. Stal Ph. De Maeyer M. Bats J. Bourgeois M. Court-Picon Ph. Crombé J. De Reu Ph. De Smedt P.A. Finke M. Van Meirvenne J. Verniers A. Zwertvaegher 《Expert systems with applications》2011,38(7):8178-8185
This paper discusses the generation of a high precision DEM (Digital Elevation Model) based on high density airborne LiDAR (Light Detection and Ranging) data for an interdisciplinary landscape archaeological study concerning the settlement history and environment in Sandy Flanders, a region to the north of Ghent (Belgium). The objective was to create a detailed topographical surface free of artificial features and topographical artefacts, in the form of a DEM, visualizing the natural and current topography through the implementation of true ground points only. The semi-automatical removal of these features and artefacts was based on topographical vector data, visual interpretations and slope analysis. Ultimately two DEM’s were constructed (1) a TIN (Triangulated Irregular Network) model, whereby the inherent large file format restricts the usability to large scale and (2) a grid model which can be used for small-, medium- and large-scale applications. Both datasets were used as an image that is interpreted using ancillary data from historical sources. Its usefulness is illustrated in a case of field pattern and microfield topography. Starting from this DEM, the approach of this landscape historical study is mainly retrogressive, i.e. starting from the landscape structures and elements that are still present in the contemporary landscape and moving into the past. 相似文献
90.
The representer theorem for kernel methods states that the solution of the associated variational problem can be expressed
as the linear combination of a finite number of kernel functions. However, for non-smooth loss functions, the analytic characterization
of the coefficients poses nontrivial problems. Standard approaches resort to constrained optimization reformulations which,
in general, lack a closed-form solution. Herein, by a proper change of variable, it is shown that, for any convex loss function,
the coefficients satisfy a system of algebraic equations in a fixed-point form, which may be directly obtained from the primal
formulation. The algebraic characterization is specialized to regression and classification methods and the fixed-point equations
are explicitly characterized for many loss functions of practical interest. The consequences of the main result are then investigated
along two directions. First, the existence of an unconstrained smooth reformulation of the original non-smooth problem is
proven. Second, in the context of SURE (Stein’s Unbiased Risk Estimation), a general formula for the degrees of freedom of
kernel regression methods is derived. 相似文献