全文获取类型
收费全文 | 6305篇 |
免费 | 845篇 |
国内免费 | 23篇 |
专业分类
电工技术 | 58篇 |
综合类 | 117篇 |
化学工业 | 1780篇 |
金属工艺 | 96篇 |
机械仪表 | 114篇 |
建筑科学 | 254篇 |
矿业工程 | 16篇 |
能源动力 | 751篇 |
轻工业 | 658篇 |
水利工程 | 42篇 |
石油天然气 | 16篇 |
无线电 | 631篇 |
一般工业技术 | 1053篇 |
冶金工业 | 711篇 |
原子能技术 | 56篇 |
自动化技术 | 820篇 |
出版年
2022年 | 55篇 |
2021年 | 65篇 |
2020年 | 73篇 |
2019年 | 113篇 |
2018年 | 135篇 |
2017年 | 147篇 |
2016年 | 184篇 |
2015年 | 238篇 |
2014年 | 253篇 |
2013年 | 398篇 |
2012年 | 270篇 |
2011年 | 337篇 |
2010年 | 486篇 |
2009年 | 421篇 |
2008年 | 577篇 |
2007年 | 239篇 |
2006年 | 247篇 |
2005年 | 196篇 |
2004年 | 198篇 |
2003年 | 182篇 |
2002年 | 190篇 |
2001年 | 121篇 |
2000年 | 133篇 |
1999年 | 105篇 |
1998年 | 121篇 |
1997年 | 91篇 |
1996年 | 103篇 |
1995年 | 85篇 |
1994年 | 96篇 |
1993年 | 81篇 |
1992年 | 72篇 |
1991年 | 51篇 |
1990年 | 71篇 |
1989年 | 71篇 |
1988年 | 49篇 |
1987年 | 50篇 |
1986年 | 47篇 |
1985年 | 72篇 |
1984年 | 61篇 |
1983年 | 63篇 |
1982年 | 52篇 |
1981年 | 49篇 |
1980年 | 49篇 |
1979年 | 59篇 |
1978年 | 31篇 |
1977年 | 45篇 |
1976年 | 40篇 |
1975年 | 41篇 |
1974年 | 31篇 |
1973年 | 27篇 |
排序方式: 共有7173条查询结果,搜索用时 31 毫秒
161.
Philippe Coni Jean‐Luc Bardon Aude Gueguen Matthieu Grossetête 《Journal of the Society for Information Display》2017,25(3):158-166
A 3D stereoscopic head‐up display using a tunable bandpass filter to perform left and right image spectral separation is presented. Using a single filter reduces the size and the cost of the head‐up display optical engine and enables each spectral band to be accurately tuned. Experiments performed on the first prototype demonstrate the ability to continuously tune the bandpass frequency on 30‐nm range while keeping a 20‐nm bandwidth. Such a system avoids the use of a bulky and costly rotating wheel and enables the use of holographic optical elements known to be wavelength selective. 相似文献
162.
Grant E. Gunn Claude R. Duguay Chris Derksen Juha Lemmetyinen 《Remote sensing of environment》2011,115(1):233-244
The algorithms designed to estimate snow water equivalent (SWE) using passive microwave measurements falter in lake-rich high-latitude environments due to the emission properties of ice covered lakes on low frequency measurements. Microwave emission models have been used to simulate brightness temperatures (Tbs) for snowpack characteristics in terrestrial environments but cannot be applied to snow on lakes because of the differing subsurface emissivities and scattering matrices present in ice. This paper examines the performance of a modified version of the Helsinki University of Technology (HUT) snow emission model that incorporates microwave emission from lake ice and sub-ice water. Inputs to the HUT model include measurements collected over brackish and freshwater lakes north of Inuvik, Northwest Territories, Canada in April 2008, consisting of snowpack (depth, density, and snow water equivalent) and lake ice (thickness and ice type). Coincident airborne radiometer measurements at a resolution of 80 × 100 m were used as ground-truth to evaluate the simulations.The results indicate that subsurface media are simulated best when utilizing a modeled effective grain size and a 1 mm RMS surface roughness at the ice/water interface compared to using measured grain size and a flat Fresnel reflective surface as input. Simulations at 37 GHz (vertical polarization) produce the best results compared to airborne Tbs, with a Root Mean Square Error (RMSE) of 6.2 K and 7.9 K, as well as Mean Bias Errors (MBEs) of −8.4 K and −8.8 K for brackish and freshwater sites respectively. Freshwater simulations at 6.9 and 19 GHz H exhibited low RMSE (10.53 and 6.15 K respectively) and MBE (−5.37 and 8.36 K respectively) but did not accurately simulate Tb variability (R = −0.15 and 0.01 respectively). Over brackish water, 6.9 GHz simulations had poor agreement with airborne Tbs, while 19 GHz V exhibited a low RMSE (6.15 K), MBE (−4.52 K) and improved relative agreement to airborne measurements (R = 0.47). Salinity considerations reduced 6.9 GHz errors substantially, with a drop in RMSE from 51.48 K and 57.18 K for H and V polarizations respectively, to 26.2 K and 31.6 K, although Tb variability was not well simulated. With best results at 37 GHz, HUT simulations exhibit the potential to track Tb evolution, and therefore SWE through the winter season. 相似文献
163.
Combination of sources of evidence with different discounting factors based on a new dissimilarity measure 总被引:1,自引:0,他引:1
Zhun-ga LiuAuthor Vitae Jean DezertAuthor VitaeGrégoire MercierAuthor Vitae 《Decision Support Systems》2011,52(1):133-141
The sources of evidence may have different reliability and importance in real applications for decision making. The estimation of the discounting (weighting) factors when the prior knowledge is unknown have been regularly studied until recently. In the past, the determination of the weighting factors focused only on reliability discounting rule and it was mainly dependent on the dissimilarity measure between basic belief assignments (bba's) represented by an evidential distance. Nevertheless, it is very difficult to characterize efficiently the dissimilarity only through an evidential distance. Thus, both a distance and a conflict coefficient based on probabilistic transformations BetP are proposed to characterize the dissimilarity. The distance represents the difference between bba's, whereas the conflict coefficient reveals the divergence degree of the hypotheses that two belief functions strongly support. These two aspects of dissimilarity are complementary in a certain sense, and their fusion is used as the dissimilarity measure. Then, a new estimation method of weighting factors is presented by using the proposed dissimilarity measure. In the evaluation of weight of a source, both its dissimilarity with other sources and their weighting factors are considered. The weighting factors can be applied in the both importance and reliability discounting rules, but the selection of the adapted discounting rule should depend on the actual application. Simple numerical examples are given to illustrate the interest of the proposed approach. 相似文献
164.
This paper deals with task scheduling, where each task is one particular iteration of a DO loop with partial loop-carried dependencies. Independent iterations of such loops can be scheduled in an order different from the one of classical serial execution, so as to increase program performance.The approach that we present is based both on the use of a directive added to the High Performance Fortran (HPF2) language, which specifies the dependencies between iterations, and on inspector/executor support, implemented in the CoLUMBO library, which builds the task graph and schedules tasks associated with iterations. We validate our approach by showing results achieved on an IBM SP2 for a sparse Cholesky factorization algorithm applied to real problems. 相似文献
165.
Chi‐Woo Kim Chang‐Oh Jeong Jean‐Ho Song Hyung‐Guel Kim 《Journal of the Society for Information Display》2001,9(3):139-143
Abstract— TFT‐LCD panels for notebook‐PC applications requires a thin and light form factor, low power consumption, and good display quality, whereas the desktop monitor has different requirements such as large panel size, wide viewing angle, high resolution, brightness, etc. However, for the fifth‐generation of mass production, current panel technologies have to improve in order to cope with these requirements. In this article, various approaches to the manufacturing technologies of next‐generation TFT‐LCDs are discussed. 相似文献
166.
Valette S Chassery JM Prost R 《IEEE transactions on visualization and computer graphics》2008,14(2):369-381
In this paper, we propose a generic framework for 3D surface remeshing. Based on a metric-driven Discrete Voronoi Diagram construction, our output is an optimized 3D triangular mesh with a user defined vertex budget. Our approach can deal with a wide range of applications, from high quality mesh generation to shape approximation. By using appropriate metric constraints the method generates isotropic or anisotropic elements. Based on point-sampling, our algorithm combines the robustness and theoretical strength of Delaunay criteria with the efficiency of entirely discrete geometry processing . Besides the general described framework, we show experimental results using isotropic, quadric-enhanced isotropic and anisotropic metrics which prove the efficiency of our method on large meshes, for a low computational cost. 相似文献
167.
Several studies have stressed that even expert operators who are aware of a machine's limits could adopt its proposals without questioning them (i.e., the complacency phenomenon). In production scheduling for manufacturing, this is a significant problem, as it is often suggested that the machine be allowed to build the production schedule, confining the human role to that of rescheduling. This article evaluates the characteristics of scheduling algorithms on human rescheduling performance, the quality of which was related to complacency. It is suggested that scheduling algorithms be characterized as having result comprehensibility (the result respects the scheduler's expectations in terms of the discourse rules of the information display) or algorithm comprehensibility (the complexity of the algorithm hides some important constraints). The findings stress, on the one hand, that result comprehensibility is necessary to achieve good production performance and to limit complacency. On the other hand, algorithm comprehensibility leads to poor performance due to the very high cost of understanding the algorithm. © 2008 Wiley Periodicals, Inc. 相似文献
168.
Stigliani JL Arnaud P Delaine T Bernardes-Génisson V Meunier B Bernadou J 《Journal of molecular graphics & modelling》2008,27(4):536-545
The front-line antituberculosis drug isoniazid (INH) inhibits InhA, the NADH-dependent fatty acid biosynthesis enoyl ACP-reductase from Mycobacterium tuberculosis, via formation of covalent adducts with NAD (INH-NAD adducts). While ring tautomers were found the main species formed in solution, only the 4S chain INH-NAD tautomer was evidenced in the crystallized InhA:INH-NAD complex. In this study we attempted to explore the modes of interaction and energy binding of the different isomers placed in the active site of InhA with the help of various molecular modelling techniques. Ligand and enzyme models were generated with the help of the Vega ZZ program package. Resulting ligands were then docked into the InhA active site individually using computational automated docking package AUTODOCK 3.0.5. The more relevant docked conformations were then used to compute the interaction energy between the ligands and the InhA cavity. The AM1 Hamiltonian and the QM/MM ONIOM methodologies were used and the results compared. The various tautomers were found docked in almost the same place where INH-NAD was present as predicted by earlier X-ray crystallographic studies. However, some changes of ligand conformation and of the interactions ligand-protein were evidenced. The lower binding energy was observed for the 4S chain adduct that probably represents the effective active form of the INH-NAD adducts, as compared to the 4R epimer. The two 4S,7R and 4R,7S ring tautomers show intermediate and similar binding energies contrasting with their different experimental inhibitory potency on InhA. As a possible explanation based on calculated conformations, we formulated the hypothesis of an initial binding of the two ring tautomers to InhA followed by opening of only the ring hemiamidal 4S,7R tautomer (possibly catalyzed by Tyr158 phenolate basic group) to give the 4S chain INH-NAD tight-binding inhibitor. The predictions of ligand-protein interactions at the molecular level can be of primary importance in elucidating the mechanisms of action of isoniazid and InhA-related resistances, in identifying the effective mycobactericidal entities and, in further step, in the design of a new generation of antitubercular drugs. 相似文献
169.
Digital microfluidic design and optimization of classic and new fluidic functions for lab on a chip systems 总被引:2,自引:2,他引:0
Yves Fouillet Dorothée Jary Claude Chabrol Patricia Claustre Christine Peponnet 《Microfluidics and nanofluidics》2008,4(3):159-165
This paper deals with microfluidic studies for lab-on-a-chip development. The first goal was to develop microsystems immediately
usable by biologists for complex protocol integrations. All fluid operations are performed on nano-liter droplet independently
handled solely by electrowetting on dielectric (EWOD) actuation. A bottom-up architecture was used for chip design due to
the development and validation of elementary fluidic designs, which are then assembled. This approach speeds up development
and industrialization while minimizing the effort in designing and simplifying chip-fluidic programming. Dispensing reproducibility
for 64 nl droplets obtained a CV below 3% and mixing time was only a few seconds. Ease of the integration was demonstrated
by performing on chip serial dilutions of 2.8-folds, four times. The second part of this paper concerns the development of
new innovative fluidic functions in order to extend EWOD-actuated digital fluidics’ capabilities. Experiments of particle
dispensing by EWOD droplet handling are reported. Finally, work is shown concerning the coupling of EWOD actuation and magnetic
fields for magnetic bead manipulation. 相似文献
170.
Walid Gaaloul Karim Baïna Claude Godart 《Service Oriented Computing and Applications》2008,2(2-3):93-110
Web service compositions are becoming more and more complex, involving numerous interacting ad-hoc services. These services are often implemented as business processes themselves. By analysing such complex web service compositions one is able to better understand, control and eventually re-design them. Our contribution to this problem is a mining algorithm, based on a statistical technique to discover composite web service patterns from execution logs. Our approach is characterised by a “local” pattern’s discovery that covers partial results through a dynamic programming algorithm. Those locally discovered patterns are then composed iteratively until the composite Web service is discovered. The analysis of the disparities between the discovered model and the initial ad-hoc composite model (delta-analysis) enables initial design gaps to be detected and thus to re-engineer the initial Web service composition. 相似文献