全文获取类型
收费全文 | 15549篇 |
免费 | 827篇 |
国内免费 | 28篇 |
专业分类
电工技术 | 166篇 |
综合类 | 43篇 |
化学工业 | 3342篇 |
金属工艺 | 326篇 |
机械仪表 | 315篇 |
建筑科学 | 685篇 |
矿业工程 | 45篇 |
能源动力 | 503篇 |
轻工业 | 1228篇 |
水利工程 | 124篇 |
石油天然气 | 51篇 |
无线电 | 1112篇 |
一般工业技术 | 2969篇 |
冶金工业 | 2251篇 |
原子能技术 | 117篇 |
自动化技术 | 3127篇 |
出版年
2023年 | 166篇 |
2022年 | 359篇 |
2021年 | 573篇 |
2020年 | 374篇 |
2019年 | 380篇 |
2018年 | 497篇 |
2017年 | 463篇 |
2016年 | 532篇 |
2015年 | 465篇 |
2014年 | 594篇 |
2013年 | 1060篇 |
2012年 | 971篇 |
2011年 | 1157篇 |
2010年 | 813篇 |
2009年 | 748篇 |
2008年 | 788篇 |
2007年 | 776篇 |
2006年 | 582篇 |
2005年 | 505篇 |
2004年 | 373篇 |
2003年 | 384篇 |
2002年 | 340篇 |
2001年 | 214篇 |
2000年 | 186篇 |
1999年 | 205篇 |
1998年 | 211篇 |
1997年 | 185篇 |
1996年 | 186篇 |
1995年 | 178篇 |
1994年 | 153篇 |
1993年 | 147篇 |
1992年 | 119篇 |
1991年 | 76篇 |
1990年 | 122篇 |
1989年 | 118篇 |
1988年 | 82篇 |
1987年 | 94篇 |
1986年 | 100篇 |
1985年 | 100篇 |
1984年 | 89篇 |
1983年 | 76篇 |
1982年 | 96篇 |
1981年 | 91篇 |
1980年 | 71篇 |
1979年 | 70篇 |
1978年 | 62篇 |
1977年 | 74篇 |
1976年 | 61篇 |
1975年 | 45篇 |
1974年 | 41篇 |
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
41.
There are a vast number of complex, interrelated processes influencing urban stormwater quality. However, the lack of measured fundamental variables prevents the construction of process-based models. Furthermore, hybrid models such as the buildup-washoff models are generally crude simplifications of reality. This has created the need for statistical models, capable of making use of the readily accessible data. In this paper, artificial neural networks (ANN) were used to predict stormwater quality at urbanized catchments located throughout the United States. Five constituents were analysed: chemical oxygen demand (COD), lead (Pb), suspended solids (SS), total Kjeldhal nitrogen (TKN) and total phosphorus (TP). Multiple linear regression equations were initially constructed upon logarithmically transformed data. Input variables were primarily selected using a stepwise regression approach, combined with process knowledge. Variables found significant in the regression models were then used to construct ANN models. Other important network parameters such as learning rate, momentum and the number of hidden nodes were optimized using a trial and error approach. The final ANN models were then compared with the multiple linear regression models. In summary, ANN models were generally less accurate than the regression models and more time consuming to construct. This infers that ANN models are not more applicable than regression models when predicting urban stormwater quality. 相似文献
42.
Indices are a common and useful way to summarize a changing field for both the lay and the specialist reader, and it's time that we had them for information security. 相似文献
43.
Some of the current best conformant probabilistic planners focus on finding a fixed length plan with maximal probability. While these approaches can find optimal solutions, they often do not scale for large problems or plan lengths. As has been shown in classical planning, heuristic search outperforms bounded length search (especially when an appropriate plan length is not given a priori). The problem with applying heuristic search in probabilistic planning is that effective heuristics are as yet lacking.In this work, we apply heuristic search to conformant probabilistic planning by adapting planning graph heuristics developed for non-deterministic planning. We evaluate a straight-forward application of these planning graph techniques, which amounts to exactly computing a distribution over many relaxed planning graphs (one planning graph for each joint outcome of uncertain actions at each time step). Computing this distribution is costly, so we apply Sequential Monte Carlo (SMC) to approximate it. One important issue that we explore in this work is how to automatically determine the number of samples required for effective heuristic computation. We empirically demonstrate on several domains how our efficient, but sometimes suboptimal, approach enables our planner to solve much larger problems than an existing optimal bounded length probabilistic planner and still find reasonable quality solutions. 相似文献
44.
The objective of this paper is to propose an architecture for aCAble TeleVision (CATV) network, capable of supporting two-way transmission. This evolution is necessary for the survival of the CATV industry in an era of deregulation and of the development of the B-ISDN by the telecommunications companies. A communication transactional service is then considered and performance analysis is done under realistic assumptions. 相似文献
45.
Ribonucleic acids (RNAs) possess great therapeutic potential and can be used to treat a variety of diseases. The unique biophysical properties of RNAs, such as high molecular weight, negative charge, hydrophilicity, low stability, and potential immunogenicity, require chemical modification and development of carriers to enable intracellular delivery of RNAs for clinical use. A variety of nanomaterials have been developed for the effective in vivo delivery of short/ small RNAs, messenger RNAs, and RNAs required for gene editing technologies including clustered regularly interspaced palindromic repeat (CRISPR)/Cas. This review outlines the challenges of delivering RNA therapeutics, explores the chemical synthesis of RNA modifications and carriers, and describes the efforts to design nanomaterials that can be used for a variety of clinical indications. 相似文献
46.
Meng T Entezari A Smith B Möller T Weiskopf D Kirkpatrick AE 《IEEE transactions on visualization and computer graphics》2011,17(10):1420-1432
The Body-Centered Cubic (BCC) and Face-Centered Cubic (FCC) lattices have been analytically shown to be more efficient sampling lattices than the traditional Cartesian Cubic (CC) lattice, but there has been no estimate of their visual comparability. Two perceptual studies (each with N = 12 participants) compared the visual quality of images rendered from BCC and FCC lattices to images rendered from the CC lattice. Images were generated from two signals: the commonly used Marschner-Lobb synthetic function and a computed tomography scan of a fish tail. Observers found that BCC and FCC could produce images of comparable visual quality to CC, using 30-35 percent fewer samples. For the images used in our studies, the L(2) error metric shows high correlation with the judgement of human observers. Using the L(2) metric as a proxy, the results of the experiments appear to extend across a wide range of images and parameter choices. 相似文献
47.
We consider the on-line version of the maximum vertex disjoint path problem when the underlying network is a tree. In this
problem, a sequence of requests arrives in an on-line fashion, where every request is a path in the tree. The on-line algorithm
may accept a request only if it does not share a vertex with a previously accepted request. The goal is to maximize the number
of accepted requests. It is known that no on-line algorithm can have a competitive ratio better than Ω(log n) for this problem, even if the algorithm is randomized and the tree is simply a line. Obviously, it is desirable to beat
the logarithmic lower bound. Adler and Azar (Proc. of the 10th ACM-SIAM Symposium on Discrete Algorithm, pp. 1–10, 1999) showed that if preemption is allowed (namely, previously accepted requests may be discarded, but once a request is discarded
it can no longer be accepted), then there is a randomized on-line algorithm that achieves constant competitive ratio on the
line. In the current work we present a randomized on-line algorithm with preemption that has constant competitive ratio on
any tree. Our results carry over to the related problem of maximizing the number of accepted paths subject to a capacity constraint
on vertices (in the disjoint path problem this capacity is 1). Moreover, if the available capacity is at least 4, randomization
is not needed and our on-line algorithm becomes deterministic. 相似文献
48.
Stephen M. Walls John V. Kucsera Joshua D. Walker Taylor W. Acee Nate K. McVaugh Daniel H. Robinson 《Computers & Education》2010
Instructors in higher education are disseminating instructional content via podcasting, as many rally behind the technology’s potential benefits. Others have expressed concern about the risks of deleterious effects that might accompany the adoption of podcasting, such as lower class attendance. Yet, relatively few studies have investigated students’ perceptions of podcasting for educational purposes, especially in relation to different podcasting forms: repetitive and supplemental. The present study explored students’ readiness and attitudes towards these two forms of podcasting to provide fundamental information for future researchers and educators. The results indicated that students may not be as ready or eager to use podcasting for repetitive or supplemental educational purposes as much as we think they are, but they could be persuaded. 相似文献
49.
Daniel M. Batista Luciano J. Chaves Nelson L. S. da Fonseca Artur Ziviani 《The Journal of supercomputing》2010,53(1):103-121
Modern large-scale grid computing systems for processing advanced science and engineering applications rely on geographically
distributed clusters. In such highly distributed environments, estimating the available bandwidth between clusters is a key
issue for efficient task scheduling. We analyze the performance of two well known available bandwidth estimation tools, pathload and abget, with the aim of using them in grid environments. Differently than previous investigations (Jain et al., ; Shriram et al., in Passive and active network measurement: 6th international workshop, PAM 2005. Springer, Berlin, 2005), our experiments consider a series of relevant metrics such as accuracy of the estimation, convergence time, degree of intrusion
in the grid links, and ability to handle multiple simultaneous estimations. No previous work has analyzed the use of available
bandwidth tools for the derivation of efficient grid scheduling. 相似文献
50.
Fuzzy grey relational analysis for software effort estimation 总被引:1,自引:1,他引:0
Accurate and credible software effort estimation is a challenge for academic research and software industry. From many software
effort estimation models in existence, Estimation by Analogy (EA) is still one of the preferred techniques by software engineering
practitioners because it mimics the human problem solving approach. Accuracy of such a model depends on the characteristics
of the dataset, which is subject to considerable uncertainty. The inherent uncertainty in software attribute measurement has
significant impact on estimation accuracy because these attributes are measured based on human judgment and are often vague
and imprecise. To overcome this challenge we propose a new formal EA model based on the integration of Fuzzy set theory with
Grey Relational Analysis (GRA). Fuzzy set theory is employed to reduce uncertainty in distance measure between two tuples
at the k
th
continuous feature ( | ( xo(k) - xi(k) | ) \left( {\left| {\left( {{x_o}(k) - {x_i}(k)} \right.} \right|} \right) .GRA is a problem solving method that is used to assess the similarity between two tuples with M features. Since some of these features are not necessary to be continuous and may have nominal and ordinal scale type, aggregating
different forms of similarity measures will increase uncertainty in the similarity degree. Thus the GRA is mainly used to
reduce uncertainty in the distance measure between two software projects for both continuous and categorical features. Both
techniques are suitable when relationship between effort and other effort drivers is complex. Experimental results showed
that using integration of GRA with FL produced credible estimates when compared with the results obtained using Case-Based
Reasoning, Multiple Linear Regression and Artificial Neural Networks methods. 相似文献