全文获取类型
收费全文 | 3900篇 |
免费 | 43篇 |
国内免费 | 3篇 |
专业分类
电工技术 | 27篇 |
综合类 | 2篇 |
化学工业 | 346篇 |
金属工艺 | 28篇 |
机械仪表 | 57篇 |
建筑科学 | 60篇 |
矿业工程 | 2篇 |
能源动力 | 38篇 |
轻工业 | 305篇 |
水利工程 | 10篇 |
石油天然气 | 5篇 |
无线电 | 222篇 |
一般工业技术 | 398篇 |
冶金工业 | 2190篇 |
原子能技术 | 25篇 |
自动化技术 | 231篇 |
出版年
2021年 | 13篇 |
2018年 | 22篇 |
2017年 | 25篇 |
2016年 | 27篇 |
2015年 | 18篇 |
2014年 | 20篇 |
2013年 | 88篇 |
2012年 | 51篇 |
2011年 | 96篇 |
2010年 | 59篇 |
2009年 | 56篇 |
2008年 | 92篇 |
2007年 | 79篇 |
2006年 | 61篇 |
2005年 | 61篇 |
2004年 | 66篇 |
2003年 | 57篇 |
2002年 | 58篇 |
2001年 | 54篇 |
2000年 | 49篇 |
1999年 | 111篇 |
1998年 | 601篇 |
1997年 | 364篇 |
1996年 | 275篇 |
1995年 | 173篇 |
1994年 | 154篇 |
1993年 | 141篇 |
1992年 | 46篇 |
1991年 | 51篇 |
1990年 | 54篇 |
1989年 | 64篇 |
1988年 | 57篇 |
1987年 | 56篇 |
1986年 | 36篇 |
1985年 | 67篇 |
1984年 | 37篇 |
1983年 | 28篇 |
1982年 | 28篇 |
1981年 | 35篇 |
1980年 | 28篇 |
1979年 | 34篇 |
1978年 | 40篇 |
1977年 | 72篇 |
1976年 | 127篇 |
1975年 | 26篇 |
1974年 | 24篇 |
1973年 | 24篇 |
1972年 | 22篇 |
1971年 | 18篇 |
1969年 | 11篇 |
排序方式: 共有3946条查询结果,搜索用时 31 毫秒
81.
James G. Phillips Rowan P. Ogeil Alex Blaszczynski 《Behaviour & Information Technology》2015,34(3):239-246
Computer mediation of communication allows interaction with events remote in space or time. However, the uptake and use of videotechnology requires an understanding of its effects upon willingness to take risks. To understand how responses to remote events are influenced by computer mediation, the present study compared responses to collocated outcomes with those conveyed over a videolink or as pre-recordings. Willingness to risk on an outcome was quantified using wagering behaviour during a simulated game of roulette: measuring preferred outcome format, levels of risk sought, and times required to make decisions. Participants tended to be more confident of winning and preferred the collocated version of roulette. Participants took greater risks with pre-recorded video outcomes and tended to spend more time locating bets. For videolinked outcomes, participants were more cautious, hedging their bets, and taking more time deliberating the odds. Although the amounts wagered did not change, a potential predictability in pre-recordings appears to encourage risk taking, while the reduced presence inherent in real-time videolinks engenders caution. 相似文献
82.
The application of scanning transmission electron microscopy (STEM) to crystalline defect analysis has been extended to dislocations. The present contribution highlights the use of STEM on two oppositely signed sets of near-screw dislocations in hcp α-Ti with 6wt% Al in solid solution. In addition to common systematic row diffraction conditions, other configurations such as zone axis and 3g imaging are explored, and appear to be very useful not only for defect analysis, but for general defect observation. It is demonstrated that conventional TEM rules for diffraction contrast such as g·b and g·R are applicable in STEM. Experimental and computational micrographs of dislocations imaged in the aforementioned modes are presented. 相似文献
83.
Yooyoung Lee James J. Filliben Ross J. Micheals P. Jonathon Phillips 《Computer Vision and Image Understanding》2013,117(5):532-550
The purpose of this paper is to introduce an effective and structured methodology for carrying out a biometric system sensitivity analysis. The goal of sensitivity analysis is to provide the researcher/developer with insight and understanding of the key factors—algorithmic, subject-based, procedural, image quality, environmental, among others—that affect the matching performance of the biometric system under study. This proposed methodology consists of two steps: (1) the design and execution of orthogonal fractional factorial experiment designs which allow the scientist to efficiently investigate the effect of a large number of factors—and interactions—simultaneously, and (2) the use of a select set of statistical data analysis graphical procedures which are fine-tuned to unambiguously highlight important factors, important interactions, and locally-optimal settings. We illustrate this methodology by application to a study of VASIR (Video-based Automated System for Iris Recognition)—NIST iris-based biometric system. In particular, we investigated k = 8 algorithmic factors from the VASIR system by constructing a (26?1 × 31 × 41) orthogonal fractional factorial design, generating the corresponding performance data, and applying an appropriate set of analysis graphics to determine the relative importance of the eight factors, the relative importance of the 28 two-term interactions, and the local best settings of the eight algorithms. The results showed that VASIR’s performance was primarily driven by six factors out of the eight, along with four two-term interactions. A virtue of our two-step methodology is that it is systematic and general, and hence may be applied with equal rigor and effectiveness to other biometric systems, such as fingerprints, face, voice, and DNA. 相似文献
84.
Peyman Afshani Pankaj K. Agarwal Lars Arge Kasper Green Larsen Jeff M. Phillips 《Theory of Computing Systems》2013,52(3):342-366
Given a set of points with uncertain locations, we consider the problem of computing the probability of each point lying on the skyline, that is, the probability that it is not dominated by any other input point. If each point’s uncertainty is described as a probability distribution over a discrete set of locations, we improve the best known exact solution. We also suggest why we believe our solution might be optimal. Next, we describe simple, near-linear time approximation algorithms for computing the probability of each point lying on the skyline. In addition, some of our methods can be adapted to construct data structures that can efficiently determine the probability of a query point lying on the skyline. 相似文献
85.
Mike Phillips 《Digital Creativity》2013,24(2):75-87
This paper explores the limitations of contemporary interface design and offers the potential of more profound forms of interaction by drawing on the rich and much older heritage of interactive art. Whilst HCI design is preoccupied with making the computer more simple to use, installation work, kinetic sculpture, and interactive multimedia art forms have generally been more concerned with the predicament of human/technological negotiation, whilst remaining a salient form of human communication. HCI activity sets out to make the complex systems of computing easy to understand and use, whilst interactive art often uses simple technology to make complex, inspiring and esoteric statements and experiences. In many ways the more simple and 'low resolution' the technology the more immersive, acute and intimate the experience. 'Low resolution' examples such as telephone-sex-lines are explored alongside more immersive systems, such as biofeedback interfaces, and other interactive experiments drawn from the 'technic' strand of art history. 相似文献
86.
Linda B. Phillips Andrew J. Hansen Curtis H. Flather 《Remote sensing of environment》2008,112(12):4381-4392
Ecosystem energy has been shown to be a strong correlate with biological diversity at continental scales. Early efforts to characterize this association used the normalized difference vegetation index (NDVI) to represent ecosystem energy. While this spectral vegetation index covaries with measures of ecosystem energy such as net primary production, the covariation is known to degrade in areas of very low vegetation or in areas of dense forest. Two of the new vegetation products from the MODIS sensor, derived by integrating spectral reflectance, climate data, and land cover, are thought to better approximate primary productivity than NDVI. In this study, we determine if the new MODIS derived measures of primary production, gross primary productivity (GPP) and net primary productivity (NPP) better explain variation in bird richness than historically used NDVI. Moreover, we evaluate if the two productivity measures covary more strongly with bird diversity in those vegetation conditions where limitations of NDVI are well recognized.Biodiversity was represented as native landbird species richness derived from the North American Breeding Bird Survey. Analyses included correlation analyses among predictor variables, and univariate regression analyses between each predictor variable and bird species richness. Analyses were done at two levels: for all BBS routes across natural landscapes in North America; and for routes in 10 vegetation classes stratified by vegetated cover along a gradient from bare ground to herbaceous cover to tree cover. We found that NDVI, GPP and NPP were highly correlated and explained similar variation in bird species richness when analyzed for all samples across North America. However, when samples were stratified by vegetated cover, strength of correlation between NDVI and both productivity measures was low for samples with bare ground and for dense forest. The NDVI also explained substantially less variation in bird species richness than the primary production in areas with more bare ground and in areas of dense forest. We conclude that MODIS productivity measures have higher utility in studies of the relationship of species richness and productivity and that MODIS GPP and NPP improve on NDVI, especially for studies with large variation in vegetated cover and density. 相似文献
87.
Abstract. We consider two fundamental problems in dynamic scheduling: scheduling to meet deadlines in a preemptive multiprocessor setting,
and scheduling to provide good response time in a number of scheduling environments. When viewed from the perspective of traditional
worst-case analysis, no good on-line algorithms exist for these problems, and for some variants no good off-line algorithms
exist unless P = NP .
We study these problems using a relaxed notion of competitive analysis, introduced by Kalyanasundaram and Pruhs, in which
the on-line algorithm is allowed more resources than the optimal off-line algorithm to which it is compared. Using this approach,
we establish that several well-known on-line algorithms, that have poor performance from an absolute worst-case perspective,
are optimal for the problems in question when allowed moderately more resources. For optimization of average flow time, these
are the first results of any sort, for any NP -hard version of the problem, that indicate that it might be possible to design good approximation algorithms. 相似文献
88.
L. Guan I.U. Awan I. Phillips A. Grigg W. Dargie 《Simulation Modelling Practice and Theory》2009,17(3):558-568
The provision of guaranteed QoS for various Internet traffic types has become a challenging problem for researchers. New Internet applications, mostly multimedia-based, require differentiated treatments under certain QoS constraints. Due to a rapid increase in these new services, Internet routers are facing serious traffic congestion problems. This paper presents an approximate analytical performance model in a discrete-time queue, based on closed form expressions using queue threshold, to control the congestion caused by the bursty Internet traffic. The methodology of maximum entropy (ME) has been used to characterize closed form expressions for the state and blocking probabilities. A discrete-time GGeo/GGeo/1/{N1, N2} censored queue with finite capacity, N2, external compound Bernoulli traffic process and generalised geometric transmission times under a first come first serve (FCFS) rule and arrival first (AF) buffer management policy has been used for the solution process. To satisfy the low delay along with high throughput, a threshold, N1, has been incorporated to slow the arrival process from mean arrival rate λ1 to λ2 once the instantaneous queue length has been reached, otherwise the source operates normally. This creates an implicit feedback from the queue to the arrival process. The system can be potentially used as a model for congestion control based on the Random Early Detection (RED) mechanism. Typical numerical experiments have been included to show the credibility of ME solution against simulation for various performance measures and to demonstrate the performance evaluation of the proposed analytical model. 相似文献
89.
Single channel currents were activated by GABA (0.5 to 5 microM) in cell-attached and inside-out patches from cells in the dentate gyrus of rat hippocampal slices. The currents reversed at the chloride equilibrium potential and were blocked by bicuculline (100 microM). Several different kinds of channel were seen: high conductance and low conductance, rectifying and "nonrectifying." Channels had multiple conductance states. The open probability (Po) of channels was greater at depolarized than at hyperpolarized potentials and the relationship between Po and potential could be fitted with a Boltzmann equation with equivalent valency (z) of 1. The combination of outward rectification and potential-dependent open probability gave very little chloride current at hyperpolarized potentials but steeply increasing current with depolarization, useful properties for a tonic inhibitory mechanism. 相似文献
90.
Estimates of the scintillation index, fractional fade time, expected number of fades, and mean duration of fade time associated with a propagating Gaussian-beam wave are developed for uplink and downlink laser satellite-communication channels. Estimates for the spot size of the beam at the satellite or the ground or airborne receiver are also provided. Weak-fluctuation theory based on the log-normal model is applicable for intensity fluctuations near the optical axis of the beam provided that the zenith angle is not too large, generally not exceeding 60°. However, there is an increase in scintillations that occurs with increasing pointing error at any zenith angle, particularly for uplink channels. Large off-axis scintillations are of particular significance because they imply that small pointing errors can cause serious degradation in the communication-channel reliability. Off-axis scintillations increase more rapidly for larger-diameter beams and, in some cases, can lead to a radial saturation effect for pointing errors less than 1 μrad off the optical beam axis. 相似文献