全文获取类型
收费全文 | 529篇 |
免费 | 12篇 |
专业分类
电工技术 | 5篇 |
化学工业 | 86篇 |
机械仪表 | 3篇 |
建筑科学 | 9篇 |
能源动力 | 19篇 |
轻工业 | 21篇 |
水利工程 | 4篇 |
无线电 | 91篇 |
一般工业技术 | 58篇 |
冶金工业 | 12篇 |
原子能技术 | 1篇 |
自动化技术 | 232篇 |
出版年
2024年 | 1篇 |
2023年 | 3篇 |
2022年 | 4篇 |
2021年 | 8篇 |
2020年 | 3篇 |
2019年 | 6篇 |
2018年 | 12篇 |
2017年 | 7篇 |
2016年 | 22篇 |
2015年 | 14篇 |
2014年 | 26篇 |
2013年 | 31篇 |
2012年 | 35篇 |
2011年 | 48篇 |
2010年 | 40篇 |
2009年 | 41篇 |
2008年 | 37篇 |
2007年 | 31篇 |
2006年 | 31篇 |
2005年 | 18篇 |
2004年 | 14篇 |
2003年 | 17篇 |
2002年 | 18篇 |
2001年 | 9篇 |
2000年 | 7篇 |
1999年 | 6篇 |
1998年 | 8篇 |
1997年 | 5篇 |
1996年 | 2篇 |
1995年 | 1篇 |
1994年 | 3篇 |
1993年 | 2篇 |
1992年 | 2篇 |
1990年 | 2篇 |
1989年 | 7篇 |
1987年 | 2篇 |
1986年 | 1篇 |
1985年 | 2篇 |
1984年 | 3篇 |
1983年 | 1篇 |
1981年 | 2篇 |
1979年 | 3篇 |
1978年 | 1篇 |
1977年 | 2篇 |
1975年 | 1篇 |
1973年 | 2篇 |
排序方式: 共有541条查询结果,搜索用时 15 毫秒
11.
Albert Angel Nick Koudas Nikos Sarkas Divesh Srivastava Michael Svendsen Srikanta Tirthapura 《The VLDB Journal The International Journal on Very Large Data Bases》2014,23(2):175-199
Recent years have witnessed an unprecedented proliferation of social media. People around the globe author, everyday, millions of blog posts, social network status updates, etc. This rich stream of information can be used to identify, on an ongoing basis, emerging stories, and events that capture popular attention. Stories can be identified via groups of tightly coupled real-world entities, namely the people, locations, products, etc, that are involved in the story. The sheer scale and rapid evolution of the data involved necessitate highly efficient techniques for identifying important stories at every point of time. The main challenge in real-time story identification is the maintenance of dense subgraphs (corresponding to groups of tightly coupled entities) under streaming edge weight updates (resulting from a stream of user-generated content). This is the first work to study the efficient maintenance of dense subgraphs under such streaming edge weight updates. For a wide range of definitions of density, we derive theoretical results regarding the magnitude of change that a single edge weight update can cause. Based on these, we propose a novel algorithm, DynDens, which outperforms adaptations of existing techniques to this setting and yields meaningful, intuitive results. Our approach is validated by a thorough experimental evaluation on large-scale real and synthetic datasets. 相似文献
12.
13.
14.
Frederik Verbist Nikos Deligiannis Marc Jacobs Joeri Barbarien Peter Schelkens Adrian Munteanu Jan Cornelis 《Multimedia Tools and Applications》2013,66(3):405-430
Distributed video coding (DVC) constitutes an original coding framework to meet the stringent requirements imposed by uplink-oriented and low-power mobile video applications. The quality of the side information available to the decoder and the efficiency of the employed channel codes are primary factors determining the success of a DVC system. This contribution introduces two novel techniques for probabilistic motion compensation in order to generate side information at the Wyner-Ziv decoder. The employed DVC scheme uses a base layer, serving as a hash to facilitate overlapped block motion estimation at the decoder side. On top of the base layer, a supplementary Wyner-Ziv layer is coded in the DCT domain. Both proposed probabilistic motion compensation techniques are driven by the actual correlation channel statistics and reuse information contained in the hash. Experimental results report significant rate savings caused by the novel side information generation methods compared to previous techniques. Moreover, the compression performance of the presented DVC architecture, featuring the proposed side-information generation techniques, delivers state-of-the-art compression performance. 相似文献
15.
Thomas Bernecker Tobias Emrich Hans-Peter Kriegel Nikos Mamoulis Matthias Renz Shiming Zhang Andreas Züfle 《GeoInformatica》2013,17(3):449-487
Traditional spatial queries return, for a given query object q, all database objects that satisfy a given predicate, such as epsilon range and k-nearest neighbors. This paper defines and studies inverse spatial queries, which, given a subset of database objects Q and a query predicate, return all objects which, if used as query objects with the predicate, contain Q in their result. We first show a straightforward solution for answering inverse spatial queries for any query predicate. Then, we propose a filter-and-refinement framework that can be used to improve efficiency. We show how to apply this framework on a variety of inverse queries, using appropriate space pruning strategies. In particular, we propose solutions for inverse epsilon range queries, inverse k-nearest neighbor queries, and inverse skyline queries. Furthermore, we show how to relax the definition of inverse queries in order to ensure non-empty result sets. Our experiments show that our framework is significantly more efficient than naive approaches. 相似文献
16.
Nikos D. Lagaros Manolis Papadrakakis 《Computer Methods in Applied Mechanics and Engineering》2008,198(1):28-41
Performance-Based Design (PBD) methodologies is the contemporary trend in designing better and more economic earthquake-resistant structures where the main objective is to achieve more predictable and reliable levels of safety and operability against natural hazards. On the other hand, reliability-based optimization (RBO) methods directly account for the variability of the design parameters into the formulation of the optimization problem. The objective of this work is to incorporate PBD methodologies under seismic loading into the framework of RBO in conjunction with innovative tools for treating computational intensive problems of real-world structural systems. Two types of random variables are considered: Those which influence the level of seismic demand and those that affect the structural capacity. Reliability analysis is required for the assessment of the probabilistic constraints within the RBO formulation. The Monte Carlo Simulation (MCS) method is considered as the most reliable method for estimating the probabilities of exceedance or other statistical quantities albeit with excessive, in many cases, computational cost. First or Second Order Reliability Methods (FORM, SORM) constitute alternative approaches which require an explicit limit-state function. This type of limit-state function is not available for complex problems. In this study, in order to find the most efficient methodology for performing reliability analysis in conjunction with performance-based optimum design under seismic loading, a Neural Network approximation of the limit-state function is proposed and is combined with either MCS or with FORM approaches for handling the uncertainties. These two methodologies are applied in RBO problems with sizing and topology design variables resulting in two orders of magnitude reduction of the computational effort. 相似文献
17.
We consider how doping can be described in terms of the charge-transfer insulator concept. We discuss and compare a few models for the band structure for the doped charges. This has led us to the conclusion that the band structure stability problem is one of the main issues in any correspondence between results for thet-J model and, say, the three-band model for the slightly doped layered oxides. The stability criterion is formulated and its implications discussed. Provided a phenomenological conduction band is chosen to satisfy the criterion of stability, a detailed picture of how dopants influence the spin wave spectrum atT=0 is presented. The basic physics for the destruction of the antiferromagnetic (AF) long-range order is rather model-independent: the long-range order (atT=0) disappears due to the Cerenkov effect when the Fermi velocity first exceeds the spin wave velocity. We then discuss the overall spectrum of spin excitations and see that the spin wave attenuation for x<x
c,T= 0 due to Landau damping appears in the range of magnon momentak(x)=2m
*
s±x. We also argue that in the presence of superconductivity, the Cerenkov effect is eliminated due to the gap in the spectrum. This may restore the role of the AF fluctuations as the main source of dissipation at the lowest temperatures. A brief discussion of how interaction with magnons may affect the hole spectrum concludes the paper. 相似文献
18.
Nikos Nikolaou Michael Makridis Basilis Gatos Nikolaos Stamatopoulos Nikos Papamarkos 《Image and vision computing》2010
In this paper, we strive towards the development of efficient techniques in order to segment document pages resulting from the digitization of historical machine-printed sources. This kind of documents often suffer from low quality and local skew, several degradations due to the old printing matrix quality or ink diffusion, and exhibit complex and dense layout. To face these problems, we introduce the following innovative aspects: (i) use of a novel Adaptive Run Length Smoothing Algorithm (ARLSA) in order to face the problem of complex and dense document layout, (ii) detection of noisy areas and punctuation marks that are usual in historical machine-printed documents, (iii) detection of possible obstacles formed from background areas in order to separate neighboring text columns or text lines, and (iv) use of skeleton segmentation paths in order to isolate possible connected characters. Comparative experiments using several historical machine-printed documents prove the efficiency of the proposed technique. 相似文献
19.
Felix Bießmann Frank C. Meinecke Arthur Gretton Alexander Rauch Gregor Rainer Nikos K. Logothetis Klaus-Robert Müller 《Machine Learning》2010,79(1-2):5-27
Data recorded from multiple sources sometimes exhibit non-instantaneous couplings. For simple data sets, cross-correlograms may reveal the coupling dynamics. But when dealing with high-dimensional multivariate data there is no such measure as the cross-correlogram. We propose a simple algorithm based on Kernel Canonical Correlation Analysis (kCCA) that computes a multivariate temporal filter which links one data modality to another one. The filters can be used to compute a multivariate extension of the cross-correlogram, the canonical correlogram, between data sources that have different dimensionalities and temporal resolutions. The canonical correlogram reflects the coupling dynamics between the two sources. The temporal filter reveals which features in the data give rise to these couplings and when they do so. We present results from simulations and neuroscientific experiments showing that tkCCA yields easily interpretable temporal filters and correlograms. In the experiments, we simultaneously performed electrode recordings and functional magnetic resonance imaging (fMRI) in primary visual cortex of the non-human primate. While electrode recordings reflect brain activity directly, fMRI provides only an indirect view of neural activity via the Blood Oxygen Level Dependent (BOLD) response. Thus it is crucial for our understanding and the interpretation of fMRI signals in general to relate them to direct measures of neural activity acquired with electrodes. The results computed by tkCCA confirm recent models of the hemodynamic response to neural activity and allow for a more detailed analysis of neurovascular coupling dynamics. 相似文献
20.
Christos D. Antonopoulos Filip Blagojevic Andrey N. Chernikov Nikos P. Chrisochoides Dimitrios S. Nikolopoulos 《Journal of Parallel and Distributed Computing》2009
This article focuses on the optimization of PCDM, a parallel, two-dimensional (2D) Delaunay mesh generation application, and its interaction with parallel architectures based on simultaneous multithreading (SMT) processors. We first present the step-by-step effect of a series of optimizations on performance. These optimizations improve the performance of PCDM by up to a factor of six. They target issues that very often limit the performance of scientific computing codes. We then evaluate the interaction of PCDM with a real SMT-based SMP system, using both high-level metrics, such as execution time, and low-level information from hardware performance counters. 相似文献