全文获取类型
收费全文 | 2307篇 |
免费 | 146篇 |
国内免费 | 2篇 |
专业分类
电工技术 | 47篇 |
综合类 | 6篇 |
化学工业 | 671篇 |
金属工艺 | 62篇 |
机械仪表 | 60篇 |
建筑科学 | 147篇 |
矿业工程 | 2篇 |
能源动力 | 50篇 |
轻工业 | 164篇 |
水利工程 | 6篇 |
石油天然气 | 2篇 |
无线电 | 186篇 |
一般工业技术 | 488篇 |
冶金工业 | 161篇 |
原子能技术 | 10篇 |
自动化技术 | 393篇 |
出版年
2022年 | 29篇 |
2021年 | 48篇 |
2020年 | 32篇 |
2019年 | 44篇 |
2018年 | 50篇 |
2017年 | 66篇 |
2016年 | 68篇 |
2015年 | 76篇 |
2014年 | 88篇 |
2013年 | 132篇 |
2012年 | 111篇 |
2011年 | 165篇 |
2010年 | 105篇 |
2009年 | 106篇 |
2008年 | 124篇 |
2007年 | 110篇 |
2006年 | 102篇 |
2005年 | 79篇 |
2004年 | 59篇 |
2003年 | 63篇 |
2002年 | 42篇 |
2001年 | 41篇 |
2000年 | 43篇 |
1999年 | 40篇 |
1998年 | 52篇 |
1997年 | 45篇 |
1996年 | 46篇 |
1995年 | 33篇 |
1994年 | 21篇 |
1993年 | 30篇 |
1992年 | 23篇 |
1991年 | 21篇 |
1990年 | 17篇 |
1989年 | 22篇 |
1988年 | 18篇 |
1987年 | 12篇 |
1986年 | 25篇 |
1985年 | 20篇 |
1984年 | 20篇 |
1983年 | 15篇 |
1982年 | 13篇 |
1981年 | 18篇 |
1980年 | 13篇 |
1979年 | 19篇 |
1978年 | 20篇 |
1977年 | 17篇 |
1976年 | 15篇 |
1974年 | 12篇 |
1973年 | 11篇 |
1970年 | 10篇 |
排序方式: 共有2455条查询结果,搜索用时 10 毫秒
981.
This paper contrasts two methods to verify timing constraints of real-time applications. The method of static analysis predicts the worst-case and best-case execution times of a task's code by analyzing execution paths and simulating processor characteristics without ever executing the program or requiring the program's input. Evolutionary testing is an iterative testing procedure, which approximates the extreme execution times within several generations. By executing the test object dynamically and measuring the execution times the inputs are guided yielding gradually tighter predictions of the extreme execution times. We examined both approaches on a number of real world examples. The results show that static analysis and evolutionary testing are complementary methods, which together provide upper and lower bounds for both worst-case and best-case execution times. 相似文献
982.
Mischak H Apweiler R Banks RE Conaway M Coon J Dominiczak A Ehrich JH Fliser D Girolami M Hermjakob H Hochstrasser D Jankowski J Julian BA Kolch W Massy ZA Neusuess C Novak J Peter K Rossing K Schanstra J Semmes OJ Theodorescu D Thongboonkerd V Weissinger EM Van Eyk JE Yamamoto T 《Proteomics. Clinical applications》2007,1(2):148-156
983.
We show that for arbitrary positive integers
with probability
the gcd of two linear combinations of these integers with rather small random integer coefficients coincides with
This naturally leads to a probabilistic algorithm for computing the gcd of several integers, with probability
via just one gcd of two numbers with about the same size as the initial data (namely the above linear combinations). This
algorithm can be repeated to achieve any desired confidence level. 相似文献
984.
Böttger J Balzer M Deussen O 《IEEE transactions on visualization and computer graphics》2006,12(5):845-852
Commonly known detail in context techniques for the two-dimensional Euclidean space enlarge details and shrink their context using mapping functions that introduce geometrical compression. This makes it difficult or even impossible to recognize shapes for large differences in magnification factors. In this paper we propose to use the complex logarithm and the complex root functions to show very small details even in very large contexts. These mappings are conformal, which means they only locally rotate and scale, thus keeping shapes intact and recognizable. They allow showing details that are orders of magnitude smaller than their surroundings in combination with their context in one seamless visualization. We address the utilization of this universal technique for the interaction with complex two-dimensional data considering the exploration of large graphs and other examples 相似文献
985.
Georgii J Westermann R 《IEEE transactions on visualization and computer graphics》2006,12(5):1345-1352
Recent advances in algorithms and graphics hardware have opened the possibility to render tetrahedral grids at interactive rates on commodity PCs. This paper extends on this work in that it presents a direct volume rendering method for such grids which supports both current and upcoming graphics hardware architectures, large and deformable grids, as well as different rendering options. At the core of our method is the idea to perform the sampling of tetrahedral elements along the view rays entirely in local barycentric coordinates. Then, sampling requires minimum GPU memory and texture access operations, and it maps efficiently onto a feed-forward pipeline of multiple stages performing computation and geometry construction. We propose to spawn rendered elements from one single vertex. This makes the method amenable to upcoming Direct3D 10 graphics hardware which allows to create geometry on the GPU. By only modifying the algorithm slightly it can be used to render per-pixel iso-surfaces and to perform tetrahedral cell projection. As our method neither requires any pre-processing nor an intermediate grid representation it can efficiently deal with dynamic and large 3D meshes. 相似文献
986.
This paper introduces a uniform statistical framework for both 3-D and 2-D object recognition using intensity images as input data. The theoretical part provides a mathematical tool for stochastic modeling. The algorithmic part introduces methods for automatic model generation, localization, and recognition of objects. 2-D images are used for learning the statistical appearance of 3-D objects; both the depth information and the matching between image and model features are missing for model generation. The implied incomplete data estimation problem is solved by the Expectation Maximization algorithm. This leads to a novel class of algorithms for automatic model generation from projections. The estimation of pose parameters corresponds to a non-linear maximum likelihood estimation problem which is solved by a global optimization procedure. Classification is done by the Bayesian decision rule. This work includes the experimental evaluation of the various facets of the presented approach. An empirical evaluation of learning algorithms and the comparison of different pose estimation algorithms show the feasibility of the proposed probabilistic framework. 相似文献
987.
We study a resource allocation problem where jobs have the following characteristic: each job consumes some quantity of a bounded resource during a certain time interval and induces a given profit. We aim to select a subset of jobs with maximal total profit such that the total resource consumed at any point in time remains bounded by the given resource capacity.While this case is trivially NP-hard in general and polynomially solvable for uniform resource consumptions, our main result shows the NP-hardness for the case of general resource consumptions but uniform profit values, i.e. for the case of maximizing the number of performed jobs. This result applies even for proper time intervals.We also give a deterministic (1/2−ε)-approximation algorithm for the general problem on proper intervals improving upon the currently known 1/3 ratio for general intervals. Finally, we study the worst-case performance ratio of a simple greedy algorithm. 相似文献
988.
Peter Holtz Joachim Kimmerle Ulrike Cress 《International Journal of Computer-Supported Collaborative Learning》2018,13(4):439-456
The advent of the social web brought with it challenges and opportunities for research on learning and knowledge construction. Using the online-encyclopedia Wikipedia as an example, we discuss several methods that can be applied to analyze the dynamic nature of knowledge-related processes in mass collaboration environments. These methods can help in the analysis of the interactions between the two levels that are relevant in computer-supported collaborative learning (CSCL) research: The individual level of learners and the collective level of the group or community. In line with constructivist theories of learning, we argue that the development of knowledge on both levels is triggered by productive friction, that is, the prolific resolution of socio-cognitive conflicts. By describing three prototypical methods that have been used in previous Wikipedia research, we review how these techniques can be used to examine the dynamics on both levels and analyze how these dynamics can be predicted by the amount of productive friction. We illustrate how these studies make use of text classifiers, social network analysis, and cluster analysis in order to operationalize the theoretical concepts. We conclude by discussing implications for the analysis of dynamic knowledge processes from a learning sciences perspective. 相似文献
989.
Joachim Kimmerle Ansgar Thiel Kim-Kristin Gerbing Martina Bientzle Iassen Halatchliyski Ulrike Cress 《Computers in human behavior》2013
We present an empirical analysis of a web forum in which followers of a health-related community exchange information and opinions in order to pass on and develop relevant knowledge. We analyzed how knowledge construction takes place in such a community that represents an outsider position which is not accepted by majority society. For this purpose we applied the Community of Practice (CoP) concept as a guideline for our analysis and found that many well-known activities of CoPs were true of the Urkost community. The social network analysis findings also supported interpreting this community as a CoP. But we found as well that this community had a variety of structural characteristics that the CoP literature deals with insufficiently. We identified the structure of goals, roles, and communication as relevant features that are typical of this outsider CoP. For example, the attitude of the core members towards people of a ‘different faith’ was characterized by strong hostility and rejection. These features provided an effective basis for the development and maintenance of a shared identity in the community. Our findings are discussed against the background of the necessity for further development of the CoP concept. 相似文献
990.