全文获取类型
收费全文 | 2181篇 |
免费 | 33篇 |
专业分类
电工技术 | 24篇 |
综合类 | 2篇 |
化学工业 | 136篇 |
金属工艺 | 17篇 |
机械仪表 | 29篇 |
建筑科学 | 28篇 |
矿业工程 | 1篇 |
能源动力 | 37篇 |
轻工业 | 87篇 |
水利工程 | 7篇 |
石油天然气 | 1篇 |
无线电 | 102篇 |
一般工业技术 | 163篇 |
冶金工业 | 1431篇 |
原子能技术 | 33篇 |
自动化技术 | 116篇 |
出版年
2022年 | 6篇 |
2021年 | 8篇 |
2019年 | 9篇 |
2018年 | 22篇 |
2017年 | 13篇 |
2016年 | 20篇 |
2015年 | 19篇 |
2014年 | 27篇 |
2013年 | 62篇 |
2012年 | 47篇 |
2011年 | 52篇 |
2010年 | 36篇 |
2009年 | 43篇 |
2008年 | 42篇 |
2007年 | 54篇 |
2006年 | 26篇 |
2005年 | 32篇 |
2004年 | 18篇 |
2003年 | 15篇 |
2002年 | 17篇 |
2001年 | 14篇 |
2000年 | 19篇 |
1999年 | 70篇 |
1998年 | 463篇 |
1997年 | 257篇 |
1996年 | 148篇 |
1995年 | 96篇 |
1994年 | 95篇 |
1993年 | 87篇 |
1992年 | 19篇 |
1991年 | 24篇 |
1990年 | 28篇 |
1989年 | 28篇 |
1988年 | 25篇 |
1987年 | 20篇 |
1986年 | 24篇 |
1985年 | 26篇 |
1984年 | 7篇 |
1983年 | 12篇 |
1982年 | 14篇 |
1981年 | 11篇 |
1980年 | 15篇 |
1978年 | 10篇 |
1977年 | 31篇 |
1976年 | 56篇 |
1975年 | 4篇 |
1972年 | 3篇 |
1967年 | 3篇 |
1955年 | 3篇 |
1954年 | 3篇 |
排序方式: 共有2214条查询结果,搜索用时 15 毫秒
51.
Localization of sound sources by human listeners has been widely studied and theories and various models of the localization and hearing mechanism have been constructed. In the classical "duplex" theory, sound localization in azimuth is explained by interaural time or equivalently, phase differences at low frequencies, and by interaural amplitude differences at higher frequencies. Head related transfer functions (HRTF's) present a linear system approach to modeling localization by representing the direction-dependent transformation the sound undergoes at each ear. Localization in elevation is explained by directional differences in the HRTF's, which also explains monaural localization. We conjecture that the HRTF's evolved during the course of nature (due to the evolution of the shape and structure of the ear etc.) are optimal with respect to several physically realizable criteria. In this paper, we investigate the problem of defining the design constraints which when optimized yield a set of HRTF's for hearing and monaural vertical localization in an attempt to better understand, and if possible, duplicate nature's design. We pursue an engineer's design perspective and formulate a constrained optimization problem, where the desired set of HRTF's is optimized according to a cost function based on several criteria for localization, hearing and smoothness, and also by imposing physically realizable constraints on the HRTF's such as nonnegativity, energy etc. The value of the cost function for a candidate set of HRTF's is an indication of the similarity of that set of HRTF's with respect to the ideal solution (measured HRTF data). The final optimization results we present are similar to the actual HRTF's measured in human subjects, and the associated cost function values are found to be almost equal. This points to the fact that the optimization criteria defined are quite relevant. The significant outcome of this research is the identification of a relevant set of mathematical criteria that could be optimized in the human auditory system to facilitate good hearing and localization. These criteria along with the associated constraints represent the desirable characteristics of the HRTF's in an HRTF-based localization system, and could lead to a better understanding and modeling of the auditory system. 相似文献
52.
53.
Zheng Z Ahmed N Mueller K 《IEEE transactions on visualization and computer graphics》2011,17(12):1959-1968
The unguided visual exploration of volumetric data can be both a challenging and a time-consuming undertaking. Identifying a set of favorable vantage points at which to start exploratory expeditions can greatly reduce this effort and can also ensure that no important structures are being missed. Recent research efforts have focused on entropy-based viewpoint selection criteria that depend on scalar values describing the structures of interest. In contrast, we propose a viewpoint suggestion pipeline that is based on feature-clustering in high-dimensional space. We use gradient/normal variation as a metric to identify interesting local events and then cluster these via k-means to detect important salient composite features. Next, we compute the maximum possible exposure of these composite feature for different viewpoints and calculate a 2D entropy map parameterized in longitude and latitude to point out promising view orientations. Superimposed onto an interactive track-ball interface, users can then directly use this entropy map to quickly navigate to potentially interesting viewpoints where visibility-based transfer functions can be employed to generate volume renderings that minimize occlusions. To give full exploration freedom to the user, the entropy map is updated on the fly whenever a view has been selected, pointing to new and promising but so far unseen view directions. Alternatively, our system can also use a set-cover optimization algorithm to provide a minimal set of views needed to observe all features. The views so generated could then be saved into a list for further inspection or into a gallery for a summary presentation. 相似文献
54.
Julia Mueller Katja Hutter Johann Fueller Kurt Matzler 《Information Systems Journal》2011,21(6):479-501
Virtual worlds, as electronic environments where individuals can interact in a realistic manner in form of avatars, are increasingly used by gamers, consumers and employees. Therefore, they provide opportunities for reinventing business processes. Especially, effective knowledge management (KM) requires the use of appropriate information and communication technology (ICT) as well as social interaction. Emerging virtual worlds enable new ways to support knowledge and knowing processes because these virtual environments consider social aspects that are necessary for knowledge creating and knowledge sharing processes. Thus, collaboration in virtual worlds resembles real‐life activities. In this paper, we shed light on the use of Second Life (SL) as a KM platform in a real‐life setting. To explore the potential and current usage of virtual worlds for knowledge and knowing activities, we conducted a qualitative study at IBM. We interviewed IBM employees belonging to a special workgroup called ‘Web 2.0/virtual worlds’ in order to gain experience in generating and exchanging knowledge by virtually collaborating and interacting. Our results show that virtual worlds – if they are able to overcome problems like platform stability, user interface or security issues – bear the potential to serve as a KM platform. They facilitate global and simultaneous interaction, create a common context for collaboration, combine different tools for communication and enhance knowledge and knowing processes. 相似文献
55.
Embedded control systems with hard real-time constraints require that deadlines are met at all times or the system may malfunction with potentially catastrophic consequences. Schedulability theory can assure deadlines for a given task set when periods and worst-case execution times (WCETs) of tasks are known. While periods are generally derived from the problem specification, a task??s code needs to be statically analyzed to derive safe and tight bounds on its WCET. Such static timing analysis abstracts from program input and considers loop bounds and architectural features, such as pipelining and caching. However, unpredictability due to dynamic memory (DRAM) refresh cannot be accounted for by such analysis, which limits its applicability to systems with static memory (SRAM). In this paper, we assess the impact of DRAM refresh on task execution times and demonstrate how predictability is adversely affected leading to unsafe hard real-time system design. We subsequently contribute a novel and effective approach to overcome this problem through software-initiated DRAM refresh. We develop (1)?a?pure software and (2)?a?hybrid hardware/software refresh scheme. Both schemes provide predictable timings and fully replace the classical hardware auto-refresh. We discuss implementation details based on this design for multiple concrete embedded platforms and experimentally assess the benefits of different schemes on these platforms. We further formalize the integration of variable latency memory references into a data-flow framework suitable for static timing analysis to bound a task??s memory latencies with regard to their WCET. The resulting predictable execution behavior in the presence of DRAM refresh combined with the additional benefit of reduced access delays is unprecedented, to the best of our knowledge. 相似文献
56.
57.
The radiation budget at the earth surface is an essential climate variable for climate monitoring and analysis as well as for verification of climate model output and reanalysis data. Accurate solar surface irradiance data is a prerequisite for an accurate estimation of the radiation budget and for an efficient planning and operation of solar energy systems.This paper describes a new approach for the retrieval of the solar surface irradiance from satellite data. The method is based on radiative transfer modelling and enables the use of extended information about the atmospheric state. Accurate analysis of the interaction between the atmosphere, surface albedo, transmission and the top of atmosphere albedo has been the basis for the new method, characterised by a combination of parameterisations and “eigenvector” look-up tables. The method is characterised by a high computing performance combined with a high accuracy. The performed validation shows that the mean absolute deviation is of the same magnitude as the confidence level of the BSRN (Baseline Surface Radiation Measurement) ground based measurements and significant lower as the CM-SAF (Climate Monitoring Satellite Application Facility) target accuracy of 10 W/m2. The mean absolute difference between monthly means of ground measurements and satellite based solar surface irradiance is 5 W/m2 with a mean bias deviation of − 1 W/m2 and a RMSD (Root Mean Square Deviation) of 5.4 W/m2 for the investigated European sites. The results for the investigated African sites obtained by comparing instantaneous values are also encouraging. The mean absolute difference is with 2.8% even lower as for the European sites being 3.9%, but the mean bias deviation is with − 1.1% slightly higher as for the European sites, being 0.8%. Validation results over the ocean in the Mediterranean Sea using shipboard data complete the validation. The mean bias is − 3.6 W/m2 and 2.3% respectively. The slightly higher mean bias deviation over ocean is at least partly resulting from inherent differences due to the movement of the ship (shadowing, allocation of satellite pixel). The validation results demonstrate that the high accuracy of the surface solar irradiance is given in different climate regions. The discussed method has also the potential to improve the treatment of radiation processes in climate and Numerical Weather Prediction (NWP) models, because of the high accuracy combined with a high computing speed. 相似文献
58.
This paper contributes a comprehensive study of a framework to bound worst-case instruction cache performance for caches with arbitrary levels of associativity. The framework is formally introduced, operationally described and its correctness is shown. Results of incorporating instruction cache predictions within pipeline simulation show that timing predictions for set-associative caches remain just as tight as predictions for direct-mapped caches. The low cache simulation overhead allows interactive use of the analysis tool and scales well with increasing associativity.The approach taken is based on a data-flow specification of the problem and provides another step toward worst-case execution time prediction of contemporary architectures and its use in schedulability analysis for hard real-time systems. 相似文献
59.
Giesen J Mueller K Schuberth E Wang L Zolliker P 《IEEE transactions on visualization and computer graphics》2007,13(6):1664-1671
Visualization algorithms can have a large number of parameters, making the space of possible rendering results rather high-dimensional. Only a systematic analysis of the perceived quality can truly reveal the optimal setting for each such parameter. However, an exhaustive search in which all possible parameter permutations are presented to each user within a study group would be infeasible to conduct. Additional complications may result from possible parameter co-dependencies. Here, we will introduce an efficient user study design and analysis strategy that is geared to cope with this problem. The user feedback is fast and easy to obtain and does not require exhaustive parameter testing. To enable such a framework we have modified a preference measuring methodology, conjoint analysis, that originated in psychology and is now also widely used in market research. We demonstrate our framework by a study that measures the perceived quality in volume rendering within the context of large parameter spaces. 相似文献
60.