全文获取类型
收费全文 | 1148篇 |
免费 | 52篇 |
国内免费 | 3篇 |
专业分类
电工技术 | 10篇 |
综合类 | 3篇 |
化学工业 | 281篇 |
金属工艺 | 17篇 |
机械仪表 | 13篇 |
建筑科学 | 66篇 |
矿业工程 | 1篇 |
能源动力 | 19篇 |
轻工业 | 108篇 |
水利工程 | 5篇 |
石油天然气 | 3篇 |
无线电 | 103篇 |
一般工业技术 | 196篇 |
冶金工业 | 181篇 |
原子能技术 | 6篇 |
自动化技术 | 191篇 |
出版年
2023年 | 4篇 |
2022年 | 16篇 |
2021年 | 37篇 |
2020年 | 12篇 |
2019年 | 13篇 |
2018年 | 23篇 |
2017年 | 26篇 |
2016年 | 25篇 |
2015年 | 31篇 |
2014年 | 48篇 |
2013年 | 89篇 |
2012年 | 60篇 |
2011年 | 80篇 |
2010年 | 59篇 |
2009年 | 59篇 |
2008年 | 67篇 |
2007年 | 63篇 |
2006年 | 53篇 |
2005年 | 25篇 |
2004年 | 38篇 |
2003年 | 37篇 |
2002年 | 32篇 |
2001年 | 19篇 |
2000年 | 17篇 |
1999年 | 19篇 |
1998年 | 21篇 |
1997年 | 8篇 |
1996年 | 20篇 |
1995年 | 17篇 |
1994年 | 17篇 |
1993年 | 11篇 |
1992年 | 6篇 |
1991年 | 9篇 |
1990年 | 9篇 |
1989年 | 7篇 |
1988年 | 10篇 |
1987年 | 11篇 |
1986年 | 4篇 |
1985年 | 9篇 |
1984年 | 9篇 |
1983年 | 6篇 |
1982年 | 4篇 |
1981年 | 6篇 |
1980年 | 8篇 |
1978年 | 7篇 |
1977年 | 4篇 |
1975年 | 10篇 |
1973年 | 3篇 |
1961年 | 3篇 |
1956年 | 3篇 |
排序方式: 共有1203条查询结果,搜索用时 15 毫秒
61.
Laurence Boxer Russ Miller Andrew Rau-Chaplin 《Journal of Parallel and Distributed Computing》1999,58(3):477
This paper considers a variety of geometric pattern recognition problems on input sets of size n using a coarse grained multicomputer model consisting of p processors with Ω(n/p) local memory each (i.e., Ω(n/p) memory cells of Θ(log n) bits apiece), where the processors are connected to an arbitrary interconnection network. It introduces efficient scalable parallel algorithms for a number of geometric problems including the rectangle finding problem, the maximal equally spaced collinear points problem, and the point set pattern matching problem. All of the algorithms presented are scalable in that they are applicable and efficient over a very wide range of ratios of problem size to number of processors. In addition to the practicality imparted by scalability, these algorithms are easy to implement in that all required communications can be achieved by a small number of calls to standard global routing operations. 相似文献
62.
63.
Masaji Tanaka Author Vitae Laurence Anthony Author Vitae Toshiaki Kaneeda Author Vitae Junji Hirooka Author Vitae 《Computer aided design》2004,36(8):723-734
Although solid models play a central role in modern CAD systems, 2D CAD systems are still commonly used for designing products without complex curved faces. Therefore, an important task is to convert 2D drawings to solid models, and this is usually carried out manually even in present CAD systems. Many methods have been proposed to automatically convert orthographic part drawings of solid objects to solid models. Unfortunately, products are usually drawn as 2D assembly drawings, and therefore, these methods cannot be applied. A further problem is the difficult and time-consuming task of decomposing 2D assembly drawings into 2D part drawings. In previous work, the authors proposed a method to automatically decompose 2D assembly drawings into 3D part drawings, from which 2D part drawings can be easily generated. However, one problem with the proposed method was that the number of solutions could easily explode if the 2D assembly drawings became complex. Building on this work, here we describe a new method to automatically convert 2D assembly drawings to 3D part drawings, generating a unique solution for designers regardless of the complexity of the original 2D assembly drawings. The only requirement for the approach is that the assembly drawings consist of standard parts such as bars and plates. In 2D assembly drawings, the dimensions, part numbers and parts lists are usually drawn, and the proposed method utilizes these to obtain a unique solution. 相似文献
64.
Several recent papers have adapted notions of geometric topology to the emerging field of digital topology. An important notion is that of digital homotopy. In this paper, we study a variety of digitally-continuous functions that preserve homotopy types or homotopy-related properties such as the digital fundamental group.Laurence Boxer is Professor of Computer and Information Sciences at Niagara University, and Research Professor of Computer Science and Engineering at the State University of New York at Buffalo. He received his Ph.D. in Mathematics from the University of Illinois at Urbana-Champaign. His research interests are computational geometry, parallel algorithms, and digital topology. Dr. Boxer is co-author, with Russ Miller, of Algorithms Sequential and Parallel, A Unified Approach, a recent textbook published by Prentice Hall. 相似文献
65.
Wooded hedgerows do not cover large areas but perform many functions that are beneficial to water quality and biodiversity. A broad range of remotely sensed data is available to map these small linear elements in rural landscapes, but only a few of them have been evaluated for this purpose. In this study, we evaluate and compare various optical remote-sensing data including high and very high spatial resolution, active and passive, and airborne and satellite data to produce quantitative information on the hedgerow network structure and to analyse qualitative information from the maps produced in order to estimate the true value of these maps. We used an object-based image analysis that proved to be efficient for detecting and mapping thin elements in complex landscapes. The analysis was performed at two scales, the hedgerow network scale and the tree canopy scale, on a study site that shows a strong landscape gradient of wooded hedgerow density. The results (1) highlight the key role of spectral resolution on the detection and mapping of wooded elements with remotely sensed data; (2) underline the fact that every satellite image provides relevant information on wooded network structures, even in closed landscape units, whatever the spatial resolution; and (3) indicate that light detection and ranging data offer important insights into future strategies for monitoring hedgerows. 相似文献
66.
Deqing Zou Wenrong Zhang Weizhong Qiang Guofu Xiang Laurence Tianruo Yang Hai Jin Kan Hu 《Future Generation Computer Systems》2013,29(8):2092-2102
Virtualization is a pillar technology in cloud computing for multiplexing computing resources on a single cloud platform for multiple cloud tenants. Monitoring the behavior of virtual machines (VMs) on a cloud platform is a critical requirement for cloud tenants. Existing monitoring mechanisms on virtualized platforms either takes a complete VM as the monitoring granularity, such that they cannot capture the malicious behaviors within individual VMs, or they focus on specific monitoring functions that cannot be used for heterogeneous VMs concurrently running on a single cloud node. Furthermore, the existing monitoring mechanisms have made an assumption that the privileged domain is trusted to act as expected, which causes the cloud tenants’ concern about security because the privileged domain in fact could not act as the tenants’ expectation. We design a trusted monitoring framework, which provides a chain of trust that excludes the untrusted privileged domain, by deploying an independent guest domain for the monitoring purpose, as well as utilizing the trusted computing technology to ensure the integrity of the monitoring environment. Moreover, the feature of fine-grained and general monitoring is also provided. We have implemented the proposed monitoring framework on Xen, and integrated it into OpenNebula. Our experimental results show that it can offer expected functionality, and bring moderate performance overhead. 相似文献
67.
This study uses a hostage negotiation setting to demonstrate how a team of strategic police officers can utilize specific coping strategies to minimize uncertainty at different stages of their decision-making in order to foster resilient decision-making to effectively manage a high-risk critical incident. The presented model extends the existing research on coping with uncertainty by (1) applying the RAWFS heuristic (Lipshitz and Strauss in Organ Behav Human Decis Process 69:149–163, 1997) of individual decision-making under uncertainty to a team critical incident decision-making domain; (2) testing the use of various coping strategies during “in situ” team decision-making by using a live simulated hostage negotiation exercise; and (3) including an additional coping strategy (“reflection-in-action”; Schön in The reflective practitioner: how professionals think in action. Temple Smith, London, 1983) that aids naturalistic team decision-making. The data for this study were derived from a videoed strategic command meeting held within a simulated live hostage training event; these video data were coded along three themes: (1) decision phase; (2) uncertainty management strategy; and (3) decision implemented or omitted. Results illustrate that, when assessing dynamic and high-risk situations, teams of police officers cope with uncertainty by relying on “reduction” strategies to seek additional information and iteratively update these assessments using “reflection-in-action” (Schön 1983) based on previous experience. They subsequently progress to a plan formulation phase and use “assumption-based reasoning” techniques in order to mentally simulate their intended courses of action (Klein et al. 2007), and identify a preferred formulated strategy through “weighing the pros and cons” of each option. In the unlikely event that uncertainty persists to the plan execution phase, it is managed by “reduction” in the form of relying on plans and standard operating procedures or by “forestalling” and intentionally deferring the decision while contingency planning for worst-case scenarios. 相似文献
68.
Jiao Feng Naixue Xiong Laurence T. Yang Yan Yang 《Multimedia Tools and Applications》2012,56(2):227-243
With the advent of Next Generation Network (NGN), services that are currently provided by multiple specific network-centric
architectures. NGN provides AAA (Anytime, Anywhere and Always on) access to users from different service providers with consistent
and ubiquitous provision of services as necessary. This special issue of NGN includes pervasive, grid, and peer-to-peer computing
to provide computing and communication services at anytime and anywhere. In fact, the application of NGN includes digital
image processing, multimedia systems/services, and so on. Here we focus on the digital image processing technology in NGN
environments. Low-contrast structure and heavy noise in NGN environments can be found in many kinds of digital images, which
makes the images vague and uncertainly, especially in x-ray images. As result, some useful tiny characteristic are weakened—which
are difficult to distinguish even by naked eyes. Based on the combination of no-linear grad-contrast operator and multi-resolution
wavelet analysis, a kind of image enhancement processing algorithm for useful tiny characters is presented. The algorithm
can enhance the tiny characters while confine amplifying noise. The analysis of the results shows that local regions of the
image are enhanced by using the concept of the grad contrast to make image clearer adaptively. Experiments were conducted
on real pictures, and the results show that the algorithm is flexible and convenient. 相似文献
69.
Laurence C. Breaker Edward M. Armstrong Charles A. Endris 《Remote sensing of environment》2010,114(2):345-362
This study strives to establish an objective basis for image compositing in satellite oceanography. Image compositing is a powerful technique for cloud filtering that often emphasizes cloud clearing at the expense of obtaining synoptic coverage. Although incomplete cloud removal in image compositing is readily apparent, the loss of synopticity, often, is not. Consequently, the primary goal of image compositing should be to obtain the greatest amount of cloud-free coverage or clarity in a period short enough that synopticity, to a significant degree, is preserved.To illustrate the process of image compositing and the problems associated with it, we selected a region off the coast of California and constructed two 16-day image composites, one, during the spring, and the second, during the summer of 2006, using Advanced Very High Resolution Radiometer (AVHRR) InfraRed (IR) satellite imagery. Based on the results of cloud clearing for these two 16-day sequences, rapid cloud clearing occurred up to day 4 or 5, followed by much slower cloud clearing out to day 16, suggesting an explicit basis for the growth in cloud clearing. By day 16, the cloud clearing had, in most cases, exceeded 95%. Based on these results, a shorter compositing period could have been employed without a significant loss in clarity.A method for establishing an objective basis for selecting the period for image compositing is illustrated using observed data. The loss in synopticity, which, in principle, could be estimated from pattern correlations between the images in the composite, was estimated from a separate time series of SST since the loss of synopticity, in our approach, is only a function of time. The autocorrelation function of the detrended residuals provided the decorrelation time scale and the basis for the decay process, which, together, define the loss of synopticity. The results show that (1) the loss of synopticity and the gain in clarity are inversely related, (2) an objective basis for selecting a compositing period corresponds to the day number where the decay and growth curves for synopticity and clarity intersect, and (3), in this case, the point of intersection occurred 3.2 days into the compositing period. By applying simple mathematics it was shown that the intersection time for the loss in synopticity and the growth in clarity is directly proportional to the initial conditions required to specify the clarity at the beginning of the compositing period, and inversely proportional to the sum of the rates of growth for clarity and the loss in synopticity. Finally, we consider these results to be preliminary in nature, and, as a result, hope that future work will bring forth significant improvements in the approach outlined in this study. 相似文献
70.