首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   232篇
  免费   6篇
电工技术   9篇
化学工业   34篇
金属工艺   3篇
机械仪表   4篇
建筑科学   15篇
能源动力   6篇
轻工业   11篇
石油天然气   1篇
无线电   40篇
一般工业技术   29篇
冶金工业   40篇
原子能技术   2篇
自动化技术   44篇
  2023年   1篇
  2022年   7篇
  2021年   7篇
  2020年   1篇
  2019年   2篇
  2018年   7篇
  2017年   2篇
  2016年   5篇
  2015年   6篇
  2014年   13篇
  2013年   20篇
  2012年   11篇
  2011年   14篇
  2010年   11篇
  2009年   13篇
  2008年   19篇
  2007年   9篇
  2006年   10篇
  2005年   1篇
  2004年   7篇
  2003年   7篇
  2002年   6篇
  2001年   4篇
  2000年   2篇
  1999年   4篇
  1998年   16篇
  1997年   10篇
  1996年   7篇
  1995年   5篇
  1993年   2篇
  1992年   2篇
  1990年   1篇
  1986年   1篇
  1985年   1篇
  1980年   1篇
  1977年   1篇
  1971年   1篇
  1969年   1篇
排序方式: 共有238条查询结果,搜索用时 0 毫秒
1.
We show how to compute the smallest rectangle that can enclose any polygon, from a given set of polygons, in nearly linear time; we also present a PTAS for the problem, as well as a linear-time algorithm for the case when the polygons are rectangles themselves. We prove that finding a smallest convex polygon that encloses any of the given polygons is NP-hard, and give a PTAS for minimizing the perimeter of the convex enclosure. We also give efficient algorithms to find the smallest rectangle simultaneously enclosing a given pair of convex polygons.  相似文献   
2.
There has been a growing interest in applying human computation – particularly crowdsourcing techniques – to assist in the solution of multimedia, image processing, and computer vision problems which are still too difficult to solve using fully automatic algorithms, and yet relatively easy for humans. In this paper we focus on a specific problem – object segmentation within color images – and compare different solutions which combine color image segmentation algorithms with human efforts, either in the form of an explicit interactive segmentation task or through an implicit collection of valuable human traces with a game. We use Click’n’Cut, a friendly, web-based, interactive segmentation tool that allows segmentation tasks to be assigned to many users, and Ask’nSeek, a game with a purpose designed for object detection and segmentation. The two main contributions of this paper are: (i) We use the results of Click’n’Cut campaigns with different groups of users to examine and quantify the crowdsourcing loss incurred when an interactive segmentation task is assigned to paid crowd-workers, comparing their results to the ones obtained when computer vision experts are asked to perform the same tasks. (ii) Since interactive segmentation tasks are inherently tedious and prone to fatigue, we compare the quality of the results obtained with Click’n’Cut with the ones obtained using a (fun, interactive, and potentially less tedious) game designed for the same purpose. We call this contribution the assessment of the gamification loss, since it refers to how much quality of segmentation results may be lost when we switch to a game-based approach to the same task. We demonstrate that the crowdsourcing loss is significant when using all the data points from workers, but decreases substantially (and becomes comparable to the quality of expert users performing similar tasks) after performing a modest amount of data analysis and filtering out of users whose data are clearly not useful. We also show that – on the other hand – the gamification loss is significantly more severe: the quality of the results drops roughly by half when switching from a focused (yet tedious) task to a more fun and relaxed game environment.  相似文献   
3.
We address the self-calibration of a smooth generic central camera from only two dense rotational flows produced by rotations of the camera about two unknown linearly independent axes passing through the camera centre. We give a closed-form theoretical solution to this problem, and we prove that it can be solved exactly up to a linear orthogonal transformation ambiguity. Using the theoretical results, we propose an algorithm for the self-calibration of a generic central camera from two rotational flows.  相似文献   
4.
This paper presents a novel technique for three-dimensional (3D) human motion capture using a set of two non-calibrated cameras. The user’s five extremities (head, hands and feet) are extracted, labeled and tracked after silhouette segmentation. As they are the minimal number of points that can be used in order to enable whole body gestural interaction, we will henceforth refer to these features as crucial points. Features are subsequently labelled using 3D triangulation and inter-image tracking. The crucial point candidates are defined as the local maxima of the geodesic distance with respect to the center of gravity of the actor region that lie on the silhouette boundary. Due to its low computational complexity, the system can run at real-time paces on standard personal computers, with an average error rate range between 4% and 9% in realistic situations, depending on the context and segmentation quality.
Benoit MacqEmail:
  相似文献   
5.
Numerical modelling of porous flow in a low‐permeability matrix with high‐permeability inclusions is a challenging task because the large ratio of permeabilities ill‐conditions the finite element system of equations. We propose a coupled model where Darcy flow is used for the porous matrix and potential flow is used for the inclusions. We discuss appropriate interface conditions in detail and show that the head drop in the inclusions can be prescribed in a very simple way. Algorithmic aspects are treated in full detail. Numerical examples show that this coupled approach precludes ill‐conditioning and is more efficient than heterogeneous Darcy flow. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   
6.
Metric Access Methods (MAMs) are indexing techniques which allow working in generic metric spaces. Therefore, MAMs are specially useful for Content-Based Image Retrieval systems based on features which use non L p norms as similarity measures. MAMs naturally allow the design of image browsers due to their inherent hierarchical structure. The Hierarchical Cellular Tree (HCT), a MAM-based indexing technique, provides the starting point of our work. In this paper, we describe some limitations detected in the original formulation of the HCT and propose some modifications to both the index building and the search algorithm. First, the covering radius, which is defined as the distance from the representative to the furthest element in a node, may not cover all the elements belonging to the node’s subtree. Therefore, we propose to redefine the covering radius as the distance from the representative to the furthest element in the node’s subtree. This new definition is essential to guarantee a correct construction of the HCT. Second, the proposed Progressive Query retrieval scheme can be redesigned to perform the nearest neighbor operation in a more efficient way. We propose a new retrieval scheme which takes advantage of the benefits of the search algorithm used in the index building. Furthermore, while the evaluation of the HCT in the original work was only subjective, we propose an objective evaluation based on two aspects which are crucial in any approximate search algorithm: the retrieval time and the retrieval accuracy. Finally, we illustrate the usefulness of the proposal by presenting some actual applications.  相似文献   
7.
Multipliers are routinely used for impact evaluation of private projects and public policies at the national and subnational levels. Oosterhaven and Stelder (J Reg Sci 42(3), 533–543, 2002) correctly pointed out the misuse of standard ‘gross’ multipliers and proposed the concept of ‘net’ multiplier as a solution to this bad practice. We prove their proposal is not well founded. We do so by showing that supporting theorems are faulty in enunciation and demonstration. The proofs are flawed due to an analytical error, but the theorems themselves cannot be salvaged as generic, non-curiosum counterexamples demonstrate. We also provide a general analytical framework for multipliers and, using it, we show that standard ‘gross’ multipliers are all that are needed within the interindustry model since they follow the causal logic of the economic model, are well-defined and independent of exogenous shocks, and are interpretable as predictors for change.  相似文献   
8.
This paper introduces the innovative system used in the reconstruction of Lisbon after the 1755 earthquake. Its findings are based on original documents that offer a detailed picture of the principles and methods established to convert the land and properties of the old city into the new and standardized gridded Plan. Implemented in the mid‐eighteenth century, the methodical system outlined was, undoubtedly, an important point of reference for the large‐scale urban improvement and redevelopment operations that would follow in nineteenth‐century Europe.  相似文献   
9.
The aim of this study was to analyze the relationship between statin use along with serum cholesterol levels and prostate cancer (PCa) detection and aggressiveness. Statin users of three years or more and serum cholesterol levels (SC) were assessed in 2408 men scheduled for prostate biopsy. SC was classified as normal (NSC: <200 mg/dL) or high (HSC: >200 mg/dL). High-grade PCa (HGPCa) was considered if the Gleason score was greater than 7. Statin users comprised 30.9% of those studied. The PCa detection rate was 31.2% of men on statins and 37% of non-statin users (p < 0.006). The PCa detection rate was 26.3% in men with NSC and 40.6% in those with HSC (p < 0.001). In the subset of NSC men, the PCa rate was 26.5% for statin users and 26.2% for non-users (p = 0.939), while in men with HSC, the PCa rate was 36.4% for statin users and 42.0% for non-statin users (p = 0.063). The HGPCa rate was 41.8% for statin users and 32.5% for non-users (p = 0.012). NSC men had a 53.8% rate of HGPCa, while the rate was only 27.6% in HSC men (p < 0.001). NSC men on statins had an HGPCa rate of 70.2%, while non-statin users had a rate of 41.2% (p < 0.001). The HGPCa rate for HSC men on statins was 18.8%, while the rate was 30.0% (p = 0.011) for non-users. Logistic regression analysis suggested that serum cholesterol levels could serve as an independent predictor of PCa risk, OR 1.87 (95% CI 1.56–2.24) and HGPCa risk, OR 0.31 (95% CI 0.23–0.44), while statin usage could not. Statin treatment may prevent PCa detection through serum cholesterol-mediated mechanisms. A disturbing increase in the HGPCa rate was observed in statin users who normalized their serum cholesterol.  相似文献   
10.
Hot Wire Chemical Vapor Deposition (HW-CVD) is one of the most promising techniques for depositing the intrinsic microcrystalline silicon layer for the production of micro-morph solar cells. However, the silicide formation at the colder ends of the tungsten wire drastically reduces the lifetime of the catalyzer, thus limiting its industrial exploitation. A simple but interesting strategy to decrease the silicide formation is to hide the electrical contacts of the catalyzer in a long narrow cavity which reduces the probability of the silane molecules to reach the colder ends of the wire. In this paper, the working mechanism of the cavity is elucidated. Measurements of the thickness profile of the silicon deposited in the internal walls of the cavity have been compared with those predicted using a simple diffusion model based on the assumption of Knudsen flow. A lifetime study of the protected and unprotected wires has been carried out. The different mechanisms which determine the deterioration of the catalyzer have been identified and discussed.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号