全文获取类型
收费全文 | 150篇 |
免费 | 2篇 |
专业分类
电工技术 | 1篇 |
化学工业 | 27篇 |
金属工艺 | 1篇 |
机械仪表 | 2篇 |
建筑科学 | 15篇 |
能源动力 | 3篇 |
轻工业 | 9篇 |
石油天然气 | 1篇 |
无线电 | 13篇 |
一般工业技术 | 22篇 |
冶金工业 | 14篇 |
原子能技术 | 2篇 |
自动化技术 | 42篇 |
出版年
2023年 | 1篇 |
2022年 | 5篇 |
2021年 | 7篇 |
2020年 | 1篇 |
2019年 | 1篇 |
2018年 | 5篇 |
2017年 | 1篇 |
2016年 | 4篇 |
2015年 | 5篇 |
2014年 | 9篇 |
2013年 | 15篇 |
2012年 | 9篇 |
2011年 | 9篇 |
2010年 | 10篇 |
2009年 | 7篇 |
2008年 | 12篇 |
2007年 | 5篇 |
2006年 | 7篇 |
2005年 | 1篇 |
2004年 | 5篇 |
2003年 | 4篇 |
2002年 | 4篇 |
2001年 | 2篇 |
2000年 | 2篇 |
1999年 | 1篇 |
1998年 | 2篇 |
1997年 | 7篇 |
1996年 | 3篇 |
1995年 | 1篇 |
1994年 | 1篇 |
1993年 | 2篇 |
1985年 | 1篇 |
1980年 | 1篇 |
1971年 | 1篇 |
1969年 | 1篇 |
排序方式: 共有152条查询结果,搜索用时 6 毫秒
1.
Helmut Alt Esther M. Arkin Alon Efrat George Hart Ferran Hurtado Irina Kostitsyna Alexander Kröller Joseph S. B. Mitchell Valentin Polishchuk 《Theory of Computing Systems》2014,54(4):689-714
We show how to compute the smallest rectangle that can enclose any polygon, from a given set of polygons, in nearly linear time; we also present a PTAS for the problem, as well as a linear-time algorithm for the case when the polygons are rectangles themselves. We prove that finding a smallest convex polygon that encloses any of the given polygons is NP-hard, and give a PTAS for minimizing the perimeter of the convex enclosure. We also give efficient algorithms to find the smallest rectangle simultaneously enclosing a given pair of convex polygons. 相似文献
2.
Axel Carlier Amaia Salvador Ferran Cabezas Xavier Giro-i-Nieto Vincent Charvillat Oge Marques 《Multimedia Tools and Applications》2016,75(23):15901-15928
There has been a growing interest in applying human computation – particularly crowdsourcing techniques – to assist in the solution of multimedia, image processing, and computer vision problems which are still too difficult to solve using fully automatic algorithms, and yet relatively easy for humans. In this paper we focus on a specific problem – object segmentation within color images – and compare different solutions which combine color image segmentation algorithms with human efforts, either in the form of an explicit interactive segmentation task or through an implicit collection of valuable human traces with a game. We use Click’n’Cut, a friendly, web-based, interactive segmentation tool that allows segmentation tasks to be assigned to many users, and Ask’nSeek, a game with a purpose designed for object detection and segmentation. The two main contributions of this paper are: (i) We use the results of Click’n’Cut campaigns with different groups of users to examine and quantify the crowdsourcing loss incurred when an interactive segmentation task is assigned to paid crowd-workers, comparing their results to the ones obtained when computer vision experts are asked to perform the same tasks. (ii) Since interactive segmentation tasks are inherently tedious and prone to fatigue, we compare the quality of the results obtained with Click’n’Cut with the ones obtained using a (fun, interactive, and potentially less tedious) game designed for the same purpose. We call this contribution the assessment of the gamification loss, since it refers to how much quality of segmentation results may be lost when we switch to a game-based approach to the same task. We demonstrate that the crowdsourcing loss is significant when using all the data points from workers, but decreases substantially (and becomes comparable to the quality of expert users performing similar tasks) after performing a modest amount of data analysis and filtering out of users whose data are clearly not useful. We also show that – on the other hand – the gamification loss is significantly more severe: the quality of the results drops roughly by half when switching from a focused (yet tedious) task to a more fun and relaxed game environment. 相似文献
3.
We address the self-calibration of a smooth generic central camera from only two dense rotational flows produced by rotations
of the camera about two unknown linearly independent axes passing through the camera centre. We give a closed-form theoretical
solution to this problem, and we prove that it can be solved exactly up to a linear orthogonal transformation ambiguity. Using
the theoretical results, we propose an algorithm for the self-calibration of a generic central camera from two rotational
flows. 相似文献
4.
Pedro Correa Ferran Marqués Xavier Marichal Benoit Macq 《Multimedia Tools and Applications》2008,38(3):365-384
This paper presents a novel technique for three-dimensional (3D) human motion capture using a set of two non-calibrated cameras.
The user’s five extremities (head, hands and feet) are extracted, labeled and tracked after silhouette segmentation. As they
are the minimal number of points that can be used in order to enable whole body gestural interaction, we will henceforth refer
to these features as crucial points. Features are subsequently labelled using 3D triangulation and inter-image tracking. The crucial point candidates are defined
as the local maxima of the geodesic distance with respect to the center of gravity of the actor region that lie on the silhouette
boundary. Due to its low computational complexity, the system can run at real-time paces on standard personal computers, with
an average error rate range between 4% and 9% in realistic situations, depending on the context and segmentation quality.
相似文献
Benoit MacqEmail: |
5.
Antonio Rodríguez‐Ferran Josep Sarrate Antonio Huerta 《International journal for numerical methods in engineering》2004,59(4):577-596
Numerical modelling of porous flow in a low‐permeability matrix with high‐permeability inclusions is a challenging task because the large ratio of permeabilities ill‐conditions the finite element system of equations. We propose a coupled model where Darcy flow is used for the porous matrix and potential flow is used for the inclusions. We discuss appropriate interface conditions in detail and show that the head drop in the inclusions can be prescribed in a very simple way. Algorithmic aspects are treated in full detail. Numerical examples show that this coupled approach precludes ill‐conditioning and is more efficient than heterogeneous Darcy flow. Copyright © 2003 John Wiley & Sons, Ltd. 相似文献
6.
Carles Ventura Verónica Vilaplana Xavier Giró-i-Nieto Ferran Marqués 《Multimedia Tools and Applications》2014,73(3):1983-2008
Metric Access Methods (MAMs) are indexing techniques which allow working in generic metric spaces. Therefore, MAMs are specially useful for Content-Based Image Retrieval systems based on features which use non L p norms as similarity measures. MAMs naturally allow the design of image browsers due to their inherent hierarchical structure. The Hierarchical Cellular Tree (HCT), a MAM-based indexing technique, provides the starting point of our work. In this paper, we describe some limitations detected in the original formulation of the HCT and propose some modifications to both the index building and the search algorithm. First, the covering radius, which is defined as the distance from the representative to the furthest element in a node, may not cover all the elements belonging to the node’s subtree. Therefore, we propose to redefine the covering radius as the distance from the representative to the furthest element in the node’s subtree. This new definition is essential to guarantee a correct construction of the HCT. Second, the proposed Progressive Query retrieval scheme can be redesigned to perform the nearest neighbor operation in a more efficient way. We propose a new retrieval scheme which takes advantage of the benefits of the search algorithm used in the index building. Furthermore, while the evaluation of the HCT in the original work was only subjective, we propose an objective evaluation based on two aspects which are crucial in any approximate search algorithm: the retrieval time and the retrieval accuracy. Finally, we illustrate the usefulness of the proposal by presenting some actual applications. 相似文献
7.
Monica Coll Anna Fernandez-Falgueras Anna Iglesias Bernat del Olmo Laia Nogue-Navarro Adria Simon Alexandra Perez Serra Marta Puigmule Laura Lopez Ferran Pico Monica Corona Marta Vallverdu-Prats Coloma Tiron Oscar Campuzano Josep Castella Ramon Brugada Mireia Alcalde 《International journal of molecular sciences》2022,23(20)
Molecular screening for pathogenic mutations in sudden cardiac death (SCD)-related genes is common practice for SCD cases. However, test results may lead to uncertainty because of the identification of variants of unknown significance (VUS) occurring in up to 70% of total identified variants due to a lack of experimental studies. Genetic variants affecting potential splice site variants are among the most difficult to interpret. The aim of this study was to examine rare intronic variants identified in the exonic flanking sequence to meet two main objectives: first, to validate that canonical intronic variants produce aberrant splicing; second, to determine whether rare intronic variants predicted as VUS may affect the splicing product. To achieve these objectives, 28 heart samples of cases of SCD carrying rare intronic variants were studied. Samples were analyzed using 85 SCD genes in custom panel sequencing. Our results showed that rare intronic variants affecting the most canonical splice sites displayed in 100% of cases that they would affect the splicing product, possibly causing aberrant isoforms. However, 25% of these cases (1/4) showed normal splicing, contradicting the in silico results. On the contrary, in silico results predicted an effect in 0% of cases, and experimental results showed >20% (3/14) unpredicted aberrant splicing. Thus, deep intron variants are likely predicted to not have an effect, which, based on our results, might be an underestimation of their effect and, therefore, of their pathogenicity classification and family members’ follow-up. 相似文献
8.
Ferran Sancho 《The Annals of Regional Science》2013,51(2):537-552
Multipliers are routinely used for impact evaluation of private projects and public policies at the national and subnational levels. Oosterhaven and Stelder (J Reg Sci 42(3), 533–543, 2002) correctly pointed out the misuse of standard ‘gross’ multipliers and proposed the concept of ‘net’ multiplier as a solution to this bad practice. We prove their proposal is not well founded. We do so by showing that supporting theorems are faulty in enunciation and demonstration. The proofs are flawed due to an analytical error, but the theorems themselves cannot be salvaged as generic, non-curiosum counterexamples demonstrate. We also provide a general analytical framework for multipliers and, using it, we show that standard ‘gross’ multipliers are all that are needed within the interindustry model since they follow the causal logic of the economic model, are well-defined and independent of exogenous shocks, and are interpretable as predictors for change. 相似文献
9.
This paper introduces the innovative system used in the reconstruction of Lisbon after the 1755 earthquake. Its findings are based on original documents that offer a detailed picture of the principles and methods established to convert the land and properties of the old city into the new and standardized gridded Plan. Implemented in the mid‐eighteenth century, the methodical system outlined was, undoubtedly, an important point of reference for the large‐scale urban improvement and redevelopment operations that would follow in nineteenth‐century Europe. 相似文献
10.
Ferran V. García-Ferrer Isabel Pérez-Arjona Germán J. de Valcárcel Eugenio Roldán 《Journal of Modern Optics》2013,60(5):763-773
It is well known that the squeezing spectrum of the field exiting a nonlinear cavity can be directly obtained from the fluctuation spectrum of normally ordered products of creation and annihilation operators of the cavity mode. In this article we show that the output field squeezing spectrum can be derived also by combining the fluctuation spectra of any pair of s -ordered products of creation and annihilation operators. The interesting result is that the spectrum obtained in this way from the linearized Langevin equations is exact, and this occurs in spite of the fact that no s -ordered quasiprobability distribution verifies a true Fokker–Planck equation; that is, the Langevin equations used for deriving the squeezing spectrum are not exact. The (linearized) intracavity squeezing obtained from any s -ordered distribution is also exact. These results are exemplified in the problem of dispersive optical bistability. 相似文献