首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   10篇
  免费   0篇
无线电   1篇
原子能技术   2篇
自动化技术   7篇
  2014年   1篇
  2013年   1篇
  2011年   1篇
  2010年   1篇
  2009年   1篇
  2007年   1篇
  2002年   1篇
  2001年   1篇
  1999年   1篇
  1983年   1篇
排序方式: 共有10条查询结果,搜索用时 15 毫秒
1
1.
In a few years, graph cuts appeared as a leading method in computer vision and graphics due to their efficiency in computing globally optimal solutions to popular minimization problems. Such an approach remains, however, impractical for very large-scale problems due to the memory requirements for storing the graphs. Among the strategies to overcome this situation, an existing one consists in reducing the size of these graphs by only adding the nodes which satisfy a local condition. In the image segmentation context, this means, for instance, that we do not need to consider a node when the unary terms are large in its neighborhood. The remaining nodes are typically located in a thin band around the boundary of the segmented object. In this paper, we detail existing strategies to reduce the memory footprint of graph cuts, describe the proposed reduction criterion, and empirically prove on a large number of experiments that the distance between the minimizer found and the global minimizer remains null or very small. We also provide extra parameters for further reducing the graphs and for removing isolated nodes due to noise.  相似文献   
2.
We introduce a new method to compute conformal parameterizations using a recent definition of discrete conformity, and establish a discrete version of the Riemann mapping theorem. Our algorithm can parameterize triangular, quadrangular and digital meshes. It can also be adapted to preserve metric properties. To demonstrate the efficiency of our method, many examples are shown in the experiment section.  相似文献   
3.
This paper develops an implementation of a Predual Proximal Point Algorithm (PPPA) solving a Non Negative Basis Pursuit Denoising model. The model imposes a constraint on the l 2 norm of the residual, instead of penalizing it. The PPPA solves the predual of the problem with a Proximal Point Algorithm (PPA). Moreover, the minimization that needs to be performed at each iteration of PPA is solved with a dual method. We can prove that these dual variables converge to a solution of the initial problem. Our analysis proves that we turn a constrained non differentiable convex problem into a short sequence of nice concave maximization problems. By nice, we mean that the functions which are maximized are differentiable and their gradient is Lipschitz. The algorithm is easy to implement, easier to tune and more general than the algorithms found in the literature. In particular, it can be applied to the Basis Pursuit Denoising (BPDN) and the Non Negative Basis Pursuit Denoising (NNBPDN) and it does not make any assumption on the dictionary. We prove its convergence to the set of solutions of the model and provide some convergence rates. Experiments on image approximation show that the performances of the PPPA are at the current state of the art for the BPDN.  相似文献   
4.
The French Nuclear Protection and Safety Institute (IPSN) launched the HEVA-VERCORS program in 1983, in collaboration with Electricité de France (EDF). This program is devoted to the source term of fission products (FP) released from PWR fuel samples during a sequence representative of a severe accident. The analytical experiments are conducted in a shielded hot cell of the LAMA facility of the Grenoble center of CEA (Commissariat à l’Energie Atomique); as simplified tests addressing a limited number of phenomena, they give results complementary to those of the more global in-pile PHEBUS experiments. Six VERCORS tests have been conducted from 1989–1994 with higher fuel temperatures (up to 2600 K) compared with the earlier HEVA tests in order, in particular, to quantify better the release of lower volatile FPs. This paper gives an overview of the experimental facility, a synthesis of FP release from these tests and exhibits, as an example, some specific results of the VERCORS 6 test, performed with high burn-up fuel (60 GWd tU−1). The on-going VERCORS HT–RT program, designed to reach fuel liquefaction temperatures, is described before conclusions are drawn.  相似文献   
5.
VERCORS is an analytical experimental programme focusing on the release of fission products (FP) and actinides from an irradiated fuel rod, under conditions representative of those encountered during a severe PWR accident. The 17 tests - financed jointly by EDF and IRSN - were conducted by the CEA on its Grenoble site in a specific high-activity cell at the Laboratory for Active Materials (LAMA) over a 14-year period (1989-2002), in accordance with three test phases. A first series of six tests (VERCORS 1-VERCORS 6) was conducted between 1989 and 1994 on UO2 fuel close to the relocation. Next, two different test series - VERCORS HT (three tests) and RT (eight tests) - were alternately performed between 1996 and 2002 at higher temperature up to the fuel sample collapse. These tests focused on UO2 and MOX fuels with a variety of initial configurations (intact or debris beds). This programme made it possible to precisely quantify fission product releases in all the situations explored, as well as to identify similar behavioural patterns between some of these fission products, thus making it possible to classify them schematically into four groups with decreasing volatility: (1) volatile FP including fission gases, iodine, caesium, antimony, tellurium, cadmium, rubidium and silver with very high releases (practically total release) at temperatures of around 2350 °C; (2) semi-volatile FP, a category composed of molybdenum, rhodium, barium, palladium and technetium with releases of 50-100%, but very sensitive to oxygen potential and with marked redeposits nearby the emission point; (3) FP that are low volatile, such as ruthenium, cerium, strontium, yttrium, europium, niobium and lanthanum, with significant releases of around 3-10% on average, but capable (for some elements and under particular conditions) of reaching 20-40%; (4) non-volatile FP composed of zirconium, neodymium and praseodymium, for which no release can be measured by gamma spectrometry for the envelope conditions of the VERCORS test grids. Actinides each have their own type of behaviour. They can nevertheless be subdivided into two categories, the first including U and Np, with releases of up to 10% and behaviour similar to that of the low volatile FP, and the second (Pu) with very low releases, typically less than 1%.  相似文献   
6.
A basic property of a simple closed surface is the Jordan property: the complement of the surface has two connected components. We call back-component any such component, and the union of a back-component and the surface is called the closure of this back-component. We introduce the notion of strong surface as a surface which satisfies a strong homotopy property: the closure of a back-component is strongly homotopic to that back-component. This means that we can homotopically remove any subset of a strong surface from the closure of a back-component. On the basis of some results on homotopy, and strong homotopy, we have proved that the simple closed 26-surfaces defined by Morgenthaler and Rosenfeld, and the simple closed 18-surfaces defined by one of the authors are both strong surfaces. Thus, strong surfaces appear as an interesting generalization of these two notions of a surface.  相似文献   
7.
We present a general framework for image restoration; despite its simplicity, certain variational and certain wavelet approaches can be formulated within this framework. This permits the construction of a natural model, with only one parameter, which has the advantages of both approaches. We give a mathematical analysis of this model, describe our algorithm and illustrate this by some experiments.  相似文献   
8.
In image denoising, many researchers have tried for several years to combine wavelet-like approaches and optimization methods (typically based on the total variation minimization). However, despite the well-known links between image denoising and image compression when solved with wavelet-like approaches, these hybrid image denoising methods have not found counterparts in image compression. This is the gap that this paper aims at filling. To do so, we provide a generalization of the standard image compression model. However, important numerical limitations still need to be addressed in order to make such models practical. Francois Malgouyres had his PhD supervised by J.M. Morel and B. Rouge, at the ENS Cachan (France), in 2000. He spent the academic year 2000--2001, as a CAM assistant professor, at the math department of UCLA (USA). He was working within the teams of Stanley Osher and Tony Chan. Since September 2001, he works as a maitre de conference in the LAGA (math laboratory) and L2TI (image processing laboratory) at the University Paris 13 (France). His teaching is in computer science. He works under the supervision of Alain Trouve (who moved to the CMLA, ENS Cachan, France).  相似文献   
9.
The direct registration problem for images of a deforming surface has been well studied. Parametric flexible warps based, for instance, on the Free-Form Deformation or a Radial Basis Function such as the Thin-Plate Spline, are often estimated using additive Gauss-Newton-like algorithms. The recently proposed compositional framework has been shown to be more efficient, but cannot be directly applied to such non-groupwise warps.  相似文献   
10.
New asymptotics formulas for the mean exit time from an almost stable domain of a discrete-time Markov process are obtained. An original fast simulation method is also proposed. The mathematical background involves the large deviation theorems and approximations by a diffusion process. We are chiefly concerned with the classical Robbins-Monroe algorithm. The validity of the results are tested on examples from the ALOHA system (a satellite type communication algorithm).  相似文献   
1
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号