In this paper we present a novel methodology based on non-parametric deformable prototype templates for reconstructing the
outline of a shape from a degraded image. Our method is versatile and fast and has the potential to provide an automatic procedure
for classifying pathologies. We test our approach on synthetic and real data from a variety of medical and biological applications.
In these studies it is important to reconstruct accurately the shape of the object under investigation from very noisy data.
Here we assume that we have some prior knowledge about the object outline represented by a prototype shape. Our procedure
deforms this shape by means of non-affine transformations and the contour is reconstructed by minimizing a newly developed
objective function that depends on the transformation parameters. We introduce an iterative template deformation procedure
in which the scale of the deformation decreases as the algorithm proceeds. We compare our results with those from a Gaussian
Mixture Model segmentation and two state-of-the-art Level Set methods. This comparison shows that the proposed procedure performs
consistently well on both real and simulated data. As a by-product we develop a new filter that recovers the connectivity
of a shape.
Francesco de PasqualeEmail:
Francesco de Pasquale
received his Ph.D. in Applied
Statistics from the University of Plymouth, United Kingdom in 2004 discussing a thesis on Bayesian and Template based methods
for image analysis. Since his degree in Physics obtained at the University of Rome ‘La Sapienza’in 1999 his work has been
focused on developing models and methods for Magnetic Resonance Imaging, in particular image registration, classification
and segmentation in a Bayesian framework. After being appointed a 2-year contract as a Lecturer at the University of Plymouth
from 2003 to 2004 he is now a post-Doc researcher at the ITAB, Institute for Advanced Biomedical Technologies, University
of Chieti, Italy and he works on the analysis of fMRI and MEG data.
Julian Stander
was born in Plymouth, UK in 1964. He received a BA in Mathematics with first class honours from University of Oxford in 1987,
a Diploma in Mathematical Statistics with distinction from University of Cambridge in 1988, and a PhD from University of Bath
in 1992. He has been a lecturer at the School of Mathematics and Statistics, University of Plymouth, since 1993, and was promoted
to Reader in 2006. His fields of interest are: applications of statistics including image analysis, spatial modelling and
disclosure limitation. He has published over 20 refereed journal articles.
相似文献
Organizations are increasingly delegating customer inquiries to speech dialog systems (SDSs) to save personnel resources. However, customers often report frustration when interacting with SDSs due to poorly designed solutions. Despite these issues, design knowledge for SDSs in customer service remains elusive. To address this research gap, we employ the design science approach and devise a design theory for SDSs in customer service. The design theory, including 14 requirements and five design principles, draws on the principles of dialog theory and undergoes validation in three iterations using five hypotheses. A summative evaluation comprising a two-phase experiment with 205 participants yields positive results regarding the user experience of the artifact. This study contributes to design knowledge for SDSs in customer service and supports practitioners striving to implement similar systems in their organizations.
The time-dependent Stokes equations were solved in the vicinity of two spheres colliding in a viscous fluid with viscosity ν to determine the rate of change of the hydrodynamic forces during large accelerations associated with Hertzian mechanical contact of small duration \({\tau_{\rm c}}\). It was assumed that the gap clearance remains finite during contact and is approximately equal to the height σ of surface micro-asperities. The initial condition corresponds to the steady-state axisymmetric solution of Cooley and O’Neill (Mathematika 16:37–49, 1969), and the initial value problem for the time-dependent Stokes streamfunction was solved using Laplace transform methods. Assuming that σ is small compared to the sphere radius a, we used singular perturbation expansions and tangent-sphere coordinates to obtain an asymptotic solution for the viscous flow in the gap and around the moving sphere. The solution provides the dependence of the resistance, added mass and history forces on σ, the sphere velocity and acceleration, and the ratio of the sphere diameters. We found that the relative importance of viscous and mechanical forces during contact depends on a new Stokes number \({St_{\rm c}=\sigma^2/\nu \tau_{\rm c}}\). Integration of Newton’s equation for the motion of the sphere during mechanical contact showed that there is a critical \({St_{\rm c}=O(\sigma/a)}\) for which there is no rebound at the end of contact. 相似文献
This paper proposes a scenario-based two-stage stochastic programming model with recourse for master production scheduling under demand uncertainty. We integrate the model into a hierarchical production planning and control system that is common in industrial practice. To reduce the problem of the disaggregation of the master production schedule, we use a relatively low aggregation level (compared to other work on stochastic programming for production planning). Consequently, we must consider many more scenarios to model demand uncertainty. Additionally, we modify standard modelling approaches for stochastic programming because they lead to the occurrence of many infeasible problems due to rolling planning horizons and interdependencies between master production scheduling and successive planning levels. To evaluate the performance of the proposed models, we generate a customer order arrival process, execute production planning in a rolling horizon environment and simulate the realisation of the planning results. In our experiments, the tardiness of customer orders can be nearly eliminated by the use of the proposed stochastic programming model at the cost of increasing inventory levels and using additional capacity. 相似文献
Nano Research - The formation of long-range ordered aperiodic molecular films on quasicrystalline substrates is a new challenge that provides an opportunity for further surface functionalization.... 相似文献
The present study investigates the impact of the experience of role playing a violent character in a video game on attitudes towards violent crimes and criminals. People who played the violent game were found to be more acceptable of crimes and criminals compared to people who did not play the violent game. More importantly, interaction effects were found such that people were more acceptable of crimes and criminals outside the game if the criminals were matched with the role they played in the game and the criminal actions were similar to the activities they perpetrated during the game. The results indicate that people’s virtual experience through role-playing games can influence their attitudes and judgments of similar real-life crimes, especially if the crimes are similar to what they conducted while playing games. Theoretical and practical implications are discussed. 相似文献
In this paper we explore the interest of computational intelligence tools in the management of heterogeneous communication networks, specifically to predict congestion, failures and other anomalies in the network that may eventually lead to degradation of the quality of offered services. We show two different applications based on neural and neuro-fuzzy systems for quality of service (QoS) management in next generation networks for voice and video service over heterogeneous Internet protocol (V2oIP) services. The two examples explained in this paper attempt to predict the communication network resources for new incoming calls, and visualizing the QoS of a communication network by means of self-organizing maps. 相似文献
Most streamline generation algorithms either provide a particular density of streamlines across the domain or explicitly detect features, such as critical points, and follow customized rules to emphasize those features. However, the former generally includes many redundant streamlines, and the latter requires Boolean decisions on which points are features (and may thus suffer from robustness problems for real-world data). We take a new approach to adaptive streamline placement for steady vector fields in 2D and 3D. We define a metric for local similarity among streamlines and use this metric to grow streamlines from a dense set of candidate seed points. The metric considers not only Euclidean distance, but also a simple statistical measure of shape and directional similarity. Without explicit feature detection, our method produces streamlines that naturally accentuate regions of geometric interest. In conjunction with this method, we also propose a quantitative error metric for evaluating a streamline representation based on how well it preserves the information from the original vector field. This error metric reconstructs a vector field from points on the streamline representation and computes a difference of the reconstruction from the original vector field. 相似文献
Toxicoproteomics is the use of proteomic technologies to better understand environmental and genetic factors, toxic mechanisms, and modes of action in response to acute exposure to toxicants and in the long-term development of diseases caused or influenced by these exposures. Use of toxicoproteomic technologies to identify key biochemical pathways, mechanisms, and biomarkers of exposure and toxicity will decrease the uncertainties that are associated with human health risk assessments. This review provides an overview of toxicoproteomics from human health risk assessment perspectives. Key toxicoproteomic technologies such as 2-D gel-based proteomic methods and toxicoproteomic approaches are described, and examples of applications of these technologies and methodologies in the risk assessment context are presented. The discussion includes a focus on challenges and future directions. 相似文献
A novel method for the generation of synthetic on-line signatures based on the spectral analysis and the Kinematic Theory of rapid human movements, was presented in Part I of this series of two papers. In the present paper, the experimental framework used for the validation of the novel approach is described. The validation protocol, which uses different development and test sets in order to achieve unbiased results, includes the comparison of real and synthetic databases in terms of (i) visual appearance, (ii) statistical information, and (iii) performance evaluation of three competitive and totally different verification systems. The experimental results show the high similarity existing between synthetically generated and humanly produced samples, and the great potential of the method for the study of the signature trait. 相似文献