共查询到20条相似文献,搜索用时 15 毫秒
1.
Vasileios K. Pothos Christos Theoharatos Evangelos Zygouris George Economou 《Pattern Analysis & Applications》2008,11(2):117-129
Texture classification is an important problem in image analysis. In the present study, an efficient strategy for classifying
texture images is introduced and examined within a distributional-statistical framework. Our approach incorporates the multivariate
Wald–Wolfowitz test (WW-test), a non-parametric statistical test that measures the similarity between two different sets of
multivariate data, which is utilized here for comparing texture distributions. By summarizing the texture information using
standard feature extraction methodologies, the similarity measure provides a comprehensive estimate of the match between different
images based on graph theory. The proposed “distributional metric” is shown to handle efficiently the texture-space dimensionality
and the limited sample size drawn from a given image. The experimental results, from the application on a typical texture
database, clearly demonstrate the effectiveness of our approach and its superiority over other well-established texture distribution
(dis)similarity metrics. In addition, its performance is used to evaluate several approaches for texture representation. Even
though the classification results are obtained on grayscale images, a direct extension to color-based ones can be straightforward.
Vasileios K. Pothos received the B.Sc. degree in Physics in 2004 and the M.Sc. degree in Electronics and Information Processing in 2006, both from the University of Patras (UoP), Greece. He is currently a Ph.D. candidate in image processing at the Electronics Laboratory in the Department of Physics, UoP, Greece. His main research interests include image processing, pattern recognition and multimedia databases. Dr. Christos Theoharatos received the B.Sc. degree in Physics in 1998, the M.Sc. degree in Electronics and Computer Science in 2001 and the Ph.D. degree in Image Processing and Multimedia Retrieval in 2006, all from the University of Patras (UoP), Greece. He has actively participated in several national research projects and is currently working as a PostDoc researcher at the Electronics Laboratory (ELLAB), Electronics and Computer Division, Department of Physics, UoP. Since the academic year 2002, he has been working as tutor at the degree of lecturer in the Department of Electrical Engineering, of the Technological Institute of Patras. His main research interests include pattern recognition, multimedia databases, image processing and computer vision, data mining and graph theory. Prof. Evangelos Zygouris received the B.Sc. degree in Physics in 1971 and the Ph.D. degree in Digital Filters and Microprocessors in 1984, both from the University of Patras (UoP), Greece. He is currently an Associate Professor at Electronics Laboratory (ELLAB), Department of Physics, UoP, where he teaches at both undergraduate and postgraduate level. He has published papers on digital signal and image processing, digital system design, speech coding systems and real-time processing. His main research interests include digital signal and image processing, DSP system design, micro-controllers, micro-processors and DSPs using VHDL. Prof. George Economou received the B.Sc. degree in Physics from the University of Patras (UoP), Greece in 1976, the M.Sc. degree in Microwaves and Modern Optics from University College London in 1978 and the Ph.D. degree in Fiber Optic Sensor Systems from the University of Patras in 1989. He is currently an Associate Professor at Electronics Laboratory (ELLAB), Department of Physics, UoP, where he teaches at both undergraduate and postgraduate level. He has published papers on non-linear signal and image processing, fuzzy image processing, multimedia databases, data mining and fiber optic sensors. He has also served as referee for many journals, conferences and workshops. His main research interests include signal and image processing, computer vision, pattern recognition and optical signal processing. 相似文献
George EconomouEmail: |
Vasileios K. Pothos received the B.Sc. degree in Physics in 2004 and the M.Sc. degree in Electronics and Information Processing in 2006, both from the University of Patras (UoP), Greece. He is currently a Ph.D. candidate in image processing at the Electronics Laboratory in the Department of Physics, UoP, Greece. His main research interests include image processing, pattern recognition and multimedia databases. Dr. Christos Theoharatos received the B.Sc. degree in Physics in 1998, the M.Sc. degree in Electronics and Computer Science in 2001 and the Ph.D. degree in Image Processing and Multimedia Retrieval in 2006, all from the University of Patras (UoP), Greece. He has actively participated in several national research projects and is currently working as a PostDoc researcher at the Electronics Laboratory (ELLAB), Electronics and Computer Division, Department of Physics, UoP. Since the academic year 2002, he has been working as tutor at the degree of lecturer in the Department of Electrical Engineering, of the Technological Institute of Patras. His main research interests include pattern recognition, multimedia databases, image processing and computer vision, data mining and graph theory. Prof. Evangelos Zygouris received the B.Sc. degree in Physics in 1971 and the Ph.D. degree in Digital Filters and Microprocessors in 1984, both from the University of Patras (UoP), Greece. He is currently an Associate Professor at Electronics Laboratory (ELLAB), Department of Physics, UoP, where he teaches at both undergraduate and postgraduate level. He has published papers on digital signal and image processing, digital system design, speech coding systems and real-time processing. His main research interests include digital signal and image processing, DSP system design, micro-controllers, micro-processors and DSPs using VHDL. Prof. George Economou received the B.Sc. degree in Physics from the University of Patras (UoP), Greece in 1976, the M.Sc. degree in Microwaves and Modern Optics from University College London in 1978 and the Ph.D. degree in Fiber Optic Sensor Systems from the University of Patras in 1989. He is currently an Associate Professor at Electronics Laboratory (ELLAB), Department of Physics, UoP, where he teaches at both undergraduate and postgraduate level. He has published papers on non-linear signal and image processing, fuzzy image processing, multimedia databases, data mining and fiber optic sensors. He has also served as referee for many journals, conferences and workshops. His main research interests include signal and image processing, computer vision, pattern recognition and optical signal processing. 相似文献
2.
Reza DrikvandiReza Modarres Abdullah H. Jalilian 《Computational statistics & data analysis》2011,55(4):1807-1814
To test the hypothesis of symmetry about an unknown median we propose the maximum of a partial sum process based on ranked set samples. We discuss the properties of the test statistic and investigate a modified bootstrap ranked set sample bootstrap procedure to obtain its sampling distribution. The power of the new test statistic is compared with two existing tests in a simulation study. 相似文献
3.
R. PintelonAuthor Vitae 《Automatica》2002,38(8):1295-1311
In the general case of non-uniformly spaced frequency-domain data and/or arbitrarily coloured disturbing noise, the frequency-domain subspace identification algorithms described in McKelvey, Akçay, and Ljung (IEEE Trans. Automatic Control 41(7) (1996) 960) and Van Overschee and De Moor (Signal Processing 52(2) (1996) 179) are consistent only if the covariance matrix of the disturbing noise is known. This paper studies the asymptotic properties (strong convergence, convergence rate, asymptotic normality, strong consistency and loss in efficiency) of these algorithms when the true noise covariance matrix is replaced by the sample noise covariance matrix obtained from a small number of independent repeated experiments. As an additional result the strong convergence (in case of model errors), the convergence rate and the asymptotic normality of the subspace algorithms with known noise covariance matrix follows. 相似文献
4.
In an object-relational database management system, a query optimizer requires users to provide cost models of user-defined functions. The traditional approach is analytical, that is, it builds a cost model generated as a result of analyzing the query processing steps. This analytical approach is difficult, however, especially for spatial query operators because of the complexity of the processing steps. In this paper, a new approach that uses non-parametric regression is proposed. This approach significantly simplifies the process of building a cost model, while achieving highly accurate cost estimation. We demonstrate the simplicity and efficacy of this approach through experiments for three spatial operators—the range query, the window query, and the k-nearest neighbor query—commonly used in spatial databases, using both real and synthetic data sets. 相似文献
5.
Application of bootstrap method in conservative estimation of reliability with limited samples 总被引:1,自引:1,他引:1
Victor Picheny Nam Ho Kim Raphael T. Haftka 《Structural and Multidisciplinary Optimization》2010,41(2):205-217
Accurate estimation of reliability of a system is a challenging task when only limited samples are available. This paper presents the use of the bootstrap method to safely estimate the reliability with the objective of obtaining a conservative but not overly conservative estimate. The performance of the bootstrap method is compared with alternative conservative estimation methods, based on biasing the distribution of system response. The relationship between accuracy and conservativeness of the estimates is explored for normal and lognormal distributions. In particular, detailed results are presented for the case when the goal has a 95% likelihood to be conservative. The bootstrap approach is found to be more accurate for this level of conservativeness. We explore the influence of sample size and target probability of failure on the quality of estimates, and show that for a given level of conservativeness, small sample sizes and low probabilities of failure can lead to a high likelihood of large overestimation. However, this likelihood can be reduced by increasing the sample size. Finally, the conservative approach is applied to the reliability-based optimization of a composite panel under thermal loading. 相似文献
6.
Matias Salibian-Barrera 《Computational statistics & data analysis》2008,52(12):5121-5135
Robust model selection procedures control the undue influence that outliers can have on the selection criteria by using both robust point estimators and a bounded loss function when measuring either the goodness-of-fit or the expected prediction error of each model. Furthermore, to avoid favoring over-fitting models, these two measures can be combined with a penalty term for the size of the model. The expected prediction error conditional on the observed data may be estimated using the bootstrap. However, bootstrapping robust estimators becomes extremely time consuming on moderate to high dimensional data sets. It is shown that the expected prediction error can be estimated using a very fast and robust bootstrap method, and that this approach yields a consistent model selection method that is computationally feasible even for a relatively large number of covariates. Moreover, as opposed to other bootstrap methods, this proposal avoids the numerical problems associated with the small bootstrap samples required to obtain consistent model selection criteria. The finite-sample performance of the fast and robust bootstrap model selection method is investigated through a simulation study while its feasibility and good performance on moderately large regression models are illustrated on several real data examples. 相似文献
7.
Simple bootstrap statistical inference using the SAS system. 总被引:5,自引:0,他引:5
S R Cole 《Computer methods and programs in biomedicine》1999,60(1):79-82
Nonparametric bootstrap statistical inference is a robust computer intensive method for generating estimates of statistical variability for which formulae are not known or asymptotic assumptions are not met. A SAS macro that implements simple nonparametric bootstrap statistical inference is presented with an example. The program code is easily generalized to any SAS procedure which includes a BY statement, and to cases of clustered data. 相似文献
8.
《计算机光盘软件与应用》2008,(10):115-115
以前,网络设计师们总是在哀叹HTML在文本控制上的不足。如今.各大常用浏览器都广泛支持第二版的Cascadlng Style Sheets(层叠样式表单).因此更为精细复杂的文字排版也成为可能。甚至是图书,杂志和报纸中大家所熟悉的首字母下沉都可以用CSS来完成。 相似文献
9.
10.
《Environmental Modelling & Software》2007,22(1):84-96
Two-sample experiments (paired or unpaired) are often used to analyze treatment effects in life and environmental sciences. Quantifying an effect can be achieved by estimating the difference in center of location between a treated and a control sample. In unpaired experiments, a shift in scale is also of interest. Non-normal data distributions can thereby impose a serious challenge for obtaining accurate confidence intervals for treatment effects. To study the effects of non-normality we analyzed robust and non-robust measures of treatment effects: differences of averages, medians, standard deviations, and normalized median absolute deviations in case of unpaired experiments, and average of differences and median of differences in case of paired experiments. A Monte Carlo study using bivariate lognormal distributions was carried out to evaluate coverage performances and lengths of four types of nonparametric bootstrap confidence intervals, namely normal, Student's t, percentile, and BCa for the estimated measures. The robust measures produced smaller coverage errors than their non-robust counterparts. On the other hand, the robust versions gave average confidence interval lengths approximately 1.5 times larger. In unpaired experiments, BCa confidence intervals performed best, while in paired experiments, Student's t was as good as BCa intervals. Monte Carlo results are discussed and recommendations on data sizes are presented. In an application to physiological source–sink manipulation experiments with sunflower, we quantify the effect of an increased or decreased source–sink ratio on the percentage of unfilled grains and the dry mass of a grain. In an application to laboratory experiments with wastewater, we quantify the disinfection effect of predatory microorganisms. The presented bootstrap method to compare two samples is broadly applicable to measured or modeled data from the entire range of environmental research and beyond. 相似文献
11.
Park Seyoung Kang Jaewoong Kim Jongmo Lee Seongil Sohn Mye 《Multimedia Tools and Applications》2019,78(4):4417-4435
Multimedia Tools and Applications - In this paper, we propose an anomaly detection system of machines using a hybrid learning mechanism that combines two kinds of machine learning approaches,... 相似文献
12.
One of the basic skills for a robot autonomous grasping is to select the appropriate grasping point for an object. Several recent works have shown that it is possible to learn grasping points from different types of features extracted from a single image or from more complex 3D reconstructions. In the context of learning through experience, this is very convenient, since it does not require a full reconstruction of the object and implicitly incorporates kinematic constraints as the hand morphology. These learning strategies usually require a large set of labeled examples which can be expensive to obtain. In this paper, we address the problem of actively learning good grasping points to reduce the number of examples needed by the robot. The proposed algorithm computes the probability of successfully grasping an object at a given location represented by a feature vector. By autonomously exploring different feature values on different objects, the systems learn where to grasp each of the objects. The algorithm combines beta–binomial distributions and a non-parametric kernel approach to provide the full distribution for the probability of grasping. This information allows to perform an active exploration that efficiently learns good grasping points even among different objects. We tested our algorithm using a real humanoid robot that acquired the examples by experimenting directly on the objects and, therefore, it deals better with complex (anthropomorphic) hand–object interactions whose results are difficult to model, or predict. The results show a smooth generalization even in the presence of very few data as is often the case in learning through experience. 相似文献
13.
Joshi A Qian X Dione DP Bulsara KR Breuer CK Sinusas AJ Papademetris X 《IEEE transactions on visualization and computer graphics》2008,14(6):1603-1610
The effective visualization of vascular structures is critical for diagnosis, surgical planning as well as treatment evaluation. In recent work, we have developed an algorithm for vessel detection that examines the intensity profile around each voxel in an angiographic image and determines the likelihood that any given voxel belongs to a vessel; we term this the "vesselness coefficient" of the voxel. Our results show that our algorithm works particularly well for visualizing branch points in vessels. Compared to standard Hessian based techniques, which are fine-tuned to identify long cylindrical structures, our technique identifies branches and connections with other vessels. Using our computed vesselness coefficient, we explore a set of techniques for visualizing vasculature. Visualizing vessels is particularly challenging because not only is their position in space important for clinicians but it is also important to be able to resolve their spatial relationship. We applied visualization techniques that provide shape cues as well as depth cues to allow the viewer to differentiate between vessels that are closer from those that are farther. We use our computed vesselness coefficient to effectively visualize vasculature in both clinical neurovascular x-ray computed tomography based angiography images, as well as images from three different animal studies. We conducted a formal user evaluation of our visualization techniques with the help of radiologists, surgeons, and other expert users. Results indicate that experts preferred distance color blending and tone shading for conveying depth over standard visualization techniques. 相似文献
14.
Hagras H. Callaghan V. Colley M. Clarke G. Pounds-Cornish A. Duman H. 《Intelligent Systems, IEEE》2004,19(6):12-20
The Essex intelligent dormitory, iDorm, uses embedded agents to create an ambient-intelligence environment. In a five-and-a-half-day experiment, a user occupied the iDorm, testing its ability to learn user behavior and adapt to user needs. The embedded agent discreetly controls the iDorm according to user preferences. Our work focuses on developing learning and adaptation techniques for embedded agents. We seek to provide online, lifelong, personalized learning of anticipatory adaptive control to realize the ambient-intelligence vision in ubiquitous-computing environments. We developed the Essex intelligent dormitory, or iDorm, as a test bed for this work and an exemplar of this approach. 相似文献
15.
16.
Adaptive non-parametric identification of dense areas using cell phone records for urban analysis 总被引:1,自引:0,他引:1
Alberto Rubio Angel Sanchez Enrique Frias-Martinez 《Engineering Applications of Artificial Intelligence》2013,26(1):551-563
Pervasive large-scale infrastructures (like GPS, WLAN networks or cell-phone networks) generate large datasets containing human behavior information. One of the applications that can benefit from this data is the study of urban environments. In this context, one of the main problems is the detection of dense areas, i.e., areas with a high density of individuals within a specific geographical region and time period. Nevertheless, the techniques used so far face an important limitation: the definition of dense area is not adaptive and as a result the areas identified are related to a threshold applied over the density of individuals, which usually implies that dense areas are mainly identified in downtowns. In this paper, we propose a novel technique, called AdaptiveDAD, to detect dense areas that adaptively define the concept of density using the infrastructure provided by a cell phone network. We evaluate and validate our approach with a real dataset containing the Call Detail Records (CDR) of fifteen million individuals. 相似文献
17.
Rifat Sonmez 《Expert systems with applications》2011,38(8):9913-9917
Modeling of construction costs is a challenging task, as it requires representation of complex relations between factors and project costs with sparse and noisy data. In this paper, neural networks with bootstrap prediction intervals are presented for range estimation of construction costs. In the integrated approach, neural networks are used for modeling the mapping function between the factors and costs, and bootstrap method is used to quantify the level of variability included in the estimated costs. The integrated method is applied to range estimation of building projects. Two techniques; elimination of the input variables, and Bayesian regularization were implemented to improve generalization capabilities of the neural network models. The proposed modeling approach enables identification of parsimonious mapping function between the factors and cost and, provides a tool to quantify the prediction variability of the neural network models. Hence, the integrated approach presents a robust and pragmatic alternative for conceptual estimation of costs. 相似文献
18.
Nader Fallah Hong Gu Kazem Mohammad Seyyed Ali Seyyedsalehi Keramat Nourijelyani Mohammad Reza Eshraghian 《Neural computing & applications》2009,18(8):939-943
We describe a novel extension of the Poisson regression model to be based on a multi-layer perceptron, a type of neural network.
This relaxes the assumptions of the traditional Poisson regression model, while including it as a special case. In this paper,
we describe neural network regression models with six different schemes and compare their performances in three simulated
data sets, namely one linear and two nonlinear cases. From the simulation study it is found that the Poisson regression models
work well when the linearity assumption is correct, but the neural network models can largely improve the prediction in nonlinear
situations. 相似文献
19.
A residual-based moving block bootstrap procedure for testing the null hypothesis of linear cointegration versus cointegration with threshold effects is proposed. When the regressors and errors of the models are serially and contemporaneously correlated, our test compares favourably with the Sup LM test proposed by Gonzalo and Pitarakis. Indeed, shortcomings of the former motivated the development of our test. The small sample performance of the bootstrap test is investigated by Monte Carlo simulations, and the results show that the test performs better than the Sup LM test. 相似文献
20.
Felipe Alonso-Atienza José Luis Rojo-ÁlvarezAlfredo Rosado-Muñoz Juan J. VinagreArcadi García-Alberola Gustavo Camps-Valls 《Expert systems with applications》2012,39(2):1956-1967
Early detection of ventricular fibrillation (VF) is crucial for the success of the defibrillation therapy in automatic devices. A high number of detectors have been proposed based on temporal, spectral, and time-frequency parameters extracted from the surface electrocardiogram (ECG), showing always a limited performance. The combination ECG parameters on different domain (time, frequency, and time-frequency) using machine learning algorithms has been used to improve detection efficiency. However, the potential utilization of a wide number of parameters benefiting machine learning schemes has raised the need of efficient feature selection (FS) procedures. In this study, we propose a novel FS algorithm based on support vector machines (SVM) classifiers and bootstrap resampling (BR) techniques. We define a backward FS procedure that relies on evaluating changes in SVM performance when removing features from the input space. This evaluation is achieved according to a nonparametric statistic based on BR. After simulation studies, we benchmark the performance of our FS algorithm in AHA and MIT-BIH ECG databases. Our results show that the proposed FS algorithm outperforms the recursive feature elimination method in synthetic examples, and that the VF detector performance improves with the reduced feature set. 相似文献