首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
A surface plotting program is used to produce projections of three-dimensional data from two slightly different view-points forming a stereo-pair. The data can then be observed in three dimensions with the aid of a stereo-viewer. The technique has been applied to sets of data arising in a number of different areas.  相似文献   

2.
A new approach to the problem of smoothing digitized points is presented. The smoothness of the input data is made visible by displaying the first and second differences. Local corrections are performed on the first- and second-difference difference curves. The improved original data points are then computed by integration.  相似文献   

3.
This article presents new procedures for multisite spatiotemporal neuronal data analysis. A new statistical model - the diffusion model - is considered, whose parameters can be estimated from experimental data thanks to mean-field approximations. This work has been applied to optical recording of the guinea pig's auditory cortex (layers II-III). The rates of innovation and internal diffusion inside the stimulated area have been estimated. The results suggest that the activity of the layer balances between the alternate predominance of its innovation process and its internal process.  相似文献   

4.
RGB-D sensors are capable of providing 3D points (depth) together with color information associated with each point. These sensors suffer from different sources of noise. With some kinds of RGB-D sensors, it is possible to pre-process the color image before assigning the color information to the 3D data. However, with other kinds of sensors that is not possible: RGB-D data must be processed directly. In this paper, we compare different approaches for noise and artifacts reduction: Gaussian, mean and bilateral filter. These methods are time consuming when managing 3D data, which can be a problem with several real time applications. We propose new methods to accelerate the whole process and improve the quality of the color information using entropy information. Entropy provides a framework for speeding up the involved methods allowing certain data not to be processed if the entropy value of that data is over or under a given threshold. The experimental results provide a way to balance the quality and the acceleration of these methods. The current results show that our methods improve both the image quality and processing time, as compared to the original methods.  相似文献   

5.
Recently, Lainiotia (1971 c), using the so-called ‘ partition theorem ’, has obtained an optimal linear smoothing algorithm in explicit, closed-form expressions that are attractive, both from a computational and an analysis point of view. Lainiotis (1971c( ‘partition smoothing’ algorithm is re-examined herein, and its computational and analytical advantages studied. It is compared to the previously established two-filter smoothing algorithm of Mayne (1966), Fraser (1967), and Mehra (1968), as well as to the ‘ innovation smoothing ’ algorithm of Kailath and Frost (1968)

Subsequently, the so-called ‘ iterative ’ or ‘ reprocessed ’ smoothing scheme, used extensively as a data reduction process in the Apollo Space Programme, is studied using the ‘ partition smoothing ’ algorithm. The resulting explicit and closed-form expressions are readily amenable to interpretation and optimization, and are, moreover, both theoretically interesting as well as practically useful. The statistical and limiting properties of the ‘ partition reprocessed smoothing ’ algorithm are obtained and are thoroughly examined.  相似文献   

6.
This paper is primarily motivated by the problem of automatically removing unwanted noise from high-dimensional remote sensing imagery. The initial step involves the transformation of the data to a space of intrinsically lower dimensionality and the smoothing of images in the new space. Different images require different amounts of smoothing. The signal (assumed to be mostly smooth with relatively few discontinuities) is estimated from the data using the method of generalized cross-validation. It is shown how the generalized cross-validated thin-plate smoothing spline with observations on a regular grid (in d-dimensions) is easily approximated and computed in the Fourier domain. Space domain approximations are also investigated. The technique is applied to some remote sensing data  相似文献   

7.
T. Kailath 《Automatica》1975,11(1):109-111
Some additions and clarifications are presented as a supplement to a recent survey in this journal of smoothing for linear and nonlinear dynamic systems.  相似文献   

8.
9.
《Computers in Industry》1987,9(3):223-237
Piecewise rational quadratic curves are frequently used in many fields of computer science to represent curved shapes. For a large number of applications of computer aided design, the representation of data with a piecewise parametric curve that lies close to the data is an important consideration. This paper presents a geometric solution to the problem of automatically generating piecewise parametric rational quadratic polynomial approximations to shapes from sampled data. The algorithm takes a set of sample points, automatically generates tangents at some points and derives a piecewise rational quadratic curve that lies close to the data points. Combining this algorithm with the biquadratic search to subdivide the data if it cannot be represented with a single arc, we have a very stable algorithm that gives good results over a range of shapes and applications.  相似文献   

10.
We consider the problem of smoothing a sequence of noisy observations using a fixed class of models. Via a deterministic analysis, we obtain necessary and sufficient conditions on the noise sequence and model class that ensure that a class of natural estimators gives near-optimal smoothing. In the case of i.i.d. random noise, we show that the accuracy of these estimators depends on a measure of complexity of the model class involving covering numbers. Our formulation and results are quite general and are related to a number of problems in learning, prediction, and estimation. As a special case, we consider an application to output smoothing for certain classes of linear and nonlinear systems. The performance of output smoothing is given in terms of natural complexity parameters of the model class, such as bounds on the order of linear systems, the -norm of the impulse response of stable linear systems, or the memory of a Lipschitz nonlinear system satisfying a fading memory condition.  相似文献   

11.
Common carriers throughout the world are finding it difficult to satisfy current needs for rapid data transfer over telephone lines. Both the carrier and the user want quick, reliable and cheap data transmission. Thus, the maximization of line utilization is essential and this is why the multiplexing of signals from several sources into a single datastream for transfer over one line is becoming popular. The paper describes statistical time division multiplexing techniques and explains the operation of statistical multiplexers. Typical STDM installations are examined, including interfacing statistical multiplexers with other equipment and operation under adverse conditions. The development of switching multiplexes is discussed and the cost benefits of STDM compared with leased-line communication are estimated.  相似文献   

12.
Suppose the distribution of a population is characterized by a parameter Γ, whose value is not numerical, as usually is the case, but linguistic. The problem is to make a good guess about Γ, especially in the case when only linguistic data are available. In this paper the theory of fuzzy random variables is used to solve this problem: A method for the construction of consistent and unbiased estimates is given.  相似文献   

13.
We propose an algorithm for computing parameter estimates for a smoothing cubic spline that minimize the estimated expectation of losses. Instead of the usual assumption that the noise is centered we use an assumption which is more realistic for many practical smoothing problems, namely that it is zero median. The problem setting is augmented by prior deterministic information in the form of constraints on linear combinations of parameters of spline functions. We obtain explicit representations of such estimates and give their qualitative interpretation. Based on the results of a numerical experiment, we establish a high degree of robustness of the solutions to the presence of outliers in the measurements, including same sign outliers, and the possibility to fairly reliably determine the actual accuracy of the resulting estimates of spline parameters by the attained minimum risk value.  相似文献   

14.
J.S. Meditch 《Automatica》1973,9(2):151-162
A survey of the field of data smoothing for lumped-parameter, linear and nonlinear, dynamic systems is presented. The survey beings with the work of Kepler and Gauss, proceeds through that of Kolmogorov and Wiener, and concludes with the studies of numerous researchers during the past 10–12 years. The purpose of the survey is to place in perspective the development of the field of data smoothing relative to the broader area of estimation theory of which it is a part.  相似文献   

15.
Automated forecasts are often required, in practice, using data series from which certain points are missing and from data occurring at completely irregular time intervals. For instance, in computerised inventory control, fast methods of dealing with such data are required. There is an almost complete absence in the literature of computationally efficient methods for such a situation. This paper gives an extension of single and double exponential smoothing adapted to data occurring at irregular time intervals. These extensions are shown to have modest computational requirements and little sensitivity to initial conditions. Results of tests on sample data series are given showing only a minor decrease in accuracy with missing data, and indicating the appropriate method of choosing the smoothing parameter. Application of this method to published government time series is illustrated by two examples, firstly, to river water quality data originating from samples taken at irregular time intervals and, secondly, to divorce rate statistics from which certain points are missing due to summarizing the data. Successive summarizing of these series is found to have a negligible effect on forecast accuracy implying attractive cost saving possibilities in data collection and publication.  相似文献   

16.
The precision of an interpretation of gas exchange records in progressive exercise is limited by the typical breath-to-breath variation in the data. Recently, two procedures have been proposed for minimizing the "noise" in the estimates of alveolar gas exchange time series data. One approach utilizes an estimate of pulmonary blood flow (Q) for smoothing purposes. The other approach utilizes an estimate of effective lung volume (V'L) for smoothing purposes. In this paper, we formulate the smoothing problem as a general linear model and demonstrate the concurrent estimates of both V'L and Q. Furthermore, we investigate the interaction between V'L and Q. Specifically, when a high value of lung volume is used (such as the subject's resting functional residual capacity) in the alveolar gas exchange algorithm, the estimate of Q is biased low and the result is a less effective smoothing of the data. In addition, we demonstrate how the Q estimate can be improved by utilizing more appropriate estimates of arterial carbon dioxide tension.  相似文献   

17.
Efficient query processing in traditional database management systems relies on statistics on base data. For centralized systems, there is a rich body of research results on such statistics, from simple aggregates to more elaborate synopses such as sketches and histograms. For Internet-scale distributed systems, on the other hand, statistics management still poses major challenges. With the work in this paper we aim to endow peer-to-peer data management over structured overlays with the power associated with such statistical information, with emphasis on meeting the scalability challenge. To this end, we first contribute efficient, accurate, and decentralized algorithms that can compute key aggregates such as Count, CountDistinct, Sum, and Average. We show how to construct several types of histograms, such as simple Equi-Width, Average-Shifted Equi-Width, and Equi-Depth histograms. We present a full-fledged open-source implementation of these tools for distributed statistical synopses, and report on a comprehensive experimental performance evaluation, evaluating our contributions in terms of efficiency, accuracy, and scalability.  相似文献   

18.
The application of fuzzy sets theory to statistical confidence intervals for unknown fuzzy parameters is proposed in this paper by considering fuzzy random variables. In order to obtain the belief degrees under the sense of fuzzy sets theory, we transform the original problem into the optimization problems. We provide the computational procedure to solve the optimization problems. A numerical example is also provided to illustrate the possible application of fuzzy sets theory to statistical confidence intervals.  相似文献   

19.
Algorithms are presented for fitting data on the sphere by using tensor product splines which satisfy certain boundary constraints. First we consider the least-squares problem when the knots are given. Then we discuss the construction of smoothing splines on the sphere. Here the knots are located automatically. A Fortran IV implementation of these two algorithms is described.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号