首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Previous studies have shown that a random walk model is an appropriate time series model for explaining exchange rate time series. This analysis is based on the assumption that the variance of an exchange rate time series is homogeneous with respect to time. This paper shows that this assumption may be violated for exchange rate time series. The monthly exchange rate of German Deutschemark per U.S. dollar is considered. The data ranges from March 1973 to December 1984. The starting point roughly coincides with the beginning of the floating rate regime. It is seen that a non-linear model would be more appropriate than a linear model for explaining this exchange rate time series.  相似文献   

2.
The object of this paper is to present a model and a set of algorithms for estimating the parameters of a nonstationary time series generated by a continuous change in regime. We apply fuzzy clustering methods to the task of estimating the continuous drift in the time series distribution and interpret the resulting temporal membership matrix as weights in a time varying, mixture probability distribution function (PDF). We analyze the stopping conditions of the algorithm to infer a novel cluster validity criterion for fuzzy clustering algorithms of temporal patterns. The algorithm performance is demonstrated with three different types of signals.  相似文献   

3.
A new method for detecting regime switches between different probability distributions in financial time series is shown. In the proposed method, time series observations are divided into several segments, and a Gaussian model or a Cauchy model is fitted to each segment. The goodness of fit of the global model composed of these local models is evaluated using the Bayesian information criterion (BIC), and the division which minimizes this criterion defines the best model. Based on this method, for example, the specification with a Gaussian process in the first half and with a Cauchy process in the second half becomes applicable. Empirical applications and data-based simulations are presented to indicate the efficacy of the proposed method.  相似文献   

4.
Sequential time series clustering is a technique used to extract important features from time series data. The method can be shown to be the process of clustering in the delay-vector space formalism used in the Dynamical Systems literature. Recently, the startling claim was made that sequential time series clustering is meaningless. This has important consequences for a significant amount of work in the literature, since such a claim invalidates these work’s contribution. In this paper, we show that sequential time series clustering is not meaningless, and that the problem highlighted in these works stem from their use of the Euclidean distance metric as the distance measure in the delay-vector space. As a solution, we consider quite a general class of time series, and propose a regime based on two types of similarity that can exist between delay vectors, giving rise naturally to an alternative distance measure to Euclidean distance in the delay-vector space. We show that, using this alternative distance measure, sequential time series clustering can indeed be meaningful. We repeat a key experiment in the work on which the “meaningless” claim was based, and show that our method leads to a successful clustering outcome. Jason R. Chen received the B.E. degree from Sydney University, Australia, in 1991 and then worked mainly in the banking and finance industry until 1997. From 1997 to 2001, he completed the Ph.D. at Australian National University, Canberra, Australia, in robotics. From 2001 to the present, he has been a Research Engineer in the Research School of Information Science and Engineering, at Australian National University. His research interests broadly include robotics, data mining, and AI.  相似文献   

5.
A simple test for threshold nonlinearity in either the mean or volatility equation, or both, of a heteroskedastic time series model is proposed. The procedure extends current Bayesian Markov chain Monte Carlo methods and threshold modelling by employing a general double threshold GARCH model that allows for an explosive, non-stationary regime. Posterior credible intervals on model parameters are used to detect and specify threshold nonlinearity in the mean and/or volatility equations. Simulation experiments demonstrate that the method works favorably in identifying model specifications varying in complexity from the conventional GARCH up to the full double-threshold nonlinear GARCH model with an explosive regime, and is robust to over-specification in model orders.  相似文献   

6.
Multivariate time series are ubiquitous among a broad array of applications and often include both categorical and continuous series. Further, in many contexts, the continuous variable behaves nonlinearly conditional on a categorical time series. To accommodate the complexity of this structure, we propose a multi-regime smooth transition model where the transition variable is derived from the categorical time series and the degree of smoothness in transitioning between regimes is estimated from the data. The joint model for the continuous and ordinal time series is developed using a Bayesian hierarchical approach and thus, naturally, quantifies different sources of uncertainty. Additionally, we allow a general number of regimes in the smooth transition model and, for estimation, propose an efficient Markov chain Monte Carlo algorithm by blocking the parameters. Moreover, the model can be effectively used to draw inference on the behavior within and between regimes, as well as inference on regime probabilities. In order to demonstrate the frequentist properties of the proposed Bayesian estimators, we present the results of a comprehensive simulation study. Finally, we illustrate the utility of the proposed model through the analysis of two macroeconomic time series.  相似文献   

7.
The paper presents and estimates an endogenous growth model with publiccapital and government borrowing. Government behavior (tax rates, spending andborrowing) does not follow optimizing rules but is restricted by two fiscalregimes (rules). In the strict fiscal regime government borrowing is used forpublic investment only. In the less strict regime it can also be used forpublic investment and to a certain degree for the debt service. The growthrate differs in our model variants according to which rule is adopted.Moreover, the growth maximizing income tax rate is different from zero. Forthe two relevant fiscal regimes, which correspond roughly to the cases of theU.S. and Germany, the model is estimated by employing time series data from1960.4 to 1992.1 and 1966.1 to 1995.1 respectively. The results suggest anexplanation for the different time paths of economic variables in the Americanand German economies in the post-war period.  相似文献   

8.
This paper presents a semi-automatic methodology for fire scars mapping from a long time series of remote sensing data. Approximately, a hundred MSS images from different Landsat satellites were employed over an area of 32 100 km2 in the north-east of the Iberian Peninsula. The analysed period was from 1975 to 1993. Results are a map series of fire history and frequencies. Omission errors are 23% for burned areas greater than 200 ha while commission errors are 8% for areas greater than 50 ha. Subsequent work based on the resultant fire scars will also help in describing fire regime and in monitoring post-fire regeneration dynamics.  相似文献   

9.
Discrimination of locally stationary time series using wavelets   总被引:1,自引:0,他引:1  
Time series are sometimes generated by processes that change suddenly from one stationary regime to another, with no intervening periods of transition of any significant duration. A good example of this is provided by seismic data, namely, waveforms of earthquakes and explosions. In order to classify an unknown event as either an earthquake or an explosion, statistical analysts might be helped by having at their disposal an automatic means of identifying, at any time, which pattern prevails. Several authors have proposed methods to tackle this problem by combining the techniques of spectral analysis with those of discriminant analysis. The goal is to develop a discriminant scheme for locally stationary time series such as earthquake and explosion waveforms, by combining the techniques of wavelet analysis with those of discriminant analysis.  相似文献   

10.
A tree-structured heterogeneous autoregressive (tree-HAR) process is proposed as a simple and parsimonious model for the estimation and prediction of tick-by-tick realized correlations. The model can account for different time and other relevant predictors’ dependent regime shifts in the conditional mean dynamics of the realized correlation series. Testing the model on S&P 500 Futures and 30-year Treasury Bond Futures realized correlations, empirical evidence that the tree-HAR model reaches a good compromise between simplicity and flexibility is provided. The model yields accurate single- and multi-step out-of-sample forecasts. Such forecasts are also better than those obtained from other standard approaches, in particular when the final goal is multi-period forecasting.  相似文献   

11.
Many current technological challenges require the capacity of forecasting future measurements of a phenomenon. This, in most cases, leads directly to solve a time series prediction problem. Statistical models are the classical approaches for tackling this problem. More recently, neural approaches such as Backpropagation, Radial Basis Functions and recurrent networks have been proposed as an alternative. Most neural-based predictors have chosen a global modelling approach, which tries to approximate a goal function adjusting a unique model. This philosophy of design could present problems when data is extracted from a phenomenon that continuously changes its operational regime or represents distinct operational regimes in a unbalanced manner. In this paper, two alternative neural-based local modelling approaches are proposed. Both follow the divide and conquer principle, splitting the original prediction problem into several subproblems, adjusting a local model for each one. In order to check their adequacy, these methods are compared with other global and local modelling classical approaches using three benchmark time series and different sizes (medium and high) of training data sets. As it is shown, both models demonstrate to be useful pragmatic paradigms to improve forecasting accuracy, with the advantages of a relatively low computational time and scalability to data set size.  相似文献   

12.
In this study we used satellite altimetry to characterize the time and space variations in water stored in or circulating through rivers, floodplains, wetlands and lakes in the major sub-basins of the Amazon basin. Using a specific methodology to rigorously select original three-dimensional (3D) data from an Environmental Satellite (ENVISAT) mission, water level time series were calculated at the crossing path of the satellite tracks with the water bodies. We took advantage of the continuous sampling of the water level along the satellite track segments that cross the watershed to analyse both spatial and temporal relationships between: (i) the river and its floodplain and (ii) different basins. This work evidences in particular the existence of water leaking between the Negro and Solimões basins at the high water stage. It highlights that the phenomenon of a secondary flood peak occurring in the water level series in the Solimões basin at rising water, known as repiquete, is caused by the rain equatorial regime of the northern upstream tributaries of the Solimões River, but is disconnected from the same phenomenon occurring within the Rio Negro basin.  相似文献   

13.
Accurate and timely predicting values of performance parameters are currently strongly needed for important complex equipment in engineering. In time series prediction, two problems are urgent to be solved. One problem is how to achieve the accuracy, stability and efficiency together, and the other is how to handle time series with multiple regimes. To solve these two problems, random forests-based extreme learning machine ensemble model and a novel multi-regime approach are proposed respectively, and these two approaches can be integrated to achieve better performance. First, the extreme learning machine (ELM) is used in the proposed model because of its efficiency. Then the regularized ELM and ensemble learning strategy are used to improve generalization performance and prediction accuracy. The bootstrap sampling technique is used to generate training sample sets for multiple base-level ELM models, and then the random forests (RF) model is used as the combiner to aggregate these ELM models to achieve more accurate and stable performance. Next, based on the specific properties of turbofan engine time series, a multi-regime approach is proposed to handle it. Regimes are first separated, then the proposed RF-based ELM ensemble model is used to learn models of all regimes, individually, and last, all the learned regime models are aggregated to predict performance parameter at the future timestamp. The proposed RF-based ELM ensemble model and multi-regime approaches are evaluated by using NN3 time series and NASA turbofan engine time series, and then the proposed model is applied to the exhaust gas temperature prediction of CFM engine. The results demonstrate that the proposed RF-based ELM ensemble model and multi-regime approach can be accurate, stable and efficient in predicting multi-regime time series, and it can be robust against overfitting.  相似文献   

14.
The Santa Fe Artificial Stock Market consists of a central computational market and a number of artificially intelligent agents. The agents choose between investing in a stock and leaving their money in the bank, which pays a fixed interest rate. The stock pays a stochastic dividend and has a price which fluctuates according to agent demand. The agents make their investment decisions by attempting to forecast the future return on the stock, using genetic algorithms to generate, test, and evolve predictive rules. The artificial market shows two distinct regimes of behavior, depending on parameter settings and initial conditions. One regime corresponds to the theoretically predicted rational expectations behavior, with low overall trading volume, uncorrelated price series, and no possibility of technical trading. The other regime is more complex, and corresponds to realistic market behavior, with high trading volume, high intermittent volatility (including GARCH behavior), bubbles and crashes, and the presence of technical trading. One parameter that can be used to control the regime is the exploration rate, which governs how rapidly the agents explore new hypotheses with their genetic algorithms. At a low exploration rate the market settles into the rational expectations equilibrium. At a high exploration rate it falls into the more realistic complex regime. The transition is fairly sharp, but close to the boundary the outcome depends on the agents’ initial “beliefs”—if they believe in rational expectations they occur and are a local attractor; otherwise the market evolves into the complex regime. This work was presented, in part, at the Third International Symposium on Artificial Life and Robotics, Oita, Japan, January 19–21, 1998  相似文献   

15.
An efficient second-order method based on exponential time differencing approach for solving American options under multi-state regime switching is developed and analysed for stability and convergence. The method is seen to be strongly stable (L-stable) in each regime. The implicit predictor–corrector nature of the method makes it highly efficient in solving nonlinear systems of partial differential equations arising from multi-state regime switching model. Stability and convergence of the method are examined. The impact of regime switching on option prices for different jump rates and volatility is illustrated. A general framework for multi-state regime switching in multi-asset American option has been provided. Numerical experiments are performed on one and two assets to demonstrate the performance of the method with convex as well as non-convex payoffs. The method is compared with some of the existing methods available in the literature and is found to be reliable, accurate and efficient.  相似文献   

16.
Kim and Nelson [1999. State Space Models with Regime Switching. MIT Press, Cambridge, MA] and others extended the framework of state space models involving independent regime changes to the Markov dependent case. The cost of dealing with state space models with Markov switching is high in computational effort because of the number of the possible paths through the chain. Thus it is necessary to make some approximations in order to obtain a computationally feasible algorithm for estimation. The approximations depend on modified smoothing and filtering recursions that can be easily incorporated into an EM algorithm for maximum likelihood estimation. To investigate the accuracy of approximations, we develop a new method to obtain more exact solutions, and then compare the two methods. We apply both methods to a simulated series. The result shows that employing the approximation-based algorithm not only provides accurate results but also leads to a significant reduction in the computational costs. We also apply the methods to an influenza mortality series, in which we develop a model that is general enough to include most structural models useful in monitoring changes of regime. The model proposed has the flexibility to deal with a wide range of problems involving possible regime shifts in pattern that may be seen to occur in many biological, medical and epidemiological studies.  相似文献   

17.
Kazakhstan is the second largest country to emerge from the collapse of the Soviet Union. Consequent to the abrupt institutional changes surrounding the disintegration of the Soviet Union in the early 1990s, Kazakhstan has reportedly undergone extensive land cover/land use change. Were the institutional changes sufficiently great to affect land surface phenology at spatial resolutions and extents relevant to mesoscale meteorological models? To explore this question, we used the NDVI time series (1985-1988 and 1995-1999) from the Pathfinder Advanced Very High Resolution Radiometer (AVHRR) Land (PAL) dataset, which consists of 10 days maximum NDVI composites at a spatial resolution of 8 km. Daily minimum and maximum temperatures were extracted from the NCEP Reanalysis Project and 10 days composites of accumulated growing degree-days (AGDD) were produced. We selected for intensive study seven agricultural areas ranging from regions with rain-fed spring wheat cultivation in the north to regions of irrigated cotton and rice in the south. We applied three distinct but complementary statistical analyses: (1) nonparametric testing of sample distributions; (2) simple time series analysis to evaluate trends and seasonality; and (3) simple regression models describing NDVI as a quadratic function of AGDD.The irrigated areas displayed different temporal developments of NDVI between 1985-1988 and 1995-1999. As the temperature regime between the two periods was not significantly different, we conclude that observed differences in the temporal development of NDVI resulted from changes in agricultural practices.In the north, the temperature regime was also comparable for both periods. Based on extant socioeconomic studies and our model analyses, we conclude that the changes in the observed land surface phenology in the northern regions are caused by large increases in fallow land dominated by weedy species and by grasslands under reduced grazing pressure. Using multiple lines of evidence allowed us to build a case of whether differences in land surface phenology were mostly the result of anthropogenic influences or interannual climatic fluctuations.  相似文献   

18.
In a recent paper we presented a new algorithm for hierarchical unsupervised fuzzy clustering (HUFC) and demonstrated its performance for biomedical state identification. In the present paper, a new hybrid algorithm for time series prediction is applying the HUFC algorithm for grouping and modeling related temporal-patterns that are dispersed along a non-stationary signal. Vague and gradual changes in regime are naturally treated by means of fuzzy clustering. An adaptive hierarchical selection of the number of clusters (the number of underlying processes) can overcome the general non-stationary nature of real-life time-series (biomedical, physical, economical, etc.).  相似文献   

19.

In the analysis and prediction of real world systems two of the key problems are nonstation arity (often in the form of switching between regimes) and overfitting (particularly serious for noisy processes). This article addresses these problems using gated experts consisting of a nonlinear gating network and several also nonlinear competing experts. Each expert learns to predict the conditional mean and each expert adapts its width to match the noise level in its regime. The gating network learns to predict the probability of each expert given the input. This article focuses on the case where the gating network bases its decision on infor mation from the inputs. This can be contrasted to hidden Markov models where the decision is based on the previous state s i e on the output of the gating network at the previous time step as well as to averaging over several predictors. In contrast, gated experts soft partition the input space. This article discusses the underlying statistical assumptions, derives the weight update rules and compares the performance of gated experts to standard methods on three time series: 1 - a computer generated series obtained by randomly switching between two nonlinear processes; 2 - a time series from the Santa Fe Time Series Competition the light intensity of a laser in chaotic state; and 3 - the daily electricity demand of France (a real world multivariate problem with structure on several timescales). The main results are (1) the gating network correctly discovers the different regimes of the process (2) the widths associated with each expert are important for the segmentation task and they can be used to characterize the subprocesses and (3) there is less overfitting compared to single networks homogeneous multilayer perceptrons since the experts learn to match their variances to the local noise levels. This can be viewed as matching the local complexity of the model to the local complexity of the data.  相似文献   

20.
We analyze the effect of a classical random telegraph noise on the dynamics of quantum correlations and decoherence between two non-interacting spin-qutrit particles, initially entangled, and coupled either to independent sources or to a common source of noise. Both Markovian and non-Markovian environments are considered. For the Markov regime, as the noise switching rate decreases, a monotonic decay of the initial quantum correlations is found and the loss of coherence increases monotonically with time up to the saturation value. For the non-Markov regime, evident oscillations of correlations and decoherence are observed due to the noise regime, but correlations, however, avoid sudden death phenomena. The oscillatory behavior is more and more prominent as the noise switching rate decreases in this regime, thus enhancing robustness of correlations. Similarly to the qubits case, independent environments coupling is more effective than a common environment coupling in preserving quantum correlations and coherence of the system for a Markovian noise; meanwhile, the opposite is found for the non-Markovian one.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号