首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This article considers the problem of testing for symmetry of the marginal distribution of weakly dependent, stationary random processes. A quantile‐based test for symmetry is proposed, which is easy to implement, requires no moment assumptions and has a standard asymptotic distribution. The finite‐sample properties of the test are assessed by means of Monte Carlo experiments. An application to financial time series is also discussed.  相似文献   

2.
This article proposes a general time series framework to capture the long‐run behaviour of financial series. The suggested approach includes linear and segmented time trends, and stationary and non‐stationary processes based on integer and/or fractional degrees of differentiation. Moreover, the spectrum is allowed to contain more than a single pole or singularity, occurring at both zero but non‐zero (cyclical) frequencies. This framework is used to analyse five annual time series with a long span, namely dividends, earnings, interest rates, stock prices and long‐term government bond yields. The results based on several likelihood criteria indicate that the five series exhibit fractional integration with one or two poles in the spectrum, and are quite stable over the sample period examined.  相似文献   

3.
Recently, to account for low-frequency market dynamics, several volatility models, employing high-frequency financial data, have been developed. However, in financial markets, we often observe that financial volatility processes depend on economic states, so they have a state heterogeneous structure. In this article, to study state heterogeneous market dynamics based on high-frequency data, we introduce a novel volatility model based on a continuous Itô diffusion process whose intraday instantaneous volatility process evolves depending on the exogenous state variable, as well as its integrated volatility. We call it the state heterogeneous GARCH-Itô (SG-Itô) model. We suggest a quasi-likelihood estimation procedure with the realized volatility proxy and establish its asymptotic behaviors. Moreover, to test the low-frequency state heterogeneity, we develop a Wald test-type hypothesis testing procedure. The results of empirical studies suggest the existence of leverage, investor attention, market illiquidity, stock market comovement, and post-holiday effect in S&P 500 index volatility.  相似文献   

4.
The paper presents a study of temporal dependence in nonlinear transformations of time series. We examine the effects of parametric transformations on autocorrelation values and the persistence range with special emphasis on long memory processes. We derive an invariance property for the order of fractional integration of transformed normal processes and propose a related specification test. Within the class of nonlinear time series transforms, we identify those which maximize autocorrelations at selected lags. This procedure is based on nonlinear canonical correlations analysis adapted to serially correlated data. The methods proposed in this paper may be applied to various financial time series that usually are transformed prior to estimation, like returns, volumes or inter-trade durations. In examples illustrating our approach, we use series of durations between trades of the Alcatel stock on the Paris Bourse.  相似文献   

5.
We consider a model for the discrete nonboundary wavelet coefficients of autoregressive fractionally integrated moving average (ARFIMA) processes in each scale. Because the utility of the wavelet transform for the long‐range dependent processes, which many authors have explained in semi‐parametrical literature, is approximating the transformed processes to white noise processes in each scale, there have been few studies in a parametric setting. In this article, we propose the model from the forms of the (generalized) spectral density functions (SDFs) of these coefficients. Since the discrete wavelet transform has the property of downsampling, we cannot directly represent these (generalized) SDFs. To overcome this problem, we define the discrete non‐decimated nonboundary wavelet coefficients and compute their (generalized) SDFs. Using these functions and restricting the wavelet filters to the Daubechies wavelets and least asymmetric filters, we make the (generalized) SDFs of the discrete nonboundary wavelet coefficients of ARFIMA processes in each scale clear. Additionally, we propose a model for the discrete nonboundary scaling coefficients in each scale.  相似文献   

6.
The results of temperature‐dependent dielectric and rheological measurements are reported on polymer‐ceramic composite films, poly(methyl methacrylate) (PMMA) : lead titanate (PbTiO3). Analyses of relaxational processes of the PMMA host matrix have been investigated using temperature‐dependent dielectric and rheological measurements. It is found that the α‐relaxation is more significantly affected by the addition of filler in comparison to β‐relaxation. The composite films are found to have much lower dielectric constants in comparison to the pure ceramic material. Suitable models have been used to explain the observed dielectric constant of the composite films. Using rheological measurements, occurrence of reinforcement in these composite films due to the addition of ceramic filler has also been observed and the results are discussed in the article. © 2012 Wiley Periodicals, Inc. J. Appl. Polym. Sci., 2013  相似文献   

7.
Evolution of microstructure and rheology during flow startup, and its connection to microscopic transport processes, is studied theoretically via active microrheology. At steady state, the balance between entropic, hydrodynamic, and other forces changes with flow strength, producing sustained microstructural asymmetry and non‐Newtonian rheology. However, the transition from equilibrium to steady flow is sometimes marked by overshoots in viscosity that suggests a temporally evolving competition between these rate processes. Here, we formulate and solve a Smoluchowski equation for the time‐dependent evolution of particle microstructure induced by the motion of a colloidal probe driven through a bath of colloidal spheres. The structure is then utilized to compute the time‐dependent microviscosity. Brownian diffusion always sets short‐time particle dynamics, which hinders maturation of the boundary layer. The disparity in Brownian and advective transport rates produces a reversal from flow thinning to flow thickening during startup, revealing that non‐Newtonian flow phenomenology is not instantaneously established. © 2018 American Institute of Chemical Engineers AIChE J, 64: 3198–3214, 2018  相似文献   

8.
9.
The purpose of this article is to develop the likelihood ratio test for the structural change of an AR model to a threshold AR model. It is shown that the log‐likelihood ratio test converges to the maxima of a two‐parameter Gaussian process in distribution. This limiting distribution is novel and we tabulate the critical values. Some simulations are carried out to examine the finite‐sample performance of this test statistic. This article also includes a weak convergence of a two‐parameter marked empirical process, which is of independent interest.  相似文献   

10.
11.
The time aggregation of vector linear processes containing (i) mixed stock‐flow data and (ii) aggregated at mixed frequencies, is explored, focusing on a method to translate the parameters of the underlying continuous time model into those of an equivalent model of the observed data. Based on manipulations of a general state‐space form, the results may be used to model multiple frequencies or aggregation schemes. Estimation of the continuous time parameters via the ARMA representation of the observable data vector is discussed and demonstrated in an application to model stock price and dividend data. Simulation evidence suggests that these estimators have superior properties to the traditional approach of concentrating the data to a single low frequency.  相似文献   

12.
A new multi‐variate stochastic volatility estimation procedure for financial time series is proposed. A Wishart autoregressive process is considered for the volatility precision covariance matrix, for the estimation of which a two step procedure is adopted. The first step is the conditional inference on the autoregressive parameters and the second step is the unconditional inference, based on a Newton‐Raphson iterative algorithm. The proposed methodology, which is mostly Bayesian, is suitable for medium dimensional data and it bridges the gap between closed‐form estimation and simulation‐based estimation algorithms. An example, consisting of foreign exchange rates data, illustrates the proposed methodology.  相似文献   

13.
This article introduces a testing procedure for cointegration and nonlinear adjustment in a smooth transition vector error correction model. To overcome the unidentified parameters problem under the null of no‐cointegration, the Wald statistic is optimized over the unidentified parameter space. The asymptotic distribution of the test statistic is shown to be non‐standard but nuisance parameter‐free and hence critical values are obtained by simulations, Simulations show that the proposed test outperforms the alternatives in small sample sizes both in terms of size and power. Application to the exchange rate‐monetary fundamentals relationship show that the proposed test works considerably well. This article also finds that nonlinear adjustment dynamics are symmetric for some currencies and therefore the speed of adjustment depends on the size of the deviations and is asymmetric for others, hence, the adjustment dynamics depend not only on the size but also on the sign of the deviations.  相似文献   

14.
Several tests for detecting mean shifts at an unknown time in stationary time series have been proposed, including cumulative sum (CUSUM), Gaussian likelihood ratio (LR), maximum of F(Fmax) and extreme value statistics. This article reviews these tests, connects them with theoretical results, and compares their finite sample performance via simulation. We propose an adjusted CUSUM statistic which is closely related to the LR test and which links all tests. We find that tests based on CUSUMing estimated one‐step‐ahead prediction residuals from a fitted autoregressive moving average perform well in general and that the LR and Fmax tests (which induce substantial computational complexities) offer only a slight increase in power over the adjusted CUSUM test. We also conclude that CUSUM procedures work slightly better when the changepoint time is located near the centre of the data, but the adjusted CUSUM methods are preferable when the changepoint lies closer to the beginning or end of the data record. Finally, an application is presented to demonstrate the importance of the choice of method.  相似文献   

15.
Abstract. The aim of this paper is to examine the application of measures of persistence in a range of time‐series models nested in the framework of Cramer (1961) . This framework is a generalization of the Wold (1938) decomposition for stationary time‐series which, in addition to accommodating the standard I(0) and I(1) models, caters for a broad range of alternative processes. Two measures of persistence are considered in some detail, namely the long‐run impulse‐response and variance‐ratio functions. Particular emphasis is given to the behaviour of these measures in a range of non‐stationary models specified in discrete time. We document the conflict that arises between different measures, applied to the same model, as well as conflict arising from the use of a given measure in different models. Precisely which persistence measures are time dependent and which are not, is highlighted. The nature of the general representation used also helps to clarify which shock the impulse‐response function refers to in the case of models where more than one random disturbance impinges on the time series.  相似文献   

16.
The existing estimation methods for the model parameters of the unified GARCH–Itô model (Kim and Wang, 2014 ) require long period observations to obtain the consistency. However, in practice, it is hard to believe that the structure of a stock price is stable during such a long period. In this article, we introduce an estimation method for the model parameters based on the high‐frequency financial data with a finite observation period. In particular, we establish a quasi‐likelihood function for daily integrated volatilities, and realized volatility estimators are adopted to estimate the integrated volatilities. The model parameters are estimated by maximizing the quasi‐likelihood function. We establish asymptotic theories for the proposed estimator. A simulation study is conducted to check the finite sample performance of the proposed estimator. We apply the proposed estimation approach to the Bank of America stock price data.  相似文献   

17.
In this article, we revisit a time series model introduced by MCElroy and Politis (2007a) and generalize it in several ways to encompass a wider class of stationary, nonlinear, heavy‐tailed time series with long memory. The joint asymptotic distribution for the sample mean and sample variance under the extended model is derived; the associated convergence rates are found to depend crucially on the tail thickness and long memory parameter. A self‐normalized sample mean that concurrently captures the tail and memory behaviour, is defined. Its asymptotic distribution is approximated by subsampling without the knowledge of tail or/and memory parameters; a result of independent interest regarding subsampling consistency for certain long‐range dependent processes is provided. The subsampling‐based confidence intervals for the process mean are shown to have good empirical coverage rates in a simulation study. The influence of block size on the coverage and the performance of a data‐driven rule for block size selection are assessed. The methodology is further applied to the series of packet‐counts from ethernet traffic traces.  相似文献   

18.
When considering two or more time series of functional data objects, for instance those derived from densely observed intraday stock price data of several companies, the empirical cross‐covariance operator is of fundamental importance due to its role in functional lagged regression and exploratory data analysis. Despite its relevance, statistical procedures for measuring the significance of such estimators are currently undeveloped. We present methodology based on a functional central limit theorem for conducting statistical inference for the cross‐covariance operator estimated between two stationary, weakly dependent, functional time series. Specifically, we consider testing the null hypothesis that the two series possess a specified cross‐covariance structure at a given lag. Since this test assumes that the series are jointly stationary, we also develop a change‐point detection procedure to validate this assumption of independent interest. The most imposing technical hurdle in implementing the proposed tests involves estimating the spectrum of a high dimensional spectral density operator at frequency zero. We propose a simple dimension reduction procedure based on functional principal component analysis to achieve this, which is shown to perform well in a simulation study. We illustrate the proposed methodology with an application to densely observed intraday price data of stocks listed on the New York stock exchange‐20.40  相似文献   

19.
Multivariate processes with long‐range dependent properties are found in a large number of applications including finance, geophysics and neuroscience. For real‐data applications, the correlation between time series is crucial. Usual estimations of correlation can be highly biased owing to phase shifts caused by the differences in the properties of autocorrelation in the processes. To address this issue, we introduce a semiparametric estimation of multivariate long‐range dependent processes. The parameters of interest in the model are the vector of the long‐range dependence parameters and the long‐run covariance matrix, also called functional connectivity in neuroscience. This matrix characterizes coupling between time series. The proposed multivariate wavelet‐based Whittle estimation is shown to be consistent for the estimation of both the long‐range dependence and the covariance matrix and to encompass both stationary and nonstationary processes. A simulation study and a real‐data example are presented to illustrate the finite‐sample behaviour.  相似文献   

20.
Online property prediction in industrial rubber mixing processes is not an easy task. An efficient data‐driven prediction model is developed in this work. The regularized extreme learning machine (RELM) is utilized as the fundamental soft sensor model. To better capture distinguished characteristics in multiple recipes and operating modes, a just‐in‐time RELM modeling method is developed. The number of hidden neurons and the value of regularization parameter of the just‐in‐time RELM model can be efficiently selected using a fast leave‐one‐out strategy. Consequently, without the time‐consuming laboratory analysis process, the Mooney viscosity can be online predicted once a mixing batch has been discharged. The industrial Mooney viscosity prediction results show its better prediction performance in comparison with traditional approaches. © 2017 Wiley Periodicals, Inc. J. Appl. Polym. Sci. 2017 , 134, 45391.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号