首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   5篇
  免费   0篇
化学工业   4篇
自动化技术   1篇
  2002年   1篇
  2000年   1篇
  1993年   1篇
  1983年   1篇
  1982年   1篇
排序方式: 共有5条查询结果,搜索用时 62 毫秒
1
1.
Abstract. A vector time series model of the form A(L)y(t) + B(L)x(t) =ε(t) is known as a vector autoregressive model with exogenous variables (VARX model) and involves a regressand vector y(t) and a regressor vector x(t). This paper provides a method for the recursive fitting of subset VARX models. It suggests the use of ascending recursions in conjunction with an order selection criterion to choose an 'optimum' subset VARX model.  相似文献   
2.
Abstract. Conventional methods to determine the forgetting factors in autoregressive (AR) models are mostly based on arbitrary or personal choices. In this paper, we present two procedures which can be used to select the forgetting factor in subset AR modelling. The first procedure uses the bootstrap to determine the value of a fixed forgetting factor. The second procedure starts from this base and applies the time-recursive maximum likelihood estimation to a variable forgetting factor. In one illustration using real exchange rates, we demonstrate the effect of the forgetting factor in subset AR modelling on ex ante forecasting of non-stationary time series. In a second illustration, these two procedures are applied to time-update forecasts for a stock market index. Subset AR models not including a forgetting factor act as a set of benchmarks for assessing ex ante forecasting performance, and consistently improved forecasting performance is demonstrated for these proposed procedures.  相似文献   
3.
Abstract. In fitting a vector autoregressive process which may include lags up to and including lag K , we may wish to search for the subset vector autoregressive process of size k (where k is the number of lags with non-zero coefficient matrices, k = 1, 2, K ) which has the minimum generalized residual variance. This paper provides a recursive procedure, which is initialized by evaluating all 'forwardand'backward' autoregressions in which k = 1. The recursion then allows one to develop successively all subsets of size k = 2, k = 3 up to k = K .
The optimum subset vector autoregression is found by employing the proposed recursive procedures in conjunction with model selection criteria. This approach is used on simulated data to assess its performance and to re-examine the annual trappings of the Canadian lynx investigated by Tong (1977).  相似文献   
4.
In this paper a numerically robust lattice-ladder learning algorithm is presented that sequentially selects the best specification of a subset time series system using neural networks. We have been able to extend the relevance of multilayered neural networks and so more effectively model a greater array of time series situations. We have recognized that many connections between nodes in layers are unnecessary and can be deleted. So we have introduced inhibitor arcs, reflecting inhibitive synapses. We also allow for connections between nodes in layers which have variable strengths at different points of time by introducing additionally excitatory arcs, reflecting excitatory synapses. The resolving of both time and order updating leads to optimal synaptic weight updating and allows for optimal dynamic node creation/deletion within the extended neural network. The paper presents two applications that demonstrate the usefulness of the process.  相似文献   
5.
The recursive algorithm to select the optimum multivariate real subset autoregressive model (AR) [1] is generalized to apply to multichannel complex subset AR's. It is initiated by fitting all "forward" and "backward" one-lag AR's. The method then allows one to develop successively all complex subset AR's of sizek(the number of lags with nonzero coefficient matrices) from 1 toK. Finally, the best subsets of each size with the minimum generalized residual power for that size are compared to any one of three model selection criteria to find the optimum multichannel complex subset AR.  相似文献   
1
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号