首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
针对非平稳时间序列预测问题,提出一种具有广义正则化与遗忘机制的在线贯序超限学习机算法.该算法以增量学习新样本的方式实现在线学习,以遗忘旧的失效样本的方式增强对非平稳系统的动态跟踪能力,并通过引入一种广义的$l_2$正则化使其具有持续的正则化功能,从而保证算法的持续稳定性.仿真实例表明,所提出算法具有较同类算法更好的稳定性和更小的预测误差,适用于具有动态变化特性的非平稳时间序列在线建模与预测.  相似文献   

2.
The integration of numerous monitoring points poses a significant challenge to the efficient modeling of dam displacement behavior, and multi-point synchronous prediction is an effective solution. However, traditional approaches usually construct site-specific data-driven models for each monitoring point individually, which focus on single-target regression and discard the underlying spatial correlation among different displacement monitoring points. This study therefore proposes a multi-input multi-output (MIMO) machine learning (ML) paradigm based on support vector machine (SVM) for synchronous modeling and prediction of multi-point displacements from various dam blocks. In this method, a novel multi-output data-driven model, termed as multi-target SVM (MSVM), is formulated through a deep hybridization of classical SVM architecture and multi-target regression. During the initialization of MSVM, the intercorrelation of multiple target variables is fully exploited by decomposing and regulating the weight vectors. The proposed MSVM is designed to capture the complex MIMO mapping from influential factors to multi-block displacements, while taking into account the correlation between multi-block displacement outputs. Additionally, in order to avoid obtaining the unreliable prediction results due to the empirical selection of parameters, an efficient optimization strategy based on the parallel multi-population Jaya (PMP-Jaya) algorithm is used to adaptively tune the hyperparameters involved in MSVM, which contains no algorithm-specific parameters and is easy to implement. The effectiveness of the proposed model is verified using monitoring data collected from a real concrete gravity dam, where its performance is compared with conventional single-target SVM (SSVM)-based models and state-of-the-art ML-based models. The results indicate that our proposed MSVM is much more promising than the SSVM-based models because only one prediction model is required, rather than constructing multiple site-specific SSVM-based models for different dam blocks. Moreover, MSVM can achieve better performance than other ML-based models in most cases, which provides an innovative modeling tool for dam multi-block behavior monitoring.  相似文献   

3.
高炉煤气是钢铁企业重要的二次能源,其产生量和消耗量的实时准确预测对高炉煤气系统的平衡调度具有重要作用;但由于高炉煤气系统工况多变、产消量数据波动较大,给高炉煤气产消量的准确预测带来了很大的挑战;为此,通过对煤气产消量数据特征的深入分析,提出了一种基于自适应遗忘因子极限学习机(AF-ELM)的在线预测算法;在序贯极限学习机的基础上,引入遗忘因子逐步遗忘旧样本,通过预测误差反馈机制,自适应的调节遗忘因子,从而提高预测方法对系统工况的动态变化的适应能力,提高预测精度;将该算法应用于钢铁企业的高炉煤气产消量在线预测,实验结果表明与序贯极限学习机相比,该预测方法在系统工况变化的情况下能保持较高的预测精度,更适合于高炉煤气产消量的在线预测。  相似文献   

4.
The construction of a mathematical model to predict dam deformation can provide an important basis for judging its operating condition. Due to several time-varying factors, such as water level, temperature and aging, the dam prototype monitoring data series shows non-linear and non-stationary features, which increase the difficulty of dam deformation prediction and analysis. For this reason, a novel distributed deformation prediction model (DDPM), which combines transformation ideology with structured methodology, is proposed to improve the reliability of deformation prediction. DDPM starts by considering three constituent elements of dam deformation series using time series decomposition, and a multi-model fusion strategy is adopted. The trend, periodic and remainder components are separately predicted through constructing the optimal fitting, weight window and remainder generation sub-models. The three predicted components are aggregated as the final predicted output based on an underlying data model. The accuracy and validity of DDPM are verified and evaluated by taking a concrete dam in China as an example and comparing prediction performance with well-established models. The simulation results indicate that DDPM can not only extract more potential data features to obtain good deformation prediction effect, it can also reduce the complexity of mathematical modeling. Furthermore, two other functions of DDPM, including missing value handling and anomaly detection, are also discussed, which ultimately realize the integrated configuration of deformation prediction and data cleaning. The new model provides an alternative method for prediction and analysis of dam deformation and other structural behavior.  相似文献   

5.
Chen  Siyu  Gu  Chongshi  Lin  Chaoning  Zhang  Kang  Zhu  Yantao 《Engineering with Computers》2021,37(3):1943-1959

The observation data of dam displacement can reflect the dam’s actual service behavior intuitively. Therefore, the establishment of a precise data-driven model to realize accurate and reliable safety monitoring of dam deformation is necessary. This study proposes a novel probabilistic prediction approach for concrete dam displacement based on optimized relevance vector machine (ORVM). A practical optimization framework for parameters estimation using the parallel Jaya algorithm (PJA) is developed, and various simple kernel/multi-kernel functions of relevance vector machine (RVM) are tested to obtain the optimal selection. The proposed model is tested on radial displacement measurements of a concrete arch dam to mine the effect of hydrostatic, seasonal and irreversible time components on dam deformation. Four algorithms, including support vector regression (SVR), radial basis function neural network (RBF-NN), extreme learning machine (ELM) and the HST-based multiple linear regression (HST-MLR), are used for comparison with the ORVM model. The simulation results demonstrate that the proposed multi-kernel ORVM model has the best performance for predicting the displacement out of range of the used measurements dataset. Meanwhile, the ORVM model has the advantages of probabilistic output and can provide reasonable confidence interval (CI) for dam safety monitoring. This study lays the foundation for the application of RVM in the field of dam health monitoring.

  相似文献   

6.
An improved neuro-wavelet modeling (NWM) methodology is presented, and it aims at improving prediction precision of time-varying behavior of engineering structures. The proposed methodology distinguishes from the existing NWM methodology by featuring the distinctive capabilities of constructing optimally uncoupled dynamic subsystems in light of the redundant Haar wavelet transform (RHWT) and optimizing neural network. In particular, two techniques of imitating wavelet packet transform of RHWT and reconstructing the major crests of power spectrum of analyzed data are developed with the aim of constructing the optimally uncoupled dynamic subsystems from time-varying data. The resulting uncoupled dynamic subsystems make the underlying dynamic law of time-varying behavior more tractable than the raw scale subwaves arose from the RHWT, and they provide a platform for multiscale modeling via individual modeling at the uncoupled dynamic subsystem level. Furthermore, on each uncoupled dynamic subsystem, the technique of optimal brain surgeon in conjunction with a new dynamic mechanism refreshing is employed to optimize the neural network, and the recombination of the modeling outcomes on every subsystem constitutes the overall modeling of time-varying behavior. The improved NMW methodology offers a feasible framework of multiscale modeling due to its flexibility, adaptability and rationality, and it is particularly useful for prediction applications of time-varying behavior of engineering structures. As an illustrative example, the improved NWM methodology is applied to model and forecast dam deformation, and the results show that the methodology possesses positive advantages over the existing multiscale and single scale modeling techniques. The improved NMW methodology is promising and valuable for the safety monitoring and extreme event warning of engineering structures.  相似文献   

7.
A reliable and online predictor is very useful to a wide array of industries to forecast the behavior of time-varying dynamic systems. In this paper, an evolving fuzzy system (EFS) is developed for system state forecasting. An evolving clustering algorithm is proposed for cluster generation. Clusters are established and modified based on constraint criteria of mapping consistence and compatible measurement. A novel recursive Levenberg–Marquardt (R-LM) method is proposed for online training of nonlinear EFS parameters. The viability of the developed EFS predictor is evaluated based on both simulation from benchmark data and real-time tests corresponding to machinery condition monitoring and material property testing. Test results show that the developed EFS predictor is an effective and accurate forecasting tool. It can capture the system's dynamic behavior quickly and track the system's characteristics accurately. The proposed clustering algorithm is an effective structure identification method. The recursive training technique is computationally efficient, and can effectively improve reasoning convergence.   相似文献   

8.
Accurate simulation of temperature effect is a major challenge for computational (intelligent) prediction models used for monitoring health of high concrete dams, especially in regions with long freezing periods and distinct seasons, occasional extreme weather. A Hydrostatic-Temperature long-term-Time (HTLT) model was proposed for better temperature effect simulation using long-term measured environment temperatures and explored the influence of temperatures data sets of different time lengths on dam displacement prediction accuracy with the traditional Hydrostatic-Season-Time model as control. The Bald Eagle Search algorithm was coupled with the Relevance Vector Machine to form a novel hybrid model (BES-RVM) for predicting concrete gravity dam displacement response and comparison of nonlinear mapping capability between different kernel functions was conducted. Further optimized by Successive Projections Algorithm (SPA) for feature selection, sensitive features were selected from the proposed HTLT variable sets to improve the prediction model’s accuracy. The prediction model was experimented on a real concrete dam with results showing that the BES-RVM model gave excellent prediction performance. Using the HTLT variable sets significantly reduced the prediction errors and the best performed result came from variables of the two-year long temperatures data. The SPA optimized BES-RVM model largely brought down the dimension and the collinearity of the input variables and further improved the prediction performance.  相似文献   

9.
Dam displacement is an important indicator of the overall dam health status. Numerical prediction of such displacement based on real-world monitoring data is a common practice for dam safety assessment. However, the existing methods are mainly based on statistical models or shallow machine learning models. Although they can capture the timing of the dam displacement sequence, it is difficult to characterize the complex coupling relationship between displacement and multiple influencing factors (e.g., water level, temperature, and time). In addition, input factors of most dam displacement prediction models are artificially constructed based on modelers’ personal experience, which lead to a loss of valuable information, thus prediction power, provided by the full set of raw monitoring data. To address these problems, this paper proposes a novel dual-stage deep learning approach based on one-Dimensional Residual network and Long Short-Term Memory (LSTM) unit, referred to herein as the DRLSTM model. In the first stage, the raw monitoring sequence is processed and spliced with convolution to form a combined sequence. After the timing information is extracted, the convolution direction is switched to learn the complex relationship between displacement and its influencing factors. LSTM is used to extract this relationship to obtain Stage I prediction. The second stage takes the difference between the actual measurement and the Stage I prediction as inputs, and LSTM extracts the stochastic features of the monitoring system to obtain Stage II prediction. The sum of two stage predictions forms the final prediction. The DRLSTM model only requires raw monitoring data of water level and temperature to accurately predict displacement. Through a real-world comparative study against four commonly used shallow learning models and three deep learning models, the root mean square error and mean absolute error of our proposed method are the smallest, being 0.198 mm and 0.149 mm respectively, while the correlation coefficient is the largest at 0.962. It is concluded that the DRLSTM model performance well for evaluating dam health status.  相似文献   

10.
Certain degree of deformation is natural while dam operates and evolves. Due to the impact of internal and external environment, dam deformation is highly nonlinear by nature. For dam safety, it is of great significance to analyze timely deformation monitoring data and be able to predict reliably deformation. A comprehensive review of existing deformation prediction models reveals two issues that deserves further attention: (1) each environmental influencing factor contributes differently to deformation, and (2) deformation lags behind environmental factors (e.g., water level and air temperature). In response, this study presents a combination deformation prediction model considering both quantitative evaluation of influencing factors and hysteresis correction in order to further improve estimation accuracy. In this study, the complex relationship in deformation prediction is effectively captured through support vector machine (SVM) modeling. Furthermore, a modified fruit fly optimization algorithm (MFOA) is presented for SVM hyper-parameter optimization. Also, a synthetic evaluation method and a hysteresis quantification algorithm are introduced to further enhance the MFOA-SVM-based model in regards to contribution quantification and phase correction respectively. The accuracy and validity of the proposed model is evaluated in a concrete dam case, where its performance is compared with other existing models. The simulated results indicated that the proposed nonlinear MFOA-SVM model considering both quantitative evaluation and hysteresis correction, abbreviated as SEV-MFOA-SVM, is more accurate and robust than conventional models. This novel model also provides an alternative method for predicting and analyzing dam deformation and evolution behavior of other similar hydraulic structures.  相似文献   

11.
为提高大坝安全监控模型的预报精度和预报时间,通过利用粒子群算法对相关向量机的关键核参数进行寻优从而提出了PSO-RVM。在此基础上采用某实际工程实测视准线位移监测对模型进行了验证,并从均方根误差、标准均方误差和平均绝对百分比误差来对模型预测的准确性、稳定性和可信程度进行评价。研究表明,PSO-RVM的泛化性能明显优于传统的RVM,应用于大坝安全监测建模是可行的。  相似文献   

12.
一种时变参数多项式扩展递推最小二乘法PRLS   总被引:1,自引:0,他引:1  
  相似文献   

13.
针对网络流量的混沌特性以及海量特性,为弥补网络流量预测模型存在的不足,以获得更优的网络流量预测结果,提出了面向海量数据的网络流量混沌预测模型。该模型首先采用小波分析对原始网络流量时间序列进行多尺度处理,得到不同特征的网络流量分量,然后对网络流量分量的混沌特性进行分析,分别进行重构,并采用机器学习算法中的极限学习机进行建模与预测,最后采用小波分析对网络流量分量的预测结果进行叠加,得到原始网络流量数据的预测值,并进行网络流量预测的仿真实验。实验结果表明,所提模型的网络流量预测精度超过90%,不仅预测精度结果远远超过其他网络流量预测模型的结果,而且其网络流量预测的结果更加稳定,因此是一种有效的网络流量建模与预测工具。  相似文献   

14.

Horizontal displacement of hydropower dams is a typical nonlinear time-varying behavior that is difficult to forecast with high accuracy. This paper proposes a novel hybrid artificial intelligent approach, namely swarm optimized neural fuzzy inference system (SONFIS), for modeling and forecasting of the horizontal displacement of hydropower dams. In the proposed model, neural fuzzy inference system is used to create a regression model whereas Particle swarm optimization is employed to search the best parameters for the model. In this work, time series monitoring data (horizontal displacement, air temperature, upstream reservoir water level, and dam aging) measured for 11 years (1999–2010) of the Hoa Binh hydropower dam were selected as a case study. The data were then split into a ratio of 70:30 for developing and validating the hybrid model. The performance of the resulting model was assessed using RMSE, MAE, and R 2. Experimental results show that the proposed SONFIS model performed well on both the training and validation datasets. The results were then compared with those derived from current state-of-the-art benchmark methods using the same data, such as support vector regression, multilayer perceptron neural networks, Gaussian processes, and Random forests. In addition, results from a Different evolution-based neural fuzzy model are included. Since the performance of the SONFIS model outperforms these benchmark models with the monitoring data at hand, the proposed model, therefore, is a promising tool for modeling horizontal displacement of hydropower dams.

  相似文献   

15.
Dam displacements can effectively reflect its operational status, and thus establishing a reliable displacement prediction model is important for dam health monitoring. The majority of the existing data-driven models, however, focus on static regression relationships, which cannot capture the long-term temporal dependencies and adaptively select the most relevant influencing factors to perform predictions. Moreover, the emerging modeling tools such as machine learning (ML) and deep learning (DL) are mostly black-box models, which makes their physical interpretation challenging and greatly limits their practical engineering applications. To address these issues, this paper proposes an interpretable mixed attention mechanism long short-term memory (MAM-LSTM) model based on an encoder-decoder architecture, which is formulated in two stages. In the encoder stage, a factor attention mechanism is developed to adaptively select the highly influential factors at each time step by referring to the previous hidden state. In the decoder stage, a temporal attention mechanism is introduced to properly extract the key time segments by identifying the relevant hidden states across all the time steps. For interpretation purpose, our emphasis is placed on the quantification and visualization of factor and temporal attention weights. Finally, the effectiveness of the proposed model is verified using monitoring data collected from a real-world dam, where its accuracy is compared to a classical statistical model, conventional ML models, and homogeneous DL models. The comparison demonstrates that the MAM-LSTM model outperforms the other models in most cases. Furthermore, the interpretation of global attention weights confirms the physical rationality of our attention-based model. This work addresses the research gap in interpretable artificial intelligence for dam displacement prediction and delivers a model with both high-accuracy and interpretability.  相似文献   

16.
针对在线贯序极限学习机(OS-ELM)算法隐含层输出不稳定、易产生奇异矩阵和在线贯序更新时没有考虑训练样本时效性的问题,提出一种基于核函数映射的正则化自适应遗忘因子(FFOS-RKELM)算法.该算法利用核函数代替隐含层,能够产生稳定的输出结果.在初始阶段加入正则化方法,通过构造非奇异矩阵提高模型的泛化能力;在贯序更新阶段,通过新到的数据自动更新遗忘因子.将FFOS-RKELM算法应用到混沌时间序列预测和入口氮氧化物时间序列预测中,相比于OS-ELM、FFOS-RELM、OS-RKELM算法,可有效地提高预测精度和泛化能力.  相似文献   

17.
Since most real-world processes exhibit both nonlinear and time-varying characteristics, there exists a need for accurate and efficient models that can adapt in nonstationary environments. Also for adaptive control purpose, it is vital that an adaptive model has a fixed small model size. In this paper, we propose an adaptive tunable gradient radial basis function (GRBF) network for online modeling of nonlinear dynamic processes, which meets these practical requirements. Specifically, a compact GRBF model is constructed by the orthogonal least squares algorithm in training, which is capable of modeling variations of local mean and trend in the data well. During online operation, the adaptive GRBF model tacks the time-varying process’s dynamics by replacing a worst performing node with a new node which encodes the current new data. By exploiting the local predictor property of the GRBF node, the new node optimization can be done extremely efficiently. The proposed approach combining the advantages of both the GRBF network structure and fast tunable node mechanism is capable of tracking the time-varying nonlinear dynamics accurately and effectively. Extensive simulation results demonstrate that the proposed fast tunable GRBF network significantly outperforms the existing state-of-the-art methods, in terms of both adaptive modeling accuracy and online computational complexity.  相似文献   

18.
In recent years, considerable progress has been made in modeling chaotic time series with neural networks. Most of the work concentrates on the development of architectures and learning paradigms that minimize the prediction error. A more detailed analysis of modeling chaotic systems involves the calculation of the dynamical invariants which characterize a chaotic attractor. The features of the chaotic attractor are captured during learning only if the neural network learns the dynamical invariants. The two most important of these are the largest Lyapunov exponent which contains information on how far in the future predictions are possible, and the Correlation or Fractal Dimension which indicates how complex the dynamical system is. An additional useful quantity is the power spectrum of a time series which characterizes the dynamics of the system as well, and this in a more thorough form than the prediction error does. In this paper, we introduce recurrent networks that are able to learn chaotic maps, and investigate whether the neural models also capture the dynamical invariants of chaotic time series. We show that the dynamical invariants can be learned already by feedforward neural networks, but that recurrent learning improves the dynamical modeling of the time series. We discover a novel type of overtraining which corresponds to the forgetting of the largest Lyapunov exponent during learning and call this phenomenondynamical overtraining. Furthermore, we introduce a penalty term that involves a dynamical invariant of the network and avoids dynamical overtraining. As examples we use the Hénon map, the logistic map and a real world chaotic series that corresponds to the concentration of one of the chemicals as a function of time in experiments on the Belousov-Zhabotinskii reaction in a well-stirred flow reactor.  相似文献   

19.
In this paper, the complex relationship between environmental variables and dam static response is expressed using composition of functions, including nonlinear mapping and linear mapping. The environmental effect and noise disturbance is successfully separated from the monitoring data by analysis of the covariance matrix of multivariate monitoring data of dam response. Based on this separation process, two multivariate dam safety monitoring models are proposed. In model I, the upper control limits (UCLs) are calculated by performing kernel density estimation (KDE) on the square prediction error (SPE) of the offline data. For new monitoring data, we can judge whether they are abnormal by comparing the newly calculated SPE with the UCL. When abnormal data are detected, the SPE contribution plots and the SPE control chart of the new monitoring data are jointly used to qualitatively identify the reason for the abnormalities. Model II is a dam monitoring model based on latent variables that can be calculated from the separation process of the environmental and noise effects. The least squares support vector machines (LS-SVMs) model is adopted to simulate the nonlinear mapping from environmental variables to latent variables. The latent variables are predicted, and the prediction interval is calculated to provide a control range for the future monitoring data. The two monitoring models are applied to analyze the monitoring data of the horizontal displacement and hydraulic uplift pressure of a roller-compacted concrete (RCC) gravity dam. The analysis results demonstrate the good performance of the two models.  相似文献   

20.
In this paper, a recursive subspace identification method is proposed to identify linear time-invariant systems subject to load disturbance with relatively slow dynamics. Using the linear superposition principle, the load disturbance response is decomposed from the deterministic-stochastic system response in the form of a time-varying parameter. To ensure unbiased estimation of the deterministic system matrices, a recursive least-squares (RLS) identification algorithm is established with a fixed forgetting factor, while another RLS algorithm with an adaptive forgetting factor is constructed based on the output prediction error to quickly track the time-varying parameter of load disturbance response. By introducing a deadbeat observer to represent the deterministic system response, two extended observer Markov parameter matrices are constructed for recursive estimation. Consequently, the deterministic matrices are retrieved from the identified system Markov parameter matrices. The convergence of the proposed method is analysed with a proof. Two illustrative examples are shown to demonstrate the effectiveness and merit of the proposed identification method.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号