首页 | 本学科首页   官方微博 | 高级检索  
     


A New Training Approach for Robust Recurrent Neural-Network Modeling of Nonlinear Circuits
Abstract: A new approach for developing recurrent neural-network models of nonlinear circuits is presented, overcoming the conventional limitations where training information depends on the shapes of circuit waveforms and/or circuit terminations. Using only a finite set of waveforms for model training, our technique enables the trained model to respond accurately to test waveforms of unknown shapes. To relate information of training waveforms with that of test waveforms, we exploit an internal space of a recurrent neural network, called the internal input-neuron space. We formulate a new circuit block combining a generic load and a generic excitation to terminate the circuit. By sweeping the coefficients of the proposed circuit block, we obtain a rich combination of training waveforms to cover the region of interest in the internal input-neuron space effectively. We also present a new method to reduce the amount of training data while maintaining the necessary modeling information. The proposed method is demonstrated through examples of recurrent neural-network modeling of high-speed drivers and an RF amplifier. It is confirmed that, for different terminations and test waveforms, the model trained with the proposed technique has better accuracy and robustness than that using the existing training methods.
Keywords:
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号