首页 | 本学科首页   官方微博 | 高级检索  
     

不同优化器在高斯噪声下对LR性能影响的研究
引用本文:徐龙飞,郁进明.不同优化器在高斯噪声下对LR性能影响的研究[J].计算机技术与发展,2020(3):7-12.
作者姓名:徐龙飞  郁进明
作者单位:东华大学信息科学与技术学院
基金项目:国家自然科学基金(16K10439)。
摘    要:目前机器学习算法已经应用到社会的各个领域,如数据挖掘、信息个性化推荐和自然语言处理等,在人们的工作和生活中起到了重要作用。线性回归模型(LR)是常见的机器学习算法的一种,具有使用简单,容易理解,便于执行等特点,但在加入噪声干扰的情况下模型性能会受到较大影响。LR的优化方式包括批量梯度下降(BGD)、随机梯度下降(SGD)、小批量梯度下降(MBGD)和Adam优化器等,最终训练后的模型性能受到优化方式、学习率、噪声等诸多因素的影响。为了研究在加入高斯噪声的情况下如何选择优化器来改善LR模型的性能,使用了Python语言和TensorFlow框架,通过比较几种优化器的损失函数和计算时间来研究加入高斯噪声后对LR模型性能的影响。实验结果表明,在加入高斯噪声的情况下,使用Adam优化器得出的损失函数和计算时间优于其他优化器。

关 键 词:机器学习  线性回归  优化器  损失函数  高斯噪声

Study on Influence of Different Optimizers on Performance of LR under Gaussian Noise
XU Long-fei,YU Jin-ming.Study on Influence of Different Optimizers on Performance of LR under Gaussian Noise[J].Computer Technology and Development,2020(3):7-12.
Authors:XU Long-fei  YU Jin-ming
Affiliation:(School of Information Science and Technology,Donghua University,Shanghai 201620,China)
Abstract:At present,machine learning algorithms have been applied to various fields of society,such as data mining,personalized information recommendation and natural language processing,which play an important role in people’s work and life.Linear regression model(LR)is one of several common machine learning algorithms.It is simple to use,easy to understand and easy to execute,but the performance of the model will be greatly affected when noise is added.The optimization methods of LR include batch gradient descent(BGD),stochastic gradient descent(SGD),mini-batch gradient descent(MBGD)and Adam optimizer,etc.The performance of the final trained model is affected by optimization methods,learning rate,noise and many other factors.In order to study how to select the optimizer to improve the performance of the LR model with Gaussian noise,we uses Python language and TensorFlow framework to study the impact of Gaussian noise on the performance of the LR model by comparing the loss functions of several optimizers and computing time.The experiment shows that the loss function and computing time obtained by Adam optimizer is superior to other optimizers when Gaussian noise is added.
Keywords:machine learning  linear regression  optimizer  loss function  Gaussian noise
本文献已被 维普 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号