首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   5篇
  免费   0篇
无线电   1篇
自动化技术   4篇
  2021年   1篇
  2015年   1篇
  2014年   1篇
  2013年   1篇
  2012年   1篇
排序方式: 共有5条查询结果,搜索用时 0 毫秒
1
1.
Information Systems Frontiers - The way open data resources of varied type and volume are used by software applications remains only partly known. In this study, following CRoss-Industry Standard...  相似文献   
2.
The vigorous expansion of wind energy power generation over the last decade has also entailed innovative improvements to surface roughness prediction models applied to high-torque milling operations. Artificial neural networks are the most widely used soft computing technique for the development of these prediction models. In this paper, we concentrate on the initial data transformation and its effect on the prediction of surface roughness in high-torque face milling operations. An extensive data set is generated from experiments performed under industrial conditions. The data set includes a very broad set of different parameters that influence surface roughness: cutting tool properties, machining parameters and cutting phenomena. Some of these parameters may potentially be related to the others or may only have a minor influence on the prediction model. Moreover, depending on the number of available records, the machine learning models may or may not be capable of modelling some of the underlying dependencies. Hence, the need to select an appropriate number of input signals and their matching prediction model configuration.A hybrid algorithm that combines a genetic algorithm with neural networks is proposed in this paper, in order to address the selection of relevant parameters and their appropriate transformation. The algorithm has been tested in a number of experiments performed under workshop conditions with data sets of different sizes to investigate the impact of available data on the selection of corresponding data transformation. Data set size has a direct influence on the accuracy of the prediction models for roughness modelling, but also on the use of individual parameters and transformed features. The results of the tests show significant improvements in the quality of prediction models constructed in this way. These improvements are evident when these models are compared with standard multilayer perceptrons trained with all the parameters and with data reduced through standard Principal Component Analysis practice.  相似文献   
3.
4.
A soft computing system used to optimize deep drilling operations under high-speed conditions in the manufacture of steel components is presented. The input data includes cutting parameters and axial cutting force obtained from the power consumption of the feed motor of the milling centres. Two different coolant strategies are tested: traditional working fluid and Minimum Quantity Lubrication (MQL). The model is constructed in three phases. First, a new strategy is proposed to evaluate and complete the set of available measurements. The primary objective of this phase is to decide whether further drilling experiments are required to develop an accurate roughness prediction model. An important aspect of the proposed strategy is the imputation of missing data, which is used to fully exploit both complete and incomplete measurements. The proposed imputation algorithm is based on a genetic algorithm and aims to improve prediction accuracy. In the second phase, a bag of multilayer perceptrons is used to model the impact of deep drilling settings on borehole roughness. Finally, this model is supplied with the borehole dimensions, coolant option and expected axial force to develop a 3D surface showing the expected borehole roughness as a function of drilling process settings. This plot is the necessary output of the model for its use under real workshop conditions. The proposed system is capable of approximating the optimal model used to control deep drilling tasks on steel components for industrial use.  相似文献   
5.
As a response to the increasing number of cyber threats, novel detection and prevention methods are constantly being developed. One of the main obstacles hindering the development and evaluation of such methods is the shortage of reference data sets. What is proposed in this work is a way of testing methods detecting network threats. It includes a procedure for creating realistic reference data sets describing network threats and the processing and use of these data sets in testing environments. The proposed approach is illustrated and validated on the basis of the problem of spam detection. Reference data sets for spam detection are developed, analysed and used to both generate the requested volume of simulated traffic and analyse it using machine learning algorithms. The tests take into account both the accuracy and performance of threat detection methods under real load and constrained computing resources.  相似文献   
1
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号