首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
The tools of soft computing will aid the knowledge mining in predicting and classifying the properties of various parameters while designing the composite preforms in the manufacturing of Powder Metallurgy (P/M) Lab. In this paper, an integrated PRNET (PCA-Radial basis functional neural NET) model is proposed in different versions to select the relevant parameters for preparing composite preforms and to predict the deformation and strain hardening properties of Al–Fe composites. It reveals that the predictability of this model has been increased by 67.89% relatively from the conventional models. A new PR-filter is proposed by slightly modifying the conventional filters of RBFNN, which improves the power of PRNET even though raw data are highly non-linear, interrelated and noisy. Moreover, fixing the range of input parameters for classifying the properties of composite preforms can be automated by the Fuzzy logic. These types of models will avoid expensive experimentation and risky environment while preparing sintered composite preforms. Thus the manufacturing process of composites in P/M Lab will be simplified with minimum energy by the support of these soft-computing tools.  相似文献   

2.
This paper presents a system for monitoring and prognostics of machine conditions using soft computing (SC) techniques. The machine condition is assessed through a suitable ‘monitoring index’ extracted from the vibration signals. The progression of the monitoring index is predicted using an SC technique, namely adaptive neuro-fuzzy inference system (ANFIS). Comparison with a machine learning method, namely support vector regression (SVR), is also presented. The proposed prediction procedures have been evaluated through benchmark data sets. The prognostic effectiveness of the techniques has been illustrated through previously published data on several types of faults in machines. The performance of SVR was found to be better than ANFIS for the data sets used. The results are helpful in understanding the relationship of machine conditions, the corresponding indicating features, the level of damage/degradation and their progression.  相似文献   

3.
This paper presents a hybrid soft computing modeling approach, a neurofuzzy system based on rough set theory and genetic algorithms (GA). To solve the curse of dimensionality problem of neurofuzzy system, rough set is used to obtain the reductive fuzzy rule set. Both the number of condition attributes and rules are reduced. Genetic algorithm is used to obtain the optimal discretization of continuous attributes. The fuzzy system is then represented via an equivalent artificial neural network (ANN). Because the initial parameter of the ANN is reasonable, the convergence of the ANN training is fast. After the rules are reduced, the structure size of the ANN becomes small, and the ANN is not fully weight-connected. The neurofuzzy approach based on RST and GA has been applied to practical application of building a soft sensor model for estimating the freezing point of the light diesel fuel in fluid catalytic cracking unit.  相似文献   

4.
Power control is a fundamental procedure for CDMA mobile radio communication systems. In multiservice CDMA systems, power control should be used to minimise the transmission power of each connection, in order to limit the multiple access interference, while obtaining the desirable SIR levels. This paper starts from a transmitted-power allocation algorithm (TPAA) that considers a set of uplink transmissions, which should be supported by the system. In the sequel, the TPAA algorithm is used for training an Elman neural network, which, due to its internal characteristics, is applicable in the time critical context of power control. Simulations and numerical results are analysed for obtaining a solid basis for employing our scheme in the power control of CDMA systems.  相似文献   

5.
 The discipline of Software Engineering is abstract and complex with all its endeavors being cast in a knowledge-intensive environment. It is not surprising that there have been a number of important initiatives that have attempted to address a burning need for solid development tools and comprehensive environments supporting an in-depth analysis. The objective of this study is to discuss a role of Computational Intelligence (CI) and visual computing being viewed as a sound methodological and algorithmic environment for knowledge-oriented Software Engineering. The CI itself is regarded as a synergistic consortium of granular computing (including fuzzy sets) promoting abstraction, neurocomputing supporting various learning schemes and evolutionary computing providing important faculties of global optimization. By its very nature, CI embraces a diversity of design paradigms; in particular it promotes a top-down approach (when exploiting fuzzy sets first and afterwards working in the neural network environment) or bottom-up style (where these two technologies are used in a reverse order). Visual computing is inherently associated with CI: it is human-centric where fuzzy sets make visualization activities feasible. Fuzzy sets are treated as a graphic means of accepting information from users. They are regarded as a vehicle used to visualize results in a linguistic manner. Software Engineering and CI are highly compatible: they are knowledge-intensive, human-oriented, and have to deal with various manifestations of the abstract world of software constructs and thought processes. This multifaceted conceptual compatibility is a prerequisite for the development of vital synergistic links that bring the technology of CI into Software Engineering. The symbiosis accrues considerable benefits for both technologies by posing new categories of challenging and highly stimulating problems. The facet of visual computing is essential in handling of software processes and software products. The intent of this study is to provide a general overview of this new development in Software Engineering. In particular, we highlight a number of selected and most visible trends occurring at the junction of CI and Software Engineering. Furthermore we discuss several specific applications of the technology of CI to software cost estimation, analysis of software measures and neural models of software quality. Support from the Natural Sciences and Engineering Research Council of Canada (NSERC) and the Alberta Software Engineering Research Consortium (ASERC) is gratefully acknowledged.  相似文献   

6.
Contemporary design process requires the development of a new computational intelligence or soft computing methodology that involves intelligence integration and hybrid intelligent systems for design, analysis and evaluation, and optimization. This paper first presents a discussion of the need to incorporate intelligence into an automated design process and the various constraints that designers face when embarking on industrial design projects. Then, it presents the design problem as optimizing the design output against constraints and the use of soft computing and hybrid intelligent systems techniques. In this paper, a soft-computing-integrated intelligent design framework is developed. A hybrid dual cross-mapping neural network (HDCMNN) model is proposed using the hybrid soft computing technique based on cross-mapping between a back-propagation network (BPNN) and a recurrent Hopfield network (HNN) for supporting modeling, analysis and evaluation, and optimization tasks in the design process. The two networks perform different but complementary tasks—the BPNN decides if the design problem is a type 0 (rational) or type 1 (non-rational) problem, and the output layer weights are then used as the energy function for the HNN. The BPNN is used for representing design patterns, training classification boundaries, and outputting network weight values to the HNN, and then the HNN uses the calculated network weight values to evaluate and modify or re-design the design patterns. The developed system provides a unified soft-computing-integrated intelligent design framework with both symbolic and computational intelligence. The system has self-modifying and self-learning functions. Within the system, only one network training is needed for accomplishing the evaluation, rectification/modification, and optimization tasks in the design process. Finally, two case studies are provided to illustrate and validate the developed model and system.  相似文献   

7.
8.
Software reliability prediction by soft computing techniques   总被引:1,自引:0,他引:1  
In this paper, ensemble models are developed to accurately forecast software reliability. Various statistical (multiple linear regression and multivariate adaptive regression splines) and intelligent techniques (backpropagation trained neural network, dynamic evolving neuro-fuzzy inference system and TreeNet) constitute the ensembles presented. Three linear ensembles and one non-linear ensemble are designed and tested. Based on the experiments performed on the software reliability data obtained from literature, it is observed that the non-linear ensemble outperformed all the other ensembles and also the constituent statistical and intelligent techniques.  相似文献   

9.
 In this paper, we give a comparison between the conventional power control scheme and soft computing-based approaches in a mobile communications application. At the base station, the `bang–bang' control strategy and a neural network-based prediction control method are employed. In addition, full power command transmission mode, single-bit command transmission mode, and fuzzy logic-based power command enhancement unit are considered. Based on simulation experiments, we quantitatively evaluate the performance of various combinations of these control methods and command transmission modes. Conclusions on the optimal configuration are finally drawn.  相似文献   

10.
The present paper surveys the application of soft computing (SC) techniques in engineering design. Within this context, fuzzy logic (FL), genetic algorithms (GA) and artificial neural networks (ANN), as well as their fusion are reviewed in order to examine the capability of soft computing methods and techniques to effectively address various hard-to-solve design tasks and issues. Both these tasks and issues are studied in the first part of the paper accompanied by references to some results extracted from a survey performed for in some industrial enterprises. The second part of the paper makes an extensive review of the literature regarding the application of soft computing (SC) techniques in engineering design. Although this review cannot be collectively exhaustive, it may be considered as a valuable guide for researchers who are interested in the domain of engineering design and wish to explore the opportunities offered by fuzzy logic, artificial neural networks and genetic algorithms for further improvement of both the design outcome and the design process itself. An arithmetic method is used in order to evaluate the review results, to locate the research areas where SC has already given considerable results and to reveal new research opportunities.  相似文献   

11.
随着油气勘探领域的不断扩大,测井解释面临的研究对象也越来越复杂,传统的单一基于硬计算或软计算的方法在测井解释中面临严格挑战。提出软计算与硬计算融合的4种模式。运用软计算与硬计算融合的分离模式对某油田Oilsk81、Oilsk83、Oilsk85三口井进行含油气性模式识别,比较结果表明,在这个油区运用软计算方法对含油气性进行模式识别优于硬计算,并且可以识别出较好的测井数据集。  相似文献   

12.
A hybrid network of evolutionary processors (an HNEP) is a graph where each node is associated with an evolutionary processor (a special rewriting system), a set of words, an input filter and an output filter. Every evolutionary processor is given with a finite set of one type of point mutations (an insertion, a deletion or a substitution of a symbol) which can be applied to certain positions of a string over the domain of the set of these rewriting rules. The HNEP functions by rewriting the words that can be found at the nodes and then re-distributing the resulting strings according to a communication protocol based on a filtering mechanism. The filters are defined by certain variants of random-context conditions. HNEPs can be considered as both language generating devices (GHNEPs) and language accepting devices (AHNEPs). In this paper, by improving the previous results, we prove that any recursively enumerable language can be determined by a GHNEP and an AHNEP with 7 nodes. We also show that the families of GHNEPs and AHNEPs with 2 nodes are not computationally complete.  相似文献   

13.
The conventional two dimensional (2-D) histogram based Otsu’s method gives unreliable results while considering multilevel thresholding of brain magnetic resonance (MR) images, because the edges of the brain regions are not preserved due to the local averaging process involved. Moreover, some of the useful pixels present inside the off-diagonal regions are ignored in the calculation. This article presents an evolutionary gray gradient algorithm (EGGA) for optimal multilevel thresholding of brain MR images. In this paper, more edge information is preserved by computing 2-D histogram based gray gradient. The key to our success is the use of the gray gradient information between the pixel values and the pixel average values to minimize the information loss. In addition, the speed improvement is achieved. Theoretical formulations are derived for computing the maximum between class variance from the 2-D histogram of the brain image. A first-hand fitness function is suggested for the EGGA. A novel adaptive swallow swarm optimization (ASSO) algorithm is introduced to optimize the fitness function. The performance of ASSO is validated using twenty three standard Benchmark test functions. The performance of ASSO is better than swallow swarm optimization (SSO). The optimum threshold value is obtained by maximizing the between class variance using ASSO. Our method is tested using the standard axial T2 − weighted brain MRI database of Harvard medical education using 100 slices. Performance of our method is compared to the Otsu’s method based on the one dimensional (1-D) and the 2-D histogram. The results are also compared among four different soft computing techniques. It is observed that results obtained using our method is better than the other methods, both qualitatively and quantitatively. Benefits of our method are – (i) the EGGA exhibits better objective function values; (ii) the EGGA provides us significantly improved results; and (iii) more computational speed is achieved.  相似文献   

14.
15.
 This paper proposes a novel soft-computing framework for human–machine system design and simulation based on the hybrid intelligent system techniques. The complex human–machine system is described by human and machine parameters within a comprehensive model. Based on this model, procedures and algorithms for human–machine system design, economical/ergonomic evaluation, and optimization are discussed in an integrated CAD and soft-computing framework. With a combination of individual neural and fuzzy techniques, the neuro-fuzzy hybrid soft-computing scheme implements a fuzzy if-then rules block for human–machine system design, evaluation and optimization by a trainable neural fuzzy network architecture. For training and test purposes, assembly tasks are simulated and carried out on a self-built multi-adjustable laboratory workstation with a flexible motion measurement and analysis system. The trained neural fuzzy network system is able to predict the operator's postures and joint angles of motion associated with a range of workstation configurations. It can also be used for design/layout and adjustment of human assembly workstations. The developed system provides a unified, intelligent computational framework for human–machine system design and simulation. Case studies for workstation system design and simulation are provided to illustrate and validate the developed system.  相似文献   

16.
Data analysis techniques have been traditionally conceived to cope with data described in terms of numeric vectors. The reason behind this fact is that numeric vectors have a well-defined and clear geometric interpretation, which facilitates the analysis from the mathematical viewpoint. However, the state-of-the-art research on current topics of fundamental importance, such as smart grids, networks of dynamical systems, biochemical and biophysical systems, intelligent trading systems, multimedia content-based retrieval systems, and social networks analysis, deal with structured and non-conventional information characterizing the data, providing richer and hence more complex patterns to be analyzed. As a consequence, representing patterns by complex (relational) structures and defining suitable, usually non-metric, dissimilarity measures is becoming a consolidated practice in related fields. However, as the data sources become more complex, the capability of judging over the data quality (or reliability) and related interpretability issues can be seriously compromised. For this purpose, automated methods able to synthesize relevant information, and at the same time rigorously describe the uncertainty in the available datasets, are very important: information granulation is the key aspect in the analysis of complex data. In this paper, we discuss our general viewpoint on the adoption of information granulation techniques in the general context of soft computing and pattern recognition, conceived as a fundamental approach towards the challenging problem of automatic modeling of complex systems. We focus on the specific setting of processing the so-called non-geometric data, which diverges significantly from what has been done so far in the related literature. We highlight the motivations, the founding concepts, and finally we provide the high-level conceptualization of the proposed data analysis framework.  相似文献   

17.
Soft computing is an interdisciplinary area that focuses on the design of intelligent systems to process uncertain, imprecise and incomplete information. It mainly builds on fuzzy sets theory, fuzzy logic, neural computing, optimization, evolutionary algorithms, and approximate reasoning et al. Information granularity is in general regarded as a crucial design asset, which helps establish a better rapport of the resulting granular model with the system under modeling. Human centricity is an inherent property of people's view on a system, a process, a machine or a model. Information granularity can be used to reflect people's level of uncertainty and this makes its pivotal role in soft computing. Indeed, the concept of information granularity facilitates the development of theory and application of soft computing immensely. A number of papers pertaining to some recent advances in theoretical development and practical application of information granularity in soft computing are highlighted in this special issue. The main objective of this study is to collect as many as possible researches on human centricity and information granularity in the agenda of theories and applications of soft computing, review the main idea of these literatures, compare the advantages and disadvantages of their methods and try to find the relationships and relevance of these theories and applications.  相似文献   

18.
Cloud computing is a recent and significant development in the domain of network applications with a new information technology perspective. This study attempts to develop a hybrid model to predict motivators influencing the adoption of cloud computing services by information technology (IT) professionals. The research proposes a new model by extending the Technology Acceptance Model (TAM) with three external constructs namely computer self-efficacy, trust, and job opportunity. One of the main contributions of this research is the introduction of a new construct, Job Opportunity (JO), for the first time in a technology adoption study. Data were collected from 101 IT professional and analyzed using multiple linear regression (MLR) and neural network (NN) modeling. Based on the RMSE values from the results of these models NN models were found to outperform the MLR model. The results obtained from MLR showed that computer self-efficacy, perceived usefulness, trust, perceived ease of use, and job opportunity. However, the NN models result showed that the best predictor of cloud computing adoption are job opportunity, trust, perceived usefulness, self-efficacy, and perceived ease of use. The findings of this study confirm the need to extend the fundamental TAM when studying a recent technology like cloud computing. This study will provide insights to IT service providers, government agencies, academicians, researchers and IT professionals.  相似文献   

19.
Precise determination of the effective angle of shearing resistance (′) value is a major concern and an essential criterion in the design process of the geotechnical structures, such as foundations, embankments, roads, slopes, excavation and liner systems for the solid waste. The experimental determination of ′ is often very difficult, expensive and requires extreme cautions and labor. Therefore many statistical and numerical modeling techniques have been suggested for the ′ value. However they can only consider no more than one parameter, in a simplified manner and do not provide consistent accurate prediction of the ′ value. This study explores the potential of Genetic Expression Programming, Artificial Neural Network (ANN) and Adaptive Neuro Fuzzy (ANFIS) computing paradigm in the prediction of ′ value of soils. The data from consolidated-drained triaxial tests (CID) conducted in this study and the different project in Turkey and literature were used for training and testing of the models. Four basic physical properties of soils that cover the percentage of fine grained (FG), the percentage of coarse grained (CG), liquid limit (LL) and bulk density (BD) were presented to the models as input parameters. The performance of models was comprehensively evaluated some statistical criteria. The results revealed that GEP model is fairly promising approach for the prediction of angle of shearing resistance of soils. The statistical performance evaluations showed that the GEP model significantly outperforms the ANN and ANFIS models in the sense of training performances and prediction accuracies.  相似文献   

20.
There is a common misconception that the automobile industry is slow to adapt new technologies, such as artificial intelligence (AI) and soft computing. The reality is that many new technologies are deployed and brought to the public through the vehicles that they drive. This paper provides an overview and a sampling of many of the ways that the automotive industry has utilized AI, soft computing and other intelligent system technologies in such diverse domains like manufacturing, diagnostics, on-board systems, warranty analysis and design. Oleg Gusikhin received the Ph.D. degree from St. Petersburg Institute of Informatics and Automation of the Russian Academy of Sciences and the M.B.A. degree from the University of Michigan, Ann Arbor, MI. Since 1993, he has been with the Ford Motor Company, where he is a Technical Leader at the Ford Manufacturing and Vehicle Design Research Laboratory, and is engaged in different functional areas including information technology, advanced electronics manufacturing, and research and advanced engineering. He has also been involved in the design and implementation of intelligent control applications for manufacturing and vehicle systems. He is the recipient of the 2004 Henry Ford Technology Award. He holds two U.S. patents and has published over 30 articles in refereed journals and conference proceedings. He is an Associate Editor of the International Journal of Flexible Manufacturing Systems. He is also a Certified Fellow of the American Production and Inventory Control Society and a member of IEEE and SME. Nestor Rychtyckyj received the Ph.D. degree in computer science from Wayne State University, Detroit, MI. He is a technical expert in Artificial Intelligence at Ford Motor Company, Dearborn, MI, in Advanced and Manufacturing Engineering Systems. His current research interests include the application of knowledge-based systems for vehicle assembly process planning and scheduling. Currently, his responsibilities include the development of automotive ontologies, intelligent manufacturing systems, controlled languages, machine translation and corporate terminology management. He has published more than 30 papers in referred journals and conference proceedings. He is a member of AAAI, ACM and the IEEE Computer Society. Dimitar P. Filev received the Ph.D. degree in electrical engineering from the Czech Technical University, Prague, in 1979. He is a Senior Technical Leader, Intelligent Control and Information Systems with Ford Research and Advanced Engineering specializing in industrial intelligent systems and technologies for control, diagnostics and decision making. He is conducting research in systems theory and applications, modeling of complex systems, intelligent modeling and control, and has published 3 books and over 160 articles in refereed journals and conference proceedings. He holds 14 granted U.S. patents and numerous foreign patents in the area of industrial intelligent systems He is the recipient of the 1995 Award for Excellence of MCB University Press. He was awarded the Henry Ford Technology Award four times for development and implementation of advanced intelligent control technologies. He is an Associate Editor of International Journal of General Systems and International Journal of Approximate Reasoning. He is a member of the Board of Governors of the IEEE Systems, Man and Cybernetics Society and President of the North American Fuzzy Information Processing Society (NAFIPS).  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号