首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 150 毫秒
1.
Neural Computing and Applications - Knowledge of groundwater level is very important in studies dealing with utilization and management of groundwater supply. Earlier studies have reported that ELM...  相似文献   

2.
Neural Computing and Applications - Medical diagnosis using machine learning techniques has great attention over the last two decades. The detection of skin cancer based on visual information...  相似文献   

3.

Classification systems such as rock mass rating (RMR) are used to evaluate rock mass quality. This paper intended to evaluate RMR based on a fuzzy clustering algorithm to improve linguistic and empirical criteria for the RMR classification system. In the proposed algorithm, membership functions were first extracted for each RMR parameter based on the questionnaires filled out by experts. RMR clustering algorithm was determined by considering the percent importance of each parameter in the RMR classification system. In all implementation stages of the proposed algorithm, no empirical judgment was made in determining the classification classes in the RMR system. According to the obtained results, the proposed algorithm is a powerful tool to modify the rock mass rating system and can be generalized for future research.

  相似文献   

4.
《Applied Soft Computing》2007,7(3):728-738
This work is an attempt to illustrate the utility and effectiveness of soft computing approaches in handling the modeling and control of complex systems. Soft computing research is concerned with the integration of artificial intelligent tools (neural networks, fuzzy technology, evolutionary algorithms, …) in a complementary hybrid framework for solving real world problems. There are several approaches to integrate neural networks and fuzzy logic to form a neuro-fuzzy system. The present work will concentrate on the pioneering neuro-fuzzy system, Adaptive Neuro-Fuzzy Inference System (ANFIS). ANFIS is first used to model non-linear knee-joint dynamics from recorded clinical data. The established model is then used to predict the behavior of the underlying system and for the design and evaluation of various intelligent control strategies.  相似文献   

5.
Multilevel thresholding is the method applied to segment the given image into unique sub-regions when the gray value distribution of the pixels is not distinct. The segmentation results are affected by factors such as number of threshold and threshold values. Hence, this paper proposes different methods for determining optimal thresholds using optimization techniques namely GA, PSO and hybrid model. Parallel algorithms are also proposed and implemented for these methods to reduce the execution time. From the experimental results, it is inferred that proposed methods take less time for determining the optimal thresholds when compared with existing methods such as Otsu and Kapur methods.  相似文献   

6.
Tight turbine steam temperature control is a necessity for obtaining long lifetime, high efficiency, high load following capability and high availability in power plants. The present work reports a systematic approach for the control strategy design of power plants with non-linear characteristics. The presented control strategy is developed based on optimized PI control with genetic algorithms (GAs) and investigates performance and robustness of the GA-based PI controller (GAPI). In order to design the controller, an effective neuro-fuzzy model of the de-superheating process is developed based on recorded data. Results indicate a successful identification of the high-order de-superheating process as well as improvements in the performance of the steam temperature controller.  相似文献   

7.
The motor unit action potentials (MUPs) in an electromyographic (EMG) signal provide a significant source of information for the assessment of neuromuscular disorders. Since recently there were different types of developments in computer-aided EMG equipment, different methodologies in the time domain and frequency domain has been followed for quantitative analysis of EMG signals. In this study, the usefulness of the different feature extraction methods for describing MUP morphology is investigated. Besides, soft computing techniques were presented for the classification of intramuscular EMG signals. The proposed method automatically classifies the EMG signals into normal, neurogenic or myopathic. Also, multilayer perceptron neural networks (MLPNN), dynamic fuzzy neural network (DFNN) and adaptive neuro-fuzzy inference system (ANFIS) based classifiers were compared in relation to their accuracy in the classification of EMG signals. Concerning the impacts of features on the EMG signal classification, different results were obtained through analysis of the soft computing techniques. The comparative analysis suggests that the ANFIS modelling is superior to the DFNN and MLPNN in at least three points: slightly higher recognition rate; insensitivity to overtraining; and consistent outputs demonstrating higher reliability.  相似文献   

8.
This paper presents the application of soft computing techniques for strength prediction of heat-treated extruded aluminium alloy columns failing by flexural buckling. Neural networks (NN) and genetic programming (GP) are presented as soft computing techniques used in the study. Gene-expression programming (GEP) which is an extension to GP is used. The training and test sets for soft computing models are obtained from experimental results available in literature. An algorithm is also developed for the optimal NN model selection process. The proposed NN and GEP models are presented in explicit form to be used in practical applications. The accuracy of the proposed soft computing models are compared with existing codes and are found to be more accurate.  相似文献   

9.
Software reliability prediction by soft computing techniques   总被引:1,自引:0,他引:1  
In this paper, ensemble models are developed to accurately forecast software reliability. Various statistical (multiple linear regression and multivariate adaptive regression splines) and intelligent techniques (backpropagation trained neural network, dynamic evolving neuro-fuzzy inference system and TreeNet) constitute the ensembles presented. Three linear ensembles and one non-linear ensemble are designed and tested. Based on the experiments performed on the software reliability data obtained from literature, it is observed that the non-linear ensemble outperformed all the other ensembles and also the constituent statistical and intelligent techniques.  相似文献   

10.
Multimedia Tools and Applications - Cyberbullying is to bully someone in the digital realm. It has become extremely detrimental as the social media and the internet have become more popular and...  相似文献   

11.
《Applied Soft Computing》2008,8(2):906-918
For many soft computing methods, we need to generate random numbers to use either as initial estimates or during the learning and search process. Recently, results for evolutionary algorithms, reinforcement learning and neural networks have been reported which indicate that the simultaneous consideration of randomness and opposition is more advantageous than pure randomness. This new scheme, called opposition-based learning, has the apparent effect of accelerating soft computing algorithms. This paper mathematically and also experimentally proves this advantage and, as an application, applies that to accelerate differential evolution (DE). By taking advantage of random numbers and their opposites, the optimization, search or learning process in many soft computing techniques can be accelerated when there is no a priori knowledge about the solution. The mathematical proofs and the results of conducted experiments confirm each other.  相似文献   

12.
Wide acceptance of mobile phones and their resource hungry applications have highlighted resource limitations of mobile devices. In this regard, cloud computing has provided mobile phones with unlimited resources in order to help them overcome their constraints and enable them to support wider range of applications; so, mobile devices can outsource their tasks to public or local clouds. To accommodate to exponential growth of requests, user requests should be distributed to different cloudlets and then transparently and dynamically redirected to the servers according to the latest network and server status. Therefore, finding the best place to off-load is vital and crucial to both functionality and performance of the system. However, accurate and timely parameters of network and servers’ status are improbable to achieve, so the traditional algorithms cannot perform effectively and fully efficient. As a solution in this paper, an adaptive neuro-fuzzy inference system is proposed and trained to assign tasks to the servers efficiently. The trained system is robust to imprecise context information and is tolerable measurement noise and errors. We have considered improving both system performance and user quality of service parameters in this paper. Simulation results demonstrate that, compared with other server selection schemes, the proposed scheme can achieve higher resource utilization (utilization is a percentage of time that a server is busy doing something), provide better user-perceived quality of service, and efficiently deal with network dynamics. Simulation results show that our proposed algorithm excels over the compared works in terms of performance, at the best case about 30% and at the worst case about 8.93%.  相似文献   

13.
With growing technology, fault detection and isolation (FDI) have become one of the interesting and important research areas in modern control and signal processing. Accomplishment of specific missions like waste treatment in nuclear reactors or data collection in space and underwater missions make reliability more important for robotics and this demand forces researchers to adapt available FDI studies on nonlinear systems to robot manipulators, mobile robots and mobile manipulators.In this study, two model-based FDI schemes for robot manipulators using soft computing techniques, as an integrator of Neural Network (NN) and Fuzzy Logic (FL), are proposed. Both schemes use M-ANFIS for robot modelling. The first scheme isolates faults by passing residual signals through a neural network. The second scheme isolates faults by modelling faulty robot models for defined faults and combining these models as a generalized observers scheme (GOS) structure. Performances of these schemes are tested on a simulated two-link planar manipulator and simulation results and a comparison according to some important FDI specifications are presented.  相似文献   

14.
Text mining or analytics is important for various applications such as market analysis and biomedical purposes because it enables the efficient retrieval of information from large datasets. During the analysis, increasing the dimensionality of the data reduces the performance of an entire system because doing so may retrieve irrelevant text, which creates errors. Therefore, this paper introduces big data and data mining techniques to analyse large volumes of information while mining texts, emails, blogs, online forums, news, and call centre documents. Initially, the data are collected from various sources that contain noise, which is removed by applying normalization techniques. Data mining techniques eliminate the irrelevant information and noise, and the relevant features are selected using the rough set‐based particle swarm optimization algorithm. The selected features are formed as a cluster using a fuzzy set with the particle swarm optimization algorithm, which improves the efficiency of the mining process. Then, the efficiency of the system is evaluated using the University of California Irvine Machine Learning Repository knowledge process mining database, along with the sum of the intra cluster distances, the mean squared error rate, and the accuracy.  相似文献   

15.
Heart Rate Variability (HRV) represents a physiological phenomenon which consists in the oscillation in the interval between consecutive heartbeats. Based on the HRV analysis, cardiology experts can make a assessment for both the cardiac health and the condition of the autonomic nervous system that is responsible for controlling heart activity and, consequently, they try to prevent cardiovascular mortality. In this scenario, one of the most widely accepted and low-cost diagnostic procedures useful for deriving and evaluating the HRV is surely the electrocardiogram (ECG), i.e., a transthoracic interpretation of the electrical activity of the heart over a period of time. With the advent of modern signal processing techniques, the diagnostic power of the ECG is increased exponentially due to the huge number of features that are typically extracted from the ECG signal. Even though this expanded set of features could allow medical staffs to diagnose various pathologies in an accurate way, it is too complex to manage in a manual way and, for this reason, methods for feature representation and evaluation are necessary for supporting medical diagnosis. Starting from this consideration, this paper proposes an enhanced ECG-based decision making system exploiting a collection of ontological models representing the ECG and HRV feature sets and a fuzzy inference engine based on Type-2 Fuzzy Markup Language capable of evaluating the ECG and HRV properties related to a given person and infer detailed information about his health quality level. As will be shown in the experimental section, where the proposed approach has been tested on a set of under exams students, our diagnostic framework yields good performances both in terms of precision and recall.  相似文献   

16.
Until recently, local governments in Spain were using machines with rolling cylinders for verifying taximeters. However, the condition of the tires can lead to errors in the process and the mechanical construction of the test equipment is not compatible with certain vehicles. Thus, a new measurement device needs to be designed. In our opinion, the verification of a taximeter will not be reliable unless measurements taken on an actual taxi run are used. GPS sensors are intuitively well suited for this process, because they provide the position and the speed with independence of those car devices that are under test. But there are legal problems that make difficult the use of GPS-based sensors: GPS coordinate measurements do not match exactly real coordinates and, generally speaking, we are not given absolute tolerances. We can not know whether the maximum error is always lower than, for example, 7 m. However, we might know that 50% of the measurements lie on a circle with a radius of 7 m, centered on the real position. In this paper we describe a practical application where these legal problems have been solved with soft computing based technologies. In particular, we propose to characterize the uncertainty in the GPS with fuzzy techniques, so that we can reuse certain recent algorithms, formerly intended for being used in genetic fuzzy systems, to this new context. Specifically, we propose a new method for computing an upper bound of the length of the trajectory, taking into account the vagueness of the GPS data. This bound will be computed using a modified multiobjective evolutionary algorithm, which can optimize a fuzzy valued function. The accuracy of the measurements will be improved further by combining it with restrictions based on the dynamic behavior of the vehicles.  相似文献   

17.
The layout design of satellite modules is considered to be NP-hard. It is not only a complex coupled system design problem but also a special multi-objective optimization problem. The greatest challenge in solving this problem is that the function to be optimized is characterized by a multitude of local minima separated by high-energy barriers. The Wang-Landau (WL) sampling method, which is an improved Monte Carlo method, has been successfully applied to solve many optimization problems. In this paper we use the WL sampling method to optimize the layout of a satellite module. To accelerate the search for a global optimal layout, local search (LS) based on the gradient method is executed once the Monte-Carlo sweep produces a new layout. By combining the WL sampling algorithm, the LS method, and heuristic layout update strategies, a hybrid method called WL-LS is proposed to obtain a final layout scheme. Furthermore, to improve significantly the efficiency of the algorithm, we propose an accurate and fast computational method for the overlapping depth between two objects (such as two rectangular objects, two circular objects, or a rectangular object and a circular object) embedding each other. The rectangular objects are placed orthogonally. We test two instances using first 51 and then 53 objects. For both instances, the proposed WL-LS algorithm outperforms methods in the literature. Numerical results show that the WL-LS algorithm is an effective method for layout optimization of satellite modules.  相似文献   

18.
This interdisciplinary research is based on the application of unsupervized connectionist architectures in conjunction with modelling systems and on the determining of the optimal operating conditions of a new high precision industrial process known as laser milling. Laser milling is a relatively new micro‐manufacturing technique in the production of high‐value industrial components. The industrial problem is defined by a data set relayed through standard sensors situated on a laser‐milling centre, which is a machine tool for manufacturing high‐value micro‐moulds, micro‐dies and micro‐tools. The new three‐phase industrial system presented in this study is capable of identifying a model for the laser‐milling process based on low‐order models. The first two steps are based on the use of unsupervized connectionist models. The first step involves the analysis of the data sets that define each case study to identify if they are informative enough or if the experiments have to be performed again. In the second step, a feature selection phase is performed to determine the main variables to be processed in the third step. In this last step, the results of the study provide a model for a laser‐milling procedure based on low‐order models, such as black‐box, in order to approximate the optimal form of the laser‐milling process. The three‐step model has been tested with real data obtained for three different materials: aluminium, cooper and hardened steel. These three materials are used in the manufacture of micro‐moulds, micro‐coolers and micro‐dies, high‐value tools for the medical and automotive industries among others. As the model inputs are standard data provided by the laser‐milling centre, the industrial implementation of the model is immediate. Thus, this study demonstrates how a high precision industrial process can be improved using a combination of artificial intelligence and identification techniques.  相似文献   

19.
Neural Computing and Applications - Image segmentation using multilevel thresholding (MT) is one of the leading methods. Although, as most techniques are based on the image histogram to be...  相似文献   

20.
The purpose of this article is to demonstrate the use of feedforward neural networks (FFNNs), adaptive neural fuzzy inference systems (ANFIS), and probabilistic neural networks (PNNs) to discriminate between earthquakes and quarry blasts in Istanbul and vicinity (the Marmara region). The tectonically active Marmara region is affected by the Thrace-Eski?ehir fault zone and especially the North Anatolian fault zone (NAFZ). Local MARNET stations, which were established in 1976 and are operated by the Kandilli Observatory and Earthquake Research Institute (KOERI), record not only earthquakes that occur in the region, but also quarry blasts. There are a few quarry-blasting areas in the Gaziosmanpa?a, Çatalca, Ömerli, and Hereke regions. Analytical methods were applied to a set of 175 seismic events (2001-2004) recorded by the stations of the local seismic network (ISK, HRT, and CTT stations) operated by the KOERI National Earthquake Monitoring Center (NEMC). Out of a total of 175 records, 148 are related to quarry blasts and 27 to earthquakes. The data sets were divided into training and testing sets for each region. In all the models developed, the input vectors consist of the peak amplitude ratio (S/P ratio) and the complexity value, and the output is a determination of either earthquake or quarry blast. The success of the developed models on regional test data varies between 97.67% and 100%.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号