首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The last few decades have seen a phenomenal increase in the quality, diversity and pervasiveness of computer games. The worldwide computer games market is estimated to be worth around USD 21bn annually, and is predicted to continue to grow rapidly. This paper reviews some of the recent developments in applying computational intelligence (CI) methods to games, points out some of the potential pitfalls, and suggests some fruitful directions for future research.  相似文献   

2.
The 3rd ICIM’2014 aims to offer a meeting opportunity for researchers,students,and industry-related researchers from around the world to present recent results and discuss the latest advances and new trends in the theory and application of Intelligent Machines with both efficient and flexible behaviour.he topics of interest include,but are not limited to:computational intelligence;intelligent hybrid systems;documents analysis and recognition;  相似文献   

3.
Recent finance and debt crises have made credit risk management one of the most important issues in financial research.Reliable credit scoring models are crucial for financial agencies to evaluate credit applications and have been widely studied in the field of machine learning and statistics.In this paper,a novel feature-weighted support vector machine(SVM) credit scoring model is presented for credit risk assessment,in which an F-score is adopted for feature importance ranking.Considering the mutual interaction among modeling features,random forest is further introduced for relative feature importance measurement.These two feature-weighted versions of SVM are tested against the traditional SVM on two real-world datasets and the research results reveal the validity of the proposed method.  相似文献   

4.
The 2008 IEEE Wodd Congress on Computational Intelligence (WCCI 2008) will be held at the Hong Kong Convention and Exhibition Centre during June 1-6, 2008. WCCI 2008 will be the fifth milestone in this series with a glorious history from WCCI 1994 in Orlando, WCCI 1998 in Anchorage, WCCI 2002 in Honolulu, to WCCI 2006 in Vancouver. Sponsored by the IEEE Computational Intelligence Society, co-sponsored by the International Neural Network Society, Evolutionary Programming Society, and the Institution of Engineering and Technology, and composed of the 2008 International Joint Conference on Neural Networks (IJCNN2008), 2008 IEEE International Conference on Fuzzy Syrtems (FUZZ-IEEE2008), and 2008 IEEE Congress on Evolutionary Computation (CEC2008), WCC12008 will be the largest technical event on computational intelligence in the world with the biggest impact. WCCI 2008 will provide a stimulating forum for thousands of scientists, engineers, educators and students from all over the world to disseminate their new research findingsand exchange information on emerging areas of research in the fields. WCCI 2008 will also create a pleasant environment for the participants to meet old friends and make new friends who share similar research interests.  相似文献   

5.
We note that some existing algorithms are based on the normalized least-mean square (NLMS) algorithm and aim to reduce the computational complexity of NLMS all inherited from the solution of the same optimization problem, but with different constraints. A new constraint is analyzed to substitute an extra searching technique in the set-membership partial-update NLMS algorithm (SM-PU-NLMS) which aims to get a variable number of updating coefficients for a further reduction of computational complexity. We get a closed form expression of the new constraint without extra searching technique to generate a novel set-membership variable-partial-update NLMS (SM-VPU-NLMS) algorithm. Note that tile SM-VPU-NLMS algorithm obtains a faster convergence and a smaller mean-squared error (MSE) than the existing SM-PU-NLMS. It is pointed out that the closed form expression can also be applied to the conventional variable-step-size partial-update NLMS (VSS-PU-NLMS) algorithm. The novel variable-step-size variable-partial-update NLMS (VSS-VPU-NLMS) algorithm is also verified to get a further computational complexity reduction. Simulation results verify that our analysis is reasonable and effective.  相似文献   

6.
Rough Sets, Their Extensions and Applications   总被引:2,自引:0,他引:2  
Rough set theory provides a useful mathematical foundation for developing automated computational systems that can help understand and make use of imperfect knowledge.Despite its recency,the theory and its extensions have been widely applied to many problems,including decision analysis,data mining,intelligent control and pattern recognition.This paper presents an outline of the basic concepts of rough sets and their major extensions,covering variable precision,tolerance and fuzzy rough sets.It also shows the diversity of successful applications these theories have entailed,ranging from financial and business,through biological and medicine,to physical,art,and meteorological.  相似文献   

7.
In this paper a new theory referred to as the decrease-radix design (DRD) is proposed, which is found in the research of logic units of ternary (tri-valued) optical computer. Based on the theory proposed, the principles and the regulations of the DRD for making operation units of multi-valued operation with carrying/borrowing free are also presented. The research work has come to the following important conclusion: let D be a special state contained in n physical informative states, then one may figure out any multi-valued processors within n^(n×n) carrying/borrowing free n-valued units by the composition some of n×n×(n-1) simplest basic operating units according to the regulations of DRD proposed in this paper. The detailed systematic way of our design regulations is highlighted step by step in the paper with an example of design of a tri-valued logic optical operating unit. The real architecture, the procedure, and the experimental results of our sample in tri-valued logic operating unit are given. Finally, a re-constructible model of ternary logical optical processor is introduced. The theory proposed in the paper has laid down a solid foundation for the design of re-constructible carrying/borrowing free operating units in ternary optical computers and can be widely used as the designing reference in a variety of multi-valued logic operating units.  相似文献   

8.
Parameter identification is a key requirement in the field of automated control of unmanned excavators (UEs). Furthermore, the UE operates in unstructured, often hazardous environments, and requires a robust parameter identification scheme for field applications. This paper presents the results of a research study on parameter identification for UE. Three identification methods, the Newton-Raphson method, the generalized Newton method, and the least squares method are used and compared for prediction accuracy, robustness to noise and computational speed. The techniques are used to identify the link parameters (mass, inertia, and length) and friction coefficients of the full-scale UE. Using experimental data from a full-scale field UE, the values of link parameters and the friction coefficient are identified. Some of the identified parameters are compared with measured physical values. Furthermore, the joint torques and positions computed by the proposed model using the identified parameters are validated against measured data. The comparison shows that both the Newton-Raphson method and the generalized Newton method are better in terms of prediction accuracy. The Newton-Raphson method is computationally efficient and has potential for real time application, but the generalized Newton method is slightly more robust to measurement noise. The experimental data were obtained in collaboration with QinetiQ Ltd.  相似文献   

9.
基于时序逻辑的移动计算的形式方法分析   总被引:1,自引:0,他引:1  
魏峻 《计算机科学》2000,27(6):22-27
At present,mobile computing is widely considered as a new computing paradigm supported by advanced computational technologies.One of main characteristics of this paradigm is the ability to dynamically change the binding for hardware and/or software components,that is mobility. Although there are many languages appeared to announce supporting mobile computing,the requirements and features of this paradigm are still scarcely recognized.So a lot of research is being expanded on formal models and methods for mobile computing. In this paper,we analyze a temporal logic based formalism for mobile computing——Mobile Unity. At first,we summarize the features of mobile computing,and generalize that mobile systems should be decoupled,strongly-autonomous,context-dependent and owning new requirements for location-transparency.Then we introduce Mobile Unity,its extensions for U NITY from syntax to computational semantics,specially analyze the correspondence between the language structures of Mobile Unity and the abstraction of mobile features,such as location,mobility,transient interactions etc.At the end,we conclude its some deficiency for supporting mobile computing.  相似文献   

10.
The search for patterns or motifs in data represents a problem area of key interest to finance and economic researchers. In this paper, we introduce the motif tracking algorithm (MTA), a novel immune inspired (IS) pattern identification tool that is able to identify unknown motifs of a non specified length which repeat within time series data. The power of the algorithm comes from the fact that it uses a small number of parameters with minimal assumptions regarding the data being examined or the underlying motifs. Our interest lies in applying the algorithm to financial time series data to identify unknown patterns that exist. The algorithm is tested using three separate data sets. Particular suitability to financial data is shown by applying it to oil price data. In all cases, the algorithm identifies the presence of a motif population in a fast and efficient manner due to the utilization of an intuitive symbolic representation. The resulting population of motifs is shown to have considerable potential value for other applications such as forecasting and algorithm seeding.  相似文献   

11.
Performance metrics and models are prerequisites for scientific understanding and optimization. This paper introduces a new footprint-based theory and reviews the research in the past four decades leading to the new theory. The review groups the past work into metrics and their models in particular those of the reuse distance, metrics conversion, models of shared cache, performance and optimization, and other related techniques.  相似文献   

12.
The theory of parameterized computation and complexity is a recently developed subarea in theoretical computer science. The theory is aimed at practically solving a large number of computational problems that are theoretically intractable.The theory is based on the observation that many intractable computational problems in practice are associated with a parameter that varies within a small or moderate range. Therefore, by taking the advantages of the small parameters, many theoretically intractable problems can be solved effectively and practically. On the other hand, the theory of parameterized computation and complexity has also offered powerful techniques that enable us to derive strong computational lower bounds for many computational problems, thus explaining why certain theoretically tractable problems cannot be solved effectively and practically. The theory of parameterized computation and complexity has found wide applications in areas such as database systems, programming languages, networks, VLSI design, parallel and distributed computing, computational biology, and robotics. This survey gives an overview on the fundamentals, algorithms, techniques, and applications developed in the research of parameterized computation and complexity. We will also report the most recent advances and excitements, and discuss further research directions in the area.  相似文献   

13.
This paper studies the passivity-based consensus analysis and synthesis problems for a class of stochastic multi-agent systems with switching topologies. Based on Lyapunov methods, stochastic theory, and graph theory, new different storage Lyapunov functions are proposed to derive sufficient conditions on mean-square exponential consensus and stochastic passivity for multi-agent systems under two different switching cases, respectively. By designing passive time-varying consensus protocols, the solvability conditions for the passivity-based consensus protocol synthesis problem, i.e., passification, are derived based on linearization techniques. Numerical simulations are provided to illustrate the effectiveness of the proposed methods.  相似文献   

14.
Foreword          下载免费PDF全文
This special issue of the Journal of Computer Science and Technology is devoted to Bioinformatics. The emerging research field connects classical topics in Computer Science, including the theory of algorithms, database theory, artificial intelligence, systems theory and others, to the solution of computational  相似文献   

15.
Mining frequent patterns from datasets is one of the key success of data mining research. Currently,most of the studies focus on the data sets in which the elements are independent, such as the items in the marketing basket. However, the objects in the real world often have close relationship with each other. How to extract frequent patterns from these relations is the objective of this paper. The authors use graphs to model the relations, and select a simple type for analysis. Combining the graph theory and algorithms to generate frequent patterns, a new algorithm called Topology, which can mine these graphs efficiently, has been proposed.The performance of the algorithm is evaluated by doing experiments with synthetic datasets and real data. The experimental results show that Topology can do the job well. At the end of this paper, the potential improvement is mentioned.  相似文献   

16.
Preface          下载免费PDF全文
This is a special issue with special interest in computability and computational complexity with extensions to information and network analysis. It consists of seven invited submissions each of which is carefully reviewed. The paper entitled: The future of computer science by John E. Hopcroft, Sucheta Soundarajan, and Liaoruo Wang, surveys some important topics and paradigm shift in computer science with the new focus on large-scale data and network analysis. The paper entitled: Holographic reduction: A domain changed application and its partial converse theorems by Mingji Xia arises the new problem of the converse of Valiant''s holographic reductions, and partially answers this question, which makes a significant contribution to the theory of holographic computation. The paper entitled: On strings with trivial Kolmogorov complexity by George Barmpalias studies the degree theoretic complexity of sets of strings with low algorithmic complexity, building a few fundamental results in the area of algorithmic randomness. The paper entitled: Color-Coding and its applications: A survey by Jianxin Wang, Qilong Feng, Jianer Chen, describes a comprehensive survey on this important topic of parameterized computation. The paper entitled: Rent-or-Buy network design problem and the Sample-Augment algorithm: A survey by Peng Zhang, gives a survey on approximation algorithms of the recently active topic of the rent or buy network design. The paper entitled: Kalimulin pairs of ** w-enumeration degrees by Ivan N. Soskov and Mariya I. Soskov studies the new phenomena of omega-enumeration degrees which is a natural extension of the classical enumeration degrees. The paper entitled: A toolkit for generating sentences from context-free grammars by Zhiwu Xu, Lixiao Zheng, and Haiming Chen, presents a toolkit for parsing context-free grammars and generating valid sentences. I would like to thank all authors for their contribution to this excellent issue, which reflects both our traditional research on computability and computational complexity and the extensions of the theory to information and networks. I am grateful to Prof. Ruqian Lu for inviting me to organize this special issue, and thank both Professor Ruqian Lu and the members of the Editorial Board of the International Journal of Software and Informatics for cooperation throughout the preparation of this special issue.  相似文献   

17.
The Joint Conference is for the seventh time organized in this format.The main goal of this conference is to provide a multidisciplinary forum between researchers from industry and academia to discuss state-of-the-art topics in system theory,control and computing,and to present recent research results and prospects for development in this evolving area.The ICSTCC conference emerged from the fusion of the three conferences that were separately organized by University  相似文献   

18.
This paper makes a brief introduction of the principle of Principal Component Analysis (PCA). Then according to the information entropy theory, and making full use of the inherent characteristic of eigenvalues of correlation matrix of data, the 2^nd information function, the 2nd information entropy and geometry entropy under PCA are proposed firstly, by which the information features of PCA are metricized. In addition, two new concepts of Information Rate (IR) and Accumulated Information Rate (AIR) are proposed, which are used to illustrate the degree of information feature extraction of PCA. In the end, through simulated application in practice, the results show that the method proposed in this paper is efficient and satisfactory. It provides a new research approach of information feature extraction for pattern recognition, machine learning, and data mining and so on.  相似文献   

19.
20.
In the last decade,market financial forecasting has attracted high interests amongst the researchers in pattern recognition.Usually,the data used for analysing the market,and then gamble on its future trend,are provided as time series;this aspect,along with the high fluctuation of this kind of data,cuts out the use of very efficient classification tools,very popular in the state of the art,like the well known convolutional neural networks(CNNs)models such as Inception,Res Net,Alex Net,and so on.This forces the researchers to train new tools from scratch.Such operations could be very time consuming.This paper exploits an ensemble of CNNs,trained over Gramian angular fields(GAF)images,generated from time series related to the Standard&Poor's 500 index future;the aim is the prediction of the future trend of the U.S.market.A multi-resolution imaging approach is used to feed each CNN,enabling the analysis of different time intervals for a single observation.A simple trading system based on the ensemble forecaster is used to evaluate the quality of the proposed approach.Our method outperforms the buyand-hold(B&H)strategy in a time frame where the latter provides excellent returns.Both quantitative and qualitative results are provided.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号