首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper investigates the application of evolutionary multi-objective optimization to two-dimensional procedural texture synthesis. Genetic programming is used to evolve procedural texture formulae. Earlier work used multiple feature tests during fitness evaluation to rate how closely a candidate texture matches visual characteristics of a target texture image. These feature test scores were combined into an overall fitness score using a weighted sum. This paper improves this research by replacing the weighted sum with a Pareto ranking scheme, which preserves the independence of feature tests during fitness evaluation. Three experiments were performed: a pure Pareto ranking scheme, and two Pareto experiments enhanced with parameterless population divergence strategies. One divergence strategy is similar to that used by the NSGA-II system, and scores individuals using their nearest-neighbour distance in feature-space. The other strategy uses a normalized, ranked abstraction of nearest neighbour distance. A result of this work is that acceptable textures can be evolved much more efficiently and with less user intervention with MOP evolution than compared to the weighted sum approach. Although the final acceptability of a texture is ultimately a subjective decision of the user, the proposed use of multi-objective evolution is useful for generating for the user a diverse assortment of possibilities that reflect the important features of interest. Brian J. Ross, Ph.D.: He is a professor of computer science at Brock University, where he has worked since 1992. He obtained his B.C.Sc. at the University of Manitoba, Canada in 1984, his M.Sc. at the University of British Columbia, Canada in 1988 and his Ph.D. at the University of Edinburgh, Scotland in 1992. His research interests include evolutionary computation, machine learning, language induction, concurrency, computer graphics, computer music and logic programming. Han Zhu, M.Sc.: She is a programmer analyst at Total System Service Company, where she has worked since 2003. She obtained her B.Sc. at Brock University, Canada, in 2002, her M.Sc. at the University of Western Ontario, Canada, in 2003.  相似文献   

2.
Evolving dynamic Bayesian networks with Multi-objective genetic algorithms   总被引:2,自引:0,他引:2  
A dynamic Bayesian network (DBN) is a probabilistic network that models interdependent entities that change over time. Given example sequences of multivariate data, we use a genetic algorithm to synthesize a network structure that models the causal relationships that explain the sequence. We use a multi-objective evaluation strategy with a genetic algorithm. The multi-objective criteria are a network's probabilistic score and structural complexity score. Our use of Pareto ranking is ideal for this application, because it naturally balances the effect of the likelihood and structural simplicity terms used in the BIC network evaluation heuristic. We use a basic structural scoring formula, which tries to keep the number of links in the network approximately equivalent to the number of variables. We also use a simple representation that favors sparsely connected networks similar in structure to those modeling biological phenomenon. Our experiments show promising results when evolving networks ranging from 10 to 30 variables, using a maximal connectivity of between 3 and 4 parents per node. The results from the multi-objective GA were superior to those obtained with a single objective GA. Brian J. Ross is a professor of computer science at Brock University, where he has worked since 1992. He obtained his BCSc at the University of Manitoba, Canada, in 1984, his M.Sc. at the University of British Columbia, Canada, in 1988, and his Ph.D. at the University of Edinburgh, Scotland, in 1992. His research interests include evolutionary computation, language induction, concurrency, and logic programming. He is also interested in computer applications in the fine arts. Eduardo Zuviria received a BS degree in Computer Science from Brock University in 2004 and a MS degree in Computer Science from Queen's University in 2006 where he held jobs as teacher and research assistant. Currently, he is attending a Ph.D. program at the University of Montreal. He holds a diploma in electronics from a technical college and has worked for eight years in the computer industry as a software developer and systems administrator. He has received several scholarships including the Ontario Graduate Scholarship, Queen's Graduate Scholarship and a NSERC- USRA scholarship.  相似文献   

3.
Ant colony optimization (ACO for short) is a meta-heuristics for hard combinatorial optimization problems. It is a population-based approach that uses exploitation of positive feedback as well as greedy search. In this paper, genetic algorithm's (GA for short) ideas are introduced into ACO to present a new binary-coding based ant colony optimization. Compared with the typical ACO, the algorithm is intended to replace the problem's parameter-space with coding-space, which links ACO with GA so that the fruits of GA can be applied to ACO directly. Furthermore, it can not only solve general combinatorial optimization problems, but also other problems such as function optimization. Based on the algorithm, it is proved that if the pheromone remainder factor ρ is under the condition of ρ≥1, the algorithm can promise to converge at the optimal, whereas if 0<ρ<1, it does not. This work is supported by the Science Foundation of Shanghai Municipal Commission of Science and Technology under Grant No.00JC14052. Tian-Ming Bu received the M.S. degree in computer software and theory from Shanghai University, China, in 2003. And now he is a Ph.D. candidate of Fudan University in the same area of theory computer science. His research interests include algorithms, especially, heuristic algorithms and heuristic algorithms and parallel algorithms, quantum computing and computational complexity. Song-Nian Yu received the B.S. degree in mathematics from Xi'an University of Science and Technology, Xi'an, China, in 1981, the Ph.D. degree under Prof. L. Lovasz's guidance and from Lorand University, Budapest, Hungary, in 1990. Dr. Yu is a professor in the School of Computer Engineering and Science at Shanghai University. He was a visiting professor as a faculty member in Department of Computer Science at Nelson College of Engineering, West Virginia University, from 1998 to 1999. His current research interests include parallel algorithms' design and analyses, graph theory, combinatorial optimization, wavelet analyses, and grid computing. Hui-Wei Guan received the B.S. degree in electronic engineering from Shanghai University, China, in 1982, the M.S. degree in computer engineering from China Textile University, China, in 1989, and the Ph.D. degree in computer science and engineering from Shanghai Jiaotong University, China, in 1993. He is an associate professor in the Department of Computer Science at North Shore Community College, USA. He is a member of IEEE. His current research interests are parallel and distributed computing, high performance computing, distributed database, massively parallel processing system, and intelligent control.  相似文献   

4.
A database session is a sequence of requests presented to the database system by a user or an application to achieve a certain task. Session identification is an important step in discovering useful patterns from database trace logs. The discovered patterns can be used to improve the performance of database systems by prefetching predicted queries, rewriting the current query or conducting effective cache replacement.In this paper, we present an application of a new session identification method based on statistical language modeling to database trace logs. Several problems of the language modeling based method are revealed in the application, which include how to select values for the parameters of the language model, how to evaluate the accuracy of the session identification result and how to learn a language model without well-labeled training data. All of these issues are important in the successful application of the language modeling based method for session identification. We propose solutions to these open issues. In particular, new methods for determining an entropy threshold and the order of the language model are proposed. New performance measures are presented to better evaluate the accuracy of the identified sessions. Furthermore, three types of learning methods, namely, learning from labeled data, learning from semi-labeled data and learning from unlabeled data, are introduced to learn language models from different types of training data. Finally, we report experimental results that show the effectiveness of the language model based method for identifying sessions from the trace logs of an OLTP database application and the TPC-C Benchmark. Xiangji Huang joined York University as an Assistant Professor in July 2003 and then became a tenured Associate Professor in May 2006. Previously, he was a Post Doctoral Fellow at the School of Computer Science, University of Waterloo, Canada. He did his Ph.D. in Information Science at City University in London, England, with Professor Stephen E. Robertson. Before he went into his Ph.D. program, he worked as a lecturer for 4 years at Wuhan University. He also worked in the financial industry in Canada doing E-business, where he was awarded a CIO Achievement Award, for three and half years. He has published more than 50 refereed papers in journals, book chapter and conference proceedings. His Master (M.Eng.) and Bachelor (B.Eng.) degrees were in Computer Organization & Architecture and Computer Engineering, respectively. His research interests include information retrieval, data mining, natural language processing, bioinformatics and computational linguistics. Qingsong Yao is a Ph.D. student in the Department of Computer Science and Engineering at York University, Toronto, Canada. His research interests include database management systems and query optimization, data mining, information retrieval, natural language processing and computational linguistics. He earned his Master's degree in Computer Science from Institute of Software, Chinese Academy of Science in 1999 and Bachelor's degree in Computer Science from Tsinghua University. Aijun An is an associate professor in the Department of Computer Science and Engineering at York University, Toronto, Canada. She received her Bachelor's and Master's degrees in Computer Science from Xidian University in China. She received her PhD degree in Computer Science from the University of Regina in Canada in 1997. She worked at the University of Waterloo as a postdoctoral fellow from 1997 to 1999 and as a research assistant professor from 1999 to 2001. She joined York University in 2001. She has published more than 60 papers in refereed journals and conference proceedings. Her research interests include data mining, machine learning, and information retrieval.  相似文献   

5.
In this paper an evolutionary classifier fusion method inspired by biological evolution is presented to optimize the performance of a face recognition system. Initially, different illumination environments are modeled as multiple contexts using unsupervised learning and then the optimized classifier ensemble is searched for each context using a Genetic Algorithm (GA). For each context, multiple optimized classifiers are searched; each of which are referred to as a context based classifier. An evolutionary framework comprised of a combination of these classifiers is then applied to optimize face recognition as a whole. Evolutionary classifier fusion is compared with the simple adaptive system. Experiments are carried out using the Inha database and FERET database. Experimental results show that the proposed evolutionary classifier fusion method gives superior performance over other methods without using evolutionary fusion. Recommended by Guest Editor Daniel Howard. This work was supported by INHA UNIVERSITY Research Grant. Zhan Yu received the B.E. degree in Software Engineering from Xiamen University, China, in 2008. He is currently a master student in Intelligent Technology Lab, Computer and Information Department, Inha University, Korea. He has research interests in image processing, pattern recognition, computer vision, machine learning and statistical inference and computating. Mi Young Nam received the B.Sc. and M.Sc. degrees in Computer Science from the University of Silla Busan, Korea in 1995 and 2001 respectively and the Ph.D. degree in Computer Science & Engineering from the University of Inha, Korea in 2006. Currently, She is Post-Doctor course in Intelligent Technology Laboratory, Inha University, Korea. She’s research interest includes biometrics, pattern recognition, computer vision, image processing. Suman Sedai received the M.S. degree in Software Engineering from Inha University, China, in 2008. He is currently a Doctoral course in Western Australia University, Australia. He has research interests in image processing, pattern recognition, computer vision, machine learning. Phill Kyu Rhee received the B.S. degree in Electrical Engineering from the Seoul University, Seoul, Korea, the M.S. degree in Computer Science from the East Texas State University, Commerce, TX, and the Ph.D. degree in Computer Science from the University of Louisiana, Lafayette, LA, in 1982, 1986, and 1990 respectively. During 1982–1985 he was working in the System Engineering Research Institute, Seoul, Korea as a research scientist. In 1991 he joined the Electronic and Telecommunication Research Institute, Seoul, Korea, as a Senior Research Staff. Since 1992, he has been an Associate Professor in the Department of Computer Science and Engineering of the Inha University, Incheon, Korea and since 2001, he is a Professor in the same department and university. His current research interests are pattern recognition, machine intelligence, and parallel computer architecture. dr. rhee is a Member of the IEEE Computer Society and KISS (Korea Information Science Society).  相似文献   

6.
This paper examines two seemingly unrelated qualitative spatial reasoning domains; geometric proportional analogies and topographic (land-cover) maps. We present a Structure Matching algorithm that combines Gentner’s structuremapping theory with an attributematching process. We use structure matching to solve geometric analogy problems that involve manipulating attribute information, such as colors and patterns. Structure matching is also used to creatively interpret topographic (land-cover) maps, adding a wealth of semantic knowledge and providing a far richer interpretation of the raw data. We return to the geometric proportional analogies, identify alternate attribute matching processes that are required to solve different categories of problems. Finally, we assess some implications for computationally creative and inventive models. Diarmuid P. O’Donoghue, Ph.D.: He received his B.Sc. and M.Sc. from University College Cork in 1988 and 1990, and his Ph.D. from University College Dublin. He has been a lecturer at the Department of Computer Science NUI Maynooth since 1996 and is also an associate of the National Centre for Geocomputation. His interests are in artificial intelligence, analogical reasoning, topology, and qualitative spatial reasoning. Amy Bohan, B.Sc, M.Sc.: She received her B.Sc. from the National University of Ireland, Maynooth in 2000. She received her M.Sc. in 2003 from University College Dublin where she also recently completed her Ph.D. She is a member of the Cognitive Science society. Her interests are in cognitive science, analogical argumentation, geometric proportional analogies and computational linguistics. Prof. Mark T. Keane: He is Chair of Computer Science and Associate Dean of Science at University College Dublin. He is also Director of ICT, at Science Foundation Ireland. Prof. Keane has made significant contributions in the areas of analogy, case-based reasoning and creativity. He has published over 100 publications, including 16 books, that are cited widely. He is co-author of a Cognitive Science textbook, written with Mike Eysenck (University of London) that has been translated into Portuguese, Hungarian, Italian and Chinese and is now entering its fifth edition. Prof. Keane is a fellow of ECCAI (European Co-ordinating Committee on Artificial Intelligence) and received the Special Award for Merit from the Psychology Society of Ireland, for his work on human creativity.  相似文献   

7.
Merging uncertain information with semantic heterogeneity in XML   总被引:1,自引:1,他引:0  
Semistructured information can be merged in a logic-based framework [6, 7]. This framework has been extended to deal with uncertainty, in the form of probability values, degrees of beliefs, or necessity measures, associated with leaves (i.e. textentries) in the XML documents [3]. In this paper we further extend this approach to modelling and merging uncertain information that is defined at different levels of granularity of XML textentries, and to modelling and reasoning with XML documents that contain semantically heterogeneous uncertain information on more complex elements in XML subtrees. We present the formal definitions for modelling, propagating and merging semantically heterogeneous uncertain information and explain how they can be handled using logic-based fusion techniques. Anthony Hunter received a B.Sc. (1984) from the University of Bristol and an M.Sc. (1987) and Ph.D. (1992) from Imperial College, London. He is currently a reader in the Department of Computer Science at University College London. His main research interests are: Knowledge representation and reasoning, Analysing inconsistency, Argumentation, Default reasoning and Knowledge Fusion. Weiru Liu is a senior lecturer at the School of Computer Science, Queen's University Belfast. She received her B.Sc. and M.Sc. degrees in Computer Science from Jilin University, P.R China, and her Ph.D. degree in Artificial Intelligence from the University of Edinburgh. Her main research interests include reasoning under uncertainty, knowledge representation and reasoning, uncertain knowledge and information fusion, and knowledge discovery in databases. She has published over 50 journal and conference papers in these areas.  相似文献   

8.
The present contribution describes a potential application of Grid Computing in Bioinformatics. High resolution structure determination of biological specimens is critical in BioSciences to understanding the biological function. The problem is computational intensive. Distributed and Grid Computing are thus becoming essential. This contribution analyzes the use of Grid Computing and its potential benefits in the field of electron microscope tomography of biological specimens. Jose-Jesus Fernandez, Ph.D.: He received his M.Sc. and Ph.D. degrees in Computer Science from the University of Granada, Spain, in 1992 and 1997, respectively. He was a Ph.D. student at the Bio-Computing unit of the National Center for BioTechnology (CNB) from the Spanish National Council of Scientific Research (CSIC), Madrid, Spain. He became an Assistant Professor in 1997 and, subsequently, Associate Professor in 2000 in Computer Architecture at the University of Almeria, Spain. He is a member of the supercomputing-algorithms research group. His research interests include high performance computing (HPC), image processing and tomography. Jose-Roman Bilbao-Castro: He received his M.Sc. degree in Computer Science from the University of Almeria in 2001. He is currently a Ph.D. student at the BioComputing unit of the CNB (CSIC) through a Ph.D. CSIC-grant in conjuction with Dept. Computer Architecture at the University of Malaga (Spain). His current research interestsinclude tomography, HPC and distributed and grid computing. Roberto Marabini, Ph.D.: He received the M.Sc. (1989) and Ph.D. (1995) degrees in Physics from the University Autonoma de Madrid (UAM) and University of Santiago de Compostela, respectively. He was a Ph.D. student at the BioComputing Unit at the CNB (CSIC). He worked at the University of Pennsylvania and the City University of New York from 1998 to 2002. At present he is an Associate Professor at the UAM. His current research interests include inverse problems, image processing and HPC. Jose-Maria Carazo, Ph.D.: He received the M.Sc. degree from the Granada University, Spain, in 1981, and got his Ph.D. in Molecular Biology at the UAM in 1984. He left for Albany, NY, in 1986, coming back to Madrid in 1989 to set up the BioComputing Unit of the CNB (CSIC). He was involved in the Spanish Ministry of Science and Technology as Deputy General Director for Research Planning. Currently, he keeps engaged in his activities at the CNB, the Scientific Park of Madrid and Integromics S.L. Immaculada Garcia, Ph.D.: She received her B.Sc. (1977) and Ph.D. (1986) degrees in Physics from the Complutense University of Madrid and University of Santiago de Compostela, respectively. From 1977 to 1987 she was an Assistant professor at the University of Granada, from 1987 to 1996 Associate professor at the University of Almeria and since 1997 she is a Full Professor and head of Dept. Computer Architecture. She is head of the supercomputing-algorithms research group. Her research interest lies in HPC for irregular problems related to image processing, global optimization and matrix computation.  相似文献   

9.
We suggest the use of ranking-based evaluation measures for regression models, as a complement to the commonly used residual-based evaluation. We argue that in some cases, such as the case study we present, ranking can be the main underlying goal in building a regression model, and ranking performance is the correct evaluation metric. However, even when ranking is not the contextually correct performance metric, the measures we explore still have significant advantages: They are robust against extreme outliers in the evaluation set; and they are interpretable. The two measures we consider correspond closely to non-parametric correlation coefficients commonly used in data analysis (Spearman's ρ and Kendall's τ); and they both have interesting graphical representations, which, similarly to ROC curves, offer useful various model performance views, in addition to a one-number summary in the area under the curve. An interesting extension which we explore is to evaluate models on their performance in “partially” ranking the data, which we argue can better represent the utility of the model in many cases. We illustrate our methods on a case study of evaluating IT Wallet size estimation models for IBM's customers. Saharon Rosset is Research Staff Member in the Data Analytics Research Group at IBM's T. J. Watson Research Center. He received his B.S. in Mathematics and M.Sc., in Statistics from Tel Aviv University in Israel, and his Ph.D. in Statistics from Stanford University in 2003. In his research, he aspires to develop practically useful predictive modeling methodologies and tools, and apply them to solve problems in business and scientific domains. Currently, his major projects include work on customer wallet estimation and analysis of genetic data. Claudia Perlich has received a M.Sc. in Computer Science from Colorado University at Boulder, a Diploma in Computer Science from Technische Universitaet in Darmstadt, and her Ph.D. in Information Systems from Stern School of Business, New York University. Her Ph.D. thesis concentrated on probability estimation in multi-relational domains that capture information of multiple entity types and relationships between them. Her dissertation was recognized as an additional winner of the International SAP Doctoral Support Award Competition. Claudia joined the Data Analytics Research group at IBM's T.J. Watson Research Center as a Research Staff Member in October 2004. Her research interests are in statistical machine learning for complex real-world domains and business applications. Bianca Zadrozny is currently an associate professor at the Computer Science Department of Federal Fluminense University in Brazil. Her research interests are in the areas of applied machine learning and data mining. She received her B.Sc. in Computer Engineering from the Pontifical Catholic University in Rio de Janeiro, Brazil, and her M.Sc. and Ph.D. in Computer Science from the University of California at San Diego. She has also worked as a research staff member in the data analytics research group at IBM T.J. Watson Research Center.  相似文献   

10.
We present a monitoring system which detects repeated packets in network traffic, and has applications including detecting computer worms. It uses Bloom filters with counters. The system analyzes traffic in routers of a network. Our preliminary evaluation of the system involved traffic from our internal lab and a well known historical data set. After appropriate configuration, no false alarms are obtained under these data sets and we expect low false alarm rates are possible in many network environments. We also conduct simulations using real Internet Service Provider topologies with realistic link delays and simulated traffic. These simulations confirm that this approach can detect worms at early stages of propagation. We believe our approach, with minor adaptations, is of independent interest for use in a number of network applications which benefit from detecting repeated packets, beyond detecting worm propagation. These include detecting network anomalies such as dangerous traffic fluctuations, abusive use of certain services, and some distributed denial-of-service attacks. P. van Oorschot(Ph.D. Waterloo, 1988) is a Professor in the School of Computer Science at Carleton University, and Canada Research Chair in Network and Software Security. He is the founding director of Carleton's Digital Security Group. He has worked in research and development in cryptography and network security, including at Bell-Northern Research (Ottawa), and Entrust Technologies (Ottawa) as VP and Chief Scientist. He is coauthor of the standard reference Handbook of Applied Cryptography. His current research interests include authentication and identity management, network security, software protection, and security infrastructures. J.-M. Robertis a Principal Security Researcher at Alcatel in Ottawa, Ontario. His research interests are network and telecom infrastructure security, focusing mainly on denial-of-service attacks and worm propagation. Previously, Dr. Robert worked as Security Director for the North American Development Center of Gemplus International as well as Professor at the Université du Québec à Chicoutimi. Dr. Robert received a Ph.D. in Computer Science from McGill University. M. Vargas Martinis an Assistant Professor at the University of Ontario Institute of Technology (Oshawa, Canada), with faculty appointments in Business and Information Technology, as well as Engineering and Applied Science. He was previously a post-doctoral researcher at Carleton University supported in part by Alcatel Canada. He holds a Ph.D. in Computer Science (Carleton University, 2002), a Masters degree in Electrical Engineering (Cinvestav, Mexico, 1998), and a Bachelor of Computer Science (Universidad Autónoma de Aguascalientes, Mexico, 1996). His current research interests include network and host-based intrusion detection and reaction, mitigation of denial-of-service (DoS) and distributed DoS attacks, Web modeling and optimization, Internet connectivity, and interconnection protocols.  相似文献   

11.
Data extraction from the web based on pre-defined schema   总被引:8,自引:1,他引:7       下载免费PDF全文
With the development of the Internet,the World Web has become an invaluable information source for most organizations,However,most documents available from the Web are in HTML form which is originally designed for document formatting with little consideration of its contents.Effectively extracting data from such documents remains a non-trivial task.In this paper,we present a schema-guided approach to extracting data from HTML pages .Under the approach,the user defines a schema specifying what to be extracted and provides sample mappings between the schema and th HTML page.The system will induce the mapping rules and generate a wrapper that takes the HTML page as input and produces the required datas in the form of XML conforming to the use-defined schema .A prototype system implementing the approach has been developed .The preliminary experiments indicate that the proposed semi-automatic approach is not only easy to use but also able to produce a wrapper that extracts required data from inputted pages with high accuracy.  相似文献   

12.
Many algorithms in distributed systems assume that the size of a single message depends on the number of processors. In this paper, we assume in contrast that messages consist of a single bit. Our main goal is to explore how the one-bit translation of unbounded message algorithms can be sped up by pipelining. We consider two problems. The first is routing between two processors in an arbitrary network and in some special networks (ring, grid, hypercube). The second problem is coloring a synchronous ring with three colors. The routing problem is a very basic subroutine in many distributed algorithms; the three coloring problem demonstrates that pipelining is not always useful. Amotz Bar-Noy received his B.Sc. degree in Mathematics and Computer Science in 1981, and his Ph.D. degree in Computer Science in 1987, both from the Hebrew University of Jerusalem, Israel. Between 1987 and 1989 he was a post-doctoral fellow in the Department of Computer Science at Stanford University. He is currently a visiting scientist at the IBM Thomas J. Watson Research Center. His current research interests include the theoretical aspects of distributed and parallel computing, computational complexity and combinatorial optimization. Joseph (Seffi) Naor received his B.A. degree in Computer Science in 1981 from the Technion, Israel Institute of Technology. He received his M.Sc. in 1983 and Ph.D. in 1987 in Computer Science, both from the Hebrew University of Jerusalem, Israel. Between 1987 and 1988 he was a post-doctoral fellow at the University of Southern California, Los Angeles, CA. Since 1988 he has been a post-doctoral fellow in the Department of Computer Science at Stanford University. His research interests include combinatorial optimization, randomized algorithms, computational complexity and the theoretical aspects of parallel and distributed computing. Moni Naor received his B.A. in Computer Science from the Technion, Israel Institute of Technology, in 1985, and his Ph.D. in Computer Science from the University of California at Berkeley in 1989. He is currently a visiting scientist at the IBM Almaden Research Center. His research interests include computational complexity, data structures, cryptography, and parallel and distributed computation.Supported in part by a Weizmann fellowship and by contract ONR N00014-85-C-0731Supported by contract ONR N00014-88-K-0166 and by a grant from Stanford's Center for Integrated Systems. This work was done while the author was a post-doctoral fellow at the University of Southern California, Los Angeles, CAThis work was done while the author was with the Computer Science Division, University of California at Berkeley, and Supported by NSF grant DCR 85-13926  相似文献   

13.
In this paper we give several improved universality results for two important classes of P systems: P systems with catalysts and evolution-communication P systems. First, the result from Reference,14) stating that six catalysts ensure the universality, has been improved in two ways: using bistable catalysts and using moving catalysts. Specifically, the universality can be reached with one bistable catalyst and 2 usual catalysts (using five membranes), as well as with one moving catalyst and three membranes, or with two moving catalysts and only two membranes. The second part of the paper deals with evolution-communication P systems, and we also give improved universality results for this type of systems, in terms of the weight of symport/antiport rules, number of membranes, or number of catalysts. Shankara Narayanan Krishna: She is an Assistant Professor in Dept. Computer Science & Engg, IIT Bombay, India. Her research interests are Natural Computing and Formal Methods. Andrei Paun, Ph.D.: He obtained his bachelor degree in Mathematics and Computer Science from the University of Bucharest, Romania. He obtaind his Ph.D. degree in Computer Science, at University of Western Ontario, Canada, under the supervision of Prof. Dr. Sheng Yu, with the thesis “Unconventional Models of Computation: DNA and Membrane Computing”. After graduation he received a postdoctoral felloship from NSERC, Canada and after six months he accepted an assistant professor position in US at Louisiana Tech University.  相似文献   

14.
XML has already become the de facto standard for specifying and exchanging data on the Web. However, XML is by nature verbose and thus XML documents are usually large in size, a factor that hinders its practical usage, since it substantially increases the costs of storing, processing, and exchanging data. In order to tackle this problem, many XML-specific compression systems, such as XMill, XGrind, XMLPPM, and Millau, have recently been proposed. However, these systems usually suffer from the following two inadequacies: They either sacrifice performance in terms of compression ratio and execution time in order to support a limited range of queries, or perform full decompression prior to processing queries over compressed documents.In this paper, we address the above problems by exploiting the information provided by a Document Type Definition (DTD) associated with an XML document. We show that a DTD is able to facilitate better compression as well as generate more usable compressed data to support querying. We present the architecture of the XCQ, which is a compression and querying tool for handling XML data. XCQ is based on a novel technique we have developed called DTD Tree and SAX Event Stream Parsing (DSP). The documents compressed by XCQ are stored in Partitioned Path-Based Grouping (PPG) data streams, which are equipped with a Block Statistics Signature (BSS) indexing scheme. The indexed PPG data streams support the processing of XML queries that involve selection and aggregation, without the need for full decompression. In order to study the compression performance of XCQ, we carry out comprehensive experiments over a set of XML benchmark datasets. Wilfred Ng obtained his M.Sc.(Distinction) and Ph.D. degrees from the University of London. His research interests are in the areas of databases and information Systems, which include XML data, database query languages, web data management, and data mining. He is now an assistant professor in the Department of Computer Science, the Hong Kong University of Science and Technology (HKUST). Further Information can be found at the following URL: . Wai-Yeung Lam obtained his M.Phil. degree from the Hong Kong University of Science and Technology (HKUST) in 2003. His research thesis was based on the project “XCQ: A Framework for Querying Compressed XML Data.” He is currently working in industry. Peter Wood received his Ph.D. in Computer Science from the University of Toronto in 1989. He has previously studied at the University of Cape Town, South Africa, obtaining a B.Sc. degree in 1977 and an M.Sc. degree in Computer Science in 1982. Currently he is a senior lecturer at Birkbeck and a member of the Information Management and Web Technologies research group. His research interests include database and XML query languages, query optimisation, active and deductive rule languages, and graph algorithms. Mark Levene received his Ph.D. in Computer Science in 1990 from Birkbeck College, University of London, having previously been awarded a B.Sc. in Computer Science from Auckland University, New Zealand in 1982. He is currently professor of Computer Science at Birkbeck College, where he is a member of the Information Management and Web Technologies research group. His main research interests are Web search and navigation, Web data mining and stochastic models for the evolution of the Web. He has published extensively in the areas of database theory and web technologies, and has recently published a book called ‘An Introduction to Search Engines and Web Navigation’.  相似文献   

15.
The two existing approaches to detecting cyber attacks on computers and networks, signature recognition and anomaly detection, have shortcomings related to the accuracy and efficiency of detection. This paper describes a new approach to cyber attack (intrusion) detection that aims to overcome these shortcomings through several innovations. We call our approach attack-norm separation. The attack-norm separation approach engages in the scientific discovery of data, features and characteristics for cyber signal (attack data) and noise (normal data). We use attack profiling and analytical discovery techniques to generalize the data, features and characteristics that exist in cyber attack and norm data. We also leverage well-established signal detection models in the physical space (e.g., radar signal detection), and verify them in the cyberspace. With this foundation of information, we build attack-norm separation models that incorporate both attack and norm characteristics. This enables us to take the least amount of relevant data necessary to achieve detection accuracy and efficiency. The attack-norm separation approach considers not only activity data, but also state and performance data along the cause-effect chains of cyber attacks on computers and networks. This enables us to achieve some detection adequacy lacking in existing intrusion detection systems. Nong Ye is a Professor of Industrial Engineering and an Affiliated Professor of Computer Science and Engineering at Arizona State University (ASU) the Director of the Information Systems Assurance Laboratory at ASU. Her research interests lie in security and Quality of Service assurance of information systems and infrastructures. She holds a Ph.D. degree in Industrial Engineering from Purdue University, West Lafayette, and M.S. and B.S. degrees in Computer Science from the Chinese Academy of Sciences and Peking University in China respectively. She is a senior member of IIE and IEEE, and an Associate Editor for IEEE Transactions on Systems, Man, and Cybernetics and IEEE Transactions on Reliability. Toni Farley is the Assistant Director of the Information and Systems Assurance Laboratory, and a doctoral student of Computer Science at Arizona State University (ASU), Tempe, Arizona. She is studying under a Graduate Fellowship from AT&T Labs-Research. Her research interests include graphs, networks and network security. She holds a B.S. degree in Computer Science and Engineering from ASU. She is a member of IEEE and the IEEE Computer Society. Her email address is toni@asu.edu. Deepak Lakshminarasimhan is a Research Assistant at the Information and Systems Assurance Laboratory, and a Master of Science student of Electrical engineering at Arizona State University (ASU), Tempe, Arizona. His research interests include network security, digital signal processing and statistical data analysis. He holds a B.S degree in Electronics and Communication Engineering from Bharathidasan University in India.  相似文献   

16.
17.
Internet video streaming is a widely popular application however, in many cases, congestion control facilities are not well integrated into such applications. In order to be fair to other users that do not stream video, rate adaptation should be performed to respond to congestion. On the other hand, the effect of rate adaptation on the viewer should be minimized and this extra mechanism should not overload the client and the server. In this paper, we develop a heuristic approach for unicast congestion control. The primary feature of our approach is the two level adaptation algorithm that utilizes packet loss rate as well as receiver buffer data to maintain satisfactory buffer levels at the receiver. This is particularly important if receiver has limited buffer such as in mobile devices. When there is no congestion, to maintain best buffer levels, fine grain adjustments are carried out at the packet level. Depending on the level of congestion and receiver buffer level, rate shaping that involves frame discard and finally rate adaptation by switching to a different pre-encoded video stream are carried out. Additive increase multiplicative decrease policy is maintained to respond to congestion in a TCP- friendly manner. The algorithm is implemented and performance results show that it has adaptation ability that is suitable for both local area and wide area networks. E. Turhan Tunali received B.Sc. Degree in Electrical Engineering from Middle East Technical University and M.Sc. Degree in Applied Statistics from Ege University, both in Turkey. He then received D.Sc. Degree in Systems Science and Mathematics from Washington University in St. Louis, U.S.A. in 1985. After his doctorate study, he joined Computer Engineering Department of Ege University as an assistant professor where he became an associate professor in 1988. During the period of 1992–1994, he worked in Department of Computer Technology of Nanyang Technological University of Singapore as a Visiting Senior Fellow. He then joined International Computer Institute of Ege University as a Professor where he is currently the director. In the period of 2000–2001 he worked in Department of Computer Science of Loyola University of Chicago as a Visiting Professor. His current research interests include adaptive video streaming and Internet performance measurements. Dr. Tunali is married with an eighteen year old son. Aylin Kantarci received B.Sc., M.Sc. and Ph.D. degrees all from Computer Engineering Department of Ege University, Izmir, Turkey, in 1992, 1994 and 2000, respectively. She then joined the same department as an assistant professor. Her current research interests include adaptive video streaming, video coding, operating systems, multimedia systems and distributed systems. Nukhet Ozbek received B.Sc. degree in Electrical and Electronics Engineering from School of Engineering and M.Sc. degree in Computer Science from International Computer Institute both in Ege University, Izmir, Turkey. From 1998 to 2003 she worked in the DVB team of Digital R&D at Vestel Corporation, Izmir-Turkey that produces telecommunication and consumer electronics devices. She is currently a Ph.D. student and a research assistant at International Computer Institute of Ege University. Her research areas include video coding and streaming, multimedia systems and set top box architectures.  相似文献   

18.
It is likely that customers issue requests based on out-of-date information in e-commerce application systems. Hence, the transaction failure rates would increase greatly. In this paper, we present a preference update model to address this problem. A preference update is an extended SQL update statement where a user can request the desired number of target data items by specifying multiple preferences. Moreover, the preference update allows easy extraction of criteria from a set of concurrent requests and, hence, optimal decisions for the data assignments can be made. We propose a group evaluation strategy for preference update processing in a multidatabase environment. The experimental results show that the group evaluation can effectively increase the customer satisfaction level with acceptable cost. Peng Li is the Chief Software Architect of didiom LLC. Before that, he was a visiting assistant professor of computer science department in Western Kentucky University. He received his Ph.D. degree of computer science from the University of Texas at Dallas. He also holds a B.Sc. and M.S. in Computer Science from the Renmin University of China. His research interests include database systems, database security, transaction processing, distributed and Internet computer and E-commerce. Manghui Tu received a Bachelor degree of Science from Wuhan University, P.R. China in 1996, and a Master Degree in Computer Science from the University of Texas at Dallas 2001. He is currently working toward the PhD degree in the Department of Computer Science at the University of Texas at Dallas. Mr. Tu’s research interests include distributed systems, grid computing, information security, mobile computing, and scientific computing. His PhD research work focus on the data management in secure and high performance data grid. He is a student member of the IEEE. I-Ling Yen received her BS degree from Tsing-Hua University, Taiwan, and her MS and PhD degrees in Computer Science from the University of Houston. She is currently an Associate Professor of Computer Science at the University of Texas at Dallas. Dr. Yen’s research interests include fault-tolerant computing, security systems and algorithms, distributed systems, Internet technologies, E-commerce, and self-stabilizing systems. She had published over 100 technical papers in these research areas and received many research awards from NSF, DOD, NASA, and several industry companies. She has served as Program Committee member for many conferences and Program Chair/Co-Chair for the IEEE Symposium on Application-Specific Software and System Engineering & Technology, IEEE High Assurance Systems Engineering Symposium, IEEE International Computer Software and Applications Conference, and IEEE International Symposium on Autonomous Decentralized Systems. She is a member of the IEEE. Zhonghang Xia received the B.S. degree in applied mathematics from Dalian University of Technology in 1990, the M.S. degree in Operations Research from Qufu Normal University in 1993, and the Ph.D. degree in computer science from the University of Texas at Dallas in 2004. He is now an assistant professor in the Department of Computer Science, Western Kentucky University, Bowling Green, KY. His research interests are in the area of multimedia computing and networking, distributed systems, and data mining.  相似文献   

19.
In this paper we introduce the logic programming languageDisjunctive Chronolog which combines the programming paradigms of temporal and disjunctive logic programming. Disjunctive Chronolog is capable of expressing dynamic behaviour as well as uncertainty, two notions that are very common in a variety of real systems. We present the minimal temporal model semantics and the fixpoint semantics for the new programming language and demonstrate their equivalence. We also show how proof procedures developed for disjunctive logic programs can be easily extended to apply to Disjunctive Chronolog programs. Manolis Gergatsoulis, Ph.D.: He received his B.Sc. in Physics in 1983, the M.Sc. and the Ph.D. degrees in Computer Science in 1986 and 1995 respectively all from the University of Athens, Greece. Since 1996 he is a Research Associate in the Institute of Informatics and Telecommunications, NCSR ‘Demokritos’, Athens. His research interests include logic and temporal programming, program transformations and synthesis, as well as theory of programming languages. Panagiotis Rondogiannis, Ph.D.: He received his B.Sc. from the Department of Computer Engineering and Informatics, University of Patras, Greece, in 1989, and his M.Sc. and Ph.D. from the Department of Computer Science, University of Victoria, Canada, in 1991 and 1994 respectively. From 1995 to 1996 he served in the Greek army. From 1996 to 1997 he was a visiting professor in the Department of Computer Science, University of Ioannina, Greece, and since 1997 he is a Lecturer in the same Department. In January 2000 he was elected Assistant Professor in the Department of Informatics at the University of Athens. His research interests include functional, logic and temporal programming, as well as theory of programming languages. Themis Panayiotopoulos, Ph.D.: He received his Diploma on Electrical Engineering from the Department of Electrical Engineering, National Technical Univesity of Athens, in 1984, and his Ph.D. on Artificial Intelligence from the above mentioned department in 1989. From 1991 to 1994 he was a visiting professor at the Department of Mathematics, University of the Aegean, Samos, Greece and a Research Associate at the Institute of Informatics and Telecommunications of “Democritos” National Research Center. Since 1995 he is an Assistant Prof. at the Department of Computer Science, University of Piraeus. His research interests include temporal programming, logic programming, expert systems and intelligent agent architectures.  相似文献   

20.
This paper proposes a novel method of analysing trajectories followed by people while they perform navigational tasks. The results indicate that modelling trajectories with Bézier curves provides a basis for the diagnosis of navigational patterns. The method offers five indicators: goodness of fit, average curvature, number of inflexion points, lengths of straight line segments, and area covered. Study results, obtained in a virtual environment show that these indicators carry important information about user performance, specifically spatial knowledge acquisition. Corina Sas is a Lecturer in the field of human–computer interaction in the Computing Department at Lancaster University. She holds bachelor degrees in Computer Science and Psychology and an M.A. in Industrial Psychology from Romania. She received her Ph.D. degree in Computer Science from University College Dublin in 2004. Her research interests include user modelling, adaptive systems, data mining, spatial cognition, user studies and individual differences. She has published in various journals and international conferences in these areas. Nikita Schmidt is a Postdoctoral Research Fellow at University College Dublin (UCD). He received his Ph.D. degree from UCD in 2004 and M.Sc. from St-Petersburg State University, Russia in 1994. His research interests include pervasive, ubiquitous and location-aware computing, embedded systems, hardware-close software development and tree-structured data. His work experience is a mix of industry and academia.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号