首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper introduces a new algorithm of mining association rules.The algorithm RP counts the itemsets with different sizes in the same pass of scanning over the database by dividing the database into m partitions.The total number of pa sses over the database is only(k 2m-2)/m,where k is the longest size in the itemsets.It is much less than k .  相似文献   

2.
POTENTIAL: A highly adaptive core of parallel database system   总被引:1,自引:1,他引:0       下载免费PDF全文
POTENTIAL is a virtual database machine based on general computing platforms,especially parllel computing platforms.It provides a complete solution to high-performance database systems by a ‘virtual processor virtual data bus virtual memory‘ architecture.Virtual processors manage all CPU resources in the system,on which various operations are running.Virtual data bus is responsible for the management of data transmission between associated operations.which forms the higes of the entire system.Virtual memory provides efficient data storage and buffering mechanisms that conform to data reference behaviors in database systems.The architecture of POTENTIAL is very clear and has many good features,including high efficiency,high scalability,high extensibility,high portability,etc.  相似文献   

3.
This paper introduces the design and implemetation of BCL-3,a high performance low-level communication software running on a cluster of SMPs(CLUMPS) called DAWNING-3000,BCL-3 provides flexible and sufficient functionality to fulfill the communication requirements of fundamental system software developed for DAWNING-3000 while guaranteeing security,scalability,and reliability,Important features of BCL-3 are presented in the paper,including special support for SMP and heterogeneous network environment,semiuser-level communication,reliable and ordered data transfer and scalable flow control,The performance evaluation of BCL-3 over Myrinet is also given.  相似文献   

4.
The information dissemination model is becoming increasingly important in wide-area information systems,In this model,a user subscribes to an information dissemination service by submitting profiles that describe his interests.There have been several simple kinds of information dissemination services on the Internet such as mailing list,but the problem is that it provides a crude granularity of interest matching.A user whose information need does not exactly match certain lists will either receive too many irrelevant or too few relevant messages.This paper presents a personalized information dissemination model based on HowNet,which uses a Concept Network-Views(CN-V) model to support information filtering,user‘s interests modeling and information recommendation.A Concept Network is constructed upon the user‘s profiles and the content of documents,which describes concepts and their relations in the content and assigns different weights to these concepts.Usually the Concept Network is not well arranged,from which it is hard to find some useful realtions.so several views from are extracted it to represent the important relations explicitly.  相似文献   

5.
Mobility management is a challenging topic in mobile computing environment. Studying the situation of mobiles crossing the boundaries of location areas is significant for evaluating the costs and performances of various location management strategies. Hitherto, several formulae were derived to describe the probability of the number of location areas‘ boundaries crossed by a mobile. Some of them were widely used in analyzing the costs and performances of mobility management strategies. Utilizing the density evolution method of vector Markov processes, we propose a general probability formula of the number of location areas‘ boundaries crossed by a mobile between two successive calls. Fortunately, several widely-used formulae are special cases of the proposed formula.  相似文献   

6.
7.
Data extraction from the web based on pre-defined schema   总被引:7,自引:1,他引:7       下载免费PDF全文
With the development of the Internet,the World Web has become an invaluable information source for most organizations,However,most documents available from the Web are in HTML form which is originally designed for document formatting with little consideration of its contents.Effectively extracting data from such documents remains a non-trivial task.In this paper,we present a schema-guided approach to extracting data from HTML pages .Under the approach,the user defines a schema specifying what to be extracted and provides sample mappings between the schema and th HTML page.The system will induce the mapping rules and generate a wrapper that takes the HTML page as input and produces the required datas in the form of XML conforming to the use-defined schema .A prototype system implementing the approach has been developed .The preliminary experiments indicate that the proposed semi-automatic approach is not only easy to use but also able to produce a wrapper that extracts required data from inputted pages with high accuracy.  相似文献   

8.
Leakage current of CMOS circuit increases dramatically with the technology scaling down and has become a critical issue of high performance system. Subthreshold, gate and reverse biased junction band-to-band tunneling (BTBT) leakages are considered three main determinants of total leakage current. Up to now, how to accurately estimate leakage current of large-scale circuits within endurable time remains unsolved, even though accurate leakage models have been widely discussed. In this paper, the authors first dip into the stack effect of CMOS technology and propose a new simple gate-level leakage current model. Then, a table-lookup based total leakage current simulator is built up according to the model. To validate the simulator, accurate leakage current is simulated at circuit level using popular simulator HSPICE for comparison. Some further studies such as maximum leakage current estimation, minimum leakage current generation and a high-level average leakage current macromodel are introduced in detail. Experiments on ISCAS85 and ISCAS89 benchmarks demonstrate that the two proposed leakage current estimation methods are very accurate and efficient.  相似文献   

9.
Digital Image Watermarking Based on Discrete Wavelet Transform   总被引:7,自引:0,他引:7       下载免费PDF全文
This paper aims at digital watermark which is a new popular research topic recently,presents some methods to embed digital watermark based on modifying frequency coefficients in discrete wavelet transform(DWT) domian,Fist,the,the present progress of digital watermark is briefly introduced;after that,starting from Pitas‘s method and discarding his pseudo random number method,the authors use a digital image scrambling technology as preprocessing for digital watermarking ,Then the authors discuss how to embed a 1-bit digital image as watermark in frequency domain.Finally another digital watermarking method is given in which3-D DWT is used to transform a given digtial image .Based on the experimental results ,it is shown that the proposed methods are robust to a large extent.  相似文献   

10.
This paper presents a test resource partitioning technique based on an efficient response compaction design called quotient compactor(q-Compactor). Because q-Compactor is a single-output compactor, high compaction ratios can be obtained even for chips with a small number of outputs. Some theorems for the design of q-Compactor are presented to achieve full diagnostic ability, minimize error cancellation and handle unknown bits in the outputs of the circuit under test (CUT). The q-Compactor can also be moved to the load-board, so as to compact the output response of the CUT even during functional testing. Therefore, the number of tester channels required to test the chip is significantly reduced. The experimental results on the ISCAS ‘89 benchmark circuits and an MPEG 2 decoder SoC show that the proposed compactionscheme is very efficient.  相似文献   

11.
A major overhead in software DSM(Distributed Shared Memory)is the cost of remote memory accesses necessitated by the protocol as well as induced by false sharing.This paper introduces a dynamic prefetching method implemented in the JIAJIA software DSM to reduce system overhead caused by remote accesses.The prefetching method records the interleaving string of INV(invalidation)and GETP (getting a remote page)operations for each cached page and analyzes the periodicity of the string when a page is invalidated on a lock or barrier.A prefetching request is issued after the lock or barrier if the periodicity analysis indicates that GETP will be the next operation in the string.Multiple prefetching requests are merged into the same message if they are to the same host,Performance evaluation with eight well-accepted benchmarks in a cluster of sixteen PowerPC workstations shows that the prefetching scheme can significantly reduce the page fault overhead and as a result achieves a performance increase of 15%-20% in three benchmarks and around 8%-10% in another three.The average extra traffic caused by useless prefetches is only 7%-13% in the evaluation.  相似文献   

12.
In the part 2 of advanced Audio Video coding Standard (AVS-P2), many efficient coding tools are adopted in motion compensation, such as new motion vector prediction, symmetric matching, quarter precision interpolation, etc. However, these new features enormously increase the computational complexity and the memory bandwidth requirement, which make motion compensation a difficult component in the implementation of the AVS HDTV decoder. This paper proposes an efficient motion compensation architecture for AVS-P2 video standard up to the Level 6.2 of the Jizhun Profile. It has a macroblock-level pipelined structure which consists of MV predictor unit, reference fetch unit and pixel interpolation unit. The proposed architecture exploits the parallelism in the AVS motion compensation algorithm to accelerate the speed of operations and uses the dedicated design to optimize the memory access. And it has been integrated in a prototype chip which is fabricated with TSMC 0.18-#m CMOS technology, and the experimental results show that this architecture can achieve the real time AVS-P2 decoding for the HDTV 1080i (1920 - 1088 4 : 2 : 0 60field/s) video. The efficient design can work at the frequency of 148.5MHz and the total gate count is about 225K.  相似文献   

13.
A Novel Computer Architecture to Prevent Destruction by Viruses   总被引:1,自引:0,他引:1       下载免费PDF全文
In today‘s Internet computing world,illegal activities by crackers pose a serious threat to computer security.It is well known that computer viruses,Trojan horses and other intrusive programs may cause sever and often catastrophic consequences. This paper proposes a novel secure computer architecture based on security-code.Every instruction/data word is added with a security-code denoting its security level.External programs and data are automatically addoed with security-code by hadware when entering a computer system.Instruction with lower security-code cannot run or process instruction/data with higher security level.Security-code cannot be modified by normal instruction.With minor hardware overhead,then new architecture can effectively protect the main computer system from destruction or theft by intrusive programs such as computer viruses.For most PC systems it includes an increase of word-length by 1 bit on register,the memory and the hard disk.  相似文献   

14.
OpenMP on Networks of Workstations for Software DSMs   总被引:3,自引:0,他引:3       下载免费PDF全文
This paper describes the implementation of a sizable subset of OpenMP on networks of workstations(NOWs) and the source-to-source OpenMP complier(AutoPar) is used for the JIAJIA home-based shared virtual memory system (SVM).The paper suggests some simple modifications and extensions to the OpenMP standard for the difference between SVM and SMP(symmetric multi processor),at which the OpenMP specification is aimed.The OpenMP translator is based on an automatic paralleization compiler,so it is possible to check the correctness of the semantics of OpenMP programs which is not required in an OpenMP-compliant implementation AutoPar is measured for five applications including both programs from NAS Parallel Benchmarks and real applications on a cluster of eight Pentium Ⅱ PCs connected by a 100 Mbps switched Eternet.The evaluation shows that the parallelization by annotaing OpenMPdirectives is simple and the performance of generatd JIAJIA code is still acceptable on NOWs.  相似文献   

15.
In this paper the problem of blending parametric surfaces using subdivision patches is discussed.A new approach,named removing-boundary,is presented to generate piecewise-smooth subdivision surfaces through discarding the outmost quadrilaterals of the open meshes derived by each subdivision step.Then the approach is employed both to blend parametric bicubic B-spline surfaces and to fill n-sided holes.It is easy to produce piecewise-smooth subdivision surfaces with both convex and concave corners on the boundary,and limit surfaces are guaranteed to be C^2 continuous on the boundaries except for a few singular points by the removing-boundary approach Thus the blending method is very efficient and the blending surface generated is of good effect.  相似文献   

16.
1IntroductionMulticastcommunication,whichreferstothedeliveryofamessagefromasinglesourcenodetoanumberofdestinationnodes,isfrequentlyusedindistributed-memoryparallelcomputersystemsandnetworks[1].Efficientimplementationofmulticastcommunicationiscriticaltotheperformanceofmessage-basedscalableparallelcomputersandswitch-basedhighspeednetworks.Switch-basednetworksorindirectnetworks,basedonsomevariationsofmultistageiDterconnectionnetworks(MINs),haveemergedasapromisingnetworkajrchitectureforconstruct…  相似文献   

17.
ARMiner: A Data Mining Tool Based on Association Rules   总被引:3,自引:0,他引:3       下载免费PDF全文
In this paper,ARM iner,a data mining tool based on association rules,is introduced.Beginning with the system architecture,the characteristics and functions are discussed in details,including data transfer,concept hierarchy generalization,mining rules with negative items and the re-development of the system.An example of the tool‘s application is also shown.Finally,Some issues for future research are presented.  相似文献   

18.
This paper proposes a novel Chinese-English Cross-Lingual Information Retrieval(CECLIR)model PME,in which bilingual dictionary and comparable corpora are used to translate the query terms.The proximity and mutual information of the term-paris in the CHinese and English comparable corpora are employed not only to resolve the translation ambiguities but also to perform the query expansion so as to deal with the out-of -vocabulary issues in the CECLIR.The evaluation results show that the query precision of PME algorithm is about 84.4% of the monolingual information retrieval.  相似文献   

19.
Eliciting requirements for a proposed system inevitably involves the problem of handling undesirable information about customer's needs, including inconsistency, vagueness, redundancy, or incompleteness. We term the requirements statements involved in the undesirable information non-canonical software requirements. In this paper, we propose an approach to handling non-canonical software requirements based on Annotated Predicate Calculus (APC). Informally, by defining a special belief lattice appropriate for representing the stakeholder's belief in requirements statements, we construct a new form of APC to formalize requirements specifications. We then show how the APC can be employed to characterize non-canonical requirements. Finally, we show how the approach can be used to handle non-canonical requirements through a case study. Kedian Mu received B.Sc. degree in applied mathematics from Beijing Institute of Technology, Beijing, China, in 1997, M.Sc. degree in probability and mathematical statistics from Beijing Institute of Technology, Beijing, China, in 2000, and Ph.D. in applied mathematics from Peking University, Beijing, China, in 2003. From 2003 to 2005, he was a postdoctoral researcher at Institute of Computing Technology, Chinese Academy of Sciences, China. He is currently an assistant professor at School of Mathematical Sciences, Peking University, Beijing, China. His research interests include uncertain reasoning in artificial intelligence, knowledge engineering and science, and requirements engineering. Zhi Jin was awarded B.Sc. in computer science from Zhejiang University, Hangzhou, China, in 1984, and studied for her M.Sc. in computer science (expert system) and her Ph.D. in computer science (artificial intelligence) at National Defence University of Technology, Changsha, China. She was awarded Ph.D. in 1992. She is a senior member of China Computer Federation. She is currently a professor at Academy of Mathematics and System Sciences, Chinese Academy of Science. Her research interests include knowledge-based systems, artificial intelligence, requirements engineering, ontology engineering, etc. Her current research focuses on ontology-based requirements elicitation and analysis. She has got about 60 papers published, including co-authoring one book. Ruqian Lu is a professor of computer science of the Institute of Mathematics, Chinese Academy of Sciences. His research interests include artificial intelligence, knowledge engineering and knowledge based software engineering. He designed the “Tian Ma” software systems that have been widely applied in more than 20 fields, including the national defense and the economy. He has won two first class awards from Chinese Academy of Sciences and a National second class prize from the Ministry of Science and Technology. He has also won the sixth Hua Lookeng Prize for Mathematics. Yan Peng received B.Sc. degree in software from Jilin University, Changchun, China, in 1992. From June 2002 to December 2005, he studied for his M.E. in software engineering at College of Software Engineering, Graduate School of Chinese Academy of Sciences, Beijing, China. He was awarded M.E degree in 2006. He is currently responsible for CRM (customer relationship management) and BI (business intelligence) project in the BONG. His research interests include customer relationship management, business intelligence, data ming, software engineering and requirements engineering.  相似文献   

20.
Advances in wireless and mobile computing environments allow a mobile user to access a wide range of applications. For example, mobile users may want to retrieve data about unfamiliar places or local life styles related to their location. These queries are called location-dependent queries. Furthermore, a mobile user may be interested in getting the query results repeatedly, which is called location-dependent continuous querying. This continuous query emanating from a mobile user may retrieve information from a single-zone (single-ZQ) or from multiple neighbouring zones (multiple-ZQ). We consider the problem of handling location-dependent continuous queries with the main emphasis on reducing communication costs and making sure that the user gets correct current-query result. The key contributions of this paper include: (1) Proposing a hierarchical database framework (tree architecture and supporting continuous query algorithm) for handling location-dependent continuous queries. (2) Analysing the flexibility of this framework for handling queries related to single-ZQ or multiple-ZQ and propose intelligent selective placement of location-dependent databases. (3) Proposing an intelligent selective replication algorithm to facilitate time- and space-efficient processing of location-dependent continuous queries retrieving single-ZQ information. (4) Demonstrating, using simulation, the significance of our intelligent selective placement and selective replication model in terms of communication cost and storage constraints, considering various types of queries. Manish Gupta received his B.E. degree in Electrical Engineering from Govindram Sakseria Institute of Technology & Sciences, India, in 1997 and his M.S. degree in Computer Science from University of Texas at Dallas in 2002. He is currently working toward his Ph.D. degree in the Department of Computer Science at University of Texas at Dallas. His current research focuses on AI-based software synthesis and testing. His other research interests include mobile computing, aspect-oriented programming and model checking. Manghui Tu received a Bachelor degree of Science from Wuhan University, P.R. China, in 1996, and a Master's Degree in Computer Science from the University of Texas at Dallas 2001. He is currently working toward the Ph.D. degree in the Department of Computer Science at the University of Texas at Dallas. Mr. Tu's research interests include distributed systems, wireless communications, mobile computing, and reliability and performance analysis. His Ph.D. research work focuses on the dependent and secure data replication and placement issues in network-centric systems. Latifur R. Khan has been an Assistant Professor of Computer Science department at University of Texas at Dallas since September 2000. He received his Ph.D. and M.S. degrees in Computer Science from University of Southern California (USC) in August 2000 and December 1996, respectively. He obtained his B.Sc. degree in Computer Science and Engineering from Bangladesh University of Engineering and Technology, Dhaka, Bangladesh, in November of 1993. Professor Khan is currently supported by grants from the National Science Foundation (NSF), Texas Instruments, Alcatel, USA, and has been awarded the Sun Equipment Grant. Dr. Khan has more than 50 articles, book chapters and conference papers focusing in the areas of database systems, multimedia information management and data mining in bio-informatics and intrusion detection. Professor Khan has also served as a referee for database journals, conferences (e.g. IEEE TKDE, KAIS, ADL, VLDB) and he is currently serving as a program committee member for the 11th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (SIGKDD2005), ACM 14th Conference on Information and Knowledge Management (CIKM 2005), International Conference on Database and Expert Systems Applications DEXA 2005 and International Conference on Cooperative Information Systems (CoopIS 2005), and is program chair of ACM SIGKDD International Workshop on Multimedia Data Mining, 2004. Farokh Bastani received the B.Tech. degree in Electrical Engineering from the Indian Institute of Technology, Bombay, and the M.S. and Ph.D. degrees in Computer Science from the University of California, Berkeley. He is currently a Professor of Computer Science at the University of Texas at Dallas. Dr. Bastani's research interests include various aspects of the ultrahigh dependable systems, especially automated software synthesis and testing, embedded real-time process-control and telecommunications systems and high-assurance systems engineering. Dr. Bastani was the Editor-in-Chief of the IEEE Transactions on Knowledge and Data Engineering (IEEE-TKDE). He is currently an emeritus EIC of IEEE-TKDE and is on the editorial board of the International Journal of Artificial Intelligence Tools, the International Journal of Knowledge and Information Systems and the Springer-Verlag series on Knowledge and Information Management. He was the program cochair of the 1997 IEEE Symposium on Reliable Distributed Systems, 1998 IEEE International Symposium on Software Reliability Engineering, 1999 IEEE Knowledge and Data Engineering Workshop, 1999 International Symposium on Autonomous Decentralised Systems, and the program chair of the 1995 IEEE International Conference on Tools with Artificial Intelligence. He has been on the program and steering committees of several conferences and workshops and on the editorial boards of the IEEE Transactions on Software Engineering, IEEE Transactions on Knowledge and Data Engineering and the Oxford University Press High Integrity Systems Journal. I-Ling Yen received her B.S. degree from Tsing-Hua University, Taiwan, and her M.S. and Ph.D. degrees in Computer Science from the University of Houston. She is currently an Associate Professor of Computer Science at University of Texas at Dallas. Dr. Yen's research interests include fault-tolerant computing, security systems and algorithms, distributed systems, Internet technologies, E-commerce and self-stabilising systems. She has published over 100 technical papers in these research areas and received many research awards from NSF, DOD, NASA and several industry companies. She has served as Program Committee member for many conferences and Program Chair/Cochair for the IEEE Symposium on Application-Specific Software and System Engineering & Technology, IEEE High Assurance Systems Engineering Symposium, IEEE International Computer Software and Applications Conference, and IEEE International Symposium on Autonomous Decentralized Systems. She has also served as a guest editor for a theme issue of IEEE Computer devoted to high-assurance systems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号