首页 | 本学科首页   官方微博 | 高级检索  
     

以云计算为支撑的海量本体推理研究
引用本文:屈振新,余传明. 以云计算为支撑的海量本体推理研究[J]. 计算机应用, 2011, 31(12): 3324-3326. DOI: 10.3724/SP.J.1087.2011.03324
作者姓名:屈振新  余传明
作者单位:中南财经政法大学 信息与安全工程学院, 武汉 430073
基金项目:国家自然科学基金资助项目,中央高校基本科研业务费专项资金资助项目
摘    要:为了解决海量本体可推理的问题,以云计算平台为支撑,将本体schema转换为图结构,并设计相应的推理策略,以Map/Reduce为计算模型,重写推理规则,设计推理算法,在Map过程中通过一次迭代实现推理,在Reduce过程中消重,解决了海量资源描述框架模式(RDFs)本体的推理问题,在合理时间实现了海量本体的推理。实验显示,一亿条三元组的推理时间没有超过4min,证明算法是有效的。

关 键 词:云计算   海量   本体   推理
收稿时间:2011-06-10
修稿时间:2011-07-20

Inference of mass ontology based on cloud computing
QU Zhen-xin,YU Chuan-ming. Inference of mass ontology based on cloud computing[J]. Journal of Computer Applications, 2011, 31(12): 3324-3326. DOI: 10.3724/SP.J.1087.2011.03324
Authors:QU Zhen-xin  YU Chuan-ming
Affiliation:School of Information and Safety Engineering, Zhongnan University of Economics and Law, Wuhan Hubei 430073, China
Abstract:To solve the problem of inference on mass ontology, a new algorithm was proposed based on cloud computing platform. Ontology schema was transformed into graph, inference strategy was designed accordingly. Inference algorithm was designed based on the computing model of Map/Reduce. After one time iteration, mass ontology could be inferred in the course of Map. Later, duplicated triples were eliminated in the course of Reduce. The experimental results show that inference of one hundred million triples costs less than four minutes. The algorithm is effective.
Keywords:cloud computing   mass   ontology   inference
本文献已被 CNKI 万方数据 等数据库收录!
点击此处可从《计算机应用》浏览原始摘要信息
点击此处可从《计算机应用》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号