首页 | 本学科首页   官方微博 | 高级检索  
     

基于规则库和网络爬虫的漏洞检测技术研究与实现
引用本文:杜雷,辛阳.基于规则库和网络爬虫的漏洞检测技术研究与实现[J].信息网络安全,2014(10):38-43.
作者姓名:杜雷  辛阳
作者单位:北京邮电大学信息安全中心,北京,100876
基金项目:国家自然科学基金[61121061、61161140320]、中央高校基本科研业务费专项资金
摘    要:Web技术是采用HTTP或HTTPS协议对外提供服务的应用程序,Web应用也逐渐成为软件开发的主流之一,但Web应用中存在的各种安全漏洞也逐渐暴露出来,如SQL注入、XSS漏洞,给人们带来巨大的经济损失.为解决Web网站安全问题,文章通过对Web常用漏洞如SQL注入和XSS的研究,提出了一种新的漏洞检测方法,一种基于漏洞规则库、使用网络爬虫检测SQL注入和XSS的技术.网络爬虫使用HTTP协议和URL链接来遍历获取网页信息,通过得到的网页链接,并逐步读取漏洞规则库里的规则,构造成可检测出漏洞的链接形式,自动对得到的网页链接发起GET请求以及POST请求,这个过程一直重复,直到规则库里的漏洞库全部读取构造完毕,然后继续使用网络爬虫和正则表达式获取网页信息,重复上述过程,这样便实现了检测SQL注入和XSS漏洞的目的.此方法丰富了Web漏洞检测的手段,增加了被检测网页的数量,同时涵盖了HTTP GET和HTTP POST两种请求方式,最后通过实验验证了利用此技术对Web网站进行安全检测的可行性,能够准确检测网站是否含有SQL注入和XSS漏洞.

关 键 词:网络爬虫  SQL注入  XSS漏洞  规则库

Research and Implementation of Web Vulnerability Detection Technology Based on Rule Base and Web Crawler
DU Lei,XIN Yang.Research and Implementation of Web Vulnerability Detection Technology Based on Rule Base and Web Crawler[J].Netinfo Security,2014(10):38-43.
Authors:DU Lei  XIN Yang
Affiliation:(Information Security Center, Beoing University of Posts and Telecommunications, Beijing 100876, China)
Abstract:Web technology is the application using HTTP or HTTPS protocols to provide services. Web applications are becoming one of the main software development trends, but a variety of security vulnerabilities in Web applications are gradually exposed, such as SQL injection, XSS vulnerabilities. It brings a lot of economic loss. To solve the problem of Web site security, based on Web research for common vulnerabilities such as SQL injection and XSS, this paper presents a novel method for vulnerability detection which can detect Web vulnerabilities using Web Crawler constructing using URLs combined with vulnerability rule base. Web Crawler uses the HTTP protocol and URL links to traverse the acquisition Web page information through web links, and gradually read the rules in the rule library that configured to detect vulnerabilities link form, then initiate a GET request and a post request automatically. This process doesn't repeats until all the rule library is read completed. And then using the Web Crawler and regular expressions to detection of SQL injection and XSS vulnerabilities is a means to enrich Web vulnerability detection, obtain Web page information, this will achieve the purpose through repeating the process. This method increasing the number of tested Web pages. At the same time, the HTTP GET and HTTP POST have done safety detection. Finally, the experiment can prove that the use of this technology on the Web site can be safety testing and detect whether the site has a SQL injection and XSS vulnerabilities.
Keywords:Web Crawler  SQL injection  XSS vulnerabilities  rule base
本文献已被 维普 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号