首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 609 毫秒
1.
完整性分析一直是恶意代码动态分析的难点。针对恶意代码动态分析方法存在行为获取不完整的问题,提出了一种面向环境识别的恶意代码完整性分析方法,通过分析恶意代码执行过程中的数据流信息识别恶意代码敏感分支点,构造能够触发隐藏行为的执行环境,提高了恶意代码行为分析的完整程度。通过对50个恶意代码样本的分析结果表明,该方法能有效缩减分析时间,获得更加全面的行为信息,有效提高分析效率和分析的完整性。  相似文献   

2.
本文总结了国内外主流的恶意代码分析方法,分析了当前恶意代码检测面临的主要困难,并有针对性提出了一种适用于工业互联网的二进制恶意代码功能模块自动化切分方法,该方法基于隐马尔可夫模型的功能模块自动划分算法进行动态分析,实现了恶意代码的同源判定,突破了传统恶意代码费时费力且对代码分析粗粒度的难题。通过研制原型系统,实现了对多类型的跨平台恶意代码的自动化切分和对比验证。  相似文献   

3.
一种有效的Cisco IOS映像注入攻击分析方法   总被引:1,自引:0,他引:1  
针对Cisco IOS映像注入攻击提出了一种基于虚拟化的恶意代码分析方法。通过虚拟化技术的研究, 设计实现了虚拟化分析平台CDAP(Cisco dynamic analysis platform), 为IOS系统提供了运行环境, 在此基础上, 采用指令信息截获与过滤技术、数据跟踪技术对注入到IOS映像中的恶意代码进行分析。使用该方法可对遭受IOS注入攻击的多型号多版本的Cisco IOS映像进行分析。实验结果证明了该方法的有效性。  相似文献   

4.
运用引用监控机的概念和虚拟机监控器的功能,提出了一种新的保护内核完整性的方法。该方法在虚拟机监控器中增加了引用监控模块,使之成为引用监控机;让客户操作系统内核以非特权模式运行在引用监控机上,使其对某些资源的修改操作必须经过运行于特权模式的引用监控机的验证,从而阻止恶意代码修改内核。与传统的防御恶意代码的方法相比,传统方法只能检测出内核完整性已被破坏,不能阻止恶意代码对内核的修改。  相似文献   

5.
随着网络技术和人工智能技术的不断发展,恶意代码对网络空间安全的威胁日益增加,对社会经济、国家安全构成严重威胁。恶意程序数量级呈指数增加大大增加了恶意代码分析的工作量,传统的恶意代码检测方式难以应对当下日益复杂的网络空间环境。本文提出了一种面向深度迁移学习的恶意代码可视化检测,基于计算机视觉技术将恶意代码进行可视化操作,并利用深度迁移学习和目标检测技术,对恶意代码相关特征片段进行检测分类。实验结果同样也表明,基于目标检测和计算机视觉技术,进行恶意代码可视化检测分析的方法在检测准确率、检测速度以及识别能力等方面较传统的恶意代码分类方法都表现出了更优异的性能。  相似文献   

6.
基于可信计算的恶意代码防御机制研究*   总被引:1,自引:0,他引:1  
根据TCG规范中可信传递的思想,提出一种恶意代码防御机制,对被执行的客体实施完整性度量以防止恶意代码的传播;对客体的执行权限严格进行控制,防止恶意代码的执行,降低恶意代码的传播速度并限制其破坏范围,确保系统的完整性不被破坏。利用可信计算技术设计并实现恶意代码防御机制。  相似文献   

7.
随着移动互联网的发展,针对Android平台的恶意代码呈现急剧增长。而现有的Android恶意代码分析方法多聚焦于基于特征对恶意代码的检测,缺少统一的系统化的分析方法,且少有对恶意代码分类的研究。基于这种现状,提出了恶意软件基因的概念,以包含功能信息的片段对恶意代码进行分析;基于Android平台软件的特点,通过代码段和资源段分别提取了软件基因,其中代码段基因基于use-def链(使用-定义链)进行形式化。此外,分别提出了基于恶意软件基因的检测框架和分类框架,通过机器学习中的支持向量机对恶意软件基因进行学习,有较高的检测率和分类正确率,其中检测召回率达到了98.37%,验证了恶意软件基因在分析同源性中的作用。  相似文献   

8.
提出了一个适用于开放系统环境的恶意代码防御模型。把系统内部划分为可信域和不可信域,可信域由已标识客体和已授权主体构成,不可信域由未标识客体和未授权主体构成。为把低完整性级别的信息限制在不可信域以防范恶意代码对可信域的渗透和攻击,定义了主体授权规则、客体访问规则和主体通信规则。为使可信域可以安全地同外界进行信息交换,引入了可信完整性部件。可信完整性部件由安全性检查部件和可信度提升部件构成,其中前者对所有要进入可信域的客体进行安全性检查,后者把经检查被认为是安全的客体转移到可信域并提升其完整性级别,从而在不损害安全性的前提下提高系统的可用性。  相似文献   

9.
针对Android平台的恶意代码分析建模一直是目前移动终端安全的研究重点,对目前常见的恶意代码进行归纳、分类和行为抽取,在对行为进行形式化描述的基础上,提出了一种基于有色Petri网(colored-Petri net,CPN)的恶意代码建模方法,使用该方法能够描述恶意代码从安装、加载到恶意执行的整个过程。最后对恶意软件Bean Bot进行建模,并利用CPN Tools仿真工具分析了模型的可达性和有界性等性质。实验表明该方法可以准确地刻画恶意代码的运行过程,有助于对恶意代码的机制进行深入分析。  相似文献   

10.
恶意代码分类是一种基于特征进行恶意代码自动家族类别划分的分析方法。恶意代码的多维度特征融合与深度处理,是恶意代码分类研究的一种发展趋势,也是恶意代码分类研究的一个难点问题。本文提出了一种适用于恶意代码分类的高维特征融合方法,对恶意代码的静态二进制文件和反汇编特征等进行提取,借鉴SimHash的局部敏感性思想,对多维特征进行融合分析和处理,最后基于典型的机器学习方法对融合后的特征向量进行学习训练。实验结果和分析表明,该方法能够适应于样本特征维度高而样本数量较少的恶意代码分类场景,而且能够提升分类学习的时间性能。  相似文献   

11.
Many software engineering applications require points-to analysis. These client applications range from optimizing compilers to integrated program development environments (IDEs) and from testing environments to reverse-engineering tools. Moreover, software engineering applications used in an edit-compile cycle need points-to analysis to be fast and precise.In this article, we present a new context- and flow-sensitive approach to points-to analysis where calling contexts are distinguished by the points-to sets analyzed for their call target expressions. Compared to other well-known context-sensitive techniques it is faster in practice, on average, twice as fast as the call string approach and by an order of magnitude faster than the object-sensitive technique. In fact, it shows to be only marginally slower than a context-insensitive baseline analysis. At the same time, it provides higher precision than the call string technique and is similar in precision to the object-sensitive technique. We confirm these statements with experiments using a number of abstract precision metrics and a concrete client application: escape analysis.  相似文献   

12.
源代码分析技术对于软件安全缺陷分析是一项非常重要的手段.分析了软件源代码分析工具的技术手段和发展过程,最后对源代码分析的理论和实践进行了分析总结.  相似文献   

13.
别名分析对于数据流分析、程序优化和分析工具的实现非常重要.文章提出了一种需求驱动,流非敏感的分析算法来解决指针别名问题.通过构造程序表达式图(PEG)把指针别名问题转化成判断两个指针节点是否是联通的问题,它不同于传统的别名分析方法,它不需要构造别名集合和对其求交集,所以提高了分析指针别名的效率.  相似文献   

14.
《Ergonomics》2012,55(11):1787-1800
Abstract

The role of cognitively oriented tasks in the workplace continues to increase as automation of physical task components advances. Difficulties in automating the operator's cognitive processes have placed a renewed emphasis on the human component in advanced manufacturing systems. While traditional task analysis techniques have made significant contributions to improving productivity when important task elements are visually observable, their focus on manual task procedures make them less effective for cognitively oriented activities. This research has made a first attempt at integrating techniques from several disciplines to develop a cognitive task analysis methodology. The utility of this combined approach is examined for a new system being tested in the United States Postal Service. This task requires operators to encode, via a keyboard, addresses presented on a video display terminal. Results support the hypothesis that, for cognitively oriented tasks, a consensus based analysis technique (the Position Analysis Questionnaire) can be significantly improved by including data from task analysis provided the methodology is suitable for identifying non-physical task components.  相似文献   

15.
M. H. Williams 《Software》1982,12(5):487-491
The researcher who knows little about computers but wants to conduct a survey and analyse the results by computer can land himself in some difficulty if he does not appreciate some of the problems of computerization. This paper describes a system which is designed to aid such a person by providing assistance with the design of the questionnaire, the capturing of the data and the final analyses.  相似文献   

16.
The size of today’s programs continues to grow, as does the number of bugs they contain. Testing alone is rarely able to flush out all bugs, and many lurk in difficult-to-test corner cases. An important alternative is static analysis, in which correctness properties of a program are checked without running it. While it cannot catch all errors, static analysis can catch many subtle problems that testing would miss.We propose a new space of abstractions for pointer analysis—an important component of static analysis for C and similar languages. We identify two main components of any abstraction—how to model statement order and how to model conditionals, then present a new model of programs that enables us to explore different abstractions in this space. Our assign-fetch graph represents reads and writes to memory instead of traditional points-to relations and leads to concise function summaries that can be used in any context. Its flexibility supports many new analysis techniques with different trade-offs between precision and speed.We present the details of our abstraction space, explain where existing algorithms fit, describe a variety of new analysis algorithms based on our assign-fetch graphs, and finally present experimental results that show our flow-aware abstraction for statement ordering both runs faster and produces more precise results than traditional flow-insensitive analysis.  相似文献   

17.
重点选取了15个副省级城市的第一产业比重、人口密度、人均绿地、园林面积、医院数目,市政建设面积、地方财政税收等42个指标,使用SPSS作为计算工具,使用因子分析方法简化评价指标,计算相关系数矩阵,判别因子分析可行性,利用主成分分析法求因子载荷,将因子进行旋转得出更有实际意义的因子解释,并计算因子得分,利用该结果计算Mi...  相似文献   

18.
一种全局数据流分析的新方法   总被引:1,自引:0,他引:1  
  相似文献   

19.
Cost analysis statically approximates the cost of programs in terms of their input data size. This paper presents, to the best of our knowledge, the first approach to the automatic cost analysis of object-oriented bytecode programs. In languages such as Java and C#, analyzing bytecode has a much wider application area than analyzing source code since the latter is often not available. Cost analysis in this context has to consider, among others, dynamic dispatch, jumps, the operand stack, and the heap. Our method takes a bytecode program and a cost model specifying the resource of interest, and generates cost relations which approximate the execution cost of the program with respect to such resource. We report on COSTA, an implementation for Java bytecode which can obtain upper bounds on cost for a large class of programs and complexity classes. Our basic techniques can be directly applied to infer cost relations for other object-oriented imperative languages, not necessarily in bytecode form.  相似文献   

20.
易定 《微机发展》2006,16(9):112-114
数据分析是从海量数据中发现隐含信息或知识的过程。基于一个公安破案辅助数据分析系统,深入研究数据分析任务的需求与实现,提出首先规划分析思路、细化分析功能,然后用多视角数据透视和智能分析两种手段,从微观与宏观、定量与定性等不同角度互为补充地使系统具有完备的分析功能。该研究对如何开发具有实用价值的数据分析系统有普遍的指导意义。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号