首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 703 毫秒
1.
Secure hash functions play a fundamental role in cryptographic and Web applications. They are mainly used, within digital signature schemes, to verify the integrity and authenticity of information. In this paper, we propose a simple and efficient keyed hash function based on a single chaotic map. Theoretical and simulation results demonstrate that the suggested scheme satisfies all cryptographic requirements of secure keyed hash functions such as strong confusion and diffusion capability, good collision resistance, high sensitivity to message and secret key, etc. Furthermore, it is fast and can be easily implemented through software or hardware. Moreover, the length of the hash value is flexible without any impact on the algorithm. This function is shown to have better statistical performance than many existing hash functions. Thus, the suggested hash function seems to be a good candidate as a secure keyed hash function for use in cryptographic applications.  相似文献   

2.
Li  Yantao  Li  Xiang  Liu  Xiangwei 《Neural computing & applications》2017,28(6):1405-1415

We present a fast and efficient hash algorithm based on a generalized chaotic mapping with variable parameters in this paper. We first define a generalized chaotic mapping by utilizing piecewise linear chaotic map and trigonometric functions. Then, we convert the arbitrary length of message into the corresponding ASCII values and perform 6-unit iterations with variable parameters and message values based on the generalized chaotic mapping. The final hash value is obtained by cascading extracted bits from iteration state values. We excessively evaluate the proposed algorithm in terms of distribution of hash value, sensitivity of hash value to the message and secret keys, statistical analysis of diffusion and confusion, analysis of birthday attacks and collision resistance, analysis of secret keys, analysis of speed, and comparison with other algorithms, and the results illustrate that the suggested algorithm is fast, efficient, and enough simple and has good confusion and diffusion capabilities, strong collision resistance, and a high level of security.

  相似文献   

3.
In this paper, a novel algorithm for image encryption based on hash function is proposed. In our algorithm, a 512-bit long external secret key is used as the input value of the salsa20 hash function. First of all, the hash function is modified to generate a key stream which is more suitable for image encryption. Then the final encryption key stream is produced by correlating the key stream and plaintext resulting in both key sensitivity and plaintext sensitivity. This scheme can achieve high sensitivity, high complexity, and high security through only two rounds of diffusion process. In the first round of diffusion process, an original image is partitioned horizontally to an array which consists of 1,024 sections of size 8 × 8. In the second round, the same operation is applied vertically to the transpose of the obtained array. The main idea of the algorithm is to use the average of image data for encryption. To encrypt each section, the average of other sections is employed. The algorithm uses different averages when encrypting different input images (even with the same sequence based on hash function). This, in turn, will significantly increase the resistance of the cryptosystem against known/chosen-plaintext and differential attacks. It is demonstrated that the 2D correlation coefficients (CC), peak signal-to-noise ratio (PSNR), encryption quality (EQ), entropy, mean absolute error (MAE) and decryption quality can satisfy security and performance requirements (CC <0.002177, PSNR <8.4642, EQ >204.8, entropy >7.9974 and MAE >79.35). The number of pixel change rate (NPCR) analysis has revealed that when only one pixel of the plain-image is modified, almost all of the cipher pixels will change (NPCR >99.6125 %) and the unified average changing intensity is high (UACI >33.458 %). Moreover, our proposed algorithm is very sensitive with respect to small changes (e.g., modification of only one bit) in the external secret key (NPCR >99.65 %, UACI >33.55 %). It is shown that this algorithm yields better security performance in comparison to the results obtained from other algorithms.  相似文献   

4.
结合双混沌系统以及传统散列函数的优点,提出一种新的带密钥单向散列函数的构造方法。该方法将帐篷映射和Logistic混沌映射结合组成双混沌系统生成混沌序列,作为动态参数代替传统散列算法中的固定参数参与轮函数的运算并生成散列摘要。结果表明,所提方法具有较大的密钥空间,很好的单向性,初值和密钥敏感性。  相似文献   

5.
刘冶  潘炎  夏榕楷  刘荻  印鉴 《计算机科学》2016,43(9):39-46, 51
在大数据时代,图像检索技术在大规模数据上的应用是一个热门的研究领域。近年来,大规模图像检索系统中, 图像哈希算法 由于具备提高图像的检索效率同时减少储存空间的优点而受到广泛的关注。现有的有监督学习哈希算法存在一些问题,主流的有监督的哈希算法需要通过图像特征提取器获取人为构造的图像特征表示,这种做法带来的图像特征损失影响了哈希算法的效果,也不能较好地处理图像数据集中语义的相似性问题。随着深度学习在大规模数据上研究的兴起,一些相关研究尝试通过深度神经网络进行有监督的哈希函数学习,提升了哈希函数的效果,但这类方法需要针对数据集人为设计复杂的深度神经网络,增大了哈希函数设计的难度,而且深度神经网络的训练需要较多的数据和较长的时间,这些问题影响了基于深度学习的哈希算法在大规模数据集上的应用。针对这些问题,提出了一种基于深度卷积神经网络的快速图像哈希算法,该算法通过设计优化问题的求解方法以及使用预训练的大规模深度神经网络,提高了哈希算法的效果,同时明显地缩短了复杂神经网络的训练时间。根据在不同图像数据集上的实验结果分析可知, 与现有的基准算法相比,提出的算法在哈希函数训练效果和训练时间上都具有较大的提高。  相似文献   

6.
In 2007, the US National Institute for Standards and Technology (NIST) announced a call for the design of a new cryptographic hash algorithm in response to vulnerabilities like differential attacks identified in existing hash functions, such as MD5 and SHA-1. NIST received many submissions, 51 of which got accepted to the first round. 14 candidates were left in the second round, out of which five candidates have been recently chosen for the final round. An important criterion in the selection process is the SHA-3 hash function security. We identify two important classes of security arguments for the new designs: (1) the possible reductions of the hash function security to the security of its underlying building blocks and (2) arguments against differential attack on building blocks. In this paper, we compare the state of the art provable security reductions for the second round candidates and review arguments and bounds against classes of differential attacks. We discuss all the SHA-3 candidates at a high functional level, analyze, and summarize the security reduction results and bounds against differential attacks. Additionally, we generalize the well-known proof of collision resistance preservation, such that all SHA-3 candidates with a suffix-free padding are covered.  相似文献   

7.
A number of encryption systems work by combining each plaintext bit with a hash function of the last n ciphertext bits. Such systems are self-synchronising in that they recover from ciphertext errors with an error extension of n. We show firstly that if the hash function is a tree function, then the system is vulnerable to a chosen ciphertext attack and, under certain circumstances, to a chosen plaintext attack; secondly, that all hash functions are equivalent to some tree function; thirdly, that whether or not this gives a computable attack on a given algorithm depends on the connectivity of a graph associated with the hash function; and, fourthly, the implications for DES, for RSA key selection, and for algorithm design in general.  相似文献   

8.
Perceptual hash functions are important for video authentication based on digital signature verifying the originality and integrity of videos. They derive hashes from the perceptual contents of the videos and are robust against the common content-preserving operations on the videos. The advancements in the field of scalable video coding call for efficient hash functions that are also robust against the temporal, spatial and bit rate scalability features of the these coding schemes. This paper presents a new algorithm to extract hashes of scalably coded videos using the 3D discrete wavelet transform. A hash of a video is computed at the group-of-frames level from the spatio-temporal low-pass bands of the wavelet-transformed groups-of-frames. For each group-of-frames, the spatio-temporal low-pass band is divided into perceptual blocks and a hash is derived from the cumulative averages of their averages. Experimental results demonstrate the robustness of the hash function against the scalability features and the common content-preserving operations as well as the sensitivity to the various types of content differences. Two critical properties of the hash function, diffusion and confusion, are also examined.  相似文献   

9.
针对群智感知网络数据融合传输过程中隐私泄露、信息不完整、数据窜改等安全问题,提出了一种基于分布式压缩感知和散列函数的数据融合隐私保护算法。首先,采用分布式压缩感知方法对感知数据进行稀疏观测,去除冗余数据;其次,利用单向散列函数求取感知数据观测值的散列值,将其和不受限的伪装数据一起填充到感知数据观测值中,达到隐藏真实感知数据的目的;最后,在汇聚节点提取伪装数据之后,再次获取感知数据的散列值并验证数据的完整性。仿真结果表明,该算法兼顾了数据的机密性和完整性保护,同时大大降低了通信开销,在实际应用中具有很强的适用性和可扩展性。  相似文献   

10.
由标准哈希算法SHA-2演变面来的SHACAL-2是新当选的3个欧洲分组密码标准算法中分组长度和密钥长度最长的算法,其安全强度被认为最高。提案沿袭单向哈希函数的形式来描述算法,文章以分组密码传统的规范形式刻画SHACAL-2算法的完整加密过程,指出算法扩散特性较差的缺点,给出分组密码算法加密过程与哈希算法压缩过程的对应关系,并提供了算法的详细解密过程和相应的加解密数据,为算法实现提供参照:补充了相应的轮常数;进一步研究了长分组长度和密钥长度的必要性。指出算法加解密结构不具有相似性。最后,对欧美4个分组密码标准进行了比较分析。  相似文献   

11.
基于双混沌映射的文本hash函数构造*   总被引:1,自引:1,他引:0  
提出了一种基于混沌Logistic 映射和斜帐篷映射的文本hash函数算法。该算法将明文信息分组并转换为相应的ASCII码值,然后把该值作为Logistic映射的迭代次数,迭代生成的值作为斜帐篷映射的初始值进行迭代,然后依据一定的规则从生成值中提取长度为128 bit的hash值。通过仿真对该算法的单向性、混乱与扩散、碰撞等性能进行分析,理论分析和仿真实验证明该算法可以满足hash函数的各项性能要求。  相似文献   

12.
Network filtering is a challenging area in high-speed computer networks, mostly because lots of filtering rules are required and there is only a limited time available for matching these rules. Therefore, network filters accelerated by field-programmable gate arrays (FPGAs) are becoming common where the fast lookup of filtering rules is achieved by the use of hash tables. It is desirable to be able to fill-up these tables efficiently, i.e. to achieve a high table-load factor in order to reduce the offline time of the network filter due to rehashing and/or table replacement. A parallel reconfigurable hash function tuned by an evolutionary algorithm (EA) is proposed in this paper for Internet Protocol (IP) address filtering in FPGAs. The EA fine-tunes the reconfigurable hash function for a given set of IP addresses. The experiments demonstrate that the proposed hash function provides high-speed lookup and achieves a higher table-load factor in comparison with conventional solutions.  相似文献   

13.
We consider a two-tier content distribution system for distributing massive content, consisting of an infrastructure content distribution network (CDN) and a large number of ordinary clients. The nodes of the infrastructure network form a structured, distributed-hash-table-based (DHT) peer-to-peer (P2P) network. Each file is first placed in the CDN, and possibly, is replicated among the infrastructure nodes depending on its popularity. In such a system, it is particularly pressing to have proper load-balancing mechanisms to relieve server or network overload. The subject of the paper is on popularity-based file replication techniques within the CDN using multiple hash functions. Our strategy is to set aside a large number of hash functions. When the demand for a file exceeds the overall capacity of the current servers, a previously unused hash function is used to obtain a new node ID where the file will be replicated. The central problems are how to choose an unused hash function when replicating a file and how to choose a used hash function when requesting the file. Our solution to the file replication problem is to choose the unused hash function with the smallest index, and our solution to the file request problem is to choose a used hash function uniformly at random. Our main contribution is that we have developed a set of distributed, robust algorithms to implement the above solutions and we have evaluated their performance. In particular, we have analyzed a random binary search algorithm for file request and a random gap removal algorithm for failure recovery.  相似文献   

14.
唐燕  闾国年  张红 《计算机应用》2016,36(11):3093-3097
高级包标记策略(AMS)是对分布式拒绝服务(DDoS)攻击进行IP追踪的有效算法,但是,由于使用哈希函数实现边地址的压缩,AMS算法存在复杂度高、保密性差、误报率高等缺陷。为了提高追踪效率,设计了一种基于多维伪随机序列的AMS算法:一方面,在路由器上,以全硬件实现的边采样矩阵代替原有的哈希函数,完成IP地址的压缩编码;另一方面,在受害者端,结合边地址压缩码和边的权重计算过程,实现攻击路径图的输出。仿真实验中,基于多维伪随机序列的AMS算法与原始算法性能基本一致,但能有效减少误判的发生和快速判断伪造路径。实验结果表明,所提算法保密性能高,计算速度快,抗攻击能力强。  相似文献   

15.
分析广播监视应用中对视频片段及水印的双重认证问题,基于三维离散小波变换,提出一种高安全性的鲁棒视频水印算法。该算法生成视频哈希值,并用私钥对版权信息和哈希值进行签名,由生成的签名和哈希值构建待嵌入的水印,以实现广播监视中对视频片段及水印的双重认证。理论分析和实验结果表明,该算法对于广播监视应用具有较好的安全性和鲁棒性。  相似文献   

16.
一种基于动态散列的GIS空间索引构造算法   总被引:1,自引:1,他引:0  
文章在介绍动态散列和传统空间索引四叉树的构造方法的基础上,综合二者的优点,提出了一种基于动态散列的空间索引构造算法,该方法改变了传统四叉树通过效率低下的空间对象的递归比较构造索引过程,采用计算机运算效率较高的二进制位运算和位比较的动态散列扩充散列值来构造空间索引。实践证明,该算法大大减少了空间索引的构造时间和效率,具有很高的应用价值。  相似文献   

17.
为了解决传统图像检索算法低效和耗时的缺点,提出一种基于PCA哈希的图像检索算法。具体地,首先通过结合PCA与流形学习将原始高维数据降维,然后通过最小方差旋转得到哈希函数和二值化阈值。进而将原始数据矩阵转换为哈希编码矩阵。最后通过计算样本间汉明距离得到样本相似性。在三个公开数据集上的实验结果表明本文提出的哈希算法在多个评价指标下均优于现有算法。  相似文献   

18.
This article proposes a new cryptographic hash function, called RC4-BHF, which is designed to be both fast and secure. This is a new attempt to design a cryptographic hash function based on the RC4 algorithm. Since vulnerabilities have been discovered in many of the existing hash functions, it is beneficial to construct a hash function which has different internal structure, and RC4-BHF is such a new hash function. Moreover, RC4-BHF is suitable for ultra-low power devices, such as sensor node, which are normally equipped with 8-bit processors, and most other hash functions cannot be implemented efficiently or are not applicable. RC4-BHF can run much faster compared to the existing well-known hash functions and is exceptionally fast on 8-bit processors.  相似文献   

19.
快速有效的视频图像序列拼接方法   总被引:2,自引:1,他引:1       下载免费PDF全文
针对现有的视频图像序列拼接方法处理速度慢的问题,提出一种基于SURF特征的快速有效的拼接算法。该算法用鲁棒性强且计算性能优越的SURF算子取代传统的SIFT算子进行特征点提取;在特征点匹配方面,提出了一种基于哈希映射和双向最近邻距离比的匹配算法,可以快速有效地获得特征点间的对应关系。为了消除由于运动物体干扰带来的误匹配,采用随机采样一致性(RANSAC)方法来消除外点确保匹配的有效性,再通过最小二乘法估计视频帧之间的全局运动参数,最终拼接形成全景图。实验结果表明,该拼接算法快速有效,鲁棒性强,具有较高的使用价值。  相似文献   

20.
基于混沌理论和单向散列函数的性质,提出了用类Hènon混沌映射构造单向散列函数的算法,并讨论了此算法的安全性.这种算法具有初值敏感性和不可逆性,且对任意长度的原始消息可生成256位的单向散列值.用该算法可以很容易的求出所给明文消息的  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号