首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
提出一种简单的非线性分形算法,简化了Popeseu等提出的算法,用于解决压缩字典较小的问题.实验结果表明,这一算法简单可行,具有良好的压缩结果和高质量的重建图像。  相似文献   

2.
一种最优化链码指纹二值细化图像压缩编码   总被引:1,自引:1,他引:0  
李超  杜赓  杨义先  钮心忻 《计算机应用》2006,26(10):2357-2359
提出了一种适合于对线状结构的条形纹线二值图像进行压缩的最优化Freeman链码压缩算法——Freeman差分链码Huffman编码。与传统的Freeman链码相比,提出的压缩算法是基于Freeman链码、差分编码和Huffman编码的一种混和编码方式。通过理论分析和在指纹二值细化图上的实验结果证明,对于指纹二值细化图像,本算法优于现有的链码压缩二值图像的算法,针对于线状结构的条形纹线二值图像,本算法也优于其他压缩算法。其平均码长为1.7651bits,低于8方向Freeman链码或者Freeman差分链码的3bits的平均码长。  相似文献   

3.
The trade-off between image fidelity and coding rate is reached with several techniques, but all of them require an ability to measure distortion. The problem is that finding a general enough measure of perceptual quality has proven to be an elusive goal. Here, we propose a novel technique for deriving an optimal compression ratio for lossy coding based on the relationship between information theory and the problem of testing hypotheses. The best achievable compression ratio determines a boundary between achievable and non-achievable regions in the trade-off between source fidelity and coding rate. The resultant performance bound is operational in that it is directly achievable by a constructive procedure, as suggested in a theorem that states the relationship between the best achievable compression ratio and the Kullback-Leibler information gain. As an example of the proposed technique, we analyze the effects of lossy compression at the best achievable compression ratio on the identification of breast cancer microcalcifications.  相似文献   

4.
The Bullwhip Effect, which refers to the increasing variability of orders traveling upstream the supply chain, has shown to be a severe problem for many industries. The inventory policy of the various nodes is an important contributory factor to this phenomenon, and hence it significantly impacts on their financial performance. This fact has led to a large amount of research on replenishment and forecasting methods aimed at exploring their suitability depending on a range of environmental factors, e.g. the demand pattern and the lead time. This research work approaches this issue by seeing the whole picture of the supply chain. We study the interaction between four widely used inventory models in five different contexts depending on the customer demand variability and the safety stock. We show that the concurrence of distinct inventory models in the supply chain, which is a common situation in practice, may alleviate the generation of inefficiencies derived from the Bullwhip Effect. In this sense, we demonstrate that the performance of each policy depends not only upon the external environment but also upon the position within the system and upon the decisions of the other nodes. The experiments have been carried out via an agent-based system whose agents simulate the behavior of the different supply chain actors. This technique proves to offer a powerful and risk-free approach for business exploration and transformation.  相似文献   

5.
In this paper a locally adaptive wavelet image coder is presented. It is based on an embedded human visual system model that exploits the space- and frequency-localization properties of wavelet decompositions for tuning the quantization step for each discrete wavelet transforms coefficient, according to the local properties of the image. A coarser quantization is performed in the areas of the image where the visibility of errors is reduced, thus decreasing the total bit rate without affecting the resulting visual quality. The size of the quantization step for each DWT coefficient is computed by taking into account the multiresolution structure of wavelet decompositions, so that there is no need for any side information to be sent to the decoder or for prediction mechanisms.Perceptually lossless as well as perceptually lossy compression is supported: the desired visual quality of the compressed image is set by means of a quality factor. Moreover, the technique for tuning the target visual quality allows the user to define arbitrarily shaped regions of interest and to set for each one a different quality factor.  相似文献   

6.
Progressive coding is a desirable feature for image database telebrowsing or image transmissions over low bandwidth channels. Furthermore, for some applications, exact image reconstruction is required. In this paper, we show that most of the lossless and progressive coders can be described by a common nonlinear subband decomposition scheme. This unifying framework provides useful guidelines for the analysis and improvement of the considered decomposition methods. Finally, we compare the respective performances of these methods in terms of Shannon entropy for several images and also evaluate their compression ability when combined with a hierarchical coding technique.  相似文献   

7.
静态图像压缩算法国际标准JPEG目前已广为应用,但其作为通用静态图像压缩算法,实现方法较为复杂,在某些场合并不适用。在深入分析JPEG压缩原理的基础上,通过改进JPEG算法的量化与编码模块,提出了一种基于离散余弦变换(DCT)的简单压缩算法。该算法在低压缩比(压缩比小于15)环境下具有很好的压缩效果,而实现方法却比JPEG简单许多。  相似文献   

8.
Cholera is an intestinal disease and is characterized by diarrhea and severe dehydration. While cholera has mainly been eliminated in regions that can provide clean water, adequate hygiene and proper sanitation; it remains a constant threat in many parts of Africa and Asia. Within this paper, we develop an agent-based model that explores the spread of cholera in the Dadaab refugee camp in Kenya. Poor sanitation and housing conditions contribute to frequent incidents of cholera outbreaks within this camp. We model the spread of cholera by explicitly representing the interaction between humans and their environment, and the spread of the epidemic using a Susceptible-Exposed-Infected-Recovered model. Results from the model show that the spread of cholera grows radially from contaminated water sources and seasonal rains can cause the emergence of cholera outbreaks. This modeling effort highlights the potential of agent-based modeling to explore the spread of cholera in a humanitarian context.  相似文献   

9.
何同林  尤春艳  郑鹏 《计算机应用》2007,27(6):1452-1454
介绍了由时间轴一维与帧内二维组合而成的准三维小波变换结合运动补偿的编码器结构,并针对运动变化较缓慢的序列图像,观察实验数据特点,提出了改进零树编码与像素点编码相结合的编码策略。计算机仿真实验证明,采用该方法,可有效降低计算复杂度,提高执行效率,同时在压缩比达到400∶1时,图像恢复质量仍在29dB 以上。  相似文献   

10.
矢量量化是一种有效的数据压缩技术,由于其算法简单,具有较高的压缩率,因而被广泛应用于数据压缩编码领域。通过对图像块灰度特征的研究,根据图像的平滑与否,提出了对图像进行均值和矢量量化复合编码算法,该算法对平滑图像块采用均值编码,对非平滑块采用矢量量化编码。这不仅节省了平滑码字的存储空间,提高了码书存储效率,并且编码速度大大提高。同时采用码字旋转反色(2R)压缩算法将码书的存储容量减少到1/8,并结合最近邻块扩展搜索算法(EBNNS)对搜索算法进行优化。在保证图像画质的前提下,整个系统的图像编码速度比全搜索的普通矢量量化平均提高约7.7倍。  相似文献   

11.
In this paper we present a novel hardware architecture for real-time image compression implementing a fast, searchless iterated function system (SIFS) fractal coding method. In the proposed method and corresponding hardware architecture, domain blocks are fixed to a spatially neighboring area of range blocks in a manner similar to that given by Furao and Hasegawa. A quadtree structure, covering from 32 × 32 blocks down to 2 × 2 blocks, and even to single pixels, is used for partitioning. Coding of 2 × 2 blocks and single pixels is unique among current fractal coders. The hardware architecture contains units for domain construction, zig-zag transforms, range and domain mean computation, and a parallel domain-range match capable of concurrently generating a fractal code for all quadtree levels. With this efficient, parallel hardware architecture, the fractal encoding speed is improved dramatically. Additionally, attained compression performance remains comparable to traditional search-based and other searchless methods. Experimental results, with the proposed hardware architecture implemented on an Altera APEX20K FPGA, show that the fractal encoder can encode a 512 × 512 × 8 image in approximately 8.36 ms operating at 32.05 MHz. Therefore, this architecture is seen as a feasible solution to real-time fractal image compression.
David Jeff JacksonEmail:
  相似文献   

12.
This paper presents an effective compression method suitable for transmission the still images on public switching telephone networks (PSTN). Since compression algorithm reduce the number of pixels or the gray levels of a source picture, therefore this will lead to the reduction of the amount of memory needed to store the source information or the time necessary for transmitting by a channel with a limited bandwidth. First, we introduced some current standards and finally the lossy DCT-based JPEG compression method is chosen. According to our studies, this method is one of the suitable methods. However, it is not directly applicable for image transmission on usual telephone lines (PSTN). Therefore, it must be modified considerably to be suitable for our purposes. From Shannon’s Information Theory, we know that for a given information source like an image there is a coding technique which permits a source to be coded with an average code length as close as to the entropy of the source as desired. So, we have modified the Huffman coding technique and obtained a new optimized version of this coding, which has a high speed and is easily implemented. Then, we have applied the DCT1 and the FDCT2 for compression of the data. We have analyzed and written the programs in C++ for image compression/decompression, which give a very high compression ratio (50:1 or more) with an excellent SNR.3In this paper, we present the necessary modifications on Huffman coding algorithms and the results of simulations on typical images.  相似文献   

13.
针对分形图像编码算法复杂度高、编码时间冗长的问题,提出正交稀疏编码和纹理特征提取表示图像块的方法.首先,灰度级的正交稀疏变换提高了图像的重建质量和解码时间.其次,相关系数矩阵度量范围块和域块之间的变异系数特征降低了冗余度和编码时间.仿真实验结果显示,该方法与传统的分形图像编码算法相比,图像重建质量更好,编码速度更快.  相似文献   

14.
Abstract

The fractal image compression is a recent tool for encoding natural images. It builds on the local self-similarities and the generation of copies of blocks based on mathematical transformations. The technique seems interesting in both theory and application but have a drawback renders in real-time usage due to the high resource requirement when encoding big data. By another way, heuristics algorithms represent a set of approaches used to solve hard optimisation tasks with rational resources consumption. They are characterised with their fast convergence and reducing of research complexity. The purpose of this paper is to provide, and for the first time, more detailed study about the Wolf Pack Algorithm for the fractal image compression. The whole Image is considered as a space search where this space is divided on blocks, the scooting wolves explore the space to find other smaller block which have a similarity with based on its parameters. Scooting wolfs perused the whole space a selected the blocks with the best fitness. The process will be stopped after a fixed number of iterations or if no improvement in lead wolf solution. Results show that compared with the exhaustive search method, the proposed method greatly reduced the encoding time and obtained a rather best compression ratio. The performed experiments showed its effectiveness in the resolution of such problem. Moreover, a brief comparison with the different methods establishes this advantage.  相似文献   

15.
A new data compression algorithm for a binary image is presented. This is fully 2-dimensional and has a hierarchical format for describing a macroscopic image structure.

An image is described as a set of uniform rectangles and a set of explicit point-by-point data which are outside of these rectangles. The basic problem involved is to find the largest possible rectangle in a given region. For this purpose an image pyramid is employed.

The simulation results of this algorithm show a very good compression ratio. This algorithm has many other interesting characteristics and can be applied for the image data storage, image edition and image transmission.  相似文献   


16.
During the last decade, research works related to modeling and simulation of infrastructure systems have primarily focused on the performance of their technical factors, almost ignoring the importance of non-technical factors of these systems, e.g., human operators, consumers. In contrast, the human operator of infrastructure systems has become an essential part in daily operation and in ensuring the security and reliability of the system. In some of the most significant technological incidents of the past century, human error has played a major role. Therefore, developing a modeling approach that is capable of assessing the human performance in a comprehensive way has become crucial. In this paper, an agent-based hierarchical modeling approach is proposed, which aims at the explicit modeling of the impacts of human performance on the operation of infrastructure systems. Within this approach, the cognition component plays a major role. For this purpose, an analytical method based on the Cognitive Reliability Error Analysis Method (CREAM) is developed using a knowledge-based approach. The proposed modeling approach is a pilot work exploring possibilities of simulating performance of human factors in infrastructure systems. The applicability of this modeling approach is demonstrated by a validation experiment using the electric power supply system as an exemplary system.  相似文献   

17.
In this paper we present a new lossless image compression algorithm. To achieve the high compression speed we use a linear prediction, modified Golomb–Rice code family, and a very fast prediction error modeling method. We compare the algorithm experimentally with others for medical and natural continuous tone grayscale images of depths of up to 16 bits. Its results are especially good for large images, for natural images of high bit depths, and for noisy images. The average compression speed on Intel Xeon 3.06 GHz CPU is 47 MB/s. For large images the speed is over 60 MB/s, i.e. the algorithm needs less than 50 CPU cycles per byte of image. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

18.
This paper describes a color image compression technique based on block truncation coding using pattern fitting (BTC-PF). High degree of correlation between the RGB planes of a color image is reduced by transforming them to O1O2O3 planes. Each Oi plane (1?i?3) is then encoded using BTC-PF method. Size of the pattern book and the block size are selected based on the information content of the corresponding plane. The result of the proposed method is compared with that of several BTC based methods and the former is found superior. Though this method is a spatial domain technique, it is also compared with JPEG compression method, which is one of most popular frequency domain techniques. It is found that the performance of the proposed method is a little inferior to that of the JPEG in terms of quality of the reconstructed image. Decoding time is another important criterion where the compressed image is decoded frequently for various purposes. As the proposed method requires negligible decoding time compared to JPEG, the former is preferred over the latter in those cases.  相似文献   

19.
Backfill is the excavated material from earthworks, which constitutes over 50% of the construction wastes in Hong Kong. This paper considers a supply chain that consists of construction sites, landfills and commercial sources in which operators seek cooperation to maximize backfill reuse and improve waste recovery efficiency. Unlike the ordinary material supply chain in manufacturing industries, the supply chain for backfill involves many dynamic processes, which increases the complexity of analyzing and solving the logistic issue. Therefore, this study attempts to identify an appropriate methodology to analyze the dynamic supply chain, for facilitating the backfill reuse. A centralized optimization model and a distributed agent-based model are proposed and implemented in comparing their performances. The centralized optimization model can obtain a global optimum but requires sharing of complete information from all supply chain entities, resulting in barriers for implementation. In addition, whenever the backfill supply chain changes, the centralized optimization model needs to reconfigure the network structure and recompute the optimum. The distributed agent-based model focuses on task distribution and cooperation between business entities in the backfill supply chain. In the agent-based model, decision making and communication between construction sites, landfills, and commercial sources are emulated by a number of autonomous agents. They perform together through a negotiation algorithm for optimizing the supply chain configuration that reduces the backfill shipment cost. A comparative study indicates that the agent-based model is more capable of studying the dynamic backfill supply chain due to its decentralization of optimization and fast reaction to unexpected disturbances.  相似文献   

20.
JPEG图像压缩算法的IP核设计   总被引:2,自引:0,他引:2  
王镇道  陈迪平  文康益 《计算机应用》2005,25(5):1076-1077,1080
以基于矩阵分解的二维DCT算法为基础,设计了JPEG图像压缩算法的IP核,并用Verilog-HDL语言对各模块和整个IP核进行了RTL级描述和仿真,实验结果验证了设计的正确性。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号