首页 | 本学科首页   官方微博 | 高级检索  
     


Performance improvement of the SPIHT coder
Affiliation:1. Agency for Defense Development, Yuseong P.O. Box 35-5, TECH-3-4, Daejeon 305-600, South Korea;2. Department of Electronics Engineering, Chung-Nam National University, 220 Koong-Dong Yuseong-Ku, Daejeon 305-764, South Korea;1. Sakarya University, Faculty of Engineering, Electrical-Electronics Engineering, 54187 Serdivan, Sakarya, Turkey;2. Istanbul Okan University, Institute of Health Sciences, Nutrition and Dietetics, 34394 Mecidiyekoy, Istanbul, Turkey;3. Hacettepe University, Faculty of Health Sciences, Department of Nutrition and Dietetics, 06100 Sıhhiye, Ankara, Turkey;4. Beykent University, School of Health Sciences, Department of Nutrition and Dietetics, Buyukcekmece, Istanbul, Turkey;1. School of Instrument Science and Engineering, Southeast University, Nanjing 210096, China;2. School of Mechanical Engineering, Xi’an Jiaotong University, Xi’an 710049, China;1. Electronics Engineering, Tsinghua University, Beijing, China;2. Department of Computing, Hong Kong Polytechnic University, Hong Kong, China;3. Harbin Institute of Technology, Shenzhen Graduate School, Shenzhen, China;1. Department of Chemical and Materials Engineering, National Central University, Jhongli 320, Taiwan;2. Department of Chemical Engineering, National Taiwan University, Taipei 106, Taiwan
Abstract:The set partitioning in hierarchical trees (SPIHT) coder is one of the state-of-the-art coders among the wavelet-based image compression coders. For improving the performance of the SPIHT coder, in this paper, we propose a pre-processing method that applies the discrete sine transform or the discrete cosine transform to the wavelet coefficients in the highest frequency subbands and in the next highest frequency subbands before the SPIHT encoding. Experimental results show that the proposed method increases the peak signal to noise ratio by up to 0.4 (dB) in textured images over the original SPIHT coder.
Keywords:
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号