当前位置: X-MOL 学术ACM Trans. Multimed. Comput. Commun. Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Perceptual Image Compression with Block-Level Just Noticeable Difference Prediction
ACM Transactions on Multimedia Computing, Communications, and Applications ( IF 5.1 ) Pub Date : 2021-01-28 , DOI: 10.1145/3408320
Tao Tian 1 , Hanli Wang 1 , Sam Kwong 2 , C.-C. Jay Kuo 3
Affiliation  

A block-level perceptual image compression framework is proposed in this work, including a block-level just noticeable difference (JND) prediction model and a preprocessing scheme. Specifically speaking, block-level JND values are first deduced by utilizing the OTSU method based on the variation of block-level structural similarity values between two adjacent picture-level JND values in the MCL-JCI dataset. After the JND value for each image block is generated, a convolutional neural network–based prediction model is designed to forecast block-level JND values for a given target image. Then, a preprocessing scheme is devised to modify the discrete cosine transform coefficients during JPEG compression on the basis of the distribution of block-level JND values of the target test image. Finally, the test image is compressed by the max JND value across all of its image blocks in the light of the initial quality factor setting. The experimental results demonstrate that the proposed block-level perceptual image compression method is able to achieve 16.75% bit saving as compared to the state-of-the-art method with similar subjective quality. The project page can be found at https://mic.tongji.edu.cn/43/3f/c9778a148287/page.htm.

中文翻译:

具有块级恰显差异预测的感知图像压缩

在这项工作中提出了一种块级感知图像压缩框架,包括一个块级仅显着差异(JND)预测模型和预处理方案。具体来说,首先利用OTSU方法,根据MCL-JCI数据集中两个相邻图片级JND值之间块级结构相似度值的变化,推导出块级JND值。在生成每个图像块的 JND 值后,设计基于卷积神经网络的预测模型来预测给定目标图像的块级 JND 值。然后,基于目标测试图像的块级JND值的分布,设计了一种预处理方案来修改JPEG压缩过程中的离散余弦变换系数。最后,根据初始品质因数设置,测试图像通过其所有图像块的最大 JND 值进行压缩。实验结果表明,与具有相似主观质量的现有技术方法相比,所提出的块级感知图像压缩方法能够实现 16.75% 的比特节省。项目页面可以在https://mic.tongji.edu.cn/43/3f/c9778a148287/page.htm找到。
更新日期:2021-01-28
down
wechat
bug