Neural Network-Based Video Compression Artifact Reduction Using Temporal Correlation and Sparsity Prior Predictions
Quantization in lossy video compression may incur severe quality degradation, especially at low bit-rates. Developing post-processing methods that improve visual quality of decoded images is of great importance, as they can be directly incorporated in any existing compression standard or paradigm. W...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9180276/ |
id |
doaj-0d46f38b1ed543949b49e0a2b6ff5e25 |
---|---|
record_format |
Article |
spelling |
doaj-0d46f38b1ed543949b49e0a2b6ff5e252021-03-30T04:45:34ZengIEEEIEEE Access2169-35362020-01-01816247916249010.1109/ACCESS.2020.30203889180276Neural Network-Based Video Compression Artifact Reduction Using Temporal Correlation and Sparsity Prior PredictionsWei-Gang Chen0https://orcid.org/0000-0002-9332-0972Runyi Yu1https://orcid.org/0000-0003-1925-8442Xun Wang2School of Computer and Information Engineering, Zhejiang Gongshang University, Hangzhou, ChinaDepartment of Electrical and Electronic Engineering, Eastern Mediterranean University, Mersin, TurkeySchool of Computer and Information Engineering, Zhejiang Gongshang University, Hangzhou, ChinaQuantization in lossy video compression may incur severe quality degradation, especially at low bit-rates. Developing post-processing methods that improve visual quality of decoded images is of great importance, as they can be directly incorporated in any existing compression standard or paradigm. We propose in this article a two-stage method, a texture detail restoration stage followed by a deep convolutional neural network (CNN) fusion stage, for video compression artifact reduction. The first stage performs in a patch-by-patch manner. For each patch in the current decoded frame, one prediction is formed based on the sparsity prior assuming that natural image patches can be represented by sparse activation of dictionary atoms. Under the temporal correlation hypothesis, we search the best matching patch in each reference frame, and select several matches with more texture details to tile motion compensated predictions. The second stage stacks the predictions obtained in the preceding stage along with the decoded frame itself to form a tensor, and proposes a deep CNN to learn the mapping between the tensor as input and the original uncompressed image as output. Experimental results demonstrate that the proposed two-stage method can remarkably improve, both subjectively and objectively, the quality of the compressed video sequence.https://ieeexplore.ieee.org/document/9180276/Compression artifact reductionconvolutional neural networkshigh efficiency video codingsparse representationtemporal correlation |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Wei-Gang Chen Runyi Yu Xun Wang |
spellingShingle |
Wei-Gang Chen Runyi Yu Xun Wang Neural Network-Based Video Compression Artifact Reduction Using Temporal Correlation and Sparsity Prior Predictions IEEE Access Compression artifact reduction convolutional neural networks high efficiency video coding sparse representation temporal correlation |
author_facet |
Wei-Gang Chen Runyi Yu Xun Wang |
author_sort |
Wei-Gang Chen |
title |
Neural Network-Based Video Compression Artifact Reduction Using Temporal Correlation and Sparsity Prior Predictions |
title_short |
Neural Network-Based Video Compression Artifact Reduction Using Temporal Correlation and Sparsity Prior Predictions |
title_full |
Neural Network-Based Video Compression Artifact Reduction Using Temporal Correlation and Sparsity Prior Predictions |
title_fullStr |
Neural Network-Based Video Compression Artifact Reduction Using Temporal Correlation and Sparsity Prior Predictions |
title_full_unstemmed |
Neural Network-Based Video Compression Artifact Reduction Using Temporal Correlation and Sparsity Prior Predictions |
title_sort |
neural network-based video compression artifact reduction using temporal correlation and sparsity prior predictions |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2020-01-01 |
description |
Quantization in lossy video compression may incur severe quality degradation, especially at low bit-rates. Developing post-processing methods that improve visual quality of decoded images is of great importance, as they can be directly incorporated in any existing compression standard or paradigm. We propose in this article a two-stage method, a texture detail restoration stage followed by a deep convolutional neural network (CNN) fusion stage, for video compression artifact reduction. The first stage performs in a patch-by-patch manner. For each patch in the current decoded frame, one prediction is formed based on the sparsity prior assuming that natural image patches can be represented by sparse activation of dictionary atoms. Under the temporal correlation hypothesis, we search the best matching patch in each reference frame, and select several matches with more texture details to tile motion compensated predictions. The second stage stacks the predictions obtained in the preceding stage along with the decoded frame itself to form a tensor, and proposes a deep CNN to learn the mapping between the tensor as input and the original uncompressed image as output. Experimental results demonstrate that the proposed two-stage method can remarkably improve, both subjectively and objectively, the quality of the compressed video sequence. |
topic |
Compression artifact reduction convolutional neural networks high efficiency video coding sparse representation temporal correlation |
url |
https://ieeexplore.ieee.org/document/9180276/ |
work_keys_str_mv |
AT weigangchen neuralnetworkbasedvideocompressionartifactreductionusingtemporalcorrelationandsparsitypriorpredictions AT runyiyu neuralnetworkbasedvideocompressionartifactreductionusingtemporalcorrelationandsparsitypriorpredictions AT xunwang neuralnetworkbasedvideocompressionartifactreductionusingtemporalcorrelationandsparsitypriorpredictions |
_version_ |
1724181181580181504 |