Optical Flow-Based Fast Motion Parameters Estimation for Affine Motion Compensation

This study proposes a lightweight solution to estimate affine parameters in affine motion compensation. Most of the current approaches start with an initial approximation based on the standard motion estimation, which only estimates the translation parameters. From there, iterative methods are used...

Full description

Bibliographic Details
Main Authors: Antoine Chauvet, Yoshihiro Sugaya, Tomo Miyazaki, Shinichiro Omachi
Format: Article
Language:English
Published: MDPI AG 2020-01-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/10/2/729
id doaj-168498a4ec854fc7be38ce66583a6bab
record_format Article
spelling doaj-168498a4ec854fc7be38ce66583a6bab2020-11-25T03:32:38ZengMDPI AGApplied Sciences2076-34172020-01-0110272910.3390/app10020729app10020729Optical Flow-Based Fast Motion Parameters Estimation for Affine Motion CompensationAntoine Chauvet0Yoshihiro Sugaya1Tomo Miyazaki2Shinichiro Omachi3Graduate School of Engineering, Tohoku University, Sendai, Miyagi 980-8579, JapanGraduate School of Engineering, Tohoku University, Sendai, Miyagi 980-8579, JapanGraduate School of Engineering, Tohoku University, Sendai, Miyagi 980-8579, JapanGraduate School of Engineering, Tohoku University, Sendai, Miyagi 980-8579, JapanThis study proposes a lightweight solution to estimate affine parameters in affine motion compensation. Most of the current approaches start with an initial approximation based on the standard motion estimation, which only estimates the translation parameters. From there, iterative methods are used to find the best parameters, but they require a significant amount of time. The proposed method aims to speed up the process in two ways, first, skip evaluating affine prediction when it is likely to bring no encoding efficiency benefit, and second, by estimating better initial values for the iteration process. We use the optical flow between the reference picture and the current picture to estimate quickly the best encoding mode and get a better initial estimation. We achieve a reduction in encoding time over the reference of half when compared to the state of the art, with a loss in efficiency below 1%.https://www.mdpi.com/2076-3417/10/2/729block-based codingvideo codingh.265/hevcaffine motion compensation
collection DOAJ
language English
format Article
sources DOAJ
author Antoine Chauvet
Yoshihiro Sugaya
Tomo Miyazaki
Shinichiro Omachi
spellingShingle Antoine Chauvet
Yoshihiro Sugaya
Tomo Miyazaki
Shinichiro Omachi
Optical Flow-Based Fast Motion Parameters Estimation for Affine Motion Compensation
Applied Sciences
block-based coding
video coding
h.265/hevc
affine motion compensation
author_facet Antoine Chauvet
Yoshihiro Sugaya
Tomo Miyazaki
Shinichiro Omachi
author_sort Antoine Chauvet
title Optical Flow-Based Fast Motion Parameters Estimation for Affine Motion Compensation
title_short Optical Flow-Based Fast Motion Parameters Estimation for Affine Motion Compensation
title_full Optical Flow-Based Fast Motion Parameters Estimation for Affine Motion Compensation
title_fullStr Optical Flow-Based Fast Motion Parameters Estimation for Affine Motion Compensation
title_full_unstemmed Optical Flow-Based Fast Motion Parameters Estimation for Affine Motion Compensation
title_sort optical flow-based fast motion parameters estimation for affine motion compensation
publisher MDPI AG
series Applied Sciences
issn 2076-3417
publishDate 2020-01-01
description This study proposes a lightweight solution to estimate affine parameters in affine motion compensation. Most of the current approaches start with an initial approximation based on the standard motion estimation, which only estimates the translation parameters. From there, iterative methods are used to find the best parameters, but they require a significant amount of time. The proposed method aims to speed up the process in two ways, first, skip evaluating affine prediction when it is likely to bring no encoding efficiency benefit, and second, by estimating better initial values for the iteration process. We use the optical flow between the reference picture and the current picture to estimate quickly the best encoding mode and get a better initial estimation. We achieve a reduction in encoding time over the reference of half when compared to the state of the art, with a loss in efficiency below 1%.
topic block-based coding
video coding
h.265/hevc
affine motion compensation
url https://www.mdpi.com/2076-3417/10/2/729
work_keys_str_mv AT antoinechauvet opticalflowbasedfastmotionparametersestimationforaffinemotioncompensation
AT yoshihirosugaya opticalflowbasedfastmotionparametersestimationforaffinemotioncompensation
AT tomomiyazaki opticalflowbasedfastmotionparametersestimationforaffinemotioncompensation
AT shinichiroomachi opticalflowbasedfastmotionparametersestimationforaffinemotioncompensation
_version_ 1724567059951517696