A Content-Motion-Aware Motion Estimation for Quality-Stationary Video Coding
<p/> <p>The block-matching motion estimation has been aggressively developed for years. Many papers have presented fast block-matching algorithms (FBMAs) for the reduction of computation complexity. Nevertheless, their results, in terms of video quality and bitrate, are rather content-va...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
SpringerOpen
2010-01-01
|
Series: | EURASIP Journal on Advances in Signal Processing |
Online Access: | http://asp.eurasipjournals.com/content/2010/403634 |
Summary: | <p/> <p>The block-matching motion estimation has been aggressively developed for years. Many papers have presented fast block-matching algorithms (FBMAs) for the reduction of computation complexity. Nevertheless, their results, in terms of video quality and bitrate, are rather content-varying. Very few FBMAs can result in stationary or quasistationary video quality for different motion types of video content. Instead of using multiple search algorithms, this paper proposes a quality-stationary motion estimation with a unified search mechanism. This paper presents a content-motion-aware motion estimation for quality-stationary video coding. Under the rate control mechanism, the proposed motion estimation, based on subsample approach, adaptively adjusts the subsample ratio with the motion-level of video sequence to keep the degradation of video quality low. The proposed approach is a companion for all kinds of FBMAs in H.264/AVC. As shown in experimental results, the proposed approach can produce stationary quality. Comparing with the full-search block-matching algorithm, the quality degradation is less than 0.36 dB while the average saving of power consumption is 69.6%. When applying the proposed approach for the fast motion estimation (FME) algorithm in H.264/AVC JM reference software, the proposed approach can save 62.2% of the power consumption while the quality degradation is less than 0.27 dB.</p> |
---|---|
ISSN: | 1687-6172 1687-6180 |