Bayesian motion estimation using video frame differences

Date

1998-05

Journal Title

Journal ISSN

Volume Title

Publisher

Texas Tech University

Abstract

In modem video compression, two major research directions are under intense scrutiny due to their potential to significantly improve the existing techniques. One is focusing on new orthogonal transformations of the video signal, such as wavelets, to perform more efficient signal energy repacking. The second concentrates on motion prediction and compensation, to remove or reduce temporal redundancy in successive video frames. In this respect, motion estimation, which may refer to image-plane motion (2-D motion) or object-motion (3-D motion), is one of the fundamental problems in digital video processing and has been the subject of much research effort, since the early 1980s.

Because of the ill-posed nature of the problem, motion estimation algorithms need supplementary provisions (models) about the structure of the 2-D motion field. In this study, the 2-D motion estimation is formulated as a Bayesian estimation problem. Furthermore, a stochastic smoothness constraint is introduced by modeling the 2-D motion vector field in terms of a Gibbs distribution.

The motion vector model proposed in this dissertation is a globally smooth model based on vector Markov random fields and the estimation criterion is the maximum a posteriori (MAP) probability, in which the a posteriori probability of motion is maximized given the input data. In contrast with other studies, successive video frame differences are used in this dissertation to estimate the motion. The MAP estimation is performed through simulated annealing, in which the sampling of the solution space is done by means of the Gibbs sampler. Bad data are eliminated using a variant of the method of local outliers rejection.

Experimental results of the application of the proposed simulated annealing algorithm and gradient descent based algorithms to natural and computer-generated images with natural and synthetic motion are compared.

Description

Citation