Automotive top-view image generation using orthogonally diverging fisheye cameras

Date

2016-05

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Advanced Driver Assistance Systems in vehicles can be a great assistance to drivers by providing them a quick and easy way to visualize their entire 360-degree surroundings. We introduce a new camera set-up for a surround-view imaging system that may be part of an ADAS. This set-up involves four wide-angle fisheye cameras with orthogonally diverging camera axes, which allows for capturing the entire 360 degrees around a vehicle in four images, captured from the lateral, front, and rear views. Simple perspective transforms can be used to convert these images into a synthesized top-view image, which displays the scene as viewed from above the vehicle. These transforms, however, are typically derived using a basic calibration procedure that is only capable of correctly mapping ground-plane points in captured images to their corresponding locations in the top-view image, and subsequently, all off-the-ground points look distorted. We present a new method for calibrating a top-view image, in which objects and off-the-ground points are accurately represented. We also present a method for using specifically designed disparity search bands to segment the scene in the overlapping field-of-view (FOV) regions between adjacent cameras, each pair of which is effectively a stereo imaging system. Such wide-baseline stereo systems with orthogonally diverging camera axes make stereo matching difficult, and traditional correspondence algorithms cannot reliably generate the dense disparity maps that might be computed in a parallel stereo set-up involving cameras that follow a rectilinear model. We segment the scene into the ground plane, objects of interest, and the background, and show that our new virtual camera calibration parameters can be applied to represent objects in the scene in a more realistic manner.

Description

Citation