Computer Vision, Embedded Systems

The Spericam V2 is a 360º camera with six 1280 x 960 px  image sensors. It features on board stitching to a 4096 x 2048px panoramic (quirectangular) video stream at 60 frames per second. At the time of production this had never been done before. Not on such a high resolution and framerate. To create a spherical video stream from six sensors introduces several issues. The most important puzzles are the stitching problem (how to connect the six images seamlessly to each other) and the color consistency (matching the color balance between the six sensors).


Stitching two image seems simple, just make sure they overlap and then move them around until they match perfectly. This is where the parallax effect comes in. As you may know, the image from your left eye differs from the image of your right eye because their position is slightly different. We humans actually use this difference to determine the 3D geometry of the space (close) around us. The illustration below shows how the order of objects may differ per eye/sensor. You can test this yourself by a simple experiment: Hold one finger right in front of your nose at 10 cm distance and hold another finger at an arm length away. Now close one of your eyes alternately and you will see that the left-right order of your fingers is swapped.

It’s hard to stitch those images because they don’t overlap properly. Reviewing the issue, stitching should be defined as combining multiple images from different viewpoint as if they were taken from a single point in the center of the viewpoints. With a proper data set, a neural network could be trained to stitch those images as good as possible, especially when taking previous or future frames into account.

Color consistency

To squeeze out the maximum image quality, Brain Builders has delved deeply into the theory of CMOS sensors. We reduced (temperature dependent) black level noise, static noise and random noise as much as possible, using a combination of factory calibration and on-line adjustments. After these improvements we removed lens distortion, chromatic lens aberration and vignetting using an automated factory calibration process.

Due to tiny misalignments of the Bayer filter on the sensor, the color balance of each sensor will vary slightly. Normally this is hardly visible and a camera’s auto white balance will account for these variations. But auto white balance per sensor doesn’t work in a multi sensor camera; each sensor would manage its own color balance, resulting in a colorful patchwork in the stitched image. The sensor’s auto white balance was disabled and each sensor was color calibrated in the factory, resulting in a collor correction matrix (CCM) per sensor. Then a global white balance algorithm, covering all sensors, was applied.

Related Projects

Start typing and press Enter to search