Stereo Camera

Stereo Disparity

Humans are able to perceive depth because we have 2 different eyes.

when the camera moves, the movement of these objects on the image forms pixel disparity. Through calculating the disparity, we can quantitatively determine which objects are far away and which objects are close.

  • like the parallax event

Ways to compute this:

Disparity Map

  // Scale result and write it to disk. Disparities are in Q10.5 format,
  // so to map it to float, it gets divided by 32. Then the resulting disparity range,
  // from 0 to stereo.maxDisparity gets mapped to 0-255 for proper output.
  cv_disparity.convertTo(cv_disparity, CV_8UC1, 255.0 / (32 * params.stereo_params.maxDisparity), 0);