I'm working on an OpenCV program to find the distance from the camera to a rectangle with a known aspect ratio. Finding the distance to a rectangle as seen from a forward-facing view works just fine:
The actual distance is very close to the distance calculated by this:
wtarget · pimage d = c —————————————————————————— 2 · ptarget · tan(θfov / 2)
Where wtarget is the actual width (in inches) of the target, pimage is the pixel width of the overall image, ptarget is the length of the largest width (in pixels) of the detected quadrilateral, and θfov is the FOV of our webcam. This is then multiplied by some constant c.
The issue occurs when the target rectangle is viewed from a perspective that isn't forward-facing:
The difference in actual distance between these two orientations is small, but the detected distance differs by almost 2 feet.
What I'd like to know is how to calculate the distance consistently, accounting for different perspective angles. I've experimented with getPerspectiveTransform
, but that requires me to know the resulting scale of the target - I only know the aspect ratio.