360 Camera Techniques Insta360

VR180 Technique: Fixing Stereo Disparity Issues in Insta360 EVO or other VR180 cameras

How to fix stereo disparity in Insta360 Evo or other VR180 cameras
How to fix stereo disparity in Insta360 Evo or other VR180 cameras

Insta360 EVO (reviewed here), a popular VR180 / 360 camera, appears to have a stereo disparity issue that results in a hyperstereo effect even in distant objects.  Fortunately, it is easily resolved with the right software, which can also be used with other VR180 or 3D 360 cameras that exhibit this issue.

Here is a video by my friend Hugh Hou from CreatorUp that demonstrates the issue.   He shows how to solve it with Mistika VR.

Not everyone has Mistika VR.  But the good news is that you can also also adjust the stereo disparity using a free built-in effect in Adobe Premiere CC 2019 called VR Projection.  One of its parameters is called Disparity Adjustment.  I applied a 0.5 adjustment (tip: you’ll need to type it in instead of using your mouse) and here is the result:

How to fix stereo disparity in Insta360 Evo or other VR180 cameras
How to fix stereo disparity in Insta360 Evo or other VR180 cameras

On the left side, you see the unedited image in anaglyph format.  You can see there is a stereo disparity in the very distant cliffs and the distant parts of the path, which is an unnatural stereo effect (in the real world, distant objects should not appear in stereo).  In Premiere, I applied VR Projection effect and set the Disparity Adjustment to 0.500.  The result is on the right side, which appears to have eliminated unnatural stereo disparity.

You may notice that the side by side image now has a line in between them.  That is because adjusting the stereo disparity has a cropping effect on each side of the photo.  In any case, the line is invisible when the image is viewed in VR.

For more information on the Insta360 EVO, or for other tutorials, please subscribe here.

About the author

Mic Ty

6 Comments

Click here to post a comment

  • “a popular VR180 / 360 camera, appears to have a stereo disparity issue that results in a hyperstereo effect even in distant objects.”

    I don’t see how that’s even possible. Stereo disparity is directly related to lens separation and significant parallax only occurs at close range and tappers off pretty quickly within a few hundred feet. So I don’t see how two lenses close together could record parallax differences of such magnitude as to cause a hyperstereo effect. The only way this might occur is if the lenses are actually diverging, which could be due to the lenses not fully extending out when going from the 360 position to the 180 position.
    Is that what’s going on here ?

    Also, with all due respect to Hugh, objects at infinite distance are at… infinite distances (that is… several hundred feet away). His studio example shows a background that is very close and exhibits parallax as it should. The background is therefore the “far” point, not the “infinity” point. The difference between the Qoocam and Evo shots may just be due to a smaller interaxial on the Qoocam. But again, I also wonder if the optical axes of the lenses on the Evo are perfectly parallel when they are in 3D 180 mode.

    But watching this clip again, I wonder if what you guys are actually talking about is just limiting the distance between far points in the left and right images so as to avoid eye divergence when viewing in a VR headset… For example at 6:14 Hugh slides the images sideways and says “less disparity and no disparity”. That’s not quite the case. Disparity does not change when doing this. All he is doing is varying the distance between the left and right image and superimposing the subject that is furthest away in his shot. That changes nothing to the stereo disparity in the scene. It just brings the left and right images closer together so as to avoid eye divergence in the headset. So I am wondering if the right terms are being used here.

    Also, the discomfort he mentions is not caused by the eyes converging but by the eyes diverging because the far points in the picture are a greater distance apart than the optical axes of the viewer lenses. Converging is natural, diverging is not.

    • Thanks Francois. In Premiere, I adjust the “disparity adjustment,” and it directly addresses the issue. So i think it’s really “disparity.” 🙂 And when I say hyperstereo, i mean the stereo effect is exaggerated beyond normal. you can see this effect if you look at the two anaglyph images side by side. The uncorrected image has an exaggerated stereo effect.

      • When I look at your anaglyphs, all I see is that, on the right anaglyph, the right image seems to have been slid to the left so that infinity points match. If that’s all that was done, it isn’t a change in disparity, it’s mainly a realignment of the images.

        However, if there is an actual distortion in the picture itself that’s something else. This is what remains unclear to me. However there is no way I can think of where there would be parallax differences at far distances using a camera that has its lenses close together. At such distances, both lenses see exactly the same thing. So this is why I question the use of the term “hyperstereo”, which refers to a picture taken with a wide lens separation.

        > In Premiere, I adjust the “disparity” adjustment, So i think it’s really “disparity”.

        Well, it’s not binocular disparity since there is no change in parallax. I think that here, the term “disparity” is not used the way it normally is when talking about stereo depth.

          • I looked over all the clips and also took some caps that I examined in SPM. There are no problems in terms of “stereo disparity” and there is no hyperstereo effect. A problem may occur if the homologous points in a stereo pair are too wide apart – that is, if they are wider apart than the optical axes of the viewer lenses. This would force the eyes to diverge. However, if the homologous points at a distance are slightly closer together or cross each other, the image should remain comfortable to view even though it might bring the point of convergence of near objects slightly closer.

            From what I understand, Hugh Hou seems to be saying that the left and right images are too close together so that, in a headset, it brings the point of convergence much closer than it would be in real life. If that’s the case, that would only be obvious with subject matter that is extremely close to the camera – and by that, I mean, about a foot away.

          • Thanks Francois. FWIW insta360 inspected the samples and said they are fixing the issue, which they found in earlier production runs.