360 Camera Reviews

Ultimate 360 Camera Ranking chart with PERPETUAL ratings (updated: February 21, 2018)

Ultimate 360 camera comparison and ranking chart
Ultimate 360 camera comparison and ranking chart

Here is the 360 camera ranking and rating chart.  Please click on the column heading “Photo,” “Video,” or “Usability” to sort the chart into a camera ranking.  To understand the ranking and ratings, pls. see below.

I updated the Ultimate 360 Camera Comparison Tool to include scores for most of the cameras in three categories: i) photo, ii) video, and iii) usability (features and workflow).   If you sort the table according to those fields, you will see the ranking for that criterion.  Here is a more detailed explanation of the scores and what they mean.

GoPro Fusion review and comparison
GoPro Fusion review and comparison

EXPLANATION OF CRITERIA

Photo: this includes photo quality (resolution, dynamic range, stitching quality, aberrations, resistance to flare, etc.)  as well as photo features, such as manual exposure, built-in HDR, or the ability to shoot raw.
Video: this includes video quality as well as video features such as stabilization.  A high score means that the camera is very good for video, either because of high video quality.  A secondary factor is the camera’s video-related features (e.g. stabilization).
Usability (features & workflow): this is a score that reflects practicality and convenience, with an emphasis on workflow and features. Useful features increase the score, while practical limitations such as limited memory decrease it.  Workflow means the process required from shooting all the way to sharing.  A high rating for usability means the camera is convenient or easy to use and/or has useful features.

MEANING OF SCORES

Based on experience not specs: The scores reflect how well a particular camera performs based on my experience using it, in comparison with my experience using my other 360 cameras (I currently have forty-one 360 cameras and panoramic heads), using best practices for each camera (e.g.  staying outside the minimum stitching distance).   The specifications have virtually no impact on the scores.  If a camera has a higher nominal resolution but its image quality is poor, it will score poorly, regardless of its resolution.  Please note not all of the cameras I own have been included in the chart yet.  I will keep expanding the database as time and weather permit.

A qualitative description of the scores would be as follows (note: the scores are not limited to 10 points):

10 – amazing compared to consumer 360 camera standards in February 2018 (very large improvement over a camera considered ‘excellent’ in February 2018)
9 – excellent compared to consumer 360 camera standards in February 2018 (very large improvement over a camera considered ‘good’ in February 2018)
8 – good / average compared to consumer 360 camera standards in February 2018
7 – far below average compared to consumer 360 camera standards in February 2018 (much worse than a camera considered ‘good’ in February 2018)
6 – dismal compared to consumer 360 camera standards in February 2018  (much worse than a camera considered ‘far below average’ in February 2018)
0.1 difference: slight difference
0.3 difference: noticeable difference
0.5 difference: significant difference
1.0 difference: a very large difference

Identical scores don’t mean that the cameras have literally identical performance but that they are approximately equivalent when their strengths and weaknesses are taken into account for that criterion.  For example, for photo, I rated Insta360 ONE and Xiaomi Mi Sphere both 8.8.  This doesn’t mean that they have the same photo quality.  I find that the Mi Sphere has better consistency of sharpness and less noise in the shadows.  On the other hand, it appears that Insta360 One has better overall dynamic range, and has Adobe DNG Raw shooting and stitching.  For those reasons, I would say it’s a toss up between them for photo.  Note: Identical scores between different criteria DO NOT imply the same performance.  In other words, a camera rated an 8 for photo quality doesn’t mean its photo quality is the same as the video quality of another camera rated an 8.  Each criterion is scored independently – please do not compare the score of one criterion with the score of a different criterion.

A higher score means I prefer that camera for that particular criterion.   If you sort the comparison table according to a criterion (photo, video, or features/workflow), it will result in my ranking of the cameras according to that criterion.  Nonetheless, there are some situations where I would use a lower-scoring camera (e.g. Xiaomi instead of a Panono for photo) in certain situations.

The scores are also a rough approximation of the degree of the difference between the cameras.  A difference of 0.1 is very subtle, and most people will need to review samples carefully side by side in order to notice the difference.  A difference of 0.2 or 0.3 means that the higher scoring camera is noticeably better, although it’s possible that reasonable minds could differ and might find the lower ranking one to be better for some reason.  A difference of 0.5 means that the higher scoring camera is significantly better in my opinion.  A difference of 1.0 means a very large difference so that in my opinion, no knowledgeable person could consider the lower scoring camera to be superior to the higher scoring camera with respect to that criterion.  This way when I say that Panono is better than Xiaomi for photos, you can see that with their score of 8.8 versus 9.8, there is a huge gap between them.

There are some cameras that I haven’t scored yet, such as Kandao Obsidian R or VRDL360 or MADV Mini.  That’s because I’m still evaluating them.  I also haven’t scored Vuze and LucidCam.  That’s because they are so unique that it would not be very meaningful to compare them to other cameras (but I might put scores for them in the future).

SCORES ARE CONSTANTLY UPDATED; THE FUTURE

360 cameras can be updated with additional features or improvements.  Whenever a camera is updated, I will update the scores.

360 camera technology will also continue to improve.  How would scores get affected?  A camera that may yield excellent scores today may be considered mediocre in the future.  Alternatively, a mediocre camera of the future may have equal or better performance than a camera that is considered excellent today.  For these reasons, in my opinion it is unfair to give a camera a permanent rating such as “Gold” or “A,” which could imply incorrectly that the camera is superior to a future one rated “B.”

Instead, to account for improvements, I will keep increasing the score, beyond 10 as needed.  This approach will allow you to compare future cameras against past cameras on the same rating scale.  Again, the scores are NOT on a 10-point scale.   As cameras improve, the scores will continue to increase indefinitely.  For example, a camera that has a very large improvement from the Insta360 Pro’s video (rated 9.5) would get a video score of 10.5.

The rating system is therefore PERPETUAL — a rating today can be compared with a rating of a future camera, indefinitely.  This is the first and only perpetual rating system in the 360 camera industry.

I’ll try to update the reviews for each camera to show the scores and my rationale for its scores.  In the meantime, you can compare the cameras yourself side by side for photos and for videos.  Do you agree with the rankings?  Did any of them surprise you?  Are there rankings that you strongly disagree with?  Let me know in the comments!

About the author

Mic Ty

13 Comments

Click here to post a comment

Leave a Reply to Vladimir Gorbunov Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  • Nice comparison table, I would add that Pano5+1 mk II and Panohero H5B can’t be used for dynamic scenes. Perhaps this isn’t clear to everyone.

    • ok thanks Chris. I’ve modified the table to so state (although I said “difficult” not impossible 🙂 ).

      Best regards,
      Mic

  • Hey Mic,

    After reading over the list, I have to ask – Did I miss a firmware, or software, update that included optical flow stitching for that camera?

      • Hi Rich! Samsung Gear 360 uses optical flow stitching as long as you stitch with Action Director. You can try this by placing your hand in the stitch line and moving it near and far, and you’ll see the hand will stitch correctly up to a certain point. If it is template-based stitching, then it can only stitch correctly at one exact distance. The exception is if you use Street View app to stitch — the stitching would be template based.
        Best regards,
        Mic

        • I’m pretty sure that Action Director does not use Optical Flow, unless there was/is and update that I’ve missed. If such were the case, I wouldn’t have made such a fuss about Mystika VR those so many months ago.

          • Hi Rich. Try my moving hand test. Having said that, some stitching algorithms are better than others, so I’m not surprised that Mystika can stitch it better. Best regards, Mic

  • Thank you for this comparison! It makes the things much more clear.

    But I believe that lack of 5-6K video resolution, data overlays and in-camera video stitching are huge drawbacks of my Mi Sphere camera. To my taste, these drawbacks would cost 1-1.5 video points in comparison to Virb 360, not 0.2 points as in your table.

    • Thank you very much Vladimir. I appreciate the feedback.
      A while back, I actually did a 360 video comparison between Mi Sphere and Virb. You can see here https://www.youtube.com/watch?v=syDrnktJ2xA When I posted it, most people could not see a big difference between them. Some people even thought Mi Sphere was better.

      It is true that Virb has overlays, but then again, it is also very susceptible to flare, and stitching is not as good as the Xiaomi. Xiaomi also has a bit better dynamic range than Virb. See here https://360rumors.com/2018/02/gopro-fusion-details-leaked-specifications-features-price-release-date-revealed.html#comparison So those are some of the factors that reduced the gap between Xiaomi and Virb.

      Best regards,
      Mic

      • Actually you’re right, and the technical quality may be not much different.

        But recently I tried to record a business meeting on my work, and stitching the 30 minutes of 3.5K video took insanely long time on my PC. If I had a camera with on-board video stitching, it would made a huge difference. Unfortunately such kind of cameras is very rare, most rely on post-stitching. Even the expensive ones, like GoPro.

        • Thanks Vladimir. Yes the in-cam stitching of Virb is a very useful feature, which is why I rated it very high for features & workflow.
          Best regards,
          Mic

  • Love the site! Found this page a bit confusing, as the link to the comparison table is challenging to find. You might consider hyperlinking the large image at the top and/or creating a more obvious link. I think you’ll find a lot more clickthroughs. Thx!