Here is the 360 camera ranking and rating chart. Please click on the column heading “Photo,” “Video,” or “Usability” to sort the chart into a camera ranking. To understand the ranking and ratings, pls. see below.
I updated the Ultimate 360 Camera Comparison Tool to include scores for most of the cameras in three categories: i) photo, ii) video, and iii) usability (features and workflow). If you sort the table according to those fields, you will see the ranking for that criterion. Here is a more detailed explanation of the scores and what they mean.
EXPLANATION OF CRITERIA
Photo: this includes photo quality (resolution, dynamic range, stitching quality, aberrations, resistance to flare, etc.) as well as photo features, such as manual exposure, built-in HDR, or the ability to shoot raw.
Video: this includes video quality as well as video features such as stabilization. A high score means that the camera is very good for video, either because of high video quality. A secondary factor is the camera’s video-related features (e.g. stabilization).
Usability (features & workflow): this is a score that reflects practicality and convenience, with an emphasis on workflow and features. Useful features increase the score, while practical limitations such as limited memory decrease it. Workflow means the process required from shooting all the way to sharing. A high rating for usability means the camera is convenient or easy to use and/or has useful features.
MEANING OF SCORES
Based on experience not specs: The scores reflect how well a particular camera performs based on my experience using it, in comparison with my experience using my other 360 cameras (I currently have forty-one 360 cameras and panoramic heads), using best practices for each camera (e.g. staying outside the minimum stitching distance). The specifications have virtually no impact on the scores. If a camera has a higher nominal resolution but its image quality is poor, it will score poorly, regardless of its resolution. Please note not all of the cameras I own have been included in the chart yet. I will keep expanding the database as time and weather permit.
A qualitative description of the scores would be as follows (note: the scores are not limited to 10 points):
10 – amazing compared to consumer 360 camera standards in February 2018 (very large improvement over a camera considered ‘excellent’ in February 2018)
9 – excellent compared to consumer 360 camera standards in February 2018 (very large improvement over a camera considered ‘good’ in February 2018)
8 – good / average compared to consumer 360 camera standards in February 2018
7 – far below average compared to consumer 360 camera standards in February 2018 (much worse than a camera considered ‘good’ in February 2018)
6 – dismal compared to consumer 360 camera standards in February 2018 (much worse than a camera considered ‘far below average’ in February 2018)
0.1 difference: slight difference
0.3 difference: noticeable difference
0.5 difference: significant difference
1.0 difference: a very large difference
Identical scores don’t mean that the cameras have literally identical performance but that they are approximately equivalent when their strengths and weaknesses are taken into account for that criterion. For example, for photo, I rated Insta360 ONE and Xiaomi Mi Sphere both 8.8. This doesn’t mean that they have the same photo quality. I find that the Mi Sphere has better consistency of sharpness and less noise in the shadows. On the other hand, it appears that Insta360 One has better overall dynamic range, and has Adobe DNG Raw shooting and stitching. For those reasons, I would say it’s a toss up between them for photo. Note: Identical scores between different criteria DO NOT imply the same performance. In other words, a camera rated an 8 for photo quality doesn’t mean its photo quality is the same as the video quality of another camera rated an 8. Each criterion is scored independently – please do not compare the score of one criterion with the score of a different criterion.
A higher score means I prefer that camera for that particular criterion. If you sort the comparison table according to a criterion (photo, video, or features/workflow), it will result in my ranking of the cameras according to that criterion. Nonetheless, there are some situations where I would use a lower-scoring camera (e.g. Xiaomi instead of a Panono for photo) in certain situations.
The scores are also a rough approximation of the degree of the difference between the cameras. A difference of 0.1 is very subtle, and most people will need to review samples carefully side by side in order to notice the difference. A difference of 0.2 or 0.3 means that the higher scoring camera is noticeably better, although it’s possible that reasonable minds could differ and might find the lower ranking one to be better for some reason. A difference of 0.5 means that the higher scoring camera is significantly better in my opinion. A difference of 1.0 means a very large difference so that in my opinion, no knowledgeable person could consider the lower scoring camera to be superior to the higher scoring camera with respect to that criterion. This way when I say that Panono is better than Xiaomi for photos, you can see that with their score of 8.8 versus 9.8, there is a huge gap between them.
There are some cameras that I haven’t scored yet, such as Kandao Obsidian R or VRDL360 or MADV Mini. That’s because I’m still evaluating them. I also haven’t scored Vuze and LucidCam. That’s because they are so unique that it would not be very meaningful to compare them to other cameras (but I might put scores for them in the future).
SCORES ARE CONSTANTLY UPDATED; THE FUTURE
360 cameras can be updated with additional features or improvements. Whenever a camera is updated, I will update the scores.
360 camera technology will also continue to improve. How would scores get affected? A camera that may yield excellent scores today may be considered mediocre in the future. Alternatively, a mediocre camera of the future may have equal or better performance than a camera that is considered excellent today. For these reasons, in my opinion it is unfair to give a camera a permanent rating such as “Gold” or “A,” which could imply incorrectly that the camera is superior to a future one rated “B.”
Instead, to account for improvements, I will keep increasing the score, beyond 10 as needed. This approach will allow you to compare future cameras against past cameras on the same rating scale. Again, the scores are NOT on a 10-point scale. As cameras improve, the scores will continue to increase indefinitely. For example, a camera that has a very large improvement from the Insta360 Pro’s video (rated 9.5) would get a video score of 10.5.
The rating system is therefore PERPETUAL — a rating today can be compared with a rating of a future camera, indefinitely. This is the first and only perpetual rating system in the 360 camera industry.
I’ll try to update the reviews for each camera to show the scores and my rationale for its scores. In the meantime, you can compare the cameras yourself side by side for photos and for videos. Do you agree with the rankings? Did any of them surprise you? Are there rankings that you strongly disagree with? Let me know in the comments!