360 Camera News and Info

Circle Optics Hydra is a 12K 360 camera with “zero stitching” (updated Nov. 25, 2019)

Circle Optics Hydra is a 12K camera with no stitching needed
Circle Optics Hydra is a 12K camera with no stitching needed
Circle Optics Hydra is a 12K camera with no stitching needed
Circle Optics Hydra is a 12K 360 camera with no stitching needed

Circle Optics Hydra is a professional 360 camera that will have 12K resolution, 12-bit video, and will require no stitching.

Conventional professional 360 cameras use several fisheye lenses and multiple sensors to capture several videos that are stitched into a single 360 video.  For high resolution cameras, stitching can be very time consuming and can be susceptible to stitching errors due to parallax.

Circle Optics is a professional 360 camera that uses a completely different design to capture high resolution 360 video with no stitching required.  Here is a product video:

Here are its specifications:
– field of view: 360 x 330
– eleven f/2.0 “channels”
– 12K resolution (72 megapixels)
– 60fps
– 12-bit video
– minimum distance: 3 feet
– 12K live view and live streaming capable.

Here are some more photos of the Hydra:

 

View this post on Instagram

 

A post shared by Circle Optics (@circleoptics) on

The Hydra appears to be slightly smaller than the Insta360 Pro 2.

 

View this post on Instagram

 

A post shared by Circle Optics (@circleoptics) on

How does it work? (updated)

Circle Optics claims that the Hydra captures a single image and has “zero stitching.”  A company representative gave the following additional information: “I like to think of Hydra like a compound eye (which have also evolved to polygons), but go into different sensors instead of one big one. So we have 11 different sensors in Hydra, and we aim to have one equirectangular output with our final product. Therefore the sensors aren’t a big cost factor – it is all the glass (a given in the OPI industry).   As for the parallax and 3ft min. obj. distance – my understanding is this comes from a calculation done looking at our current lens designs and assembly tolerances. With tighter tolerances and more building experience behind us, we can drop this distance significantly, but for our first model 3ft it is.”

Given that there are 11 different sensors, it appears that there is stitching involved, although the stitching seems to happen in real time.

Price and availability

Price and availability have not yet been announced, but given the specs, the presumably large and high resolution sensor, and the custom optics, I speculate that it could be quite expensive.  The current plan is to rent it out on a day-by-day basis to studios and VR producers.  I’ll update this post as I find out more info.  Here is the official website.  Thanks to 360video.it for the additional info!

 

About the author

Mic Ty

10 Comments

Click here to post a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  • So I can’t edit my previous post with the link to the patent, but based on reading the patent that I linked in another comment:

    They are using multiple sensors, not a single sensor

    The key here is that the optical arrangement of the lens and sensor is such that the nodal point (or no-parallax point) is BEHIND the sensor. This allows them to set multiple cameras around the NPP.

    Reading the patent, the assertion that “no stitching” is occurring is marketing fluff and false. Just like panoramas taken with a nodal head need stitching, so will the images from the sensor.

    HOWEVER – since these images will be taken simultaneously with a fixed lens arrangement and there will be no parallax issues, all of the photometric corrections and lens parameters (distortion, etc.) can be pre-calibrated and the stitching process becomes a fairly simple warp algorithm. No need for the optical flow stitching/depth map creation tricks that are needed for multilens units where the no-parallax points are at different locations.

      • I’ll take a look at your video tonight. (I’m on the go and make a point of not using YT on mobile data)

        So they may have far less parallax than other cameras, but perhaps still have some.

        Something ate my other post with the link to the patent, so perhaps if I don’t have a link, things will work better – If you search for Zakariya Niazi or simply US20190289209A1 on Google Patents, you can read their patent which goes into deep technical detail.

        I had assumed that their 3 foot minimum object distance was dictated by depth of field issues, but other aspects of the design might be responsible instead per the information Mic received from a company rep.

        Achieving an optical solution for a fairly wide-angle lens where the NPP is behind the sensor IS pretty unique – this is usually only seen in longer telephoto lenses (I believe one of Nikon’s 300/2.8 lenses has the NPP something like 200mm behind the lens mounting flange, and hence 150mm or so behind the sensor plane.) – but it’s questionable whether this optical uniqueness is beneficial.

        It seems to me like this will be one of those cases where advanced mechanical complexity to reduce computational requirements is only a temporary advantage that dissipates rapidly as signal processing capablities continue to advance. Look at gigabit Ethernet – originally a different standard (1000Base-TX) than what eventually became widely deployed was looking like the frontrunner due to the hardware costing less, but in the end the cost of 1000Base-T’s signal processing hardware at the endpoints dropped far more rapidly than the cost of 1000Base-TX’s Cat6 cable requirements. As a result, TX is a mere footnote in history.

    • Very interesting! I’m happy to see how technology in this field is advancing. i hope this solution will bring down prices as well.

    • Thanks Robert. I think for professionals, the image quality is the be-all-end-all. They don’t care how it looks, as long as it works. 🙂