VR News and Info

iPhone 15 Pro’s most important feature: what are SPATIAL VIDEOS?

iPhone 15 Pro spatial video

Apple launched iPhone 15 Pro today.  Its most important new feature is the ability to capture spatial videos.  What are they?

Although the iPhone 15 Pro looks like any other iPhone, it has a groundbreaking new feature: the ability to capture spatial videos.  Spatial videos are not just “3D” videos.  They are 6DOF videos (i.e., with 6 degrees of freedom).

That means that you can view the videos and be able to move and the perspective will change accordingly, just like the Princess Leia hologram from Star Wars.

Spatial videos change perspective as you move around
Spatial videos change perspective as you move around

What is the difference between spatial video vs 3D?

Is “spatial video” just a fancy name for plain old stereoscopic 3D videos or maybe VR180 videos?  No, spatial video is very different from 3D.  With 3D videos and VR180 videos, the perspective never changes regardless of how you view them.  If you have two sticks, one behind the other, and you took a 3D or VR180 image of them, you’ll never see the stick at the back even if you move your head to one side or the other.

In the GIF animation above, you can see that the guitar’s headstock occludes the lady in some frames but not in the frontal view.  Similarly, the lady is to the left or the right of the cliff in the background, depending on the perspective.  3D and VR180 videos cannot capture this change in perspective.

To be clear, this doesn’t mean that you’ll be able to see the back of the head of the subjects in the video above.  That’s because the spatial video was captured from only one point.  If you move far enough to the side, the spatial video will appear like hollow mannequins chopped vertically.  That is why Apple puts a frame around the spatial video, to limit the range of your movement.

How does iPhone 15 Pro capture 6DOF spatial video?  What about older iPhones?

In the Apple launch video, they only showed iPhone 15 Pro’s two lenses.  Moreover, the two lenses are less than an inch apart, which means they have very little parallax (the stereoscopic 3D effect won’t be very noticeable unless you are shooting macro subjects).  How could the two lenses possibly capture 6DOF spatial video?

The answer is a depth sensor.  Every iPhone Pro since iPhone 12 Pro has had LiDAR (Light Detection and Ranging), which uses lasers to detect depth.  iPhones also have Face ID which use lasers to analyze depth with even more precision.  That depth information, combined with the stereoscopic image from the two lenses are enough to create 6DOF video.  In fact, there are already apps in the App Store than enable LiDAR-equipped iPhones to capture 6DOF video.

So does that mean other iPhones with LiDAR will also have spatial video?  In the demos, we see that the iPhone 15 Pro captures 6DOF videos from several feet away from the subject.  My iPhone 13 Pro doesn’t seem capable of capturing 6DOF video from such distances.  I believe this means that iPhone 15 Pro’s depth sensors are far more powerful to cover longer distances, besides being possibly more detailed (similar to the Face ID sensors).

For older iPhones, it is technically possible for Apple to enable spatial videos but only for shorter distances such as a couple of feet away, but I think they won’t do it because it will dilute their marketing message, and will make iPhone 15 Pro’s spatial video appear to be less of an innovation.

What’s the catch?

The catch is that you’ll need to use the $3499 Apple Vision Pro headset to see the spatial videos.  The Vision Pro headset itself can also capture spatial videos, but people were saying it’s unrealistic to capture spontaneous moments by first donning a headset.  Well, the iPhone 15 Pro addresses that issue and you bet they knew that from the beginning. Apple’s plan for world domination is coming together nicely…

Spatial videos will be available for iPhone 15 Pro at the end of this year.  But Vision Pro is not coming out until next year, so how would users see the spatial videos?  The answer is through AR.  You’ll be able to see the spatial videos as an overlay in your room.

What do you think of spatial videos?  Are they the next big thing for imaging, or will they be only for tech geeks?  Let me know in the comments!

About the author

Mic Ty


Click here to post a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

    • Hi it’s possible but someone has to make an app for it. And I’m not sure if Apple will make it easy for people to view spatial videos on other devices.

      • The 3D videos made on iPhone can be played back on Quest 3. Meta released an update the day Vision Pro was released.

  • The iPhone 15 Pro’s most important feature is undoubtedly its groundbreaking capability for capturing and viewing SPATIAL VIDEOS. Spatial videos revolutionize the way we experience content by adding an extra dimension to our videos. With spatial videos, the iPhone 15 Pro can capture not only the visuals but also the depth and spatial audio of a scene. This means that when you watch these videos on your device, you’ll feel like you’re right there in the moment, with objects and sounds coming from different directions. It creates an immersive and lifelike experience, whether you’re watching your favorite movie or reliving a cherished memory. Spatial videos mark a significant leap forward in smartphone technology, and the iPhone 15 Pro is leading the way in this exciting new era of content creation and consumption.

  • Hmmm, the lenses are what ? Millimeters apart! How does it see “around “ something? Is it being generated in AI?
    The sample is shot at 12’… a view change of about 20-30 degrees and the lady didn’t move, just the bts gimbal camera
    Trying to unwrap the physics of this sorcerers gack

      • Mic Ty, this article needs a full revision. Spatial video is 3D stereoscopic video, Apple is just labelling the video process differently than what is it. Too many terms with same definitions can create confusion for novice. Lets be fair here.

    • Simple answer: it’s not. It’s just plain old stereoscopic video. Same technique as for hundreds of years. There’s no 6dof, there’s barely any depth because the lenses are too close. They’re looking to fix that (hopefully) with the iPhone 16. But even still, this is just vr180.

      There’s incredible hype around this for no reason.

  • If it will make Microsoft or go pro they just announced without any in future supply and it will be forgotten after release… But now i don’t know… How they want to populise it because technology in end of year where you can watch it will be next year… And cost that it will be just for one for 5000 person who like gadgets and buy all new… I think this will not be successful just in two years … Next iphone maybe when glasses they make) and still you have to wear plug to make see quick emotions … When it is have to make more steps it will be not in life… It is like I can make blog on 360 camera but when you must put, reframe and just shot on smartphone and you get picture just right now… Without dancing with shaman to see video)

    • I think at the beginning, it will be too expensive for most people to enjoy. But I think Apple will create an “Apple Vision” (not Pro) that will be much more affordable.

      • Simply, I can’t believe it. It’s impossible to have a “real”, I mean = to reality, video, with that lenses, from a fixed point without moving the iphone. Even with a dozen lidars.
        You can not phisically view the “sides” of the subject.
        This will simply be a stupid gadget, You can already see what will be the effect. In facebook you can already find photos that “shift” the view position for some degrees, created with some software, I don’t remember.
        Do you remember the old 3D postcards with the lenticular system ? The effect is the same. In the GIF animation you can see all the cliff behind the lady, but if the lady was in front, and the IPHONE WAS NOT MOVED, how can you have all the cliff ?
        I wonder… AI generative fill. Something like that, and I repeat, Not Real.

  • This freakin’ awesome!!!
    Sure, there will be people who lack the long-range vision that Apple clearly has. They want it now, and because it is going to evolve over whatever timeline Apple has in mind, they will not see how this is the start of an insane revolution in how we share images we find.
    Bring it on.

  • Quite funny how they rediscover the 3D Phones, 10 years later. OK, they put a little bit lidar depht information in their AI App to fake 6DOF.
    More important, which format and standard will they use to store and playback the files? It was a hard time to record and playback 3D Stuff in the past. Having another proprietary Apple format is not the way to go.
    And as somebody, who has seen good 180 and 360 3D Stuff, i can’t imagine, this will be impress me in any way. My EGO is settling dust with its tiny viewing angle😉

    • Indeed. Facebook already offers this feature using AI/content generation to simulate a warped perspective and create this 3D effect. With the physical limitations of capture it is physically impossible to create an accurate model of the subject which extends more than a couple degrees either side of the perpendicular. The generative textures/fill with the depth information captured will be enough to fool most people, though.

    • Hi Jarno. This is not just going to be 3D. The perspective will change as you move, which is not possible on normal 3D or VR180.

      • The GIF animation in this article has nothing to do with the advertised “spatial video”. This animation was produced from two video frames recorded when a single lens camera was rolling on a dolly around guitar players (i.e. those are two different frames captured in two consecutive but separate time moments … hands of the players and lady’s hair are in different positions in those two frames) … that “look-around” effect was produced in the very old/classic way known for ages (no LIDAR needed to do that).
        When it comes to LIDAR depth sensor used in iPhone15 phone, apparently it’s IMX611 (made by Sony) with XY resolution at just 140170 pixels. Surprisingly Sony doesn’t provide Z (depth) resolution of that sensor. An interesting discussion about that LIDAR is in the comments of the following article:

        • … looks like the interface for posting comments on this page doesn’t like a “star” character … my earlier comment about XY resolution of LIDAR component must be re-written to 140×170 pixels.

  • Wrong assumption. LiDAR won’t be used at all. It’s called Apple HEVC Stereo Video (MV-HEVC) introduced in WWDC 2023.

    • Luckily this time, the MV-HEVC standard has nothing to do with Apple closed “garden” ecosystem. Luckily this time, Apple decided to use an official international standard established in 2014. MV-HEVC stands for Multi View High Efficiency Video Coding. It was developed by a working group of ISO/IEC MPEG (Moving Picture Experts Group) and ITU-T VCEG (Video Coding Experts Group).
      LIDAR data can be stored in that format as one of the views and be adequately used by LIDAR-aware playback software.

  • If you have an iPhone 15 Pro / Max, you can now record spatial videos (Apple’s new iOS 17.2 public beta) and analyze them. Maybe then we can solve the puzzle. Unfortunately I don’t have one 🙂

Exit mobile version