Skip to content

Why the iPhone is the best 3d device without 3D display

UPDATE: now more updates, this time to the AR capabilities with ARKit 6, improving the depth awareness of environment, and available to developers and their Apps. Example scanning an entire house in seconds.

It is not a secret, the iPhone has become a great device to take 3D photos and videos. The excellent quality of the cameras, the great night mode and the Live Photos feature, allows making stunning stereo pairs easier than with other devices; There are some other features that aren’t even possible for Android devices (not even by third party Apps). But with the iOS 16 new features, this affirmation becomes bolder than ever.

But let’s start with features and things you can do without the iOS 16 update.

Foremost, on the iPhone, Apps can use two lenses at a time to capture 3D photos and 3D videos like a 3D camera. With i3DMovieCam, you can take photos/record videos in real time in various 3D formats, (or stream to a PC as a 3D webcam). You will get 3D media that will fit perfectly all the 16:9 surface of every 3D display. You can adjust parallax easily without the need of stereoscopic display, so you can align images accordingly to the subject’s distance prior to start capturing. We recommend checking the help of the App to know how you can take control of the results before start recording. The developer tried to make the same App for Android, just to find out that is impossible, Android SDK doesn’t allow in any way to use two cameras at once, not even for previewing images.

If you own a ProMa King Tablet, there are some sample 3D photos included on the storage taken with i3DMovieCam for iPhone.

As many of you already know, photos made in portrait mode, store depth information. The official use for this is to add bokeh and change focus of image, but for us 3D enthusiasts this opens better uses. Stereo Photo Maker can make use of the depth map of these images to edit and generate a standard 3D image or watch it on the Looking Glass, not to speak about the official Looking Glass tool to watch iPhone’s portrait mode photos as holograms, or even making animated holograms. Lume Pad owners also can import these images into their tablet to watch them in Lightfield’s 4V format. Take into account that making photos directly in 3D should produce better results than photos in portrait mode, though the TrueDepth camera of iPhones are among the best available for creating photos with depth maps.

Now things become more interesting. Some Apps are using these depth maps in a way that seems magic. They allow to literally add virtual studio illumination to the image at any position in space, so you can illuminate your subject from different angles than the original photo. You can have a virtual professional illumination system by just positioning lights in the 3d space with the properties you want: direction, brightness, size, color and so on. And these lights will create realistic shadows, consequently to the position of objects in space. You can use these amazing features with Apps like Focos or Apollo.

You can do this with the free version of Focos, would be awesome in Stereoscopic 3D

The AR features of iPhone are also among the best in the industry, even models without LiDAR achieve a good accuracy calculating depth of real space, so AR Apps work almost flawlessly. And with the LiDAR sensor of the latest models allows to perfectly 3D scan anything, or to take measures instantly to automatically generate a CAD map of a room; among another amazing uses like record video with depth, in which you can delete background or add CGI and 3d models later in your favorite video editor, faster than Michael Bay adding explosions in post to its latest movie.

We’re pretty sure all these features are shaping the future of Mixed Reality glasses, like the rumored Apple Glasses.

And with the yesterday’s presentation, use of depth becomes more prominent. With iOS 16 you can use the main subject of photos to lightly cover the time on the lock screen, if they are Portrait mode photos it will do it automatically, but in case of photos without depth metering artificial intelligence is used, an AI at least as good as Leia and its 2d-4V converters.

But there’s a new killer feature: you can drag a subject in a photo and drop on another App, and that subject will instantly appear without any background; saving you several minutes of editing if you wanted to do it manually, and with more accuracy than using your mouse or finger to draw the borders of the subject you want to isolate.

You needed to spend a lot of minutes to do this manually, not anymore.

Remember the Apps adding virtual lights to your photos? With upcoming updates, you will be able to use the iPhone as a webcam for your Mac and dynamically change virtual lights while conferencing. But, of course, we prefer to use NDI HX Capture to stream i3DMovieCam full screen SBS image to any PC or Mac, 3D webcam is way better than 2d with virtual illumination 🙂

The TrueDepth camera does not only allow to scan objects, but also to interact with 3d models in real time

Apple Maps now feature a realistic 3d view, a new and beautiful schematic 3d view, and the look around view which seems a 360 2d image, but when you pan around you see how nearer objects block the farther ones; all three features rely on 3d scans of the real world. Another feature -already present for some years-, is the 3d Memoji feature, using the TrueDepth camera in real time; To finish, there was a small detail in the presentation: a lock screen background of the Earth that rotates in 3d to another angle when you unlock the screen.

This schematic 3d view is so beautiful, even limited by iPhone’s 2d screen.

Imagine if Apple introduces a glasses-free stereoscopic display packed on a new iPhone. That would be a killer device, and a dream for us.

All this 3d goodness is designed to use depth on everything, so when Apple finally releases their Mixed/Augmented Reality glasses, they will already have a working ecosystem that doesn’t need creating everything from zero.

But, what if, added to project 3D objects and information layers to the real world, they augment the iPhone’s screen with their Apple Glasses? They could make 3D images pop out of the iPhone’s screen, without blocking the real world. It would not be glasses-free, but it would be cool too.

With glasses or glasses-free, the iPhone’s ecosystem urgently needs stereoscopic visualization to culminate all the possibilities of its 3d technologies.

Ooh!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Currency