Do you have a dual lens smartphone? Maybe you already have hundreds of 3D/holographic photos

Last Updated on

As you have been reading on our other articles depth maps are being used more and more on several places. Is part of the workflow of some Virtual/Mixed Reality and Augmented Reality Apps. Google Camera (Lens Blur mode) and Seene use it, and now Camarada too. Lenticular 3D/holographic images from postcards are generated mostly by depth maps instead of making several photos for all the views (if you find a 3D/holographic postcard with big “jumps” between a point of view and another is because they used a couple of photos instead a depth map).

 

Actually Depth maps can now be calculated more precisely than some time ago, for that reason holographic devices like RH1 and Ultra-D can exist now, for that reason maybe light field cinemas will replace 3D with holographic images, and for that reason the smartphones now can do better bokeh photos than previously. And the future looks promising.

 

And that use of depth maps on dual lenses smartphones to calculate bokeh is the key to extract that depth information to generate a 3D or even holographic image, a depth map can generate an unlimited number of views of same image, so if you need only 1 point of view for 3D you can generate it (and you can adjust separation to increase or decrease the strength of 3D), and if you need 45 points of view for a The Looking Glass display you can extract all the images for the same depth map. The problem -a big one- is that every movement of that file will likely destroy that information. Even moving the file from a folder to another could kill that metadata. When you look a .jpg file that file will likely not be an exact copy of the original, all software today seems to rewrite the file ignoring metadata or even modifying exif attributes, even the operating system itself. If you want to use that metadata you need to avoid all movements/interactions with the file, you can’t treat the image like an image as all the image programs will likely remove that metadata. 

 

On Android devices you need to open your bokeh image as a File or Document

 

StereoPhotoMaker and Depthy.me can generate 3D images from that bokeh photos. Even Facebook copied to Depthy.me/Seene (but you still can edit your bokeh photos with Depthy to improve them before publishing on facebook), so you can use Seene-like images on that data trafficking social network. But you have to upload them right from the original folder. Facebook can access directly to the system of your device (for bad reasons, very bad actually), so they override the operating system and can open them directly. On depthy you need to upload the image as a file and instead browse on gallery you have to browse on files (as Documents on Android or if via USB cable from the phone to the PC). On StereoPhotoMaker you have to open the file directly from the device connected to the PC by USB.   

 

On StereoPhoto Maker you need to open the file clicking on Edit -> Depth map -> Open Jpeg include depth map (load file directly from your device, don’t copy to the PC)

 

Once image is recognized and loaded on Depthy or StereoPhoto Maker you can save it as 3D photo and you can also adjust depth or make an animated GIF.

If following these steps you can’t load the depth map image it means your device uses depth map only internally and after generating your bokeh photo discards the depth map (most likely if your device does not allow to modify the bokeh foucs of your photo), or it save it outside the file or on a different format unrecognizable for other Apps. 

 

We hope you would be lucky and have a device that stores the depth map correctly.
You device is a lucky one? Please comment and share the device you used. Or use our chat.

Ooh!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.