When you view 3D stereo photos (parallel, side-by-side, MPO formats) on your phone with my Google Play Store Android 3D/VR stereo photo viewer app, you would normally use a stereoscope or Cardboard viewer.
You don't need a stereoscope or Cardboard viewer after all, instead you can use the "free viewing" technique.
My Samsung Galaxy S6 phone has a horizontal display width that approximates the width of my glasses. Because my phone display width is about 4.5 inches, and the center to center distance between the left and right eye images on the phone nearly matches the distance between my eyes, I can free view 3D parallel side-by-side photos and actually see them in stereo.
Maybe this works for me because I'm near sighted. With my glasses off I place the phone very close to my eyes, relax my eye focus beyond the phone, and then very slowly back away the phone, until the photo comes in focus and I see a 3D image. This is the same technique used to view stereograms, as found in the "Magic Eye" book series, which are multiple volumes of 3D illusions. Stereo photos are easier to free view than stereograms in my opinion.
When I use the 3D/VR stereo photo app, in VR mode, while free viewing, I can zoom into the photo and pan left and right to see other areas of the image close up. In the Settings menu I turn off Lens Distortion Correction because I'm not using Cardboard VR lenses. This option makes the photo distortion free in VR mode.
I sometimes also use a Bluetooth connected mouse to move the photo around and zoom into the photo. The TeckNet BM307 Wireless Mouse works great. When I use the mouse I don't have to tilt the phone to zoom in. Tilting the phone makes it harder for me to maintain my 3D vision when free viewing. Also the app Settings allow you to turn off the head movement option, but still use a mouse or other Bluetooth controller to move the photo and zoom in.
Another improvement on free viewing I found is to wear a pair of reading glasses bought from a pharmacy/drug store. With the reading glasses I can get closer to the phone and see a bigger image. I use +3.75 magnification to see a photo in focus less than 5 inches from the phone screen. If you are far-sighted you will probably need reading glasses to view the phone at a close distance.
And iPhone owners, you can use the free view technique with side-by-side photos too. You don't need a viewer app, although an app helps if you want to zoom into a photo. Expanding a photo with your finger to magnify a photo will not show the photo correctly in 3D stereo.
If you want to learn more about how to free view check out this link
Or do a Google search for "how to free view (3D) images".
Of course DO NOT free view or use a viewing app and stereoscope combination, if this gives you eye strain or eye discomfort.
As people get older some may loose their ability to see in 3D and have poor depth perception, making driving difficult. This is because the brain controls stereoscopic vision not the eyes. I wonder if stereoscopic vision is a learned experience from birth and whether it can be relearned when it diminishes or the ability is lost.
|Example Stereogram generated from flower photo|
If you would like to experiment with Stereogram creation using the Processing language and development environment (SDK), I uploaded the code on github for you. I converted the code from open source Java desktop to Processing sketches. The example stereogram above was created with the Processing code using a flower photo I made.
Original Project Code:
|3D side by side photo of fashion show dancers|
Using the WiFi trigger, the cameras did not perform well with many poorly synchronized 3D photos or missed photos from either or both cameras. See my analysis below.
Each camera has a 16 mm fixed focal lens (24 mm equivalent to 35mm full frame camera) with manual settings at 1/80 sec, F4, ISO 1600. The inter-axial spacing between the cameras was increased from the minimum of 86 mm possible with the rig to about 120 mm to get a better 3D hyper-stereo effect given how far away the subjects were from the camera. Photos were shot in raw, post processed using Lightroom, and aligned for 3D stereo with Stereo Photo Maker.
The 3D camera rig was attached to a very light weight tripod and I was able at times, to lift it high above all the official event photographers in front of me, and keep the foreground clutter out of the shot. Here the remote bluetooth/WiFi combination was useful. In retrospect, I should have used a long shutter cable or PocketWizard remote triggers.
With 24 mm equivalent lens I was able to capture the entire view of the models, stage, and audience. The show floodlights lit the scene well from my vantage point at the end of the runway, so a flash was not necessary.
The public show space had a large crowd (guessing 300+) watching in a multi-retail store business location (with WiFi networks) and everyone had their phone out taking pictures, emailing photos, or what ever. So I think there was a lot of interference from WiFi signals and Bluetooth. I was thinking the cameras and phone where close enough to override any interference, but I misjudged the outcome based on home tests.
It seems my Sony Z1s phone WiFi hot-spot defaults to the 2.4 GHz channel 1 band. I should have changed this channel setting. Going back the next day with a phone WiFi analyzer, I saw at least 6 strong WiFi signals on channel 1 at my shooting location. Plus there were many more strong signals on other channels. I would recommend a strong signal portable router if its available instead of using a phone.
I also discovered a major problem in that the NX500 camera is slow responding to shutter close requests when triggered with WiFi, because a delay occurs when the camera saves files on the SD card and the camera does not immediately process network shutter trigger requests. This problem does not occur with wired remote triggers. With fast moving models it was not possible to consistently capture the best photo moment. For rapid shooting with twin cameras I would recommend a wired remote shutter release instead of using a WiFi trigger to capture your photo subjects.
WiFi control of multiple cameras can have advantages in some situations, where you can take your time shooting, but not for public 😢fashion shows, where camera wired remote shutter releases should work better with consistency. And unfortunately the NX500 cannot take a lot of shots rapidly, a camera feature needed for this kind of event.
I enjoyed writing the code for the Remote Capture app and learned some new techniques, so all was not lost, it was worth the effort. Maybe you will find the app useful for your photo projects. Please let me know how it works out for you, thanks!
In the past when I wrote an Android app using Processing-Android open-source code SDK, I could not combine Android GUI elements with my Processing code. It was not possible to mix the Android GUI features with a Processing sketch in a display screen. I had to rely on special purpose GUI libraries written by volunteers to add more elaborate graphical elements to an app.
During the past year the open-source Android Processing Language development team volunteers modified the internal substructure code of Processing-Android to use Android Fragments instead of Activity classes, thereby allowing Android GUI elements to be combined with Processing GUI sketches within a single screen display in an app.
Searching the Internet I could not find any examples of code that combined the GUIs. I modified (forked) the Processing Android Demo code on Github.com to use as a base for some new examples of Processing Android code I wrote that combines Processing sketches with Android GUI elements. Use this link to see the Android Studio project code:
For Processing coders It's a little more complicated because you will have to use Android Studio SDK instead of the Processing SDK to combine the code development libraries.
Here are photos of my landscape configured Twin Samsung NX500 Camera 3D Stereo Rig mounted on a tripod. (I also have a portrait configured rig I describe in a blog post here.) The left eye camera is mounted upside down to get the smallest possible camera separation distance. The rig has an inter-axial camera lens separation of 86 mm with the camera strap lugs not cut off, otherwise I could have had a separation of 82 mm inter-axial.
The cameras are mounted on a 3D printed support I devised in a Z-bar format and had Shapeways.com manufacture the support in rigid plastic. A large painter's sanding paper handle holds the support mount for comfort. The whole rig weighs about 3 lbs with 16 mm fixed lens and UV protection filter.
The Sony Z1S smart-phone runs a Remote Capture app (Android) that sends broadcast shutter control commands over the phone's WiFi Hot-spot local network to the firmware modified NX500 cameras.
The pictured Bluetooth remote shutter device sends key presses for focus and shutter commands to the app that then sends the WiFi broadcast commands to the cameras. To my knowledge it is not possible to use the Bluetooth controller directly to control the NX500 camera without a firmware modification. If Bluetooth shutter remote was possible, then you would need two controllers wired together for 3D, or one for each camera you want to trigger.
Without the Bluetooth shutter release controller, I tap the phone screen to control the camera operations with more features available (like video).
With this setup I do not need to carry an external portable WiFi DHCP router with me, the phone does it all using WiFi Hot-spot!
The Remote Capture app can also control the twin cameras in video mode to make 3D videos. Also more cameras can be added to the WiFi network for multiple views of a scene at the same instant. Additional Android smart-phone cameras can be added to the WiFi network to capture simultaneous multiple views using the Open Camera Remote app.
I have not measured the NX500 camera's shutter synchronization time when using WiFi. It is acceptable for non-flash shoots, but not as good as a wired camera's shutter/focus remote. Of course the cameras have to be both locked in focus to get the best synchronization. This is the reason for the focus hold feature in the Remote Capture app. I find synchronization works best in NX500 camera manual mode, not auto mode. A lot depends on the movement of your subject and camera synchronization to capture the best quality 3D stereo photo.
Today on the Google Play Store, I published a free Android open source camera app: Open Camera Remote. My camera app is a modified version of the open source Open Camera app by Mark Harman for Android phones and tablets. The Open Camera Remote app listens for broadcast message camera shutter trigger commands when connected to a local WiFi network.
Used with the WiFi Remote Capture app, you now have the capability to take photos and videos simultaneously with multiple Android smart-phone cameras. The WiFi Remote Capture app uses a local WiFi network to transmit capture/shutter release commands to all the cameras that are connected to the same local network. You can use this pair of apps for 3D stereo photography, multi-camera angle video shoots, multi-camera VR panoramic image capture.
At a social gathering, party, photo shoot, or other group event with all the participants using the Open Camera Remote app to take photos or videos and all participants connected on the same local WiFi network, you can trigger all the cameras to take pictures at the same instant.
The WiFi Remote Capture app now works with two types of cameras:
1. Samsung NX500/NX1 Cameras
2. Android smart-phones running the Open Camera Remote App
|Screen shot of WiFi Remote Capture Android App|
Here's a link to my new free Android app to trigger photo focus/shutter and video record/pause operations with firmware modified Samsung NX500/NX1 cameras.
The app sends shutter control commands to multiple cameras on a local WiFi network via broadcast messages. I use it for 3D photography with a home-made twin NX500 camera 3D rig. No more control wires, but I sometimes had to carry a portable WiFi DHCP link router with a phone charging battery to power the router, plus a smart phone to run the Remote Capture app.
Now with an updated app version 1.3, I only need to carry a phone to run the app and uses its mobile hot-spot feature for a local WiFi network. With this hot-spot DHCP feature available in the phone, I no longer need a separate router to carry with me.
Some phones like my Samsung S6 (t-mobile) require you to have a sim card to use the WiFi hot-spot feature, but I was able to use a Sony Z1S phone (t-mobile) without a sim card to run the WiFi hot-spot local network. This phone also runs the Remote Capture app.
Detail information about usage and code you need for the NX500 camera can be found at: