|3D Sunset After Thunderstorm, Andy Modla photographer|
|Tekla3D Free Viewing QR Code (www.tekla3d.com)|
Given a phone with this approximate screen dimension, you can freeview on you phone, similar in ways you would view Magic Eye stereogram puzzles. See my blog posting about free viewing.
With free viewing you do not use a VR headset!
So I thought it you be helpful to have a QR code that can set your Google VR/Cardboard apps for free-viewing. Basically it tells the app there is very little lens distortion - you are not using a headset! I created the QR code above from the Google Cardboard Viewer Profile Generator.
I tried this setting on my phone with a few Cardboard apps and it works, or example, 3D/VR Reel HGI Fashion app. It may help to have reading glasses to see the phone screen up close. I used +2.0 magnification. I think it helps if the app does not require a lot of head movement. The experience is not a immersive as using a VR headset, but is better for 3D photos.
If you encounter eye strain or any discomfort, do not free view. Not everyone can do this.
Let me know if this works for you. Good luck!
|Italian Cloister Garden at Cathedral of St. Andrew, 3D parallel photo|
The Italian Cloister Garden at St. Andrew's Cathedral, Glasgow, Scotland, has a stunning mirrored memorial sculpture, fountain, and two-hundred-year old olive tree (from Tuscany) commemorating the loss of about 800 British-Italian World War II internees. They died in the sinking of the cruise ship Arandora Star torpedoed by the Nazis in 1940 off the coast of Ireland, while on their way to Canada. Each mirrored steel slab, like a gravestone, has gospel quotes on one side and classic Italian poetry on the other.
3D parallel photos
Memorial Information - Italian Cloister Garden
News article about a historical fiction book describing the tradgedy -http://www.barganews.com/2014/02/12/arandora-star-fiction-serves-truth/
Cathedral renovation news - http://www.catholicherald.co.uk/news/2011/04/07/cathedral-receives-a-glorious-makeover/
Enjoyed a street performance by Clanadonia, a Scottish Tribal Drum and Pipes band.
|Graffiti Street Painting by Artist Fuse|
Walking around downtown Glasgow, on a beautiful sunny day in May, we came across this graffiti street painting. At first I thought it was a Banksy, but later realized it's by the artist Fuse. The painting is located near 48 Miller Street across from the Tobacco Merchant's House (national landmark) at 42 Miller St.
|3D parallel photo of Graffiti Street Painting by Artist Fuse|
Here is the painting location photo in 3D stereo. This was my Henri Cartier-Bresson decisive moment 😵
For more information about the artist Fuse in Glasgow see this article:
You can see a Google street view showing the blank wall on the right of the photo here:
Today I captured this photo of tree bark with my phone camera. It's a peeling American Sycamore tree. Looking at this 2D photo may be puzzling. The question is you don't know for sure if the center section is where the bark had already peeled away or not. Looking at the tree straight on with one eye closed does not answer the question. You could circle the tree with one eye closed but that's not an option.
In this parallel (side by side) 3D photo you can clearly see that the peeling bark has not dropped off the tree yet. It is about 35 cm long (14 inches).
In this second 3D photo you can see where the bark peel away revealing green bark underneath. The bark peel is curved.
More information about Sycamores can be found at https://www.nycgovparks.org/news/daily-plant?id=19242
I shot the 3D photos using the cha-cha method captured with the Open Camera Remote app on a Samsung S7 phone. The camera app was launched from my 3D/VR Stereo Photo Viewer app. I had to wait for a windless early evening to use the cha-cha technique.
Next I aligned the left and right photos I captured with Stereo Photo Maker software. For the most comfortable stereo effect I placed the peeling bark just behind the stereo viewing window (phone display screen). Then I copied the aligned photos back to my phone for viewing with the 3D/VR Stereo Photo Viewer.
Here I post the largest resolution photo size from the camera with cropping, so you can zoom in to see the bark in great detail. In 3D it's very realistic. Sometimes I think 2D photos are boring.
|Street Painting Artist|
I especially like photo above because the sun highlighted the partially finished work, while showing the artist's work-in-progress and inspiration source material in contrast. It may look flat in 2D, but I think it looks good in 3D because the depth perception further defines and contrasts the artist, chalk materials, drawing method, shading umbrella, and art work.
To capture these photos, I used twin Samsung NX500 cameras with Samsung 16 mm prime lenses and circular polarized filters (24 mm equivalent to 35 mm full frame cameras). Several people asked me about the camera rig. It uses a wired shutter release and each camera was set to F8, 1/160 with variable auto ISO.
|Tracy Lee Stum|
Tracy Lee Stum, from California, autographed her book "The Art of Chalk" for me. She painted the 3D anamorphic chalk art of the ironworkers on the beam above.
An example of 3D anamorphic Chalk Art by Rod Tryon, Street Painter
I wonder if the above 3D photo of 3D anamorphic art looks more realistic when viewed stereoscopic? With Stereo Photo Maker I adjusted the stereo window to center on the spider's head (middle ground with zero parallax). This helped make the spider's web appear over the opening.
Street performance artist Randy Orwig at work.
More 3D photos from the festival are here:
On your Android phone copy this link above into the clipboard and view 3D photos with my 3D/VR Stereo Photo Viewer app in your Google Cardboard VR headset.
Last Sunday I attended a photography workshop with a Hunger Games inspired fashion theme sponsored by the South Jersey Photography Co-Op in Oaklyn, NJ. Thanks to Elysian Models (Samantha, Cassie, Kenzie, Angie, and Molly), the event organizers Bill and Scott, hairstylist Cyndi Griffiths, and make-up-artist Chelsea Banks who created the looks.
I used twin Samsung NX500 cameras to capture 3D stereo photos, like the one above. The NX500 cameras used pairs of Nikon 1.8 50 mm and Nikon 2.8 24 mm lenses with adapters that required manual focus. When I got the focus right the resulting photos were very sharp. Wireless triggered strobes or constant light fixtures provided the lighting. The left and right images were aligned and edited with Stereo Photo Maker software.
I placed my best photos into a Google Android app for viewing with a Cardboard Google VR headset. You can download the free app from the Google play store at:
The app is about 100 MB in size, the largest allowed without extension data. The app is this large because the photos are 4K+ high resolution.
The app features a Zoom mode which allows you to tilt or roll your VR headset viewer clockwise to zoom in (magnify image) and counter-clockwise to zoom out (shrink image). Pressing the phone volume up key toggles between variable zoom or fixed zoom mode setting. Toggle the volume up key to save the zoom level and continue viewing photos at your preferred fixed magnification level for all photos. You can also use Bluetooth key controllers and mouse wheel to change the zoom level.
You can re-position the photo viewing window to an area of interest by moving your headset up/down or left/right. I call this technique "swivel viewing", since sitting in a swivel chair works well. This fashion photo app is limited to swivel viewing.
For my 3D/VR Stereo Photo Viewer app, I named my original headset viewing motion technique "couch viewing" because you could lay on a couch and still re-position and zoom the viewing window without a lot of movement. Both of these viewing mode techniques are available in my 3D/VR Stereo Photo Viewer app found at
Plus this app also has new 360 3D panoramic viewing modes with three types of 3D 360 photo projections available: Equirectangular, Mercator, and Cylindrical.
Here are the other photos you will find in the 3D/VR fashion app, but presented here at a lower resolution:
Updated 2017-07-08: add link to youtube video on Magic Eye free-viewing technique and book on learning to see in 3D (Susan Barry).
When you view 3D stereo photos (parallel, side-by-side, MPO formats) on your phone with my Google Play Store Android 3D/VR stereo photo viewer app, you would normally use a stereoscope or Cardboard viewer.
You don't need a stereoscope or Cardboard viewer after all, instead you can use the "free viewing" technique.
My Samsung Galaxy S6 phone has a horizontal display width that approximates the width of my glasses. Because my phone display width is about 4.5 inches, and the center to center distance between the left and right eye images on the phone nearly matches the distance between my eyes, I can free view 3D parallel side-by-side photos and actually see them in stereo.
Maybe this works for me because I'm near sighted. With my glasses off I place the phone very close to my eyes, relax my eye focus beyond the phone, and then very slowly back away the phone, until the photo comes in focus and I see a 3D image. This is the same technique used to view stereograms, as found in the "Magic Eye" book series, which are multiple volumes of 3D illusions. Stereo photos are easier to free view than stereograms in my opinion.
When I use the 3D/VR stereo photo app, in VR mode, while free viewing, I can zoom into the photo and pan left and right to see other areas of the image close up. In the Settings menu I turn off Lens Distortion Correction because I'm not using Cardboard VR lenses. This option makes the photo distortion free in VR mode.
I sometimes also use a Bluetooth connected mouse to move the photo around and zoom into the photo. The TeckNet BM307 Wireless Mouse works great. When I use the mouse I don't have to tilt the phone to zoom in. Tilting the phone makes it harder for me to maintain my 3D vision when free viewing. Also the app Settings allow you to turn off the head movement option, but still use a mouse or other Bluetooth controller to move the photo and zoom in.
Another improvement on free viewing I found is to wear a pair of reading glasses bought from a pharmacy/drug store. With the reading glasses I can get closer to the phone and see a bigger image. I use +3.75 magnification to see a photo in focus less than 5 inches from the phone screen. If you are far-sighted you will probably need reading glasses to view the phone at a close distance.
And iPhone owners, you can use the free view technique with side-by-side photos too. You don't need a viewer app, although an app helps if you want to zoom into a photo. Expanding a photo with your finger to magnify a photo will not show the photo correctly in 3D stereo.
If you want to learn more about how to free view check out this link
Or do a Google search for "how to free view (3D) images".
An excellent video explaining Stereograms, Magic Eye, and free viewing is on Youtube:
Magic Eye: The optical illusion, explained
Of course DO NOT free view or use a viewing app and stereoscope combination, if this gives you eye strain or eye discomfort.
As people get older some may loose their ability to see in 3D and have poor depth perception, making driving difficult. This is because the brain controls stereoscopic vision not the eyes. I wonder if stereoscopic vision is a learned experience from birth and whether it can be relearned when it diminishes or the ability was lost or never learned.
Since posting this blog I read a book by Susan Barry, "Fixing My Gaze" in which she explains how she learned to see in 3D stereo after middle age.
|Example Stereogram generated from flower photo|
If you would like to experiment with Stereogram creation using the Processing language and development environment (SDK), I uploaded the code on github for you. I converted the code from open source Java desktop to Processing sketches. The example stereogram above was created with the Processing code using a flower photo I made.
Original Project Code:
|3D side by side photo of fashion show dancers|
Using the WiFi trigger, the cameras did not perform well with many poorly synchronized 3D photos or missed photos from either or both cameras. See my analysis below.
Each camera has a 16 mm fixed focal lens (24 mm equivalent to 35mm full frame camera) with manual settings at 1/80 sec, F4, ISO 1600. The inter-axial spacing between the cameras was increased from the minimum of 86 mm possible with the rig to about 120 mm to get a better 3D hyper-stereo effect given how far away the subjects were from the camera. Photos were shot in raw, post processed using Lightroom, and aligned for 3D stereo with Stereo Photo Maker.
The 3D camera rig was attached to a very light weight tripod and I was able at times, to lift it high above all the official event photographers in front of me, and keep the foreground clutter out of the shot. Here the remote bluetooth/WiFi combination was useful. In retrospect, I should have used a long shutter cable or PocketWizard remote triggers.
With 24 mm equivalent lens I was able to capture the entire view of the models, stage, and audience. The show floodlights lit the scene well from my vantage point at the end of the runway, so a flash was not necessary.
The public show space had a large crowd (guessing 300+) watching in a multi-retail store business location (with WiFi networks) and everyone had their phone out taking pictures, emailing photos, or what ever. So I think there was a lot of interference from WiFi signals and Bluetooth. I was thinking the cameras and phone where close enough to override any interference, but I misjudged the outcome based on home tests.
It seems my Sony Z1s phone WiFi hot-spot defaults to the 2.4 GHz channel 1 band. I should have changed this channel setting. Going back the next day with a phone WiFi analyzer, I saw at least 6 strong WiFi signals on channel 1 at my shooting location. Plus there were many more strong signals on other channels. I would recommend a strong signal portable router if its available instead of using a phone.
I also discovered a major problem in that the NX500 camera is slow responding to shutter close requests when triggered with WiFi, because a delay occurs when the camera saves files on the SD card and the camera does not immediately process network shutter trigger requests. This problem does not occur with wired remote triggers. With fast moving models it was not possible to consistently capture the best photo moment. For rapid shooting with twin cameras I would recommend a wired remote shutter release instead of using a WiFi trigger to capture your photo subjects.
WiFi control of multiple cameras can have advantages in some situations, where you can take your time shooting, but not for public 😢fashion shows, where camera wired remote shutter releases should work better with consistency. And unfortunately the NX500 cannot take a lot of shots rapidly, a camera feature needed for this kind of event.
I enjoyed writing the code for the Remote Capture app and learned some new techniques, so all was not lost, it was worth the effort. Maybe you will find the app useful for your photo projects. Please let me know how it works out for you, thanks!
In the past when I wrote an Android app using Processing-Android open-source code SDK, I could not combine Android GUI elements with my Processing code. It was not possible to mix the Android GUI features with a Processing sketch in a display screen. I had to rely on special purpose GUI libraries written by volunteers to add more elaborate graphical elements to an app.
During the past year the open-source Android Processing Language development team volunteers modified the internal substructure code of Processing-Android to use Android Fragments instead of Activity classes, thereby allowing Android GUI elements to be combined with Processing GUI sketches within a single screen display in an app.
Searching the Internet I could not find any examples of code that combined the GUIs. I modified (forked) the Processing Android Demo code on Github.com to use as a base for some new examples of Processing Android code I wrote that combines Processing sketches with Android GUI elements. Use this link to see the Android Studio project code:
For Processing coders It's a little more complicated because you will have to use Android Studio SDK instead of the Processing SDK to combine the code development libraries.
Here are photos of my landscape configured Twin Samsung NX500 Camera 3D Stereo Rig mounted on a tripod. (I also have a portrait configured rig I describe in a blog post here.) The left eye camera is mounted upside down to get the smallest possible camera separation distance. The rig has an inter-axial camera lens separation of 86 mm with the camera strap lugs not cut off, otherwise I could have had a separation of 82 mm inter-axial.
The cameras are mounted on a 3D printed support I devised in a Z-bar format and had Shapeways.com manufacture the support in rigid plastic. A large painter's sanding paper handle holds the support mount for comfort. The whole rig weighs about 3 lbs with 16 mm fixed lens and UV protection filter.
The Sony Z1S smart-phone runs a Remote Capture app (Android) that sends broadcast shutter control commands over the phone's WiFi Hot-spot local network to the firmware modified NX500 cameras.
The pictured Bluetooth remote shutter device sends key presses for focus and shutter commands to the app that then sends the WiFi broadcast commands to the cameras. To my knowledge it is not possible to use the Bluetooth controller directly to control the NX500 camera without a firmware modification. If Bluetooth shutter remote was possible, then you would need two controllers wired together for 3D, or one for each camera you want to trigger.
Without the Bluetooth shutter release controller, I tap the phone screen to control the camera operations with more features available (like video).
With this setup I do not need to carry an external portable WiFi DHCP router with me, the phone does it all using WiFi Hot-spot!
The Remote Capture app can also control the twin cameras in video mode to make 3D videos. Also more cameras can be added to the WiFi network for multiple views of a scene at the same instant. Additional Android smart-phone cameras can be added to the WiFi network to capture simultaneous multiple views using the Open Camera Remote app.
I have not measured the NX500 camera's shutter synchronization time when using WiFi. It is acceptable for non-flash shoots, but not as good as a wired camera's shutter/focus remote. Of course the cameras have to be both locked in focus to get the best synchronization. This is the reason for the focus hold feature in the Remote Capture app. I find synchronization works best in NX500 camera manual mode, not auto mode. A lot depends on the movement of your subject and camera synchronization to capture the best quality 3D stereo photo.