Lake Worth Street Painting Festival 2017

Street Painting Artist
A large group of street painting artists, working with ephemeral chalk, gathered in Lake Worth, Florida, February 25-26, to draw their paintings on the pavement during the Lake Worth Street Painting Festival. We went mid-day Sunday and many works were not yet finished. It was an amazing show of street art stretching for blocks and blocks of Lake Worth. My goal as a 3D photographer was mainly to show the artists at work, the spectators, but not always the completed painting.

I especially like photo above because the sun highlighted the partially finished work, while showing the artist's work-in-progress and inspiration source material in contrast. It may look flat in 2D, but I think it looks good in 3D because the depth perception further defines and contrasts the artist, chalk materials, drawing method, shading umbrella, and art work.

To capture these photos, I used twin Samsung NX500 cameras with Samsung 16 mm prime lenses and circular polarized filters (24 mm equivalent to 35 mm full frame cameras). Several people asked me about the camera rig. It uses a wired shutter release and each camera was set to F8, 1/160 with variable auto ISO.

More Work-in-Progress.....

Tracy Lee Stum
Tracy Lee Stum, from California, autographed her book "The Art of Chalk" for me. She painted the 3D anamorphic chalk art of the ironworkers on the beam above.

An example of 3D anamorphic Chalk Art by Rod Tryon, Street Painter 

I wonder if the above 3D photo of 3D anamorphic art looks more realistic when viewed stereoscopic? With Stereo Photo Maker I adjusted the stereo window to center on the spider's head (middle ground with zero parallax). This helped make the spider's web appear over the opening.

Street performance artist Randy Orwig at work.

More 3D photos from the festival are here:

On your Android phone copy this link above into the clipboard and view 3D photos with my 3D/VR Stereo Photo Viewer app in your Google Cardboard VR headset.


3D/VR HG Inspired Fashion Shoot Photos

Last Sunday I attended a photography workshop with a Hunger Games inspired fashion theme sponsored by the South Jersey Photography Co-Op in Oaklyn, NJ. Thanks to Elysian Models (Samantha, Cassie, Kenzie, Angie, and Molly),  the event organizers Bill and Scott, hairstylist Cyndi Griffiths, and make-up-artist Chelsea Banks who created the looks.

I used twin Samsung NX500 cameras to capture 3D stereo photos, like the one above. The NX500 cameras used pairs of Nikon 1.8 50 mm and Nikon 2.8 24 mm lenses with adapters that required manual focus. When I got the focus right the resulting photos were very sharp. Wireless triggered strobes or constant light fixtures provided the lighting. The left and right images were aligned and edited with Stereo Photo Maker software.

I placed my best photos into a Google Android app for viewing with a Cardboard Google VR headset. You can download the free app from the Google play store at:

The app is about 100 MB in size, the largest allowed without extension data. The app is this large because the photos are 4K+ high resolution.

The app features a Zoom mode which allows you to tilt or roll your VR headset viewer clockwise to zoom in (magnify image) and counter-clockwise to zoom out (shrink image). Pressing the phone volume up key toggles between variable zoom or fixed zoom mode setting. Toggle the volume up key to save the zoom level and continue viewing photos at your preferred fixed magnification level for all photos. You can also use Bluetooth key controllers and mouse wheel to change the zoom level.

You can re-position the photo viewing window to an area of interest by moving your headset up/down or left/right. I call this technique "swivel viewing", since a swivel chair works well. This fashion photo app is limited to swivel viewing. 

For my 3D/VR Stereo Photo Viewer app, I named my original headset viewing motion technique "couch viewing" because you could lay on a couch and still re-position and zoom the viewing window without a lot of movement. Both of these viewing mode techniques are available in my 3D/VR Stereo Photo Viewer app found at

Plus this app also has new 360 3D panoramic viewing modes with three types of 3D 360 photo projections available: Equirectangular, Mercator, and Cylindrical.

Here are the other photos you will find in the 3D/VR fashion app, but presented here at a lower resolution:

Free Viewing 3D Stereo Photos

When you view 3D stereo photos (parallel, side-by-side, MPO formats) on your phone with my Google Play Store Android 3D/VR stereo photo viewer app, you would normally use a stereoscope or Cardboard viewer. 

You don't need a stereoscope or Cardboard viewer after all, instead you can use the "free viewing" technique. 

My Samsung Galaxy S6 phone has a horizontal display width that approximates the width of my glasses. Because my phone display width is about 4.5 inches, and the center to center distance between the left and right eye images on the phone nearly matches the distance between my eyes, I can free view 3D parallel side-by-side photos and actually see them in stereo. 

Maybe this works for me because I'm near sighted. With my glasses off I place the phone very close to my eyes, relax my eye focus beyond the phone, and then very slowly back away the phone, until the photo comes in focus and I see a 3D image. This is the same technique used to view stereograms, as found in the "Magic Eye" book series, which are multiple volumes of 3D illusions. Stereo photos are easier to free view than stereograms in my opinion.

When I use the 3D/VR stereo photo app, in VR mode, while free viewing, I can zoom into the photo and pan left and right to see other areas of the image close up. In the Settings menu I turn off Lens Distortion Correction because I'm not using Cardboard VR lenses. This option makes the photo distortion free in VR mode.

I sometimes also use a Bluetooth connected mouse to move the photo around and zoom into the photo. The TeckNet BM307 Wireless Mouse works great. When I use the mouse I don't have to tilt the phone to zoom in. Tilting the phone makes it harder for me to maintain my 3D vision when free viewing. Also the app Settings allow you to turn off the head movement option, but still use a mouse or other Bluetooth controller to move the photo and zoom in.

Another improvement on free viewing I found is to wear a pair of reading glasses bought from a pharmacy/drug store. With the reading glasses I can get closer to the phone and see a bigger image. I use +3.75 magnification to see a photo in focus less than 5 inches from the phone screen. If you are far-sighted you will probably need reading glasses to view the phone at a close distance.

And iPhone owners, you can use the free view technique with side-by-side photos too. You don't need a viewer app, although an app helps if you want to zoom into a photo. Expanding a photo with your finger to magnify a photo will not show the photo correctly in 3D stereo.

If you want to learn more about how to free view check out this link
Or do a Google search for "how to free view (3D) images".

Of course DO NOT free view or use a viewing app and stereoscope combination, if this gives you eye strain or eye discomfort.

As people get older some may loose their ability to see in 3D and have poor depth perception, making driving difficult. This is because the brain controls stereoscopic vision not the eyes. I wonder if stereoscopic vision is a learned experience from birth and whether it can be relearned when it diminishes or the ability is lost. 


Example Stereogram generated from flower photo

If you would like to experiment with Stereogram creation using the Processing language and development environment (SDK), I uploaded the code on github for you. I converted the code from open source Java desktop to Processing sketches. The example stereogram above was created with the Processing code using a flower photo I made.

Original Project:

Original Project Code:


Fashion Show 3D Stereo Capture Using WiFi

Dancers warming up fashion show audience

3D side by side photo of fashion show dancers
Last week Delray Beach, Florida hosted #DelrayFashionWeek with proceeds benefiting the Delray Beach Achievement Centers for Children & Families. I went to the opening public fashion show in downtown Delray Beach with my 3D camera rig to photograph the runway action. My goal was to try out the WiFi Remote Capture app for shutter trigger of the twin Samsung NX500 cameras.

Using the WiFi trigger, the cameras did not perform well with many poorly synchronized 3D photos or missed photos from either or both cameras. See my analysis below.

Each camera has a 16 mm fixed focal lens (24 mm equivalent to 35mm full frame camera) with manual settings at 1/80 sec, F4, ISO 1600. The inter-axial spacing between the cameras was increased from the minimum of 86 mm possible with the rig to about 120 mm to get a better 3D hyper-stereo effect given how far away the subjects were from the camera. Photos were shot in raw, post processed using Lightroom, and aligned for 3D stereo with Stereo Photo Maker. 

The 3D camera rig was attached to a very light weight tripod and I was able at times, to lift it high above all the official event photographers in front of me, and keep the foreground clutter out of the shot. Here the remote bluetooth/WiFi combination was useful. In retrospect, I should have used a long shutter cable or PocketWizard remote triggers. 

With 24 mm equivalent lens I was able to capture the entire view of the models, stage, and audience. The show floodlights lit the scene well from my vantage point at the end of the runway, so a flash was not necessary. 

The public show space had a large crowd (guessing 300+) watching in a multi-retail store business location (with WiFi networks) and everyone had their phone out taking pictures, emailing photos, or what ever. So I think there was a lot of interference from WiFi signals and Bluetooth. I was thinking the cameras and phone where close enough to override any interference, but I misjudged the outcome based on home tests.

It seems my Sony Z1s phone WiFi hot-spot defaults to the 2.4 GHz channel 1 band. I should have changed this channel setting. Going back the next day with a phone WiFi analyzer, I saw at least 6 strong WiFi signals on channel 1 at my shooting location. Plus there were many more strong signals on other channels. I would recommend a strong signal portable router if its available instead of using a phone.

I also discovered a major problem in that the NX500 camera is slow responding to shutter close requests when triggered with WiFi, because a delay occurs when the camera saves files on the SD card and the camera does not immediately process network shutter trigger requests. This problem does not occur with wired remote triggers. With fast moving models it was not possible to consistently capture the best photo moment. For rapid shooting with twin cameras I would recommend a wired remote shutter release instead of using a WiFi trigger to capture your photo subjects.

WiFi control of multiple cameras can have advantages in some situations, where you can take your time shooting, but not for public 😢fashion shows, where camera wired remote shutter releases should work better with consistency. And unfortunately the NX500 cannot take a lot of shots rapidly, a camera feature needed for this kind of event.

I enjoyed writing the code for the Remote Capture app and learned some new techniques, so all was not lost, it was worth the effort. Maybe you will find the app useful for your photo projects. Please let me know how it works out for you, thanks!

Combine Android GUI with Processing Sketches

In the past when I wrote an Android app using Processing-Android open-source code SDK, I could not combine Android GUI elements with my Processing code. It was not possible to mix the Android GUI features with a Processing sketch in a display screen. I had to rely on special purpose GUI libraries written by volunteers to add more elaborate graphical elements to an app.

During the past year the open-source Android Processing Language development team volunteers modified the internal substructure code of Processing-Android  to use Android Fragments instead of Activity classes, thereby allowing Android GUI elements to be combined with Processing GUI sketches within a single screen display in an app. 

Searching the Internet I could not find any examples of code that combined the GUIs. I modified (forked) the Processing Android Demo code on to use as a base for some new examples of Processing Android code I wrote that combines Processing sketches with Android GUI elements. Use this link to see the Android Studio project code:

For Processing coders It's a little more complicated because you will have to use Android Studio SDK instead of the Processing SDK to combine the code development libraries.

Twin NX500 Camera 3D Stereo Rig Using Remote Capture

Here are photos of my landscape configured Twin Samsung NX500 Camera 3D Stereo Rig mounted on a tripod. (I also have a portrait configured rig I describe in a blog post here.) The left eye camera is mounted upside down to get the smallest possible camera separation distance. The rig has an inter-axial camera lens separation of 86 mm with the camera strap lugs not cut off, otherwise I could have had a separation of 82 mm inter-axial. 

The cameras are mounted on a 3D printed support I devised in a Z-bar format and had manufacture the support in rigid plastic. A large painter's sanding paper handle holds the support mount for comfort. The whole rig weighs about 3 lbs with 16 mm fixed lens and UV protection filter.  

The Sony Z1S smart-phone runs a Remote Capture app (Android) that sends broadcast shutter control commands over the phone's WiFi Hot-spot local network to the firmware modified NX500 cameras

The pictured Bluetooth remote shutter device sends key presses for focus and shutter commands to the app that then sends the WiFi broadcast commands to the cameras. To my knowledge it is not possible to use the Bluetooth controller directly to control the NX500 camera without a firmware modification. If Bluetooth shutter remote was possible, then you would need two controllers wired together for 3D, or one for each camera you want to trigger. 

Without the Bluetooth shutter release controller, I tap the phone screen to control the camera operations with more features available (like video).

With this setup I do not need to carry an external portable WiFi DHCP router with me, the phone does it all using WiFi Hot-spot!

The Remote Capture app can also control the twin cameras in video mode to make 3D videos. Also more cameras can be added to the WiFi network for multiple views of a scene at the same instant. Additional Android smart-phone cameras can be added to the WiFi network to capture simultaneous multiple views using the Open Camera Remote app.

I have not measured the NX500 camera's shutter synchronization time when using WiFi. It is acceptable for non-flash shoots, but not as good as a wired camera's shutter/focus remote. Of course the cameras have to be both locked in focus to get the best synchronization. This is the reason for the focus hold feature in the Remote Capture app. I find synchronization works best in NX500 camera manual mode, not auto mode. A lot depends on the movement of your subject and camera synchronization to capture the best quality 3D stereo photo.