Write Google Cardboard VR Apps with Processing





Processing is a programming language, library, and environment for visual artists, designers, and programmers. Here is an example of an Android app I wrote using Processing to demonstrate work I did to merge the Processing for Android library with the Google Cardboard SDK. Now you can write Cardboard VR apps using Processing. 

This open source code can be found at: https://github.com/ajavamind/Processing-Cardboard   

The example code builds the demonstration app above that displays a stereo photo cube in front of a stereo photo background. In a Cardboard viewer the user may change the viewing angle and size of the cube with head movement. A screen tap will bring the cube back to its original viewing location. Tilting the viewer left or right will change the cube size. Moving your finger on the display will rotate the cube for a different view.




Processing Sketches Using Android Studio


Processing, the programming language, library, and development environment for Artists and Visual Designers can generate Android apps.  This feature allows you to see your visual programming creations on an Android phone or tablet. If you have some experience with Android Studio, you might want to use it to develop your Android Processing code instead of Processing's development environment to gain access to a debugger and powerful source editor, etc.

I posted an example Processing Android Sketch project on GitHub that is a starting point you can use for writing processing code using Android Studio.

https://github.com/ajavamind/ProcessingAndroidSketch

Eclipse has been used as an alternative to the Processing development environment and there is a tutorial at  https://processing.org/tutorials/eclipse/ that describes how to do it. There is also a web page for processing-android information that has a paragraph on using Eclipse for Processing Development at:
https://github.com/processing/processing-android/wiki   Here I take it a step further and show how to I use Android Studio as an alternate development environment for Processing.

First I created a new blank Activity project in Android Studio. I chose the  minimum target API 16, Android 4.1 selection.  Next I downloaded a zip file of the Processing-Android libraries from  https://github.com/processing/processing-android

From the unzipped file, processing-android-master.zip, I copied the core source code from the "processing" folder at K:\downloads\processing.org\processing-android-master\core\src to my Android project at C:\Users\Andy\Documents\projects\android\ProcessingSketch\app\src\main\java. I chose to do this instead of creating a separate jar file for Processing-Android so I could study and better understand the internal workings of Processing and tinker with it.

Processing-Android has some data resource files within its java source files. This does not work with Android. The solution is to create an "assets" folder under your project "main". Then add sub folders "processing/opengl" and move the "processing/opengl/shaders" folder with its content into  the "opengl" folder you created. I modified all the paths to these shader resources by prefixing path strings with "/assets/" in the PGraphicsOpenGL.java file. Synchronize the project so it will see the folders you changed or added.

For the kinds of sketches I might want to experiment with, I modified the AndroidManifest.xml file by adding the following permissions and features:

<uses-permission android:name="android.permission.NFC" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.BLUETOOTH"/>
<uses-permission android:name="com.sonymobile.permission.SYSTEM_UI_VISIBILITY_EXTENSIONS"/>
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.VIBRATE" />
<uses-feature android:glEsVersion="0x00020000" android:required="true" />
<uses-sdk android:minSdkVersion="16" android:targetSdkVersion="19"/>


In the MainActivity.java file I modified MainActivity to extend PApplet as follows:

public class MainActivity extends PApplet {
    private static String TAG = "MainActivity";

The import for PApplet package should be added if it was not done automatically.

From MainActivity.java in onCreate(), I also removed setting content view by commenting the line:

  //setContentView(R.layout.activity_main); 

This change prevents overwriting Processing's content view, otherwise you will not see Processing visuals. Other minor updates making onCreate public were also made.

Now in the MainActivity.java file I added Processing settings(), setup(), and draw() functions. You can only call the size()  and fullscreen() functions in  settings() , not in setup(). I used size(1920,1080,OPENGL) to match my phones capabilities and called fullscreen();

All these changes become clear in the example I posted on GitHub. The example code draws lines on the screen while you use your finger as a mouse on the screen.

Android Studio is an advanced integrated development environment (IDE), so you get a debugger and source code tools not available in Processing IDE. If you write code with the Processing IDE, you will find this to be a very handy tool to have.

Stereo Photo Cube App for Cardboard Viewer


Here's a screen display for an Android app I wrote in the Processing language/library framework. It displays a photo cube. When viewed with a Google Cardboard VR viewer (as a stereoscope), the cube will appear in 3D and since one photo is a left and right stereo pair, that photo will appear in 3D in the cube. The cube can be rotated using the arrow keys with a bluetooth keyboard connected to the phone or by dragging your finger across the screen.

You can try this yourself with code I posted on GitHub at:

The stereo display framework code is based on work done at:

I simplified the Stereo.java library code to only use Processing P3D/OPENGL library calls. I did this because I could not get the CreativeCodingLab implementation to work as written with the OpenGL ES libraries that were needed to build. It turned out Processing-Android P3D/OPENGL has everything I need, for this project at least.

Processing 2.2.1 in the Android build Mode (target 4.0.4) with Sony Z1S phone running Android 5.0.2, 


Using Google Cardboard as a Stereoscope

Homemade Cardboard Viewer, rear view phone mount area

As a device to view 3D photos, the Google Cardboard VR (Virtual Reality) viewer has its limitations. The lenses do not have enough field of view to completely see a side by side (parallel) 3D photo that fills the entire display on a phone. I think the design intention was to have an immersive experience and that the cardboard wearer would move to view an entire image as it shifts corresponding to head movement. Cardboard is foremost a Virtual Reality viewer, not a stereoscope.

Berezin Stereo Photography Products has a  rectangular lens (product 3791) with a better field of view for viewing parallel 3D photos on phones. Having used this lens with my phone I found the experience viewing 3D photos much better than with the Cardboard lenses. This lens has very little barrel distortion so software corrections are not needed.

Berezin 3DVu Lens, before hacksawed in half
My next step was to mount the Berezin 3D viewer lens in a Cardboard viewer. I  downloaded plans for making my own Cardboard viewer and altered the design to fit the lenses. This required rectangular  holes for the lenses and increased lens distance from the phone for normal sighted persons.

The Berezin 3D viewer lens come in a plastic frame that does not have enough lens separation for Cardboard. I solved this by cutting the frame in half with a hacksaw and mounting each frame half and lens in my Cardboard. I did not try to remove the lens from the plastic frame because I feared destroying the lens. Conveniently, the plastic frame provided a mounting surface to hot glue to the cardboard.  I increased the distance from the lens to the phone by 38 mm, adjusting the cardboard viewer layout plans before I cut out my custom version.

I'm very pleased with the resulting stereoscope. My 3D photos look much better when Cardboard is used as a simple stereoscope because I can see the full screen image without moving. I also found my modified viewer better for looking at photo-spheres and other VR Cardboard apps with head movement too. The trade off  for my Cardboard viewer with the Berezin lenses is that the screen is not fully immersive for virtual reality, that is, I can see the black walls inside the viewer. But I did not find this to be a distraction.

Morris Arboretum Insect Sculptures 3D Photos

David Rogers (artist), Big Bugs Exhibit: I photographed these side by side (parallel) stereo 3D photos of insect sculptures on display at the Morris Arboretum, Philadelphia, PA, August 30, 2013.

You will need a stereoscope viewer to see in 3D.















FujiFilm W3 3D Stereo Movie Viewing With Cardboard

UPDATED with additions and corrections (2015/4/4, 2015/4/12 )

This post describes procedures I used to prepare a FujiFilm W3 camera 3D stereo movie for viewing with Google's Cardboard VR viewer on my Android Sony Xperia Z1S smart phone.

First download Stereo Movie Maker (SMM for Windows) from Muttyan's site:  http://stereo.jpn.org/eng/stvmkr/index.html
I use this program to convert and re-size a FujiFilm W3 AVI movie file to a side by side version for viewing with Cardboard.

Next I downloaded 32-bit ffdshow:  http://sourceforge.net/projects/ffdshow-tryout/  to provide Stereo Movie Maker with video codecs for reading and writing the movie file. If you do not, SMM gives an error about a missing MJPEG decoder. To configure I used information supplied by http://3dstereophoto.blogspot.com/2011/05/fujifilm-finepix-real-3d-w3.html  Make sure you run the VFW configuration program (View for Windows), not the video decoder configuration program, turning on libavdecoder for MJPEG.

Now open your video file using SMM:


Next you should align with the Adjust menu, Auto Alignment since the W3 cameras may be slightly misaligned. Also make any other adjustments you might want, such as under View, auto color adjustment, for example to fix the brightness.



Save your movie in side by side format. I chose to re-size for the Cardboard viewer for my phone's display screen 1920x1080. The SMM re-size parameters are 960x540 (each eye). If I had a phone with larger display, like Samsung Galaxy S6 with its 2560x1440 display, I would not need to re-size to X 1280 and Y 720 since this is nearly the resolution of W3 movies out of the camera (1196x720).

Unfortunately my codec does not have a license for HD 2560x1440 size. Initially I used 960x540 for the re-size. I found I could crop to 4x3 aspect ratio and avoid the resize to approximate 960x720 per eye. With this aspect ratio my video looked best in the Cardboard viewer.


A video compression window will appear. If you leave uncompressed the file size is large and video is low quality. I chose GoPro since I own a GoPro and have their software installed on my computer. The quality is very good if you choose to configure. I tried several video compression formats, but only had success with GoPro-CineForm  and Cinepac codecs. You can get a GoPro codec from here.





After saving the movie file, I transferred it to my Z1S via USB cable. A dialog box appears recommending conversion, do this. The transfer program converts the file to MP4 format that the Sony phone can show. With MAGIX Movie Edit Pro 2014 video editor I am also able to convert from AVI to MP4 so as to not rely on my phone transfer to do this.

On the phone I used the VLC for Android media player app for viewing the movie. https://play.google.com/store/apps/details?id=org.videolan.vlc
I also tried Cardboard Theater, but found VLC much easier to use.


Celebrating PI Day


And Einstein's birthday and any other irrational activities you can think of!