VR Development Software
#TODO - link out to this page from Home, under "VR Development Software"... can be included under to main subheader, in the short summary describing "VR Development Software"
Things to Compare
Martin, optional second deliverable: results of comparison of Hello World implemented in Blender and Unity [table of numbers]
Jacob, results of comparison of Hello World (Labyrinth) implemented in Unity and MinVR [table of numbers and narrative]
Charles, results of generating a full vr experience.
Comparison of Labyrinth Project in Unity and MinVR (Jacob)
See the chart and lists below for detailed breakdowns.
If you're a beginner, don't use MinVR. Unity is a fairly powerful tool that is well-documented and easily ported to any HMD. Any programmer should be able to make their own programs in it in a matter of hours, with no installation headaches and few or no unexpected bugs. Any question is probably answered on Stack Exchange or Unity's own forums. Unity is perfect for a fun weekend project.
BUT, if you want to program for the YURT, or are a hardcore graphics programmer, you should use MinVR. MinVR lets you easily port your application to ANY VR experience, from Cardboard to Vive to the YURT with fairly little effort. Installation can be difficult for some people, but it is constantly getting better. People that are familiar with graphics programming won't have an issue writing code for it. But for someone trying to jump into game or VR development, the learning curve is extremely steep. This arguably makes it a more rewarding experience, and running a program in the YURT is extremely satisfying, but it is only for people that are serious.
I also experimented with A-Frame. Anecdotally it is significantly easier to create a world than either of the other two options, but it is difficult to get user input from Google Cardboard. It is the easiest to get started with and its documentation level is somewhere between Unity and MinVR, so it is a very reasonable choice for people that don't want to put in the work that Unity demands. That being said, I think it can accomplish less than Unity.
Tilting the plane independently of the ball proved to be a challenge. If it moved a lot in one frame, it would move through the ball and the ball would drop below, falling forever. To eliminate this challenge, I made the ball a child of the plane, which keeps it from falling through. But this distorts it when the plane is tilted, and I am unable to fix this.
It would be difficult to port to another, more advanced VR experience.
Building the product and uploading it to a phone to test the VR functionality is a slow process.
Interface development is tricky. Since it takes so long to upload the project to a phone, iterative design becomes quite tedious.
The physics doesn't accelerate with gravity or apply drag realistically. It's hard to override or improve these built-in features.
While physics is flawed, it is quick and easy.
Graphical interface makes iterative design quick and easy.
Online documentation is fantastic, community of creators provides help and (sometimes free!) content.
Drivers, etc., installed with application.
Documentation is extremely sparse, written by only two people.
Testing on the target device only allowed to Brown students/faculty, requires supercomputer account and swipe access.
Iterative design is tedious because the application has to be closed and recompiled for every minor change.
No physics package.
Manual shaders are an extreme pain.
Unexpected bugs, like impartial removal of X11 from months prior, can result in hours-long issues that are extremely difficult to trace and have never been encountered before because of limited user base. Similarly, makes it hard to resolve any bug.
Anything made in MinVR will run on a variety of VR devices.
Greatly rewarding, as any product could be the first of its kind to run on a multi-million dollar VR device.
Control over minutiae of program behavior (shaders, physics, etc.).
Unreal Engine 4 (Charles and Jen) #TODO
Quick and Simple tasks
Extremely accessible hardware-wise—browser based, compatible with low-end and high-end headsets
Works even without headsets—tilt controls on mobile, click and drag on desktop
Easy to pick up compared to other development software—HTML/simple JS based with intuitive element creation
Quick deployment—no compilation necessary
Camera, shaders, etc. taken care of within the A-Frame tags
Documentation relatively sparse—no dedicated docs like some other development software
Limited usage past intermediate level rendering—skill ceiling lower than software such as Unity
Need physical controllers for full capability, somewhat limiting accessibility
Extremely easy to use with a variaty of VR Hardware without too much set up (after a little bit of familiarizing with the interface)
Quick develpment and good for prototyping.
Uses Python which is a very accessible language
Phisics/Collisions comes out of the box (only a few lines neccessary to set them up)
A lot of ready to use models
It is hard to find examples and tutorial online
When developing for VR it is better to have a VR hardware connected to the computer during development (without it is more annoying to develp)
Unity (SteamVR Plugin v.s. Oculus Integration) (Brandon)
Before delving into the differences between the OVR and SteamVR plugins, it should be mentioned that there is much overlap between the two SDKs. Both have relatively sparse information on how to use either, and although both have various resources that are available to programmers who import either of the plugins, the lack of documentation on both leaves much for the programmers to figure out themselves.
Something else that should also be noted:
Expected time to complete Throw A Ball for SteamVR - 1.5 hours - 2 hours
Expected time to complete Throw A Ball for Oculus - 30 minutes - 45 minutes
Able to be used on both Oculus and Vive headsets
More tutorials - although not many, there are several tutorials for SteamVR that were very informative, and made getting started with SteamVR much easier.
Easier to understand - when reading through the SteamVR plugin provided scripts in comparison to Oculus OVR provided scripts, I had a much easier time understanding and breaking down line by line the logic behind the code of SteamVR.
Need SteamVR to run - while this is a prerequisite for Vives, on Oculus this requires enabling third party apps.
While this poses little to no problem on the development side, this could pose a potential issue for developers hoping to reach the Oculus store, since SteamVR isn't endorsed by Oculus, and applications made not using the Oculus SDK cannot be put on the store.
While SteamVR supports Oculus Controllers, and applications can be made without too much extra effort to add Oculus support, controller-related code is still in terms of Vive controllers, which might be a possible source of confusion.
Little help for developers beyond tutorials - unlike Oculus, there is no dedicated forum for answering questions for developers using specifically the Unity SteamVR plugin.
Very annoying popup that shows up every 5 minutes in Unity that tells you "you did the right thing!" should you choose not to ignore it.
Oculus Integration Advantages
Can be directly run on headset (as opposed to using a third party application like Steam)
Has far more resources available compared to SteamVR - Unlike the SteamVR plugin, where you must write your own grab script, the OVR plugin already has the script for grabbing, and the player can already move around using the joysticks.
This is just the tip of the iceberg, as OVR has pre-made functionality for things such as sound integration, player avatar, etc.
Code is very well documented - Almost every single function has a summary describing its purpose
Oculus Integration Disadvantages
Only works for Oculus
As mentioned in SteamVR's advantages, the Oculus Integration code has a much higher learning curve, and is therefore difficult to work with - possibly because there are more assets being imported, and in order for the assets to smoothly work together, the code requires much more complexity.
In terms of tutorials, there is almost nothing in terms of utilizing the SDK in your own code the way there is for SteamVR.
As a final comment on the evaluation of Oculus Integration vs. SteamVR plugin, based on personal experience, and my understanding from working with both, I would strongly recommend programmers new to C# or Unity to work with SteamVR if they were hoping to program for Oculus or Vive; however, for a programmer with a more advanced understanding of C# and Unity, he or she might find that it is easier to actually pick up OVR, because the main trade off between both is that while SteamVR is easier to understand,the fact that there are many more scripts in the Oculus integration, and provided assets, gives programmers a deeper understanding upon working with the SDK.
Unity XR SDK vs Oculus Integration (OVR) (Jennifer)
Disadvantages of OVR
OVR doesn’t provide great support for beginners in VR development
The package lacks up-to-date examples (I had a hard getting the sample scene working)
It’s very scripts-heavy but lack proper documentation/references for those scripts
There aren't many tutorials available online on using OVR
It’s difficult to edit the features from OVR scripts because many scripts do not have customizable options (i.e. changing controller button of grab script) or functions to override them.
Advantages of OVR
OVR offers many pre-made functionality specific to the Oculus platform. Personally, I would only consider using OVR if you depend on Oculus platform features such as Account Linking, Avatars, Guardian Integration. Even then, many of these services can be implemented without Oculus which in many cases is preferable anyway.
Disadvantages of Unity XR
frequent version updates causes existent solutions + references to be outdated quickly
Advantages of Unity XR
Unity XR is beginner-friendly
users don't need to write a single line of code to create a VR app
plentiful documentation and resources online (Unity has a great tutorial themself)
Unity XR is more expandable and customizable
Unity vs. Unreal Engine (Brandon)
For the Unity and Unreal Engine, my knowledge of the two engines is heavily skewed toward Unity; however, the following comparison between the two engines factored greatly into my decision to switch from working with Unreal Engine to working with Unity. As a side note, another reason that accounted for my decision to switch was the preference for C# in Unity over C++ in Unreal, as C# more closely resembles Java, a language I am familiar with. Also, although I list the Unreal Engine "Blueprints" as an advantage, I personally prefer pure code rather than a visual representation. The general feeling I had from comparing both engines was that Unity was more developer oriented, while Unreal was more artist oriented.
Extremely vivid design - this was a huge plus for Unreal over Unity, as the more realistic the design looks, the more immersive the VR gets
Unreal Engine's blueprints - scripts are represented as flowcharts where functions and values have connections drawn between them representing program flow - this allows for a more intuitive interface making it easier for non-developers to understand the logic behind the program
Unreal Engine is very slow at times, especially when first starting up, and when opening the code editor
It is difficult to find proper documentation on Unreal Engine
Unity3D is extremely well documented, providing a relatively easy to navigate base for all functions, methods, and data types.
More applications in VR are made using Unity
Unity's capabilities graphics-wise pale in comparison to Unreal Engine's