Diving into development
Developing for HoloLens requires a Windows OS and Visual Studio. Most of the holographic apps are made in Unity with C#, but you can deploy 2D Universal Windows Platform apps to the device as well. There’s a HoloLens emulator and personal editions are available for both Unity and Visual Studio, so you can start fiddling with them for free if you feel the need to scratch the AR itch. Microsoft provides plenty of resources to get started with HoloLens development. There are tutorials which show you how to implement each of the input methods, design guidelines for holographic UIs and more. Additionally, I recommend checking out MixedRealityToolkit (MRTK, formerly HoloToolkit) and Vuforia SDK (integrated in to Unity in version 2017.2) to ease your journey into AR.
MixedRealityToolkit is a set of scripts and components you can add to projects to easily configure your Unity project’s settings to support HoloLens, add hand dragging to holograms and implement the gaze cursor among other things. The Vuforia SDK is a ready-made solution for implementing image, object and text recognition. In case you want to set up a Unity project for HoloLens without using the MRTK, you’ll have to add Windows Holographic support in the Unity editor, and it’s recommended to set the quality settings to fastest, because of the mobile level performance, and setting the camera clipping plane to 0.85 to ensure the holograms won’t get too close for comfort. When developing on Unity, you first build your application in Unity, then open the solution in Visual Studio and deploy to the device from there.
Fiddling & Tinkering
As I use a Macbook Pro for development, to get started with HoloLens development, I decided to set up Windows 10 on top of Parallels Desktop. Setting up the development environment and running Unity on an emulated Windows was surprisingly hassle-free and even though the use of the Unity editor in such a setup wasn’t silky smooth, it was totally usable.
After about 10 days of working at Vincit, we had our first weekly meeting regarding the HoloLens development. In the meeting, we went over what I had learned so far, I showed some finished tutorial projects I had done and we planned the first demo application I would develop for HoloLens. We decided that in the first demo we’d like to get some data from a REST API and display it to the HoloLens user, and decided on getting the weather displayed in a window. From there, I looked up how I should display text to the user and as it was a holographic app, I of course wanted the window to be a 3D window, with the text neatly on it.
Getting the data from openweathermap’s API was simple enough, but there were problems with the parsing. Unity has its own JsonUtility for working with JSON data, but it can’t handle nested JSON objects and its main purpose is serialization. I looked into popular .NET and C# JSON libraries and tried implementing a few of them. Some seemed to work well and had no hiccups when testing the app in the Unity editor, but would give errors when trying to deploy the app to HoloLens. I ended up finding a working JSON library called SimpleJson in the Unify Community Wiki.
With the parsing problem out of the way, it was time for another weekly progress meeting and so far, I had managed to make a rough prototype of the app, just a plain cube that had London’s weather forecast for the day. The app wasn’t going to blow anyone’s mind as it was, so we made goals to make it more engaging and interactive.
I started with making the cube / window more pleasing to the eyes by giving it a nice material from the MRTK, then added buttons next to the window, to allow the user to get the weather from different cities. The buttons had to be animated so that they would clearly indicate their state to the user and sound had to be added to the clicking action of the buttons. It was little things like this that made the experience interactive. To add to the engagement, I later added simple animations like a spinning sun and moving clouds to the window to reflect the weather.
The screenshot didn’t capture the colors very well and overall the visibility of the hologram is better on the device
On top of the button clicking action, I made the window hand draggable, with an icon to indicate that the window is being moved, and added voice commands for changing the city and bringing the window in front of the user.
I fiddled a bit with the Vuforia SDK as well and with the weather window app I tried Vuforia’s text & image recognition tools. For the text recognition, I printed BERLIN with half-a-cat sized letters on an A4 and wanted the app to get Berlin’s weather when it recognized the word. The text recognition didn’t work as well as I would’ve liked however, so I moved on to Image recognition. Inspired by Vincit’s then-recent visit to Berlin, I printed a picture of the Brandenburg gate and sure enough when I held it in front of the HoloLens I heard the familiar chime of an image being recognized (I had added the image to the app’s image database beforehand).
Towards the end of the summer I got temporary reinforcements in the form of vincitizen Ossi from Turku and together we worked on a killer app that was sure to make us rich. “What if you could use the HoloLens to count cards while playing blackjack?” was the core idea for our project. We hacked away, looking up probability tables and analysing different card counting methods. When the time came to implement the card recognition feature, we started out with Vuforia based on my earlier positive experiences with it. We soon found out that the playing cards we used didn’t have distinct enough features to make for a good experience with Vuforia’s image recognition on HoloLens. Both of us got busy with other projects soon after, and sadly the project has been untouched ever since.
All in all, the Vuforia SDK left me with a good impression thanks to its ease of use and I recommend trying it out with HoloLens for image recognition capabilities.
Comfort and UX
Like I mentioned in the User experiences section in the previous post, it can take a while for your eyes to get adjusted to the holograms. You can reduce the discomfort by following the guidelines and setting your holograms between 1.8m to 5m away from the user and ensuring you have constant 60fps.
Another tip for improving the UX of your app is to avoid static UI elements. Having the holograms behave in a way the user would expect, in a “natural” way, can make for a more comfortable experience. For example, if you move a hologram, it’s better to animate the transition from point A to point B, than to make it suddenly disappear in one place and appear in the next.
Overall if you’re planning on developing 3D apps for HoloLens, you should definitely take a look at the guidelines about hologram comfort and best practices.
When deploying the app to the device from Visual Studio, you should make sure you do so from the Release branch. Building and deploying from the Debug branch can make your app take a big hit in performance. I had a case where I was getting an error trying to deploy from the release branch, but found out I was able to do so from the debug branch. Deploying from the debug branch however dropped the fps from 60 to <0, switched back to release and was once again able to deploy from there with no problems.
Live streaming from the device is possible, though it drops the fps of HoloLens to 30, making the experience less smooth. Microsoft has provided two ways for streaming so far, one is through a browser in HoloLens’ Device portal, but from my experience there’s a 4–6s lag. The other way is through the Microsoft HoloLens desktop app. With the desktop app, the lag is noticeably shorter at around 0.5–1.5s and you decide on either Higher quality or faster stream quality. Even with the High-Quality stream, the resolution as well as the fps are far from great, but it gets the job done.
I should also mention that I had problems connecting to the HoloLens quite often with the desktop app.
As HoloLens’ availability was expanded to more countries in Europe (including Finland) a little while ago and MixedRealityToolkit’s latest version works with Unity 2017.2, now’s a great time to start working on those AR apps.