Music for TrinityVR

Having previously acquired an M.Mus in Production of Popular Music, I thought it would be good to make a contribution of some original music to the app. This not only demonstrates the breadth of skills in the project, but also served a valuable purpose. While the photospheres are stationary, hence so is their ambience, the videospheres involving Adrian walking with the camera, which would change the audio ambience as he went. Unfortunately, the microphone built into the camera is quite low-fidelity, picking up wind noise and rumble from footsteps too easily. Instead music could replace this ambience when in videospheres. In addition, Ying found the ambience recorded in the Museum Building and Long Room to be inadequate. We intended to have the Chapel playing music recorded there constantly, and it was decided that this concept would suit all indoor areas, making them feel a bit more different from the Trinity exterior.

All pieces of music were created and processed in Avid Pro Tools 12, except for Captain O’Kane, which was created in Logic Pro X, due to its more appropriate virtual instrument selection.

Trinity Dawn

My initial idea with music was to create sounds in the style of the ambient music genre. This genre was pioneered by Brian Eno (in works such as Ambient 1: Music for Airports (1978)), and places an emphasis on creating atmosphere through sound over traditional music structures like rhythm and melody. I wanted to sculpt a piece that was simple and unobtrusive, which felt both light and comforting. I call the finished work Trinity Dawn, as it seemed to naturally coalesce with Adrian’s 360º photography of Trinity, taken at a rather beautiful dawn. The music features long, deep and swelling tones, accompanied by a slow harp melody from which evolve light, crystalline sounds.

Ancient Stones

The harp became an important idea to me in thinking about music and sound for this project. The harp is not only a truly Irish symbol, but also a symbol of Trinity. Something about the gentle lull of the harp also evokes an older, more ancient world. For this reason, I composed Ancient Stones. The song is mostly driven by a rhythmic harp and a simple melody that has a universally traditional sound. In  Originally, the outdoor videos were to alternate between this song and Trinity Dawn. However, when it became more appropriate to have music in the indoors areas, I moved this song there, naming it appropriately, being inspired by the Museum Building’s old marble and limestone, as well as its ancient fossils and relics.

Captain O’Kane

Ying conducted research on Irish harp music and found some very nice pieces. In particular, she transcribed the melody of Captain O’Kane, a traditional harp piece by Turlough O’Carolan. However, the virtual instruments available in Cubase left much to be desired. I found a full score, including the rhythmic and harmonic accompaniment and transcribed it in Logic Pro X, which comes with some very good quality sampled instruments. I also included some extra voices: an Irish tin whistle and Irish hammered dulcimer, which added to the traditional sound. I also wrote a string accompaniment to enhance the second repetition of the tune. The striking and poignant sounding song came to reside in the Long Room, as well as the introduction video Hailee created.

Screen Shot 2016-08-31 at 18.03.36.png

Chapel Music

There were a few different ideas for the music in the Chapel, but unfortunately, time was against developing a particularly interesting feature out of it. However, my sister-in-law was a member of one of the Trinity choirs. She offered me several tracks of her choir that had been recorded in the Chapel itself. I processed these in Pro Tools, making sure to enhance the lush natural reverberation that the Chapel possesses. I created a script which would iterate though a these songs in a list, at the users discretion. The songs, which are mostly religious in nature, include: Faire Is The Heaven, Bring Us O Lord God, Agnus Dei (Lamb of God), Ubi Caritas and Pastime in Good Company.

Advertisements

Recording the Narration

Hey there, Enda here!

For the narration, I arranged to record a family friend, Alan Condell. Alan, along with his wife Marie, have been neighbours with my parents in Monkstown for over twenty years now. My mum recommended Alan to me, as he has a very articulate voice, full of dulcet tones. Alan generously donated his time and voice for this project.

I offered to go to Alan to record in his home. With me I brought a variety of sound recording equipment, along with my laptop. The two most vital pieces were the microphone and preamp.

The microphone I used is the Shure SM7B. This is a condenser mic first made in 1974 – it’s been in popular in the recording industry and radio use ever since. The mic has a reputation has an excellent vocal mic, especially for men, though its applications are flexible. The mic is frequently used by radio presenters as it gives them a warm and full-bodied sounding voice. I thought this mic would be ideal, being one of the best in my collection, in addition to its pleasant tone.

61CwS8vNCFL.jpg
The Shure SM7B

The preamp I used is the Golden Age Project Pre-73 (hereby referred to as the GAP73). A preamp provides gain to the microphone, in order to bring its amplitude up to a recording level. Not all preamps are the same however, and preamps may be prized for their transparency, or how they colour a sound. The GAP 73 is a modern take on the vintage Neve 1173 preamp. The sound is fairly coloured, creating an extra warm tone, while still keeping excellent definition. I find its characterful sound really complements the SM7B.

113544-1489ee427dab339f8713d18a9547316d-3073a9c65ea7010b1d7c4518568cfe83.jpg
The front plate of the GAP73

Hailee had prepared the script from Jill’s research, which I had also proofread. There were around 60 points in all, each lasting between 20 seconds to a minute or more, though most erred on the shorter side. Thankfully recording with Alan went perfectly, as he is an excellent speaker, and the process took less than 2 hours, from setup to completion (though much time was spent having a good chat with Alan!)

I recorded, edited and mixed the audio in Avid Pro Tools 12, my preferred DAW of choice. Editing involved topping and tailing each audio clip, as well as cutting out breaths or mouth noise. For mixing, I used a variety of plugins to enhance the audio and give it a professional quality:

RBass: this plugin is especially useful on spoken word vocals, especially when they have to stand on their own. The plugin generates low frequency harmonics from the source, giving it a deeper and more solid sound.

Screen Shot 2016-08-29 at 18.40.14.png

Fabfilter Pro-Q: the Pro-Q is my go to EQ plugin, as its very versatile, yet easy to use. Using this I cut any super-low or subharmonic frequencies (less than 40Hz) and also dipped the low mids around 250Hz. This ensured that the low-end was under control but and made the low-mids a bit clearer.

Screen Shot 2016-08-29 at 18.40.21.png

Slate Digital VMR: the Virtual Mix Rack by Slate contains a variety of plugin modules on one plugin slot. In this case, I used the Virtual Console Collection mix buss plugin, which emulates the sound the summing amplifier of a mix buss of a vintage console (the model I chose is a Neve). This imparts a subtle sonic change that is very pleasing, lightly adding some harmonics and giving the low and high ends a nice finish. I then used Revival, which further added some crispness and clarity to the top and some fullness to the low-mids.

Screen Shot 2016-08-29 at 18.40.30

Fabfilter Pro-C: again from Fabfilter, my go-to compression plugin, especially when a smooth and transparent compression is required. This plugin levelled off Alan’s voice, giving a more consistent amplitude – lowering phrases or syllables that are too loud, which simultaneously raising ones that are too quiet.

Screen Shot 2016-08-29 at 18.40.45

Fabfilter Pro-L: this limiter plugin was used to provide a standard loudness across all the audio clips. Like the Pro-C is has excellent transparency. In this case, I brought the clips up to a peaking value of -6dBFS, which still leaves headroom for any other audio sources to occur simultaneously.

Screen Shot 2016-08-29 at 18.40.53

It’s all well and good to write about audio, but it can only truly be understood with our ears! Unfortunately, this blog cannot host audio, so you’ll just have to listen to Alan’s lovely voice on the app itself!

 

Fade In / Fade Out

Hey there, Enda here!

It might seem like a little thing, and it will be a short entry, but the topic of this blog post is definitely something I found made the app feel ‘real’ and professional. Transitions between events in a virtual space need to be smooth – when things pop in and out of existence, it can be quite jarring for the user. Rather, transitions often use fades to make this a gradual, rather than instant process.

Fading is a crucial part of the design of this app, as it affects the aesthetic ‘feel’ of so many objects. For example, fades are used in the following situations:

  • Fading photographs in and out, in time with their narration
  • Fading Blender stills in and out, in time with their narration
  • Fading the whole screen from black when the scene starts
  • Fading the whole screen to black when the scene ends
  • Fading in the loading screen and its animation

Fading is possible through linear interpolation (commonly known as a ‘lerp’). Lerping is a mathematical function that creates a line between to points, therefore creating a smooth change between the values.

 

LinearInterpolation.png

Unity has inbuilt lerping functions. The main one is Mathf.Lerp, but there is also Color.Lerp, specifically for colour changes. This is scripted in the code below:

Screen Shot 2016-08-29 at 17.43.03.png

In this case, the code loops in a while loop, changing the colour from clear (RBGA (0,0,0,0)) to black (RGBA(0,0,0,1)) by the value of ‘progress’, which is a multiplier times time.deltaTime. time.deltaTime is the amount of time in seconds the last frame took to complete – our app runs at approximately 60FPS.

These fades create a smoothness that makes the app a more pleasant experience. It also allows me to bring up full screen overlays, (such as the loading screen below) which can hide scene changes, which would otherwise be awkward in VR.

Screen Shot 2016-08-29 at 17.42.35.png
The TrinityVR loading screen

Bugs, horrible bugs! (Troubleshooting)

Hey there, Enda here.

So I’ve definitely fallen of the regular schedule of blogging, since the project has escalated and been incredibly busy! However, I plan to address a few different topics over the next few blog posts and catch up.

This post will talk about bugs and troubleshooting them, and this project has had more than a few, which led to some very despairing situations. Bugs are a daily reality in programming; if you’re lucky, when you find a bug, it’s something small and you immediately recognise what has gone wrong (mostly likely because you put it there in the first place!) However, on this project I have dealt with two ‘major’ bug problems, which were proving catastrophic for the project.

Bug 1: Easy Movie Textures

As mentioned in previous posts, I’ve been using a third party plugin, Easy Movie Textures, in order to create film textures on mobile devices. This is because Unity still does not natively support such a feature, although they really should! As a third party plugin, the behaviour of the asset is not entirely certain, and certainly lacks the robust testing that Unity’s native features go through. However, it took some sleuthing to even pinpoint the bug as an Easy Movie Texture issue.

The problem manifested in early builds. It appeared that UI buttons that moved the camera would stop working. This problem appeared in builds only and behaved as expected in the editor. My first thought was that this was a UI issue: we had already encountered UI problems with Unity and the GoogleVR SDK. The bug initially seemed to be random, but further examination revealed that it only happened after a video transition has been done – this was the first clue that it was something to do with the Easy Movie Texture asset. As it transpired, I was able to establish that the variable which contained the current state of the movie (ready, playing, stopped, ended) did not behave in builds as it did in the editor. In this case, it meant that the ‘if’ statement used to detect when the movie was over was constantly being triggered. This ‘if’ statement in turn triggered a function which moved the camera to a particular set of coordinates was also being constantly called. Forcing the camera to become stuck in one place. The next UI marker would move the camera when clicked, but only for a single frame, as it was then forced back to its previous location by the still running script.

The issue was reported to Unity, who recreated the problem but could not identify why. Thankfully, I was able to discover the solution to the issue myself in the end. This bug was an important lesson for me – it taught me that behaviours in the Unity Editor and in a build will not necessarily behave 1:1. As a result, I became more cautious and understanding of the difference between the two environments.

 

Bug 2: Crashing & Memory Leaks

This bug induced more than a bit of panic! As builds started getting bigger and bigger, we noticed that they would crash more frequently. Initially this was attributed to whatever the user was doing at the time, but eventually we noticed the app crashes regardless of what the user was doing – they seemed inevitable. By using the debug tool in Xcode, while the phone was hooked up, I was able to find some shocking numbers.

 

memoryWarning.pngVRmodeMemory9mins.png

Unfortunately, our app was eating up the phone memory. I discovered that this is something called a ‘memory leak’, where something is constantly being stored into memory, but is unable to be dumped from it. When memory use hits a critical point, the app crashes in order for the phone to save itself for its basic functions.

I did a lot of testing with various kinds of builds, to see how memory behaved, to try and identify the source of the leak. I discovered two interesting things. When VR mode was disabled, there was no leak. Also when no Unity UI elements were present (in VR Mode), there was no leak. Rather the app looked like one would expect in its memory use:

noVRmemory3mins.png

It was clear something was wrong with Google VR and its VR mode specifically, in combination with Unity’s UI. This lead me to do a lot of querying online and with various people who might have had answers. However, blissfully I stumbled across a closed thread on the Google VR: as it turns out the issue is with Unity: any version greater than 5.3.1f. This version came out in February, so we had always been using an incompatible version since the start of the project! Thankfully, this meant that rolling back our project to an older version of Unity saved it.

Despite being a very stressful event, the memory leak bug gave a better understanding of Unity, Google VR, programming and memory management – especially for mobiles, which have so little memory to spare.

Building Trinity in Unity

Hey there, Enda here! We’ve come a long way with this project over the last few weeks. Now, we thankfully have a full prototype app. This app includes all of the locations (except for the Chapel, which we have yet to gain access to) and videos (moving forward only currently), as well as the interaction markers with placeholder narration and images. The 360º photography and video is currently also only placeholder, but has been valuable in getting a good understanding of it, especially for considerations to take during video. While this will all be expanded on in the coming weeks, it is good to have a tangible product and to be working on the master copy of the project, rather than throwaway prototypes.

The project is built into several different scenes – these are akin to levels in video games. Scenes load all their content when they are accessed and the content of the previous scene is unloaded – this is very important for app optimisation. As seen below, I have broken up Trinity VR into several different scenes. Each scene contains at least two photospheres and the appropriate videos, for a total of 7 main scenes and 14 main spheres.

Screen Shot 2016-08-01 at 10.59.48
A list of all the scenes we are using in our app

Each scene contains all its required objects. Only a certain few variables are static, which means that they maintain their values across all scenes. Examples of such variables are the boolean for VR mode, as well as booleans which track where the user has been and is coming from, in order to situate them correctly, e.g. leaving the Long Room should leave you in Fellows Square, but the default initial location for that scene is outside the Berkeley. However, there are also some objects that are used in every scene, but it is simpler to recreate them, rather than transfer them between scenes. These include the camera, home panel, event system (which controls gaze input and touch) and the system sounds (such as UI sounds). Below is an example of the kind of objects in a scene, and how their hierarchy is organised. Some objects are simply empties, used to help organise and group other objects together.

Screen Shot 2016-08-01 at 11.01.03.png
Some top level objects, such as 09museumBuilding01, are used to group all other objects within it and keep the scene well organised.

The meat of each scene are its spheres, which contain either a 360º photo or video. The photos also feature all the interaction markers, for movement to other areas as well as tour interaction. Moving in an area causes the camera to jump to the video sphere (labelled VUP for a forward moving video, VDN for a return video). This video plays when entered and moves the camera on to the appropriate photo sphere when the video has finished playing. The structure of this can be seen in the sample scene from the Museum Building below.

Screen Shot 2016-07-31 at 12.25.41 copy.png
Museum Building: this demonstrates how the camera can move between the two areas of this scene using quite simple mechanics.

Being able to explore the app in a rough form has given us a lot of relief as well as allowed us to generate new and exciting ideas for it, now that the essential backbone has been laid.

Until next time,

Enda

Scripting in C#

Hi, Enda here. I thought for this week’s entry I would talk about scripting for Unity in C#, as well as go through some of the scripts that I’ve been required to make for this project.

C# is a general-purpose, object-oriented programming language, best known for its use in Microsoft’s .NET framework. However, it is also one of the two languages with which one can make make custom scripts in Unity – the other being JavaScript (sometimes known as UnityScript in the Unity community). In terms of syntax, I have found that C# is is very similar to Java (much like Processing that we used earlier in the year, which is built upon Java). As such, I find C# to be a very clean and readable language to work with.

C# and Java are similar languages that are typed statically, strongly, and manifestly. Both are object-oriented, class-based, and designed with semi-interpretation or runtime just-in-time compilation, both use garbage-collection, and both are curly brace languages, like C and C++.(https://en.wikipedia.org/wiki/Comparison_of_C_Sharp_and_Java)

In Unity, C# is written by default in the accompanying software, MonoDevelop. This thankfully provides a clever autocomplete, as well as being able to peer into objects and see their functions, making writing and research much more streamlined. 

Screen Shot 2016-07-26 at 15.39.14.png
A typical view from Unity’s MonoDevelop

An important concept in C# (as well as other languages, such as Java) is method. Methods are procedures associated with objects that change how an object can be accessed – the scope of the object and its protection. In Unity, so far, the types of methods I’ve used are:

  • Public: can be accessed from anywhere, and by other scripts. Importantly, public variables appear in the inspector, which means information unique to the iteration of that script can be changed.
  • Private: only accessible from within the class they are declared in. Their scope is limited to the containing code block, which, among others, can be a class, function or if-statement. Whenever a variable is declared in between curly braces you can only see and read from it within the same code block or nested blocks.
  • Protected: similar to private, but can be also accessed from derived classes as well as the class it is declared in.
  • Static: are similar to public, but a static variable lives on a level with the class template, not with the individual instances of a class. It is a single entity which is shared by all class instances.
Screen Shot 2016-07-26 at 15.59.41
The Unity Inspector view of a script that includes three public variables.

Two other important ideas are that of Coroutines and InvokeRepeating. Often in Unity, because it takes place in real-time, a function is needed that repeats over time, i.e. for each frame. It is possible to handle situations like this by adding code to the Update function that executes on a frame-by-frame basis. However, it is often more convenient to use a coroutine for this kind of task as it is more easily controlled.A coroutine is like a function that has the ability to pause execution and return control to Unity but then to continue where it left off on the following frame. A coroutine can be started and stopped by other functions. InvokeRepeating is similar, as it:

Invokes the method methodName in time seconds, then repeatedly every repeatRate seconds.

However, it cannot be as easily controlled as a coroutine and is best use for a function that should repeat, but not be stopped.

Most of the coding I’ve done has been around manipulating different variables of Unity objects, changing them based on different criteria. Here is a brief overview of the scripts I’ve written so far:

  1. audioFadeOut.cs – a coroutine to be triggered by other scripts that smoothly fades out audio over time (a public variable), instead of stopped audio abruptly which can cause inter-sample clipping to occur.
  2. audioMarker.cs – used to play audio attached to UI markers. It checks to see if the audio is already playing; if it is, it calls audioFadeOut. It also checks to see if any other audio is playing first (by checking for a particular tag) and fades that audio, in order to prevent multiple audio markers being played at the same time.
  3. audioRandomPlay.cs – to be used with environmental sounds. It uses InvokeRepeating to play audio which starts and repeats within a random time range.
  4. audioRandomPlayGvr.cs – as before, but specifically for GvrAudioSource (Google VR’s 3D sound source) rather than Unity’s native AudioSource.
  5. audioTrigger.cs – detects collision between the camera and destination (such as a photo or video sphere) and starts playback on collision. Fades out sound after collision has ceased. Useful for starting environmental sounds and music.
  6. gvrHeadRot.cs – used to extract the current X and Y rotation of the camera – then used in homeIcon.cs.
  7. homeIcon.cs – uses the camera’s rotation to update the rotation of the home panel, so that it’s icon is always facing forward. Uses linear interpolation to move the panel smoothly, organically lagging behind the camera rotation.
  8. loadScene.cs – loads a new scene in Unity when triggered.
  9. navTrigger.cs – moves both the camera and the home panel to a new set of coordinates when triggered. Used to transition between spheres.
  10. videoTrigger.cs – similar to audioTrigger, it detects collision and then plays back the video of the video sphere. When the video has ended it is reset and the camera and home panel are also moved to a new set of coordinates (to a photosphere).
  11. videoTriggerToScene.cs – as above, but instead changes scene when the video ends, rather than moving the camera coordinates.
  12. vrChangeCamera.cs – changes the camera view mode between VR and full screen view.
  13. vrTrueFalse.cs – stores the public static variable of the VR mode being on or off, so that it applies across all scenes, as well as the functions to change that variable.
Screen Shot 2016-07-26 at 16.30.21
The first screen of the app, which allows the user to choose VR mode or not, utilising both vrChangeCamera.cs and vrTrueFalse.cs.

While there may need to be adjustments made to the scripts in the future, for the time being, between these scripts and Unity’s native functions, we have what we need technically for this project.

Until next time,

Enda

Unity UI & Video

This week I spent my time refining the UI elements of our project in Unity. While advancements have been made, they are certainly still subject to change, especially following some user testing.

 

UI Marker Design

Our goal has always been to make our UI markers fairly unobtrusive; giving the user an uncluttered space to view their virtual environment. I’ve added an extra ‘echo’ animation to the markers when they are not highlighted, in order to bring further attention to them, while still having them take as small as space as possible. The video below shows a marker in its ‘normal’ phase, then highlighted and ‘clicked’.

 

Home Panel UI Design

The ‘home’ panel serves two purposes: it acts as a quick way for the user to get back to the main gallery (still in development) and select a new scene; it also acts as a bottom plate to obscure the camera which cannot be removed from the shot (at least in video). The panel tracks the user head movement, so it always faces forward, meaning the user can easily find the home function at any time. I’ve also included a VR mode button, which the user can toggle (as seen in the videos, for clarity). However, swapping the mode seems to cause the app to crash occasionally, so it will have to remain to be seen whether or not the stability of the function can be improved.

 

Area Travel

Finally, and most importantly, we have implemented 360º video in the app! This was only made possible by the purchase of the Unity asset, Easy Movie Textures. Thankfully, the asset works very easily for iOS (and Android, still untested). Creating the travel transition successfully required some additional custom scripting. The video is loaded but does not start until the user view has moved to its sphere. Once the video has ended the camera is then moved again to a new photo sphere. For the illusion to be complete, the photography must match the start and end of the video very closely. While the user head movement helps mask the change, it may be necessary to provide a brief camera blur effect or fade.

 

That’s all for this week! For the next two weeks I think most development in Unity will be making a strong prototype, bringing together all current techniques, for the mid-point presentation.

Enda