The Intro Video

Early on in the project, we had realized that a lot of the information we wanted to convey throughout the tour could be easily tied to buildings, making their placement intuitive and simple. But there was some information, mainly contextual and having to do with wider history, that didn’t really fit anywhere along the route. This gave rise to the idea of a timeline, where the sort of synopsis of Trinity’s 400+ years of history could live.

In June, we wrestled with how this timeline would fit within the larger app. Would the user walk into it in Front Square? Would it live over the portals? Would it be its own sphere separate from the tour itself? Ultimately, we decided we wanted something inclusive, brief, and easy to interact with, so we chose to make a video.

That work fell to me. In writing the rest of the script for the POIs, I also wrote a script for the intro video. The script kept with the timeline theme, basically walking the viewer through the highlights of Trinity and Irish history. After making sure to have Enda and Adrian check the script for “Americanisms,” I downloaded After Effects and planned on taking full advantage of the 7-day trial. With everything else going on in the app, it was 3 days in before I actually got started. That didn’t leave much time, so I based most of my work off a timeline template thinking this would be more efficient. Looking back now, I realize I barely used any of it (of course).

Screen Shot 2016-08-30 at 9.38.27 PM
These animated infographics help to illustrate changes in Trinity over time

Before starting this part of the project, I had forgotten how annoying the rendering time is with After Effects. I wasn’t able to make any precomps because I was still waiting on the final narration (and therefore the final timing), so the RAM playback was painfully slow. The timeline used a combination of historical images, nested videos, animated infographics, and Blender renders.

 

The latter proved to be the biggest headache of all. I wanted to illustrate how Front Square used to look surrounded by red-brick buildings, so I used a model of the Rubrics building I’d made in SketchUp. Unfortunately, exports from SketchUp do not play well with Blender, and each of the models had well over 10,000 objects in them. This obviously slowed down my computer a lot, and placing each of them was very trying. But it worked out, eventually. I used Adrian’s

Screen Shot 2016-08-30 at 9.40.06 PM.png
Blender render illustrating recreation of red-brick quadrangle in Front Sqaure

360 photo of Front Square as the environment background and placed them within the world convincingly enough, having a centrally-located camera spin around as four of the models rose from the ground in a quadrangle.

I also animated a lot of graphics in the video. I keyframed each of them by hand to coincide with the narration to illustrate things like the divide between Catholics and Protestants, Britain’s imperial standing, and the kinds of educational offerings the school provided at first. All of this information could have been conveyed simply using narration, but I think the addition of engaging, animated visuals is more apt to keep the viewer entertained and paying attention.

WhatsApp Image 2016-08-30 at 21.47.39
Intro video in Front Square 1

Blender Models and Animations

It’s been a busy few weeks. With the groundwork laid and the memory leak in Unity fixed, it was time to focus on ways to flesh out the app. I’ve been focusing on creating Blender models to add context to the narrations and more multimedia to the scenes.

Sign
This sign will swing on the campanile to illustrate a story marker there

The Blender models and animations will be triggered by markers and will coincide with narration. These help to add a little more to the tour than just listening and looking. Some of these are stills that fade in and out, while others are animated within Blender. Because animating and texturing in Unity is beyond me, I wanted to know if it would be possible to texture and animate the models in Blender and then render them out and fit them in the photospheres. This was very complicated, though, because our Unity world is only an illusion of 3D. Trying to figure out how to fit both a 2D stills and 3D animations within a 3D-looking world that’s actually 2D made my brain hurt, and often the meshes I made had to be altered to force perspective that fit the world. But through some complex finagling and a harsh lesson in camera perspective, Enda and I were able to figure out a way to fit both within the spheres. Because it’s easy to animate location and opacity within Unity, many of my

planned animations became stills that were animated as sprites within Unity to swing or

Screen Shot 2016-08-16 at 11.16.27 AM
This Blender animation will be ‘green screened’ into the scene at Library Square

fade in and out. The remaining animation of the crumbling gravestones in Library Square uses the Easy Movie Textures “Chromakey” plug-in to “sit” within the scene. This one was especially challenging, as I wanted to include the shadows within the scene, but needed to render out the plane image with transparency so as to better fit within the world. Manually matching the slope of the square and the camera perspective may have turned me off to Blender for the rest of my life.

 

The complete list of Blender meshes:

  • Rubrics building
    • Made to illustrate how Front Square originally looked with red-brick buildings
    • Copied from building schematic via Sketchup
    • Individual sections textured by hand
  • Graveyard animation
    • Crumbling gravestones illustrate burial place of monks from All Hallows in Library Square
    • Basic Celtic Cross mesh downloaded from Blendswap but then manually overhauled in order to use Cell Fracture on the mesh
    • Using Rigid Body physics animations from Blender manually keyframed in to control speed and direction of crumbling
    • All textures downloaded from cgtextures.com and then UV-mapped to individual sections of the mesh; keyframed cracks added by hand
    • Originally, I had this animation composited to render only the gravestones and their shadows, but leaving out the plane on which they sit. Unfortunately, once I green-screened this in FCP and then Enda applied the Chromakey plugin, the shadows disappeared as well (too much green in them, apparently). Instead, I textured out the green in the shadows by adding a dirt ground made by photoshopping an alpha-enabled PNG onto the plane on which the gravestones sat.
    • The use of Chromakey means that when the animation fades in it has a bit of a green tinge, but I’m not complaining at this point.
  • “For Sale” sign
    • Will swing on campanile in Library Square to accompany story of race up the structure
    • Mesh made by hand, manually “splintered” using Boolean modifier
    • UV-wrapped texture including Photoshopped “For Sale” font
  • Picture frames in Museum Building
    • Accompany story of one-sided portraits along the staircase
    • Meshes made and textured by hand
    • Individually UV-wrapped historical portraits of soldiers sourced from derjudenfrage.deviantart.com, tpc.dodlive.mil, http://www.dailymail.co.uk
  • Plants on top of Arts Block
    HANGING-GARDENS-OF_2744030b
    I based the design for the Arts Block plants on images like this
    • Illustrate the original design of the Arts Block as based on the Gardens of Babylon
    • Plants and trees appear on different roofs of the Arts Block
    • Grass made using particles in Blender
    • Was originally going to animate this using a force field physics animator, but the
      Screen Shot 2016-08-16 at 11.41.44 AM
      This is how they turned out

      Blender scene had nearly 5,000,000 vertices and was really slowing my computer down. We opted to make it a still and then fade it in and out within Unity to save time and to hopefully keep my computer from blowing up.

  • Original Long Room ceiling
    • Illustrate what the Long Room looked like before the addition of a renovated ceiling
    • The forced perspective of this made me (and probably Enda) want to weep. Trying to get a 2D render of this to look as if it spanned the entire depth of the Long Room was very difficult
    • I ended up having to make the mesh an actual wedge that fit within the perspective from the camera point in the second Long Room scene so that it appeared to extend into the distance. It was somewhat helpful for things like this to use Adrian’s actual images as environment textures within Blender, although everything did tend to change once we brought it into Unity

Now that the models are all have the proper perspective and have been rendered out, the next step is to tie them to a narration marker within Unity and then fade them in and out at the appropriate times.

-Hailee

 

Green Screen Fun

Vivienne had mentioned at our mid-project review that we might want to try to add a more personal touch to the project. Because of this, I wanted to try to include short anecdotes from people about different buildings or spaces on campus. Originally, we thought maybe it could just be different voices narrating their own personal stories, but I thought that was a bit too much listening on top of the narration that’s already there. I’m a big fan of adding more visuals whenever possible, so I talked with Enda about the possibility of adding people within the scene. He’s sick of my ideas by now, for sure, but he had noticed that the Easy Movie Textures plugin included a chromakey option. We tested some green-screen footage we found on YouTube and were able to make a person appear within the photosphere.

I’d used green screens in my undergrad, and felt comfortable lighting scenes in the even way you needed to to get these to work. As it happens, Enda had access to a convenient green screen studio just off campus, so Jill set to work gathering lists of interested parties using our access by using on-campus resources like the GSU and SU groups.We also reached out to Trinity alums with whom we had personal ties. We had a few replies, so we set up a short list of people whose experiences we were most interested in. These included a former Trinity Scholar, a Trinity singer, a global room tour guide, a former accommodation office employee, and an English lecturer (you might be able to pick out a few familiar faces in the footage). The green screen studio already had a pretty solid setup with a camera and lights, but after testing some footage from there, we realized that FCP7 (what I use for editing) had problems natively importing the AVCHD footage from the camera they had. They also had a few wonky lights, although the diffusions on them were really nice for faces.

Instead, we used the school’s camera and lights (but hijacked their diffusion). I used two backlights to give an even coverage to the green screen behind the subject, then a key and fill for their faces. In the past, I’ve worked in green screen cycs, so you’re able to separate the subject from the screen a bit more and have more control over the shadows. In this studio, they didn’t need to capture the subjects feet, so it was set up more for mid- close-up interviews and the green didn’t cover much of the floor. We remedied this by buying several sheets of large green paper from Eason’s and having the subject stand on these. Although the green didn’t match completely, it still works in the Chromakey.

Screen Shot 2016-08-16 at 12.11.37 PM.pngEnda and I ran the shoot last Wednesday, and managed to have everyone come in for 30-minute slots. Some took longer than others, and some were definitely more comfortable in front of the camera than others, but we got some good stuff in the end. I wanted the testimonials to come out naturally, so we started each interview with more of a conversation to get a rough idea of what each person wanted to talk about, and then ironed out a script naturally from what they said. It meant going over each story a few times, but it worked out.

Screen Shot 2016-08-16 at 12.25.19 PM.png
Original problematic pants

After the shoot, I edited each of the testimonials in FCP. Because they each did the stories a few times, I was able to edit

Screen Shot 2016-08-16 at 12.25.57 PM.png
Green pants now brown thanks to AE magic

the stories a few times, I was able to edit the takes together into cohesive pieces. I was also able to cut out a few of the extra “umms” and pauses without too much visual hiccuping. The one hitch was that one subject wore pants that were a gray-green, and the Chromakey made them transparent. Thankfully, the subjects are mostly stationary, so I was able to use a tracking mask in AfterEffects to layer the footage on top of itself and then use a color change effect to alter the pants to a brown. It’s not perfect if you look closely, but because the people are relatively small within the scene, it’s acceptable (I hope).

I adjusted the aspect ratio to be vertical to cut out the unnecessary footage and have the

Screen Shot 2016-08-16 at 13.52.40.png
One of our subjects in the sphere

entire background be green. Then, I passed them off to Enda to edit the sound. He returned the edited WAVs to me and then I pasted them in and added fades to the footage so that the subjects appear to fade in and out. I exported these as ProRes MOVs and then HandBraked them to MP4s so Enda could add them to the scenes. He’s currently working on having them be triggered by the markers so they appear on command. So far, it looks pretty realistic, minus the Ghostbusters-esque green glow that accompanies the fades. I may need to fix that in the future, but we’ll see.

-Hailee

 

 

Scripting & More..

The past two weeks have been the final stage of laying the groundwork before the project really starts coming together. In the coming two weeks we will (hopefully!) begin to see all of our hard work start to transform into something tangible (and awesome).

Despite running into a Unity bug that kept us from constructing an entire test walkthrough as we had wanted, we’ve still been making lots of progress. I referred to Jill’s research to write the scripts for all the POIs throughout the tour and the intro video we’ll have in the initial photosphere. I then rewrote these scripts twice Screen Shot 2016-07-31 at 12.26.04 PM.pngbased on Enda’s scratch recordings of them and feedback we received during the midterm presentation. I organized these into a spreadsheet that included the location of each POI and the kinds of multimedia (image, video, animation) that will accompany each.

Also in the last few weeks as the skeleton of the project has formed, we’ve been thinking of some ways to really make the tour stand out, and to make it as engaging as possible. Because of this, I want to construct animations in Blender for some of the POIs that will play alongside the narrations. Only some of the POIs lend themselves to this, but I’m thinking crumbling tombstones in Library Square where the monks were buried, a red-brick quadrangle in Front Square to show what the area once looked like, and putting a “For Sale” sign on the campanile with the story of the race up the structure, for example. I think it would add a sense of history “coming to life.” We’ll see how it fits into the schedule, but I’m set to begin this weekend.

Also, based on Vivienne’s comment at our mid-project review, I wanted to find a way to add a bit of a personal touch to the project. Originally, I had thought perhaps adding quotes of anecdotes from previous students would work, but then I decided we could be a bit more ambitious. I have worked with green screens in the past, and wanted to see if there was a way we could record people actually giving testimonials and then stick them into the photospheres to tell their stories. Screen Shot 2016-07-31 at 12.34.42 PM.pngEnda had seen that the Easy Movie Textures plugin had a Chromakey shader, and I have worked with green screens in the past. After a few tests with green screen footage ripped from YouTube, he was able to make the person look like they were really in the scene. What’s even better, his dad’s work has a full green screen studio, equipped with a camera and lights. We’re going to test it out ourselves on Tuesday, with the hopes that we’ll get people in and recorded by the end of the week. I’m excited to get back into a studio and do some shooting!

The final add-on idea is a companion website, which we had discussed at the beginning of the summer. I wanted the website to be a bit more than a marketing landing page, so after speaking with Enda we decided we should also have the ability for people to submit questions that hadn’t been answered in the tour. These questions would be added to a database, where we could potentially have an expert answer them. This would serve the dual purpose of allowing users to engage with Trinity beyond the initial tour, and showcase our programming and web development skills. Jill and Ying have happily agreed to take on this challenge.

Speaking of shooting, the steadicam (Stayblcam) finally arrived from the States. Adrian was able to do some test shooting with it last week, and while it’s not as smooth as a dolly, it definitely helps with some of the up-and-down motion we were having problems with initially. Thanks to Jill and her hard work securing filming permissions, he’s set to do the final shooting this week. Combining this along with the ambient sound that Ying has been designing should really help bring the project to the next level.

Designs and Content

This week we were able to test the route exteriors with the camera, which gave us a foundation to analyze potential problems with the project. We found that we may need to try using a stabilizer for the video footage, as the bouncing of unstabilized walking is a little nauseating. We also found that the amount of videos and photos caused the app to crash, so we’ll need to find an alternative way of loading the assets within Unity.

logoR3
App icon design

I continued to work in Photoshop to test out some app icon and portal designs. The icon is looking pretty good so far, but

Screen Shot 2016-07-16 at 10.06.23 PM
Portal possibility

I’m still wrestling with the portals. I think they should gel well enough with the app design, but I’m still wrestling with how to make them portal-looking and not cheesy. It’s a lot to imagine without any context, though we’re talking about playing around a bit with animations, so if the user gazes at a portal, it will move or become colored or something. We want something clean-looking and easily understood, but I think we may have to wait until we have a full test run-through of the route before nailing down the design.

Finally, I worked with Jill to begin writing the content for the project. She’s gathered a lot of the historical facts about Trinity and its place within Irish history, so now it’s time to begin putting those facts in narrations. We’ve decided to scrap the timeline idea in lieu of an intro video that succinctly introduces Trinity in the cultural and historical context of Ireland, but there’s so much information that being able to pare it down and keep it interesting may be a challenge. I’m also looking forward to the possibility of using After Effects to use visual effects in the POIs and the intro video in order to make the experience more immersive.

-Hailee

Week 3 – Organization

This week had a heavy design focus. We experimented with different UI designs and sounds for POIs and navigations. We discussed design possibilities for the gallery, and decided we wanted a clean, modern-looking design for the “portals” to jump around the different exhibits. We also looked ahead to things we’d need to figure out in the near future, including collecting archived photos and creating a repository of sounds. I put together a list of everything we’ll need to focus on in the coming weeks. IMG_6167

We’ll soon need to make some decisions about when to use narration vs. music, and the possibilities for using videos as transitions. We’re looking forward to receiving the camera so we can start testing the footage and seeing what’s possible as far as stringing together photos and videos.

Hailee

Week Two – Understanding Unity & Mapping the Tour

After presenting our idea to Mads on Monday, we created a shopping list of equipmentricoh_910720_theta_s_spherical_digital_1183083 we’ll need for the project. The first, of course, is the 360° camera. Though we were originally looking at the LG 360, additional research showed that the Ricoh Theta S has a far better picture quality and is better suited to our needs. We also requested the Easy Movie Textures plugin for Unity that allows 360° videos to be played on Andriod and iOS, as this is not yet a native capability (we’re really on the cutting edge here!). Finally, we requested a stabilizer rig for shooting the 360° video, though the technology is so new that there are few rigs on the market for it. Many are made for GoPros or smartphones, but are not made for a 360° scope, and therefore the hardware is often very apparent in test footage. The LUUV stabilizer rig bills itself as “the first 360° video stabilizer,” but as a Kickstarter project it’s not yet in production. Though it lists a July 4 ship date, we have concerns that it will meet the timeframe we need for this project. We’ll have to test the camera when it comes in and see what kind of stabilization is necessary for the video and decide from there.

Unity and Tech

In the meantime, we’re continuing to focus our energies on testing the technology, generating content, and thinking about user interaction. This week, Enda set up a tech demo on his phone to show how users are able to interact with navigation icons that animate on selection and then transport the user to a new scene. He’ll elaborate a bit more on the tech piece himself as it’s over my head, but I do know it looks cool!

UI and Design

Light-Selection Example
Testing light source interactions

These scripting requirements are heavily influenced by our choices in design and user interaction, however, so we’re really trying to nail those down. For example, how do we demonstrate to users that a certain spot will trigger an interactivity? As this is VR, there aren’t established standards for UI yet. We were playing around with maybe having a building or object within the scene “glow” if it would trigger an interaction, but felt like that limited our options too much. We experimented with using light sources within Unity to highlight interaction points, but it didn’t seem intuitive enough, and we really wanted clear interaction that didn’t require much explanation. Good UI is innately understood UI, and that lead us to the conclusion that we should use what

Photography by http://adrianlangtry.com
“Designing” icon possibilities

people are already familiar with. Cue Google-inspired icons and material design à la Easter Rising tour. We want to be sure not to impede the views, however, so in the coming weeks we’ll be exploring different animations where the icons begin as small, colored 2D dots, growing slightly and revealing a photo/video/narration icon as a users’ gaze moves towards them. We’ll have to test this within the Cardboard environment before making any final decisions, however.

Content and Research

A huge part of this project will be establishing the exhibit routes, researching for the POIs, developing scripts for narration. After Jill and I (Hailee) went on a walking tour of campus

20160621tcdfrontsquare
Placeholder image of Front Square

to try to establish the stopping points and exhibit routes, we realized it was a bit more complex than we’d originally thought. Visually, the angles didn’t line up as we’d hoped they would, especially for exteriors like the Museum Building and Library Square. We’d also need to think about how to deal with interactions that weren’t directly tied to objects within scenes–for example, how do we relay information on the founding of the

Photography by http://adrianlangtry.com
Old-school content management system

school, or general historical overviews of certain areas? Perhaps we’ll put those in narrations during the video transitions, or set up a “hear more” interaction within the scenes. After taking everyone on a walkthrough of the route we had determined, Adrian and myself took placeholder 360 stills of the stops we had all agreed upon using the Google Street View app. While these images are nowhere near perfect (with all the tourists around, lots of people wound up stitched together with the wrong legs or faces!), it did give us the ability to print out each of the scenes and start compiling information for each using a sophisticated system of Sellotape and Post-It notes. Not only does this help us manage the multitude of information on the history of Trinity, but by covering the walls in images and content, it really helps give our little “office” some personality and focus!

Hailee

Week 1 – Establishing the Concept

One week down! We spent much of this first week fleshing out our project concept by determining team member’s interests and backgrounds, familiarizing ourselves with software, and laying out potential workflows.

Between Enda and Jill’s background in sound design, Adrian’s experience with photography, my (Hailee’s) knowledge of videography and project management, and Enda and Ying’s desire to get involved in the technical side of production, we have a very well-rounded team. We wanted to capitalize on the team’s broad skill set while constructing a project that’s consumer-oriented and practical, yet also cutting-edge.

Screen Shot 2016-06-22 at 11.35.37 AM
Still-video-still navigation

Working mainly from Adrian’s initial project pitch and his design team’s 360° photography website, we came up with the idea for a VR tour of Trinity using a combination of 360° photos and videos, innovative sound design, and interactions that gave users the feeling of truly “being there.” We would use the Unity game engine to “stitch” these media together into different areas of Trinity called “exhibits.” Within these exhibits, users would navigate from still image to still image by way of 360° videos that provide the illusion of movement.

At each still image, they would be able to interact with points of interest which would provide information about the history, culture, and stories of the areas they’re experiencing.

Possible exhibits include:

musuem-building5
Museum Building
  • Front Arch
  • Front Square
  • Library Square
  • The Museum Building
  • The Long Room
  • The Chapel
  • The Exam Hall
  • The Dining Hall

This technology has natural applications to the €5.1 billion-per-year Irish tourism industry. With this in mind, we determined that this product could be used by organizations like Failte Ireland or TCD itself to market the college, Dublin, or Ireland as a whole to potential tourists who either a) can’t afford to visit from overseas at the time, or b) are deciding between spending their holidays in Ireland or another country. (We also think the tour has potential uses for prospective students, but decided for the purposes of this project to keep the audience more focused). As such, we want the tour to be a realistic “teaser” of what TCD, Dublin, and Ireland as a whole has to offer.

Project statement: “An interactive, immersive, informative 3D virtual reality app that allows tourists to realistically experience the best parts of Trinity from the comfort of their homes”

 

After several meetings, we determined the extent of the technology we will be using:

  • Unity 5 with binaural audio recording to create an immersive experience
  • Google Cardboard and Google VR SDK for Unity

    Screen Shot 2016-06-22 at 11.35.30 AM
    360 still image captured using Google Street View app (not as useful as actual 360 camera)
  • 360° photography and videography (using PhotoShop, FCP, possibly Adobe After Effects as well)
  • Possibility: a 3D modeled “narrator” who can interact with the user and relay stories about certain stops (will depend on the amount of time we have left once the more integral parts of the project are completed).
  • Content should include a variety of media, including narrated scripts, photos, videos, and audio recordings

The many facets of the project and our limited timeframe mean that we will need to tackle multiple aspects at once.While we’re waiting on the 360° camera from the school, Enda and Ying will be focusing on the technical side of working within Unity and familiarizing themselves with C#, while myself, Adrian, and Jill will be working on the content and the visuals of the tour, including research and visual framing. We took a campus tour on Friday to begin mapping out how each of the photos and videos will be stitched together, and determine possibilities for Points of Interaction (POIs) we might offer at each stop. Because the field of VR UI is so new, there is no real template for what works and what doesn’t. Because of this, we will be making these decisions as a team.

With so many development tracks occurring at the same time, we decided to set up a free account with project management software Taiga.io, which will allow the entire team to keep each other up-to-date on development as it occurs, as well as keeping track of broad ideas and specific tasks that spring up over time. We’ve got some great ideas, and don’t want anything to fall to the wayside!

Hailee