Ross Monroe
Student and Creative Technologist
DSC01880-1280x854.jpg

Power of Trust AR

 

Project:

Power of Trust AR

Client/Agency:

Electric Coffin

Location:

Seattle, WA

Role:

Software Development

POWEROFTRUST_ELECTRICCOFFIN_BLADEFINAL_BEACH-1280x853.jpg
 

Overview:

For the 2018 Seattle Design Festival, several Seattle design agencies partnered to create Power of Trust. A sculpture that invited attendees to reflect on their personal responsibility as designers in the public space. The art piece consisted of three industrial fan blades that were originally intended to cool a nuclear reactor, that sat above a pond of water below.

The blades were finished in chrome and positioned with the concave side facing inward toward an inlet that allowed attendees to walk into the center of sculpture where they could see their reflection from all sides. This prototype allowed the agencies to explore the refurbishment of these fan blades on a small scale by using only three out of two hundred and twenty they have available.

Even though they were only using three, the agencies wanted to showcase the vast amount of these blades by merging the physical sculpture with augmented reality where the other two hundred and seventeen fan blades could be viewed.

I was brought on to create the augmented reality app that placed a model of all the fan blades surrounding the physical sculpture. The Power of Trust AR app allowed visitors to view and walk among all of the fan blades together on display in pioneer square.

Development:

With the festival only a couple weeks away and the Augmented Reality being an Add-on to the full sculpture I had to define the minimal viable product to meet the needs of the event. The ideal conditions of the agencies were:

  • Open access so anyone can view the AR side.

  • The app continued the theme of reflection.

  • The model was positioned surrounding the physical installation.

With these in mind, it was decided to use an iPad that staff would have available to those viewing the sculpture. This is because a cross platform solution would take too much time to develop and web based AR is still in its infancy with limited functionality and poor stability.

It took a couple days to do proper research on the swift programming language as well as designing for AR environments. Thankfully Apple’s AR Kit was very easy to work with and plenty of tutorials, documentation, and examples exist that were able to get me started.

I broke down my approach into several stages:

  • Create a basic AR app using tutorials to display an object in AR

  • Add the ability anchor the object by placing the object with touch

  • Replace the test model with the actual model

  • App Stability

  • Physical appearance of the model

The main issue through the final stage was getting the resolution and sizing of the model accurate enough to where it looked appropriate but didn’t crash the app. Having almost three hundred fan blades being loaded by a phone is not the easiest of tasks for a mobile device. On top of that each fan blade had a reflective skin. 

blade test clip.gif
blade OBR Test clip.gif

The reflection on the fan blades was created by using Object Based Rendering (OBR). OBR allows several layers of textures to be applied to a model that give real life reflection qualities. The reflection is created by an environment map (360 degree photo) you add to the texture pack. Before the event I went to the location of where it would be installed and captured two one hundred and eighty degree panoramas that I combined to make the environment map. This way the reflections of the blades were of the actual place the app would be used to make the experience even more immersive. 

I also added in a wonderful feature of AR Kit that is natural lighting. It takes the lighting that it observes through the camera and is able to match it to the model with the same intensity to make the models appear even more realistic.

Execution:

During the Seattle Design Festival, the art installation was staffed so the AR experience could be shared with visitors and provide further explanation about the sculpture and what drove its creation. The staff would place the model so it was oriented properly, then pass the device onto visitors to allow them to wander and view the digital side of the installation. 

To allow as many people to view the AR experience, the app was loaded onto an iPad and multiple devices so the staff were able to have multiple visitors viewing the experience at once.

Knowing that this weekend design festival had long hours, the last tweak I had made to the app was to stop the AR scene once the model was loaded. The reflections would still work properly, but this allowed the devices to run the app longer by conserving battery life.

Results:
This was one of those situations where the minimal viable product was the end product but it met the requirements of the agencies. It functioned reliably, could be placed over the physical installation, and continued the theme of reflection.

Its estimated a couple hundred people were able to view the experience and most of them were so amazed by the realistic reflections they actually thought they were being created in real time. It was such a satisfying feeling to watch someone view the AR side of the app and have them exclaim its almost like magic.

Reflecting on this project, if it had moved into further development I created a small list of items that would need to be addressed to create a more mature product.

  • User interface with buttons to reset the scene and plane detection.

  • Higher poly count on the models for more realism.

  • Reduced load times.

  • Object recognition so the app would automatically place the model and anchor it when it sees the physical installation. 

  • Rendering only whats in view to reduce processing load/increase battery life.

  • Cross Apple device testing.

  • Distribute app on Apple App Store for individual availability.