Augmented Reality

Love Anthem

An interactive Augmented Reality audio-visual healing experience that takes the audience through my personal love journey in NYC.


Concept Development AR Interaction Design AR Foundation Development Unity Mapping 3D Modeling Audio Editing


AR Foundation Unity C4D Adobe Audition


Oct 2019 - Dec 2019



Love Anthem is a 5-min AR walk in Washington Square Park that takes the audience from one spot in Washington Square Park to the arch by having them follow the emotional rollercoaster of my love life in NYC which is visualized as a trail in AR. ・・・・・・・・・・・・ When the audience follows the trail made of a transparent tube with their phone aiming at the tube, the tube would be filled with color and the original song with my original poem narrated in my voice would be played. The experience would reach its climax when the trail brings the audience to the fountain and have them look up to see a word in front of the arch that summarizes the end result of my love journey here. ・・・・・・・・・・・・ The physical pose of the audience following the trail staring at their phone is a general portrayal of people who fall in love, focusing on only the new connection that was born with tunnel vision and neglecting to see other wonderful things happening in the world as possibilities, which could look funny from other people’s perspective. Eventually, when the trail brings the audience to the huge word made of the tube in the sky in front of the arch, the pose of the audience would also be opened up, symbolizing the moment when they finally see the whole picture and are able to take in the world instead of focusing on one fragment of their life.



To keep exploring the emergence of the physical and digital world, I wanted to explore more in the AR realm and create an AR experience that:

  • Visualizes my emotions and memories - my theme, something I'm passionate about
  • Is healing - transcending my frustration into something beautiful
  • Is interactive - wanted to explore the interactivity of AR
  • Is location-based - inspired by Apple's AR walk
Inspired by Swiss artist Pipilotti Rist on Apple AR walk


Exploration of the Interactivity in AR.

As creating something that touches people has always been my passion, and I heard ...

The best way to make it universal is to make it personal.

so my idea was...

  • An audio/visual healing journey of the stories throughout my love life

I grew up in Taipei, spent 1/4 of my life time in Tokyo, and now living in NYC, as someone who is very sentimental and affectionate, there are numerous love stories of mine in each city, epic or trivial.

Therefore, I wanted to create a series of huge AR word sculptures that sums up my love life in each city across the skyline of Taipei, Tokyo, and NY respectively, vehemently whispering my love stories there. With that, I hope I could evoke some feelings in each audience's heart.

  • Each city would have an original theme song which consists of an original poem that tells my love stories

I wanted the song to be played as the audience fill in the huge AR word sculpture made of a transparent tube in the sky. As the audience move their phone and fill the color into the transparent tube which forms the word that sums up my love life in that city where it is located, the original theme song of my love life put together by my musician artist friend Nick Gregg in that city would be played along with my voice over whispering each of my love stories in a poetic way into the audience ears. First I started with NYC.

  • Visualization of my emotional rollercoaster

Have the trail of the transparent tube leading to the revealed word in the end visualize my emotional rollercoaster during my situationships in NYC, so the audience not only saw my breakups but also physically following and drawing my emotion graph

  • Physical element of AR

Why using this new medium? being asked by Nick. Besides wanting to augment on the environment and the existed sound of the city, we thought more closely about how the physical pose of audience using AR could also tell the story and convey the message of the feeling. The following of the trail physically could help pace the music as the music plays while the tube being filled with color. We wanted to use the pose of the audience following the tube trail from closer to the ground to up in the air and stretching in the end to lead the audience to open their eyes and rejoin the world as a metaphor to open up the tunnel vision of people in love, having them realize the beauty of the world and that the new connection built is just a small fragment of life.

The pose of the audience of my AR public sculpture also takes into consideration of how the user is perceived by other people surrounding them. How closely the user of my piece following the trail and coloring the transparent tube by carefully aiming their phone at it without looking into the world around them could be seen as ridiculous by people around them just as how people in love including myself seem funny by my friends and other people as I had my rose-colored glasses on with tunnel vision.


Audio Story Beats.

For my trail to visualize my emotional rollercoaster and for Nick to put together an original sound track along with my ups and downs throughout the experience of my piece, I mapped out my emotional rollercoaster with the x-axis as time and y-axis as my emotional state. There are three love stories along the time axis with three distinctive guys from different countries and ethnicities respectively.


Asset Preparation - 3D Modeling.

At first, I wanted the word to be "Vulnerable" as that has been how I feel throughout my love stories here.

I used C4D to make the tube version of the word.


First Pancake - MVP.

After consulting with mixed reality artist Sarah Rothberg, we broke down the technical steps fulfilling the interaction for my project and made my first MVP that made sure the key interaction worked, and would later expand on it:

  1. Activate the audio with raycasting (raycasting it just as one mesh)
  2. raycasting it as many meshes
  3. paint with patterns

I used AR Foundation and developed the huge AR word sculpture in Unity.

..getting the raycast simply to work with AR Foundation could not get the raycast to work on transparent mesh
..reverse-engineered and troubleshoot with solid metallic mesh which is a state before I made it into transparent
..and finally the raycast worked on transparent mesh successfully!

Then I started to replace the color selection with audio play, then coloring the mesh to make it visually healing, and finally scaling it to the right size and put it at a reasonable distance.

Scaling is actually a lot more challenging than it sounds. Especially when it had to be built on my phone to see the result I altered, it was very time consuming just to adjust it to the right height with the right size that makes it look like it was up in the air and not too far away as how the moon follows us wherever we go.


Challenge -Mapping.

Yet, the biggest challenge is mapping the 3D model and the trail to the location for the AR walk experience.

I chose Washington Square Park as where the AR walk would take place because some of my stories happened around the area.
I wanted it to be an AR walk that takes the audience on this audio-visual healing journey with my love stories in NYC.
Washington Square Park always gave me this magical feeling which I thought would match the audio-visual healing experience I was trying to create.

To map, I started walking around the park and calculated the duration of my walk, and decided the distance, route, and pace I want the experience to be.

I also calculated the length of the narrative for each story in the poem and the audio so that it matches the duration of each situationship in real timeframe correctly.

I spent three hours on a sunny weekday at the park sitting right next to the legendary piano guy, reflecting on the past three love experiences in NY, writing my poem while recording the sound for Nick to compose.

The poem as a script of my voice over for the audio intended to immerse the audience with my love stories at the park, and through augmenting the sound of the park, mixing the prerecorded sounds and the actual sounds happening at the park in real time, to evoke their imaginations and emotions.

In order to map the large scale 3D sculpture with the park across 200 meters more efficiently and accurately, after several attempts of building and testing in the park below 40°F with freezing hands, I created a plane in both C4D and Unity to accelerate the process of creating the 3D model in the right scale and mapping it at the right distance and location. What I ended up doing is to remember the relative positions of some landmark of the park with the feature of my 3D sculpture when I tested on site with my phone and documented it for my record, and then adjusted the 3D model and aligned it with the landmark of the map on the plane in C4D and Unity according to my observation and the reference from earlier documentation to have the AR experience mapped out as close as I designed.

After around 10 more times going back and forth, scaling the 3D model relatively to the plane of map on C4D and Unity indoor (so my hands were not frozen), building it on my phone, going to the park to test, remembering the landmark and screen recording, back to scaling and adjusting, building, testing, from daytime to evening, the mapping was finally close to what I desired.



After getting the music composed by Nick with the prerecorded sounds I recorded in the park and edited it in Adobe Audition with the voice over I recorded on Zoom multiple times to get the right emotions across, and switching out the audio file with this one on Unity, Love Anthem is born!

Since it is such a personal peice, and I am still pondering on whether I should publish my AR public sculpture, in order for people to have an experience and idea of what the AR walk experience is like, I created some storyboards and asked my designer friend Ella Chung to help realize the documentation with me under 32°F.

Talented Ella helping documenting my project with bare hands under 32°F with my AR sculpture above her head


Improvement for Interactivity.

One of the flaws of the current state is that the area for the audience to activate the colored tube and audio is only in the middle of the screen. I am planning on attaching a much thicker version of the current trail 3D model as a transparent game object onto the existed one, and link the raycast script to that transparent game object so that it is much easier for the audience activate the raycast to color the tube and listen to the audio more consistently.