Experiential Design: Task 03

| 03.06.2025 - 24.06.2025 (WEEK 7-WEEK 10)
| Experiential DesignBachelor of Design in Creative Media
| Experience Design Project Proposal
Tang Chin Ting [ 0366473 ]

INSTRUCTIONS


SUBMISSION

Task 03: Project MVP Prototype
Instructions
Once their proposal is approved, the students will then work on the prototype of their project. The prototype will enable the students to discover certain limitations that they might not have learnt about before and they’ll have to creatively think about how to overcome the limitations to materialize their proposed ideas. The objective of this task is for the students to test out the key functionality of their project. 
  • The output may not necessarily be a finished, visually designed project. 
  • The students will be gauge on their prototype functionality and their ability to creatively think of alternatives to achieve the desired outcome.
Requirements
  • Screen Design visual prototype (Unity)
  • Functioning MVP of the App Experience
Submission
  • Video walkthrough and presentation of the prototype
  • Online posts in your E-portfolio as your reflective studies
Progress
This is a group assignment that Ming En and I worked on together. Our topic is about creating an AR route guide for freshmen and visitors at Taylor’s University. The idea is that users can scan a building with their device, select their destination, and have the AR route guide appear directly on their screen to help them navigate the campus.

Review of our Figma prototype, which we did at Task 02:
Figure 1.1 Task 02 Prototype in Figma
We initially planned to create the route scan using the Mapper app to capture the environment, along with the Immersal SDK engine in Unity for building the AR experience. However, we discovered that the Mapper app can only capture up to 100 photos in one session. This limitation meant that our original route—from the library to the Grand Hall—was too long to capture completely, since it would require far more photos than allowed.

Because of this restriction, we decided to revise our plan by changing both the starting and ending points. We chose a shorter, more manageable route—from E7.14 classroom to D7.03 classroom. This new route is much easier to scan within the photo limit and still allows us to demonstrate the AR navigation concept effectively.

We used the AR Foundation engine for the whole project.
Figure 1.2 Using AR Foundation Manager in Unity
We used the Measure app on the iPhone to record the distances for each specific segment of the route. This helped us accurately scale the 3D route path in Unity. For the AR route itself, we created a simple 3D cube object. We adjusted its width and scale to resemble a path on the ground, making it easier for users to follow the guide visually in the AR experience.
Figure 1.3 Create AR Route in Unity
Route Arrow 3D Object
We added route arrows along the path to help guide users more clearly. To create the arrow, I used two 3D cube objects in Unity and adjusted their scale and rotation to form an arrow shape that would fit well on the route path. I also wrote a custom script that places these arrows automatically along the entire route and added a floating animation to make them appear more interactive and noticeable in the AR experience.

The script made it much easier for me to fine-tune the behavior of the arrows. It includes two adjustable fields: one for controlling the floating speed of the animation and another for setting the spacing between each arrow along the path. This flexibility allowed me to quickly test and modify the appearance and movement of the arrows to achieve the desired visual effect.
Figure 1.4 Create Route Arrow in Unity
Figure 1.5 Scripting for the Route AR in Unity
Arrived Destination Label 
I found this 3D object on Google and placed it at the final point of the route to indicate the destination. To make it more dynamic and visually engaging, we added a floating animation by writing a custom script in Unity. The script allows us to easily adjust both the speed and duration of the animation to draw attention to the endpoint and enhances the overall interactivity of the AR experience.

Click here for the 3D object link.
Figure 1.6 Design the Arrived Destination Label in Unity
Figure 1.7 Scripting for The Label in Unity
We’ve added a Particle System to your scene under the ARRouteModel Variant object. This component is likely used to create a visual effect, such as glowing or sparkles, to make the final destination object stand out.
Figure 1.8 Adding Material Effect for the Label in Unity

UI Interface

For the UI interface of our app, we designed and implemented four main pages: the Onboarding page, Home page, Location page, and Profile page. We began by creating a new canvas in Unity to serve as the foundational layout for all the UI elements. Then, we designed each interface using panels. 

Figure 1.9 Creating UI Canvas Interface in Unity

Figure 1.10 Creating UI Canvas Interface in Unity (1)

When designing buttons, we encountered a limitation in Unity, which was that we couldn't directly adjust the border radius of the button. To maintain our design, we decided to export rounded rectangle-shaped button designs from Figma and then import them into Unity. This approach allowed us to preserve our initial design.

Figure 1.11 Creating the Buttons Link to Another Scene in Unity

Additionally, we imported the font into Unity to better align with the overall design and tone of the app. The font choice plays an important role in reinforcing the app's identity and enhancing user experience, so we made sure to match it closely with our original design plan.

Figure 1.12 Import Font Type in Unity

To enable smooth navigation between pages, we used the scripts to link each button to its panel. In these scripts, we wrote functions that activated the target panel while deactivating the others to ensure only one page is visible at a time. These scripts were then attached to the buttons through the Unity Inspector. After that, we assigned the appropriate methods to the OnClick() event of each button. This allowed users to navigate seamlessly across the UI—from onboarding to home, from home to location, and so on—by simply tapping the buttons.

Figure 1.13 Creating The Function for Each Button

Background Music

We decided to implement background music that plays during walking or exploration to create a more immersive atmosphere while users navigate the campus in AR.


First, we selected and prepared a looping background music file in .mp3 format that matched the tone and mood of the app. We selected a background music track with a simple piano melody.


This helps to reduce user stress, enhance focus, and make the walking experience more enjoyable as users navigate the campus. We hope this can encourage users to slow down and absorb the environment around them, and create an immersive AR journey.


After that, we imported the audio file into Unity by dragging it into the Assets folder. Next, we created an empty GameObject in the scene and renamed it. We added an Audio Source component to this GameObject and assigned our background music clip to the Audio Clip field. To make sure the music plays continuously, we enabled both "Play On Awake" and "Loop" options in the Audio Source settings.

Figure 1.14 Adding Sound Effect in Unity

Testing Phase

Once we had most of the features implemented, we exported the AR navigation app to a mobile phone and began testing it on-site at our campus. For our test route, we chose to walk from classroom E714 to D703. This allowed us to experience the app in a real environment and see how well the guidance system worked.


During the first few tests, we noticed that some features weren’t responding as expected. For example, the route and arrows didn’t appear properly.


After several rounds of testing and tweaking, we were finally able to get everything working the way we imagined. The arrows floated smoothly along the path, and the gradient color on the route looked clear and attractive. Testing on-site really helped us identify and fix issues that we wouldn’t have noticed in the editor alone.

Figure 1.15 Testing Phase

Final Video Presentation

Figure 1.16 Final Video Presentation

REFLECTION

This was a learning experience for me because it was my first time designing an AR route guidance system in Unity. Ming En and I initially planned on utilizing the Mapper app and Immersal SDK to map our route and provide the AR experience. But we faced limitations with the Mapper app, which only allowed 100 photos per scan—meaning it was not possible to map our original long route from the library to the Grand Hall. Even when we adjusted our route to a shorter one from E7.14 to D7.03, we still faced technical issues having the Immersal SDK work as desired.

Because of these complications, we decided to opt for doing it with AR Foundation in Unity instead. It was a paradigm shift that we needed to reverse, but in the long run, it made the process of development easier. We utilized the Measure app on the iPhone to actually document distances along the way so we could actually scale our 3D path in Unity. For the visual navigation, we created cube-shaped section paths and wrote custom floating arrow objects with scripts to manage their floating speed and gap. The arrows direct users along the AR path in an interactive and intuitive way.

We included a destination marker at the ending point, enhanced by a particle system and floating effect to draw attention. What I learned from this project was the importance of being able to change quickly when technical limitations are reached and choosing the right tools for the project requirements. I realized how meticulous considerations like animation and readable markers can highly improve the user experience in AR navigation. Overall, this assignment ended up giving me more hands-on practice with AR Foundation in Unity and a better idea about how interactive AR guides are supposed to be implemented.

Comments

Popular Posts