As one of my first augmented reality explorations in Unity, the goal of this project was to work with many types of assets to understand how different media can be used to bring out Unity's fullest potential.
The experience uses AR Foundation's image tracking features to display a digital model on a physical card, and offers clickable links that can be used to visit my social accounts and website.
The final product also incorporates video animations, particle effects, and an animated avatar to achieve a futuristic look.
The initial inspiration for this project was a post on social media with a similar concept. This example was pretty basic, but it gave me a lot of interesting ideas about what the full potential of an AR business card could be.
A lot of my interest in augmented reality surrounds applications with real utility, and this project card seemed like a great first step into the AR space.
I started with some quick sketches in Procreate to develop a storyboard that outlined my initial thoughts on the experience. I tried to stay conscious of usability from the start, considering text and button size as well as where a user might want to grab the card.
To dial in the design with a higher level of fidelity, I mocked up my ideas for the digital part of the experience in Figma. At the same time, I began designing the physical card with Illustrator and Photoshop.
It was important to create these initial mock-ups, because it allowed me to bring my ideas into Unity and do some early testing to ensure the basic functionality of my experience.
This turned out to be valuable because my early design for the physical card did not have enough identifiable features to work smoothly with image tracking. I used what I learned to work through a few iterations and optimize the image for better performance.
With a functional base for my prototype, I was excited to move on to the animations, effects, and interactions that would make this project stand out. Referencing my initial storyboard, I strategized how I would achieve the desired result.
I used a combination of familiar and unfamiliar tools, hoping to learn new techniques, but also how to incorporate tools I had experience with into the Unity workflow.
I brought each asset into Unity and used the built-in timeline to create the animation sequence I had outlined. To program the interactions, I used the UI event system.
With all these pieces together in one place, I finally had a full prototype of my idea to test and refine.
To reveal any weak points in my design, I put it to the test, allowing others to demo my prototype. Here are a few key takeaways from that process:
While some of these issues are a quick fix in Unity, others will require a more thoughtful approach and continued testing to validate.
For example, the choice to use white text with a transparent background shaped a big part of the overall design. To preserve that style, it would be worth experimenting with a darker gradient as a backdrop to improve contrast.
Another interesting problem is the users' understanding of the buttons. Without hand tracking and the ability for hover animations, I think a good clue could be a subtle button animation every couple of seconds.
Whether I decide to do another prototype for this project, or use my findings to improve the next one, I'm really satisfied with my result and what I learned in the process.
I'm looking forward to applying these lessons and continuing the ongoing loop of iterating, learning, and applying my knowledge to build better AR experiences in the future.