Experience Design, UX Design - Smartwatch, Speculative Design, Wearable Design

Thesis Project in collaboration between Royal College of Art x Imperial College London, and featured in Royal College of Art Grad Show 2021

TEAM Independent Project

DATE 2021


The future of personalized immersion will enable humans to be augmented and empowered to make time-efficient, customized decisions on how they engage with their data, and their environments in their cities. Most people spend too much time during their time outdoors searching, decision making and wayfinding with a smartphone, wasting precious time.


Today, activity and purchase decisions in local and international travel begin with an Instagram post. Tech companies like Pinterest, Instagram and LinkedIn have reported a 276 percent increase in small-town travel, with an emphasis on people wanting more unexpected destinations, local knowledge, and eco-friendly means.


The design process of Encounter used a methodology driven by user research and stakeholder feedback in the testing of the multimodal design system. Encounter has garnered validation and interest from industry experts across the fields of wearable technology, voice user interface, machine learning and color psychology from companies such as WearWorks, Google and Pear Sports.

The Future of Personalized Immersion

Encounter is an AI-driven augmented reality experience that helps people discover serendipitous experiences and geotag audio memories in their cities by leveraging their digital information to enhance their physical journeys through voice assistance. Encounter’s geotagged audio connects users with a rich tapestry of lived experience in their physical environment for a timeless connection to loved ones.

Visual Design: Smartwatch App

Users begin the journey by wearing a custom hearable that connects to a  smartwatch application. Encounter has three different layers:  the first layer reflects your routine or tried encounters, the second layer recommends encounters based on your social media and manually entered data, and the third layer creates serendipitous encounters such as the perfect sunset view at exactly the right time.

— Encounter is an AI-driven digital assistant that augments your journey with serendipitous experiences and geotagged audio. The system comprises of a hearable device, smartwatch, the encounter application.
Bone Conduction Hearable

Users are able to effortlessly navigate a city using the voice assistant designed with the needs of mobility, as personalized serendipitous encounters extend and enhance their physical outings into meaningful experiences.

— Encounter works through a customized hearable device and a smartwatch application to provide information on physical experiences near you, along with walking time.
— Encounter's bone conduction adhesive hearable was designed using flexible electronics to feel like a second skin.
— Encounter Audio-Visual Interaction: When the Encounter voice assistant is asked a question, it pushes only the information people need directly to their watch.
Geolocated + Spatial Audio

Users can record and leave geotagged audio files for friends and family through their phone or watch, creating opportunities to revisit places through new perspectives.

— Encounter Geo-located Audio: Encounter's geolocated audio allows friends and family to send audio memories into physical locations for users to discover on their journey.
— Encounter Spatial Audio: Encounter's spatial audio navigation uses a virtual corridor to guide people to a destination and can be prompted through voice.
The Encounter Experience

Encounter has massive potential in personalized immersion and how we can save precious time through meaningful and customized experiences instead of spending most of our time planning. Encounter presents a vision of the future of X Reality (XR) and Human-Computer Interaction (HCI).

Grad Show Exhibitions

Physical exhibitions were held at Royal College of Art and Imperial College London. An interactive map was designed using NFC tags to recreate a small scale version of the Encounter city experience in the exhibition space.

Design Methodology

A speculative design approach was initially used to scope the technology and context to design each component of the multimodal design systems. By using multimodal design approaches, the components were individually validated by industry experts and users before being tested in a consolidated prototype.


With the advent of augmented and mixed reality interfaces, there is a unique opportunity to expand experiences beyond screen-based interactions towards reality-based interactions.

User Research + Expert Engagement

Surveys were conducted on 67 participants and synthesized using sentiment and keyword analysis software. Encounter has garnered validation and interest from industry experts across the fields of wearable technology, voice user interface, machine learning, and color psychology, from leading companies including Google, Panasonic, Pear Sports and CultureTrip.

— Cities participants would find Encounter useful in given the context of local travel or exploration abroad.
— Informations typically used by participants when trying to inform their recommendations and navigations around a new city.
Persona + User Journey

The target personas for Encounter include the local explorer who is looking for spontaneous encounters on their familiar journeys. By solving for the extreme case of the local user rediscovering local places, Encounter was able to also be applied towards even the tourist user.

Secondary Research

Secondary research was conducted and culminated in a thesis report on the design approach and process.

Multimodal Interaction Mapping

Multimodal design allows for multiple modes of interaction for the user. Because Encounter was a concept of a digital assistant that could be used anywhere, interaction mapping was done on a specific journey from Bayswater to Royal Albert Hall. This journey uncovered necessary interactions for human, visual, auditory and haptic feedback.


Visual designs were created using secondary research on glanceable notifications and color theory expert validation. The voice interface design was tested through stakeholder engagement feedback and run through google assistant. The hearable design process was informed by secondary research, participatory design and expert interviews. An immersive walk and paper prototypes tested the overall interaction system of the multimodal design.