Product Designer at Muse
User Research, User experience design, Interaction design, Interface design, Data visualization, Animation
Integrated Session Results With Muse
About Muse
At Muse, we live at the heart of human-centered technology. Our products are designed to help make our users lives easier, more connected, and a lot more mindful. With a state-of-the-art EEG system that uses advanced signal processing algorithms, Muse gives you feedback about your meditation in real time by translating your brain signals into the sounds of weather. When your mind is calm and settled, you’ll hear calm and peaceful weather. When your mind is active, the weather will get louder. Monitoring changes and trends over time in each state helps you to track your progress.
Discovery
After the launch of the Muse 2, we saw an influx of customer cases where users wanted to get brain data from the EEG sensors in the device regardless of which experience (mind, heart, body, breath) they had chosen. My team dove deeper into this user request to figure out the actual "user needs" underneath the "user wants".
Qualitative User Interviews
Conducted a series of qualitative user interviews with a variety of users, some users were pretty new to meditation and others had been practicing 10+ years. Most have tried other meditation apps, but prefer Muse because of the additional biofeedback and data that the wearable provides.
Insights
Most of our customers are focused on thought related practice, where Mind is a key part of their practice.
Because brainwave feedback (EEG) is the unique product offering, people expected to see their brain data at the end of their meditation session, regardless of the type.
Advanced users who were comfortable, wanted the deeper insights so they could deepen their meditation practice.
Users wanted to look for correlations in their biofeedback data (i.e. does my heart rate affect the calmness of my mind?).
The Challenges
1. Universal Signal Quality Check (USQC)
The first challenge was to have users complete signal quality check (SQC) for all the sensors (EEG, PPG, accelerometer) in their Muse device at the same time. This meant we would have to merge the individual SQCs for each experience into one comprehensive universal signal quality check.
2. Integrated Session Results
We then had to integrate 4 different types of complex biofeedback in a way that allows user to look for correlations in their data.
1. Universal Signal Quality Check (USQC)
Discovery
After conducting internal user testing, I validated some of my concerns about the original mind SQC and its onboarding.
Insights
SQC on boarding was too lengthy (7 min)
Highly technical information was unnecessary and confusing.
Not being able to skip onboarding or know progress was frustrating.
Colours differentiating the sensors gave users the impression that the sensors all did different things when in fact they’re all EEG sensors. Plus the recent addition of the other Muse 2 experiences (body, heart and breath) only added to the confusion around colour mapping.
Always present tooltips gave user the impression that they had to adjust the headband, prolonging the
SQC process.
Conceptualization
After sharing the insights to get buy-in from stakeholders on the feature, I began exploring a variety of configurations, data visualizations, layouts and colour palettes before for the optimal universal signal quality graphic.
User Testing
Conducted rapid prototyping and user testing over a 5 day sprint with existing Muse users and people who had never used the product.
The Goal
Determine if the user was able to:
Map the sensors in their Muse device to the graphic
Comprehend the different states (none, weak, good) of signal quality check
Understand how to troubleshoot connection issues while in signal quality check
Insights
It was challenging for the user to pay attention to multiple animated graphics at once.
Users didn’t realize secondary SQC options were tappable.
Signal quality check should be universal and not specific to experience type so that that the graphic could scale with the Muse experiences as they evolved.
Combined graphic was easier for user to understand.
Location of PPG sensor was misleading, did not accurately represent its location on the device.
Body (accelerometer) was not needed in USQC because if the other sensors were able to establish good signal, then it was a given that it would work.
Needed to keep troubleshooting tips
Mid Fidelity Designs V1
Conducted 3 user tests
This version of the USQC graphic had a simplified shape that more closely resembled the form factor and mirrored the actual placement of the sensors on the Muse 2.
Insights
Bright colours used to create visual separation were ultimately distracting and users tried to attach a deeper meaning to them.
Close up of the headband made it hard for users to correctly identify the sensors.
Mid Fidelity Designs V2
Conducted 5 user tests
In this version I attempted to split apart the onboarding information further and provide pagination to indicate progress. After conducting user testing, I validated my decision to use a minimal colour pallet.
Insights
There was still too much copy to get through.
Splitting the users attention between the copy and image was an issue.
Grid pattern was still not a clear enough indication of weak signal.
Headband diagram was too small to be useful.
Users were prematurely adjusting the headband when specific sensors were not producing good signal quality, but in the process would disconnect all other sensors that had either a good or weak connection.
There was a need for us to introduce instant feedback of the sensors status because the sensors need to establish a good connection with the user before the device can test the signal quality and this was not clear.
The Solution
The solution was to create a universal signal quality check (USQC) not specific to experience type with a simplified shape that more closely resembled the form factor and mirrored the actual placement of the sensors. The onboarding for USQC would be condensed into 3 short videos which only included essential information and avoided technical jargon.
Final USQC
Conducted 4 user tests
The final USQC introduced dots were visual queues which indicated the current status of the corresponding sensor in the headband. During onboarding, videos now replaced text, allowing user to focus on the graphic while listening to the instruction.
Insights
The different shades of teal were correctly interpreted by everyone as representing various degrees of signal quality.
The shape of the diagram makes it very clear where the sensors are and is reported to work better than the full circle.
Perceived as being very efficient and quick, speedier then before.
When asked, 75% of existing users preferred the look of the new design and 80% found it easier to use.
2. Integrated Session Results
Discovery
Each of the existing Muse 2 experiences (mind, heart, body and breath) provided the user with a breakdown of their own unique data at the end of of meditation session.
Mid Fidelity Designs V1
Conducted 4 user tests
This version used a tabbed view of results which allowed users to compare data sets.
Insights
Users wanted to see graphs in relation to each other so they could easily look for correlations in their data.
Discoverability of navigation was low.
Mid Fidelity Designs V2
Conducted 3 user tests
This version still used tabs for the user to navigate through their secondary data but the graph for their chosen experience remained ever present.
Insights:
It wasn’t immediately clear to users what the connection was between the tabs and which graph it effected (top or bottom).
Secondary legend too far down and needed to be closer to the graph.
Mid Fidelity Designs V3
Conducted 3 user tests
Here I explored overlapping data sets to help users find correlations in their data.
Insights:
Too much data in one graph.
Double legend on y axis was confusing to users.
Legends proximity to graph was helpful.
The Solution
The solution was to show the user a high level view of their session, but allow them to dig deeper into the data if they desired. Spark lines and corresponding statistics would provide the user with a holistic view of their meditation session.
Final Integrated Session Results
Conducted 4 user tests
The final integrated session results introduced spark lines which showed the user the minimal amount of mind, heart, body and breath data side by side so they could gain a holistic view of their meditation session. Plus the expanded view allows the user to scroll through each full graph and zoom into specific points of their session to uncover deeper insights.
Insights
Users felt like the spark lines provided a holistic snap shot of their meditation session.
Expanded view perceived as being for more experienced meditators.
Users remembered points in their sessions when their mind became active but were excited to see how their heart and body data were also connected.
Tools: Sketch, Invision Studio, Adobe Illustrator, Adobe After Effects