Six weeks (part-time)
This project was an assignment for Designlab’s UX Academy. The brief was to design a mobile application that used augmented reality (AR) to help millennial consumers buy furniture that truly fits their space.
The clickable prototype I tested appears to the left. It's designed to test the following tasks, so not every affordance is clickable:
Get up to speed on the product landscape and understand potential users.
Based on insights in research data, get clear on the problems to solve and who to solve for.
Quickly build a prototype to test whether users can accomplish core tasks. Visual design fidelity relates to project needs.
Test prototype with users representing target demographic. Come up with insights from this data to spotlight usability issues and reframe design questions.
The research phase included both secondary and primary research. Secondary included a quick survey of the home furnishing purchase space and a more focused competitor and adjacent analysis. I also conducted user interviews to get a closer look at how users were
actually buying furniture.
Home furnishings are big business, and one in which a digital transformation has already happened. Bigger brands increasingly deliver omnichannel experiences and consumers are comfortable shopping online.
The value of the US home furniture market
as of 2019.
More than three quarters of revenue comes from online sales
Several apps already exist to allow consumers to visualize furniture in AR. Ikea's Place app was successful, but hasn't translated into widespread adoption of AR. Other than Houzz, there are few examples of AR-enabled apps that offer furniture from multiple different manufacturers and designers.
Although home decor maintains a solid presence on social media, few of the larger manufacturers have meaningful connections from their sites. DecorMatters gives users the opportunity to be their own decorators, but with 2d tools and not AR.
Interviews yielded rich qualitative data about both what participants said and what they did across complex purchase journeys. Notably, participants were generally renting apartments and weighing purchase decisions in terms of future housing plans. Some displayed more interest in interior design, while others saw furniture as a practical necessity with the potential to improve their home life. All commented on the dizzying array of furniture buying choices.
I interviewed six participants from the millennial demographic.
Average interview length
I used affinity mapping to find insights in the interview data I gathered. Two general patterns of purchasing behavior, motivations, frustrations and needs emerged, corresponding with the personas shown below.
Enthusiast customers are using social media and other digital resources to keep up with design trends outside of more focused furniture searches. They also discussed the value of seeing furniture in the context of decorated rooms instead of just on its own in developing a sense of style.
This led to the collections feature, which puts human-curated sets of furniture at the center of the app, headed by photo-real renderings of the furniture together in a space. (release schedule)
All particiapnts used measurement and other visualization techniques like drawing or taping to judge fit. It was most pronounced when shopping for multiple furniture items. More engaged users also highlighted that fit was also about the relationship of scale of the furniture to the room, indicating the creative dimension of fit. Overhangs, overlaps, and pinched spaces were also hard to judge without 3d represntations.
These observations suggested that features to support easy visualization of multiple items in AR was important.
All users lamented the overwhelming amount of choices in buying furniture online. A curated approach to an overall catalog was clearly welcome, even if different personas might browse it different ways.
Users infrequently bought furniture on first sight (or sit). Once a specific need was identifed, most would buy within a month or two, sometimes less time. However, many were also putting off dream purchases because of constant mobility due to work or family. The most enthusiastic about interior design would bookmark items for years later.
This underlined that bookmarking would be important, and need to include multiple items.
A user like Emma seemed both underserved by current products, and the most likely to engage. Thus, she was the primary persona. However, I realized that both could be served with relatively similar features.
Users matching the Emma persona would likely use the collections to do a broader search for new items, while more casual shoppers would likely search for more specific items like “couch”.
Based on these user journeys, I created a mid-fidelity clickable prototype to test with users who matched the target demographic.
Patterns borrowed from curated shopping apps helped users get to Heem’s collections more quickly.
I chose a final design that mixed the efficiency (and discoverability) of carousels on mobile with the impact of large images showcasing the collections.
A search bar and category-based browsing allowed users like Ken to do targeted and utilitarian browsing based on room or item type.
Initially, I explored tabs as a way to present options for selecting different categories of furniture to add to the scene. I opted instead to adapt the image carousel found in popular consumer AR-enabled apps like Instagram and Snapchat and add tabs above instead of buttons.
Some UI references for the carousel design
AR prototyping can be complicated. Fortunately, core user tasks could mostly be tested without moving objects in 3d space. Doing usability testing of a mid-fidelity Figma prototype helped validate navigation, icon choice, and other UI design details without any motion.
Although I didn’t test with high fidelity wireframes, I created them as a way to better articulate the concept with branding and better graphics. Ideally, this could be applied to the next iteration of clickable prototypes.
Testing with participants at home could also shed light on challenges for type legibility over different backgrounds.
I tested the mid-fidelity prototype with five participants in the user demographic using remote, moderated testing. Users were able to complete most tasks, but sometimes struggled in illuminating ways.
Users depended heavily on the copy to decipher unfamiliar icons. Labeling the “AR” icon in the bottom nav did wonders in allowing users to understand the view in AR icon. All users understood the purpose of this icon because of the bottom nav label.
In order to make viewing multiple items in AR as seamless as possible, I created this feature to allow users to add items without leaving the AR view. Users displayed some confusion around the meaning of “similar”, although they were all able to intuit that it would add some kind of item. Based on feedback, I changed the type to the more general but accurate “add item”. Users did not seem to need to know that this add button would present collections and matching items first in order to use it.
Users displayed a little confusion around the meaning of sets and collections, and also that sets would be accessible under favorites. However, this only prevented one user from finding them. When looking for one item that they had previously saved to a set, most users assumed that items saved to a set would also appear in favorites.
To make this more intuitive, I changed the name of the favorites on the bottom nav to “Saved Items”.
A suite of machine learning algorithms enables experiences like those found in eBay, Etsy, and other online shopping tools where users can find related items based on camera images. Real-time classification and segmentation models can locate known items within an image, as background blur tools do within video conferencing software like Zoom. How could this technology make an AR shopping experience even more intuitive and immersive?
I created the following frames to sketch out an auto-matching feature, which would allow users to instantly find matching furniture for their room without leaving the 3d viewport.
The above designs take a seemingly straightforward approach where users can intuitively tap on identified objects to see matching ones from the catalog.
However, this can require users who are already holding the phone with one hand (as many naturally do in AR) to shift hand positions and thus move the screen targets as they raise their other hand to click on the items. Although hard to describe, you've likely experienced this if you've ever used a QR code on an iPhone.
How could this experience avoid shifting hand positions after scanning around a room?
In this design, users can fully control the experience with their thumb as they hold the phone upright. As objects are detected in the room, they populate the swipe-able carousel. Maintaining consistency with other AR UI elements, the center icon is clickable, activating the object placement screen with a set of matched objects.