Real Time Mapping

Initially, I was only tasked with designing an animation for the empty state of the home screen that would be replaced after a map was generated.

The "North Star" cleaning run experience that we designed included a Real Time Mapping feature; the robot would show its location and generate a map in real time during the first cleaning run.

This was quickly ruled out as not in scope for release. The absence of this content brought up the new challenge of designing a placeholder animation until real time mapping was in scope.
A GIF of a map being generated by a robot vacuum in an app UI. The UI shows the transition from pause to play, and shows that the robot's name is Roberto Ostinelli. The robot is currently cleaning its first clean.
A high level screenshot of two tables. One shows "before clean" and the other shows "during clean." The tables' cells are colored yellow, green, or red to distinguish if a feature is available in the app.
Below are during and before cleaning screenshots from mobile apps from the companies Roborock, ecovacs, iRobot, and Neato Robotics

Competitor analysis

With two problems to account for, I started analyzing our competitors. What did we use in our old app? Who had Real Time Mapping? What were our competitors using when they didn't have Real Time Mapping? Are these placeholder videos valuable to their users?

Two of our competitors had Real Time Mapping. Our old app and iRobot did not. Overall, it seemed like the information provided before and during the cleans without Real Time Mapping were not very interesting or informational to the user. This seemed like an opportunity.

V1: The Interactive Approach

Real Time Mapping is in itself a unique interactive experience. You can watch your whole house be built on a screen by a tiny robot! It's cool! It's interesting! And we didn't have it. How can we still provide this exciting experience?
After brainstorming for a while, I threw out the idea for a minigame. I had game design experience, and to make a small endless runner that provided cleaning tips to the user would be simple to code.
Five mobile screens present a gamified approach to cleaning:

Screen 1 (left): Shows a robot in cleaning mode with a "Pause" button.
Screen 2 (middle): Displays a countdown with instructions to "Tilt your phone to pick up all the floor filth."
Screens 3-5 (right): Show the gameplay, with the robot picking up various small objects on the floor like food crumbs and toy pieces.

V2: The Informational Approach

Okay, maybe a game isn't the best idea. Back to square one. I went through endless iterations trying to create a perfect cleaning run video that provided informational messaging to the user. We can at least make the wait time of a cleaning run valuable, right?

Next, time to decide what will play during an actual cleaning run. Unlike the default screen, this video would be shown during every cleaning run until Real Time Mapping is implemented. The team decided to use this time as an opportunity to give users helpful hints.
 This image shows a sequence of six mobile screens documenting the robot's mapping process:

Top row: Starts with the robot generating the first map, with various stages of map visualization, from a basic outline to a more detailed room map.
Bottom row: Displays close-up views of the robot scanning furniture and objects, eventually generating a full map of the space.

User Testing

We finally nailed down a design and animation. Time to user test. We integrated testing of the default and cleaning animations into our July user testing.
We asked our participants what their first impressions of the default screen were. The animation was successful; all 5 participants understood that objects needed to be picked up before a cleaning run.
Full User Test
open_in_new
User Feedback and Clean Start Screen A mobile screen on the left shows a prompt to "Pick up small objects and loose cords before starting a clean," with a robot image and a "Start" button below. To the right is a user comment from "Techie" explaining that based on personal experience, they would pick up larger objects like chairs to ensure the vacuum gets to where they want it to go.
Unfortunately, the Cleaning Run Video was not as successful. Only 1 of 5 participants noticed the information bubbles throughout the video. The participants understood that the video was not a live feed of the robot mapping, but there was confusion about the fidelity and color of the map, especially when the final map was presented at the end of the cleaning run.
Screen 1 (left): A "Generating map" screen shows the progress of the robot creating a map, with some areas colored in red and orange. A "Pause" button is available at the bottom.
Screen 2 (right): Displays the completed robot map, now more detailed, with user comments on the side. The comments mention assumptions that the vacuum is running around the room, confusion about pixelation, and how the generated map looks very different from the animation.

Final Revisions

I reviewed the User Study feedback with stakeholders. The informational content was completely overlooked by participants, and would be seemingly redundant if played for every cleaning run. After some discussion, we decided to readdress the aesthetic of the animations and go for a more minimal approach. We would keep the default screen informational content, but opt for a simple animation of the robot during cleaning runs.
Back To Top
arrow_upward