Design Process - Ask Questions, Define Hypothesis & Measure Results
To build a user-centric product we introduced a design framework learned from Julie Zhuo that comprises of three major components namely explore, define, measure. Throughout the project, we constantly explored the problem space via business meetings & interviews defined our hypothesis and measured our outcomes using data. The goal of each component is as follows:
a. Explore - What people problem are we solving? The keyword here is people. We wanted to ensure we keep the Foodbuy analysts at the center of our project at all times. Asking this question frequently ensured we kept our assumptions in check and aligned as a team.
b. Define - How do we know this a real problem? The keyword to focus here is real. To formulate our hypothesis, we had regular business meetings with all our stakeholders and gathered both qualitative and quantitative data. This ensured our team was solving the right problem.
c. Measure - How will we know if we’ve solved the problem? In order to select the best solution under a given set of constraints, we set metrics that allowed us to compare the effectiveness of each solution.
Research - What We Learnt About Our Users
Extensive research was conducted during the initial few weeks of the project primarily led by the managers, where we understood the needs and operating circumstances of our clients. Some of the UX methods we used to better understand the intent behind answers provided were the Five Whys and Abstraction Ladder method. Given the large scale of the project, these meetings and interviews were conducted on-site and were moderated sessions. Post analysis and after synthesis of gathered data, three key parameters of the legacy system were prominent:
a. Inflexible Design Reduced Productivity - The legacy analytics tool was the primary breadwinner for the company and was used extensively by five internal teams within Foodbuy. There were multiple use cases and users within each team hence the application needed to serve different user groups and yet be flexible.
b.Broken End-End Workflows Reduced Effectiveness - The legacy system did not provide a mechanism for teams to collaborate which caused nuanced information to slip through email exchanges. The new tool needed to support the collaborative nature between the five internal teams.
c. Information Density Caused Frustration - As a financial tool which consists of information regarding millions of contracts the tool was dense with information. Furthermore, the right information wasn’t shown at the right time which was a major cause of frustration among analysts and team leads.
Research In Action - Design Principles
To be intentional in our design, based on research synthesis, as a team we created design principles that we used throughout the project to address the needs & circumstances of our users. These principles were intentionally created at an abstract level to ensure we could apply them when making any design decision. We had three design principles for this project:
a. Tools For Professionals - As a professional tool we knew our users would be using it for 8 hours a day hence we wanted to leverage the concept of muscle memory in our design. This was extremely useful as creating repeatable patterns and placing certain types of actions in certain locations made it easier for the user to find them, as a result, we were able to increase their efficiency.
b. Robust & Reliable - Given the extensive use of the tool and many different user types and use cases, we wanted to think about scalability during each and every decision. We wanted to ensure every design decision was scalable to meet the needs of our users.
c. Put Right Information In Right Step - Given the frustration of our users, we leveraged the power of defaults as we wanted to make sure we provide the right information at the right step. This allowed us to tackle the information density problem & also reduce our user frustration significantly.
Each design cycle comprised of whiteboarding and low-fidelity prototypes that supported testing workflows at a high level. Each design critique session revealed improvements which were then carried forward to a high-fidelity prototyping session. Below is a collection of sample workflows that were after multiple design iterations.
To present the right information at the right time we emphasized interaction design early in our process. Through micro-interactions, we crafted an experience to optimally present the information that was essential. The use of motion and meaningful transitions helped our users understand the application. We gathered this feedback using the think-aloud method.
Testing Design Solutions - AB Tests
With each design critique session, we improved our solutions. Furthermore, to measure performance and effectiveness between designs we used Hotjar to run multiple versions simultaneously and get quantitative results regarding each design solution.
a. Evaluating Design Via Communication Effectiveness - I learned that a decent proxy to measure design is to check effectiveness in internal communication. As designers, our fundamental tool is empathy and its application should first happen internally then move toward our users. As designers, we are responsible for articulating the north star to our engineers & product managers. Once that’s done effectively, there is internal alignment among teams, design solutions are intentional and is fulfilling its true responsibility.
b.Overvaluing Simplicity and Style At The Cost of Clarity - As a designer, we are attracted toward motion design and sleek interactions. These sleek interactions weren’t received well by our users because they couldn’t understand their affordance. I realized this early during our first testing session. I learned the importance of using clear labels rather than obscure icons even though it would decrease the visual appeal. By doing this we significantly increased the application's usability.