UX Architect
Workload Replay

I led the design of this product from concept to delivery. My user research and prototyping defined the scope of the product and produced an intuitive design. The product was an immediate success in the marketplace and customers praised the ease of use.

Process and Results

1. Identified business problems

Identifying the business problem and customer pain points is the first step in my user research. In the case of OWR, I used contextual interviews and scenario walkthroughs to gather input and then worked with product management to articulate and validate the value proposition. The chart below shows customer validation of the importance of the pain points articulated to the right — we were addressing problems that significantly impact their business.

2. Conducted user research

Through user research, I produced a set of use cases that I later validated with additional customers. In addition to importance ratings (chart below), the interviews gave insight into the goals users hoped to achieve. Both the ratings and the goal insights helped to drive later design direction and details. One such detail drove us to seamlessly integrate a tool for data cloning.

3. Developed user scenarios

I used visualization of user scenarios to communicate our design thinking and validate our process model with users. As shown in the charts below, I look at the detailed ratings and correlate them with specific user comments. Why do some users rate a scenario high and others low? The answers give key insights for design – like those we gained from the answers to the data privacy question.

4. Designed scenario-based mockups

Based on the scenarios and use cases generated in the user research, I designed scenario-based mockups and iteratively reviewed them with users. The final scenario-based reports, for example, were highly rated and the value praised by beta customers.

5. Conducted usability tests

One of our goals for the release was for users to be up and running (install, configure, capture, replay, and report) in 3 hours or less. As the charts here show, it took a few rounds of testing and fixes to meet the goal for the S-Tap install and config.

6. Performed complexity analysis

Complexity analysis is a tool developed by some IBM colleagues that we use to do a type of quantitative heuristic evaluation. We look at factors like how many inputs are required, how many new concepts introduced, and how often a user has to shift context, e.g. going from a GUI to a command line to finish a task. We add all that up to measure the complexity of a task and then look for ways to reduce the complexity. In this instance, we reduced the complexity of the Cross Network Replay task by 80%.

7. Gathered quantitative feedback

8. Validated final design with users