nFocus - Capstone Project

nFocus Banner 1.png


The following project was completed during my senior year of my bachelor’s in Human Centered Design and Engineering as my capstone project. Our project was sponsored by nFocus Solutions, and was focused on their Report Center tool in their product TraxSolutions. Taking into account our user's needs and current frustrations with key usability problems in the current product, the redesign was focused on creating a more efficient and effective workflow for the user. The project was presented at the HCDE Open House where it took home People's Choice.

You can view the clickable prototype of the final design at the bottom of the page.

You can watch a video about the project here.



Methods: Heuristic Evaluations, User Interviews, Usability Testing, Affinity Analysis

Our research question was: What information and reporting tools are most important to satisfy the goal of our nonprofit users?

We defined our users as light, data entry and admin position users at national nonprofits across the United States. Our answer to our research question became more about the inefficiency the user is experiencing and the functionality they need to create a report that is ready to be presented. This iterative, “blue sky” design project is focused on improving the Trax reporting tool, and as such we must reframe our research questions and goals along the way.

We began by initially looking at the usability of the product by conducting a heuristic evaluation, following Jakob Nielsen's 10 Heuristics for User Interface Design. The evaluation was done on select pages from the Report Center, such as the pages shown to the left. Next, our team conducted semi-structured user interviews with real users of the current product, as we wanted to understand more about the users and their context. The participants also participated in a short usability test which helped uncover any current usability problems that occurred during their workflow. We then conducted an affinity diagram, where we consolidated our research to pull our main findings from.

Our research yielded four overarching key findings:

  • How users think about their organization’s impact

  • Lack of consistency in the Report Center

  • Issues with visibility of system status

  • Need for flexibility in data manipulation to build custom reports

Some additional findings were our users’ lack of use of the favorites features, users participating in training others, and the high and frequent use of nFocus’ customer service and the high reviews of it. The combination of heuristic evaluations and user interviews successfully answered our research question and gave us a clear perspective on the users’ experience. Through this process, we have defined four design principles: transparency, language, future use, and guidance.

From our user research we also found that there were two key user groups to design for to consider this redesign successful - more experienced admin level users that interact with larger sets of data and assist local users, and more basic users who work with smaller sets of data and have less expertise with the product. An interesting insight we gained is that while our research participants are fairly expert level users of the product, many Trax users have six-month turnarounds and therefore need to learn how to use the system quickly.

After conducting our research, we answered our research question with: Users need access to a specific variety of metrics about their program participants to guide their programming and satisfy their stakeholders. Their current process in doing this is inefficient due to their need for workarounds in the software.

Through conducting our user research and increasing our understanding of the design scope with our sponsor, we realized that our initial question was too limiting in guiding our design possibilities and focus during this stage. We learned that a purely usability inspired design improvement on the report center would be limiting for our user’s workflows. We decided to revise our research question going forward, asking: How can we determine and evaluate a higher-level system design while keeping our users’ workflows and goals in mind?



Methods: Sketching, User Scenarios and User Workflows, User Requirements, Sitemap

Throughout our ideation phase, we laid the groundwork for the design phase through our tasks and deliverables. We first synthesized our user research into four different user scenarios that helped us understand our user’s workflows. We then defined a list of user requirements based off of our design guidelines to be sure we kept the current functionalities our users need and set some that they want. Finally, we completed comparative analyses, sketched possibilities, and ideated workflows to create a sitemap that describes our higher-level system design.

Our design guidelines that influenced our solution and requirements were:

  • Transparency: We want the user to know and understand the process of generating a report and the outcome of different actions

  • Language: We need the to speak the user’s language. The product should use terms that the user is familiar with and are applicable to their needs

  • Adaptability: We want the user to be able to adapt their reports to their goals and needs while using the product

  • Future Use: We need to design for and understand that generated reports are being used to communicate outcomes and may pass through many hands who will need to understand it

  • Guidance: We need to design to guide the user to complete a task or generate a report. This allows for a standardized process across all users who use the product

From our analysis of the user scenarios, brainstorming sessions, and comparative analysis, we decided to move forward with a dashboard-based design. By combining both the dashboard and report center pages in Trax we feel that our users will be able to access and visualize their data and reports much easier than before. On the dashboard users will be able to see graph visualizations of selected reports, access their saved reports, and search for any reports in the system. Some added functionality that we believe will be most beneficial is the addition of custom notes to reports, a list of most recent reports, and admin functionality that can push preset reports and visualizations to their branches.


Design and Iteration

Methods: Sketching, Wireframes, Visual Design, Prototyping, Usability Testing, Iteration

Role: Project MAnager / Design Lead

Before the design phase started, we met with our sponsor to get feedback on our ideation deliverables as well as deeper information on the current tool. With the new information from our sponsor and some preliminary sketched designs, we iterated upon our current information architecture. With a new information architecture, we went deeper into the design process.

Our design phase was focused on the system redesign, informed by our results from the previous phases. To start, we began by sketching, both individually and collaboratively, different screens to explore a variety of options for the user interface. These sketches helped guide us when we started to create our wireframes. Once the whole system was wireframed, we then assembled a clickable prototype of a core workflow which we could use for conducting usability testing and iterating on our design. Finally, we created our hi-fidelity mockups of key screens to demonstrate our vision for the implementation of a full product redesign and release.

For our usability test, we used an array of methods to gain valuable user feedback. We designed the prototype to represent a user flow of generating TraxSolutions’ most popular report, the Participant Alpha Roster.

Our testing sessions looked like this:

  • Gave our participants a scenario and task that generated the Participant Alpha Roster

  • Ask scripted questions

  • View hi-fidelity mockups

  • Ask scripted questions

These questions were based around the participants’ opinions of the of the screens and workflows and focused on the design goals we previously developed in the user research phase. We also did a demonstration of the usability test to our sponsor and received further feedback and suggestions so we could iterate on our protocol before testing was conducted.

Four usability tests were conducted:

  • 2 Trax Solutions Users

    • Previously interviewed during user research

    • Conducted virtually, using online meeting services and screen sharing.

  • 2 University of Washington Students

    • No previous experience with Trax

    • Conducted in person, on the UW campus

Our usability test yielded many results, both positive and negative. When it came to the positives, the participants enjoyed the clean look of the new system, the added functionality that made the system more efficient and personalized, and the new organization. We also received feedback on what needs to be improved. This feedback involved adding functionality that exists in the current system that was not apparent on our design, showing processes in depth through adding extra wireframes, and improving clarity of system status.

The participants were very intrigued by the report dashboard, since its design is a bridge between the current system’s report list and dashboard. Most of the participants enjoyed having the quick access function available on the page, but most thought that they should be listed before the ‘All Reports’ section, since they were deemed more useful. We made modifications to the dashboard based on usability testing and sponsor feedback. This mainly involved reorganizing the items on the page to better reflect user priorities.


Final Design

Review the clickable prototype below: