Skip to content

Latest commit

 

History

History
122 lines (59 loc) · 10.3 KB

2020, Summer - Understanding the OFA Experience.md

File metadata and controls

122 lines (59 loc) · 10.3 KB

2020, Summer - Understanding the OFA Experience

Issue #113

What We Wanted to Learn

Goals

We wanted to…

  1. Determine OFA's view of "Best Practices" regarding TANF report submission from STTs.
  2. Identify which errors and / or omissions most frequently lead to resubmission / manual data correction efforts.
  3. Evaluate usability of core Admin UI tasks

Expected Outcomes 

We expected to learn: 

  • Best practices from OFA's perspectives for TANF report creation and submission.
  • Understand data points that cause poor data quality and how they mitigate that through resubmission.
  • From the OFA's perspective, determine STTs pain points and frustrations when creating and resubmitting TANF reports
  • Admin user interface meets expectations

The full research plan can be found here 🔒.

Who We Talked To

We wanted to talk to OFA analysts who had well-rounded experience with and exposure to a variety of administrative tasks and data submission scenarios. Specifically, we worked with the TDP Product Manager to recruit members of the OFA data team who…

  • Could authoritatively speak to the process of receiving TANF data submissions from STTs.
  • Work regularly with STTs to resubmit TANF data as a result of data errors and quality issues.
  • Had experience with territories and tribes in addition to states.

We interviewed 3 OFA Analysts in total.

What We Did

Interviews

We conducted 3, 90-minute remote interviews with OFA Analysts

  • The first 60-70 mins was used as a Q&A.
  • The remaining time was used for concept testing with initial admin UI mockups. 
  • 2 of the three interviews were recorded to fill out notes. (NOTE: the participants had the opportunity to opt out of recording the sessions).
  • The recordings were password protected then made available to the user research team to augment notes.
  • As soon as notes were augmented, the videos were deleted.

OFA Synthesis Workshop

After drafting a top-level synthesis of our observations, we conducted a workshop with the members of the team who participated in the interviews to expand on the top-level synthesis, co-create insights based on the observations and brainstorm project impacts.

Artifacts

The full conversation guide can be found here 🔒.

The conceptual prototype of the admin UI can be found here.

The output of the synthesis workshop can be found here 🔒.

What we learned

Summary of Insights and Impacts

This summary is provided to give the reader quick access to top level findings and the recommended actions based on those findings. More detail about these findings and the supporting data behind each observation/insight can be found in the full synthesis document here 🔒.

Observations / Insights Project Impacts
Best Practices – Report Submission
* Alerting OFA when STTs submit data may increase OFA efficiency would likely reduce ambiguity & repetitive manual communication.
* If "fatal errors" were more informative at the outset, Analysts would need to spend less time investigating.
* STTs that have more capacity (e.g. resources, funds, tools, etc.) are more likely to fix data errors. Updated training docs (for how to help STTs) would reduce OFA time/effort when assisting STTs.
* Updated training docs (for how to help STTs) would reduce OFA time/effort when assisting STTs.








Potential TDRS Features
* Consider including notification functionality for OFAs when STTs submit their data.
* Consider including logic that helps explain the nature of fatal errors.
* Though the quality and quantity of training material is not in the scope of the TDRS project, we may want to consider adding a way to access training resources from the OFA admin view.
* Consider the learning curve of the new system we're designing and building and avoid introducing new "hurdles" from a training standpoint.

Future Research

* Interview STTs who represent different levels of engagement to gain a deeper understanding of what engagement is, what leads to data errors, validate engagement hypotheses (e.g. more engagement = fewer errors)
* Look for opportunities for the TDRS project to reduce errors for low-engagement STTs.
Data Points That Cause Poor Data Quality
* File names, headers, and trailers are often problematic.
* Larger reports (with lots of data) present the most opportunities for error.
* STTs store data differently from one another which leads to errors in coding.
* OFA analysts cite inconsistent errors from the system.

Potential TDRS Features
* Consider self-service features for helping STTs understand, diagnose, and fix their own errors as a way to mitigate the risk of errors due to STT data storage differences, larger reports, file names, headers & trailers.

Future Research
* Make sure tribes are represented in STT research to understand specific differences in submission.
* Investigate the nature of inconsistent errors.
Perceptions of STT Pain Points
* No feedback mechanisms exist to inform STTs about the status of their submission.
* High turnover in some STTs leads to higher error rates.
* Lack of training/resources.



Potential TDRS Features
* Consider including notification functionality for STT data submission such as submission status and data errors.
* Though the quality and quantity of training material is not in the scope of the TDRS project, we may want to consider adding a way to access training resources from the STT view.

Future Research
* Validate OFA perceptions during STT research
Perceptions of Admin User Interface
* The columns/data for each user seem adequate.
* The role of "Data Prepper" seems too generic.
* The list of users was seen as being potentially helpful for communication purposes.
* There is a need to be able to filter the information in the admin screens in different ways.


Potential TDRS Features
* Maintain the current fundamental structure and organization of the screen design.
* Represent nuances to the Data Prepper role.
* Consider including contact mechanisms for each user.
* Include mechanisms for viewing data by region as well as individual STTs.

Future Research
* Investigate nuances to the Data Prepper role.

Next Steps

Based on these observations, insights and impacts, we have added the following issues to the backlog

Additionally, our plan is to add issues to the product backlog to explore the following:

  • Notification functionality for OFAs when STTs submit their data
  • Logic that helps explain the nature of fatal errors to OFAs
  • Adding a way to access training resources from the OFA admin view.
  • Self-service features for helping STTs understand, diagnose and fix their own errors
  • Notification functionality for STT data submission such as submission status and errors
  • Adding a way to access training resources from the STT view
  • Represent nuances to the Data-Prepper role in the UI
  • Contact mechanisms for each user in the OFA Admin view
  • Mechanisms for viewing data by region as well as individual STTs