Protect Texas Together

Protect Texas Together Usability Study

This usability study was conducted to evaluate University of Texas at Austin’s Protect Texas Together Mobile App and recommend improvements based on usability testing and user feedback

Objectives and Goals

Key Objectives

  • Location Privacy – Investigate perspectives on location privacy features and the importance of location privacy
  • Features – Gather thoughts about using the app’s features on a daily basis and opportunities to encourage daily engagement
  • Contact Tracing – Explore thoughts on contact tracing and gain deeper insights into implementation
  • Data & Resources – Learn what value the resources section provides to users and what the users decide are the most useful pieces of information

Methodology

Our Process

Evaluate

Searching for pain

We searched for key points to lock into via a heuristic evaluation on the existing designs

Analyze

Seeing what others do

We performed a comparative analysis of both direct and indirect competitors within the space

Interview

Collecting people’s thoughts

After finding areas to look into, we interviewed a variety of users and tested mockups with them for usability

Synthesize

Creating answers

Based on what we found in our research, we have synthesized our findings and created suggestions

The Heuristic Evaluation

Evaluation scale

For our evaluation, we chose to use a scale from one to five to rate usability issues throughout the application. Through this scale, we were able to provide light to varying severity issues throughout our evaluation.

  • Positive application of heuristic principles, no solutions needed
  • Cosmetic problem with a low-priority solution
  • Minor usability problem with a low-priority solution
  • Major usability problem with a high priority
  • Critical usability problem that must be fixed as soon as possible

Heuristics used

We based our heuristic evaluation off of Nielsen Norman Group’s 10 usability heuristics for user interface design and added notes about accessibility to the evaluation.

Heuristic Definitions

Visibility of System Status

The system should always keep users informed about what is going on, through appropriate feedback within reasonable time.

Match between system and the real world

The system should speak the users’ language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order.

User control and freedom

Users often choose system functions by mistake and will need a clearly marked “emergency exit” to leave the unwanted state without having to go through an extended dialogue. Support undo and redo.

Consistency and standards

Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions.

Error prevention

Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action.

Recognition rather than recall

Minimize the user’s memory load by making objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate.

Flexibility and efficiency of use

Accelerators — unseen by the novice user — may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions.

Aesthetic and minimalist design

Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility.

Help users recognize, diagnose, and recover from errors

Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.

Help and documentation

Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user’s task, list concrete steps to be carried out, and not be too large.

What we found

We found 1 blocker, 15 high severity, 14 medium, and 11 cosmetic problems within the application through our evaluation.

The Competitive Analysis

We compared direct competitor and market adjacent applications to find what the UT HornSense app does well and what it needs to improve upon. We rated changes on a scale of minimal important changes and critical features

What we found

We identified five major features that need to be addressed, but none that block the deployment of the application completely.

Market Contact Tracing

The Austrian COVID application does a good job of breaking down what contact tracing is and giving it a good marketing term

Inform Users

The NHS Covid app shows a screen with a summary of how the app works and the user’s role in it and keeps the users informed of all the steps and processes in the app.

Quick Logging

The current log book within the application is convoluted and hard to use. We should simplify it and create a way for people to quickly log their locations or tests.

Exposure Risk Information

Show users potential exposure risks based on previous locations where the user may have been exposed

Quick Logging

The current log book within the application is convoluted and hard to use. We should simplify it and create a way for people to quickly log their locations or tests.

User Interviews

Interview Screening

For our usability testing, we needed to look at people that would have the highest probability of needing to use the Protect Texas Together application.

Who we screened for

Undergrads5
Graduate Students5
Covid Aware70%
LocationAustin, TX

Who we interviewed

Undergrads5
Graduate Students3
Non-UT Grad Students5
Total13 Participants

Testing Format

1. Interview Medium

One-on-One recorded interviews held over Zoom as video calls

2. Time

We limited our interview time to roughly 35-45 minute blocks depending on the flow of each one

3. Questionnaires

We asked each participant to complete both a pre and post-test questionnaire

4. Scripting

We used a general script for consistency within the interviews but allowed freedom to deviate if a conversation led to good insights

Detailed Task Findings

We performed card sorting, affinity diagraming, and insight extraction techniques to organize data received from the interviews at a task level

Recommendations

Privacy and Location

Problems

  • Users found location information too invasive and did not want UT to have this data about them
  • Users found it confusing and difficult to input their location in the app

Proposed Solutions

  • Use the onboarding process to clearly and concisely explain why UT needs location information and how it will benefit the user
  • Simplify the location input screen to either automatically collect the users information or make the location input one-click

User Input Survey

Problems

  • Users found this process tedious and would not complete the action without a mandate from the school
  • Users found that many questions asked in the survey should be answered by the school rather than the user

Proposed Solutions

  • Consider an offline solution done by the school where a student could not get on campus without filling out the survey
  • Consider adding an incentive program where students could get discounts on school products by filling out the survey
  • Users found the “Resources” page useful. Consider keeping that page locked until students fill out the survey
  • Consider advertising and media pages to teach users about the usefulness of surveys before the user starts using the app
  • Sync the information available to UT through each UT eID so that the user fills out as little information as possible

Usefulness of Resources

Problems

  • Users found the Resources page useful but questioned the validity of the information
  • Users want more information regarding Covid statistics

Proposed Solutions

  • Clearly explain to the user the sources of the information in Resources
  • Give users the ability to drill down on specific case rises based on location or the college within UTUsers found the “Resources” page useful. Consider keeping that page locked until students fill out the survey
  • Collaborate with other organizations making similar applications so that you can share the COVID statistics and give users information on other areasSync the information available to UT through each UT eID so that the user fills out as little information as possible
  • Allow users to compare their town or school to others doing the same thing

Additional Resources