Exceptional User Experiences
Introduction
We build trust by incorporating user feedback into every decision, ensuring our products are accessible, reliable, and exceed expectations. When we go above and beyond what the business is asking for, we create solutions that are not only functional but resonate with users.
We monitor performance, adjust based on real-world usage, and communicate openly with users to keep them informed about upcoming changes, ensuring users feel confident in the tools and solutions we provide.
Read the attributes below to access the detailed guidance, tools, and resources for each.




Exceptional User Experiences Attributes:
3.1 We actively seek to understand how all kinds of users experience our products (e.g., through site visits, telemetry, regular usability testing)
OIT expects all product teams to leverage a variety of techniques to understand user experiences.
During initial research and vision creation, teams should consider in-person site visits to gather feedback from users directly. During development, teams should leverage regular usability testing to iteratively shape the creation of new features. Where applicable, teams may also track quantitative user experience telemetry to understand how users engage with their systems.
Best Practices
- Schedule in-person site visits to better understand user needs
- Conduct live or in-person usability tests where appropriate to watch users physically engage with systems and provide feedback
- Integrate usability testing into the development workflow (e.g., before each release)
- Analyze telemetry data (if applicable) to understand bottlenecks and other issues
Guiding Questions
- Who are the product’s end users? Are there more than one kind of end-user?
- In what phase of your lifecycle does it make the most sense for your team to conduct a site visit? How often should your team conduct site visits?
- How can you best integrate usability testing into your team’s software development lifecycle?
- What types of telemetry data might best inform your understanding of the end-user experience? Do you already collect such data, or do you need to set up new infrastructure to enable its collection?
Key Contacts
- Contact 1
- Contact 2
- Contact 3
3.2 We measure end-user satisfaction for all products using common measurement tools and methods
Measuring our successes and failures is critical to understanding whether we are delivering on our goals. Teams should collect qualitative and quantitative metrics to develop a robust picture of end-user satisfaction.
Best Practices
- Identify the most appropriate way to measure end user satisfaction and regularly collect those metrics
- Collect qualitative metrics (e.g., sentiment, improvement to user journey, adoption) throughout the development process and after deployment to inform feature and functionality prioritization
- Work with Custom Development platforms or implement OIT-recommended tooling to incorporate monitoring and logging tools into your system
- Identify the most relevant quantitative user experience metrics (e.g., time-on-task, click-through rates, journey completion rate, page visits) to capture satisfaction
Guiding Questions
- How can we tell if our products are being used?
- How can we tell if our products are successfully fulfilling their vision?
- What metrics do we need to justify product improvements, resource allocation, or other funding needs?
- What end-user data do we need to make decisions about new features to add in future phases?
- How can we track our product’s success relative to others within the same product line or portfolio?
Runbooks
- Measuring User Experience: A guide for technical team members on the importance of and steps to collecting and analyzing user feedback
- Measurement and User Testing: A guide for Product Managers to leverage existing data, identify new data to be collected, and gather feedback directly from users to inform prioritization to make their product more effective
Useful Links:
Key Contacts:
- Contact 1
- Contact 2
- Contact 3
3.3 We maintain a ranked list of top user pain points based on user behaviors (e.g., telemetry, usability testing) and user feedback for all products
As a part of our promise to meet the needs of Veterans, families, and their caregivers, it is important that we ruthlessly prioritize our work. Product teams are expected to synthesize information from telemetry, usability testing, and other metrics into a ranked list of pain points to their product vision.
Best Practices
- Synthesize results from telemetry, usability testing, and metrics into a list of pain points
- Prioritize the list of pain points
- Integrate the prioritized list of pain points into feature prioritization, roadmaps, and sprint planning; regularly update rankings
- Share common pain points with Product Line Managers so they can be shared
Guiding Questions
- Where do end-users seem to have issues when interacting with an application?
- Where do users get stuck when they are making their way through a new user experience?
- What are the common complaints or requests received from user feedback?
- What prioritization framework makes sense to rank pain points for your product?
- What inputs might make sense for your prioritization framework?
- Which pain points make sense to share back to a broader group?
Use Case
The Veteran Experience Service (VES) portfolio in Product Delivery Service (PDS) set out to understand how the Veteran users of their products were experiencing those products and catalogue, rank, and address the commonly cited unmet needs.
The unmet needs are presented from the perspective of the Veteran, they’re clearly written, and they each answer the following questions:
- At what point do you encounter issues interacting with our service?
- What task are you currently unable to achieve with our service?
- What specific obstacle is preventing you from achieving your task with our service?
The needs are then prioritized based on the following criteria:
- Impact: Level of impact that the unmet need has on the Veteran’s life.
- Reach: Number of Veterans that experience a specific step in the VA service journey per year according to our most recent available source.
- Research confidence: The robustness of all the research reviewed for an unmet need.
The highest priority items are those that have the largest impact, effect the most Veterans, and have met the threshold of types and quantity of reviewed research.
The lowest priority items on the list are those with a lower impact on a smaller group of Veterans, and require additional research to identify the full scope and complexity of the problem.
While this work is ongoing, it is an excellent example of how teams and portfolios can compile all kinds of inputs to help prioritize their work.
Useful Links:
- Visit the unmet needs microsite to learn more about current Veteran needs.
Key Contacts:
3.4 We incorporate end-user feedback, not just business requirements, into our product decisions, and we adjust plans when needed
We aim to design products that truly meet our users’ needs, beyond simply fulfilling predetermined business requirements. User feedback allows us to create products that are effective, accessible, and aligned with real-world user experiences. This means we actively gather, analyze, and incorporate end-user feedback to adapt our strategies, requirements, and roadmaps.
Best Practices
- Establish clear channels for feedback collection through surveys, feedback forms, and user interviews at various stages of the product lifecycle
- Identify actionable insights from collected data by categorizing feedback into themes that directly inform design or functionality adjustments
- Engage cross-functional teams, including developers, designers, and analysts, in the feedback loop to ensure all aspects of the product reflect the simplest solution that meets users’ needs
- Create a flexible product roadmap [link to product roadmap attribute] that allows for adjustments based on emerging user insights, prioritizing features that address pressing user needs
- Evaluate the impact of feedback-driven changes by measuring user satisfaction and performance to guide future iterations
Guiding Questions
- How are we ensuring user feedback consistently informs our product decisions?
- What insights from users indicate a need to pivot or adjust our product approach?
- Are the channels for capturing user feedback (e.g., surveys, interviews, analytics) effective and comprehensive?
- How flexible is our product roadmap, and do we have mechanisms for incorporating user-driven changes?
Runbooks
- Research at VA: A VA.gov-specific guide for teams designing digital services for Veterans to incorporate research into their work
Use Case
The VA.gov platform supports researchers and designers on its platform with extensive resources to ensure that experiences are consistent and user-centric. This process is enforced via the platform’s collaboration cycle: a series of touchpoints that ensure products meet platform standards across design, user research, information architecture, and accessibility.
A core component of the collaboration cycle is the Design Intent touchpoint, where teams present a problem statement, user workflow, and lo-fi prototype to the platform Governance Team. This meeting presents an opportunity for teams to consider the end-user early on, collaborating with the platform team to ensure the flow makes sense before beginning development.
After the Design Intent meeting, teams can be requested to do a Research Review, where they will 1) leverage existing user research repositories gathered by the platform, and 2) conduct new research with users. The platform provides several resources to conduct research, including research checklists, research plan templates, prototype certification (pre-user testing), and lists of Veterans who have volunteered for user testing.
Useful Links:
Key Contacts:
3.5 We deploy changes iteratively, using techniques such as feature flags and beta channels
To ensure an optimal user experience upon release, and avoid unnecessary changes or system outages, teams should deploy in a highly iterative manner. Product teams should consider using techniques such as feature flags, which allow teams to turn functionality on and off, even after release, and beta channels, which allow teams to release non-final software to a small group of users for testing.
Best Practices
- Break up large releases on the product roadmap into several smaller releases
- Use feature flags when releasing a new feature to avoid any disruption to the end-user experience
- Consider creating a beta channel to identify and resolve additional bugs before the final release
Guiding Questions
- How might you break up a large release into several smaller releases? Can any new features or functionality be decoupled from each other?
- Where might it make sense to add a feature flag to a new release? If you still want certain user groups to be able to access a “hidden” feature, who might those user groups be?
- How large will your beta channel audience be? How long should the beta testing period be?
- Is a migration from monolith to microservices on your roadmap? Should it be?
- Is there an opportunity to migrate your application to a Custom Development platform or build a CI/CD pipeline?
Useful Links:
Key Contacts:
3.6 We use common design frameworks, including those required by Federal law
VA’s comprehensive design framework should be used by all applications developed by OIT. This framework contains:
- A content guide including guiding principles and search engine optimization (SEO) best practices
- Base information about styles used within VA, including color palettes and font families and
- A component, pattern, and template library, including plug-and-play elements that can be used by both developers and UX designers.
Best Practices
- Consult relevant section(s) of va.gov depending on your role. At a minimum, ensure product design is compliant with the “Foundation” section
- Utilize digital.gov if a desired component, pattern, or template is not yet available on design.va.gov
- Follow the Federal Website Standards (including pending and draft) outlined at digital.gov
- Consider contributing back to the VA design community by reaching out to the design.va.gov team
Guiding Questions
- Are you fully utilizing the VA-specific design resources made available to you, including Figma and React components?
- Have you implemented a design consistent with the standards laid out in the “Foundation”section of design.va.gov?
- Are your applications compliant with the Federal Website Standards?
Useful Links:
Key Contacts:
- UX Design CoP (OCTO)
3.7 We build products with accessibility in mind from the beginning and ensure solutions meet or exceed Section 508 standards
At VA, we “design our websites and other digital tools to meet or exceed the Section 508 technical requirement standards.” Teams are expected to conform to guidelines provided by the World Wide Web Consortium (W3C), specifically the Web Content Accessibility Guidelines (WCAG) 2.2, and design and develop with these principles in mind rather than validating compliance at the end of the lifecycle.
Best Practices
- Visit the Federal Section 508 page on developing software and websites to find accessibility trainings, better understand requirements, and find recommendations for automated tools and testing techniques
- Review VA training courses on Section 508 requirements across job functions
- Review the Digital VA Section 508 checklists to manually ensure accessibility compliance
- Contact the Digital VA Section 508 office for additional support, resources, and accessibility audits
- Use a Custom Development platform with automated accessibility checks
Guiding Questions
- Do you have a complete understanding of the accessibility requirements for your application?
- Did you validate that your application is compliant with WCAG 2.2 standards (either by visiting the W3C website directly or using the Digital VA Section 508 checklists)?
- Have you reached out to the Section 508 office to request an accessibility audit, or do you have certified contract staff who can validate accessibility?
- Is there an opportunity to use a solution that automates your accessibility checks?
- What changes can you make to improve accessibility beyond Section 508?
- Have you considered all possible end-users and abilities?
Training
- Tech Tuesday 12 | Human-Centered Design: The Role of Plain Language and Content Design (Feb 28, 2023)
- Tech Tuesday 14 | Human-Centered Design and Accessibility: More Than Just 508 Compliance (Mar 28, 2023)
- Tech Tuesday 23 | Ace your 508 Audit, Start your HCD Accessibility Journey (Aug 15, 2023)
- Training Options
Useful Links:
Key Contacts:
- 508 Program Manager: Chet Frith – Chet.Frith@va.gov
- Digital VA Section 508 office
- #accessibility-help
3.8 We build trust with our users by monitoring performance, driving continuous improvement, and keeping them informed about upcoming changes to their experience
Reliability is one of the most critical aspects of development at VA: outages can potentially affect millions of users across countless mission-critical processes, with additional downstream impacts possible. Ultimately, VA strives for all systems to achieve “three-nines” of uptime (i.e., application or system is functional 99.9% of the time).
Our response to issues is as important as their identification. Teams should have communication procedures in place to ensure that customers and end-users are informed in advance – or as soon as possible in the case of an outage – of any potential impact to service, and that they are updated regularly.
Best Practices
- Proactively communicate with end-users about major updates, scheduled maintenance, or reliability issues
- Lay out plans to communicate planned outages with end-users
- Develop robust communication protocols as part of your Major Incident Management process (see attribute 2.1) that ensure end-users are informed as soon as possible of impacts or potential impacts to their service
Guiding Questions
- If you were the business or end-user, would you want to know about this change or possible outage?
- Is there a robust communication element of your incident management protocol? If so, who is responsible for executing said plan?
- Do all user types need to be informed directly by your team? What information can you provide your business partner for them to communicate to their customers and end-users?
- What is the critical information your end-user needs to know?