Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prepare Slides & Videos for assessment #1192

Open
teeceeas opened this issue Dec 18, 2024 · 0 comments
Open

Prepare Slides & Videos for assessment #1192

teeceeas opened this issue Dec 18, 2024 · 0 comments

Comments

@teeceeas
Copy link

teeceeas commented Dec 18, 2024

Assessors need to understand the story of the service if they are to provide meaningful advice and governance. We have asked assessors what they want to see from teams in order to ascertain if they meet the Standard. This should be read in conjunction with the Beta guidance from CDDO.

Remember, all of this information does not need to be included, it's about telling a story in a way that makes sense to your service and in your context. Some of the information may not be covered in the presentation, it may be in the briefing form, tech call or demo.

  1. Understand users and their needs

Key things to demonstrate

  • - Remind us of your users and their key needs.
  • - Walk us through the user journey.
  • - Who did you conduct research with? How many participants? How many rounds of research? Did you duplicate participants in any rounds of research? How did you recruit users for research / testing?
  • - How have you tested the offline or assisted digital journey end to end with users? Have you done any UR into any paper elements of the journey?

Other things you might want to consider evidencing

  • - Has anything changed since Alpha?
  • - What research methods did you use?
  • - How do you share what you know about users with the team? Personas? User profiles? Where do these live? How are they used by the team?
  • - Has the unhappy path been tested? And error messaging?
  • - Has the content been tested? What did you learn?
  • - What devices do your users use? Which ones have you conducted research with?
  • - Are there any gaps in your knowledge about users and their experience of your service? How will you fix those going forward?
  1. Solve a whole problem for users

Key things to demonstrate

  • - Be explicit about the problem(s) you are solving.
  • - What steps have you taken to understand your users’ wider end to end journey?
  • - What is the effect of your service on related services upstream and downstream?

Other things you might want to consider evidencing

  • - How are you collaborating with services / teams / organisations outside of your team that play a part in the wider journey?
  • - Have you considered the admin or internal users, and what have you learned about them?
  • - How does your service connect with others, especially on GOV.UK? Have you agreed a subdomain name with gov.uk?
  • - What constraints have you had to work within and how are these being addressed by the wider organisation?
  • - How will users find out about the service? How are you encouraging users to adopt the service? Will that approach work at scale?
  1. Provide a joined up experience across all channels

Key things to demonstrate

  • - How have you involved operational staff or other colleagues with your design and user testing?
  • - How have you ensured the non-digital parts of this service join up with the digital parts?
  • - What steps have you taken to understand overlaps in patterns/data between your service and others?
  • - Have you tested the handoffs between different channels?

Other things you might want to consider evidencing

  • - What other services are connected to this one? How do they join up? Have you conducted user research into the entries/exits/segues of these?
  • - How have you tested this service / part of the service across multiple channels? Can you show us how users who aren't digital would interact with this service?
  • - Are there other services or teams impacted by your service? What mitigation have you put in place?
  • - Can staff from one channel see applications via another channel? (eg phone support giving update on online application)
  • - Have you collaborated with service areas that use similar/identical patterns? Have you used common design components? Is the copy/content/process consistent across post/telephone and online applications?
  • - What steps have you taken to understand shared data with other services?
  1. Make the service simple to use

Key things to demonstrate

  • - How have you ascertained if the service is intuitive?
  • - Where are the points in the service that users struggled the most? What support are you offering here?
  • - Have you tested the unhappy paths?

Other things you might want to consider evidencing

  • - What mechanisms do you have in place if users need help or training on the service?
  • - How have you named your service?
  • - Why did you decide to use a (new) pattern and what evidence do you have that it works?
  • - How did you test any error messages?
  • - What other services are similar and have you learned anything from them?
  1. Make sure everyone can use the service

Key things to demonstrate

  • - What is the team's understanding of how users with accessibility needs (e.g. physical or cognitive disability, neurodiversity, low literacy or numeracy, low digital literacy or access, non-native speakers) experience the service?
  • - What research has been carried out with participants with these needs?
  • - What steps have you taken to ensure that your service works for those with accessibility issues? Have you done any automated accessibility testing? What are your plans for an accessibility audit?
  • - What happens if a user can't use the service?

Other things you might want to consider evidencing

  • What do you offer in the way of assisted digital?
  • Are there any groups of users you have not tested this with?
  • Is your service available in Welsh? Why not?
  • Can someone still apply if they don't have an email/ a fixed postal address / a phone number?
  1. Have a multidisciplinary team

Key things to demonstrate

  • Who is in your team?
  • What non-digital roles have you brought in?
  • What does your team look like going forward?
  • Other things you might want to consider evidencing
  • What roles have you found you don’t need full time?
  • Were there any disciplines or roles you didn’t feel you had sufficient access to?
  1. Use agile ways of working

Key things to demonstrate

  • How do you prioritise what you work on?
  • How do you organise your delivery?
  • What constraints prevented you from working in an agile way?
  • Other things you might want to consider evidencing
  • What are your plans for rolling out the service to all users?
  • How can you demonstrate working in the open?
  • What collaborative tools are you using?
  • What compromises have you had to make?
  • What constraints have you had to work within?
  • What decisions have been passed down from above?
  1. Iterate and improve frequently

Key things to demonstrate

  • What have you learnt from testing with users and how are you feeding this research into design iterations of your service?
  • How have you iterated your designs based on research? Can you show a couple of examples of when you have iterated on the basis of user feedback?
  • Other things you might want to consider evidencing
  • How do the designers and researchers work together?
  • What did you throw away based on user research?
  • How do you know when to stop iterating?
  • When did you make a decision without research? What were the circumstances?
  • What are the budget plans for continuous improvement going forward?
  1. Create a service which protects users’ privacy?

Key things to demonstrate

  • What measures have you taken to secure your service and its data?
  • What are your top threats and the mitigations you've put in place?
  • Is there a Privacy Policy?
  • Is there a DPIA?
  • Has an IT Health Check been done? What was the result?

Other things you might want to consider evidencing

  • Describe authorisation - Secure Sign On? Roles and permissions?
  1. Define what success looks like and publish performance data

Key things to demonstrate

  • - How are you measuring success?
  • - How do you know you have improved the user experience and solved users’ problems?
  • - Have you baseline data to compare against?

Other things you might want to consider evidencing

  • What did you measure before you started Beta and what do you measure now? Do they differ and why?
  • How are you sharing metrics about your service?
  1. Choose the right tools and technology

Key things to demonstrate

  • A tech arch diagram
  • Description of the tech stack - list the main technology choices of the core service
  • How are you managing any limitations caused by your technology choices?

Other things you might want to consider evidencing

  • Describe your CI/CD, covering types of tests, approval steps, deployment method, hosting
  • Configuration/infrastructure management - could you rebuild an environment quickly and easily?
  1. Make new source code open

Key things to demonstrate

  • - List your repo URLs and explain if any aren't public.

Other things you might want to consider evidencing

  • Are you coding in the open?
  • Use and contribute to open standards, patterns and components
  1. Key things to demonstrate
  • What is your approach to open standards and common platforms?
  • List the common components, tools and patterns you have adopted or rejected.
  • What design patterns are you using?

Other things you might want to consider evidencing

  • Have you defined guardrails?
  • Have you contributed to open standards?
  • Have you shared any learnings with the gov.uk design system team? Have you used department standards? Can you justify the use of a non standard pattern?
  1. Operate a reliable service?

Key things to demonstrate

  • What are your processes for alerts, monitoring and logs?
  • How confident are you that you can run this at scale? What are you basing that confidence on? Eg load testing, autoscaling
  • How will you deal with incidents and fallbacks? What are your backup plans if the service is unavailable?

Other things you might want to consider evidencing

  • How are you collating metrics or analytics?
  • What is your incident management process?
  • How do you deploy your software? How long does it take to deploy code to production?
  • What tests are carried out on hosting and config management?
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Todo 📝
Development

No branches or pull requests

1 participant