Hannah Hall, Customer Empathy Manager, SBS

Skipton Building Society & Digital Accessibility - Creating a Society where no one is left out


This is the first in our new Accessibility Leaders blog series, where we showcase organizations leading the way in digital and web accessibility. As a company passionate about promoting digital inclusion and bringing equal access to the web, we will be giving experts and industry leaders the chance to share best practice alongside practical thoughts and solutions that can be used by IT, Marketing and Inclusion professionals, as they pursue the goal of increasing accessibility within their own organizations. Our first guest blog is from Hannah Hall, Customer Empathy Manager from Skipton Building Society, who showcases the research and impact of her company's Accessibility Initiative.


Skipton Building society digital accessibility blog image

Skipton Building Society was established in 1853, has over a million customers, and is the fourth largest Building Society in the UK. With 88 branches across England and Scotland, it's very much a traditional Building Society in the sense that we provide savings products, mortgages, financial advice and insurance. Read on to find out about our Digital Accessibility journey.

In terms of our accessibility journey up until today, I usually start with talking about the work done by the Society around Empathy. Back in 2016 we identified, that as an organization, we wanted to focus on this and an “Empathy Audit” was completed which resulted in actions to help improve. As these actions became embedded within the culture of the Society, in 2017 we took these to market by beginning our partnership with Alzheimer’s Society and working towards becoming a Dementia Friendly organization. We continue this work to this day and are proud of how colleagues support this and became Dementia Friends to help support our customers.

We then began to realize that this work helped more people than just those impacted by dementia and we needed to take a more holistic view, which is where, in 2018 the Accessibility initiative began. We realized we were doing lots of things to support our customers but never under the umbrella of accessibility. At this point we were not sure, as a business, how to measure ourselves which resulted in us becoming members of the Business Disability Forum. We then completed a self-assessment so we could benchmark ourselves and create a comprehensive plan to move forward.

One of the drivers behind this focus was that, through internal research, we found accessibility impacts 50% of our customers (yes that high!). This figure does have the potential to be higher as it only looked at six conditions and doesn’t include any situational, temporary, unregistered or undiagnosed conditions. When it came to influencing decisions and stakeholders, with this figure and the potential impact on our customers we have seen good levels of buy-in.

As part of our plan to progress with accessibility, an area that came out as a theme throughout was not asking customers, specifically with accessibility requirements, what works for them or doesn’t. This topic will be the theme for the rest of the blog!

Moving forward

We began digitally user testing throughout 2019. I am just going to talk to you about two sets in this blog, both of which we worked on with the Research Institute of Disabled Consumers (RiDC). The first set was our Customer Panel Recruitment and Welcome Email review, users went through the process of receiving an email to join our Customer Panel. Our panel is a group of customers who have agreed to take part in research, predominantly digitally based. The users then went on the complete a joining survey and then finally received our marketing “Welcome Email” to feed back on.

The feedback from the survey showed us were there were opportunities to be more inclusive but also supported some of the changes we had already implemented. It is worth mentioning at this point that for the final stage of the testing, the “Welcome Email” section, we split the group in half. Half of the users received an email that was “less accessible” and the remaining half received an email that was more accessible. The differences included line and character length, different fonts and link designs.

To start, there were opportunities within the language we used, rather than using “we would like to hear you” we have moved to “we would like to know”. Something small but could have an impact on the customers experience of our brand. We also had a lot of feedback around the color contrast between the text and background color as well as the styling of the text within the emails and surveys, it is safe to say it wasn’t overly consistent. We also found that within the survey, that is hosted by an external supplier, there were issues with the tick boxes working correctly with assistive software which lead to no audible cue for participants to tell that they had selected a box. Lastly, the links were not clear and the alt text on the image did not add any value for the user.

Most of this feedback came from the “less accessible” email, with no mention of some of these from the participants that received the second email. This evidence has supported the work that has been done to improve the accessibility of our emails, and we are currently working on creating guidelines to improve the accessibility and consistency of our surveys. We have also used the feedback to inform other work throughout the Society including the release of new email templates for our operational emails. 

The second piece of testing we completed was on our app, one of our newest developments, and therefore it was great to see how users with different accessibility needs found it. We planned on completing the testing before the app was live however we had some challenges with location, so it was conducted just after. The users were asked to complete three different journeys: registering to use the app; exploring the various help options and lastly completing different transfers and payment options.

Real change

Overall, we were pleased with the feedback we received however there were areas were the app could be improved. Within the journeys there were some challenges with color contrast, this wasn’t related to the text but to icons and symbols within the journeys that were generally accompanied by text.

The labeling for screen readers throughout the app was inconsistent, and some entry fields were labeled buttons leaving the user unsure if they needed to enter some information or click. One of the most challenging areas we need to improve on is our Terms and Conditions, they are notoriously long, complicated and inaccessible however there are things we can do to improve. The users needed more information to make the experience easier, such as the fact it takes 22 scrolls to reach the bottom or the “accept” button not appearing until you reach the end and the ability to move between paragraphs if needed.
 
This was our most recent piece of testing and we plan to prioritize when the improvements will be implemented. As an organization we are still at the beginning of our journey and have a way to go but we have made a start. After one of our pieces of testing my colleague said “Without this research, us trying to improve the experience is just a theory… ” and he couldn’t have been more right!

If you'd like to begin your own digital accessibility journey, try Browsealoud today.

Comments

Blog post currently doesn't have any comments.
SHARE

Search

Submit

Subscribe To Blog

Google reCaptcha: