Seamless accessibility: improve user experience with end-to-end business support

Have you had an online experience which has been of great quality but found the accompanying business service less than satisfactory?  I recently went through a similar experience with a popular cultural festival. There was an inconsistency in the ticketing process where an online ticket purchase required a visit to the ticket collection point to get a print of the ticket. Some of these points were difficult to reach and were poorly sign posted. It got me thinking how one inadequately supported aspect of an otherwise fantastic event was sticking out like a sore thumb for me! Allowing the customers to print the tickets at home would perhaps resolve the issue? Or offering a mobile e-ticket which would also be environmentally friendly?

A similar thought was mentioned by a speaker at the Accessibility Scotland 2016 conference that I attended recently. Accessibility expert Mark Palmer highlighted:

Accessibility needs end-to-end support in a business and web accessibility is just one aspect of it

Quoting the example of booking a flight ticket for a customer travelling with a guide dog (which is yet to be made a fully online process by some airlines and can be quite laborious), he explained that unless the business processes around this idea are well designed, it does not serve much purpose to just get the IT part of it right. Say if a software implementation has delivered a perfectly accessible web based system to place order for a product but the ordering process needs the user to physically go to an inaccessible collection point to pick up the product, the purpose is defeated. Yes, we do want the web accessibility requirements fully addressed but there should be an associated review of the business set up as well.

Coming to think of it, I can see many examples around me where the quality of an online experience is not followed up in the delivery of the actual service / business process. The priority seat booking in some of the low cost airlines that still requires the customers to wait in a long queue to make sure they get to keep their hand luggage on the aircraft with them. Another instance is when I booked classes for my son with a local swimming company, which had marketed their website in all flyers. The highly presentable website did not have the option to pay the fees online; hence the transaction did not end with my online activities, I had to follow up with a phone call to make the payment. While this could be true of any online service, the same principle is applicable to accessibility i.e., user experience as a topic is not limited to the web part of the customer’s journey in accessing a service.

An excellent example of getting this idea right is the ‘Accessible Tourism’ initiative by the public sector organisation Visit Scotland. The aim of this project is to encourage tourism businesses to consider making the full experience to be completely accessible. Right from practical tips around a disabled person using their facilities to case studies of success stories, there is extensive information provided to encourage businesses to make the overall experience fully accessible. This measure is to be appreciated as a step in the right direction to encourage the thought process of thinking through the end-to-end user experience.

Can you see such processes around you where the overall service experience is inconsistent with the online service?

It could be a project you are part of, an experience as a customer / end user? Can you imagine the frustration of such an experience? Perhaps it’s something we should bring to the attention of our clients / project teams who are on such missions. Project managers and business analysts need to look at this more closely perhaps? After all, it is the end-to-end user experience which ensures customer loyalty and complete user satisfaction.

Leave a reply below or contact me by email.

Don’t forget the lampposts 

I was recently reviewing a ‘complete’ checklist for testing web applications but at no point in the list was accessibility even mentioned despite being quite thorough in other areas. I would like to try to kid myself that this was just an oversight but it is sadly all too common even though it has been a legal requirement for well over a decade. Of course, a legal requirement where enforcement is practically unheard of is rarely a motivation for an organisation to spend more money on something. That being said, there is strong evidence of the real business benefit to accessible services and information being available to the ten million disabled people in the UK (two million of which have sight problems), but that would be a whole other article.

When testing for accessibility is carried out, it is done so to a set of guidelines. The W3C[1] WAI[2] WCAG 2.0[3] are widely regarded as the guidelines to use, with the middle road AA standard the most sought after level. While the AA standard is quite adequate for the majority (AAA is better and readily achievable with a little extra effort), testing relying solely on the guidelines does not guarantee the final product is accessible and usable. It is entirely possible to have an accessible website that is very difficult to use.

When I was a child, my mother used to paint the front door of our house a bright colour in the belief, unbeknownst to me, that this was necessary for me to be able to find my way home from school. When I asked about our door and this was explained to me I thought that it was a really silly reason and promptly told her, “You just need to count the lampposts”.

This may seem like quite a bizarre anecdote to throw into a web accessibility article. However, my point is that…

just because you expect someone to do something one way does not mean they have not already found their own preferred way to do it.

The same applies to people with disabilities accessing websites and applications. The developers may intend a site/app to be accessed in a specific way but, particularly for non-visual users, the content order and methods they use will be quite different and vary upon personal preference.

Your test team can ensure the site/app designs follow the WAI guidelines, and that your content authors are trained in how to maintain the accessibility standards of your site/app but until you perform real user testing you will not know if you have completely succeeded in your goal.

There is no substitute for having a couple dozen people test your site with various technologies and tell you all the things that annoy them about it, as they will all do so in a slightly different way.

Many of the issues that arise during accessibility testing come from developers not being properly trained in HTML features for accessibility and implementing them incorrectly, which only serves to aggravate the user and drive them away from the site. This has become a particular problem with the increasing reliance on JavaScript without proper alternatives in place and most recently the use of ARIA[4] in HTML 5. ARIA has many potential benefits, particularly for fast navigation using screen readers, but when implemented poorly it can render a site extremely unpleasant to use.

Having worked in accessibility testing for over 13 years and having a lifetime’s experience of visual impairment I can’t help but feel depressed at times at how little regard is given to web accessibility.

The need for systems to be fully accessible will only increase due to the growth in essential services being provided via web applications.

With a little training and care, it is simple to implement accessibility at early development stages, thus providing a superior product that will benefit the customer and users alike (and fewer headaches for myself will be a nice bonus too).

If you have any comments about this topic, please a reply below or contact me by email.

Footnotes

[1] World Wide Web Consortium
[2] Web Accessibility Initiative
[3] Web Content Accessibility Guidelines 2.0
[4] Accessible Rich Internet Applications

The Testing Pyramid

Agile has totally changed the focus of testing and introduced automation to support its development practices. I have been involved with software development teams for nearly thirty years. Over that time I have seen many different methods and practices come and go but testing has remained focused around manual testing. That is until Agile software development arrived.

An Agile iterative approach to software development means you have to test all your software all the time. Ensuring what you have just built has not broken a previous written piece of functionality. Agile automates testing at all layers of the application. This approach to testing is fast overtaking the traditional manual approach. Automated tests that previously existed were focused on testing the front end, the most volatile part of the application. They were also written after the application was built therefore not giving the short feedback loop we use in Agile software development. The Testing Pyramid puts a different testing emphasis on each of the application layers and focuses efforts at the beginning of the development cycle rather than at the end.

The Testing Pyramid

Looking around the web you will see various implementations of the Testing Pyramid with different names for the levels and implementing different types of tests. The one I use is the basic three level pyramid – Unit Tests, Service Tests and UI Tests. As with any pyramid, each level builds on the solidity of the level below it.

From technical view point one can look at these three levels as small, medium and large. Small tests are discreet unit tests that use mocks and stubs to collaborate with other objects and are typically written in one of the xUnit frameworks. Service level tests are the medium sized tests and interface with one other system, typically a database or a data bus. Large tests, UI tests collaborate with many sub-system parts they support the end-to-end scenarios.

Personally I look at the levels in a functional way:

  • UI Tests: Does the software work as expected by the end user?
  • Service Tests: Does the software meet the acceptance criteria on the user story?
  • Unit Tests: As a developer does the software work as I expect?

Unit Tests

Unit tests are the foundation of the Testing Pyramid. This is where the bulk of tests are written using one of the xUnit frameworks – for example, JUnit when developing in Java. They ask the question “Are we building the product right?”. When writing software I like to take a Test Driven Development (TDD) approach. TDD is a design approach to development where tests are written first and code written to support the test. There are a number of benefits to taking this approach:

  • High test coverage of the written code
  • Encourages the use of Object Orientated Analysis and Design (OOAD) techniques
  • Allows you to move forward with confidence knowing the functionality works
  • Debugging is minimised because a test takes you straight to the problem
  • Developers take more responsibility for the quality of their code
  • Because it is written to support specific tests, the code works by design not by coincidence or accident

Service Tests

I see Service Tests as supporting the acceptance criteria on the user story. They ask the question “Are we building the right product?”. When writing these tests, I like to take advantage of the Given/When/Then BDD format of the acceptance criteria and use one of the BDD test frameworks, typically Cucumber. I only adopt specification by example that is use real world examples in the acceptance criteria. This approach gives a number of benefits;

  • Assurance that all stakeholders and delivery team members understand what needs to be delivered through greater collaboration
  • The avoidance of re-work through precise specifications and therefore higher productivity

UI Tests

The User Interface is the most volatile part of the application and should not contain any business logic. For these reasons the least emphasis on automated testing should be here. That does not mean there is no testing, I like to automate key user journeys through the system using one of the UI testing frameworks e.g. WebDriver. UI testing demonstrates that all subsystems are talking to each other.

I use manual testing for the look and feel, checking the UI acts as expected and exploratory testing to find those hidden nuances.

Test Delivery

Unit tests and Service tests should be delivered in the iteration by the delivery team. As part of their development practices developers write Unit tests when building the functional code. Ideally a test first, TDD approach should be used.

My conclusion?

The Testing Pyramid inverts the traditional approach to testing. The focus is at the beginning of the development process with the developer taking responsibility for quality of their code. This is a very different way at looking at the problem from the traditional approach where code is handed over to the tester and they are assumed responsible for the code.

The early identification of defects gives two major business benefits;

  • Issues are discovered early in the development process reducing the cost of defect fixing associated with late discovery
  • Issues are identified and resolved early therefore negating the need to postpone the production release through late identification of issues

References

http://www.mountaingoatsoftware.com/blog/the-forgotten-layer-of-the-test-automation-pyramid

http://martinfowler.com/bliki/TestPyramid.html

http://www.agilecoachjournal.com/index.php/2014-01-28/testing-2/the-agile-testing-pyramid/

UX Designers HQ: UX Hackathon

Have you ever woken up in the morning and wondered “Did that actually happen or was I dreaming?” That was my exact thought as I groggily arose from my warm bed recently, unsure of whether or not I had attended a game-changing event in the world of UX. It wasn’t until I scrolled through the abundance of notifications on my iPhone with the tag #CFUXHACKATHON attached that I realised not only had the event actually happened, right here in central London, but also that I was not alone in my feeling of fulfilment.

Many members of the digital community have often walked away from an event that has a UX focus to it, still with so many questions left unanswered. UX is such a complex discipline, it is hard to keep up or even retain the information being spoken on stage by the UX thought leader of the moment. The digital community in the UK has long awaited an event where they can leave and go to bed knowing more, understanding more and being more than they were when they woke up that morning. As the hope was slowly beginning to drift away, along came the meetup group UX Designers HQ: London. The organisers (Career Foundry) promised an intense, knowledge filled six hours of UX-iness, in the form of a UX Hackathon, which is believed to be the first held in London of its kind and scale. Was this the event that the community had been looking for? We all waited in anticipation for the date of the six hour UX Hackathon to be announced; Wednesday 25th February 2015 at 6pm.

6pm mid week? Haven’t we all go work in the morning?

Despite the fact that it was on a work night 100+ members from the digital community arrived, representing some of the biggest technology companies globally and of all abilities, knowledge and experience in UX.

The Career Foundry team had asked me and six other UX professionals to mentor the twelve teams during the hackathon based on our experience practicing UX in specialist areas related to the briefs that the teams would be working from. The seven of us also formed the expert judging panel, where we provided critique and scored the final presentations from our UX Hackers.

Mentors/Judges:

  • Jay Tulloch – UX Designer at Sopra Steria
  • Yael Levey – Senior UX Designer at the BBC
  • Sandra Sears – UX Designer for TalkTalk
  • Andy Iosifescu – Freelance Interaction Designer
  • Neil Sampson – Professional UX Designer
  • Paola Miani – Senior User Experience Consultant at IG
  • James Walters – UX Lead at Open Inclusion

The night flowed extremely smoothly with the teams getting acquainted and well and truly stuck into the tasks at hand.

The Hackathon consisted of five stages:

  1. User research and prep
  2. User testing 1: User Interviews
  3. Divide, coordinate & conquer: value proposition, user flows and information architecture
  4. User testing 2: paper prototyping
  5. Iteration & pitch

As a mentor, it was my job to add value to teams and help direct them through their design brief, providing them with the in-depth UX knowledge and methodology required for them to really understand the needs and goals of their ideal target users. I soon found myself being called to different tables to provide my insight and expertise. This was great! It meant that the information that I was sharing not only made sense but was indeed valued. By the end of the night, all of the twelve teams had confidently presented their final designs to us judges and there was uproar of applause from the audience for everyone involved.

The feedback from the attendees both directly and on-line has been incredible: thanking Sopra Steria for sharing our expert knowledge and experience in the UX field with them, which they found invaluable during the tasks. Participants and Mentors alike displayed a keen interest in the great work that we are all doing in digital transformation at Sopra Steria. The organisers have been praised in huge amounts for the event, and they are planning a full three day UX Hackathon in nine months time for over 300 participants! This is another event which could provide Sopra Steria with the opportunity to further increase our influence as thought leaders, as we continue to make our own transition from the New European Leader to the New Global Leader in Digital Transformation.

Read on, for a timetable of events

STAGE ONE: USER RESEARCH

Competitor Analysis

Teams researched three competitors who shared the same goals as the teams business concept. It was important for the teams to understand their competitors in order to begin to form a picture of their target users. I asked the teams to think about:

  • Who are their competitors communicating to?
  • How are their competitors communicating?
  • Why are their competitors communicating in this way?
  • What is their competitors message?
  • Is there a story behind their business?
  • Do their competitors values match theirs?
User Personas

After identifying three of their closest competitors, the teams were asked to create a single persona which described their ideal target user of their product or service. It was important for the teams not to be distracted by the look, age and name of their persona. They needed to look deeper into who the person actually was and think to about:

  • What are their interests?
  • Who do they socialise with?
  • Where do they socialise?
  • What are the motivations for using their product/service?
  • Goals, what does their persona want to be or do?
  • Is this why they are using the product/service?
Card Sorting

Card sorting was a fun exercise that really brought the team together. They rolled up their sleeves and got stuck into this task and were asked to maintain focus on their vision for the product. Once they grouped their ideas into categories they carefully prioritised features, ideas and pages that were must haves for the MVP of their product/service.

STAGE TWO: USER INTERVIEWS

The teams went out into the big wide world (the table next door) to find representatives of their target audience, and ask them questions based on the teams assumptions of how their users would interact with the product or service. The task was very insightful and the teams soon realised that the answers that they received were not those that were first expected.

STAGE THREE: BRANDING, USER FLOWS AND IA

A huge task which required the teams to be very organised. Based on the data obtained from the user interview stage the teams went about creating the optimum user journey through their proposed product. From this, they could then begin to develop the information architecture and place in the features and ideas that they had prioritised for the MVP. When constructing the IA, I discussed with the teams the importance of how information is delivered to the user through content (labelling, hierarchy, tagging, grouping), which allowed them to question some of their own decisions and assumptions, and provided a starting point for the next round of user testing.

STAGE FOUR: USER TESTING w/PAPER PROTOTYPES

Another fun task, however, one of the most valuable. The teams tested the paper prototypes of their proposed user journeys and interactions with users from other teams. None of the teams got the design right first time, and that was ok because they gained invaluable insight into how their users actually will use their products and what their expectations really are.

STAGE FIVE: ITERATION AND PITCH

They now had the opportunity to refine their design before the final presentation, it was critical to the success of their product that they utilised the helpful feedback obtained from the user testing stage. The data received from the testing would help them to direct their iterations towards the needs of their ideal target users.

Once they were satisfied that their final design was a perfect fit for the needs of their user the teams began to organise how they would pitch their vision to the audience and more importantly the judges.

All twelve teams pitched extremely well and delivered the goals from the brief. Some of the pitches were long, some were short, some were fun…and some were not so much. At the end of the night, there was not a single person without a smile, and there was a brief moment where everyone could see the satisfaction in the faces of their team mates. The whole room congratulated one another and it was clear to see that the night was one we’d all never forget.

FEEDBACK

Apologies in advance for the “cringe factor” of the following images. Although I feel strongly that this feedback is important for us to know and it does not only reflect the work that I did on the night, it is also reflective of the awesomeness of us as a UX and Innovation team at Sopra Steria and the work that we have all done over the last two years building our Digital Practice from the ground up to get us into spaces such as this one.