The Missing Information Story
Huw Alexander looked at the data from the perspective of suppliers who wanted to use the objective data from audit results to prioritise improvements in their accessibility information. Meanwhile, Alistair McNaught was intrigued by the variability in the scores - some publishers had a range of scores that equalled or exceeded their mean.
Was this inconsistent auditing? Or was something more fundamental at play?
Making changes to a product is a long-term project. Making changes to product information is quick and easy.
Everyone recognizes that making changes to the accessibility of your product is a medium to long term goal. It has co-dependencies with other priorities and may involve changes to workflows, processes and quality assurance. None of this is trivial.
By contrast, improving the accessibility statement may take an afternoon and have an immediate beneficial impact on end users.
Three key findings from the ASPIRE data included:
Most publishers provide far less accessibility information than a university or college requires in order to support their learners efficiently.
Most publishers are missing a significant marketing opportunity because their digital products almost invariably have accessibility benefits (compared to hard copy print) that could – and should – be marketed.
Even when a supplier has an accessible product and useful accessibility information, the information is often hard to find or is spread across unrelated parts of the website, creating a big variability in end user experience.
By downloading the raw data in Excel format, publishers can explore the scores they achieved for the different accessibility criteria. This is an easy way to prioritize the information you want to provide to end users. The graph below illustrates the main ‘missing scores’ where most publishers failed to provide relevant information. From left to right, it illustrates the key ‘missing information’ that depressed scores.
In rank order, the missing information was copying and printing limits, file navigability via a tagged structure, response times to accessibility inquiries, image descriptions, magnification and reflow compatibility, digital rights management (via 3rd party distributors), currency of accessibility information, licence terms (including level of student detail needed and recognition of accessible Intermediate Copies), alternative format DRM restrictions, available formats and engagement with accessibility services.
These were missing in more than 80% of audited suppliers. Even the most basic contact information and mention recognition of accessibility was absent in 60 – 63% of suppliers respectively.
All suppliers should have access to this ‘missing information’. Some will come from their production team, some from the rights department but the critical thing is for it to be communicated to end users or those supporting them.
Figure 1: Average missing scores by criteria.
The way accessibility information is organized and presented can have a significant impact on the end-user experience - and the ASPIRE score.
There was significant variability around the average in some of the scores. This was initially thought to be due to the crowd-sourced nature of the data and the inherent variability between auditors making value judgements.
However, looking at the website URLs of publishers with very variable scores shows a different explanation. In figure 2 below, the example of Elsevier illustrates several salient points:
Figure 2: Variable scoring
The two lowest marks were for URLs where no accessibility information was found. In this case the auditor went to the main website and found no signposting to accessibility information. This resulted in a score of zero. The highest score (16) was for a URL found via a Google search.
There are two auditors who audited exactly the same URL but one was more generous in the marking (or one was much more stingy!) resulting in scores of 5 and 12 for the same information. We expected there to be variation which is why we set up a system to ensure, as far as possible, multiple audits were done for each supplier.
Elsevier's Science Direct site has an innovative and highly effective approach to providing key accessibility information. However, some of the information that library staff had deemed important was still missing. It could be argued that Elsevier focused on information more relevant to the end-user than the library, but the project scored on the basis of both.
The Impact of Disaggregated Information
Where different URLs lead to different levels of information, scores may be significantly depressed. Figure 3, below illustrates the difference in scores when the variation round the range is added.
For simplicity, we have assumed that the range is equally distributed around the mean. This is not always true but it is close enough approximation to demonstrate the point clarifying and signposting only your best information would give a much better score.
Figure 3: The range around the mean
Use your own scores in the Publishers&files-Scores tab - on the downloaded spreadsheet (or the ‘commonly missing’ information in figure 1) to identify areas for which you can provide better information.
Search your own website for accessibility information on your product (don’t get mixed up with accessibility information about the website!).
Check that there isn’t different quality information in different places.
Rationalize the accessibility information that is available so that it is easy to find and comprehensive.
Remember that even small changes can make a big difference to your score. Be honest about the work you are doing and tell your accessibility story.