top of page

ASPIRE stories

ASPIRE_stories.png

The Student Perspective Story

University of Sussex logo.

This case study by Annette Moore, Content Delivery Manager at the University of Sussex, summarizes the Platform and Aggregator scores for the accessibility ‘Feature Set’ Questions in the final part of the audit form.  

​

Consider this a 'thought experiment' - if a product's accessibility was judged by what the accessibility information tells users, would a disability-savvy library purchase?

​

The ‘Feature Set’ questions consisted of seven accessibility features central to providing a better reading experience for students with print impairments. The features and their beneficiaries are listed below:

​

  1. Image descriptions (visual impairment, some specific learning difficulties), 

  2. Navigation and reading order (all print impairments),

  3. Screen reader compatibility (blind),

  4. Text to speech compatibility (visually impaired, concentration difficulties and dyslexic),

  5. Magnification and reflow (visual impairment and dyslexia),

  6. Colors and contrast options (visual impairment and dyslexia).

  7. Response times to an accessibility query (all).

​

The questions in this section of the audit form score the Platform or Aggregator on the INFORMATION they provided on the accessibility features or services described.

Open Quote.

In practice, the platform may offer accessibility features that are not described fully in the accessibility statement, but from the perspective of the Student Experience, it is important that the user is informed - before accessing the platform - of the range of accessibility features available.

Close Quote.

The scoring, therefore, is based ONLY on information provided in the accessibility statement.  There were four options for scoring under this section:  Poor = 0; Fair = 1; Good = 2 and Excellent = 3. The maximum total score for the ‘Feature Set’ questions was 21.  Where no accessibility statement could be found, a score of ‘Poor = 0’ was given.  

Top Scorers

Out of a possible maximum score of 21 for the ‘Feature Set’ questions, only four platforms scored over 50%, highlighted in green in the table below.  The platforms were; EBSCO eBooks (76.19%), Kortext (71.43%), VitalSource (65.71%) and Cambridge Core (53.97%). Of the remainder, six platforms scored between 25% and 50% and 44 platforms scored less than 25% of the maximum score.  

Table of top 10 platforms with percentage of maximum score.

Figure 1: Top 10 - Percentage of maximum score obtained for ‘Feature Set’

The URL Lottery

Each Platform or Aggregator was audited several times by different reviewers and the ‘Range’ and ‘Number of Audits’ recorded for each one. The ‘Range’ is the difference between the lowest total score and the highest total score for the ‘Feature Set’. A separate analysis of the data showed that the biggest variation in the range came from the URL where the auditor found the information. 

​

Average scores were depressed where accessibility information was fragmented. Some platform providers immediately raise their score by ensuring their 'best' accessibility URLs are more clearly signposted and their worst are simply retired!

Table of top 10 platforms wit range and number of audits.

Figure 2: Top 10 - Range + Number of Audits

Reporting + Currency

Although a few suppliers reported on a good range of accessibility features the averages were very disappointing, being brought down by the complete absence of information from many suppliers. If we take all the scores of all the platforms and convert them to a percentage of the total score available for that accessibility feature, we get the following:

Horizontal br chart of accessibilit features.

Figure 3: Accessibility features - total score as a percentage pf maximum possible score.

Figure 3 shows that screenreader compatibility scores highest for reporting, but at only 19%. For the 'average' platform, 89% of the possible marks were lost. Navigation and reading order was the next highest score but still only scoring 17% of the possible marks.

​

All the other feature reporting was between 10 and 15% except for response times which scored a mere 7.5%.

​

As mentioned above, it is important to note that there may be a difference between the information about the feature set within the Accessibility Statements and the availability of that feature on the actual platform.  In this context, the answers to the question in the audit regarding’ Currency of Accessibility Information’ are relevant. The three categories for this question were:


•    Last updated within the last year – score 2.
•    Last updated more than a year ago – score 1.
•    No information provided – score 0.


Surprisingly, 43 of the 54 platforms or aggregators had an average score of less than 1; 7 scored an average of between 1 and 1.5 and 4 scored an average of greater than 1.5. 

Conclusion

  • Suppliers undersell their accessibility due to generally poor accessibility statements.

​​

  • Disabled students looking for information to help them get significantly less help than they might expect.

​​

  • Few suppliers seem to have a process for updating their accessibility information.

Full Breakdown by Feature

The 'Feature set' part of the audit questionnaire focused on the core concerns of a disabled student. It explores a subset of the data and won't represent the total scores. The rank order is, nonetheless, very similar to the final scores for obvious reasons - a supplier giving good information on core accessibility criteria is likely to give information on secondary criteria as well.

​

The final rank is shown in the image below where the maximum score for this subset of data is out of 21.

​

Note that a low score does not imply poor accessibility, merely poor information about accessibility. But this begs a final question...

Open Quote.

Is it acceptable that a purchasing institution should be unable to determine the accessibility of a product without first paying for it and then devoting staff resource to testing it?

Close Quote.

The final rank is shown below, with only four suppliers - EBSCO, Kortext, VitalSource, Cambridge Core and Cambridge Companions Online gaining more than 40% score in terms of the accessibility information they provided.

Horizontal bar chart of average scores by platform for accessibility feature set,

Figure 4: Average Scores by Platform for Accessibility Feature Set.

End of page icon.
bottom of page