The Design for All Research Group at Middlesex University have produced a report called Declaring conformance on web accessibility asking the question: can website accessibility declarations be trusted?
Sadly the conclusion was no, for both self-declared and third-party certifications, confirming the findings of earlier studies. Using a sample of 100 European government and commercial sites claiming accessibility standards conformance, more than 95% were found to have accessibility issues. The study used our automated tool, SortSite, in conjunction with manual testing performed by the accessibility group at the Shaw Trust (see the report for details on methodology).
The results on accessibility conformance mirror results we see with sites claiming to be valid HTML. About 30% of sites displaying the W3 "Valid HTML" and "Valid XHTML" badges fail validation. Although false validation claims are a smaller proportion than false accessibility claims, it's more surprising since:
- The validation test is completely automated so should be easy to run
- Using the W3 "Valid HTML" badge on invalid pages is a breach of the W3's terms of use
In practice ensuring entire sites are accessible and conform to standards is tough to do manually. Even a medium sized site with 10,000 pages takes over 200 days to test manually, assuming an 8 hour working day, and 10 minutes spent testing each page for accessibility and web standards compliance. To underscore this point, the original version of the report was published as an untagged PDF, making it hard to use in a screen reader. A quick run through an accessibility checklist or automated checking tool could have prevented this.