Enter a search term such as “mobile analytics” or browse our content using the filters above.
That’s not only a poor Scrabble score but we also couldn’t find any results matching
Check your spelling or try broadening your search.
Sorry about this, there is a problem with our search at the moment.
Please try again later.
The Disability Rights Commission (DRC) proves hostile to findings that show it to have failed basic accessibility compliance requirements, and its response adds confusion to an already confused market.
For a copy of the full report please email email@example.com
The Disability Discrimination Act (DDA) requires businesses and organisations to make websites accessible to all users, particularly the disabled. Yet the DRC, the Royal National Institute for the Blind (RNIB) and the Royal National Institute for the Deaf (RNID), the supposed standard bearers for website accessibility, continue to fail even the most basic A/AA requirements. The findings and the reports produced by SiteMorse showed, for example, that the DRC’s website failed both A / AA continually over the last few months.
The DRC’s chairman, Bert Massie, wrote a ‘sharp’ letter in May 2004 published in the Guardian newspaper. In it he claimed that the tests had “caught them on a bad day; the error was not on their own site but that of a sponsor…” The letter explained why the DRC felt that the tests were flawed. SiteMorse then re-checked the report and found problems as basic as the image description on the letter from the organisation’s own chief executive, Bob Niven.
Head of Media at the DRC, Patrick Edwards, gave me a hostile reception when interviewed about the test results. He seemed to place usability testing through human interaction above web accessibility compliance. He caused confusion when he inferred in a telephone conversation that there are no legal standards for website accessibility, even though a number of DRC speeches and documents (including comments made in the Guardian) clearly state that it is a legal duty for organisations and businesses to make their sites accessible to the UK’s 8-10 million disabled people.
Nicholas Le Seelleur, director of SiteMorse, said; “The DRC made no comment about the number of other problems identified across their site; including broken links, pages that do not work correctly and missing images.”
Massie even said in April 2004: “Organisations that offer goods and services on the Web already have a legal duty to make their sites accessible.” The transcripted speeches made at Islington’s Business Design Centre at that time on ‘The Web: Access and Inclusion for Disabled People’ emphasise the legal duty of organisations and businesses to ensure that their websites are accessible, or they could potentially face prosecution for discrimination under the DDA.
Now I wouldn’t claim to be an expert, but Edwards’ comments and tone during the interview and in an email that followed the conversation were shocking and puzzling. “You asked whether there were legal standards for web site accessibility and there are not”, he later said and added, “There are duties under part 3 of the Disability Discrimination Act 1995 (the 1999 duties) which require services provided over the web which make it unreasonably difficult for disabled people to receive a service to make reasonable adjustments. The law does not spell out what these adjustments are.”
“There are web guidelines for accessibility and I expressed the view, also supported in our report, that meeting technical access requirements was not sufficient in itself to guarantee that the disabled people could actually use the sites. Our formal investigation revealed that where minimal access standards were achieved in line with ‘A’ level registration – the lowest, there were unfortunately few sites that achieved a standard above this – significant numbers of disabled users still had problems using the site. I made it clear to you that involving disabled people in testing sites allied to meeting technical compliance was a surer way of ensuring that sites were able to be used by disabled people.”
He also claimed that SiteMorse’s reports on the DRC’s site were erroneous, and that they actually reflected an external site that was only linked to its own. However, he refused to provide further information or comment when he was asked to show what the errors were in the reports; that is until further evidence is supplied. SiteMorse says it will be sending this to him in due course.
Le Seelleur commented: “The Child Support Agency and many others have found that our tools have helped them to achieve higher levels of web accessibility compliance; our tests and reports have proven themselves to be very accurate and even more accurate than those of rival tools like Bobby.”
Access all areas?
“A small travel agent operating from a first storey premises may never be required to put in a lift. It may simply be too expensive - or indeed not physically feasible. But they will need to look at providing an alternative means, such as an accessible website or telephone service.”
The Guardian, Social Care Comment, Friday October 1, 2004
The RNIB: Keen to Improve
The RNIB, which was more welcoming, appeared keener to get to the bottom of the problem. It also rated usability quite highly, but suggested that it is of equal importance and part of the same coin. Its Press Officer, Bill Alker, supports a combined approach to testing and believes that user evaluation is ‘under-utilised’. He said that his organisation’s site “conforms to the RNIB's ‘See it Right’ accreditation, which is based on the WAI Web Content Accessibility Guidelines (see http://www.rnib.org.uk/seeitrightaudit for more details).”
Alker advises that companies and organisations should, “Involve their potential users of their web site in testing designs from the earliest stage, including disabled users. Frequent user testing can ensure 'real usability' for disabled people, and the results are much more useful than those produced by using automated testing tools alone" Great emphasis is placed on ‘likely human behaviour.’ He also claimed that automated tools cannot test the appropriateness of ALT text descriptions.
Confused Definitions and Poor Awareness
Both the DRC and the RNIB doubt the validity and accuracy of automated testing, even though SiteMorse – which is a specialist in this field – has conducted research and received client feedback that shows its clients have made significant improvements to their sites since using the company’s web accessibility testing tools.
Questions are also raised about whether the standards are fully understood, whether there is sufficient awareness of the need for website accessibility compliance, and whether the definitions of usability and accessibility are clear. It often seems that these terms are being used as one and the same, and SiteMorse wonders whether the web accessibility compliance providers and service provision companies understand them fully too. This limited understanding and the confusion over the requirements make it difficult to tender for developments.
“We receive a high volume of complaints from organisations who are confused by or who feel they have been misled by the results produced by some automated tools. The RNIB agrees that there is no single test for web accessibility. Automated checking tools fail to address the user experience in a meaningful way, and may produce false positive and false negative results.” Alker said. He recommends a combined approach to assessing the accessibility of websites, while emphasising that “compliance to the WAI’s guideline is crucial”.
Bobbie Johnson also writes in The Guardian, “The Disability Discrimination Act was enacted five years ago, but widespread confusion over its terms meant many people thought it did not affect websites until this October. In fact, web accessibility has been part of the law all along.”
Unwittingly Misled By Providers?
There are a number of organisations and companies that claim to achieve web accessibility compliance. The monthly stringent tests, league tables and in-depth reports created by SiteMorse, however, often show that they actually fail even some of the very basic requirements. Sometimes this is down to the tool that they use; there are many tools that only scratch the surface, which can as a result provide misleading data. There are some web design and development agencies, designers, accessibility compliance and service provision companies, which may also unwittingly mislead their clients.
There is certainly a case for more web designer and developer training in this area, because their awareness of the issues may be limited. Yet sometimes they may simply ignore, or be required to ignore, web accessibility requirements in favour of a more creative design, and therefore fail to test their designs and developments for compliance. However, they should be aware that there are economic, financial and a legal case for making sure that disabled people can have access to their websites.
Bobbie Johnson explains in his Guardian report: “The moral imperative may now be a legal compulsion, but many businesses are beginning to understand accessibility issues in a way that is more familiar to them: money. There are an estimated 9m people in Britain suffering from some form of disability, and such figures get the attention of marketing men and bean counters. No company can afford to alienate or exclude an audience of that magnitude.”
Evidence: The League Tables (avail on req please email firstname.lastname@example.org)
It is interesting to note that in SiteMorse’s league table for January 2005, ‘Testing and Ranking of the Accessibility Compliance and Service Provision Companies’, that Red Ant and Nomensa are the only ones to pass the automated tests checking for A and AA compliance – demonstrating that building compliant websites is achievable. Sitting right at the bottom of the league is FHIOS.
Top site in the report was Red Ant, which achieved full compliance for accessibility (based on the automated tests) Gavin Massey – Head of Accessibility at Red Ant commented:
“We are obviously very pleased with this result. SiteMorse testing is a key part of our development process. We have always had the policy 'to practice what we preach' and have found SiteMorse reports to be essential when aiming for WCAG AA+ for the mandatory 'valid HTML' testing. Other popular products just don't offer this.”
“While the importance of automated testing is clear, it does not guarantee true accessibility until it's put to the test by people with disabilities. Red Ant works with a number of disabled Internet users to ensure our websites offer the highest levels of accessibility to everyone. We encourage the competition and awareness these league tables bring.”
The table below clearly shows the DRC in tenth place, having failed both A and AA levels of compliance and its site also failed both modem and ADSL download speed tests, although the site passed meta data tests. Below the DRC is the RNIB in twelfth position, having failed the A/AA and meta data tests. However, it managed to pass the modem and ADSL tests. The RNIB’s site is let down by a slower Rank Response, which is defined as the average time taken by the server to respond to a request.
They Could Do Better
The DRC, RNID, and the RNIB should at least do more to pass the basic levels of web accessibility compliance. They are after all supposed to be the exemplary organisations promoting the issue and the DRC is expecting to see a number of test cases going through the courts at some point in time. With this in mind, Edwards should further clarify his position on the legal side of web accessibility.
While there may be no legal duty to make a site usable, there is a legal duty to make websites accessible to disabled people. Those that ignore this could find themselves in court accused of discrimination. It’s not just a moral imperative for compliance, because poor accessibility has a financial and economic impact too.
Other sites tests include their own sites;
www.open4all.org – failed A (Priority 1)
Tested 3rd June – failed A/AA
Failing A from September ’04 to latest test 28th December
Tested 17th Jan, 14.45 – failed A / AA
www.disabilityaware.org - failed A (Priority 1)
Reached A compliance mid may 2004, failed AA (Priority 2) on every test to date.
www.drc-gb.org failed A (Priority 1)
Tested the site every month, emailed the summary in Nov / Dec and Jan
In line with letter, failed A on 15th April, for 2 weeks
Failed AA (Priority 2) on every test to date.
Tested 17th Jan, 14.35 – failed A / AA
Summaries have been sent at no charge
They have been given an open offer a number of times to review and use the product to improve their own compliance.
About SiteMorse™ (www.SiteMorse.com)
SiteMorse™ is a leading automated website testing service with unique website diagnostic testing, monitoring and reporting facilities.
SiteMorse™ technology can provide detailed diagnostic reports for websites, on function and performance, HTML compliance checking, availability monitoring, brand, corporate and company compliance and accessibility testing.
By acting in the same way as a user’s browser, SiteMorse™ requests every combination of every item from every page on a web server. The results are identified exactly by page, line, type and link to produce exact details of problems or failures. SiteMorse™ technology is non-site specific and requires no downloads, configuration or technical support services.
SiteMorse™ produces independent monthly website rankings in sectors including FTSE 100, Central and Local Government, NHS, Banking and Finance, Solicitors and Insurance firms. SiteMorse™ automated web testing has been used in the annual ‘Better Connected’ report by Society of Information Technology Management (SOCITM) and is the benchmark for all Local Authority websites.
Lastly, does the DRC have the clout to enforce greater accessibility when it seems to place usability above anything else, and while it continues to fail basic levels of compliance? This is not what a standard bearing organisation should do. Both organisations could do better.
Published on: 12:00AM on 21st January 2005