This paper raises the need for quantitative accessibility measurement and proposes three different application scenarios where quantitative accessibility metrics are useful: Quality Assurance within Web Engineering, Information Retrieval and accessibility monitoring. We propose a quantitative metric which is automatically calculated from reports of automatic evaluation tools. In order to prove the reliability of the metric, 15 websites (1363 web pages) are measured based on results yielded by 2 evaluation tools: EvalAccess and LIFT. Statistical analysis of results shows that the metric is dependent on the evaluation tool. However, Spearman's test produces high correlation between results of different tools. Therefore, we conclude that the metric is reliable for ranking purposes in Information Retrieval and accessibility monitoring scenarios and can also be partially applied in a Web Engineering scenario.