The UK has committed to reporting on national progress towards the Sustainable Development Goals. But how should the UK approach the 232 statistical indicators adopted as the global way of measuring progress? The Office of National Statistics consultation is just the start of the conversation says Graham Long, from the Institute for Sustainability at Newcastle University.

Measuring the statistical indicators that were adopted as the global way of monitoring progress towards the Sustainable Development Goals (SDGs) is the task facing the Office for National Statistics (ONS) - and the topic of their recently launched consultation.

The spirit of this consultation is welcome: after all, the UK has agreed that review of the Goals should be “inclusive", "participatory", and "support reporting by all stakeholders” (see para. 74 of the SDG agreement). For respondents though, the questionnaire might look dauntingly expansive, posing questions that many may feel ill-equipped to answer. The SDG agenda is large and interlinked, and hard to generalise about. Respondents cannot be expected to know the extent of available UK data and how big the gaps are, or where they are.

The indicator challenge

Initially, the task at hand for the UK might appear overwhelming - so many indicators, so little time. But in four respects, scrutiny of the SDG indicators themselves might yield guidance for UK reporting.

(i) First, approximately 30 of the UN indicators are effectively binary: they are yes/no questions. The detail of these issues should not be underestimated, but the UK clearly has universal health coverage, freedom of information laws, functioning human rights commissions etc. It may be that measurements like these can be dealt with swiftly, at least initially.

(ii) Second, some of the targets and indicators are focused on developing countries – like those on incoming development aid – and are not appropriate for the UK’s situation. Despite being a universal agenda, there is no requirement that every target in the SDGs is equally applicable to every country. This matters for what the UK should report on.

(iii) Third, around 80 of the UN’s indicators do not yet possess an internationally agreed methodology so the ONS cannot be expected to report on them; for these “tier III” indicators, the ONS should instead be aiding in their development and be readying to report on them in the future.

(iv) Fourth, the UN has already gathered data on some of these indicators for the UK context, available in their global SDG indicator database.

Each of these points generates its own stream of work:

(i) Binary indicators are the beginning, not the end, of a critical conversation about the detail of UK structures, laws and policies.

(ii) The UK could and should choose to measure its impact on developing countries on many of these indicators, eg disaggregating the kinds of projects UK aid supports or the UK’s record on supporting reform of global governance.

(iii) The UK indicators will need to be aligned to a developing global data structure, while finding near-neighbour stand-in indicators in the short term.

(iv) Ongoing engagement with the UN’s data collection to ensure maximum compatibility and accuracy is required.

Reasons for prioritisation

The reasons for and against prioritising amongst all these indicators might also seem overwhelming. In the consultation, the ONS offers a list of several criteria to guide data development (question 10) and several ways of dividing up the data (question 8) and their options are not exhaustive. But the SDG agenda itself is, again, one source of guidance on these issues.

At a goal level, reporting must respect the universality, breadth, and interlinked nature of the SDGs. The need for "universal" and "balanced" review (para 74 of the SDG agreement) across the goals invites ONS to fill the biggest gaps first, by goal, to allow effective assessment of the whole agenda. Establishing key baselines early is critical, as Parliament has noted. This might mean prioritising goals where the lowest proportion of UK indicators are in place. It might also mean prioritising those targets most central to the ambition of each goal - though agreeing these would itself be an issue for consultation. We do not know where ONS judges the gaps to be. On first sight though, the UK gathers plenty of data on educational outcomes, health, and environmental protection whereas goals on sustainable production and consumption, governance, or inequalities, might be less well covered.

At an individual indicator level, some of the indicators covered less well are especially relevant for the UK's circumstances. To highlight a few as examples: the target for inclusive growth – in effect, measuring progress towards socioeconomic equality in the UK; the measurement of multidimensional poverty; the extent of food insecurity in the UK; levels of public confidence in politics; better measurement of the UK's resource footprint; and the measurement of corruption and illicit flows in the UK and its overseas territories. In all these areas of current political and public debate, UK commitment to the SDGs is salient, and the importance of the ONS as an independent source of data on these topics could be a reason to prioritise reporting.

Another issue respondents are asked to wrestle with is the disaggregation of data to identify and highlight disadvantaged and discriminated-against groups in the UK. But prioritising different disadvantaged groups, and different dimensions of disadvantage, is inherently problematic. If the ONS is asking “which of these issues of discrimination is most important to you?” it is not clear that our views as respondents should count for much when set against the idea of equal human dignity.

If there is a basis for deciding where to start further disaggregation work, it lies in the SDGs' injunction to start with those “furthest behind first” (para. 4). Who is “furthest behind” in the UK, and in what ways, is precisely the question the SDGs task us with focusing on. And it is in these terms that any case for prioritisation of disaggregative work should be made.

Such a justification would expect to be subject to critical scrutiny and debate. And this is true for however the ONS chooses to tread a path through these issues of reporting and prioritisation. The current consultation on early steps in national measurement serves to start this conversation, but drawing action-guiding conclusions from it might prove difficult. The next step will be for ONS to lay out its approach to national measurement, and the reasons underpinning it - and then to listen to stakeholders on both what the ONS is proposing to measure, and why.

This blog was written for UKSSD by Dr Graham Long of the Institute for Sustainability at Newcastle University

Have your say on the ONS approach to national reporting, respond to their consultation.

Do you still have questions? Join us on 12 September for our webinar with the ONS on their consultation.