A responsible approach to monitoring responsible research and innovation
From the outset of the SUPER MoRRI journey, a key question that has preoccupied the project team is how to monitor RRI in a responsible way?
Three years into the project the discussion, debate, exploration, and experimentation is still going on! Our commitment remains strong to provide robust data and information for our monitoring framework and to present all our outputs as responsibly as possible. So, what does this mean, in concept and in practice? This blog post attempts to spell out the essentials of the SUPER MoRRI approach to responsibility.
Our conceptual approach to the challenge of making our own practice responsible is based on two overlapping concepts. The first of these is responsible quantification, which can be defined as ensuring that the data and information that we provide to users as a resource is prepared, presented, and made interpretable in appropriate ways. Our approach picks up on many of the key elements of debates about responsible metrics. As our monitoring framework will use a variety of quantification techniques and will present information in formats other than numeric indicators, we rely on a concept that refers to taking a responsible approach to both the process and the products of our quantification effort. In addition, our approach to responsible quantification also means that the primary data collected by SUPER MoRRI underpinning the resources provided will be available to prospective users under FAIR (findable, accessible, interoperable, reusable) principles. Guidance on the responsible use of these data will also be made available.
The second concept guiding our responsible approach is designed to support users to interpret the quantitative outputs we provide. The potential for perverse outcomes from poor design, misapplication, or inappropriate use of indicators presents a significant challenge, as was constructively addressed in the Leiden Manifesto. The SUPER MoRRI approach to this challenge will be to support users of its indicators and other data-driven elements, by providing specific tools to support the appropriate and credible interpretation of the information provided. We call this conceptual innovation credible contextualisation, and it has two main aspects.
First, credible contextualisation recognises that there are no universal context-free metrics, indicators, or other quantifications (of RRI or anything else). Rather, data used in indicators are gathered in a specific context. The degree to which any quantification can be utilised as a comparator or as a benchmark, for example, depends on the degree of de-contextualisation this quantification can credibly stand. Generally, the further you move away in time and space from a specific action or intervention of interest that you wish to monitor, the more likely it that the complex dynamics of broader societal factors will influence or ‘over-determine’ the outcomes or impacts that you might be seeking to attribute to that action or intervention.
Second, credible contextualisation recognises that indicators should be developed in ways that are relevant and meaningful in specific use contexts. In the case of the development of new indicators or other quantifications, a co-creation phase will be conducted to ensure the relevance and meaningfulness of indicators to users. This will involve bringing together some potential users of the indicator to critically reflect on the work-in-progress and offer their advice. This process will be iterative and involve both co-creation of the final indicator and the form in which it will be presented. This phase should also improve credible contextualisation of an indicator, by bringing it into line with user perspectives on what the information can be understood to mean and how it can be useful, and doing so during the development process ahead of its final definition and presentation.
Presentation of SUPER MoRRI indicators will include three components: an indicator fiche; a description of potential interpretive models; and complementary information to support user understanding and interpretation of the indicator. The indicator fiche will include all relevant technical information, including the data source(s), the metric used to calculate the score, and the indicator coverage. Interpretive models will describe how the indicator might be interpreted in relation to monitoring RRI but will also discuss contextual factors relevant to interpreting the substantive results. In the case of new RRI indicators, the interpretive model will explain the rationale for the creation of the indicator and how it is perceived to support RRI. For indicators that are time-series – or have the potential for future replication to create time-series – the model will describe what a change in the indicator can be reasonably understood to mean. Further information will be provided to try and ensure the credible contextualisation of the indicator. Complementary information will include descriptions or links to information on specific conditions, such as regional or national contexts or stages of policy cycles that will support users in understanding the limits of what the indicator can be thought to signify validly and reliably. Involving users in the development phase of the indicator will also help to guide the design and production of these supporting elements as work progresses.
Finally, to illustrate our approach, let’s take one indicator as an example. The share of women researchers working in R&D is an indicator that reflects women’s horizontal participation in the highly skilled labour force. These data are compiled at the national level by Eurostat and, despite gaps in the series, provide a picture of the proportion of women scientists, engineers and technicians working in R&D, which can also be broken down by sectors of the economy. This indicator appears on the surface to be highly ‘objective’ and ‘universal’ meaning that the information can be safely de-contextualised and that comparisons at the national level are meaningful and appropriate.
But from the SUPER MoRRI perspective, we are looking at this information from a particular perspective and we need to bring our credible contextualisation tools to bear on this information. First of all, to what extent can we interpret these data as indicating support for RRI? What do high levels of women’s participation or rising trends in this participation mean? From the RRI perspective, the interpretive model could be something along the lines of: progressive policies and practices, such as gender equality plans in organisations, improved communication and more role models highlighting the possibilities for STEM education for girls at all education levels, improved research career conditions for women who have career breaks due to family formation, etc. have reduced and continue to reduce the discrimination against women in research and innovation. This is a perfectly valid interpretive model that can be used, judiciously, in interpreting indicators of women’s participation in the R&D labour market.
However, there are also other elements that should be considered in interpreting these data. For example, there is great variation in the historical participation in R&D occupations across European countries. In some countries, such occupations were not historically prestigious or well remunerated and were not coveted by male workers. There has also been great cultural variation in social perception and possibilities of access of women to higher education, often depending on class or other forms of social stratification. These determining factors, which can vary significantly by national or cultural contexts, are already ‘baked in’ to the data used in an indicator for women’s participation in the R&D workforce. The extent to which an interpretive model based on RRI-related institutional transformation can be reliably considered to account for the levels or the trends in the data requires a considered assessment.
Providing SUPER MoRRI users with the necessary tools and information to make such considered and credible assessments of the SUPER MoRRI data and information is thus a challenge to which we will devote much energy in seeking to set a new benchmark for taking a responsible approach to monitoring. We look forward to bringing you more news about our efforts in the upcoming months.
- March 2, 2022