BLOG POST

A Call to Improve the Quality and Transparency of Women’s Economic Empowerment and Gender Equality Indexes

As gender equality has gained increased prominence as an important human rights and global development priority, our teams at CGD and Data2X have seen a proliferation of reports, tools, and indexes that seek to track countries’ progress (or lack thereof) in improving various women’s economic empowerment and gender equality outcomes. In a joint paper, Measuring Women’s Economic Empowerment: A Compendium of Selected Tools, we reviewed 20 of these indexes (which we refer to as “population monitoring” or “PM” tools in the report) to better understand the information they reflect and the way they were developed.

Since the compendium’s publication, CGD and Data2X launched a learning collaborative on women’s economic empowerment measurement. Through the collaborative’s work, we have taken a deeper dive into a subset of these indexes—those with the most complete and transparent information—with the aim of identifying how these and the wider array of gender indexes can be strengthened going forward, both from a transparency and validity perspectives. Here we summarize the findings of our audit of four indexes and our key takeaways to inform good practice standards for the future.

What did the audits do?

Many users of these gender indexes—whether they are policymakers examining how their own country compares to others in its ranking, or advocates pushing governments to make improvements in particular areas of women’s economic empowerment or gender equality—may assume that the quantitative data reflected in indexes is objective and can be taken at face value. But the reality is much more complex: country rankings result from a multitude of choices made by indexes’ developers, including the gender outcome they are trying to measure, what specific indicators they choose to include (e.g., an indicator on non-agricultural employment versus one on employment status as a reflection of women’s economic participation), the data sources they rely on, and the methods they use to translate raw data into country rankings.

The audits assessed the four gender indexes with respect to their conceptual frameworks, measurement models, indicators and data sources, country rankings, adjustments to the raw data, use of multivariate analysis, aggregation formulas, external validity and transparency. We used sensitivity analysis to assess how developers’ various decisions affect the country rankings and their criterion validity, with the latter estimated by the correlations between the index’s country rankings and the country rankings of a set of external “criterion indicators” (widely available country-level measures with which the index would be expected to be most closely correlated).

What did the audits find?

Our analysis, summarized in the technical brief, shows that the country rankings of all four indexes have moderate to high estimated criterion validity. However, the audits also reveal that these indexes share several problems, including:

  • Less than complete transparency in making their data available and providing complete and accurate technical documentation
  • Missing values in their indicators, which are addressed in a variety of ways but without clear explanation for how developers’ choices affect the country rankings
  • Absence of sensitivity analysis assessing the effects of developers’ decisions made in the course of developing the indexes
  • Absence of multivariate analysis supporting the strong assumptions made in their measurement models

Recommendations to strengthen indexes’ transparency

Our main message here is that developers and funders of gender indexes need to set transparency guidelines so that all can easily access the basic information about how they were constructed and be in a position to assess their effects on the country rankings. Two sets of actions, if followed by developers, would increase gender indexes’ data transparency and validity substantially:

1. Index developers should make raw data readily available to download by interested users in a form that facilitates their analysis

The downloadable data should be provided in the format of at least one widely used statistical package (e.g., R, Stata, SPSS, SAS and when feasible in CSV format) so that users can analyze the data without having to invest a lot of time converting, recoding, and re-labeling the downloadable database to prepare it for analysis. The “country” variable should include standard UN country names when available and ISO 2-alpha or ISO 3—alpha country codes identifying the countries and territories represented in the data sets to facilitate comparisons with external data sets (e.g., the UN’s Human Development Indicators). The site providing the downloadable data should also clearly identify the person(s) to contact for questions about the data or the technical report and should also list clearly any restrictions on the use of the data.

2. Index developers should publish a technical report

The technical reporter should clearly explain the assumptions and choices made with respect to the conceptual framework, the measurement model, data sources, modifications to the raw data, formulas used to aggregate indicators to indexes, and the results of analyses done to test the validity of the index. Once equipped with the technical report and the raw data, an analyst should be able to reproduce not only the country rankings but also the normalized indicators and overall index values. If some of the information needed to do so is provided in earlier technical reports, that information should be clearly referenced in the current technical report.

In addition, index developers are encouraged to address the methodological problems identified in the audit by carrying out multivariate and sensitivity analyses and using systematic rather than ad hoc methods to adjust the raw data.

The way forward

The time is ripe for formulating good practice standards, and in particular tackling the issue of transparency, an overlooked data feature that affects the quality of all data, including gender data. This finding was one of the main takeaways of our audits and is our main message here: developers and funders need to set transparency guidelines so that all can easily access the basic information about how gender indexes were constructed, which assumptions or normative judgements were behind the choice of indicators, and how they affected the country rankings. That is, so that the user has sufficient information to assess the credibility of the index’s country rankings.

We have spelled out above what could be the basic elements of transparency guidelines for the development of women’s economic empowerment-related indexes. For the full version of the draft transparency guidelines we have proposed, click here (updated May 2022). The next step is to socialize and refine these proposed ideas with the gender data community who will ultimately endorse and use these guidelines. Stay tuned and send your feedback.

Disclaimer

CGD blog posts reflect the views of the authors, drawing on prior research and experience in their areas of expertise. CGD is a nonpartisan, independent organization and does not take institutional positions.