As a result of societal norms which historically have been found to favour men, a Gender Data Gap has emerged. What might be assumed to be an objective data set may actually be hindered by design flaws which fail to properly account for the experiences of women. We sometimes inadvertently design algorithms with the same human biases we hope to eliminate from the decision-making process.
The issues of the Gender Data Gap can be divided into two buckets: the first finds that issues affecting both men and women are framed, discussed and addressed through a male lens. The second finds that we often have missing or poor-quality data on issues that disproportionately affect women. This makes it more difficult to understand, assess and mitigate such issues. In her book ‘Invisible Women' (Criado-Perez, Caroline. Invisible Women: Data Bias in a World Designed for Men. New York: Abrams Press, 2019.), Caroline Criado-Perez discusses various instances of this phenomenon, for example:
- In car accidents, women are 47% more likely to be seriously injured, since cars have historically been designed and tested using male dummies.
- 71% of women wear Personal Protective Equipment (PPE) that has been designed for the male form .
- Smart phones and laptop keyboards tend to be designed to be optimal for the average male handspan.
There are implications of the Gender Data Gap across all areas of society, but what is its relevance to the workplace? In the last five years, organisations have begun to use AI programs in the first stage of interview screenings. The technology ‘reads’ and assesses CVs against the role profile. The more responsibility we give to the AI systems we design, the more we should interrogate the objectivity of data models to avoid unintentionally propagating gender inequalities.
Eliminating the Gender Data Gap in your organisation can provide new opportunities for the collection of valuable data insights. Whilst most large companies now rely on ‘Big Data analytics’ to understand their supply chains, manage risk and better predict market trends, a new area explored by some large organisations sees the use of data analytics to better interpret employee behaviour. ‘People analytics’ is the act of collecting large amounts of employee data to detect patterns and trends in areas such as employee wellbeing, performance and turnover rates. Leading examples of the possibilities of ‘People analytics’ can be provided by Google. With a new data-led approach within their HR function, Google discovered young women were twice as likely to quit as the average employee, due to what they called the ‘parent gap' (Bohnet, Iris. What Works: Gender Equality by Design. Cambridge: Harvard University Press, 2016). Armed with this insight, Google was able to make targeted changes to the firm’s parental leave policies and remove this disparity.
When it comes to improving the employee experience, a data-driven effort can support in identifying and addressing root causes of systemic issues. Using careful approaches to improve data inclusivity, valuable insights will follow, bringing firm-wide benefits. However, these efforts will be undermined if our data pool lacks integrity and fails to represent minority employees. To achieve data-inclusivity, we must collect gender-disaggregated data, refrain from allowing the default to be male, and ensure data models are designed with female collaborators to ensure the female perspective is truly accounted for.