330
Why Data Science Needs Feminism?
Data feminism is for everyone.D’Ignazio, Catherine and Lauren F. Klein. 2020. Data Feminism, Cambridge, MA: MIT Press.
https://mitpressonpubpub.mitpress.mit.edu/data-feminismhttps://blogs.lse.ac.uk/impactofsocialsciences/2020/10/04/book-review-data-feminism-by-catherine-dignazio-and-lauren-f-klein/Mathematician Christine Mann Darden began to work as a human computer at the NASA's Langley Research Center in the summer of 1967, whose story served as an inspiration for the book, Hidden Figures. Early in her career, Darden realized that while she held the same qualifications and did the same work as her male colleagues, she had not been promoted. She consulted an employee of the Equal Opportunities Office, Gloria Champine, who visualized data on all employees' qualifications by gender and rank. Identifying a systemic issue, she then took this up with senior management who finally gave Darden her overdue promotion. Dr Christine Darden went on to become the first ever African-American woman to hold a Senior Service Executive rank at Langley and was a director when she retired from NASA in 2007.
Christine Darden in the control room of the Unitary Plan Wind Tunnel at NASA's Langley Research Center in 1975.
Dr Christine Darden's experience offers a guide for challenging power and working toward justice. Darden models data feminism. In Data Feminism, Catherine D’Ignazio and Lauren F. Klein use an intersectional feminist lens to examine unequal power structures in the realm of data, and highlight attempts made to rectify them. Showing how the data we collect is representative of our unequal society, this book is a call to action that will particularly benefit feminists seeking to learn how activism can contribute to creating a more equitable form of data science. You can read the whole book on the website of MIT press (https://data-feminism.mitpress.mit.edu/).The word data dates to the mid-seventeenth century, when it was introduced to supplement existing terms such as evidence and fact. Identifying information as data, rather than as either of those other two terms, served a rhetorical purpose. It converted otherwise debatable information into the solid basis for subsequent claims. But what information needs to become data before it can be trusted? Or, more precisely, whose information needs to become data before it can be considered as fact and acted upon.Data feminism is a way of thinking about data, both their uses and their limits, that is informed by direct experience, by a commitment to action, and by intersectional feminist thought. The starting point for data feminism is something that goes mostly unacknowledged in data science: power is not distributed equally in the world. Those who wield power are disproportionately elite, straight, white, able-bodied, cisgender men from the Global North.
Computer science has always been dominated by men and the situation is worsening. Women awarded bachelor's degrees in computer science in the United States peaked in the mid-1980s at 37 percent, and we have seen a steady increase in the ratio of men to women in the years since then. This particular report treated gender as a binary, so there was no data about nonbinary people.The work of data feminism is first to tune into how standard practices in data science serve to reinforce these existing inequalities and second to use data science to challenge and change the distribution of power. Underlying data feminism is a belief in and commitment to co-liberation: the idea that oppressive systems of power harm all of us, that they undermine the quality and validity of our work, and that they hinder us from creating true and lasting social impact with data science.Data feminism can help to remind us that before there are data, there are people—people who offer up their experience to be counted and analyzed, people who perform that counting and analysis, people who visualize the data and promote the findings of any particular project, and people who use the product in the end. There are also, always, people who go uncounted—for better or for worse. And there are problems that cannot be represented—or addressed—by data alone. And so data feminism, like justice, must remain both a goal and a process, one that guides our thoughts and our actions as we move forward toward our goal of remaking the world.Data feminism isn't only about women. It takes more than one gender to have gender inequality and more than one gender to work toward justice. Likewise, data feminism isn't only for women. Men, nonbinary, and genderqueer people are proud to call themselves feminists and use feminist thought in their work. Moreover, data feminism isn't only about gender. Intersectional feminists have keyed us into how race, class, sexuality, ability, age, religion, geography, and more are factors that together influence each person's experience and opportunities in the world. Finally, data feminism is about power—about who has it and who doesn't. Intersectional feminism examines unequal power. And in our contemporary world, data is power too. Because the power of data is wielded unjustly, it must be challenged and changed.Data Feminism has developed seven core principles. Individually and together, these principles emerge from the foundation of intersectional feminist thought. Each of the following chapters is structured around a single principle. The seven principles of data feminism are as follows:1. Examine power. Data feminism begins by analyzing how power operates in the world.2. Challenge power. Data feminism commits to challenging unequal power structures and working toward justice.3. Elevate emotion and embodiment. Data feminism teaches us to value multiple forms of knowledge, including the knowledge that comes from people as living, feeling bodies in the world.4. Rethink binaries and hierarchies. Data feminism requires us to challenge the gender binary, along with other systems of counting and classification that perpetuate oppression.
5. Embrace pluralism. Data feminism insists that the most complete knowledge comes from synthesizing multiple perspectives, with priority given to local, Indigenous, and experiential ways of knowing.6. Consider context. Data feminism asserts that data are not neutral or objective. They are the products of unequal social relations, and this context is essential for conducting accurate, ethical analysis.7. Make labor visible. The work of data science, like all work in the world, is the work of many hands. Data feminism makes this labor visible so that it can be recognized and valued.Each of the following chapters takes up one of these principles, introduce key feminist concepts like the matrix of domination, situated knowledge, and emotional labor, drawing upon examples from the field of data science, especially those from margins, whether because of their gender, sexuality, race, ability, class, geographic location, or any combination of those (and other) subject positions, to show how that principle can be put into action.One example of this that is highlighted in the book was the lack of data available on maternal mortality, particularly for Black mothers, leading to a situation where the horrifying maternal mortality rates for Black women were never flagged as a cause for concern. Serena Williams's experience of giving birth became a landmark moment for many women of color who realized they had not been the only person whose pain had been ignored by doctors. Estimates suggest that maternal mortality for Black women may be more than triplefold that of white women, and the lack of validation of their pain has often taken the lives of these women.Powerful examples of the projects in the book include a counterdata initiative led by María Salguero to record cases of femicide (gender-based killings of women and girls) in an open, accessible manner. The lack of government-published data on the subject prompted Salguero to sift through newspaper articles and Google Alerts, finding every instance she could and logging these on a map.
Another example is the 'Gender Shades' project wherein a team of researchers led by Joy Buolamwini and Timnit Gebru found that Black women are 40 times more likely to be misclassified by facial recognition technology than white men. This piece of research quickly sparked IBM to launch its 'Diversity in Faces' project which aims to build facial recognition technology which is racially fair and accurate; however, it recently abandoned this project after broader discussions on the unethical use of such software in racial profiling and mass surveillance.CityReads ∣Notes On Cities"CityReads", a subscription account on WeChat,
posts our notes on city reads weekly.
Please follow us by searching "CityReads"