by Mirusha Yogarajah
A few months ago, Mark Zuckerberg’s congressional testimony sparked widespread conversation around Facebook, technology, and privacy. It is now clear that Facebook was the enabler of significant privacy breaches aimed at swaying voters in the 2016 American election. Cambridge Analytica could arguably be responsible for the election of Donald Trump. In light of this, it is worth wondering: does the development of technology and AI interventions threaten democracy itself? There are known methods of discriminating in elections, (e.g., voter ID laws and gerrymandering). Could Facebook’s analytics be the latest tool for political operatives wishing to gain something by pitting people against one another?
The design and implementation of public policy tends to overwhelmingly emphasize the need for data. Consequently, data collection is a priority for the top policy institutions around the world. However, we must treat this focus cautiously. Data has the capacity to be biased or inaccurate, it can be weaponized by technologies, and it can be racist.
Dr. Latanya Sweeney, Professor of Government and Technology in Residence at Harvard University, investigated discrimination in online ad delivery after her personal encounter with a google search of her name, and she found that names “assigned primarily to black babies… generated ads suggestive of an arrest in 81 to 86 percent of name searches on one website and 92 to 95 percent on the other.” There were exceptions to this rule – for instance, the name “Dustin,” which tends to be given to white babies, “generated an ad suggestive of arrest 81 and 100 percent of the time.” Employers can conduct a google search in order to gauge whether you have a criminal record, and determine whether you should be given a job offer or not. It is racist to give an innocent Deandre Washington an advertisement suggesting she might have an arrest record, while the ads given to a Mary Williams are perfectly ordinary. The systemic implications of racially biased AI technology have put American (and Canadian) values up for grabs.
Joy Buolamwini has done work relating to colourist and gender biases that come alongside “recognition software.” Consider, for instance, hands-free sinks in washrooms, where my own difficulties as a person of colour triggering the sensor to release water from the sink are not shared by my white and lighter-skinned counterparts. This is the exact technology that Buolamwini researches. The argument is that facial detection algorithms are typically created using white individuals as a reference more often than dark skinned individuals. This problem may have some advantages – darker-skinned people may be subjected to less invasive data collection than their lighter skinned counterparts, for instance – but given the evolving role of technology in modern society, such benefits are far outweighed by the barriers they create.
Northpointe Inc. argues that they can determine an individual’s potential for recidivism and pretrial risk based on facial recognition. At the Black Policy Conference hosted at Harvard University in April 2018, Buolamwini projected a black woman’s face on the wall, for whom Northpointe recommended an eight-year incarceration sentence. By contrast, a white man was recommended a three-year sentence by the same algorithm. The black woman had four juvenile misdemeanor arrests, while the white man had a grand theft auto charge and armed robbery charges. When asked to release the algorithms, Northpointe Inc. argued that they are proprietary, and thus releasing them would hurt profits. How exactly does a company analyze what individuals are more “dangerous to society” based on a mugshot? The answer is surely racism.
AI technology can be equitable. However, this requires transparency and accountability, terms thrown around in the public sector, but never prioritized on the public agenda. Technological development needs to incorporate more of the faces and the hands of dark-skinned people, instead of continuing the current pattern of marginalization. We can aim to be more transparent and accountable by being cognizant of proxies in algorithms. For example, one of the indicators tracked by algorithms like Northpointe’s is reports of abuse. Individuals reporting instances of abuse can be anonymous, meaning that their claims do not have to be substantiated. Thus, black families are 3.5 times more likely to be reported than white families. Reports of abuse are a proxy for race – we need ethical algorithms.
Buolamwini proposes storytelling as a way to keep the public attuned to the complexities of data analysis and program evaluation. We can use the devices that oppress us in order to document the ways in which they do. Lived experiences need to be included in the data in order for any research strategy to call itself interdisciplinary. We must also continue to put external pressure on software and technology companies so that the equity in technology becomes a salient issue. Additionally, we must pressure our governments to ensure that transparency and accountability is not only demanded of the private sector, but is a priority in the public service, as well.
At the Black Policy Conference, William Issac, a PhD candidate in the department of political science at Michigan State University, said that data isn’t inherently malevolent. The American and Canadian constitutions (for example) both say that there should be a census conducted regularly in order to make determinations about electoral representation. However, in a technology-driven ecosystem, institutions and the data they collect can be a force for evil and oppression just as easily as they can be a force for good. Using data responsibly requires diversifying the individuals collecting and analyzing data. Data has the capacity to drive equity, but those with power must give us a seat at the table.
—
Mirusha Yogarajah is a Tamil kid doing her Master of Public Policy at the University of Toronto. She cares deeply about policy innovation and social equity. She is the Editor-in-Chief of SPICYWTR Mag, plays Bananagrams and has an affinity for cheese plates. She writes to heal.
One Comment
Comments are closed.