“I believe that human dignity must be at the center of any development in artificial intelligence. Through conversation and advocacy, we can end unfair detention and prevent technology from undermining human rights.”
In business and public policy, computer algorithms (also known as artificial intelligence or AI) are increasingly used to cut costs and improve efficiency. However, these industries rely on algorithms without always accounting for factors that algorithms do not include. In the American justice system, judges use algorithms to score the “dangerousness” of defendants and justify detention before a trial even occurs. The data used to make these scores is flawed, limited, and proven to discriminate based on age, gender, race, and income status among other identities or factors.
Justice requires that all people receive equal treatment under the law. When AI is built and deployed without proper guidance–or based on the biases of the AI creators–, it quickly becomes a tool that reinforces racism and reduces people to numbers and blunt categorizations. There needs to be an interrogation of these algorithms, the data used to build them, and the contexts in which they are used.
To combat the use of unjust technology, Landecker Democracy Fellow Greg’s project is to build a platform that will center the human stories behind AI in criminal justice and fight to prioritize democratic and anti-racist standards in AI development. The project is more than a space to engage with AI policy.
In today’s fractured political environment, places are needed to discuss novel problems without the baggage of ideology. These conversations can spark the realization that people from diverse backgrounds and political perspectives share many of the same values.
The project will unite people in collaborative reform, elevate better technology, and protect human dignity.
The project will feature three distinct pieces:
- First, Greg will gather perspectives and stories from people who are closely involved with the use of AI within the American criminal justice system.
- Next, he will create a website that curates the stories and highlights the consequences of technology on human beings.
- Lastly, he will develop a network of activists and advocates for fair and democratic AI through online discussions and a policy campaign that leverages stories and data to develop better AI policy.
Updated December 2021.