Event Recap / March 2023

Tech and Justice: Navigating Harms in the Digital Age

On February 15, 2023, the Center for Long-Term Cybersecurity convened an online panel discussion that brought together representatives from Berkeley Underground Scholars, an organization of UC Berkeley students who were formerly incarcerated, to discuss the technology-mediated harms that disproportionately affect communities impacted by the criminal legal system. 

Moderated by Ji Su Yoo, a PhD student at the UC Berkeley School of Information, the panelists discussed a wide range of topics, such as privacy and surveillance, online financial scams, and the impact of these harms on post-incarceration life. The event also highlighted potential solutions and preventative measures that can be taken to address these issues.

Shani Shay, the Incarceration to College Program Coordinator at the UC Berkeley Underground Scholars, discussed the impact of new forms of surveillance that have emerged with the rise of digital technologies. “For anyone who has read The New Jim Crow, it talks about how racism is built into our [legal] code, so we don’t even recognize it, because racism and white supremacy are so insidious,” Shay said. “The whole technique of surveillance started from surveilling slaves…. It’s important to be mindful that it’s not only affecting people that look like me, it affects so many of us from so many different backgrounds. Surveillance is very often the key that allows for those apparatuses to be so powerful.”

Eli Martinez, a fourth-year undergraduate Sociology major with a Public Policy minor, explained that he studies the intersection of technologies of policing, social theory, and labor; his current research focuses on the experiences and post-incarceration employment prospects of individuals who have engaged in wildland firefighting while incarcerated in California’s fire camps. Martinez emphasized that the harms of technology affect not only formerly incarcerated people, but also their families. “Sometimes we don’t consider the families of these individuals,” he said.

He also noted that negative experiences with technology can lead to an erosion of trust in institutions broadly. “Aside from the social and physical harms that we’re starting to see associated with technology-mediated harms, there’s also a growing body of literature showing how it’s eroding the trust that individuals have with these institutions,” Martinez said. “Even minimal contact with the criminal legal system can fuel what’s called “systems avoidance.” This is where individuals avoid surveilling institutions, like banks, hospitals, the labor market, and even educational systems. The unintended consequences of these technologically mediated harms include that these populations go on to avoid our communities and institutions, which then only further entrenches social marginalization and accelerates downward mobility.”

Conan Minihan, a first-generation college student majoring in Data Science at UC Berkeley, studies a range of issues involving the criminal justice system, such as algorithms assessing risk for a bail review and the use of tracking apps as an alternative to being held in county jail to await trial. “In our data science class, we studied the COMPAS algorithm, and the California prison realignment initiative,” he explained. “It was surreal, because I was a data point in both of these, and I have an intimate knowledge of how that data was collected. I like to bring my perspective of not just being a data scientist, but also having been formerly incarcerated.”

Minihan explained that the continual tracking of data about individuals can have adverse impacts throughout their lives. “Many people in county jails, state prisons, and federal prisons are diagnosed with mental illnesses,” he said. “You have a system where, in order to get mental health care, custody has to be informed. Then later, you put conditions on custody in your housing, or on your probation. How this data about people is transmitted and how it affects their decisions about receiving care has effects on their health and longevity, and on their families and communities. That data is transmitted and kept on them and made available to people who shouldn’t have it. I don’t really know how that is legal.”

Shay explained that the use of technology in the probation system leads to a lack of connection and inaccurate decision-making. “When I was on probation, my probation officer was a kiosk,” Shay said. “I would come to a kiosk, I would press a button and say I was there, and that was my check-in. That probation officer never got to know me, and if you know anything about probation, probation officers have almost complete control over your life. If you do anything that can be interpreted as a violation, then you’re violating probation. It’s very important that the person that has control over your body knows you very personally. The kiosk never knew me. I had a newborn daughter who was seven months old, and when I was taken to jail, I remember being taken aback that this lady didn’t even know me. She didn’t know my experience, because I was only checking into that kiosk.”

Shay also described how she won the Oakland housing lottery, but then was denied because of a background check that was out-of-date and inaccurate. “That almost broke me,” she said.

underground scholars

“Apps and companies are using people’s data to deny them access to something everyone else has access to,” Minihan said. “People who are already in danger of homelessness and housing security are denied [housing at] more places. There’s an acceleration, just because this data is spread so far and wide.”

“It’s the negation of a basic human right,” Martinez added. “In the most wealthy country in the history of the world, we’re denying a basic human right to individuals or populations that are already marginalized and stigmatized.”