New report combines social work and artificial intelligence to address racial bias in housing for people experiencing homelessnessDecember 05, 2023 / by Michele Carroll
Racial inequities and the impacts of systemic bias are starkly evident in the population of people experiencing homelessness in Los Angeles, but a new report details a proposed method of collaboration between human and technological systems that could eliminate racial bias in housing allocation. The USC Center for AI in Society (CAIS), a joint venture between the USC Suzanne Dworak-Peck School of Social Work and the USC Viterbi School of Engineering, released the results of a three-year research project, in partnership with the California Policy Lab at UCLA and the Los Angeles Homeless Services Authority (LAHSA).
The Coordinated Entry System Triage Tool Research and Refinement (CESTTRR) report provides a series of recommendations that combine human social science expertise with artificial intelligence to improve and optimize the intake assessment process for more fair and equitable housing allocation.
“We’ve made an important step forward for Los Angeles in addressing the really challenging social problems of racial bias and homelessness, and we’ve done it in a way that is both technologically innovative and also driven by the values of the community,” said Eric Rice, professor at USC Social Work, co-director of CAIS and project lead for CESTTRR. “It’s a great example of social science and data science coming together to do more than either discipline could do on their own.”
Funded by the Conrad N. Hilton Foundation, Home for Good Funders Collaborative and The Homeless Policy Research Institute, CESTTRR is the latest transdisciplinary research initiative in CAIS’s ongoing mission to harness the use of artificial intelligence to create social good, community impact and a more just, healthy and sustainable world. It aligns with USC's Frontiers of Computing initiative to accelerate advanced computing research and training, one of the "moonshot goals" set by President Carol L. Folt.
“The partnership between social work and engineering allows us to go beyond the data, to understand the human side behind it, and to create AI solutions tailored to the specific needs of the population,” said Phebe Vayanos, associate professor at USC Viterbi, co-director of CAIS and lead for the USC Data Science and Computerized System Design Team on CESTTRR. “As a result, we have built a system that can better align with the community’s values and more efficiently and fairly allocate limited housing resources. Our proposed system is also more transparent which helps build trust and improve participation.”
Asking the right questions, the right way
In 2018, LAHSA’s Ad Hoc Committee on Black People Experiencing Homelessness identified a need to examine the Coordinated Entry System (CES) Triage Tools on issues related to structural racism in homelessness and housing. Black individuals were not allocated housing resources or exiting homelessness at the same rates as their white counterparts, indicating that vulnerability was not being fully captured for particular populations in the assessment tools. According to the 2023 Greater Los Angeles Homeless Count, while Black people account for 7.6% of the total population of Los Angeles County, they are 31.7% of the county’s homeless population.
Rather than turn their back on an inconvenient problem, LAHSA leadership asked the hard question: is the system broken and, if so, how do we fix it?
“It would have been very easy to say ‘We don’t want to know that this tool is broken because we don’t want to deal with the consequences of knowing that.’” Rice said. “But LAHSA wants to do the right thing. When it was confirmed that the CES tool wasn’t working the way that it should, they said, ‘Tell us how to make it better.’”
For Marina Genchev, director of systems and planning at LAHSA, the ability to collaborate with researchers who are invested in the homeless community and rooted in a deep understanding of the human crisis of homelessness made for a different, more relevant product.
“They understand that this is not science for science’s sake,” Genchev said. “We may be talking about huge quantities of data and where AI or prediction can come into place, but it is never a pure science exercise, it is always a human exercise using data.”
A priority for the research team was to optimize the human intake process, comparing the information gathered against outcomes data to identify weaknesses and improve the specific questions asked during assessment for high-risk vulnerability, as well as when and how these questions are presented to clients. Using the results from this initial analysis of the triage tool, the team narrowed down the intake process to 19 questions that would most accurately predict future adverse events for the client and their likelihood to be able to exit homelessness if they received housing resources. The questions are modified to reduce or eliminate any unintentional racial bias, and guidelines provided on best practices for administering the tool to reduce client stress during the intake process to increase the probability of capturing accurate information of their vulnerability and needs.
“Improving the triage tool used in Los Angeles, and making the process more equitable, is a key first step since this is the first entry point for people seeking out services,” explained Janey Rountree, executive director of the California Policy Lab at UCLA, which developed algorithms to augment the vulnerability assessment. “We were glad to play a role in this important work and are eager to see the impact it can have.”
Rice points out that the combination of sensitive, personal data being collected by overwhelmed case managers from stressed and potentially mistrusting clients can lead to inaccuracies that create unintentional bias. A hypothetical example would be a Black man with a criminal history feeling uneasy about revealing this to his case manager during the intake process believing this information makes him less likely to be prioritized for housing. In fact, this information is an important predictor of high-risk vulnerability as often individuals with this background are more likely to continue on a downward spiral into repeat institutionalization or even death if left on the streets.
“You can have the best data science model in the world, but if the data that’s being collected is biased, you’re going to run into bias,” Rice said. “What we’ve been trying to do with this process is to help create less bias in the model, but also help create less bias in the way that we collect the data.”
An integral part of the CESTTRR project was the establishment of a Community Advisory Board (CAB) made up of frontline case managers who collect the intake data, resource “matchers” who allocate housing based on the intake data, and individuals with lived experiences of homelessness who could speak directly to bias encountered through triage tool. The CAB members provided an invaluable perspective and helped the research team in developing a true community-driven refinement of the triage tool by understanding first-hand what fairness really means to people living through these circumstances, and how that could be better reflected in the intake system.
“It was an amazing collaborative experience,” said Debra Jackson, a member of the CAB who matches housing allocation as program manager for the St. Joseph Center in Venice, California. “The USC team really pulled everybody together. With the new wording we can capture the client vulnerability while developing a relationship and engaging them in a different way.”
Beginning with the end in mind
An important element when creating a data science model is to begin with the end goal in mind, Vayanos explained. For the CESTTRR research, this involved a deeper understanding of how members of the CAB and other stakeholders defined vulnerability and risk within the context of housing allocation, and then anchoring the artificial intelligence to a specific outcome to achieve racial equity against the metric. Similar answers emerged repeatedly throughout this process: if vulnerable and high-risk individuals were not provided with housing they would end up in emergency rooms, psychiatric holds or dying. If a housing intervention was received, these outcomes were less likely to occur.
This objective measure of adverse events, and measuring vulnerability to those outcomes, became the metric in which to anchor the system. The existing vulnerability assessment in the triage tool was less accurate at predicting negative outcomes if the clients were Black or Latinx than if they were white, therefore systematically not identifying high-risk people of color for housing resources.
“We have always thought of AI as making tools that can assist humans to do their jobs better, not replace humans as decision makers, especially around important issues like housing that can literally be a life-or-death outcome for people,” Vayanos said.
The CESTTRR report presents two data systems models that address different needs, and allows LAHSA and other communities to adapt the criteria and inputs according to individual community preferences and human priorities. The first, developed by the California Policy Lab team co-led by Rountree, is a data prediction model linking existing administrative data from touchpoints throughout the County of Los Angeles to predict future adverse outcomes and elevate those clients’ priority for available housing resources. This model helped to optimize the intake questions while also producing greater validity due to less reliance on self-reported data collection.
The second model, developed by the USC team led by Vayanos, elevated the anchoring issue to address homelessness itself, creating a solution that would produce equity across all groups to successfully exit homelessness. When tested, this model reduced overall homelessness over time by increasing the number of individuals able to successfully exit homelessness thereby making additional resources available to others.
“We are not applying existing solutions to new problems,” Vayanos said. “These are problems that need new research in order to be effectively addressed. By just changing who gets the resources in a heavily constrained system, we can actually increase the number of people who exit homelessness by three percent.”
Rice and Vayanos are already expanding the reach of the recommendations in the CESTTRR report outside of Los Angeles, developing implementations within communities in Missouri and Washington. Vayanos has made a Python software package available for social service agencies looking to increase fairness, efficiency and transparency in their policies, and finalizing another that allows local communities to adapt the CESTTRR models to their own community priorities. The pair also continue to meet with the CAB for further guidance around outcome priorities and how best to address them.
“The hope is really to elevate their voices and make them heard among policymakers,” Vayanos said.
While the CESTTRR report does not presume to solve all of the problems associated with homelessness, the models it presents help to significantly move the needle by ensuring that the limited housing resources available within Los Angeles County are allocated fairly to those the most vulnerable, regardless of race or ethnicity.
The heartbreaking challenge of this work is that, ultimately, it is a supply issue. Rice notes that while LAHSA has one of the most successful housing placement results in the country, providing housing solutions for around 25,000 people annually, approximately 75,000 people continue to sleep on the streets of Los Angeles every night.
“We have a long way to go to solve homelessness,” Rice said. “But we’re doing something to make for a more equitable, fair and community-driven process that will help to serve people experiencing homelessness — no matter who they are — in a more thoughtful and meaningful way.”
To reference the work of our faculty online, we ask that you directly quote their work where possible and attribute it to "FACULTY NAME, a professor in the USC Suzanne Dworak-Peck School of Social Work” (LINK: https://dworakpeck.usc.edu)