Sat, Jul

Unveiling the Truth: State Auditor's Report Sparks Debate on Homelessness Spending


iAUDIT! - Ever since it was released on April 9, the State Auditor’s report on homelessness programs has caused quite an uproar.  Some critics claim its proof of massive waste and fraud in city and county homelessness agencies. Others say its recommendations to improve reporting and accountability merely reflect administrative problems, but the programs themselves are working. To understand what its 53 pages really say, we need to examine the report from a few different angles. 

First, it’s important to know what the audit was intended to do, and where the numbers come from. It was not a financial or fraud audit. Rather, as the introduction says, “The Joint Legislative Audit Committee requested an audit of the State’s homelessness funding, including an evaluation of the efforts undertaken by the State and two cities to monitor the cost‑effectiveness of such spending”. The key phrase is “an evaluation of the efforts undertaken by the State and two cities to monitor the cost-effectiveness of such spending”.  It is a performance audit of the State’s efforts to measure the effectiveness of the programs it funds. It is not a fraud audit.  Fraud auditing is a specialized field and concentrates on financial processes. Nowhere in this report is the word “fraud” used. 

The audit reviewed homelessness programs from the state’s perspective, primarily through the activities of the California Interagency Council on Homelessness (CA-ICH). Although it focused on four counties, (Los Angeles, Santa Clara/SanJose. San Diego, and San Francisco), the audit assessed how well CA-ICH did at requiring homeless agencies to provide accurate and meaningful program statistics. Some people erroneously believe the numbers, especially the $24 billion figure, applied to specific county programs; they do not. That is the total five-year expenditures included in the report. Stated simply, the audit says, by failing to require Continuum of Care agencies (primarily counties), to report accurate and valuable data, CA-ICH is unable to show what the effect, if any, of the expenditures has been. 

The report’s findings bring up the issue of waste. Like fraud, the word “waste” is not mentioned in the audit.  Waste can be a subjective term. Some people think any funds spent on homelessness are wasted, because they believe most unhoused people are homeless by choice or because of their poor life choices, and deserve no assistance at all. But waste can also mean not spending enough. If a program should spend $1,000 per person but leaders choose to spend $500, that $500 will be wasted because it will not produce the desired outcome. If your tire has a nail in it, constantly filling it with air might be cheaper than a new tire, but you will still need a new tire, and driving on a deflated tire will just make things worse. 

It is also important to understand the $24 billion figure.  As the report said, the audit covers “financial information…related to all state-funded homelessness programs”. Local initiatives like Measures H, HHH, and ULA are not included in the audit because they are not state programs. So, for example, the $1.3 billion the City of Los Angeles spends on homelessness is an amalgam of federal, state, county, and local funding, including the City’s General Fund. It is not unusual for municipal governments to leverage and combine funding sources, especially for housing construction. This makes separating the specific uses of a given funding source difficult for anyone other than a government financial expert.

So, if the audit does not say $24 billion was spent fraudulently nor wasted, what does it say? Again, looking at the report itself, it says CA-ICH has failed to show homelessness programs have had any effect on homelessness because it has not been collecting reliable or relevant data. In essence, the report says there is simply no way of telling what benefit spending $24 billion has had on homelessness.  It does note homelessness has increased for the period covered in the audit.  Those who defend the current system could claim the programs work, and it’s just a matter of collecting the right data to prove it. Critics can claim with equal legitimacy all $24 billion was wasted on ineffective and inefficient programs. It is unlikely either of those extreme views are correct. 

To assess the effectiveness of LA’s homelessness programs, we need to consider the state audit in relation to data available at the local level.  In response to a federal judge’s order, the City and County have created websites that should include performance data. Because they have recently been set up, neither website is particularly user-friendly. They should be organized so users can intuitively follow the funding, descriptions, and outcomes of various programs, but at the moment display mostly invoices, budget sheets, and limited metrics reports.  Nevertheless, the websites offer some valuable insights. 

Both websites show the City and County are following proper financial procedures, so there is little possibility of outright fraud.  The County website, in particular, shows payments were made based on detailed invoices referencing properly approved contracts.  In a purely financial audit, there would be a low level of concern of fraud. But remember, financial audits focus on process and procedures, not outcomes. Neither website tells us what we’re getting for the money being spent. 

As a source of meaningful outcome measures, neither website has much to offer. As of this writing, the County’s website contains little more than invoices and long lists of shelter capacity and use, which are workload measures, but not outcomes. The City includes some performance data, but as I’ve written previously, they tell us little about outcomes. The data are particularly weak when it comes to showing long-term housing stability.  Although some reports show housing placements, there are no statistics on how long people stay housed. In addition, the state audit and City and County websites all note what is being counted are placement actions, not necessarily the numbers of individuals served. Based on previous reports from the City Controller’s office, we know numbers provided by LAHSA tend to be unreliable, especially when it comes to the number of people housed and shelter/interim housing occupancy. Therefore, on any given day, we don’t know how many people are in shelters nor in permanent housing facilities. 

One indicator of program effectiveness can be found in the state audit report. Appendix B shows interim and permanent housing data from fiscal year 2019-20 through March 2023, or roughly four fiscal years. In those four years, 274,817 people, or an average of 68,704 per year, were moved into interim or transitional housing.   46,363 were placed in permanent housing, or an average of about 11,590 per year. However, of that number 35,711, or 77 percent of the total, went into rapid rehousing programs like Inside Safe, and cannot be defined as truly housed. Remember, these are numbers for all the agencies in the audit, not just Los Angeles; they must be considered in terms of LA’s homeless population, which, according to LAHSA, was around 75,500 in 2023. Statewide and over five years, 46,363 people were permanently housed. That’s just 61 percent of LA County’s homeless population for one year. 

By training and as a matter of professional objectivity, auditors don’t speculate in the absence of facts and evidence. The state audit correctly says the success or failure of most state programs cannot be proven based on CA-ICH’s poor data practices. It is never a good thing when a government agency has no empirical evidence of success, but state’s audit wasn’t intended to be a deep dive into every county’s performance. However, there are some local-level data points we can examine. 

Focusing on LAHSA’s numbers, we can infer the lack of effectiveness in LA’s homelessness programs.  According to the 2023 PIT count, the number of unsheltered homeless people increased 14 percent, and the number of chronic homeless grew by 18 percent, while the overall number of homeless people increased nine percent. Unsheltered homelessness now accounts for 73 percent of LA County’s homeless population. Because of their acute needs, the unsheltered and chronic unhoused populations should be the ones targeted with the most intense outreach resources, yet their numbers increased disproportionately to the general homeless population. If current programs were effective, we would expect to see declines, or at least stabilization, in the number of unsheltered and chronic homeless. 

Once again turning to the City’s homelessness program website, I have already documented how the shelter and housing numbers do not support the success stories we hear from the City and LAHSA. Numbers and percentages don’t relate to one another and exits back into homelessness far exceed the number of people claimed to be housed. Worse yet, we don’t know how many people in the shelter-housing system are unique individuals.  Almost every housing measure has a caveat that it includes an unknown number of repeat clients.  The Homeless Management Information System (HMIS) is supposed to track individuals as they move through the service system, but as I explained in an October 2023 column, the HMIS is plagued by inconsistent data rules and a lack of shared information.  LAHSA and other agencies claim it is extraordinarily difficult to track people because of HIPPA and other rules, yet the City of Chattanooga, Tennessee has managed to do just that, helping local officials make informed decisions about homelessness strategies. 

Performance audits like the State Auditor’s report aren’t meant to play “gotcha” games and place blame.  But they are about accountability and the effective use of public resources.  They should be used as tools to help officials make the changes needed to achieve the highest degree of success. However, audits cannot serve their purpose unless they have good data to work with, and elected officials are willing to make needed changes. Apparently, at the state and local levels, we have neither.

(Tim Campbell is a resident of Westchester who spent a career in the public service and managed a municipal performance audit program.  He focuses on outcomes instead of process.)