In a week-long series, prominent thinkers will look at ways to harness the private sector or extract more from a recalcitrant public sector in order to combat poverty and inequality. In the fifth post, Matthew Klein, co-chair the Human Service Data Project, describes a new initiative to give nonprofits and human services information that can help them do more with less.
With needs rising and government agency budgets getting reduced, the pressure is on human service providers to do more with less, and on government to ensure that it is deploying its limited resources to the best effect.
This pressure, while increasingly acute, has been building for some time and is reflected in the focus on the need for "measurable results" in the social sector. As any nonprofit can tell you, private funding streams and government contracts have increasingly asked service providers to quantify what they achieve. The rationale behind the trend is straightforward: funding -- especially in a time when dollars are tight -- should flow to more effective organizations. So some objective criteria is needed to assess which agencies are, in fact, the high performers.
The interest in measurable results is not only imposed by funders. Many providers, who are engaged day in and day out with trying to improve the lives of clients in challenging circumstances, are mission-driven and want to assess their progress in order to continually improve. They want to track what's working and what isn't and learn from others who are having success where they are struggling.
Yet most would agree that current performance reporting practices for government funded social services rarely meet the needs of the people who rely on them. In most cases, funding decision makers do not get information that would allow them to make relative comparisons between organizations, and individual service providers are unable to benchmark themselves against others to understand their work in context or learn from best in class organizations.
Why not? Two problems figure heavily. First, the metrics used to define performance are themselves not standardized. So, for example, a group of job training organizations might include those that define "participants" as anyone who signs up to receive services and those that define "participants" as those who pass a screening and trial period. Without clarifying distinctions like these, assessing organizations' placement rates for participants is not meaningful.
Second, information reporting typically only goes one way -- from the service provider to the government funding agency. Even when a given funding contract does strictly define performance terms and all of the funded agencies measure success the same way, they rarely find out how their reported results rate against others. Providers don't know whose practices to emulate.
The experiment in NYC
In New York City, a cross-sector initiative called the Human Service Data Project is underway to try and improve how performance reporting occurs and data is shared. In 2009, Mayor Michael Bloomberg expressed concern about the need to keep the city's nonprofit community strong given the economic downturn, and his Deputy Mayor for Health and Human Services, Linda Gibbs, convened a task force of nonprofit leaders to identify action items. The need for better, more timely information about the performance and financial health of human service agencies emerged as a priority, and the Human Service Data Project, dubbed HSData, was launched in response.
Along with Louisa Chafee, Director for Management Innovation in the Office of Deputy Mayor Gibbs, I co-chair the HSData Project and we report to Deputy Mayor Gibbs. The effort involves government agencies, private funders, and literally hundreds of human service agencies that have participated in surveys, focus groups, and working sessions to shape the goals and substance of the project. Based on this input, HSData is working to allow human service providers to:
- Track performance results in relation to common measures to facilitate peer learning and progress within their respective fields. This involves identifying those core results that practitioners believe are important and meaningful, ensuring these metrics have clear definitions so that everyone means the same thing when they use the same words, incorporating these metrics into the city's ongoing funding processes, and reporting back to providers how their results compare to their peers. The process of standardizing metric definitions is happening first in three sub-sectors: workforce development, senior services, and adult alternative to incarceration programs.
- Analyze financial information to gain insights about their own fiscal practices. This analysis should help nonprofits assess their fiscal health with reference to other, similar organizations and articulate the full, true cost of program activities so they can better understand the implications of operating with public funding.
- Store institutional documents in an electronic repository to reduce the redundant exchange of paperwork with multiple funding sources. One of the major complaints from providers is the burden of regulatory compliance, and online document sharing is a low hanging fruit solution that everyone agrees is overdue.
In my day job, I direct a small foundation that focuses on social innovation in New York City, and from my perspective there are a few aspects of HSData that I think are particularly exciting.
First, the effort is working to align the perspectives of multiple sectors. Instead of policymakers developing outcome definitions in a vacuum, Deputy Mayor Gibbs has asked service providers to specify the results that are truly important and meaningful for their clients and then to help create a set of common definitions that accommodate important programmatic nuances. So a concept like "job placement rate" will not be used overly broadly, but will be broken out based on provider input to clarify what is meant by "job" (full time or part time, with retention or not, etc.) or "rate" (of all graduates of the program? Of original enrollees? and so on) and allow for more relevant apple-to-apple comparisons. It's impressive that a city is embracing such a bottom up process to influence its funding approach.
Second, the scale of New York City's human service sector provides an opportunity for standardized outcome definitions to gain traction. The idea of common metrics is not new; in fact, HSData is inspired by the important work of other organizations like P/PV's Benchmarking Project and the Urban Institute's Outcome Indicators Project. But there is not enough reason for nonprofits to dedicate the time and resources to follow a specific outcome taxonomy unless there are funding implications attached to it. In New York City, over 5,000 health and human service organizations have paid employees, and about 1,300 of the groups are funded at least in part through city contracts. With its purchasing power, the city is in the extraordinary position to generate a widespread use of common definitions that can bring coherence to a fragmented space.
Third, HSData represents an early example of applying open data principles to social services. As technology advances, there are a growing number of solutions that allow databases to talk to each other. But for that to happen, the fields and definitions need to be the same in each database for the information to flow between them. The common metrics of HSData can hopefully allow more automated ways for providers and government to share information. Even better would be if the data (with appropriate anonymity) is made available more broadly to be analyzed and presented visually. Open data in action, for example in weather, teacher performance, crime and many other areas, can bring new insights and benefits to the public. It's possible that standardized data about performance results, if accessible to broader analysis, could lead to new thinking about the underlying services.
HSData is still a work in progress, and it's yet to be seen if all the benefits will come to fruition. There's no doubt, though, that in an age of austerity the motivation behind HSData and efforts like it to make performance data more meaningful, more easily shared, and more important in the allocation of resources is only going to get stronger.
Matthew Klein is the Executive Director of Blue Ridge Foundation New York, a social innovation fund in New York City that incubates start-up organizations with high potential ideas for increasing equal opportunity and economic mobility.