The Power of AI and Predictive Analytics in Health and Human Services

Blurred crowd of unrecognizable at the street
Do predictive analytics and AI have something to offer social service organizations? In this blog, we explore current and future implementations of AI and machine learning in social service delivery.

There are two ways to fight fires. The first is to wait until a fire starts, and then rush to put out the fire. The other is to take preventive measures to keep the fire from starting in the first place. Running around and putting out fires without taking preventive measures is by far the more dangerous and expensive option, yielding poor outcomes and incurring higher costs.

The challenge is knowing what preventive measures to take. For effective prevention, you need to know things like when and where fires are most likely to occur, and the biggest contributing factors to fire risk—the data that allow you to predict and prevent fires.

The field of health and human services (HHS) is no different. If we’re providing services solely in reaction to crises, we’re just putting out fires, and with the same result: poorer outcomes, higher costs.

Major progress in care coordination and case management has already transformed how we approach providing healthcare and social services, but we’re only just beginning to see the potential of artificial intelligence (AI) and machine learning in providing predictive analytics that would enable us to truly stop putting out fires and start preventing them.

AI and machine learning can help identify trends and patterns that are just harder to tease out with more traditional methods. The ability to automate data analytics can help free up countless hours of tedious manual work combing through data, time that can be spent focused on serving at-risk populations.

AI at Work

AI-driven, predictive analytics can identify important risk factors, determine the optimal time to screen individuals who are at risk, and decide which populations or individuals to prioritize—processes that are critical for providers to get the most mileage out of limited resources.

There’s already evidence to show the AI approach is working. Harvard University’s Teamcore group, whose mission is “AI for social good,” has been working on many projects to improve outcomes for HHS providers and organizations.

In India, where maternal mortality is high, Indian nonprofit ARMMAN is using artificial intelligence to better provide critical care services to new and expecting mothers. The organization had noticed that many participants in service programs were dropping out. Teamcore built an AI system for ARMMAN designed to help nonprofits identify those at risk. The system was able to significantly improve engagement with expecting mothers and critical services, decreasing drop-out rates by more than 30%.

Teamcore also set about using AI-driven statistical modeling to inform policies that would optimize COVID-19 contact tracing, identify targeted quarantines rather than blanket shutdowns, and more strategically control the spread of the disease. The team was able to assess the impact of policies and identify less-effective actions that limited population mobility without adequately serving to limit transmission.

Another Teamcore project used AI techniques to create a predictive model that could help case workers prevent suicide among active-duty service members and homeless youth. By analyzing social networks and identifying stress-inducing transitions, this model could be a vital tool for identifying early warning signs and help service organizations provide targeted, life-saving interventions.

But AI Isn’t Flawless

As useful as AI can be, it can also be flawed. Unrecognized bias in the model, for instance, could unintentionally exclude or deprioritize traditionally marginalized communities. AI errors or poor models could overlook dangerous situations and result in harmful outcomes. Holes in security could result in data breaches or unauthorized release of sensitive, protected information.

Writing for the Brookings Institute, economists Michael Lokshin and Nithin Umapathi identify significant issues that HHS organizations need to be aware of.

First, it’s critical that your AI technology and software be current. As Lokshin and Umapathi state, “The quality of administrative data profoundly affects the efficacy of AI. In Canada, the poor quality of the data created errors that led to subpar foster placement and failure to remove children from unsafe environments. The tendency to favor legacy systems can undermine efforts to improve the data architecture.”

Technology keeps developing exponentially, so if you’re sticking with older technology, you’re going to fall exponentially behind.

Second, as with any database that captures and stores personally identifiable information (PII) or other sensitive information, organizations needs to be very careful with data privacy and security. “The Florida Department of Child and Family collected multidimensional data on students’ education, health, and home environment,” write Lokshin and Umapathi.

“However, this data has since been interfaced with the Sheriff’s Office’s records to identify and maintain a database of juveniles who are at risk of becoming prolific offenders. In such cases, data integration creates new opportunities for controversial overreach, deviating from the intentions under which data was originally collected.”

Dr. Jim Watson, president and CEO of Genesis Physicians Group, warns in an interview with Healthcare IT News that it’s critical your custom AI and machine learning models be trained on the specific population you serve, rather than using an off-the-shelf solution that is trained on general population data. Without being trained on the right data, the predictive model will be much less accurate because it will not factor in the unique situations specific to your community.

And if an off-the-shelf model doesn’t use the same data types or sources as you, any predictive power the model has can be negatively impacted.

Opportunities Abound

Knowledge is power. With the proper foresight and care in selecting and developing the right predictive analytics tools, the potential benefits are vast.

What calculators did for the field of mathematics, AI and machine learning could do for the field of HHS. Targeted interventions, preventive service provision, and strategic resource deployment can help organizations of all sizes—from small nonprofits to government agencies—to craft intelligent, efficient policies and programs that finally allow HHS providers to stop putting out fires and start preventing them.

Eccovia’s recently announced ClientInsight is a data warehouse and business-intelligence platform that represents its ongoing effort to help HHS providers and organizations reduce costs and achieve better outcomes for the people they serve. ClientInsight is hosted in Azure and uses AI and machine learning libraries—trained on your data—to provide you with better, clearer data and tease out critical insights and patterns that can inform better business strategies.

To learn more, check out our website and talk to one of our experts.

More Topics

February is Teen Dating Violence Awareness Month. Learn how to get involved and promote healthy, respectful relationships for teenagers, and how learning how to identify warning signs of abuse can …

ClientConnect 2023, our inaugural peer-to-peer conference for human services organizations, was a resounding success! But if you missed out, don’t let that stop you from catching the next one. In …

ClientConnect 2023, our inaugural peer-to-peer conference for human services organizations, was a resounding success! But if you missed out, don’t let that stop you from catching the next one. In …

Contact Us