AI & Ethics: An Analysis of the Allegheny County Screening Tool
According to public records, more than 15,000 calls are made annually to report violations of child health and safety in Allegheny County, Pennsylvania (Allegheny DHS, 2016). To manage these claims, the county implemented the Allegheny Family Screening Tool (AFST) in 2016, a predictive algorithm helping phone screeners at Child Protection Services (CPS) respond to referrals and identify families in most need of further investigation. The algorithm maps dozens of data points on child related family members and calculates a score between 1 and 20 representing the level of safety and welfare in the home. The tool’s goal is to more accurately predict which children are at the highest risk of danger and should be removed from their homes (Allegheny DHS, 2016).
This first-of-its-kind algorithm has been both praised and criticized for its role in decision-making on behalf of the county. By analyzing the strengths and weaknesses of both human cognition and machine learning, we can begin to see how CPS operators and the AFST can best work together. In this paper, I will asses the reasons why I believe the algorithm should continue being used by the county under three conditions. First, and most critically, the tool should inform human decision-making, but never replace it as a standalone solution; second, the system must support long-term family needs; and third, the decisions and outcomes should be explainable.
Inform, Not Make, Decisions
The Allegheny Family Screening Tool is currently being used by county screeners as a one of many touch points that guide their decision to investigate a family and remove the child from the home. When a screener receives a call, they make their own assessment of the situation and then look up the Family Screening Score calculated by the algorithm. If the score indicates a high risk, they will recommend an in-person home visit before any further action is taken with the child. This structure characterized by active checks-and-balances between the algorithm and the county workers is essential to its long-term success.
In Mimi Onuoha’s Medium article (2016), “The Point of Collection”, she explains, “Software thrives on abstraction. It flattens out individual variations in favor of types and models.” In the case of the Allegheny Family Screening tool, decisions are being made that impact families, communities, and the greater society. That weight can only be felt by a human, which is why they should be trained and aware of its algorithmic shortcomings, as well as their essential role in decision-making.
Computer-generated algorithms are excellent at analyzing lots of data points and creating ‘informed outcomes’, whereas humans are much better at contextualizing and recognizing nuances in data. How might we look at a recovered drug addict differently than an algorithm? Is he or she active in the community or known for their erratic behavior? By working together, the two systems can leverage the other’s strengths to provide thoughtful and holistic recommendations; much like Coons imagined when he described the “collaboration [between humans and technology] as a symbiotic dance (Cardoso Llach, 2015).” Just think, there is no such thing as stepping on an algorithm’s toes. By design, we must work together, both as the lead and follower, to create harmony.
Some might argue that algorithms are actually designed to be objective, and without fault of human judgement (Hurley, 2018), and in turn, work more accurately on their own. As New York Times author Dan Hurley (2018) writes, “What screeners have is a lot of data…and the human brain is not that deft at harnessing and making sense of all the data.” While it is true that computers can draw connections that humans cannot, it is also the case that only people have the ability to humanize the decision-making process. This is not new. Across industries, recruiters have been utilizing screening tools to lure candidates to new jobs since the early 2000s. Yes, first they use algorithms to identify qualified candidates – but then, they reach out and schedule a set of phone calls before they recommend the individual to the company. Even universities, like Carnegie Mellon, use quantitative GRE scores to qualify candidates for admission, but not without assessing qualitative assets like previous work experience and a statement of purpose. “Perfect on paper” might be good enough for an algorithm, but it doesn’t always cut it in the real world. At Allegheny County CPS, workers can apply their cognitive strengths by asking critical questions at key decision points and visiting families in-person to clarify data-driven judgements.
Support Long-term Family Needs
To date, Allegheny and third party researchers have proven that the AFST is supporting the county’s goal to more accurately predicting children at high risks, but they have also recognized overt biases in the data itself (Courtland, 2018). Although challenging, fairness and accuracy of the algorithm must continue to be monitored by objective human parties. In tandem, in order to fully support the long-term welfare of children and families, they must begin to proactively address system inequalities.
Third-party researchers are currently involved in the development of the AFST and are essential checks-and-balances to the department’s commitment to the wellbeing of all families. In the case of Allegheny County, they have helped identify the fact that families of color have been more impacted by family separation than white families (Courtland, 2018). This has led to discussions about data collection, unearthing the reality that predominantly poor families use public services, which means their data is more widely collected in use of the tool (Courtland, 2018). These imbalances highlight a larger-scale challenge that cannot be solved with a reactionary algorithm or any single-step solution.
Systematic inequalities, such as generational poverty, are wicked problems – a term coined by 20th century planner Horst Rittel (Rittel & Webber, 1973) to describe a type of ill-defined, complex and systemic problem. Allegheny County should use Terry Irwin’s transition design approach to address these challenges. Terry explains (2018) that, “wicked problem resolution requires myriad interventions at multiple levels of scale…[they] always have their roots in the past because it takes years, decades, and even longer for problems to become wicked.” Some critics might argue that proving transition design’s success is much more difficult than proving success of an algorithm. While I agree that it is difficult to measure quickly, it will actually bring greater social and economic benefits to the county in the long term. It is only proactive approaches like this one that can prevent the conditions that lead to child welfare concerns.
Make it Explainable
The Allegheny County Family Screening Tool is unique in that it has been co-developed by an economist and is operated by the county, not a third party company. In addition, the county has disclosed the variables that are weighed to determine a score. This is important because the algorithms needs to be held accountable (Courtland, 2018) when used to make decisions about people.
Across the globe, lawmakers are passing bills to make information used in software more transparent. Just this year, France’s president made all algorithms used by government open access and Europe passed the General Protection Regulation to promote algorithmic accountability (Courtland, 2018). These legislative updates reflect a clear and immediate need for governments and agencies to both engage deeply in the creation of the algorithm in order to understand it, and to open their doors to public input.
Although I agree with this approach, the greater challenge is finding knowledgeable developers and designers to create these algorithms. Currently, UX designers have poor mental models of machine learning (Dove, Halskov Forlizzi, Zimmerman, 2017) which makes it very difficult for them to design a transparent system. Some critics might argue that understanding statistical inference is beyond the limits of a designer and point to computer scientists and economists to deeply understand these systems. While I agree that a designers role is not rooted in mathematical inference, I also recognize that machine-generated algorithms are being used more and more to make critical decisions that impact people’s lives. Being that we are human-centered creators, this is an emerging area of technology that should be studied in order to build solutions that can be explained, tested, and fundamentally made accountable to their stakeholders.
Emerging Model
The Allegheny Family Screening Tool is just one of thousands of algorithms that are used by governments and businesses to make decisions about people. Allegheny County’s predictive tool is an emerging model for accountable algorithms, because the county has leveraged its affordances and sought out human action in place for its shortcoming.
This is reflected in the county’s decision to use human decision-making in the process, rather than use the algorithm as an automated solution. In addition, the county is working with public data and measurable goals, while also inviting critical researchers to test biases. These conditions do not make the algorithm anywhere near perfect, but they do offer an example of how we can better analyze and measure computer-aided decision in our data-driven society.
Sources:
Allegheny County Department of Human Services. Allegheny Family Screening Tool frequently asked questions. (2017, July 20). Retrieved October 25, 2018, from https://www.alleghenycountyanalytics.us/index.
Courtland, R. (2018, June 20). Bias detectives: The researchers striving to make algorithms fair. Retrieved from https://www.nature.com/articles/d41586-018-05469-3
Daniel Cardoso Llach. 2015. Builders of the Vision: Software and the Imagination of Design. London, New York: Routledge 2015.
Graham Dove, Kim Halskov, Jodi Forlizzi, John Zimmerman, “UX Design Innovation: Challenges for Working with Machine Learning as a Design Material,” CHI ’17 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, p. 278–288.
Hurley, D. (2018, January 02). Can an Algorithm Tell When Kids Are in Danger? Retrieved from https://www.nytimes.com/2018/01/02/magazine/can-an-algorithm-tell-when-kids-are-in-danger.html
Onuoha, M. (2016, February 10). The Point of Collection – Data & Society: Points. Retrieved from https://points.datasociety.net/the-point-of-collection-8ee44ad7c2fa
Accessed on Medium
Horst Rittel & Melvin M. Webber, “Dilemmas in a General Theory of Planning,” Policy Science 4 (1973), 155–69.
Terry Irwin, “The Emerging Transition Design Approach,” DRS2018