Oregon drops AI tool used in child abuse cases

Placeholder while loading article actions

Oregon child welfare officials will stop using an algorithm to help decide which families are investigated by social workers, opting instead for a new process that officials say will lead to better and more racially equitable decisions.

The move comes weeks after a Associated press review of a separate algorithmic tool in Pennsylvania that originally inspired Oregon officials to develop their model, and which was found to flag a disproportionate number of black children for “mandatory” neglect investigations during of its establishment.

The Oregon Department of Human Services announced to staff via email last month that after a “thorough analysis,” agency hotline workers would stop using the algorithm in late June to reduce disparities in families being investigated for child abuse and neglect by child protective services.

“We are committed to continuous improvement in quality and fairness,” Lacey Andresen, the agency’s deputy director, said in the May 19 email.

Jake Sunderland, a spokesman for the department, said the existing algorithm “would no longer be needed” because it cannot be used with the state’s new selection process. He declined to provide further details on why Oregon decided to replace the algorithm and did not specify related disparities that influenced the policy change.

This story, backed by the Pulitzer Center for Crisis Reporting, is part of an ongoing Associated Press series, “Tracked,” that investigates the power and consequences of decisions made by algorithms on people’s daily lives.

Decisions by helpline workers regarding reports of child abuse and neglect mark a critical moment in the investigation process, when social workers first decide whether families should face the intervention of the state. The stakes are high – failing to respond to an allegation could end in the death of a child, but examining a family’s life could set them up for separation.

From California to Colorado and Pennsylvania, as child welfare agencies use or consider implementing algorithms, an AP review has identified concerns about transparency, reliability and racial disparities in the use of technology, including their potential to reinforce bias in the child protection system.

US Senator Ron Wyden, a Democrat from Oregon, said he had long been concerned about the algorithms used by his state’s child welfare system and contacted the department again after the story of the PA to ask about racial bias – a pervasive concern with the growing use of artificial intelligence tools in child protective services.

“Making decisions about what should happen to children and families is far too big a task to give away untested algorithms,” Wyden said in a statement. “I am pleased that the Oregon Department of Social Services is taking the concerns I have raised about racial bias seriously and is suspending the use of its screening tool.”

Sunderland said Oregon child welfare officials had long considered changing their investigative process before making the announcement last month.

He added that the state recently decided that the algorithm would be completely replaced by its new program, called the Structured Decision-Making Model, which aligns with many other child welfare jurisdictions across the country. .

Oregon’s Safety Screening Tool was inspired by the influential Allegheny Family Screening Tool, named after the county surrounding Pittsburgh, and aims to predict the risk children face of ending up in family d reception or to be the subject of an investigation in the future. It was first implemented in 2018. Social workers consult the numerical risk scores generated by the algorithm – the higher the number, the greater the risk – when deciding whether another social worker should go out to investigate the family.

But Oregon officials tweaked their original algorithm to rely only on internal child welfare data in calculating a family’s risk, and deliberately tried to address racial bias. in its design with an “equity correction”.

In response to Carnegie Mellon University researchers’ findings that the Allegheny County algorithm initially flagged a disproportionate number of black families for “mandatory” child neglect investigations, county officials called searching for “hypothetical” and noted that social workers can always replace the tool, which was never meant to be used alone.

Wyden is a lead sponsor of a bill to establish transparency and national oversight of software, algorithms and other automated systems.

“With the livelihoods and safety of children and families at stake, the technology used by the state must be fair – and I will continue to monitor,” Wyden said.

The second tool Oregon developed — an algorithm to help decide when foster children can be reunited with their families — remains on hiatus as researchers rework the model. Sunderland said the pilot was axed months ago due to inadequate data, but there was “no expectation that it will be reinstated soon”.

In recent years, while under the watch of a governor-ordered crisis oversight committee, the state agency — which is now preparing to hire its eighth new director of child protection in six years – considered three additional algorithms, including predictive models that sought to assess a child’s risk for death and serious injury, whether children should be placed in foster care, and if so, where. Sunderland said the Department of Child Protection never built these tools, however.

Follow Sally Ho and Garance Burke on Twitter at @_sallyho and @garanceburke.

Contact the AP Global Investigation Team at [email protected] or https://www.ap.org/tips/

Comments are closed.