New Algorithms Perpetuate Old Biases in Child Welfare Cases

I believe unfair predictive software may have influenced an investigator’s recommendation to take my children away. I’m surely not alone.

  • Some child welfare agencies now use predictive analytics to help decide whether a child is at risk in their home.

    Visual: Michał Parzuchowski/Unsplash

In 1956, author Philip K. Dick published a short story called “The Minority Report.” Set in a near-future U.S., the piece describes a felony prevention system called “Precrime,” which harnesses the psychic powers of three mutated humans, dubbed “precogs.” The precogs predict egregious crimes like rape and murder before they ever take place. Then, Precrime detectives arrest the future assailants before they have the chance to complete the act. The targeting of people who haven’t committed an offense is justified by an inspired faith in the precogs’ psychic abilities. “The precogs are never wrong,” is a narrative refrain.

Partner content, op-eds, and Undark editorials.

That justification begins to unravel when Precrime Commissioner John Anderton is faced with a devastating prediction: that he is destined to commit murder. Incredulous, he decides to investigate how the precogs could have gotten him wrong, and he ultimately discovers that the three precogs do not always see eye-to-eye. Sometimes one precog disagrees with the others over whether the accused will go through with the crime, a discovery that undermines the entire Precrime system. But Anderton decides the public benefit of Precrime is worth the risk of occasionally arresting innocent citizens, and to preserve the public’s faith in the system, he sacrifices himself — and his victim — by fulfilling the precogs’ prediction.

To my knowledge, no branch of the actual U.S. government has access to psychic mutants. But, increasingly, child welfare agencies are behaving like they do. Agencies in several states now use predictive analytics to help decide whether a child is at risk in their home. By placing undue faith in algorithms, however, the agencies are performing a moral calculus akin to Anderton’s, except without the element of self-sacrifice.

It’s no secret that only about one-fifth of investigated maltreatment allegations are ever substantiated by a court or child welfare agency. According to the U.S. Department of Health and Human Services, for example, more than 100,000 kids were removed from their families in 2001 based on allegations that turned out to be unfounded. That’s to say nothing of the substantiated cases that are overturned on appeal, or the families that give in to the demands of a child welfare agency even if they don’t agree with court findings. Predictive analytics have been introduced into the child welfare industry with the promise of making the system more just. But in reality they make it easier to target already vulnerable populations — and make it more difficult to appeal decisions that were made in error.

Cathy O’Neil, a mathematician and the author of “Weapons of Math Destruction,” says that “data scientists are translators of moral choices.” Algorithms do not provide divine insight into people’s objective moral good. Rather, they predict patterns from the data they are fed — data that depend on which information is available and on which information the programmers deem relevant. This means that the people who design predictive child-welfare algorithms, however well-intentioned, insert personal, historical, and systematic biases into equations that, by definition, perpetuate those biases. Thousands of unwitting families are affected in the process. Families like mine.

One-hundred and sixty days ago, my three- and four-year-old daughters were removed from my custody, on a temporary but indeterminate basis, because authorities in Broward County, Florida, deemed them unsafe in my care. It began when I left the girls with their grandparents for three days while I was away in Miami. My mother-in-law phoned an abuse hotline and accused me of going to Miami to use drugs, an accusation that was given immediate weight because I had been treating a diagnosed substance use disorder with pharmacotherapy and counseling for the past five years. I was subsequently charged with placing my daughters at imminent risk of serious harm.

My accuser’s claim was later countered by a series of negative drug panels, including urine and hair analyses, the latter of which can detect substance use dating back 90 days. The investigator made no attempt to contact me before deciding to bring me before a judge, and when the judge ruled to place my children with their paternal grandparents, she did not acknowledge my negative drug tests. Instead, she stated that my “skill with language” indicated that I wasn’t trustworthy, and that I was “clearly drug-seeking” because I admitted to almost buying marijuana, which I’d occasionally used legally in the past.

It didn’t matter that no evidence ever arose to substantiate the allegations of drug use made against me, nor that, by the investigator’s own admission, my daughters showed no signs of abuse or neglect. What mattered was that somewhere, somehow, I was deemed high risk.

It turns out Florida is one of the first states in the country to implement predictive analytics as part of their child welfare system. Although my investigator would not reveal her methodology to me when I asked, I believe predictive analytics were used to determine my risk status and may have influenced the investigator’s recommendation to take my children away.

Love Undark? Sign up for our newsletter!

The cost of such biased and haphazard family separations can’t be ignored. Even short-term separations have the potential to significantly impact the children involved. This is especially true of children under three. Claudia Gold, pediatrician and author of “The Developmental Science of Early Childhood,” describes what young children experience when separated from their parents as “annihilation of the self.” She goes on to say that this can cause problems later in life, such as impulsivity, difficulty regulating emotions, low self-esteem, and attention deficit issues.

My own daughters have already shown signs of anger and depression since we’ve been separated. Earlier this month, my four-year-old bit her grandmother, after shouting that she was angry because she’d taken her from me. Last weekend, when I showed my three-year-old a photo of her that I use as my phone’s lock-screen image, she squirmed and refused to look at it.

To better understand which families are bearing the brunt of these traumas, I talked with Emily Putnam-Hornstein, an associate professor of social work at the University of Southern California who helped create a child welfare algorithm that’s being used in Allegheny County, Pennsylvania. She tells me the system isn’t designed to rate the severity of a maltreatment report — that’s still up to the human staff. Rather, it predicts the likelihood that a family will require future interventions from child welfare authorities, information that’s supposed to help call screeners decide which allegations to pass through for an investigation.

Putnam-Hornstein describes her work in hopeful terms, as a tool for positive social change. To hear her speak it is almost to believe it, but she also admits that more black and Hispanic families are being investigated than white families. Additionally, the 131 indicators that feed into the algorithm include records for enrollment in Medicaid and other federal assistance programs, as well as public health records regarding mental-health and substance-use treatments. Putnam-Hornstein stresses that engaging with these services is not an automatic recipe for a high score. But more information exists on those who use the services than on those who don’t. Families who don’t have enough information in the system are excluded from being scored.

In Florida, the Department of Children and Families contracted with private analytics giant SAS to do a research project to predict which families are most likely to cause the death of a child through maltreatment. According to the SAS website, they used “criminal histories, behavioral health data, [and] drug and alcohol treatment data,” all of which, due to privacy laws, had to be sourced from public health databases. As a result, the behavioral-health and drug-treatment statistics don’t include privately insured patients. In other words, it’s a little bit like Philip K. Dick’s fictional precogs, if one or two of the precogs had visions only about poor people.

O’Neil thinks that algorithms like the one being used in Allegheny have the potential to help correct the discrimination taking place in child welfare investigations, but they have to be used differently for that to happen.  She says that “sometimes when you start using data-driven methods, the flaws in the results highlight the flaws in the underlying system.” She suggests that if those trends are analyzed in the name of gathering knowledge rather than predicting behavior, they can be used to correct systemic bias. As these algorithms function now, they appear to simply perpetuate disparities.

“The Minority Report” gives us terrifying insight into what happens when a justice system believes too fervently in its internal knowledge base. Ironically, child welfare agencies have decided to ignore that admonition in favor of new forms of prediction that target poor families, families of color, parents with physical and mental health disabilities, and parents who have undergone treatment for an addiction.

For me, the cost of that decision has been 160 days — 160 days and counting since I’ve been alone with my daughters, or held them without someone watching. 160 days since I’ve put them to bed. Since I’ve comforted them after a nightmare. 160 days since I’ve given them a bath or made them breakfast. Since I’ve fought with my four-year-old over shoes.

I’m forgetting the words to “Frozen.” I’m forgetting the smell of their hair. Sometimes now they call me grandma instead of mom. Sometimes they call her mom. 160 days.

Elizabeth Brico is a freelance writer whose work has appeared in Tonic/VICE, Talk Poverty, Politico Magazine, Vox, and more. When she isn’t covering addiction and social justice, she can usually be found reading, writing, or watching speculative fiction.

UPDATE: This piece was updated to clarify that the work SAS performed for the Florida Department of Children and Families was a research project. That company’s software has not been operationalized in the state of Florida, according to an SAS representative.

See What Others Are Saying

15 comments / Join the Discussion

    I’m so sorry that bad things keep happening to you hang in there!

    I see you clapped back on twitter saying that you’re cuban now. I’m not going there so you can doxx me and get the sjw mob to terrorize me like you do everybody you hold a grudge with. Maybe do a dna test 23 and me or to prove your indigenious roots? I’ll say this, I was born in America but i don’t go around saying I’m a native american. Just because you have european grandparents that made a pit stop in cuba to colonize for a while before coming to America doesn’t grant you some special victim status. You are still a european anerican by blood. But keep trying to win the oppression olympics, you’re going for a gold medal.

    5 years in drug treatment sounds a bit much. I was addicted to crack and smack in the 80’s. I went to jail and lucky for me the judge ordered treatment. I believe you have to hit rock bottom to want to quit the stuff. Its harder for pretty girls cuz theres usually a bloke there to catch their fall. And woman are largely under represented in the penal system. Judges just cut them loose once they start whining about being mothers and such.

    I follow many mental health blogs. Yours (bettys battleground) is one of my favorites because of the humor you inject into your style of writing. But these most recent developments have me noticing a distinct pattern with you. Last year you were blogging about going through this exactly same scenario. Only the players and the location were different. Instead of your daughters it was your son and instead of in Florida it was in Seattle. In Seattle you wrote there was a psychopathic ex that was plotting to kill you and your family. In Florida your husband is schizophrenic and the government is plotting against you. As somebody that cares about you Im asking you to look in the mirror. You are putting yourself in these situations. And why are you in such a hurry to get back to Seattle where you no longer have an apartment and you claim a madman is after your family? That doesn’t sound safe for those girls. Good luck to you. I’ll keep reading the blog but please try and clean up some of these glaring plot holes in your story.

    I’m starting to wonder if this is like one of those Rachel Dolezal situations even the editors had to put an ‘update’ on the bottom of this story that kind of upends the whole premise. Fun writing though. Cheers!

    Umm maybe bigots @ cpis really did stole your girls… but announcing on your pic in your twitter bio that you overdosed on heroin 9 times prolly dint help your case any… just sayin

    Yasssss! I feel ya Momodou. These basic white girls be playin like they got some brown innem so when they fail at whateva they can be like it was racism! Annoying af! Smdh!!!!

    With all due respect Alecks, as a dark skinned African living in the United States I can not choose my race. This is a privelege only enjoyed by the whites here. Race appropriation by whites seems to be on an upswing everywhere I look ny eyes.

    In response to Dave Johnson’s comment: its 2018 she can be any race/gender she chooses.

    When I read articles like this I always wonder, what about the babys father, if his mother called the cops on you for abuse, where does he fit in in this picture, cant he tell police you didnt do nothing?

    Make up yo dang mind! In the title you says the computer algarythm took yo kids. Then after I red further you says the gramma took them kids. Sheesh! Takin me onna dang rollercoaster ride with this. Im gone!

    I work closely with CPS in Broward county and be assured that they do not use algorithms to take peoples kids. They have actual human agents doing child welfare asessments in the field. These are dedicated individuals that care about the job they do and the children they help rescue from abusive situations. If Mrs. Brico had her children put into anothers custody it was for the safety of those children and with good cause.

    I’m so sorry. You think they would do some kind of reliable, real world testing of predictability before implementing. You should condense this to an op ed and send to the major papers, your story shows a major flaw, aside from the biased souring.

    This is a huge problem and I’m glad it’s finally being looked at as a real issue instead of the parents being stereotyped because of being minorities. In Canada, Indigenous families have been systematically targeted for decades due to racist stereotypes about them. These algorithms are a terrible idea. It dehumanizes marginalized people, suddenly they are no longer individuals but faceless embodiments of social problems that must be targeted to eradicate the problem. My own family has experienced intergenerational trauma because of bigoted child welfare agencies. Based on classism though not racism in our case because we are white. My dad’s parents were accused of abusing my aunt because she had very poor vision and was falling a lot and getting bruises. She ended up just needing glasses. Because of already dealing with negative interactions with child “welfare” agencies, when my dad experienced a head injury due to his dad accidentally hitting him with a chainsaw, he was not taken to hospital because my grandparents feared getting in trouble. He vomited every morning and had a constant migraine for months. My own family was targeted by them because I was being bullied by ableist teachers for having a learning disability. People like this cannot conceive that the problem lies with them, so it must be me or my parents of course. All three instances of social workers coming to my house to “interview” accuse my parents of abuse were extremely traumatic. This is why “SJW” social workers really grind my gears. They are aware of the insanely bigoted history and effects of child welfare institutions, yet think because they’ve read a thing or two about it, that they are “above that” and could never perpetuate oppression in their own practice. An anti-racist, anti-classist person would not go into social work because they would understanding that the bigotry from these agencies is STILL TAKING PLACE! They would (rightly) call racist someone saying Natives are lazy drunks and shouldn’t be allowed to be parents, but will happily carry out the will of institutions that believe the exact same thing. Walk the walk people. It’s as bad as pro-military peace advocate.

Comments are closed.


& Tipsters

Corruption in science?
Academic discrimination?
Research censorship?
Government cover-ups?

Undark wants to hear about it.

Email us at, or visit our contact page for more secure options.