Automating the Forced Removal of Children in Poverty

Quote 1

Where the line is drawn between the routine conditions of poverty and child neglect is particularly vexing. Many struggles common among poor families are officially defined as child maltreatment, including not having enough food, having inadequate or unsafe housing, lacking medical care, or leaving a child alone while you work. Unhoused families face particularly difficult challenges holding on to their children, as the very condition of being homeless is judged neglectful.

Quote 2:

The AFST sees the use of public services as a risk to children. A quarter of the predictive variables in the AFST are direct measures of poverty: they track use of means-tested programs such as TANF, Supplemental Security Income, SNAP, and county medical assistance. Another quarter measure interaction with juvenile probation and CYF itself, systems that are disproportionately focused on poor and working-class communities, especially communities of color. The juvenile justice system struggles with many of the same racial and class inequities as the adult criminal justice system. A family’s interaction with CYF is highly dependent on social class: professional middle-class families have more privacy, interact with fewer mandated reporters, and enjoy more cultural approval of their parenting than poor or working-class families.

Quote 3:

We might call this poverty profiling. Like racial profiling, poverty profiling targets individuals for extra scrutiny based not on their behavior but rather on a personal characteristic: living in poverty. Because the model confuses parenting while poor with poor parenting, the AFST views parents who reach out to public programs as risks to their children.

Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor by Virginia Eubanks

First They Came for the Poor

…one day in early 2000, I sat talking to a young mother on welfare about her experiences with technology. When our conversation turned to EBT cards, Dorothy Allen said, “They’re great. Except [Social Services] uses them as a tracking device.” I must have looked shocked, because she explained that her caseworker routinely looked at her purchase records. Poor women are the test subjects for surveillance technology, Dorothy told me. Then she added, “You should pay attention to what happens to us. You’re next.”

Dorothy’s insight was prescient. The kind of invasive electronic scrutiny she described has become commonplace across the class spectrum today.

Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor by Virginia Eubanks

Technology Is Not Politically Neutral

The proposed laws were impossible to obey, patently unconstitutional, and unenforceable, but that’s not the point. This is performative politics. The legislation was not intended to work; it was intended to heap stigma on social programs and reinforce the cultural narrative that those who access public assistance are criminal, lazy, spendthrift addicts…Technologies of poverty management are not neutral. They are shaped by our nation’s fear of economic insecurity and hatred of the poor; they in turn shape the politics and experience of poverty.

Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor by Virginia Eubanks

A Feedback Loop of Injustice

Marginalized groups face higher levels of data collection when they access public benefits, walk through highly policed neighborhoods, enter the health-care system, or cross national borders. That data acts to reinforce their marginality when it is used to target them for suspicion and extra scrutiny. Those groups seen as undeserving are singled out for punitive public policy and more intense surveillance, and the cycle begins again. It is a kind of collective red-flagging, a feedback loop of injustice.

Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor by Virginia Eubanks