(Bloomberg) -- Liberals are outraged by moves at both the federal and state level to require people who receive public assistance — food stamps, housing aid, Medicaid and others — to work. But those who think that so-called workfare is a dangerous development peculiar to the present must reckon with the fact that such arrangements have long been the norm, not the exception, in the U.S.
Contemporary American attitudes toward welfare and work date back to medieval Britain, when Parliament first passed “Poor Laws” that punished beggars and sought to put them to work. By the 17th century, workhouses — where the poor would toil in exchange for basic subsistence — became increasingly common.
The rise of the workhouse reflected fears that poor relief could create a cycle of dependency and indolence. Forcing the poor to work would, in theory, instill habits of industry and self-discipline that would guarantee an escape from poverty. Unfortunately, no such thing happened: The numbers of the poor only multiplied amid the wrenching economic adjustments of the 17th and 18th centuries.
Nonetheless, many of the approaches to poverty that originated in Britain took root in the American colonies, persisting even after the United States became independent. In the 1820s, for example, several states began to study the problem of what was quaintly known as “pauperization” — the growing numbers of poor people who could no longer support themselves. Like their British counterparts, American officials became obsessed with the fear that the needy might become dependent on public assistance unless there was some goad to become self-sufficient.
In Massachusetts, state senator Josiah Quincy III wrote a famous report in 1821 that drew a sharp distinction between the “impotent poor” (people genuinely incapable of working) and the “able poor” (individuals who could work). Quincy argued that no-strings-attached benefits would end up “destroying the economical habits and eradicating the providence of the laboring class of society.”
Quincy argued for the creation of public institutions — workhouses or “houses of industry” — where “work is provided for every degree of ability in the pauper, and thus the able poor made to provide, partially at least, for their own support…”
Other states came to the same conclusion. For the remainder of the century, most states maintained workhouses. Their track record was dismal: Poorly run and often corrupt, they did not seem to have propelled people out of poverty. Nonetheless, this kind of “aid” remained the norm well into the 1920s.
The Great Depression challenged the conventional wisdom on relief, and for a brief period, it looked as though a new kind of welfare state might emerge. Unlike earlier kinds of workfare, New Deal employment programs — the Public Works Administration, the Works Progress Administration and the Civilian Conservation Corps — didn’t have the stigma associated with the older workhouse; they simply aimed to put people back to work. The New Deal also introduced programs that gave direct cash payments to families in which fathers were “deceased, absent or unable to work” (Aid to Dependent Children, or ADC).
But this new approach never fully escaped the workfare tradition. As historian Eva Bertram has observed, Southern Democrats starting in the 1930s worked to modify this legislation in ways that would leave the recipients subject to workfare requirements. For example, legislators eliminated a provision that would have explicitly exempted ADC recipients from work requirements.
Moreover, states retained considerable authority in implementing these kinds of benefits. In the South, local authorities usually rejected applications for ADC benefits, especially if the supplicant was black. One federal official reported in 1939 that communities in the region “see no reason why the employable Negro mother should not continue her usually sketchy seasonal labor or indefinite domestic service rather than receive a public assistance grant.”
In the 1950s and 1960s, attempts to expand welfare benefits provoked an intense backlash — not among Republicans, but among Southern Democrats and a handful of allies. In 1967, these politicians successfully imposed workfare requirements when the Aid to Families With Dependent Children program (which succeeded ADC) came up for reauthorization. This included a provision making it mandatory for states to impose work requirements.
Throughout the 1970s and 1980s, workfare provisions crept into a host of federal welfare programs. In 1971, for example, Congress passed what became known as the Talmadge work incentive amendments, which required AFDC mothers with school-age children to register for work and training programs. In 1988, Congress passed a bill that required all mothers receiving AFDC to work; Democratic Senator Daniel Patrick Moynihan, a sponsor of the bill, promised that it would “turn the welfare program upside down.”
These efforts would eventually culminate in the Personal Responsibility and Work Opportunity Reconciliation Act of 1996, which replaced AFDC with the aptly named Temporary Assistance for Needy Families program. President Bill Clinton hailed the program as “ending welfare as we know it.” In reality, though, the bipartisan welfare reform was simply reviving a centuries-old tradition of workfare.
The latest efforts by the Donald Trump administration, Congress and some states may well inaugurate a further expansion of workfare. This may be bad policy, creating more problems than it solves, as liberals claim. But in gearing up for this battle, they need to know that these proposed changes are not a break from the past. They are part of a deep and powerful historical tradition that has, with rare exception, tied welfare to work.
©2018 Bloomberg L.P.