Child Welfare: A Brief History
by Linda Gordon, Ph.D., New York University, New York, NY
Children have been central to the development of welfare programs in the United States. Indeed, sympathy for poor and neglected children was crucial in breaking through the strong free-market individualism that has been mobilized repeatedly to condemn public aid to needy children. Recurrently, poverty has been diagnosed as a product of individual immoral character–even legitimated as a punishment for immorality–and thus removed from the arena of public policy. A distinction between the “deserving” and “undeserving” poor, and a fear that aid might encourage immorality, underlay the policies of both charity and local government relief programs. The notion of the “innocence” of children, as opposed to their caretakers, has worked to modify that harsh refusal.
Until the early 20th century, provisions for needy children were primarily local and private. In the colonial period, the responsibility for children came to the public primarily when they were orphans. But many of these children were not orphans in our contemporary meaning: many had at least one living parent, almost always a mother, who was unable to provide for them. When young, these children were typically given to the care of relatives, with the local Overseers of the Poor sometimes contributing to support the children. Older children were indentured to work for other families. This kind of apprenticeship bore no stigma in the 17th and 18th centuries, and children from prosperous families were often indentured in this way, in order to learn a craft from adults who would be less indulgent, it was believed, than parents.
Orphanages developed as indenturing declined. By 1800 there were some seven orphanages in the US, by the 1830s there were 23, by the 1850s more than 70. Most were privately run by religious and charitable groups, but municipalities frequently contributed to costs, and older orphans were often expected to work in order to defray expenses. The Civil War greatly expanded demand for orphanages, and the supply grew to over 600 by 1880. Increasingly, orphanages were expected not only to provide minimum food and shelter but also to instill good character, by preaching piety, thrift, self-reliance, sobriety, and hard work. They practiced rigid scheduling and strict discipline, including ample physical punishment, as a means, they believed, of requiring the orphans to internalize these values. Children were marched in lines to meals and toileting, often dressed identically, their hair cut identically (or shaved in the eternal battle against lice), seated on wooden benches to eat spartan food from tin plates. Meanwhile, by the mid-19th century, urbanization made the needs of poor children more visible than ever.
In the 1870s it was estimated that 150 children a month were simply abandoned, and thousands more were malnourished, ill-clothed, left unattended in unheated tenements or simply fending for themselves on the streets. Simultaneously the filthy and abusive conditions of the orphanages were attracting reformers’ attention, and conditions were probably even worse for children who resided in almshouses along with their parents. So child-welfare experts began to condemn congregate care institutions and to recommend a return to placing children in families. In the mid-19th century orphanages began systems of “placing out” children with private families, the beginnings of what we would come to label foster care. Charles Loring Brace, founder of the New York Childrens Aid Society in the 1850s, is usually identified as the author of this reform, a man possessed by a messianic sense of his power to uplift the poor by remolding their childrens character. He may have initiated the “orphan trains,” by which eastern urban orphans were shipped out to western families. Typically, their placements were informal apprenticeships and they were often exploited as workers by their new guardians. By 1875 Brace’s Emigration Department was exporting 4000 orphans a year. Orphan trains continued until 1930, when criticism of the lack of oversight to protect children finally stopped them.
Another private initiative for children was the child-protection movement of the 1870s. Deriving from Humane Societies formed to protect animals from abuse, Societies for the Prevention of Cruelty to Children arose throughout the US and Europe. Often motivated by women active in moral reform and temperance, they focused at first on egregious assaults on children and identified the source of the problem as hard-drinking, brutal, lower-class fathers. Soon, however, the new agencies were led by the evidence their investigations uncovered, and by the complaints of children and their mothers, to extend their dominion to take in child neglect and domestic violence against women. They then encountered the most irreducible problem of child protection: that very often the greatest damage to children comes from poverty, rather than neglectful or abusive caretakers; but child-welfare agencies have no power to combat poverty and the inequalities that create it.
In an example of the way that many private foundations and charities have functioned as arms of the state, the agents of these SPCCs were treated as experts and deferred to by the courts, although they had no accountability to the public. It was only in the 1960s that government agencies began accepting responsibility for child abuse and neglect. In 1974 the federal Child Abuse Prevention and Treatment Act provided funds and required certain professionals–doctors, social workers, teachers–to report suspected child abuse. Since then, mandatory reporting has been criticized because it floods agencies with many times more complaints than they are funded to handle, increasing the caseload for social workers and thus leaving some serious cases without intervention; and also because the mandatory reporting has been shown to be biased, with single mothers, the poor, and people of color more likely to be reported even in cases were the evidence of abuse was equally great among more privileged families.
The first precedent for public monetary aid to needy children developed in the decade 1910-20 in the form of the “mothers’ pension” laws. Networks of organized women, influenced by the settlement houses and by the final campaigns of the woman-suffrage movement, became aware that the majority of children in orphanages and foster care were not orphans, but rather relinquished by their poor lone mothers who could not both care for and earn for their children. (Organized day care for children was rare at that time.) Most mothers intended these institutional placements as temporary, but often lost track of their children or were unable to reclaim them. Outraged that mothers and children were being separated even when there were no allegations of neglect, women activists lobbied for state programs of aid to lone mothers. By 1920, forty states had established mothers’ pensions. But these programs aided only a tiny proportion of those in need, discriminated against immigrants and other nonwhites, and never provided stipends adequate to support a mother and children decently. Blacks, although much more in need on average than whites, were only 3% of recipients, and in the west, Hispanics and American Indians were usually excluded altogether. At a time when 93,000 families received help nationwide there were at least 1.5 million female-headed families with children. Nevertheless they created the first precedent for public aid to children.
Moreover, in 1912 these activists got the U.S. Childrens Bureau established in the new Department of Labor. A major achievement, this agency functioned simultaneously as a promoter of child welfare, but also as the agency fighting for women and for a welfare state in general. Its investigations of child and maternal health, plus the final victory of woman suffrage, produced the Sheppard-Towner Act of 1921, providing public health nursing to poor mothers and children in rural areas. Although it was repealed seven years later in response to a campaign by the American Medical Association, Sheppard-Towner reduced infant mortality considerably in those few years. A decade later the Children’s Bureau investigations of child labor contributed in a major way to the prohibition of child labor and extension of compulsory education. The Great Depression of the 1930s bankrupted many states, and with them their mothers’ pension programs, just as the need for help for children soared. The result was the federalization of these programs in Title IV of the 1935 Social Security Act. That law is usually identified with its social insurance programs, old-age pensions and unemployment compensation, both of which were aimed at male heads of family and were not means-tested, therefore carrying no stigma.
The Childrens Bureau network succeeded in winning one program for mothers and children, in Title IV of the Social Security Act: Aid to Dependent Children (ADC). It was a skimpy and only partially federally-funded program, in part because its authors envisioned it as only temporary-–assuming that lone motherhood would decline as a new welfare state would eliminate poverty. In contrast to the social-insurance titles, ADC was not only means tested — but also morals-tested — requiring invasive surveillance of recipient families; its benefits were below the lowest prevailing minimum wages; it was funded primarily by the states through property and sales taxes; and it was not an entitlement but a public charity, of which the recipient had to prove herself deserving. The federal government contributed 1/3 of ADC funds, the rest provided by the states, and the program was administered locally. One result was the same exclusion of nonwhites that the mothers’ pension had operated. Far from shrinking, ADC grew geometrically, particularly as a civil rights consciousness led nonwhites to demand citizenship rights. ADC remained the main program for poor children until its repeal in 1996. In its sixty years, as it grew in its number of recipients, its stipends declined.
With no automatic cost-of-living adjustment, such as that attached to the old-age pensions, ADC–or “welfare” as it came to be called–trapped its recipients in poverty and stigma. In 1961 federal funds for foster care were provided. With the repeal of ADC, therefore, the number of needy children removed from their families has grown once again. As legal adoption became common in the 20th century, adoptive parents came to prefer infants; older children often moved through several foster families, and when they “aged out” as teenagers, were left literally without family. Another approach, pioneered by New York state in 1965 and supported by the federal Adoption Assistance and Child Welfare Act of 1980, was to subsidize adoptions. Subsidies challenged the assumption that permanent kinship required financial independence and acknowledged the high costs of raising children, especially those who needed ongoing medical and psychological help. Yet just as adoption subsidies encouraged stability for children, a 1977 decision in a class action suit maintained instability for foster children. In Smith v. OFFER, the U.S. Supreme Court decided that foster parents could not oppose childrens removal or expect a default preference for keeping their families intact, as birth families could, no matter how long-lasting and deep the ties between foster parents and children.
In the last decades of the 20th century, child-welfare policy has veered between two goals that are often in conflict: child protection and family preservation. In the 1970s and 1980s, pressure from progressive social movements and discourse emphasized the rights of poor parents and recommended aid to intact families. In the 1990s, child safety and removal from caretakers dominated the discourse–and, not coincidentally, proved cheaper. ADC, repealed in 1996, was replaced with the Temporary Assistance to Needy Families legislation. TANF offers time-limited benefits and then forces single parents into the labor force; like ADC a joint federal-state program, TANF provisions vary by state. In general many single mothers have been successfully pushed into jobs, but most of them are minimum-wage and insecure jobs, without benefits, and many of these families have been made poorer rather than better-off by the employment of the mothers. More recent legislation has made piecemeal adjustments. The Promoting Safe and Stable Families Amendments of 2001 attempted to speed adoptions, prohibited race-based segregation of adoptable children, and increased incentive payments. Foster Care Independence programs authorized services for teenagers aging out of foster homes.
But the United States remains a country with a poor child-welfare record. Child poverty increased by 12 percent between 1999 and 2004: 18 percent of children were living in families with incomes beneath the minimum poverty level established by federal statisticians. Because of the federal structure of the US, there is enormous variation in child welfare throughout the country. But nowhere is the record good: in Mississippi, 52 percent of children live beneath the income level required to meet basic needs; in New York, 41 percent; California, 42 percent; Wisconsin, 35 percent. Measured internationally, America’s child poverty rate is the highest in the developed world–5 times higher than that in Norway, Sweden, and Finland, twice that of Spain, 50 percent higher than that in Italy, Ireland, or the United Kingdom.
Nina Bernstein, The Lost Children of Wilder: The Epic Struggle to Change Foster Care.
Linda Gordon, Heroes of Their Own Lives: The Politics and History of Family Violence.
Linda Gordon, Pitied But Not Entitled: Single Mothers and the History of Welfare, 1890-1935.
Timothy A. Hacsi. Second Home: Orphan Asylums and Poor Families in America.
Dorothy Roberts, Shattered Bonds: The Color of Child Welfare.
How to Cite this Article: Gordon, L. (2011). Child welfare: A brief history. Social Welfare History Project. Retrieved from http://socialwelfare.library.vcu.edu/programs/child-welfare-overview/