To make addictive behavior even more damaging to poor communities, it has played a major role in spreading that modern plague, AIDS. HIV infection is not, of course, a behavior, but high incidences of drug addiction and sexually transmitted diseases among the urban poor mean that a disproportionate number of those infected with HIV are African American. Difficulties in accessing health care mean that a disproportionate number of them will develop full-blown AIDS. Although African Americans are only 12 percent of the population, they now account for nearly half (47 percent) of new AIDS diagnoses. In the United States in 2000, almost two-thirds of women and almost two-thirds of children reported with AIDS were African American. Among infected young people (ages thirteen to twenty-four) almost two-thirds are African American.
22
While the Centers for Disease Control and Prevention does not break down its AIDS statistics by economic class, its reports make it all too clear that poverty, and not race, is the determining factor in this disproportionate rate of HIV infection among African Americans.
23
Increasingly, AIDS is becoming a disease of the poor.
“WHY DON’T THEY JUST GET JOBS?”
The poor are frequently criticized for their lack of initiative and motivation, but after eighteen years in the inner city, I must disagree. Most inner-city residents are hard working, scratching out a living for themselves and their families by stacking several part- or full-time jobs at minimal wages one on top of another. In fact, it always surprises me that so many of them are as motivated than as seem to be. Given all the strikes against the poor, any realistic look into the future is bound to seem grim indeed. In the end, though, high aspirations usually collide with the reality of limited vocational options. Like most people in our individualistic culture, the poor ultimately blame themselves for their lack of success, and can easily lose whatever self-confidence they have been able to muster. What little public assistance exists is often administered in ways that make it difficult to move back into the world of self-sufficiency, especially when self-sufficiency is defined as a series of exhausting jobs that don’t pay a living wage.
Middle- and upper-class perceptions that ghetto residents lack proper motivation have many sources, not the least of which is our belief that anybody can “make it” in America, which leads directly to the assumption that there must be something wrong with anyone who doesn’t. But as their dialect indicates,
24
black inner-city residents are severely isolated from the rest of society and so, not surprisingly, can lack certain social and job-related skills necessary for life in the wider society. If one has seen relatively few people get up in the morning and go to work on a regular basis, if one has not lived in an environment where punctuality is important or necessary, if one has not learned “appropriate” deference toward superiors, if one has not even learned how to deliver excuses in a sincere and believable manner, then one will be misunderstood. Most of us could not say where we learned such skills, but we
have
learned to dress well for a job interview even if the place to which we are applying has few employees who dress well, even if the job we are applying for will not require us to dress well. We will make sure that we are absolutely on time at work each and every day during the first weeks or months on the job, a probationary period during which we know that even reasonable excuses for tardiness are likely to be dismissed. During that probationary period we know we should take few breaks and appear eager to work. If one has not learned such behavioral skills, one’s behavior may very well be misread as disrespectful, lazy, or slovenly.
The middle-class perception of many poor people is that “they don’t want to work.” In my experience, that is rarely the case, but cross-cultural miscommunication is easy.
VICTIMS TO BLAME
The poverty and hopelessness of life in the ghetto make it difficult for residents to develop self-esteem by conforming to the values and ideals of the larger society or to gain prestige in a socially acceptable fashion. When asked by pollsters, many ghetto residents continue to hold to the values of the wider culture, though they may find it impossible to live by them. They consider education important, but inferior schools and other obstacles to formal learning mean that less than half of them graduate from high school and only a fraction go on to vocational school or college. Marriage has been a goal, but all the issues discussed above lead to single-parent families. People have respect for the law, but for those on welfare bureaucratic rules make staying within the letter of the law virtually impossible, the lack of adequate jobs makes illicit work alluring, and the militarization of law enforcement that turns the ghetto into a battleground foments anger and resentment toward the law and its representatives.
Trying to live up to a set of values without real hope becomes painful. Gradually, within the ghetto, a parallel status system has developed, particularly among the young, in opposition to wider cultural norms. Perhaps oversimplifying, Massey and Denton nevertheless come close to the truth:
If whites speak Standard American English, succeed in school, work hard at routine jobs, marry, and support their children, then to be “black” requires one to speak Black English, do poorly in school, denigrate conventional employment, shun marriage, and raise children outside of marriage. To do otherwise would be to “act white.”
25
The extent of this oppositional value system varies. Raising children outside of marriage, for instance, has certainly become the norm. Similarly, learning Standard English is not very high on anyone’s priority list. On the other hand, only certain subgroups publicly aspire to do poorly in school or avoid conventional employment. The growth of an oppositional value system in the ghetto has received an enormous amount of attention in the media, so the young man lounging on the street corner unwilling to work has become for the wider public the face of the ghetto. But it is also true that, as poverty continues to strangle generation after generation, this oppositional culture has become ever more established, and members of the ghetto who continue to hold the values of the wider society come under increasing pressure to change.
Are individual behaviors an important factor in inner-city poverty? Of course. But historically, the negative structural forces came first and have never gone away. Fifty years ago, before urban renewal and the interstate highway program, before the jobs moved away, before the upper and middle class moved out, before a multitude of societal forces struck with devastating effect, the African-American ghetto was a far different place. In attacking poverty we certainly must confront the realities of “ghetto-related behavior,” but we must not become confused about root causes. Mere survival within the “surround” indicates enormous strength and resilience. Observe carefully in any inner-city neighborhood, and you will see many strong, resourceful, independent people who are not only keeping their heads above water but doing their best to strengthen the community as well. The problem is that these people are swimming against an overwhelming current of forces that constantly threatens to overpower even the strongest.
Four
WELFARE IN MODERN AMERICA
There is an enduring myth that earlier in our history government stayed out of the business of welfare. When people were in trouble, according to the myth, extended families and neighbors helped out. When that wasn’t enough, charities stepped in to see a person through. Part of the problem today, we’ve come to believe, is that we depend on the state to do things that family, friends, and charity used to take care of.
In fact, the care of Americans in need has always involved some combination of state aid, private institutional support, and purely voluntary assistance. In both England and colonial America, local governments provided assistance to the destitute, and this practice continued well into the nineteenth century. In order to stave off riots or other civil disturbances, city governments often provided food or shelter during hard economic times. From the beginning, there was public concern about the cost of this relief and its effects on taxes. The publicly supported poorhouses of the nineteenth century were spectacularly unsuccessful attempts to reduce the financial burden on government. Local governments hoped to reduce welfare costs by bringing everyone who required support under one roof and creating economically self-sustaining communities. Far from being self-sustaining, however, the cost of the poorhouses to the government was considerably more than the previous meager welfare payments. Even the federal government got involved in an early form of welfare after the Civil War, offering pensions to veterans of that war and their surviving families. These benefits were later extended to veterans of all wars. The first “widows’ pensions” were available to the wives of those veterans. Before the program was discontinued prior to World War I, its cost had risen to 18 percent of the total federal budget, far greater than any comparable program since.
In the late nineteenth century, some employers became part of the welfare state by offering pensions to employees. Later, as part of a “welfare capitalism” movement in the decades before the Great Depression, some larger companies experimented with pensions and other benefits as a way of maintaining company loyalty and retiring older, less productive employees. These benefits could never be counted on, however, because they depended upon the continuing economic health and largesse of an employer. It was only with the growth of unions during the early decades of the twentieth century that employers were brought firmly into the administration of the welfare state through pensions, disability insurance, and health coverage.
But to say that some mixed forms of public assistance have always existed is not to say that they were ever adequate. While some employers have continued to provide good pensions and health benefits, concern that the “undeserving poor” would take advantage of too generous programs and fear that the cost of providing adequate welfare services would be exorbitant have generally left public programs at levels that did little more than keep people from starvation and utter destitution.
WHAT IS “POVERTY”?
We talk glibly of poverty without defining our terms, but definitions are important. In this book what I mean by poverty is having an income below the federally determined poverty level. This is the official definition and the one most commonly used in the United States. It is important to be aware, however, that this official poverty level severely understates the actual number of people who live in what most Americans would intuitively consider poverty.
The “official poverty level” first seeped into government parlance in 1961, when Mollie Orshansky, a staff analyst at the Social Security Administration, needed an objective definition for statistical work she was doing. She reasoned that the financial inability to purchase an adequate diet would be generally considered poverty. In the 1950s, the United States Department of Agriculture (USDA) estimated that the average American family spent about a third of its income on food. Every year the USDA also estimated the cost of a minimally adequate diet. Orshansky, therefore, defined the poverty level as the cost of a minimally adequate diet multiplied by three. That definition stuck, and without real evaluation became the official government standard, which is revised annually, using updated USDA estimates of food costs. Although levels are calculated for various family sizes, when used by itself the term “poverty level” usually refers to the amount a family of four would need to stay out of poverty, which in 2001 was $17,650.
Unfortunately, Orshansky’s definition is too simplistic for the weight it has had to bear over the last forty years. There are numerous problems. First, the poverty level is held to be the same throughout the continental United States, although the cost of living varies enormously. Someone living on a farm in South Carolina needs less money to live than a person living in the inner city of New York.
Second, non-cash income like food stamps and housing subsidies was only minimally available in 1961 and is, by definition, excluded from the calculations. A family with an income just below the poverty line who receives food stamps and a housing voucher is clearly better off than another family with an income just over the poverty line who receives neither of these benefits, but the former is considered poor and the latter is not.
Third, taxes are not taken into account, so neither the expense of taxes or the income of the Earned Income Tax Credit changes one’s “income” for purposes of the calculation.
But by far the biggest problem with the poverty level is that it is obsolete. Relative costs of different expenses have changed significantly in the past fifty years. Utility costs have risen faster than the cost of food, as have housing costs. A one-bedroom apartment in the Washington, D.C., area (at the government fair market rent of $716) would be 61 percent of the poverty level income for a family of three. If food still costs 33 percent of their budget, that leaves only 6 percent or $71 a month for all other expenses, including childcare and health care. Technology—washers, dryers, kitchen appliances, television, computers—now eats up a larger portion of expenditures. Probably the biggest single issue, however, is childcare. Because most women with children stayed at home in the 1950s, the cost of childcare, now significant for young families, is still not included in the calculation.