And we’re back.

So about a year ago, I went on blog hiatus. I was thinking of a master plan, and that takes time, obviously.

Over the last 12 months, I wrapped up “What Went Wrong: The Betrayal of the American Dream.” That beast was a multi-year project I was editing at the Investigative Reporting Workshop, tied to a book by two incredible journalists I can now happily count as friends and mentors, Don Barlett and Jim Steele. Their book came out in July, and promptly landed on the NYT bestseller list.

I also won a fellowship from the Alicia Patterson Foundation (yay!). I’m incredibly honored, and incredibly excited to have a year to focus on the civil court system, and the challenges of self-represented litigants. That’s something I’ve written about before, and I feel (am) so lucky to be able to dig into this without daily deadline and editing pressures. I’m focusing on one courthouse, the 36th District Court in Detroit, where I’ve been visiting on and off for about 5 years now. I’m grateful to the staff and judges at the court who have welcomed me, and to the many lawyers who have patiently explained things to me, and most of all to the people handling eviction, debt collection, foreclosure and other very real, very hard personal crises who still took the time to talk to a reporter in the hallway.

As usual, I’m going to promise to try to be better about posting here, particularly (but not only) when I write things. I’ll try.

An article about something besides economic doom and gloom

In addition to obsessing about the economy and the courts, I also like to make things. And I like to write about making things. Living in DC, though, my desire to make things is sometimes stymied by the fact that the District has no fabric stores at all. As I explain in the Washington City Paper.

Allison Lince-Bentley, of Bits of Thread. Photo by Darrow Montgomery for the Washington City Paper.

To get to the District’s only fabric store, you must walk through a metal gate, past two buzzers, and up a narrow set of carpeted stairs, into two crammed room above a McDonald’s. The store has a limited selection, with pieces of gently used fabric at prices that match their secondhand nature. Material hangs from posts made of plastic tubing, an odd assortment of textures and patterns. Plastic bags full of thread are sorted by color, and piled on plastic bins. Mismatched glass jars full of buttons sit on a windowsill, and a stack of vintage sewing magazines lays on a folding table near the door. It’s an unorthodox setup.

The Adams Morgan shop is a collaboration between the Bits of Thread sewing studio and Scrap D.C., a local nonprofit that rescues potentially useful arts and craft supplies headed to the dump. It’s an amazing example of creative reuse, but with most of the fabrics cut into lengths too short for a dress, it’s not the ideal resource for an aspiring seamstress. Yet in a city so full of makers and craftsters, this tiny room of off-cuts is pretty much all we’ve got.

The piece goes on to explore the glory days, when DC had something like three fabric stores, why they got pushed out, and why they can’t come back.

Now & then: Is the Great Recession so different from the Great Depression?

Published Dec. 24, 2011 with The Philadelphia Inquirer and New America Media.

Some call this moment the Great Recession. As the hardship has lingered, others have begun calling it the Little Depression. But equating the hard times of the 1930s and with the hard times of today is mostly overblown rhetoric. Or is it?

On the surface, the comparisons are obvious: a period of great wealth and exuberance, followed by a stock market crash. After the crash, widespread economic pain. Millions of people out of work, thousands of homes lost. Families going hungry.

But much has changed. There is social security, unemployment insurance, Medicare and Medicaid, none of which existed when the Depression hit. Breadlines and shantytowns, emblems of the Depression, are nowhere to be seen. Today, though, there is great hardship out of view. Behind closed doors, apartments and shelters are overcrowded, and cupboards are bare.

In interviews with dozens of people who lived through the Great Depression, both similarities and differences between the eras emerge.

“People have so much more now than people had during the Depression,” says Luanne Durst, who was born in 1931 in Rice Lake, Wis. But, she says, “I think its relative. I think it’s a different level of want, but it’s there nonetheless.”

“I think it’s a scary time for people,” Durst says. “I think it’s got to be pretty much like it was then.”

“I think it was much worse,” says Martha Rutherford, who was born in Portland, Ore., in 1935. “Well,” she clarifies, “overall, I think it was much worse. I think some people are in terrible straits now. But back then it was pretty much everybody, I think.”

There were certainly more people unemployed during the Depression than today, although it’s hard to make direct comparisons because the Bureau of Labor Statistics didn’t start tracking unemployment as we know it until the 1940s. What we do know that is unemployment rose from 3.2 percent in 1929 to a staggering 20.9 percent in 1933, according to a research paper from the Federal Reserve Bank of Dallas. The high-water mark of unemployment in this recession is 10.1 percent, in October of 2009.

During the Depression, many people made their own work, or tried.

“Guys would be coming around from different neighborhoods, knocking on doors,” says Saul Coplan, who was born in Philadelphia in 1932.  “’Can we mow your lawn? Would you like us to clean your windows? Is there anything we can do? Want us to shovel coal into the coal furnace?’  And it was sad.”

Rutherford’s house was near train tracks and she remembers hoboes knocking on the back door. “Maybe they didn’t look clean, but they were always humble. Never scary,” she says. “And they would ask for work, and, of course, my parents didn’t have any extra money to give them work. They were barely scraping by.”

Paul Rees, who was born in 1929, grew up in Weymouth, Mass., where his father ran a small mechanic shop and gas station. He credits his father’s entrepreneurship with their relative stability during the Depression and thinks that self-reliance is out of reach for many today.

“So many people don’t work for themselves and can’t because of the governmental regulation and the inability to cope with the taxation problems and sale taxes,” he says. “None of that affected people during the Depression.”

Both during the Depression and today, the unemployment numbers understate the problem. There are far more people not working, or working less than they want to be, than are counted among the unemployed. In November of 2011, some 7 million people had either given up looking for work, or were working part time but seeking full-time jobs — on top of the 8.6 million officially unemployed. By all accounts, the same was true during the Depression.

‘What’s wrong with me’ was and is common

For those who couldn’t find work, the sense of shame was profound.

“People had been encouraged to kind of look down on the poor, and think that they were responsible for their own troubles” during the boom years of the ’20s, says McElvaine. “That had been sort of internalized by people who had been doing reasonably well.”

“And so when the bottom fell out,” he says, their reaction was to think “that there must be something wrong with me.”

That embarrassment often led to social isolation, McElvaine says. “Especially after a long time of being unemployed,” people would “cut off some of their social contacts, not want to see other people because they were kind of ashamed of their own situation.”

The same is true today, says David Eliott, communications director for US Action, a left-leaning advocacy organization. He compiled a recent report featuring quotes from dozens of unemployed workers, and asked some of those quoted if they would be willing to speak with reporters. “A lot of people were so stigmatized that they’re saying no, there’s no way I could talk to the media,” he says. “There was a sense of shame and embarrassment at being unemployed, even though they’re unemployed through no fault of their own.”

Then, as now, the pain of unemployment was not spread equally. Communities of color were hit particularly hard.

“More black people than white lost their jobs. More black people than white became underemployed,” says Cheryl Greenberg, a professor at Trinity College who has written two books about African-Americans and the Depression. In New York City, for example, the unemployment rate was 25 percent in 1932, Greenberg says. In Harlem, it was 50 percent. “In every city and at every stage of the Depression, whatever the white unemployment rate was,” she says, “the black unemployment rate was always higher.”

Today, the unemployment rate for blacks is 15.5 percent. For whites, it is 7.6 percent.

In part, the higher unemployment rate for blacks during the Depression was attributable to increased competition for even low-paying jobs. Jobs at the bottom, which “had always been bad, were at least black jobs. Because white people could get better jobs. Suddenly in the Depression, every job became open to white people,” Greenberg says. When jobs became available, many white business owners prioritized hiring white worker over blacks. For blacks, “their hold on even those terrible jobs declined,” she says.

Despite that, some felt that the black community suffered less than whites during the Depression because they were already used to making do with little.

“I think it is true that for black people, the Depression was less of a change” than for many whites, she says. But that’s because “they had been in Depression all along,” she adds. Many African-Americans had been struggling for decades, she notes, leaving them with few resources and savings to fall back on. “But it’s not like they had gone from riches to rags. They had gone from rags to fewer rags,” Greenberg says.

“The erosion of the middle class for blacks is different than for whites because there were some things blacks never had,” agrees Paul Ingram. Ingram, who is black, lived in Detroit during the Depression.

“Scraps. That was all that was left for our ancestors. And they learned to take their food, and transfer it into a decent meal. As a kid growing up, my mother would take neck bones and boil them in a pot, get potatoes and boil them. Get some celery and make a gravy and eat it with biscuits,” he says. “The nickname was puzzle bones because you had to reach all in and tear them apart to find the meat.”

For all Americans, the Depression was particularly bitter after a period of optimism and prosperity — a story familiar to many whose homes shot up in value during the early 2000s. But in the 1920s as in the 2000s, that wealth was not evenly distributed.

“The concentration of income at the very top peaked in 1928, ’29 and peaked again in 2007, 2008. It seems like it’s more than coincidence that both times that was followed by the collapse of the economy,” says Robert McElvaine, a professor of history at Millsaps College and the author of two books on the Great Depression. “Whatever you think about such a concentration of income as a moral question — and I happen to think that it’s bad, morally — economically, it just doesn’t work in a consumption-based economy to have it so concentrated.”

Living off the land and helping neighbors

Unable to consume goods through the cash economy, many people provided for themselves during the Depression.

Far more people were farming then than now, which meant that more people were able to grow their own food. Even people who lived in towns often had farming relatives who could provide staples in a pinch. That produce was often essential to their survival.

“The farmers were in a good position because they could raise their own food. Including animals, not just the gardens. My mother and my aunts, everybody canned a lot before they had the frozen food,” says Margaret Deitrich, who lived in rural Colorado during the Depression.

“And there used to be a lot of farmers,” says Deitrich, who was born in 1934. “There are the big corporations, but the individual farmers aren’t near as many as there were when I was growing up. You don’t have a grandparent nearby with a farm.”

In 1930, there were 6.5 million farms, with 12.5 million working farmers, according to the U.S. Department of Agriculture. By 2000, there were just 2.1 million farms, with just under 3 million farmers. (2010 figures are not yet available.)

The population was also far more rural in the 1930s, with 44 percent of Americans then living in rural areas. As people moved off the farms and into cities, their ability to grow food for themselves diminished. Now, just 23 percent of Americans live in a rural area, according to the U.S. Census Bureau.

City dwellers did grow food where they could, forebears of today’s urban agriculture movement. Many new city dwellers were African-American former farmers who had come North as part of the Great Migration.

“They grew groceries in their backyard, if they had backyard. They would grow beets and cabbages,” says Joseph Jackson, who is black and was born in 1924 in Detroit.

“People weren’t so concerned as much about how their grass looked as much as how their greens looked,” agrees Ingram.

What people had, they shared.

“My parents felt very blessed. They weren’t religious people, but they felt very blessed, for their economic security, although it was very small. They never turned anyone away hungry, never, ever. I remember that as a little kid,” says Rutherford.  “And there were a lot of hungry people.”

In 2010, nearly one in six American households was food insecure, according to the USDA, meaning that people were unable to afford enough food for everyone in their household.

People also shared housing. Then as now, doubling up was common. Rees and his family had rooms in the home of a widower and his sons; Rees’s mother took care of the house and children.

“That was not uncommon in the New England area,” Rees says. “A lot of families did that with relatives. In our case they were not relatives, but it was a way to survive until we could afford to get something on our own.”

During the current recession, the number of doubled-up households has risen by more than 18 percent, according to the U.S. Census Bureau. In 2011, 69.2 million adults were living in a combined household — more than 21 percent of all households nationwide.

The years of joblessness and underemployment, the crowded homes, the persistent hunger: All took an emotional toll.

“It was called a depression for a reason,” says Coplan. “People were depressed. It was hard to see people smile. Kids smiled. Kids were always having fun. We didn’t know how bad it was until much later.”

The federal government knew just how bad things were and launched aggressive spending programs to boost the economy. The Works Progress Administration and the Civilian Conservation Corps hired millions of workers directly, engaging in everything from photography to construction to forestry. The Depression also saw the development of a core social program now being contested: Social Security. The program was created by the 1935 Social Security Act, which also introduced unemployment insurance and aid to dependent children.

The era also ushered in the Federal Deposit Insurance Corporation, offering a government guarantee on bank deposits. The creation of the FDIC followed a period of massive turmoil in the banking sector, with a run on banks, a government-imposed “bank holiday,” and widespread bank failures.

The banking troubles hit store owners, like Saul Coplan’s father in Philadelphia. One day in late 1928, Coplan says, his father went to his furniture store and found it padlocked, along with the third-floor apartment he lived in. He went to the bank, his son says, and confronted the teller. The teller, a friend from high school, told him that “the board of directors thought you weren’t going to be able to make the payments, so they called the whole note,” Coplan recalls. His father, who had never missed a payment, lost his home, his store and its entire inventory. “He was frosted at banks from that point on,” Coplan says.

Even children were affected by the bank failures.

“They took my Christmas savings when the bank crashed,” says Jackson. “I had to be 5 or 6. I had a little savings account and I’d put a nickel or a dime in the account and I went to the bank and the bank was closed.”

“That bank never did open,” he adds.

Between the fall of 1929 and the end of 1933, some 9,000 banks suspended operations, according to the FDIC. To date, 412 banks have failed since the start of the 2007 recession, although today all deposits up to $250,000 are insured against loss.

Ultimately, though, it wasn’t banking reforms, unemployment payments or the WPA that ended the Depression. It was World War II.

The Depression’s legacies

In New England, the economic impact hit before war was formally declared. Roosevelt was “building up along the coast, in military establishments, in the Navy and the Army,” says Rees. “My dad worked in Camp Edwards, which was the new jumping-off place to go to Europe in World War II, and that was built before we went into the war.”

In Philadelphia, the boost came later. “Until the war started, there wasn’t much change. People were still out of work,” says Coplan.

Frank Luke, who was born in Honolulu in 1935, remembers the wartime boost to his grandfather’s restaurant.

“He had a little hole in the wall: a couple of tables, a counter, a kitchen in the back. But the location was opposite Fort DeRussy,” says Luke, who is Chinese-American. Soldiers kept business booming, he says. It was “sweet-and-sour spare ribs that sent me and my sisters to college.”

While it was war spending that lifted America into prosperity, both those who have studied the Depression and those who lived through it say the social programs mattered, and matter still.

For one thing, McElvaine says, social supports made it clear that joblessness “was a societal-wide, economy-wide problem and not just a personal failing.” More broadly, he says, the social programs of the Depression “make it easier for people to survive but also keep the economy from getting worse. Because they are the programs that put some buying power into the hands of people who are going to spend it and keep stimulating the economy.”

That’s a matter of controversy today. The notion of giving money directly to poor people, through jobs programs or benefits, is by no means universally supported. Unemployment insurance is being cut in many states and federal unemployment insurance extensions are slated to expire at the end of this year. Cuts to Social Security and Medicare, long politically untouchable, are now being considered.

Rutherford, who lived through the Depression, now depends on one of its key legacies.

“I can’t imagine what would happen if we didn’t have Social Security because we don’t have a lot of savings,” she says.

But the legacy of the Depression-era programs goes beyond individual survival. The spending lay the foundation for future growth, Greenberg says.

“Our whole infrastructure was rebuilt during the Depression, thanks to the New Deal. We had airports and highways and post offices and schools that were built because we took unemployed people and put them to work,” she says. Now, “what worries me is that because those programs don’t exist, not only are we not giving those people skills, or feeding them, we’re not even prepared for the recovery once it happens.”

With deficits dominating the discourse in Washington, a massive outpouring of federal cash is all but unimaginable today. President Obama proposed his American Jobs Act in early September, but neither the bill as a whole nor its component parts have gained traction in Congress.

“We need somebody like Roosevelt to say ‘I don’t care what you think, I’m going to push this through and I have the people behind me,’ ” says Coplan. But he thinks that’s unlikely. Now, he says, “money has become the ruler. And I fear for our country.”

Reporter Michael Lawson contributed to this story.

This story was produced with help from sources in the Public Insight Network from American Public Media.

The original version of this article incorrectly stated that Medicaid was part of the 1935 Social Security Act. The program was enacted as part of a 1965 addition to the legislation.

 

New data show fewer children, more seniors in poverty

_____________

Graphic by Melanie Taube, Investigative Reporting Workshop

Source: U.S. Census Bureau.

Published Nov. 7, 2011 at the Investigative Reporting Workshop

Numbers released today (pdf) by the Census Bureau paint a fresh and complex picture of poverty in America. For the first time, the figures count the impact of benefits like food stamps, tax credits and housing assistance. And for the first time, the data reflect not just income but spending, factoring in medical expenses and child-care costs.

Under the new measure, the number of children in poverty is lower than under the traditional poverty calculations. The number of seniors in poverty is higher.

At a time when many benefit programs are facing deep cuts, the data shows that the social safety net is having a big impact, particularly for children. And it shows that for seniors, the economic situation may be far more dire than previously understood.

“The main driving force behind this measure was to give policy makers a handle on the effectiveness of policies,” says Kathleen Short, a researcher at the Census Bureau and the author of the new report.

The new numbers reflect many “policies that are aimed at the people whose incomes are at the bottom end of the income distribution. The official measure simply did not include a lot of those programs.”

What the data show

According to the new data, more people overall are in poverty than under the official measure, both as a percentage of the population and in raw numbers. But among specific age groups, differences emerge. For children, the new measure lowers the poverty rate by more than 4 percentage points. Among the elderly, the poverty rate rises by almost 7 percentage points.

“In the past we’ve certainly seen that story that the elderly are not as poor as children. But it’s often because the benefits that are not included in the official measure are targeted at families with children,” Short says.

Two measures of poverty

Under the new measure, poverty among children fell, but poverty among adults and seniors rose. For children, food stamps and tax credits are major factors in lowering the poverty rate. Among seniors, out-of-pocket medical costs pushed many below the poverty line.

Graphic by Melanie Taube, Investigative Reporting Workshop

Source: U.S. Census Bureau 2010 data

Seniors tend to have incomes just above the official poverty line, Short says, while households with children are more likely to be below the official poverty measure. The supplemental measure counts benefits that lift children up, and counts expenses that drop seniors into poverty.

Because so many seniors are living on the precipice of poverty, “anything you subtract from their income is likely to bring them below the line,” she adds.

The main drain on seniors’ income is medical costs. Once they are subtracted from their income, the poverty rate for seniors skyrockets. Without counting out-of-pocket medical costs, just 8.6 percent of seniors are in poverty. But once those costs are factored in, the poverty rate for seniors rises to 15.9 percent. That’s because the new measure doesn’t do much to change how much money the elderly are taking in, since Social Security benefits are already counted under the official measure. The income seniors have doesn’t change much under the supplemental calculations, but the demands on their income are more clearly reflected.

Highest rates loom in the West

Poverty rates were lower in the South and West under the supplemental measure than under the official measure. Rates were higher in the Northeast and Midwest. The official measure calculates poverty in the South at 17 percent, and at 15.4 percent in the West. Official rates in the Northeast are 12.9 percent, and 14 percent in the Midwest.

Graphic by Melanie Taube, Investigative Reporting Workshop

Source: U.S. Census Bureau 2010 data

For households with children, the new measure generally lifts their income significantly, while only slightly raising their expenses. Low-income families can get thousands of dollars through the Earned Income Tax Credit, which is counted under the supplemental measure, but not under the official poverty calculations. Without the credit, 22.4 percent of children are living in poverty, according to the new data. With the credit, that number drops to 18.2 percent. Food stamps also have a big impact for children: Without the food assistance, the poverty rate for children rises by 3 percent.

There are racial and regional differences as well. The poverty rates for non-Hispanic whites, Asians and Hispanics are all higher under the supplemental measure than under the official poverty rates, while those for blacks are lower. The poverty rates in the Northeast and West rise under the supplemental measure and fall in the South and West.

Comparing old and new poverty rates

The official poverty measure, first developed in the 1960s, simply takes an estimate of spending on food and multiplies it by three. The official measure takes into account only cash income, pre-tax, which means that any non-cash benefits, or any post-tax spending, gets left out. (See our earlier story on the history of the poverty line.)

How benefits impact the count

The inclusion of the Earned Income Tax Credit and food stamps (or Supplemental Nutrition Assistance Program benefits) significantly lower the poverty rates, under the supplemental measure. The tax credit drops the poverty rate by 2 percent, and food stamps by 1.7 percent. On the expenses side, the inclusion of out-of-pocket medical spending increases the overall poverty rate by more than 3 percent.

Graphic by Melanie Taube, Investigative Reporting Workshop

Source: U.S. Census Bureau

The new measure is far more nuanced. The supplemental measure includes a wide range of in-kind government benefits that can functionally raise household income, including the Earned Income Tax Credit, heating and housing assistance, WIC benefits for women and young children and food stamps. It also takes into account factors that can drop household income, including payroll taxes, child care and commuting expenses, and perhaps most significantly, out-of-pocket medical expenses.

The alternative measure also has a new, broader definition of a household unit, counting essentially all people who share an address. The official measure counts only “people who are related by either birth, marriage or adoption,” Short says. The new definition accounts for households where families are doubled-up — a group whose numbers rose significantly since the recession began, according to earlier Census data.

It also means people who are unmarried but living together, for example, now count as one household unit. “We are now taking account of the fact that people who identify themselves as unmarried partners are likely to be sharing resources,” Short says.

The expanded household unit may explain why fewer African-Americans are below the poverty line under the supplemental measure, since “they are more likely to be in the new units,” Short notes, and the larger households may include people who are bringing in income.

While the supplemental measure is currently for research purposes only, the data is certain to be dragged into fights around cuts to public benefits and social programs. The calculation itself is also likely to be scrutinized, says Ron Haskins, a senior fellow at the Brookings Institution.

“We’re not at the end of the trail for sure. There are going to be objections to the measure,” Haskins says.

But, he adds, “I would say today is a good day.”

Poverty numbers get an update

Published Nov. 3, 2011, at the Investigative Reporting Workshop

When the Census Bureau released new poverty figures in September, the stats made headlines. More Americans in poverty than ever before. A poverty rate not seen since 1993. One in five children living in poverty, one in 10 children in deep poverty.

But the real picture of American poverty is more complicated than the recent census numbers suggest. Those figures, and all the ones before them, were based on calculations made in 1963 by Mollie Orshansky, a researcher at the Social Security Administration. She took USDA estimates of how much a family needed to spend on food for a month, multiplied that by three, and voila, the poverty line. The numbers have been a bone of contention ever since.

Now that decades-old data is getting some competition.

On Monday, the Census Bureau is putting out a second set of numbers, the Supplemental Poverty Measure. These numbers will take into account how much a family spends in four categories: food, shelter, clothing and utilities. That’s not based on the food budget times three, but on actual consumer spending data.

The supplemental measure also factors in other basic needs, adding 20 percent onto the total cost of those essentials — a calculation known as “plus a little more” in Census-speak. The “plus a little more” is for “things like shampoo, personal needs for yourself, maybe some transportation to the grocery store or books you need for school,” says Kathleen Short, a research economist in the social, economic and housing statistics division of the Census Bureau.

There’s another huge change: many government benefits will be included in family resources for the first time. Right now, unemployment benefits, social security payments and other cash transfers are counted as part of income. But the current calculations leave out things like food stamps, the Earned Income Tax Credit, or other forms of non-cash, pre-tax aid. The new numbers also subtract work-related expenses like child care and commuting costs, as well as out-of-pocket medical expenses. That last one is huge, particularly for seniors.

In other words, the supplemental poverty measure should give a much more realistic picture of what resources families do and don’t have to meet their basic needs.

Orshanky, who died in 2006, may well have been pleased with the new companion to her famous figures. She knew that the poverty line wasn’t perfect from the start. For one thing, she based her calculations on the USDA’s “economy plan,” which was designed “for temporary or emergency use when funds are low,” not for planning every day family meals, as she wrote in a landmark 1965 article (pdf). That’s why she also calculated a second, higher line, based on the “somewhat less stringent low-income plan,” writes Gordon Fisher, a program analyst in the Department of Health and Human Services who has published numerous papers about Orshansky. But the Office of Economic Opportunity went with the lower, economy plan, Fisher writes, and that was what stuck. He speculates that the OEO preferred the lower line because it “yielded approximately the same number of persons in poverty” as the definitions already used by the President’s Council of Economic Advisers. The more generous measure would have meant more poor people.

The poverty line was supposed to measure income levels that allowed people to eat “the minimal diet that could be expected to provide adequate nutrition” and still have enough to cover “all other living essentials,” Orshansky wrote. To that end, she proposed adding a whopping 15 cents per person per day to the poverty line, “to allow for the husband in a family to buy coffee at work or for children to buy snacks,” Fisher writes (pdf). Her supervisor vetoed the idea, according to Fisher. Living in poverty, then and now, really means the barest of bones.

Orshansky’s definitions will still be used; the supplemental poverty figures are, essentially, for information purposes only. But while her calculations are still the federal standard, her warnings about their use, and their shortcomings, have fallen into obscurity.

“Poverty has many facets, not all reducible to money,” she wrote 46 years ago. “The poor have been counted many times,” she continued. “It remains now to count the ways by which to help them gain a new identity.”

 

 

 

Working, but still poor

Published Sept. 13, 2011 at the Investigative Reporting Workshop

From the president to Congress to nearly every neighborhood in America, the focus today is on job creation. But for millions of Americans, just having a job doesn’t mean prosperity or anything like it.

Nearly one in six Americans lived in poverty in 2010,  according to data released today by the Census Bureau. (See our highlights of the Census data.) That’s 46.2 million people, the highest number ever recorded in the 52 years that poverty estimates have been calculated.

The rise in poverty may be attributable in part to the nation’s persistently high unemployment rate.  “We have an increase in the number of people who did not work at all last year,” said Trudi Renwick, chief of the poverty statistics branch of the Census Bureau. “That might be the single most important factor” behind the higher poverty rate, she said.

But working people are also struggling, as data from the Bureau of Labor Statistics reveals.

More are working, and more are ‘working poor’

The number of people in the workforce 27 weeks or more grew 30 percent from 1987-2009, but the number in the workforce 27 weeks or more living in poverty grew 65.3 percent.

Source: Bureau of Labor Statistics, Annual Social and Economic Supplement, Current Population Survey

Graphics by Melanie Taube, Investigative Reporting Workshop.

“When you have this type of labor market weakness, you’ve got a reinforcing effect on the working poor,” said James Borbely, an economist with the Bureau of Labor Statistics. “Even people just above that would be affected. The middle class certainly hasn’t been without its hardships,” he said. “People are just getting by. And I’d say the working poor are barely getting by.”

The Investigative Reporting Workshop examined data from the Bureau of Labor Statistics to look at trends among the working poor, going back to 1987. As shown in the accompanying graphics, the number of people in the workforce since then grew by almost 30 percent, but those working and living in poverty grew by more than 65 percent.

Rate of working poor climbs

The poverty rate has grown, too, from 5.5 percent of the people in the labor force in 1987 to 7 percent in 2009.

Source: Bureau of Labor Statistics, Annual Social and Economic Supplement, Current Population Survey

The “working poor” are those who spent at least 27 weeks in the workforce, working or looking for work, but whose incomes still fell below the official poverty level.   And those incomes are low. The 2009 poverty threshold for a single person is $10,956, and it’s just $21,954 for a family of four.

There were 10.4 million people among the working poor in 2009, according to a Bureau of Labor Statistics’ report, “A Profile of the Working Poor,” released earlier this year (see report). That’s 1.5 million more than in 2008. How can so many people with jobs still be so destitute? Low earnings, unemployment and involuntary part-time work are cited in government reports as three of the labor market’s biggest problems. In 2009, 7 percent of the working poor experienced all three.

Many workers work year-round, but in part time jobs. And involuntary part-time work is a major contributor to poverty. The number of people working part-time who want to work full time has tripled in the last 10 years, according to the Bureau of Labor Statistics. In 2001, 3.3 million people Americans worked part time but sought full time jobs. The figure trended upward in 2006 and climbed to 4.6 million in December of 2007.

In the last few years, the number has risen again, hitting a high of 9.5 million in September 2010. It stands now at 8.8 million. “Historically speaking that’s extremely high,” Borbely said.

Twice as many people are now involuntarily part-time as they were before the start of the recession. “That’s something we have to consider a contributing factor to that working poor number,” he said.

Who are the working poor?

Minorities are hardest hit

Source:  Bureau of Labor Statistics, Annual Social and Economic Supplement, Current Population Survey

And women are more likely to be among the working poor …

Source:  Bureau of Labor Statistics, Annual Social and Economic Supplement, Current Population Survey

… especially if they are heads of households with children

Source:  Bureau of Labor Statistics, Annual Social and Economic Supplement, Current Population Survey

What job a worker has is, not surprisingly, a major factor in determining whether or not his or her income falls below the poverty line. Workers in the service sector are more likely to be poor than any other category of employed person. More than 13 percent of service workers were classified as working poor in 2009, and “service occupations, with 3.2 million working poor, accounted for nearly one-third” of all the working poor in the country.

“The more low-paying jobs are concentrated in the service sector,” Borbely said. “Those jobs tend to be available and accessible to people with lower education, and it tends to skew toward lower education in the working poor. The labor force is always growing, but is the number of high-paying jobs keeping pace? It hasn’t been anywhere near where it needs to be to keep pace.”

Workers in natural resources, construction, and maintenance occupations also struggled to stay above water, with 9.7 percent of such workers living below the poverty line.

Most of the few jobs created since the start of the recession are low-wage jobs, according to research (pdf) by the National Employment Law Project, a workers’ advocacy group. Most of the jobs lost since the recession began were middle-wage jobs, the group found.

Demographically, black and Hispanic workers were “about twice as likely as white or Asian workers to be poor,” the BLS report notes. Women workers are more likely to be poor than men (7.5 percent vs. 6.6 percent). Young workers are far more likely than older workers to be poor, “in part because their earnings are lower and their unemployment rate is higher,”  according to the report.

One of the groups most likely to be among the working poor are single women with children.  More than a quarter of families headed by a female worker lived below the poverty line in 2009.

The Bureau of Labor Statistics report also shows that in 2009:

• Those with a college education were much less likely to be among the working poor. Among college graduates, 2.1 percent were classified as working poor, but for those with less than a high school diploma, the rate was 20.3 percent. Black women workers with less than a high school diploma are particularly likely to be poor, with 31.8 percent falling below the poverty line. The rate for black male workers with less than a high school degree is 22.5 percent.

• Families with children younger than 18 were four times more likely than those without children to be poor, even with if one family member was working 27 weeks or more. And women who head those households were far more likely to be among the working poor.

• Full-time workers were less likely to be among the working poor than part-timers.

And even though people who work full time are less likely to be poor than others, there are still people who worked at least 27 weeks or more in full-time wage and salary jobs, and 3.8 percent — 4.2 million — of them were classified in working poor, almost unchanged from 2008 (3.6 percent).

Although the recession ended in December of 2009, wages have not picked up. “We’ve puttered along at an extremely modest pace,” Borbely said.

The slow recovery is unique to the Great Recession. The economy fell in the 1980s, but the sharp downturn that led to a recession was “followed by an equally sharp recovery,” he said, unlike recent years. And downturns in the mid-’90s affected employment, but again, recovery was relatively quick. A recession in 2001, after 9/11, was propped up by construction and a housing market that stayed strong.

But even during the growth years, the gains were not shared evenly. Poverty rose in most years between 2001 and 2009, even as the economy was booming.  Median household income has fallen more than 7 percent since its peak in 1999, according to the Center for Budget and Policy Priorities.

“Even before the recession began, a growing number of Americans were being left behind,” said Robert Greenstein, president of the center.