August 31, 2006

Fat Cats!

I'll admit it. It's sometimes hard to talk about global inequality. Not because the subject leaves me queasy or uncomfortable, but because it's difficult to present the relevant facts in a way that hasn't already been done a thousand times before. Declare that a billion people live on less than a dollar a day and eyes start glazing over. Something new is needed. Something novel. So here's something I stumbled across while clicking around on the faculty page of the economics department at Northeastern (don't ask).

In this paper, M. Shahid Alam estimates—and this is back of the envelope on his part, so quibble if you want—that Americans spent roughly $360 billion on their dogs and cats in 2003. That number includes about $30 billion in direct expenditures, as estimated by the American Pet Products Manufacturer's Association (a figure that has risen to nearly $40 billion this year), plus time spent taking care of those pets, which Alam values at $10 per hour—a good deal given the lavish attention many dogs and cats receive.

$360 billion, needless to say, is a staggering amount. Consider that those 143 million dogs and cats had, in 2003, a total income larger than the economy of Pakistan (which, that year, had about 160 million people and a GDP of roughly $306 billion [PPP]). Obviously pet economies and people economies aren't strictly comparable, but let's just go with it. Those well-fed and much-adored pets had a per capita consumption of about $2,532, while the 2.3 billion people who lived in the lowest-income countries had a per capita consumption of about $2,190. Roughly a third of the people on the planet, apparently, would be materially better off as a household pet in the United States. Hmmm...

UPDATE: No, this post wasn't meant in all seriousness. Yes, comparing the "incomes" of dogs and humans is problematic. But consider this! Private contributions towards development assistance in the United States amount to about $34 billion annually. That's less than the amount we spend directly on pets each year. Not that I'm against pets or anything. I just thought it was striking. That's all.
-- Brad Plumer 6:07 PM || ||

August 30, 2006

Why the Long Decline?

Rummaging around through Mark Thoma's archives, I noticed he had a nice post last week laying out various possible reasons for the decline of union membership in the United States since the 1970s. Explanations include the idea that workers are relatively well-off today, thereby lessening the demand for unions; the fact that globalization has destroyed labor's bargaining power; the massive downsizing in manufacturing over the past thirty years; staunch employer opposition to organizing; and of course, laws and government policies that are, by and large, hostile towards unionization.

Now some economists have tried to assign relative weights to each, and smart people can argue about this list all day. (Globalization, for instance, seems like a poor explanation given that the United States is actually less exposed to trade than smaller yet still highly-unionized countries such as the Netherlands.) But set that aside. Asking why unions have declined over the last thirty years is sort of the wrong question. All monopolies tend to decline over time. The growth of unions, rather than their stagnation, is the striking fact that, I think, really demands explanation.

It doesn't get much attention, but there are only a few select points in American history in which union density actually rocketed upward. Unions multiplied and prospered during both world wars because the government helped organize the "war industries" in exchange for no-strike pledges. And during both the Progressive and Depression periods, union density grew largely because upstart radical unions—the IWW during the 1910s, factions of the CIO during the 1930s—challenged the AFL's primacy. (In many cases, union density expanded because employers chose to negotiate with the collaborationist AFL—or let the AFL charter company unions—in order to head off attacks by the more radical unions.)

And that's pretty much it. After 1945, the CIO's postwar organizing strategy—focusing on public employment, retail, and southern industries—fell victim to the anti-union Taft-Hartley bill that passed through Congress, the demise of Operation Dixie at the hands of southern racism, and labor infighting that led to a Cold War purge of left-leaning unions. Eventually the CIO merged with the AFL and private sector union density has withered ever since, as the federation has acted as a monopoly more interested in gains for its own members—and understandably so—than expanding union density. (Public sector unions, on the other hand, have had more success.)

Two proposed solutions for labor's decline have been put forward in the present day. In Congress, Rep. George Miller (D-CA) has introduced the Employee Free Choice Act, which would legalize card checks (making it easier for employees to vote themselves into a union), speed up the union certification process, and increase penalties for employers who violate labor law. Among labor leaders themselves, Andy Stern and the Change to Win breakaway faction of the AFL-CIO have pledged to spend more money on organizing rather than politics.

Both ideas sound promising, but in reality will probably have limited effect. Unions use different estimates, but a decent rule of thumb is that it costs about $2,000 to organize a new member. Adding one million new members would cost $2 billion—or about 30 percent of the AFL-CIO's total member dues prior to the split. One million new members is nothing to sneeze at, but it would also add merely a point to total union density in the United States—hardly enough to make a dent in labor's long decline. (I also agree with Robert Fitch that some of Stern's specific organizing tactics—such as making political contributions to convince governors to add home-care workers to the SEIU's rolls—have had only mixed results.)

At any rate, I'll probably write more on this later, because I need to get to work. Basically, I do think unions need to make a serious comeback in the United States—for many of the reasons outlined by Nathan Newman here. But history suggests that it will take more than modest improvements in both organizing strategy and congressional legislation to achieve the same dramatic upsurge in union density that took place during the Progressive era and the 1930s. What it will take is, I think, an extremely difficult question.
-- Brad Plumer 8:18 AM || ||

August 29, 2006

Single-Payer Feuds

The big news out of California today is that the state legislature is on the verge of passing Sen. Sheila Kuehl’s single-payer health care bill, which, as the San Francisco Chronicle puts it, "would eliminate private insurance plans in California and establish a statewide health insurance system that would provide coverage to all Californians." (I’m not sure that’s quite accurate—the lengthy description here suggests that patients could still purchase private insurance to cover whatever services single-payer doesn’t; so it's more of a French-style system than Canadian.)

Anyway, the bill hasn't garnered nearly enough support to override the inevitable veto from Schwarzenegger, so I haven't bothered studying the policy itself. (If it was ever signed into law, though, it would be a good opportunity to prove once and for all that government-run health insurance can actually work.) I will note something interesting, though: According to this page, the bill has the support of eight locals in the SEIU.

I find that interesting only because Andy Stern, president of the SEIU, has generally opposed single-payer. In a June 17 op-ed to the Wall Street Journal, Stern argued that the existing "employer-based system of health coverage is over." But in a June 16th speech at Brookings he also said that "we need to find a system that is not built on the back of government…. I don't think we need to import Canada or any other system." So where does he stand? Perhaps most crucially, Stern, along with other Change to Win officials, participated in Schwarzenegger's July 25th Health Care Summit with various corporate executives and the like, a meeting widely viewed as an attempt to undermine support for Kuehl's bill.

So… it's interesting, though not terribly surprising, that the California SEIU and Change to Win unions aren't following Stern's lead—they voted to support Kuehl's bill at the California Labor Federation convention in July, and a number of them are angry at Stern's positions. Now conflicts between labor leaders and the rank-and-file are hardly novel, but it seems like this divide—along with similar rifts, no doubt, in other unions—could become much more significant if, say, Democrats ever take over Congress and get serious about discussing health care reform.
-- Brad Plumer 11:27 AM || ||

August 27, 2006

The Ozone Near-Miss

Lest you doubt the left’s pieties are now a religion, try this experiment: go up to an environmental activist and say “Hey, how about that ozone hole closing up?”
Frankly, I haven't a clue what Mark Steyn's trying to say here. Activists should be insulted that they helped ban CFCs and close the ozone hole? Er, okay... But this does remind me of something interesting I recently read about the topic.

In 1974, two chemists, Mario Molina and Frank Sherwood Rowland, discovered that CFCs could potentially destroy the ozone layer and expose the whole planet to ultraviolet rays. The media quickly picked up on the discovery, and annual growth in the production of CFCs dropped from 10 percent to zero, as the public started boycotting hairspray products with CFCs and the like, while the U.S. and European governments got together to control the production of CFCs in refrigeration under the Montreal Protocol. The ozone started recovering, albeit slowly.

Anyway, that story's well known. What I didn't know, though, was that if Molina and Rowland hadn't made their discovery when they did, and CFC production had continued growing at its projected rate, the entire ozone layer would have been severely depleted by about 1984. A mere decade! It was quite the close call. It's also noteworthy that, at the time, the principal CFC manufacturer attacked Molina and Rowland's evidence, only to let up after developing an alternative to CFCs. Conservative and industry "skeptics" held out for longer, although they too were eventually proven wrong. You'd think environmentalists would have serious bragging rights after this episode, but apparently not.
-- Brad Plumer 8:03 PM || ||
Was Welfare Reform a Fraud?

While I was moving, the tenth anniversary of welfare reform came and went without too much fanfare. Many observers simply declared that welfare reform had "worked," because caseloads had declined. But what kind of metric is that? The whole point of reform was to push people off welfare. Of course caseloads were going to decline—that's what the law did. We should be asking if families are now better off because of it. I'm sure you can guess my answer but I'll go through the numbers anyway.

Nation-wide, welfare caseloads declined from 4.4 million families in 1996 to about 2 million families today, as states began kicking people off the rolls. So what happened to all of those families? The Urban Institute conducted a massive survey on this, and found that in 2002, of those who left welfare:
  • 57 percent were working (about 40 percent full-time)
  • 26 percent had returned to welfare
  • 14 percent had "no employment income, no working spouse, and no cash welfare or public disability benefits." (Presumably some of these families receive other benefits from America's stingy safety net, such as housing assistance, food stamps, or WIC grants.)
  • Welfare "reform" has obviously failed the last group, people who by and large are in poor physical and mental health and unable to work. That's about 300,000 families so far, and the number has grown since 2000. Many more families who still receive assistance will likely face a similar situation once they run up against the five-year time limit and get kicked off welfare. That's especially true if Republicans in Congress get their way and pass rules that would make it harder for states to evade these time limits.

    Now consider the first group, those who went back to work. Their median hourly wage was about $8 per hour in 2002, or about $16,000 per year. Many families do better, many do worse. For a single mother, though, that's not enough to raise children on—it's hardly a wonder that the poverty rate among single working mothers increased slightly during the "booming" economy in the late 1990s, as women were pushed off welfare to find jobs that paid little. And those jobs are often precarious; an illness or a broken car can easily mean getting fired, with only a shredded safety net to fall back on.

    So has work made families better off? The MDRC has done some valuable case studies on the subject and found that, for instance, in Los Angeles "families were not substantially better off financially, even though many parents went to work." And California has one of the more generous state programs around. The flip-side, too, is that now those parents—many of them single mothers—have less time to spend with their kids and someone has to pay for child care.

    And what about the children? Well, in Minnesota, which also has one of the more generous TANF programs in the country, reform "had no overall effect on the elementary school achievement of very young children." (Some disadvantaged children in the state saw gains, although note that Minnesota was one of the few states that spent more under its reformed TANF program than it did under the old AFDC system.). The Urban Institute study also noted that "children's outcomes were largely unchanged" nationwide. Just because welfare reform wasn't nearly as disastrous as critics predicted doesn't mean it's done a lot of good.

    And what about poverty reduction? In 1999, after welfare reform plus a roaring economy plus new work-support programs like the EITC, the poverty rate was 11.8 percent, which was... the exact same rate as in 1979, under the "bad old welfare" system. (1979, note, was only two years before the Reagan administration started slashing AFDC benefits). Moreover, the black poverty rate—which "reform" was supposed to help reduce—declined faster in the three years before 1996 than it did in the three years after. So what, exactly, did welfare reform accomplish, apart from pushing people who genuinely need assistance off the rolls and saving the government a few bucks?

    Many liberals will say that welfare reform can and will be a stunning success story if only we increase the Earned-Income Tax Credit and raise the minimum wage and provide heaps more funds for child care and heaps more money for job-training and so on. Well, no kidding. Cup Noodle makes a great meal if it comes with a side of steak. If people are going to be forced to work then the government should make work pay. If society isn't willing to do this, then cash assistance is the way to go.

    Over at TNR, meanwhile, Jon Cohn argued this week that welfare reform was a positive thing because it got an unpopular monkey off the Democratic Party's back and gave them "more political room" to propose new anti-poverty programs. That could be, although it seems like a difficult counterfactual to prove. I'll note, however, that the flagship anti-poverty program being pushed by Democrats right now is a proposal to raise the minimum wage to $7.15 an hour by 2008. In real terms, this will be equivalent to about $5.60 in 1997 dollars, when the minimum wage was last raised to... $5.15. A radical shift? Really?
    -- Brad Plumer 1:09 PM || ||

    August 26, 2006

    Is Yucca Mountain Fair?

    I really liked this post over at Alas, a Blog on whether storing all of the nation's nuclear waste at Yucca Mountain constitutes an "acceptable risk" or not. Basically, there are two different ways of looking at this question. Scientists and geologists can get together, look at the details, and determine whether or not it's relatively safe to store nuclear waste at the site. For most people, it's hard to enter this debate without getting mired in a morass of technical questions.

    (Personally, I've been convinced by a friend who has studied this issue in depth that there are still a lot or problems with the Yucca proposal. I'd just add the rather banal observation that planning for 10,000 years is a task that borders on the absurd. Less than 10,000 years ago a bunch of people built a bunch of pyramids with warnings about Egyptian curses should anyone trespass. Explorers since then have, of course, disregarded those warnings and entered the pyramids. Whose to say our warnings won't be shrugged off in a similar fashion a few millennia from now?)

    But there's also a fairness question at issue with Yucca Mountain that gets less play. Why, after all, should Nevada of all places be the state to hold the nation's nuclear waste (especially since the country doesn't have any nuclear reactors)? In practice, the Department of Energy was ordered to consider Yucca—and Yucca only—for a storage facility back in 1987 because the congressional delegation from Nevada was fairly junior at the time. (Another proposal, if I recall, was to build pyramids out of nuclear waste in Texas, but that was squashed by Texas lawmakers.)

    That procedural unfairness, among other things, is what concerns many of those opposed to the Yucca repository. So when supporters say that Yucca makes sense and cite this and that scientific study, they're talking past their opponents in many ways. But whether there's ever an acceptable way to decide the fairness question (someone, after all, has to bear the risks of nuclear waste-storage) is unclear. The D.C. Circuit Court very recently ruled that Yucca was chosen by a fair process, but regardless of whether the opinion's correct or not, it certainly hasn't changed any minds in Nevada.

    Anyway, I certainly don't know how to settle this, but I thought I'd mention the curious tale of how France is dealing with these questions. French people tend to trust nuclear power much more than Americans do. A Frontline report found that this might be because, to generalize a bit, French people trust their scientists and technocrats more than we do. France is also much more reliant on nuclear energy than the United States, so even though the French public is just as wary of the dangers of nuclear power as we are, they're more accepting of it.

    But the question of storing nuclear waste has still been extremely contentious in France. People living in rural areas don't like the idea of their countryside becoming a dump for radioactive waste—partly because it conflicted with traditional ideas about desecrating the earth and the like. But French politicians have recently changed their rhetoric and announced that they only planned to stock the nuclear waste temporarily—and watch over it—rather than bury it and forget about it, and the public has become, reportedly, more receptive to the idea. So making a concerted effort to address those fairness concerns can sway popular opinion to some extent.
    -- Brad Plumer 6:55 PM || ||
    The Paradox of Craigslist

    So I've moved to D.C. and, after a week of nonstop and mostly frantic searching, finally found a decent place to live in a gentrifying neighborhood that seems nice enough. For that we can all thank Craigslist. And since it's too hot to do anything else today, I may as well write some posts. So let's start by talking about… Craigslist.

    I've now used Craigslist to find housing—by which I mean roommates offering a room in a group house or apartment—in four different major cities (New York, Boston, San Francisco, D.C) over the course of about five years. And it's becoming… progressively more difficult. More and more "sellers" are using Craigslist these days, which sounds good at first, because that means there are more options for the apartment-seeker. But there's also been exponential growth in the number of seekers, too. You would think this makes the overall market more efficient, but I'm not sure.

    Five years ago, when I was looking for a place in Brooklyn, I could reply to a week's worth of ads and get replies from each person. The people offering housing were only receiving about fifteen or twenty replies per post, after all. But nowadays, you put up an ad offering a room in your house and get 150 replies, minimum. So you can't reply to everyone, and have to spend time selecting maybe 20 or so folks to meet in person. Odds are you'll stage an open house, where you can meet everyone in a single afternoon and try to see who would make a good roommate.

    But for certain apartment-buyers, open houses can be excruciating. Everyone there is jostling to appear more outgoing and fun than the next guy, and the traits that make you successful at getting recognized and liked at an open house aren't necessarily congruent with the traits that make you a good roommate. (Bribes are also not uncommon at open houses—baked goods, especially.)

    Now I didn't mind so much, but I imagine this can all deter many "buyers" from even bothering to check out certain places. And "sellers" can, conceivably, get bad matches from open houses (since they're meeting everyone very quickly and getting a potentially misleading impression). But the alternative to an open house is to schedule one-on-one interviews with multiple roommates—which can take way too much time.

    The end result is that people get desperate—especially buyers, who find that they are not getting many replies to their inquiries, and getting rejected after many interviews—because there's so much competition—and end up accepting the first place offered, especially if they need housing by a certain date. (This happened to me, although I'm very happy with my pick.) So people may well often settle on suboptimal living situations.

    Now I don't think this is very different from other types of markets (although I imagine the old newspaper-ad days had fewer people replying to each ad, and so avoided the problems mentioned above), and in truth, it all works pretty well, but Craigslist may have been better from the buyer's point of view five years ago, when fewer people knew about it. In other words, maybe Barry Schwartz is onto something.
    -- Brad Plumer 6:28 PM || ||

    August 17, 2006

    Basketball and Racism

    Michael Perelman puts forward evidence of racial discrimination in the NBA:
    Even in sports, one of the few venues where society associates Blacks with excellence, Blacks still face discrimination. For example, in cities where the population is more White, professional basketball teams hire fewer Black players (Brown, Spiro, and Keenan 1991).

    In an unpublished paper, Dan Rascher and Ha Hoang found that after adjusting for a number of factors, Black basketball players have a 36 percent higher chance of being cut than Whites of comparable ability. This statistic reinforces the widely held impression that while teams will want to employ the Black superstar for a better chance of winning, they will prefer a higher mix of White players on the bench to please their predominately white audience.
    This might well be true. Here's a 2005 paper entitled, "Are NBA Fans Becoming Indifferent to Race," published in the Journal of Sports Economics, that charts a few trends. During the 1990s, teams that played in whiter cities really did tend to have a greater number of white players. The correlation is higher for starters than it is for benchwarmer, as you'd expect—"star" players who are white, in fact, are especially likely to end up in cities with relatively higher white populations.

    Meanwhile, white players are less likely to be traded away from teams with a larger population of whites in their market area (and black players more likely to be traded away). And the kicker is that, controlling for other factors, teams that better "matched" the racial composition of their market area boosted their home-game attendance revenue. That dovetails somewhat with this 2001 paper, which, judging from the abstract, finds that the size of television audiences for NBA games increases with increasing participation by white players.

    Now granted, none of this is conclusive. I don't know much about the NBA, but it seems like there could be all sorts of perfectly benign reasons why the Utah Jazz, playing in a lily-white city, just happened to be 44 percent white during the 1990s (luck of the draft, existing contractual obligations, the way the free agent pool happens to shake out). But it also doesn't seem so unrealistic to think that a black fan base would rather see black players (and a white fan base white players), and that teams respond to this preference.

    On the brighter side, racial discrimination in pay seems to have largely vanished during the 1990s, unlike the previous decade, when white players were paid more than blacks for the same level of performance. Here's a long-ish paper on the topic if anyone's interested.
    -- Brad Plumer 4:43 PM || ||

    August 16, 2006

    Nationalize the Defense Industry?

    Okay, time for a real post. I'm sitting in a café trying to look for housing on Craigslist, reading the news only cursorily, and haven't really been following the Lebanon-Israel ceasefire or the latest ginned-up terror threat closely enough to say anything insightful. But John Stanton had a piece in Counterpunch back in July arguing that we should nationalize the defense industry, and I thought I'd comment.

    The case for nationalizing the defense industry appears straightforward. Private enterprise is only "better" than state control of the means of production insofar as it's more efficient at producing whatever goods need to be produced, I think. But the U.S. defense industry is anything but efficient: no-bid contracts, cost overruns and foundering projects are so commonplace that they barely warrant headlines anymore. And private industry hasn't exactly produced a top-notch military: as Stanton points out, the Pentagon has yet to enter the information age.

    So I doubt that converting certain departments of Boeing and General Dynamics into "publicly-controlled, nonprofit [entities]" would make them any less effective at creating weapons and the like. But what would be the benefit of doing so? Stanton lists a lot of benefits—and most of them sound convincing—but this one seems a little too pat:
    Converting the companies to publicly-controlled, nonprofit status would introduce a key change: it would reduce the entities' impetus for aggressive lobbying and campaign contributions. Chartering the defense contractors at the federal level would in effect allow Congress to ban such activities outright, thereby controlling an industry that is now a driving force rather than a servant of foreign policy objectives.
    Now, having spent the past few months researching various aspects of the defense budget process for a project that will hopefully see the light of day soon, I'll fully agree, the means by which a defense bill becomes law in Congress is utterly grotesque. The United States spends way, way too much on defense, and much of that money goes to worthless projects even by hawkish standards. (That is, even if you do believe that the United States should continue running a global empire—I don't, but some do—there's no sound justification for, say, spending billions on new stealth fighters.) And contractors play a role in perpetuating this bloat. But, I tend to think, it's only a supporting role.

    At a very basic level, members of Congress like spending money. They especially like spending money in their own districts, which is far more valuable to their re-election chances than corporate contributions. And the defense appropriations process gives them a perfect chance to spend that sort of money. Bureaucrats in the Pentagon, meanwhile, enjoy expanding their own budgets for parochial reasons. Often, mid-level military officials who push various pet projects on a compliant Congress are responsible for some of the worst bloat in the defense budget.

    Corporations obviously aren't innocent bystanders here. The Boeing tanker-lease deal, in which Congress nearly wasted $30 billion on an entirely useless rental scheme was conceived and pushed by the aircraft manufacturer itself (Air Force officials never asked for the lease until they were coaxed by members of Congress who were in turn coaxed by Boeing). But that's something of an egregious case. Even if you could wave a magic wand and banish all contributions and lobbying by defense contractors, "foreign policy objectives" still wouldn't be a guiding force in the defense budget process. But other than that, I think Stanton's onto something.
    -- Brad Plumer 9:38 PM || ||
    Disappearing Act

    I can't imagine my personal life is of interest to many readers, but a few people have wondered if I've disappeared for good, so here's a quick explanation. As of last week I'm no longer working as an editor at Mother Jones. So anyone trying to reach me should now send an email to brad -dot- plumer -at- gmail -dot- com.

    The other news, and my excuse for not writing much, is that in a week I'll be moving to Washington D.C. to start work as a reporter-researcher for the apparently much-maligned Joe Lieberman Weekly. And in case anyone's curious, yes, I do still think Israel's war against Lebanon was not just strategically stupid but also morally abhorrent, that the U.S. invasion of Iraq was a criminal enterprise from start to (if it ever comes) finish, and that Lieberman's defeat was a grand thing. But check back in a month or so to see if I've "mysteriously" changed my mind about all of this. (No, seriously, I like TNR a great deal, and am happy to be going there, even if I don't agree with every position they take.)

    Anyway, I don't know if the new job will mean I'll have to shutter up this blog for good, but at the very least it may mean a patchy spell for a bit. In the meantime, this New York Times piece on the Poincaré conjecture actually manages to make topology sound thrilling. And it is thrilling! But that's hard to convey sometimes...
    -- Brad Plumer 1:01 PM || ||