November 30, 2005

Broken Windows

Matthew Yglesias' post on "broken windows" policing reminded me to link to a great Legal Affairs debate on the subject, between Bernard Harcourt and David Thacher. Both seem to agree that social science research is very inconclusive as to whether "broken windows" policing and cracking down on public disorder and graffiti and panhandling and the like actually helps reduce the crime rate. Intuition suggests that it should, but the research says otherwise.

Anyway, Harcourt says "broken windows" policing is a waste of time, and diverts resources away from more serious police functions. Thacher says that even if it doesn't reduce crime, order maintenance is still worth doing in itself—because disorder is gross, and can destroy public spaces (that, in turn, affects the poor the most). That sounds reasonable, but there are two things to be wary of here. In practice, "broken windows" policing usually relies on a major increase in misdemeanor arrests, even when that's not the intent. And inevitably, the people who are arrested here are disproportionately minorities—a disparity that can have very poisonous effects. Plus, as Harcourt points out, perceptions of disorder are often shaped by race: a recent study, "Seeing Disorder," found that people see more disorder in neighborhoods with higher numbers of blacks, Latinos, and in places with higher levels of poverty, even when there isn't, in fact, more disorder.

But even so, Thacher still makes the case that order maintenance in public spaces is necessary and right, and "soft" order maintenance measures—social services, building proper public facilities, etc.—won't always do the trick. He also points out that even if law enforcement is a racially unjust system, it still might be better than other options:
No one relishes the use of coercion to deal with problems like disorder, whether the coercion is exercised by police or by anyone else. Institutionalizing the mentally ill and many of the other alternatives to order maintenance policing that you describe in your second paragraph obviously raise their own civil liberties concerns, as you clearly acknowledge. (Michel Foucault… recognized these dangers as clearly as anyone, as have those he inspired.) One advantage of police order maintenance is that no one can fail to recognize that it is coercive.

Therapeutic solutions like institutionalization make it dangerously easy to jump too quickly to the claim that "it's for their own good," but it's harder to kid ourselves that way about police order maintenance. It puts us on our guard. It forces us, hopefully, to recognize the need to face the tough questions you rightfully asked yesterday: Do we really have good reason to intervene against this disorder? Is this really wrongful conduct? Or are we just being finicky, intolerant—even racist?
That's a clever argument, but very, very dubious. David Cole's No Equal Justice makes a good case that those who have the luxury of debating the justice system have in fact been very, very good at "kid[ding] ourselves" about disparities in the system—especially since the many double standards in law enforcement "allow the privileged to enjoy constitutional protections from police power without paying the costs associated with extending those protections across the board to minorities and the poor." It's allowed people to avoid "tough questions" about the tradeoff between liberty and security. Plus, voters seem to be especially irrational about crime control in a way they aren't about many other things. In no other policy field, it seems, are the experts who know how to control crime, deal with drug use, reduce incarceration rates, etc., so thoroughly ignored. It's true that the subtle coercion inherent in other social policies is often ignored, but to say that law enforcement policy "puts us on our guard" seems wrong.

But at any rate, it's a nuanced debate, far better than what I've presented here, and definitely worth reading in full.
-- Brad Plumer 10:56 PM || ||
Again with the Tea Leaves

I don't know what to say about the president's new "Strategy for Victory" in Iraq. There's nothing new here, besides hope that everything ends up working out somehow. But that aside, this Time piece on Iraq is actually pretty informative, and has some reporting on the various Pentagon plans being floated for withdrawing troops:
There isn't one plan, but several, each containing various options for Army General George Casey, the top U.S. military officer in Iraq. Pentagon officials acknowledged last week that the number of U.S. troops could be cut to 100,000 by the end of 2006. But Casey will face two "decision points" next year--one in March, when he can fully assess the effects of the Dec. 15 election, the other in June, when major U.S. units have to be told if they will deploy.

At this stage, almost no one is talking about a rapid, large-scale troop drawdown. Inside the Pentagon, officers privately caution that troop levels could even rise if Iraqi security forces don't shape up as expected, if the insurgency grows more fierce or--of greatest concern--if civil strife evolves into full-fledged civil war. In fact, a senior Pentagon official tells TIME that Secretary of Defense Donald Rumsfeld asked his planners last week to make sure they have a contingency option if things go very badly in Iraq next year.

Even if the U.S. does decide to withdraw troops, it won't simply flee. Washington is spending millions on fortifying a few Iraqi bases for the long haul. "The challenge for us is, what is the right balance--not to be too present but also not to be underpresent. This will require constant calibration," U.S. Ambassador to Iraq Zalmay Khalilzad tells TIME. Indeed, last August, Army chief of staff Peter Schoomaker said that as many as 100,000 Army troops could remain in Iraq for four years.
Presumably this also depends on what the new Iraqi government wants, which may be what comes up in that March "decision point." I have no idea whether it's even possible to keep 100,000 troops in Iraq for the next four years, or whether a "contingency option"—sending more troops in if civil war breaks out?—is at all feasible. I assume a lot of soldiers would have to go back for fourth or fifth tours, with all the bad effects that will have. Still, Time doesn't make it sound like a massive drawdown is in the cards, though admittedly it's like reading tea leaves here.

Also, Elaine Grossman of Inside the Pentagon recently reported on actual debates within the military about strategy in Iraq. Officers "have complained privately that the military strategy seemed adrift, lacking clear objectives or measurable progress." Originally the strategy involved killing lots of insurgents. Then in spring 2004 the military shifted focus, making Iraqi troop training its first priority. But after that started, insurgent attacks on civilians rose dramatically, so now the military wants to focus on protecting civilians. Ambassador Zalmay Khalilzad is apparently very interested in Andrew Krepinevich's "oil-spot" theory and very recently had a "Red Team" of analysts recommend a new strategy along those lines. Part of this new approach, it seems, would involve recruiting Sunni tribal leaders for security purposes—"with mixed results" so far. (They're also putting together three "provincial reconstruction teams"; why this wasn't done before, I have no idea.)

So that seems to be where things are heading (again, tea leaves...), although not surprisingly, many people think it's way too late for this "new strategy" to work, especially since the new plan will, it appears, rely heavily on local militias—death squads—to "protect local populations," not exactly a recipe for stability. Not to mention the fact that there are a lot of things in the country, especially on the political front, that the U.S. can no longer control...
-- Brad Plumer 4:03 PM || ||

November 29, 2005

When Lobbyists Attack

This Mark Schmitt post brings up a good point. Yesterday I posted on how the defense appropriations process was heavily swayed by the $40 million spent on lobbying each year. Not that I know personally; most of that comes from reading Wastrels of Defense, by Winslow Wheeler, the former national security staffer for Pete Domenici. He's in the know, and if he's saying there's a ton of "legal corruption" going on, there probably is. (Plus, he makes a good case.)

Still, in theory it's possible that many lobbying dollars have far less insidious effects. In the case Schmitt mentions, Byron Dorgan (D-ND) wrote a letter requesting funds for a school desired by a Louisiana Indian tribe, and two weeks later Jack Abramoff told the tribe, a client of his, to send Dorgan $5,000 in campaign contributions. That could be corruption—i.e., Dorgan was paid to write the letter—but it might just be that Dorgan was going to write the letter anyway, seeing as how he's always done a lot of work for Native Americans, and Abramoff knew this, and so he had the tribe send some money to make it look like his lobbying efforts were worthwhile, even though Abramoff had done nothing. I guess it's inevitable that the people who hire lobbyists will be the ones getting ripped off now and again, although that hardly means that all campaign contributions are innocent.
-- Brad Plumer 10:05 PM || ||
Utopia and Fascism

Ellen Willis' essay, arguing that the world needs more "utopian thinking," is very much worth a read. She's right that utopian thinking has gotten a bad rap ever since Hannah Arendt and Isaiah Berlin argued that the Soviet Union and Nazi Germany came about all because people rejected "liberal pluralism" in favor of a "monolithic ideology." As a quick aside, I agree, monolithic ideology's bad news, but the idea that fascism came about because of "utopian thinking" has always seemed incorrect to me. Robert Paxton's The Anatomy of Fascism made this point nicely:
Fascist regimes functioned like an epoxy: an amalgam of two very different agents, fascist dynamism and conservative order, bonded by shared enmity toward liberalism and the Left, and a shared willingness to stop at nothing to destroy common enemies.
Mind you, this isn't the only way to think about fascism; it all depends, I suppose, on whether it's seen as a revolutionary movement, or a counterrevolutionary one. Paxton believes the latter, and his argument rests on the fact that while fascist movements may have been ideological—although hardly very philosophically coherent—they only actually came to power in Germany and Italy, in places where the conservative elite feared the rise of the left. (In Romania and Austria, fascist movements were crushed by conservatives.)

And once in power, the two fascist regimes didn't really have a coherent philosophy of governing, apart from a drive towards national and social domination—and, in the Nazi case, racial purity (one could call the Holocaust a "utopian" project, although I don't know how much that clarifies). Neither pursued a utopian political order as such: both Hitler and Mussolini pretty much used existing state institutions as needed, in an ad hoc fashion. There was no philosopher of fascism, or system builder, except for maybe Carl Schmitt—though it would be hard to call him utopian. So I don't know, the idea that fascism is a brand of "utopian thinking" doesn't seem quite right.
-- Brad Plumer 7:36 PM || ||
Death Squads

Reading the New York Times' report on Shiite death squads in Iraq—which seem to have semi-official backing from the Interior Ministry and have killed or abducted a reported "700 Sunni men" over the past four months—it's hard to figure out how this all got started. Laura Rozen suggests that Pentagon officials had planned for death squads (or as they put it, the "Salvadoran option") all along. But this bit from the Times suggests that things aren't quite going according to plan:
American officials, who are overseeing the training of the Iraqi Army and the police, acknowledge that police officers and Iraqi soldiers, and the militias with which they are associated, may indeed be carrying out killings and abductions in Sunni communities, without direct American knowledge.
Praktike points out that the original Newsweek piece on the Salvadoran option wasn't exactly correct, and the United States may have never intended to create "death squads" per se. In 2003, Special Forces veteran James Steele was charged with organizing "special police commando units" that were mainly supposed to target insurgent leaders. Those units, of course, drew heavily from Shiite and Kurdish militias, including, no doubt, the Iranian-backed Badr Brigade. Meanwhile, some of the Badr militamen running a torture camp in Baghdad may have been trained by American interrogators, but that doesn't mean they were intentionally trained as death squads. (U.S. forces uncovered the torture palace, after all.) And some of this death squad behavior probably came about because of genuine infiltration by the Badr Brigades, Mahdi Army, etc., without U.S. knowledge.

Either way, the end result was the same. The Pentagon's early attempts at a "dirty war" have pretty clearly spiraled out of control, and the death squads seem intent on going far beyond anything the U.S. ever envisioned. SCIRI leader Abdul Aziz al-Hakim has been chafing at U.S. efforts to rein him in. The Interior Minister, Bayan Jabr, no longer shares information with the U.S. It never seems to have occurred to anyone that fanatical Shiite militias would be somewhat less than charitable about policing their former Sunni tormentors. (Or maybe it did, and there was nothing they could do about it.) Yet again the Pentagon's discovering what a few "bad apples" can do if given half a chance.

Back when the Newsweek article on the "Salvadoran Option" was first published, Jason Vest wrote an important piece noting that military analysts have long concluded that the "death squads" in El Salvador, far from being a brutal force that just so happened to be effective, actually prolonged the conflict against the leftist insurgency there. It seems, reading the Times, that current military officers are aware that an Iraqi army filled with Iranian-backed thugs carrying out reprisal killings and running torture camps isn't going to end the violence in Iraq either. On the other hand, according to Seymour Hersh, the president sure seems enthusiastic about backing Shiite butchers so long as they "complete the mission." As one official described the president's thinking, the battle against the insurgency "may end up being a nasty and murderous civil war in Iraq, but we and our allies would still win." Lovely.

Maybe Bush will get his way, and that's how the U.S. will stake out its exit. Even if saner voices prevail, though, it's not clear that they can actually do anything about it. As the Times reported back in August, the U.S. is already wary of giving the Iraqi army heavy weaponry in part because they're worried that some of the Shiite groups will use them for "civil conflict". But if they don't arm the security forces, then there goes the exit strategy. On the other hand, Jim Lobe reports that Zal Khalilzad is going to start chatting directly, for the first time ever, with the Iranians about stabilizing Iraq. In a former age, this was known as the John Kerry policy, but I guess real men wait until the car is totaled before asking for directions.
-- Brad Plumer 6:47 PM || ||
Messiahs and Empire

In the Fall issue of Dissent, John Judis revisits a perennially fascinating topic: the religious roots of American foreign policy. There are a couple ways of look at this issue. Judis spent a lot of time in his last book, The Folly of Empire discussing how America's image of itself as God's chosen nation—as Abraham Lincoln said, the "last, best hope on earth"—has inspired the idea that the United States has a "calling" to transform the world. Not only that, but American often casts its conflicts in terms of good and evil, which, as Walter Russell Mead pointed out, means that it tends to fight viciously and rarely, if ever, accept defeat.

These religious ideals certainly aren't the only things driving American policy (external events and dangers matter too), but they play a large part; after all, far and away most foreign policy thinkers believe that the United States does have a duty to transform the world. Sometimes those ideals do more good than folly. Often not. As Judis points out, the main differences among the different schools are tactical—liberal internationalists, for example, tend to value multilateralism and a sense of prudence, they believe in the magical healing powers of global capital, and they don't usually descend into the trance-like messianism of George W. Bush, as recently described by Seymour Hersh:
In recent interviews, one former senior official, who served in Bush's first term, spoke extensively about the connection between the President's religious faith and his view of the war in Iraq. After the September 11, 2001, terrorist attacks, the former official said, he was told that Bush felt that "God put me here" to deal with the war on terror. The President’s belief was fortified by the Republican sweep in the 2002 congressional elections; Bush saw the victory as a purposeful message from God that "he's the man," the former official said. Publicly, Bush depicted his reelection as a referendum on the war; privately, he spoke of it as another manifestation of divine purpose.
It's insane. But as Judis touches on, and Anatol Lieven really dives into, a president with a divine sense of purpose is hardly the scariest religious impulse in American foreign policy. That honor belongs to the various forms of millennialism in the United States, which can hold that it's America's duty to bring about the millennium—as the 18th century preacher Jonathan Edwards said, "the dawning, or at least the prelude, of that glorious work of God… shall begin in America"—a mentality which can incline one towards reckless and revolutionary foreign policy. Alternatively, the broad prophetic belief in the Rapture—as depicted in the Left Behind series, which has sold 62 million copies—tends to foster paranoia and national aggression of the worst sort. Here's Lieven:
Not only is this [prophetic] tradition deeply and explicitly hostile to the Enlightenment and to any rational basis for human discourse or American national unity, it cultivates a form of insane paranoia toward much of the outside world in general. Thus The End of the Age, a novel by the Christian Rightist preacher and politician Pat Robertson, features a conspiracy between a Hillary Clintonesque first lad and a Muslim billionaire to make Antichrist president of the United States. Antichrist, who has a French surname, was possessed by Satan, in the form of the Hindu god Shiva, while serving with the Peace Corp in India.
It would be wrong to think these sorts of views have no effect on shaping Republican foreign policy—or Democratic foreign policy, for that matter—although it's hardly a necessary leap from millennialism to a neoconservative foreign policy that's intent on revolutionizing the world through armed aggression. Many populists over the years have put the millennial impulse in the service of a more isolationist foreign policy—in which, as William Jennings Bryan put it, "Our mission is to implant hope in the breast of humanity, and substitute higher ideals for the ideals which have led nations into armed conflict." In other words, the U.S. will set a good example from afar. Pat Robertson has advocated something similar, although he also seems to dabble in political assassination these days.

At any rate, religion—especially crazy religions—won't up and vanish from the United States anytime soon. So Judis argues that the most successful American policymakers will "focus on America's objectives, as given by the millennialist framework, while still retaining a complex and non-apocalyptic view of means and ends, capabilities and challenges." Or, as the theologian Reinhold Niebuhr put it, one must have a sense of irony towards the "pretentious elements in our original dream."

That may be. On the other hand, the standard bearer of a "non-apocalyptic view of means and ends, capabilities and challenges" tends to be the "realists" within the foreign policy establishment—those, like Jeanne Kirkpatrick or Condoleeza Rice, who believe that America must remake the world in its own image, but should be cautious about how to do so. These include the officials who dissuaded George W. Bush from pushing a hard-line, neoconservative stance against Russia and China in early 2001, and who convinced the president to adopt a more pragmatic approach towards Iran, North Korea, and Syria—stepping back from messianic talk of "evil" regimes—in 2005. But even if these "realists" don't drink from the same millennial Kool-Aid as the neoconservatives, they very much serve their own master: namely, American militarism. Lieven again:
This relative caution on the part of Realists in the U.S. establishment reflects in part the nature and interests of the U.S. military-industrial and security elites. These elites are obviously interested in the maintenance and expansion of U.S. global military power, if only because their own jobs and profits depend on it. Jobs and patronage also ensure the support of much of Congress, which often lards defense spending bills with weapons systems the Pentagon does not want and has not even asked for, to help out senators and congressmen whose states produce these systems. And as already noted, to maintain a measure of wider support in the U.S. media and public, it is also necessary to maintain the perception of certain foreign nations as threats to the United States and a certain minimum and permanent level of international tension.

But a desire for permanent international tension is different from a desire for war, especially a major international war which might ruin the international economy. The American generals of the Clinton era have been described as "aggressive only about their budgets." The American ruling system is therefore not a Napoléonic or Moghul one. It does not actively desire major wars, because it does not depend on major victorious wars for its own survival, and it would indeed be threatened by such wars even if the country were victorious. Small wars are admittedly a different matter.
(Needless to say, an approach that fosters a "minimum and permanent level of international tension," with, say, China, without intending to go to war can still lead to war—even by accident.)

So these seem to be the broad constraints on American foreign policy. On the one hand, there's the widespread, quasi-religious idea that America is the chosen nation called on to transform the world, with all the genuinely noble and appallingly ugly impulses that brings. On the other hand, there's a security establishment that doesn't buy into these millennial fantasies, but still remains committed to the endless "maintenance and expansion of U.S. global military power." Zbigniew Brzezinski, who is held in high regard by the Democratic establishment, sees the world as a "Grand Chessboard," which gives a sense for what that's all about. If this is all correct, then any hopes for a sensible foreign policy in the near future are probably foolish. A lot will have to change before then. Fortunately, Barbara Rossing's book on why the Rapture is bad theology is now out in paperback. A good stocking-stuffer, for sure..
-- Brad Plumer 4:20 PM || ||

November 28, 2005

Dukestir

The Duke Cunningham story, I'm guessing, will mostly focus on the corruption angle. And why not? You have a congressman on the defense appropriation committee taking bribes in exchange for helping a defense contractor win contracts. What a sleazeball, etc. But in an ideal world, some attention would get focused on the much larger problem of the defense appropriations process in general, which makes this sort of corruption almost inevitable.

Defense contractors live or die on the contracts Congress decides to hand out. Most of them have grown up under the command economy that is the annual defense appropriations bill, and wouldn't know how to survive in the free market. Understandably, then, contractors tend to put a lot of effort into lobbying and influencing legislators. Between 1997 and 2004, the top 20 defense contractors made $46 million in campaign contributions, and spent $390 million on lobbyists—and were rewarded for their efforts with $560 billion in contracts. Then there's a permanent revolving door between government and the defense industry, which is laid out in gory detail by the Project on Government Oversight. A lot of money gets sloshed around, and while most of it isn't flat-out bribery, it often comes close. Under the circumstances, what happened with Cunningham was bad and illegal, but not completely out of step with the larger trend.

Even more interesting than Cunningham, meanwhile, is MZM Inc., the company that bribed him. The Los Angeles Times reports that the company has received "$163 million in federal contracts, mostly for classified defense projects involving the gathering and analysis of intelligence." Just to be clear, a firm that bought a house for a corrupt Congressman is doing "classified" intelligence work. Okay, then.
-- Brad Plumer 5:01 PM || ||
The Two Welfare States

Via Mark Thoma, a good counter-intuitive point by Paul Krugman about the American welfare state:
We like to think of ourselves as rugged individualists, not like those coddled Europeans with their oversized welfare states. But as Jacob Hacker of Yale points out in his book "The Divided Welfare State," if you add in corporate spending on health care and pensions - spending that is both regulated by the government and subsidized by tax breaks - we actually have a welfare state that's about as large relative to our economy as those of other advanced countries.

The resulting system is imperfect: those who don't work for companies with good benefits are, in effect, second-class citizens.
Since I have Hacker's book on my desk, here are the numbers: In 1995 Germany and Sweden spent about 27 percent of GDP on after-tax social welfare expenditures—and two percent of that was "private" spending (i.e., by employers). The United States, meanwhile, spent 25 percent of GDP—and 8 percent of that was private.

Granted, the American "divided welfare state" is comparable to or better than Swedish-style socialism for those workers who happen to have stable jobs with good pensions, 401(k)'s, corporate day care, and gold-plated health care benefits. But for everyone else, it's inequitable, regressive, and a source of uncertainty for those increasingly at risk of losing their jobs. (Indeed, it's becoming a worse deal even for many workers with stable jobs, as they face greater cost-sharing for health care or as companies default on their pension funds, or what not.) In the old days, though, businesses loved it, and some of them even backed the creation of entitlements like Medicaid and Medicare as ways of reinforcing the status quo.

But now many companies look ready to shift more of the welfare burden onto the government, and move towards a European-style welfare state, although Daniel Gross has noted that there are two types of companies here—those that, like GM, want the government to pick up its health care and pension costs, and those, especially newer, high-tech companies, who still want tiny government. Meanwhile, 60 percent of workers still get their health benefits through their company (although that number's declining), so it's not clear how many voters actually want to shift to a European-style welfare state, at least right now. Change won't be easy, although the opportunity is certainly there. Also, those companies that no longer want to be on the hook for, say, health care costs aren't necessarily going to push for single-payer, or France-style health care. They could just as easily be convinced to agree to the Bush administration's horrible HSA proposal, which would shove people onto the open insurance market. That, I think, is going to be a major fight.

One other miscellaneous point: It's worth noting that most of the legislation that expanded the employer-centered "private" welfare state was passed with very little public debate. The Revenue Act of 1978, for instance, slipped in an obscure provision to create 401(k) plans, which was estimated to have a "negligible effect on budget receipts" by the CBO—today it costs the government over $100 billion a year, for a program that mostly benefits the well-off. It would be naïve to pretend that public opinion has an all-powerful effect on public legislation, but even by those standards there's been a breathtakingly small amount of oversight on the expansion of the private welfare state.
-- Brad Plumer 2:13 PM || ||

November 27, 2005

Out of Iraq

In the wake of the Bush administration's pseudo-announcement that it has plans to draw down troops from Iraq by the end of 2006, here are two worthwhile posts on the subject, by Dan Darling and Mark Safranski. It really does seem that the wretched state of the Reserves and the National Guard, along with GOP concerns about the 2006 midterms, is the main driver here.

As to what comes next, I won't try to predict. John Robb thinks the U.S. is going to stage a "controlled chaos exit," relying on Shiite and Kurdish paramilitaries to keep order as the Iraqi state dissolves. Ayad Allawi is worried that death squads will run rampant. Juan Cole reports that the U.S. may make a stronger push to negotiate with Sunni insurgents. And Seymour Hersh is reporting that the U.S. will continue to use massive airpower to bomb insurgents, and whoever else happens to get in the way, after the draw-down. Understandably, he—along with a number of Air Force officers, apparently—thinks this is a bad idea.

Well, maybe any or all of those things will come to pass. Nadezhda seems to have the best prediction here, though: "[F]or at least the next six months, it's hard not to predict a continued absence of a clear strategy. In turn, that means a continued reliance on messy improvisation, with the quality of outcomes in part dependent on the talents of various improvisers." How does the saying go? "I don't see any method at all here, sir."
-- Brad Plumer 8:22 PM || ||
Standing in Line

A few weeks ago, kactus of the excellent OurWord.org had a moving post on the often sadistic bureaucracies that welfare recipients are forced to navigate, sometimes on a daily basis: "I've decided that there must be a giddy sense of power that comes from being able to command poor people to stand in line, at the drop of a hat." Definitely worth reading in full, and I meant to link to it before, but forgot until coming across a related passage in David K. Shipler's The Working Poor:
The system is also plagued by welfare cheats. They are not people who receive welfare illicitly. The more damaging welfare cheats are the caseworkers and other officials who contrive to discourage or reject perfectly eligible families. These are the people who ask a working poor mother a few perfunctory questions at the reception desk, then illegally refuse to give her an application form, despite the law's provision that anyone of any means may apply. It is a clever tactic, say the lawyers, because they cannot intervene on behalf of a client who has applied.

The welfare cheats are the officials who design Kafkaesque labyrinths of paperwork that force a recipient of food stamps or Medicaid or welfare to keep elaborate files of documents and run time-consuming gauntlets of government offices while taking off from work. "I have clients with daily planners that are filled more than mine are," said Ellen Lawton, an attorney at [a nutrition] clinic.

If you want to stay on welfare, you have to provide pieces of paper proving that your children have been immunized and are attending school. If you want food stamps, you have to deliver pay stubs and tax returns. If you want a job, you need day care for your children, and if you can't afford it, you have to get a day-care voucher, and if you want a voucher, you have to prove that you're working. Getting a voucher involves multiple visits to multiple offices—during working hours, of course. Caught in this Catch-22, one mother put herself on waiting lists at infant day-care centers all over the city; meanwhile, her caseworker told her that she had to get a job before she could get day care paid for. Lawton quoted the caseworker: "So if you're on a waiting list, you need to find somebody who's gonna watch your kid."

Every demand for a document provides an opportunity for a cutoff, because no matter how meticulous a recipient may be, pieces of paper seem to get lost in the bureaucracy. "I just had a client like this last week," said Lawton. "She had received three different notices informing her in three different ways that she was being cut off. One of the issues was that she hadn't provided a certain piece of paper about her attendance at a [job-training] program. And she said she had provided the paper, but they lost it. Fine, we provided another piece of paper. She receives another notice that she's going to be cut off. Well, it's actually a different computer system that's generating notice, so she has to take time off from her program to go and get another piece of paper, bring it to the office…. Being poor is a full-time job, it really is."

It also promotes absurdity. One mother, desperate to get her asthmatic child out of a harmful apartment, obtained a letter from her pediatrician saying the house was making the child sick, which technically qualified her for emergency assistance, Zotter said. But the welfare department's receptionist turned her away three times, telling her that she already had housing and couldn't even apply for temporary shelter as long as she wasn't homeless. The mother seriously considered moving out and making herself homeless to qualify. As the lawyer was explaining forcefully to a caseworker how the welfare department had broken the law, "she gave up and she moved to Atlanta, because she said she just didn't feel like the system was helping her."

Just under half such cases can be solved with an attorney's phone call, Zotter estimated. One involved the mother of another patient who was denied an application for emergency food stamps. "If you're really low income you can get food stamps within twenty-four to forty-eight hours," Zotter said, "and then they do your verification and see if you really qualify. And they wouldn't let her apply for it. I just called them up and said, this is her income, she has no resources, she qualifies for this, you have to give it to her. And they did."

Blessed are the poor who have lawyers on their side.
A few scattered thoughts. First, especially with Medicaid, it's worth noting that many programs have perverse incentives on this front. When states are facing budget crunches, they certainly try to make things difficult and deter as many low-income families as possible from signing up—and these efforts are usually greatest during downturns, when more people than usual need coverage. Those eligible also tend to be, as Shipler notes, the easiest people to deter. This partly explains why some 10 million people who are eligible for Medicaid are not enrolled, although there are lots of other reasons. On the other hand, if everyone was enrolled in one universal health care system, there would be no need for extensive verification, and much of the bureaucracy could be slashed. (And Medicaid's administrative costs are already lower than the private sector.) Not to mention the fact that no one would be turned away.

Second, it's very likely that the conservative push to hand programs such as Medicaid entirely over to the states will give those states even greater incentive to save money by deterring people from signing up. (On a related note, see this paper by Michael Bailey for evidence that "devolution" creates a race to the bottom, as states try to reduce the level of services they provide in order to dissuade recipients from moving to their states.)

There's also no reason to think the increasing drive to privatize social services would improve the "Kafkaesque labyrinth." The situation kactus describes seems like a result of prevailing cultural attitudes towards welfare recipients, that, as she says, they should "wait in line without complaint, cuz that's what you get for daring to be poor and looking for a handout." Shipler's book gives a similar sense, that attitudes towards the poor play a huge role here. See this interview by Richard Moffitt for one theory on why this cultural shift came about. (Although he seems to overlook a point or two...)
-- Brad Plumer 7:24 PM || ||

November 26, 2005

Abramoff and Friends

For anyone who, like me, hasn't been keeping up with all the Abramoff-related scandals in Washington, Matthew Continetti has two very readable overviews in the Weekly Standard. The SunCruz affair in particular, which ended in a mob slaying, seems like a very big deal. Although it's hard to see what the point was of bribing Rep. Bob Ney to insert mean things about a casino owner in the Congressional Record. Obviously it's unethical, but is this really the best way to intimidate a stubborn casino seller into selling his boat? It's a bit unclear. And Ney's defense seems to be that he wasn't bribed to make those remarks, he just often puts random things handed to him by lobbyists in the Congressional Record. Fun times...
-- Brad Plumer 10:56 PM || ||
Enduring Dynamics

Thanksgiving weekend without the family has basically been time for reading a bunch of books I've been meaning to read. One is Anatol Lieven's excellent America Right or Wrong: An Anatomy of American Nationalism, which tries to diagnose American foreign policy—always a fun game. In the introduction Lieven takes issue with those on "the Left" who lump Bush's foreign policy in with Bill Clinton's, seeing them both as "reflecting the enduring dynamics and requirements of an imperial version of American capitalism." To this he says:
This analysis is indeed partly true, but in emphasizing common goals, left-wing analysts have a tendency to lose sight of certain other highly important factors: the means used to achieve these ends; the difference between intelligent and stupid means; and the extent to which the choice of means is influenced by irrational sentiments which are irrelevant or even contrary to the goals pursued. Of the irrational sentiments which have contributed to wrecking intelligent capitalist strategies—not only today, but for most of modern history—the most important and dangerous is nationalism.
That seems right, and as someone who thinks that the foreign policies of both the Bush and Clinton administrations do reflect, in part, "the enduring dynamics and requirements of an imperial version of American capitalism," and have many things in common, I think it's worth trying to sketch this point out a bit.

I imagine the "left-wing" storyline Lieven's discussing would argue, more or less, that the "enduring dynamic" of American politics since 1979, both at home and abroad, has been class warfare. This story would start in the 1970s, when inflation endangered the share of national wealth reserved for the top 1 percent of Americans—a share which plunged from 35 percent in the 1960s down to under 20 percent in 1976. (Real wages, meanwhile, reached historic highs during the 1970s.) For a time, Lord Keynes' vision of the "euthanasia of the rentier" was perilously close to becoming a reality, and that just wouldn't do.

Luckily for the rentier, Paul Volcker came to the rescue in 1979, using the Fed to break inflation's back—along with many an American worker. And Ronald Reagan's parallel attack on labor and progressive taxation helped ensure that the upper classes would never again have to endure the trauma of the '70s. Thomas Edsall's 1985 book on the Volcker/Reagan backlash, The New Politics of Inequality, argues that these moves were partially the result of a concerted drive on the part of the upper classes. For example: "During the 1970s, business refined its ability to act as a class, submerging competitive instincts in favor of joint, cooperative action in the legislative arena." By the end of Reagan's term, that class had certainly restored itself to power—the top 1 percent owned 40 percent of the wealth once again. (A parallel restoration occurred under Thatcher in England.) While it's easy to get too conspiratorial here—presidents only have so much power, after all—there's surely something to Edsall's account.

Continuing the story: Bill Clinton's two terms saw "neoliberalism" go global, as they say. One leftist explanation for the '90s boom is that investors were doing stunningly well pillaging countries overseas. Gérard Duménil and Dominique Lévy have noted that U.S. holdings abroad skyrocketed in the '80s and '90s, and, at points during the Clinton era, American firms were making as much in profits overseas as they were at home. Then there was the shadier sort of pillaging described in Confessions of an Economic Hitman. Even the Ming Dynasty at its peak was never so efficient at exacting tribute from foreigners.

Not only that, but the prescriptions of the "Washington Consensus," as they say, minted new upper classes abroad—in Mexico and Russia, among others—and brought about a string of debt crises and devaluations in the '80s and '90s that seemed to do little more than redistribute wealth to the first world, as investors swooped in and bought up foreign assets at bargain prices. Eric Toussaint notes in Your Money or Your Life, since 1980, "fifty Marshall Plans (over $4.6 trillion) have been sent by the peoples at the Periphery to their creditors in the Center." Whether there are any redeeming features of this system is another debate. (I think there are.)

So far, so tedious. But within this "Leftist" view of the past 25 years—which, in part, I agree with—it's sometimes difficult to see how foreign policy conforms to the "enduring dynamic." Often it doesn't. In the Reagan years, foreign policy was motivated by a blend of Cold War concerns and pro-business intervention. That fits. But Bush I and Clinton are harder cases. It's not always obvious how their foreign policy moves benefited, say, business interests. The first Gulf War secured the Middle East oil supply, true, but James Mann's Rise of the Vulcans makes clear that that was hardly the only consideration. Kosovo and Bosnia, meanwhile, had a string of ulterior motives guiding intervention—some genuinely humanitarian—as did many of Clinton's foreign policy moves. Military power hasn't always advanced the neoliberal "agenda" as efficiently as leftists would expect. (Indeed, some of them—like Bosnia and Kosovo—did a lot of good, whatever their motives.)

In general, I'd agree with Andrew Bacevich that American foreign policy tends to be driven by a self-sustaining militarism, along with what Lieven sees as nationalism and the "American Creed," the idea that America's duty is to spread freedom and liberty around the world. The "requirements of an imperial version of American capitalism" gets a role too, but different presidents will chart very different courses, depending on ideology, specific events, their level of competence, etc. There's a lot of randomness involved, and those courses won't always follow the "demands of capital." Mark Engler recently pointed out that Bush's foreign policy, and the "war on terror," has been very bad for business, which seems intuitively true. The upper classes may have little to quarrel with when it comes to Republican rule, but the "liberal internationalism" of the Clinton years—with its focus on relentless globalization, undemocratic market institutions, stable alliances, and a vast empire of bases abroad—was arguably much better for business worldwide. Again, this was a function of competence, ideology, specific events, etc.

Now Lieven's very convincing when he argues that the current Bush's foreign policy has been mobilized, to a greater degree than his predecessor, by nationalism, which, in many cases, "wreck[s] intelligent capitalist strategies." It's reminiscent of the days prior to World War I, when the reigning pacifists were German and British business elites, who were overruled by inept politicians and nationalist sentiment.

But one thing Lieven might have noted is that, insofar as this "neoliberal revolution" over the past twenty years has led to increasing economic instability at home, stoking the fires of nationalism might be one of the few reliable ways that a politician like Bush can gain electoral support. Certainly liberal pundits who suggest that the Democratic Party must focus on foreign policy to win more elections believe this. In that case the tension between neoliberalism and nationalism might end up being an "enduring dynamic," with the Clinton years being a bit of an aberration. I can't really say. It's worth noting that the sort of leftist structural views discussed here (and by Lieven) often end up being much too rigid and quite wrong.

Continue reading "Enduring Dynamics"
-- Brad Plumer 10:55 PM || ||
Underground Gun Markets

Two weeks ago San Francisco passed a ban on guns within city limits, the strictest such ban in the country: not only are all sales banned, but everyone must turn in his or her handgun by next April Fool's Day. The betting line, it seems, is that the new law won't survive a court challenge. From a practical standpoint, though, many opponents of the law have argued that at any rate these sorts of bans won't affect gun ownership, or gun violence, in the city, since criminals will still be able to buy guns from either the underground market or outside city limits. Only law-abiding citizens will be affected, etc.

That sort of logic usually makes sense—after all, bans on narcotics don't seem to have any appreciable effect on the drug market—but it's probably not quite right. Four economists—Philip Cook, Jens Ludwig, Sudhir Venkatesh, and Anthony Braga—have just put out a study looking at Chicago's underground gun market, and found that a ban on handguns in that city seriously increased the amount of "friction" in the gun market, making it much harder for the average person to get access to guns.

Unlike with drugs, the relatively small number of buyers, sellers and transactions in an underground gun market creates "thinness" in the market, which leads to serious transaction costs. Repeat business in the gun market is rare, since generally you just need one gun, so most buyers generally don't know where to go, or who to trust. It's hard to advertise, after all. Plus, many buyers in the underground markets don't know the first thing about guns, and will often buy guns that may not even work, just for show. (Ammo is even harder to find; for obvious reasons you're generally not allowed to load a gun and "test it" during a sale, and many youths don't even know what sort of ammo they need.) And friction creates more friction—a lack of sellers reduces the number of buyers, which in turn discourages sellers. (Moreover, many people who want a gun, especially for "unlawful purposes," don't ever end up leaving the city, so buying a gun legally from the suburbs isn't usually an option.)

Now here's the caveat: gangs, obviously, have readier access to gun markets, but for a variety of reasons gangs don't just start giving guns to anyone who wants one, and that includes low-level gang members. Not only are gangs wary of hostile takeovers, but gun violence is often bad for business—it scares customers and attracts the police. So regulation is very tight, especially within gangs, where only about 25 percent of all members even have a gun. But that's still a lot of people with access, which probably explains why Chicago doesn't have much lower levels of gun violence than other cities, even if it is harder for the average person to get a gun.

In that case, it's reasonable to think that increasing the friction in gun and ammo markets, coupled with some sort of collective-deterrence strategy against gangs—as happened in Boston's Operation Ceasefire—would reduce gun violence. It's hard to tell. But either way, it's an interesting study, and a useful corrective to the view that it's "impossible" to regulate the gun market.
-- Brad Plumer 5:51 PM || ||
Organizing Temp Workers

The other day I pointed to research showing that temp work wasn't really the best way for anyone, especially those kicked off the welfare rolls, to move up the ladder and land a steady job. That brings up the obvious question, "Well, why don't temp workers organize?" What are the obstacles? Especially since about 27 percent of the workforce is employed in "nonstandard" arrangements.

As it turns out, back in 2000, the National Labor Relations Board actually ruled in MB Sturgis that temp workers who perform the same job as regular employees at a company share a "community of interests," and unions can organize and negotiate for these workers. That obviously affects a lot of things, including wages and benefits, but one of the best is that contingent workers could have access to employer-provided training programs that can eventually help them get full-time employment. Here's the sort of thing temp workers could, in theory, take advantage of:
In the telecommunications industry, where complex new technologies are constantly being introduced, large employers such as Verizon and SBC Communications have long partnered with the Communications Workers of America (CWA) to offer telecom workers advanced technical training. Now, through the union’s CWA/NETT academy, other workers can take advantage of an array of programs offering training in the highly specialized skills telecom employers need. After only three years, CWA/NETT has already graduated more than 1,000 workers.

Some smaller firms have adopted a similar approach. For example, Tucker Technology Inc., an Oakland, California, information-technology firm, offers its clients a range of hardware- and software-design and sophisticated installation services. Through its partnership with the CWA, Tucker is able to train the cabling technicians, design engineers, and other workers it needs.
The catch, though, is that Sturgis is no longer valid. The Bush-appointed NLRB overturned it in November of 2004—ruling instead that temp workers could join forces with employees and unionize only if both the temp agency and employer agree to it. In other words, never. Granted, the relative impact of this is rather small, since in practice Sturgis mainly affected those temp workers in unionized industries—roughly 10 percent—but it still matters. While we're at it, one might ask why the "independent" Federal Reserve Board sets aside five votes for heads of institutions owned by commercial banks, while the NLRB certainly doesn't have nearly half its seats reserved for unionists, but obviously I just don't understand democracy very well.
-- Brad Plumer 4:58 PM || ||

November 23, 2005

Life of Caesar

In the L.A. Weekly, Vince Beiser has an article on the life cycle of a Caesar salad, looking at all the underpaid labor and overtoxic chemicals that go into making one, from start to finish. Good piece. It's also worth quoting Mark Krikorian's bit on the economics of immigrant farmworkers: "Foreign labor is concentrated in the harvest of fresh fruits and vegetables. And if you look at the actual numbers, you'd see that since labor accounts for such a small part of the retail price fruits and vegetables, giving farm workers a 40 percent raise would increase grocery costs for the typical American consumers by $8 a year. Eight dollars. A year." Okay, then.
-- Brad Plumer 6:48 PM || ||
True Believers

In the latest New Republic, Jason Zengerle has a lot of fun with Stephen Hayes, the Weekly Standard writer who's made a career out of insisting that Saddam Hussein really did have links with al-Qaeda. It's worth being clear about the various distinctions here, though. Hayes has tried to refute the idea that Saddam had "no" connection with al-Qaeda. He's probably right. It's hard to believe there weren't a few winks and nudges here and there. I certainly don't know. But what Hayes definitely hasn't shown was that there was a connection worth invading over, which, when you get down to it, is sort of the heart of the issue.

And it's a pretty high bar. Say al-Qaeda and Saddam Hussein did have a "collaborative relationship," as Hayes alleges, and bin Laden was asking Iraq to hook him up with some Chinese mines, or whatever. Is this a "threat"? Sure, it could be. Are there ways of dealing with it short of invading Iraq? Of course. Disrupting al-Qaeda in Afghanistan and Pakistan would have, presumably, helped put a damper on any possible relationship. Catching bin Laden would have helped. Clamping down on terrorist finances, focusing intelligence on monitoring the relationship, tightening sanctions on Iraq, heck, making Saddam paranoid with messages that bin Laden is screwing his daughter. What do we spend $44 billion a year on if not for this?

I certainly don't know if those things would all work, we'd have to ask the experts, but surely invading Iraq isn't the only way to disrupt a "collaborative relationship." Not to mention the fact that the actual invasion appears to have strengthened al-Qaeda, so relationship or no, this wasn't really the optimal response. The case for war, meanwhile, was built around the suggestion that Saddam was going to hand bin Laden a nuclear bomb any day now. That would have raised harder questions about war, but it was always ridiculous, and as it turns out—even in Hayes' universe—wildly untrue.
-- Brad Plumer 6:20 PM || ||
Aid and Pessimism

Sam Rosenfeld has a very good Tapped post about aid to Africa, noting that while turning poor African countries into democracies with 10 percent GDP growth a year is very hard, spending a bit of money to provide them with bed nets for malaria is not. That's right. I think, though, he's attacking a straw man here. Very few "aid critics," even William Easterly, think that modest steps like sending malaria nets to Africa are useless. Easterly would probably laud it as the sort of thing we should be doing. But that's not what people like Jeffrey Sachs are proposing.

Sachs argues that you can't solve one poverty problem without solving a whole host of others, and wants to send nations not just malaria nets but trees that replenish nitrogen in the soil, rainwater harvesting, better health clinics, etc. etc. The UN Millenium Project is very broad, and as such, is open to the usual criticisms. In fact, critics of Jeffrey Sachs sometimes cite the Gates Foundation's malaria net work as their preferred, more modest alternative. See the end of this piece, for instance. It seems like a distinction without a difference but these are heated debates.

Now as it happens, I think Sachs' broad approach is a good one. Even if only an eighth of UN aid makes its way to those who need it, that's still a lot more than before. And as I reported here, aid to developing countries is generally more effective than it's given credit for, and much of the squandered trillions in African aid in years past can be explained away by the fact that there was a Cold War going on, and first world nations didn't exactly hand out aid with, ah, humanitarian ends in mind. Yes, there are a lot of sorely-needed ways to improve the aid process, and aid alone won't save any country, but on balance, it does more good than harm. (The benefits of increased trade, meanwhile, while certainly positive and worth reaping, are generally overstated.) Plus, at the margins, you get stuff like malaria nets that have concrete results.

But that's not to say aid—even very modest aid like providing malaria nets—won't do any harm whatsoever. Unless African countries can figure out how to grow, they'll remain dependent on humanitarian aid, which, while not the worst thing in the world, is a dangerous place to be. And too much squandered aid—even if it's still doing some good—can discourage donors from working in Africa. It can even help prop up dictatorships or encourage corruption. There are a lot of things to worry about. But it's true, the pessimists about aid to Africa sometimes go too far.
-- Brad Plumer 4:24 PM || ||
GM, Socialism, Redux

Okay, okay, yesterday's post on health care and GM was very badly expressed, so let me try again, the short(er) version. Any health care reform that can free employers from the burden of providing insurance for their workers—let's say, Medicare-for-all—will make the economy far more efficient in all sorts of good ways. Very good ways. It will also be far, far more equitable from a health standpoint. So yes, bring on reform.

But like any redistribution of wealth, reform will create winners and losers. A company like GM will "win" and become more "competitive" if, after reform, it can lower the total compensation it pays to workers. Presumably it thinks that it can. Arguably, the UAW would try to fight for that pot of money suddenly "freed up" by health care reform. Certainly my local would do exactly that, especially if we all suddenly faced higher taxes to pay for Medicare-for-all or whatever. Unions tend to be justifiably proud of their hard-fought health benefits, and won't give up that money easily. Management will want that freed-up money to go to profits, obviously, or R&D, or whatever. Maybe they can learn to share. Lion, lamb, etc.

In an abstract "perfect" market, total compensation wouldn't change at all (since that's the price of labor, determined by supply and demand), and health care reform wouldn't reduce the labor costs of businesses one whit. We don't have "perfect" markets in the real world. So figuring out who the winners and losers are all depends on where and on whom the taxes and costs fall after the dust all settles. In the short run, the transition could get ugly. In the long run, we have a more sensible and equitable system. That's all I'm saying.

If health care reform, like Medicare-for-all, can lower total health care costs without hurting health outcomes—and the European experience suggests this is very likely—then the winners will win much more than the losers lose. Think of it like free trade. Maybe we'll all be winners. One can hope. By the way, it seems that DLC darling Jacob Hacker has proposed his own plan for universal Medicare. Here's a magazine article where he justifies it. It's good stuff. I back it. Maybe someone other than Dennis Kucinich will propose something similar in 2008. One can hope.
-- Brad Plumer 2:41 PM || ||
Postwar Europe

In this week's New Yorker, Louis Menand reviews Tony Judt's Postwar and turns up a few surprising tidbits. Why, for instance, was Europe so peaceful and stable and prosperous after World War II? Judt credits, more than anything else it seems, the previous thirty years of what was essentially a "ruthlessly efficient" ethnic cleansing campaign:
The tidier Europe that emerged, blinking, into the second half of the twentieth century had fewer loose ends. Thanks to war, occupation, boundary adjustments, expulsions, and genocide, almost everybody now lived in their own country, among their own people. . . . The stability of postwar Europe rested upon the accomplishments of Josef Stalin and Adolf Hitler. Between them, and assisted by wartime collaborators, the dictators blasted flat the demographic heath upon which the foundations of a new and less complicated continent were then laid.
Obviously that's not the only theory out there to explain Europe's postwar success, and since nationalism—what counts as one's "own people," as Judt puts it—tends to be as much an artificial construct as anything else, it's not as if "war, occupation, boundary adjustments, expulsions, and genocide" are necessary for peace and prosperity. Still, gruesome to think about, not least with an eye towards modern-day Iraq. On another note, this part—Menand's words, not Judt's—seems pretty unconvincing:
Western Europe became a place of social planning, nationalized economies, and strong states not because democratic socialism was in the Continental genes but because there were no reserves of private capital and few viable non-governmental institutions around to put the world back together again. The "European model," Judt says, was mostly an accident. There was no great political vision; necessity and pragmatism ruled the day. As Armstrong wrote, you cannot eat ideology.
That doesn't seem right. Even long prior to the war Europe had a much more expansive welfare state than the United States—see Table 2.2 of this paper; the difference in social spending across the two continents wasn't much greater in 1960 than it was in 1937.

There are a all sorts of historical reasons other than "Continental genes" to explain why the United States has always had a smaller welfare state than the "European model": until the 1930s the U.S. was dominated by a Supreme Court that protected private property and business at all costs; path dependency followed from that; there's the fact that Roosevelt and the New Dealers, along with the mainstream AFL, co-opted and effectively destroyed more radical and socialist movements in this country; there's the fact that ethnic and racial divisions fractured the American labor movement and influenced public attitudes towards welfare, etc.

Even prior to World War II, European revolutions and incidents of social unrest were very effective at spurring new social spending—especially since, given that the countries were so much less spacious, elites faced greater danger from rioting than they do in this country. There's a long tradition of it. I'm sure there were some accidental components to the creation of postwar Western European welfare states, but it wasn't all an accident.
-- Brad Plumer 2:03 PM || ||

November 22, 2005

You Are Getting Sleepy...

Evidently, the scientists who study human consciousness are in the habit of making advances and breakthroughs every now and again. This long quote on how cognition might work is from a very fascinating New York Times piece today on hypnosis research:
[N]ew research on hypnosis and suggestion is providing a new view into the cogs and wheels of normal brain function… One area that it may have illuminated is the processing of sensory data. Information from the eyes, ears and body is carried to primary sensory regions in the brain. From there, it is carried to so-called higher regions where interpretation occurs.

For example, photons bouncing off a flower first reach the eye, where they are turned into a pattern that is sent to the primary visual cortex. There, the rough shape of the flower is recognized. The pattern is next sent to a higher - in terms of function - region, where color is recognized, and then to a higher region, where the flower's identity is encoded along with other knowledge about the particular bloom.

The same processing stream, from lower to higher regions, exists for sounds, touch and other sensory information. Researchers call this direction of flow feedforward. As raw sensory data is carried to a part of the brain that creates a comprehensible, conscious impression, the data is moving from bottom to top.

Bundles of nerve cells dedicated to each sense carry sensory information. The surprise is the amount of traffic the other way, from top to bottom, called feedback. There are 10 times as many nerve fibers carrying information down as there are carrying it up.

These extensive feedback circuits mean that consciousness, what people see, hear, feel and believe, is based on what neuroscientists call "top down processing." What you see is not always what you get, because what you see depends on a framework built by experience that stands ready to interpret the raw information - as a flower or a hammer or a face.

The top-down structure explains a lot. If the construction of reality has so much top-down processing, that would make sense of the powers of placebos (a sugar pill will make you feel better), nocebos (a witch doctor will make you ill), talk therapy and meditation. If the top is convinced, the bottom level of data will be overruled.

This brain structure would also explain hypnosis, which is all about creating such formidable top-down processing that suggestions overcome reality.
Very cool!

Continue reading "You Are Getting Sleepy..."
-- Brad Plumer 9:39 PM || ||
"Rehabilitation"

On a random suggestion from a blog comment somewhere on the internet, I started reading Jennifer Mittelstadt's From Welfare to Workfare, which, among other things, revisits the debates over welfare back in the 1950s and 1960s, when liberal reformers conceived of welfare as a form of income support for needy mothers, recognizing the value of work and the ways to support that for low-earners while trying to avoid moral hazard. All sorts of smart progressive ideas were making the rounds back then, many reaching across what's seen as the liberal-conservative divide on the subject today.

But what started as an extremely thoughtful and nuanced debate eventually went down in flames when neoconservatives hijacked it and seized on reform as a means of "rehabilitating" women who were too "dependent" on welfare—by shunting them into jobs as soon as possible. On this view, Bill Clinton's welfare reform bill in 1996 was a rousing success, since it did after all reduce welfare rolls, which, for many, seems to be a good in itself. Mittelstadt's book traces a lot of the gendered and racialized assumptions that went into this view, which deserves a post or two in itself. (Here's one short article on the topic.)

At any rate, this isn't really the place to revisit the debate over welfare reform. No doubt there will be time for that if and when Congress ever decides to reauthorize the 1996 bill (they've put it off several times already). But it was a way of introducing an interesting new paper by Susan Houseman and David Autor, which examines the temp agencies often used by states to get welfare recipients into work as soon as possible:
We find that moving welfare participants into temporary help jobs boosts their short-term earnings. But these gains are offset by lower earnings, less frequent employment, and potentially higher welfare recidivism over the next one to two years. In contrast, placements in direct-hire jobs raise participants' earnings substantially and reduce recidivism both one and two years following placement. We conclude that encouraging low-skilled workers to take temporary help agency jobs is no more effective - and possibly less effective - than providing no job placements at all.
But hey, at least welfare rolls are reduced! Sort of. No, not good. On the other hand, before getting too excited about those "direct-hire jobs" mentioned above, Harry Holzer of Brookings has argued that even in those, many lucky welfare-to-work "recipients" get stuck on the lowest rung of the job ladder, with permanently low earnings, in part because they generally get "placed" into the first job that comes along—jobs which often offer little prospect of advancement. Whether these stagnant jobs are "better" for the person in question than remaining on welfare depends—on whether, for instance, the pay and "sense of achievement," etc., outweigh the increased child care and transportation costs, increased income volatility, etc. (among a thousand other things). For some it obviously is; for many it's not.

Ideally, low-earners would find jobs with decent pay and actual upwards mobility, but those jobs aren't exactly falling off trees, so it takes time, continuous training and support, a tight employment market, and perhaps even financial incentives for decently-paying employers to hire former welfare recipients. (The 2004 Senate authorization bill actually had a provision of this sort.) But judging from the most recent welfare bill that came out of the House, though, the debate is still in the clutches of 1960s neoconservatives and their offspring, so these aren't problems we're likely to worry about anytime soon.

Continue reading "Rehabilitation"
-- Brad Plumer 9:03 PM || ||
Does GM Need Socialism?

Conventional wisdom in this country has it that American businesses are uncompetitive partly because they have to spend so much on health insurance for their workers. Here's a common variation, from Dean Bakopoulos:
[W]e must implement a system that guarantees universal healthcare. American industry — from National Steel to Starbucks — would benefit from having the burden of health insurance lifted off its back. Why else would GM be aggressively investing in nationalized-healthcare Canada while U.S. plants shut down?
Why indeed? I certainly don't know. But I'm not convinced that the conventional wisdom is entirely right. At least let's hash it out. There's reason to think that national healthcare wouldn't, in itself, make American businesses more competitive. Or rather, it can only do so by radically restraining costs.

Let's say that each year GM paid each worker $40,000 and spent $5,000 per worker on health insurance. That's a major drag, right? Well, look. Say national health insurance is then created, some system that doesn't rely on employers. Depending on how it's financed, GM could still be on the hook for that $5,000, so long as total worker compensation doesn't change—which it shouldn't, so long as it's set by the market. Maybe companies will now pay that $5,000 in wage form, to attract the same caliber workers as before (or because unions demand it). Or maybe the new insurance system will be financed by payroll taxes or individual mandates, in which case the company might have to pay each worker $45,000 so that the workers can cover the cost. But total compensation shouldn't change, in the abstract world.

Alternatively, those companies that currently don't provide any insurance can, after reform, share the burden with companies like GM, via taxes or whatever. But then you're just taking from one company to help out another—American businesses overall don't necessarily become more competitive. There are probably ways to redistribute the load that make sense, and that's why we have policy wonks, but the point is there's nothing prima facie business-friendly about this.

In reality, of course, things would look far more complicated. The current tax system makes things complex. And some health insurance systems are more efficient than others. National health insurance is probably cheaper, on aggregate, than our current system, in which case everyone would be paying less. Then businesses obviously would become more competitive. But what if the new system was more expensive—given that 45 million new people would need to be covered? GM's fortunes would depend largely on how the system was financed and how good it was at controlling costs. European companies are more competitive on this front presumably because Europe rations its health care and so spends less (with similar, if not better health outcomes). If we could do that, it wouldn't matter quite as much how health care was delivered—cutting costs is where the benefits to business would lie, primarily.

There's another aspect here. Right now, when insurance premiums go up each year, GM usually has to cover the increase, which goes up faster than wages do, unless it wants to shift some of the cost onto workers—a move that usually causes a big stir and is somewhat hard to do. But if GM was paying its workers entirely in wages, and the government handling health insurance, then GM might be able to get away with avoiding the "necessary" wage increases whenever there was a premium hike. In that case, GM would save money and become more profitable by giving its employees a pay cut—who get, say, a payroll tax increase or premium hike from the government, but not enough of a corresponding wage increase from GM to cover it. But who knows.

I certainly think a national health insurance system is necessary in this country, one not tied to employment. It would help workers move from job to job more easily while remaining insured, and would guarantee that everyone had insurance. It's fair, moral, decent, etc. And it would likely be progressive, which the current tax deductions for employer-backed insurance certainly aren't. And so on. But would it be a boon for American businesses? It primarily depends on whether the new system could control costs and prevent premiums from rising so quickly.

UPDATE: There are a number of good (and, I think, persuasive) comments to the contrary that are worth reading...

Continue reading "Does GM Need Socialism?"
-- Brad Plumer 2:01 PM || ||

November 21, 2005

i-Pod Pollution

Here's something you might not often consider: "[T]he flipside of Moore's law is that consumer appetite means we're junking technology like there's no tomorrow… Ninety per cent of our electronic waste is thrown into landfill, particularly scary when you think that each computer contains several hundred toxic chemicals. According to EU figures, consumer electronics are responsible for 40 per cent of the lead found in landfills." The upside is that, thanks to all this new technology, electronic equipment is usually smaller, meaning less waste, and gizmos like the i-Pod mean that there are fewer CDs we need to throw out. But which effect prevails?
-- Brad Plumer 7:01 PM || ||
Robot Factory

Brent Staples has an interesting New York Times piece today that looks at one of the hidden strengths of the Japanese educational system—the fact that it actually takes the time to train and develop teachers:
[There is] growing interest in the Japanese teacher-development strategy in which teachers work cooperatively and intensively to improve their methods. This process, known as "lesson study," allows teachers to revise and refine lessons that are then shared with others, sometimes through video and sometimes at conventions. In addition to helping novices, this system builds a publicly accessible body of knowledge about what works in the classroom.

The lesson-study groups focus on refining methods that improve student understanding. In doing so, the groups go step by step, laying out successful strategies for teaching specific lessons. This reflects the Japanese view that successful teaching is the product of intensive teacher development and self-scrutiny. In America, by contrast, novice teachers are often presumed competent on Day One. They have few opportunities in their careers to watch successful colleagues in action. We also tend to believe that educational change would happen overnight - if only we could find the right formula. This often leaves us prey to fads that put schools on the wrong track.
This seems so commonsensical that one wonders whether American schools really are so deficient in this regard. A Google search brings up an old Joanne Jacobs post with an excerpt from a subscriber-only Education Week piece that suggests, if I'm reading it right, that American teacher development too often focuses on "generic teaching techniques" rather than more valuable specifics—"what teachers must cover and [how] students think about that content." It goes on: "[R]esearchers also have a hunch that it’s important for teachers to engage in learning sessions collectively—maybe with other teachers from the same department or grade—so that they can meet later to reflect on what they learned." That doesn't really answer the question, but it seems to suggest they don't already do this.

Meanwhile, this very short policy brief points out some of the problems with the Japanese school system, including the oft-heard critique that Japanese schools turn their students into robots and stifles their creativity and individuality. Interestingly, in the 1990s, Japan's Ministry of Education adopted a "loose education" system, which trimmed textbooks, reduced workloads, and gave kids Saturdays off from school. But the country's test scores started slipping and pretty soon parents and teachers were rebelling, putting stricter standards back in place. Rote-intensive learning's making a comeback. So there seem to be serious trade-offs here.
-- Brad Plumer 5:40 PM || ||
Leviathan

In Foreign Affairs this month, Alexander Cooley has a good piece on the politics of American basing agreements that's worth a read. He does make the good point that the U.S. often seeks out basing agreements with authoritarian regimes, apparently on the theory that these countries will be more reliable allies from a military standpoint. But in fact, dictatorships can be unreliable, as Uzbekistan showed a few months ago, and Cooley argues that the U.S. is less likely to criticize a non-democratic regime for bad behavior if it has bases there, making reform less likely. (On the other hand, one might argue that, unless the United States has some sort of working relationship with an authoritarian country, whether it be military or economic ties, there's no hope of encouraging any sort of reform in those countries.)

Cooley then argues that democracies are in fact much more reliable footholds for our vast basing empire, and since agreements are negotiated openly, those bases are less likely to aggravate extremists or provoke a backlash. And they're actually more stable, since opposition leaders are less likely to campaign against an American presence negotiated by an authoritarian regime, as is now happening in South Korea. From a strategic standpoint, that's valuable. It's true, bases in democracies get a bad name because Turkey wouldn't let itself be used as an invasion platform in 2003, but that was something of an exception.

So it's an interesting piece, but it's not clear how much this advice applies to the current American basing empire. The Pentagon controls at least 725 military bases in about 130 countries around the world, valued at some $118 billion and employing half a million people. From a foreign policy standpoint, some of the bases seem to serve good purposes, some of them serve dark purposes—the ring of bases in Central Asia certainly have an "it's all about oil (and gas)" feel to them—but most of them seem to exist just to exist, and grow, and expand, as all bureaucracies tend to do. In time they create their own rationale for being there.

And few people have really taken the time to figure out whether this basing madness is all necessary for foreign policy—whether we actually need listening stations and covert operations in every corner of the earth, or whether they just exist for their own sake, because the military and its intelligence agencies are, for lack of a better phrase, addicted to control, addicted to seeking military and intelligence solutions to every problem, addicted to expanding their budgets every year. I certainly don't know. (Donald Rumsfeld seems to believe that the footprint of the empire needs to be reduced, but not its omnipotence.) If that's the case, then democracy, non-democracy, whatever; policymakers won't much care where the bases go, just so long as they're pervasive. Indeed, Cooley's piece seems to be searching for the right basing arrangement to carry out a preferred foreign policy, but it seems just as plausible that the reverse is how things tend to work, and the bases end up driving foreign policy.
-- Brad Plumer 3:35 PM || ||
Is Withdrawal On Its Way?

Christopher Dickey, who seems to have a good ear for goings-on in the Middle East, has this twist on the debate over when and how the U.S. should withdraw from Iraq. Basically, he says, the United States is no longer in the driver's seat:
So topsy-turvy is the policy at this point that we’re not going to imagine leaving until the Iraqi government demands that we go—and you can be sure the Iraqis who are now taking power will do just that. When? As soon as they and their Iranian allies have consolidated their hold on the southern three fourths of the country and its oil. [n.b. And not, as the Pentagon prefers, when a national Iraqi army can take over security.] ...

The Bush administration no longer sets the agenda in Iraq, in fact, and hasn’t for at least two years. The watershed came in November 2003 when there was a dramatic spike in U.S. casualties and Washington suddenly scrambled together a policy for transferring sovereignty back to Iraqis instead of pocketing it indefinitely for the Pentagon and the oil companies, as originally intended. The American invasion, which was supposed to be proactive, has led to an occupation that is entirely reactive, and it’s clear—or ought to be—that the castles in the air constructed by Wolfowitz and his friends have been blown away by facts on the ground.
There's certainly evidence that that's the case. Ahmad Chalabi, who is almost certainly an Iranian ally of some sort—if only a temporary ally—and may well become prime minister in December elections, has already suggested a tentative deadline for withdrawal, telling Congress that 2006 should be a "period of significant transition" for the United States, echoing language in a recent Senate defense bill. Moqtada al-Sadr is uniting Sunni and Shiite radicals in Iraq in support of U.S. withdrawal. The Pentagon even has a plan to do so, if necessary, drawing down to about 80,000 troops by 2006. (On the other hand, maybe Chalabi won't be prime minister after all: polls show that Ibrahim Jaaferi, who seems to want the U.S. to stay, is still pretty popular.)

It's hard to say what the end result would be of a forced drawdown plus the pro-Iranian Shiites "consolidat[ing] their hold on the southern three fourths of the country." Chaos, probably. War, maybe. Ezra Klein argues that if the United States got out in front on this and pre-empted the Shiites by withdrawing before said consolidation happened, it would force Chalabi and his belligerent Shiite allies to play nice with the Sunni insurgency. That's certainly possible. On the other hand, a troop drawdown could just as easily spur each and every Iraqi party to panic, grab whatever gun or armed ally they can find, and make war more, rather than less, likely. Sabrina Tavernise reports today that already "20 cities and towns around Baghdad are segregating" by sect, an ominous sign. Trying to predict how Iraqis will react to our future American actions seems pretty dubious. It's much safer to predict that whenever the troop drawdown comes, it will be conducted no less incompetently than every other aspect of the war so far.
-- Brad Plumer 1:59 PM || ||
Tom Sawyer, Sadist

The American Scholar magazine doesn't seem to have a website, or web presence of any sort, but in the print edition there was an amusing article about how Mark Twain's The Adventures of Tom Sawyer wasn't very funny, in a humorless liberal sort of way. Basically you have this kid who's being skinned, lashed, hit, tanned, and cracked upside the head by his sadistic Aunt Polly until he becomes a docile instrument of production. As the lady says, "It's mighty hard to make him work Saturdays, when all the boys is having a holiday, but he hates work more than he hates anything else, and I've got to do some of my duty by him."

Now that's all very obvious to anyone who reads the novel, and the interesting part is how Tom the Marxist subverts all that, but the book becomes cruel when, in turn, Tom starts brutalizing animals. He torments a little pinch-bug and then claps his hands and giggles sadistically when the pinch-bug starts harming a dog. Then he force feeds a cat some nasty medicine or other that obviously causes the beast no small discomfort. Now I don't have a serious position on animal rights, but I can see how this might all be appalling from a certain standpoint. On the other hand, Nikolai Gogol made a career out of showing how bureaucrats who were tormented by their bosses in turn torment their inferiors, and that was usually funny, though he at least made you feel bad about it. Oh well. Interesting little magazine.
-- Brad Plumer 1:14 PM || ||

November 18, 2005

Torture is Awesome, Clearly

Via Hilzoy, ABC News is running a partial list of the "harsh interrogation techniques" that were authorized by the CIA in 2002 to use on detainees. Here's one:
Long Time Standing: This technique is described as among the most effective. Prisoners are forced to stand, handcuffed and with their feet shackled to an eye bolt in the floor for more than 40 hours. Exhaustion and sleep deprivation are effective in yielding confessions.
And then there's this old reliable:
Water Boarding: The prisoner is bound to an inclined board, feet raised and head slightly below the feet. Cellophane is wrapped over the prisoner's face and water is poured over him. Unavoidably, the gag reflex kicks in and a terrifying fear of drowning leads to almost instant pleas to bring the treatment to a halt.

According to the sources, CIA officers who subjected themselves to the water boarding technique lasted an average of 14 seconds before caving in. They said al Qaeda's toughest prisoner, Khalid Sheik Mohammed, won the admiration of interrogators when he was able to last between two and two-and-a-half minutes before begging to confess.
Well bully for him. But I don't get it. "Confess"? Why would the CIA want Khalid Sheikh Mohammed to confess under torture? Why would that even help them? They already assume he's guilty. They can't use confessions obtained this way in court, ever. A confession doesn't seem useful at all. The CIA wants information, not confessions, and as a neverending parade of officials have acknowledged, torture is terrible at dredging up information.
According to CIA sources, Ibn al Shaykh al Libbi, after two weeks of enhanced interrogation, made statements that were designed to tell the interrogators what they wanted to hear. Sources say Al Libbi had been subjected to each of the progressively harsher techniques in turn and finally broke after being water boarded and then left to stand naked in his cold cell overnight where he was doused with cold water at regular intervals.

His statements became part of the basis for the Bush administration claims that Iraq trained al Qaeda members to use biochemical weapons. Sources tell ABC that it was later established that al Libbi had no knowledge of such training or weapons and fabricated the statements because he was terrified of further harsh treatment.

ABC News was told that at least three CIA officers declined to be trained in the techniques before a cadre of 14 were selected to use them on a dozen top al Qaeda suspects in order to obtain critical information. In at least one instance, ABC News was told that the techniques led to questionable information aimed at pleasing the interrogators and that this information had a significant impact on U.S. actions in Iraq.
There you go. But at least that guy confessed. Would an innocent detainee caught in the system "confess" as well? Sure, after a bit of waterboarding he'll confess to anything. That way, the CIA can justify holding him indefinitely, put him through a few more mock executions, and pump him for bad information that risks getting soldiers in Iraq killed. This is depraved. The ABC report also notes that at least one detainee died of hypothermia after a junior military officer in Afghanistan "learned" and then "misused" these techniques. Oops. Well, no doubt that detainee "confessed" as well, so no worries, eh?

No, really, why is this still being debated? How many more news reports like this do we have to read? Torture isn't effective, the CIA certainly can't use it effectively, not now, not ever, it's misleading our soldiers, it's depraved, and it's turning American officers into sadists. Little wonder "at least three CIA officers declined to be trained in the techniques"—three officers, by the way, who seem to have a lot more guts than anyone else in this whole sordid affair. At least they can make clever t-shirts: "I tortured al-Libbi and all we got was this lousy war in Iraq." Bloody hell.

Continue reading "Torture is Awesome, Clearly"
-- Brad Plumer 10:29 PM || ||