This is probably obvious by now, but I’m pretty much without a computer or internet connection here in Tokyo, so I probably won’t be posting much, if anything, until after the new year. Hopefully the United States doesn’t go totalitarian before I get back.
By the way, Japan is just as absurd as I remembered it growing up. The other day my brother and I saw a homeless guy hurling a handbag up and down the sidewalk wearing a bright pink hooded sweatshirt with the word “HOBO” emblazoned on the back. Akihabara, meanwhile, seems to be run entirely by robots. And the girls dressed up as Little Bo Peeps in Harajuku are a bit freaky. On the other hand, you really can’t beat being able to get canned hot coffee from vending machines in the subway stations. (Speaking of which, if anyone knows anything about Japan, I ask: who is the “boss” on all the Boss Coffee machines all over the city? Lenin? I was thinking some Cold War-era coffee plantation owner-turned-Latin-American dictator. He looks a bit like a younger Efrain Rios Montt, perhaps.)
Blecch, sorry for the lack of posts lately. I've been trying to catch up on work all day and it didn't help that I got back from the San Quentin death-penalty rally at about 4:30 last night. (Yes, yes, I hear that debate's alive and kicking.) But as a random link, Carrie McLaren's piece in Stay Free! magazine this month, on the history of the uproar over subliminal advertising, is very interesting. In the end, she argues, all the popular outcry in the '70s and '80s over the possibility that advertisers were sneaking in subliminal "Eat Popcorn!" messages into every fifth frame in a movie ended up blinding people to the more sinister ways in which advertising really does manipulate consumer desires and the like.
By the way, I certainly didn't know that U.S. companies spent $1.074 trillion on marketing in 2005—about 9 percent of GDP. That's a damn lot of marketing. But hey, so long as wealth in this country keeps getting redistributed upwards, wages keep stagnating, and credit cards keep getting maxed out, it will get harder and harder to get the masses to keep stuff, so this is a pretty good investment, no?
While we're on the topic, though, Juliet Schor's paper on the "commercialization of children" is pretty interesting. I've never really been sold on the anti-consumerism movement—after all, surely the problem with, say, Nike is on the production end, where overpriced sneakers are made by 10-year-old girls with bleeding fingers in draconian sweatshops, and not the fact that people like owning "stylish" clothes—but Schor has been making some decent counterarguments over the years (see here as well).
Of course, it's also much easier for a college professor to make anti-consumerism arguments—their status doesn't depend quite so much on material possessions after all; in an anti-consumerist culture, they'll be at the top of the pecking order. Well, that and the fact that any anti-consumerist movement always becomes commercialized anyway, as David Brooks enjoys pointing out. Myself, I enjoy the Citigroup billboards all around this city saying things like, "Remember when you were young and carefree and money didn't matter [!!] ?" This from the company that covered for Enron and defrauded consumers by the fistful. Where I was going with all this, I've forgotten.
Oh good, the eternal question: What to do about Iran's nuclear program? Kevin Drum picks up reports that Israel is considering air strikes against Iran's facilities to stop the program, and thinks it's unlikely. That's probably true: A while back Kenneth Pollack pointed out to me that it's not even clear that Israel can conduct days or weeks worth of bombing raids from 1000 miles away, unless its pilots are allowed to refuel their jets over Iraq with our permission—in which case the U.S. might as well bomb Iran itself. My guess is that the Israeli government keeps leaking these stories to spur the EU and United States into keeping pressure on Tehran, and isn't actually serious, but who knows?
At any rate, going to war with Tehran sounds like a terrible idea, but so long as analysts are convinced that a nuclear-armed Iran would start setting off bombs all over the region and annihilate Israel at the first opportunity, then of course people will think that the benefits outweigh the costs here. But honestly, why would Iran do that? Are Khamene'i, Rafsanjani, and the rest of the leadership in Tehran really willing to see Iran incinerated just to stick it to Israel? Why?
For the record, I think it would be a bad thing if Iran got nuclear weapons. It's a bad thing when anyone gets nuclear weapons. It's a bad thing that the United States has nuclear weapons—or France, or Russia, or Israel. They're horrific weapons, and even the slightest risk that someone crazy might detonate one is unacceptable. But that goes for every country. Ideally the U.S. would recognize this and get behind global arms control. Bennett Ramberg, a State Department official in Bush's first term, has laid out a proposal for a nuclear-free zone in the Middle East, which would include disarming both Israel and Iran. There might be practical problems with Ramberg's plan, but for once it would be nice to see world leaders take this sort of approach to disarmament.
The Bush administration sees things differently. As one defense expert close to the administration told William Potter, the current White House has reversed the Clinton administration's "undifferentiated concern about proliferation" with one that "is not afraid to distinguish between friends and foes." In other words, there are "good" proliferators, like India and Israel, and "bad" ones, like Iran. The former can build away, but the latter need to be stopped at all costs. To put it lightly, that's insane. Among other things, it ignores the fact that today's friends are often tomorrow's foes. The goal has to be, as much as possible, a universal reduction in nuclear weapons, by multilateral means—as has been done with the NPT for the past 30 years.
Arms-control treaties may have their flaws—although the NPT has worked much better than people think—but in the end, something like it is the only practical "solution" for Iran in the long run. If the U.S. fears the possibility that Iran might develop nuclear weapons, then eventually it will need IAEA inspectors in Iran, under an NPT regime, to make sure that doesn't happen. There's no getting around this fact. Even after an air-strike against Iran, inspectors would still be needed to make sure the program doesn't reappear (after all, Saddam Hussein quickly ramped up his nuclear program after the Osirak raid). Unless the U.S. plans on bombing Iran every few years, the arms-control treaty route is the only way to go. But getting there requires so many steps—a détente between Washington and Tehran, and probably the U.S. releasing its grip on the Middle East—that it seems unlikely at this point.
MORE:Here's yet another "pacifist" proposal for dealing with Iran: Jack Boureston and Charles Ferguson argue that the United States should offer to help Tehran with its current, perfectly-legal, nuclear program, in order to improve the safeguards and monitoring. Seems worth a look.
Shakespeare's Sister points to a vaguely humorous story about a police officer who Taser'ed his partner in the leg during a dispute over whether or not to stop for soft drinks. This isn't exactly Aesop's fables, but there is a moral here regarding supposedly "non-lethal" forms of weaponry. In theory, they're supposed to prevent situations from escalating the point where deadly force is required -- giving law enforcement intermediate options. But in practice "non-lethal" forms of weaponry often end up substituting for talking or other non-coercive methods of interacting with people.
The numbers tell the story: In Phoenix in 2003, the introduction of Tasers reduced the number of people killed by police from 13 to 9. But there was a 22 percent increase in the number of incidents where policed used force (160 more incidents). A similar thing happened in Cincinnati—more force, but fewer deaths—and a monitor found that police often used Tasers as an option of "first resort." In 2004, Amnesty International reported that Tasers are often used against people who pose no serious threat (unruly schoolchildren, for instance).
That's all well-known. It gets more important to think about this in a military and foreign policy context. The Pentagon is in the midst of developing a whole array of new "non-lethal" weapons to control crowds: tasers, stun devices, heat rays, decibel blasters. On one level, this all seems reasonable: so long as the U.S. is going to continue its vast empire around the world, as seems inevitable, and do a lot of peacekeeping and nation-building, then it's better that soldiers have "non-lethal" weapons to manage crowds than firing assault rifles every which way. But that's only one consideration. The other is that "non-lethal"—stun guns and the like—could start being substituted for old-fashioned diplomacy, and become options of "first resort". Here's an illustrative anecdote by Carl Canetta:
In April 2003, a company of the 101st Airborne Division used innovative tactics to defuse a potential confrontation with a hostile crowd outside a Shiite holy site in Najaf. The crowd mistakenly thought the soldiers were about to storm the Tomb of Ali. As tensions escalated, the unit commander, Colonel Chris Hughes, had his troops go down on one knee before the crowd, point their weapons at the ground, and smile. After five minutes, he had the troops walk slowly back to their base, instructing them to point their guns at no one, but to smile at everyone. Reportedly, the Iraqis then intermingled with the troops, patting them on the back and giving them thumbs-up signs. In the early days of the war, British troops used similar tactics to defuse potential confrontations in Basra.
More than tactically clever, this approach is consistent with a campaign strategy that puts a premium on building popular consensus and cooperation, which is pivotal to the success of peace, stability, and humanitarian operations. Of course, use of a "pain ray" to disperse the crowd would have been quicker and less risky for the troops. But it probably would have had a distinctly detrimental effect at the strategic level, undermining relations between the US mission and the Shia majority in Iraq.
Right, exactly. The same scrutiny should be applied to any military technology that promises "non-lethal"—or even more dubiously, "slightly less lethal" techniques. Will future weapons of war be able to reduce casualties as promised? Hopefully. But perhaps not always. The Pentagon has been developing "precision targeting" for years, in order to minimize civilian casualties when it starts bombing other countries. Fair enough. But the catch is that while each individual bomb may produce less collateral damage nowadays, the military has also begun targeting a greater number of sites, including targets it might have considered out of reach before, like "dual-use" sites. These two trends can easily cancel each other out.
I forget the citation, but an arms control expert once pointed out that the number of casualties, including civilians, in the initial phase of Operation Iraqi Freedom in 2003—some 11,000 to 15,000 dead—was very comparable to that from other modern wars with similar numbers of combatants and timeframes. That includes the 1967 and 1973 Arab-Israeli wars, the 1965 and 1971 Indian-Pakistan wars, and the conventional phase of the 1978 Vietnamese-Cambodian war.
That puts things into stark relief, I'd say. War isn't more "humane" today then it was 40 years ago, and supposed improvements like "precision targeting" won't necessarily improve things from a humanitarian point of view. What advances in weaponry will do is help to reduce American casualties, which is good, but depending on how you think about it, could have the side-effect of making the American public more likely to support war. So there's every reason to be skeptical of new military technology that promises "non-lethal" or "less lethal" forms of war.
The New Republicmakes the case for somewhat-open borders (they don't have the stomach to fling the doors open all the way, but okay), in part by noting that the demand for unskilled jobs will outpace the rate immigration over the coming decade, so we need more workers. Matthew Yglesias responds:
Supply and demand doesn't really dictate anything. If you had an airtight immigration control system, there wouldn't just be oodles of undone jobs. There's be some combination of higher wages, higher labor costs, less overall GDP growth, etc.
He's right that there wouldn't ever be "oodles of undone jobs." As for the rest, though, it brings up the question: what happened during the 1920s when the United States, in a bout of xenophobia, cracked down on immigration? Proponents of immigration controls often cite this period as proof that restrictions "work." After all: during the "roaring twenties" productivity was high, unemployment was low, nifty new consumer goods were practically spilling off the shelves, and, on the progressive side, labor unions were able to expand, supposedly unburdened by ethnic and racial rifts. (Michael Lind, among others, has made this argument.) Well, let's take a look then. Certainly one effect of immigration controls was that businesses in the 1920s, finding unskilled workers in short supply, did what you'd expect: they substituted capital for labor. As an example, Bryan Lew and Bruce Cater argued that this explains why American farmers adopted the tractor in the 1920s more quickly than their counterparts in pro-immigration Canada. This economic shift was certainly good for some people—relatively skilled white-collar and blue-collar workers captured the bulk of the income gains here. But it's less certain whether relatively poor and unskilled blue-collar workers saw any rise in real wages between 1920 and 1929. And overall, inequality increased dramatically.
It's difficult to say how much of all of this was due to immigration, and how much might have happened anyway, but at any rate it's worth noting that restrictions weren't quite as good for unskilled workers in the 1920s as one might expect. (Conversely, immigration during the 1990s wasn't nearly as bad for unskilled native workers as one would expect—perhaps in part because, as Ethan Lewis has argued, firms adopted technological innovations to take advantage of the increased supply in unskilled labor, though it may have slowed firms from adopting changes to boost productivity among skilled workers.)
Now obviously during the Depression, World War II, and afterwards, wage and income inequality decreased dramatically, and eventually in the 1940s the floor started lifting for unskilled workers. Brad DeLong takes a crack at an explanation:
[Jeffrey] Williamson and [Peter] Lindert guess that about half of the reduction in wage inequality was due to a shift in the character of technological progress after 1929. Before 1929, productivity growth had been concentrated in manufacturing industry and other sectors that required a relatively skilled workforce. As the economy's structure shifted toward these sectors where productivity was growing most rapidly, demand for and returns to skills and capital rose and demand for unskilled labor fell. After 1929, productivity growth was much more balanced: productivity in agriculture and in service sector jobs that relied on unskilled labor more than skilled labor also grew rapidly.
The other half of the levelling comes from demographic factors: fewer children and more education per child both shifts the distribution of skills within the population—making more skilled and fewer unskilled workers—and diminishes the supply of unskilled workers. The levelling of the wage distribution was also encouraged by the growth in the number of jobs that were relatively low-skilled yet also paid high wages.
Perhaps continued immigration restrictions helped here. Now much of this "levelling" started to deteriorate in the 1970s, incidentally after Lyndon Johnson had loosened immigration controls in 1965 and women continued to enter the labor market (although it's worth noting that "unskilled" and lower-income women have been working since time immemorial). How did this affect things? One could note that most of the influx of unskilled workers into the labor market in the 1970s came to the service industry, which, according to Robert Gordon, did experience a productivity slowdown during that time. On the other hand, William Nordhaus argues that the bulk of the productivity slowdown in the 1970s was concentrated in energy-intensive industries, so the oil shock, and not immigration, seems to be the greatest factor here.
At any rate, this is a very crude sketch, and hardly the complete story. But one possible lesson, it seems, is that a restricted supply of unskilled labor may well lead to quality productivity gains. (And a surplus of unskilled workers may hurt those gains.) One way to tighten the supply, then, is to restrict immigration, as was done from the 1920s to 1965. But another way is for the government to loosen immigration controls and simply pursue full employment—through monetary policy, public work programs (preferably not involving military expansion) and other government policies, along with better education to get whatever proportion of skilled to unskilled workers is needed. If that's possible to achieve, then the effect should be essentially the same. I haven't seen anyone make this argument, so very likely it's wrong, but it doesn't seem wrong.
Stacy Mitchell's piece on Wal-Mart makes plenty of decent points—certainly it would be nice if local zoning boards didn't kowtow to corporations at every turn, for instance—but her paean to small businesses in this paragraph looks like a sacred cow ripe for the gutting:
We've lost tens of thousands of independent businesses over the last decade and, with them, an important part of the fabric of American life. Small businesses contribute significantly to the vitality of local economies. They nurture social capital, disperse wealth and vest decision-making in local communities rather than corporate headquarters. They are the means by which generations of families have pulled themselves into the middle class.
It seems like the only thing you can ever get anti-globalization activists and the Chamber of Commerce to agree on is the unimpeachable virtue of small business. Now they may well "nurture social capital, disperse wealth, and vest-decision making in local communities." That's possible. But small businesses also tend to pay their workers less, offer fewer benefits, are much, much harder for unions to organize, and are often more dangerous places to work than large corporations. They're rarely more innovative, and they aren't the really the "motor" behind job growth in America—at least in manufacturing, a Federal Reserve Board study done in 1997 found that "net job creation… displays no systematic relationship to employer size," and big firms tend to create more durable jobs, partly because they engage in more "planning," that old socialist bugbear.
The point isn't to pile on small businesses—they're great and many obviously have advantages over monstrous corporations, especially for their owners. Would that everyone could be his or her own boss. But some of the enthusiasm here ought to be tempered, I think. Especially since the pagan god of small business gets invoked every single time a progressive policy idea comes gurgling out of the faucet. "No, we can't raise taxes, it will hurt small businesses." "No, we can't have national health care, it will hurt small businesses." Eff that.
Apropos of Steven Spielberg's upcoming movie on the Munich massacre in 1972, a Time reporter apparently has a new book out arguing that the Israeli reprisals ordered by Golda Meir were completely ineffective (this is Wikipedia's summary):
Aaron J. Klein (who based his book in large part on rare interviews with key Mossad officers involved in the reprisal missions) contends that the Mossad got only one man directly connected to the massacre. The man, Atef Bseiso, was shot in Paris as late as 1992. Klein goes on to say that the intelligence on Zwaiter, the first Palestinian to die, was "uncorroborated and improperly cross-referenced. Looking back, his assassination was a mistake." He elaborates, stating that the real planners and executors of Munich had gone into hiding along with bodyguards in East-bloc and Arab countries, where the Israelis couldn't reach them.
Meanwhile, it was lesser Palestinian activists that happned to be wandering around Western Europe unprotected that were killed. "Israeli security officials claimed these dead men were responsible for Munich; P.L.O. pronouncements made them out to be important figures; and so the image of the Mossad as capable of delivering death at will grew and grew." The operation functioned not just to punish the perpetrators of Munich but also to disrupt and deter future terrorist acts, writes Klein. "For the second goal, one dead P.L.O. operative was as good as another." Klein quotes a senior intelligence source: "Our blood was boiling. When there was information implicating someone, we didn't inspect it with a magnifying glass."
Huh, the real perpetrators go untouched, lesser figures are rounded up and killed instead, and no one bothers inspecting key intelligence with "a magnifying glass." We could draw parallels but it seems much too tricky. It's not clear if Spielberg uses Klein's book for his movie—surely it changes the movie if we know that, in the end, the assassins are just knocking off a bunch of lesser activists, no?
MORE: Also, read this fantastic post at Arms and Influence.
With regards to the post below on "open-source unionism" and the catch-22 involved with labor law, a reader asks a good question: How did unions get so strong the first time around? After all, it's not like business is any more hostile to organized labor today than it was back when Carnegie's Pinkertons were machine-gunning steel workers. So why should it be any more daunting for unions today?
My understanding is that the growth of unions in the twentieth century was mostly done in "spurts," and to a good degree based on a lot of contingent factors. During World War I, for instance, the Allied countries needed to make nice with labor in order to fight their little war, and so, for instance, things like the War Labor Board were created. The 1920s were a darker time for labor, but the Depression and World War II once again happily steered the government back to labor-friendly territory. Labor density also rose so dramatically in the 1930s and 1940s in part because a new form of union had suddenly appeared—the industrial union. The same thing had happened in the 1880s: a new form of organizing, the "local assembly," had given labor its big boost.
Looked at his way, the usual prescriptions for bolstering union strength—labor-friendly legislation and better organizing—sometimes seem to fall a bit short. After all, labor density still declined during the Clinton years, and while Clinton was no FDR, he was about as labor-friendly as any Democrat likely to gain the presidency in the near future. It's also worth looking soberly at the limits of regular old organizing. Richard Freeman has estimated that if it costs about $2,000 to organize a worker, then adding 1 million new members would require 40 percent of all union dues—about what the SEIU wanted to spend—but that would add only a point to labor density in the United States. A million new numbers would be a phenomenal gain—as would card checks and other labor-friendly measures—but not nearly enough on their own to reverse the decline.
Freeman and Rogers' proposal for "open-source unionism" points to at least one way forward—a "new form" akin to local assemblies and industrial unions—for an age where a shrinking manufacturing base is decimating union strength, and the government has been trampling on labor since the Reagan years. (Although it's worth pointing out that their proposal will only work if companies are dissuaded from cracking down on pro-union workers.) I don't know how one would turn their open-source proposal into practice—I'm way too far down on the labor food chain for that—but maybe if everyone claps their hands and says they believe in fairies or something.
Via Nathan Newman, an organization called American Rights at Work has just released a new report showing how widespread union-busting is among American employers. Some of the findings:
30% of employers fire pro-union workers.
49% of employers threaten to close a worksite when workers try to form a union.
51% of employers coerce workers into opposing unions with bribery or favoritism.
82% of employers hire unionbusting consultants to fight organizing drives.
91% of employers force employees to attend one-on-one anti-union meetings with supervisors.
As Nathan says, a majority of American workers would likely join a union if given the option. Most aren't given the option. ARW argues that major changes to labor law are needed to change this—including establishing "card checks" as a process for union organization, whereby a workplace would be unionized if a majority of workers simply signed a card, plus much tougher penalties for any employer that violated labor laws. But there's also something of a catch-22 here: Effective pro-labor legislation will be very difficult to pass in Congress without a strong labor movement agitating for it, but it's hard for the labor movement to become strong so long as the law is biased against unions.
So what to do, what to do? One of my favorite "out of the box" labor proposal comes from Joel Rogers and Richard Freeman, who have argued that "open-source unionism" is the way forward:
[Right now,] workers typically become union members only when unions gain majority support at a particular workplace. This makes the union the exclusive representative of those workers for purposes of collective bargaining. Getting to majority status… is a struggle. The law barely punishes employers who violate it, and the success of the union drive is typically determined by the level of employer resistance. Unions usually abandon workers who are unsuccessful in their fight to achieve majority status, and they are uninterested in workers who have no plausible near-term chance of such success.
Under open-source unionism, by contrast, unions would welcome members even before they achieved majority status, and stick with them as they fought for it--maybe for a very long time. These "pre-majority" workers would presumably pay reduced dues in the absence of the benefits of collective bargaining, but would otherwise be normal union members. They would gain some of the bread-and-butter benefits of traditional unionism--advice and support on their legal rights, bargaining over wages and working conditions if feasible, protection of pension holdings, political representation, career guidance, access to training and so on.
And even in minority positions, they might gain a collective contract for union members, or grow to the point of being able to force a wall-to-wall agreement for all workers in the unit. … Joining the labor movement would be something you did for a long time, not just an organizational relationship you entered into with a third party upon taking some particular job, to expire when that job expired or changed.
I don't really know what the upsides and downsides of this proposal are—it looks like all upside to me, but it's certainly worth debating, rather than waiting around hoping that pro-labor Democrats will ever regain power and fiddle with the law.
Yesterday, Kevin Drum took Wesley Clark to task for his New York Timesop-ed laying out how to "change course" in Iraq. Indeed, read as an actual policy proposal, Clark's op-ed had nothing more than a few decent ideas punctuated by bouts of wishful thinking. It's nothing to get excited about. On the other hand, it's probably not an actual policy proposal. I'm no expert, but I doubt that politicians and potential presidential candidates ever write op-eds in the New York Times because they feel they have some novel idea or strategy they think no one else has considered yet. This goes double for the current war in Iraq. The first thing to remember is that this president will never, ever listen to the Democrats on Iraq, no matter what they say, no matter how crafty their ideas, no matter how sparkling their proposals. On a basic level, it would be awfully naïve for someone like Clark to play dumb and pretend that the White House will heed his advice. Let's assume he's not that naïve.
In that case, the point of any op-ed of this sort is political positioning, pure and simple, as it should be. If you think the Republican Party has driven this country into a tree, if you think the Iraq war is a disaster that will take decades to recover from, and if you think the president won't listen to reason, then the only way forward is to lay the groundwork to win elections in 2006 and 2008. Call it "overly political," but it's also right and rational. And Clark might well be helping here. Back in August, Charlie Cook explained the basic poll dynamics at work over Iraq:
For a month or two, there has been a theory circulating among those that watch polls that the American public can be broken down into four distinct groups: those that have always been against the war; those who were for it but now believe we've blown it and should pull out; those who supported the war, believe the invasion was successful but think that the aftermath has been completely blown, yet would hate to see us withdraw immediately and lose all we've invested; and those that have always been for the war.
Pollsters say that the first group -- always against -- makes up about 30 percent of the electorate, while the second group -- those that started off in favor of the war but now see it as a lost cause -- includes about 20 percent. These two categories total half of all voters in opposition.
The third group -- those that are conflicted because they see the effort as doomed and casualties increasing, yet still hate to see us 'cut and run' -- makes up another 25 percent.
Assume Cook is right on this. Then here's what we have: those Democrats who are advocating withdrawal can, at best, capture support from the first two groups, and, at the very least, cause these groups to turn against the president and, presumably, the Republican Party. But that's still only 50 percent of the electorate on the Democratic side—hardly enough to recapture Congress.
So that leaves the third group of voters—"those that are conflicted because they see the effort as doomed… yet still hate to see us 'cut and run.'" This group seems to consist of what Walter Russell Mead once called "the Jacksonians"; war-mongering Americans who would rather cut off their left arm than see America lose a war, but are having real doubts about Iraq. Many in this group may never vote Democratic. But some of them might. At the very least, "hawkish" Democrats like Clark and Biden might at least be causing this group to doubt the president's resolve, especially if the Jacksonians can be convinced that the president isn't doing everything he needs to do to win.
If Mead is right, then nothing infuriates Jacksonians more than a president waging a half-hearted war. Over the past year, Bush has been facing this fury. My guess is that president's recent uptick in the polls has come about because his Iraq speech shored up support among more than a few Jacksonians, who were persuaded that he would do what it takes to win. But that bounce may not last. What hawks like Biden and Clark seem to be doing—regardless of whether they truly believe we should stay or not—is chipping away at this certainty. (They also probably erode support for the war itself, which is good news for the withdrawal advocates.)
Is it a problem that Democrats aren't at all unified on Iraq, that you have Feingold saying one thing and Clark another? Robin Wright and Mike Crowley sure seem to think so. I don't see why. There isn't going to be a national election in 2006. Different congressional and Senate Democrats will inevitably end up holding various positions on the war to appeal to their particular districts and states. There might be a vague "official Democratic stance," but individual candidates will diverge from that. And that's as it should be—people in North Dakota may have different views about the war from people in Connecticut, and there's no reason why, in an electoral system like ours, both have to vote for some Democrat toeing a single party line.
In 2008, of course, Democrats will be nominating someone who will actually be in a position to do something about Iraq, if we're still bogged down there. In that case, let primary voters decide. American democracy is nothing to brag about, but it's still a decent way to choose among competing views.
At long last, we're going to have a real, honest-to-goodness war on Christmas. It's about time. I've always been a bit mystified that the biggest holiday in the universe was such a wilting violet as to constantly need saving—first the Grinch had to save Christmas, then Ernest, I think at one point the job got pawned off on Elmo. Come on. But now that there's going to be an actual war, I'm curious to see what cartoon character they'll trot out to play defense? Perhaps the Smurfs, though we all know they're a bunch of anti-yuletide communists.
For my part, I'll be going back home to Japan for the holidays—for the first time in five years—waging war on Christmas overseas. Won't be easy, though; I see they've brought in the "Xmas Gojira" this year:
Try to declare war on that. While we're at it, Adam Cohen's piece on the "Christmas wars" is really quite good.
A new Health Affairsreport out today. Good and bad news. Here 'tis. Good: those Americans living in relatively well-off communities, along with retirees on Medicare, are enjoying better access to health care than ever before. Cardiac and orthopedic surgery is in particular making great strides. Smiles all around.
And... the bad: both the uninsured and Medicaid recipients are increasingly receiving worse access to basic care, especially after the last recession, as states face budget crunches. For instance:
Adhering to commitments to not give up hard-won gains in eligibility, most state Medicaid agencies have used other techniques, including reducing or freezing provider payments, eliminating certain benefits, instituting copayments, setting service limits such as total inpatient days or prescriptions covered, shrinking periods of guaranteed eligibility, and narrowing the time window for reapplying for coverage renewal.
Medicaid payment reductions and freezes have exacerbated problems with access to key services such as mental health and dental care, as well as many types of specialty care. Applying copayments, eliminating benefits, and setting arbitrary limits on services is seen by some observers as "cost shirking," which leaves providers caring for these patients in the position of either dropping them or absorbing the cost of their uncompensated care. More commonly, providers avoid undertaking care for these patients to evade such discomforting situations.
The reduced access to dental care is a critical one. Malcolm Gladwell touched on this awhile back in his New Yorkerarticle on health insurance, but the bad effects of tooth decay, common among those who can't afford to see a dentist, start to multiply very quickly. First your teeth start turning brown and rotting, then you're pulling them out with pliers to stop the pain ("They'll break off after a while, and then you just grab a hold of them, and they work their way out"), then you can't eat fruits and vegetables, which invariably leads to further health problems, and then you can't ever land a job that requires you to be seen by other human beings—such as a bank teller, or a receptionist—since no employer will hire a receptionist with brown stumps in his or her mouth.
Dentures are sometimes an option, but many state Medicaid programs won't cover dentures unless all your teeth have been yanked out with pliers, and if the dentures are made incorrectly and don't fit quite right, an adjustment can cost hundreds of dollars—it's usually cheaper just to toss the ill-fitting dentures in a drawer than shell out $200. So "problems with access to… dental care" are a big deal.
The Health Affairs study also notes that public mental health services have been cut in recent years. In Orange County, pop. 3,000,000, the county mental health agency has a crisis inpatient unit of exactly 10 beds. For instance. I'm guessing it's obvious how and why these cuts can be devastating, but it's worth adding that in the absence of a decent public mental health system, throwing a person in jail often becomes the primary way to treat the mentally ill—after all, state health budgets may have an upper bound, but the sky's the limit for the correctional system, even during a downturn. Which is great, because prison mental health services are often only slightly less humane than kicking a homeless guy in the stomach.
A while back, Witold Rybczynski wrote an article in Slate about how urban sprawl was inevitable, had happened throughout history, and was impossible to stop. Naturally, it was pointed out that while that might be true, not all sprawl was created equal. Some forms are worse than others. A random bit of clicking around the World Bank's site dredged up this old study which can make some of the differences here a bit more palpable.
To see what we're dealing with here, take Boston (for some reason there's incomplete data on San Francisco). Boston's already a fairly spread-out city, but if its "population centrality" was as spread-out as Atlanta's, Bostonians would be driving about 9 percent more. Public transit also matters: If Boston had Atlanta's rail system, total driving would increase by about 5 percent. (If it had a transit system as shoddy as, say, Dallas', there would be even more driving.) The distribution of jobs to housing matters too. Boston has a very even mix in this regard, but if it was to become more unbalanced like, say, Washington D.C., driving would increase by roughly 9 percent. (I'm eyeballing the calculations here.)
That doesn't seem like such a big deal, but taken together, these changes start to have a real impact—the authors point out that if you could wave a magic wand and make Atlanta similar to Boston, then total driving per household would decrease by 25 percent. Obviously no one has a magic wand, and maybe people prefer living in Atlanta-type cities to Boston-type cities, although who can fathom why, but it's certainly something to consider.
More fascinating news on the military budget front. The Wall Street Journal reports today (page A1) that the Air Force is going to cut 30,000 to 40,000 personnel over the next decade in order to save some of its big-ticket weapons programs:
To stay within its expected budget, the Air Force is planning to cut at least 30,000, and perhaps as many as 40,000, uniformed personnel, civilians and contractor-support staff through fiscal 2011, military officials said. The exact composition of the cuts isn't known, though their thrust is clear: "This is one way to pay the bills without messing around with our weapons programs," said one official involved in the Air Force budget.
It looks like the Air Force will use that money to keep both the Joint Strike Fighter program—a costly purchase that was once called "unexecutable" by the GAO—along with a system of "missile-warning satellites" built by Lockheed that has been years late and costing "more than three times as much as its initial $3 billion budget." Both programs were under fire from the Pentagon but it seems the Air Force managed to deflect the axe by going after personnel instead. And yes, in case anyone was wondering, "the shift is good news for the nation's major defense contractors." As I recall, about a month ago, the Army proposed something similar in response to the Pentagon's demands for $11.7 billion in cuts from them: the service offered to reduce its force structure, at a time when the Army is already stretched thin, rather than touch its own procurement budget.
Now perhaps someone knows better than I, and surely there are a lot of ins and outs involved here, but it sure looks like defense contracts are dictating the type of military we have, rather than the other way around...
David Cloud's report in the New York Times today about the Navy's plans for future shipbuilding is interesting for two reasons. First, there's the surface reason: the Navy appears to be scaling back its plans for a war with China, and adjusting its procurement budget so that the service can play "a greater role in counterterrorism and humanitarian operations." A while back, Defense Industry Daily did a profile of the Littoral Combat Ships that seem to be the centerpiece of this new look, so that gives some of the flavor here. But Cloud's report is also of interest because you can detect undertones of the Navy's growing budgetary conflict with Congress.
The Navy has been slowly losing the battle to preserve its funding over the past few years, partly because both the Bush administration and many members of Congress don't seem to believe that ships and submarines are wholly relevant to the so-called global war on terror, and partly because shipbuilding has become hideously expensive. The Navy's FY2006 budget requested only four new ships, scaled back from the six they were planning to request the year before, and well below the eight per year that naval officials would reportedly prefer. (Figure it this way: If ships have an average life span of 35 years, then the current fleet of around 281 ships will need to buy 8 new ships every year to maintain itself.)
But Congress has had other ideas, and in March asked the Navy to provide two future shipbuilding plans: one, for 325 ships, seems to be the one the Navy is telling the Times about, for obvious reasons. The other was a much smaller plan to pare the current fleet down to 260 ships by 2035. The CBO compared the two plans here: one major difference is that the 325-ship plan has far more "surface combatants," which, for anyone less-than-convinced about the coming war with China or Russia, probably look a bit less than necessary. Judging from recent defense budget debates, Congress appears to be leaning closer to the 260 ship, with the Senate as always being more generous with spending than the House.
The main concern seems to be that shipbuilding costs are ballooning far beyond what anyone envisioned. Not surprisingly, the Soviet-style command economy hasn't worked very well for the defense industry. The Navy had earlier proposed, as a way of keeping its costs down, that two shipyards compete for the right to build all DD(X) destroyers, rather than let the two facilities "share" production, without any incentive to compete. Naturally, twenty members of Congress opposed the idea, since competition would defeat the "share the wealth" mentality that supports inefficient contracts. At least this year, the House tried to put a stop to some of these inefficiencies by imposing cost ceilings on future ships, but ultimately these debates often come down to which members of Congress need contracting jobs for their constituents, and how powerful those members are. (After all, the supposedly "budget-conscious" House also marked up an additional $2.5 billion for two Arleigh Burke class destroyers.)
Worth noting, though: it's always a leap of faith to try to guess whether Congress is really scaling back the procurement budget, as it appears to have done for FY2006. The Center for Strategic and Budgetary Assessments has pointed out that many procurement costs may have just been shoved into the future. A favorite trick is "incremental" funding, where Congress appropriates only part of the money needed to complete a ship, figuring that the rest will get added on in later years (after all, if the Navy only needs a couple million more to finish up a nuclear submarine, who's going to say no?).
Back to Cloud's original story, though it's also not exactly clear whether the Navy really needs to build 32 new ships, at a cost of an extra $13 billion a year, by 2020. I certainly don't know, but it's a question worth asking. A June 2005 CRS report on naval transformation lays out some of the issues being debated here. On the one hand, the Navy has been trying to change in a lot of ways you'd expect, so that it can do things like operate in coastal areas and improve its response time. This certainly came in handy with the Navy's rapid response to the tsunami in Indonesia last year, a move that probably did as much for national security as anything else the military's done since 2002. On the other hand, the Navy seems to be placing a fair bit of emphasis on "strategic defense against ballistic missiles," which usually leads to some dubious spending decisions.
Eugene Robinson wrote a column a few days ago saying that he's against the death penalty, but doesn't see why Stanley "Tookie" Williams—the founder of the Crips who has since turned his life around on death row—should get special attention from the anti-death penalty crowd. "For me," he writes, "this case just reinforces my belief that there is no way the death penalty can be fairly applied," since unless you have Snoop Dogg on your side, no one cares about your fate. It's a reasonable point, though I partly disagree. If you believe that Williams really has turned into a peacemaker and anti-gang spokesman these days, then his story helps illustrate some of the problems with letting the state decide when a person's life is no longer worth living.
Still, Robinson has a point. One could expand his point even further, though, and say that it's not always clear why death penalty cases should get extra-special attention. Many of the pragmatic arguments against the death penalty apply to the criminal justice system as a whole. Yes, defendants in capital cases, especially minorities, are often convicted after shoddy trials with poor representation and biased juries—but mistakes and unfair trials happen all the time, in all sorts of cases for all sorts of crimes. (The only difference I can think of is that juries that are "death qualified," i.e., qualified to sit on capital cases, have been shown to be more likely to convict.)
It's true that the death penalty is "irreversible," so mistakes in capital cases are especially glaring, but sending an innocent person to prison for 30 years is irreversible as well, happens more often, and in many cases isn't exactly a "nicer" fate than death. About 25 years ago the New Republic ran an article arguing that the old Islamic code of justice—cutting off the hand of a thief, etc.—was barbaric, but perhaps not more barbaric than the American criminal justice system. If anything, it's more equitable, since a severed hand is an equal punishment for all (sort of), but the horribleness of a prison sentence depends on how well you can survive prison.
So yes, it's arbitrary that of 20,000 homicide convictions each year, the state chooses a 100 or so to kill. But it's equally arbitrary that, say, a man convicted of armed robbery can also be, in effect, killed by the state because his prison guards think it's "funny" to stick him in a cell with a known rapist. I'm certainly against the death penalty, but it's useful to remember from time to time that it's only a relatively small awful aspect of a larger awful system.
Via Iraq the Model, here's an online questionnaire to see which Iraqi party most represents your political views—if you were an Iraqi, that is.
My top match turned out to be the Iraqi Communist Party (76%), which is a more random outcome than it seems—most of the questions concern federalism and the like, not who should control the means of production. Interestingly, the ICP seems to be marginally pro-occupation here: it's "neutral" on whether the U.S. should leave after the December election, "neutral" on whether it's okay for Iraqis to resort to American forces for security, and "disagree" with the idea that killing Americans is a legitimate right for Iraqis. It's actually a pretty good way to figure out where each of the parties stand on various issues.
Over in the MaxSpeak comment section, Barkley Rosser is making a bunch of good points about tax reform. For starters, he and commenter Spencer point out that almost 90 percent of the "important" investments in the United States—i.e., buildings, factories, equipment, the stuff that actually makes the country grow—are financed by corporations from retained profits, and not individuals or investors. Doug Henwood has made a similar argument elsewhere, adding that even when firms do turn outside for cash, they go first to banks, then the bond market, and only last to the stock market. Even in the peak year of 2000, IPOs accounted for just 5% of nonresidential fixed investment, and while venture capital does help young firms, it also only provides around 1% of investment.
Blah, blah, blah. At any rate, those numbers seem to imply that repealing the corporate income tax would probably have the greatest effect on investment in this country. I might be fine with that, especially since most companies nowadays seem to waste an inordinate amount of everyone's time either dodging the corporate tax in Bermuda or carving out loopholes in Congress. (Although I shudder to think what idle corporate lobbyists would get up to once they no longer had to meddle with the tax code.) By contrast, higher taxes on personal income or capital gains would probably have relatively little effect on investment.
Of course, Barkley Rosser also points out that since the economic effects of all of these different tweaks are probably vastly exaggerated, the best thing might just be to keep the current system more or less in place**—once enough revenue is raised to close the deficit and pay for spending—and then leave it the hell alone so that people can actually have time to figure out how the system works and then get on with their lives. Good point! I guess it tends to get overlooked that tax accountants and policy wonks have a vested self-interest in constantly fiddling with the tax code in order to make themselves relevant to the world.
(**Mind you, if I was allowed one "tweak," I would exempt the first $10,000 of income from payroll taxes of any sort, and eliminate the "Trust Funds"—which have been squandered on tax cuts anyway—and make Social Security and Medicare pay-as-you-go systems once again. If on the off-chance the system ever faces a shortfall someday, we can raise the cap on the upper end. But... I don't have a magic wand.)
Via Ezra Klein, who has a lot of interesting things to say about this, here's the progressive case for Wal-Mart, by Jason Furman. The paper makes some solid points, though as Max Sawicky points out, it's unlikely that the effect of Wal-Mart's low prices can compensate for rock-bottom wages for the lowest income groups. But set that aside. The weakest part of the paper, I think, is that Furman seems to believe that just because Wal-Mart has some positive and progressive effects on the economy, it can also be a progressive company.
If Wal-Mart were committed to the welfare of its more than 1.3 million "associates," as it calls its workers, then it would push to expand these public programs. Instead, Wal-Mart and the Walton family have generally worked against the progressive issues that would benefit its employees, including funding campaigns advocating the repeal of the estate tax. Recently, Wal-Mart has come around to endorsing a higher minimum wage, but this limited step is outweighed by its consistent funding for attempts to roll back progressive priorities that would benefit its workers.
This seems almost willfully naïve. Of course Wal-Mart works "against the progressive issues that would benefit its employees." Why wouldn't it? In as much as Wal-Mart can be said to be an entity that "wants" and lobbies for things, it will want a) lower corporate taxes, b) lower taxes for its managers and owners, c) lax corporate and workplace regulation, and d) a governing party that can distract the working class with cultural issues. Now which party delivers that?
Ezra seems to think that Wal-Mart can somehow be convinced to become a progressive ally and support universal health care in order to lower its labor costs. Nonsense, I say. If Wal-Mart's worried about its health care costs it will stump for a Republican health care plan that leaves workers at the mercy of the sharks in the open insurance market and ensures that Wal-Mart, along with its managers and shareholders, pay as little as possible for the scheme. Back to Furman:
Finally, Wal-Mart should obey labor laws that bar gender discrimination, unpaid overtime and environmental laws like the Clean Air Act. I do not know whether or not Wal-Mart has systematically broken these laws; to the degree it has, the remedy is through the legal system, not through restricting the expansion of Wal-Mart into new areas or applying new mandates to a single company.
Aargh. That ignores the fact that Wal-Mart, along with other corporate interests, gets involved in politics precisely to neuter the legal system that's supposed to act as a "remedy"—trying to limit its liability from class-action lawsuits through tort "reform," and trying to weaken workplace regulations, and so on. More to the point, the legal system as currently structured is inadequate for dealing with corporations like Wal-Mart. The company has shown no remorse for anything it has done, any of the workers it has mistreated, any of the laws it has broken, and considers paying fines simply a cost of doing business.
This is why many people support the death penalty for corporations. In the short term it would be disastrous for many people, not least the company's 1.3 million employees, if Wal-Mart's corporate charter was revoked and its assets auctioned off. But understandably, many people feel that drastic action is necessary to teach corporations that you don't lock workers in the store after hours, you don't treat undocumented workers like indentured servants, and you don't let children operate heavy machinery just so that economists can gush over your "productivity growth." Now there are certainly problems with the corporate death penalty, and personally I have mixed feelings on it, but the idea gets at the general problem here: you can't just ask Wal-Mart to play nice.
That's exactly why, as Matt Yglesias says, unions are so important—so that there is some countervailing power fighting against corporate-friendly changes to the legal system and hideous labor practices. That's why the SEIU wants to unionize the company, so it can tip the balance of power, and Wal-Mart stands in the way. Trying to "get along" won't get the job done.
Now Ezra's totally right to say that there are real problems with simply demanding that Wal-Mart offer its workers health care—employer-based health care is a very flawed system. Better to get a more rational universal health care system. But quite naturally, unions don't think universal health care is ever going to come, and that any such system would inevitably be weakened by corporate lobbyists hired by, among others, Wal-Mart. Ezra's trying to think through what a more rational long-term strategy to get to universal health care would look like. I'd love to see it, and think it's a worthwhile project. But I don't think it does any good expect that Wal-Mart can ever be "persuaded" to fight for the working class and single-payer health care. It's still a war and always will be.
Michael Fumento says that we're all a bunch of Chicken Littles for worrying about avian flu. Well, that's good news if true. Bad news if not. I can't say. Mostly I just wanted to quote this bit of morbid goodness:
True, no retelling of [the 1918 flu epidemic] is without anecdotes of apparently healthy young people simply dropping dead, such as the man who boarded the trolley car feeling fine only to leave in the company of the grim reaper. But even these probably didn't die from a direct attack of the virus, writes [John] Barry. Rather, "victims' lungs were being ripped apart . . . from the attack of the immune system on the virus." That explains in great part why an extraordinary number of young people died – they have stronger immune systems.
The sad thing is, the only "clever" quip I could think of in response was, "We had to destroy the lungs in order to save... &tc." which suggests that it's time to step away from the computer.
The "tough" new immigration proposals put on the table by the Bush administration look like a mix of bad and good: appeasing the base with beefed-up border security measures and comforting Hispanic voters with amnesty for illegal immigrants and the like. I don't know whether it will pass or not. On the policy side, punitive border security rarely seems to work, but that's what the American people seem to want. Over in Democrat land, meanwhile, the McCain-Kennedy provisions for amnesty, guest worker quotas, and citizenship tracks for immigrants, while imperfect, are perhaps the best "liberal" policies one can realistically hope for right now.
So that's what Congress is up to. But maybe it's worth drawing up a wild-eyed, far-left position on immigration, if only to broaden the debate a bit. Immigration, as we know, has very large benefits—not only does it provide a colossal subsidy to the destination country, which is freed from the expensive task of creating labor power, but it also helps the source country as well: remittances benefit many developing countries much more than foreign aid or free trade does. (Nancy Birdsall has estimated that proposals to boost labor mobility could transfer some $200 billion or more to poor countries, along with skills, work experience, and, when immigrants return home, the creation of a middle class that could agitate for better governance.)
That said, workers who worry that immigration can put downward pressure on wages in the United States have a point, although the evidence that it actually does so is oftenambiguous. True, some of these worries are motivated by xenophobia or racism, but the underlying economic concerns shouldn't be dismissed. Immigrants, meanwhile, are heavily exploited by the current system. Both are problems. Mainstream liberal policies, such as national health care and worker retraining, may soothe the pain caused by all this "creative destruction," but ultimately the balance of power needs to shift towards the worker and wage-earner. That's our guiding populist principle here.
Point one: "Guest worker" policies and quotas, ala McCain-Kennedy, are problematic, in that they give business undue power over labor, both because "guest workers" will always have very little leverage against the employers that bring them in, and because quotas will allow businesses to control the flow of labor and import additional immigrants illegally as needed—and those workers have no leverage. The Cornyn-Kyl proposal to allow guest workers to stay here for five years before they have to leave is even more impractical, since workers will inevitably stay on illegally when their time expires, and business will have greater power over the workers they bring in. That depresses wages and leads to exploitation.
In contrast, open immigration, allowing workers to come and stay legally and indefinitely, with amnesty for all and open paths to citizenship, is both legitimately pro-growth, just like population growth, and better for wages, by giving immigrants more leverage and autonomy—including the ability to organize. That brings us to…
Point two: Open immigration will only work alongside policies to promote full employment and labor rights. So long as businesses can take advantage of immigration as a way to create "slack" in the labor market, they will be able to hold wages down. Full employment policies shift the balance towards the worker, and are perfectly realistic—it's not as if there are only a fixed number of jobs in existence, so in theory there should be no upper limit to how many immigrants can come in, although in practice limits would be set. Immigrant organizing is also crucial here.
There is no reason to think we'd necessarily be "swamped" with immigrants with looser immigration controls. Just because more people can immigrate doesn't mean everyone will. Teresa Hayter noted that in the postwar period England had open borders with its former colonies, and only 0.6 percent of the population migrated to the industrial countries, despite the obvious attractions. Obviously opening the floodgates from Mexico would be different, but it still seems manageable. And on the plus side, it would reduce our billion-dollar immigration-control costs and eliminate the deadly trade in human trafficking, along with bringing down the worker-to-retiree ratio in the United States, alleviating our Social Security and Medicare imbalances.
Point three: Immigration must be open for white-collar workers too. Dean Baker is very good on this. The medical profession, through licensing, hinders immigrant doctors who trained abroad from working in America. Why? If I need brain surgery, sure, I'll go see a Harvard Medical School grad with AMA credentials, if I can afford it. If I just have a sore knee, I'd rather pay $20 to see the immigrant doctor who just came over from Bangladesh. She can treat me. The same goes for an "uncredentialed" immigrant lawyer who can offer basic legal advice. Good stuff all around. Ending white-collar protectionism will also lead to less income inequality.
Point four: There must be better enforcement against employers who exploit immigration. No matter what our immigration controls, undocumented whistleblowers should be protected from reprisals. The threat of an investigation from INS or Department of Labor discourages many immigrants from complaining about unpaid wages or illegal overtime.
Point five: Voting rights for immigrants. Why not? They work here. They have kids enrolled in our schools. Local policy affects them. I agree with Jamin Raskin that there is no reason why non-citizen residents should not be allowed to vote at the very least in local elections. (He would limit voting at state and federal levels to citizens only.)
Granted, all of those steps are wildly, wildly unrealistic at present—although #4 is something that should be pushed for no matter what—but as eventual goals to aim for I think it's a good outline. It's not at all complete—I've said nothing about security issues, which are both real and very hard to reconcile with an "open borders" immigration policy, or how to deal with the "brain drain" in emigrating countries. And perhaps some of the above is flat wrong. Comments welcome, as always.
One serious challenge, it seems, will be damping down animosity against, say, immigrants from Central and South America. This is daunting in the best of times. Even where Hispanic immigrants aren't affecting native wages, on account of immigrants getting slotted into unskilled jobs no one else wants, native backlashes are cropping up. It's everywhere. This has been the case throughout history. Rationally speaking, there is no reason for "cultural discomfort" with immigration, as Dan Drezner argued here—within a few generations Hispanic immigrants have assimilated just like centuries worth of (also reviled) European immigrants did before them. Still, xenophobia is an absurdly powerful impulse, and as Digby says, liberals should try to avoid feeding into the uglier populist currents here.
I'm a reporter at The New Republic, mostly covering green issues, apocalypses, general doom. This is my personal site. I also post regularly at The Vine, TNR's enviro-blog.