Who knew there was a journal called Human Rights & Human Welfare? Well, there is, and it's fascinating. Richard McIntyre's essay "Are Workers Rights Human Rights and Would It Matter If They Were?" has some good stuff, notably his review of Robert Ross' investigation into American sweatshops. If you go by the GAO's definition—"a business that regularly violates wage or child labor laws and safety and health laws"—then there are 250,000 sweatshop workers in the United States. That figure excludes a good number of people working under poor conditions for low pay, but it's a start. But I'm less convinced as to the reason these sweatshops exist:
Undocumented immigration is a condition for the new American sweatshop. But, as Ross points out, large numbers of immigrants need not mean sweatshops, as the Puerto Rican wave of immigration to New York City in the 1950s demonstrated. It is globalization, with the accompanying decline in union power and in the State’s inability (or unwillingness) to regulate labor markets that has encouraged the growth of sweatshops, and the concentrated power of the mass retailers that has called them into being.
It seems—at least at times—more complicated than that. For instance: In 1997, a Labor report found that two-thirds of the garment shops in New York City were sweatshops, with twelve-hour days and seven-day weeks and wages under the federal minimum. Fun times for all. Partly that's a consequence of the decline in monitoring. During the Carter administration, the Labor Department had 1,600 wage-and-hour inspectors; that was slashed to 700 under Reagan, and only bumped up slightly to 940 or so under Clinton.
What's striking, though, is that many three those shops were actually union shops, run by UNITE-HERE (which was, ironically, organizing protests against sweatshops in the Third World at the time). In fact, three-fourths of the union's shops were in violation of the law. UNITE had basically stopped enforcing its contracts.
As Robert Fitch tells it in Solidarity for Sale, his interesting-yet-sketchy book on labor corruption, this was partly because locals such as UNITE's 23-25 were controlled by the mob and had been colluding with and even receiving bribes from employers not to enforce contracts. (Seven members of the Lucchese crime family were convicted over the racket; no UNITE officials were ever indicted.) What's more, in Fitch's telling, many of the mob-dominated garment unions in New York have been doing this for a long, long time—receiving bribes in exchange for allowing contractors to pay sub-minimum wages and violate labor laws.
Of course, mob-controlled unions are hardly the whole story. (In any case, UNITE has seriously cleaned itself up over the past few years—although Fitch claims that factory conditions in unionized Chinatown have only worsened since 2001, and the official decline in union sweatshops is only due to a change in Labor Department reporting requirements). More importantly, perhaps, over the past fifty years New York garment unions like UNITE and the ILGWU have made it their explicit strategy to lower wages in order to keep contractors from moving out of the city. Here's Fitch:
To maintain bottom-feeding wage levels, the social-democratic [ILGWU leader David] Dubinsky embraced the most Dickensian version of capitalism—piece-rate wages, no-strike pledges, five-year contracts, opposition to the minimum wage, and opposition to government aid—complete with up-to-date Fagins in the form of the city's major Mafia families. This low-road strategy led naturally to a reliance on illegal immigrants and made the New York City industry more and more vulnerable to cheap foreign imports….
Of the cheap labor strategy, at least it could be said that it kept enough of the industry in place so that New York City officials could retain their power and retire on fat pensions. But the costs were great. The measures required to keep wages low destroyed the moral substance of the union. Even poorer waves of unskilled immigrant workers had to be imported who would find life at the bottom acceptable. The once-vibrant political life of the union had to be extinguished. With constitutional changes, opposition to the leadership became difficult.
But isn't this strategy sometimes the only way to go? In the face of globalization—or, at the very least, mobile capital—won't unions sometimes, or even often, have to agree to lower standards in order to preserve their membership base? Isn't that Ross' whole point above?
Maybe, but consider this. According to a 1999 report, the vast majority of garment shops in San Francisco, despite being mostly non-unionized, still manage to pay the city minimum of $8.15 an hour and comply with labor laws, while in New York unionized workers were making under $5.15 an hour and repeatedly subject to wage and hour violations. The United States, meanwhile, imports billions of dollars of clothes from Northern Italy each year, where garment workers make two or three times what their counterparts in New York make. So it's not clear that globalization always has to create a race to the bottom. At the very least, to slap a hackneyed conclusion on this rambling little post, "it's complicated."
A few weeks ago, I threw together some numbers and statistics suggesting that the French protesters might not be so misguided, and France-style labor protections might not cause high unemployment after all. Now David Howell and John Schmitt of EPI have a new paper getting into this in more depth.
The super-novel point here is that France's youth unemployment-to-population ratio (8.6) is actually nearly identical to that in the United States (8.3). France's "official" youth unemployment rate is higher primarily because very few French students enrolled in school actually work, while a lot of our college kids get jobs, so the ratio of unemployed youths to working youths is higher in France than it is here. Different numbers measure different things.
Okay, so why do so few French high school and college students work? Maybe it's because they can't find jobs. Or maybe it's because they don't need to—their public universities are more heavily subsidized, after all. Interestingly, though, the percentage of 20- to 24-year-olds who aren't in school and are unemployed is actually a bit lower (14.1) than it is in the United States (14.4). That seems like the main number to worry about, and France seems to be about average on that front.
It's also worth noting that the share of young French adults still enrolled in education is much higher than it is in the United States (51.1 versus 35.0 percent). Again, whether that's because French kids like school or because they have no other options is up in the air. But even if it's because they have no other options, perhaps being "forced" to stay in school isn't so bad: According to OECD data, French workers are, on average, 6 to 16 percent more productive than American workers. Work less; study more—maybe that's the way to go.
Apparently, CIA investigators found out that Mary McCarthy was the one who leaked information on the agency's secret prisons by hooking her boss up to a lie-detector test. Or rather, that was part of the investigation process—it's not clear that the polygraph itself actually helped uncover anything. Mark Kleiman has all the substantive commentary anyone could hope to have on the larger story here; I'd rather talk about the polygraph itself.
Do these things actually work? The National Academy of Sciences tackled this question back in 2002, in a study that seems to have a definitive air about it. Polygraphs are used in three basic situations: Ferreting out specific information about specific events (i.e., after a crime), employee screening, and pre-employment screening. As far as ferreting out specific information, the NAS found, polygraphs "can discriminate lying from truth-telling at rates well above chance, though well below perfection." Good, not great. For the latter two purposes, it's not so good (more on that in a bit).
But it's very, very far from perfect. If a "security threat" is smart, according to the NAS report, and knows how to beat a polygraph, he or she often can. Aldrich Ames passed two CIA polygraph tests before he was caught. What's more, most experts don't even seem to share the NAS' (very limited) confidence in the polygraph for any purpose. Only 36 percent of experts in a 1997 survey in the Journal of Applied Psychology thought the test was based on "sound psychological principles." And the Supreme Court ruled in 1998 that polygraph evidence can't be used in court because "there is simply no consensus that [it's] reliable."
Most of the time, the polygraph seems to work only because people think it's effective and confess to crimes or indiscretions before taking the test; a Pentagon study concluded that the bulk of its information at security screenings comes before anyone gets strapped in. On the other hand, the test's deterrence value seems low; leaks in the CIA increased during the '90s, despite increased use of the test. Unfortunately, as the NAS points out, no alternatives have outperformed the polygraph, so it's still widely used.
Steven Aftergood had an interesting column in Science back in 2000 arguing that the polygraph—which was increasingly being used at nuclear laboratories—had less to do with security and more to do with "exercising political control through intimidation" over nuclear scientists. Someone else can answer whether that's what's going on in the CIA now that "dozens of employees" have been given "special polygraphs" since January as Porter Goss cracks down on anti-Bush sentiment in the agency.
Anyway. Another related—and quite serious—problem with polygraphs, it seems, is the fact that the test is still used to do pre-employment screening in the government. For this sort of screening, the NAS found, the lie-detector test is extremely unreliable, since the questions being asked tend to be less specific and involve more ambiguity. ("Have you ever passed along classified information?" etc.) That's why Congress passed the Employee Polygraph Protection Act in 1988, making it illegal for private employers to subject employees to polygraphs, after hearing testimony that over 400,000 workers had wrongly flunked the test and suffered as a result.
But the law exempted government agencies: The CIA, the FBI, not to mention hundreds of state and local agencies, all use the polygraph as part of the hiring process. And a lot of people get disqualified because of it—many presumably wrongly. There's a whole website, antipolygraph.org, where people complain about having their careers derailed by an inaccurate lie-detector test. (An applicant who was simply and understandably nervous about being tested could easily fail the test, for instance.) In 1997, the FBI estimated that 20 percent of its applicants are rejected because of their polygraph results—and those people then have trouble finding government employment anywhere else—despite the fact that it's known to be inaccurate. Still, for whatever reason, people have a lot of faith in it.
Good, I was sitting around waiting for my chance to play the unreasonable environmentalist. And lo, Patrick Moore writes in the Washington Post over the weekend that the U.S. needs to start ramping up production of nuclear power plants if we ever want to reduce carbon emissions. And he's certainly not going to let a bunch of tree-hugging Naderites get in the way:
When I attended the Kyoto climate meeting in Montreal last December, I spoke to a packed house on the question of a sustainable energy future. I argued that the only way to reduce fossil fuel emissions from electrical production is through an aggressive program of renewable energy sources (hydroelectric, geothermal heat pumps, wind, etc.) plus nuclear.
….Here's why: Wind and solar power have their place, but because they are intermittent and unpredictable they simply can't replace big baseload plants such as coal, nuclear and hydroelectric. Natural gas, a fossil fuel, is too expensive already, and its price is too volatile to risk building big baseload plants. Given that hydroelectric resources are built pretty much to capacity, nuclear is, by elimination, the only viable substitute for coal. It's that simple.
Well, that's nice. But I want numbers first. How many power plants are we talking about? In 2004, Stephen Pascala and Robert Socolow argued in a much-discussed Sciencearticle that the world's carbon output will rise from about 7 billion tons today to 14 billion tons in 2054. In order to keep carbon concentrations under 500 ppm by mid-century, and avoid bad global warming scenarios, the world should be emitting only 7 billion tons of carbon by 2050.
Now in order to replace one billion tons of those emissions with nuclear energy, Pascala and Socolow estimate that the world would have to add an additional 700 GW in nuclear capacity, double what's produced at present by 440 reactors. So that means 880 new reactors, unless technology makes our plants much more efficient and the like. Since the United States is responsible for roughly a fourth of all carbon output, we'll say that the U.S. would need to build around 220 new nuclear reactors by 2050. Just to cut future emissions by a seventh.
(Keep in mind, too, that Pascala and Socolow are using only one set estimates of the carbon impact of nuclear power. That's totally valid, but opinions do differ. The Chartered Institution of Water and Environmental Management in Britain, for instance, estimates that the total carbon footprint of nuclear power, once you factor in extraction, decommissioning, and waste storage, could be much greater than once believed. There's also a question of whether there's enough uranium in the dirt to power all those new plants; granted, that problem could be solved by building fast-breeder reactors, but those have serious proliferation issues.)
How much would all those new reactors cost? A nuclear power plant currently being proposed in Calvert County, near Washington D.C., is estimated to cost $2.6 billion—and that's likely a low estimate (this new plant in South Carolina will cost "$4 to $6 billion"). The Department of Energy is chipping in $300 million for licensing costs, but critics already fear that the federal government will be forced to pick up cost overruns down the road. During debates over the energy bill, pro-nuclear Senators like Pete Domenici were proposing subsidies to pay for half of all nuclear construction costs, since private investors shy away from building reactors. A rough guess is that helping to fund 220 reactors would cost taxpayers $300-500 billion or more, at minimum. And that's not counting security and the like.
Now fine, if that's what it takes to avert global warming, that's what it takes. But there's also a question of opportunity costs. Every dollar spent building nuclear reactors is a dollar not spent funding other renewable energy sources. And that might be a poor trade-off. I'll quote British journalist Hannah Bullock:
The [British] government has done its sums and reckons that, by 2020 – the earliest time a new nuclear programme could come on stream – it’ll be cheaper to cut a tonne of carbon dioxide emissions through wind or combined heat and power (CHP) technologies – or simply through energy efficiency. These greener options could even have a negative net cost, the Performance and Innovation Unit (PIU) worked out. But if an expensive new nuclear build programme draws investment away from the development of clean technologies like these, we could see higher, rather than lower, emissions in the medium term.
I don't know exactly what study Bullock's talking about here, but it's something to consider. The choice isn't just between building new nuclear reactors or keeping all our carbon-spewing coal plants in place, which is what David Roberts is getting at here.
At any rate, I wouldn't rule out nuclear power completely, but everything I've seen indicates that it could only ever be a very small yet fairly expensive part of a larger plan to reduce carbon emissions, and that the opportunity costs may be quite high. And that's without diving into the very real concerns about nuclear safety and disposal of radioactive materials—all areas that need real improvements, as this paper by Matthew Bunn of the Belfer Center summarizes in a non-shrill fashion.
Fred Kaplan has a good column about the recent spate of retired generals calling for Donald Rumsfeld's head. On the one hand, no one wants to see a repeat of the 1960s, when the Joint Chiefs of Staff, against their better judgment, failed to speak up and dissuade Johnson and McNamara from hurtling the country into Vietnam. If military leaders think something has gone badly awry in the Pentagon, the public should probably know.
On the other hand, it's perfect reasonable to get a bit leery when generals suddenly start speaking out against civilian government. During the 1990s the military became quite politicized—a development that Bill Clinton, ironically, helped start when he took the unprecedented step of getting endorsements from 20 retired generals in his 1992 campaign, to counteract his image as a pot-smoking draft-dodger. Just like they do now, Democrats made a fetish of men in uniforms. The flipside was that once in office, Clinton was loathe to challenge his generals—they had more credibility on security issues, after all.
The upshot was that the military enjoyed inordinate influence over a not-insignificant part of foreign policy during the '90s. Partly that was because the Goldwater-Nichols Act of 1986 made the made the military more powerful by making the Chairman of the Joint Chiefs of Staff "principal military advisor to the president, the NSC, and the secretary of Defense." Partly that was because, by all accounts, Les Aspin and William Perry were relatively aloof and inattentive Defense Secretaries.
Whatever the cause, the military seemed to have more sway than usual. Colin Powell felt free to write a Foreign Affairsarticle describing "his" foreign policy in early 1993 and the military went into open near-revolt over lettings gays in the military. Later on, the Joint Chiefs opposed the land-mine treaty because it would hurt our readiness in North Korea; they opposed the International Criminal Court for fear that U.S. soldiers could be prosecuted—an unlikely event, but whatever; they opposed the Anti-Ballistic Missile Treaty and pushed for missile defense systems over the objections of the rest of the world; they opposed the ban on child soldiers. And the president caved on all of these issues.
Meanwhile, as Dana Priest reported in her excellent book, The Mission, regional Commanders-in-Chief were essentially handling diplomacy in their little parts of the world, as State Department funding dwindled and no one attempted to rein them in. And, as Andrew Bacevich has noted, during the 1990s there was the odd spectacle of more and more retired generals appearing on television to criticize the president, and the politicization of the officer corps.
Now it's a very large leap from a couple of retired generals speaking out against a disastrous Secretary of Defense after a long reticence to the creation of a full-blown military state. I don't think there will be a coup tomorrow. Certainly many active generals take civilian control of the military very seriously—Kaplan notes that many of them remember what happened when Gen. Douglas MacArthur tried to force a public showdown with Harry Truman. And if a bit of grumbling gets Rumsfeld fire, that would be a good thing—although presumably Bush would just replace him with someone equally disastrous. Still, there's decent reason to worry about the scenario Kaplan sketches here:
Rumsfeld's arrogance, his "casualness and swagger" as Gen. Newbold put it—which have caused so many strategic blunders, so much death and disaster—have started to tip some officers over the edge. They may prove a good influence in the short run. But if Rumsfeld resists their encroachments and fights back, the whole hierarchy of command could implode as officers feel compelled not merely to stay silent but to choose one side or the other. And if the rebel officers win, they might find they like the taste of bureaucratic victory—and feel less constrained to renew the internecine combat when other, less momentous disputes arise in the future.
Maybe he's just worrying too much. It seems like it would be better not to find out.
UPDATE: Related, and much recommended, is military historian Richard Kohn's 1994 article, "Out of Control: The Crisis in Civil-Military Relations."
ThisWashington Post op-ed on "The Myth of the 'Boy Crisis'" was pretty good overall, but I was more intrigued by this bit at the end:
The Department of Defense offers a better model [for schools]. DOD runs a vast network of schools on military bases in the United States and abroad for more than 100,000 children of service members. And in those schools, there is no class and race gap. That's because these schools have high expectations, a strong academic focus, and hire teachers with years of classroom experience and training (a majority with master's degrees).
Really? Apparently so. In 2000, Education Weekreported that "black and Hispanic students at the 154 Department of Defense schools overseas--along with 70 others operated by the military on U.S. soil--do better than their counterparts almost anywhere in the United States." And the Christian Science Monitornoted in 2003 that the schools' 112,000 students "outperform the national average," despite being poorer as a group than students at most U.S. schools. 80 percent of base students go to college, compared with 67 percent nationally.
So that's impressive. How come military schools are so good? It turns out that Jenny LaCoste-Caputo of the San Antonio Express-Newsinvestigated this only a few weeks ago, looking at San Antonio's three school districts on military bases—all of them with sterling educational reputations. Among other things, students come from families that, while not rich, have a steady paycheck; everyone has access to good health care, and there's a "culture of discipline." (Students supposedly "know their bad behavior can have serious consequences for their parents.")
But another major difference comes down to money. The military school districts have "twice as much money to spend per student as the average Texas district"—not to mention fewer students in the first place. That means smaller classes, more aides, more assistants, more ESL classes. Across the country, the disparities aren't quite as stark, but base schools still spend $9,500 per student, as compared to a national average of $8,800 in public schools. No one can really say exactly how much of the success of military base schools is due to money and how much due to the other stuff, but that's obviously a big gap.
In Reason the other day, Jacob Sullum reminded us that large tobacco companies have used court settlements to muscle out smaller competitors. Here's what a Washington Timesreport had to say about this a year ago:
Critics say the 1998 Master Settlement Agreement has not turned out as expected as big players accused of wrongdoing wound up with market protections while small businesses were forced to pay up….
Under the deal, the big cigarette companies agreed to restrict their marketing, fund stop-smoking efforts and make annual payments to the states for 25 years.
But attorneys on both sides knew the large companies would raise cigarette prices to make their payments, so they built in protections to prevent companies outside the agreement from taking too much market share.
Sullum thinks this is all to the bad—the small tobacco companies that didn't engage in massive fraud and mislead their consumers are being penalized, while the four major tobacco companies that did all the wrongdoing now get to set up a state-enforced cartel as a reward. That sounds pretty bad, but let's see.
According to the Times the states don't much care about cartels and the like. Their ultimate goal is to reduce cigarette consumption, and blocking out smaller competitors and forcing the big companies to raise their prices could do just that. It seems to have worked. One other consideration, though, is that helping four large tobacco companies acquire even greater wealth and influence will make further tobacco regulation even harder, since these companies will have greater resources for lobbying and court battles. (Indeed, after lots of litigation, the major cigarette companies may soon get their annual payments under the MSA reduced by $1.2 billion.)
Here's something else. The MSA also required the four major tobacco companies to stop advertising to kids—no more cartoon characters, for instance. And after youth smoking rates saw a massive drop-off in 2002, after increasing steadily throughout the 1990s, many people think this was significant. But perhaps not. The New England Journal of Medicinefound a few years ago that young people probably had the same exposure to cigarette advertising in magazines as they did prior to the MSA. Looking around, meanwhile, it seems the American Spectator was scoffing about advertising bans back in 1999:
But, frivolous libertarian complaints notwithstanding, wouldn't a federal settlement be a boon to the health of the country by funding countless health initiatives and stop-smoking programs? Not likely. First, in six European countries where tobacco ads have been completely banned, overall tobacco sales have increased.
Why on earth would that be? A-searching we go. David Cutler and Edward Glaeser recently asked the question, "Why Do Europeans Smoke More than Americans?" One-half of the difference, they found, can be accounted for by differences in beliefs about the health effects of smoking. "Europeans are generally less likely to think that cigarette smoking is harmful." What that has to do with tobacco cartels, I can't say, but it's all very curious.
I'd highly recommend this post by Mark Thoma, which discusses research suggesting that unskilled immigration doesn't necessarily depress wages for low-income native workers here in America. That seems counterintuitive—after all, an increase in the supply of labor should cause wages to drop, no?—but Alan Krueger and David Card apparently think that maybe, just maybe, all that immigration increases demand for goods and services, which in turn helps wages and employment.
Well, that would be nice, if true. My general take, which isn't very original, is that even if it isn't true, there are better ways to help low-wage workers than to shut down immigration, which in any case is extremely difficult and likely to be harmful to developing countries like Mexico. Mickey Kaus, in turn, thinks that aren't any better ways, and says that the usual suspects—more education, progressive tax brackets, the EITC, a higher minimum wage, and "public provision of services"--won't help those people who can't find jobs because immigrants are creating slack in the labor market.
Well, he's probably right on that narrow point (although not on others). But why stop there? Why can't the government pursue full employment policies to create a tight labor market? As Jared Bernstein argues in this old Prospect article—and as Kaus acknowledges—full employment is one of the best wage-boosting, anti-poverty programs that has ever been devised. And it doesn't seem like you have to restrict immigration (which shouldn't be any different from normal population growth) to achieve it: an expansionary monetary policy, federal spending, and massive public-sector employment should do the trick. Maybe we could revive the 1946 Employment Act, as it was originally conceived. Except that that would probably be considered "socialism," eh?
Paul Krugman is still uneasy about large-scale immigration, judging from his column on Friday. Unskilled immigration, he says, depresses the wages of low-skilled workers. Well, yes, but again, with properly-designed policies—living wages, full employment, labor laws that allow unions to flourish, earned-income tax credits, and the like—I think you can mitigate this, while preserving the very, very large benefits immigration brings for immigrants and the countries that send them. It's awfully odd to think that shutting the border is really the best possible thing we can do for low-skilled native workers.
But okay, we've been over that. This passage in Krugman's Friday piece, on the other hand, is new and deserves comment:
Imagine, for a moment, a future in which America becomes like Kuwait or Dubai, a country where a large fraction of the work force consists of illegal immigrants or foreigners on temporary visas -- and neither group has the right to vote. Surely this would be a betrayal of our democratic ideals, of government of the people, by the people. Moreover, a political system in which many workers don't count is likely to ignore workers' interests: it's likely to have a weak social safety net and to spend too little on services like health care and education.
This isn't idle speculation. Countries with high immigration tend, other things equal, to have less generous welfare states than those with low immigration. U.S. cities with ethnically diverse populations -- often the result of immigration -- tend to have worse public services than those with more homogeneous populations.
Well, I agree. Creating a Dubai-style underclass of disenfranchised immigrants who have few rights and even less voice in the country they help prop up is an awful idea. That's why everyone should oppose "guest worker" policies that allow companies to import a captive labor force that are here at the mercy of their employers, can't bargain for better wages, speak out against shoddy work conditions, or organize and strike. But I'd go even farther. Why should non-citizens have to be disenfranchised? Why not just let anyone living here legally vote?
It seems a bit crazy, but it's worth putting out there. Non-citizen immigrants seem to be constitutionally barred from voting at the federal level in any case, but nothing's stopping anyone from giving them the vote in state and local elections. And why not? Presumably immigrants should have a say in, for instance, what goes on in the schools they're sending their kids to. And it's perfectly possible: Takoma Park in Maryland allows non-citizens to vote, although I don't think it's affected voter participation or local politics very much there. (San Francisco has considered similar measures at various points, too—it's unsettling, by the way, that 4.6 million people in California, one-fifth of the state population, can't vote.)
Who knows, a bit of civic participation might even make immigrants more "patriotic" or "assimilated" or whatever it is nativists worry about. (Even though the evidence shows that even Hispanic immigrants are assimilating just fine.) At the very least, non-citizen voting would help prevent the United States from turning into another Dubai. It's just not very likely to happen, although maybe a well-placed and influential New York Times columnist could do his part to help this idea gain momentum...
I don't really know whether the riots in France are going to create a political crisis in Paris or what, but I do know I'm not quite convinced by the "sensible" view on this side of the Atlantic that France absolutely needs to make it easier to fire young people if it wants to reduce unemployment. Of particular interest is this 2004 paper put out by researchers at the Center on Economic Policy Research which looks at evidence from the OECD countries and finds no evidence that, in general, "employment protection" laws have much impact on unemployment rates. (The paper also criticizes a much-cited IMF study that found just such an impact.)
That doesn't make a ton of sense at first glance—intuitively, one would think that if companies could fire people more easily, they'd be more likely to hire people in the first place—but then again, if employment is mostly determined by overall demand for goods and services, then maybe regulations governing the hiring and firing employees don't matter all that much in the grand scheme of things. The European Central Bank has been keeping interest rates rather high over the past few years, and maybe that does more to explain France's high unemployment.
It's interesting that other European countries have high levels of employment protection, yet still manage happily low levels of unemployment—Denmark, Austria, the Netherlands. Perhaps deregulation's not the answer after all. The CEPR paper argues that, for instance, changes to labor market institutions probably "contributed nothing at all" to the drastic reduction in unemployment in Ireland between 1980 and 1998. Meanwhile, Olivier Blanchard and Thomas Philippon argue that the main reason for a similar fall in unemployment in the Netherlands during that period was a national agreement by Dutch unions to moderate their wage demands—and not deregulation per se. So I don't necessarily see conclusive evidence that European unemployment is high because labor laws are too "rigid" and "inflexible."
Now all that said, granted, in this particular case, Chirac's latest proposed revision to the labor reform—instituting a one-year trial period during which companies could fire young workers with just cause—doesn't seem very draconian or unreasonable. Then again, I don't follow French politics very closely, and if, say, people are looking at this as a potential first step on the march towards creating a more "flexible," American-style labor market in France, I can see why they'd oppose it. And they should.
MORE:This DailyKos diary is worth a read, noting among other things that the true unemployment rate among French youths is greatly exaggerated—although I think the author's neglecting to count people who are discouraged from finding jobs—and that the rate of job creation and destruction in France is the same as in the United States. One can also add that in any case the originally-proposed reform would let employers fire workers for race- or gender-related reasons. Don't we have laws against even that here in the "dynamic" ol' U.S. of A.? [UPDATE: Oops, that last point isn't quite right; see comments.]
I'm a reporter at The New Republic, mostly covering green issues, apocalypses, general doom. This is my personal site. I also post regularly at The Vine, TNR's enviro-blog.