What's the deal with Robert Parry of Consortium News? According to Wikipedia, he broke a number of Iran-Contra stories as a reporter for the Associated Press in the 1980s, so he's not some random crackpot. But he also seems to have access to a fairly different stable of sources than most journalists. See, for instance, his report today:
Military and intelligence sources continue to tell me that preparations are advancing for a war with Iran starting possibly as early as mid-to-late February. The sources offer some differences of opinion over whether Bush might cite a provocation from Iran or whether Israel will take the lead in launching air strikes against Iran’s nuclear facilities.
But there is growing alarm among military and intelligence experts that Bush already has decided to attack and simply is waiting for a second aircraft carrier strike force to arrive in the region--and for a propaganda blitz to stir up some pro-war sentiment at home. ...
According to intelligence sources, Bush’s Iran strategy is expected to let the Israelis take a lead role in attacking Iran's nuclear facilities in order to defuse Democratic opposition and let the U.S. intervention be sold as defensive, a case of a vulnerable ally protecting itself from a future nuclear threat.
I don't know if Israel's actually preparing to bomb Iran in "mid-to-late February," but the stated logic about "defus[ing] Democratic opposition" seems depressingly plausible. I generally don't think that AIPAC and other hawkish pro-Israel group are pushing the White House towards a confrontation with Iran, if only because I doubt the White House needs the push. Iran poses a threat to U.S. hegemony in the Middle East, and that's reason enough for Bush to adopt a belligerent stance towards Tehran. But AIPAC is quite adept at reining in Democratic critics of Israel, and in this context, that can certainly help grease the skids for war. I also wouldn't mind hearing more about this:
Unconfirmed reports suggest Vice-President Dick Cheney has cut a deal with Saudi Arabia to keep oil production up even as prices fall, to undercut Iran's main source of foreign currency.
In The Washington Note's comment section the other day, Dan Kervick argued at some length that Saudi Arabia has been running "a very effective propaganda campaign" against Iran here in Washington. I don't see any examples, although Hassan Fattah didreport last month that the former Saudi ambassador to the United States, Prince Turki al-Faisal, was forced to resign because he believed "the only solution to the problem is in negotiating with the Iranians." The Saudis allied against Turki have "sought to closely back the Bush administration as it seeks a toughened policy on Iran."
I also thought I had either recently read or heard a theory that the recent "surge" in Baghdad was chiefly aimed at placating the Saudis, who are getting increasingly worried about Iranian influence in Iraq, but I can't find a link, so let's consign that one to the rumor bin. At any rate, I usually just assume this administration is perfectly capable of doing crazy things without outside pressure.
It's been a while since I've carped about some arcane (but sinister!) thing the World Bank or IMF has been doing. So let's do that. Every year since the fall of 2003, the World Bank has put out a ranking of 175 countries around the world, based on the "ease of doing business" within their borders. The "Doing Business" report has become quite influential since its inception—both the Bank and the IMF often refer to the rankings when they make policy recommendations to developing countries—and it's worth a closer look.
In 2005, to take one example, a World Bank assessment worried that Ecuador ranked much too low in the Doing Business rankings, partly because of its poor "flexibility of firing" grade—apparently, it's too difficult to fire people at will in Ecuador. So the Bank suggested that the country "consider measures aimed at reducing rigidities in the labor market, particularly with regard to firing restrictions, mandatory profit sharing, and employer-subsidized retirement." These "reforms" were pushed even though only 14 percent of Ecuadorian firms saw labor-market regulations as a major problem. And Ecuador's hardly alone. The International Confederation of Free Trade Unions (ICFTU) has compiled a long list of countries with similar experiences.
So what's the deal with these rankings? Let's look at the most recent report: "Doing Business 2007: How to Reform." The countries listed are ranked in ten categories related to taxation, licensing, financial regulation, legal institutions, and—the contentious one—labor. In the "Employing Workers" category, countries get a higher ranking if they have a low minimum wage, if they don't restrict the number of hours an employee can work, and if they allow companies to fire employees at will, among other things.
Countries with poor or non-existent labor standards seem to do particularly well, it seems. Saudi Arabia, for instance, doesn't allow workers to organize—a violation of the International Labor Organization's (ILO) fundamental principles—and yet it receives a very high score in the "Employing Workers" category. The Marshall Islands and Palau both notched nearly perfect scores—even though neither country has much in the way of health and safety standards (Palau doesn't even ban child labor). If a country wants to boost its rankings in this category, apparently all it needs to do is go consult with the Heritage Foundation and hack away all its labor standards.
Now, it's certainly possible that labor deregulation is always for the best, and brings only good things, as the Doing Business report implies. But how would we know? The report doesn't provide any sort of cost-benefit analysis of these regulations. There's no mention of the potential downsides to overly-long working hours or a skimpy minimum wage. No one points out that many countries don't offer unemployment benefits, so severance pay requirements and advance dismissal notices—two things frowned upon by Doing Business—are often the only available substitute. No one notes that discrimination still remains a major problem in many countries, and junking certain firing protections could have real consequences.
Interestingly enough, not everyone at the World Bank even believes that labor deregulation is always and everywhere an unqualified good. After all, the Bank's 2006 World Development Report sensibly notes that labor markets don't work like commodity markets do, and can often have "unfair and inefficient outcomes when the bargaining position of the workers is weak." And yes, the WDR report warns against too-rigid work rules, but still argues that some labor regulations "can improve market outcomes and lead to significant equity gains." But the Doing Business report appears to be more influential than the WDR, and doesn't share this sort of nuance. Instead, it always pushes deregulation, without hesitation. From all accounts, it seems to be having an effect.
Okay, I don't quite get the point of this piece, "5 Myths about Suburbia and Our Car-Happy Culture," by Ted Balaker and Sam Staley from the Reason Institute. They have some nice facts to share, but the moral of the story, it seems, goes something like this: As countries get wealthier, they'll drive more, so therefore we shouldn't do anything to reduce carbon emissions. Okay... On the other hand, it is nice to see a few libertarians admit that clean-air regulations have actually improved air quality in the United States since 1970. Baby steps, I guess.
I'm also a bit annoyed by the argument—first made by Bjorn Lomborg—that we shouldn't do anything to stop global warming because we could save more lives, in a more cost-effective manner, by addressing malaria and water-borne diseases. But why not do both? Where are the numbers that say curbing CO2 emissions will be so prohibitively expensive that the United States won't be able to address any other problems in the world?
Consider this: The Lieberman-McCain bill now under consideration in the Senate would come close to reducing U.S. emissions down to the levels necessary to for the country to do its part to avert drastic climate change. It's not perfect, but it's decent. The EIA estimates that the bill would take an $89 billion bite out of the United States' inflation-adjusted GDP in 2025. That's not nothing, but it's still less than a year's worth of occupation in Iraq. Alternatively, we could shutter some of our 700 (and growing) military outposts around the world and scrounge up some cash. It's not like there are only two choices here.
Honestly, I've read a lot of arguments of the form, "If we really cared about the poor, we'd be doing X instead of curbing emissions," but I can't say I see a lot of conservatives or libertarians lobbying hard for the United States to do X. On the subject of malaria, for instance, the only writings I can find by either of these Reason people is an interview Balaker did in 2005 with an opponent of foreign aid, who insists that Africans will figure out how to deal with malaria on their own, as soon as they're no longer "suffocated with aid."
As best I can tell, modern-day commentators on the Middle East mostly disregard Edward Said's views on "Orientalism"—that is, his argument that Western scholarship on the region has been shaped by imperialistic attitudes and riddled with prejudice. (Okay, the argument's a tad more complex than that—among other things, he suggested that imperial Europe in the 19th century partly defined itself in opposition to its idea of "the Orient".) His thesis was fairly influential in academic circles, but it doesn't get much mention in the outside world, except when it gets blamed by conservatives for debasing Middle Eastern studies.
Robert Irwin's recent book, Dangerous Knowledge, sort of follows that line of attack, arguing that Said "libeled generations of scholars" who were mostly writing in "good faith". The book does take a fascinating jaunt through Middle Eastern studies, but I'm not exactly qualified to assess his larger argument. I did, however, like Lawrence Rosen's take on all this--he seems to think that Said was too savage, but identifies a bunch of biases in current scholarship on the Middle East, including this one:
No self-respecting contemporary Orientalist comments on a text or a word without showing its "original" meaning, commonly implying that, whatever accretions may have attached themselves over the centuries, the word’s true meaning is its first meaning. This is the attitude, for example, of Bernard Lewis in The Political Language of Islam and almost every entry in The Encyclopedia of Islam.
Rather than seeing language in its living—to say nothing of psychological—context, it is thought all but sufficient to access its earliest meaning to grasp its timeless essence. This focus on has allowed the philological enterprise to be carried along—without the need for interdisciplinary insight—into the ongoing appraisal of texts. This approach does not necessitate, as Said would have it, a biased view of the mentalités of modern Arabic speakers, but it can certainly perpetuate the assumption that the scholar knows better than the speaker the true meaning of his or her expression.
I can't think of any good examples off the top of my head, but this habit does seem to crop up in various essays and papers on the Middle East or the Arab world, and the point's well-taken.
Last week, a U.S. official told Laura Rozen that the Bush administration was planning to "confront Iran in every way but direct armed conflict, using all means short of war." No mention, though, of how the White House planned to ensure that these new moves--confronting Iranian networks in Iraq, doubling its naval power in the Gulf, finding "covert ways to counter Hezbollah in Lebanon"--actually do stop short of war.
After all, say the U.S. military decides to "kill or capture Iranian operatives inside Iraq," as seems to be the new policy. Iran could very likely retaliate. So the United States steps things up another notch. Iran does likewise. And so on. It wouldn't be inconceivable for the two countries to find themselves in a full-blown conflict in short order. Indeed, it seems more likely, at this point, that the United States will stumble into war with Iran rather than initiate one out of the blue.
So that brings us to Dafna Linzer's piece in the Washington Post today, which briefly mentions the "safety measures" the Bush administration is taking to avoid this scenario:
In meetings with Bush's other senior advisers, officials said, Rice insisted that the defense secretary appoint a senior official to personally oversee the program to prevent it from expanding into a full-scale conflict. Rice got the oversight guarantees she sought, though it remains unclear whether senior Pentagon officials must approve targets on a case-by-case basis or whether the oversight is more general.
That doesn't sound overly comforting--especially since we also learn that the new Iran policy "is based on the theory that Tehran will back down from its nuclear ambitions if the United States hits it hard in Iraq and elsewhere, creating a sense of vulnerability among Iranian leaders." In other words, the White House expects Iran to fold, in awe of American military might, rather than strike back. And hey, it's not like rosy assumptions have ever led us astray in the past...
Right after the mid-terms, I wrote a churlish little piece for TNR predicting that John Dingell, the incoming chair of the House Energy and Commerce Committee, would pose an obstacle for Democrats trying to do something about global warming. That's partly because he's always been skeptical about emissions controls, especially on automobiles (hey, he represents Dearborn!). Anyway, now it seems Nancy Pelosi is also concerned about this:
One sign of the Democrats' determination to move on climate bills occurred when a Democratic Congressional aide confirmed that Speaker Nancy Pelosi wanted to create a special committee on climate, apparently an end run around Representative John D. Dingell, the Michigan Democrat who is chairman of the Energy and Commerce Committee.
Mr. Dingell, through an aide, Jodi Seth, said Wednesday that such committees were "as relevant and useful as feathers on a fish."
Yikes. We'll see how this all shakes out. As far as I can tell, the new committee would have the power to subpoena and hold hearings, but wouldn't be involved in drafting legislation. So Dingell hasn't really been sidelined. I'd also like to know if this bit of turf-burglary makes him more or less inclined to help Pelosi push climate bills forward.
While we're on the topic, Dave Roberts of Grist has a nice little run-down of the various climate bills under consideration in the Senate. Note that Jeff Bingaman's plan to stabilize emissions is being flogged as the "centrist" bill, but as this New York Timeschart shows, it doesn't come anywhere close to reducing U.S. emissions down to the levels necessary for the country to do its part to avert drastic climate change. The McCain-Lieberman-Obama bill does somewhat better. The only proposal that (nearly) reaches the targets, it seems, is Bernie Sanders' bill. I doubt that will get far, though it will at least make the McCain-Lieberman-Obama bill look moderate by comparison.
I'm sure most people have seen thisNew York Times headline: "REBUKE IN IRAN TO ITS PRESIDENT ON NUCLEAR ROLE." It seems that Iran's Supreme Leader, Ali Khamenei, now wants Mahmoud Ahmadinejad "to stay out of all matters nuclear." As with all things in Iranian politics, the move's hard to decipher: Does Khamenei actually want to defuse tensions with the West, or is he just trying to improve Iran's public image? Ever since the United Nations slapped sanctions on Iran last month, Khamenei has been rather reticent about his country's nuclear program--only once insisting that the country would relinquish it--and Ali Larijani, the country's chief nuclear negotiator, has also sounded somewhat less inflammatory.
The Guardian had a more detailed report about all this on Tuesday, noting that 150 members of Parliament have sharply criticized Ahmadinejad--both for getting embroiled in a dispute with the UN Security Council and for running the economy into the ground. (Relatedly, Robert Stern, an economist at Johns Hopkins, has recently argued that Iran's oil economy is facing something of a "death spiral.")
It would be easy to read too much into this--no one thinks that Iran's ready to surrender its nuclear program and make nice with the United States just because a bunch of MPs are pissed at Ahmadinejad--but it's a reminder that in some respects, at least, Iran is much like any other country, with its own concerns and political disputes, not just single-mindedly obsessed with the destruction of Israel and the West. The former, recall, is basically the view of those who think the United States ought to try talking with Iran, rather than whatever hostilities the White House is trying to provoke right now. It's also a reminder that Ahmadinejad, lunatic though he may be, has much less power than commonly thought--and may even be on his way out.
For anyone who's curious, I have a piece up at The New Republic today on socially responsible investing (no need to subscribe to read it). It's basically a riff on the Los Angeles Times' investigation into the Gates Foundation's seedy investments. I'm actually much more positive about SRI than the headline seems to imply, but I do think there are some limits to the whole idea that are worth keeping in mind.
The AP has an alarming story today about how sensitive military equipment sold through the Defense Department's surplus sales program has been finding its way to China and, in at least one case, Iran. Federal investigators are especially worried that the surplus sales are poorly regulated, and that Iran could find the parts it badly needs to maintain its F-14 fleet--which was bought from the United States in the 1970s when the two countries were still allies--through this market. Bad news.
It's also familiar news. Last year, the GAO discovered that private citizens could buy "two launcher mounts for shoulder-fired guided missiles, several types of body armor, a digital signal converter used in naval electronic surveillance," and other fun toys from the Defense Department's surplus inventory, without hassle. Body armor was being offered for a fraction of the price that military units in Iraq had to pay for it. And a leaked 2003 GAO report revealed that stuff needed to make biological weapons could be bought on the Internet for pennies on the dollar.
Call me crazy, but maybe it's time to revisit the whole idea behind the Pentagon's surplus sales program. The most comprehensive report on the subject, compiled in 1996 by the Federation of American Scientists, found that the United States had given away--or sold at steep discount--$8.7 billion worth of arms considered "surplus" between 1990 and 1996. Not only are weapons being dumped on the world market, but in many cases, "the services appear to be giving away useful equipment to justify procurement of new weaponry."
To take one present-day example, the Pentagon is spending billions to acquire a new fleet of F-22 Raptor stealth fighters. The Air Force and Navy have justified the program, which has turned into something of a boondoggle, by pointing to the spread of U.S.-built F-15, F-16, and F-18 fighters around the world--many of which were originally given away by the services in surplus auctions. (One mildly infamous promotional pamphlet for the F-22, which circulated some years back, stressed the need to maintain U.S. "air superiority" by pointing to worrisome countries around the world that were either adversaries or potential adversaries. It turned out that most of those countries were worrisome because they had had fleets of U.S.-built F-16s.)
Arms-control experts often argue that the United States could just keep and upgrade many of its existing weapons rather than auctioning them off at bargain prices. (In 1994, a GAO report suggested that the Pentagon could simply upgrade its existing fleet of F-15s and still defeat potential foes for years to come, rather than invest in the F-22.) Defense contractors, though, see giveaways as "loss leaders" that give cash-strapped countries a taste for buying U.S. arms, setting the stage for future sales down the road. Plus, they put upward pressure on the procurement budget. So it's doubtful that the surplus sales program is going anywhere anytime soon, risks and all.
There's plenty of interest in this GAO report on why the pharmaceutical industry hasn't been terribly innovative over the past decade—or at least not as innovative as one might expect. Between 1993 and 2004, drug companies boosted their spending on R&D by 147 percent. But the number of new drug applications submitted to the FDA only rose by 7 percent. What's more, only 12 percent of new drug applications during that time were classified as "highly innovative," while 60 percent were in the least innovative category, and were basically modifications of existing drugs.
Why is this? The GAO interviewed a bunch of experts and industry analysts to figure out what sort of things were hampering drug-development efforts. I'll do a quick summary of some of the theories bandied about in the report. All or some or none of these might be true (some sound much more plausible to me than others):
1. The science of disease is incredibly difficult, and it could just be that the drug industry is going through a tough phase right now—possibly because companies have been focusing on "complex and chronic diseases such as cancer" that have higher failure rates. Plus, they haven't quite mastered all the new-fangled technology—such as genomics—but once they do, innovation will really take off.
2. There's an industry-wide lack of trained researchers "who possess the ability to effectively translate basic discoveries into new drugs." In particular, the number of physician-scientists—people who have both medical and research degrees and can bring the two realms together—declined by 22 percent from 1983 to 1998. Some blame academic institutions for not providing enough financial incentives to convince medical students to pursue these tracks.
3. Over the past ten years, drug companies have increasingly concentrated on producing "blockbuster drugs"—that is, drugs intended for large populations that have the potential to reach over $1 billion in annual sales. This sort of work can crowd out research on other smaller, potentially innovative drugs that would benefit more limited portions of the population, or be used to treat less-visible diseases.
(Moreover, since multiple drug companies are often pursuing the same blockbuster drug for the same blockbuster market, one company will often beat the others to the punch, and the losing companies will have to discontinue all the research they've done.)
4. Given that drug development has become more costly and more complex of late, many companies prefer to focus on "me too" drugs that are basically variations on drugs already on the market. These drugs are less innovative, but they're also less risky to develop.
5. As a follow-up, the current intellectual property system encourages "me too" drugs, since companies can obtain new patents for their drugs regardless of whether the drug in question is a significant improvement on what's already out there. Instead, they can slightly tweak an existing drug and voila—they get a patent and the profit that comes with it.
6. According to industry analysts, the recent rash of mergers and acquisitions in the drug industry might have negatively affected drug development. Quite often, a newly-merged company will simply cancel several of the former company's ongoing projects because there are new revenue expectations to meet. In one 2003 survey, nearly half of all clinical research organizations reported that they had had drug development projects cancelled due to mergers.
7. Many of the FDA's rules concerning drug safety and effectiveness are unclear, and that uncertainty may sometimes lead companies to abandon drugs in the early stages rather than risk bringing a drug all the way to the final stage only to get rejected by the FDA. The fuzzy rules, though, are partly due to an honest lack of consensus about how to measure drug safety. Other industry officials have also complained that the FDA takes too long to review a drug, although most of these rules were put in place for safety purposes. (I'd note, though, that the FDA is, to a large extent, in the pocket of the drug industry, and hardly the most adversarial force around.)
Anyway, that's the rough overview. Obviously there are a lot of things that could be fiddled with—one of GAO's recommendations was to create a multi-tier patent system, in which, say, a drug company could receive, say, a 30-year patent for a drug considered highly innovative, but would get a much shorter patents for "me too" drugs that don't much improve on existing drugs. As an alternative, Dean Baker has proposed a publicly-funded alternative to the patent system that could work just as well, or better.
In comments to my earlier post on Somalia, there was some back and forth over what role, exactly, the United States has played in the Ethiopian invasion. To that end, I thought I'd pass on this report by Daveed Gartenstein-Ross, a terrorism consultant (via Cursor.org):
The al-Qaeda affiliated Islamic Courts Union's (ICU) rapid retreat in the face of Ethiopia's military campaign in Somalia has puzzled many observers. How could the Ethiopians roll up the jihadists so quickly? Pajamas Media has learned that one significant factor is that U.S. air and ground forces covertly aided the Ethiopian military since its intervention began on Christmas day.
U.S. ground forces have been active in Somalia from the start, a senior military intelligence officer confirmed. "In fact," he said, "they were part of the first group in."
These ground forces include CIA paramilitary officers who are based out of Galkayo, in Somalia's semiautonomous region of Puntland, Special Operations forces, and Marine units operating out of Camp Lemonier in Djibouti.
If Gartenstein-Ross' report is accurate, the United States probably wasn't taken by surprise over the Ethiopian offensive. At least, the Pentagon wasn't. Dahir Jibreel, a foreign secretary for Somalia's transitional government, said that U.S. planes have been striking targets since Christmas, and claims that the U.S. and Ethiopia had been planning this for months beforehand.
On the other hands, comments by various State Department officials before the conflict started suggest that a number of U.S. diplomats really wanted to avoid war and settle the ongoing inter-Somalian disputes peacefully. Were they lying? Or were the State Department and Pentagon just on totally different pages, with the former not knowing that the latter was gearing up for war? That's a disturbing scenario. Details are all still hazy, though.
Over at Tapped, Robert Farley responds to my earlier fear that having an even bigger military would cause the U.S. to adopt an even more hawkish foreign policy agenda. He's not convinced:
More generally, I just don't care for the argument that we ought to structure our military around the fear that a government will use it unwisely rather than around a careful analysis of national values and interest.
As an empirical question, I'm not sure there are any good examples of "wagging" in the sense that having military capabilities caused a state to eschew a better diplomatic option in anything but the most obvious sense. The development of military capabilities follows aggressive intent, not the other way around.
Okay. At the same time, though, much of the military's force structure wasn't put in place because someone in the Pentagon laid out specific foreign policy goals and then Congress judiciously decided what capabilities the military would need in order to pursue those goals. Much of the military was put together to address needs or goals or vague threats that might arise in the future. And many weapons systems were built because Lockheed Martin wanted more money and so decided to lobby the head of the appropriations committee to finance various new projects. Here's one example.
So lots of things—militarism, defense contractors, members of Congress who want to preserve bases or arms manufacturing plants in their districts—put upward pressure on the size of the United States armed forces. You don't get enough nukes to destroy the world a dozen times over and a military the size of the next 14 countries combined through a "careful analysis of national values and interest" alone. Now, as that outsized military grows and grows, I wouldn't be surprised if Pentagon planners take a look at all the new tools at their disposal and decide that maybe now some additional foreign policy pursuits might also be feasible. I doubt it's always a straightforward process, but here's one example.
Currently, there's much debate over whether China will pose a strategic threat in the future. Many experts say it won't. Nevertheless, the Navy and the Air Force, both of which need to justify their massive budgets, are consistently pushing the direst possible assessment of China's military capabilities. Now I don't think that high-ranking officials are consciously overstating the threat to fend off budget cuts. But let's say some mid-level strategist puts together a Power Point presentation showing that China will wreak havoc in 2015 unless the United States keeps a whole bunch of warships around as deterrence. I'd imagine this presentation will garner much more attention than one arguing that China is unlikely to pursue an aggressive foreign policy and the Navy probably has too many ships. And so the China hawks slowly gain sway.
In this case, the U.S. military's force structure isn't the only thing driving its foreign policy, but the two are interconnected, and there are elements of tail-wagging at work. Meanwhile, Chalmers Johnson's book, The Sorrows of Empire, argues that America's far-flung military bases have become self-sustaining enterprises that in turn often justify further interventions in those regions. I wasn't totally persuaded by Johnson's case, at least not as stated, but that's another tail-wagging-the-dog argument if anyone's interested.
Here's something I read offhand recently, and it sounded noteworthy, although I'll have to see if it's true. Apparently the labor inspection systems in Latin America and North America run on very different principles. In the United States, if an employer violates some safety standard, it gets fined or sanctioned by the government. Often, as with various coal mines in West Virginia, the employer simply prefers to pay the fine rather than fix the problem, and basically has the freedom to make that choice. But usually, employers adopt health and safety code to avoid the penalties.
In various Latin America countries, on the other hand, employers can't just pay a fine and let that be that. The business is required to fix the problem, and has to work out a plan, with the help of inspectors, to comply with the rules. Penalties are mainly used only in cases when employers are breaking the law repeatedly and flagrantly. Otherwise, inspectors act more like consultants—designing a plan to help the company put in place adequate labor protections without becoming uncompetitive.
Now, again, I'd like to research the details before commenting more, but it seems like a fairly intriguing model for the developing world, and I'd be curious if anyone else knows anything about this. I'm skeptical that greater flexibility on labor standards always benefits everyone, rather than merely providing a way for businesses to trample over health and safety standards in the name of "economic efficiency", and at any rate, many Latin American countries have all sorts of appalling labor-standard problems, so it's not like the model works wonderfully everywhere. (Many countries also have too few labor inspectors, and corruption is an issue.) But it struck me as something worth exploring further.
Hmmm, freaky. Earlier today, Ann and I were riding the lovely subway when the train ground to a halt just outside the Mount Vernon stop. After a few minutes they ushered everyone in the front four cars out through the front, which had already reached the platform. Figuring there was some annoying technical issue afoot, we got off the train, exited the station, and just walked the rest of the way, blissfully unaware, it seems, that the back two cars of the train had actually derailed and crashed against the tunnel wall, trapping 60 people in the dark for an hour and sending 16 people to the hospital. Good to know that when catastrophe strikes I'm totally oblivious...
The New York Timesreports on all the nifty new technologies that are making Japanese homes more and more energy-efficient. In part, high energy prices have "spurred the invention and development of things like low-energy washing machines and televisions and high-mileage cars... Japanese factories also learned how to cut energy use and become the most efficient in the world."
I particularly liked the heating system that uses sensors to direct heat only toward rooms with people in them. Presumably technology like that would become more commonplace in the United States if a carbon tax were ever put in place. In 2001, the average Japanese household used 4,177 kilowatt-hours of electricity, compared with 10,655 kilowatt hours in the United States. It's difficult to tell, though, how much of that difference is due to all of the fuel-saving gadgets, and how much to people living in smaller houses and constantly doing stuff on their own to save energy, such as recycling bath water for the laundry or putting on a sweater rather than cranking up the heat.
I still don't quite know what to make of Ethiopia's recent invasion of Somalia. Does anyone? Jonathan Edelstein lays out a few possible scenarios now that Ethiopia has driven the ruling Islamic Courts Union out of power and ushered the warlords from the transitional government back in. Here's his prognosis:
I'd still rate the most likely outcome as a sham Ethiopian withdrawal followed by an extended counterinsurgent conflict, with the [transitional government] remaining ineffectual and internally divided while the Islamist militias wage a guerrilla struggle with substantial public support.
This, in turn, will ensure that Eritrea continues to support local proxies against the Ethiopians, and that fighters from the greater Middle East will continue to be attracted by the widely reported (albeit erroneous) portrayal of the conflict as one pitting Somali Muslims against Ethiopian Christians. And needless to say, a prolonged counterinsurgency is the type of conflict that nobody wins, with major impact on regional food security as well as widespread death and displacement.
Oh, good. Death and displacement. Anyone hankering for less gloom might want to read his other, more optimistic scenario, in which Ethiopia withdraws, Uganda sends in a peacekeeping force to support the transitional government, and the African Union starts disarming Somalia's heavily-armed clans and militias. Now, according to this morning's headlines, a bunch of Somalis started rioting and setting stuff on fire after being told they might have to turn in their assault rifles, so the odds look long for the sunny prediction, but take your pick.
Anyway, enough with predictions. I'm still trying to figure out how this war got going in the first place. The full story is long and complex, but here's one aspect that strikes me as either an appalling screw-up or something more sinister. In early December, recall, the Islamic Courts had taken control of most of Somalia, save for the town of Baidoa, where the transitional government had shacked up, protected by Ethiopian soldiers. That month, the United States pushed a resolution through the UN Security Council intended to bolster the transitional government—which had been considered an ally in the "war on terror" for quite some time.
The UN resolution partly lifted the arms embargo on Somalia, and authorized a regional force to protect the transitional government. Many Somalis interpreted it as legitimizing the 8,000 Ethiopian troops already in the country, and it seems plausible that it inflamed the situation, even though it technically barred Ethiopia from taking part in the peacekeeping force. (At the time, onlookers such as the International Crisis Group warned that it would lead to further conflict if passed.) Other Security Council members, such as Britain refused to cosponsor the resolution, finding it too inflammatory. Among other things, it came a week before the ICU and the transitional government were scheduled to meet for talks in Khartoum, effectively torpedoing that route.
One strange bit is how the resolution was passed in the first place. It came with an 90-page report from a Monitoring Group that built the case against Somalia's ruling Islamists. As best as I can tell, the group consisted of four observers who relied largely on Ethiopian and U.S. intelligence, and made a number of sensational claims that were disputed by experts—including a claim that the ICU had sent 720 fighters to fight alongside Hezbollah in Lebanon (how a relatively small militia struggling to maintain order at home could afford to send this many fighters abroad is unclear) and invited two Iranian agents to come look for uranium in exchange for arms.
Perhaps these claims are true. Then again, scare stories about U.S. foes don't have the best track record of late. But the State Department decided that negotiations with the ICU were out of the question, the UN resolution was passed, and Ethiopia eventually invaded. (Ironically, the Monitoring Group had warned strongly against loosening the arms embargo, also warning that it could lead to war.) That's not to say the UN Resolution caused the war, as Ethiopia had its own reasons for declaring the ICU a security threat, but it sure seems like the Security Council mishandled an opportunity to defuse the crisis.
(Many Ethiopians, for their part, believe the invasion mainly took place so that Prime Minister Meles Zenawi could "distract people from a vast array of internal problems and to justify further repression of opposition groups." Dubiousness all around.)
Anyway, who knows how this will all turn out? Perhaps a handful of Somalis will come to resent the United States' role in all this and sign up for jihad against the West. Or maybe the same warlords who spent decades trashing Somalia will put the country on a path to peace and stability. But I'm curious to know what people thought the UN resolution would accomplish, because it sure looks like it was a rather provocative move, rammed through the Security Council over serious objections, based on what looks like a fishy intelligence assessment. Maybe that's unfair. But I haven't seen much good reporting on what, exactly, the United States expected to happen.
I must've missed it when it came out, but this story is more a little unsettling. Last month, The Project on Government Oversight charged that a nuclear warhead almost exploded in 2005 while it was being dismantled at the government's Pantex facility near Amarillo, Texas:
The Project on Government Oversight says it has been told by knowledgeable experts that the warhead nearly detonated in 2005 because an unsafe amount of pressure was applied while it was being disassembled ...
An investigator for Project on Government Oversight says the weapon involved was a W-56 warhead with 100 times the destructive power of the atomic bomb dropped on Hiroshima.
The plant itself, which is managed by private contractor BWX Technologies and used to decommission nuclear weapons, was fined $110,000 for violating safety procedures during disassembly, and "is now being investigated by the Department of Energy for a number of other alleged safety problems."
The Los Angeles Timesfollowed up: Employees characterized conditions at the plant as "degraded," and a letter to the president of BWXT complained that engineers required to work up to 84 hours a week, and production technicians 72 hours a week. According to the Times, the letter also stated that "some managers lacked specific experience in handling nuclear weapons." But BWXT denied the allegations and Energy Secretary Samuel Bodman expressed "confidence that Pantex will continue its outstanding work." No need for alarm!
In other nuclear safety news, Linton Brooks, head of the National Nuclear Security Administration, has recently been dismissed "because of security breakdowns at the Los Alamos laboratory and other facilities." In June, Brooks was reprimanded for failing to report a security breach of computers in New Mexico that resulted in the theft of personal data for 1,500 workers. Last fall, "[d]uring a drug raid, authorities found classified nuclear-related documents at the home of a former lab employee with top security clearance." Exciting times in nuclear safety and security, no?
This is rather stunning. Devah Pager and Bruce Western recently conducted an experiment in New York City in which they sent a bunch of black, white, and Latino male volunteers out with fabricated resumes to apply for various low-wage jobs. To mix things up, some of the volunteers had criminal records on their resumes, others didn't.
Well, they found that white applicants were more likely to get callbacks from employers than Latino applicants with comparables resumes, and Latinos were more likely to get callbacks than black applicants. Typical discrimination, one might say. Hold on, hold on. It turns out, though, that even white applicants with felony convictions are more likely to get callbacks than black applicants. Pager did a similar study a few years ago in Milwaukee and found the same thing: A white male with a felony conviction is more likely to get a callback than a black male with a clean record. (Black men with felony convictions are, of course, totally screwed and almost never get callbacks.)
This story is at least six months old, but Betsy Bowman and Bob Stone had an article in Dollars & Sense last year about the growth of "cooperatives"—worker-owned businesses—in Venezuela. So far, over 108,000 co-ops representing 1.5 million members have registered with the state. Most of them are small, set up through government loans, but here's how the larger ones are set up:
Since January 2005... MINEP has stood ready to help workers take control of some large factories facing bankruptcy. ... In one instance, owners of a shuttered Heinz tomato processing plant in Monagas state offered to sell it to the government for $600,000. After factoring in back wages, taxes, and an outstanding mortgage, the two sides reached an amicable agreement to sell the plant to the workers for $260,000, with preferential loans provided by the government.
In a more typically confrontational example, displaced workers first occupied a sugar refinery in Cumanacoa and restarted it on their own. The federal government then expropriated the property and turned it over to cooperatives of the plant's workers. The owners' property rights were respected inasmuch as the government loaned the workers the money for the purchase, though the price was well below what the owners had claimed. Such expropriated factories are then often run by elected representatives of workers alongside of government appointees.
So it doesn't seem like the government is just seizing existing factories willy-nilly and handing them over to workers—mostly just the ones about to close or go bankrupt, which amounts to a mere 700 so far. (Of course, if workers in healthy firms ever decided to lobby to "cooperatize" their employers, that would certainly create controversy.) Some co-ops, meanwhile, are required to set aside a part of their profits to fund health, education, and housing for the local population, although this bit is unclear.
Anyway, it's an interesting sidenote to James Surowiecki's New Yorkercolumn this week about how Chavez's government isn't really all that radical. There are a few lefty ideas being put into practice. I was curious, though, as to how many worker-owned firms there were in the United States. Back in May, the New York Timesran a story on Reflexite, which is owned by its 500 employees and seems to do well. "Employees who own a big share of their company," the Times reports, "are more likely to innovate, stay focused on quality and hold management accountable."
More common are ESOP firms, described by Gar Alperovitz in America Beyond Capitalism. Various bits of legislation since 1974 gave tax incentives to companies that contribute stock to an employee trust, known as an Employee Stock Ownership Plan. In some companies, workers own virtually all of the stock, and hence, the whole company—as with Gore-Tex, which has 6,000 employees and generally wins plaudits as being a great place to work. All told, about 3,000 of the 11,000 ESOP firms in the country are majority-owned by workers.
Now only about 40 percent of these companies give full voting rights to their worker shareholders, but even so, one 1998 survey found that median hourly pay at ESOP firms was 4 to 18 percent higher than pay for comparable work at other firms. Early studies suggest that combining worker ownership with employee participation actually boosts productivity. And the concept has been backed by everyone from William Greider to William F. Buckley. Several state programs, such as the Ohio Employee Ownership Center, already do what the Ministry of Popular Economy in Venezuela does—provide support for worker ownership—on a smaller scale.
Anyway, this is a sort of rambling post with no real point. I'd be interested in a more direct comparison between worker-owned firms and cooperatives in, say, Venezuela. For one, I imagine ESOP firms aren't nearly as democratic, seeing as how higher-salaried workers will likely earn more stock and hence, have more voting power (it's not clear how power is split in Venezuelan co-ops). And I'd be interested to know more about Bowman and Stone's claim that cooperatives have helped lower the unemployment rate in Venezuela. To what extent? And yes, before it comes up, I know Chavez's human rights record is less than sterling, but that seems rather irrelevant to this issue, at least.
Sorry to be a downer, but this NPR report on Jose Padilla is easily the most disturbing thing I've read all day. Padilla, as we've learned, has basically gone insane during his time in U.S. military custody, after four years of stress positions and "total sensory deprivation." But here's the government's defense:
The government maintains that whatever happened to Padilla during his detention is irrelevant, since no information obtained during that time is being used in the criminal case against him.
Er... so there was real no point in holding him indefinitely, without charges? Is that what's being said here? Not exactly:
Indeed, there are even some within the government who think it might be best if Padilla were declared incompetent and sent to a psychiatric prison facility. As one high-ranking official put it, "the objective of the government always has been to incapacitate this person."
Digby points out that the Soviet Union had a term for this: Psikhushka, psychiatric hospitals, which were used to "isolate political prisoners from the rest of society, discredit their ideas, and break them physically and mentally." Here's Wikipedia: "The 'treatment' included various forms of restraint, electric shocks, a range of drugs... that cause long lasting side effects, and sometimes involved beatings." As with Padilla, the objective was always to incapacitate the person. But hey, comparing U.S. policy with Soviet Russia is overly shrill so maybe we should just drop it.
John Schmitt and Ben Zipperer of CEPR have a new paper estimating that "one in seven union organizers and activists are illegally fired while trying to organize unions at their place of work." Technically speaking, it's against the law to fire a worker for being involved in an organizing drive. In practice, though, employers just have to cop to a small fine—back pay averaging around $2,500 per firing—if they break this law. Most businesses are happy to pony up and take out a few key pro-union employees, which usually disrupts the campaign and intimidates the other workers. It's certainly cheaper than granting union recognition.
Under the Bush administration, the CEPR paper found, illegal firings have risen substantially—largely because the current National Labor Relations Board has little interest in punishing union-busting employers. It's no wonder so many unions prefer card-check elections, in which a majority of workers simply have to sign cards signaling their approval in order to win union representation. The current system of NLRB elections drags on for much too long, giving the employer time to stall, figure out who the culprits are, and do a few illegal firings—which, it seems, happens quite frequently.
Via Ezra, the Wall Street Journalreports on a sparkling new computerized scheduling system being used by Wal-Mart, Radio Shack, and other retailers. The program will track customer habits, predict surges and lulls in store activity, and adjust worker schedules accordingly. It's good for the stores, which can now avoid having too many employees on the floor during down times, but some observers fear it could be hell on the workers:
But while the new systems are expected to benefit both retailers and customers, some experts say they can saddle workers with unpredictable schedules. In some cases, they may be asked to be "on call" to meet customer surges, or sent home because of a lull, resulting in less pay. The new systems also alert managers when a worker is approaching full-time status or overtime, which would require higher wages and benefits, so they can scale back that person's schedule.
That means workers may not know when or if they will need a babysitter or whether they will work enough hours to pay that month's bills. Rather than work three eight-hour days, someone might now be plugged into six four-hour days, mornings one week and evenings the next.
There's nothing wrong with computerized scheduling per se. In an ideal world the program would try to foresee when things will be busy and when they won't, and simply draw up a more predictable schedule. In restaurants, for instance, too many waiters often show up on slow days, in which case either someone has to go home or everyone has to split the lower-than-usual tip jar. If a computer could prevent this from happening, wait staff would benefit. But Wal-Mart, on the other hand, seems to be using the new system to put workers "on call" or limiting hours. That's very different, and exceedingly pernicious.
Wal-Mart—like many other retailers—already demands that employees set aside large chunks of time each week to be available to work, even though they may only end up working a fraction of that time. To a lesser degree, this was the case in every retailer I've ever worked for: Workers had to be "on call" occasionally and got sent home if things were slow. Ideally, employees would get paid at least something for sitting at home "on call", but without bargaining power, they can't really demand this. They just suffer through the uncertainty instead. (Erratic scheduling also takes a nice little toll on one's health—an added bonus.)
Another problem, of course, is that 40 percent of Wal-Mart's employees will soon be part-time workers. Many of them—and many of the full-time workers, too—need to find second or even third jobs to make ends meet. Of course, it becomes near-impossible to find another job when you have to sit around "on call" and can't predict your schedule from week to week. Ah, but at least the Bureau of Labor Statistics can record an uptick in "productivity", and economists can then sit around and wonder why median wages aren't going up too. So it's all good...
Anyway, Ezra has the right take on this: Service workers need "the bargaining power and voice to demand--and receive--better treatment." Interestingly, back in the 1930s, many of the striking unions were concerned less with wages than with mistreatment at the hands of capricious and tyrannical employers. Of course, Harry Bennett's "servicemen," who would beat Ford factory workers for talking on duty, were a fair bit more odious than Wal-Mart managers poring over schedules, but the principle's basically the same.
He's hardly the first to point this out, but Robert Fisk says that now that Saddam Hussein's been executed, he can no longer expose the West's role in propping him up—or explain how the United States sold Iraq the chemicals he would later use to gas the Kurds in Halabja. Or tell a million other rather inconvenient tales.
That's certainly true, although if we're talking about secrets that Saddam will now take to his grave, I've always been interested in his meeting with April Glaspie, the U.S. ambassador to Iraq under the first Bush administration, which took place on July 25, 1990, a month before Iraq's invasion of Kuwait. By some accounts, it sure seems like Glaspie basically implied that the U.S. would stay neutral in the event of war. "We have no opinion on your Arab-Arab conflicts, such as your dispute with Kuwait," reads one version of the transcript of the meeting.
In retrospect, the first Gulf War against Iraq has always been the "no-brainer" war—the war all right-thinking people agree was perfectly justified. Or so goes the conventional wisdom. But it's entirely possible that the first Bush administration might have prevented the whole war in the first place by warning Saddam in no uncertain terms that an invasion of Kuwait wouldn't be tolerated. Then again, James Akins, the Saudi ambassador at the time, told PBS that no diplomat in Glaspie's position would have ever threatened war. And in a later interview, Joe Wilson insisted that Saddam "did not think he was getting a green or a yellow light."
So maybe Kenneth Pollack's right and the invasion of Kuwait was a wholly irrational act that couldn't have been deterred. (And perhaps world history would've taken a turn for the worse had there been no Gulf War and Iraq developed nuclear weapons.) Still, it would have been valuable to at least try and get a fuller picture of this story from Saddam himself. I suppose people could still ask Tariq Aziz, since he was at the April 25 meeting too, and he hasn't been executed yet.
This is really quite the story. For several years now, a bunch of towns in rural Pennsylvania have been passing various "anti-sludge" laws to prevent corporations from dumping toxic waste all over their fields. This includes bans on companies with previous environmental violations from operating in town, and laws requiring companies to test their waste for safety. Well, businesses weren't too pleased, and some of them sued, arguing that the laws violated their rights to "equal protection, due process... and rights guaranteed under the commerce clause."
In return, the townships of Licking and Porter did something rather radical, passing a town ordinance stripping corporations of their personhood. Bam! "Corporations shall not be considered to be 'persons' protected by the Constitution of the United States." As Barry Yeoman notes, the Supreme Court has recognized these protections for well over a century:
In the landmark 1886 Supreme Court case Santa Clara v. Southern Pacific, a railroad company refused to pay a special county tax in California, arguing (much as sludge hauler Synagro would do in Pennsylvania more than a century later) that to treat it differently from everyone else violated its constitutional rights. Speaking from the bench, Chief Justice Morrison Waite announced, "The court does not wish to hear argument on the question whether the provision in the 14th Amendment… applies to these corporations. We are all of the opinion that it does."
After Santa Clara, federal judges began granting more and more rights to nonliving "persons." In 1922, the Supreme Court ruled that the Pennsylvania Coal Co. was entitled to "just compensation" under the Fifth Amendment because a state law, designed to keep houses from collapsing as mining companies tunneled under them, limited how much coal it could extract. In 1967 and 1978, businesses prevailed in Supreme Court cases citing the search-and-seizure provisions of the Fourth Amendment as protection against fire and workplace safety inspections.
Corporate lawyers have also taken a shine to the First Amendment. In 1978, the Supreme Court agreed with corporations claiming that the state could not limit their political spending in an antitax campaign. Almost two decades later, a federal appellate court struck down a Vermont law requiring that milk from cows treated with bovine growth hormone be so labeled. Dairy producers had a First Amendment right "not to speak," the court said. In California, Nike invoked the First Amendment to fight a lawsuit arguing that the company's public relations materials misrepresented sweatshop labor conditions.
Honestly, I don't really know the ins and outs of this issue. It's plausible that stripping corporations of all their constitutional 'rights' would lead to some serious headaches. On the other hand, it's a gross injustice when a town can't even require sludge-dumping corporations from testing their toxic waste to make sure kids who ride their bikes through the local fields don't get sick. Either way, the finer points aren't up for debate at the moment. The Licking and Porter ordinances don't stand a chance in the courts. The Supreme Court certainly isn't going to overturn Santa Clara anytime soon.
So what's the point? Activists are hoping that these anti-personhood ordinances will generate further discussion, and prompt more grassroots organizing and local lawmaking, and hopefully, one day, create pressure for changes to state constitutions. Maybe even the federal one. It's all very much thinking about the long run. But it's not entirely quixotic. Already, according to Yeoman, Democratic parties of Maine, New Hampshire, and Washington have passed resolutions opposing corporate personhood. It's a start, at least.
Last year, Goldman Sachs handed out $16.4 billion in bonuses to all its employees. As Laurent Fischer points out, "That amount exceeds all corporate donations to charity in 2005. It is also nearly the entire worldwide amount spent annually to fight the global HIV/AIDS epidemic." But, of course, that money definitely should not go to HIV-prevention in Africa. It should go to those hardworking Goldman Sachs execs who dreamed up the bright idea of advising state governments to privatize their highways over the wishes of voters while, on the side, getting rich investing in toll roads...
Actually, never mind. The point of this post isn't to call for more class warfare. We'll save that for another day. Mostly I wanted to link to Emily Oster's piece in Esquire about AIDS in Africa, because it seemed to make some important points.
Here's the gist: To a certain degree, and without exaggerating things, HIV is relatively hard to transmit. Drug users sharing needles are at very high risk. Sex workers are at very high risk. Gay and bisexual men are at high risk. So are women having unprotected sex with high-risk men. But all else being equal, an HIV-free heterosexual man who has sex with a woman actually has a relatively low (though hardly nonzero) chance of contracting the virus. That is, with one major exception: If the sexual partners have other STDs—and the open sores that come with them—that makes transmission much, much more likely.
Well, Oster discovered that the difference between HIV infections rates in the United States and Africa can be largely explained by a greater prevalence of other STDs, especially herpes, in Africa. Because many STDs cause open sores, HIV transmission is four to five times more likely for each sexual encounter in Africa. If Oster's right, that means that one of the most effective HIV-prevention program might actually be to just treat those other STDs. The good news is that herpes, say, can be treated for as little as $3.50 a year, as opposed to the $300 a year it costs to treat AIDS. And it could save more lives. Although it's not clear that donors would get as enthused about opening their checkbook to pay for herpes treatment.
On the other hand, it's hard to figure out how big a role STDs play, and how important sexual behavior is. In another section, Oster suggests that Africa needs to "encourage safe sexual behavior" in order to reduce HIV infection rates down to U.S. levels. The problem is that "there have been very limited changes in sexual behavior on average." Granted, the United States bears some blame for promoting abstinence programs that don't work in Africa, but Oster says the biggest problem is simply poverty—people living in desolate areas with low life expectancies have little incentive to change their behavior. Not the happiest of conclusions.
The latest fashionable policy idea is to expand the size of the U.S. Army by adding anywhere from 30,000 to 90,000 more troops. The president likes it. Democrats like it. All indicators point to go. So it's nice to see Gordon Adams and John Diamond are taking the time to point out that this idea is, in fact, a terrible one.
Or, at the very least, the debate over troop levels is a terrible one. Congress shouldn't be talking about adding more troops until it has a decent sense for why the Pentagon might need those troops. All those new soldiers, after all, won't be trained in time to help out in Iraq. So what will they be for? One could see the case for additional troops if, say, the United States planned to invade and occupy more countries in the future. But if so, that should be stated clearly. As Adams and Diamond say, "If this is about invading Iran, or carrying out a land war with China... then maybe we need to have a debate about that strategy, not slip it in sideways by expanding the Army without agreeing on the mission."
In any case, I tend to doubt that 90,000 more troops would make future Iraq-style invasions feasible—let alone desirable. On the other hand, if the United States government had a bigger Army at its disposal, it would view the military as the solution to an even greater number of foreign policy problems. As Madeline Albright once asked Colin Powell, "What's the point in having this superb military you are always talking about if we can't use it?" That's some serious tail-wagging-the-dog action there. Why bother seeking out a non-military solution in this or that little corner of the world when we can always send in a couple thousand troops?