A little while back, I saw a copy of A Free Nation in Debt: The Financial Roots of Democracy lying around the office. Looked thrilling, but it also clocked in at nearly 600 pages—roughly sixty times the amount I can read without developing mysterious headaches and begging for mercy. Sad, but true. Fortunately, James K. Galbraith wrote a quick four-page review of the thing, so we can see what the big deal is.
James MacDonald's grand unified theory of the universe, basically, is that democracies generally arise because countries go into debt when they need to fight wars and stuff. Historically, states have needed extra cash, so they go to the citizens, and since it's hard to borrow a lot of money and not give your creditors the right to vote, voila, democracy. Wealthy oil states need not apply. That's the nickel version.
It's an interesting idea. Might even be true. It does help explain why so many European states instituted suffrage after the first World Wars; they had to issue a bunch of war bonds to their citizens during the 1910s, after all, and couldn't well continue to suppress the vote. The theory also predicts that democracies will generally out-survive other forms of government, because, with readier access to credit, they have the resources both to fight expensive wars and invest in public infrastructure. And as Rabelais once pointed out, the grand thing about being a debtor is that people care a great deal about your well-being.
Galbraith likes the book, but wonders what will happen now that the United States government depends less and less on American citizens to finance its public debt, and more and more on China and Japan. "Can a country… be truly democratic if it is in hock to banks and foreigners?" That seems like an odd question, but maybe it will make more sense if I read the book.
Sometimes the only thing to do is quote, so let's quote. Jack Balkin:
I am puzzled by and ashamed of the Democrats' moral cowardice on this [torture] bill. The latest version of the bill blesses detainee abuse and looks the other way on forms of detainee torture; it immunizes terrible acts; it abridges the writ of habeas corpus-- in the last, most egregious draft, it strips the writ for alleged enemy combatants whether proved to be so or not, whether citizens or not, and whether found in the U.S. or overseas.
This bill is simply outrageous. I doubt whether many Democratic Senators or staffs have read the bill or understand what is in it. Instead, they seem to be scrambling over themselves to vote for it out of a fear that the American public will think them weak and soft on terror.
The reason why the Democrats have not been doing very well on these issues, however, is that the public does not believe that they stand for anything other than echoing what the Republicans have been doing with a bit less conviction. If the Republicans are now the Party of Torture, the Democrats are now the Party of "Torture? Yeah, I guess so." Not exactly the moral high ground from which to seek office.
I don't actually know whether a strong anti-torture stance will help the Democrats win elections. But if it's an electoral loser, then elections probably aren't worth winning anyway. Because—surprise!—a country that tortures suspects who have no way to challenge their detention in court is still a sick and depraved nation even with Nancy Pelosi in the Speaker's chair. As Atrios says, if Barack Obama and other supposedly upstanding "faith-based" Democrats don't even try to put a stop to this thing, "that's the last time I want any fucking lectures from the faith and values [crowd]." It's all unseemly and shrill but entirely apt.
Okay. Sometime soon, the Supreme Court will decide on whether public employee unions can always spend worker dues on political causes without permission. Some people say it's unfair for unions to spend money on causes its members don't believe in, and the state should be able to stop the practice. These people often just happen to be right-wingers hoping to dry up a major source of funding for the Democratic Party. Shocking, I know. Then again, it might be a good thing for labor if unions were forced to be more responsive to their members. What do I know?
But if more democracy is good for unions, surely it ought to be good for everyone else too. Why not require that all corporations get permission from their employees before spending money on, say, lobbyists? After all, every dollar spent cozying up to Senator Tax Loophole is a dollar not spent on wages. It's a "due," in other words. There's no difference. And surely the shareholders should also have a say in any and all corporate political giving, no? Permission slips all around.
Honestly, I wouldn't be surprised if labor unions would perform better with more accountability in certain respects. (In a more benign world, it might even be worth experimenting with entirely voluntary union dues, as in some European unions—except that this isn't a more benign world.) Either way, though, research suggests that more "democratic" corporations perform better. No doubt the wisdom of the crowds can help corporations meddle more "intelligently" in the political world too. I mean, so long as we have to have meddling. I'm maybe 40 percent serious.
Back at my old job, in the course of doing research on the defense budget, I came across a few congressional staffers who mentioned that John McCain would often rail loudly and stridently for hours and hours against all the waste and fraud in the annual defense bills, but when it came time to do something about it, he'd usually just sit back and fold his hands. There were a few exceptions—his staff did do an exceptional job untangling the Boeing tanker scandal—but that was the basic pattern. Bloviate and do nothing. He's a real man of "principle," you see.
And now we find, via Marty Lederman, that McCain's latest "compromise" with the torture-happy Bush administration pretty much fits that pattern perfectly. The Senate's latest bill would eliminate habeas rights for detainees, and, crucially, allow the president to "interpret the meaning and application of the Geneva Conventions." U.S. courts wouldn't be able to step in if Bush's interpretation was, you know, totally and utterly wrong and illegal. The administration is confident that the "compromise" will let it keep running its interrogation program—which, from all accounts, violates the Geneva Conventions. The "dissident" Republicans in the Senate just gave the thumbs up to war crimes. How bold of them.
So... we can see where this is going. The real scandal is that some unshaven lefty on the streets somewhere might dub this whole thing "fascist." Such foul language. Tsk tsk.
It's common for aid organizations to define "extreme poverty" as those people who subsist on less than a dollar a day around the world. The U.N. Millennium Project generally claims that there are 1.1 billion people below this threshold, and want to halve the number of "extremely poor" by 2015. Who can argue with that? The problem, though, is that the "dollar a day" metric is terribly imprecise, as David Singh Grewal mentions in a thoughtful essay on globalization:
In fact, for such an assessment [of global poverty], the dollar-a-day measure is "meaningless," as the economist Sanjay Reddy and the philosopher Thomas Pogge have argued in a series of papers, because the figure uses purchasing power parity (PPP) ratios in order to convert foreign purchasing power into U.S. dollars.
The problem is ultimately one of aggregation: to value real purchasing power (the ability to buy goods and services) in one currency in terms of another, economists compare representative baskets of goods and services—cars, movie tickets, doctor visits, heads of lettuce, and so on—purchased in each country for a given amount of currency. But many of the goods and services that enter into these calculations may be irrelevant to the poor, because no poor person consumes them, and so they skew poverty calculations that include them. Reddy and Pogge conclude that it is impossible to construct a meaningful poverty line of the dollar-a-day type that uses PPP ratios.
I'd probably go even further than this. The mere fact that a person lives on $1 a day tells us relatively little about that person's actual condition. After all, a dollar a day will go much further in a stable country with generous social services than in a basket case country like Angola. On the flip side, World Bank studies have charted a decline in "extreme poverty" since the 1980s, ostensibly thanks to free markets and the like. But most of that decline has occurred in China and India, where the amount of money needed to maintain a decent standard of living may well have risen even more dramatically, so that even, say, $3 a day still leaves on in dire straits.
But let's take this further. Consider the people living in extreme poverty—a billion people or so. Many of them live in rural areas. They grow their own food and live in their own homes and have their own communities. To be sure, they're all really fucking poor, and suffer from economic insecurity, varying degrees of political insecurity, and could undoubtedly use better health care and education. But without at all trying to romanticize this sort of existence, living in a farming or fishing community could, conceivably, be preferable to life as a Third World urban-dweller in an overcrowded slum, foraging for trash to sell on the black market, stealing food, and contracting various exotic diseases because of a lack of clean water.
Economic growth, of course, usually uproots rural life. It's been happening for decades in the developing world. Corporations move in and start appropriating natural resources; trade and large agribusinesses put farmers out of business; the young folks are lured by jobs in the city, various processes disrupt traditional communities. Urbanization proceeds apace. Regardless of the merits of this process, though, it's entirely possible that many people could end up worse off in the short term—pushed into a life of urban squalor, with no escape in sight—even if "extreme poverty," the number of those living on less than $1 a day, is declining.
Now lots of developed countries have gone through the changes just described, and even though the transition period can, like puberty, get ugly, it might well be the only way countries can get rich and prosper. I don't think that's true (in fact, I think that urbanization and slum-ification will be much more painful for the Third World than it was for, say, Britain, because there's no "New World" to siphon off all those excess city-dwellers brought about by industrialization), but that's not the point—the point is that official "extreme poverty metrics" could easily obscure much of what's going on, and might even steer experts towards faulty prescriptions in the never-ending quest to "make poverty history".
Weapons of mass destruction are, of course, bad things. The name is a dead giveaway. But over on the Guardian's new blog, Richard Wilson writes about what's probably an even more important arms control issue: the unregulated global trade in small arms. The prevalence of cheap light weapons—assault rifles, submachine guns, grenades and the like—around the world has been responsible for the deaths of some 300,000 people per year. Here's Wilson:
"It's the white people supplying the weapons in Africa - now you're going to feel what it's like," my sister Charlotte was told, shortly before being gunned down by members of the Forces pour la Liberation Nationale (FNL) armed group in war-torn Burundi. The UK post-mortem found that she had been shot seven times in the back with an eastern European semi-automatic rifle. Her killers may have been illiterate members of a ragtag peasant army, but they knew where the guns were coming from. ...
More than 300,000 people - mostly civilians - have died in Burundi's bloody conflict since 1993. In the wider region - Rwanda, Uganda and the Democratic Republic of Congo - the death toll runs into the millions. The financial cost, too, is devastating. Across Africa, $15bn is lost every year through the impact of war, cruelly undermining prospects for economic development. Poverty, inter-ethnic rivalries, and a culture of impunity all play a part in fuelling the violence. But without the ready and abundant supply of guns and ammunition, these conflicts would be far less deadly.
That's an appropriately dark take, I think. When it comes to gun control here in the United States, I'm sometimes swayed by anti-gun-control arguments. Sometimes. Abroad, though, that whole "guns don't kill people" slogan sounds cheap and risible. Without easy access to cheap weapons, armed militias would rarely have the means to, say, try to take over a state—as Charles Taylor's gang did in Liberia, and as the Taylor-backed RUF did in Angola. Without a global surfeit of guns that cost less than the price of a live chicken, people couldn't afford to arm child soldiers, or drag civil wars out for decades. The small arms trade really is the root of much evil.
But global gun control is a tricky matter. Amnesty International recently looked at the increasingly sophisticated black market in small arms, which has been supported by states that employ private contractors to broker arms deals and deliver the guns. Basically, governments are nurturing a complex network of private intermediaries—arms brokers, logistics firms, transport companies—who then turn around and use those same networks to traffic illegally in small arms. It's all highly unregulated. It's all a mess.
One would think that the UN would be an ideal forum for clamping down on the global small arms trade. But the United States, spurred on by the NRA, has been implacably opposed to global arms control. John Bolton made that clear in 2001, in a stunning anti-arms control speech to the UN. China and Russia have also staunchly opposed sensible regulations, such as creating a global registry to track arms and ammunition, so that trafficking networks can be uprooted more easily. (Russian arms manufacturers still depend heavily on the global arms trade, especially since the end of the Cold War.)
Then there's the fact that arms control will inevitably founder so long as the supply of guns continues to expand. In 2001, Bolton said the U.S. would oppose "measures that would constrain the legal trade and legal manufacturing of small arms and light weapons." Seems fair, no? What's wrong with "legal"—government to government—arms deals? Well, for one, as AI pointed out, the "legal" trade has helped bolster the black market. Second, arms sold to governments—say, assault rifles sold to Algeria by Italy—often end up in private hands due to "leakage". Sixty percent of the 639 million small arms in existence are now privately owned. Most didn't start that way.
Now the "war on terror" has given G-8 governments a new excuse to ramp up their sales small arms to "friendly" governments. In the U.S., Congress seems less eager nowadays to restrict weapons sales to repressive countries such as, say, Egypt and Pakistan—countries notorious for "losing" weapons that end up in arms bazaars and private hands. Granted, regulating these sorts of deals wouldn't be enough to stop the trade in light weapons—after all, hundreds of millions of privately-owned small arms are still floating around, and they can always migrate from conflict to conflict even if the U.S. never sold another gun to anyone—but it would be a start.
So... it's a complex issue. I certainly don't know how one would go about fixing things, although a lot of smart people no doubt do. But on the offchance that, say, a leading superpower were concerned about creating a liberal world order and preventing the sharp rise in the number of failed states, it might seem appropriate to take a keen interest in the small-arms trade. But that would mean taking on arms manufacturers and, inevitably, the NRA. It's also sort of a dull issue—less exciting than confronting evildoers and gearing up for war, that's for sure—so that's a bit of a problem.
In the 1970s, Section 8—the federal rental-assistance program—was started as an alternative to public housing. Government officials thought that massive housing projects were just clustering the poor into cramped and crime-ridden environments, and that a better solution would be to bulldoze all that public housing and give the families housing vouchers instead. Then poor families could be "scattered" across the country, placed in better neighborhoods with better schools for their kids. It would be a great success. Even some conservatives liked it.
Or at least that was the idea. In practice, it hasn't always worked that way. Fred Kelly had a fantastic article in the Charlotte Observer recently, examining the data on Section 8 recipients in Charlotte and finding that the vast majority of them "are clustered in 10 ZIP codes already burdened with crime and blight." Only a handful get placed in nicer neighborhoods. So much for the whole "scatter the poor" theory.
By law, cities aren't supposed to do this—Baltimore and Chicago have both been sued over this exact issue—but obviously it's happening. To some extent, blame lies with the Bush administration for creating a "funding crisis" in the Section 8 program. With less money, local housing authorities increasingly can't afford to place people in middle-class neighborhoods (and many people who need housing don't receive assistance at all). But the city of Charlotte, it seems, also hasn't made much effort to build affordable housing around the city. I wonder if they're exceptionally dysfunctional in this regard or not. I'm guessing not.
They certainly don't reach the most surprising conclusions the world has ever seen, but Jonathan Gruber and Daniel Hungerman have an interesting paper on "blue laws"—laws that states historically enacted to limit commercial activity on Sunday, in order to get everyone to go to church. The authors find that once states began repealing their blue laws, after the Supreme Court declared them unconstitutional in 1961, people started going to church far less frequently. That seems commonsensical—when there are other things to do on Sunday, church sounds less appealing—but now there's data too, so that's fun.
The authors also discover that drinking and drug use starts to increase among the people who stop attending church after the blue laws are appealed. In the past, Gruber has done a bunch of work showing a strong correlation between church participation and a bunch of positive outcomes, such as improved health and lower criminal activity so this sort of jibes with that. Sitting in a pew, praying, and belting out the odd hymn, it seems, makes you an upstanding citizen.
Or does it? I'd like to get a fuller picture of what's going on here. What is it about churches that lead to such positive outcomes? After all, as I've noted before, research shows that churches actually provide a fair amount of income insurance for their members. If times are tough—if you lose your job, say, or your child contracts some horrible illness—people in the congregation will pass the hat around and help you out. Research shows that this actually happens frequently, and the effect is quite significant. Surely that would explain some of these positive outcomes, no?
Anyway, the Gruber-Hungerman paper discusses competition between secular businesses and churches. There's also a question of how churches compete with public services. I've always assumed that in many parts of the country, people join big megachurches, say, because they often provide certain valuable services—like child care and support networks. This, in turn, can make churchgoers less supportive of the welfare state, since they already receive many services through the church. And if people are all voting to hack up the welfare state, then more and more people, in turn, become reliant on church. It's a vicious—or virtuous—cycle. But does what I just described actually happen? It' would make for a fascinating study, I think.
So. This morning I decided I'd get up and make myself a nice spinach and mushroom omelette for breakfast. Yum yum. Then I log on to the vast and mighty internet and see that spinach all around across the nation has been contaminated with e. coli., leaving spinach-eaters susceptible to "bloody diarrhea, kidney failure, and—in rare cases—death." Not so yum! At any rate, I'm not really worried, seeing as how I had already eaten half the bag a week ago without incident and I tend to cook my spinach pretty well, but on the off-chance that things take a turn for the gruesome, I'll be sure to do some fantastic e. coli blogging, so stay tuned...
In an ideal world, one could, say, pick up a newspaper and understand what was going on in this country. In this world, that's too much to ask. Last week, I was too busy to follow the news much, so I brought home a Friday New York Times to cure my ignorance (okay, in truth, I wanted the crossword puzzle, but the news section was a side benefit). Here's a headline: "AN UNEXPECTED COLLISION OVER DETAINEES." The article suggests that Republican Senators are challenging Bush's burning desire to have unlimited powers to detain and torture anyone at will. Sounds like good news, no?
Except, of course, that's not exactly the case. In reality, McCain, Graham, and Warner are pushing a bill in the Senate that somewhat curtails Bush's power. But it still gives him, say, 80 percent of what he wants. And what he wants is appalling. The GOP bill would strip habeas corpus rights for any alien who has been "properly detained as an enemy combatant" (even detainees, hilzoy reminds us, who aren't actually threats) and eliminate their right to challenge their detention in court. Even if they're innocent, as detainees commonly are. It would also weaken the definition of "war crimes," making only "grave breach[es]" of Article 3 of the Geneva Convention a no-no, rather than any violation whatsoever.
It's also still not clear what makes someone an "unlawful enemy combatant." The Senate bill's definition includes those who are "engaged in hostilities against the United States." But what sorts of hostilities? It doesn't quite specify. Aziz Huq argues that under the current language, a grandmother sending money to Cuba—which is, of course, an "enemy" of the United States—could probably be detained, without legal recourse. Obviously that specific scenario doesn't seem likely, but Congress isn't supposed to pass bills that simply hope and trust that the president won't abuse his power and start locking up grannies left and right.
So that's bad. But what's the fighting all about? Well, the Bush administration, for it's part, specifically wants to be able to ignore parts of Article 3 of the Geneva Conventions, so that it can still engage in the following interrogation techniques:
Threaten detainees with violence and death, and threaten their family members with violence and death.
Put detainees in "cold cells," where they stand naked in a near-freezing room and are doused with cold water..
Shackle detainees in standing positions for more than 40 hours, or put detainees in painful stress positions, or force detainees to undergo sleep deprivation.
As Mary Lederman says, this is the key issue. Three Republicans in the Senate have said no, the president can't just ignore the Geneva Conventions—and hence undermine a hugely-important international treaty—just because he wants to employ some tactics of dubious efficacy. Twenty-nine other retired military officials and CIA leaders have also said this is a terrible idea. So now Bush is pouting and saying, fine, in that case, I won't have the CIA interrogate any terror suspects, and if there's an attack on America, well, that's your fault. Our hero.
So in that sense, McCain, Warner, and Graham are trying to do something positive (an earlier McCain bill on this matter gave the administration too much leeway). But their bill still undermines the rule of law in many important respects. Worse, given that it weakens judicial oversight of the Bush administration's overseas detentions, the Senate bill would make it hard to actually enforce the prohibition on torture. They're still willing to trust an administration that has repeatedly flouted the law—and Congress—on this issue. But let's give the Senators a hand because they're all "principled," or something.
Thanks to my newfound and wholly-corrupt pledge to link to Alternet more often, I just happened across a great story by Chris Levister about prison labor. The prison-industrial complex, now employs some 80,000 prisoners, and the number's rapidly growing. A large number of those workers earn less than minimum wage:
For the tycoons who have invested in the prison industry, it has been like finding a pot of gold. They don't have to worry about strikes or paying unemployment, health or worker's comp insurance, vacation or comp time. All of their workers are full time, and never arrive late or are absent because of family problems; moreover, if prisoners refuse to work, they are moved to disciplinary housing and lose canteen privileges. Most importantly, they lose "good time" credit that reduces their sentence.
Bonanza! Now one might reasonably wonder whether this captive, ultra-cheap labor force might be pulling down wages and standards for everyone else. After all, while inmates making goods for domestic consumption must be paid a "prevailing wage," that law doesn't apply to exports. So companies can outsource production to the local penitentiary and pay their "workers" next to nothing:
Prisoners now manufacture everything from blue jeans, to auto parts, to electronics and furniture. Honda has paid inmates $2 an hour for doing the same work an auto worker would get paid $20 to $30 an hour to do. Konica has used prisoners to repair copiers for less than 50 cents an hour. … Clothing made in California and Oregon prisons competes so successfully with apparel made in Latin America and Asia that it is exported to other countries.
The whole racket seems rather perverse—prisons have effectively become domestic sweatshops. But what should be done about it? Well, the government could obviously require that prisoners who manufacture exports be paid a prevailing wage too. That could result in fewer companies contracting with prisons altogether, which might be bad for prisoners, seeing as how these programs are popular and purportedly help them build job skills and the like. But this sounds exactly like the standard argument against raising the minimum wage, a fear that sounds sensible in theory but almost never pans out in practice. Plus, who says prison sweatshops are the best way to "help" prisoners out anyway?
(Then again, seeing as how popular opinion currently tilts so overwhelmingly against anything smelling like "rehabilitation," maybe the only way prisoners will ever actually receive job-training and the like is if corporations can make a buck off it. Cynical thought, but perhaps not unrealistic.)
Of course, the real moral of the piece is that way too many people are in prison. There are currently two million prisoners in the United States, and "some experts believe that the number of people locked up in the U.S. could double in the next 10 years"—many for minor, non-violent drug crimes and the like. The War on Drugs: truly the gift that keeps giving. But now that businesses are finding it so lucrative, what are the odds of that changing anytime soon? I wonder if corporations have ever lobbied for "tough on crime" policies because they need the cheap labor. Seems a bit paranoid, but hardly impossible.
In a long and thoughtful post on income inequality in the United States, Reihan Salam had this to say:
My sense is that many on the left are bothered by the massive increase in income and wealth among the top 1 percent because they believe this threatens the democracy by giving a small handful of households outsized influence. This could be true.
Well, yes, that's certainly one of a couple reasons why inequality bothers me, but why stop at saying it could be true? I think it's certainly true. Political scientists have done a lot of work on how income inequality leads to political inequality (which in turn can lead to further income inequality) and their findings are very much worth reviewing.
First, some statistics. Between 1979 and 2003, the income of the richest one percent of Americans more than doubled, the income of the middle 15 percent grew by only 15 percent, and the income of the poorest 20 percent barely budged, according to the CBO. By the late 1990s, the richest one percent of Americans households had a third of all wealth in the economy, and took in 14 percent of the country's income—a greater share than at any point since the Great Depression. These days you can't swing a dead gerbil without hitting some leftist faithfully reciting these figures, but I thought I'd repeat them anyway.
In politics, this all matters very much. Larry Bartels of Princeton has studied the voting record of the Senate between 1989 and 1994—a time, note, when Democrats controlled Congress. He found that Senators were very responsive to the preferences of the upper third of the income spectrum, somewhat less attentive to the middle third, and completely ignored the policy preferences of the poorest third of Americans. In one striking example, Bartels discovered that Senators were only likely to vote for a minimum wage increase if and when their wealthier constituents favored it—the views of those directly affected by the hike had "no discernible impact."
Nor is this pattern limited to domestic policy. Lawrence Jacobs of the University of Minnesota and Benjamin Page of Northwestern have found that the foreign policy views of the executive and legislative branches are primarily influenced by business leaders, policy experts—whose think tanks are often funded by businesses—and, to a lesser extent, organized labor. Jacobs and Page found that the views of the broader public have essentially zero impact on the government when it comes to tariffs, treaties, diplomacy, or military action. Walter Russell Mead once argued that "Jacksonian" nationalism in the heartland drove American foreign policy, but the data doesn't back him up. Business still calls the shots.
To some extent, these findings are likely a result of the fact that elected officials tend to hail from the upper classes, and so tend to be the sort of people who worry more about the burden the estate tax imposes than, say, food insecurity or too-high heating bills. In 2003, financial records revealed that 40 senators and 123 representatives were millionaires. This shouldn't be surprising. Without publicly-financed elections, it takes a good deal of personal wealth to run for office—the average Senate campaign in 2006 will cost about $10 million, minimum, according to a University of Washington study.
But that's only the most obvious way economic power begets political power. Consider the fact that wealthy Americans are far more likely to vote: 86 percent of those in families with incomes over $75,000 reported voting in 1990, compared to only 52 percent of those in families with incomes under $15,000. Whether that's because the well-off are more likely to believe that government will work for them—evidently a sound assumption—or because they have more time and opportunity to inform themselves and do their civic duty is unknown.
But it's not just voting. People in the $75,000 bracket are much more likely to join a political advocacy group like the NRA or the NAACP (73 percent vs. 29 percent), and much more likely to make campaign contributions (56 percent vs. 6 percent). Indeed, in the 2000 election, 95 percent of those donors making substantial campaign contributions came from households making over $100,000. While high-income donors don't usually bribe politicians to do their bidding, they do get more face time with their representatives, during which they can frame issues and concerns in ways amenable to their interests.
This counts for a lot. For one, as Dean Baker has argued, conservatives have actively used the nanny state to redistribute wealth upwards over the years—these are the policies we get when the wealthy control Congress. Second, politicians tend to think more about their better-off constituents in everything they do, which inevitably biases policy. George W. Bush once reportedly told Jim Wallis, "I don't understand how poor people think." I can find no better example of how massive income disparities undermine democracy.
So much for peak oil, huh? Mark Morrison of Businessweekthinks we no longer need to worry about the oil supply running out, because Chevron and other companies just found a "mammoth" oil field deep beneath the Gulf of Mexico. And in the future, Morrison says, energy companies will find even more oil deposits deep beneath the ocean floor, and we'll be able to dart around in our SUVs forever. Of course, actually burning all those newfound hydrocarbons might destroy the planet, but other than that, we're totally fine!
Unless, of course, Chevron's wrong. Randy Kirk has written an interesting Energy Bulletin memo wondering if the Gulf of Mexico "discovery" is just hype. Curiously, the range given for the Gulf discovery is between 3 billion and 15 billion barrels, yet the new field "is comprised of no single field of more than 300 million barrels. An entire area of as much as 15 billion barrels with no 'giant' over 1 Bn bar oil field is unusual."
So maybe Chevron's exaggerating things. But why oh why would they do that? Well, in just a few weeks, the Senate is going to vote on whether to lift a 25-year ban on most offshore drilling. It's the usual story: Environmentalists have concerns about opening up the coasts to drilling, but oil companies stand to make a lot of money if the Senate lifts the moratorium. And the vote's close right now. The "discovery" of a gargantuan field off the coast of Mexico could probably sway a few wavering politicians. Suspicious, no? Kirk sees a rather... interesting parallel:
[T]he [Chevron] announcement is reminiscent of the Mexican "huge oil discovery" announced last year, of a possible 10 billion barrels, which was quietly revised this year to around 43 million barrels, a downward revision of 99.57%. This similar "discovery" was made in Mexico last year a few months before the Mexican parliament was to vote on Pemex (state oil co)'s budget and rights to expand drilling.
It's almost like the oil industry's version of a well-timed terror alert...
Criticizing the media usually isn't my thing, but the Washington Post has an absolutely appalling front-page story today about "pregnancy crisis centers." Anyone who knew nothing about these centers—like, apparently, the National Review'sJohn Miller—would come away from the article with the impression that religious conservatives have merely set up clinics for the purpose of giving women sonograms and talking them out of abortion. Miller calls it "an example of women exercising their freedom" and doesn't see what the big deal is.
The big deal is that these centers are actively trying to mislead women. The centers are designed to look like abortion clinics—and are often placed next to abortion clinics, or housed in a building where a clinic used to be—which is why so many pregnant women mistakenly wander in. And rather than gently dissuading women from having abortions with actual facts, they lie and scare-monger. The Post reporter cites a study conducted by Rep. Henry Waxman about the centers, but never mentions that Waxman's staff actually called 23 "crisis pregnancy centers" and found that 20 of them give outright misleading information about the risks of abortion.
Nor does the Post mention concerns that many centers go far beyond merely trying to persuade women from having an abortion and actually harass some of the women who walk in under the impression that they're at Planned Parenthood. Amanda Marcotte wrote a great piece about the centers for Alternet that included some of these tales. Seems worth mentioning, no? But nothing in the Post.
And, of course, pregnancy crisis centers are demonstrably harmful. As the Post mentions, the centers offer sonograms, with the intention of dissuading women from having an abortion. But, again, unmentioned is the fact that the staff at these centers usually have no real medical training. So whoever's giving the sonogram won't know to look for signs that the pregnancy could endanger the woman's health, which is, you know, the whole point of a sonogram—rather than taking pretty pictures for propaganda purposes. But the experience can give the women the dangerous impression that she's received proper prenatal care, and prevent her from getting care later on.
But because it's easy to buy a sonogram and have some untrained bible thumper operate it, pregnancy crisis centers can easily become more widespread than clinics burdened by the obligation to provide actual health care. Some right-wingers liken the pregnancy crisis centers to Starbucks—it's just the free market in action (never mind that the centers have received $30 million in federal money since 2001). Yeah, sure, it's as if Starbucks could put all the other coffee shops out of business by serving up mud and fooling customers into thinking it was actual coffee. It would be cheap, but it would also be wrong.
Anyone who's been paying even cursory attention to the debacle in Iraq knows that the United States military doesn't do counterinsurgency very well. In fact, the military's awful at it. What's more, officials in the Army have long refused to try to improve things along this front—despite the fact that the military has lost in both Vietnam and Iraq because of, in part, its ineptitude at counterinsurgency. Instead, they prefer to focus on winning conventional wars, which the military does very well, but which are rarely fought these days. But why do military leaders do that?
In a new paper for Cato, Jeffrey Record of the Army War College blames it on the "strategic culture" in the military. American strategists have long preferred to "separate war and politics"—believing that if they just kill lots of bad guys, the rest will sort itself out later. When the Pentagon planned for a war against Saddam Hussein's army but neglected to plan for postwar operations, it wasn't a fluke: military leaders have always thought that way. And that culture is unlikely to change. So Record thinks the U.S. should just give up even trying to wage small, counterinsurgency-heavy wars like we've seen in Iraq, unless there's some truly, truly dire need.
It's a smart paper, but I think there are also structural reasons why the military is so terrible at counterinsurgency. For one, there's an institutional bias towards spending more and more money on fancy new weapons every year. Congress loves them, because fancy new weapons create jobs. Pentagon bureaucrats love them, because they get bigger budgets and new toys. Contractors and lobbyists love them for obvious reasons. So procurement spending takes up an inordinate share of the defense budget, at the expense of things that could potentially help the military become more adept at counterinsurgency—like an increase in peacekeepers and linguists.
There's also the fact that U.S. military deployments tend to be logistically very complex. Any occupation these days will need lots of support staff, supply lines, convoys, etc. All that stuff needs guarding. And that means that inevitably, something like the Green Zone will be set up, where soldiers reside, aloof from the local people and their culture—and that runs at cross-purposes with winning "hearts and minds" and the like. Plus there's the military's understandable desire to avoid casualties at all costs, which again, militates against having soldiers blend in with the local population. There's also public opinion to consider.
At any rate, I do think Record's right. It's very, very unlikely that the military will suddenly become good at counterinsurgency. The latest Quadrennial Defense Review was probably the best chance that proponents of counterinsurgency had to drastically change the direction of the military. After all, soldiers were dying daily in Iraq because the military was futilely trying to win a fourth-generation war using conventional means. Surely if things were ever going to change, it would be then. But, of course, the 2006 QDR merely ended up hyping the "threat" from China and largely focused on building up the military's conventional capabilities further.
Record's paper dovetails nicely with an essay Andrew Bacevich recently wrote, arguing that Islamist movements in the Middle East are increasingly learning to wage sophisticated insurgency campaigns that are impossible for conventional Western militaries to defeat. Israel never vanquished Hezbollah. Iran will give the United States equal trouble or more if we ever try to invade. "The inhabitants of [the Middle East]," Bacevich says, "now have options other than submission or collaboration." We can no longer, it seems, impose our will on recalcitrant Middle Eastern countries by force.
Fortunately, as Bacevich rather sensibly points out, we don't have to impose our will on these countries. They don't pose existential threats. Iran, at the moment, is surrounded by American troops, rather than vice versa. Whatever problems are associated with its nuclear program can be handled by perfectly dovish means or, if worst comes to worst, deterrence. Ideally, the United States would just accept that it can no really longer use its military power to meddle in the Middle East. Regardless of whether we should or not, it's just not feasible. Somehow, I think we'll survive just fine.
A new documentary, it seems, reveals some double standards in the ratings handed out by the MPAA:
For example, scenes of women receiving oral sex… are far more likely to earn a film an NC-17 rating than images of their male counterparts getting similarly serviced. Same-sex couples are also subject to far harsher scrutiny than straight ones. A split-screen montage in the [documentary] makes this point by juxtaposing near-identical scenes from two different films at a time, one featuring gay sex, the other straight; invariably, the gay movie gets an NC-17 rating, the straight one a PG-13.
I'm shocked! A good follow-up question, of course, is whether this disparity merely reflects certain misogynistic and homophobic beliefs and attitudes in this country, or whether it actually reinforces and bolsters those beliefs. I guess it's a tricky thing to figure out, especially since Americans in general seem to be getting less misogynistic and homophobic over time. But again, is that because, for instance, more and more sympathetic gay characters are cropping up in films and television? Which causes which?
At any rate, I tend to fret more about the oft-cited fact that the MPAA cracks down harder on sex than it does on violence. Movies are pretty shockingly and gratuitously violent these days. And I don't think that makes individual people more violent, or more likely to commit crime, or more likely to kill someone. But it could matter on a political level. America, needless to say, has a belligerent warmongering streak ten miles wide. War—at least when we're kicking ass—is appallingly popular in this country. Should we blame movies for this?
Here's another thought. Antulio Echevarria once pointed out that the U.S. military "is geared to fight wars as if they were battles, and thus confuses the winning of campaigns... with the winning of wars." British strategist Colin Gray has argued that Americans "are wont to regard war and peace as sharply distinctive conditions" and during wartime tend to focus solely on defeating enemies on the battlefield, worrying about peace and politics later—essentially believing all that stuff will sort itself out once ass has been duly kicked. That, of course, is a mistaken notion, as Iraq made clear. But it also sounds like the plot of roughly every action movie ever made. And those action movies are pretty damn popular in this country. Coincidence?
I'm just wondering. Then again, Charles Loeffler pointed out last year that, contrary to what experts feared, the popularity of shows such as CSI and Law and Order have virtually no effect on the way people think about criminal law, or how they act when serving on juries, so maybe this semi-Marxist notion that our ruling elites are using Hollywood to foster an imperialist ideology among the masses is totally wrong. Still, it sounds plausible.
Writing in the Nation, Liza Featherstone tries to think through the implications of Wal-Mart's entrance into the organic food market. One potential fear she neglects to mention is that the store could one day start lobbying the Department of Agriculture to weaken standards on what can actually be sold under the organic label. That's hardly an idle fear: the USDA has long favored agribusiness as a rule—take, for example, its regulations that allow milk from cows cramped in tiny feedlots to be considered "organic". (Advocacy groups have pointed out that when most people think of organic milk, they think of cows allowed to graze out in the pasture, but the USDA has rebuffed their suggestions.)
And Wal-Mart, as one would expect, will inevitably exploit whatever loopholes do exist. Featherstone notes that Wal-Mart now gets its organic milk from Horizon Foods, a company that provides cheap milk by cramming its cows in factory farms in Idaho. Meanwhile, family-scale farmers, who are less likely to cut corners and have more incentive to treat the land better—in other words, the ideal "organic" farmers"—may well get priced out of the organic market altogether. Often, for instance, it's too much money and effort to even get certified by the USDA in the first place.
Another concern is that Wal-Mart will manage to defeat the whole purpose of organic food—helping the environment and all that jazz—by contracting with farms in countries far, far away. Transporting food long distance uses up a stunning amount of energy—food in the United States today typically travels about 1,500 miles to reach our plates, up 25 percent from 1980. According to Featherstone, industry observers think that much of Wal-Mart's "organic" food will come from Chile, Kenya, and China. (On the other hand, China may have to clean up a lot of its groundwater in order to set up organic farms, so it's not all bad.)
So… who knows how this will all shake out. I'm pessimistic. Sure, Wal-Mart has promised to behave itself, but that means about as much as you'd expect it to mean. Wal-Mart's whole strategy is to slash prices by outsourcing many of its costs onto other entities—the environment, say, or its workers. The idea behind organic farming, by contrast, is to make the consumer pay all of those costs, since cheap products aren’t cheap when others are shouldering the cost. Expecting that these philosophies can happily coexist seems improbable, to say the least.
So... Labor Day weekend is upon us. How fun. Come to think of it, Labor Day has always seemed like a strange holiday to me. Not because of the cause in question—far from it—but because of its dubious origins. Long before it was established, workers in the late 19th century had been demanding a national holiday celebrating their efforts for years. In 1882, laborers in New York even took an unpaid holiday to march in Union Square in support of (among other things) just such a holiday. But nothing happened.
It wasn't until the Pullman strike in 1893 that things actually changed. Pullman, Illinois, is, I've always thought, a fascinating town. There's the fact that Pullman created it to house his workers and shield them from the moral depravities in Chicago. There's the fact that it was a feudal state, where workers lived in row houses, kept in debt slavery, bought goods from Pullman-owned stores, drew paychecks from Pullman-owned banks, and had their rent—set by Pullman rather than the market—garnished automatically. All aspects of their lives were regulated. Calling it "totalitarian" may be hyperbole but not by much.
Anyway, any basic history book will describe how the Pullman company took a hit during the 1893 depression, workers saw their paychecks shrink but rents increase, and went on strike, led by Eugene Debs and the American Railway Union. Riots broke out. Trains stopped running. Railway executives panicked. So Grover Cleveland sent in 12,000 troops to break the strike. 13 workers were killed, Debs was hauled off to jail, the ARU dismantled, and industrial unions were effectively finished as a force for the next 40 years. We know the drill.
Sadly for the politicians, though, gunning down the working class has never looked good right before an election year. And 1894 was an election year. So Congress rushed to pass legislation establishing Labor Day, and Cleveland signed the bill in an attempt to look kinder and gentler to the voters. It didn't work—Cleveland lost—but now we have a holiday that both celebrates a good cause and was essentially a political gambit meant to paper over state-backed repression of organized labor. Like I said, it's always seemed a bit odd.
I tend to think that one reason Western Europe has had a robust labor movement, while the United States has not, is that in the early days the American government was less ashamed to use troops to mow down strikers and organizers. In the 19th century the Knights of Labor were an impressive and truly progressive American labor organization that fought for broad-based social change—running pro-labor candidates for office and organizing women, (some) minorities, and unskilled workers, unlike the exclusionary unions that came later. But the movement was destroyed by state-backed violence and in their place arose the parochial and conservative trade unions. Tempting to imagine what might have been.
The Pullman strike also reminds me of a curious fact about the labor movement during the Depression. In 1933, during the first wave of industrial uprisings, many of the striking unions weren't much concerned about wages and hours. They were more concerned about the tyranny of the workplace—at Ford's factories, for instance, security chief Harry Bennett's 'servicemen' were beating assembly workers on the spot simply for talking in line. Capricious and brutal foreman were another issue. Many company towns, meanwhile, were Soviet-style dictatorships run by steel barons, and early strikes often simply focused on basic civil liberties rather than overtime or benefits. Pullman, by these standards, was actually one of the less severe offenders.
I'm a reporter at The New Republic, mostly covering green issues, apocalypses, general doom. This is my personal site. I also post regularly at The Vine, TNR's enviro-blog.