February 02, 2009

The Great Dinosaur Wars

Excuse me, but I'm on a Wikipedia bender lately. I just can't stop clicking. So here we go: The Bone Wars? This was a real thing? Oh yes. Yes it was. In the latter third of the 19th century, we learn, a pair of rival American paleontologists, Edward Drinker Cope and Othaniel Charles Marsh "used underhanded methods to out-compete each other in the field, resorting to bribery, theft, and destruction of bones."

This wasn't as gruesome as it sounds: The rivalry did spur a race to excavate, which in turn led to the discovery of 142 new species of dinosaur, including childhood favs like Triceratops, Stegosaurus, and Allosaurus. A word of appreciation, by the way, for Allosaurus: The dinosaur books I gobbled up as a kid were usually divided into three eras: the Triassic (lame chicken-sized dinosaurs), Jurassic (now these were terrible lizards…) and finally the glorious Cretaceous (oh hell yes). The Allosaurus was a late Jurassic beast, usually shown scarfing down chunks of Brontosaurus meat, and he signified a child's very first encounter with the truly massive, peerless carnivores of yore—dominating the imagination for a brief while until you flipped a few pages and stumbled across the Cretaceous-era Tyrannosaurus Rex, king of all predators. (I understand now they have Spinosaurus and other gigantic brutes that put T. Rex to shame—well, T. Rex is still top lizard in my book.)

Anyway, back to our dueling paleontologists:
On one occasion, the two scientists had gone on a fossil-collecting expedition to Cope's marl pits in New Jersey, where William Parker Foulke had discovered the holotype specimen of Hadrosaurus foulkii, described by the paleontologist Joseph Leidy; this was one of the first American dinosaur finds, and the pits were still rich with fossils.

Though the two parted amicably, Marsh secretly bribed the pit operators to divert the fossils they were discovering to him, instead of Cope. The two began attacking each other in papers and publications, and their personal relations soured. Marsh humiliated Cope by pointing out his reconstruction of the plesiosaur Elasmosaurus was flawed, with the head placed where the tail should have been. ... Cope tried to cover up his mistake by purchasing every copy he could find of the journal it was published in.
In the end, Marsh won the Bone Wars, digging up and naming 80 new species of dinosaur, versus Cope's piteous 56. But paleontology itself was the real loser—not only was the profession's good name blighted for decades by their tomfoolery, but "the reported use of dynamite and sabotage by employees of both men destroyed or buried hundreds of potentially critical fossil remains." Oh dear.

Even worse, the two miscreants would often hastily slap all sorts of bones together in their race to unveil new species, and they often assembled incorrect skeletons that sowed confusion for decades. Marsh, it turns out, was the wiseguy who fit a human skull on a looming sauropod's torso and named his creation Brontosaurus (it later had to be rechristened Apatosaurus, once it was given its correct head—the real tragedy is that the new name was clumsier and not quite so fearsome: "Apatosaurus" means "deceptive lizard"; right, uh-huh, like there's anything deceptive about that dude.)
-- Brad Plumer 10:49 PM || ||
Outsourcing the Brain

In a chat with The Philosopher's Magazine, David Chalmers lays out his theory of the "extended mind," arguing that our cognitive systems are more than just what's crammed inside that gray matter in our skulls. The mind should be thought of as a larger, extended system that can encompass even gadgets like the iPhone:
"The key idea is that when bits of the environment are hooked up to your cognitive system in the right way, they are, in effect, part of the mind, part of the cognitive system. So, say I'm rearranging Scrabble tiles on a rack. This is very close to being analogous to the situation when I'm doing an anagram in my head. In one case the representations are out in the world, in the other case they're in here. We say doing an anagram on a rack ought be regarded as a cognitive process, a process of the mind, even though it's out there in the world."

This is where the iPhone comes in, as a more contemporary example of how the extended mind works.

"A whole lot of my cognitive activities and my brain functions have now been uploaded into my iPhone. It stores a whole lot of my beliefs, phone numbers, addresses, whatever. It acts as my memory for these things. It's always there when I need it."

Chalmers even claims it holds some of his desires.

"I have a list of all of my favorite dishes at the restaurant we go to all the time in Canberra. I say, OK, what are we going to order? Well, I'll pull up the iPhone—these are the dishes we like here. It's the repository of my desires, my plans. There's a calendar, there's an iPhone calculator, and so on. It's even got a little decision maker that comes up, yes or no."
Fine, I buy it. (Scrabble racks!) The original paper that Chalmers and Andy Clark wrote on the subject invented an Alzheimer's patient named Otto who jots things down in his notebook so that he won't forget them. When you or I need to conjure up an address or name, we just summon it from some musty alcove in our memory. When Otto needs a fact, he flips through his spiral notebook. We consider memory part of the mind, so why can't we say that Otto's notebook is part of his mind? What's the difference? It's a short hop to declaring that an iPhone can be part of one's extended mind, too.

Maybe all this just means that "mind" is an unduly foggy term. Let's all thank the relevant deities I'm not a philosopher and don't have to tear hair over this. It is odd, though, how memory works in the Internet age. When I'm writing about energy issues, let's say, then instead of learning all the nitty-gritty details about how utility decoupling works or what the latest data on snowfall in Greenland says, I never bother memorizing the details. Instead, I'll just get a feel for the main concepts and remember where on the Internet I can find more information in a pinch.

Now, sure, I do the same thing with books or notes, but I rely on this method much more often now—and, of course, the Internet's always available, making the need to memorize stuff even less acute. My head's filled with fewer actual facts and more search terms or heuristics. Everyone does it. Emily Gould recently wrote a indispensable post about Mind Google, which works like this: Rather than reaching instinctively for your iPhone to settle some trivia dispute with your friends (who did the voice of Liz in Garfield: A Tale of Two Kitties again?), you just rack your futile brain about it and then, later, when you're in the bathroom gazing at the tiles and thinking about something else entirely, the answer magically slaps you in the face. ("Jennifer Love Hewitt—of freaking course!")

So some facts are still up in the old noggin. I mean, it's not like you can just do without facts altogether. It's just more efficient to outsource a larger chunk of stuff to that great hivemind on the Web. Probably a net plus. Though I wonder if someone like (oh, say) Thomas Pynchon could've written Gravity's Rainbow if instead of cramming all that stuff about rocket science and German ballads and Plato and god knows what else in his skull, he was always saying to himself, "Eh, I don't need to remember this, I'll just look it up on Wikipedia later..."

One more thing about the extended mind. It ought to include other people, too. Shouldn't it? I know I have a bunch of stories and anecdotes I can't tell—or at least tell well—without some of the other participants around to help me fill in the details and color. In effect, I have to recreate the memory with someone else. I read somewhere once a devastating article about how older widows essentially lose a large chunk of their memory after their partner dies, because there are many events that could only recollect "together," with their spouse. No, wait, maybe this was in a novel? Or a This American Life podcast? Oy, my brain's a barren wasteland—off to wheedle the answer out of the Internet.
-- Brad Plumer 11:08 AM || ||
It's Hue You Know


Here's a good Monday diversion: Wikipedia's master list of color names. Learn your cobalts and ceruleans and cyans. Some of the terms on this list are fairly obscure: Would you ever, in a million years, have guessed what color "Razzmatazz" is? (It's a sort of magenta.) But some of the names are hilariously perfect—"old rose" denotes the hue of aging, sickly rose petals. And some of the terms made me realize how easily I could distinguish shades I hadn't even realized were distinguishable until I saw the names for them: I could picture right away the difference between "Islamic green" (think Saudi flag) and "shamrock green."

More clicking around... There's also a complete list of every single crayon color that Crayola has seen fit to christen over the years. Some of the names have been changed, sometimes mysteriously so. In 1990, "maize" (a childhood favorite of mine, handy for drawing golden retrievers and buried sea-treasure) was replaced by the weedier name "dandelion." The color itself changed slightly, too: Appropriately, dandelion looks more like the drab patches of yellow you'd see on an abandoned lot while maize evokes sunny Ukrainian wheat fields. Why'd they get rid of it?

Some of the Crayola color names are frivolous and totally inapt: I have no use for "laser lemon" or "Fuzzy Wuzzy brown" or even "radical red" (which, perhaps to protect kids from creeping Bolshevism, actually denotes a watery hue that's more pinkish than Pinko). But some of the Crayola names should be added to the original master list of English color names—"Granny Smith green" is an amazing term, instantly visualized, and as best I can tell has no analogue on the grown-ups' list of colors.

A few clicks later, we learn that Hungarian is the only known language to have two "basic" words for red, "piros" and "vörös." Now, it's not so unusual for a language to have a different set of basic color terms than English does: For instance, both "blue" and "azure" are basic color terms in Russian, while only "blue" is a basic color term in English (in the sense that native English speakers consider azure a type of blue, which is not the case with, say, green). What makes the Hungarian case odd is that red is the most basic color—it's the third color languages acquire a word for, after black and white. Also, there's this queer business:
* Expressions where "red" typically translates to "piros": a red road sign, the red line of Budapest Metro, a holiday shown in red in the calendar, ruddy complexion, the red nose of a clown, some red flowers (those of a neutral nature, e.g. tulips), red peppers and paprika, red card suits (hearts and diamonds), red traffic lights, red stripes on a flag, etc.

* Expressions where "red" typically translates to "vörös": red army, red wine, red carpet (for receiving important guests), red hair or beard, red lion (the mythical animal), the Red Cross, the novel The Red and the Black, the Red Sea, redshift, red giant, red blood cells, red oak, some red flowers (those with passionate connotations, e.g. roses), red fox, names of ferric and other red minerals, red copper, rust, red phosphorus, the color of blushing with anger or shame, etc.
You know, those lists made sense for a few short moments, and then I was lost again.
-- Brad Plumer 10:58 AM || ||

February 01, 2009

Recycled Cities, Prefab Homes

Over at The Nation, architect Teddy Cruz walks us through Tijuana, which has been built, in large part, using waste materials from neighboring San Diego—garage doors, box crates, old tires, even abandoned post-World War II bungalows that are loaded onto trailers and shipped south. Who knew you could recycle an entire city? There's even a YouTube clip, by filmmaker Laura Hanna:


That video, incidentally, was included in the Home Delivery: Fabricating the Modern Dwelling exhibit at the MoMa, which was reviewed by architecture critic Sarah Williams Goldhagen in the current issue of The New Republic.

Goldhagen's review, by the way, is provocative and very much worth reading. She observes that, compared with other countries, prefabricated homes are curiously rare in the United States. (In Sweden, by contrast, 85 percent of homes are largely prefab, and yes, Ikea offers models.) In theory, mass-produced homes could be cheaper, the quality could be more consistent, and—here's the bonus environmental angle—they're likely to be a great deal greener than the usual contractor-built ("stick-built") variety. There's no reason prefabs have to be hideous, either, or shoddy, or drearily uniform in appearance. Trouble is, building codes vary so widely from town to town in this country that it's impractical for prefab homes to take hold. Is that a good thing? Goldhagen says probably not:
What these exhibitions demonstrated is that there is no dearth of splendid ideas for prefabricated mass-produced housing. Were the best of these ideas adopted by the market—and in the right climate, more good ideas would be forthcoming—there is no doubt that the bulk of newly constructed houses would be less toxic to the environment, better built, and last longer with fewer repairs. They would more fully and attractively accommodate the ways people currently live.

Why do we continue to settle for residential litter in ever-more-degrading landscapes? That is the question these exhibitions fail to address. But of course it is not an architectural question. In social policies, better ideas by savvier architects will change little. For quality affordable mass-produced housing to be built, we need to create different conditions for a mass market. A new legislative structure must clear away the obstacles presented by non-standard, municipally controlled building codes and create enforceable national standards for prefab-friendly, environmentally responsible manufacturing and construction practices. Incentives must be offered so that the entrenched and intransigent construction industry, which has made plenty of money on its poorly conceived, shoddily built, environmentally toxic houses, will re-configure itself.

If the necessary legislation were passed and new market incentives put in place, and the designers and manufacturers of prefabricated homes made all the real innovations in quality and reduction of price that the automobile industry has made since the Model T, who would walk away from a better designed and better built home for less money?
Et voilà—the future of homebuilding in the United States? I have no clue, but assuming energy and environmental issues continue to be central in the coming years (and why wouldn't they be?), I'd expect the question to bubble up now and again. Greening the construction industry and our built environment, after all, pretty much has to be a big part of any push to nudge down those greenhouse-gas emissions.

(Cross-posted to The Vine)
-- Brad Plumer 11:37 AM || ||