Astrogator's Logs

New Words, New Worlds
Rest
Artist, Heather Oliver             

Archive for the 'Science' Category

Damp Squibs: Non-News in Space Exploration

Saturday, January 5th, 2013

LaLaLa

Biologists interested in space exploration are consistently delegated to the back of the stellar tour bus, if we’re allowed on at all. We’re Luddites who harsh everyone else’s squee, you see. We keep pointing out that radiation is not kind to living tissue, whether gametes or neurons; that uploading to silicon chassis is not possible as an alternative to carbon bodies; that human babies cannot be hatched and reared by robots at planetfall; that living on extrasolar planets poses huge problems and dilemmas even if they’re quasi-compatible. And that since FTL and warp drive are and will always remain science fiction, we need to at least tackle, if not solve, some of these issues before we launch crewed starships for long exploratory or migratory journeys. This year, there were two non-news items in the domain that brought these matters once again to the fore.

The earlier of the two was the disclosure that “NASA scientists might achieve warp drive” based on Alcubierre’s theoretical concept (by using a Jovian weight’s worth of exotic matter as likely to exist as stable wormholes). Beyond its terminally wobbly foundation, the concept also doesn’t take into account that such folding of space would destroy nearby star systems (and almost certainly also the starship) via distortion of the local spacetime and/or massive amounts of radiation. It’s also unclear how the starship could be steered from within the “negative energy” or “tachyonic matter” bubble. This means that even if fast space travel were possible using this method, it would still take lifetimes to safely reach a planet within a system because local travel would by necessity be at sublight speed.

More recently came the non-news that radiation causes… brain malfunction, as if the term “free radicals” and “radiation damage” were not in the biomedical vocabulary since before I entered the discipline in the mid-seventies (let alone the in-your-face evidence of the Hiroshima and Nagasaki holocausts or the Chernobyl meltdown). Radiation, especially the high-energy portion of the spectrum, breaks atomic bonds directly and indirectly by producing free radicals. Free radicals start chain reactions: lines of descendants, each of which can damage a biomolecule. Radiation causes mutations in the DNA, which is bad enough, but it can also result in other errors: protein misfolding, holes in cell membranes, neuron misfiring. And although cells have several repair mechanisms to counter these insults, they have evolved for the radiation burdens of earth.

All these effects at the molecular/cellular level converge into two large rivers: for dividing cells, cancer; for non-dividing cells (most prominently gametes and brain neurons), death. Kill enough cells, past the brain’s ability to rewire and reroute, and you get neurodegeneration: if the most affected region is the substantia nigra, Parkinson’s; if the cerebellum, ataxia; if the hippocampus and parts of the cortex, Alzheimer’s; if the frontal lobe, frontotemporal dementia; if the Schwann cells of the myelin sheath, multiple sclerosis. Incidentally, radiation also affects electronic devices – something to keep in mind for even short interstellar journeys.

Stating-the-obvious-7

On earth, we are subject to a good deal of radiation from natural causes (radon, solar flares) as well as human-made ones (industrial, occupational, medical, airport X-ray machines). Cosmic radiation constitutes about 5-10% of our total exposure. That will be very different in space, where bombardment by galactic cosmic rays will be both chronic and acute. And whereas cosmic radiation on earth is moderated by the solar wind, the earth’s magnetic field and the layers of atmosphere, none of these protections will be present on a starship. Shielding options are inadequate or, like warp drive, sheer fantasy – which makes this risk one of the major showstoppers to star travel. The best candidate is the most low-tech: water.

Scientific papers that discuss these outcomes, from both inside and outside NASA, have been around since at least the early nineties. So what exactly is new in this study that is making the customary rounds in various space enthusiast sites and blogs? In a word, nothing. In fact it’s a bits-and-pieces study that reaches miniscule, non-surprising conclusions. The adage “labored as if for an elephant and brought forth a mouse” is particularly apt here. As for the originality of its discoveries/conclusions, it’s like hitting someone’s head repeatedly against a cement wall and concluding that such blows eventually cause, um, skull fractures.

At the same time, the authors of the study decided to gild their tinfoil lilies. They used a double transgenic mouse strain engineered to develop amyloid plaques of the Alzheimer’s-associated variety. Despite this loading of the dice, they saw changes in plaque size and numbers and in amyloid processing only in the male irradiated mice. Even the small shifts they saw are far less important than laypeople think: for a while now, the consensus in the field is that plaques may be neutral warehouses. In particular, plaques seem to be a sidebar for sporadic Alzheimer’s which is 90-95% of the disease cases. Many people have heavy amyloid plaque loads with zero cognitive impairment. As is often the case with mice studies, they subjected them to overwhelming amounts of the perturbing parameter (in this case, iron nuclei) that nevertheless represents a simplified subset of what they’d encounter in a real journey. Finally, they saw neither inflammatory microglial activation nor changes in amyloid clearance. They did see changes in a couple of behavioral tests, although in most of them the error bars overlap, which means “not statistically significant”.

The obvious experiment that might give remotely useful results would be to do such studies with a mouse strain that is not merely wild-type but aggressively outbred. However, that would still be superfluous, even if we set aside the limited usefulness of mouse models for human brain function. We already know what would happen during long interstellar journeys, and more or less why. I propose that we use the time and funds spent on irradiating guaranteed-to-develop-disease mice to develop effective, and preferably low-key, shielding. Radical-clearing drugs are also an option, although the favorite defaults bristle with their own host of problems (teratogenicity for retinoids, tumorigenesis for mitochondrial boosting). Like most complex problems, there are no silver bullets to counteract the iron-nuclei ones of galactic radiation. It will have to be done the hard, slow way – or not at all.

harsh-my-mellow

Relevant papers:

H White (2012). Warp Field Mechanics 101.

JD Cherry, B Liu, JL Frost, CA Lemere, JP Williams, JA Olschowka, MK O’Banion (2012). Galactic Cosmic Radiation Leads to Cognitive Impairment and Increased A? Plaque Accumulation in a Mouse Model of Alzheimer’s Disease. PLoS One 7(12):e53275

The Solstice after the Supposed End of Days

Friday, December 21st, 2012

Chichen Itza Orion sm

For aeons it took us sailing, we never sank,
a thousand times we changed captains.

We never paid account to cataclysms,
we went full ahead, through everything.

And on our mast as eternal lookout
we have the Great Chief, the Sun.

From “The Crazy Ship” by Odysséas Elytis

Image: Sunrise and Orion over the temple of Kukulkan; Chichén Itzá, Yucatán, Mexico (from the NASA APOD; credit and copyright: Stéphane Guisard and UNAM/INAH)

Grandmothers Raise Civilizations

Wednesday, October 31st, 2012

Several attributes of human women are routinely posited as evolutionary enigmas because they tend to be placed in the “not really necessary” and/or “inconvenient” bins: hidden ovulation (How’s a guy to know a kid is his?? Ergo, chastity belts and purdahs!); orgasms (Who cares, as long as the kids come out?); and living past menopause (Done with heir production and no longer eye candy — discard!).

However, it turns out these attributes are not that enigmatic unless you believe that teleology drives evolution. It looks increasingly like the bright red buttocks of our primate relatives are actually a recent acquisition, and hidden ovulation is the earlier default. Some cultures have solved the kinship problem: brothers act as fathers to their sisters’ children, to whom they are unequivocally related. Orgasms are equally explicable once you accept the simple fact that the clitoris is the equivalent of the penis, including the associated excitability and sensitivity (which is why female genital mutilation is identical to a penectomy, not to foreskin circumcision). As for living longer than the contents of one’s ovaries, which is a third of women’s lifespan once they’re past the risky childbirth years, it may have to do with what made us human in the first place. So says the grandmother hypothesis, first intimated by George C. Williams of antagonistic pleiotropy fame and later elaborated by Kristen Hawkes and her colleagues in the late nineties, after observations of the Hadza people in Tanzania.

Back in the fifties and in today’s evo-psycho groves, the fashion has been to posit the nuclear family as the kernel unit of primordial humanity. If you take the crucial details of humans into account (unique birth risks, extended neoteny, unusual nutritional requirements, necessity for higher-order skill acquisition), you realize that the possibility of such a unit seeing offspring reach adulthood is close to nil. Not surprisingly, when anthropologists look carefully and past their own cultural blinders at less technologically endowed human groups, the scaffolding they see is always communal. As Sarah Blaffer Hrdy said, it really does take a village to raise a child.

Such a configuration is not problem-free: it’s vulnerable to tyranny of conformity as well as the devastation that can be wrought by charismatic sociopaths. Nevertheless, it allows distribution of infant care, overlap of skills, quasi-fair apportioning of resources and monitoring of emerging imbalances. And grandmothers, maternal ones in particular, play a crucial role in all of these.

The grandmother hypothesis postulates that the presence of grandmothers allowed more children to reach adulthood, because grandmothers not only foraged for their daughters’ older offspring but also socialized them, taught them important skills and transmitted knowledge and experience. It also postulates that older children had to develop ways to compel caretaker attention, giving rise to the enlarged frontal lobe unique to humans. So the hypothesis argues that female longevity is essentially a “quality over quantity” fitness adaptation that in turn favored descendants of women who fit this profile.

There is, of course, a competing hypothesis far more beloved of Tarzanists. The hunting hypothesis, demolished by Sally Slocum, postulates that hunting became better than foraging as a means of sustenance when resources became scarcer in Africa; and that coordinating the hunt (versus, say, figuring out which berries weren’t poisonous) led to natural selection for bigger brains as well as ushering in the female adoration of “alpha males” who brought home the only protein that supposedly counts.

Kristen Hawkes recently published the results of a mathematical simulation of the grandmother hypothesis. The algorithms did not include brain size, hunting or pair bonding. The model showed that grandmother effects alone are sufficient to double life spans in less than sixty thousand years. Not surprisingly, one requirement is natal homing: living close enough to the maternal grandparents that grandmothers can exert their humanizing effects. This fits with the observation that rigidly patrilocal and patrilineal societies which completely obliterate female kinship networks have often gone for quantity over quality, essentially reducing women to incubators that can always be exchanged for newer models – and that some of these societies used to discard infant girls and older women literally like garbage. Other societies went the opposite route, treating older women like honorary almost-men (allowing them to keep sacred objects, for example, though few were made council heads) once they were no longer “tainted” by menstruation.

Those who had grandmothers almost certainly remember the stories they told and the moderating influence they exerted on the family. I never met either of mine. Both died young; tuberculosis hollowed one, fire consumed the other. I did get to know my father’s stepmother, a gentle too-religious soul who was one of the first Greek women to become a teacher. She tried her best, but was not strong enough to counteract my mother’s fierceness, which I have internalized by now. I wonder if I would have been more adjusted to social expectations had my other grandmothers been around, wielding the authority of blood kinship. Given my other non-adaptive core attributes, I suspect the answer is no.

Selected papers:

Slocum, Sally. (1975, reissued 2012). Woman the Gatherer: Male Bias in Anthropology. In Anthropological Theory: An Introductory History. R. Jon McGee and Richard L. Warms, eds. Pp. 399-407. New York: McGraw-Hill.

Hawkes, Kristen. (2003). Grandmothers and the evolution of human longevity. American Journal of Human Biology 15 (3): 380–400.

Images: 1st, Grandmother Storyteller by Ada Suina (Wheelright Museum, Santa Fe, NM); 2nd, Pakistani grandmother with her three-day-old grandchild (credit: Adek Berry, AFP).

Why We May Never Get to Alpha Centauri

Wednesday, October 24th, 2012

(sung to the glam tune of The Low Spark of High-Heeled Boys)

Last week, astronomers announced that Alpha Centauri B may have an earth-sized planet in tight orbit. Space enthusiasts were ecstatic, because the Alpha Centauri triplet (a close binary, Alpha A and Alpha B, circled by Proxima) is the closest star system to ours at a distance of 4.3 light years. The possible existence of such a planet buttresses the increasing evidence that planetary systems form around every possible configuration: in particular, binary systems had been traditionally discounted as too unstable to maintain planets. Terms like “in our back yard” and “stone’s throw” were used liberally and many expressed the hope that the discovery might spur a space exploration renaissance.

As with many such discoveries, the caveats extend from here to Proxima. The planet’s existence has been inferred by the primary’s wobble, rather than from direct observation. This means that independent confirmation will be required to pronounce it definitively real. The lifespan of such a planetary system remains an open question. The specifics of the system (including the reason that a wobble was detectable) suggest that the planet, if present, is closer to Alpha B than Mercury is to the sun – which in turn means that it would be tidally locked, awash with the primary’s radiation and too hot for liquid water. Last but decidedly not least, it would take us about eighty thousand years to get there with our current propulsion systems. Depending on one’s definition, eighty thousand years exceed the entire length of human civilization by a factor of two to ten.

So besides the fully justified calls for an immediate robotic probe mission, cue the “solutions” of FTL, warp drive and uploading in addition to those within the realm of the possible (nuclear fusion, light sails, long generation ships… I’m even willing to put Bussard ramjets in this bin). Lest you think such suggestions pop up only on places like io9 or singularitarian lists, I assure you that talk tracks examining such scenarios with totally straight faces were entertained at both last year’s and this year’s Starship Symposium. The warp drive scenario got a boost when a NASA-linked lab announced that they thought they could sorta kinda fold space… if they could get enough strange matter (as in: a few stellar masses’ worth) and manage to stabilize it beyond the usual nanosecond life length. Then again, a NASA-linked lab gave us the “arsenic bacteria” cowpat, so nothing of this kind surprises me any longer.

Science fiction has been the entry portal for many scientists and engineers. The sense of wonder and discovery that permeates much of SF makes people dream – and then makes them ask how such dreams can become real. The problem arises when science fiction is confused or conflated with real science, engineering and social policy. When that happens, our chances of ever reaching Alpha Centauri decrease steeply, for at least two reasons: the fantasies make people impatient with/contemptuous of real science and technology; and when this pseudo-edginess substitutes for real science, you get real disasters. The recent sentencing of six Italian geoscientists to years in jail for “failing to predict” an earthquake with casualties speaks to both these points. So does the story of the Haida community that allowed a “businessman” to dump tons of iron into its coastal waters, based on his assurance it would improve conditions for its salmon fisheries. The resulting potentially lethal algal bloom has become visible from space.

Propulsion systems are an obvious domain where fiction (and the understandable fond wish) is still stronger than fact, but there are others. One is using space opera terraforming paradigms for geoengineering. (“Stan Robinson did it in the Mars trilogy, why not us?”) Another is using cyberpunk novels to argue for economic solutions – think of Greenspan’s belief in Rand’s Übermenschen fantasies. More recently, Damien Walter, a Guardian columnist, earnestly urged the head of the British Labour party to bypass austerity and resource limitations by… implementing ideas from Banks, Stross and Doctorow (Walter also wrote a column about women writing hard SF and used a man as his star example; between him and Coren, it looks like elementary reasoning is not a particularly strong suit at the Guardian). Commenters added Herbert’s Dune to the list, using swooning terms about the politics and policies it portrays. (“Banks’ Culture does it, why not us?”) Just intone “3-D printing!” or “Me Messiah!” over a rock pile, with or without Harry Potter’s wand, and hey-presto: post-scarcity achieved, back to toy universes and customized sexbots! I won’t go over the semi-infinite transhumanist list (uploading, genengineering for “virtue” etc), having done so before.

A related problem that looks minor until you consider social feedback is the persistent mantra that SF has been forced willy-nilly to become inward-gazing and science-illiterate because… reality moves too fast, thereby instantly dating predictive fiction. Much of this is justification after the fact, of course – writers “must focus on maintaining their online presence” so who has time for background research? – but the basal argument itself is invalid. There’s exactly one domain that’s moving fast: technology that depends on computing speed, although it, too, is approaching a plateau due to intrinsics. To give you an example from my own field, I’ve worked on dementia for more than twenty years. During this time, although we have learned a good deal (and some of it goes against earlier “common sense” assumptions, such as the real role and toxicity of tangles and plaques) we have not made any progress towards reliable non-invasive early diagnosis of dementia, let alone preventing or curing it. The point here is not that we never will, but that doing so will require a lot more than the mouth farts of stage wizards, snake-oil salesmen or pseudo-mavens.

When faced with these facts, many people fall back to the Kennedy myth: that we went to the moon because of the vision of a single man with the charisma and will to make it reality. Ergo, the same can be done with any problem we set our sights on but for those fun-killin’ Luddites who persist on harshing squees (file this under “unclear on concepts” and “perpetual juvenility”). Messianic strains aside, there were very specific reasons that made the Apollo mission a success: it was tightly focused; it had no terrestrial repercussions; it was the equivalent of gorilla chest-beating, another way of establishing dominance vis-à-vis the USSR; and it was done in an era when US was flush with power and confidence – the sole actor involved in WWII not to have suffered enormous devastation of its home ground. The outcomes of “war on cancer”, “war on drugs” and “war on terrorism” (to just name three of many) illustrate how quickly or well such an approach works when applied to complex long-range problems with constellations of consequences.

Mind you, as a writer of space opera I’m incorrigibly partial to psionic powers and stable wormholes (in part because they’re integral to mythic SF). And the possible existence of a planet in the Alpha Centauri system is indeed a genuine cause for excitement. But I know enough to place the two in separate compartments, though they’re linked by the wish that one day we have propulsion systems that let us visit Alpha Centauri in person, rather than by proxy.


Selected related articles

The Double Helix: Why Science Needs Science Fiction
SF Goes MacDonald’s: Less Taste, More Gristle
Miranda Wrongs: Reading Too Much into the Genome

“Arsenic” Life, or: There is TOO a Dragon in my Garage!
The Charlatan-Haunted World

Images:
1st, Alpha Centauri A and B seen over the limb of Saturn (JPL/NASA); 2nd, the algal bloom in the NW Pacific after the iron dump (NASA/Wikimedia Commons); 3rd, real science: The Curiosity Mars rover (Maas Digital LLC/National Geographic)

Bridge Struts in Pink Pantalets

Wednesday, October 3rd, 2012

My readers know by now that I’m not “feminine” as defined by western mainstream culture. It wasn’t a conscious effort on my part. I was instinctively allergic to being girly. I didn’t like the brittle plastic feel of dolls (though woolly bears and tigers crowd my bed even now), I detest all pink except the salmon-nectarine hues of dusk and dawn, I took to formal math like a goose (not a gander) to water – the silly stories made up to “soften it” gave me hives – and I’ve always loved and excelled in structural toys and puzzles, including those that supposedly derange female brains: namely, mentally rotating objects.

A question that comes up constantly in the circles I frequent is “Why aren’t more girls following STEM paths?” (STEM=Science Technology Engineering Mathematics). In many ways it reminds me of that other vexed question: “Why are First Worlders getting more obese?” In both cases, the question foci (girls/overweight people) are caught in severe double binds: the desired goal (becoming a woman who enters a STEM domain/having a healthy weight – which is not the same as a “socially desirable” weight) is strewn with obstacles that are almost entirely external and so systemic as to constitute the equivalent of the atmosphere; and both success and failure at following each path carry heavy personal costs [before anyone starts shrieking about “fatphobia”, read You Can Have Either Sex or Immortality where I discuss the grave dangers of excessive thinness. I intend to write a counterpoint follow-up to that at some point; this time we’ll focus on girls and STEM.]

To put it bluntly, a girl/young woman who wishes to follow a STEM vocation sets herself up for a lifelong drizzle of frustration, belittlement and harassment. At all points she will be reminded she’s unnatural, like a dog prancing on its hind legs; that women cannot achieve “true greatness” (however defined) in STEM. She may be actively attacked, from verbal insults to outright physical assaults. She will be given less mentoring, less salary, fewer plum positions and first-ranking journal publications, even fewer awards, promotions and perks – and she will be expected to be the default parent, if she wants a family. Her credentials and credibility will always be questioned, even if she gets a Nobel. This holds for the so-called First World as well and in fact it’s getting worse rather than better (economic downturns and fundie religiosity tend to do that). Given all this, the fact that women do make up a significant proportion of STEM is actually a near-miracle.

I was reminded of this issue recently when I had reason to look into games aimed at familiarizing very young girls with STEM before the age they start get turned off science or risk being labeled unfeminine. A preliminary point is that such efforts may be “making holes in the water” because the sad fact is that when enough women enter a discipline, it gets automatically re-classified as “female” and its perceived value and social/financial rewards plummet. This is true regardless of content: from doctors in the former Soviet Union to personal assistants to writers of what is arbitrarily labeled “soft” SF (which, ironically, includes almost all biologically-focused work because, you know, only pointed and exploding objects are hard SF).

That aside, the attempts to create STEM-relevant toys that are “girl-friendly” show the desire to counteract gender targeting, which starts in the cradle and never subsides, as well as the unavoidable pitfalls of such coding. Unquestionably, narrowing the STEM gender gap is more than worthwhile. At the same time, the guiding principles of this concept give me serious pause.

Pitches of such products are abrim with bluntly essentialist statements like “Boys have strong spatial skills, which is why they love construction toys so much. Girls, on the other hand, have superior verbal skills. They love reading, stories, and characters.” and “The set features soft textures, curved edges and attractive colors which are all innately appealing to girls.” As a corollary, such toys/games are aggressively girly (bubble-gum pink features prominently) and their characters are usually so whitebread that they could cause snow blindness. This is nothing new, of course – just read Tom Englehardt’s trenchant and still sadly relevant 1986 essay about gender coding in children’s TV programs. This domain hasn’t moved an inch since the fifties. Given how formative early socializing is, the rarity of women engineers, in particular, should not really be so surprising.

Had I seen such games when I was their target age, I’d have walked right past them (and I threw them summarily away when I received them as gifts). For one, I used Erector sets and suchlike as enthusiastically as I read stories; for another, the bland blondness endemic in such toys codes for “daffy airhead” in my culture. These products are explicitly geared to appeal to parents anxious to “correctly” socialize their children. And despite their excellent intentions, they reinforce the incredibly problematic “separate but equal” status quo even as they try to combat it.

In some ways, these games are younger-cohort variations of the concept that it’s a good idea to have sex-segregated schools if they enable girls to gain a foothold in areas traditionally closed/hostile to them. Of course, this approach worked if you went to the Ivy League Seven Sisters before they went co-educational – or to my competition-entry elite high school, whose explicit mission was to create future nation leaders. On the other hand, my sister went to a public school whose math teacher decided not to teach the girls – because “housewives don’t need algebra.” So sex-segregated education works, kinda, but only if you’re a princess or, at minimum, a mandarin-to-be.

Proponents of this approach argue that “girly” identity is often established by age 5 and therefore girls need to be coaxed back to problem-solving (as if traditionally “feminine” occupations like cooking and doing laundry are not problem-solving and don’t require spatial and suchlike skills… but we’ll put that aside). As far as I know, Tarzanist bleatings excepted, no correlation has been established between early girliness and later inclination to science, nor are the two mutually exclusive at any age (this also depends on how “girliness” is defined). On the other hand, if your parents, teachers and peers punish you in a myriad small or large ways if you don’t behave “as you should” gender-wise, it’s a foregone conclusion you will tack your lifeboat accordingly. Unless you’re like me, in which case you’ll get even more stubborn – and pay the price.

I think the only real solution to this problem is to tone down the gender essentialism of both “halves” and see to it that girls (and, more importantly, their parents) receive the message that it’s okay to browse the “blue aisles”, where STEM-relevant games are not an explicit insult to basic intelligence. Of course, the ideal would be to tone down (better yet, erase) gender essentialism at all times and places and deem “non-masculine” things of equal value, but I recognize that for the pipe dream it is.

Junk DNA, Junky PR

Wednesday, September 19th, 2012

Note: this article first appeared as a guest blog post in Scientific American. It got showcased in a few places. Not surprisingly, some were dissatisfied: those who think scientists (especially dark non-Anglo female ones) should be just technicians with no larger contextual views of their work; those who cling to old notions of DNA functions; those enamored of miracle “cures”; and, needless to say, creationists of all stripes.

A week ago, a huge, painstakingly orchestrated PR campaign was timed to coincide with multiple publications of a long-term study by the ENCODE consortium in top-ranking journals.  The ENCODE project (EP) is essentially the next stage after the Human Genome Project (HGP).  The HGP sequenced all our DNA (actually a mixture of individual genomes); the EP is an attempt to define what all our DNA does by several circumstantial-evidence gathering and analysis techniques.

The EP results purportedly revolutionize our understanding of the genome by “proving” that DNA hitherto labeled junk is in fact functional and this knowledge will enable us to “maintain individual wellbeing” but also miraculously cure intractable diseases like cancer and diabetes.

Unlike the “arsenic bacteria” fiasco, the EP experiments were done carefully and thoroughly.  The information unearthed and collated with this research is very useful, if only a foundation; as with the HGP, this cataloguing quest also contributed to development of techniques. What is way off are the claims, both proximal and distal.

A similar kind of “theory of everything” hype surrounded the HGP but in the case of the EP the hype has been ratcheted several fold, partly due to the increased capacity for rapid, saturating online dissemination.  And science journalists who should know better (in Science, BBC, NY Times, The Guardian, Discover Magazine) made things worse by conflating junk, non-protein-coding and regulatory DNA.

Biologists – particularly those of us involved in dissecting RNA regulation – have known since the eighties that much of “junk” DNA has functions (to paraphrase Sydney Brenner, junk is not garbage).  The EP results don’t alter the current view of the genome, they just provide a basis for further investigation; their definition of “functional” is “biochemically active” – two very different beasts; the functions (let alone any disease cures) will require exhaustive independent authentication of the EP batch results.

Additionally, the findings were embargoed for years to enable the PR blitz – at minimum unseemly when public funds are involved. On the larger canvas, EP signals the increased siphoning of ever-scarcer funds into mega-projects that preempt imaginative, risky work.  Last but not least, the PR phrasing choices put wind in the sails of creationists and intelligent design (ID) adherents, by implying that everything in the genome has “a purpose under heaven”.

What did the study actually do?  The EP consortium labs systematically catalogued such things as DNAase I hypersensitive and methylated sites, transcription factor (TF) binding sites and transcribed regions in many cell types.  Unmethylated nuclease-sensitive DNA is in the “open” configuration – aka euchromatin, a state in which DNA can discharge its various roles.  The TF sites mean little by themselves: to give you a sense of their predictive power, any synthetically made DNA stretch will contain several such sites.  Whether they have a function depends on a whole slew of prerequisites.  Ditto the transcripts, of which more anon.

Let’s tackle “junk” DNA first, a term I find as ugly and misleading as the word “slush” for responses to open submission calls. Semantic baggage aside, the label “junk” was traditionally given to DNA segments with no apparent function.  Back in the depths of time (well, circa 1970), all DNA that did not code for proteins or proximal regulatory elements (promoters and terminators) was tossed on the “junk” pile.

However, in the eighties the definition of functional DNA started shifting rapidly, though I suspect it will never reach the 80% used by the EP PR juggernaut.  To show you how the definition has drifted, expanded, and had its meaning muddied as a term of art that is useful for everyone besides the workaday splicers et al who are abreast of trendy interpretations that may elude the laity, let’s meander down the genome buffet table.

Protein-coding segments in the genome (called exons, which are interrupted by non-protein-coding segments called introns) account for about 2% of the total.  That percentage increases a bit if non-protein-coding but clearly functional RNAs are factored in (structural RNAs: the U family, r- and tRNAs; regulatory miRNAs and their cousins).

About 25 percent of our DNA is regulatory and includes signals for: un/packing DNA into in/active configurations; replication, recombination and meiosis, including telomeres and centromeres; transcription (production of heteronuclear RNAs, which contain both exons and introns); splicing (excision of the introns to turn hnRNAs into mature RNAs, mRNA among them); polyadenylation (adding a homopolymeric tail that can dictate RNA location), export of mature RNA into the cytoplasm; and translation (turning mRNA into protein).

All these processes are regulated in cis (by regulatory motifs in the DNA) and in trans (by RNAs and proteins), which gives you a sense of how complex and layered our peri-genomic functions are. DNA is like a single book that can be read in Russian, Mandarin, Quechua, Maori and Swahili.  Some biologists (fortunately, fewer and fewer) still place introns and regions beyond a few thousand nucleotides up/downstream of a gene in the “junk” category, but a good portion is anything but: such regions contain key elements (enhancers and silencers for transcription and splicing) that allow the cell to regulate when and where to express each protein and RNA; they’re also important for local folding that’s crucial for bringing relevant distant elements in correct proximity as well as for timing, since DNA-linked processes are locally processive.

But what of the 70% of the genome that’s left?  Well, that’s a bit like an attic that hasn’t been cleaned out since the mansion was built.  It contains things that once were useful – and may be useful again in old or new ways – plus gewgaws, broken and rusted items that can still influence the household’s finances and health… as well as mice, squirrels, bats and raccoons.  In bio-jargon, the genome is rife with duplicated genes that have mutated into temporary inactivity, pseudogenes, and the related tribe of transposons, repeat elements and integrated viruses. Most are transcribed and then rapidly degraded, processes that do commandeer cellular resources.  Some are or may be doing something specific; others act as non-specific factor sinks and probably also buffer the genome against mutational hits.  In humans, such elements collectively make up about half of the genome.

So even bona fide junk DNA is not neutral and is still subject to evolutionary scrutiny – but neither does every single element map to a specific function.  We know this partly because genome size varies very widely across species whereas the coding capacity is much less variable (the “C-value paradox”), partly because removal of some of these regions does not affect viability in several animal models, including mice. It’s this point that EP almost deliberately obfuscated by trumpeting (or letting be trumpeted) that “junk DNA has been debunked”, ushering in “a view at odds with what biologists have thought for the past three decades.”

Continuing down the litany of claims, will this knowledge help us cure cancer and diabetes?  Many diseases are caused not by mutations within the protein-coding regions but by mutations that affect regulation.  Unmutated (“wild-type”) proteins at the wrong time, place or amount can and do cause disease: the most obvious paradigm is trisomy 21 (Down syndrome) but cancer and dementia are also prominent members in this category, which includes most of the slow chronic diseases that have proved refractory to “magic bullet” treatments.  Techniques that allow identification of changes in regulatory elements obviously feed into this information channel. So a systematic catalogue of regulatory elements across cell types is a prerequisite to homing in on specific stretches known or predicted to have links to a disease or disease susceptibility.

A few potential problems lurk behind this promising front.  One is that the variety between normal individual genomes is great – far greater than expected.  There’s also the related ground-level question of what constitutes normal: each of us carries a good number of recessive-lethal alleles.  So unless we have a robust, multiply overlapping map of acceptable variability, we may end up with false positives – for example, classifying a normal but uncommon variation as harmful.  Efforts to create such maps are currently in progress, so this is a matter of time.

Two additional interconnected problems are assigning true biological relevance to a biochemically defined activity and disentangling cause and effect (this problem also bedevils other assays – the related SNP [single nucleotide polymorphism] technique in particular).  To say that a particular binding site is occupied in a particular circumstance does not show a way to either diagnostics or therapeutics.  “Common sense” deductions from incomplete basic knowledge or forced a priori conclusions have sometimes led to disasters at the stage of application (the amyloid story among them – in which useless vaccines were made based on the mistaken assumption that the plaques are the toxic entities).

The pervasive but clearly erroneous take-home message of “a function for everything” harms biology among laypeople by implying ubiquitous purpose.  It also feeds right into the perfectibility concept that fuels such dangerous nonsense as the Genetic Virtue Project.  Too, it will attract investors who will push sloppy work based on flimsy foundations.  Of course, it’s funny to see creationists fall all over themselves to endorse the EP results while denying the entire foundation that gives raison d’être and context to such projects.  As for ID adherents, they should spend some time datamining genome-encompassing results (microarray, SNP, genome-wide associated studies, deep sequencing and the like), to see how noisy and messy our genomes really are.  I’d be happy to take volunteers for my microarray results, might as well use the eagerness to do real science!

What the EP results show (though they’re not the first or only ones to do so) is how complex and multiply interlinked even our minutest processes are.  Everything discussed in the EP work and in this and many other articles takes place within the cell nucleus, yet the outcomes can make and unmake us.  The results also show how much we still need to learn before we can confidently make changes at this level without fear of unpredicted/unpredictable side effects.  That’s for the content part.  As for the style, it’s true that some level of flamboyance may be necessary to get across to a public increasingly jaded by non-stop eye- and mind-candy.

However, people are perfectly capable of understanding complex concepts and data even if they’re not insider initiates, provided they examine them without wishing to shoehorn them into prior agendas.  Accuracy does not equal dullness and eloquence does not equal hype.  The EP results are important and will be very useful – but they’re not paradigm shifters or miracle tablets and should not pretend to be.

Citations:

Brenner S (1990).  The human genome: the nature of the enterprise (in: Human Genetic Information: Science, Law and Ethics – No. 149: Science, Law and Ethics – Symposium Proceedings (CIBA Foundation Symposia) John Wiley and Sons Ltd.

ENCODE Project Consortium, Bernstein BE, Birney E, Dunham I, Green ED, Gunter C, Snyder M (2012).  An integrated encyclopedia of DNA elements in the human genome. Nature 489:57-74. doi: 10.1038/nature11247.

Stamatoyannopoulos JA (2012). What does our genome encode?  Genome Res. 22:1602-11.

Useful Analyses and Critiques:

Birney, E.  Response on ENCODE reaction.  (Bioinformatician at Large, Sept. 9, 2012).

Note: Ewan Birney is one of the major participants in the ENCODE project.

Eddy, S.  Encode says what? (Cryptogenomicon, Sept. 8, 2012).

Eisen M. This 100,000 word post on the ENCODE media bonanza will cure cancer (Michael Eisen’s blog, Sept. 6, 2012).

Timmer, J.  Most of what you read was wrong: how press releases rewrote scientific history (Ars Technica, Sept. 10, 2012).

Two More Borders Crossed

Monday, September 17th, 2012

My article on junk DNA and the recent huge PR noise associated with it just appeared in Scientific American as a guest post.  I will reprint it here toward the end of the week, to give SciAm its time lead.

This has also occasioned the opening of (groan) a Twitter account that will be essentially an adjunct to the blog, as are my FB and LJ accounts.  The handles are Helivoy (LJ) and AthenaHelivoy (TW) — but if you’re tracking the blog, all else is redundant.

Cool Cat by Ali Spagnola

The Psychology of Space Exploration: A Review — Part 2

Monday, September 17th, 2012

by Larry Klaes, space exploration enthusiast, science journalist, SF aficionado.

Note: this is a companion piece to Those Who Never Got to Fly.

Part 1

To give some examples of what I feel is missing and limited in representation in Psychology of Space Exploration, there is but a brief mention of what author Frank White has labeled the “Overview Effect”. As the book states, this is the result of “truly transformative experiences [from flying in space] including sense of wonder and awe, unity with nature, transcendence, and universal brotherhood.”

Clearly this is a very positive reaction to being in space, one which could have quite helpful benefits for those who are exploring the Universe. The Overview Effect might also have an ironic down side, one where a working astronaut might become so caught up in the “wonder and awe” of the surrounding Cosmos away from Earth that he or she could miss a critical mission operation or even forget what they were originally meant to do. Mercury astronaut Scott Carpenter may have been one of the earliest “victims” of the Overview Effect during his Aurora 7 mission in 1962. Apparently his very human reaction to being immersed in the Final Frontier in part caused Carpenter to miss some key objectives during his mission in Earth orbit and even overshoot his landing zone by some 250 miles. Carpenter never flew in space again, despite being one of the top astronauts among the Mercury Seven. It would seem that in those early days of the Space Race, having the Right Stuff did not include getting caught up with the view outside one’s spacecraft window, at least so overtly.

Image: Buzz Aldrin. Credit: NASA

Another item largely missing from Psychology of Space Exploration is the effects on space personnel after they come home from a mission. Edwin “Buzz” Aldrin, who with Neil Armstrong became the first two humans to walk on the surface of the Moon with the Apollo 11 mission in 1969, is one of the earliest examples of publicly displaying the truly human side of being an astronaut.

Although not revealed publicly until 2001 by former NASA flight official Christopher C. Kraft, Jr., in his autobiography Flight: My Life in Mission Control, the real reason Aldrin was not selected to be the first one to step out of the Apollo Lunar Module Eagle onto the Moon was due to the space agency’s personal preference for Armstrong, who Kraft called “reticent, soft-spoken, and heroic.” Aldrin, on the other hand, “was overtly opinionated and ambitious, making it clear within NASA why he thought he should be first [to walk on the Moon].”

Even though Aldrin was a fighter pilot during the Korean War, earned a doctorate in astronautics at the Massachusetts Institute of Technology (MIT), and played an important role in solving the EVA issues that had plagued most of the Gemini missions and was critical to the success of Apollo and beyond, his lack of following the unspoken code of the Right Stuff kept him from making that historic achievement.

Aldrin would later throw the accepted version of the Right Stuff for astronauts right out the proverbial window when he penned a very candid book titled Return to Earth (Random House, 1973). The first of two autobiographies, the book revealed personal details as had no space explorer before and few since, including the severe depression and alcoholism Aldrin went through after the Apollo 11 mission and his departure from NASA altogether several years later, never to reach the literal heights he accomplished in 1969 or even to fly in space again. Although Aldrin would later recover and become a major advocate of space exploration, he is not even given a mention in Psychology of Space Exploration. In light of what later happened with Nowak and several other astronauts in their post-career lives, I think this is a serious omission from a book that is all about the mental states of space explorers.

The other glaring omission from this work is any discussion of the human reproductive process in space. NASA has been especially squeamish about this particular behavior in the Final Frontier. There is no official report from any space agency with a manned program on the various aspects of reproduction among any of its space explorers, only some rumors and anecdotes of questionable authenticity.

As with so much else regarding the early days of the Space Age, that may not have been an issue with the relatively few (primarily male) astronauts and cosmonauts confined to cramped spacecraft for a matter of days and weeks, but this will certainly change once we have truly long duration missions, space tourism, and non-professionals living permanently off Earth. As with daily life on this planet, there will be situations and issues long before and after the one aspect of human reproduction that is so often focused upon. Unfortunately, outside of some experiments with lower animals, real data on this activity vital to a permanent human presence in the Sol system and beyond is absent.

I recognize that Psychology of Space Exploration is largely a historical perspective on human behavior and interaction in space. As there have been no human births yet in either microgravity conditions or on another world and the other behaviors associated with reproduction are publicly unknown, this work cannot really be faulted for lacking any serious information on the subject. What this does display, however, is how far behind NASA and all other space agencies are in an area which will likely be the determining factor in whether humans expand into the Cosmos or remain confined to Earth.

So Far Along, So Far to Go

What the Psychology of Space Exploration ultimately demonstrates is that despite real and important improvements in how astronauts deal with being in space and the way NASA views and treats them since the days of Project Mercury, we are not fully ready for a manned scientific expedition to Mars, let alone colonizing other worlds.

Staying in low Earth orbit for six months at a stint aboard the ISS as a standard space mission these days gives an incomplete picture of what those who will be spending several years traveling to and from the Red Planet across many millions of miles of space will have to endure and experience. If an emergency arises that requires more than what the mission crew can handle, Earth will likely be a distant blue star for them rather than the friendly globe occupying most of their view which all but the Apollo astronauts have experienced since 1961.

Image: Jerrie Cobb poses next to a Mercury spaceship capsule. Although she never flew in space, Cobb, along with twenty-four other women, underwent physical tests similar to those taken by the Mercury astronauts with the belief that she might become an astronaut trainee. All the women who participated in the program, known as First Lady Astronaut Trainees, were skilled pilots. Dr. Randy Lovelace, a NASA scientist who had conducted the official Mercury program physicals, administered the tests at his private clinic without official NASA sanction. Cobb passed all the training exercises, ranking in the top 2% of all astronaut candidates of both genders. Credit: NASA.

Regarding this view of the shrinking Earth from deep space, the multiple authors of Chapter 4 noted that ISS astronauts took 84.5 percent of the photographs during the mission inspired by their motivation and choices. Most of these images were of our planet moving over 200 miles below their feet. The authors noted how much of an emotional uplift it was for the astronauts to image Earth in their own time and in their own way.

The chapter authors also had this to say about what an expedition to Mars might encounter:

As we begin to plan for interplanetary missions, it is important to consider what types of activities could be substituted. Perhaps the crewmembers best suited to a Mars transit are those individuals who can get a boost to psychological well-being from scientific observations and astronomical imaging. Replacements for the challenge of mastering 800-millimeter photography could also be identified. As humans head beyond low-Earth orbit, crewmembers looking at Earth will only see a pale-blue dot, and then, someday in the far future, they will be too far away to view Earth at all.

Now of course we could prepare and send a crewed spaceship to Mars and back with a fair guarantee of success, both in terms of collecting scientific information on that planet and in the survival of the human explorers, starting today if we so chose to follow that path. The issue, though, is whether we would have a mission of high or low quality (or outright disaster) and if the results of that initial effort of human extension to an alien world would translate into our species moving beyond Earth indefinitely to make the rest of the Cosmos a true home.

The data recorded throughout Psychology of Space Exploration clearly indicate that despite over five decades of direct human expeditions by many hundreds of people, we need much more than just six months to one year at most in a collection of confined spaces repeatedly circling Earth. This will affect not only our journeys and colonization efforts throughout the Sol system but certainly should we go with the concept of a Worldship and its multigenerational crew as a means for our descendants to voyage to other suns and their planets.

This book is an excellent reflection of NASA in its current state and human space exploration in general. As with the agency’s manned space program since the days when the Mercury Seven were first introduced to the world in 1959, we have indeed come a long way in terms of direct space experience, mission durations, gender and ethnic diversity, and understanding and admitting the physiological needs of those men and women who are brave and capable enough to deliberately venture into a realm they and their ancestors did not evolve in and which could destroy them in mere seconds.

Having said all this, what I hope is apparent is that we now need a new book – perhaps one written outside the confines of NASA – which will address in rigorous detail the missing issues I have brought to light in this piece. This request and the subsequent next steps in our species’ expansion into space – which will also eventually take place beyond the organizational borders of NASA – cannot but help to improve our chances of becoming a truly enduring and universal society in a Cosmos where certainty and safety are eventually not guaranteed to beings who remain confined physically and mentally to but one world.

The Psychology of Space Exploration: A Review — Part 1

Thursday, September 13th, 2012

by Larry Klaes, space exploration enthusiast, science journalist, SF aficionado.

Note: this is a companion piece to Those Who Never Got to Fly.

Early on the morning of February 5, 2007, several officers from the Orlando Police Department in Florida were summoned to the Orlando International Airport, where they arrested a female suspect. This woman was alleged to have attacked another woman she had been stalking while the latter sat in her car in the airport parking lot. Judging by the various items later found in the vehicle the suspect had used as transportation to the Sunshine State all the way from her home in Houston, Texas, her ultimate intent was to kidnap and possibly conduct even worse actions upon her victim.

While such a criminal incident is sadly not uncommon in modern society, what surprised and even shocked the public upon learning what happened was the occupation of the perpetrator: She was a veteran NASA astronaut, a flight engineer named Lisa Nowak who had flown on the Space Shuttle Discovery in July of 2006. As a member of the STS-121 mission, Nowak spent almost two weeks in Earth orbit aboard the International Space Station (ISS), performing among other duties the operation of the winged spacecraft’s robotic arm.

It seems that the woman who Nowak went after, a U.S. Air Force Captain named Colleen Shipman, was in a relationship with a male astronaut named William Oefelein. Nowak had also been romantically involved with Oefelein earlier, but he had gradually broken off their relationship and started a new one with Shipman. Oefelein would later state that he thought Nowak seemed fine about his ending their affair and moving on to another woman. However, by then it was painfully and very publicly obvious that Oefelein had not thoroughly consulted enough with his former companion on this matter.

NASA would eventually dismiss Nowak and Oefelein from their astronaut corps, the first American space explorers ever formally forced to leave the agency. NASA also created an official Code of Conduct for their employees in the wake of this publicity nightmare.

Now I have no documented proof of this, but I strongly suspect that the Nowak incident played a large but officially unacknowledged role in the creation of the recent offering by the NASA History Program Office book titled Psychology of Space Exploration: Contemporary Research in Historical Perspective (NASA SP-2011-4411), edited by Douglas A. Vakoch, a professor in the Department of Clinical Psychology at the California Institute for Integral Studies, as well as the director of Interstellar Message Composition at The SETI Institute.

Quoting from a NASA press release (11-223), which appeared about the same time as the book:

Psychology of Space Exploration is a collection of essays from leading space psychologists. They place their recent research in historical context by looking at changes in space missions and psychosocial science over the past 50 years. What makes up the “right stuff” for astronauts has changed as the early space race gave way to international cooperation.

The book itself is available online in several formats.

From the Right Stuff to All Kinds of Stuff

It may seem obvious to say that astronauts are as human as the rest of us, but in fact our culture has long viewed those who boldly go into the Final Frontier atop a controlled series of explosions otherwise known as a rocket in a much different and higher regard than most mere mortals. Even before the first person donned a silvery spacesuit and stepped inside a cramped and conical Mercury spacecraft mated to a former ICBM for a brief arcing flight over the Atlantic Ocean in 1961, NASA’s first group of human space explorers – known collectively as the Mercury Seven – were being presented from their very first press briefing in 1959 as virtual demigods who had the right skills and mental attitude to brave the unknown perils of the Universe.

Image: The Mercury Seven stand in front of a F-106 Delta Dart. Credit: Wikimedia Commons.

The Mercury Seven astronauts were not just men: They were an elite breed of space warriors ready to conquer the Cosmos who also represented the best that the United States of America had to offer when it came to their citizens, their technology, and their science. The nation’s first space explorers may have been ultimately human and limited in various ways, even flawed, but the agency’s goal was to keep any issues in check through their missions at the least and preferably during their full tenure with NASA.

By the time of Nowak’s incident, astronauts may not have been the demigods of the days of Mercury, Gemini, and Apollo, but they were still looked upon as highly capable people who ventured to places few others have gone and who did not give into human passions beyond a few moments of wonder at the Universe, realistic or not. This is why Nowak and Oefelein’s behaviors were so shocking to the public even four decades after the first generations of space explorers.

There are two reasons why I brought up the dramatic events of 2007 with Lisa Nowak: The first is my aforementioned hypothesis that what took place between the former astronaut and her perceived romantic rival led to NASA feeling the need to examine their policies regarding the human beings they send into space and formally documenting the resulting studies.

The second reason is that Psychology of Space Exploration needed more of these personal stories about the astronauts and cosmonauts. Now certainly there were some of these throughout the book: The Introduction to Chapter 1 relays a tale about a test pilot who was applying to be an astronaut who told an evaluating psychiatrist about the time the experimental aircraft he was flying started spinning out of control. The pilot responded to this emergency by calmly leafing through the vehicle’s operating manual to solve the immediate problem, which he obviously did.

Nevertheless, more of these kinds of stories would have not only made the book a bit less dry as it was in places, but they would have added immeasurably to the information content of this work.

As just one example, in Chapter 2 on page 26, the author mentions (from another source) that the Soviet space missions “Soyuz 21 (1976), Soyuz T-14 (1985), and Soyuz TM-2 (1987) were shortened because of mood, performance, and interpersonal issues. Brian Harvey wrote that psychological factors contributed to the early evacuation of a Salyut 7 [space station] crew.”

The problem here is that the book then moves on without going into any details about exactly what happened to curtail these missions. Knowing what took place would certainly be useful in making sure that future space ventures, especially the really long duration ones that will be of necessity as we move past our Moon, could be the difference between a secure and functioning crew and a disaster.

Incidentally, the author noted that the Soviets, who were usually reticent about giving out many technical details or goals on most space missions manned and robotic, were more open when it came to the experiences of their cosmonauts and showed more interest in their physiological situations in confined microgravity situations than NASA often did with their astronauts.

The Soviet space program also had a longer period of actual experience with humans living aboard space stations starting in 1971 with Salyut 1 (or Soyuz 9 in 1970 if you want to count that early space endurance record-holding jaunt) which NASA did not share between their three Skylab missions in 1973-1974 until their joint involvement with the Soviet Mir station in the 1990s. Having the details from that era would be of obvious benefit and interest.

Image: The MIR station hovering over Earth. It deorbited in March 21, 2001.The station was serviced by the Soyuz spacecraft, Progress spacecraft and U.S. space shuttles, and was visited by astronauts and cosmonauts from 12 different nations. It endured 15 years in orbit, three times its planned lifetime. Credit: NASA.

Granted, as with a collection of research papers such as this, there are plenty of references. Finding the stories this way is not a problem if you are doing your own research and using Psychology of Space Exploration as a reference source, but for the more casual reader it could be a bit of a disappointment when these items are not readily available.

While I think most people who want to learn more about how our space explorers are affected by and respond to and during their missions into the Final Frontier will find something of interest and value throughout this book, Psychology of Space Exploration is largely a reference work that goes into levels of certain details as befitting literature of its type while missing a number of others which I think are just as important for a comprehensive view of human expansion into space, both in the past, the present, and most vitally the future.

The ultimate goal of putting people into space is eventually to create a permanent presence of our species beyond Earth. That is the grand aim even if their initial underlying purposes were more geared towards engineering and geopolitical goals. This is similar to the history of the early navigators who crossed the Atlantic Ocean from Europe to the New World, for they too had other plans initially in mind, although the ultimate result was the founding of the many nations that exist in the Western Hemisphere today.

Part 2

The Charlatan-Haunted World

Sunday, August 26th, 2012

In the larger context of how sciency blather shapes culture, including speculative literature, it’s interesting to juxtapose two movement gurus, Ray Kurzweil and Deepak Chopra. Many consider them very different but in fact they’re extremely similar. Essentially, both are prophet-wannabes who are attempting to gain legitimacy by distorting science to fit a cynically self-aggrandizing agenda.

Chopra goes the faux grand unification route; Kurzweil belongs to the millenarian camp, including his habit of setting goals that ever recede: the year we become optimized by nanobots… the year we upload our minds to silicon frames… the year we welcome our AI overlords. The Singularity and the complete reverse-engineering of the human brain were slated for 2010; now the magic year is 2045. Sound familiar?

Both men are embodiments of Maslow’s dictum that if all you have is a hammer, everything looks like a nail. Chopra’s hammer is power of mind over matter; Kurzweil’s, Moore’s Law with “exponential” as its abracadabra. It’s easy to laugh at Chopra’s blatant misuse of quantum mechanics and his idea that we can destroy tumors with sheer thought power. Most of the biocentrist vaporings of Chopra, Lanza et al can be dealt with by one word: decoherence. Conversely, Kurzweil’s ignorance of basics is so obvious to a biologist that seeing him being taken seriously makes you feel you’re in a parallel universe. For the rest of this article, I will focus on transhumanism (TH) and just briefly linger on salient points many of which I’ve covered before in detail.

I’ve often said that cyberpunk is the fiction arm of TH, but upon reflection I think it would be more accurate to say TH is a branch of cyberpunk SF if not fantasy – and not a particularly original one, at that. At the same time, even its own adherents are starting to publicly admit that TH is a religion. After all, its wish list consists of the same things humans have wanted since time immemorial: immortality and eternal youth. Eternally perky breasts and even perkier penises. Those lucky enough to attain these attributes will frolic in Elysium Fields of silicon or in gated communities like today’s Dubai or tomorrow’s seasteads. Followers of other religions have to wait patiently for paradise; transhumanists can gain instant bliss by thronging to Second Life. Or as that famous Sad Children cartoon says, “In the future, being rich and white will be even more awesome.”

Transhumanists posit several items as articles of faith. All these items require technology indistinguishable from magic – and in some cases, technology that will never come to pass because of intrinsic limitations. Transhumanists call unbelievers Luddites — funny, given that many who object to the cult approach are working scientists or engineers. Among the TH tenets:

1. Perfectibility: “optimization” of humans is not only possible but also desirable.

1a. Genes determine high-order behavior: intelligence, musical talent, niceness. This has gone so far that there is a formal TH proposal by Mark Walker to implement a Genetic Virtue Program; in cyberpunk SF you see it in such laughable items as Emiko having “dog loyalty genes” in the inexplicably lauded Windup Girl by Paolo Bacigalupi. Basing a genetic program on the concept that genes determine high-order behavior is like planning an expedition to Mars based on the Ptolemaic system. Genomes act as highly networked ensembles and organisms are jury-rigged. Furthermore, optimization for one function in biological systems (across scales) makes for suboptimality at all else.

1b. There’s one-to-one mapping between hormones or evolutionary specifics and behavior. Most of these generalizations  come from research on non-humans (mice for hormones; various primates for evolution) and lead to conclusions like: people can become lovesome by judicious applications of oxytocin or murderous by extra helpings of testosterone; and to the evopsycho nonsense of “alpha male rape genes” and “female wired-for-coyness brains”. This is equally endemic in what I call grittygrotty fantasy, but it seems to be at odds with TH’s willingness to entertain the concepts of gender fluidity and sculpting-at-will.

1c. Designer genetic engineering will come to pass, including nanotech that will patrol us internally. Genetic engineering is already with us, but it will take time to fine-tune it for routine “vanity” use. Of course, we already have nanites – they’re called enzymes. However, cells are not this amorphous soup into which nanoships can sail at whim. They’re highly organized semi-solid assemblages with very specific compartments and boundaries.  The danger for cell and organ damage shown in the cheesy but oddly prescient Fantastic Voyage is in fact quite real.

2. Dualism: biological processes can be uncoupled from their physical substrates.

2a. Emotions are distinct from thoughts (and the former are often equated with the non-cortical Four Fs). This aligns with such items as the TH obsession with sexbots and proxy relationships through various avatars — and the movement’s general fear and dislike of the body. Of course, our bodies are not passive appendages but an integral part of our sensor feedback network and our sense of identity.

2b. It is possible to achieve immortality and continuity of consciousness by uploading, which might as well be called by its real name: soul – as Battlestar Galumphica at least had the courage to do. It should go without saying that uploading, even if a non-destructive implementation ever became possible, would create an autonomous copy.  I still boggle at Stross’ pronouncement that “Uploading … is not obviously impossible unless you are a crude mind/body dualist. // Uploading refutes the doctrine of the existence of an immortal soul.”

3. Dogma: invalid equivalences and models for complexity.

3a. The brain is a computer. This leads to fantasies that “expansion of capabilities” (however defined) and such things as uploading or “stigmata” (that is, leakage between VR and reality) are possible. The fundamental point is that the brain is not a computer in any way that is useful to either biology or computer science, starting with the fact that a brain is never a blank chassis that passively accepts software. Also, it’s one thing to observe that the cerebellum contains four types of neurons, another to talk of stacks. The black noise on this has reached such a level that I cringe whenever I hear people discuss the brain using terms like “Kolmogorov complexity”.

3b. Sentient AI and animal uplift will not only come to pass, but will also produce entities that are remarkably similar to us. Connected to this are the messianic ravings of the extropians, who envision themselves as essentially overseers in plantations, as well as David Pearce’s “imperative” that any issues will be ironed out with such things as contraceptives for sentient rabbits and aversion therapy for sentient cats that will turn them into happy vegans. However, cat intestines are formed in such a way that they need meat to survive. If they must be medicated non-stop (let alone mangy from malnutrition), much better to design a species de novo. Crowley’s Leos and Linebarger’s Underpeople were both more realistic and more humane than the equivalent TH constructs.

Like all religions, TH has its sects and rifts, its evangelicals and reformists. Overall, however, the shiny if mostly pie-in-the-sky tech covers a regressive interior: TH hews to triumphalism, determinism and hierarchies. Interestingly, several SF authors (most notably Iain Banks) see TH applications as positive feedback loops for a terminal era of plenty: infinite resources courtesy of nanites, infinite flexibility in identities and lifestyles. However, I think that we’re likelier to see some of this technology become real in two contexts: an earth running out of resources… and people in long-generation starships and quasi-terrestrial exoplanets.

In both cases, we may have to implement radical changes not for some nebulous arbitrary perfection, or as a game of trust/hedge fund playboys, but when we’re in extremis and/or for a specific context. For example, the need to hibernate on an ice-bound planet or survive on toxic foodstuffs. Because TH is essentially a futuristic version of Manifest Destiny, it’s an unsuitable framework for exploring low-key sustainability alternatives. But TH does itself even fewer favors by harnessing stale pseudoscience to its chariots of the gods.  People like Kurzweil have the education and intelligence to know better, which makes them far more culpable than brain-dead ignorant haters like Akin.

Note: This article is an adaptation of the talk I gave to Readercon 2012 this July.  A panel discussion followed the talk; the other participants were John Edward Lawson, Anil Menon, Luc Reid and Alison Sinclair.

Related articles:

Equalizer or Terminator?
Miranda Wrongs: Reading Too Much into the Genome
Ghost in the Shell: Why Our Brains Will Never Live in the Matrix
“Are We Not (as Good as) Men?”
Won’t Anyone Think of the Sexbots?!
That Shy, Elusive Rape Particle

Images: 1st, Mike Myers as Maurice Pitka in The Love Guru; 2nd, flowchart from The Talking Squid, who adapted an original by Wellington Grey; 3rd, The Transhumanist by movement member Sandberg — appropriately enough, part of a Tarot card set.

Fresh Breezes from Unexpected Quarters

Tuesday, August 14th, 2012

As I get older, it’s harder to find books or films that surprise me – pleasantly, that is. I went to see The Dark Knight Rises (TDKR) and The Bourne Legacy (TBL) more as a means to avoid the New England summer humidity and give my cortex a chance to cool down between edits of my SF anthology. Both films had the expected scads of sound and fury, yet one of them managed to surprise me. To be clear, I’ve read neither the Miller comics nor the Ludlum or Lustbader books; so those who plan to use arguments of the type “But this is explained on page 4 of issue 13!” can save their breath.

I detest Christopher Nolan’s ponderous dourness. The only film of his I found remotely intriguing was The Prestige. Auteur pretensions aside, the closest relatives of Nolan’s Batman opus are the abysmal Star Wars prequels. The two trilogies share pretty much everything: the wooden dialogue, the cardboard characters, the manipulative sentimentality, the leaden exposition, the cultural parochialism, the nonsensical plot, the worshipping of messiahs and unaccountable privileged elites, the contempt for “mundanes” and democratic structures, the dislike of women and non-hierarchical relationships. To be sure, Nolan’s second Batman film boasted the unforgettable performance of Heath Ledger’s Joker. But TDKR should have been called Bat Guano or Darth Vader Meets the Transformers.

The reactionary politics (Billionaires and police know best! People left unherded devolve instantly to mob rampaging and kangaroo courts!) are bad enough. So are the obvious telegraphings and pious ersatz-mythic strains (“Rise! Rise! Rise!” — and of course, sob, the orphan boys). But the film is dull, unfocused, lumbering and messy even within its own frame: why the elaborate (and totally fallow) Wall Street takeover if Bane intends to blow the city up anyway? The protracted mano-a-mano between Batman and Bane is frankly dumb. All Batman has to do is rip out Bane’s breathing muzzle – incidentally, a lousy way to deliver pain meds. The reversals of the two women (antagonist becomes ally and vice versa… and the villain, naturally, is the one who removes her clothes) are so much by the numbers that I felt literally itchy. The hero rejoins the living twice, once as Bruce, once as Batman, for zero reasons of either plot or emotional logic.

The two male protagonists are boring one-note ciphers. Batman doesn’t earn his increasingly stale angst; even less so the unquestioning loyalty of his long-suffering allies (as laid bare in a great analysis of Gary Stuism). Christian Bale isn’t capable of more than one facial expression anyway – in Terminator Salvation he was more wooden than Sam Worthington, which is a real achievement. Needless to add, he has zero chemistry with either of the romantic interests put on Bruce Wayne’s silver spoon. Nolan criminally wastes Tom Hardy, who can really act: he made a feral, magnetic Ricki Tarr in the remake of Tinker, Tailor, Soldier, Spy and the one glimpse of his face when he’s about to be swallowed by the raging crowd in TDKR shows what he’s capable of. The Bane/Vader parallel is obvious: the raging slave of great ability who dares to love above his station and heroically serves a cause intrinsically hostile to him – yet is demonized because he doesn’t fit the Messiah profile, first due to his “wrong” pedigree, later due to severe mutilations that limit his potential. The equivalence is made plain by several touches beyond the breathing mask, including the camera lingering on the frantically kicking feet of someone in his grip.

Oddly enough, both principal women fare fractionally better as characters, despite (or because of?) Nolan’s palpable disinterest in them. Anne Hathaway’s Selina Kyle owes more to Charlise Theron’s slinky yet formidable Aeon Flux than to past Catwomen, which is to the good. On the other hand, the obsessive zooming on her ass while she’s maneuvering the Batbike is emetic (when Batman does it, his nether cheeks are decorously covered by his cape — which finally gives a reason for its existence). There is also a hint that she’s bisexual, which makes her truly intriguing. But for my money, Marion Cotillard’s Talia al Ghul is hands down the most arresting presence in the entire Batman film parade. I’d be happy to see a whole film with her as the protagonist. Hell, a trilogy. Although I’d have preferred that she had gone after her mother’s killers rather than her father’s – especially taking into account her father’s shabby treatment of her savior, to say nothing of his daffy agenda (“cleansing the earth of humanity” using nuclear weapons: unassailable logic, if you’re five years old).

Despite its superficial similarity to TDKR, TBL is a very different beast; I agree with MaryAnn Johanson that it’s high-quality fanfic – specifically, AU fanfic with OCs (in English: alternative universe with original characters). Don’t misunderstand me, it’s far from perfect. It’s uneven, lumpy and ends on a blatant “To Be Continued” note. Nevertheless, it has four great assets besides its intricate interweaving of the Bourne prequel threads: the two principals, Jeremy Renner as Aaron Cross and Rachel Weisz as Dr. Marta Shearing, come across as complex persons – even setting aside the lagniappe of Oscar Isaac as Aaron’s fellow enhanced killing machine; Aaron’s plight is more relevant, interesting and wrenching than that of Jason Bourne; the dialogue is snappy, non-generic, character-specific; and it gets its science as right as Hollywood possibly can. As is often the case with me, I’m in the minority. TBL’s Rotten Tomatoes rating is significantly lower than TDKR’s, in part because many reviewers (like orthodox fanfic readers) want canon, not AU; some have also opined that Jeremy Renner lacks Matt Damon’s charisma.

To each his own. To me at least, Damon has the charisma of a particle board plank. Renner, on the other hand, with his lived-in pug/cherub face, comes across as truly dangerous: you’re never sure if he will kiss or kick, yet you trust him when his smile reaches his eyes – a volatility he engaged to stunning effect in The Hurt Locker and to single-handedly elevate The Town into something eminently watchable. Weisz, on her part, radiates intelligence and competence in whatever role she appears, from The Mummy to Agora to The Constant Gardener. She is one of the very few actors who’s entirely believable as a working scientist.

What makes Aaron’s plight closer to my heart and to real life is that he’s in a Flowers for Algernon situation: he got brain damage during his tour of duty, which made him ripe for the poisoned apple of the top secret augmentation program; for him, stopping the medications that leash him to his handlers is equivalent to a sentence of living death. This pegs the jeopardy meter far harder than Jason Bourne’s thriller-cliché amnesia. When Aaron decides to renounce his newly won freedom for the sake of keeping Marta safe, we feel that real stakes are involved. Aaron and Marta are true partners with equally instrumental overlapping skills. Marta does not spend any length of time impersonating quivering jello, nor does she get relegated to the helpmate slot – though knowing Hollywood’s stance on fully human women, I tremble for her fate in the inevitable sequel.

The science is stunningly accurate for a Hollywood film. That’s a real lab in the chilling massacre scene; when Marta injects Aaron with the viral stock that might cut his indenture bonds, she withdraws it from a real cryovial. When she described the delivery problems of viral vectors, I didn’t wince once and the enhancement route she outlined (mitochondrial ratcheting) is in the domain of the possible. She made one error when she segued into brain function: it’s plasticity, not elasticity… but I’ll take it over ANY other Hollywood science in my memory banks. Nor are the slippery slopes ignored: Marta knows that she let her fervent wish to do cutting-edge science override her moral judgment, choosing to close her eyes to the applications of her work.

In the end, Aaron is kin not to Jason Bourne but to the fascinating loners that we glimpse all too briefly in the Bourne franchise: the Professor (Clive Owen), Jarda (Marton Csokas), Outcome 5 (Oscar Isaac). It occurs to me, of course, that these guys fall in my snacho category… which may be one more reason why I liked TBL far more than TDKR.

Images: Tom Hardy as Ricki Tarr (Tinker, Tailor, Soldier, Spy); Marion Cotillard as Édith Piaf (La Vie en Rose); Jeremy Renner as William James (The Hurt Locker); Rachel Weisz as Kathryn Bolkovaç (The Whistleblower).

Those Who Never Got to Fly

Thursday, July 26th, 2012

Sally Kristen Ride, one of the iconic First Others in space flight, recently died at the relatively young age of 61: she was the first American woman to participate in missions. Her obituary revealed that she was also the first lesbian to do so. Like other iconic First Others (Mae Jemison comes to mind), Sally Ride was way overqualified – multiple degrees, better than her male peers along several axes – and she also left the astronaut program way before she needed to (more about this anon). Even so, Ride remained within the orbit of space exploration activities, including founding NASA’s Exploration Office. She was also part of the board that investigated the crashes of Challenger and Columbia; Ride was the only public figure to side with the whistleblowing engineer of Morton-Thiokol when he warned about the problems that would eventually destroy Challenger.

When Sally Ride was chosen for her first mission – by an openly sexist commander who still had to admit she was by far the most qualified for the outlined duties – the press asked her questions like “Do you weep when something goes wrong on the job?” This was 1983, mind you, not the fifties. The reporters noted that she amazed her teachers and professors by pulling effortless straight As in science and – absolutely relevant to an astronaut’s abilities – she was an “indifferent housekeeper” whose husband tolerated it (she was married to fellow astronaut Steve Hawley at the time). Johny Carson joked that the shuttle launch got postponed until Ride could find a purse that matched her shoes.

Ride and Jemison had to function in this climate but at least they went to space, low-orbit though it had become by then. There were forerunners who never got to do so, even though they were also overqualified. I am referring, of course, to the Mercury 13.

This was the moniker of the early core of women astronauts who trained in parallel with the Mercury 7 and outperformed them – except, as is often the case, they did so in makeshift facilities without official support. Here’s the honor roll call of these pioneers whose wings were permanently clipped (the last names are before marriages changed them): Jane Briggs, Myrtle Cagle, Geraldyn Cobb, Janet Dietrich, Marion Dietrich, Mary Wallace Funk, Sarah Gorelick, Jerrie Hamilton, Jean Hixson, Rhea Hurrie, Irene Leverton, Gene Nora Stumbough, Bernice Trimble.

The Thirteen, never officially part of NASA (they were selected by William Lovelace, who designed the NASA astronaut tests, and the initiative was supported by private donations), had to have at least 1000 hours of flying experience. They underwent the same physical and psychological tests as the men and did as well or better at them: all passed phase I, several went on to phase II, and two completed the final phase III. This was not because any failed II or III, but because they didn’t have the resources to attempt them.

When the Thirteen gathered at Pensacola to show their abilities, the Navy instantly halted the demonstration, using the excuse that it was not an official NASA program. The women, some of whom had abandoned jobs and marriages for this, took their case to Congress. Several people – among them “hero” John Glenn – testified that women were not eligible to fly in space because 1) they didn’t have the exact advanced degrees specified by NASA (neither did Glenn, but he got in without a whisper) and the agency would not accept equivalents and 2) they were prohibited from flying military jets (yet women flew such jets from factories to airfields in WWII; when some of the Mercury 13 flew military jets to qualify, NASA simply ratcheted up that rule).

Space aficionados may recall that the Mercury program’s nickname was “man in a can” – the astronauts had so little control that engineers had to manufacture buttons and levers to give them the illusion of it. Nevertheless, NASA made military jet piloting experience a rule because such men, notorious cockerels, were considered to have The Right Stuff – and Congress used this crutch to summarily scuttle the Mercury 13 initiative, although there was brief consideration of adding women to space missions to “improve crew morale” (broadly interpreted).

It took twenty years for NASA to decide to accept women as astronauts. Just before it did so, hack-turned-fanboi-prophet Arthur C. Clarke sent a letter to Time crowing that he had “predicted” the “problem” brought up by astronaut Mike Collins, who opined that women could never be in the space program, because the bouncing of their breasts in zero G would distract the men. When taken to task, Clarke responded that 1) some of his best friends were women, 2) didn’t women want alpha-male astronauts to find them attractive?? and 3) libbers’ tone did nothing to help their cause. Sound familiar?

Women have become “common” in space flight – except that the total number of spacenauts who are women is still 11% of the total. Furthermore, given that the major part of today’s space effort is not going to Mars or even the Moon but scraping fungus off surfaces of the ISS or equivalent, being an astronaut now is closer to being a housecleaner than an hero. We haven’t come so far after all, and we’re not going much further.

I’m one of the few who believe that women’s rights and successful space exploration (as well as maintenance of our planet) are inextricably linked. As I wrote elsewhere:

“I personally believe that our societal problems will persist as long as women are not treated as fully human. Women are not better than men, nor are they different in any way that truly matters; they are as eager to soar, and as entitled. The various attempts to improve women’s status, ever subject to setbacks and backlashes, are our marks of successful struggle against reflexive institutionalized misogyny. If we cannot solve this thorny and persistent problem, we’ll still survive — we have thus far. However, I doubt that we’ll ever truly thrive, no matter what technological levels we achieve.”

This holds doubly for space exploration – for the goals we set for it, the methods we employ to achieve it and the way we act if/when we reach our destinations.

Addendum: I did not discuss Valentina Tereshkova, who was both the first woman cosmonaut and the first civilian to fly into space. because I wanted to keep the focus of this article on NASA.  Nevertheless, I should mention her as well as Sveltana Savitskaya, the first woman to do a space walk, whose first mission preceded that of Sally Ride.

Sources and further reading

Martha Ackmann, The Mercury 13: The True Story of Thirteen Women and the Dream of Space Flight

Julie Phillips, James Tiptree Jr.: The Double Life of Alice B. Sheldon (one source of the Clarke “distracting breasts” incident and also excellent in its own right)

Site dedicated to the Mercury 13: http://www.mercury13.com/

2nd Image: some of the Mercury 13, gathered to watch the launch in which Eileen Collins was the first woman to pilot a space shuttle mission. Left to right: Gene Nora Stumbough, Mary Wallace Funk, Geraldyn Cobb, Jerri Hamilton, Sarah Gorelick, Myrtle Cagle, Bernice Trimble.

The Other Half of the Sky

Tuesday, July 17th, 2012

I made three appearances in this year’s Readercon: I gave a talk about transhumanism, I was part of a panel that discussed time travel and — last but very decidedly not least — we officially unveiled the SF anthology I am editing. We now have a publisher, as enthusiastic about the project as we are: Candlemark and Gleam, headed by Kate Sullivan. Kay Holt of Crossed Genres, my co-editor in this venture, put together a neat flyer for which she did artwork that reminds me of black-figure Attic vases.

The anthology will bear the title The Other Half of the Sky. Here’s what I said in my outline:

“Women may hold up more than half the sky on earth, but it has been different in heaven: Science fiction still is very much a preserve of male protagonists, mostly performing by-the-numbers quests.

The Other Half of the Sky offers readers heroes who happen to be women, doing whatever they would do in universes where they’re fully human: Starship captains, planet rulers, explorers, scientists, artists, engineers, craftspeople, pirates, rogues…

As one of the women in Tiptree’s “Houston, Houston, Do You Read?” says: “We sing a lot. Adventure songs, work songs, mothering songs, mood songs, trouble songs, joke songs, love songs – everything.” Everything.

The panel flowed like a sea swell. Four of the authors invited to participate in the anthology (Sue Lange, Ken Liu, Vandana Singh and Joan Slonczewski) discussed it along with Kay and me. Alex Jablokov, another of the invited authors, was also there to lend moral support. We discussed why we embarked on the venture, why we think it covers less-trodden ground and how each author conceived their story within the framework I constructed.

Each participant brought up unique and interesting items pertinent to the larger concerns of the anthology. Among them: interactions with aliens that play out differently from the standard “colonize/annihilate” mode; the reciprocal influence of language and perceptions; the fact that you can have space opera with “regular” people as protagonists, rather than Chosen Ones; the complex requirements for space travel and their intersection with our needs on this planet.

The audience was eager to know when the anthology will appear (spring 2013, barring unexpected obstacles) and asked if we plan a series! So we seem to have struck a chord — maybe even a new melody on the old instrument. I want to thank everyone who helped create this intricate tapestry of a discussion.

Image: art for the anthology flyer for Readercon by Kay Holt.

“Arsenic” Life or: There Is TOO a Dragon in My Garage!

Tuesday, July 10th, 2012

GFAJ-1 is an arsenate-resistant, phosphate-dependent organism — title of the paper by Erb et al, Science, July 2012

Everyone will recall the hype and theatrical gyrations which accompanied NASA’s announcement in December 2010 that scientists funded by NASA astrobiology grants had “discovered alien life” – later modified to “alternative terrestrial biochemistry” which somehow seemed tailor-made to prove the hypothesis of honorary co-author Paul Davies about life originating from a “shadow biosphere”.

As I discussed in The Agency that Cried “Awesome!, the major problem was not the claim per se but the manner in which it was presented by Science and NASA and the behavior of its originators. It was an astonishing case of serial failure at every single level of the process: the primary researcher, the senior supervisor, the reviewers, the journal, the agency. The putative and since disproved FTL neutrinos stand as an interesting contrast: in that case, the OPERA team announced it to the community as a puzzle, and asked everyone who was willing and able to pick their results apart and find whatever error might be lurking in their methods of observation or analysis.

Those of us who are familiar with bacteria and molecular/cellular biology techniques knew instantly upon reading the original “arsenic life” paper that it was so shoddy that it should never have been published, let alone in a top-ranking journal like Science: controls were lacking or sloppy, experiments crucial for buttressing the paper’s conclusions were missing, while other results contradicted the conclusions stated by the authors. It was plain that what the group had discovered and cultivated were extremophilic bacteria that were able to tolerate high arsenic concentrations but still needed phosphorus to grow and divide.

The paper’s authors declined to respond to any but “peer-reviewed” rebuttals. A first round of eight such rebuttals, covering the multiple deficiencies of the work, accompanied its appearance in the print version of Science (a very unusual step for a journal). Still not good enough for the original group: now only replication of the entire work would do. Of course, nobody wants to spend time and precious funds replicating what they consider worthless. Nevertheless, two groups finally got exasperated enough to do exactly that, except they also performed the crucial experiments missing in the original paper: for example, spectrometry to discover if arsenic is covalently bound to any of the bacterium’s biomolecules and rigorous quantification of the amount of phosphorus present in the feeding media. The salient results from both studies, briefly:

– The bacteria do not grow if phosphorus is rigorously excluded;
– There is no covalently bound arsenic in their DNA;
– There is a tiny amount of arsenic in their sugars, but this happens abiotically.

The totality of the results suggests that GFAJ-1 bacteria have found a way to sequester toxic arsenic (already indicated by their appearance) and to preferentially ingest and utilize the scant available phosphorus. I suspect that future work on them will show that they have specialized repair enzymes and ion pumps. This makes the strain as interesting as other exotic extremophiles – no less, but certainly no more.

What has been the response of the people directly involved? Here’s a sample:

Felisa Wolfe-Simon, first author of the “arsenic-life” paper: “There is nothing in the data of these new papers that contradicts our published data.”

Ronald Oremland, Felisa Wolfe-Simon’s supervisor for the GFAJ-1 work: “… at this point I would say it [the door of “arsenic based” life] is still just a tad ajar, with points worthy of further study before either slamming it shut or opening it further and allowing more knowledge to pass through.”

John Tainer, Felisa Wolfe-Simon’s current supervisor: “There are many reasons not to find things — I don’t find my keys some mornings. That doesn’t mean they don’t exist.”

Michael New, astrobiologist, NASA headquarters: “Though these new papers challenge some of the conclusions of the original paper, neither paper invalidates the 2010 observations of a remarkable micro-organism.”

At least Science made a cautious stab at reality in its editorial, although it should have spared everyone — the original researchers included — by retracting the paper and marking it as retracted for future reference. The responses are so contrary to fact and correct scientific practice (though familiar to politician-watchers) that I am forced to conclude that perhaps the OPERA neutrino results were true after all, and I live in a universe in which it is possible to change the past via time travel.

Science is an asymptotic approach to truth; but to reach that truth, we must let go of hypotheses in which we may have become emotionally vested. That is probably the hardest internal obstacle to doing good science. The attachment to a hypothesis, coupled with the relentless pressure to be first, original, paradigm-shifting can lead to all kinds of dangerous practices – from cutting corners and omitting results that “don’t fit” to outright fraud. This is particularly dangerous when it happens to senior scientists with clout and reputations, who can flatten rivals and who often have direct access to pop media. The result is shoddy science and a disproportionate decrease of scientists’ credibility with the lay public.

The two latest papers have done far more than “challenge” the original findings. Sagan may have said that “Absence of evidence is not evidence of absence,” but he also explained how persistent lack of evidence after attempts from all angles must eventually lead to the acceptance that there is no dragon in that garage, no unicorn in that secret glade, no extant alternative terrestrial biochemistry, only infinite variations at its various scales. It’s time to put “arsenic-based life” in the same attic box that holds ether, Aristotle’s homunculi, cold fusion, FTL neutrinos, tumors dissolved by prayer. The case is obviously still open for alternative biochemistry beyond our planet and for alternative early forms on earth that went extinct without leaving traces.

We scientists have a ton of real work to do without wasting our pitifully small and constantly dwindling resources and without muddying the waters with refuse. Being human, we cannot help but occasionally fall in love with our hypotheses. But we have to take that bitter reality medicine and keep on exploring; the universe doesn’t care what we like but still has wonders waiting to be discovered. I hope that Felisa Wolfe-Simon remains one of the astrogators, as long as she realizes that following a star is not the same as following a will-o’-the-wisp — and that knowingly and willfully following the latter endangers the starship and its crew.

Relevant links:

The Agency that Cried “Awesome!”

The earlier rebuttals in Science

The Erb et al paper (Julia Vorholt, senior author)

The Reaves et al paper (Rosemary Rosefield, senior author)

Images: 2nd, Denial by Bill Watterson; 3rd, The Fool (Rider-Waite tarot deck, by Pamela Cole Smith)

That Shy, Elusive Rape Particle

Saturday, May 26th, 2012

[Re-posted modified EvoPsycho Bingo Card -- click on image for bigger version]

One of the unlovely things that has been happening in Anglophone SF/F (in line with resurgent religious fundamentalism and erosion of democratic structures in the First World, as well as economic insecurity that always prompts “back to the kitchen” social politics) is the resurrection of unapologetic – nay, triumphant – misogyny beyond the already low bar in the genre. The churners of both grittygrotty “epic” fantasy and post/cyberpunk dystopias are trying to pass rape-rife pornkitsch as daring works that swim against the tide of rampant feminism and its shrill demands.

When people explain why such works are problematic, their authors first employ the standard “Me Tarzan You Ape” dodges: mothers/wives get trotted out to vouch for their progressiveness, hysteria and censorship get mentioned. Then they get really serious: as artists of vision and integrity, they cannot but depict women solely as toilet receptacles because 1) that has been the “historical reality” across cultures and eras and 2) men have rape genes and/or rape brain modules that arose from natural selection to ensure that dominant males spread their mighty seed as widely as possible. Are we cognitively impaired functionally illiterate feminazis daring to deny (ominous pause) SCIENCE?!

Now, it’s one thing to like cocoa puffs. It’s another to insist they are either nutritional powerhouses or haute cuisine. If the hacks who write this stuff were to say “Yeah, I write wet fantasies for guys who live in their parents’ basement. I get off doing it, it pays the bills and it has given me a fan base that can drool along with me,” I’d have nothing to say against it, except to advise people above the emotional age of seven not to buy the bilge. However, when they try to argue that their stained wads are deeply philosophical, subversive literature validated by scientific “evidence”, it’s time to point out that they’re talking through their lower digestive opening. Others have done the cleaning service for the argument-from-history. Here I will deal with the argument-from-science.

It’s funny how often “science” gets brandished as a goad or magic wand to maintain the status quo – or bolster sloppy thinking and confirmation biases. When women were barred from higher education, “science” was invoked to declare that their small brains would overheat and intellectual stress would shrivel their truly useful organs, their wombs. In our times, pop evopsychos (many of them failed SF authors turned “futurists”) intone that “recent studies prove” that the natural and/or ideal human social configuration is a hybrid of a baboon troop and fifties US suburbia. However, if we followed “natural” paradigms we would not recognize paternity, have multiple sex partners, practice extensive abortion and infanticide and have powerful female alliances that determine the status of our offspring.

I must acquaint Tarzanists with the no-longer-news that there are no rape genes, rape hormones or rape brain modules. Anyone who says this has been “scientifically proved” has obviously got his science from FOX News or knuckledraggers like Kanazawa (who is an economist, by the way, and would not recognize real biological evidence if it bit him on the gonads). Here’s a variation of the 1986 Seville Statement that sums up what I will briefly outline further on. It goes without saying that most of what follows is shorthand and also not GenSci 101.

It is scientifically (not politically) incorrect to say that:
1. we have inherited a tendency to rape from our animal ancestors;
2. rape is genetically programmed into our nature;
3. in the course of our evolution there has been a positive selection for rape;
4. humans brains are wired for rape;
5. rape is caused by instinct.

Let’s get rid of the tired gene chestnut first. As I’ve discussed elsewhere at length, genes do not determine brain wiring or complex behavior (as always in biology, there are a few exceptions: most are major decisions in embryo/neurogenesis with very large outcomes like Down syndrome, aka trisomy 21). Experiments that purported to find direct links between genes and higher behavior were invariably done in mice (animals that differ decisively from humans) and the sweeping conclusions of such studies have always had to be ratcheted down or discarded altogether, although in lower-ranking journals than the original effusions.

Then we have hormones and the “male/female brain dichotomy” pushed by neo-Freudians like Baron-Cohen. They even posit a neat-o split whereby too much “masculinizing” during brain genesis leads to autism, too much “feminizing” to schizophrenia. Following eons-old dichotomies, people who theorize thusly shoehorn the two into the left and right brain compartments respectively, assigning a gender to each: females “empathize”, males “systematize” – until it comes to those intuitive leaps that make for paradigm-changing scientists or other geniuses, whereby these oh-so-radical theorists neatly reverse the tables and both creativity and schizophrenia get shifted to the masculine side of the equation.

Now although hormones play critical roles in all our functions, it so happens that the cholesterol-based ones that become estrogen, testosterone, etc are two among several hundred that affect us. What is most important is not the absolute amount of a hormone, but its ratios to others and to body weight, as well as the sensitivity of receptors to it. People generally do not behave aberrantly if they don’t have the “right” amount of a sex hormone (which varies significantly from person to person), but if there is a sudden large change to their homeostasis – whether this is crash menopause from ovariectomy, post-partum depression or heavy doses of anabolic steroids for body building.

Furthermore, as is the case with gene-behavior correlation, much work on hormones has been done in mice. When similar work is done with primates (such as testosterone or estrogen injections at various points during fetal or postnatal development), the hormones have essentially no effect on behavior. Conversely, very young human babies lack gender-specific responses before their parents start to socialize them. As well, primates show widely different “cultures” within each species in terms of gender behavior, including care of infants by high-status males. It looks increasingly like “sex” hormones do not wire rigid femininity or masculinity, and they most certainly don’t wire propensity to rape; instead, they seem to prime individuals to adopt the habits of their surrounding culture – a far more adaptive configuration than the popsci model of “women from Venus, men from Mars.”

So on to brain modularity, today’s phrenology. While it is true that there are some localized brain functions (the processing of language being a prominent example), most brain functions are diffuse, the higher executive ones particularly so – and each brain is wired slightly differently, dependent on the myriad details of its context across time and place. Last but not least, our brains are plastic (otherwise we would not form new memories, nor be able to acquire new functions), though the windows of flexibility differ across scales and in space and time.

The concept of brain modularity comes partly from the enormously overused and almost entirely incorrect equivalence of the human brain to a computer. Another problem lies in the definition of a module, which varies widely and as a result is prone to abuse by people who get their knowledge of science from new-age libertarian tracts. There is essentially zero evidence of the “strong” version of brain modules, and modular organization at the level of genes, cells or organ compartments does not guarantee a modular behavioral outcome. But even if we take it at face value, it is clear that rape does not adhere to the criteria of either the “weak” (Fodor) or “strong (Carruthers) version for such an entity: it does not fulfill the requirements of domain specificity, fast processing, fixed neural architecture, mandatoriness or central inaccessibility.

In the behavioral domain, rape is not an adaptive feature: most of it is non-reproductive, visited upon pre-pubescent girls, post-menopausal women and other men. Moreover, rape does not belong to the instinctive “can’t help myself” reflexes grouped under the Four Fs. Rape does not occur spontaneously: it is usually planned with meticulous preparation and it requires concentration and focus to initiate and complete. So rape has nothing to do with reproductive maxima for “alpha males” (who don’t exist biologically in humans) – but it may have to do with the revenge of aggrieved men who consider access to women an automatic right.

What is undeniable is that humans are extremely social and bend themselves to fit context norms. This ties to Arendt’s banality of evil and Niemöller’s trenchant observations about solidarity – and to the outcomes of Milgram and Zimbardo’s notorious experiments which have been multiply mirrored in real history, with the events in the Abu Ghraib prison prominent among them. So if rape is tolerated or used as a method for compliance, it is no surprise that it is a prominent weapon in the arsenal of keeping women “in their place” and also no surprise that its apologists aspire to give it the status of indisputably hardwired instinct.

Given the steep power asymmetry between the genders ever since the dominance of agriculture led to women losing mobility, gathering skills and control over pregnancies, it is not hard to see rape as the cultural artifact that it is. It’s not a sexual response; it’s a blunt assertion of rank in contexts where dominance is a major metric: traditional patriarchal families, whether monogamous or polygynous; religions and cults (most of which are extended patriarchal families); armies and prisons; tribal vendettas and initiations.

So if gratuitous depictions of graphic rape excite a writer, that is their prerogative. If they get paid for it, bully for them. But it doesn’t make their work “edgy” literature; it remains cheap titillation that attempts to cloak arrant failures of talent, imagination and just plain scholarship. Insofar as such work has combined sex and violence porn as its foundation, it should be classified accordingly. Mythologies, including core religious texts, show rape in all its variations: there is nothing novel or subversive about contemporary exudations. In my opinion, nobody needs to write yet another hack work that “interrogates” misogyny by positing rape and inherent, immutable female inferiority as natural givens – particularly not white Anglo men who lead comfortable lives that lack any knowledge to justify such a narrative. The fact that people with such views are over-represented in SF/F is toxic for the genre.

Further reading:

A brief overview of the modularity of the brain/mind
Athena Andreadis (2010). The Tempting Illusion of Genetic Virtue. Politics Life Sci. 29:76-80
Sarah Blaffer Hrdy, Mothers and Others: The Evolutionary Origins of Mutual Understanding
Anne Fausto-Sterling, Sex/Gender: Biology in a Social World
Cordelia Fine, Delusions of Gender
Alison Jolly, Lucy’s Legacy: Sex and Intelligence in Human Evolution
Rebecca Jordan-Young, Brain Storm: The Flaws in the Science of Sex Differences
Kevin Laland and Gillian Brown, Sense and Nonsense: Evolutionary Perspectives on Human Behaviour
Edouard Machery and Kara Cohen (2012). An Evidence-Based Study of the Evolutionary Behavioral Sciences. Brit J Philos Sci 263: 177-226

Internet Scofflaw: Breaking the Blogging Commandments

Thursday, May 3rd, 2012

I have the bad habit of site-jumping when a topic snags my interest.  Recently, starting with a tale of blatant plagiarism by a top YA book reviewer (who issued the standard non-apology and accused her victims of being mean to her, thereby setting up a bullying spree by her followers), I found myself skimming the plagiarized pieces.  Two dealt with blogging don’ts.  Those who know me will guess the rest: I looked up “blogging no-nos” in Google.

Several sites later, suffice it to say that the advice is as harmonious as a skua rookery.  There are, however, a few near-consensus points for non-business blogs:

  1. content über alles (if only);
  2. fast loading good, pop-ups and multi-clicks bad (unless they help the site’s hit count);
  3. also bad: spelling mistakes, eye-hurting design and music autoplay (the latter makes it hard to secretly net-surf at work, for one);
  4. well-chosen pictures are mandatory (a thousand words and so forth);
  5. so is replying to all comments and having painless spam filters (everyone’s whims must be catered to the max, otherwise they won’t keep reading the blog);
  6. don’t exceed a certain length (below 1,000 words good, below 750 even better – after all, people are busy surfing);
  7. use social media – newsletters, Share buttons, Twitter (establish a presence!);
  8. do 10-Things lists, polls, contests (with awards);
  9. update frequently or risk being forgotten (people must be constantly entertained, after all);
  10. find a content niche and stick with it like a burr (or else no community for you).

Now, the first four are commonsense and should be obvious – though judging from what I saw during this particular dip, they’re not.  This observance-in-the-breach includes the common associated clause of “don’t be negative” for point 1: if anything, flamewars seem to feed blogs like dry twigs feed brushfires.  However, I break the last six with abandon and in full consciousness.  This may explain why my blog flipflops wildly at various ranking sites, and why I haven’t yet been awarded a Pulitzer or a regular column at, say, Nature or Tor.

Points 5-9 can only be followed if your blog and the activities it promotes are the focus of your entire existence – or you’re paid what passes for a pro rate (whatever that is, in today’s “content yearns to be free” mindset).  It does so happen that I don’t live in my parents’ basement pushing XBox buttons: I have a research lab and an academic job that demand more than passing attention.  Besides, I’ve seen Twitter, Facebook and Livejournal close up and found them less than enticing.  “Loyalties” that spring from social media are shallow and brittle.  It takes more than exchanging snarky soundbites to build sturdy alliances that go beyond “Like” or “Headdesk”.

More fundamentally, having entered the last third of my life, I sometimes tire of old issues springing up again and again like dragons’s teeth: the relentless fundamentalist war on women’s rights in this country and elsewhere; grittygrotty SF/F authors calling their pornokitsch fiction subversive and invoking “rape modules in male brains” (although I and several others tackled this from the writing angle and I intend to discuss it in a near future post from the biological angle); young women and self-labeled “progressive” men saying that feminism is passé, having achieved its goals (equal pay? easy access to contraception?); fanboiz whining that I’m elitist because I don’t like Avatar, Accelarando or the pronouncements of Kurzweil, or that I’m hard on armchair tourist authors who get famous (or at least solvent) from tone-deaf depictions of non-Anglo cultures.  Which brings us to the major issue, point 10: content focus.

What people write on their blogs depends on their goals.  Some use them as pulpits, others as public diaries, yet others as marketing tools (“Here are my Hugo nominations, now go vote!”).  Focused-content blogs tend to become watering holes for the like-minded.  In some cases, their owners become oracles to a worshipping group of reader-acolytes.  Personally, I’m interested (more than casually) in several domains: science, history and language, literature and the arts, space exploration, politics.  I also believe that none of these strands can be examined in isolation.  To give a recent example, my critique of the John Carter film included all these angles – and when I was asked to take out “the review bits” for possible reprinting on a popsci-oriented blog, I realized I couldn’t do so without essentially rewriting the entire piece.

I’m also allergic to acolytes because at some point they take you over.  Not that women attain prophet status without becoming Ayn Rand or the equivalent, mind you – women who denigrate their own, thereby becoming pillars of the status quo.  Being a non-Anglo woman who is a non-joiner by temperament and falls between more stools than I can either avoid or count, I’m reconciled to the idea that if I were a man I would probably be knee-deep in accolades, awards and groupies eager to have my babies.  But I’m happy to be a feral nomad instead.  “I cannot be tethered, while I still hear the night winds moan and call.” [1]

So here we are – done in less than 1,000 words this time!  Bottom line: this blog will continue to be unapologetically eclectic in topic selections but neither a diary nor a collection of laundry lists. It’s a salon where friends and passing guests gather for conversations, subject to my tides of mood and health; a review along classic lines that reflects its opinionated editor’s interests and viewpoints.  For me it is a window to the world.  All kinds of neat things alight here as I sing for my own pleasure.  And that’s good enough for a pagan outlaw loner like me.

[1] From Though I Grow Old with Wandering… in Realms of Fire

Images: 1st, Curious Cat (Jane Burton); 2nd, self-explanatory; 3rd, how I see the blog.

 

Looking at John Carter (of Mars) — Part 2

Thursday, April 12th, 2012

by Larry Klaes, space exploration enthusiast, science journalist, SF aficionado (plus a coda by Athena)

Part 1

Burroughs’ Influences

ERB had several strong influences while creating the fictional world of Barsoom. One came from his experiences in the late 1890s as an enlisted soldier with the 7th U.S. Cavalry at Fort Grant in Arizona (still a US territory at the time). The vast desert landscape of the Southwest served as a geophysical model for his drying and dying Mars. The surrounding Native American population became the Tharks. The native women – whom he found to be haughty, beautiful, and very proud – may also have served as ERB’s involuntary muses for Dejah Thoris.

ERB’s other prominent influence for the formation of Barsoom came from a fellow who was also a resident of Arizona around the same time: Percival Lowell. A member of a very prominent Boston Brahmin family, Lowell became fascinated with Mars after the Italian astronomer Giovanni Schiaparelli reported observing a series of long, straight dark lines on the Red Planet starting in 1877. His intense and focused interest in Mars (along with his wealth) led Lowell to build a professional observatory in remote Flagstaff, Arizona, where he felt he could properly study our neighboring world to better discern its compelling features.

Lowell and others soon came to the conclusion that such formations had to be artificial in nature. Lowell believed that a race of beings much older, wiser, and more advanced than humanity dwelt on Mars. These Martians built a vast network of giant canals to bring water from their arctic regions of ice to their cities on and near the equator. Their plan was to stave off extinction as their ancient world began to dry up, taking the native flora and fauna with it in the process. Lowell and his followers thought they were witnesses to the last great act of an alien civilization.

Lowell’s hypothesis for Mars were not completely pulled out of thin air, for his ideas were based on a combination of contemporary thoughts and observations: From what astronomers could see through their telescopes about the Red Planet from their vantage point on Earth many millions of miles away, the fourth world from the Sun appeared to be more like our globe than any other place in the Sol system. Mars possessed two white polar caps, an axial tilt and rotation rate very similar to Earth’s, and light and dark regions which changed in color, shape, and size through the long Martian seasons. Many conjectured that these mobile surface markings were the life cycles of native plants or even the migration of animals.

Another idea popular at the time was the Nebular Theory of solar system formation. This plan declared that the outer worlds cooled and condensed first ages ago from the cosmic cloud of dust and debris that would become our Solar System. These places would thus develop  the conditions to support life sooner than the worlds closer to the warming Sun. As a result, the outer planets would also one day find themselves becoming less able to sustain their ecosystem sooner than the inner planets. This is why Lowell concluded there were canal-building intelligences on Mars without being able to actually see any such beings to learn whether he was correct or not.

Whether Percival Lowell was eventually right or wrong about the true state of the Red Planet ultimately mattered little to authors such as ERB and H. G. Wells. They found in Lowell’s ideas a fertile field for their imaginary worlds, though of course in Wells’ case, the Lowellian conditions on Mars served as a literal springboard for his octopus-like inhabitants to seek a better place to live, by force no less, thus creating the alien invasion scenario that remains popular to this very day. The only major difference between Wells’ creatures and their fictional descendants is that they now spring (mostly) from worlds circling other suns.

In contrast, ERB’s Martians remained on Barsoom despite the similarly debilitating environmental situation. There was and is a lot of high technology across Barsoomian society in both the novel and the film, including aerial flying machines, but they did not seem to focus on space travel, if you exclude the Therns’ guarded method of celestial transportation. Nevertheless, at least Helium appears to have had some rather powerful ground-based telescopes, as in the film version Dejah Thoris eventually realized that John Carter was a native of Jarsoom, while in the novel the princess was well aware of human civilization on Earth long before Carter arrived on her world.

Obviously the main reason I am emphasizing the John Carter connection with Lowell’s Mars is due to its important influence in bringing about the world of Barsoom. My other motive for bringing up the era defined by what Lowell created, pursued, and essentially preached about the Red Planet – namely from the latter half of the nineteenth century to July of 1965, when the American robotic probe Mariner 4 revealed with its t relatively crude images of the planet’s surface and other measurements a shockingly Moon-like Mars – is to highlight a period of astronomical history that is both fascinating in its own right and a relevant lesson in our current pursuit of extraterrestrial life.

John Carter did give some tantalizing hints about the Lowell era of Mars at the beginning and end of the film, very briefly displaying some real early hand-drawn maps of the planet. Included among these charts was one of the famous Lowell maps of the Martian canals, where it turns out that ERB rather closely modeled the various city-states and other features of Barsoom upon in numerous cases. See here for the details:

I also took special pleasure in noting that John Carter’s tomb looked rather similar to the one Percival Lowell was buried in on Mars Hill at his Flagstaff observatory in 1915.  It is these touches and obvious indication that someone did their historical research which I appreciate very much.

While it is clear to us (and a number of astronomers from that era) that Lowell went much too far in speculating on what the Martian canals were all about (sadly, even the canals turned out not to be real but rather optical illusions caused by real surface features being just beyond the resolution of most telescopes), his influence and imagination were the important catalyst in spurring both classic works of fiction and the people who would go on to study and explore the real Red Planet. A film about that era could be quite successful in my opinion. Certainly there would be enough real excitement, romance, and drama to work from.

Final Thoughts – The White Messiah

When Athena initially asked if I was interested in writing a review of John Carter, we briefly touched upon the “White Messiah” complex that exists in most films such as Avatar, Dances With Wolves, and certainly the John Carter series.  Of course one could not create a John Carter story absent of its white male American hero without radically changing the focus and point of what ERB was trying to do (in addition to making a living at writing): to get American boys to become manlier like their forbears were presumed to be.

While researching John Carter, I read that ERB was concerned about the growing population move from the farms and fields to more urban areas.  ERB felt that boys who were not able to spend their youths hunting, fishing, and partaking in other outdoor activities were in danger of losing their manhood and possibly becoming – gasp – intellectual sissies!  So ERB conceived of a character that would inspire young males to become bold, daring, and adventurous (along with pursuing beautiful women) under the guise of an entertaining plot.

I have my doubts that this idea was actively considered or even known of by the makers of the John Carter film.  If anything, the snachismo concept Athena has written about here in her blog was quite in play:  John Carter was still indeed a manly man, but he was also shown to have a sensitive and caring side, including a back story that did not exist in the novel so far as I know.  And for a “Gentleman from Virginia” of the Nineteenth Century, Carter recognized and respected Dejah Thoris’ numerous abilities, despite her being – gasp – a woman.

The White Messiah idea does have some literal merit for John Carter (note the initials).  This article in Slate magazine goes into some interesting and revealing depth on the subject.  One has to wonder why our society seems to always be waiting and hoping for one particular individual (or even an advanced ETI) to come along and save the rest of us from ourselves?  Is it just because we are social mammals hardwired to defer authority to an Alpha Male?

While works like John Carter were not really aimed at exploring this topic, they can stir us to move beyond these basic plots and concepts to create our own ideas and stories of worlds and beings who think and operate in ways different from our current culture.  After all, that is one of the key features of science fiction, to imagine alternate scenarios and societies and see how they might play out.

It was nice to see on the big screen a fairly well done rendition of and tribute to a series that inspired so much of our popular science fiction stories today.  Now that a century has passed, I think it is time for cinematic science fiction to start graduating to more complex and daring concepts, which we did see a few times in the pre-Star Wars era.  If done and sold right, I think audiences are becoming sophisticated enough to handle stories outside the mainstream “comfort zone”.  At the very least, perhaps next time we will have a story about a Dejah Thoris type who simultaneously inspires young women and saves the world.

Athena’s coda: I already expressed my views of how well-made/progressive I deemed the JCM film in part 1.  ERB is one of the forefathers of the grittygrotty contingent in SF/F.  Its members are invariably linked with regressive tropes, evopsycho paradigms that extol reactionary mores as universal (the Alpha Male canard among them – there are no such creatures in the human species, biologically speaking) and hack writing.  I won’t list names, lest I spread the disease; nevertheless, it’s indicative that this contingent went ballistic because the JCM film updated the novel to lighten its deeply reactionary nature vis-à-vis women and non-whites.

Percival Lowell’s social prominence and wealth allowed him to indulge in his passionate hobby, and concrete good came of it: namely, the discovery of Pluto (he could have spent his money on golf clubs or financing conservative politicians).  However, it was already widely accepted during Lowell’s heyday that the Martian canals (a mistranslation of Schiaparelli’s original term, which meant channels) were natural formations.  It’s entirely likely that his “maps” of Mars and Venus were in fact depictions of his retinal blood vessels.

Mars, by dint of all its intrinsics as they gradually unfolded before us, has been a perennial object of fascination.  The issue of whether it once did or still does harbor life has not been resolved and I, for one, am all for a crewed expedition that will not only attempt to definitively answer this question but will also be useful in showing up the pitfalls and limitations of longer space travel.

On the art side, it’s true that there hasn’t yet been a film depiction of Mars that does it justice.  The obvious candidate (for a series rather than a standalone film, given its length) is Stan Robinson’s trilogy.  But for my taste, the hands-down choice would be Alexander Jablokov’s River of Dust: it shows a Mars that harbors a precarious but culturally vibrant underground human colony after a terraforming attempt failed, and it overflows with mythic echoes, dramatic situations that matter, exciting ideas, unique settings and vivid characters.

Images: Lowell’s “map” of the Mars south pole (1904); Lowell’s mausoleum; Valles Marineris, one of the largest canyons in the solar system (NASA/JPL); Alex Jablokov’s marvelous River of Dust

Looking at John Carter (of Mars) — Part 1

Sunday, April 8th, 2012

by Larry Klaes, space exploration enthusiast, science journalist, SF aficionado (plus a dissenting coda by Athena)

There is an interesting parallel between John Carter as the main character of the Mars series of adventure novels begun by Edgar Rice Burroughs (from here on called ERB) one century ago this year, and the recently released Disney film of the same name.

Both arrived on their respective worlds – the fictional man Carter on planet Mars, a.k.a. Barsoom, and the motion picture John Carter in cinemas all over planet Earth (a.k.a. Jarsoom) – with relatively little fanfare. Both Carters initially encountered natives who had no real idea who they were and were ready to kill them off. Yet somehow both survived their hostile environments and slowly earned the understanding and respect of their newfound worlds, eventually going on to change things for the better and having a wild time in the process.

Now of course the film version still has a long road ahead to achieve its equivalent of what the novel hero achieved in his fictional and serialized lifetime. To be honest, I do not know if it will ever become as popular and influential as the novels were in their day, if for no other reason than too many other fictional series influenced by the ERB works have left their much stronger mark on the cultural mindset in the intervening century. In addition, while John Carter is better than I feared, the very ironic fact that it looks rather derivative of the very genre it spawned may permanently hobble its journey across the cinematic and cultural landscape.

So why should I make a big deal out of a film and series that its parent company will probably write off as a financial loss, one that most of today’s audience is almost totally unfamiliar with, and in truth its core plot was not terribly original or new when ERB produced its first installment back in 1912?

For the following reasons: The film did not become the bloated mess that I thought Hollywood was going to turn it into (and which many film critics who I do not think would know or understand science fiction and its history if they proverbially bit them continue to insist it is while mentioning its big budget in the same breath). The John Carter series deserves to be honored, understood, and appreciated for all it has done both for science fiction/fantasy and for influencing later real scientists like Carl Sagan, who talked about his love for the series as a youth in an episode of Cosmos. Finally, the real story and history behind the influences – hinted at in the film – that spawned John Carter and affected our views of life on Mars and elsewhere are more than worthy of being reintroduced to new generations as well.

The plot of John Carter is essentially that of ERB’s first novel in the series, A Princess of Mars: Confederate war veteran and Gentleman of Virginia John Carter goes into a cave in the Southwestern United States and wakes up millions of miles away on the planet Mars. There he meets several of the remaining native populations on that world, all of whom are battling with each other and the elements as Mars is slowly drying up. Carter’s Earth-developed muscles allow him to jump quite high and punch very hard in the lower Martian gravity, abilities which quickly earn him the awe and respect of key natives. In the end, our hero defeats the bad guys, wins the hand of the beautiful Princess of Helium, Dejah Thoris, and then involuntarily ends up back on Earth.  Carter spends most of his Jarsoomian exile trying to get back to Mars and his wife, which he eventually does.

I must confess: I did not read any of the John Carter novel series until rather recently, despite knowing about them for most of my life. I am not a big fan of fantasy fiction and that is what I considered these works to be. I also assumed that the prose would not have aged well in the intervening decades.

I have since read the first novel and, like the film, found it to be not as bad as I feared. Both were rather entertaining and I found myself actually caring about the characters, always a key point for me with any story. As just one example, I recall being both surprised and moved when it was revealed in the novel that Sola was the daughter of Tars Tarkas.

Based on past experience with Hollywood’s efforts at science fiction (and John Carter really is basically SF and not fantasy), along with Disney’s historical habit of making major changes in their productions to suit their intended audiences and their less-than-stellar promotional efforts for this film, I expected John Carter to be an expensive and flashy mess, one that was as much about the original A Princess of Mars as the “re-imaged/re-invented” Star Trek film from 2009 was about the original Star Trek television series: A shell resembling the franchise but full of hot air and junk underneath. Instead I witnessed a film that actually got the main characters and plot points, along with the essence and feel of the novel – no small feat there. I just wish that more people were aware of this and could appreciate it. Ironically, science fiction is starting to become more “acceptable” to the mainstream audience due to the reimaged Battlestar Galactica and especially The Hunger Games series, whose first film came out right after John Carter and financially steamrolled our Martian hero and every other current movie in its path.

I found the film to capture the feel and look of the novel as I and others imagined it quite well. From the flying battle cruisers to the appearance and behavior of the warrior Tharks, this cinematic world of Barsoom is one I think ERB would have said well matched his visions of his creation.

There were a few notable changes from the novel, most of which only make sense in light of the medium and era. One was the addition of clothing on John Carter and the residents of Barsoom. In the novel, most natives went either naked or nearly so and did not even think twice about being in such a state (Dejah Thoris only wore strategically-placed ornate jewelry, for example). John Carter even arrived on Mars sans clothing. For obvious reasons the film could not replicate this situation from the novel; besides, it probably would have been too distracting even if such a thing were allowed by the modern film industry.

The women of Barsoom fared rather well from their “modernization” in the film, though it should be noted that even in the first novel I did not find them to be just the damsels-in-distress one might be led to believe from the decades of artwork depicting that alien world.

The two main Martian city-states depicted in the film, Helium and Zodanga, employed female soldiers as readily as male ones. I had to wonder if this situation was due to the fact that the Martian environment was dying and people and resources were in ever-dwindling supply, but no one ever seemed to question or even react to the idea of women in their military. The audience was not given enough cinematic time to learn very much about these societies in any event.

The Thark Sola was an intelligent and compassionate individual in addition to being a strong warrior. She endured a fair deal of suffering from her harsh culture to remain true to herself and her beliefs. Sola also became open to new ideas as the story progressed, such as flying, despite her father and chief Tars Tarkas earlier intoning that “Tharks do not fly!”

The most notable woman of the series is of course Dejah Thoris. While she remained a beautiful princess and the focus of John Carter’s admiration and desire, for the film Dejah also became a highly capable scientist as well as a warrior who more than held her own in battle. When the Helium leadership was ready to cave in and acquiesce to the demands of the Zodanga leader to marry Dejah in the hope of saving their society from defeat and destruction, Dejah was the only one who not only balked at this forced union but saw how Helium’s being united with the more barbaric city-state of Zodanga would actually undermine her culture and eventually all the people of Barsoom.

Dejah’s demonstrated scientific knowledge and technical skills were strong enough that the main “bad guys” of the film, the highly advanced species known as the Thern, considered Dejah to be a serious impediment to their plans for Barsoom while simultaneously admiring her abilities. As for the actor who played the Princess of Helium, Lynn Collins was an excellent choice for the character. She not only played Dejah with both intelligence and an air of royal nobility, Collins’ years of martial arts training also showed convincingly in her numerous scenes of hand-to-hand combat – including the several occasions when Carter got behind Dejah for protection!

The Thern are another cinematic modification from the novel. In A Princess of Mars, Martian natives make a trip down the River Iss when they feel ready to pass on from this life. They believe at the end of that river is where they will meet the goddess Issus and go on to a paradisiacal afterlife. Instead the mythology and the journey are a trap set by the Thern, descendants of the White Martians, who use monstrous creatures such as the white apes to kill and eat the unwary pilgrims and enslave or consume in turn those who survive the ordeal.

In the film, the Thern are an advanced alien race (they appear as humanoids but can also shapeshift) who travel from one inhabited world to another and “feed” off the energies expended by the native populations as they struggle with each other and use up or neglect their planet’s natural resources. One Thern named Shang implies to Carter that Earth and humanity are next on their menu once they are done with the dying Barsoom.

The Thern have a very interesting and quite alien technology which looks like a tangled mass of blue fibers, whether it is one of their structures or a weapon (Dejah Thoris recognizes its artificial nature). They also travel between worlds by sending “copies” of themselves similar to a fax using a medallion that operates on specific verbal commands. Whereas in the novel, Carter mysteriously arrives on Mars after simply falling asleep in a cave, our hero is accidentally transported to Barsoom by Thern technology. While of course there is no actual explanation given as to how the mechanism works, the audience is at least handed some kind of plausible reason for Carter’s celestial journey that is no worse than using a faster-than-light drive for a fictional starship. Besides, the JC series is all about the destination, not the journey.

The film version of the Thern left me wondering if perhaps there are advanced societies in the galaxy who view other alien species as lesser creatures to either be ignored or utilized for their own purposes. While they held some genuine admiration for Dejah Thoris, I got the impression that their whole attitude about using Barsoom until it was dry and dead and all the other worlds they have come across could be summed up as “It’s nothing personal, it’s just business.”

I have often wondered if an advanced ETI, using the Kardashev Type 2 or 3 labels for simplification, would mow over whole worlds and species as they developed their interstellar existence in the same way a construction crew would run over an ant colony on their building site. I would like to think that such sophisticated and experienced beings would be a bit more sensitive than that, but we are still so very clueless about anyone else in the Milky Way galaxy and beyond.

Athena’s afterword: Unsurprisingly, my view of John Carter (henceforth JCM) is far more jaundiced than Larry’s.  JCM is dull, curiously inert, with zero frisson or sensawunda despite the non-stop eye candy.  Although the novel it’s based on predated and influenced Star Wars, Avatar, etc, it was a given that the film’s late arrival would doom it to looking stale unless its makers were truly bold.  Pressing Pixar’s Stanton into service made success a possibility but Disney standard hackery prevailed: the deletion of crucial words from the film’s title (Mars, because other films with Mars in their titles bombed; Princess, because… it might give the film girl cooties) signals this fatal lack of conviction.

True, JCM is not a total failure; however, given its semi-infinite budget and the longueurs recognized even by its champions, this is a pathetically low bar.  It’s a near-failure even as film space opera — which by tradition has low standards for coherence, opting instead for assaultive FX pyrotechnics.  Of course, JCM’s science is non-existent even within its own silly framework (example: the intermittence and variability of Carter’s locust-like jumping abilities).  At least, unlike Cameron’s Na’vi, Stanton forbore from putting breasts on female Tharks.  In fact, JCM’s core failure lies in its clumsy, generic narrative and its paper-thin worldbuilding and characters, for whom it’s impossible to care.  Additionally, by being mostly faithful to the novel, JCM’s makers reproduced its highly problematic underpinnings.

The cultures in JCM are based (snore) on ancient Rome and the Celtic and German nations that opposed it  – as filtered through the lens of someone who learned history from comics or fifties Hollywood films.  JCM’s obvious muscular-christian underdrone further underlines its poverty of imagination.  There is no internal logic to the conflicts: they must simply exist, so that 1) we can see the neat-o flying machines and 2) the savior can become indispensable and lead his disciples to victory.  Its pace is as lumbering as its six-legged war beasts; neither its tone nor its visuals ever coalesce.  The relentless battles and fights are choppy and muddy.  The dialogue is clunkier than that of Lucas (a feat I considered impossible), the characters look and speak like Pharaonic wooden statues and the two leads have as much chemistry as pet rocks.  The aptly named Taylor Kitch, blander than lo-fat cottage cheese, doesn’t deserve Lynn Collins’ hot chili and the best that can be said about Thoris and Sola is that neither is a bimbo… or a blonde.

The clichés that literally sink JCM have dogged Hollywood space operas even in their self-labeled progressive incarnations like Star Trek: the White Messiah who out-natives the natives and has their princesses begging for his babbies; the lone feisty-but-feminine metal-bikini-clad woman among a sea (desert?) of men, bereft of any female interactions; the total absence of mothers, when even the non-dyadic Thark family structure gets twisted into providing Sola with a father; natives as noble savages who prevail, Ewok-like, over much superior technology once they choose the right (non-native) leader; hierarchical dog-eat-dog warrior societies; imperial rule by charismatic autocrats as the sole viable method of governance; the dog-like mascot whose sugary cuteness could elicit a full-blown diabetic coma.

People will undoubtedly try to argue that ERB was “a man of his time”.  This is an excuse used ad nauseam for other SF/F “founders” such as Tolkien – who in fact was deemed a regressive throwback even by his own circle before he got canonized into infallibility by his acolytes.  Ditto for ERB.  As one example, John Carter is a “gentleman of Virginia” who served with distinction in the Confederate Army.  Romantic lost causes aside, it means that ERB deliberately made his hero someone who chose to uphold the institution of slavery.  And, of course, the names… oh, how they thud!  Zodanga.  Woola. Tardos Mors.  Barsoom (rhymes with bazoom and va-va-voom, underscored by the Frazetta opulent pornokitch depictions so adored by Tarzanist evopsychos).

Such material can be salvaged in only two ways: either by radical re-imagining (which briefly was the route of the Battlestar Galactica reboot before it collapsed under its maker’s pretensions) or by being played as stylish high camp (which was why the Flash Gordon 1980 remake was such a breath of fresh air).  Like a good bone structure underneath flesh gone to flab, there were glimpses of what might have been had Stanton and his paymasters been braver.  But that would be a parallel universe where Barsoom truly came alive.  Stanton tries to elicit extra sympathy (and remind us of Wall-E) by dedicating JCM to Steve Jobs – but his latest opus resembles a clunky, bloated Microsoft PC.  It makes me once again think how immensely grateful I am that The Lord of the Rings was not directed by an American.

Images: John Carter (Taylor Kitch) realizing that not even super-jumping abilities will get him out of this mess; Dejah Thoris (Lynn Collins) all undressed up with nowhere to go; Dejah Thoris and John Carter trying to find escape clauses in their contracts; Sola (voiced by Samantha Morton) in WTF? posture.

Part 2

The Asymptotic Approach

Monday, March 19th, 2012

The first round of the NIH budget petition that I discussed in my previous entry fell 400 signatures short by the deadline. Research scientists are nothing if not tenacious, so a second round has begun. I think this will make it, but it speaks volumes about the US public’s acceptance/understanding/appreciation of biomedical research that scientists can’t collect 25,000 signatures in a month — even a shorter one like February.

Speaking of tenacity in a more cheerful context, Chris Jones recently spoke with me on Trek.fm about life in concentric circles, starting with extremophiles on Earth, moving out to Mars, Europa, Titan, Enceladus… then onto solar systems beyond ours, whether populated by watery Neptunes or super-Earths.

“And If I Cried Out, Who Would Hear Me…?”

Monday, March 12th, 2012

– Reiner Maria Rilke, the first line of The Duino Elegies

You may recall I wrote about the condition of biomedical research a while ago: Of Federal Research Grants and Dancing Bears.

The NIH, the sole major funding source for such research, has stagnated for the last decade. People are trying to get a measly single-digit increase this year to staunch the bleeding. To get considered, the relevant petition must gather 25,000 signatures by this Sunday, March 18. If you care about basic research or therapeutic applications, follow this White House link. You will need to create an account, but the only thing they request are your name and e-mail. You can also boost this signal, if you wish. This part of the future may still be in our hands, if we don’t sit passively by.  Thank you.

Image: Allies, Susan Seddon Boulet.