Psych Symptoms Less Biology “Than a Kind of Language”

 

Demoniaque, by Rubens. Courtesy Wellcome Images
Demoniaque, by Rubens. Courtesy Wellcome Images

Is mental illness a product of biology or culture? Ethan Watters, over at Pacific Standard, argues that whatever the biology involved in mental illness, its expression and our responses to it are shaped by culture far more than we realize.

The resounding lesson of the history of mental illness is that psychiatric theories and diagnostic categories shape the symptoms of patients. “As doctors’ own ideas about what constitutes ‘real’ dis-ease change from time to time,” writes the medical historian Edward Shorter, “the symptoms that patients present will change as well.”

Watters, author of the superb Crazy Like Us: The Globalization of the American Psyche, is among the sharpest observers of culture’s powerful but unacknowledged effect on how we view and experience mental illness. This latest contribution looks particularly at hysteria — a presentation and diagnosis unique to its time a century ago, and vanished ever since:

Women by the tens of thousands, after all, displayed the distinctive signs: convulsive fits, facial tics, spinal irritation, sensitivity to touch, and leg paralysis. Not a doctor in the Western world at the time would have failed to recognize the presentation. “The illness of our age is hysteria,” a French journalist wrote. “Everywhere one rubs elbows with it.”

Hysteria would have had to be included in our hypothetical 1880 DSM for the exact same reasons that attention deficit hyperactivity disorder is included in the just-released DSM-5. The disorder clearly existed in a population and could be reliably distinguished, by experts and clinicians, from other constellations of symptoms.

So did psychiatry make these people sick? Watters (and I’m with him here) argues that No, these diagnostic categories don’t create mental distress out of nowhere; but the expectations and responses of both psychiatry and wider culture can shape, and sometimes seem to outright dictate, the expression of this distress.

Some critics of the DSM-5, he notes, worry that its new, sometimes looser criteria will increase the number of Americans diagnosed with mental illness.

But recent history doesn’t support these fears…. In fact, as psychologist Gary Greenberg, author of The Book of Woe, recently pointed out to me, the prevalence of mental health diagnoses actually went down slightly  [since the DSM-5’s publication]. This suggests that the declarations of the APA don’t have the power to create legions of mentally ill people by fiat, but rather that the number of people who struggle with their own minds stays somewhat constant.

What changes, it seems, is that they get categorized differently depending on the cultural landscape of the moment. Those walking worried who would have accepted the ubiquitous label of “anxiety” in the 1970s would accept the label of depression that rose to prominence in the late 1980s and the 1990s, and many in the same group might today think of themselves as having social anxiety disorder or ADHD.

Viewed over history, mental health symptoms begin to look less like immutable biological facts and more like a kind of language.

I think it’s hard to overstate this issue — in fact, that one can’t understand what we call psychiatry and mental health and illness without recognizing how powerfully culture shapes the expression of psychic distress. I looked at similar issues in a post responding to the “Batman movie killings” in Aurora last year: Batman Returns: How Culture Shapes Muddle Into Madness. The crucial story there concerned a young woman who began to feel violent urges when, and only when, she was ostracized from her profession after she received a diagnosiss of schizophrenia:

Her confusion and disorientation and anxiety — her schizophrenia — rose from complex sources. But her anger rose in large part from an alienation that came hand-in-glove with our society’s definition of what she experienced — to the mere application of the word schizophrenia. And her ideas about expressing that anger rose direclty from models of action brought to her from the media, and which expressed, in their violence and their repetitive, replicative nature — each bloody rampage imitating others — deep and multiple strains of our culture.

Watters post is: The Problem With Psychiatry, the ‘DSM,’ and the Way We Study Mental Illness, Pacific Standard.

Mine exploring the issue from other angles is Batman Returns: How Culture Shapes Muddle Into Madness, here at Neuron Culture.

Others related at Neuron Culture:

A Calm Eye on the Selfish Gene Storm

Lizard v grasshopper

Over at the Genetic Literacy Project, editor Kenrick Vezina offers a particularly level-headed and constructive consideration of the debate over the fitness of the selfish-gene metaphor that my Aeon article “Die, Selfish Gene, Die,” raised.

Violence of the title aside, [Dobbs’s] point was not that we should go out and gather up all the copies of biologist Richard Dawkins’s seminal book and burn them. Instead, he argues that in light of the myriad biological phenomena that take pace outside the simple one-to-one gene-makes-trait paradigm, we might want to devise a new narrative. A new story that more easily accounts for the ability of grasshoppers to become locusts without altering their DNA, or for culture to act as a mechanism of cross-generational inheritance.

[snip]

While fully acknowledging the good it has done, Dobbs and several of the scientists he consults argue that the selfish-gene concept as broadly understood may now have become oversimplified and ossified; an obstacle to a richer understanding of evolutionary science.Ultimately Dobbs proposes some sort of “social genome” framework for further development.This pissed a lot of people off.

From here Vezina looks at the pushback from Richard Dawkins, Jerry Coyne, and Steven Pinker; notes the pivotal role The Selfish Gene played in his own love of science; and describes his reaction to Steven Pinker’s dismissals of my article. Then, moving toward his crux, he notes:

First, Dobbs isn’t guilty of horribly botching any facts, of sensationalism or of anything that would be considered a major crime in journalism. Second, everyone agrees that the selfish gene was a useful metaphor, so no one is trying to take anything away from Dawkins or his most famous idea. Third, most of the responses focus on the question or whether or not the gene is a necessary, immortal vessel of inheritance.

His final concern is with the perspectives offered earlier this week at Aeon by Robert Sapolsky, Karen James, Laura Hercher, and John Dupré. As he says, all four contributors make fascinating points, and 

since none of the panelists actually seem to be wrong in any significant way here, it’s all about varying perspective. This is where the fourth response, from Hercher, hits home.

Then Vezina articulates something I liked about Hercher’s response, but which I didn’t quite identify even to myself.

She, as a genetic counselor, is in the unique position of needing to communicate both the power of genes and their non-deterministic nature to patients dealing with genetic screening results. Hers is perhaps the most humanizing voice in this debate. She sees the kerfuffle raised in the wake of the article, including the opposition of Pinker, Coyne, and Dawkins — and to a lesser extent, I suspect, by her peers on the Aeon panel — as distracting from the real concern raised by Dobbs’s piece. “There is a pressing need,” she writes, “to create a language in which to discuss the complex relationship between genes and traits, which is accessible to the non-scientist.

Which is indeed my main point, above all. And here, says Vezina,

is where Dobbs’s response and my own synch up perfectly. Science, he rightly notes, is built on stories. The facts are merely the beats the story must hit, the parameters it must be told within, but always science is trying to find the best story (read: hypothesis, idea, theory) to make sense of the world as we understand it.

Whether or not it’s time to dethrone the selfish gene as the reigning metaphor seems to me slightly irrelevant. It’s more important that we be willing to have the conversation. What the more vitriolic response to Dobbs’s piece showed — especially the dismissive and condescending response from some of the people we hold up as leaders in science communication — is that the real ossification is not in the ideas of the selfish gene but in the people who defend it as holy ground.

Go to Genetic Literacy Project for The selfish gene debate: The power of stories in science and society.

Uncommon Reading – Eudora Welty on Virginia Woolf on Hemingway

The manuscript book in which Woolf wrote "Mrs. Dalloway." Photo by the author.
The manuscript book in which Woolf wrote “Mrs. Dalloway.” Photo by the author.

One of the underrated pleasures of the internet is all the old stuff we can now read — goodies that 20 years ago you could read only by going to a major library and hunting. A few weeks back, someone or something pointed me to a sort of English major’s dream hiding within The New York Times of 21 September 1958:  Eudora Welty reviewing Virginia Woolf reviewing other books. We get to see one highly original and lively mind peeking into the workings of another as that other peers into yet others.  The entire short piece is a pleasure, but let me call out three choice bits here:

1.  Welty on Woolf on Hemingway. Lord above. In the mid-1920s, Woolf reviewed Hemingway’s first book, “Men Without Women,” his second story collection. This was before it had become dead-clear that Hemingway, like Woolf, would profoundly change 20th-century fiction. And here Welty is hysterical:

Of course, an editor who sent out a new book called “Men Without Women” to Virginia Woolf knew what he was doing. The New York Herald Tribune received a very farsighted review. (The title she decided merely “to stare out of countenance.”) She thought Hemingway’s characters talked too much, but she would, “if life were longer,” care to read the stories again. Hemingway “lets his dexterity, like the bullfighter’s cloak, get between him and the fact…. But the true writer stands close up to the bull and lets the horns – call them life, truth, reality, whatever you like – pass him close each time.”

I like the bit about Hemingway’s dexterity getting in the way. He might have said similar about Woolf, which would have made both of them wrong, since the veil they cast between us and the material is part of their magic: like one of those Instagram filters that actually makes everything more intense.

2. Welty on Woolf’s basic and rich gift:

In the early pieces there are no early sentences…. She scatters treasure everywhere she reads. “The novelist [of all those practicing the arts] … is terribly exposed to life …. He can no more cease to receive impressions than a fish in mid-ocean can cease to let the water run through his gills.”

Which is Woolf herself, of course. Which gets me to

3.  Woolf’s sensitivity, which is of special interest to me as I write my book on temperament and human sensitivity to experience:

What a beautiful mind! That was the thing. Lucid, passionate, independent, acute, proudly and incessantly nourished, eccentric for honorable reasons, sensitive or every reason, it has marked us forever. Hers was a sensitivity beside which a Geiger counter is a child’s toy made of a couple of tin cans and a rather common piece of string. Allow it its blind spots, for it could detect pure gold. It could detect purity. In the presence of poetic fire it sent out showers of sparks of its own. It was a mind like some marvelous enchanter’s instrument that her beloved Elizabethans might have got rumor of and written poems about.

Get the rest at the Times: Uncommon Reader.

Also: The Quiet Greatness of Eudora Welty, a fine page at the National Endownment of the Humanities

Elsewhere at Neuron Culture:

Virginia Woolf Was a Plant Sensitive and Tough

The Agony of Editing Virginia Woolf’s Early Journals

How To Pick Apart Great Writing: Joan Didion on Ernest Hemingway

 

Selfie Showdown: Colin Powell v Stanley Kubrick

Making the rounds today is the highly charming selfie of a young Colin Powell:

Colin Powell selfie

Powell’s photo has that good power a selfie can have: a picture not just of someone’s face, but, especially apparent when viewed later on, of a person’s aspirations and possibilities. For that reason it reminded me of this youthful selfie of Stanley Kubrick, which I came across a few months ago while researching mid-century Leica cameras:

Kubrick selfie

Scaring that up, meanwhile, led me to this also charming photo: Kubrick and his daughter photobombing their own selfie that is also a photo of Jack Nicholson on the set of The Shining.

Kubrick photobombs Nicholson

Game to Kubrick.

Fortune Favors the Bold and Anxious — in baboon learning in this one study, at least.

Photo: Alecia Carter/ Tsaobis Baboon Project.
Photo: Alecia Carter/ Tsaobis Baboon Project.

At her Zoologic blog, over at my old haunts at Wired Science Blogs, Mary Bates looks at an absolutely fascinating study of how temperament is linked to certain behavior. In this case, researchers measured two independent traits in baboons, boldness/shyness and (separately) anxiety/calmness — and found that in new situations, baboons who were both bold and anxious learned the most. Very clever experiment by Alecia Carter of the University of Cambridge and colleagues, and a clear, skilled, absorbing write-up from Bates.

Carter and her colleagues had given all the baboons “personality tests” to measure two traits, boldness and anxiety. They assessed boldness by looking at a baboon’s response to a new food (such as a hard-boiled egg dyed green); the bolder the individual, the more time he or she spends inspecting a new food. They assessed anxiety by presenting the baboons with a taxidermied venomous snake; in this test, more anxious individuals spend more time investigating the potential threat. Boldness and anxiety are stable personality traits and are independent in baboons, meaning a bolder baboon is just as likely to be anxious as a shy baboon.

After figuring out where individual baboons fell on these two personality traits, the researchers looked at whether the traits were related to the time spent watching a demonstrator or the subsequent ability to then solve the task being demonstrated.

They found bolder and more anxious individuals were more likely to learn about a novel foraging task from another baboon — despite the fact that shy baboons watched the demonstrators just as much as bold baboons, and calm baboons paid even more attention to the demonstrators than anxious baboons. This means that an individual’s ability or interest in watching a demonstrator does not necessarily translate to then solving the task. All personality types seemed to collect social information, but bolder and more anxious baboons were better at using it.

This looks like a rich line of study, with layers of implications and new questions to follow. As Bates notes,

These results suggest that when performing behavioral studies of animals, researchers might have to consider the personality of individual animals before making judgments of their cognitive abilities. “Animals may perform poorly not because they aren’t clever enough to solve the task, but because they are too shy to interact with it,” Carter says.

Carter was especially surprised by one observation during the study: she wasn’t able to test some individuals because they were never close enough to a demonstrator to observe the task. “It seemed as if some of the baboons never foraged in the presence of a knowledgeable individual,” she says. “I was surprised that a baboon could be so limited by with whom they spent time.”

This observation is inspiring Carter’s next line of research, investigating if baboons’ social networks may prevent them from learning from others.

Get the rest at Baboon Personality Predicts Social Learning, by Bates, who is @mebwriter on Twitter.

The Carter study is Personality predicts the propensity for social learning in a wild primate, at the fine and innovative open-access publisher PeerJ.

 

Dead or Alive? The Selfish Gene, Reconsidered

Lizard v grasshopper
Evolutionary models. Photo courtesy Aeon Magazine, all rights reserved.

After my Aeon essay “Die, Selfish Gene, Die” last December sparked much discussion, the editors asked biologists Robert Sapolsky and Karen James, genetic counselor Laura Hercher, and philosopher John Dupré to offer their thoughts on the subject. Their smart, well-considered, well-crafted responses  are published today. My hearty thanks to all four of these contributors, to Aeon editor Brigid Hains for collecting their responses, and to the many, many who have contributed constructively to the lively and fascinating discussion of genes and evolution that this is part of.

Some snips:

Robert Sapolsky:

…[I]n this scenario, the much-vaunted genome inside that cell is being regulated by some other guy’s pee.

It ultimately makes no sense to ask what a gene does, only what it does in a particular environment; remember what turns grasshoppers into locusts. It is the triumph of context.

Laura Hercher:

In September last year, the National Institute of Health in the US announced a grant of $25 million to examine the impact of DNA sequencing in newborns. Some of those parents are going to get results that suggest that the little bundle they are bringing home from the hospital is at risk for cancer, heart disease, autism. How important is it for parents to understand the limitations of the test? We have a minute, two minutes, maybe a year, to think about that question before we start talking about pre-natal DNA sequencing

Karen James:

The answer to Dobbs’s question ‘Why bother rewriting the genome to evolve?’ then is ‘Because there is no other way’. The interactions among genes, and between them and the environment, are indeed far more sophisticated and ramified than what we learnt in high school, but evolution is, and indeed must be, gene-centric.

John Dupré:

The real relevance of the complexity of gene expression and regulation systems, as well as epigenetic inheritance, is that these provide multiple possible ways in which changes to the system might be stabilised without involving changes in DNA sequence.

These examples call into question a remarkable and insufficiently discussed idea in The Selfish Gene, the idea that DNA forms immortal coils. Dawkins argues that only genes replicate with sufficient fidelity to stabilise an evolutionary process. But why should evolution, a process of change, require something immortal at its heart?

My own follow-up is there as well. An excerpt:

Science’s true job and modus operandi is to find and articulate the most compelling story consistent with the facts. Naturally scientists must revise and sometimes replace these stories as research reveals new facts and dynamics.

Dawkins knows this, and in The Selfish Gene he tells a compelling story indeed. But, in an age when research is revealing the genome’s conversation with the outside world, and with itself, to be far more complex than we ever suspected, does the selfish-gene story remain the most compelling one we can offer about genetics and evolution?

That’s my question.

Get the whole kit at Aeon: Selfish gene, dead or alive?

___

Earlier posts on the subject:

‘Die, Selfish Gene, Die,’ with links

Jerry Coyne Mucks Up and Misreads “Die, Selfish Gene, Die”

 

The Story Behind the Story of “My Mother’s Lover”

My mom in Hawaii. Photo by Norman Zahrt.
My mom in Hawaii. Photo by Norman Zahrt.

Quick publicity note: Snap Judgment, the popular NPR story-telling show and podcast, has a segment this week in which I describe how I uncovered the story that became “My Mother’s Lover,” the Atavist story that became a #1 best-selling Kindle Single.

You can find the Snap Judgment segment here.

You can read the story itself either via Kindle or, far better, methinks, in its gorgeous Atavist presentation using photographs, film, audio, maps, and a timeline. Enjoy.

Neil Young, Bruce Springsteen, and Time’s Vain Delusions

179073_1_f

A couple weeks ago I found myself staring at an undersized image on my iPhone: the wee tiny cover of Neil Young’s 1979 album Rust Never Sleeps, which I was listening to yet again. Rust deepens nicely over time, because time is its subject. The title track* is about remaining rustless as you age. In “Powderfinger,” another particularly strong track, a youth faces death, which is the moment, as I think it was Dylan supposedly said, in which you’re as old as anyone can get. This young man, “just turned twenty-two,” has mixed feelings about this. There’s a beautiful anger in Young’s voice as he resumes singing after the second of the song’s raggedly lyrical guitar solos. But this youth accepts his end both grudgingly and gracefully (“Remember me to my love / I know I’ll miss her…”: a counter to the title track’s insistence on constant regeneration. 

51zskKoYAZL
Sleeve of Rust Never Sleeps LP, with playlist

I first bought Rust in 1987, and for years the album cover often leaned on the wall behind my turntable as the LP was spun, flipped, and spun again: Neil’s one-man version of “Hey Hey, My My” to open the acoustic, A side, followed by “Ride My Llama” (“I’d like to take a walk, not around the block…”), “Pocahontas,” and “Sail Away”; flip the disc to get “Powderfinger” lyrically lighting up the album’s electric side, and the title track’s furious, grungy, amplified version closing it. Time, loss, escape, death, rebirth. Flip it again: repeat.

I rarely saw the album cover once digital came. I’m not quite sure why last month I kept gazing at the tiny iPhone-size version. But I was entranced in a different way with this image of Neil: with his lean easy tallness and his hair black and his smooth skin, weaving, I imagined, the Powderfinger solo I was listening to at that moment. He looks so young. He was in his thirties. Many a morning he probably looked in the mirror and thought to himself, as one does starting about then, The years are starting to show. But just look at this man: So beautiful. So young.

Yet, as I think anyone who has followed Neil since then would agree, he was then a lesser man than he would become. Magnificent — yet a younger version of himself. Whole, but not complete.

__

A few days after gazing at tiny Neil, somehow diverted by Twitter to YouTube, I encountered another time-tinted stage presence: Bruce Springsteen singing “Thunder Road” in Perth just 10 days earlier, on February 7 of this year. Bruce, at the end of another epic evening, sixty-something and soaked in sweat, came out for an encore. (May we all look so good at that age, or ever, in tight wet tee-shirts.) Instead of bringing the band out for this wall-of-sound hit, he sang it solo acoustic. He was out there not to recreate a younger version of himself, but to look back at it.

“Thunder Road” is from Born to Run, an album that threw Springsteen into a fame-making machine he hated. Hated. Next to the title track, “Thunder” is arguably the album’s most adolescent composition. It’s a paean to the idea that in youth you can remake the world simply by driving away from your past with enough horsepower. As if your history would not ride in the trunk.

Born to Run followed Bruce around a long time. A dozen years later (the same year I bought Rust Never Sleeps, by chance), Bruce released Tunnel of Love, which he once called his first album for grown-ups. I would discover Tunnel only a few years later, at a timely time. No illusions here, mister, about leaving your past behind. Tunnel is all about living with ghosts and mistakes and baggage and limitations. He sneers at the kinds of hopes he once held and fueled:

Some girls they want Handsome Dan
or some good-looking Joe.
On their arms some girls like
a sweet-talking Romeo.

Well ’round here, baby,
I learned you get what you can get.
So if you’re rough enough for love,
Honey, I’m tougher than the rest.

The title track, “Tunnel of Love,” is a work of stealth. It opens like a tinkly teenage carnival ride and sucks you into a dark tour of middle age. You must embrace even new love with the stench of the old still upon you:

‘Well the house gets haunted and the
ride gets rough.
You got to ride down, baby,
into that tunnel of love.”

On the other hand, those rough enough and ready can find moments of contentment, as in “All That Heaven Will Allow”: 

Some people wanna die young, man,
Young and gloriously.
Well, let me tell you now, mister:
Hey buddy, that ain’t me.

Which is all a pit stop, along Bruce’s never-ending drive, on the way to a night in Perth, 27 years later and 39 years after Born to Run, where he takes “Thunder Road” and reshapes an adolescent anthem into an homage to age. It’s not just that he plays it solo and slow. He reforges all the crucial phrasings and melodic turns and dynamics to turn them downward. Where once they rose in defiance, now they fall.

He has replaced his love song to youth with two other songs.

One is a love song to his fans, many of whom, of course, have been aging with him for years. When early on he pauses to let them sing a line on their own (“Show a little faith, there’s magic in the night…”; it’s just after the 1:00 mark in the video above, best viewed if you take in the 60 seconds before as well), it’s as if all these people signing with him go inside him to get the lines and bring out new energy. The sense of connection is astounding. Listen to them sing; look at his face when with closed eyes he hears them. What a happiness this man has made.

He’s also delivering another love song: a Tunnel of Love-level love song sung to the young Jersey musician who, writing this tune 40 years before, let himself think that you could outdrive your troubles — or at least use that proposal to seduce a girl and yourself and a planet full of fans. This 2014 rendering gently taps that earlier Bruce on the shoulder to warn and console him. The Bruce in Perth, singing to the old (young) Bruce in Jersey, has accepted his age, and he has never been more beautiful. When he lets the audience sing, “Maybe we’re not that young anymore,” it’s one of the song’s few rising lines. It blooms as a statement not of loss but of gain, and it seems to fill his heart.

Aging brings a thousand cruelties. Perhaps the most bizarre is the intermittent realization that we so fiercely resist the fact that we are always younger versions of ourselves — rough drafts of the people we will become. Try to entertain that notion for a second. It’s hard. It stays hard. I am now closer to Bruce’s age now than I am to Bruce’s age back then, and I still can’t do it. If you’d asked me to try when I was in my 20s or 30s, I’d have inwardly scoffed, even though I was paying ruinous interest rates on my early bad bets; that junk in the trunk was quite a load. No wonder we flinch. Even if part of you knows that at some level you’re mostly at sea, your operating assumption is that you have this thing down; you’re older now; you got this. A useful delusion. It’s too dispiriting to confront how incomplete we are, to contemplate how much we lack because we haven’t lived longer. And no one’s in a hurry to reach the end.

Yet somehow life gets better anyway.

My 12-year-old reads everything and seems to remember every word. Two weeks ago, as we were riding a ski lift up into a bright icy sky, he asked me about something he’d read recently, probably in Muse. He said, Dad, Dad — Did you know there’s this theory about time? It’s that time doesn’t really exist. It’s that time is really just a human invention, a conceptual trick we created to keep ourselves from going crazy because we can’t deal with everything happening at once.

I said I didn’t know that, but it made sense.

*The title track’s title is “Hey Hey, My My,” but the album’s title, Rust Never Sleeps, is from the song’s take-away line.

Leonardo da Vinci and the Power of Ignorance

NulandOnLeonardo
In his fine short Leonardo da Vinci: A Life, for Penguin’s Penguin Lives series, Sherwin Nuland wonders if there are times when good scientists or entire disciplines, psychology among them, sound idiotic in their speculations because the empirical underpinnings of their fields are simply not yet in place.

[Leonardo] has been criticized, now and in his own time, for finishing so little of what he started. And yet, how could it have been otherwise, at least in the areas of his scientific work? The probings of his mind had gone well beyond the supporting knowledge and technology of his era. Had much more been available, it would certainly have released his genius to fly as far in reality as it did in his conjectures and fantasies. Kenneth Keele, the foremost authority on Leonardo’s anatomical studies, once sent me a paragraph extracted from a letter to a mutual friend, in which he described his own feelings about these matters, aroused while he was working on some of Leonardo’s manuscripts:

At every page I am fascinated by his intelligent questions and answers. But I find myself realizing that however intelligent, however full of instinctive weight the questions are, when the supporting base of knowledge is not there the answers are bound to contain errors.  This makes my tale inevitably one tinged with sadness; and the more Leonardo struggles within his chains of ignorance the sadder it becomes. Especially is this so because though he breaks his fetters in many places he never escapes from them. I wonder if in a number of fields (I would cite sociology, psychology, thanatology) we are not in a rather similarly sad state today with the fetters being no less powerful for being unknown to us, even unfelt.

Sociology, psychology, thanatology; I would add much of genetics to that list as well. I would offer too that the proper response is not to speculate, but to do so, and use the speculation to push both theory and experiment, as Darwin did so long ago with coral reefs and then species.

Note 03/04/14, 1:38pm EST: Fixed several dictation-typos (dictos?) I failed to catch earlier; hazards of posting while kid-wrangling. Apologies for any confusion to earlier readers.