Last week’s spat between Nicholas Carr and Steven Pinker generated a lot of attention — and, happily, delivered a couple of the more lucid framings yet of the debate over whether digital culture makes us shallow, as Carr argues in his new book, or simply represents yet another sometimes-distracting element that we can learn to deal with, as Pinker countered in a Times Op-Ed last Thursday.
I sympathize with both arguments; I see Carr’s point but feel he overplays it. I find digital culture immensely distracting. I regularly dive down rabbit holes in my computer, iPhone, and iPad, taking wandering, shallow paths much like those Carr describes. Yet I remember getting distracted by other things — newspapers, magazines, favorite books I’d already read, tennis matches, conversations with neighbors — as a young adult in the dark dark pre-Internet era. So instead of reading tweets and blog posts instead of writing my book(s), I read again some favorite passage about Eric Shipton exploring Nepal, watched Wimbledon, or phoned my sister to see how grad school was going. As Pinker notes,
distraction is not a new phenomenon. The solution is not to bemoan technology but to develop strategies of self-control, as we do with every other temptation in life.
I agree. Twitter indeed offers endless, easy, and lasting distraction; it calleth as I compose. But 20 years ago, so called too The Sporting News, the New York Review of Books, and my tennis racket, my binoculars, my bicycle, and my Gibson ES-345, a stack of books I hadn’t read and several bookshelves full I had read, not to mention all the people I could find to talk to if I took a long enough walk. I didn’t work any more steadily or deeply back then than I do now, once I get going. But now I am far less isolated socially and intellectually, even living in rinkydink Montpelier, than I was back then living in large university towns. I don’t mean to dismiss Carr’s concerns altogether. But I side with Pinker and Jonah Lehrer in being skeptical that the Net is working a fundamental, singular, bad bad voodoo on how we think.
I bring to this a bit of history: About a year or 18 months ago, I had several discussions with an editor (at Wired, of all places; this was going to be a sort of anti-Wired piece) about doing a story exploring a more tightly constrained version of Carr’s argument: I would flesh out the notion that consuming digital culture, even just words on the net instead of words on the page, likely wired the brain differently than reading on the page did. I pitched the story because I wondered if that was happening to me; reading on the web felt different; perhaps it affected brain and cognitive development proportionately.
Perhaps things have changed since then, but at the time, we decided against doing the story because in a couple days of surveying literature and making phone calls to people who studied reading from a neuroscientific point of view … well folks, I could not find anyone with data showing such rewiring. Yes, people were doing the sorts of fMRI studies that showed the brain activated a bit differently reading the web or following links than it did reading print; they showed, in other words, that the experience was different. But no one had data showing that the sort of change that I would consider “rewiring” — that is, that reading on the web, or immersing digitally in general, actually created a different course of brain or cognitive development. Again, possibly things have changed since; perhaps I’d find those studies if I read Carr’s book — though for what it’s worth (quite a bit, in my book) Jonah Lehrer did read it and came to the same conclusion as I did: the data doesn’t clear the bar.
So went my own Shallows argument 12 or 18 months ago: to the round file. I started with my own feeling that the web was rewiring my brain — and failed to find data supporting my own dark suspicions.
But oh wait — I got distracted. I want to address here not so much the heart of the Pinker-Carr argument but one particular argument that Carr used in his response to Pinker that I found offkey — not so much because it didn’t apply (though it doesn’t, for reasons we’ll see), but because it pulls on a false dichotomy that I think we need to lay to rest. I refer to this:
Pinker, it’s important to point out, has an axe to grind here. The growing body of research on the adult brain’s remarkable ability to adapt, even at the cellular level, to changing circumstances and new experiences poses a challenge to Pinker’s faith in evolutionary psychology and behavioral genetics. The more adaptable the brain is, the less we’re merely playing out ancient patterns of behavior imposed on us by our genetic heritage.
Wuh-oh, trouble: Carr casts a strong opposition here between inherited cognitive powers and in-play adaptability, genes and plasticity.* On a fine scale and at most proximate range, of course, he’s attacking Pinker’s “faith in evolutionary psychology and behavioral genetics,” and perhaps that’s all Carr means here — that Pinker objects because Pinker feels threatened, and Pinker feels threatened because he’s wed to a false dichotomy about nature-or-nurture. Yet Carr himself seems intimately tied to the same dichotomy when he writes “The more adaptable the brain is, the less we’re merely playing out ancient patterns of behavior imposed on us by our genetic heritage.” He seems not to be saying that Pinker is wrong to draw the contrast, but that he’s on the wrong side of the debate.
And so Carr sets adaptability against genetic heritage. Carr has stronger arguments, and I think he needs to set this one aside. For the most vital part of the “genetic heritage” he cites is the very adaptability or plasticity he likes to emphasize. We’re successful (as a species, and generally as individuals) precisely because our brains learn readily and —as Carl Zimmer points out nicely in a recent essay — both brains and genes fluidly adapt to a surprising range of environments and challenges. Adaptability exists not despite our genes but because of them.
Nick Carr is a bright guy, and I suspect that at some level, perhaps many levels, he recognizes this. Indeed, in the very next paragraph of his piece he notes that, to understand human thought,
we need to take into account both the fundamental genetic wiring of the brain – what Pinker calls its “basic information-processing capacities” – and the way our genetic makeup allows for ongoing changes in that wiring.
This clearly recognizes that genes underlie our behavioral and neural plasticity. Yet Carr’s earlier language, about the brain’s adaptability being incompatible with a recognition of our genetic heritage, ignores it. He seems to insist on a false split between nature and nurture.
Possibly I’m misreading him here. Possibly he misspoke. But I suspect that Carr — hardly alone in doing so — expressed a nature v. nurture framework for pondering human thought and behavior that, though deeply ingrained, is being proven false by the highly fluid conversations that researchers are exposing between genes and experience. Possibly he does so just to make a point; certainly that’s the way he deploys this idea here. And goodness knows, among the attractions of the nature-or-nurture debate is that it lets you argue incessantly about a dichotomy that even your own argument betrays as false.
For what it’s worth, Louis Menand, reviewing Pinker’s The Blank Slate in 2002, accuses Pinker of the same muddle.
Having it both ways [that is, sometimes insisting that nature trumps nurture, and at other times citing nurture’s power to override nature) is an irritating feature of “The Blank Slate.” Pinker can write, in refutation of the scarecrow theory of violent behavior, “The sad fact is that despite the repeated assurances that ‘we know the conditions that breed violence,’ we barely have a clue,” and then, a few pages later, “It is not surprising, then, that when African American teenagers are taken out of underclass neighborhoods they are no more violent or delinquent than white teenagers.” Well, that should give us one clue. He sums the matter up: “With violence, as with so many other concerns, human nature is the problem, but human nature is also the solution.” This is just another way of saying that it is in human nature to socialize and to be socialized, which is, pragmatically, exactly the view of the “intellectuals.”
The nature-or-nurture debate exerts a strong pull. I’m tempted to say it seems to be in our genes. Yet while resolving puzzles is in our genes, the nature-nurture debate is not; it’s prominent and perennially hot because each side offers a seemingly viable explanation of behavior, and, even more important, because it carries the horrid legacies of racism, the Holocaust, and 20th century eugenics. It’s as much political as it is scientific. But we’re at a place where science, anyway, would let us set it aside.
*Unlike Pinker or my friend Vaughan Bell, I don’t find neuroplasticity a dirty word. Though it’s often used badly and sloppily, neuroplasticity, along with plain old plasticity, provides a useful shorthand for reminding us that both our brains and our behavior are more malleable and changeable than the neuroscience and psychology of a couple decades ago recognized. It also reminds us — implies, anyway — that some of us are more mentally and behaviorally plastic and capable of change than others.