Carr, Pinker, the shallows, and the nature-nurture canard

Last week’s spat between Nicholas Carr and Steven Pinker generated a lot of attention — and, happily, delivered a couple of the more lucid framings yet of the debate over whether digital culture makes us shallow, as Carr argues in his new book, or simply represents yet another sometimes-distracting element that we can learn to deal with, as Pinker countered in a Times Op-Ed last Thursday.

 

I sympathize with both arguments; I see Carr’s point but feel he overplays it. I find digital culture immensely distracting. I regularly dive down rabbit holes in my computer, iPhone, and iPad, taking wandering, shallow paths much like those Carr describes. Yet I remember getting distracted by other things — newspapers, magazines, favorite books I’d already read, tennis matches, conversations with neighbors — as a young adult in the dark dark pre-Internet era. So instead of reading tweets and blog posts instead of writing my book(s), I read again some favorite passage about Eric Shipton exploring Nepal, watched Wimbledon, or phoned my sister to see how grad school was going. As Pinker notes,

distraction is not a new phenomenon. The solution is not to bemoan technology but to develop strategies of self-control, as we do with every other temptation in life.

I agree. Twitter indeed offers endless, easy, and lasting distraction; it calleth as I compose. But 20 years ago, so called too The Sporting News, the New York Review of Books, and my tennis racket, my binoculars, my bicycle, and my Gibson ES-345, a stack of books I hadn’t read and several bookshelves full I had read, not to mention all the people I could find to talk to if I took a long enough walk. I didn’t work any more steadily or deeply back then than I do now, once I get going. But now I am far less isolated socially and intellectually, even living in rinkydink Montpelier, than I was back then living in large university towns. I don’t mean to dismiss Carr’s concerns altogether. But I side with Pinker and Jonah Lehrer in being skeptical that the Net is working a fundamental, singular, bad bad voodoo on how we think.

Continue reading →

Ozzy Osbourne. Now genomics is getting somewhere.

ozzy

Ozzy Osbourne, preparing to grasp the meaning of his genome.

There’s been much attention lately to the failure of genomics advances to create many medical advances. From rock’n’roll comes  hope.

THE mystery of why Ozzy Osbourne is still alive after decades of drug and alcohol abuse may finally be solved.

The 61-year-old former Black Sabbath lead singer — who this week begins his health advice column in The Sunday Times Magazine — is to become one of only a few people in the world to have his full genome sequenced.

In addition to giving Osbourne information that could help prevent diseases, it is hoped the results will provide insights into the way drugs are absorbed into the body.

I suppose stranger things have happened. Now they need to do Keith Richards. Anything rare variants those two share … well, I’m not sure WHAT you do with it.

via The Sunday Times

The New York Review goes bloggy

gattopardo_jpg_470x389_q85.jpg

A still from Visconti’s The Leopard, via NYRB

This is not new, but seems to me overlooked (and underlinked) in the blogosphere: The New York Review of Books — a long, longtime favorite of mine — has a blog stable that offers a nice variety of goodies. The current line-up gives a sense of the range: a piece on Mexican art by Alma Guillermoprieto; Sue Halpern’s beef about the iPad, which I elaborated on earlier; pieces on the Vatican, Iraq, and Pakistan; and a leisurely travel post on Palermo that begins, “Everything in Palerno is slow except the traffic, which is as confusing as a video game and just as fast.”  

Locals & tourists living mashed all together

 

londonphotos

 

A clever fellow named Eric Fischer tapped Flickr geotag data to map where locals and tourists take photos in major cities. Above — the wiggly Thames declares it instantly to lovers of this city — is London, where I’m moving (for a time, anyway) in just a few weeks. Blue denotes photos taken by locals, red by tourists. H/t Ferris Jabr, BoingoBoingo, and Kottke

Gleanings: mayfly radar, tennis memoir, et alia

 

mayflyswarm.jpg

 

Unbelieveble! Department, via SciencePunk:  Giant mayfly swarm caught on radar

NYRB reviews what sounds like an especially moving memoir from Andre Agassi.

Whatever It Takes Department, via Ed Yong: Superstitions can improve performance by boosting confidence.

The climate-change doubt industry and its roots – http://bit.ly/an4cAr, via @stevesilberman

RitaRubin: Study: Have bad habits? U r more likely 2 blame health problms on your genes. ‘Cause u can’t do anything 2 change them http://bit.ly/ad6iRy. Damned interesting if true.

techreview: Genetic Testing Can Change Behavior http://bit.ly/c1jKSU

sarcastic_f: Criminal offenders w/ psychopathic tendencies impaired in emotional but not cognitive theory of mind http://j.mp/a1fsza OFC involvement? This does not surprise me. (See my Times Magazine article “The Gregarious Brain” and search for “theory of mind).



 

iPad, therefore iKludge

 

Wallace.jpg

Don DeLillo’s Players, as marked up by David Foster Wallace.
Courtesy Harry Ransom Center, University of Texas at Austin.

I just sat down to air a complaint about reading on the iPad when I discovered that Sue Halpern had done much of my work for me:

For all its supposed interactivity, the iPad is a surprisingly static machine, especially for reading. … One of the guilty pleasures of an actual, ink-on-paper book is the possibility of marking it up—underlining salient passages, making notes in the margins, dog-earing a page. While it’s true that some electronic book platforms for the iPad allow highlighting (it even looks like you’ve used a fat neon yellow or blue or orange marker), and a few—most notably Kindle and Barnes and Noble but not iBooks—allow you to type notes, they barely take advantage of being digital. It is not possible to “capture” your notes and highlights, to organize, compile, arrange, or to print them out. Until there is a seamless way to do this, marginalia will remain sequestered in the margins, and the promise of electronic books will be unrealized.

This plaint struck me a few weeks ago when — eating pasta in Palo Alto, as it happens — I was reading The Selfish Gene. I was enjoying both meal and book immensely, and, thinking fondly of my faithful readers here, wanted to share some of it with you, and to harvest salient passages for my own research as well. So I was pleased, exploring this new device, to find that the iPad’s Kindle program offers a highlighting feature.

Later, however, when I wanted to share these passages, I found what Halpern complains about: My highlighted passages appeared to be as locked up in the book as they would be if I’d highlighted a print copy.

Continue reading →

Gleanings: genetic goofs & mutts, and where good ideas come from

 

SandPebbles.jpg

The photo above isone of several posted by NeuroDojo, who has a lovely post on them.

Genetic Future ponders the 23andMe Oops-wrong-data event. Turns out it was a flipped tray.

“I’m frankly astonished that this was possible at an accredited genotyping facility – plate-flipping is an age-old problem, but trivial to prevent with good plate/machine design that only allows plates to be loaded in a single orientation.”

Same source carried a good strong early account of this mix-up as well.

Genomeboy ponders a dog’s life, as glimpsed through its genome.

Steven Berlin Johnson gives a peek at his new book, Where Good Ideas Come From.


Are we living in a neuroculture?

magicforest.jpg

Andrew Carnie, Magic Forest, 2002, via Neuroculture.org

 

Do we live in a neuroculture? Of course we do!

Coming from a blog named Neuron Culture, this is obviously a set-up question — my excuse to call attention to a post by Daniel Buchman that offers a brief review article on the question.

It seems that everywhere I look nowadays, I’m seeing images of, or reading descriptions of, the brain in some shape or form.

Buchman links (at the post’s bottom, as is now the practice at NCore) to several good reads and sites, including Neuroculture.org, which has some lovely stuff, and — curse those paywalls — a Nature Neuroscience essay on the subject (subscription required) from last year and a more recent paper with the lovely title “The Birth of the Neuromolecular Gaze” (subscription required).

Free and fun, however, is Marco Roth’s Rise of the Neuronovel, at  n+1. it’s a fun read, even if you don’t completely agree, as is the case with me. (I see now that Jonah Lehrer didn’t go for it either; worth checking, as is Roth’s response there.) But I call it out here not to differ, but to a) bring it forward and b) add something to Roth’s several sharp-eyed observations about Ian McEwan’s Saturday. (I think Roth is a bit too hard on McEwan, but that’s not what I’m after here; and I suspect Roth sensed what I’m calling out here but left it unremarked because he had other dishes he wanted to cook. )

Continue reading →

Gleaned – acupuncture, HeLa, linkage debates, psychos, and the FBI

A biological basis for acupuncture, or more evidence for a placebo effect? Ed Yong ponders acupuncture, placebos, and context. This I like, and there’s a nice meta dimension here as well: placebos being all about context.

Abel Pharmboy reports on Marking the magnificient memory of Henrietta Lacks. A nice account of what sounds like a lovely ceremony. Among other things, testifies to the potential power of the book.

Much ado about links and where they best belong. Starting points: Rough Type: Nicholas Carr’s Blog: Experiments in delinkification, which gets examined by ReadWriteWeb in The Case Against Links, and more critically by Matthew Ingram, who ponders Nick Carr’s Retreat From the Internet. I’m of two minds on this. it strikes me that in some types of posts, links are best used in-line while in others they might best serve both writer and reader if they’re held till the end.

As Vaughan Bell notes, a leading psychopath researcher is threatening to sue critics. This adds to a disturbing trends in which researchers sue other researcher who differ with them, taking to court scientific matters that should be settled through research and discussion.

And speaking of being careful about what you study: The splendid anthro blog Savage Minds notes that “A senior at Pomona College flying out of Philly was detained and handcuffed by the TSA and then questioned by the FBI … for carrying Arabic flash cards in his pockets. So start putting your terrorist language materials in your checked luggage!”

 

 

Neuron Culture top 5 hits for May

Suspicious minds

In reverse order:

5.  David Sloan Wilson, pissing off the angry atheists.

I piss off atheists more than any other category, and I am an atheist.” This sparked some lively action in the comments.

4. Lively or not, Wilson and Dawkins lost fourth place to snail jokes.

A turtle gets mugged by a gang of snails. 

3. A walking tour that lets you See exactly where Phineas Gage lost his mind

 

2. “Push” science journalism, or how diversity matters more than size

We’re constantly told — we writers are, anyway — that people won’t read long stories. They’re hard to sell to editors, probably because editors believe they’re hard to sell to readers. I think I read once — can’t recall where, don’t know if it’s true, we’re trusting my hippocampus here, which is a frail thing — that a major online news magazine found that readership of its stories reliablty fell off as the stories went past the 1000-word mark.

That’s probably true. Yet if a long story is written with care, plenty of people read it. The Times’s most-popular-story lists consistently includes long features among their top three entries. (My depression story was there several days.) Clearly length does not always dissuade. Yet the idea that it does dissuade holds strongly enough that writers seldom get the opportunity to write long — and thus to include the goods that will carry some readers through a science story.

1. Who you gonna believe, me — or my lyin’ fMRI?

As I noted in an earlier article on the overreach of forensic science, juries tend to be overly credulous about any evidence offered as forensic or scientific evidence. And other studies show that imaging studies generate an extra layer of overcredulousness. (On those, see Dave Munger and Jonah Lehrer.) So when an ‘expert’ shows a jury a bunch of brain images and says he’s certain the images say a person is lying (or not), the jury will led this evidence far more weight than it deserves.