When “Chasing Ice” finished, my 10-year-old son, sitting next to me in the almost empty theater, said, “That was sobering.” He was right: Sobering, but also beautiful and inspiring.
“Chasing Ice” documents both the earth’s current warming and one man’s obsessive efforts to show that warming in terms everyone can understand: visual, immediate, dramatic. National Geographic photographer James Balog says he was a bit of a climate skeptic himself until he took an assignment in 2005 and 2006 photographing the retreat of a single glacier in Iceland for a National Geographic story. Seeing the glacier’s retreat with his own eyes (and in his photographs) convinced him. He figured that if he could show the same thing on many glaciers around the globe, he could convince other skeptics that climate change was real and serious. So he organized the Extreme Ice Survey to document global warming with time-lapse photographs of retreating glaciers. The film shows this effort — and some of the truly stunning images they captured, both in stills and in live video.The film’s most renowned segment left me truly drop-jawed.
Some see this as an antidote to a sort of cognitive resistance that discourages us from acknowledging changes or risks that can’t be directly perceived or that seem distant in time. The role of such thinking in climate-change skepticism was called into question in May 2012 by an interesting paper out of Yale. That paper found that neither scientific literacy nor supposedly rational modes of thought made people more likely to acknowledge climate change. Rather, in a manner that brings to mind the Kill Whitey studies of morality, people tend to take the view most harmonious with whatever peer groups or political cultures they identify with. We subscribe to a view that we’re comfortable with socially, culturally, and politically, then backfill the reasoning.
So it’s possible this film may leave your climate-change friends cool on the whole global-warming thing. Then again, it may “work,” for the film makes a particularly strong case with its combination of ingenious graphics, a story of a very nice guy pursuing an idealistic obsession with lots of sexy choppers, crampons, and cameras; and some of the most stunning and beautiful earth footage I’ve ever seen. This is one of the few movies where I was moved to the verge of tears by the imagery’s sheer beauty.
More info is at the film’s website, which also has some lovely photos. (So do Balog’s main site and the Extreme Ice Survey page.) Don’t despair if the site’s “See the Film” page doesn’t list nearby or current showings; that listing seems out of date or incomplete and did not, for instance, include the screening my son and I enjoyed here in Vermont. You might try Moviefone or similar sites instead.
In any case, I highly recommend this film. It may or may not change your mind if you’re a skeptic. (So, you know, it’s safe….) But the footage and film will likely blow your mind regardless.
“You are different with a gun in your hand; the gun is different with you holding it.” – Bruno Latour
I’ve been on an Elmore Leonard tear of late. Right now I’m reading his Split Images, in which a good cop and a good magazine journalist (heh) try to stay a step ahead of a rich gun nut named Robbie Daniels. Our hero and heroine cross Daniels’ path as the car-parts heir is discovering that he enjoys shooting people. Daniels doesn’t just go out one day and buy a gun and become both killer and gun nut. The guns come first: a curated and cabineted collection kept in drawers that slide out so that Daniels can show them off, recumbent in plush, to admirers like his new gun pal and accomplice Walter Kouza, a bad-news cop who rather enjoys shooting people himself:
There must have been two dozen handguns in there, a showcase display against dark velvet.
“Jesus,” Walter said.
There were Smith and Wesson thirty-eights and three-fifty-sevens, in Chief Special and Combat Masterpeice models, two- and four-inch barrels. He had a Walther P thirty-eight, a Baretta nine-millimeter Parabellum. He had Llama automatics, several, including a thirty-two and a forty-five. A Llama Commanche three-fifty-seven, an Iver Johnson X300 Pony, a Colt forty-five Combat Commander, a Colt Diamondback and a Detective Special. He had a big goddamn Mark VI Enfield, a Jap Nambu that looked like a Luger. Christ, he had a ten-shot Mauser Broomhandle, nickel-plated, a Colt single-action Frontier model, a couple of little Sterling automatics. Walter’s gaze came to rest on a High Standard Field King model, an ordinary twenty-two target pistol except for the barrel. The original five-and-a-half-inch barrel had been replaced by a factory-made suppressor, or silencer, that was at least ten inches long, fabricated in two sections joined together.
A minute later he opens the cabinet with the machine guns and assault rifles.
Even as Leonard reveals these weapons, late in Chapter One, we recognize them as the guns of Chekhov’s maxim: “If in the first act you have hung a pistol on the wall, then in the following act it should be fired. Otherwise don’t put it there.”
Once put in the play, a gun must be put in play.
Is there a sense too in which a real-life gun, once put in hand, must be fired? If so, how deeply does this expectation, this foreshadowing of action, soak into its holder? How might the surrounding culture (hunting versus “tactical”) and type of gun shape these expectations? One holds a gun by its grip. Is there a sense in which the gun grips the holder, so that gun begets gunner?
Evan Selinger, a philosopher who focuses on technology, explored these a few months ago in an article he wrote just after the Aurora shooting. The piece demands a bit of us, for it asks us to reconsider the shape of something whose shape we’re certain we already know. But the dimensions of which he writes, being of the human mind, are not as plain as we might like to think. As America now re-examines its relationship with firearms, more seriously, it appears, than it has for many a year, I wanted to put this idea back in play, so I’m posting it here at Neuron Culture. You’ll find it below.
Thanks and a huzzah to Dr. Selinger, whose other work you can find here, and to Alexis Madrigal and the team at The Atlantic’s Tech channel, who originally ran and illustrated this piece and happily encouraged this reposting. The piece originally ran there on July 23, 2012, under the title “The Philosophy of the Technology of the Gun.”
________
We Grip the Gun and the Gun Grips Us
by Evan Selinger
[original version ran July 23, 2012]
The tragic Colorado Batman shooting has prompted a wave of soul-searching. How do things like this happen? Over at Wired, David Dobbs gave a provocative answer in “Batman Movies Don’t Kill. But They’re Friendly to the Concept.” I suspect Dobbs’s nuanced analysis about causality and responsibility won’t sit well with everyone.
Dobbs questions the role of gun culture in steering “certain unhinged or deeply a-moral people toward the sort of violence that has now become so routine that the entire thing seems scripted.” But what about “normal” people? Yes, plenty of people carry guns without incident. Yes, proper gun training can go a long way. And, yes, there are significant cultural differences about how guns are used. But, perhaps overly simplistic assumptions about what technology is and who we are when we use it get in the way of us seeing how, to use Dobbs’s theatrical metaphor, guns can give “stage directions.”
Instrumentalist Conception of Technology
The commonsense view of technology is one that some philosophers call the instrumentalist conception. According to the instrumentalist conception, while the ends that technology can be applied to can be cognitively and morally significant, technology itself is value-neutral. Technology, in other words, is subservient to our beliefs and desires; it does not significantly constrain much less determine them. This view is famously touted in the National Rifle Association’s maxim: “Guns don’t kill people. People kill people.”
The NRA maxim “Guns don’t kill people. People kill people,” captures the widely believed idea that the appropriate source to blame for a murder is the person who pulled the gun’s trigger.
To be sure, this statement is more of a slogan than well-formulated argument. But even as a shorthand expression, it captures the widely believed idea that murder is wrong and the appropriate source to blame for committing murder is the person who pulled a gun’s trigger. Indeed, the NRA’s proposition is not unusual; it aptly expresses the folk psychology that underlies moral and legal norms.
The main idea, here, is that guns are neither animate nor supernatural beings; they cannot use coercion or possession to make a person shoot. By contrast, murderers should be held responsible for their actions because they can resolve conflict without resorting to violence, even during moments of intense passion. Furthermore, it would be absurd to incarcerate a firearm as punishment. Unlike people, guns cannot reflect on wrongdoing or be rehabilitated.
Beyond Instrumentalism: Gun Use
Taking on the instrumentalist conception of technology, Don Ihde, a leading philosopher of technology, claims that “the human-gun relation transforms the situation from any similar situation of a human without a gun.” By focusing on what it is like for a flesh-and-blood human to actually be in possession of a gun, Ihde describes “lived experience” in a manner that reveals the NRA position to be but a partial grasp of a more complex situation. By equating firearm responsibility exclusively with human choice, the NRA claim abstracts away relevant considerations about how gun possession can affect one’s sense of self and agency. In order to appreciate this point, it helps to consider the fundamental materiality of guns.
In principle, guns, like every technology, can be used in different ways to accomplish different goals. Guns can be tossed around like Frisbees. They can be used to dig through dirt like shovels, or mounted on top of a fireplace mantel, as aesthetic objects. They can even be integrated into cooking practices; gangster pancakes might make a tasty Sunday morning treat. But while all of these options remain physical possibilities, they are not likely to occur, at least not in a widespread manner with regularity. Such options are not practically viable because gun design itself embodies behavior-shaping values; its material composition indicates the preferred ends to which it “should” be used. Put in Ihde’s parlance, while a gun’s structure is “multistable” with respect to its possible uses across a myriad of contexts, a partially determined trajectory nevertheless constrains which possibilities are easy to pursue and which of the intermediate and difficult options are worth investing time and labor into.
A gun’s excellence simply lies in its capacity to quickly fire bullets that can reliably pierce targets.
With respect to the trajectory at issue, guns were designed for the sole purpose of accomplishing radical and life-altering action at a distance with minimal physical exertion on the part of the shooter. Since a gun’s mechanisms were built for the purpose of releasing deadly projectiles outwards, it is difficult to imagine how one could realistically find utility in using a gun to pursue ends that do not require shooting bullets. For the most part, a gun’s excellence simply lies in its capacity to quickly fire bullets that can reliably pierce targets. Using the butt of a gun to hammer the nail into a “Wanted” post–a common act in the old cowboy movies–is an exceptional use.
What the NRA position fails to convey, therefore, are the perceptual affordances offered by gun possession and the transformative consequences of yielding to these affordances. To someone with a gun, the world readily takes on a distinct shape. It not only offers people, animals, and things to interact with, but also potential targets. Furthermore, gun possession makes it easy to be bold, even hotheaded. Physically weak, emotionally passive, and psychologically introverted people will all be inclined to experience shifts in demeanor. Like many other technologies, Ihde argues, guns mediate the human relation to the world through a dialectic in which aspects of experience are both “amplified” and “reduced”. In this case, there is a reduction in the amount and intensity of environmental features that are perceived as dangerous, and a concomitant amplification in the amount and intensity of environmental features that are perceived as calling for the subject to respond with violence.
French philosopher Bruno Latour goes far as to depict the experience of possessing a gun as one that produces a different subject: “You are different with a gun in your hand; the gun is different with you holding it. You are another subject because you hold the gun; the gun is another object because it has entered into a relationship with you.” While the idea that a gun-human combination can produce a new subject may seem extreme, it is actually an experience that people (with appropriate background assumptions) typically attest to, when responding to strong architectural configurations. When walking around such prestigious colleges as Harvard and the University of Chicago, it is easy to feel that one has suddenly become smarter. Likewise, museums and sites of religious worship can induce more than a momentary inclination towards reflection; they can allow one to view artistic and spiritual matters as a contemplative being.
flickr/robertnelson
The Brave One
The points about guns made by Ihde and Latour are poignantly explored in the 2007 film The Brave One.Unfortunately, many critics examined the film through a humanist lens, and bounded by its conceptual limitations, offered damning reviews. Many depicted the movie as a hyperbolic revenge film. All they saw was a gun blazing Jodie Foster playing a character named Erica Bain who copes with a violent assault (that kills her fiancé and leaves her in a three week coma) by moving through one scene after another of gratuitous vigilante violence, using an illicitly acquired 9mm handgun to settle scores and punish criminals that the law cannot touch. A stir was even caused by the following so-called “liberal” remarks that Foster made during an interview:
I don’t believe that any gun should be in the hand of a thinking, feeling, breathing human being. Americans are by nature filled with rage-slash-fear. And guns are a huge part of our culture. I know I’m crazy because I’m only supposed to say that in Europe. But violence corrupts absolutely.
The critics failed to grasp a point that Foster herself underscored in numerous interviews. Despite its market-driven name, the film is not primarily about human virtues or vices. It does not try to discern whether there is an essential experience of bravery or cowardice, and the extent to which characters in the film personify such ideals. Rather, it is an existential meditation that centers on what Foster calls a “deeper and scarier” theme. Looking beyond the explicit plot and its correlative bursts of visually disturbing depictions of violence, makes it becomes possible to recognize that the film explores the anti-essentialist thesis that people are not unified subjects, but instead are beings with fluid and re-negotiable identities. Especially in the face of trauma, people can abandon old lives and start new ones. In the case at issue, Erica goes from being a woman who lives a relatively disembodied existence — a radio host who collects the sounds of NY city by blending into its background; a minor celebrity who refuses an offer to appear on television by suggesting that she is more of a voice than a seductive face; and a lover who, at the beginning of the film, is visually contrasted with an athletic looking, long-haired, male-nurse fiancé — to a someone who can kill in cold blood without experiencing the quintessential physical sign of remorse, shaky hands.
By depicting Erica’s metamorphosis as a shift away from disembodiment that is brought by means other than consciousness-raising or personal affirmation, The Brave One challenges the instrumental conception of technology. Erica’s transformation is so explicitly and thoroughly dependent upon technological mediation that the audience is led to infer that without the gun, she would be radically debilitated by her beating; her fate would lie in becoming an apartment-bound recluse.
Reflecting on the centrality of technological mediation to the plot, Foster uses phenomenological language and tells the media that the gun “opens up a world” in which Erica is viscerally “materialized” and therein drawn to dangerous situations (e.g., late night trips to a convenience store and subway) where there is an increased likelihood of encountering violence. Since Erica enters these places because of a technologically induced desire, and not because she is deliberately seeking retribution, it may be fitting to consider the gun -as Latour might suggest, through his notion of “symmetry” — one of the “actors” in the film.
To be sure, The Brave One is just a movie. It isn’t a scientific study and it does feature a character who has come undone. But if philosophers like Ihde and Latour are right, we’ve got more in common with her than most are willing to admit. And this possibility ups Dobbs’s already high metaphorical ante.
___________
Evan Selinger is an associate professor of philosophy at Rochester Institute of Technology. He discusses these ideas further, post-Sandy Hook, in a December 19, 2012 conversation with MSNBC’s Ned Resnikoff, “What can philosophy of technology tell us about the gun debate.” You can read more of Selinger’s writing here and keep up with him at Twitter here.
Neuron Culture’s comment policy: Please feel free to comment, but keep it civil, evaluate messages rather than messengers, and, ideally, read the thing you’re commenting on.
Who owns your memories? You’d think that would be you. But in a short interview with Claudia Dreifus at the New York Times, neuroethicist Matthew Liao notes that to the extent our devices are serving as outsourced personal memory banks, you may be sharing ownership with, say, Facebook:
Lately, you’ve been writing about this question: Do people own their memories? Most of us think, “Of course we do.” Why are you bringing this up?
Because there are some new technologies coming where we may be able to enhance cognition and memory with implanted chips. Right now, if you work for a company, when you quit, your boss can take away your computer, your phone, but not your memory.
Now, when we come to a point when an employee gets computer chip enhancements of their memory, who will own it? Will the chip manufacturer own it as Facebook owns the data you upload on their products at present?
Even today, some people claim that our iPhones are really just extensions of our minds. If that’s true, we already lack ownership of that data. Will a corporate employer own the chip and everything on it? Can employers selectively take those memories away? Could they force you to take propranolol as a condition of employment so that you don’t give away what they define as corporate secrets?
I keep up with the somewhat overblown neuro-ethicist beat pretty well, and most of what I read (including in this interview) is fairly hard-trodden. This one popped out at me as new, and — within the forward-looking realm of neuroethics — possibly more consequential than some of the other material.
Studying Ethical Questions as the Brain’s Black Box Is Unlocked.
Image by David Dobbs, copyright 2010. Every single right imaginable reserved.
Someone just directed me to Maggie Koerth-Baker’s examination of a riveting example of the differences between a life lived and a life recalled — in this case, the life memorialized, fictively, in Laura Ingalls Wilder’s splendid Little House on the Prairie. Koerth-Baker, a Little House fan (like much of my household), writes of coming across a transcript of a speech Wilder gave in 1937, when she’d become famous for the book series. For reasons apparent, that series did not include this well-told tale:
There were Kate Bender and two men, her brothers, in the family and their tavern was the only place for travelers to stop on the road south from Independence. People disappeared on that road. Leaving Independence and going south they were never heard of again. It was thought they were killed by Indians but no bodies were ever found.
Then it was noticed that the Benders’ garden was always freshly plowed but never planted. People wondered. And then a man came from the east looking for his brother, who was missing.
… In the cellar underneath was the body of a man whose head had been crushed by the hammer. It appeared that he had been seated at the table back to the curtain and had been struck from behind it. A grave was partly dug in the garden with a shovel close by. The posse searched the garden and dug up human bones and bodies. One body was that of a little girl who had been buried alive with her murdered parents. The garden was truly a grave-yard kept plowed so it would show no signs. The night of the day the bodies were found a neighbor rode up to our house and talked earnestly with Pa. Pa took his rifle down from its place over the door and said to Ma, “The vigilantes are called out.” Then he saddled a horse and rode away with the neighbor. It was late the next day when he came back and he never told us where he had been. For several years there was more or less a hunt for the Benders and reports that they had been seen here or there. At such times Pa always said in a strange tone of finality, “They will never be found.”
But was even that the true goods? Well, no, Maggie KB explains, because while the Benders were serial killers who were themselves brought to sudden justice by vigalantes, Pa wasn’t among the latter: The Wilders left Kansas two years before the Benders were exposed.
So why tell people that you left a story out of your memoir, when that story is not true?
Koerth-Baker has the answer — a speculative one, which is the only kind possible — in the remainder of Little House on the Prairie, serial killers, and the nature of memoir. Don’t miss it, or the rest of her splendid stuff.
IMAGE: The excavated grave of one of the Bender’s victims. From the Kansas Memory site.
Neurotransmitters are highly important and mind-bendingly complex. That’s why now and then I hip-check writers who boil neurotransmitters down to simple stories. Neurotransmitters are multi-purpose messengers.
They’re versatile enough, in fact, to get hijacked by parasitic worms who use them to enslave perfectly innocent shrimplike creatures called gammarids. As Carl Zimmer relates, in Parasites Use Sophisticated Biochemistry to Take Over Their Hosts , posted last week at the TImes:
Other parasites manipulate their hosts by altering the neurotransmitters in their brains. This kind of psychopharmacology is how thorny-headed worms send their hosts to their doom.
Their host is a shrimplike crustacean called a gammarid. Gammarids, which live in ponds, typically respond to disturbances by diving down into the mud. An infected gammarid, by contrast, races up to the surface of the pond. It then scoots across the water until it finds a stem, a rock or some other object it can cling to.
The gammarid’s odd swimming behavior allows the parasite to take the next step in its life cycle. Unlike baculoviruses, which go from caterpillar to caterpillar, thorny-headed worms need to live in two species: a gammarid and then a bird. Hiding in the pond mud keeps a gammarid safe from predators. By forcing it to swim to the surface, the thorny-headed worm makes it an easy target.
Simone Helluy of Wellesley College studies this suicidal reversal. Her research indicates that the parasites manipulate the gammarid’s brain through its immune system.
The invader provokes a strong response from the gammarid’s immune cells, which unleash chemicals to kill the parasite. But the parasite fends off these attacks, and the host’s immune system instead produces an inflammation that infiltrates its own brain. There, it disrupts the brain’s chemistry — in particular, causing it to produce copious amounts of the neurotransmitter serotonin.
Serotonin influences how neurons transmit signals. Dr. Helluy proposes that the rush of serotonin triggered by the thorny-headed worms corrupts the signals traveling from the eyes to the brain. Normally, an escape reflex causes the gammarid to be attracted to the darkness at the bottom of its pond. Thorny-headed worms may cause their host to perceive sunlight as darkness, and thus swim up instead of down.
Whether humans are susceptible to this sort of zombie invasion is less clear. It is challenging enough to figure out how parasites manipulate invertebrates, which have a few hundred thousand neurons in their nervous systems. Vertebrates, including humans, have millions or billions of neurons, and so scientists have made fewer advances in studying their zombification.
That’s not even the weirdest thing in the story. Get the goods at Parasites Use Sophisticated Biochemistry to Take Over Their Hosts – NYTimes.com.
Photo by EraPhernalia Vintag, used with permission Some rights reserved.
As the fallout over DSM-V continues, UK psychologist Graham Davey probes the question:
Once scientists establish a paradigm in a particular area this has the effect of (1) framing the questions to be asked, (2) defining the procedures to answer them, and (3) mainstreams the models, theories and constructs within which new facts should be assimilated. I suspect that once a paradigm is established, even those agencies and instruments that provide the infrastructure for research contribute to entrenching the status quo. Funding bodies and journals are good examples. Both tend to map on to very clearly defined areas of research, and at times when more papers are being submitted to scientific journals than ever before, demand management tends to lead to journal scope shrinkage in such a way that traditional research topics are highlighted more and more, and new knowledge from other disciplinary approaches is less likely to fertilize research in a particular area.
This led me to thinking about my own research area, which is clinical psychology and psychopathology. Can we clinical psychology researchers convince ourselves that we are doing anything other than trying to clear up the status quo in a paradigmatic approach that hasn’t been seriously questioned for over half a century – and in which we might want to question it’s genuine achievements?
It’s a smart, lengthy prod. Clinical psychology is arguably stalled, and cognitive psychology and neuroscience have shown us some fascinating things about how we think, and even feel, but little, aside from the gains from cognitive behavioral therapy, that advances either diagnosis or treatment. (There are some exceptions; I’m among those who think the work of Helen Mayberg, which is base primarily on hard-won imaging findings, may produce actionable advances that could help a lot of people. But those are still potential, and not yet in hand.)
In any case, Davey’s nicely measured essay is the sort of thing clinical psychology and psychiatry need more of right now. If the DSM-5 controversy pushes this sort of inquiry, perhaps the book will serve a purpose yet.
Get more at Graham Davey’s Blog: Mental health research: Are you contributing to paradigm stagnation or paradigm shift?.
Over at Slate I have a story, “The New Temper Tantrum Disorder,” about the “Disruptive Mood Dysregulation Disorder” I wrote about more briefly here a couple weeks ago, when DMDD was still a proposed diagnosis. Last week the DMDD diagnosis was approved for inclusion in the American Psychiatric Association’s forthcoming Diagnostic Statistical Manual, Fifth Edition — and some in the field are upset:
[T]he alterations the APA announced for DSM-5 this week sparked unusually ferocious attacks from critics, many of them highly prominent psychiatrists. They say the manual fails to check a clear trend toward over-diagnosis and over-medication — and that a few new or expanded diagnoses defy both common sense and empirical evidence. This medicine is not going down well.
Nothing burns the critics worse than “Disruptive Mood Disregulation Disorder,” a new diagnosis for kids 6 to 18 years old who three or more times a week have “temper outbursts that are grossly out or proportion in intensity or duration to the situation.” It actually started out as “temper dysregulation disorder with dysphoria” (tantrums, plus you feel bad) but got changed so as not to openly malign tantrums. But the diagnosis still focuses on tangrums, and critics say it is so broad and baggy that it’s ridiculous — and dangerous. Duke University psychiatrist Allen Frances, who chaired the revision of DSM-IV in 2001, says the DMDD diagnosis “will turn temper tantrums into a mental disorder.” In a recent blog post at Huffington Post, Frances put DMDD at the top of his list of DSM-5 diagnoses we should “just ignore,” because “a new diagnosis can be more dangerous than a new drug.” Clinical social worker and pharmacist Joe Wegmann called DMDD a diagnosis based on “no credible research” that would help drive a “zealous binge” of over-diagnosis.
Is the outcry legitimate? Or are Frances and Wegmann just having themselves their own conniption fit?
As the story reveals, the outcry is shrill, but likely not out of proportion, and speaks to far deeper problems in psychiatry. Do saunter over and get the whole thing at Slate. The comments thread is growing richer by the second.
Autism, famously fuzzy, seems to defy most attempts at definition, treatment, understanding. It’s often easier to spot the ideas and writing about it that don’t make sense than to find and fully embrace those that do. That’s what makes writers such as Emily Willingham and Steve Silberman and Amy Harmon so invaluable: They show us the possibilities within the confounds; that the fuzziness is richly textured.
So too does a new story by Gareth Cook*, a Pulitzer-winning journalist whose article “The Autism Advantage” appeared today in the early-online version of this coming Sunday’s New York Times Magazine. This wonderfully smart, richly reported, finely turned piece explores with unusual skill what may be autism’s central paradox — the difficulty of discerning a person who meets the world through such different perceptual, social, and communicative prisms. Take, for instance, this story from the principal of a school that has embraced the task of teaching children with autism:
The Specialisterne school uses Legos, too. Frank Paulsen, a red-haired man with a thin beard who is the school’s principal, told me about a session he once led in which he handed out small Lego boxes to a group of young men and asked them to build something that showed their lives. When the bricks had been snapped together, Paulsen asked each boy to say a few words. One boy didn’t want to talk, saying his construction was “nothing.” When Paulsen gathered his belongings to leave, however, the boy, his teacher by his side, seemed to want to stay. Paulsen tried to draw him out but failed. So Paulsen excused himself and stood up.
The boy grabbed Paulsen’s arm. “Actually,” he said, “I think I built my own life.”
Paulsen eased back into his seat.
“This is me,” the boy said, pointing to a skeleton penned in by a square structure with high walls. A gray chain hung from the back wall, and a drooping black net formed the roof. To the side, outside the wall, two figures — a man with a red baseball cap and a woman raising a clear goblet to her lips — stood by a translucent blue sphere filled with little gold coins. That, the boy continued, represented “normal life.” In front of the skeleton were low walls between a pair of tan pillars, and a woman with a brown pony tail looked in, brandishing a yellow hairbrush. “That is my mom, and she is the only one who is allowed in the walls.”
The boy’s teacher was listening, astonished: In the years she’d known him, she told Paulsen later, she had never heard him discuss his inner life. Paulsen talked to the boy, now animated, for a quarter of an hour about the walls, and Paulsen suggested that perhaps the barriers could be removed. “I can’t take down the walls,” the boy concluded, “because there is so much danger outside of them.”
Get the whole thing at The Autism Advantage.
*Disclosure: I first came to know Gareth’s work when he was editing the Ideas column at the Boston Globe. Later he took over editing of Mind Matters, a Scientific American online department I founded. And yet later I signed a book contract with an editor who happens to be married to Mr. Cook. I feel confident I’d be highly impressed with this autism feature anyway. The Lego story alone: It’s not every day one comes across something that beautiful.
Should soldiers who survive suicide attempts be court-martialed — tossed from the military in shame? It’s a sticky question that gets stickier on examination. USA Today looks at it through the prism of a case in which a Marine private was court-martialed after being convicted of ‘self-injury’ after he slit his wrists in a barracks in Okinawa in 2010:
He was convicted under the Uniform Code of Military Justice’s Article 134, known as the General Article, because the judge found his self-injury was prejudicial to good order and discipline and brought discredit upon the service.
At least one judge on the military’s high court agreed with that argument. “You don’t think that the public will think less of the military if people are killing themselves? …There’s literature out there that these things come in waves,” said Judge Margaret Ryan.
Underpinning the case is the question of why the military criminalizes attempted suicide when it does not treat successful suicide as a crime.
“If (Caldwell) had succeeded, like 3,000 service members have in the past decade, he would have been treated like his service was honorable, his family would have received a letter of condolence from the president and his death would have been considered in the line of duty. Because he failed, he was prosecuted,” noted Navy Lt. Michael Hanzel, the military lawyer representing Caldwell.
Suicides among active-duty troops have soared in recent years, from less than 200 in 2005 to 309 in 2009, and a spike this year has put 2012 on track to set a new record high.
The court seems to be weighing, from a particularly military point of view, a two-stage question that is actually quite slippery: Within the military’s serve-the-group culture, will punishing suicide attempts actually reduce suicides by stigmatizing the act of attempting it? Or will stigmatizing the act actually raise the rate because it will also stigmatize the impulse or thought — or depression generally —‚ and thus prevent people from seeking help? In general, stigmatizing any given behavior does tend to reduce it, and one might expect that to be all the more true in rules-based subcultures like the military. This is probably part of why those in the military are generally more law-abiding across the board.
But is it possible, even within the rules-based military culture, to stigmatize suicide without stigmatizing depression and thus discouraging treatment? My guess is No. But if the USA Today story is paraphrasing the trial fairly, the court is struggling with just these questions. I suspect the military may find no way to be truly consistent here, or to come up with any rulings or policies that hold no difficult contradictions or confounds. As it is, however, the military’s stance is already all a-hoo: Soldiers who are successful in committing suicide are put to rest with full military honors, while those who fail to — but succeed, as it were, in living — are subject to courtmartial.
A reminder here to keep comments respectful, svp.
via Military court wrestles with punishing suicide attempts.
Years ago, I tried crossing a downhill street plated with glare ice (friction is one of our few weapons against gravity) and could no more walk across that street than I could fly. And for the first time, I understood what gravity was capable of. It doesn’t negotiate, it can’t be avoided, it runs this place like an absolute dictatorship.
Anne Finkbeiner, channeling Haldane*, on Falling.
*“You can drop a mouse down a thousand-yard mine shaft; and on arriving at the bottom, it gets a slight shock and walks away. A rat is killed, a man is broken, a horse splashes.” From On Being the Right Size, by JBS Haldane.