How does culture shape the expression of mental illness or anamolous mental states? I’ve explored that questions several times at Neuron Culture, sometimes provoking sharp objections to the idea that culture has any effect at all on the expression of psychosis — some people are just crazy, the response goes, and culture has little or nothing to do with it. Some of that response was due to poor argument on my side; I tried to make it move substantive in Batman Returns: How Culture Shapes Muddle Into Madness.
In Hearing Voices in Accra and Chennai, a recent talk broadcast at the Culture Mind Brain conference, Stanford’s Tanya Luhrmann looks at roughly this same issue through another prism: that of the voices people with schizophrenia sometimes hear. It shows yet another example of how both the experience of what we call schizophrenia and other people’s response to it vary by culture.
A couple weeks ago I was delighted to find a more comprehensive piece by Luhrmann at Wilson Quarterly, Beyond the Brain, in which she looks at how mainstream psychiatry’s biological model of schizophrenia is starting to recognize this heavy influence of culture, with increasing recognition of schizophrenia as a “biocultural” phenomenon. Luhrmann’s opening account of one particular patient, Susan, shows how different responses from culture can shape the course of schizophrenia, even within the U.S.:
Susan was a success story. She was a student at the local community college. She had her own apartment, and she kept it in reasonable shape. She did not drink, at least not much, and she did not use drugs, if you did not count marijuana. She was a big, imposing black woman who defended herself aggressively on the street, but she had not been jailed for years. All this was striking because Susan clearly met criteria for a diagnosis of schizophrenia, the most severe and debilitating of psychiatric disorders. She thought that people listened to her through the heating pipes in her apartment. She heard them muttering mean remarks. Sometimes she thought she was part of a government experiment that was beaming rays on black people, a kind of technological Tuskegee. She felt those rays pressing down so hard on her head that it hurt. Yet she had not been hospitalized since she got her own apartment, even though she took no medication and saw no psychiatrists. That apartment was the most effective antipsychotic she had ever taken.
The idea that culture shapes the experience and expression of anomalous mental states is also explored in Ethan Watters’ Crazy Like Us: The Globalization of the American Psyche, and with particular directness in the blogosphere lately by N, a blogger who at the fascinating Ruminations on Madness often examines the sometimes ill fit of our culture with his/her schizophrenia. N responded to the Batman shootings with the extraordinary piece maeror meror, which I wrote about in Batman Returns: How Culture Shapes Muddle Into Madness. S/he now responds to Luhrmann’s “Beyond the Brain” essay with Return of the Social: Rewriting the recent history of schizophrenia — a post that adds value by taking exception to some of Luhrmann’s argument without dismissing it. Among the many striking things there:
[R]esearch in public attitudes in fact shows a steep increase in the affirmation of biomedical causal beliefs regarding schizophrenia over the past two decades. (Climbing not only in the 90s, but every year since.) Georg Schomerus and colleagues’ (2012) important recent meta-analysis helpfully aggregates this data. Not only have biomedical beliefs and aetiological attributions increased dramatically over the last two decades; social acceptance and inclusion, in parallel, has declined even further.
I find this a bracing pair of reads. The current jabbering about the relationship between madness and violence might improve if more people were familiar with the ideas raised in these essays. Both essays might strike newcomers to the discussion as dense at points, and the differences between the two authors obscure or academic, as Luhrmann and N disagree on where we are in the pendulum swing between environmental and biological causes of madness. But it seems safe to say they would agree that — as the recent Culture, Mind, Brain conference suggests — there’s more attention now to seeing how environment (including culture) and biology work together, and less on an either-or explanation.
Edits: Jan 15, 2013: Corrected a few typos, changed a muddy phrase or two, and removed a couple redundancies.
Last week, in Embrace Your Dangerous Genome, I argued (following Virginia Hughes’ Slate article) that we should press for more openness and less worry about people receiving their genomic information. In my focus on making that point I failed to note that I was speaking mainly about people who want to unearth their genomic information via so-called personal genomics outfits like 23andme — and not about people who find themselves confronting bad genetics news in medical situations in which genetic testing became part of diagnostic efforts. I should have made that distinction, for it’s important one: There’s a difference between facing bad news when you’ve asked for it and facing it because you got sick — especially if the news is especially grave.
The post stirred some objections from people who are painfully familiar with that latter scenario, such as genetic counselors in medical settings, who sometimes find they must tell someone that they have, for instance, the gene variant that dooms one to Huntington’s disease — among the grimmest of findings.
One such counselor is Laura Hercher, of the Joan H. Marks Human Genetics Program at Sarah Lawrence College, who kindly agreed to make the case here for a sort of selective paternalism in medical genetic counseling. My sincere thanks to her for providing this perspective.
The Case for Paternalism in Genetic Testing
by Laura Hercher
“So,” the journalist interviewing me asked, “you are for paternalism, then?”
Pause. I feel the linguistic walls closing in on me. Are you for paternalism? Are you against life? When did you stop beating your wife?
The case against paternalism in genetics is a cause célèbre among many scientists and science writers. The argument generally paints a picture of the medical professional as some sort of hybrid nanny-thug, protecting consenting adults from viewing their own genomes as though they were small children begging to play with knives, and at the same time fighting a bare-knuckle brawl with DTC companies to defend their turf as the only legitimate explicators of medical significance (because fortunes ride on the right to explain complex inheritance patterns and probability to the worried well. Please.).
The genome is not such a scary place and we have a right to our own genetic information. This case is made by many, including Virginia Hughes at Slate and Razib Khan at Discover and Daniel MacArthur at Wired. These arguments are smart, well-written, ethically unassailable to a point – but at the same time contextualized to a very distinct set of circumstances, which can be frustrating to genetic counselors who work in settings where the scientifically literate, information-seeking consumer with time on his hands and an interest in genomics is not – safe to say – the average patient.
All of these critiques are set in the world of 23andMe, which is a perfectly good world, but one where in my imagination the sky is orange and the sun is blue. It’s not a real place. For one thing, in 23andMe-world there is no necessity of making a profit (They don’t. Do the math). I think it is wonderful that they sprung for scans for Parkinson’s patients and full sequencing for 50 people with PD-associated LRRK2 mutation. It’s lovely. It’s just not a business model. It’s not a medical model.
And the customer base has a similarly unreal quality for those of us in traditional patient-care settings. Who are these people, who read in-depth explanations of residual risk, and fill out surveys detailing their health and family history? As a genetic counselor, my experience is with people who come for assessment or testing because they were concerned about something specific – or more likely, because some doctor told them they should – and not out of intellectual curiosity. On a good day, our patients know their family history. And most of them are simply hoping that if you find something wrong they can just take a pill and don’t need to see another doctor, because they had a hell of time getting off work for this appointment and enough is enough. So, if it’s no emergency – yeah, we’re done.
Which brings me back to this issue of paternalism. I agree that it makes no sense to put up obstacles for inquisitive and motivated individuals who wish to query their genome for information, however laced with uncertainty or peril. But forgive us if our first thoughts are often about how to help (yes, and to protect) the patients we see, in the medical setting. Science literacy is rare. The desire to use web-based tools to analyze their own DNA sequence is vanishingly rare. And a sentence like “Your risk of type II diabetes is decreased by the allele that you carry, in a gene that accounts for an estimated 1.5% of the heritability of the disease” is regularly interpreted as “You will not get type II diabetes.” So we worry about the effect that getting this information may have on the people who live where the sky is blue and the sun is yellow. Sue us.
(And by the way, they will sue us too – another difference between the real world and genetics as a social media project.)
Regulation of genetic testing, beyond standards for laboratory competence, is an idea on the wrong side of history. I get that. How can you tell people they can’t access information that is carried in their own cells? But I come to this conclusion with some reluctance and concern. I know that for most people it will work out fine. I know that the REVEAL study that looked at the impact of giving out sometimes devastating information about genetic susceptibility to Alzheimer’s disease and other early work suggest that people can handle this information. Of course, early work is done, by necessity, on the early adopters who are first to seek out genetic testing … but not to quibble.
Still, I cannot participate in the full-throated enthusiasm. David Dobbs, making the case that we should set our worries about news being “more than many people can handle” aside, wrote here, “A very few people, meanwhile, may learn they carry the gene that makes it absolutely certain that, should you live to your 30s or 40s (possibly sooner), you’ll develop Huntington’s disease, which is highly unpleasant and invariably fatal.” Highly unpleasant? Huntington’s disease is a degenerative condition that show up as a rule between 30 and 45, a slow descent that begins with a few tics and grimace and progresses to the loss of self on every level – cognitive, behavioral, physical. Eventually, patients are overwhelmed by involuntary movement – paralyzed by movement rather than stillness – until finally the movements wane, and a rigor sets in and a mute, rigid wait for the end. The whole span of the thing might be 15 years. Yes, unpleasant.
I worked on a project years ago that involved interviewing people at risk for HD, many of whom knew their genetic status. People who have the gene for Alzheimer’s disease are lucky, said one, because they can just put some pills away with a note that says ‘if you can’t remember what these are for, take the whole bottle.’ When you give out results to HD testing, you don’t usually have to tell them too much about the disease. After all, this is a family condition – they generally know. They know how it feels to be embarrassed by a twitching parent. They know what the end stage looks like. And so imagine sitting across from that person – 20 or 25 years old, or maybe 35 and hoping for the all-clear to have a family –or 40 and feeling like maybe they’ve gotten lucky – and having to tell them, “I’m so sorry…” It sticks with you.
So, two things about Huntington’s disease. One, thankfully, is that while it has often been proposed as a model for the risks of pre-symptomatic testing, it turns out that very few things are like HD. What makes it different from almost everything else is that the test is so definitive. If you have the gene, and you don’t get hit by a bus, you will get the disease. This lack of hope is the thing that makes it all so intolerable – and also what brings me to my second point: there has been some good news lately about potential treatments. For a neurodegenerative condition like HD, treatment is most effective before signs of disease appear – which will entirely change the calculus on testing.
So, yes – more information, not less, is the way of the future, for so many reasons. But I will throw in a plea for understanding that sometimes the opposition is not merely protecting an information fiefdom, but responding to their own previous experience. Sometimes, I get a little protective. I guess that’s paternalism. I plead guilty – guilty, with an explanation.
Laura Hercher is a genetic counselor, and a faculty member at the Joan H. Marks Human Genetics Program at Sarah Lawrence College, focusing on the social, legal and ethical impact of genetic testing and technology. She recently completed three years as chair of the National Society of Genetic Counselors Ethics Advisory Group. Her works include original research, commentary, and the occasional foray into journalism; she blogs at TheDNAExchange.com, and her first novel, Anybody’s Miracle, is due out in May. She’s on Twitter at @laurahercher.
When I first heard that the medical examiner for the Sandy Hook shooting had asked for a genetic profile of the apparent shooter, Adam Lanza, I suggested on Twitter that this would end badly: The results will almost certainly be close to meaningless, but any variants formerly associated with particular behaviors, rightly or wrongly, will get blown horrifically out of proportion. (Prepare yourself for “The Shooter Gene.”) This has the potential to set back popular understanding of both genetics and psychopathy for years. A fine editorial today in Nature sees it likewise, saying the effort is “misguided and could lead to dangerous stigmatization.”
Over at Mind Hacks, writer and psychiatrist Vaughan Bell digs in a bit more, tracing other efforts to find answers in the genes of famous killers, all of those efforts fruitless. This is a bad way to try to understand a particular killing and a bad way to think about genetics. As Bell puts it,
There is a valuable science of understanding how genetics influences violent behaviour but analysis of individual killers will tell us very little about their motivations.
It does, however, reflect a desire to find something different in people who commit appalling crimes. Something that is comprehensible but distinct, alien but identifiable.
This may give us comfort, but it does little to provide answers. In the midst of tragedy, however, the two can easily be confused.
No easy answer : Editorial at Nature
The search for a genetic killer « Mind Hacks, by Vaughan Bell
Image: Spirea in Sandy Hook, Connecticut. by Dougtone. Some rights reserved
Virginia Hughes is “sick of reading about the dangers of the genome.” So she complains over at Slate, eloquently, and I’m sick right with her.
Hughes, who blogs at National Geographic and is among our sharper followers of genetics, doesn’t mean “dangers” as in hazardous habits of actual genomes: She means the overhyped danger of “The DNA Dilemma,” as a recent Time cover called it, which is essentially whether normal regular everyday people can deal with genetic test results:
The primary question [the Time story] raises—how much information is too much information?—has been dominating commentaries about genetic testing in the medical literature.
But this is the wrong question, or at least one that’s becoming increasingly irrelevant. The personal genomics horse has bolted, and yet many paternalistic members of the medical community are still trying to shut the barn door. In doing so, they’re fostering a culture of DNA fear when what we really need is a realistic and nuanced genetics education.
Five years ago, pre-23andme, one might reasonably wonder whether people would freak out over genetic results. Your genome can bear some pretty nasty news. My own 23andme tests, for instance, revealed a gene variant that doubles my statistical risk of getting Alzheimer’s, hiking it to over 14 percent. One person close to me, meanwhile found she has a gene variant wildly raising her risk of breast and ovarian cancer. A very few people, meanwhile, may learn they carry the gene that makes it absolutely certain that, should you live to your 30s or 40s (possibly sooner), you’ll develop Huntington’s disease, which is highly unpleasant and invariably fatal.
No wonder some people worried, some 5 years ago, that such news might be more than many people can handle. Today, though, it’s clear we should set those worries aside.
To start with, most genetic testing companies offer the option to simply not learn your results on specific risk genes, such as Alzheimer’s or Huntington’s. Few choose that option; most want to know, even if there’s no cure and the news isn’t actionable. Meanwhile, much information you get is actionable. My friend who got the bad news about her breast-cancer gene, for instance, chose to have a double mastectomy and have her ovaries removed. The operation beat her up horribly, but she did it without hesitation and was immediately glad she did. The operation erased an 80% chance of getting a very nasty cancer. She’s plenty smart enough to know she was better off knowing.
Almost everyone tested agrees. As Hughes points out, a recent study of 2000 people in the New England Journal of Medicine found that lo and behold, despite that all of them got tested and received pretty complicated news, “Nobody freaked out.” We humans apparently learned a long time ago to deal with bad news and uncertainty. Hughes rightly argues we should set aside these worries and spend time thinking instead about using these genetic testing to educate people about genetics.
I agree — and in fact I’d add that one’s own genetic-test results offers a particularly memorable demonstration of something that too often gets overlooked in reporting about genetics: with very few exceptions, genes work in complicated and decidedly non-deterministic fashion.
I have ranted here before about how readily the media embraces simplistic pictures of how single genes generate even enormously complex traits and behaviors, so that a gene variant that affects a neurotransmitter or hormone that operates in scores of systems in brain and body gets boiled down to a slut gene or a feminist molecule. A personal genetic test shows the idiocy of such reductionism. It shows you that even genes affecting relatively simple physical traits, such as height or hair color, are highly probabilistic and subject to interactions with not just other genes but diet, exercise, family history, and God knows what else. I felt much less alarmed about my supposed 15% Alzheimer’s risk, for instance, when I saw that I also carry genes making it more likely I’d be short and bald. To read such news standing at over six feet and with a full head of hair delivers a particularly powerful lesson about the ambiguities of genetics.
Cited: See Virginia Hughes, Ethics of genetic information: Whole genome sequencing is here, and we need logistics for sharing results. – Slate Magazine
Adam Green’s profile of pickpocket Apollo Robbins in The New Yorker has rightly generated much buzz this week: Robbins is a master, and Green does his rich story justice.
The New Yorker produced an accompanying video of Robbins showing Green how he can relieve him of his watch, his wallet, and even his cellphone (from his front pocket) without being detected. It’s a fun clip that gives some hint of how Robbins manipulates the spotlight of attention in away that allows him to “work in the dark.”
For sheer entertainment and wonder value, however, I actually prefer this older clip of Robbins fleecing New York Times science writer George Johnson in a less rehearsed and controlled situation. This occurred at a special MindScience conference on the neuroscience of magic. Johnson was by chance called to the front of the room to serve as a mark. And Robbins simply cleans him out. You watch — and watch again. Even the second (or third) time through, you’ll miss most or all of the moves Robbins uses to empty Johnson’s pockets.
For more on the neuroscience of magic, see the lab page of Susana Martinez-Conde, a visual attention scientist who, with her husband and collaborator Stephen Macknik, organized the conference and wrote, with Macknik and writer Sandra Blakelee, the excellent Sleights of Mind.
Too good to be true — yet it’s true. Type a search term into gizooogle, and the site will gangsta-lingo-ize both the search results and the pages you go to. The fun is endless. I, for instance, become Dizzy Dobbs, lyricist n’ journalist , and author of “My f__in Muthaf___’s Lover.” (Gizoogle doesn’t use blanks.*)
Dizzy Dobbs, lyricist of Reef Madness n’ tha #1 Kindle-Single bestsella My fuckin Muthafuckaz Lover, writes features n’ essays for publications includin tha Atlantic, tha New York Times Magazine, Nationizzle Geographic, Wired, tha Guardian, n’ other publications. Right back up in yo muthafuckin ass. Several of his stories done been chosen fo’ leadin anthologies, da most thugged-out recent bein “Beautiful Domes,” bout adolescents, which ran up in tha Nationizzle Geographic n’ was selected fo’ both Eccoz Da Best American Science Freestylin 2012, edited by Michio Kaku, and Marinerz Da Best American Science n’ Nature Writrin 2012, edited by Don Juan Ariely yo. Dude be also lyricist of tha #1 Kindle Single bestsella My fuckin Muthafuckaz Lover.
Dude is now freestylin his wild lil’ fourth book, Da Orchid n’ tha Dandelion (Houghton Mifflin Harcourt) which expandz on his crazy-ass much-discussed feature fo’ tha Atlantic, “Da Orchid Children.” It will follow both scientists n’ some rather extraordinary ‘regular’ gangstas as they grapple wit emergin ideas bout how tha fuck genes n’ culture shape temperament, behavior, evolution, n’ destiny.
His most recent previous book, Reef Madness (Pantheon, 2005), looks at a long-ass argument dat Charlez Darwin had bout how tha fuck coral reefs form; Oliver Sacks found it “buckwildly written, almost unbearably poignant.” Dude lives up in Vermont, wit frequent trips ta New York, London, DC, n’ other points distant.
We can thank @TomChivers and @alokjha for bringing this to our attention. Do check it out.
*I’m using blanks instead of expletives so that I personally directly am not befouling the ears of young people reading this blog, such as my 11-year-old son, who reads this despite directions not to. Boy better watch his mouf.
Yesterday, the U.S. Congress, like Wyle E. Coyote on an especially lucky day, scrambled back onto the fiscal cliff that Congress itself had jumped off of, having built the cliff last year as a thing that no one would be stupid enough to jump off of. So it may seem a futile time to ask that we make policy-making a bit more rational.
We writers are a stubborn lot, though, so I’m publishing below my foreword to The Geek Manifesto, a wonderful book that makes just that appeal. The Geek Manifesto, written by Mark Henderson, formerly a London Times science editor and now communications director at the Wellcome Trust, has been a sensation in Britain, finding its way through a hivemind campaign to every member of Parliament and stirring a much-needed high-profile debate about how to better integrate evidence-based thinking into public life. Henderson was kind enough to ask me to write a foreword for the U.S. edition, which currently in e-book form, soon to be in print*; the publisher liked it enough to include it as an afterword in the new UK paperback edition, which is out tomorrow.
A Geek Manifesto for America
Foreword to the US edition by David Dobbs
Nothing illuminates like a close analog.
So I found during a recent year in London, as I watched scientists and science-writing colleagues there, including Mark Henderson, the author of this book, wrestle with translated forms of the threats that haunt those of us in America who would like to see our country run according to honest airing of fact and principle rather than lies and fear. In the US, we deal with virulent creationism, medicine-by-advertising, and deeply institutionalized resistance to the reality of climate change; in the UK, the assault on empirical thinking ranges from ridiculous prescriptions for colonic irrigation to the sublimely sad savaging of researchers by those who would have us ignore drought, fire and the melting of the ice caps. In noisy pubs and cafés, on the slightly less noisy sidewalks outside pubs and cafés, and in quieter halls of inquiry such as universities and the Royal Society, Henderson, colleagues and I talked and ranted and laughed about these things and, comparing notes, saw how alike the battles raged in Old England and New America.
Alike – but with instructive differences. The Brits’ climate-change battles may lack the vitriol of ours, but they lost early fights over genetically modified foods that we have won (so far), and the UK faces an even stiffer challenge than we do in calming fears about vaccines and autism. We Yanks must endure legislators trying to ban the use of temperature-change data in laws about climate change (thanks, North Carolina), but Henderson must listen to arguments in Parliament that the government should bloody well do something, my Lord, and right away, please, to counter ‘the awesome power of the moon’, because the full moon, astrologists tell us, makes bad people act worse and also endangers innocents: it makes their blood run thin, so surgeons can’t save them when the moon-maddened bad people cut them up (see Chapter 2). We Yanks hardly hold the market on irrationality.
To these battles Henderson brings deep knowledge, steady determination and good humour. Despite the head-banging idiocy of many of the problems he writes about, he amiably outlines how any democratic English-speaking nation can bring more evidence-based thinking into every realm of public life – government, education, healthcare, the economy, the environment, even politics. He wields lightly his wonk and tells vividly his tales of hope and woe. American parents and teenagers will cheer, for instance, at Henderson’s accounts of how teaching self-control to children early on can boost them all their lives, and how one British high school raised its grades simply by starting classes at 10 a.m., the better to suit adolescent diurnal rhythms. And anyone will be engrossed and then disturbed by his account of how Britain’s Forensic Science Service cracked a gruesome murder case – but faces closure, because forensics, incredibly enough, cost money.
Henderson has made a real mark in England with this book, managing to get copies to every single member of Parliament (including the astrology-crazed) and shaping debate and policy. His book can help here, too. We Americans have won some great victories in pushing policy along empirical lines. We have prevented millions of deaths because our laws heed evidence that seat belts save lives and that secondhand smoke, smog and polluted water take them. We have fended off most efforts to ban perfectly healthy genetically modified foods, allowing us to benefit from better nutrition. And the healthcare reforms in the recently passed Affordable Care Act will save lives, grief and money by designing healthcare policy according to what treatments actually work best.
But we’ve far to go. Too many policies fly in the face of facts or empirical principle. Medicine remains driven far more by business concerns than by data. We still forbid needle exchanges for addicts despite overwhelming evidence that we should do otherwise. We let pharmaceutical companies push new, more expensive drugs, even though they work no better than cheaper drugs we already have. We dally on climate change. We base education policy on what’s convenient or traditional, rather than what’s effective. And we often starve science budgets, even though science drives the economy.
We should do better. The London libraries, universities and noisy pubs in which Henderson, his British mates and I exchanged war stories, lamented denial and laughed about quackery were kin to – and sometimes were – the venues in which the compatriots of John Locke, Francis Bacon and the two Charlies, Darwin and Lyell, forged the principles of empiricism in the eighteenth and nineteenth centuries. The fight for empiricism was then, and is now, more than a fight for science. It is a fight for a society based on defensible argument and a fair exchange of views – a fight for democracy.
Henderson articulates with bracing clarity how science’s central principle – that evidence should trump authority, and reason trump rumour – can help improve the clumsy, cranking machinery that produces law, policy and other frameworks of public life. This is the real value of science in the public realm: in its elevation of evidence over authority, science is insistently democratic. It steers authority not to those who hold more power, but to those who hold better evidence. Like democracy, science sometimes gets messy – but it’s a productive mess that is ultimately liberating. Like the principle of liberty, the principle of empiricism moves us always towards a better, freer, healthier world.
And a more enjoyable world: Henderson is a warm and funny man with a lively mind. His book goes down like a good British ale: reassuringly familiar – the stuff is quite clearly beer – but different enough from one’s home brew to spark new interest and perspective. You’ll emerge from TheGeek Manifesto as I did from talking science with Henderson over pints of Bishops Finger and London Pride: refreshed, stimulated, and embracing with new energy the problem of how to give empirical thinking a bigger place in public life.
I wrote fewer features than usual this year, to give more time to my work on “Orchids and Dandelions,” the book I’m now slaving away at. But I was still able to find ten piece I wrote either here or for other publications that I feel good about offering in a “best of 2012” spirit. Hope you enjoy, and thanks, immensely, for reading.
Are all years in science and science writing this weird? Just on my own self-assigned beats, I found myself writing and thinking about bad science and bad writing about vaginas; madness and murder; a psychiatric manual that actually accelerates psychiatry’s race into the weeds; and the media’s insistence on simple, clean fables about both neuroscience and genetics at a time when most fields are increasingly recognizing how very little they actually know.
Probably most years are this weird — but the weirdness sure jumps out when you sit down and chat about it with some other people who keep track of these things. So I did when Maggie Koerth-Baker, Maryn McKenna, Brian Switek and I sat down last week to talk about the year in science with Skeptically Speaking’s Desiree Schell. The topics run the gamut from the DSM to dinosaurs, abortion to zoology, pandemics to poison and punditry, as Schell tries and mostly succeeds in herding the cats that be our brains. Got to be something you like. You can listen via the link below; any trouble here, grab a listen at the episode’s page at Skeptically Speaking.