So the Jews did it after all.
OK, scratch out that pluralizing “s.”
Not that it will make any difference to diehard antisemitic conspiracy nuts – “the men who taste Jews in their sandwiches.”
Those types must have squirmed with glee when the Daily Mail reported that Jack the Ripper’s identity had finally been revealed thanks to DNA testing.
The mentally deranged Kosminski was 25 years old, an immigrant (likely from Russia’s Pale of Settlement), a sometime-hairdresser – and a Jew.
I pushed off the idea of writing this article when I first heard that Joan Rivers, one of my comic icons, was rushed to the hospital after a botched outpatient procedure last week. I didn’t want to think about having to say goodbye to Joan, to bid farewell to yet another icon of an age gone by, a powerhouse who managed to be a cultural force until her last breath. The only solace we can muster is in knowing that, for these ten reasons at least, Joan’s memory will be a blessing.
10. Joan never grew old or gave up.
At 81, she was as attuned to pop culture, politics, and current events as a 20 year old. A self-made fashionista, the comedian never retired, sat in a chair, or gave in to technology. Joan will forever be a role model to women who refuse to trade style for a shapeless moo-moo and an office chair for a rocking chair. In her later years she paired up with Melissa, illustrating that mothers and daughters really can work together and get along. She was a modern Bubbe, surrounded by her children and grandchildren as she took the world by storm.
Genie, you're free. pic.twitter.com/WjA9QuuldD
— The Academy (@TheAcademy) August 12, 2014
1. Matt Walsh at his eponymous blog: “Robin Williams didn’t die from a disease, he died from his choice“
I’m not normally one to write a blog post about a dead celebrity, but then I suppose there is no such thing.
There are only living celebrities, not dead ones. In death, wealth and prestige decay and we are brought into a new reality, the only reality there is or ever was — one which, for much better or much worse, doesn’t care at all about our popularity or our money.
The death of Robin Williams is significant not because he was famous, but because he was human, and not just because he left this world, but particularly because he apparently chose to leave it.
A terrible, monstrous atrocity. It disturbs me in a deep, visceral, indescribable way. Of course it disturbs most people, I would assume. Indeed, we should fear the day when we wake up and decide we aren’t disturbed by it anymore.
We tend to look for the easiest answers. It makes us feel better to say that depression is only a disease and that there is no will and choice in suicide, as if a person who kills themselves is as much a victim as someone who succumbs to leukemia.
2. Jim Geraghty at National Review: “Robin Williams and Our Strange Times: Does our society set the stage for depression?”
The constant online presence would lead to a world of nonstop instant reaction, where everyone could immediately transmit the first thought that popped into his head in response to news. Everyone’s first reaction would become his defining reaction, particularly if it’s dumb or knee-jerk. If it was racist, sexist, hateful, or obnoxious, even better. Those horrified would then share and retweet it to their friends and followers, spreading the perception that the world was overpopulated with hateful idiots, and that average Americans — or average human beings! – were rather nasty, ignorant creatures unworthy of respect or affection. Many people would quickly and easily forget that the people who comment on Internet websites represent a small slice of the population, a fraction predisposed to getting pleasure from posting shocking, obnoxious, or hateful material.
The widespread perception that almost everyone else was a moron — why, just look at the things people post and say on the Internet! – would facilitate a certain philosophy of narcissism; we would have people walking around convinced they’re much smarter, and much more sophisticated and enlightened, than everyone else.
3. Bryan Preston at the PJ Tatler responding to Walsh: “Chasing Shadows in the Death of Robin Williams”
Anyone who has seen true mental illness up close knows that the idea of choice gets bent and blurred.
I’ve seen Alzheimer’s Disease up close. It’s not depression, but it is a different disease of the same organ, the brain. Alzheimer’s sufferers do not choose to lurch from the present to three decades into the past in an instant. They don’t choose to forget who you are, what your name is, who they are, where they are, everything they have ever known and everyone they have ever loved. They don’t choose to become hostile to those they love who are caring for them. They are not choosing any of that. Yet what is happening in their brains impacts their behavior and can be incredibly frustrating and crushing for their loved ones. It’s heart-breaking, one of the most heart-breaking experiences a person can experience.
There is no more choice in that than there is choice to come down with cancers unrelated to behavior. There is no more choice in that than the choice to grow old, see your organs wink out one by one, as you approach the end. Did the boy who was diagnosed with cystic fibrosis, an organ disease which will probably kill him in his 20s, choose that? Depression, like Alzheimer’s, is a disease of an organ, the brain. Where choice begins and ends in the mind of someone with clinical depression is quite blurry. I don’t pretend to know where it is. Depression is the ultimate mind game, only your own brain is working deviously against itself.
Inevitably, Robin Williams’ suicide saw the “raising awareness about mental health issues” camp fighting it out online with the “he was a selfish git” crowd.
When the latter reject the “disease model” of addiction and mental illness — people like Theodore Dalrymple — they do so prompted by a laudable instinct:
They think depressed people or addicts use the “disease” model to avoid taking responsibility for their actions.
This is a bit like the New Atheists’ concept of “God,” as “an old man in the sky.” They proudly and loudly reject that concept, seemingly unaware (despite their alleged sophistication and superior education) that so do most actual believers.
Likewise, few addicts who accept the disease model (and not all do) use it as a “get out of jail free” card.
It’s called “How It Works” not “How It Lounges on the Couch Eating Cheetos and Watching Judge Judy.”
“Some of us thought we could find an easier, softer way, but we could not…”
Making amends, taking inventory, doing service and even prayer and meditation are exercises in responsibility and action.
Robin Williams apparently did all those things and stayed clean and sober for 20 years.
Then he “went out” in 2006 and was never the same.
Or, as Catholics like to say when they can’t explain something: “It’s a mystery…”
(If you say it in a somber enough voice, and include the “…”, it sounds satisfyingly deep.)
No one could have predicted that Oscar-winning comedian Robin Williams would kill himself.
Or could they?
When someone commits suicide, the reaction is often the same. It’s disbelief, mixed with a recognition that the signs were all there. Depression. Maybe talk of ending one’s life.
Now, by studying people who think about committing suicide, as well as brains of people who actually did, two groups of genome researchers in the U.S. and Europe are claiming they can use DNA tests to actually predict who will attempt suicide.
While claims for a suicide test remain preliminary, and controversial, a “suicide gene” is not as fanciful as it sounds.
The problem is that suicide samples are small and I often wonder how much gender plays a role in the lack of studies and data on suicide:
“We seem to be able to predict suicidal behavior and attempts, based on seeing these epigenetic changes in the blood,” says Kaminsky. “The caveat is that we have small sample sizes.”
Kaminsky says that following the report, his e-mail inbox was immediately flooded by people wanting the test. “They wanted to know, if my dad died from suicide, is my son at risk?” he says….
The bigger problem, says Dracheva, is that there are simply not enough brains of suicide victims to study. Unlike studies of diabetes or schizophrenia, where scientists can call on thousands or tens of thousands of patients, suicide studies remain small, and their findings much more tentative.
It’s because they don’t have DNA from enough people who committed suicide that researchers, including those at Hopkins and Max Planck, have had to try connecting the dots between DNA and whether or not people have suicidal thoughts. Yet there’s no straight line between the contemplation of suicide and actually doing it.
Of the more than 38,000 suicides in this country, over 30,000 are by men, yet the suicide studies remain small? Why?
image illustration via shutterstock / Youjin Jung
10. Written on the Wind (1956)
Douglas Sirk’s soapy melodramas had an element of tongue-in-cheek camp that later came to be appreciated as sly subversion, and in this one Bacall played along beautifully as a canny Manhattan career woman in the advertising business who marries the scion (Robert Stack) of a wild oil clan while secretly making time for the poor outsider (Rock Hudson) who has worked his way up in the family business.
10. Deconstructing Harry (1997)
Williams’ only Woody Allen film is essentially a series of sketches in which Allen works out his demons. Williams is in the film for only a few minutes but he makes them count in a brilliant bit part as Mel, a film actor whose life is such a blur that he has literally gone out of focus.
The world mourns the passing of one of the truest talents of all time – Robin Williams. The Juilliard-trained comedian and actor won an Oscar, two Emmys, five Grammys, and — dearest to me — became a Disney Legend in 2009. Williams made his struggles with depression and addiction public, yet he was unable to overcome them. But here at PJ Lifestyle, we’re going to celebrate his life. Here are Robin Williams’ ten best performances. I hope you’ll take as much comfort in these wonderful moments as I have.
10. The Crazy Ones (2013-2014)
One of the most underrated television series of the past season paired Williams with Sarah Michelle Gellar as father-and-daughter partners in an advertising agency. The Crazy Ones featured a terrific ensemble, sharp writing, and plenty of space for Williams to let loose. Williams had his best moments on the show when he had the chance to blend his trademark humor with sweet sentiment (as in the clip above). He couldn’t have a much better alter ego than the character of Simon Roberts — he and the writers even made recovery from addiction a huge part of the character. The Crazy Ones showed such promise, and it’s such a shame that CBS didn’t see fit to give it a second chance.
I know: with the Obama presidency unraveling in a disaster for America and the world, it seems absurd to waste a blog post on the death of actor James Garner. But bear with me. This is a blog on the culture. It was the culture, dominated by leftists, that helped make this catastrophic presidency possible. Garner’s death underscores part of what went wrong.
The star of the ’50s TV western Maverick and the ’70s private eye show The Rockford Files died at 86 over the weekend. He was a wonderfully charming and entertaining actor who made some fine movies (The Great Escape, The Americanization of Emily) but was only truly a star on the small screen. In this, he resembled two other favorites of mine, David Janssen, who starred in The Fugitive and Harry O and Darren McGavin, who starred in Mike Hammer, The Outsider, and The Night Stalker.
I’m not sure — no one’s really sure — what made an actor more suitable for the small screen rather than the movies back in the day, or why some could move comfortably between one and the other. Garner, Janssen and McGavin all had a limited range and a set number of out-sized mannerisms. But that was true of John Wayne and Clint Eastwood too, two of the biggest movie stars of all time. Maybe something about Garner and the others was just more recognizable and knowable and human than what we saw in movie stars when there actually were movie stars. Wayne, Eastwood — even more actorly stars like Brando and Pacino — all had something huge and iconic about them. No matter how well they played their parts, they were always more personae than persons. You could imagine hanging out with Garner. You could only dream about being John Wayne.
The distinction between what the law permits and what the law enjoins is often blurred. An absence of proscription is sometimes mistaken for prescription. The more the law interferes in our lives, the more it becomes the arbiter of our morality. When someone behaves badly, therefore, he is nowadays likely to defend himself by saying that there is no law against what he has done, as if that were a sufficient justification.
The recent Supreme Court decision in the cases of Burwell v. Hobby Lobby Stores and Conestoga Wood Specialties Corp. v. Burwell illustrates the difficulties when two or more rights clash irreconcilably. The complex issues involved were the subject of an article in a recent edition of the New England Journal of Medicine. The matter is still far from settled. It seems to me likely that the Supreme Court will one day reverse itself when its philosophical (or ideological) composition has changed.
The two corporations were owned by strongly religious people. Corporations of their size were enjoined by the government to provide their staff with health insurance which would cover contraceptive services. However, some contraceptive methods violated the religious beliefs of the owners of the companies. Did the companies have the right to except these methods from the policies that they offered to their staff (who, incidentally, numbered thousands, many of whom would not be of the same religious belief)?
10. Water Fluoridation
Back in the 1950s water fluoridation was, for some, a big thing. A very big thing. We had the threat of communism. We had the threat of “The Bomb.” And then, according to some, we had the related threat of fluoridation.
It is not our purpose here to cover all the arguments pro and con that put this, or any of the other 10 “awful, terrible, frightening things that are going to kill us all,” on our list. But to point out that none of them yet has. There has to be a lesson (or maybe a few) in this.
One is that there are many things on this earth that are dangerous. As movie gangsters used to say, “No one gets out’a here alive!”
There are many things that are poisonous to us if we are exposed to them in large quantities. But, interestingly, some of those very same things prove beneficial when we are exposed to them in smaller amounts.
For example, swallow a whole bottle of aspirin and it will make you sick, or may even kill you. But taking a prescribed dose of two tablets after, say, a protracted discussion about the dangers of water fluoridation and the secret cabal supposedly behind it can do wonders.
(Okay, so where did I put that aspirin bottle?)
If the future were knowable, would we want to know it? When I was young, a fortune teller who predicted several things in my life that subsequently came true predicted my age at death. At the time it seemed an eternity away, so I thought no more of it, but now it is not so very long away at all. If I were more disposed to believe the fortune teller’s prediction than I am, would I use my remaining years more productively or would I be paralyzed with fear?
In a recent edition of the New England Journal of Medicine a question was posed about a 45-year-old man in perfect health (insofar as health can ever be described as perfect) who asked for genetic testing about his susceptibility to cancer, given a fairly strong family history of it. Should he have his genome sequenced?
A geneticist answered that he should not: to have his entire genome sequenced would lead to a great deal of irrelevant and possibly misleading information. But if the family history were of cancers that themselves were of the partially inherited type – more factors than genetics are involved in the development of most cancers – then the man might well consider having the relevant part of his genome, namely that part with a known predisposing connection to the cancers from which his family had suffered, sequenced.
This is not a complete answer, however. Two obvious questions arise: is additional risk clinically as well as statistically significant, and if the risk is known can anything practicable and tolerable be done to reduce it? There is no point in avoiding a risk if to do so makes your life a misery in other respects. You can avoid the risk altogether of a road traffic accident or being mugged on the street by never leaving your house, but few people would recommend such drastic avoidance.
Finding out the number of accidents and fatalities in our beautiful national parks isn’t easy. The National Park Service doesn’t want to scare away visitors so they don’t offer a handy guide to the number of tourists who fall, drown, are trampled, or are eaten while visiting our wild places. However there’s enough data in news reports and studies to come up with a top ten list of our most dangerous national parks. Which do you think tops the list? Yellowstone? Yosemite? Denali? Take a look and see if you’re as surprised as I was. Let’s start with number 10.
10.) The Delaware Water Gap National Recreation Area, New Jersey and Pennsylvania
This lovely river valley encompasses 67,000 acres of land on both sides of the Delaware River in New Jersey and Pennsylvania. Varied species of birds, mammals and fish call this area home and nearly two million visitors a year come to the area to enjoy boating and water recreation sports.
That’s what’ll kill ya in the Delaware Water Gap, which begins our Top 10 Most Dangerous National Parks. Failing to wear a lifejacket while on the river is the number one cause of death. Adding alcohol consumption ups the risk. The Delaware River looks tranquil but can have unexpected currents which can overwhelm a swimmer. Keep the life jackets on and enjoy this beautiful (and only occasionally deadly) recreation area.
If a Martian were to land on earth to study humanity, one of the things that would no doubt surprise him about our race is the pleasure it takes in contemplating its own extinction by various catastrophic means: the crash into earth of a giant asteroid, climate change or the spread of new, virulent and untreatable diseases, especially caused by viruses that emerged from the African jungle.
Of all the viruses to have emerged of late, Ebola is the most frightening. It comes in several varieties of different virulence, with (according to a recent article in the New England Journal of Medicine) death rates from a “low” 40 percent to over 70 percent. Among monkeys the death rate can be 100 percent.
Before Ebola there was Marburg, so named because it was first recognized among laboratory workers in Marburg, Germany. This virus is spread from fruit bats to monkeys to humans, and I happened to be in Rhodesia (as it was then still called) when there was an epidemic there of the disease and 33 percent of the patients died. I remember the reaction in the hospital between panic and pride that it should be in the eye of a world-publicized storm. The question on everyone’s mind was whether it could spread on a large scale from Africa to Europe and North America. Could the virus escape its ecological niche?
Nearly half a century ago, in 1965, the Rolling Stones wrote a song called Mother’s Little Helper. The words went:
Kids are different today, I hear ev’ry mother say
Mother needs something today to calm her down
And though she’s not really ill, there’s a little yellow pill
She goes running for the shelter of a mother’s little helper
And it helps her on her way, gets her through her busy day…
And if you take more of those
You will get an overdose
No more running for the shelter of a mother’s little helper
They just helped you on your way
Through your busy dying day…
The pill was valium (diazepam) and the yellow pill was 5 milligrams – as it still is. White is 2 milligrams and blue is 10.
The song was not great poetry, perhaps, but for pop music it was prescient pharmacovigilance, the epidemiological study of the adverse effects of drugs: though strictly speaking overdoses of diazepam are not dangerous. Many thousands of people have taken overdoses of diazepam in attempts to kill themselves with it, but few have succeeded unless they took something else with it.
However, it has long been known that diazepam and other similar drugs cause falls in the elderly, and such falls are often the precursor of death. It has also been suspected that, by some unspecified mechanism, diazepam (and sleeping draughts of all kinds) promote death.
A paper in a recent edition of the British Medical Journal compares the death rates of primary care patients who were prescribed diazepam-like medicines and hypnotics with those who never were prescribed them more than once (they excluded patients who had been prescribed them only once because it was possible that they had never taken them, which was unlikely if they were prescribed them twice). The authors compared the records of 37,000 of the former with 63,000 of the latter. They attempted to match them for such variables as age, social class, sex, and medical and psychiatric history. They followed the patients for an average of 7.6 years.
Noel Sheppard, prominent conservative media critic and one of the founding contributors and editors at Newsbusters, died of cancer March 28. He was 53 years old.
Newsbusters publisher Brent Bozell posted a short, elegant tribute to his friend and colleague:
Our Noel Sheppard passed away yesterday (Friday) morning at about 5:00 AM. Say a prayer for the soul of a man we’ll all miss professionally, and many, many of us will miss personally as well. Noel was not just a force of nature, he was a very good man.
How quickly this all happened. Just two months ago, Noel wrote about suddenly getting cancer at 53 called “Cancer’s Ray of Hope.” Nine days ago, he wrote us and said he was interested in writing about his “progress” — and he put “progress” in quotes. We were all wishing for better news, and really couldn’t imagine this was a battle that would end this way.
Noel joined us and was introduced to us by Matt Sheffield at the founding of NewsBusters in 2005, and he became our Associate Editor. It must be said that no blogger here was more prolific and more popular.
Matt Sheffield, one of Newsbusters’ founders, penned a tribute to Sheppard that describes why he will be missed so dearly:
Noel never intended to become a professional blogger once he began submitting pieces online. But just as America turned out to love reading blogs, Noel took to the new medium like a fish to water. Eventually, it became a full-time gig for him as he sold his financial planning business to pursue blogging full-time for us at NewsBusters as a mid-life career change.
It was a perfect combination. Noel loved attention and NewsBusters readers loved his work, making him by far the blog’s most popular writer. Very frequently, he single-handly brought in half of the site traffic each month.
In an earlier time, Noel would’ve been an ace reporter or well-known editor, such was his talent for spotting the hot story and writing about it in an engaging way. He also had the rare ability to make dry subjects interesting.
Sadly, Noel’s combination of brio, intelligence, and popular touch are all too rare in the conservative world. Noel and I spoke many times about the fact that too many conservatives and libertarians seem more interested in getting read by Republican congressional staffers than by millions of their fellow Americans. My upcoming book on the future of the American Right is inspired by many of these conversations. (For those interested in some of our preliminary thoughts on the topic, see this piece we published together in the American Spectator in 2012.)
Like everything he did, Noel threw himself into his career as a writer, literally blogging at least one post a day on NB before he fell ill to cancer and was admitted to the hospital in January. Weekend readers could always count on Noel to have something new and interesting for them to read.
Sheppard’s nose for news and his ability to distill the essence of a story into a few well written paragraphs that were enlightening as well as thought provoking is a rare combination. He will be missed at Newsbusters, but also around the right side of the internet.
Meir Har-Zion, an iconic Israeli military figure, died at 80 on March 14. He never pursued a political career and you probably haven’t heard of him. Indeed, his military exploits were mostly confined to a three-year period in the 1950s. Yet his fame in Israel never wore off, and a 2005 poll ranked him 15th out of the 200 greatest Israelis of all time.
Moshe Dayan—another iconic Israeli figure who was a chief of staff, defense minister, and foreign minister—called Har-Zion “the finest of our commando soldiers, the greatest Jewish warrior since Bar Kochba,” referring to the leader of the 2nd-century-CE revolt against Rome. It was Dayan who had Har-Zion appointed an officer even though he had never undergone officers’ training.
In eulogizing Har-Zion, current defense minister Moshe Yaalon called him “one of the greatest warriors in the history of the IDF—an audacious, distinctive commander whose influence in molding generations of fighters and units was pivotal.”
Love it or hate it, The AMC channel hit series The Walking Dead is a mirror of our culture. The show is nominally an apocalyptic zombie series but it is really about how people deal with a total societal collapse.
The answer is: Badly. Usually very badly.
Episode #14 of season 4, “The Grove,” is a thoughtful and tragic examination of what a society should or can do with a psychopath. (Spoilers!) Set in the woodlands of the American south after a zombie apocalypse, in this episode a group of five refugees find a cabin to stop and rest for a few days. There, disturbed young Lizzie goes homicidal. She stabs another little girl to death. Her mother-figure, Carol, then asks her to “look at the flowers” while she prepares to execute her, the only solution possible in their terrible new world.
The clues were all there, laid out carefully in past episodes. The girl had an obsession with capturing and cutting up live rats. She had sudden outbreaks of violent rage and anger. She was fascinated with zombies and couldn’t distinguish between the living and the dead.
The clues are all here in the real world as well, and we are no better at preventing the slaughter when a mentally disturbed person decides to kill. The Sandy Hook killer, the Aurora theater killer, the murderer at Virginia Tech, the killers at Columbine High School, all exhibited distinct indicators of violence and psychosis. All of these killers were under psychiatric care and on medically prescribed drugs. Each of them showed signs like little Lizzie on The Walking Dead, and her path ended the same as theirs, in blood.
In “The Grove,” just as in America today, we wait until a disturbed person becomes a killer and only then do we do something about them. Only then do they receive the confines of a cell or a grave. We can do better than this. Unlike Carol on The Walking Dead, we have options.
In the heartbreaking and frightening essay “I am Adam Lanza’s Mother,” the mother of a mentally disturbed boy explains how she cannot find care for him. “With state-run treatment centers and hospitals shuttered, prison is now the last resort for the mentally ill.” This mother doesn’t want to put her innocent (but violent and disturbed) twelve-year-old boy in prison. Would you like to live in a world where people are jailed for crimes they might commit? Instead, we need to re-build our mental health care system in this country and that includes treatment centers and hospitals. If we don’t, we will continue to endure the slaughter of innocents at the hands of the mentally ill.
“I went down a tunnel, I saw a light.” It has become such a standard part of near-death-experience accounts that it’s almost a cliché. Near-death experiencers report moving through a (usually dark) tunnel and emerging at a different place, where they may encounter a being of light, deceased relatives, heavenly landscapes, a review of their lives—usually some combination, or all, of those elements.
The tunnel experience is common though not universal. Some NDErs seem to go directly to the other world, without passing through a tunnel. Tunnels seem to be considerably less common in Hindu NDEs. Japanese NDErs report moving along rivers instead of tunnels.
At least in Western NDEs, though, tunnels are more common in cases where NDErs are actually clinically dead. All this suggests that the tunnel is a metaphorical representation of the transition from one world to the other. What is, however, universal is that the world encountered in the NDE is very different from the earthly one—and in the vast majority of cases, a lot better. To such an extent that NDErs—no matter what their earthly responsibilities and attachments—usually want to stay in the transcendent world, and regret—sometimes quite painfully—having to return to the earthly one.
Skepticism is the right attitude if it means you insist on real, strong proof before being persuaded of something. It is not a good attitude if it means you’re set to deny and belittle proof of something no matter what.
Skeptics about whether near-death experiences are real tend to be in the second category. Millions of people have undergone them since the 1960s; a good summary of the confirmative evidence that arises from this vast trove of experience is here.
As The Blaze told it:
Brian Miller, 41, was hospitalized after suffering a major heart attack. While he was doing well at first, his heart eventually went into a deadly arrhythmia called Ventricular fibrillation, described by the Mayo Clinic as “a … rhythm problem that occurs when the heart beats with rapid, erratic electrical impulses.”
From that point, Miller was out cold. As a nurse affirmed, “He had no heart rate, he had no blood pressure, he had no pulse…. His brain had no oxygen for 45 minutes….”
In other words, Miller was in the state known as clinical death. Before the advent of modern CPR techniques, there would have been a simpler name for it: death. He would have been seen as beyond any hope of revival.
Brian Miller, of course, revived—but how it happened, and even whether the medical team’s efforts were solely responsible for it, is not at all clear.
A moment’s reflection is all that should be necessary to convince anybody that our passions are not necessarily engaged by public controversies in proportion to the numerical or statistical importance of the question in hand. The debate over euthanasia and physician assisted suicide (PAS) is deeply impassioned everywhere; but not even the most enthusiastic advocate of euthanasia supposes – at least not yet supposes – that the question will ever affect other than a very tiny percentage of people.
The fact is that man is an animal that quarrels over symbols, and euthanasia is as much a matter of symbolic as of practical importance. How else are we to explain the fact, cited in an article in a recent edition of The Lancet about the new Belgian law extending the benefits of euthanasia to children, that there have been dozens of bills before the Belgian parliament desiring either to extend or to limit the scope of the current euthanasia legislation?
Reading the article and the articles to which it was linked, I came across two statements, one startling and the other importantly revealing. The starting fact was the following:
Recent studies have shown that the proportion of deaths that are the result of euthanasia or PAS in Oregon, USA as a whole, and The Netherlands, are 0.09%, 0.4%, and 3.4%, respectively.
Assuming this to be no misprint, why should the rate of physician-assisted suicide be more than four times higher in the United States as a whole than in Oregon, which is one of only four states (with a total of only 5 percent of the U.S. population between them) to permit it? Is it under-reported in Oregon? Is it carried out surreptitiously and illegally elsewhere? Are all the figures so inexact as to be virtually bogus? And if they are bogus, what does that tell us about the whole matter?
Another question is why there should be nearly forty times as many deaths by euthanasia and PAS as there are in Oregon. Is unbearable end-of-life suffering forty times more frequent in Amsterdam than in Portland? This is prima facie most unlikely. The pattern of disease in most western countries in very similar, and both in Oregon and the Netherlands cancer is by far the most common cause of requests for easeful death. Is there something sinister in the disparity?
“HEAVENLY Father,” take to thee
The supreme iniquity,
Fashioned by thy candid hand
In a moment contraband.
Though to trust us seem to us
More respectful—“we are dust.”
We apologize to Thee
For Thine own Duplicity.
That’s by Emily Dickinson, the wonderful 19th-century American poet who churned out almost two thousand poems in almost total obscurity, too shy to publish more than a handful of them during her lifetime.
“Heavenly Father” is a retort, couched in acid irony, and also a plaint. We are not supposed to be anything much—dust, iniquity. Creating us was a momentary lapse, a glitch. The father is not presumed to be proud of what he has wrought.
And yet, if the creations are that flawed, why blame them for their failings? It seems like a double insult—to be fashioned as something iniquitous, then also held accountable for it. Dickinson raised here a profound question about moral responsibility and the relationship of the creator to his imperfect handiwork.
The poetess died at 55 in 1886, and “Heavenly Father” is considered one of her later poems. That means she wrote it about a hundred years before the publication in 1975 of Raymond Moody’s Life After Life, the first major, groundbreaking book on near-death experiences. At that time, thanks to advances in resuscitation medicine in the 1960s, there was a sudden surge in the numbers of people—ordinary people, not mystics or spiritualists—saying they had had a direct experience of the deity. They gave descriptions of a being more logical, or reasonable, than the one Dickinson had accosted.
When I was young enough still to consider myself rational, I was irritated by patients who tried any remedy in desperation to save themselves from their fatal disease. I have long since mellowed and when an acquaintance of mine with glioblastoma, a rapidly fatal brain tumor, decided recently to go to India to try Ayurvedic medicine, all I could do was wish him luck – sincerely so. After all, the scientific medicine — which he would continue to take while there — offered him little enough hope, a few months at most. (This case, incidentally, illustrates an important point: alternative medicine, so called, is not generally alternative, it is additional.)
Two trials of a very expensive monoclonal antibody, bevacizumab, in glioblastoma, published recently in the New England Journal of Medicine, make disappointing or even dismal reading. This antibody is directed at vascular endothelial growth factor that promotes the growth of new blood vessels; glioblastoma is a tumor particularly rich in new blood vessels, and so it was hoped that by preventing them from forming, tumor growth would either be prevented or at least slowed. Early results were promising but as has so often been the way in the history of medicine, early promise is not fulfillment of promise.
In one trial, for example, 637 patients with this terrible tumor were randomized to conventional treatment plus placebo and conventional treatment plus bevacizumab. Although the latter had a slightly longer period free of progression of the tumor, their overall length of survival was not increased, and indeed they suffered so many more side effects that the overall quality of their lives was worse. The patients taking bevacizumab survived on average 15.7 months; those taking placebo survived 16.1 months. The authors of the paper end:
In conclusion, we did not observe an overall survival advantage first-line use of bevacizumab in patients with newly diagnosed glioblastoma. Furthermore, higher rates of neurocognitive decline, increased symptom severity, and decline in health-related quality of life were found over time among patients who were treated with bevacizumab.
This makes rather odd the concluding words of an editorial that accompanies the trials in the Journal:
Finally, it is worth noting that despite its limitations, bevacizumab remains the single most important therapeutic agent for glioblastoma since temozolemide. Ongoing and future trials will better define how and when it should be used in this population of patients for whom so few treatment options currently exist.
Clearly the viewpoint of the oncological researcher is not that of the sufferer of the disease: he is looking far into the future, while the poor patient (all the poorer if he has to pay for his drugs) is thinking rather less far ahead.