Parents teach you how to live.
And Pragmatists teach you how to live in the world.
Pedants and Prudes teach you how to live
in the world of the past.
And Prophets teach you how to live
in the world of the future.
But Philosophers teach you how to die.
Parents want to fill you with food and values.
And Preachers want to fill you with faith.
Politicians want to fill you with hope.
And Pundits want to fill you with opinions.
But Philosophers want to fill you with doubt.
Parents care about your dreams.
And Partners want you to care about their dreams.
Preachers and Poets want you to dream their dreams.
And Pessimists want you to dream their nightmares.
But Philosophers just want you to wake-the-fuck-up!
Patricians run for The Cure.
And Politicians run for office.
Players run for the ball.
And Prophets run from The Call.
But Philosophers walk. Slowly.
Parents prepare you for the life they didn’t have.
And Preachers prepare you for the life you ought to have.
Pessimists prepare you for the life they’ve had.
And Pragmatists prepare you for the life
you’re likely to have.
But Philosophers prepare you for the death
you’re sure to have.
Jesus answered, “Do you think that these Galileans were worse sinners than all the other Galileans because they suffered this way? I tell you, no! . . . Or those eighteen who died when the tower in Siloam fell on them—do you think they were more guilty than all the others living in Jerusalem? I tell you, no!”—Luke 13:2-5
“Well, guys, didn’t really ask for the death sentence. But if that’s all you’ve got, I’ll take it in black.” That’s what my sarcastic friend said, before blowing out the candles on her 32nd birthday cake. She was just like that: you know, the kind of person who simply refuses to take life seriously, the kind of person who can turn anything into a joke, even a breast cancer diagnosis.
Of course there were chinks in her body armor, cracks in her bulletproof butch persona; and, in them, we could see the fear and the terror and the doubt peeking out at us, like shy forest creatures with big eyes. They were there: right there: in the quivering corners, of her sardonic smile.
When I saw her two years later, in the palliative care unit, she said she thought the oncologist’s diagnosis was the worst diagnosis she would ever hear; until, that is, she got a second diagnosis from her uncle, the bible-thumping fundamentalist, with the “Jesus Loves You” t-shirt: “The wages of sin are death,” he thundered through the phone. “God’s punishing you for being a lesbian! You brought this cancer on yourself.” Before hanging up on his dying niece, he promised to pray for her.
Worse still, she said, was a third diagnosis she got from her high-strung, Prius-driving sister, the health-obsessed housewife with rock-hard yoga abs. Her sister was doing a “cleanse” when she got the call. Maybe that’s why she was on edge. Maybe that’s why she got mad at her little sister, for having cancer. Maybe that’s why she told her off: “Can’t live like you’ve been living, and get away with it forever. Been telling you for years now: quit the lesbo-fat-is-beautiful shit, drop 40 pounds, and get in shape! You brought this cancer on yourself.” Before hanging up on her dying sister, she mumbled something under her breath, something my friend couldn’t quite make out, something about how hard this was gonna be on the kids (but my friend had no kids).
Words like “fascist” and “communist” aren’t particularly useful when it’s hard to tell the difference between life in Stalin’s Soviet Union, Hitler’s Germany and Mussolini’s Italy. That’s why Hannah Arendt said we needed a new name for this 20th-century manifestation of an age-old problem. I’m referring, of course, to the problem of evil, which is always, to some extent, a problem of naming.
We found ourselves similarly situated at the funeral, as we gazed down, upon the lifeless body, of our 34-year-old friend. Because words like “secular” and “religious” aren’t particularly useful when health-nuts and fundamentalists start seeing eye-to-eye, when the heartlessness coming out of the health-club is indistinguishable from the heartlessness coming out of the church, when the metaphysics of the yoga retreat converge with the metaphysics of the bible camp.
But we don’t need a new word like “totalitarianism” for her uncle’s diagnosis, nor do we need a new word for her sister’s diagnosis. Plenty of nasty old words will do, though I can’t, for the life of me, seem to settle on one. I keep looking for a word and yet all I seem to find is a scripture: “if I have prophetic powers, and understand all mysteries and all knowledge, and if I have all faith, so as to remove mountains, but have not love, I am nothing.”
“In Heaven’s name, Hollingsworth,” cried I, getting angry, and glad to be angry, because so only was it possible to oppose his tremendous concentrativeness and indomitable will, “cannot you conceive that a man may wish well to the world, and struggle for its good, on some other plan than precisely that which you have laid down? And will you cast off a friend, for no unworthiness, but merely because he stands upon his right, as an individual being, and looks at matters through his own optics, instead of yours?” “Be with me,” said Hollingsworth, “or be against me! There is no third choice for you.”—Nathaniel Hawthorne, The Blithedale Romance (1852)
Scott Nearing lived to be 100 years old. One would be hard pressed to find a single progressive twentieth-century cause that he did not advocate at one time or another. Nearing participated in the labor movement, pacifism, socialism, the woman’s liberation movement, civil rights, communism, and, for the second half of his life, environmentalism, organic farming, and the natural health movement. He also found time to write over fifty books, hundreds of pamphlets and articles, and a novel. He was a religious virtuoso who had a habit of getting himself kicked out of institutions. In 1915, Scott Nearing, then a professor of Economics, was fired from the University of Pennsylvania for protesting against child labor; in 1917, he was fired from the University of Toledo for protesting against the First World War; and in 1930, he was expelled from the Communist Party for writing a book contradicting Lenin and suggesting that the Soviet Union was an imperialistic power.
Making Scott Nearing toe the party line, any party line, was a difficult task. “Although collectivism is part of his creed,” one journalist observed, “try as he will to cooperate with his fellow men, he cannot play the party man. Neither the Right nor the Left has been able to make him conform.” It is likely that Scott Nearing’s first marriage to feminist and woman’s rights activist, Nellie Seeds, ended in large part because he could not compromise his ideals for anyone. When he found their lifestyle too opulent he gave away almost all his clothes, began dressing very simply and eating a Spartan vegetarian diet out of nothing but the same wooden bowl and spoon—this, in protest against his own family. Not surprisingly, Scott Nearing’s single-mindedness made him awkward socially. “He abhorred gossip and small talk, avoiding commonplace trivia,” wrote Helen; “he was not an easy or avid conversationalist.” Scott saw most friendliness as a form of affectation, which he disapproved of as much as “dancing and dress clothes.” Although Helen had religious virtuoso tendencies, it took a while for Scott to convert her to radicalism. “There were times,” she later wrote in her memoir, “when he had to poke or pull me along toward his own rare intense level of dedication.”
Like Scott Nearing, Helen Knothe came from a wealthy northeastern family. As a young woman, she looked forward to a career as a concert violinist. Knothe was, as she said, artistic and musical. She had lived in Europe for years, spoke several languages, and was well versed in Eastern mysticism and the occult. She had studied for a number of years under the Indian guru Jiddu Krishnamurti. Helen Knothe had a serious side, but she was for the most part a fun-loving, spontaneous, free spirit. She once, on impulse, flung herself off to Australia to join a commune. By contrast, Scott Nearing’s obsessive-compulsive personality was legendary. He told his friend Upton Sinclair that he could stay in his summer cottage for as long as he wanted, but he could not touch any of his tools as they “might be mislaid.” Helen once jokingly exclaimed, “I bet you even fold up your toilet paper neat and square.” He confessed that he did. Scott Nearing had a lifelong love affair with order. Carefree Helen Knothe nevertheless fell in love with him. She was in her mid-twenties. He was in his mid-forties. “For more than fifty years,” she remembered, “my life was Scott-centered.” They were not officially married until 1948, however, when Nellie Seeds died, leaving Scott Nearing a widower.
To say that Scott Nearing was at a low point in his life in 1928 would be an understatement. He had recently separated from his first wife, his professional career was over (he had now been fired by two universities), journals declined his articles, and he was broke. Knothe and Nearing’s first years together were tough, made worse by the onset of the Great Depression. After living in a cold-water flat in New York’s Lower East Side for a few years, they became thoroughly disillusioned with American society. They concluded that they were living in a social order based upon perverse values: competition, acquisition, conspicuous consumption, aggression, and war-making. Theirs was a society that butchered for food and murdered “for sport and power.” They resolved to emancipate themselves from American society on ethical grounds: “The closer we have to come to this social order the more completely are we a part of it. Since we reject it in theory, we should, as far as possible, reject it also in practice. On no other basis can theory and practice be unified.” They fixed upon moving back to the land as the only viable solution. Thus, in 1932, Helen Knothe and Scott Nearing moved to rural Vermont, purchased a derelict farm with what little money they had left, and began the herculean task of creating a self-sufficient homestead out of a patch of tired New England soil.
“Sentiment without action,” Edward Abbey once said, “is the ruin of the soul.” The Nearings could not have agreed more. In their manifesto, Living the Good Life: How to Live Sanely and Simply in a Troubled World (1954), they contended that as long as theory was divorced from practice it did violence to the soul by dividing “the personality against itself.” “The most harmonious life,” they argued, “is one in which theory and practice are unified.” “We desired to liberate and dissociate ourselves as much as possible, from the cruder forms of exploitation: the plunder of the planet; the slavery of man and beast; the slaughter of men in war, and of animals for food.”
Since they disapproved of all these “forms of exploitation” they could not in good conscience enjoy any of the spoils. The Nearings claimed that they had tried to live an ethical life in an urban setting and found it impossible. Invariably they encountered the same obstacles: “complexity, tension, strain, artificiality, and heavy overhead costs.” It was, they maintained, “virtually impossible to counter city pressures and preserve physical health, mental balance and social sanity through long periods of city dwelling.” More importantly, the costs of living in the city “were payable only in cash, which had to be earned under conditions imposed upon one by the city—for its benefit and advantage.” As long as they remained in the city they would be more or less in “the system’s clutches,” helpless cogs in an “impersonal, implacable, merciless machine operated to make rich men richer and powerful men more powerful.”
The Nearings insisted that they were doing more than merely saving their own souls when they left the city; they were not “shirking obligations” or “seeking to escape.” They adamantly maintained that their errand into the wilderness was a political act; life on Forest Farm was an argument: the personal was altogether political. “We believed that we could make our contribution to the good life more effectively in a pre-industrial, rural community than in one of the great urban centers.” As the Nearings saw it, their first major contribution was to stop adding to the problem; their second was to prove that it was possible to live a harmless life, by creating a viable alternative to America’s wasteful lifestyle.
The Nearings argued that people could save the planet by living simply and conscientiously. Many of those who had become disillusioned with political change in the early 1970s found this message empowering. “I read, in 1976, Living the Good Life,” wrote one follower, “and then saw you shortly thereafter speaking in Boston. It was during this time, and directly due to you, that I became absolutely clear that I as a single individual . . . could accomplish whatever I set out to do. And what I set out to do was to participate in the healing of the planet.” “Up to this time,” she added, “I had thought that I hadn’t nearly enough of the ‘stuff’ it would take to achieve this. I made it too lofty a goal for myself by placing it only in the hands of those who apparently had large amounts of ‘power and influence.’” “You have shown,” enthused another, “that actions speak louder than words. Doing is more important than knowing and knowledge which cannot be translated into action is of little worth.” By removing themselves to the wilderness and teaching by example the Nearings were perhaps unknowingly continuing a long New England tradition—they were building a city on a hill. But this city on a hill was adamantly anti-city.
The Nearings raised a stone house with their own hands and named their Vermont homestead Forest Farm. “We were not young,” they would later write, “but we were adventurous.” Within a couple of years, they were almost completely self-sufficient, living off of the vegetables and fruits they grew on the farm and describing outsiders as “visitors from the outside world.” When nearby Stratton Mountain developed into a popular ski resort virtually overnight in the early 1950s, the Nearings relocated to Maine. Tellingly, however, the Maine homestead was also called Forest Farm, highlighting the fact that “Forest Farm” was an idea and not a specific place; it was an ecologically conscious way of life that unified theory and practice.
Living the Good Life (1954) was the memoir of Helen and Scott’s two-decade-long (1932-1952) Vermont project, a manifesto published on the centenary of Henry David Thoreau’s Walden; or, Life in the Woods (1854). Comparisons were made almost immediately, with journalists often referring to the Nearings’ book as the Walden of the twentieth century. Although sometimes resenting the comparison, the Nearings did much to encourage it. They quoted from Walden at length in Living the Good Life and had clearly been inspired by Thoreau. Regardless, the Nearings had no pretensions to originality. “We were trying out a life style,” they wrote in 1979, “that was not new in history, but was new in our generation.”
The Nearings’ commitment to the ecological logic of individual responsibility was not born full-grown; they grew into it. Over the years the gulf between their private lives and public lives narrowed, as every aspect of their existence succumbed to the gravitational pull of their ideals. They eventually, for instance, boycotted all of the food-oriented holidays. On Christmas and Thanksgiving, when most Americans made merry, the Nearings fasted. “We do it,” Helen Nearing wrote in 1980, “as a protest against the folly of feasting, against the national gluttony of overfed people overeating.” Feast days were obscene in “a world where there are people who are starving”—“in a world where there is enough to go around, but it is not shared equally.”
Life at Forest Farm became a spectacle, a political act, from the food the Nearings ate to the house they lived in; even their lithe physically fit bodies became prooftexts and arguments for their way of life. “The whole of our lives so far has been our message,” Helen Nearing wrote in 1995. The rural utopia described in Living the Good Life (1954) became a veritable new frontier in the minds of many of the environmentally conscious youth that came of age in the 1970s. Much like the frontier that Frederick Jackson Turner envisioned a century ago, homesteading came to be seen as an ever-present possibility, a comforting thought; a potential escape route from the complexities of modern life; something to fantasize about on a bad day. It was a pressure valve that provided for the safe and largely apolitical release of the socially disaffected. Rather than mobilize politically in the urban centers to push for structural change, idealists were encouraged to withdraw from politics, remove to the country, and live a self-sufficient existence. Even if most of those who entertained the homesteading fantasy never actually moved back to the land, the idea that they could had a powerful effect on the imagination of a generation of ecologically-conscious North Americans.
The Nearings became countercultural celebrities in the 1970s. Forest Farm, in turn, became a sacred place. A trip to Forest Farm became de rigueur for many young homesteaders. World Forum claimed in 1973 that “visiting them” was “like a pilgrimage.” Forest Farm became “a shrine for the faithful;” “a Mecca for people attracted to living a sane and simple life close to the land.” “Every year,” declared Booklegger magazine, “hundreds of young people, long-haired and knap-sacked, make the journey to the Nearings’ Forest Farm at Harborside. They leave with renewed vision.” Advertisements appeared in hippie homesteading magazines such as: “Leaving for the Nearings on the 25th. Have room for two.”
The Nearings’ fame was at least partly due to the popular press. Journalists regularly constructed antecedents for the counterculture. In 1970, the Buffalo Evening News declared that Scott “was a dropout from society 40 years before it was ‘in,’” while The Nation said of the Nearings—“Whether they know it or not, they are by way of becoming an ‘in’ couple.” About a half a year later, the New York Times ran an article on the Nearings entitled, “They Lived Today’s Ideas Yesterday.” They would eventually be touted as “the elder statesmen” of the homesteading movement and “The Counter Culture’s Pioneers.” Establishing their credentials as the bona fide forerunners of the homesteading movement, the Boston Herald American wrote, “Scott and Helen Nearing [had] already had their fill of a fledgling rat race . . . long before the world heard of ‘hippies,’ when Timothy Leary was still in knee pants.” Harold Henderson was blunter: “they left the city well before doing so became fashionable.” Thus, after years of obscurity, Helen and Scott Nearing became countercultural celebrities; as People magazine put it, “the Nearings suddenly became chic radicals.”
Pilgrims to Forest Farm regularly employed explicitly religious language to describe their experience at the seaside homestead. As if they had just returned from a visit to Machu Picchu or Chartres Cathedral, Mark Jackson and Karen Roberts recalled their demeanor: “Speaking in guarded whispers, we felt as if we were on sacred ground, blessed to be there.” “Art in twentieth-century America has many forms,” declared another, “and I view the Nearing homestead as one of them.”
The Nearings were personally revered just as much as (if not more than) their work of art, Forest Farm. In an article chronicling the activities of a homesteading conference, Jack Aley claimed that many homesteaders “worship Scott Nearing as a living folk hero” and refer to him colloquially as “the seer.” He noted that a solemn silence prevailed when Scott addressed the crowd of 2,000 young people at the conference, and that “several young women close to the speaker’s platform had beatific smiles on their faces.”
Ellie Thurston believed that she was in the presence of a holy woman when she met Helen Nearing for the first time: “I began to feel a little shy,” she confessed, “or maybe awe-struck is the word, as we followed the famous Helen Nearing into her very simple, wood-heated kitchen.” “I just couldn’t help but feel somewhat humble,” averred Thurston, “in the presence of the very Mother of the homesteading movement.” She maintained that the Nearings were “practically the founders of today’s ever-growing back-to-the-land movement.” “The name Nearing,” Thurston observed, “is a household word among young back-to-the-landers, with the significance that the name Sigmund Freud would have in psychological circles.” “They’ve been referred to,” she added, “as the “senior gurus” of the homesteading movement.”
Visitors reported feeling spiritually transformed after spending time at Forest Farm. Mary Beth Fielder and her companion were filled with something akin to religious ecstasy after a day with the Nearings. Fielder described the experience in a letter to Helen. “After saying goodbye we stopped at a beach a few miles down the road and as the clouds changed color over the ocean we cried, laughed and prayed that we, like you, would have the courage and perseverance to bring our inner visions into reality.”
For Alice Ellison, visiting Forest Farm was a redemptive experience and an important catalyst for personal growth: “meeting you and seeing your beautiful home have helped me to make some solid changes in my own life. . . . You have touched me profoundly.” Likewise, Sharon Watson wrote, “Since being with you I have been rethinking my garden, diet and general way of life . . . . It is wonderful to see a place where life is in such harmony and that feels so true and balanced. This is what I want in my life.” If mankind is “to work harmoniously together” for a better future, averred Robert Brown, we must all take heed of “your living example.” What is striking about these sentiments is how thoroughly apolitical they are. A pilgrimage to Forest Farm led to introspection and guilt-ridden repentance, not political action.
The Nearing homestead was at times so overrun with long-haired onlookers that it took on the appearance of a countercultural theme park for the ecologically conscious. On these days organic farming was transformed into a spectator sport. Ellie Thurston described one of these particularly crowded days: “300 people crowded in and around the garden, cameras flashing, movie film rolling, tape recorders humming,” while “Scott leaned on his hoe” and preached the gospel of homesteading. Commenting on the pilgrimage phenomenon at the end of the decade, the Nearings remarked, “Before we moved from Vermont to Maine, the trickle of visitors had become a stream. During the next years in Maine it became a flood.
By the 1970s the number of visitors, by head count, has ranged between 2,000 and 2,500 in the course of a year. It often reached dozens in a day.” Most pilgrims arrived, like Sheila and Richard Garrett, “uninvited and unannounced,” counting upon the Nearings’ open-house policy. They were “come-seers,” and like everyone else, they were welcomed, fed, and invited to join in the work of the day or do nothing—provided that they did not get in the way of those who were working, especially the Nearings. Irrespective of these restrictions, the number of visitors continued to grow and the situation became so overwhelming that in 1976 they were forced to put up a sign which indicated that they would only receive guests from three to five o’clock in the afternoon. Even this arrangement became too much to bear, and so, in 1978, they forbade any visitors from showing up without prior notice via the mail (the Nearings, of course, did not have a telephone). They even declared 1978 a “sabbatical” year. The Nearing Edict of 1978 was not altogether successful. Many of the faithful continued to show up unannounced. The decree did, however, significantly curtail the exhausting ritual.
The Nearings worried that the lax cultural values of the 1970s might be fundamentally incompatible with the austerity of the homesteading life. They admired the pilgrims’ idealism, but found their work ethic wanting. As Helen Nearing once put it, many came to Forest Farm, but few stayed: “They said we worked too hard. They wanted to lie in a hammock and discuss the Good Life.” Most homesteaders refused to submit to the Nearings’ highly regimented lifestyle, preferring a more relaxed pace. They also dissented from the Nearings’ tee-totaling ways as well as their aversion to swearing. Scott Nearing actually resigned from the advisory board of the War Resisters League when the word “shit” appeared in the League’s magazine. The Nearings abhorred “dances and beer parties.” They were exactly the sort of “puritanical, sour, righteous” Old Left radicals that Charles A. Reich contrasted with the free-spirited radicals of the counterculture in The Greening of America (1970).
To the young people who visited them, many of the Nearings’ pet peeves seemed old-fashioned, dated relics from a bygone era. The Nearings’ moralism vis-à-vis health and the body, however, made them positively au courant. “It is unnecessary for us to say,” the Nearings once declared, “that the difference between good health and bad is the difference between the success and failure of almost any long-term human project.” In Living the Good Life, the Nearings quoted (favorably) an English medical doctor named G. T. Wrench who likened disease to a “censor” that “pointed out” those errant individuals whose lifestyles were “faulty.” Ellie Thurston found this attitude to be widespread among the leaders of the homesteading movement. At one homesteading convention, she wrote, “I couldn’t help but speak up in defense of falling ill occasionally—some of the diehards honestly seem to think homesteaders are immune to any kind of disease because of our ‘healthful’ way of life.”
The analogies used to differentiate The Good Life from its opposite frequently revolved around notions of purity, hygiene and cleanliness. One follower tellingly described the Nearings as “a clear stream in a polluted river.” In the emerging moral economy of health and wellness of the 1970s, to be a vegetarian and a nonsmoker, to eat organic food and drink water from a spring, was not simply to be living a healthful life, it was to be pure, clean, undefiled, unpolluted, and in a certain important sense, righteous and good. Much of this moralism comes through in Helen Nearing’s cookbook, Simple Food for the Good Life (1980), which Food & Wine magazine described as, “The funniest, crankiest, most ambivalent cookbook you’ll ever read.” In the book’s lengthy diatribe against the eating of meat, Helen Nearing describes the “savage” and “repulsive” custom of consuming “putrefying corpses” as “unethical” and “unhygienic,” and expresses disgust at “the ghoulish practice of making . . . stomachs the burial ground for dead bodies.” It was this holier-than-thou tone that most bothered socialist critic Jigs Gardner about the Nearings: “They make people feel like sinners by endowing what one would think of as neutral acts—eating, for instance—with a strongly moral tone.” “Without directly saying so,” Scott Nearing “makes you feel like an epicene degenerate for enjoying [white] bread.” “Moralism,” averred Gardner, “lurks everywhere in the militant Simple Liver’s world: I’m better than you are because I’m a vegetarian . . . or because I wear old clothes, and so on and so on . . . . Using a woodstove, growing a vegetable garden . . . entitle one to a feeling of sanctimonious superiority to the mass of yahoos out there in suburban Consumerland.” What Gardner grasped was that the Nearings saw all of these personal lifestyle choices in explicitly moral terms. Consider, for instance, how Scott Nearing dealt with the death of his son, John Scott. In 1976, John Scott died of a heart attack. He was 71 years old. Though he was asked, Scott refused to attend his son’s funeral. Sitting out the funeral was, he claimed, a protest against his son’s unhealthy lifestyle. He went so far as to write a nasty letter to his dead son’s daughter, which, in essence, maintained that her father got what he deserved.
The Nearings’ intolerance toward physical weakness grew more marked with age as they began to take full credit for their undeniably impressive vitality. Continuing the Good Life (1979) was full of references to their health: “We homesteaded in Vermont for nineteen years without having a family doctor. We have homesteaded in Maine for more than twenty-five years equally free of permanent medical advice because we have been chronically well.”
The Nearings came to view their “abounding health” as an act of the will, a conscious choice, rather than as a fortuitous byproduct of both their healthy lifestyle and their ample good fortune. When a doctor friend proposed that Scott Nearing begin receiving monthly vitamin B12 injections, he responded curtly: “If I did this I would be trying to prolong my life under medical supervision for the rest of my life. Thank you, but I would rather die much earlier than follow such a course. . . . If I cannot stay well by a normal diet and temperate living, the sooner I die the better for me and the society of which I am a member.” There is in this statement—and most of the Nearings’ thoughts on health and wellness—an echo of Scott Nearing’s onetime interest in eugenics. In The Super Race: An American Problem (1912), he declared that the “perpetuation of hereditary defect [was] infinitely worse than murder.” “The murderer,” he argued, “robs society; the mentally defective parent curses society, both in the present and in the future, with the taint of degeneracy. The murderer takes away a life; but the feeble-minded parent passes on to the future the seeds of racial decay.” We must, Scott Nearing concluded, do everything in our power to ensure that the “scum of society” do not have children.
For most of the 1970s, when the homesteading movement was in full bloom, Scott Nearing was in his nineties and Helen Nearing was in her seventies. Yet they continued to build houses out of stone with their bare hands, dig sizable ponds with shovels, maintain a massive garden virtually all year round, cut and split all their own firewood, and do physical work on a daily basis that regularly exhausted young people in their twenties. The Nearings’ vitality granted legitimacy to their way of life among the health conscious. “Your health,” one homesteader declared, “is evidence of how successful your outlook on life can make you physically as well as mentally.” Their bodies had with age become arguments and prooftexts for their way of life.
The Nearings used their age and remarkable health to browbeat and belittle homesteaders (usually ex-homesteaders) who publicly pointed out the difficulties and physical discomfort associated with moving back to the land. For instance, the March 1979 issue of Country Journal contained a number of letters to the editor from crestfallen homesteaders who had tried their best, failed, given up, and moved back to the city. Helen Nearing responded to these letters in the following edition with derision: “Scott and I know it is hard work subsisting by your own sweat on a homestead. We’ve done it for half a century, but who’s crying?” “Not we,” she crowed. “If we oldsters can stand it, what’s wrong with your various authors that they should creep back to the protection of the city? What’s all this grousing about hard work . . . .? Scott, at ninety-six, enjoys it. ‘Good exercise,’ he says. ‘Keeps the blood boiling.’”
The Nearings responded with anything but sympathy when homesteaders complained that being back to the land in the middle of a bitterly cold winter could be miserable. One homesteader, suffering from chronically cold feet, wrote to the Nearings quite clearly looking for consolation and advice. Helen Nearing responded callously, practically blaming the woman for her condition: “It seems to us that your circulation is probably inadequate. We’ve lived in New England through more than 40 years’ worth of subzero winters, and neither Scott nor I have ever suffered from chronically cold feet. In fact, I often pad around on our stone floors—in the dead of winter—barefoot!”
There was something almost monstrous about Helen and Scott Nearing. Their idealism was all-consuming. This was especially true of Scott Nearing. He loved teaching: yet he engaged in activities that led to his dismissal from two different universities. He thought child labor and the First World War were wrong—and he was going to protest against them, come what will. Scott Nearing loved Communism: yet he wrote a book contradicting Vladimir Lenin and suggesting that the Soviet Union was an imperialistic power. That book resulted in his expulsion from the Communist Party. He thought, at the time, that certain aspects of Soviet foreign policy were wrong, and he was going to say so, come what will. The Communist Left turned its back on Nearing in the 1930s; but he did not turn his back on the Soviet Union. Scott Nearing defended Stalin well into the 1950s, long after the American Left had repudiated him. One can only assume that he loved his son John: yet he was willing to sacrifice even this love on the altar of his idealism. Scott Nearing disowned his own son because he disapproved of his politics and his eating habits.
How are we, who are not consumed by similar passions, to make sense of people like Helen and Scott Nearing? How are we to make sense of these lovers of discomfort and discord? The Nearings took the logic of individual responsibility to many of its more radical conclusions. In their personal lives and in their little City on a Hill, Forest Farm, we can see so much of the beauty and ugliness of the purpose driven life.
“death is something that always has to be enclosed by an elaborate set of explanations. It is an ancient litigation, this turning of horror into stories, and it is a lonely piece of work, trying to turn the stories back into horror, but somebody has to do it—especially now that God has reverted to a state of fire.”—Tony Hoagland, “Fire,” What Narcissism Means to Me (2003)
Funerals suck. And they hurt. On so many different levels. If you were really close to the deceased, you’re probably in a great deal of pain. Probably devastated. The sense of loss can be so all-consuming, so overwhelming, so suffocating. You feel like you’re drowning in it.
No one prepares you for how profoundly physiological it is. Your chest tightens so much that your breathing grows shallow—frightfully shallow, almost asthmatic. Your head swims with a dizziness that’s halfway between roller-coaster gross and the drunk spins. You shake uncontrollably from time to time, for no apparent reason. And you feel really nauseous, so much so that you think you might puke—and sometimes do.
Even if you weren’t particularly close to the deceased, seeing others in so much pain—especially people you know and love—usually triggers a powerful empathetic response. Before you know it, you are, quite literally, feeling their pain.
But we have another reason for detesting funerals; a reason which is less noble, less respectable, and less socially-acceptable; a reason which is, well, sort of selfish: funerals force us to confront our own mortality. And this makes all of them more or less uncomfortable. But some funerals are considerably more uncomfortable than others, and I think I know why.
There are, to my mind, three distinct types of funeral: (1) funeral for the sinful, (2) funeral for the elderly, and (3) funeral for the innocent. Each is shaped by our moral assessment of the recently deceased.
1) Funeral for the Sinful: Few, if any, view this person’s death as accidental. All to the contrary, the deceased was morally compromised in some way. We blame them for what happened. Though sad, we all know they had it coming. After all, they made bad decisions, and these bad decisions led them to this untimely end (e.g., the addict who overdosed on heroin, the drunk driver who careened off a cliff, the career criminal stabbed to death in prison, the chain-smoker felled by lung cancer). This kind of funeral is, existentially speaking, by far the easiest funeral to attend. Sure, if you use heroin on a regular basis, attending the funeral of a fellow addict might be profoundly unsettling. But in all likelihood, the vast majority of the people at the funeral do not use heroin, and, as a consequence, they don’t have to worry about dying of a heroin overdose. It’s easy to avoid facing up to your own mortality at a funeral for the sinful, so long as you yourself do not engage in the sinful practice in question.
2) Funeral for the Elderly: You can’t blame an old person for dying at 94. Nor can you deny the fact that their fate will one day be yours. But there’s no reason to dwell on this thought. After all, you’re in your 30s or 40s or 50s, and 94 seems so very far away.
3) Funeral for the Innocent: Existentially speaking, this is by far the most difficult funeral—the degree of discomfort is outrageously high—because the deceased cannot be plausibly blamed for their death (e.g., the 20-year-old athlete who drops dead of a heart attack as a result of a rare genetic defect; the 36-year-old hospital employee who was accidentally shot by a police officer whilst driving to work on a bicycle; the 32-year-old mother of three who dies of breast cancer despite a lifetime of clean living, yoga, and veganism; or the 43-year-old conference participant who randomly chokes to death on a piece of steak at dinner). “When it comes to death,” Epicurus maintained, “all men live in a city without walls.” When the innocent die, we’re forced to remember this.
At the funeral of a sinful man, death’s voice is but a whisper: barely audible and easily ignored. At the funeral of an elderly relative, death speaks to you in a voice that’s clear and unmistakable—yet strangely distant, and oddly unconvincing. But at the funeral of an innocent man, death grabs you by the shoulders and shouts in your face: “You could be next! Yes, you! This could happen to you! Today, tomorrow, or the day after that! So don’t get too comfortable!”
“Oh, the fatal curiosity of the philosopher, who longs, just once, to peer out and down through a crack in the chamber of consciousness. Perhaps he will then suspect the extent to which man, in the indifference of his ignorance, is sustained by what is greedy, insatiable, disgusting, pitiless, and murderous—as if he were hanging in dreams on the back of a tiger. ‘Let him hang!’ cries art. ‘Wake him up!’ cries the philosopher”—Friedrich Nietzsche, “On the Pathos of Truth” (1874)
Like Horst Hutter, I maintain that images are a kind of food which must be properly digested like any other. Some things, like an ISIS beheading—child pornography or a drone bomb snuff film—are extremely hard to digest; so hard, in fact, that they can leave you with a species of spiritual indigestion, which manifests itself in bad dreams, generalized anxiety, and PTSD. The decidedly disturbing sculptures that comprise Bevan Ramsay’s Soft Tissue leave you with haunting images that take weeks to properly digest.
Like touring a sweatshop, a garbage dump, or a factory farm, Soft Tissue forces you to remember that we are indeed “sustained by what is greedy, insatiable, disgusting, pitiless, and murderous.” Even so, it would be a mistake to view Ramsay as yet another preachy moralist trafficking in the pornography of pain. Nothing could be further from the truth. Ramsay is an artist and a philosopher—which means that his loyalties are deliciously divided: the artist in him wants to let us hang in “dreams on the back of a tiger,” whilst the philosopher in him wants to wake us up! This creative tension runs through all of Ramsay’s work, but it’s never been quite so obvious as it is in Soft Tissue. There’s something undeniably erotic and sensual about these sculptures. Yet, at one and the same time, we find ourselves—in equal measure—repulsed by them. In the Bible, this strange, ambivalent mixture of fear and wonder is known as awe.
—John Faithful Hamer, The Village Explainer (2016)
“When a man who is happy compares his position with that of one who is unhappy, he is not content with the fact of his happiness, but desires something more, namely the right to this happiness, the consciousness that he has earned his good fortune, in contrast to the unfortunate one who must equally have earned his misfortune.”—Max Weber, The Sociology of Religion (1922)
I was born on a hippie commune in 1974 where yoga, free love, and vegetarianism, blended together seamlessly with Christian mysticism and Tibetan Buddhism. We left the commune a week or two before John Lennon was shot. It was 1980, we were in North Carolina, and I was six years old. I realize now, and only in retrospect, that I have been going back to the commune, as a scholar, over and over again. Much of my academic odyssey through the history of America was a personal attempt to make sense out of the strange epoch that produced me. As an undergraduate history major, I was attracted most of all to the 1830s and 1840s, which witnessed an extraordinary flowering of utopian experimentation, radicalism, and reform. I was amazed to learn that my parents’ generation—the babyboomers—were not the first to dabble in free love, vegetarianism, and Eastern mysticism. They were also not the first group of relatively privileged white middle-class people to turn their backs on traditional politics in favor of personal development. I have always agonized over the ethics of this inward turn. Is the personal really as political as proponents of this strategy say, or is this move essentially irresponsible and narcissistic?
My first attempt to answer this question came in the form of an undergraduate Honours Thesis entitled “Antislavery Realpolitik: Salmon P. Chase, the Kansas-Nebraska Debate of 1854, and the Politics of Reform on the Eve of Republicanism.” I used the letters, diaries, and speeches of the prominent abolitionist to demonstrate how a profoundly moral individual might choose the messy road of democratic politics, knowing full well the compromises and disappointments that it would entail. Chase’s willingness to grapple with the moral complexity of political engagement astonished me and held my attention for quite some time. In graduate school, however, I soon found myself gravitating back toward the other kind of abolitionist, who would have little or nothing to do with politics or compromise. Under the tutelage of my advisor, Ron Walters, my fascination with radical abolitionism gave way to a more wide-ranging interest—spanning two centuries—in those who have spurned political activism for a personal, utopian approach to social reform.
My first-year paper, a major rite of passage at Johns Hopkins, focused on the lives of Helen Knothe Nearing (1904-1995) and Scott Nearing (1883-1983), two socialists who moved from New York City to rural Vermont at the height of the Great Depression. In their homesteading manifesto—Living the Good Life (1954)—the Nearings insisted that we could all change the world for the better by withdrawing from politics, moving back to the land, and living a self-sufficient existence. Living the Good Life became a classic among hippie homesteaders soon after it was republished in 1970. The Nearings, in turn, became countercultural celebrities and their New England homestead, Forest Farm, became a sacred place. Thousands of long-haired idealists made the pilgrimage to Forest Farm in the 1970s. I thought that my research into the Nearings at the Boston University Special Collections would lead to a dissertation on hippie homesteading. But something in the Nearing Papers kept drawing my attention away from the back-to-the-land movement: the Nearings were obsessed with food, health, and disease. All of these concerns came together in what was commonly referred to in the 1970s as the natural health movement.
I began to track the relationship between radical ideas about politics and radical ideas about health. What I expected to find was a necessary connection between the two. After all, I reasoned, many nineteenth-century reformers—such as William Lloyd Garrison (1805-1879), Theodore Dwight Weld (1803-1895), Susan B. Anthony (1820-1906), and Elizabeth Cady Stanton (1815-1902)—were vegetarians who obsessed over food purity, as were many of the radicals and reformers during the Progressive Era. Upton Sinclair (1868-1968), Jack London (1876-1916), and Scott Nearing are obvious examples. The list could go on. Still, ultimately I would have to confess that this expectation rang true because it accorded with my personal experience.
I was brought up in a single-parent household where progressive politics and health consciousness seemed to go hand-in-hand. My mother was a feminist who railed against Reagan, wrote angry letters to politicians, and went to demonstrations to protest against this and that; she championed the cause of the mentally ill, wrote an M.A. thesis on literary decolonization, and composed environmentalist hymns such as “Hugged by a Tree” and “World with Whales.” But she also ate sprouts and tofu, popped large quantities of vitamin pills, shunned meat, avoided hydrogenated fats, burned incense, played the guitar, sang folk songs, went to a Buddhist temple, wrote a book about a Tibetan lama, and interrogated the food labels at the grocery store mercilessly. And she was not alone. Virtually all of the activists that I had known were vegetarians of some description who fretted over chemical additives, organic food, genetic engineering, food purity, and disease causation. My experience with the Student Labor Action Committee (SLAC) is a case in point.
When my wife and I joined SLAC in 1999, the organization’s main objective was to force Johns Hopkins to abide by Baltimore’s living-wage legislation. Baltimore was one of the first cities in the United States to enact an ordinance of this kind. The law stipulated that every organization that received municipal money—directly or indirectly—had to pay its workers “a living wage”. In Charm City at the time, a living wage was judged to be about $10.00/hr (the minimum wage was, then, less than $6.00/hr). Johns Hopkins was unaffected by the living-wage legislation. Even so, we maintained that as the largest employer in the state of Maryland and the recipient, each year, of over a quarter of a billion dollars in federal money, the administration had a moral obligation to abide by the law regardless of whether or not it technically applied to Johns Hopkins. We organized demonstrations and letter-writing drives, yet to no avail: the administration simply would not budge. In 2000, we staged a sit-in and occupied JHU’s Garland Hall (the president’s building) for 17 days. A number of local restaurants kindly donated food—delicious food—but much of it went to waste because it contained meat. Back then, before the kids, my wife and I were strict vegetarians, as were most of the members of SLAC. We subsisted on organic food, drank echinacea tea daily, went to a farmer’s market every Saturday, and spent a great deal of money on vitamins. For these reasons, and others, I was quite sure of what my research into the relationship between health reform and the Left would unearth, long before I started digging.
Initially, what I found supported my preconceived notions. As I had suspected, an interest in a recurring cluster of unconventional ideas about food, health, and disease was indeed something that united a diverse group of radicals and reformers on the political Left. Still, this was not enough. To prove that there was a connection between liberal-left politics and popular health reform, I had to demonstrate that the influence of these ideas was confined, or at least largely confined, to those on the political Left. This would, I thought, be relatively easy to prove. But I was wrong. Indeed, before long I realized that the influence of these ideas extended far beyond the countercultural world of faded blue jeans, brown rice, and yoga mats, even during the early 1970s. Moreover, it soon became clear that these ideas were more than just un-conventional or un-orthodox, which is to say that they were united by more than what they were not. By and large, I found that these ideas had a grammar, a syntax, and a logic—the logic of individual responsibility; they shared a common style too, which seemed to echo the rhythms and sounds of the Old Testament, not the Old Left. Taken together, this recurring cluster of ideas about food, health, and disease constituted a fairly coherent belief system—an ideology of natural health—which can be stated as a series of existential propositions: First, every human being is in possession of a free will regarding health. Second, good health and a long life are rewards for a certain kind of behavior. Third, certain lifestyle decisions lead inexorably to the salvation of the body—that is, good health and a long life—while others lead to sickness and ill health. Chance has little or nothing to do with this process, and allowances were only rarely made for mitigating factors such as hereditary predisposition. Commitment to these three propositions was, I discovered, what held the various factions of the natural health movement together.
The ideology of natural health was not incompatible with liberal-left politics in the twentieth century. My own experience bears witness to this. Even so, I found that its emphasis on individual responsibility often made it much more compatible with a socially conservative outlook, especially during the 1980s. But to linger too long on this point would be a mistake. Much like evangelical Protestantism and the New Age movement, the natural health movement was a challenge to the modern scientific worldview, not merely the post-New Deal liberal faith. Sprout-eating health gurus, crystal-gazing spiritualists, and sweaty televangelists had at least one common goal: to make the world a meaningful place for the American people. They wanted to banish the confusion and uncertainty that they thought modern science had engendered, and they all achieved a certain measure of success, though we are here interested primarily in the contributions made by the mainstream leaders of the natural health movement.
I am not the first to compare twentieth-century health reform to religion, nor am I likely to be the last. Thus far, however, the value of this comparative approach has not been fully realized because those who have employed it have done so in a fuzzy and imprecise manner. At its worst, the comparison is used in such a way as to stretch the definition of religion so much that it ceases to be meaningful. If, for example, mall-shopping is a religion—as a friend of mine from New Zealand once argued at a dinner party—then what could possibly not be described as a religion? If everything is a religion, then nothing is. Funny as it was, my friend’s disquisition taught me much more about his highly unconventional definition of religion than it did about shopping. Analogical reasoning can only yield significant insight when all key terms are used in a more or less conventional way. In this instance, however, adhering to a conventional definition of religion is not enough. There are a wide variety of religious traditions in this world, and an even wider variety of ways in which those traditions are interpreted, so religious analogs can be found for many secular practices.
If all religion is fair game, as it was for mythologist Joseph Campbell, and one’s knowledge of religion is encyclopedic, then finding the religious in the secular is fairly easy—too easy in fact—since one can selectively draw upon a wealth of potential examples. Thus, if an analysis of the similarities between religion and American health reform is to be truly meaningful and suggestive, it must be hemmed in by the limitations of culture and history, time and place; a specific religious tradition must be identified, with demonstrable ties to twentieth-century America, and this tradition must be used, throughout, as the only point of comparison. To that end, let me state from the outset that I believe the philosophical origins of twentieth-century health reform are to be found in the religious traditions of the West, and not, as is commonly assumed, in those of the East.
Health reformers such as Jerome Irving Rodale, Adelle Davis, Carlton Fredericks, Adolphus Hohensee, Robert David Rodale, and Mark Harris Bricklin drew heavily upon language and concepts derived from Judaism and Christianity, not from Buddhism, Hinduism, Confucianism, or Taoism. They were especially indebted to many of the elements of the Protestant tradition that came together during the Second Great Awakening and continue to inform the modern evangelical worldview, such as: 1) the rejection of predestination in any form; 2) the concomitant emphasis upon free will and individual responsibility; 3) the belief that we are all in need of salvation from a corrupt and unclean world; 4) the notion that living a virtuous life in this world is hard work and ceaseless struggle; 5) the idea that feeling more or less inadequate on a regular basis is an indispensable characteristic of the virtuous life, as it wards off complacency and propels the individual toward perfection; 6) the belief that the wayward cravings of the body are, ultimately, the individual’s worst enemy; 7) the notion that the spiritual must be incorporated into everyday life; and, finally, 8) the idea that conviction ought to lead to repentance. Being convinced of the truth of the gospel of health was not enough; it had to lead to behavior modification and lifestyle changes.
Health reformers employed the biblical language of sin and redemption, in part, simply because of who they were. They were products of the Judeo-Christian tradition, and, as such, this language came naturally to them. But there were other—less spontaneous but more practical—reasons for expressing their message in such a manner. The storytellers of the natural health movement knew their audience. They understood that they were preaching the gospel of health to the most religious country in the industrialized world; to a country where prayer is common and church attendance remains high; to a country shaped by centuries of Protestantism; to a country where many still believe in Good and Evil, angels and demons, salvation and damnation, Heaven and Hell. They understood that by and large, in America, Protestant theology is optional, but Protestant psychology is not. Even so, health reformers departed from the mainstream American Protestant tradition in one critical way: they embraced a secular form of perfectionism.
Like most Protestants, health reformers posited the existence of a fateful Fall from Grace: it was, for them, the Fall into the industrialized world of machines, pesticides, lazy living, obesity, pollution, white sugar, cancer, and Wonderbread: “What’s going on is that we are all becoming exiles from nature. Not being driven out of our native land by a cruel tyrant, but simply walking away of our own volition, enticed by nothing more than cuteness and convenience,” wrote one health reformer in 1978. “In fact,” he added, “the biggest threat to our health and well-being today is this gradual shift from a natural lifestyle to a technological, synthetic—call it phony if you will—lifestyle. It’s constricting our arteries and our spirits. It’s giving us depression and diabetes. It’s making us passive, fat, bored and lonely. It’s making us allergic, addicted, and malcontent.” Unlike the Fall described in the Book of Genesis, the Fall described by health reformers was reversible. The evil brought into the world by modernity could be exorcised. The gates to the Garden of Eden could be reopened. Human nature was not hopelessly flawed. One could retrain the body to crave that which was healthy.
If the ideology of natural health was a sort of Protestantism, it was a peculiar Protestantism wherein God was optional, The Fall was reversible, and original sin was altogether absent. These were major breaks with the Augustinian worldview that has informed mainstream Protestantism since the Mayflower. Health reformers departed from the Augustinian tradition decisively when they rejected original sin and all of its secular analogs. The intolerance towards human frailty that has so characterized the thought of twentieth-century health reformers can be traced back to this root cause. Health reformers have as a rule failed to appreciate their own limitations, just as they have failed to account for the limitations of the human beings who they have judged so harshly. As we shall see, the natural health movement was liberating and undeniably empowering for many Americans, especially women. Even so, it gave rise to a new orthodoxy, with a decidedly unforgiving approach towards aging, mothering, and disease. As one alternative medicine provider put it in 1977: “You’re going to have to take the blame for everything once you get your body back.” In health-conscious circles across America, tragedies such as cancer, heart disease, depression, schizophrenia, crib death and miscarriage were redefined as punishments meted out to those who failed to obey the natural laws of health.
Between 1970 and 2016, millions of Americans embraced concerns that were once the exclusive province of a quirky subculture: health-food stores and health clubs proliferated; anti-smoking campaigns won astounding victories; vegetarianism and breastfeeding became much more common—and the demand for organic food, vitamin and mineral supplements, water filters, exercise gear and alternative health care created multi-million-dollar industries. Pre-1970 America had its fair share of health nuts, exercise gurus, vegetarians, anti-smoking activists, organic farmers, and alternative health-care providers. Yet even the most widespread of these health enthusiasms never affected anything more than a small percentage of the American population. Twentieth-century health reformers succeeded in doing what generations of health reformers before them failed to do: they broke through to the masses and helped define mainstream American attitudes toward food, health, and disease. “History,” observed the authors of Panic in the Pantry (1975), “is full of food fads. But our current preoccupation with organic and additive-free food has assumed truly unprecedented dimensions. Sylvester Graham would have been pleased—and probably a bit envious.”
“there was a mighty tempest in the sea, so that the ship was like to be broken. . . . the mariners were afraid . . . . And they said every one to his fellow, Come, and let us cast lots, that we may know for whose cause this evil is upon us. . . . the lot fell upon Jonah. . . . So they took up Jonah, and cast him forth into the sea:”—Jonah 1:4-15 (King James Version)
Magic fish aside, what strikes the modern reader as odd about the Jonah story is the mariners’ unthinking assumption: that human behavior was to blame for the storm. The idea that the weather could be altogether indifferent to their welfare was foreign to these ancient mariners. The storm had to have a meaning because they inhabited a world that was in every respect meaningful, a world without mere coincidence, a world that was yet to be disenchanted. A romantic longing for that meaningful, enchanted world—where bad things only happen to bad people—has been a hallmark of health-conscious America for well over a century.
In The Laws of Health (1857), for example, nineteenth-century health reformer William Alcott declared that there were at least two things that we could all be sure of: namely, “that, if the wicked do not live out half their days, it is because of their wickedness”—and, that “if the infirmities of age come upon us, it is because we have disobeyed, either intentionally or ignorantly, the Divine laws.” Alcott was here articulating one of his Judeo-Christian culture’s most fundamental assumptions. Philosopher Susan Sontag correctly stressed that in “the world envisaged by Judaism and Christianity, there are no free-standing arbitrary events. All events are part of the plan of a just, good, providential deity . . . . Every disaster or calamity must be seen either as leading to a greater good or else as just and adequate punishment fully merited by the sufferer.” “Sicknesse comes not by hap or chance,” as one seventeenth-century New England Puritan put it, “but from mans wickednesse.”
It took the West centuries to move away from this moralistic Judeo-Christian worldview toward a scientific one that recognizes the often accidental and arbitrary nature of human suffering. Anthropologist Lucien Lévy-Bruhl maintained that “our distinctive achievement,” as moderns, “was to invent the idea of natural death and actually believe in it.” For Lévy-Bruhl, “the defining feature of primitive mentality is to try to nail a cause for every misfortune; and the defining feature of modernity, to forbear to ask.” Continuing this thought, Mary Douglas and Aaron Wildavsky emphasized that the “concept of the accident rate and the normal chances of incurring disease belongs to the modern, scientific way of thinking. Faced by statistical averages there is no point in my asking why a particular illness should have struck me.” Douglas and Wildavsky suggested that had Lévy-Bruhl lived to see the 1970s, he would have been astonished to see moderns “asking those famous primitive questions as if there were no such thing as natural death, no purely physical facts, no regular accident rates, no normal incidence of death.” In health-conscious America, the older way of thinking—“the primitive mentality”—made a spirited comeback in the twentieth century.
The Meaning of Death in a World without Chance
Humans are probably the only creatures that can see death coming for them, so to speak, years before it arrives. The human brain has evolved a capacity for foresight that is, to the best of our knowledge, unrivaled in the animal kingdom. We can anticipate problems and opportunities long before they happen and plan accordingly. In some parts of the world, for instance, people plant and tend to trees whose fruits will be tasted only by their grandchildren. Foresight is an evolutionary adaptation that has given the human species a tremendous competitive advantage. Even so, foresight has its costs. Humans are perhaps the only intelligent animals that fret about death when they are in perfect health and safety. Most people fear death. Often, they also fear the mental and physical decline that so frequently precedes it. Alternative healthcare providers have consistently taken advantage of these fears. They have informed people that decline is not inevitable and that the human lifespan can be vastly extended.
William Alcott went so far as to claim that the mortal life of a human being need not ever come to end. All death was failure, as far as he was concerned, even that of the famously long-lived patriarch Methuselah, who, according to the Book of Genesis, met his end at the ripe old age of 969. Alcott maintained that if “Methuselah suffered from what we call the infirmities of age, it was his own fault. God, his Creator, never intended it. The very common belief,” added Alcott, “that old age must necessarily bring with it infirmities, besides being a great mistake, reflects dishonor to God.” Twentieth-century health reformers were, for the most part, considerably more reasonable than Alcott. Still, they set unrealistic longevity standards that few people, if any, have been able to achieve. For instance, Gayelord Hauser (1895-1984), bestselling author of Look Younger, Live Longer (1950), claimed that if we took care of ourselves we could all live to be 140 years old. Adolphus Hohensee (1901-1967), always prone to hyperbole, believed 180 more accurate. Prevention writers, with relative restraint, usually pegged the “normal” human lifespan at 120. Proclamations such as these gave graying Americans a kind of hope that doctors could not in good conscience provide.
Harvard nutritionists Elizabeth Whelan and Fredrick Stare maintained that licensed medical professionals were in this instance, as in so many others, competing at a distinct disadvantage. As men and women of science they were obliged to tell their ageing patients the unpleasant truth: namely, that some diseases thus far cannot be cured and some suffering cannot yet be alleviated; “that popping vitamin pills every few hours, or avoiding white bread and refined sugar, is not going to either cure or prevent degenerative disease;” that “there simply are no wonder supplements or magic potions;” and that until “medical science advances far beyond its present state, humans will necessarily continue to die from something.” Although reasonable, this analysis proved a bitter pill that many Americans refused to swallow. Instead, more and more people came to believe that one could bargain with death. They were encouraged in this belief by the remarkable promises made by health reformers.
For better or for worse, the ideology of natural health has permeated virtually every facet of American society in the last four decades: health-food stores and health clubs have proliferated; anti-smoking campaigns have won astounding victories; breastfeeding and vegetarianism have become much more common; and the demand for organic food, herbal remedies, vitamin and mineral supplements, exercise gear and alternative health care has created massive industries. The ideology has resonated particularly well in the United States because its emphasis upon individual responsibility is, at bottom, largely a secular restatement of deeply-rooted Judeo-Christian assumptions about the meaning of suffering and the capacity for choice. Health gurus such as Jerome Rodale, Adelle Davis, Carlton Fredericks, and the editors of Prevention—“America’s Leading Health Magazine”—promised much to health-conscious Americans. They maintained, for instance, that ageing—a human experience so thoroughly fraught with danger and uncertainty—could be controlled by the right mixture of vitamins, exercise, organic food, dietary restrictions, and positive thinking. Although the natural health movement provided new choices and a sense of self-mastery to many, especially women, its success has spread a new orthodoxy across America, with a harsh and unforgiving approach toward ageing, obesity, motherhood, disease and death. Health reformers such as Robert Rodale helped redefine tragedies such as cancer, heart disease, depression, schizophrenia, crib death and miscarriage as punishments meted out to those who failed to obey the natural laws of health. They promised to free the American people from the tyranny of Western Medicine. Yet they replaced Doctor God with an equally demanding deity: Mother Nature.
When Bad Things Happen to Good People
Every so often, the promises propounded by the preachers of prevention proved problematic. Freshly published results based on a well-designed scientific study might, for instance, demonstrate that a beloved supplement like vitamin C or echinacea does not, in fact, cure the common cold. Still, nothing has proven more problematic than the death of a leader. The death of Jim Fixx is a case in point. In the early 1980s, Jim Fixx was the most well-known fitness promoter in America. His best-known work, The Complete Book of Running (1977), broke numerous records for non-fiction sales. More than perhaps anyone else, Fixx popularized running as sport, religion, and lifestyle. His sinewy physique was legendary. And his superior health was assumed. Indeed, that is why his early death came as such a shock. Fixx was running along a quiet tree-lined street in Vermont on a sunny July day in 1984 when he dropped dead of a heart attack. He was just 52 years old. Remarking upon the intrinsic irony of Fixx’s death proved irresistible for irreverent late-night comedians such as Denis Leary. In No Cure for Cancer (1992), Leary maintained that Fixx’s death was a refutation of the health-conscious lifestyle. Health enthusiasts were always telling Americans, he claimed, that if they would only give up their bad habits and replace them with good ones they could add an extra ten or twenty years to their lives. “Hey,” Leary retorted, “I got two words for you, okay. Jim Fixx. Remember Jim Fixx? The big famous jogging guy? Jogged fifteen miles a day. Did a jogging book. Did a jogging video. Dropped dead of a heart attack. When? When he was fucking jogging, that’s when!” In 2004, twenty years after Fixx’s death, famed running instructor Hal Higdon posted an editorial on his website that took issue with those critics of long-distance running who characterized Fixx’s death as an indictment of the sport. Jim Fixx’s father, Higdon observed, died of a heart attack when he was just 43 years old. Had he not taken up running when he did, it is likely, Higdon argued, that Jim Fixx would have died of a heart attack at 43, too. Running, he maintained, probably added nine years to Fixx’s life. In 1987, three years after Fixx’s death, health writer Carlton Fredericks, famous for his syndicated radio show “Good Health,” also died of a heart attack. He was 76 years old, which is not a particularly good score for a health reformer. But it was an open secret by then that he had been smoking heavily on the sly for years, so his death was fairly easy to explain away. The death of Prevention magazine’s founder Jerome Rodale was, by contrast, much more problematic.
Jerome Rodale once argued that all of the teachings of health guru Horace Fletcher (1849-1919) were inherently suspect because of his untimely demise. Fletcher, famous for his advocacy of extreme mastication, was 70 years old when he died. “To be proof of his system,” Rodale maintained, “he should not have died before 90.” Thinking along similar lines, Prevention‘s executive editor claimed in 1974 that the teachings of 80-year-old health promoter Gayelord Hauser were to be heeded because his longevity and vitality were “a superb testimonial to the value of the nutritional principles he has been writing and lecturing about for over 50 years.” “If more of us followed his simple advice,” he added, “we might also find ourselves as ageless and vital as Gayelord Hauser.” Jerome Rodale said that he intended to live until the ripe old age of 102, so that he could say that he had lived in three different centuries (he was born in 1898). He insisted that his healthy lifestyle would allow him to achieve this goal. Alas, he suffered a massive heart attack, in 1971, while he was being interviewed on the Dick Cavett Show.
Though the episode was never aired, Cavett recalls that Rodale was a splendid guest: “He was extremely funny for half an hour, talking about health foods, and as a friendly gesture he offered me some of his special asparagus, boiled in urine. I think I said, ‘Anybody’s we know?’ while making a mental note to have him back.” Among the many remarkable things that Rodale said on the show, Cavett remembers these as the most startling: “’I’m in such good health . . . that I fell down a long flight of stairs yesterday and I laughed all the way.’ ‘I’ve decided to live to be a hundred.’ And the inevitable ‘I’ve never felt better in my life!’” Rodale slumped over slightly to one side while Cavett was interviewing another guest—Pete Hamill, a columnist for The New York Post. Initially, Cavett thought that Rodale had fallen asleep. With characteristic wit, Cavett exclaimed: “Are we boring you, Mr. Rodale?” The studio audience burst into laughter. Of course it soon became clear that Rodale was not sleeping. Like Robert Atkins, Jerome Rodale was 72 years old when he died. His passing sent shock waves through health-conscious America. “Rodale,” as one panicked follower put it, “has ruined the health-food industry by dying.”
The death of Adelle Davis three years later was even more problematic. At the time of her death, Davis was by far the most well-known health guru in America. As with Jim Fixx, her superior health was assumed. Even so, Davis died of bone cancer at the age of 70—in 1974, when the life expectancy of a white woman in America was 76.7 years. Mean-spirited critics were quick to note that not only did Davis fail to improve upon her expected longevity; she barely even made it into her seventies. Detractors of the natural health movement had a field day with the deaths of Jerome Rodale and Adelle Davis. These deaths were a public relations disaster that required a great deal of energy to explain away. In much the same way that Hal Higdon defended Jim Fixx in 2004, defenders of Jerome Rodale insisted in the wake of his death that he had in fact lived a virtuous life, but that the congenitally weak heart that he was born with could only take him so far. The argument had merit. Rodale’s father, Michael Cohen, died of a heart attack at fifty-one. Eerily, his oldest brother, Archie Cohen, also died of a heart attack at fifty-one. Cardiac arrest claimed his brother Solomon at sixty-two, his brother Joe at fifty-six, his sister Tina at sixty-four, and his sister Sally at fifty-eight. Given this dismal family history, Prevention writers maintained that Rodale would have in all likelihood died in his early fifties, had he not taken such good care of himself. He had added twenty years to his life. This was cause for celebration. Rodale’s death at seventy-two, they argued, was nothing that the health conscious need be ashamed of.
Explaining Adelle Davis’s death was more difficult. She had died at the relatively early age of seventy, more than six years below the national average. How could someone who lived such a healthy life develop bone cancer? Davis was initially shocked when she received the diagnosis in 1973. But she soon came up with an explanation that left her belief system intact. True to form, Davis blamed herself. Her cancer had come about, she argued, as a result of two important lapses in judgement. The first was acquiescing in numerous x-rays over the years. Insurance companies required them for periodic examinations, but, she reasoned, they were carcinogenic and she should have known better. Davis claimed that the second transgression took place long ago in her youth. She maintained that she had eaten well on the Indiana farm where she grew up, but that she had turned to junk foods when she left home for college. She had continued upon this nutritionally unsound path for much of her twenties. She insisted that her life since then had been thoroughly virtuous. But clearly, she lamented, the damage was already done. Davis concluded that she was now paying for the sins of her youth. Prevention magazine accepted her analysis and praised “the great lady of the natural health movement” for battling it out better than most. In her death, Adelle Davis maintained the principle of personal responsibility, appropriately so because she was one of the health reformers most insistent on placing it on individuals. As luck would have it, health-conscious America’s nemesis Fredrick John Stare managed to outlive all of his lifelong enemies. He died in 2002 at the ripe old age of 91.
Uncertainty and the Moral Imagination
A healthy respect for uncertainty has been conspicuously absent in health-conscious America. And it is here, I think, that the psychological origins of the movement’s heartlessness are to be found. To put it plainly: If you don’t believe in luck, you probably don’t believe in compassion either. True compassion stems from an awareness of your own limitations, and from a careful assessment of the limitations of the person you wish to judge; it stems, as well, from an honest appreciation of the good fortune that has helped you achieve whatever it is that you have achieved. “To respond with compassion,” philosopher Martha Nussbaum rightly observes, “I must be willing to entertain the thought that this suffering person might be me”; viz., compassion requires “a sense of one’s own vulnerability to misfortune.” Thinking along similar lines, Jean-Jacques Rousseau argues, in Émile (1762), that the moral imagination of a child ought to be shaped by an awareness of “the vicissitudes of fortune”: “Make him understand well that the fate of these unhappy men can be his . . . . Unsettle and frighten his imagination with the perils by which every man is constantly surrounded.” Only thus, avers Rousseau, can a man be made humane. Yet this is precisely the kind of moral reasoning that has been consistently opposed by the leaders of the natural health movement.
Luminaries like Adelle Davis and the Rodales claimed, time and again, that good health is not a matter of luck or fate; it is a decision—a decision made by self-disciplined individuals. A healthy physique is, in health-conscious circles, an accurate indicator of a person’s moral worth. Conversely, a diseased body is seen as fundamentally aberrant; “it is foolish to be ill,” thunders Adolphus Hohensee. “We can have health,” declared another, “or we can have disease. It’s all up to us.” Of course it’s not all up to us. Things fall apart. People get sick and die, often for no apparent reason. The world we inhabit can be an unpredictable place. This horrifies most of us. So we ignore it when we can, and deny the evidence of experience when we cannot. When all else fails, we embrace illusions of total control. Our current obsession with health and wellness is just one of those illusions. There have been others in the past, and so long as our desire to control the capricious revolutions of the Wheel of Fortune remains intact, there will be others in the future.
In The Future of an Illusion (1927), Freud argues that our colossal attempts to make sense of suffering and death are ultimately fueled by a childish fear of growing up. We are afraid of leaving the home where we “felt so warm and cozy.” We do not wish to come to terms with the arbitrary nature of existence. Illusions fulfill this wish. They coddle us, stunt our psychological growth, and allow us to prolong our childish fantasies well into adulthood. All of this renders us, at least as far as Freud is concerned, pretty pathetic: “A person cannot remain a child for ever; eventually the child must go out into what has been called ‘hostile life’. The process might be termed ‘education for reality’.” The resigned realists of the future would, Freud hoped, resolutely reject all infantilizing illusions. Like St. Paul, they would “put away childish things.” This widespread cultural maturation would lead, in turn, to the dawning of a more enlightened age led by sober scientists. But alas, the somber souls that Freud idealized remain—especially in the United States—little more than a melancholy morbid minority. Freud’s adults cannot help but feel like misfits and outsiders in 21st-century America. Even in these difficult economic times, their pessimistic worldview places them at odds with the morals and mores of the mainstream. The same could manifestly not be said about health-conscious Americans. Their ideas are, at present, a central feature of our shared experience. Between 1970 and 2015, millions of Americans embraced concerns that were once the exclusive province of a quirky subculture. Visit them if you like: they’re there, right now, on the blessèd isles of health-conscious America, quixotically asserting their freedom over fate and fortune. Freud would be horrified by their optimism. Rousseau would be horrified by their lack of compassion.
“You’re going to have to take the blame for everything once you get your body back.”
—John Feltman, Prevention: The Magazine for Better Health (July 1977)
In the 1970s, the success of the natural health movement spread a new orthodoxy across North America, with an unforgiving approach to motherhood. Countercultural health gurus like Adelle Davis helped redefine tragedies such as crib death and miscarriage as punishments meted out to mothers who failed to obey the natural laws of health. They promised to free modern women from the tyranny of Western Medicine. Yet they replaced Doctor God with an equally demanding deity: Mother Nature.
If you want to know where we went wrong, Adelle Davis’s bestselling advice book Let’s Have Healthy Children(1951) is a great place to start. Written “primarily for the expectant mother,” Let’s Have Healthy Children is an extended diatribe against American mothers, who, Davis inveighed, “seem to have shifted the responsibility for their children’s health entirely onto the shoulders of these physicians.” She insisted that this error threatened the very strength of the American nation on the world stage. It had to be stamped out; “the responsibility for the infant’s health must again be shouldered by the mother.” The truth, argued Davis, was that “every woman, by her choice of foods before and during pregnancy, largely determines the type of baby she will produce.” The crux of her message to mothers was clear and unambiguous: “The responsibility is yours.”
Failing to heed Davis’s call could prove disastrous. In one cautionary tale, she told the story of Margaret: “She was thrilled to be pregnant again but had refused to eat intelligently. More than once I had tried to get her to improve her diet. ‘Phooey on that stuff,’ she would answer gaily. ‘I have two beautiful children, and I ate anything I wanted when I was pregnant with them.’” Margaret’s luck ran out a couple of months later: “In her seventh month,” recounted Davis, “she developed toxemia. Her baby was born dead, and she was frightfully ill. If she and other pregnant women could know more about nutrition and recognize the danger warnings, this tragedy and thousands like it could be avoided.”
Let’s Have Healthy Children, despite its cheerfully nonchalant title, is preachy, highly prescriptive, and even, at times, downright angry. A more accurate title might have been: You’d Better Have Healthy Children! Davis addressed herself directly to mothers in a harsh, insulting manner, completely devoid of warmth or compassion. It is inconceivable to me how any mother could have read this book and not come away feeling grossly inadequate. Davis proffered a perfectionist ideal that was as unrealistic as it was unattainable. She argued that anything less than an easy pregnancy, a flawless birthing experience, and a perfect child was completely unacceptable and inexcusable. Davis maintained that a healthy newborn could be “expected to meet the following conditions: Be perfectly formed without defects. . . . Sleep soundly. . . . Cry little. . . . During the first year, he continues to sleep soundly, cry little, and eat with a good appetite.” The healthy child was, she avowed, a happy child: “he smiles early, [and] laughs aloud by the age of six months; his tears are rare and of short duration; he is neither irritable nor whiny, but relaxed and happy.” He should also, averred Davis, be free of all the supposedly normal afflictions of infancy. Included in her exhaustive list of “abnormalities” were cradle cap, colic, diaper rash, diarrhea, constipation, colds, infections, allergies, eczema, indigestion, vomiting, smelly stools, and thrush. “At no time,” she added, “has he needed or been given antibiotics, aspirin, tranquilizers, or drugs of any kind.” The number of ways that each reader’s child deviated from this standard, Davis maintained, made manifest the degree to which she had failed as a mother.
At first glance it is difficult to understand why anybody other than a masochist would read Let’s Have Healthy Children from cover to cover. Yet upon further consideration it becomes clear that Adelle Davis’s effectiveness as an author of prescriptive literature stemmed precisely from her harsh tone, and, perhaps more importantly, from her adroit assessment of the type of women who comprised her target audience. All mothers worry about the efficacy of their mothering to some extent. But for the most part it has been educated middle-class women with adequate leisure time who have consulted advice literature. When a woman buys and reads a “how to” book on mothering she enters a self-selected group. She has indicated by that very act that she is open to new and perhaps unorthodox ideas. She probably lacks confidence in her mother’s advice and would like to distance herself from her mother’s parenting style. She believes truth is often to be found in books. And she does not think that she is adequately prepared for what is to come. Let’s Have Healthy Children played on all of these insecurities. Yet at the same time Davis’s book was also empowering. For it argued that the power to create a perfect child was in every woman’s hands. A happy result could be guaranteed by the right kind of behavior. Pregnancy, childbirth, and child rearing—experiences so fraught with danger, uncertainty, and the unknown—could be controlled. Adelle Davis’s message was in this respect comforting. Control is what people who are attracted to the health-conscious lifestyle are looking for, even if its costs are high, even if it means that they will have to shoulder an awesome weight of responsibility. A sense of control is precisely what Adelle Davis gave her readers.
The editors of Prevention magazine published many of the letters that they received in a section entitled “Mailbag,” which often engulfed a goodly portion of the magazine. The lion’s share of these letters consisted of testimonials, such as the following from Janet Stensel, a woman who had recently given birth to her first child: “All during my pregnancy I followed Prevention’s suggestions for a safe, comfortable pregnancy and healthy baby. I avoided caffeine, alcohol, diuretics, common household drugs and junk foods. I added good sources of protein to my meals along with prenatal supplements . . . .” Moreover, alleged Stensel, “I faithfully practiced exercises to strengthen my back muscles.” She insisted that the “benefits reaped from this regimen were tremendous.” “I had,” she claimed, “enough stamina to work full time . . . right up until the day before the delivery. And I still had enough energy to go home and bake my own recipes for cheese breads and wheatgerm muffins!” Apparently, halcyon days persisted to the end: “Early one Monday morning last November, my uncomplicated speedy labor and delivery . . . produced our first child—a healthy 6½-pound son.” Declared another proud parent, who had adhered to a similarly virtuous regimen: “I sailed through my pregnancy starting with no morning sickness at all and ending with a very smooth labor and delivery.” “My labor and delivery [was] a breeze, too,” crowed yet another. “In fact, the whole thing took only three hours. And just 10 days after my daughter was born I was out jogging.” The didactic function of letters of this kind was to promote and normalize precisely that kind of ideal birthing experience that Adelle Davis described in Let’s Have Healthy Children. “In my opinion,” Davis groused, “labor should be measured in minutes, not in hours; and prolonged labor is typical of women whose diets have been inadequate during pregnancy.” If an expectant mother was doing everything right, Davis maintained, she should not experience any nausea, nor, in fact, should she develop varicose veins, hemorrhoids, leg cramps, exhaustion, or stretchmarks. Contrary to popular belief, she insisted that these were not normal side-effects of pregnancy; they were all “abnormalities” that could be avoided by virtuous behavior.
With labor, delivery, and nine months of pregnancy behind her, the health-conscious mother could now, assuming all went well, gaze into her child’s lovely eyes and breathe a sigh of relief. But not for long. The woman would soon realize that she had merely left one exacting jurisdiction and entered another. Choices of monumental importance lay ahead. For instance, health gurus insisted that her toddler could not consume any commercially-canned baby food. Instead, she would have to make her own mash from high-quality organic ingredients. As the child grew older, the very same authorities claimed that she would have to be sure to give him or her plenty of vitamin and mineral supplements. Still, the new mother’s most important decision was immediate: to breastfeed or not to breastfeed?
“Every mother,” Adolphus Hohensee maintained, “who resorts to artificial feeding when she is capable of nursing her child will be held responsible . . . on judgement day.” Health reformers have never wavered in their denunciation of bottle feeding. “To say it is deplorable that countless mothers will not nurse their offspring is putting it mildly,” declared Hohensee. “These mothers are lacking in some of the essential qualities of motherhood.” In health-conscious circles, the good mother was the mother who breastfed her children—she was a mother who understood, in the words of one Prevention writer, that “breast milk is God’s gift to babies.” The zealous founders of La Leche League—a breastfeeding advocacy organization—argued, as did Carlton Fredericks and Mark Bricklin, that it was fundamentally irresponsible to bottle feed a baby. They claimed that the mother who chose to bottle feed consigned her child to a lifetime of unnecessary suffering. Her child was, they alleged, more likely to develop a weight problem later on in life, as well as allergies, asthma, eczema, learning disabilities, and a host of other medical conditions.
In the short term, the health conscious maintained, in the words of one Prevention staffer, that “bottle-fed babies are far more likely to be victims of crib death than are breast-fed babies.” “Many of the unexplained ‘crib deaths’ have been attributed to cow’s milk,” asserted Prevention in 1971. “Mother’s milk contains food factors that are designed specifically for the human baby,” the article continued. “At a critical time, their presence could spell the difference between life and death.” “The number of infants,” read another Prevention article published in the same year, “who actually owe their lives to breastfeeding is probably quite high.” As usual, Adelle Davis took the most unequivocal position. “Crib death,” she insisted, “which takes the lives of some 20,000 seemingly healthy infants each year, does not occur among babies who are breast-fed.”
In the long term, even if a bottle-fed baby managed to survive infancy, Davis maintained, he or she would never be as happy, warm, friendly, or emotionally stable as a breastfed baby. She claimed that a wide variety of deviant behaviors could be traced back to bottle feeding, a practice that “often causes compulsive eating, drinking, and smoking, and results in obesity and alcoholism. It can take the form of psychosomatic illnesses such as arthritis or asthma,” she added. “Certainly,” Davis continued, “it plays a role in such social problems as crime and drug addiction. There are, for example, fewer child delinquents among children who have been breast-fed than among bottle-fed ones.” The damage done to those who were deprived of the breast as youngsters, she argued, manifested itself in adulthood in dysfunctional interpersonal relationships. Adults who were bottle fed as children found it hard, Davis alleged, to “give and receive warmth and love.” “The child,” she reasoned, “who has not been nursed unconsciously feels a lifelong rejection, knows less security, and has more difficulty in adjusting as a social being. He unconsciously harbors hostilities toward his mother which prevent a close relationship between the two.”
Bottle-fed babies tend to be ugly, declared Davis, in the twenty-fifth chapter of Let’s Have Healthy Children, aptly titled “Your Child Has the Right to be Beautiful.” “Children should be beautiful,” she insisted. But this was so often not the case in the United States: “I have been repeatedly impressed by the gorgeous, smiling babies seen in every European country and in the Orient. Almost without exception each child is beautiful. In contrast, the pinched, pale, unsmiling, narrow faces of American babies break my heart. In all other countries except ours, most babies are nursed.” This “tragedy of ugliness” was, Davis maintained, much more serious than was usually recognized. “Almost every child hates himself if he is not reasonably attractive.” Mothers had an obligation, therefore, to raise attractive children “with superb minds and beautiful bodies,” she wrote. “The goal outlined here,” she observed, “is no higher than the standard of perfection farmers and pet breeders expect of their animals.” Besides, beautiful children were, at bottom, a testament to virtuous mothering: “As these superior children go into the community and eventually into the world at large, their straight bodies, excellent bone structure, attractive appearance, athletic prowess, mental alertness, and quick grasp of social needs are all advertisements of your own efficacy as parents.” Permissive mothers who allowed their children to eat junk food could, she contended, expect a lifetime of resentment from their adult children. Referring to a woman she knew, who allowed her son to eat junk food, Davis wrote: “the time would come when this boy would hate his mother for allowing him to grow up with a nearly deformed body.”
Davis claimed her beautiful daughter was a perfect example of what American babies could look like if American mothers breastfed their children. Let’s Have Healthy Children included a number of photographs of her daughter as a baby, as a toddler, and as a high-school graduate. Her daughter was, quite clearly, an adorable baby, a cute toddler, and a beautiful young woman. Davis took credit for it. “A mother,” she argued, “largely determines whether her children will be beautiful or homely, depending on the adequacy of the diet during the first few months.” Still, beautiful babies had to be properly nourished after birth, too. “When nutrients are deficient or poorly absorbed,” wrote Davis, “a baby who had been beautiful at birth often becomes homely by the time he is three to six months old; and once allowed to develop, this homeliness remains throughout his entire lifetime.”
Prevention writers maintained that the bottle-fed baby was more likely to be an underachiever in the dog-eat-dog world of work. Indeed, failure to succeed in the capitalist marketplace was often, they chided, the result of motherly neglect. The poverty that was a necessary concomitant of this failure was therefore also, they deduced, the mother’s fault. “Poor nutrition,” declared Joan Jennings, “makes poor brains and poor brains make poor people.” In one fascinating Prevention fable—“Only Well-Nourished Babies Achieve Their Potential”—this line of reasoning was taken to some decidedly illiberal conclusions. The article tells the tale of two boys named Paul and Mike who grew up together as close friends. As adults, we learn that they “are still good friends, but Mike is now a clerk and Paul is his supervisor.” “Even though the two men have practically everything in common,” the narrator claims that “Paul was able to develop himself more fully than Mike.” The unnamed storyteller insists that the difference between the two men has “nothing to do with any negligence on Mike’s part, for it stems from something that happened long before Mike had reached the use of reason. Mike’s mother took the option,” we are told, “recommended by her pediatrician of feeding her infant a formula, while Paul’s mother followed her instincts and breast-fed him.” Although Mike seems perfectly fine to those around him—“Mike wasn’t retarded. He never failed a grade in school, and he has lived a normal life.”—the fabler insists that appearances are, in this instance, deceiving. The mistakes that Mike’s mother made long ago have, in fact, had dire consequences: Mike is paying the price for his mother’s woeful lack of judgement each and every day of his boring, mediocre life. Had young Mike been “breast-fed when the only things he knew were eating, sleeping and crying,” the raconteur confidently assures us, “the additional nutrient elements that were intended for him by nature would have laid a foundation for a richer and more satisfying life.” Apparently, from time to time Mike wonders “why he is working for Paul instead of with him.” “He doesn’t realize,” the narrator solemnly declares, “that the decision was made for him a long time ago.” This stunningly didactic piece concludes with a question, posed directly to the expectant mother who is, presumably, still weighing her options: “Why be the one responsible for making your child wonder why he is the clerk and his friend is the supervisor?” “Most people,” lamented Adelle Davis, “believe that intelligence is largely inherited and if a child fails to be smart, it is because he had the wrong ancestors.” Nothing, she claimed, could be further from the truth. Stupid children, she insisted, are the result of bad mothers, and they are a constant drain upon the American society. “The cost to our nation of supporting malformed persons is staggering.”
The natural health movement seemed so naturally allied with feminism in the 1970s. Health reformers shrewdly diagnosed the gendered nature of conventional medicine, time and again, and they promised women freedom from the oppression of patriarchal professionals. Yet health reformers forged new forms of oppression that were often more onerous than those that they replaced. As many women discovered, much to their chagrin, striving to be an Earth Mother Goddess wasn’t particularly liberating. If the biomedical model of health disenfranchised the female patient, took away all of her responsibility, and placed it in the hands of a patriarchal male doctor, the natural health movement’s backlash against that model swung the pendulum all the way in the opposite direction. Women became personally responsible for every aspect of their child’s fate. If a woman had a difficult pregnancy, a miscarriage, a complicated delivery, or a child born with birth defects, the assumption among the health conscious was that she must have done something wrong. Perhaps she had a glass of wine at Christmas or forgot to take her prenatal vitamins. Chance and bad luck had little place in this worldview. If something went awry, anything, there was a reason; someone was guilty, and that someone was almost always a woman. We’ve replaced Doctor God with an equally demanding deity: Mother Nature.