This article was originally published in the January/February 2012 issue of Unity Magazine, and is reproduced here with permission.
Atheism isn’t really new. It’s as old as the idea of God itself. At the dawn of history the first time someone said “there is a God” the guy standing next to him said “no there isn’t.” And we’ve been arguing about it ever since.
In the ten years since 9/11 a raft of writers have published best-selling books championing the well-worn idea that God is an invention of our over-active collective imagination, an invention humanity would be a lot better off without.
At the head of the pack of the so-called New Atheists is Richard Dawkins whose book The God Delusion, published in 2006, spent 51 weeks on the New York Times bestsellers list and has since sold over 2 million copies worldwide. Dawkins is an evolutionary biologist and has little patience for any truth-claim that cannot be supported by empirical evidence. For him, belief in the virgin birth, Creationism and the existence of an invisible cosmic overlord is utterly groundless and worse – “Religion,” said Dawkins in a recent New York Times interview, “teaches you to be satisfied with non-answers.” In other words, religion makes us stupid.
Dawkins is not alone in his critique of the traditional Judeo-Christian-Islamic God. He joins a brilliant and esteemed list of philosophers including Hume, Sartre, Camus, Schopenhauer, Nietzsche, Nagarjuna, Mill, Chomsky, Santayana and Foucault.
Other famous atheists range from the not at all surprising (Joseph Stalin, Karl Marx, Sigmund Freud) to the unexpected (Bill Gates, Thomas Edison, Helen Keller). Thoughtful, inventive, creative and courageous people throughout history have, at sometimes great personal and professional risk, dared to question the central paradigm of western civilization – that the God of Abraham, Moses, Jesus and Muhammad is real.
But atheism doesn’t just ask questions – it asserts answers. By making a specific truth claim, namely that there is no God, atheism is vulnerable to the same criticism it levies against theism. Whether you claim there is a God or not you still have to supply evidence to support your claim and present that evidence in a framework we can all accept. The devil is always in the details.
Where Dawkins’s brand of atheism falls short is in its misestimation of the human capacity to know. For Dawkins, religion is a failed science – a science utterly without evidence or sound hypotheses. What Dawkins is unwilling to consider is the possibility that religion and science do not share a common epistemology. The process by which one establishes knowledge or certainty in science is utterly different from the process by which one establishes knowledge or certainty in religion. Scientific certainty is founded solely on empirical, that is, sensory evidence whereas religious conviction is founded on externally unverifiable inner experience. Religious claims are therefore prone to a host of criticisms from an empirical epistemological stance. To scientists like Dawkins religion is nothing more than a long list of misunderstandings amplified through time and concretized by tradition. Gone from even the realm of consideration is the possibility that there are ways of apprehending reality other than through sensory data and conceptual thought. What if non-sensory awareness or direct, unmediated experience carries its own epistemological weight? As Native American philosopher Vine Deloria puts it, “We may misunderstand, but we do not misexperience.” Learning to humbly trust the authority of our own inner-awareness gives birth to an epistemology unbound by mere intellect and the limiting mechanics of logic.
Ironically, atheism does religion a great favor by laying bare the absurdities inherent in any attempt to conceptualize the ground of being. If the formless ground of being that we commonly personify as God is the source of all reality, (including our conceptual minds), then of course any mere concept of God falls woefully short of the reality it purports to describe, leaving all such concepts susceptible to ridicule.
Whether we like Dawkins’s conclusion or not, any thinking person understands and appreciates the urgent importance of his inquiry. Throughout history, the God idea has done as much harm as good. Religious wars, oppression, conquests and crusades have left us battered and bloodied. Given the rise in popularity of atheism in the post 9/11 world it is clear that a great number of people are frustrated by religion, especially fundamentalism in all its many forms. Atheists like Dawkins capture a wide audience because they deftly skewer outdated and outmoded God-concepts that never really worked anyway. In other words, the God-concept attacked by atheism is a God-concept many of us have already left behind – the angry, judgmental, anthropomorphic God (think Michelangelo’s Sistine Chapel) who commands unquestioning obedience to an endless list of confusing and often conflicting dictates administered by an authoritarian church. It’s a shame, however, that in their haste to abandon religion so many people have cut ties with their innate spirituality as well.
A genuinely scientific and open minded approach to the God question would allow for the possibility that while the existence of God cannot be proven within the narrow bounds of empirical science, God may still exist. In this sense Dawkins does not disappoint. Dawkins believes that evolution is progressive and inherently leads to increasingly complex forms. The emergence of conscious beings from the primordial ooze strongly suggests the possibility of significant future evolutionary development. If there was no God “in the beginning”, could there be one now or in the future? “Yes,” says Dawkins, “it is highly plausible that in the universe there are God-like creatures,” and if there aren’t, there could be someday. Such is the power and potential of evolution. Admittedly, these are not the sort of gods that populate creation myths the world over but are rather the result of a long, unguided process of mutation and natural selection of desirable traits – the culmination of evolution, not its genesis.
What Dawkins is unwilling to concede, despite eons of experiential evidence, is that God-consciousness is not just a future possibility, the end-point of eons of evolutionary progress, but the starting point of it all. If God-consciousness is the source of everything, and even more to the point the essential nature of everything, then it is impossible to turn God into a mere concept let alone a logically sound one. Trying to define God is like trying to see your own eyes. “The source of consciousness cannot be an object in consciousness,” said Nisargadatta Maharaj in his classic of Vedanta philosophy I Am That. “To know the source is to be the source.” In other words, we cannot turn God into a thought because God is the very act of thinking itself. Asking us to explain God is like asking a fish to explain water. We cannot point to a disembodied thing called God because God is what everything is. This brand of religious philosophy, dismissively and misleadingly called pantheism by mainstream theologians, offers a third alternative to the tired theism/atheism debate.
By challenging an outmoded concept of God and the crippling propensity of mainstream religious doctrine to jettison rational thought Dawson is performing an invaluable service. Arguably, he is helping us all move forward out of millennia of dogmatic authoritarian hearsay and toward a spirituality grounded firmly in experiential knowing. As Jung famously remarked, “Religion is a defense against the experience of God,” and as such ought to be critically examined by all who wish to deepen their authentic spiritual practice. Dawkins’s well-reasoned attack on traditional religious belief is pushing us away from the shallow end of the pool and into deeper waters. From here we can see the other side.
In any debate, theological and other wise, the goal is not to eliminate dissention and compress the baffling complexity of reality down to a single, simplistic proposition. No matter how deep our longing, humanity’s search for meaning cannot be reduced to an up or down vote on the existence of God. The object of thoughtful discourse is to allow conflicting truth claims to polish each other to a shining luster in the rough and tumble give and take of rigorous yet mutually beneficial dialogue. And in the great sorting, the chaff is left on the granary floor, laying bare the wheat that nourishes us all on the long road to wisdom. Moving past simple scenarios of this or that, we finally begin to appreciate the need to grow beyond slavish attachment to rigid opinions or positions. Maybe the question of God’s existence can never be answered to everyone’s satisfaction. “The great and most important problems in life are utterly unsolvable,” said Carl Jung, “they can never be solved, but only outgrown.” Instead of childishly regarding the new atheism as either true or false, it is more likely that it is yet another facet of the unfolding of evolutionary consciousness, a welcome corrective to our natural tendency to cling to old narratives and conceptual frameworks that no longer serve our highest good.
Friday, December 30, 2011
Thursday, December 22, 2011
Simple Misunderstanding
At big family gatherings Aunt Sally always prepared a ham. As her older sisters watched, she would carefully cut a large chunk off the end of the ham before placing it in her over-sized roasting pan. Being gracious house guests, none of her sisters said a word, deferring to their host’s culinary wisdom. After many years the oldest sister Martha finally spoke up.
“Why do you always cut off the end of the ham before roasting it?” Martha asked.
“Because that’s how mom always did it,” Sally replied. “It makes the ham more delicious.”
Martha went out to the living room to fetch their old mother.
“Mom,” said Martha, “Sally cuts the end off the ham like you always did because you said it tastes better that way. Is it true, does that make it taste better?”
“Oh no dear,” said their old mother as she ambled into the kitchen, “I had to cut the end off the ham so it would fit into my tiny roasting pan.”
As individuals, families and societies, we are often bedeviled by past practices that no longer have meaning and worse – they’ve been clothed in the unassailable garb of tradition and now lie beyond reproach. Cutting off the end of the ham did nothing to improve the flavor. It was just an empty ritual based on a simple misunderstanding.
In his illuminating book Guns, Germs, and Steel Jared Diamond recounts the story of how we all got stuck with the QWERTY keyboard on our computers. Named for the first six letters on the left end of the upper row, the QWERTY keyboard was first designed in 1873 with the express purpose of slowing down typists. The levers of these early typewriters were prone to jam, so in order to make typing as difficult and awkward as possible the most commonly used letters were scattered all around the keyboard instead of being placed conveniently in the center. To make matters worse, the most common letters were placed on the left side where most people are weakest. Fifty years later in the 1930s the mechanical issues had been resolved and the hammers no longer jammed. Newly redesigned keyboards increased typing speed by 95 percent. But it was too late. The QWERTY keyboard was deeply entrenched into the culture, and there was no going back. The productivity of typists throughout the twentieth century was sacrificed to the tune of 95 percent on the altar of “but we’ve always done it this way.” Even computer designers utilized the horribly awkward QWERTY configuration for their keyboards. Introducing a new keyboard at this point would be commercial suicide. No one would buy it. We like our absurdly designed and maddeningly difficult keyboards just the way they are.
The larger question Jared Diamond raises in Guns, Germs and Steel is this: in the evolution of human societies, why do some cultures embrace technological innovation while others remain entrenched in old ways of thinking and deeply committed to outmoded and inefficient behaviors? The same question could apply to each one of us individually. Why do we mindlessly cut off the ends off the ham even though our pan is plenty big enough to hold the whole thing?
The answer is right in front of us. We are habitual creatures and do not embrace change, no matter how beneficial. We don’t like learning new things because we don’t like feeling incompetent and awkward. Both as individuals and societies we’ve become attached to our thought-systems and past practices.
Another dynamic that impacts technological progress is the fact that new inventions are sometimes ignored because they simply do not align with current cultural values or needs. Gun powder and guns were invented in Asia long before they were ever seen in the west. But as tools of warfare guns never caught on in medieval Japan. Guns were seen as crude and dishonorable under Samurai code, an ethos that celebrated the elegant choreography of swordsmanship and the rare courage of elite warriors. Killing your enemy from a distance by blasting lead balls through steel tubes dehumanized the ancient art of honorable combat. Technology must always serve the deepest needs of a people, not the other way around.
It is also the case that invention is rarely born from necessity. When Thomas Edison invented the phonograph in 1877 no one “needed” a phonograph. He was just messing around. He certainly was not trying to invent the music industry (although that’s what he did) – recording music was the last thing on his mind. As inventors often do, Edison completely misunderstood the wider applications of his own invention. He simply wanted to record the last words of dying people, record books for the blind, announce time and teach spelling. Edison was convinced the phonograph would have no lasting commercial value. It was only later that some ingenious entrepreneurs used Edison’s technology to invent the jukebox. Soon there were jukeboxes all across America in bars and restaurants, swallowing the coins of patrons thirsty to hear the latest popular song. The record industry was born and music would never be the same.
But it is never simple. Technological innovation does not drive culture as is often assumed. We embrace or reject new gadgets based on their affinity with our current value system. Sometimes rock throwing tribes do not adopt bow and arrow technology even though they’re surrounded by enemy tribes that do, simply because they prefer the old way of doing things. There’s no judgment here. Technological progress is not an unmitigated good. The Samurai settled regional conflicts by sending one warrior from each of the warring states to engage in a battle to the death with each side accepting the outcome. That would be like locking Rambo and Osama bin Laden in a room and whoever walked out would be the winner and no one else would have to die. Does anyone really think we do it better now?
The lessons from these stories seem clear. But that doesn’t make them easy to learn. On one hand, we sometimes embrace changes that erode our most cherished values, allowing technology to shape humanity instead of the other way around. In that case, change is bad. But most of the time, like Aunt Sally, our unwillingness to innovate, improve and change is rooted in a deeply irrational and unconscious attachment to ways of thinking that no longer serve our highest good. We simply do not have the eyes to see all the myriad ways we are caught in a web of ignorance, tradition and conformity. For some reason we do not have the wisdom to see when change is good. Maybe the Buddha was right when he characterized our attachment to old ways of thinking, being and doing as a disease of the ego. Along the way we came to believe that our current patterns of thought and behavior defined and embodied our identity, and to alter or abandon these patterns would be to alter or abandon ourselves. This was a fracture our ego simply could not endure. But down deep we know that we are not bound by our thoughts or our patterned behaviors. Beneath the layers of social conditioning and fear-based attachment we are infinitely free. Sometimes change is good. Sometimes change is bad. Wisdom is the capacity to discern which is which. Let’s hope we stop cutting off the end of the ham for no reason. Let’s hope we finally grow out of this simple misunderstanding.
“Why do you always cut off the end of the ham before roasting it?” Martha asked.
“Because that’s how mom always did it,” Sally replied. “It makes the ham more delicious.”
Martha went out to the living room to fetch their old mother.
“Mom,” said Martha, “Sally cuts the end off the ham like you always did because you said it tastes better that way. Is it true, does that make it taste better?”
“Oh no dear,” said their old mother as she ambled into the kitchen, “I had to cut the end off the ham so it would fit into my tiny roasting pan.”
As individuals, families and societies, we are often bedeviled by past practices that no longer have meaning and worse – they’ve been clothed in the unassailable garb of tradition and now lie beyond reproach. Cutting off the end of the ham did nothing to improve the flavor. It was just an empty ritual based on a simple misunderstanding.
In his illuminating book Guns, Germs, and Steel Jared Diamond recounts the story of how we all got stuck with the QWERTY keyboard on our computers. Named for the first six letters on the left end of the upper row, the QWERTY keyboard was first designed in 1873 with the express purpose of slowing down typists. The levers of these early typewriters were prone to jam, so in order to make typing as difficult and awkward as possible the most commonly used letters were scattered all around the keyboard instead of being placed conveniently in the center. To make matters worse, the most common letters were placed on the left side where most people are weakest. Fifty years later in the 1930s the mechanical issues had been resolved and the hammers no longer jammed. Newly redesigned keyboards increased typing speed by 95 percent. But it was too late. The QWERTY keyboard was deeply entrenched into the culture, and there was no going back. The productivity of typists throughout the twentieth century was sacrificed to the tune of 95 percent on the altar of “but we’ve always done it this way.” Even computer designers utilized the horribly awkward QWERTY configuration for their keyboards. Introducing a new keyboard at this point would be commercial suicide. No one would buy it. We like our absurdly designed and maddeningly difficult keyboards just the way they are.
The larger question Jared Diamond raises in Guns, Germs and Steel is this: in the evolution of human societies, why do some cultures embrace technological innovation while others remain entrenched in old ways of thinking and deeply committed to outmoded and inefficient behaviors? The same question could apply to each one of us individually. Why do we mindlessly cut off the ends off the ham even though our pan is plenty big enough to hold the whole thing?
The answer is right in front of us. We are habitual creatures and do not embrace change, no matter how beneficial. We don’t like learning new things because we don’t like feeling incompetent and awkward. Both as individuals and societies we’ve become attached to our thought-systems and past practices.
Another dynamic that impacts technological progress is the fact that new inventions are sometimes ignored because they simply do not align with current cultural values or needs. Gun powder and guns were invented in Asia long before they were ever seen in the west. But as tools of warfare guns never caught on in medieval Japan. Guns were seen as crude and dishonorable under Samurai code, an ethos that celebrated the elegant choreography of swordsmanship and the rare courage of elite warriors. Killing your enemy from a distance by blasting lead balls through steel tubes dehumanized the ancient art of honorable combat. Technology must always serve the deepest needs of a people, not the other way around.
It is also the case that invention is rarely born from necessity. When Thomas Edison invented the phonograph in 1877 no one “needed” a phonograph. He was just messing around. He certainly was not trying to invent the music industry (although that’s what he did) – recording music was the last thing on his mind. As inventors often do, Edison completely misunderstood the wider applications of his own invention. He simply wanted to record the last words of dying people, record books for the blind, announce time and teach spelling. Edison was convinced the phonograph would have no lasting commercial value. It was only later that some ingenious entrepreneurs used Edison’s technology to invent the jukebox. Soon there were jukeboxes all across America in bars and restaurants, swallowing the coins of patrons thirsty to hear the latest popular song. The record industry was born and music would never be the same.
But it is never simple. Technological innovation does not drive culture as is often assumed. We embrace or reject new gadgets based on their affinity with our current value system. Sometimes rock throwing tribes do not adopt bow and arrow technology even though they’re surrounded by enemy tribes that do, simply because they prefer the old way of doing things. There’s no judgment here. Technological progress is not an unmitigated good. The Samurai settled regional conflicts by sending one warrior from each of the warring states to engage in a battle to the death with each side accepting the outcome. That would be like locking Rambo and Osama bin Laden in a room and whoever walked out would be the winner and no one else would have to die. Does anyone really think we do it better now?
The lessons from these stories seem clear. But that doesn’t make them easy to learn. On one hand, we sometimes embrace changes that erode our most cherished values, allowing technology to shape humanity instead of the other way around. In that case, change is bad. But most of the time, like Aunt Sally, our unwillingness to innovate, improve and change is rooted in a deeply irrational and unconscious attachment to ways of thinking that no longer serve our highest good. We simply do not have the eyes to see all the myriad ways we are caught in a web of ignorance, tradition and conformity. For some reason we do not have the wisdom to see when change is good. Maybe the Buddha was right when he characterized our attachment to old ways of thinking, being and doing as a disease of the ego. Along the way we came to believe that our current patterns of thought and behavior defined and embodied our identity, and to alter or abandon these patterns would be to alter or abandon ourselves. This was a fracture our ego simply could not endure. But down deep we know that we are not bound by our thoughts or our patterned behaviors. Beneath the layers of social conditioning and fear-based attachment we are infinitely free. Sometimes change is good. Sometimes change is bad. Wisdom is the capacity to discern which is which. Let’s hope we stop cutting off the end of the ham for no reason. Let’s hope we finally grow out of this simple misunderstanding.
Subscribe to:
Posts (Atom)