I had a blast back in August singing in a Village Harmony adult camp. The obvious highlight for me was singing the solo part of a gospel number, “Ain't Got Time To Die." It felt good, was a fun stretch for me, I'm told it sounded good... and in reflection it was a very odd choice for me.
As I've discussed earlier, though I belong to a denomination that many think is a Christian sect, I am not a professed Christian, nor do I carry may of the hallmarks of such: I do not accept Jesus as my savior, nor do I accept God as Father, or believe most of the stories in the gospels as literal, if-you-had-been-there-with-a-video-recorder-you'd-have-seen-it-too truth. I'm some flavor of agnostic, one with pretty strong non-theistic sensibilities. Deistic, maybe, but... here I am really enjoying singing gospel tunes.
OK, there's really nothing new here. I learned “Swing Low, Sweet Chariot” and all those other spiritual standards back in grade school. They were cultural artifacts—good songs from the African-American tradition. I learned a lot of them off of Weavers records: Lee Hays was a lapsed minister's son. I sing and love Christmas carols. I sang Vivaldi's Gloria and Schubert's Mass in G in school. And so on, and so on. You'll have a hard time singing choral music in this society without singing music meant for church services—but over time we've developed a framework where if it's sung in concert, it doesn't count—the whole performance has a big frame, a set of quote marks around it, just as performing in HMS Pinafore doesn't suggest you have any experience as a sailor.
Out of concert, and the frame is not so clear. When the Blind Boys of Alabama opened for Peter Gabriel a few years ago, we were all singing and dancing in the aisles, and then one of them said something to the effect of his feeling the power of the Lord and this whole hall praising Jesus, and OK fine, who am I to say otherwise, but it felt a little awkward because, well, I was singing along but I didn't mean the words literally.
Or my atheist/pagan fellow singer who got in a huff about all the religious songs—old-time gospel, mainly—that cropped up in a row at a pub sing. Or the fellow singer at the camp who wondered what his fellow Jewish friends would think about him singing gospel with such gusto.
The “frames and quotes" think only goes so far. I find a lot of the white-folky versions of spirituals I grew up with pale and even a little offensive. I joke about forming an "agnostic gospel choir" for people like me who love to sing the songs but aren't interested in being the house choir for a faith we don't really share. But as I think about it, the built-in insincerity would end up showing, one way or another. It would be fake.
Because what makes gospel work is something I just don't have that explicitly: utter commitment. Not that gospel singers are free from sin, or perfected saints in any sense, but when they sing, and sing well, it requires the whole body to dig in and hold up the song, and the lyrics are about as un-ironic as you can get. And that's part of the appeal, and it's something I and a lot of urban liberals like me simply don't carry around with us in any sort of coherent package.
Friday, November 11, 2011
Saturday, October 29, 2011
Feelings
I had a dream a few nights ago, where I was some sort of volunteer assistant teacher in an inner city school. The kids in my group were all African-American boys, about second or third or fourth grade. They had a series of little books about feelings on the table near them, and they were really pissed off about having to read them. Their objections amounted to, "Don't you go telling me what to feel, asshole." Probably not in that language, but I could feel their rage coming off them.
And so I tried talking with them, saying, "You know, of course you have a right to feel what you feel, but do you really always want to be drawn into a fight whenever you feel mad, or burst into uncontrollable tears when you feel sad? And when someone else is mad, do you have to just go with getting mad right back and getting into a fight with them?" I think that's what I said, or something like that. Hard to remember; it was a dream. And I woke up before I could hear any sort of reaction from them.
I've had a couple heated discussions on Facebook lately. One was with a guy in my neighborhood arguing that conceal-carry laws are good: he carries a gun as he walks around the neighborhood and it makes him feel safer. I'm not a fan of conceal-carry, but it turns out most of our energy about this comes not from facts but from communal beliefs: he's a passionate defender of individual liberties, while I tend towards a passionate interest in communality and mutual responsibility. When you get to statistical studies, having a firearm is more dangerous to the carrier because of household accidents and moments of passion, and in terms of public safety, conceal-carry a statistical wash.
But here's the thing I noticed about our back-and-forth: he came out of the box spitting mad—calling names, making accusations, saying things that weren't threats but carried the structure of threats ("If you... then I..."). And of course he has a "right to his feelings," but what I was seeing was how much his anger in and of itself washed over the relationship. It almost instantly stopped being just his anger. It was anger that I also had to deal with.
We use the word "feelings" to describe emotions, and this makes sense for little kids that are just learning about themselves: "What do you feel?" is a really good question for little kids to step back from themselves and name the churning mass of stuff inside them.
But I'm wondering about the use of that word in adults, because feelings in a group of people are more like waves: they aren't felt by you as an individual, they are emanated. They are like germs: sometimes your neighbor gets infected, sometimes her immune system kicks in with its own anti-emotion. But none of us live in emotional bubbles. Even those of us who try to, end up emanating their own weird little "can't touch me" vibe.
The other Facebook discussion was with a friend of a friend, about this letter and quickly turned into a debate about tyranny (taxation) vs reckless individualism (anti-taxation). And the guy I had the tête-a-tête with was pretty hyperbolic. He's clearly been through the comments-section school of political commentary and debate.
If you read the comments section of pretty much any article on the internet that touches on politics, you know the language: a group of villains is named, fear-and-anger-inducing words are invoked, and and either a plea for divine retribution or a call to arms concludes. These are the tools we use to try and win arguments. Except they utterly fail at that. They help us gather allies, and maybe we swing one or two people who are confused and unsure where they stand, but they don't turn anyone from the enemy camp, because they make it clear the enemy camp is the enemy.
When Jon Stewart made his plea for civility and less hyperbole ("These are hard times, not the end times...") this summer, I was interested to see some of my left-wing friends get pissed off because to them Stewart seemed to be saying "Stop fighting for what is right." And I didn't really know what to say to that, because of course we want people to fight for justice. And liberty. And freedom. And communal responsibility.
But who are they fighting? And how do you fight a demagogue, or a whole sea of demagogues? When we say we are going to fight, we invoke a specific set of analogies: there is a battle, there is an enemy, there is going to be some kind of combat. There's a poster/t-shirt slogan, "fighting for peace is like f***ing for virginity," which makes the point crudely, but the problem is, we don't know how to talk about large structural issues except by fighting.
And I think the root of the problem is the tidal-emotion thing I started this post off with: When I am passionate about something, a lot of what you—my audience—are paying attention to is the passion. The work of understanding the something itself does not come in presentation, it comes from our internal processing and piecing puzzle pieces that fit our internal unanswered-question puzzle-pieces.
And so I wonder about the place of passion in public debate. It seems to me that opening more of a place for testimony from personal experience, and clear, interesting delineations of the field of debate, are needed. But that's me. Actually, I was bowled over by this discussion of the divided mind, from a recent talk at the Royal Society of Art. It may sound boring from the title, but the conclusion about the sort of balancing needed in our world, is profound:
And so I tried talking with them, saying, "You know, of course you have a right to feel what you feel, but do you really always want to be drawn into a fight whenever you feel mad, or burst into uncontrollable tears when you feel sad? And when someone else is mad, do you have to just go with getting mad right back and getting into a fight with them?" I think that's what I said, or something like that. Hard to remember; it was a dream. And I woke up before I could hear any sort of reaction from them.
I've had a couple heated discussions on Facebook lately. One was with a guy in my neighborhood arguing that conceal-carry laws are good: he carries a gun as he walks around the neighborhood and it makes him feel safer. I'm not a fan of conceal-carry, but it turns out most of our energy about this comes not from facts but from communal beliefs: he's a passionate defender of individual liberties, while I tend towards a passionate interest in communality and mutual responsibility. When you get to statistical studies, having a firearm is more dangerous to the carrier because of household accidents and moments of passion, and in terms of public safety, conceal-carry a statistical wash.
But here's the thing I noticed about our back-and-forth: he came out of the box spitting mad—calling names, making accusations, saying things that weren't threats but carried the structure of threats ("If you... then I..."). And of course he has a "right to his feelings," but what I was seeing was how much his anger in and of itself washed over the relationship. It almost instantly stopped being just his anger. It was anger that I also had to deal with.
We use the word "feelings" to describe emotions, and this makes sense for little kids that are just learning about themselves: "What do you feel?" is a really good question for little kids to step back from themselves and name the churning mass of stuff inside them.
But I'm wondering about the use of that word in adults, because feelings in a group of people are more like waves: they aren't felt by you as an individual, they are emanated. They are like germs: sometimes your neighbor gets infected, sometimes her immune system kicks in with its own anti-emotion. But none of us live in emotional bubbles. Even those of us who try to, end up emanating their own weird little "can't touch me" vibe.
The other Facebook discussion was with a friend of a friend, about this letter and quickly turned into a debate about tyranny (taxation) vs reckless individualism (anti-taxation). And the guy I had the tête-a-tête with was pretty hyperbolic. He's clearly been through the comments-section school of political commentary and debate.
If you read the comments section of pretty much any article on the internet that touches on politics, you know the language: a group of villains is named, fear-and-anger-inducing words are invoked, and and either a plea for divine retribution or a call to arms concludes. These are the tools we use to try and win arguments. Except they utterly fail at that. They help us gather allies, and maybe we swing one or two people who are confused and unsure where they stand, but they don't turn anyone from the enemy camp, because they make it clear the enemy camp is the enemy.
When Jon Stewart made his plea for civility and less hyperbole ("These are hard times, not the end times...") this summer, I was interested to see some of my left-wing friends get pissed off because to them Stewart seemed to be saying "Stop fighting for what is right." And I didn't really know what to say to that, because of course we want people to fight for justice. And liberty. And freedom. And communal responsibility.
But who are they fighting? And how do you fight a demagogue, or a whole sea of demagogues? When we say we are going to fight, we invoke a specific set of analogies: there is a battle, there is an enemy, there is going to be some kind of combat. There's a poster/t-shirt slogan, "fighting for peace is like f***ing for virginity," which makes the point crudely, but the problem is, we don't know how to talk about large structural issues except by fighting.
And I think the root of the problem is the tidal-emotion thing I started this post off with: When I am passionate about something, a lot of what you—my audience—are paying attention to is the passion. The work of understanding the something itself does not come in presentation, it comes from our internal processing and piecing puzzle pieces that fit our internal unanswered-question puzzle-pieces.
And so I wonder about the place of passion in public debate. It seems to me that opening more of a place for testimony from personal experience, and clear, interesting delineations of the field of debate, are needed. But that's me. Actually, I was bowled over by this discussion of the divided mind, from a recent talk at the Royal Society of Art. It may sound boring from the title, but the conclusion about the sort of balancing needed in our world, is profound:
Saturday, October 15, 2011
Physical Maps
I'm recovering from NACIS 2011, which as usual was wonderful and rich as a source of ideas and techniques and wonderful conversations with fellow cartographers and mapheads.
The thing that kept coming back to me this year is how we often leave aside the idea that maps are physical objects, or at least are experienced as physical objects. It's easy in this electronic world to get caught up in the content that streams to us via our screens, and learn to ignore the screen itself, or at least allow it to fall to a different level of consciousness.
A map is our word for a kind of information transmission: we talk about map makers and map users, about map-generating technology, the language of maps, the meaning and power of maps. Every link in that chain, from the physical ground of our discussion, through the physical means of recording, the physicality even of computers and their electronic guts, exists in physical form. It is grounded in stuff.
This used to be so self-evident as to be an absurd statement. Phrases like "buying a map" or "reading a map," "folding a map" or "publishing a map," represented physical processes that were the primary concern of map makers and users. In fact, we as a map culture were so caught up in these physicalities, that it was kind of a surprise and a jolt to be reminded a generation ago that there was something abstract, ineffable, and grounded in symbol about map-making.
This is to me one of the huge changes the digital revolution has brought about. We now mostly accept that maps are images, texts, arguments, or propositions. The public no longer talks about "folding that paper up like a road map" because our children have no more idea what we're talking about than they do when older folks talk about "dialing someone" on the telephone.
We need to be reminded of the physicality of maps. At a session in NACIS, I made the point that a technique of cross-hatching that Patrick Kennelly presented (really cool idea, by the way), would carry more of the rich texture of the art prints he was using as examples, if he actually made copper plate intaglio prints from them. And the conversation then turned to how you could add texture in Photoshop and so make them look more like old prints. And I held my tongue. The point is, an actual copper plate print, in its physicality, looks and feels different than even the most interesting plotter print—they may look the same on the projection screen at a conference, but their physical appearance in the world is not the same. Physicality matters.
The thing that kept coming back to me this year is how we often leave aside the idea that maps are physical objects, or at least are experienced as physical objects. It's easy in this electronic world to get caught up in the content that streams to us via our screens, and learn to ignore the screen itself, or at least allow it to fall to a different level of consciousness.
A map is our word for a kind of information transmission: we talk about map makers and map users, about map-generating technology, the language of maps, the meaning and power of maps. Every link in that chain, from the physical ground of our discussion, through the physical means of recording, the physicality even of computers and their electronic guts, exists in physical form. It is grounded in stuff.
This used to be so self-evident as to be an absurd statement. Phrases like "buying a map" or "reading a map," "folding a map" or "publishing a map," represented physical processes that were the primary concern of map makers and users. In fact, we as a map culture were so caught up in these physicalities, that it was kind of a surprise and a jolt to be reminded a generation ago that there was something abstract, ineffable, and grounded in symbol about map-making.
This is to me one of the huge changes the digital revolution has brought about. We now mostly accept that maps are images, texts, arguments, or propositions. The public no longer talks about "folding that paper up like a road map" because our children have no more idea what we're talking about than they do when older folks talk about "dialing someone" on the telephone.
We need to be reminded of the physicality of maps. At a session in NACIS, I made the point that a technique of cross-hatching that Patrick Kennelly presented (really cool idea, by the way), would carry more of the rich texture of the art prints he was using as examples, if he actually made copper plate intaglio prints from them. And the conversation then turned to how you could add texture in Photoshop and so make them look more like old prints. And I held my tongue. The point is, an actual copper plate print, in its physicality, looks and feels different than even the most interesting plotter print—they may look the same on the projection screen at a conference, but their physical appearance in the world is not the same. Physicality matters.
Saturday, September 24, 2011
Traditional Marriage
If you wander the Minnesota Renaissance Festival, you'll probably run into a group of men dressed in white, with bells strapped to their shins, dancing while waving handkerchiefs or clashing sticks. They are Morris dancers.
I am also a Morris dancer. Morris dancing is an old tradition, but we hedge a bit on exactly how old it is. Passers-by ask, "where does this come from?" and "when is this supposed to be?" The answer they want to hear is "It’s from England, and it’s veeeerrry old," but the answer I want to give is "it’s from here and now,” because we perform in what we folky types call a "living tradition."
A living tradition is passed down over time, but we expect change in its patterns. We celebrate freshness within the old forms we love. Think of bluegrass, or ballet, or French cooking: in each case, there’s a reverence given to old ways of doing things, and a sense of joy when a new variation on an old theme is introduced.
What's the opposite of a living tradition? A fossilized, hidebound tradition? It isn’t simply conservatism—people find real life in many conservative traditions, where in each presentation of an ancient unalterable text or ritual, the devotee hears something deep and vital. A tradition truly dies when it becomes separated from life—when it is empty of meaning for its participants, when it holds together a group that exists for no good reason. Or when it has become a lie.
Marriage is (or ought to be) a living tradition.
Marriage may be grounded in seemingly unchanging forms, and in words that have been said for a very long time. But the world itself and what it means to live in the world are constantly changing, and so does marriage. What it means to be a husband or a wife is different for me and my wife than it was for our parents, and their marriages were different from those of their parents.
My religious community, the Society of Friends (Quakers) has a strong sense of tradition. If our ideas seem odd to outsiders, it’s not because they are new. From their founding, Quakers rejected the idea of ordained ministers acting as intermediaries between people and God. Quaker weddings had no officiant standing between the couple and that that joined them. We still have no officiants today, and we can honestly say we marry the same way Quakers have been marrying for almost 400 years.
And yet, things do change. We no longer "disown” members who marry non-Quakers, as Friends Meetings used to do. We marry couples who have been living together unmarried, which would have appalled our forebears. And we marry same-gender couples. My congregation, Twin Cities Friends Meeting, has been doing so for 25 years.
This is the witness I want to bear as a member of this congregation: Recognizing marriage between two people of the same sex does not undercut traditional marriage. My opposite-sex marriage (also under care of this Meeting) is strengthened by the same living tradition under which my friends' same-sex weddings are celebrated, and by the examples of those marriages.
The idea that marriage must be protected from change is a lie. The implication that my friends’ same-sex marriages are not legitimate is a lie. And the suggestion that we are corrupted by the growth and change in our living traditions is not just a lie. It is a lie that, if followed, ends in the death of those traditions.
Those who believe these lies need to ask themselves: Is your sense of marriage’s fragility bound up in a tradition to which you no longer fully subscribe? Look to the strength and life in the tradition of marriage and welcome same-sex couples. Don’t just reject the proposed Constitutional amendment. Legalize same-sex marriage. Do it now.
I am also a Morris dancer. Morris dancing is an old tradition, but we hedge a bit on exactly how old it is. Passers-by ask, "where does this come from?" and "when is this supposed to be?" The answer they want to hear is "It’s from England, and it’s veeeerrry old," but the answer I want to give is "it’s from here and now,” because we perform in what we folky types call a "living tradition."
A living tradition is passed down over time, but we expect change in its patterns. We celebrate freshness within the old forms we love. Think of bluegrass, or ballet, or French cooking: in each case, there’s a reverence given to old ways of doing things, and a sense of joy when a new variation on an old theme is introduced.
What's the opposite of a living tradition? A fossilized, hidebound tradition? It isn’t simply conservatism—people find real life in many conservative traditions, where in each presentation of an ancient unalterable text or ritual, the devotee hears something deep and vital. A tradition truly dies when it becomes separated from life—when it is empty of meaning for its participants, when it holds together a group that exists for no good reason. Or when it has become a lie.
Marriage is (or ought to be) a living tradition.
Marriage may be grounded in seemingly unchanging forms, and in words that have been said for a very long time. But the world itself and what it means to live in the world are constantly changing, and so does marriage. What it means to be a husband or a wife is different for me and my wife than it was for our parents, and their marriages were different from those of their parents.
My religious community, the Society of Friends (Quakers) has a strong sense of tradition. If our ideas seem odd to outsiders, it’s not because they are new. From their founding, Quakers rejected the idea of ordained ministers acting as intermediaries between people and God. Quaker weddings had no officiant standing between the couple and that that joined them. We still have no officiants today, and we can honestly say we marry the same way Quakers have been marrying for almost 400 years.
And yet, things do change. We no longer "disown” members who marry non-Quakers, as Friends Meetings used to do. We marry couples who have been living together unmarried, which would have appalled our forebears. And we marry same-gender couples. My congregation, Twin Cities Friends Meeting, has been doing so for 25 years.
This is the witness I want to bear as a member of this congregation: Recognizing marriage between two people of the same sex does not undercut traditional marriage. My opposite-sex marriage (also under care of this Meeting) is strengthened by the same living tradition under which my friends' same-sex weddings are celebrated, and by the examples of those marriages.
The idea that marriage must be protected from change is a lie. The implication that my friends’ same-sex marriages are not legitimate is a lie. And the suggestion that we are corrupted by the growth and change in our living traditions is not just a lie. It is a lie that, if followed, ends in the death of those traditions.
Those who believe these lies need to ask themselves: Is your sense of marriage’s fragility bound up in a tradition to which you no longer fully subscribe? Look to the strength and life in the tradition of marriage and welcome same-sex couples. Don’t just reject the proposed Constitutional amendment. Legalize same-sex marriage. Do it now.
Friday, September 2, 2011
New blog: Measured Words
I've started a new blog, Measured Words, which I describe below. I'm not abandoning this one, but wanted to do this to impose a structure on an idea that's been floating around in my mind for a while. I hope you'll join me over there.
Words are not sticks and stones, but we use words to get people to throw sticks and stones. Words are like magic—that's why magic spells are such a part of magic's trope. And words, in order to work, in order to work their magic, have to mean something.
Words are slippery. Words are malleable. They are not the rocks beneath our collective understanding we want them to be, because they shift in their meaning, subject to our changing wants and our collective will (or lack of will). But we still use them, because we don't seem to have any better tool at hand to work that magic, to reshape our world to meet our desires.
We lie with words, and we tell the truth with words. What makes those words into truth or untruth is not the words themselves, but how well those words match up to the things they describe. And we have gotten way too lazy about making that connection.
And so this blog. Each entry I will pick a word and try to get at what we mean, and sometimes what we ought to mean, when we invoke it. Some of the words are at the forefront of political speech (jobs, freedom, government), others are parts of my particular life (Quaker, map, folk). And still others I expect to pick up just because they pique my interest.
Monday, August 15, 2011
Blind Spot
There's an old trick where you place a dark spot on a white wall, then sit back and with one eye open, look slightly to the left or right, and at some point, the spot will simply disappear from view. This marks the small area (scotoma) on the retina where there are no visual receptors (no cones or rods) because that's where the optic nerve connects the retina to the brain.
I think we each have points like this is our psychic landscape, which cannot be approached in the direct way we know how to approach most of the world, not because they are too painful (that's another story—see below) but because they simply contradict our ways of understanding; they are incomprehensible because they are in the blind spots of our comprehension.
The annihilation of being is the big one for most people. Of course we can see death all the time; all living things die. But we cannot understand what it means to die, because it would be to imagine not imagining, to think about not thinking—ever again.
We construct all sorts of ways to bridge this blank spot, but at root it is almost impossible to understand a world without a self. That is to say, a story with no narrator, a picture not drawn from a point of view. So when a character in a story (or, in the particular case I'm thinking of, a play I saw last week) considers his or her undoing, and the creator is portraying this as straightforwardly as possible, there comes a kind of gray moment, when the artist (and character) is simply lost.
All of this assumes that the "soul" does in fact die, that consciousness, the self, does not have an immortal component. And I suspect that the power of that "blind spot" is a big part of the impetus to discover alternatives to total death of the self, whether immortality of the soul, or reincarnation, or some other process by which something happens after the end.
Well, something does happen to the body of course: it decomposes and—one way or another—is eaten. And that eating is a root of horror. There was an interesting discussion on Minnesota Public Radio's Midmorning recently, with the author of the hot new werewolf novel, The Last Werewolf. My question for him was about the horrific effect of having a sympathetic character become meat, how viscerally painful this is for the audience, and how he as a writer used—or at any rate dealt with—this horror. He said that specifically it was being eaten that to his mind was the horror: that all you have worked for in your life is summed up in being a meal for some other creature, and that this was in a way the key to horror as a genre and as a tool. I think he was spot on. Like death, the prospect that we (or our bodies if our sense of self is gone) will be consumed elicits a visceral turn of the stomach.
It is not, however, as powerful a blind spot, because we can in fact imagine being captured in a great monstrous maw like a bird in a cat's jaws. It's painful and horrible but the horror is comprehensible.
I wrote earlier about Diana Wynne Jones' Fire and Hemlock, and about my troubles with the ending. In the denouement, she pulls from T.S. Eliot's Four Quartets an image of Nowhere as a place, in her book an eddying gray horror, a pool at the foot of a garden, the maw of Hell — not a fiery place but an utterly empty negation of everything, good and bad. I think this is the blind spot, and perhaps this is why I find the ending of the book unsatisfying: it takes us up to the lip of a visible impossibility, and then uses a sort of rule-manipulating trick to turn us away, pull us through and out. In the end, that horror is simply left behind, unaddressed.
I recently read William Styron's Darkness Visible, an account of his own deep clinical depression. The book was recommended to me as the truest and clearest description of clinical depression a friend had ever read. It is an excellent book, but one of the things it makes very clear is that depression in itself is indescribable: you can approach it, you can say something about it, but it is a pain of absence, an experience of void, and as such is not really possible to put into words, because the words fill a space in the audience's heads that are simply missing in the sufferer. Depression is like a blind spot of the self, a place that by definition cannot be held and looked at directly. It can be described in the descent, and—as Styron notes, quoting Dante in his return from the Inferno—in the ascent back out of it, but because description is itself something, the void cannot be captured in words.
Is there any way out of these blind spots? If the analogy were perfect, one could just open the other eye. If one trusted the vision of others, one could ask what they saw, but no-one else can truly see our selves from the inside, or be a sufferer of depression for the sufferer. People describe near-death experiences, but these experiences are unsatisfactory because they are about someone else's negation, not ours. Our blind spots are places where our frame of understanding is fundamentally personal, and because we are conscious in some essential way within our own bodies, there is no sure-fire way to add the equivalent of parallel vision. Even a close companionship like Styron had with his wife can't bridge the disease, though of course it sure can't hurt either. It probably saved his life—his realization as he considered suicide that he couldn't just do this selfishly to those he loved. But it didn't cure or offer a window to his condition.
Buddhist practice, with its focus on non-self and non-being, maybe comes closest. But here I fall short, never having really studied such practices. And my understanding is that in Buddhist meditation, the goal is a stilling of self so one can experience the not-self, not the prospect of the soul's extinguishment.
Perhaps the key to addressing these blind spots is to think of them not in terms of their being things we see, but products of how we look. That is to say, it is not self-negation, or death, that we cannot see, but our way of seeing that keeps us from seeing death. The idea—and this is really just an untested idea on my part—that depression is similar in kind to the gray space around the idea of the absence of self, suggests that there is something organic in us, as there clearly is in depression, that makes our seeing unclear. If we saw the world differently—as some who believe in an immortal soul do, for instance—that nothingness would not be a gray and shimmering horror.
What the blind spots do show pretty definitively to me at least, is that description, the set tools we use to say what the world is, has inherent paradoxical limits. It's not that we won't look at them—in the way we won't look at being eaten, or at any of a number of bogeymen and women we set up as furniture in our psychic household—it's that description itself is housed within a finite, mortal frame and cannot therefore see the absence of that frame itself.
I think we each have points like this is our psychic landscape, which cannot be approached in the direct way we know how to approach most of the world, not because they are too painful (that's another story—see below) but because they simply contradict our ways of understanding; they are incomprehensible because they are in the blind spots of our comprehension.
The annihilation of being is the big one for most people. Of course we can see death all the time; all living things die. But we cannot understand what it means to die, because it would be to imagine not imagining, to think about not thinking—ever again.
We construct all sorts of ways to bridge this blank spot, but at root it is almost impossible to understand a world without a self. That is to say, a story with no narrator, a picture not drawn from a point of view. So when a character in a story (or, in the particular case I'm thinking of, a play I saw last week) considers his or her undoing, and the creator is portraying this as straightforwardly as possible, there comes a kind of gray moment, when the artist (and character) is simply lost.
All of this assumes that the "soul" does in fact die, that consciousness, the self, does not have an immortal component. And I suspect that the power of that "blind spot" is a big part of the impetus to discover alternatives to total death of the self, whether immortality of the soul, or reincarnation, or some other process by which something happens after the end.
Well, something does happen to the body of course: it decomposes and—one way or another—is eaten. And that eating is a root of horror. There was an interesting discussion on Minnesota Public Radio's Midmorning recently, with the author of the hot new werewolf novel, The Last Werewolf. My question for him was about the horrific effect of having a sympathetic character become meat, how viscerally painful this is for the audience, and how he as a writer used—or at any rate dealt with—this horror. He said that specifically it was being eaten that to his mind was the horror: that all you have worked for in your life is summed up in being a meal for some other creature, and that this was in a way the key to horror as a genre and as a tool. I think he was spot on. Like death, the prospect that we (or our bodies if our sense of self is gone) will be consumed elicits a visceral turn of the stomach.
It is not, however, as powerful a blind spot, because we can in fact imagine being captured in a great monstrous maw like a bird in a cat's jaws. It's painful and horrible but the horror is comprehensible.
I wrote earlier about Diana Wynne Jones' Fire and Hemlock, and about my troubles with the ending. In the denouement, she pulls from T.S. Eliot's Four Quartets an image of Nowhere as a place, in her book an eddying gray horror, a pool at the foot of a garden, the maw of Hell — not a fiery place but an utterly empty negation of everything, good and bad. I think this is the blind spot, and perhaps this is why I find the ending of the book unsatisfying: it takes us up to the lip of a visible impossibility, and then uses a sort of rule-manipulating trick to turn us away, pull us through and out. In the end, that horror is simply left behind, unaddressed.
I recently read William Styron's Darkness Visible, an account of his own deep clinical depression. The book was recommended to me as the truest and clearest description of clinical depression a friend had ever read. It is an excellent book, but one of the things it makes very clear is that depression in itself is indescribable: you can approach it, you can say something about it, but it is a pain of absence, an experience of void, and as such is not really possible to put into words, because the words fill a space in the audience's heads that are simply missing in the sufferer. Depression is like a blind spot of the self, a place that by definition cannot be held and looked at directly. It can be described in the descent, and—as Styron notes, quoting Dante in his return from the Inferno—in the ascent back out of it, but because description is itself something, the void cannot be captured in words.
Is there any way out of these blind spots? If the analogy were perfect, one could just open the other eye. If one trusted the vision of others, one could ask what they saw, but no-one else can truly see our selves from the inside, or be a sufferer of depression for the sufferer. People describe near-death experiences, but these experiences are unsatisfactory because they are about someone else's negation, not ours. Our blind spots are places where our frame of understanding is fundamentally personal, and because we are conscious in some essential way within our own bodies, there is no sure-fire way to add the equivalent of parallel vision. Even a close companionship like Styron had with his wife can't bridge the disease, though of course it sure can't hurt either. It probably saved his life—his realization as he considered suicide that he couldn't just do this selfishly to those he loved. But it didn't cure or offer a window to his condition.
Buddhist practice, with its focus on non-self and non-being, maybe comes closest. But here I fall short, never having really studied such practices. And my understanding is that in Buddhist meditation, the goal is a stilling of self so one can experience the not-self, not the prospect of the soul's extinguishment.
Perhaps the key to addressing these blind spots is to think of them not in terms of their being things we see, but products of how we look. That is to say, it is not self-negation, or death, that we cannot see, but our way of seeing that keeps us from seeing death. The idea—and this is really just an untested idea on my part—that depression is similar in kind to the gray space around the idea of the absence of self, suggests that there is something organic in us, as there clearly is in depression, that makes our seeing unclear. If we saw the world differently—as some who believe in an immortal soul do, for instance—that nothingness would not be a gray and shimmering horror.
What the blind spots do show pretty definitively to me at least, is that description, the set tools we use to say what the world is, has inherent paradoxical limits. It's not that we won't look at them—in the way we won't look at being eaten, or at any of a number of bogeymen and women we set up as furniture in our psychic household—it's that description itself is housed within a finite, mortal frame and cannot therefore see the absence of that frame itself.
Thursday, July 7, 2011
Silverbacks
In my world, whether Dominique Straus-Kahn raped the hotel housekeeper or had consensual sex with her, he's still political damaged goods—what kind of trust do you place in a potential leader who has unprotected sex with a total stranger on a moment's notice?
Well, if you're a gorilla, you respect him (perhaps grudgingly) as the silverback leader of the tribe. And there's some part of us that recognizes silverbacks among us, and accepts them into leadership positions. Perhaps this is why so many male politicians get tripped up by exercising their sexual desires—they were chosen for their silverback qualities, and now here we are punishing them for them.
But why should these two be necessarily connected? Straus-Kahn didn't make his way to the top of the IMF and France's Socialist Party on the size of his "harem." Even among those who accept that powerful men have mistresses, it is expected that they will be discreet about it.
I find myself thinking about the side-effects of domestication. When you breed wolves into dogs, one of the side effects of becoming part of the human household is a sort of perpetual puppyhood. In fact, you can correlate certain kinds of breed-related gentleness with the degree of puppy-like physical charateristics: floppy ears, shorter snout, rounder body. (A couple starter sources on this: Temple Grandin's Animals Make Us Human, and the excellent Nature documentary "The Secret Life of the Dog")
Is male sexual aggressiveness tied to wider social leadership qualities? Does promoting faithfulness and lack of sexual aggression give us milquetoasts? It makes a certain amount of sense—the leader of a gang or a tribe proves his kingship by having his pick of the women.
But this is far from the only model of human social organization with deep roots. The model of a chieftan who rules by loyalty and punishment is matched by that of the council circle—an egalitarian model where getting too far above oneself is a recipe for a group smackdown.
What I observe is that these models move back and forth. The silverback king model makes more sense when there is immediate threat, and the group needs to move quickly and responsively. Think of a platoon in battle or a group of escapees from slavery or prison—adrenaline pumps, and you do what the leader tells you, or you are dead.
By contrast, the egalitarian model makes sense when life is stable, and threats to life are longer-term—harvest, hunt, and child-rearing. Instead of adrenaline-fueled survival instincts, we take time to consider and plan, and good planning means listening and considering advice, something that doesn't happen as effectively when we are worried about Darth Vader enforcing his will upon us...
The kingship model also makes sense when the population becomes to large to manage by consensus. In a mass society, you can mitigate this by choosing a council to govern the larger group, either by election or tradition. But when selection to this council becomes competitive, it is the silverbacks who will tend to put the energy into getting onto the council, and suddenly you don't have a group of a co-operators, but a bunch of people trying to be top gun.
When the language of egalitarianism becomes embedded in a competitive political system, you thus end up with strange cognitive dissonances: Anthony Weiner on one hand brilliantly calling out outrageous anti-democratic abuses by his opponents, while on the other hand playing out an aggressive primate mating ritual on line; Michelle Bachmann and Sarah Palin's strange combination of driven personal ambition and endorsement of traditional stay-at-home motherhood; calls for bipartisanship alongside constant (and often personal) political attacks.
Now, it isn't fair to say that religious conservatives are somehow promoting aggressive promiscuity. Because clearly the orthodoxy says you should keep your pants on if you want to go to heaven. And good behavior is enforced by shame—the tearful admission of sin has become almost routine in scandals involving politicians. But the purist sense of human behavior—the sense that we ought to be above primate wrestling in the mud—which much of modern conservatism is grounded upon, might be a big part of the problem. Especially when that purism becomes embodied in political and social structures that are driven by the energy of those primate combats.
And that I think gets to the root of the silverback problem: We depend on silverback models of leadership to keep us together and to give us drive, but we also want to feel a sense of rational or spiritual community in which we are all treated as equals. And these two models simply do not play well together.
Well, if you're a gorilla, you respect him (perhaps grudgingly) as the silverback leader of the tribe. And there's some part of us that recognizes silverbacks among us, and accepts them into leadership positions. Perhaps this is why so many male politicians get tripped up by exercising their sexual desires—they were chosen for their silverback qualities, and now here we are punishing them for them.
But why should these two be necessarily connected? Straus-Kahn didn't make his way to the top of the IMF and France's Socialist Party on the size of his "harem." Even among those who accept that powerful men have mistresses, it is expected that they will be discreet about it.
I find myself thinking about the side-effects of domestication. When you breed wolves into dogs, one of the side effects of becoming part of the human household is a sort of perpetual puppyhood. In fact, you can correlate certain kinds of breed-related gentleness with the degree of puppy-like physical charateristics: floppy ears, shorter snout, rounder body. (A couple starter sources on this: Temple Grandin's Animals Make Us Human, and the excellent Nature documentary "The Secret Life of the Dog")
Is male sexual aggressiveness tied to wider social leadership qualities? Does promoting faithfulness and lack of sexual aggression give us milquetoasts? It makes a certain amount of sense—the leader of a gang or a tribe proves his kingship by having his pick of the women.
But this is far from the only model of human social organization with deep roots. The model of a chieftan who rules by loyalty and punishment is matched by that of the council circle—an egalitarian model where getting too far above oneself is a recipe for a group smackdown.
What I observe is that these models move back and forth. The silverback king model makes more sense when there is immediate threat, and the group needs to move quickly and responsively. Think of a platoon in battle or a group of escapees from slavery or prison—adrenaline pumps, and you do what the leader tells you, or you are dead.
By contrast, the egalitarian model makes sense when life is stable, and threats to life are longer-term—harvest, hunt, and child-rearing. Instead of adrenaline-fueled survival instincts, we take time to consider and plan, and good planning means listening and considering advice, something that doesn't happen as effectively when we are worried about Darth Vader enforcing his will upon us...
The kingship model also makes sense when the population becomes to large to manage by consensus. In a mass society, you can mitigate this by choosing a council to govern the larger group, either by election or tradition. But when selection to this council becomes competitive, it is the silverbacks who will tend to put the energy into getting onto the council, and suddenly you don't have a group of a co-operators, but a bunch of people trying to be top gun.
When the language of egalitarianism becomes embedded in a competitive political system, you thus end up with strange cognitive dissonances: Anthony Weiner on one hand brilliantly calling out outrageous anti-democratic abuses by his opponents, while on the other hand playing out an aggressive primate mating ritual on line; Michelle Bachmann and Sarah Palin's strange combination of driven personal ambition and endorsement of traditional stay-at-home motherhood; calls for bipartisanship alongside constant (and often personal) political attacks.
Now, it isn't fair to say that religious conservatives are somehow promoting aggressive promiscuity. Because clearly the orthodoxy says you should keep your pants on if you want to go to heaven. And good behavior is enforced by shame—the tearful admission of sin has become almost routine in scandals involving politicians. But the purist sense of human behavior—the sense that we ought to be above primate wrestling in the mud—which much of modern conservatism is grounded upon, might be a big part of the problem. Especially when that purism becomes embodied in political and social structures that are driven by the energy of those primate combats.
And that I think gets to the root of the silverback problem: We depend on silverback models of leadership to keep us together and to give us drive, but we also want to feel a sense of rational or spiritual community in which we are all treated as equals. And these two models simply do not play well together.
Subscribe to:
Posts (Atom)