The first three chapters of The Brothers Lionheart, by Astrid Lindgren, were serialized in very early issues of Cricket magazine, which is where I ran into it. I got it as a book for Christmas the winter of fourth grade. The story is a sad one, and it gets sadder the older I am. I remember crying over it when I was nine.
The book does what I've never seen any other book do, in resolving the "how to get rid of he parents" question. It kills the kids. Given that much of the book is grounded in the basic heroism of being human ("Some things you have to do or you're nothing but a bit of filth" is a running line), the heroic deaths of Jonathan Lionheart make sense. What Lindgren does that's so unusual is to then follow on to their next adventure.
The last time I read the book, several years ago, I got a strong whiff of Scandinavia's time in World War II. Lindgren doesn't acknowledge this as a source, but just as Tolkien's Sauron and Saruman owe a great deal to the fascist and communist totalitarians of the mid-20th Century, so too with Lindgren's Tengil. But the whole book has an oddly stylized quality to it, and so the evil doesn't carry the same visceral punch you might expect.
What does pack a punch is the juxtaposition of joyful love of life, and death. I remember the scene at the beginning of chapter 3, when the 10-year-old narrator, who has arrived in Nangiyala after dying of tuberculosis, discovering he had straight legs, no cough, and could swim and ride— it's like a miracle story or a fairy tale, but told in the first person. It's utterly deliriously beautiful. On the other hand, the ending, where the brothers agree to commit suicide, to "jump into Nagilima," this next world's next world, just puts a capstone on a story where, before they turn 14 and 11 respectively, they both will die twice. It's very unsettling, the more you think about it.
Astrid Lindgren has written about her sources for the book: a train ride along Lake Fryken with a sunrise of unearthly beauty, and a cross in a cemetery in her home town of Vimmerby, memorializing two brothers who died young.
This is what I take away from this book: the terrible sad beauty of boys who know how to do good in the face of evil, who know how to be the hero of the saga, and who give their lives not happily, but willingly, because in the story they are living it is the necessary thing to do. All those boys, a long line of them, following a clear road that was paved before they arrived, and not coming back.
Tuesday, December 25, 2012
Sunday, December 23, 2012
Old Books 1: Siddhartha
Siddhartha, by Hermann Hesse, was the first explicitly spiritual book I read that grabbed me. Actually, such books make a pretty short list —explicit spirituality tends to turn me off in prose.
The last time I read it through was probably in college. I wrote a paper for Religion 101 about viewing the novel through the lens of the seven Kundalini chakras. In coonventional kundalini practices, the goal is an upward movement from root (animalistic tooth-and-claw existence, centered in the anal area) up through the body and out of the crown (the top of the head, representing unity with universal consciousness). My point was that the character of Siddhartha begins as a young Brahmin living in those top chakras, without really living into the worldly chakras. Then he shifts course and becomes a successful businessman and lover. Finally, in despair, he finds a middle road, a balance point centered in the direct experience and love of other people. Once his heart is full, he can understand the unity of all...
That was when I was 20.
On rereading it, I tried summarizing the plot to Ingrid, who asked, "Does he have anyone?" And the answer is, not really. Not as a companion. Indeed it's a major point of his journey that he lose his childhood best friend, his lover, and his son. Only the loss of his son really breaks his heart.
Siddhartha's journey is one I think I internalized as a model for a "spiritual path," and it is really a problem... that the path is about going solo. Not that there's anything inherently wrong with going solo, and it does make it more possible to follow the thread through the labyrinth, making the sharp corner turns that that kind of devotion to a path implies. But it does imply that the "true path" is a lonely one.
One of the books I'll be reading later, Diana Wynne Jones's Homeward Bounders, makes as similarly bleak point... the narrator's final line is "But you wouldn't believe how lonely you get."
There's something off about this. I buy Siddhartha's journey as absolutely authentic to who he is, but it's worth noting that his final teacher, the one who shows him how to listen and understand the river of time as one unity, is a widower and ferryman, not a lonely hunter. Vasudeva had been happily married. No, the point is more that this lonely path was Siddhartha's choice. There was something in him that demanded he go solo, not in the path towards full enlightenment.
But I was reminded what a beautiful book this is. At each stage, Siddhartha goes through anguishes of despair, followed by a liberating joy in what is to come and in having become something new. That's the other thing that strikes me now... how many times in his life he felt the joy of "now I've finally gotten it right!" It made me think how great it would be to bottle that specific feeling, separate from actually discovering something new. I'd love to see a philosopher or psychologist of religion looking just at that transformative moment.
I also am sorry for his father and wife and son. I can't imagine it's a great thing to love someone whose spiritual path is higher than you.
Sunday, December 16, 2012
Angry
It's been eleven years since I was this angry, sad, and generally rendered incapable of much useful work. It's shocked me how hard the last two days have been: I mean, yes, it's awful—twenty first graders shot dead in their classroom, and the teachers and principal and so on. Of course it's awful. But there have been lots of awful things over the last ten years.
I'm not alone in this. Something about the events in Newtown have made us as a nation viscerally, boiling-over angry in a way that we haven't seen since 9/11/01. We are grief-stricken in a way we don't know what to do with.
I realized tonight that underneath the weeping for those 20 six- and seven-year-olds is something bigger. I am weeping for my country. I am weeping for the sense that this is becoming a place that isn't mine anymore. But I'm not from anywhere else. This is my home. I'm an American.
It isn't cheap political rhetoric. I spent a few days in Toronto on September, and it was such a startling weight off of me, walking through the streets of a very urban, gritty, full-of-urban-problems city, and not feeling the sense of anxiety that hangs even over my nice hometown of Minneapolis. It was like losing a headache I'd forgotten was there.
Toronto's no paradise. Canada's no paradise. I'm probably never moving to Canada. But I just don't get how so many people, including some of my friends, look at Canada and Western and Nothern Europe, and sneer at universal health care and pooh-pooh the lack of gun violence. I could quote figures at you, but I don't want to here. That's not the point. The point is, I felt more at home and at peace in a strange city than I do in my own front yard. I found that profoundly unsettling.
I am angry, angrier than I've been willing to admit to myself. I cover it up pretty well most of the time, I think—both from others and myself—but what I've seen in some of my liberal friends—the bitterness and fatalism—well, I worry I'm coming down with it too. I love my country, and I want it to be a place of love and peace. That's the picture I grew up with, and as I get older, I realize most of my fellow Americans have either given up on that vision as childish, or never had it in the first place. Instead it's a nation filled with demons needing to be stomped out with vigor. No dream of a better place in the here and now, just a resigned sigh that the battle is never won, and hope for peace in the next world.
But we're the nation that made a great industry out of dreams and fantasies. You'd think we'd know better, that we could learn to harness this great national talent for self-invention, and become a nation of Ray Bradburys. But we're not. We produce Ray Bradburys in a way no other country could, but the fantasies we adopt as our national scripts are full not of magic and hope, but of moralizing and fear and brimstone.
We are not the Greatest Nation on Earth. Whoever said that anyway? It sounds like a P. T. Barnum line. It's cheap boasting, and we've always been good at that. But we've also been good at self-deprecation, and we've been sorely lacking that in our national debate lately, outside of Comedy Central. Maybe we were the greatest nation on earth for a while after World War II, but we didn't even get to enjoy it, because we were so consumed with hate for dissent and fear within ourselves.
I love my country, but my country lies to itself. It hates itself. It's like loving someone with anorexia: their body image doesn't match their body, and becomes an ugly tool of self-mutilation, instead of a guide to positive change.
I am angry that we need revisionists like Howard Zinn (We who live in a nation that prides itself on a clarity and practical know-how. No fancy theories with abstact thises and thats—we leave that to the old world. No outdated, ossified social hierarchies). But we need the Howard Zinns to to show us how we have lied and lied again to ourselves. Lies upon lies. No fancy theories, just plain bald-faced ignorance of evidence and stubbornness. We let people say science is just someone's opinion, and all opinions are equal, and so it doesn't matter a whit how much research and effort you've done.
Jonathan Haidt thinks liberals don't care about sanctity and loyalty and respect. We do. I do anyway. And it hurts to think that what was sacred, what I want to be loyal to, and respect, has been dragged through the filth, betrayed my loyalty, and unearned my respect.
I want to live in a country where ideology is not king, especially ideology that masks rapaciousness and greed. I know we're never going to be rid of ideologues, and that's OK. But the floor of our national sense of self is rotting from underneath, and all we seem to be able to summon the collective will to do is tap on the floor with our foot and complain about the funny smell, and argue about whose job it is to hire the contractor and whether we really ought to pay for new sills.
And weep when twenty children fall through the hole and into the basement, gone forever.
I'm not alone in this. Something about the events in Newtown have made us as a nation viscerally, boiling-over angry in a way that we haven't seen since 9/11/01. We are grief-stricken in a way we don't know what to do with.
I realized tonight that underneath the weeping for those 20 six- and seven-year-olds is something bigger. I am weeping for my country. I am weeping for the sense that this is becoming a place that isn't mine anymore. But I'm not from anywhere else. This is my home. I'm an American.
It isn't cheap political rhetoric. I spent a few days in Toronto on September, and it was such a startling weight off of me, walking through the streets of a very urban, gritty, full-of-urban-problems city, and not feeling the sense of anxiety that hangs even over my nice hometown of Minneapolis. It was like losing a headache I'd forgotten was there.
Toronto's no paradise. Canada's no paradise. I'm probably never moving to Canada. But I just don't get how so many people, including some of my friends, look at Canada and Western and Nothern Europe, and sneer at universal health care and pooh-pooh the lack of gun violence. I could quote figures at you, but I don't want to here. That's not the point. The point is, I felt more at home and at peace in a strange city than I do in my own front yard. I found that profoundly unsettling.
I am angry, angrier than I've been willing to admit to myself. I cover it up pretty well most of the time, I think—both from others and myself—but what I've seen in some of my liberal friends—the bitterness and fatalism—well, I worry I'm coming down with it too. I love my country, and I want it to be a place of love and peace. That's the picture I grew up with, and as I get older, I realize most of my fellow Americans have either given up on that vision as childish, or never had it in the first place. Instead it's a nation filled with demons needing to be stomped out with vigor. No dream of a better place in the here and now, just a resigned sigh that the battle is never won, and hope for peace in the next world.
But we're the nation that made a great industry out of dreams and fantasies. You'd think we'd know better, that we could learn to harness this great national talent for self-invention, and become a nation of Ray Bradburys. But we're not. We produce Ray Bradburys in a way no other country could, but the fantasies we adopt as our national scripts are full not of magic and hope, but of moralizing and fear and brimstone.
We are not the Greatest Nation on Earth. Whoever said that anyway? It sounds like a P. T. Barnum line. It's cheap boasting, and we've always been good at that. But we've also been good at self-deprecation, and we've been sorely lacking that in our national debate lately, outside of Comedy Central. Maybe we were the greatest nation on earth for a while after World War II, but we didn't even get to enjoy it, because we were so consumed with hate for dissent and fear within ourselves.
I love my country, but my country lies to itself. It hates itself. It's like loving someone with anorexia: their body image doesn't match their body, and becomes an ugly tool of self-mutilation, instead of a guide to positive change.
I am angry that we need revisionists like Howard Zinn (We who live in a nation that prides itself on a clarity and practical know-how. No fancy theories with abstact thises and thats—we leave that to the old world. No outdated, ossified social hierarchies). But we need the Howard Zinns to to show us how we have lied and lied again to ourselves. Lies upon lies. No fancy theories, just plain bald-faced ignorance of evidence and stubbornness. We let people say science is just someone's opinion, and all opinions are equal, and so it doesn't matter a whit how much research and effort you've done.
Jonathan Haidt thinks liberals don't care about sanctity and loyalty and respect. We do. I do anyway. And it hurts to think that what was sacred, what I want to be loyal to, and respect, has been dragged through the filth, betrayed my loyalty, and unearned my respect.
I want to live in a country where ideology is not king, especially ideology that masks rapaciousness and greed. I know we're never going to be rid of ideologues, and that's OK. But the floor of our national sense of self is rotting from underneath, and all we seem to be able to summon the collective will to do is tap on the floor with our foot and complain about the funny smell, and argue about whose job it is to hire the contractor and whether we really ought to pay for new sills.
And weep when twenty children fall through the hole and into the basement, gone forever.
Friday, November 30, 2012
Folk songs
I've sung the song "Good Morning Mister Railroad Man" (also called "Danville Girl") practically every night for ten years. I learned it from a recording of Cisco Houston, Woody Guthrie's sometime companion, that I listened to a lot as a child (here's a recording of the two of them singing a different version of the song together in 1944). I learned it word for word and inflection for inflection from that one recording.
Is it a folk song? Well, it was cobbled together by hobos from bits and pieces, and adapted by Houston and Guthrie. Who knows where those the pieces came from, who they went through. According to Michael Cooney, this process through an oral tradition, like a game of telephone, is what makes a folk song a folk song.
But I think there's something deeper. It's important, for example, for many of my folk friends, to get the words right. Make sure you respect the tradition by getting it right. And there are a lot of good resources out there: collections of texts and field recordings and scholarly editions... for folkies there's always the wonderful Mudcat Cafe, where you can find many many variants of this song.
And so, though there is no "right" version of this song, you can pick out any of the selection on view, most transcribed from recordings, and you can learn to reproduce that version.
In an oral tradition, without recording technology or writing, such reproduction is meaningless. Performances vary, the exact word order varies in places, verses get transposed... The fixing of a song in place is in itself meaningless. Songs in this world are inherently fluid.
So why do I feel a little guilty when I want to mess with what someone else has written? We have copyright laws, which depend on reproduction—hence the "copy" in "copyright." The also serve to fix a form in time: a work "published" on such and such a date becomes a definitive version. I respect copyright (I spent summers as a teen working on copyright filing issues). I come to think of this fixed, published form as privileged over the fluid forms that, I've come to realize, are a more natural shape for ideas to move amongst us.
I write songs very very slowly—I probably have eight decent ones to show for almost thirty years of trying. Part of my problem is that I don't actually write songs very well—they need to develop pretty much complete in my head, and if I push too hard, and especially if I go writing things down too soon, it kind of spoils the soup. Good thing I have a day job.
I think my song composing process is a little like the fluidity of the folk process; I need to forget a song almost, then think of it again, only maybe get it a little bit "righter." Over and over—I finally "got right" a song a couple years ago I'd mostly written in college, first as a lullaby for Joseph (why do all the Christmas songs focus on Jesus and Mary? Yes, I know the reason, but it hardly seems fair to patient, kind old Joseph.), then as an imagined fisherman's lullaby. Finally, twenty years on, I came back to it and was able to make it sound right—just a couple tweaks to the first verse was all it needed, but it needed them.
And now, I think of it as fixed, and I think of that fixity as proper—for me. It's the same for jokes—there are jokes I love that I say the same way every time. They are fixed for me, because they fit me. If someone else learned them from me, and it wasn't written or recorded, I expect they'd learn them differently. On the other hand, especially in our well-recorded world, there are performances one learns verbatim. Hang around with my old Morris team and you'll be able to reproduce Monty Python and the Holy Grail almost word for word, and inflection for inflection. These have left the folk process, and in their brilliant final-cut versions, are not so much jokes as routines.
And the great rich songs of the living folk tradition have bits that last like hardened gems, never changing—not always the punch line, but particular memorable turns of phrase, perfect rhymes, or the repeated formula that holds a song together.
Here's the paradox at the heart of my ramblings here: we want to remember, but if we remember too well, we remove the water the memory is swimming in. Our care, our collecting and sorting and knowing of songs is great, but it also means the songs themselves—not new, original songs, but living, changing existing songs—stop moving and growing. We need to be able to forget them enough to have to struggle to remember, and so let them grow and change to fit us.
A song that's gone through the mill, and come out weathered and smooth, that's a traditional song or a folk song. But even more, a song still floating in that tide, still bouncing around between people, changing and adapting—that's a living folk song. Most of what we sing as folk songs are taxidermy.
Is it a folk song? Well, it was cobbled together by hobos from bits and pieces, and adapted by Houston and Guthrie. Who knows where those the pieces came from, who they went through. According to Michael Cooney, this process through an oral tradition, like a game of telephone, is what makes a folk song a folk song.
But I think there's something deeper. It's important, for example, for many of my folk friends, to get the words right. Make sure you respect the tradition by getting it right. And there are a lot of good resources out there: collections of texts and field recordings and scholarly editions... for folkies there's always the wonderful Mudcat Cafe, where you can find many many variants of this song.
And so, though there is no "right" version of this song, you can pick out any of the selection on view, most transcribed from recordings, and you can learn to reproduce that version.
In an oral tradition, without recording technology or writing, such reproduction is meaningless. Performances vary, the exact word order varies in places, verses get transposed... The fixing of a song in place is in itself meaningless. Songs in this world are inherently fluid.
So why do I feel a little guilty when I want to mess with what someone else has written? We have copyright laws, which depend on reproduction—hence the "copy" in "copyright." The also serve to fix a form in time: a work "published" on such and such a date becomes a definitive version. I respect copyright (I spent summers as a teen working on copyright filing issues). I come to think of this fixed, published form as privileged over the fluid forms that, I've come to realize, are a more natural shape for ideas to move amongst us.
I write songs very very slowly—I probably have eight decent ones to show for almost thirty years of trying. Part of my problem is that I don't actually write songs very well—they need to develop pretty much complete in my head, and if I push too hard, and especially if I go writing things down too soon, it kind of spoils the soup. Good thing I have a day job.
I think my song composing process is a little like the fluidity of the folk process; I need to forget a song almost, then think of it again, only maybe get it a little bit "righter." Over and over—I finally "got right" a song a couple years ago I'd mostly written in college, first as a lullaby for Joseph (why do all the Christmas songs focus on Jesus and Mary? Yes, I know the reason, but it hardly seems fair to patient, kind old Joseph.), then as an imagined fisherman's lullaby. Finally, twenty years on, I came back to it and was able to make it sound right—just a couple tweaks to the first verse was all it needed, but it needed them.
And now, I think of it as fixed, and I think of that fixity as proper—for me. It's the same for jokes—there are jokes I love that I say the same way every time. They are fixed for me, because they fit me. If someone else learned them from me, and it wasn't written or recorded, I expect they'd learn them differently. On the other hand, especially in our well-recorded world, there are performances one learns verbatim. Hang around with my old Morris team and you'll be able to reproduce Monty Python and the Holy Grail almost word for word, and inflection for inflection. These have left the folk process, and in their brilliant final-cut versions, are not so much jokes as routines.
And the great rich songs of the living folk tradition have bits that last like hardened gems, never changing—not always the punch line, but particular memorable turns of phrase, perfect rhymes, or the repeated formula that holds a song together.
Here's the paradox at the heart of my ramblings here: we want to remember, but if we remember too well, we remove the water the memory is swimming in. Our care, our collecting and sorting and knowing of songs is great, but it also means the songs themselves—not new, original songs, but living, changing existing songs—stop moving and growing. We need to be able to forget them enough to have to struggle to remember, and so let them grow and change to fit us.
A song that's gone through the mill, and come out weathered and smooth, that's a traditional song or a folk song. But even more, a song still floating in that tide, still bouncing around between people, changing and adapting—that's a living folk song. Most of what we sing as folk songs are taxidermy.
Sunday, November 25, 2012
Conservatism and liberalism
I've been ruminating since the election on what exactly people who call themselves conservatives actually mean by that. It turns out conservatism is a relatively new word, only emerging as a common term in 1819 in the title of French ultra-royalist journal Le Conservateur. It is a term that really only began to make sense in the wake of the French Revolution, with its wholesale destruction of not only an established political order, but much of the national cultural riches that had grown up around that order. Not to mention the Terror.
People had words for “old guard” before 1819; conservatism represents half of a dynamic that's been going on since the dawn of time: some people want to do things the way they have been done habitually, and others want to change those patterns. At root, it's a basic biological function: continuity vs adaption. Both valuable, both valid, together regularly in conflict.
But something happened to the political world of Europe (and the Americas) in what we now call the “revolutionary era.” Revolution itself became a self-conscious cause. Some revolutions were specifically about a change of regime, but for the first time the idea that people ought to revolt and establish new order organically had real on-the-ground success—established governments were founded not on overthrow of a particular tyrant or governing identity group (ethnic, religious, etc), but on the overthrow of a “class” of people. “Conservatism,” I believe, was a reaction to revolutionism as a philosophy."
“Tradition” is a word that gets bandied about a lot if conservative circles. In America, “traditional family values” is a code phrase for family structures and social mores grounded in the nuclear, church-going family. “Traditional marriage” is used to denote a life-long commitment between a man and a woman, producing offspring biologically, sanctified in a religious institution (see my article about this usage here).
But “traditional” is also a word used a lot in the folk music and dance community. It connotes being part of a style or form that's been passed down from person to person. But folk music as a field was also heavily influenced by leftist political activism: union organizers adapted hymns into union songs, desegregation activism was founded in part on singing songs like “We Shall Overcome,” and the entire counterculture of the late 1960’s and early 1970’s included heavy doses of the electrified rediscovery of folk music. It still does.
The thing is, traditions can be revolutionary. As the Wikipedia article on conservatism points out, “In the United States, conservatism is rooted in the American Revolution and its commitment to conserve the rights and liberties of Englishmen. Most European conservative writers do not accept American conservatism as genuine; they consider it to be a variety of liberalism.” So also we see the Chinese Communist Party engaged in a struggle between “reformers” and the revolutionary “old guard.”
Really, any revolution that sustains itself long enough to become the new established order carries this possibility. How else could we have a “Rock and Roll Hall of Fame?” And so we have a fundamentally confused idea: “conservatives” fighting for policies developed in the 1980s interested in dismantling “liberal” institutions assembled in the 1930s and 1940s.
Conservatism doesn’t really make sense as a working philosophy in and of itself. Conservatism means maintaining institutions generally, but without being attached to some specific identity or institution, it could just as easily mean the old guard in Beijing as the Dalai Lama's retinue in Dharamsala; ultra-Orthodox Jewish settlers or Salafist Saudis; singers from the Sacred Harp or singers from Wobbly songbooks.
So: conservative… what? What is being preserved? And the thing is, we almost all want something preserved, collected, retained, remembered… we don't by and large want to completely forget. Even the monsters of cultural memory erasure—Taliban who destroyed Buddhist monuments, Pol Pot's educide, the iconoclasms that have swept the Christian world for centuries—have their own ideal past they wish to purify and make clear. They are conservative even as they destroy.
I betray my biases. I was raised a self-confessed liberal. I was raised to sneer at Ronald Reagan and Jerry Falwell and Barry Goldwater. And here’s the thing: there is no such thing as purity of conservatism or liberalism. We all want some things to remain, and some to change. The American Republican party was formed by the joining of the business-conservative Whigs with the “Radical Republicans” who rallied around slavery abolition. Revolution and plutocracy wrapped up in one package. We liberals who rail against the “former Party of Lincoln” forget this. Lincoln stood for abolition (eventually) but also for railroad baronies. Nixon was for overthrowing Allende in Chile and for forming the EPA.
I sing old songs and new songs that sound like old songs, after listening to new rock music on the radio. I sit in silent worship in a 350-year-old practice, while wrestling with my own very modern non-theism. I live in a 122-year-old house, typing in a blog entry over wi-fi.
Conservatism for conservatism's sake makes no sense, just as liberalism for liberalism's sake ends up as a grand mushiness. We as humans need traditions and institutions and patterns to make sense, and we need to be able to edit those patterns as the world they describe changes. And for any given tradition, we need conservatives working in the “old style” at the same time as liberal avante gardists push the envelope. This is true in the arts, and it’s true in public life.
I need to give up my own broad liberalism for something more nuanced. And I hope some of my conservative friends will consider doing the same.
People had words for “old guard” before 1819; conservatism represents half of a dynamic that's been going on since the dawn of time: some people want to do things the way they have been done habitually, and others want to change those patterns. At root, it's a basic biological function: continuity vs adaption. Both valuable, both valid, together regularly in conflict.
But something happened to the political world of Europe (and the Americas) in what we now call the “revolutionary era.” Revolution itself became a self-conscious cause. Some revolutions were specifically about a change of regime, but for the first time the idea that people ought to revolt and establish new order organically had real on-the-ground success—established governments were founded not on overthrow of a particular tyrant or governing identity group (ethnic, religious, etc), but on the overthrow of a “class” of people. “Conservatism,” I believe, was a reaction to revolutionism as a philosophy."
“Tradition” is a word that gets bandied about a lot if conservative circles. In America, “traditional family values” is a code phrase for family structures and social mores grounded in the nuclear, church-going family. “Traditional marriage” is used to denote a life-long commitment between a man and a woman, producing offspring biologically, sanctified in a religious institution (see my article about this usage here).
But “traditional” is also a word used a lot in the folk music and dance community. It connotes being part of a style or form that's been passed down from person to person. But folk music as a field was also heavily influenced by leftist political activism: union organizers adapted hymns into union songs, desegregation activism was founded in part on singing songs like “We Shall Overcome,” and the entire counterculture of the late 1960’s and early 1970’s included heavy doses of the electrified rediscovery of folk music. It still does.
The thing is, traditions can be revolutionary. As the Wikipedia article on conservatism points out, “In the United States, conservatism is rooted in the American Revolution and its commitment to conserve the rights and liberties of Englishmen. Most European conservative writers do not accept American conservatism as genuine; they consider it to be a variety of liberalism.” So also we see the Chinese Communist Party engaged in a struggle between “reformers” and the revolutionary “old guard.”
Really, any revolution that sustains itself long enough to become the new established order carries this possibility. How else could we have a “Rock and Roll Hall of Fame?” And so we have a fundamentally confused idea: “conservatives” fighting for policies developed in the 1980s interested in dismantling “liberal” institutions assembled in the 1930s and 1940s.
Conservatism doesn’t really make sense as a working philosophy in and of itself. Conservatism means maintaining institutions generally, but without being attached to some specific identity or institution, it could just as easily mean the old guard in Beijing as the Dalai Lama's retinue in Dharamsala; ultra-Orthodox Jewish settlers or Salafist Saudis; singers from the Sacred Harp or singers from Wobbly songbooks.
So: conservative… what? What is being preserved? And the thing is, we almost all want something preserved, collected, retained, remembered… we don't by and large want to completely forget. Even the monsters of cultural memory erasure—Taliban who destroyed Buddhist monuments, Pol Pot's educide, the iconoclasms that have swept the Christian world for centuries—have their own ideal past they wish to purify and make clear. They are conservative even as they destroy.
I betray my biases. I was raised a self-confessed liberal. I was raised to sneer at Ronald Reagan and Jerry Falwell and Barry Goldwater. And here’s the thing: there is no such thing as purity of conservatism or liberalism. We all want some things to remain, and some to change. The American Republican party was formed by the joining of the business-conservative Whigs with the “Radical Republicans” who rallied around slavery abolition. Revolution and plutocracy wrapped up in one package. We liberals who rail against the “former Party of Lincoln” forget this. Lincoln stood for abolition (eventually) but also for railroad baronies. Nixon was for overthrowing Allende in Chile and for forming the EPA.
I sing old songs and new songs that sound like old songs, after listening to new rock music on the radio. I sit in silent worship in a 350-year-old practice, while wrestling with my own very modern non-theism. I live in a 122-year-old house, typing in a blog entry over wi-fi.
Conservatism for conservatism's sake makes no sense, just as liberalism for liberalism's sake ends up as a grand mushiness. We as humans need traditions and institutions and patterns to make sense, and we need to be able to edit those patterns as the world they describe changes. And for any given tradition, we need conservatives working in the “old style” at the same time as liberal avante gardists push the envelope. This is true in the arts, and it’s true in public life.
I need to give up my own broad liberalism for something more nuanced. And I hope some of my conservative friends will consider doing the same.
Thursday, November 22, 2012
Things Fall Apart
It didn't have to end this way. And, actually, it didn't end this way.
The nightmare of my childhood and young adult years was the all-out nuclear war. The end-of the world scenario younger viewers will recognize from the end of Terminator 3. I remember it most vividly from The Day After and Threads, American and British what-if-there-were-a-nuclear-war movies.
The horror of that vision is so absolute: nothing but irradiated dirt, burnt corpses, smoldering ruins... and the presence of that nightmare lurked in the background for half a century. It still lurks today, even further in the background, though Russia and the US seem like unlikely all-out enemies today.
But in the wake of that vision of the End of Everything, there was the question, what comes afterward? What about the survivors? And the answers we were given were just as awful: a breakdown of order, summary execution of looters (that was a scene in Threads that stopped me cold and still runs through my head sometimes), shorter brutish lifespans, nuclear winter, ruined crops, starvation...
And Mad Max. Or young Don Johnson in that most peculiar film A Boy and His Dog. The world turned desert, every man for himself. Kind of like Conan the Barbarian's world, only in the imagined future, not the imagined past.
This is a world where everyone is an orphan or a widow/er, where no-one whom we survivors meet (because you and I will be part of the lucky 5%, right?) is a friend or family. So even more than Conan's world, it's the world of B-grade westerns, full of suspicious gun-toting strangers.
Here's the thing: most of the real horrors of the world don't happen with breakdown of a larger society. They happen when that larger society is kidnapped by psychopaths with a Theory: Aryan superiority, collectivization, the legitimacy of Protocols of Zion, the Tutsi Menace... When that Theory is enacted, hundreds of thousands can be efficiently murdered. When the mass societies—which may do these terrible things but mostly just serve to organize people into ever-more-efficient machines for making things—break down, they tend, sooner or later, to re-form as small societies. These small societies may wage in regular low-level warfare on each other, but my point is things do NOT completely fall apart for very long.
European explorers and long-distance traders in the Americas of the 16th to 18th centuries came across well-organized groups of Indians. They appeared, in fact, to be a permanent part of the primeval wilderness. What they did not realize was that the primeval wilderness had been a lot less wild only a few generations earlier, before waves of disease destroyed a huge proportion of the population (50%? 70%? more?). By the time those Europeans penetrated the interior of the country, whole nations had vanished, and what the Europeans encountered were the survivors. What they took as natural poverty was the poverty of the children of refugees from a holocaust.
But they didn't see savage anarchy. They thought they saw savages, but savages with intricate kinship structures, a religious life, stories and arts and costumes and dances and villages and all those things that early anthropologists loved to collect and write down. And these survivors had organized political alliances in the fast-changing landscape, entire new tribes sometimes formed out of the decimated remnants of old tribes.
The end is not the end. The collapse of a state, or a church, or an economy, or of any institution, doesn't mean zombies shambling in the streets. Or rather, it only means shambling zombies for people so devoid of social imagination that life is literally meaningless without the collapsed entity. And sadly, if that's the case... those are the people who ought to be sympathetically treated as zombies. Not the poor survivors out looking to re-form some kind of society and feed themselves and their family and friends as best they can.
The nightmare of my childhood and young adult years was the all-out nuclear war. The end-of the world scenario younger viewers will recognize from the end of Terminator 3. I remember it most vividly from The Day After and Threads, American and British what-if-there-were-a-nuclear-war movies.
The horror of that vision is so absolute: nothing but irradiated dirt, burnt corpses, smoldering ruins... and the presence of that nightmare lurked in the background for half a century. It still lurks today, even further in the background, though Russia and the US seem like unlikely all-out enemies today.
But in the wake of that vision of the End of Everything, there was the question, what comes afterward? What about the survivors? And the answers we were given were just as awful: a breakdown of order, summary execution of looters (that was a scene in Threads that stopped me cold and still runs through my head sometimes), shorter brutish lifespans, nuclear winter, ruined crops, starvation...
And Mad Max. Or young Don Johnson in that most peculiar film A Boy and His Dog. The world turned desert, every man for himself. Kind of like Conan the Barbarian's world, only in the imagined future, not the imagined past.
This is a world where everyone is an orphan or a widow/er, where no-one whom we survivors meet (because you and I will be part of the lucky 5%, right?) is a friend or family. So even more than Conan's world, it's the world of B-grade westerns, full of suspicious gun-toting strangers.
Here's the thing: most of the real horrors of the world don't happen with breakdown of a larger society. They happen when that larger society is kidnapped by psychopaths with a Theory: Aryan superiority, collectivization, the legitimacy of Protocols of Zion, the Tutsi Menace... When that Theory is enacted, hundreds of thousands can be efficiently murdered. When the mass societies—which may do these terrible things but mostly just serve to organize people into ever-more-efficient machines for making things—break down, they tend, sooner or later, to re-form as small societies. These small societies may wage in regular low-level warfare on each other, but my point is things do NOT completely fall apart for very long.
European explorers and long-distance traders in the Americas of the 16th to 18th centuries came across well-organized groups of Indians. They appeared, in fact, to be a permanent part of the primeval wilderness. What they did not realize was that the primeval wilderness had been a lot less wild only a few generations earlier, before waves of disease destroyed a huge proportion of the population (50%? 70%? more?). By the time those Europeans penetrated the interior of the country, whole nations had vanished, and what the Europeans encountered were the survivors. What they took as natural poverty was the poverty of the children of refugees from a holocaust.
But they didn't see savage anarchy. They thought they saw savages, but savages with intricate kinship structures, a religious life, stories and arts and costumes and dances and villages and all those things that early anthropologists loved to collect and write down. And these survivors had organized political alliances in the fast-changing landscape, entire new tribes sometimes formed out of the decimated remnants of old tribes.
The end is not the end. The collapse of a state, or a church, or an economy, or of any institution, doesn't mean zombies shambling in the streets. Or rather, it only means shambling zombies for people so devoid of social imagination that life is literally meaningless without the collapsed entity. And sadly, if that's the case... those are the people who ought to be sympathetically treated as zombies. Not the poor survivors out looking to re-form some kind of society and feed themselves and their family and friends as best they can.
Sunday, November 4, 2012
God Is a Really Good Story
I've been struggling for months, not with my non-theism, but with a way to frame it that reflects my sense of "yes." Our family has been attending a different Friends Meeting than we had been for over a decade, one which was founded several years ago out of a desire to have a more explicitly theist worship. A number of our friends have been involved in the group. We tried it one Sunday, and just kept going.
It would make a good narrative, I suppose, or anyway a more stock narrative, to say I was somehow converted or convinced. I haven't been, not in the way that is usually meant. But I have been sitting with a kind of "disturbance in the Force" that predates our joining Laughing Waters worship. I've spent most of this year trying to get to where this disturbance is coming from.
I had an image come to me in worship last Sunday. It was simple, but it's a puzzle, and it's not leaving me alone. It's a question, which I know is concealed behind an impenetrable wall. The question, not the answer. I don't know what the wall is (I assume it's metaphorical), and I don't know what the question is. So like the character in Kafka's "Before the Law," I'm waiting.
One thing that has been coming more clearly to me, is how I stand in relation to God. I've been getting clearer and clearer over the last few years in my non-theism, moving from a "Who the heck knows" attitude to a "I'll be really surprised if there is a God." I've been enjoying Frank Turner's joyful "no!" in his song, "Glory Hallelujah":
And within the last few months, I've come to see God as a really central, vital—and fictional— character. A really really important figure who doesn't exist factually. That feels right to me. Because it's not that Yahweh is undeserving of respect. Jesus and the Holy Spirit too. The stories in the Bible, and all the saints stories, and all the stories of what faith can do... all important.
I think the big mistake is actually trying to bring factuality into religious discussion, as if reproducible evidence will make it work better. It doesn't. Personal witness, yes. Scientific proofs, no. Why should this be?
I think we often assume that what is factual is more real than what is found in stories. As They Might Be Giants says, "Science is Real:"
But while science is about real things in the sense that it's about things we can share even with strangers, it isn't a very rich internal language, or even much of a language for intimate social interactions, as between a parent and child, or between lovers or even close friends. Part of what makes intimate human relationships work is specifically the non-reproducible results: the specific moments shared and not in need of public justification.
It's these non-reproducible results—having specific human feelings of love, hate, anger, joy, calm, or fury at specific times and places—that story-telling works with. They are not recipes, despite what folklorists and mythologists want us to think. If they were, they would have been written as recipes or maps. Instead, stories are about specific characters with whom we can parallel our own specific experiences.
Science is a kind of discussion we can have with strangers, and as such it is essential. It's hard to imagine a world in which we didn't have the dollar, and the degree centigrade, and the meter and liter and so on and on... a world where we couldn't trust in a platform of common discourse that can go as far as things like human rights.
And this is why human rights, while they are vital, are not enough. Invoking rights means we are strangers who are trying not to hurt each other. We need more than that, and most of us have more than that, in the form of love.
Love is not rational, as Mr Spock and Mr Data found over and over in Star Trek. It does not submit to measurement. But for human beings, it is clearly essential.
So how do we get from this point to a fictional God?
I was flabbergasted by this video a month or so ago. US Representative Paul Broun, who serves on the House Committee on Science, Space, & Technology, which got a wide mocking audience via Facebook:
It's flabbergasting because I've come to think of this kind of science-bashing as coming from ignoramuses, not medical doctors with public policy reach. But the more I think about the kind of religious absolutism this represents—the notion that scriptural knowledge trumps science and law, that God's truth surrounds and contains whatever petty knowledge we mortals can hope to obtain, the less it looks like the problem is that surround. The problem seems to me to be the idea that factualism is the true container for the human experience.
Facts are bits of reality we can wrap our minds around, and share with strangers. Wonder and love— and religion—are what we share with friends. So when we try and make religion factual, try to make that the gold standard for legitimate discourse, we alienate ourselves from the very intimate moments of love and transcendence we're trying to get to.
Why do we try to tell each other God is real in one way or another? Well, God is real in the same sense that our love is real, or our anger, or our fear. God is real in the same sense that any truly powerful, heart-changing story is real. Don't tell me that what rips my heart out at the end of Shawshank Redemption, or what made me feel a deep ache of mourning that lasted for days at the end of Stephen King's 11/23/63, isn't real. Of course they're real. But they are real fictions.
So. This, it seems to me, is a task before us: to confront literalism—the demand that what is important is always factually true. It isn't. And that confrontation needs to include a robust counterproposal to the argument of literalism. It needs to not try and toss out God, but to make God's proper place respectable again. A shining throne? Not my style, but if that works for you, fine. I don't care for the big royal medal ceremony at the end of Star Wars either. No, what I mean is not what kinds of trappings God deserves, but what kind of genuine respect God-stories deserve, without requiring they sound like lab reports.
And hand-in-hand with this is a recognition that the way the universe works that we have learned through scientific experiment—the truly universal and factual world—is a truer framework for the world we enter as strangers.
It would make a good narrative, I suppose, or anyway a more stock narrative, to say I was somehow converted or convinced. I haven't been, not in the way that is usually meant. But I have been sitting with a kind of "disturbance in the Force" that predates our joining Laughing Waters worship. I've spent most of this year trying to get to where this disturbance is coming from.
I had an image come to me in worship last Sunday. It was simple, but it's a puzzle, and it's not leaving me alone. It's a question, which I know is concealed behind an impenetrable wall. The question, not the answer. I don't know what the wall is (I assume it's metaphorical), and I don't know what the question is. So like the character in Kafka's "Before the Law," I'm waiting.
One thing that has been coming more clearly to me, is how I stand in relation to God. I've been getting clearer and clearer over the last few years in my non-theism, moving from a "Who the heck knows" attitude to a "I'll be really surprised if there is a God." I've been enjoying Frank Turner's joyful "no!" in his song, "Glory Hallelujah":
And within the last few months, I've come to see God as a really central, vital—and fictional— character. A really really important figure who doesn't exist factually. That feels right to me. Because it's not that Yahweh is undeserving of respect. Jesus and the Holy Spirit too. The stories in the Bible, and all the saints stories, and all the stories of what faith can do... all important.
I think the big mistake is actually trying to bring factuality into religious discussion, as if reproducible evidence will make it work better. It doesn't. Personal witness, yes. Scientific proofs, no. Why should this be?
I think we often assume that what is factual is more real than what is found in stories. As They Might Be Giants says, "Science is Real:"
But while science is about real things in the sense that it's about things we can share even with strangers, it isn't a very rich internal language, or even much of a language for intimate social interactions, as between a parent and child, or between lovers or even close friends. Part of what makes intimate human relationships work is specifically the non-reproducible results: the specific moments shared and not in need of public justification.
It's these non-reproducible results—having specific human feelings of love, hate, anger, joy, calm, or fury at specific times and places—that story-telling works with. They are not recipes, despite what folklorists and mythologists want us to think. If they were, they would have been written as recipes or maps. Instead, stories are about specific characters with whom we can parallel our own specific experiences.
Science is a kind of discussion we can have with strangers, and as such it is essential. It's hard to imagine a world in which we didn't have the dollar, and the degree centigrade, and the meter and liter and so on and on... a world where we couldn't trust in a platform of common discourse that can go as far as things like human rights.
And this is why human rights, while they are vital, are not enough. Invoking rights means we are strangers who are trying not to hurt each other. We need more than that, and most of us have more than that, in the form of love.
Love is not rational, as Mr Spock and Mr Data found over and over in Star Trek. It does not submit to measurement. But for human beings, it is clearly essential.
So how do we get from this point to a fictional God?
I was flabbergasted by this video a month or so ago. US Representative Paul Broun, who serves on the House Committee on Science, Space, & Technology, which got a wide mocking audience via Facebook:
It's flabbergasting because I've come to think of this kind of science-bashing as coming from ignoramuses, not medical doctors with public policy reach. But the more I think about the kind of religious absolutism this represents—the notion that scriptural knowledge trumps science and law, that God's truth surrounds and contains whatever petty knowledge we mortals can hope to obtain, the less it looks like the problem is that surround. The problem seems to me to be the idea that factualism is the true container for the human experience.
Facts are bits of reality we can wrap our minds around, and share with strangers. Wonder and love— and religion—are what we share with friends. So when we try and make religion factual, try to make that the gold standard for legitimate discourse, we alienate ourselves from the very intimate moments of love and transcendence we're trying to get to.
Why do we try to tell each other God is real in one way or another? Well, God is real in the same sense that our love is real, or our anger, or our fear. God is real in the same sense that any truly powerful, heart-changing story is real. Don't tell me that what rips my heart out at the end of Shawshank Redemption, or what made me feel a deep ache of mourning that lasted for days at the end of Stephen King's 11/23/63, isn't real. Of course they're real. But they are real fictions.
So. This, it seems to me, is a task before us: to confront literalism—the demand that what is important is always factually true. It isn't. And that confrontation needs to include a robust counterproposal to the argument of literalism. It needs to not try and toss out God, but to make God's proper place respectable again. A shining throne? Not my style, but if that works for you, fine. I don't care for the big royal medal ceremony at the end of Star Wars either. No, what I mean is not what kinds of trappings God deserves, but what kind of genuine respect God-stories deserve, without requiring they sound like lab reports.
And hand-in-hand with this is a recognition that the way the universe works that we have learned through scientific experiment—the truly universal and factual world—is a truer framework for the world we enter as strangers.
Thursday, October 18, 2012
On Critique
This is a paper I gave (am giving?) at NACIS's annual meeting in Portland, Oregon. I hope you will find it of use.
---
Critique is basically looking at something someone has made, usually in the presence of the maker, and talking about why it does and doesn’t work. Critique is a core methodology for teaching fine arts, which was my college major. Critiques have become a core part of NACIS, including the Cartotalk gallery, the Map-off, and roundtable map talks. Whether it’s with colleagues or with clients, we all engage in feedback. We all need the outside point of view that allows us to see our maps freshly.
However, I want to start by reading from a 1997 interview with Stephen Sondheim, from the TV series Inside the Actors Studio. For those unfamiliar with Sondheim, he is probably the greatest living writer of Broadway musicals, with his heyday in the 1970s and 1980s.
This is a different kind of critique than most of us will ever get. I'm not going to suggest that the Oscar Hammerstein of maps— I don’t know who that would that be anyway—is going to offer you the secrets of his or her trade. But I want to hold up this example for clues how we can position ourselves best to receive and give critique. Specifically, I think there are social barriers that often keep our critiques from being what they could be.
The first thing I want to call your attention to, is that Sondheim and Hammerstein were crystal clear about the field in which they were playing. Hammerstein wrote Broadway musicals, and Sondheim wanted to win at that very specific game. If he had wanted to write a Gilbert and Sullivan operetta, or an art-song cycle, or radio jingles, Hammerstein might have been supportive, but would not have been able to offer such specific advice, grounded in experience. Same if Sondheim had wanted to write something new and revolutionary, to jump tracks. Sondheim wanted to perform within a specific, evolving but established tradition.
Too often in cartography, we act as if “knowing how to make maps” is a singular tradition. And there isa basic unity to modern cartography. But I make urban navigational maps as my primary specialty, with a secondary specialty making regional thematic point maps, in both cases within the context of a for-profit map publishing company. Pretty much everything else I do is a kind of experiment. I shouldn’t really be trying to pass myself off as an expert on choropleth maps, or mountain hiking maps, or geological maps, or any other of a hundred other map types.
That phrase “map type” sounds like it’s all about definitional categories, but it’s also about context—how do you sell the map, or otherwise get it out to the audience. Broadway musicals were defined at least as much by the business structure of producing a show as by artistic theory. I can tell you this kind of issue is central to commercial map publishing, that actual cartography is only the most obvious part of the job. Whatever branch of the map world we’re in, we know this, but somehow in discussing cartography, it’s harder to bring that element of our work—call it the business end, maybe—into our critiques. Maybe it feels like some kind of sell-out, the tainting of pure mappage by the finger of mammon, or pandering to the public, or maybe it’s because we really don’t understand who we’re making maps for—we just keep making maps and dang it, the public keeps using them. But that public is who we’re making maps for, so if we pay as close attention as we can to how that public are being reached, it should help us make maps that do a better job.
Our audience—who we’re making the map for—is not just the end user. Hammerstein’s critique, if it was anything like his 1949 essay Notes on Lyrics, was focused not just on lyrics as text, or on the experience of the audience, but also on the work of the singer. His critique of Sondheim was broadly structural: how to compose a scene, how to structure a song. Your map is probably part of a presentation your client or organization will be making to someone. It’s in no way shameful to factor that wider performance into your critique. It’s not a sell-out unless you personally cannot stomach the client or organization’s message.
---
I was a studio art student in college as I mentioned earlier, and critique remains a crucial part of studio education. For me, the frame for critique was as much collegial as professorial, about getting students to talk critically about each others’ work. I want to mention an interesting book from 2001, Why Art Cannot Be Taught: A Handbook for Art Students, by James Elkins. It’s actually just about the only literature on modern post-secondary art teaching. It makes the point that because fine art is no longer about following specific technical traditions—learning to paint like your master—the idea that art teachers can critique from a position of stylistic authority breaks down: we expect art students to find an original voice, rather than be measured on how they can speak in the copied voice of the teacher, or of canonic Old Masters. And so while art teachers are still teachers, and often lead a discussion, they focus in part on students learning from each other. This is an oversimplification of course, but that to me is the nut of the situation. And thus the title of Elkins book, Why Art Cannot Be Taught
We don’t really have the problem of lack of technical traditions in cartography—there is still a clearly defined broad visual language in our field, and the ideal of self-expression doesn’t loom large. Cartographers largely share the same tools and tricks. And yet in our critiques, in forums like NACIS and CartoTalk, we maintain a strong collegial quality: even when there are clearly newbies in the room, asking for basic feedback, we feel like we need to respect their work. And we do, but too often, we also feel the need to be nice. We don’t want to hurt feelings by putting ourselves in the position of Hammerstein, telling people that their map is garbage.
This is not as much of a problem with clients. Clients are usually willing to tell you when a map you made for them is not what they had planned on paying you for. The trouble is, they often have a hard time telling you what they want done differently. They probably believe they don’t know map design, and so they don’t know what is possible. Often they have no design background: “This map is too, you know, blah-ish, or too, kind of, hard to read, or just not PUNCHy enough.... You figure it out. You’re the cartographer.”
And so we find ourselves in this weird position: we are the authority on making maps—the stand-in for Hammerstein—this is why they hired us after all; and they are the ones with the authority of the checkbook who aren’t happy with what we are producing for them. And that can be really awkward.
More than ten years ago, I was at a party at my brother's house, and a friend of his, really insistently, came to talk with me. I'd never met him before, but he was happy to tell me at length why my map stunk. It was Professor Pathfinder’s Downtown Denver, and although my brother still cringes at the memory, it was a really useful critique, because it centered not around how I was violating some preconceived standards, but on how he had a lousy experience trying to use my map.
You cannot deny any user their experience. You can try to cast it as illegitimate or ill-informed—an outlier experience—but you can’t say they didn’t have it. And to me, that’s the best way out of the incoherent client critique trap: focus on their experience and how that isn’t working. You’re then essentially doing what I said earlier: working within a shared audience-centered environment.
But what about the niceness trap? Most people I’ve dealt with are not as clear and direct as my brother’s friend. They simultaneously want to be friendly, and think your map has serious issues. Critiques often come from positions of power—teachers, bosses or clients—and despite stereotyping, most people aren’t interested in playing the heavy with people they have a continuing relationship with. I think it’s the fear of being seen as mean that inspires a lot of forced collegiality.
Hammerstein was in his late 40s, a very successful lyricist. Sondheim was 15, clearly ambitious, but also clearly in awe of Hammerstein. But Sondheim was a family friend. He taught the elder Hammerstein to play chess. He spent the weekend there a lot. Hammerstein was “Uncle Ocky” to him, a surrogate father.
There was a moment at the beginning of the critique, and it almost seems like a formality, but I think it’s actually central. Hammerstein says, “Do you really want me to treat this as if I didn’t know you?” and Sondheim replies, “Oh yes.” What they’re doing is agreeing to play-act. By agreeing that Hammerstein will treat this script by his young friend as if he wasn’t a friend, they’ve set up a privileged space, an afternoon-long social parenthesis, where Hammerstein can impart some really important information. Without that parenthesis, with Sondheim being the crushed teenager and Hammerstein the rejecting parent-figure, would that exchange of information have been possible? I don’t think so.
And this is the weird crux of hearing critique: You have to be able to pretend the critiquer really knows what he or she is talking about—and you have to be able to trust them. And this is where that social parenthesis is so useful. It’s a kind of suspension of belief, from which both participants are released at the end of the discussion.
When you’re in a position of direct power—employer and employee, teacher and student—that parenthesis is really hard to come by, which I think is actually a great case for seeking outside critique. It’s hard to pretend that the person giving you the grade is coming at you as a total stranger, or that you aren’t judging your employee on their performance when you are in fact... judging their performance.
This is where that most common comment I got from fellow cartographers on critique comes from: “focus on the work, not the person.” I want to suggest two approaches here. The first is the focus on user experience. Then your judgments aren’t about you, the judge, it’s about an amorphous “them”. The second, which I like even better, partly because it seems more honest, is to focus on the Work, with a capital W. How do we get better maps to exist, in general? This makes the critique a joint exercise. It also takes performance pressure off that pathetic excuse for a map you have sitting between you. You still don’t want it to fail, but it’s failure is part of a process, not the failed summit of Everest.
If you put these two things together—the social parenthesis, and a focus on user experience—you come up with an interesting model that’s not too different from any good performance: the audience enters the performance space not knowing the performer, but being mostly willing to trust that the performer will reveal something important or at least interesting. The performer’s job is simultaneously to connect—to reward the audience trust with a sense that they are in fact on our side—and to tell us something new. We then leave the performance changed.
---
And then there’s the critique we all desire, deep-down.
Folks who have learned the lesson of positive reinforcement will start off by telling you how great your map is, and then say “BUT”... And then you’ll get to the real point of the critique— what isn’t working and how to make it work. Other times, you’ll get thumbs-up for doing something which you’ve been promoting you do well. Call these “affirmations”, perhaps. They’re great, they boost the ego, but they aren’t news. They don’t change you.
And every so often, you’ll get a comment that’s like a little moment of grace. You’ll be praised for something that you weren’t really paying attention to.
I had this great moment on CartoTalk, a couple years ago, where Derek Tonn made some really lovely comment about my hand at label placement. Totally out of nowhere, and I had certainly never thought, “well, if all else fails, I can always go into label placement, because I’m good at it.” What that little unexpected bit of praise did was to make me pay attention to something I was doing and taking for granted. There’s a singing and dancing friend of mine who is great about this: when I first knew him, he just seemed a little overexcited about stuff, but now I can see that what he’s doing is pointing out moments we might just take for granted, things like, “Wasn’t that just a perfect night and it was so cool to see four sets of parents and kids each leading songs.”
Like I said, it’s like grace, in that you can’t force it. Having to find nice things to say doesn’t work when you just don’t see them.
And the key to seeing them is to give yourself time. This is one of the biggest problems with most critiques: they’re scheduled, and you’re in busy, get-it-done-on-budget mode. A really good critique requires a kind of mental state that allows the critiquer to wander over the work, and allow observations, both positive and negative, to develop. And it’s in that time and space that you can find surprises, and find structures and qualities the map-maker wasn’t looking for, and then suss those out a little bit internally before you blurt them out.
To conclude:
Some of you may be familiar with the TV show Mystery Science Theater 3000. The frame story doesn’t matter here: the central premise is that a bad movie is shown to a human and his robot friends (they’re puppets). We see them in silhouette, and they wisecrack their way through the show, taking it down a usually well-deserved notch or three.
So we have a performance—the bad movie—that fails. Then we have another performance on top of that, a commentary on failure and how we can still enjoy a work that is bad. The wisecrackers enter the performance ready to mock, and we are happy to mock with them—it’s a comedy, as most bearable works about failure are. Most of us do not want to be that bad movie, and if the movie is our friend’s work, we don’t want to be the wisecrackers. We really don’t want to be in a comedy when it comes to work with people we care about.
But we are in a comedy. We all make bad maps, one way or another—or, to speak more gently, deeply imperfect maps. We all have to look at imperfect maps other people make. We don’t have to be wise-cracking jerks about it, but if we aren’t able to admit this basic fact, we’ll get nowhere: that we are all at least partly Stephen Sondheim, aged 15—full of potential, not necessarily knowing what we’re doing, wanting desperately to please and to be successful, wanting people to think we’re more than competent, and always in need of some good critique.
---
Critique is basically looking at something someone has made, usually in the presence of the maker, and talking about why it does and doesn’t work. Critique is a core methodology for teaching fine arts, which was my college major. Critiques have become a core part of NACIS, including the Cartotalk gallery, the Map-off, and roundtable map talks. Whether it’s with colleagues or with clients, we all engage in feedback. We all need the outside point of view that allows us to see our maps freshly.
However, I want to start by reading from a 1997 interview with Stephen Sondheim, from the TV series Inside the Actors Studio. For those unfamiliar with Sondheim, he is probably the greatest living writer of Broadway musicals, with his heyday in the 1970s and 1980s.
INTERVIEWER
When
you were ten and your parents divorced, your mother moved to
Pennsylvania and it was there at the age of eleven that you encountered
Jimmy Hammerstein and were welcomed into the family of Oscar and Dorothy
Hammerstein. I understand you’ve said that if Hammerstein had been a
geologist, you would have become a geologist.
STEPHEN SONDHEIM
Yes.
He was a surrogate father and a mentor to me up until his death. When I
was fifteen, I wrote a show for George School, the Friends school I
went to. It was called “By George” and was about the students and the
faculty. I was convinced that Rodgers and Hammerstein couldn’t wait to
produce it, so I gave it to Oscar and asked him to read it as if he
didn’t know me. I went to bed dreaming of my name in lights on Broadway,
and when I was summoned to his house the next day he asked, Do you
really want me to treat this as if I didn’t know you? Oh yes, I said, to
which he replied, In that case, it’s the worst thing I’ve ever read. He
saw me blanch and continued, I didn’t say it was untalented, but let’s
look at it. He proceeded to discuss it as if it were a serious piece. He
started right from the first stage direction; and I’ve often said, at
the risk of hyperbole, that I probably learned more about writing songs
that afternoon than I learned the rest of my life. He taught me how to
structure a song, what a character was, what a scene was; he taught me
how to tell a story, how not to tell a story, how to make stage
directions practical.
Of course when you’re fifteen you’re a sponge. I soaked it all up and I still practice the principles he taught me that afternoon. From then on, until the day he died, I showed him everything I wrote, and eventually had the Oedipal thrill of being able to criticize his lyrics, which was a generous thing for him to let me do.
Of course when you’re fifteen you’re a sponge. I soaked it all up and I still practice the principles he taught me that afternoon. From then on, until the day he died, I showed him everything I wrote, and eventually had the Oedipal thrill of being able to criticize his lyrics, which was a generous thing for him to let me do.
This is a different kind of critique than most of us will ever get. I'm not going to suggest that the Oscar Hammerstein of maps— I don’t know who that would that be anyway—is going to offer you the secrets of his or her trade. But I want to hold up this example for clues how we can position ourselves best to receive and give critique. Specifically, I think there are social barriers that often keep our critiques from being what they could be.
The first thing I want to call your attention to, is that Sondheim and Hammerstein were crystal clear about the field in which they were playing. Hammerstein wrote Broadway musicals, and Sondheim wanted to win at that very specific game. If he had wanted to write a Gilbert and Sullivan operetta, or an art-song cycle, or radio jingles, Hammerstein might have been supportive, but would not have been able to offer such specific advice, grounded in experience. Same if Sondheim had wanted to write something new and revolutionary, to jump tracks. Sondheim wanted to perform within a specific, evolving but established tradition.
Too often in cartography, we act as if “knowing how to make maps” is a singular tradition. And there isa basic unity to modern cartography. But I make urban navigational maps as my primary specialty, with a secondary specialty making regional thematic point maps, in both cases within the context of a for-profit map publishing company. Pretty much everything else I do is a kind of experiment. I shouldn’t really be trying to pass myself off as an expert on choropleth maps, or mountain hiking maps, or geological maps, or any other of a hundred other map types.
That phrase “map type” sounds like it’s all about definitional categories, but it’s also about context—how do you sell the map, or otherwise get it out to the audience. Broadway musicals were defined at least as much by the business structure of producing a show as by artistic theory. I can tell you this kind of issue is central to commercial map publishing, that actual cartography is only the most obvious part of the job. Whatever branch of the map world we’re in, we know this, but somehow in discussing cartography, it’s harder to bring that element of our work—call it the business end, maybe—into our critiques. Maybe it feels like some kind of sell-out, the tainting of pure mappage by the finger of mammon, or pandering to the public, or maybe it’s because we really don’t understand who we’re making maps for—we just keep making maps and dang it, the public keeps using them. But that public is who we’re making maps for, so if we pay as close attention as we can to how that public are being reached, it should help us make maps that do a better job.
Our audience—who we’re making the map for—is not just the end user. Hammerstein’s critique, if it was anything like his 1949 essay Notes on Lyrics, was focused not just on lyrics as text, or on the experience of the audience, but also on the work of the singer. His critique of Sondheim was broadly structural: how to compose a scene, how to structure a song. Your map is probably part of a presentation your client or organization will be making to someone. It’s in no way shameful to factor that wider performance into your critique. It’s not a sell-out unless you personally cannot stomach the client or organization’s message.
---
I was a studio art student in college as I mentioned earlier, and critique remains a crucial part of studio education. For me, the frame for critique was as much collegial as professorial, about getting students to talk critically about each others’ work. I want to mention an interesting book from 2001, Why Art Cannot Be Taught: A Handbook for Art Students, by James Elkins. It’s actually just about the only literature on modern post-secondary art teaching. It makes the point that because fine art is no longer about following specific technical traditions—learning to paint like your master—the idea that art teachers can critique from a position of stylistic authority breaks down: we expect art students to find an original voice, rather than be measured on how they can speak in the copied voice of the teacher, or of canonic Old Masters. And so while art teachers are still teachers, and often lead a discussion, they focus in part on students learning from each other. This is an oversimplification of course, but that to me is the nut of the situation. And thus the title of Elkins book, Why Art Cannot Be Taught
We don’t really have the problem of lack of technical traditions in cartography—there is still a clearly defined broad visual language in our field, and the ideal of self-expression doesn’t loom large. Cartographers largely share the same tools and tricks. And yet in our critiques, in forums like NACIS and CartoTalk, we maintain a strong collegial quality: even when there are clearly newbies in the room, asking for basic feedback, we feel like we need to respect their work. And we do, but too often, we also feel the need to be nice. We don’t want to hurt feelings by putting ourselves in the position of Hammerstein, telling people that their map is garbage.
This is not as much of a problem with clients. Clients are usually willing to tell you when a map you made for them is not what they had planned on paying you for. The trouble is, they often have a hard time telling you what they want done differently. They probably believe they don’t know map design, and so they don’t know what is possible. Often they have no design background: “This map is too, you know, blah-ish, or too, kind of, hard to read, or just not PUNCHy enough.... You figure it out. You’re the cartographer.”
And so we find ourselves in this weird position: we are the authority on making maps—the stand-in for Hammerstein—this is why they hired us after all; and they are the ones with the authority of the checkbook who aren’t happy with what we are producing for them. And that can be really awkward.
More than ten years ago, I was at a party at my brother's house, and a friend of his, really insistently, came to talk with me. I'd never met him before, but he was happy to tell me at length why my map stunk. It was Professor Pathfinder’s Downtown Denver, and although my brother still cringes at the memory, it was a really useful critique, because it centered not around how I was violating some preconceived standards, but on how he had a lousy experience trying to use my map.
You cannot deny any user their experience. You can try to cast it as illegitimate or ill-informed—an outlier experience—but you can’t say they didn’t have it. And to me, that’s the best way out of the incoherent client critique trap: focus on their experience and how that isn’t working. You’re then essentially doing what I said earlier: working within a shared audience-centered environment.
But what about the niceness trap? Most people I’ve dealt with are not as clear and direct as my brother’s friend. They simultaneously want to be friendly, and think your map has serious issues. Critiques often come from positions of power—teachers, bosses or clients—and despite stereotyping, most people aren’t interested in playing the heavy with people they have a continuing relationship with. I think it’s the fear of being seen as mean that inspires a lot of forced collegiality.
Hammerstein was in his late 40s, a very successful lyricist. Sondheim was 15, clearly ambitious, but also clearly in awe of Hammerstein. But Sondheim was a family friend. He taught the elder Hammerstein to play chess. He spent the weekend there a lot. Hammerstein was “Uncle Ocky” to him, a surrogate father.
There was a moment at the beginning of the critique, and it almost seems like a formality, but I think it’s actually central. Hammerstein says, “Do you really want me to treat this as if I didn’t know you?” and Sondheim replies, “Oh yes.” What they’re doing is agreeing to play-act. By agreeing that Hammerstein will treat this script by his young friend as if he wasn’t a friend, they’ve set up a privileged space, an afternoon-long social parenthesis, where Hammerstein can impart some really important information. Without that parenthesis, with Sondheim being the crushed teenager and Hammerstein the rejecting parent-figure, would that exchange of information have been possible? I don’t think so.
And this is the weird crux of hearing critique: You have to be able to pretend the critiquer really knows what he or she is talking about—and you have to be able to trust them. And this is where that social parenthesis is so useful. It’s a kind of suspension of belief, from which both participants are released at the end of the discussion.
When you’re in a position of direct power—employer and employee, teacher and student—that parenthesis is really hard to come by, which I think is actually a great case for seeking outside critique. It’s hard to pretend that the person giving you the grade is coming at you as a total stranger, or that you aren’t judging your employee on their performance when you are in fact... judging their performance.
This is where that most common comment I got from fellow cartographers on critique comes from: “focus on the work, not the person.” I want to suggest two approaches here. The first is the focus on user experience. Then your judgments aren’t about you, the judge, it’s about an amorphous “them”. The second, which I like even better, partly because it seems more honest, is to focus on the Work, with a capital W. How do we get better maps to exist, in general? This makes the critique a joint exercise. It also takes performance pressure off that pathetic excuse for a map you have sitting between you. You still don’t want it to fail, but it’s failure is part of a process, not the failed summit of Everest.
If you put these two things together—the social parenthesis, and a focus on user experience—you come up with an interesting model that’s not too different from any good performance: the audience enters the performance space not knowing the performer, but being mostly willing to trust that the performer will reveal something important or at least interesting. The performer’s job is simultaneously to connect—to reward the audience trust with a sense that they are in fact on our side—and to tell us something new. We then leave the performance changed.
---
And then there’s the critique we all desire, deep-down.
Folks who have learned the lesson of positive reinforcement will start off by telling you how great your map is, and then say “BUT”... And then you’ll get to the real point of the critique— what isn’t working and how to make it work. Other times, you’ll get thumbs-up for doing something which you’ve been promoting you do well. Call these “affirmations”, perhaps. They’re great, they boost the ego, but they aren’t news. They don’t change you.
And every so often, you’ll get a comment that’s like a little moment of grace. You’ll be praised for something that you weren’t really paying attention to.
I had this great moment on CartoTalk, a couple years ago, where Derek Tonn made some really lovely comment about my hand at label placement. Totally out of nowhere, and I had certainly never thought, “well, if all else fails, I can always go into label placement, because I’m good at it.” What that little unexpected bit of praise did was to make me pay attention to something I was doing and taking for granted. There’s a singing and dancing friend of mine who is great about this: when I first knew him, he just seemed a little overexcited about stuff, but now I can see that what he’s doing is pointing out moments we might just take for granted, things like, “Wasn’t that just a perfect night and it was so cool to see four sets of parents and kids each leading songs.”
Like I said, it’s like grace, in that you can’t force it. Having to find nice things to say doesn’t work when you just don’t see them.
And the key to seeing them is to give yourself time. This is one of the biggest problems with most critiques: they’re scheduled, and you’re in busy, get-it-done-on-budget mode. A really good critique requires a kind of mental state that allows the critiquer to wander over the work, and allow observations, both positive and negative, to develop. And it’s in that time and space that you can find surprises, and find structures and qualities the map-maker wasn’t looking for, and then suss those out a little bit internally before you blurt them out.
To conclude:
Some of you may be familiar with the TV show Mystery Science Theater 3000. The frame story doesn’t matter here: the central premise is that a bad movie is shown to a human and his robot friends (they’re puppets). We see them in silhouette, and they wisecrack their way through the show, taking it down a usually well-deserved notch or three.
So we have a performance—the bad movie—that fails. Then we have another performance on top of that, a commentary on failure and how we can still enjoy a work that is bad. The wisecrackers enter the performance ready to mock, and we are happy to mock with them—it’s a comedy, as most bearable works about failure are. Most of us do not want to be that bad movie, and if the movie is our friend’s work, we don’t want to be the wisecrackers. We really don’t want to be in a comedy when it comes to work with people we care about.
But we are in a comedy. We all make bad maps, one way or another—or, to speak more gently, deeply imperfect maps. We all have to look at imperfect maps other people make. We don’t have to be wise-cracking jerks about it, but if we aren’t able to admit this basic fact, we’ll get nowhere: that we are all at least partly Stephen Sondheim, aged 15—full of potential, not necessarily knowing what we’re doing, wanting desperately to please and to be successful, wanting people to think we’re more than competent, and always in need of some good critique.
Friday, September 28, 2012
City on a Hill
I've been listening to Bruce Springsteen's most recent Album, Wrecking Ball. It's a good album. But the thing that is interesting to me is how he is writing in large part about the Idea Of America, and how so many people are frustrated by the lack of that idea's implementation.
The songs are pretty much all from a point of view: the characters may take on a mythic Everyman quality, but the idea is that they are spoken from a singular point of view. They aren't hymns; they aren't meant to be expressions of collective truth, even when the first person plural is invoked in the first song "We Take Care of Our Own."
Why is this important? Because we each (most of us) carry around our own little America, and that is the "shining city on the hill" Reagan invoked 30+ years ago. And each Perfect Union is a little different. Most of ours conveniently leave out some constituent type of Actual American—the kinds of people that drive you, personally, nuts.
I was struck by how conveniently the otherwise beautiful and heart-wrenching 11/23/63 by Stephen King gave space to black people but had NONE in the school where the main character was teaching. Were there really no black students in small-town Texas in 1963? None? It's not that black people are absent, but they are there as phantoms, as evidence of how deeply flawed America was around 1960. But the evidence was news and signs and white people talking, not actual flesh-and-blood black people.
But then, it would have been a different story. And all stories are selective. All of them. That's part of how stories work: they allow us to pull a singular pattern out from the overwhelming swirl of life, and look at that pattern's shape.
So I invoke this idea of a multitude of Little Americas in peoples' hearts not to say we should try to do differently. I don’t think we can. I think when organizations try and make the little dreams into one big dream—and then into reality—monstrous things happen. Reichs are created, and People's Republics. Places where people desperately wear a mask of someone else’s dream, in hopes they will be allowed to survive.
At the White Privilege Conference last year, there was a roundtable discussion with a group of well-respected radical activist academics. And I was struck by the comment of one of them, to the effect that she had given up on America. As far as she was concerned, it was irredeemable, with the sheer quantity of injustice and blood on its hands. And I understand exactly where she's coming from. The United States has a lot to answer for, most of which it probably never will. Certainly not in full. But I thought how sad that comment made me feel.
I don't want to live in your perfect America. You probably don't want to live in mine. But it’s the fact that each of us (or most of us, disillusioned folks aside) have a perfect America we carry around with us, that makes it possible for us to go on trying to be part of this whole. I think that the sense of a perfected anything is necessary to go on caring. Even if we know that perfection will never come to be.
I'm reminded of "Ramadan", from Neil Gaiman's The Sandman (collected in Brief Lives), where the Caliph of the Baghdad of the 1001 Nights, surveys his the perfect Baghdad. He knows it will not last, and so he persuades the Lord of Dream to bottle that perfect city, the city of his dream, and preserve it forever.
This is the false balance. When we try to protect our dream Americas, when we fight hard to keep them from bumping into the hard realities of the real America and changing in the process, becoming maybe less perfect, we are like people taking a living tradition and putting it under glass. We're like someone who shoots the last animal of its kind so we will have a good stuffed specimen in the zoo. It's not what animals are for, or traditions, or dream worlds.
The songs are pretty much all from a point of view: the characters may take on a mythic Everyman quality, but the idea is that they are spoken from a singular point of view. They aren't hymns; they aren't meant to be expressions of collective truth, even when the first person plural is invoked in the first song "We Take Care of Our Own."
Why is this important? Because we each (most of us) carry around our own little America, and that is the "shining city on the hill" Reagan invoked 30+ years ago. And each Perfect Union is a little different. Most of ours conveniently leave out some constituent type of Actual American—the kinds of people that drive you, personally, nuts.
I was struck by how conveniently the otherwise beautiful and heart-wrenching 11/23/63 by Stephen King gave space to black people but had NONE in the school where the main character was teaching. Were there really no black students in small-town Texas in 1963? None? It's not that black people are absent, but they are there as phantoms, as evidence of how deeply flawed America was around 1960. But the evidence was news and signs and white people talking, not actual flesh-and-blood black people.
But then, it would have been a different story. And all stories are selective. All of them. That's part of how stories work: they allow us to pull a singular pattern out from the overwhelming swirl of life, and look at that pattern's shape.
So I invoke this idea of a multitude of Little Americas in peoples' hearts not to say we should try to do differently. I don’t think we can. I think when organizations try and make the little dreams into one big dream—and then into reality—monstrous things happen. Reichs are created, and People's Republics. Places where people desperately wear a mask of someone else’s dream, in hopes they will be allowed to survive.
At the White Privilege Conference last year, there was a roundtable discussion with a group of well-respected radical activist academics. And I was struck by the comment of one of them, to the effect that she had given up on America. As far as she was concerned, it was irredeemable, with the sheer quantity of injustice and blood on its hands. And I understand exactly where she's coming from. The United States has a lot to answer for, most of which it probably never will. Certainly not in full. But I thought how sad that comment made me feel.
I don't want to live in your perfect America. You probably don't want to live in mine. But it’s the fact that each of us (or most of us, disillusioned folks aside) have a perfect America we carry around with us, that makes it possible for us to go on trying to be part of this whole. I think that the sense of a perfected anything is necessary to go on caring. Even if we know that perfection will never come to be.
I'm reminded of "Ramadan", from Neil Gaiman's The Sandman (collected in Brief Lives), where the Caliph of the Baghdad of the 1001 Nights, surveys his the perfect Baghdad. He knows it will not last, and so he persuades the Lord of Dream to bottle that perfect city, the city of his dream, and preserve it forever.
This is the false balance. When we try to protect our dream Americas, when we fight hard to keep them from bumping into the hard realities of the real America and changing in the process, becoming maybe less perfect, we are like people taking a living tradition and putting it under glass. We're like someone who shoots the last animal of its kind so we will have a good stuffed specimen in the zoo. It's not what animals are for, or traditions, or dream worlds.
Sunday, April 22, 2012
(Sometimes) Forget
I've been talking to first-graders at my son's elementary school this month, about maps and the Titanic (we did a Titanic Reference Map in 1998), and what I do when I make maps. I learned back in preschool that getting too technical at this age just loses them, so I tried to use concrete examples instead of "the real words." This process can be really illuminating to me as well, opening up the true meaning of concepts I've learned to gloss over.
"Generalization" for example. The Titanic Reference Map includes deck plans of the ship, color coded by broad usage: 1st class cabins and common areas, same for 2nd class and 3rd class; crew quarters and utility...
I traced these shapes from deck plans published as part of the investigations into the disaster, plans that include placement of bathroom fixtures, dining room tables, etc. Part of my job, then, was simplification, a kind of generalization. A kind of selective forgetting.
I grew up with the idea that it was good to remember as much as possible, that it was sad that we had forgotten so much, that so much history had been lost. This idea is deeply embedded in my values. And I'm coming to realize that it's not entirely true.
What is the value in forgetting? Well, for one things, it means we have some real say in the stories we retell. If there had been strong documentary evidence (including video interviews and reams of contemporary commentary), about King Lir, would Shakespeare have been as free to construct his narrative around his own deep understanding of human relationships? And what about his "history plays," where there was more documentary evidence available to him... Shakespeare lived in a world where the fictionalization of even relatively contemporary stories was the norm, but he arrived on the cusp of an age—our age—where people paid more attention to the lines between fiction and non-fiction, where the rules around non-fiction became stricter, and where people took documentary non-fiction more seriously as a gauge of how to live in the "real world."
I've been intrigued, in my limited recent reading into North American Indian history (as it is has been reconstructed), how quickly the "historically accurate" truths of peoples' origins fade into legend: the Cahokia culture had collapsed less than two centuries before Columbus, but by the time of European contact with the Indians of the area in the 17th century, the mound-makers were people of myth. By contrast, we know the names of architects who built European monuments 2000 years old, let alone the kings who ordered them built.
Is there advantage in literate memory? Well, we certainly think so. For one thing, it's easier to prevent repeats of long-term disastrous behavior. As George Santayana famously wrote,
And what of our personal knowledge? I have every composition I ever wrote on a computer—every sketch, every half-baked idea, every email—somewhere on a disc or my hard-drive. Everything. I will never ever read all these things. In fact, I will probably read almost none of them again.
When I was in high school, I started a journal/notebook, which I kept up, off and on, for ten years. Will I ever profit from looking through all those pages of stuff? I mean, it might be fun, but perhaps a small selection of those 20-odd volumes might be of some interest. I think I was inspired by things like Coleridge's notebooks, which are treasures, scrupulously edited and notated. But Coleridge didn't write them thinking they were going to to be looked at and researched and (for God's sake) published in a scholarly edition 200 years on. They were a working tool, and if they had stopped being useful, if they had in fact become a drag upon his work, I suspect they would have been given the heave-ho.
We mourn the loss of the library of Alexandria. But have we gone too far the other direction? Do we remember too much? Are we headed towards a future where, like the angels in John Crowley's Great Work of Time, we long to be able to forget, to not have been made to remember?
Where is the balance? Is there a principle that can be invoked? What should we remember, and what should we allow to be forgotten?
"Generalization" for example. The Titanic Reference Map includes deck plans of the ship, color coded by broad usage: 1st class cabins and common areas, same for 2nd class and 3rd class; crew quarters and utility...
I traced these shapes from deck plans published as part of the investigations into the disaster, plans that include placement of bathroom fixtures, dining room tables, etc. Part of my job, then, was simplification, a kind of generalization. A kind of selective forgetting.
I grew up with the idea that it was good to remember as much as possible, that it was sad that we had forgotten so much, that so much history had been lost. This idea is deeply embedded in my values. And I'm coming to realize that it's not entirely true.
What is the value in forgetting? Well, for one things, it means we have some real say in the stories we retell. If there had been strong documentary evidence (including video interviews and reams of contemporary commentary), about King Lir, would Shakespeare have been as free to construct his narrative around his own deep understanding of human relationships? And what about his "history plays," where there was more documentary evidence available to him... Shakespeare lived in a world where the fictionalization of even relatively contemporary stories was the norm, but he arrived on the cusp of an age—our age—where people paid more attention to the lines between fiction and non-fiction, where the rules around non-fiction became stricter, and where people took documentary non-fiction more seriously as a gauge of how to live in the "real world."
I've been intrigued, in my limited recent reading into North American Indian history (as it is has been reconstructed), how quickly the "historically accurate" truths of peoples' origins fade into legend: the Cahokia culture had collapsed less than two centuries before Columbus, but by the time of European contact with the Indians of the area in the 17th century, the mound-makers were people of myth. By contrast, we know the names of architects who built European monuments 2000 years old, let alone the kings who ordered them built.
Is there advantage in literate memory? Well, we certainly think so. For one thing, it's easier to prevent repeats of long-term disastrous behavior. As George Santayana famously wrote,
Progress, far from consisting in change, depends on retentiveness. When change is absolute there remains no being to improve and no direction is set for possible improvement: and when experience is not retained, as among savages, infancy is perpetual. Those who cannot remember the past are condemned to repeat it. (Age of Reason, 1905)And yet, there is such a thing as too much retention. Consider the Incas, who maintained their dead kings and all their descendants as living royal households. By the time of Pizarro's appearance on the scene, there had been at least 11 Inca kings, and the weight of all those monarchs households being maintained was becoming a drag on the economy. This weight may have contributed to the Inca civil war that had just been won by Atahualpa at the time of Pizarro's appearance, which had generally weakened the empire, helping make Pizarro's conquest possible.
And what of our personal knowledge? I have every composition I ever wrote on a computer—every sketch, every half-baked idea, every email—somewhere on a disc or my hard-drive. Everything. I will never ever read all these things. In fact, I will probably read almost none of them again.
When I was in high school, I started a journal/notebook, which I kept up, off and on, for ten years. Will I ever profit from looking through all those pages of stuff? I mean, it might be fun, but perhaps a small selection of those 20-odd volumes might be of some interest. I think I was inspired by things like Coleridge's notebooks, which are treasures, scrupulously edited and notated. But Coleridge didn't write them thinking they were going to to be looked at and researched and (for God's sake) published in a scholarly edition 200 years on. They were a working tool, and if they had stopped being useful, if they had in fact become a drag upon his work, I suspect they would have been given the heave-ho.
We mourn the loss of the library of Alexandria. But have we gone too far the other direction? Do we remember too much? Are we headed towards a future where, like the angels in John Crowley's Great Work of Time, we long to be able to forget, to not have been made to remember?
Where is the balance? Is there a principle that can be invoked? What should we remember, and what should we allow to be forgotten?
Saturday, April 14, 2012
Careful of that dead thing
I had a terrifying dream last week. I was driving my family in the car, on a nearby nondescript suburban road (County Rd C in Roseville, MN, if you care). It was late twilight and cloudy. Suddenly ahead of us, there was a burst of flame: the afterburner engaging as a jet fighter swooped up and to the right. It startled my wife, who yelled. There were flashes in the sky, like lightning behind a cloudbank.
Then off in the distance, way off in the distance, ahead of us and slightly to the left, was a blinding blue-white flash, with a shockwave visible pushing away from it. I knew right away it was a nuclear explosion. Someone had set off an atomic bomb. My immediate question was, what do we do, where do we go? Do I turn the car around and run like hell for home? Would I make it? Would the shockwave get us this far away? Would more bombs explode?
This was the cultural shared nightmare from my growing-up years: nuclear armageddon. I don't remember actually having nightmares about it then—I remember nightmares where I watched passenger jets crash nearby, coming in low and screaming and flying all wrong, and then a cloud of moke from behind a line of trees. But not the Big One. Neither is it really a daylight nightmare for me, and hasn't been since glasnost. Terrorist attacks and pandemics are what tend to set me off in the same way today.
What the heck?
---
I've been coming back over and over this spring to how we tend to avoid awareness of mortality—not just ours, but the mortality of those entities we are part of. In particular, when we found an institution, we seldom build into that institution's structure the assumption that it will one day be dissolved. Most legal entities have procedures built into their generic type: how to dissolve a foundation, corporation or church. But when we found most institutions, we expect them to go on "in perpetuity."
I think I had forgotten how viscerally overwhelming it is to actually face the end of our own bodily life. No philosophy, no rationality, just an overwhelming urge to figure out how to go on living; how to get out of this dangerous situation now.
As I keep moving forward in this exploration (can I really call it that? seems like pretty random wandering much of the time), I need to bear this in mind: the subject of endings can touch off a panicked response that seems to come out of left field. No-one who is not facing excruciating pain wants to die. And no-one who feels their very life depends on a larger organization will therefore respond well to suggestions that the organization ought to be left for dead.
----
I really enjoyed, earlier this week, listening to Kevin Kling talk about what to him was a new an revelatory way of thinking about storytelling, as part of an interview with Krista Tippett on On Being. He says:
Then off in the distance, way off in the distance, ahead of us and slightly to the left, was a blinding blue-white flash, with a shockwave visible pushing away from it. I knew right away it was a nuclear explosion. Someone had set off an atomic bomb. My immediate question was, what do we do, where do we go? Do I turn the car around and run like hell for home? Would I make it? Would the shockwave get us this far away? Would more bombs explode?
This was the cultural shared nightmare from my growing-up years: nuclear armageddon. I don't remember actually having nightmares about it then—I remember nightmares where I watched passenger jets crash nearby, coming in low and screaming and flying all wrong, and then a cloud of moke from behind a line of trees. But not the Big One. Neither is it really a daylight nightmare for me, and hasn't been since glasnost. Terrorist attacks and pandemics are what tend to set me off in the same way today.
What the heck?
---
I've been coming back over and over this spring to how we tend to avoid awareness of mortality—not just ours, but the mortality of those entities we are part of. In particular, when we found an institution, we seldom build into that institution's structure the assumption that it will one day be dissolved. Most legal entities have procedures built into their generic type: how to dissolve a foundation, corporation or church. But when we found most institutions, we expect them to go on "in perpetuity."
I think I had forgotten how viscerally overwhelming it is to actually face the end of our own bodily life. No philosophy, no rationality, just an overwhelming urge to figure out how to go on living; how to get out of this dangerous situation now.
As I keep moving forward in this exploration (can I really call it that? seems like pretty random wandering much of the time), I need to bear this in mind: the subject of endings can touch off a panicked response that seems to come out of left field. No-one who is not facing excruciating pain wants to die. And no-one who feels their very life depends on a larger organization will therefore respond well to suggestions that the organization ought to be left for dead.
----
I really enjoyed, earlier this week, listening to Kevin Kling talk about what to him was a new an revelatory way of thinking about storytelling, as part of an interview with Krista Tippett on On Being. He says:
Well... with this post-traumatic stress a few months ago, after years and years, it came back with a vengeance. And I went to a therapist and she said, "You got to understand... it's not time [that heals]— it... doesn't work, it sits in such a deep place that it's not triggered in ways you would think. It's not something that time heals. It will come back." And so what she had me do, which was so right fit just with my weird, Jungian sensibility, she had me tell the story of my motorcycle accident.
It was a bit more complicated than this. She told me the story, but instead of hitting the car, I missed the car, and I went to where I was going. And by retelling the story and having a different outcome, I started sleeping better. I started, all of a sudden the post-traumatic stress really dissipated in a significant way. And it was because I retold the story in another way that had me survive in another way.
Now the struggle with me is, I still wake up in the morning with my arm not working, with all these things. So there's a reality, and then there's another story I've created. And it really seems to fit with the way we work as, as humans, especially these days. We need to rewrite our stories sometimes just so we can sleep at night.
...but it's not the reality. But we can't live in the story that makes us sleep, but we need it to sleep. And so that's my struggle now, putting those two together, taking the myths we form to make ourselves feel better and fitting it with the reality that we live in.And I think that about sums it up.
Tuesday, March 20, 2012
Zombies and Faeries
It's been rattling around in the back of my mind, to ask: Why zombies? Why now? I think the answer (or an answer) finally occurred to me tonight.
I'm not a watcher of horror, but I found this trailer for the remake of Dawn of the Dead sometime, and it's stuck with me. What disturbs me is the familiar (the neighbor kid) turning into a predatory monster. And unlike other movie monsters, zombies (and similar phenomena like the virus in 28 Days Later) are about us becoming monstrous.
To me, this seems entirely in keeping with a broader sense I was brought up with, that the most dangerous threats are not from foreigners or monsters from beyond, but from our own carelessness and rage. Zombieism, when it has a "scientific" explanation within a movie, usually has something to do with dangerous research... it's a dystopian pandemic story, like Contagion. When it simply is, without explanation, it's like a haunting.
"When there's no more room in Hell, the dead will walk the Earth" is the line from the promotional poster for the original Dawn of the Dead. This makes zombies something alien, but the story itself is about reanimated corpses: in this sense vampires, but stupid, shambling vampires, out for something much less subtle that a vampire's neckbite.
Anyway, as a broad explanation, this makes sense to me. I don't know how much power is really left in foreign monsters, in this era of instant communication and global social networking.
And it makes me wonder about the construct of Faerie, as intelligence-apart-from-humanity. In the Scottish traditions I know best, Faerie is situated where humans are not: under the hill, under the sea, in the deep forest... In a world where none of these things seems that exotic, where James Cameron is about to dive the Marianas Trench to make a movie, the idea seems kind of quaint.
Perhaps an parallel of the zombie phenomenon is in order: where the seduction of Faerie in the ballad tradition comes from its alienness, perhaps a modern version is fully homo sapiens, but has made itself "other" by rejecting... what? The Faerie rejection of religiously-based morality doesn't really work as an alienating force any more. But something like that.
This requires further thought.
I'm not a watcher of horror, but I found this trailer for the remake of Dawn of the Dead sometime, and it's stuck with me. What disturbs me is the familiar (the neighbor kid) turning into a predatory monster. And unlike other movie monsters, zombies (and similar phenomena like the virus in 28 Days Later) are about us becoming monstrous.
To me, this seems entirely in keeping with a broader sense I was brought up with, that the most dangerous threats are not from foreigners or monsters from beyond, but from our own carelessness and rage. Zombieism, when it has a "scientific" explanation within a movie, usually has something to do with dangerous research... it's a dystopian pandemic story, like Contagion. When it simply is, without explanation, it's like a haunting.
"When there's no more room in Hell, the dead will walk the Earth" is the line from the promotional poster for the original Dawn of the Dead. This makes zombies something alien, but the story itself is about reanimated corpses: in this sense vampires, but stupid, shambling vampires, out for something much less subtle that a vampire's neckbite.
Anyway, as a broad explanation, this makes sense to me. I don't know how much power is really left in foreign monsters, in this era of instant communication and global social networking.
And it makes me wonder about the construct of Faerie, as intelligence-apart-from-humanity. In the Scottish traditions I know best, Faerie is situated where humans are not: under the hill, under the sea, in the deep forest... In a world where none of these things seems that exotic, where James Cameron is about to dive the Marianas Trench to make a movie, the idea seems kind of quaint.
Perhaps an parallel of the zombie phenomenon is in order: where the seduction of Faerie in the ballad tradition comes from its alienness, perhaps a modern version is fully homo sapiens, but has made itself "other" by rejecting... what? The Faerie rejection of religiously-based morality doesn't really work as an alienating force any more. But something like that.
This requires further thought.
Saturday, February 25, 2012
Nakedness
Over and over and over we get hung up on the questions of "who are we?" and "what kind of person am I?" When we name ourselves as a group, we seem to need to ask why some people are inside that group and others are outside. How do the outsiders get in? Will I ever be forced outside? Can I be a part of this group and of that other group over there?
Over and over and over, the cries of protest and rage by people who have felt the pain of exile—that is to say, all of us. We are afraid of being alone, angry that our people might decide that we are not their people.
Maggie Harrison's been creating the latest variation on this stir, telling blog viewers "YOU ARE NOT A QUAKER (so please stop calling yourself one)." All the comments and responses are heartfelt, but I find myself sliding into a tiring deja vu-like state. Who is a Quaker blah blah blah, how can we all call ourselves Quakers blah blah blah, how dare you call me out as not a Quaker blah blah blah... on and on an on.
I do like the foundational idea of Maggie's work. It is to shed the clothing we have put over our spiritual nakedness: the pretense we know what we're doing, which covers the shame we feel for being imperfect.
How I long to take all the names I wear—Quaker, non-theist, Democrat, American, Cartographer, White, Male, Straight—and take them off one by one like pieces of clothing, to be able to stand there, shivering slightly because it is February in Minnesota. And be joined by others who have also taken off their name-clothes. Not so we can have some kind of nameless orgy here on the tundra, but so we can see each other a little more truly, even just for as long as it takes before our toes begin to freeze.
And then, eventually, I and my fellows will put on at least some of those name-clothes again, and go off into the world, because we homo sapiens need these labels. This is how we make ourselves into a people. This is how we say who we are.
So why do I find myself wearied? I feel like this is becoming old territory to me, and I want to stop going around in circles, coming back to the same arguments. I want to move on.
I want three simple, difficult things:
1. I want to be allowed to wear the labels that fit me. I want to be an American, a Minnesotan, a Quaker, a straight male, of European descent, a geek, a morris dancer, a cartographer, a father, a son, a husband, someone of middling economic means, a Case, a resident of Northeast Minneapolis, a beer-drinker, a lover of various musics, a song-leader, a person, a member of the species homo sapiens... I want to be able to wear any and all of these labels, and I want to be able to participate in the groups these labels imply that I belong to.
2. I want to not have outsiders to these groups assume they know what it means when I wear any label. I want people to approach unfamiliar identities with humility—either curiously, or seeking some other label they can use if they just don't have the energy to learn about the unfamiliar label.
3. I want my fellows in the groups I belong to, to recognize that identity groups are fluid. Organizations may not be: it may be necessary to keep the inside/outside relationship of the group clear. But no organization will ever be able to exactly line itself up with people who hold identities, not least because all those identities are themselves fluid, and depend on spending time and attention: I might move to St Paul, and while I would then hold Northeast Minneapolis as a dear place in my heart, I would have less say in what Northeast means, being absent.
I think this last is really important, and points to a shortcut in our name-labels we too easily take. We think that holding an identity ought to be like earning a medal, that it ought to secure a relationship to a group, permanently. If you've earned it, you can put it away in a drawer and pull it out when needed. And some relationships are like that: the conversation picks up where it left off years ago. But some don't. Most don't.
I have a recurring dream, where I go back to my old college, and I'm so out of place. I haven't checked my mailbox for months or years. I'm not sure where my stuff is: I still have a dorm room, but haven't used it for a long time, and I need to find my stuff to bring it to where I'm living now. Professors have changed, and I don't know what courses to take for that one last term I need to get my degree.
Identity labels are a shorthand for membership in a group, for belonging. If we do not act on those labels, if we do not live them out, we pull away from holding them. This is a terrifying prospect for most of us, because it means we are that much more alone. Even typing the second sentence in this paragraph, part of me was saying, "Nooooooo!"
Being a Quaker, to go back to Maggie Harrison, feels like it ought to belong to the territory of common practice, or creed (or belief anyway), or following a common teacher or guide. All these things we recognize as the hallmarks of "religious" commonality. But some of it, perversely, is just wearing the same "clothes," the very clothes Maggie urges us to cast off. We are Quakers because we choose to wear that label.
Can real nakedness be the basis of identity? Could it be that all the stuff we use to bind ourselves together is getting in the way? If we wish to utterly open ourselves to truth, to go out naked into February, do we need to shed the very name itself? Is the truest Quaker the one who accepts no common identity—no meetinghouse, no clerk, nothing? How could such a Quakerism survive? How would it avoid frostbite?
Over and over and over, the cries of protest and rage by people who have felt the pain of exile—that is to say, all of us. We are afraid of being alone, angry that our people might decide that we are not their people.
Maggie Harrison's been creating the latest variation on this stir, telling blog viewers "YOU ARE NOT A QUAKER (so please stop calling yourself one)." All the comments and responses are heartfelt, but I find myself sliding into a tiring deja vu-like state. Who is a Quaker blah blah blah, how can we all call ourselves Quakers blah blah blah, how dare you call me out as not a Quaker blah blah blah... on and on an on.
I do like the foundational idea of Maggie's work. It is to shed the clothing we have put over our spiritual nakedness: the pretense we know what we're doing, which covers the shame we feel for being imperfect.
How I long to take all the names I wear—Quaker, non-theist, Democrat, American, Cartographer, White, Male, Straight—and take them off one by one like pieces of clothing, to be able to stand there, shivering slightly because it is February in Minnesota. And be joined by others who have also taken off their name-clothes. Not so we can have some kind of nameless orgy here on the tundra, but so we can see each other a little more truly, even just for as long as it takes before our toes begin to freeze.
And then, eventually, I and my fellows will put on at least some of those name-clothes again, and go off into the world, because we homo sapiens need these labels. This is how we make ourselves into a people. This is how we say who we are.
So why do I find myself wearied? I feel like this is becoming old territory to me, and I want to stop going around in circles, coming back to the same arguments. I want to move on.
I want three simple, difficult things:
1. I want to be allowed to wear the labels that fit me. I want to be an American, a Minnesotan, a Quaker, a straight male, of European descent, a geek, a morris dancer, a cartographer, a father, a son, a husband, someone of middling economic means, a Case, a resident of Northeast Minneapolis, a beer-drinker, a lover of various musics, a song-leader, a person, a member of the species homo sapiens... I want to be able to wear any and all of these labels, and I want to be able to participate in the groups these labels imply that I belong to.
2. I want to not have outsiders to these groups assume they know what it means when I wear any label. I want people to approach unfamiliar identities with humility—either curiously, or seeking some other label they can use if they just don't have the energy to learn about the unfamiliar label.
3. I want my fellows in the groups I belong to, to recognize that identity groups are fluid. Organizations may not be: it may be necessary to keep the inside/outside relationship of the group clear. But no organization will ever be able to exactly line itself up with people who hold identities, not least because all those identities are themselves fluid, and depend on spending time and attention: I might move to St Paul, and while I would then hold Northeast Minneapolis as a dear place in my heart, I would have less say in what Northeast means, being absent.
I think this last is really important, and points to a shortcut in our name-labels we too easily take. We think that holding an identity ought to be like earning a medal, that it ought to secure a relationship to a group, permanently. If you've earned it, you can put it away in a drawer and pull it out when needed. And some relationships are like that: the conversation picks up where it left off years ago. But some don't. Most don't.
I have a recurring dream, where I go back to my old college, and I'm so out of place. I haven't checked my mailbox for months or years. I'm not sure where my stuff is: I still have a dorm room, but haven't used it for a long time, and I need to find my stuff to bring it to where I'm living now. Professors have changed, and I don't know what courses to take for that one last term I need to get my degree.
Identity labels are a shorthand for membership in a group, for belonging. If we do not act on those labels, if we do not live them out, we pull away from holding them. This is a terrifying prospect for most of us, because it means we are that much more alone. Even typing the second sentence in this paragraph, part of me was saying, "Nooooooo!"
Being a Quaker, to go back to Maggie Harrison, feels like it ought to belong to the territory of common practice, or creed (or belief anyway), or following a common teacher or guide. All these things we recognize as the hallmarks of "religious" commonality. But some of it, perversely, is just wearing the same "clothes," the very clothes Maggie urges us to cast off. We are Quakers because we choose to wear that label.
Can real nakedness be the basis of identity? Could it be that all the stuff we use to bind ourselves together is getting in the way? If we wish to utterly open ourselves to truth, to go out naked into February, do we need to shed the very name itself? Is the truest Quaker the one who accepts no common identity—no meetinghouse, no clerk, nothing? How could such a Quakerism survive? How would it avoid frostbite?
Thursday, February 16, 2012
Purity test
In college, a "purity test" made the rounds. It's probably still making the rounds, 25 years later. It consisted of hundreds of "Have you ever?" questions: what kinds of sex have you had, what kinds of resticted substances have you ingested, where and with how many people... It went on and on, a litany of sins major and minor. Getting a high score gained you collegiate street cred. Either that or a trip to the emergency room.
"Purity" is closely associated with sinlessness, not just in our society but pretty much species-wide. A virgin maiden wears white to show she is spotless. Ritual cleansing before worship in a temple appears almost everywhere. And taboo foods and substances are "unclean."
Today, you'll see similar associations between "pure" and "natural" in consumer prodcuts. This is odd, because until recently, purity was clearly an unnatural phenomenon, requiring human or superhuman intervention. Mostly. Pure clear streams ran out of the rocks, of course, and pure ore was sometimes found embedded in rocks. But most of our physical purity is manufactured (think of Ivory Soap's trademark "99 44/100% Pure").
Nature is not pure, or at any rate, organic nature is not pure. Our body depends on bacteria in our gut to digest our food, and on trace elements in water to fill out our nutritional needs. We live now—and always have—in a soup of organic and inorganic ingredients, a constantly shifting mixture of bits and pieces.
What we want to avoid instinctively is pollution. We want to keep most of the infectious germs out of our respiratory and digestive systems so our immune system does not become overwhelmed. We want to keep toxic chemicals from subverting and breaking down the processes our internal chemistry is constantly churning to keep us intact and functioning. Pollution is mostly a matter of degree, not of true purity.
We like purity because it fits how our brains work. We like discrete objects and clearly delineated ideas. We like rules and laws because when we lose our sense of structure, we literally feel lost. And so when we say what exactly something is, and when we can even say that is all that it is, we feel more secure in the universe.
It's a running theme in this blog, but the trouble seems to come when we then take that categorization and reimpose it on the universe: purifying populations; purifying ourselves of sinfulness; purifying toxins, creating lethal concentrations of them. Purity—real, created purity—belongs in the world of ideas, and in the dead world of inorganic chemistry, not in our living world.
"Purity" is closely associated with sinlessness, not just in our society but pretty much species-wide. A virgin maiden wears white to show she is spotless. Ritual cleansing before worship in a temple appears almost everywhere. And taboo foods and substances are "unclean."
Today, you'll see similar associations between "pure" and "natural" in consumer prodcuts. This is odd, because until recently, purity was clearly an unnatural phenomenon, requiring human or superhuman intervention. Mostly. Pure clear streams ran out of the rocks, of course, and pure ore was sometimes found embedded in rocks. But most of our physical purity is manufactured (think of Ivory Soap's trademark "99 44/100% Pure").
Nature is not pure, or at any rate, organic nature is not pure. Our body depends on bacteria in our gut to digest our food, and on trace elements in water to fill out our nutritional needs. We live now—and always have—in a soup of organic and inorganic ingredients, a constantly shifting mixture of bits and pieces.
What we want to avoid instinctively is pollution. We want to keep most of the infectious germs out of our respiratory and digestive systems so our immune system does not become overwhelmed. We want to keep toxic chemicals from subverting and breaking down the processes our internal chemistry is constantly churning to keep us intact and functioning. Pollution is mostly a matter of degree, not of true purity.
We like purity because it fits how our brains work. We like discrete objects and clearly delineated ideas. We like rules and laws because when we lose our sense of structure, we literally feel lost. And so when we say what exactly something is, and when we can even say that is all that it is, we feel more secure in the universe.
It's a running theme in this blog, but the trouble seems to come when we then take that categorization and reimpose it on the universe: purifying populations; purifying ourselves of sinfulness; purifying toxins, creating lethal concentrations of them. Purity—real, created purity—belongs in the world of ideas, and in the dead world of inorganic chemistry, not in our living world.
Labels:
formal systems,
power,
purity
Location:
NE Minneapolis, MN
Subscribe to:
Posts (Atom)