Showing posts with label objective/subjective. Show all posts
Showing posts with label objective/subjective. Show all posts

Saturday, April 14, 2012

Careful of that dead thing

I had a terrifying dream last week. I was driving my family in the car, on a nearby nondescript suburban road (County Rd C in Roseville, MN, if you care). It was late twilight and cloudy. Suddenly ahead of us, there was a burst of flame: the afterburner engaging as a jet fighter swooped up and to the right. It startled my wife, who yelled. There were flashes in the sky, like lightning behind a cloudbank.

Then off in the distance, way off in the distance, ahead of us and slightly to the left, was a blinding blue-white flash, with a shockwave visible pushing away from it. I knew right away it was a nuclear explosion. Someone had set off an atomic bomb. My immediate question was, what do we do, where do we go? Do I turn the car around and run like hell for home? Would I make it? Would the shockwave get us this far away? Would more bombs explode?

This was the cultural shared nightmare from my growing-up years: nuclear armageddon. I don't remember actually having nightmares about it then—I remember nightmares where I watched passenger jets crash nearby, coming in low and screaming and flying all wrong, and then a cloud of moke from behind a line of trees. But not the Big One. Neither is it really a daylight nightmare for me, and hasn't been since glasnost. Terrorist attacks and pandemics are what tend to set me off in the same way today.

What the heck?

---
I've been coming back over and over this spring to how we tend to avoid awareness of mortality—not just ours, but the mortality of those entities we are part of. In particular, when we found an institution, we seldom build into that institution's structure the assumption that it will one day be dissolved. Most legal entities have procedures built into their generic type: how to dissolve a foundation, corporation or church. But when we found most institutions, we expect them to go on "in perpetuity."

I think I had forgotten how viscerally overwhelming it is to actually face the end of our own bodily life. No philosophy, no rationality, just an overwhelming urge to figure out how to go on living; how to get out of this dangerous situation now.

As I keep moving forward in this exploration (can I really call it that? seems like pretty random wandering much of the time), I need to bear this in mind: the subject of endings can touch off a panicked response that seems to come out of left field. No-one who is not facing excruciating pain wants to die. And no-one who feels their very life depends on a larger organization will therefore respond well to suggestions that the organization ought to be left for dead.

----
I really enjoyed, earlier this week, listening to Kevin Kling talk about what to him was a new an revelatory way of thinking about storytelling, as part of an interview with Krista Tippett on On Being. He says:
Well... with this post-traumatic stress a few months ago, after years and years, it came back with a vengeance. And I went to a therapist and she said, "You got to understand... it's not time [that heals]— it... doesn't work, it sits in such a deep place that it's not triggered in ways you would think. It's not something that time heals. It will come back." And so what she had me do, which was so right fit just with my weird, Jungian sensibility, she had me tell the story of my motorcycle accident.
It was a bit more complicated than this. She told me the story, but instead of hitting the car, I missed the car, and I went to where I was going. And by retelling the story and having a different outcome, I started sleeping better. I started, all of a sudden the post-traumatic stress really dissipated in a significant way. And it was because I retold the story in another way that had me survive in another way.

Now the struggle with me is, I still wake up in the morning with my arm not working, with all these things. So there's a reality, and then there's another story I've created. And it really seems to fit with the way we work as, as humans, especially these days. We need to rewrite our stories sometimes just so we can sleep at night.
...but it's not the reality. But we can't live in the story that makes us sleep, but we need it to sleep. And so that's my struggle now, putting those two together, taking the myths we form to make ourselves feel better and fitting it with the reality that we live in.
And I think that about sums it up.

Tuesday, August 10, 2010

Lies, damned lies, and plagiarism

This paragraph stuck out at me in Stanley Fish's latest piece on plagiarism on the NY Times web site:

And if there should emerge a powerful philosophical argument saying there’s no such thing as originality, its emergence needn’t alter or even bother for a second a practice that can only get started if originality is assumed as a baseline. It may be (to offer another example), as I have argued elsewhere, that there’s no such thing as free speech, but if you want to have a free speech regime because you believe that it is essential to the maintenance of democracy, just forget what Stanley Fish said — after all it’s just a theoretical argument — and get down to it as lawyers and judges in fact do all the time without the benefit or hindrance of any metaphysical rap. Everyday disciplinary practices do not rest on a foundation of philosophy or theory; they rest on a foundation of themselves; no theory or philosophy can either prop them up or topple them. As long as the practice is ongoing and flourishing its conventions will command respect and allegiance and flouting them will have negative consequences.
Sems to me the same argument could be made about "objectivity" or "aesthetics" amongst other ideas discussed in this blog. The point, that a standard need not be somehow supported by the fundamental structure of the universe, but can be constructed largely for the needs and desires of a group of people, parallels the idea of maps as propositions or arguments rather than statements of fact.

The point I would make is that it is important to note that we are talking about the formal rules and criteria of judging communications about a subject, not about the subject itself. In the subject of the article, not attributing a quote (plagiarism) is not the same as faking lab results. In the same way, making a map with a bias is different from making a map with errors. One is untrue to the "objective" rules of map discourse, and may be disparaged within the map community for this. The other is a untrue to the physical subject of the map, and is a lie.

Saturday, July 4, 2009

Harley's article on his "Favourite Map" online

I've referred to J.B. Harley's article "My Favourite Map. The Map as Biography: Thoughts on Ordnance Survey Map, Six-Inch Sheet Devonshire CIX, SE, Newton Abbot" a few times. It was published in The Map Collector in 1978, and Kunstpedia.com is putting articles from that magazine online. I requested they add Harley's article and poof! Boudewijn Meijer did. Here it is! Wow, that was fast. Enjoy!

Sunday, June 28, 2009

Un-personed

I had a good exchange with John Krygier recently—thought-provoking as usual. It got me thinking more seriously about the experience of maps as performance. I know very little about performance theory, and much that I have seen I find frankly impenetrable. But I know a little about performance itself from having performed. So what I'm going to outline here is a framework that may well overlap what more experienced theorists have outlined. In any case, it's getting my thoughts down in a more thought-out form. Any recommendations of relevant and not-too-thickly-jargony performance literature is welcomed.

---

The aspect of performance I've been reflecting on is the centrality of the performer. Humans pay more attention to (and have more cognitive tools to explore) other humans than any other subject. So it makes sense that looking at another person is qualitatively different from looking at something that a person has made. An actor is different from a stage setting, no matter how elaborate that set.

I've made the analogy before of cartography being fundamentally about the "stage setting" for a performance about space, that perforance not necessarily being performed within the map. Well, any serious performer will tell you setting is an integral part of performance (for that matter, so is the audience). The whole thing, the entire constructed experience, is the performance.

And yet, there is something different about the designated "performer." It's a person, and so we instinctively pay more attention to that person. I think it may be that simple.

To me, this puts a new spin on the whole idea of attempts at "objectivity," in which the biases and idiosyncrasies of individuals are intentionally de-emphasized. The idea is, while maintaining a clearly human-made voice, to partly "un-person" that voice. It's not exactly the same as what I'm describing, but it is a useful device in a number of ways.

First, it allows the user to put her or himself directly into the performer role. Thus a "base map" is like a karaoke track. It fuctions a lot like the "voice" of a recipe. I had an interesting discussion with my wife Ingrid about this the other night. She reads a lot of food writing, and she confirms that it is common practice, even when the prose style is very fluid and personal, to then drop out of that personal voice into the "recipe voice", in which instructions are neutral. The goal is to de-emphasize the personal viewpoint of the author and to put the reader directly into the driver seat.

Second, it allows for the creation of the idea of a "common truth." This drives many contemporary carto-critics crazy, because they believe the common truths modern cartography has been emphasizing are fundamentally false, leading us straight to the destruction of our ecosystem and so ourselves. But on a smaller scale, it is often useful to have available a "referee voice." It's why we've always had a role in our societies for judges of one sort or another. And by putting off the personal voice and adopting an un-personed voice, we make that more possible.

I'll admit that second one is a loaded bomb. Before you all pile on, let me just ask you to consider, not whether it is right and good for us to do this, but whether it is a basic human reaction to seek someone speaking in an "neutral" voice.

I'm not sure exactly how the idea of anonymous monastic performances done for the glory of God (the Book of Kells, for example) fit into this, but I think they do.

---

Ther other thing that's been on my mind is the priveleged place of performance. Larry Shiner (whom I've discussed earlier) talks about the creation of contemplative frames for the fine arts (the concert hall, the gallery wall, the silent library) as being a big part of those fine arts distinction from "craft" or "artisan" work. Something analagous happens whenever we recognize a performance is taking place. It is different from ordinary social space: we do not expect performers to have the same relationship to those around them as they would when they are not performing. Some of it is a matter of allowing for concentration, but some of it is also that performances are specifically about "setting aside space" to allow for a different experience.

It feels very like the suspension of disbelief that is essential to fiction.

---

And that's all the ideas I have energy for tonight. I'm going to call it good.

Saturday, December 13, 2008

Pragmatism

Tobin Harshaw in the NY Times, takes on the questions of political pragmatism vs ideology, surveying current blogosphere opinion on the subject in light of the coming Obama presidency. An interesting read, paralleling faintly some of my earlier thoughts on the nature of "usefulness" in the context of eugenics. Harshaw opens with a quote from Christopher Hayes in The Nation:

In the wake of the 9/11 attacks, “pragmatists” of all stripes–Alan Dershowitz, Richard Posner–lined up to offer tips and strategies on how best to implement a practical and effective torture regime; but ideologues said no torture, no exceptions. Same goes for the Iraq War, which many “pragmatic” lawmakers–Hillary Clinton, Arlen Specter–voted for and which ideologues across the political spectrum, from Ron Paul to Bernie Sanders, opposed. Of course, by any reckoning, the war didn’t work. That is, it failed to be a practical, nonideological improvement to the nation’s security. This, despite the fact that so many willed themselves to believe that the benefits would clearly outweigh the costs. Principle is often pragmatism’s guardian. Particularly at times of crisis, when a polity succumbs to collective madness or delusion, it is only the obstinate ideologues who refuse to go along. Expediency may be a virtue in virtuous times, but it’s a vice in vicious ones.

There’s another problem with the fetishization of the pragmatic, which is the brute fact that, at some level, ideology is inescapable. Obama may have told Steve Kroft that he’s solely interested in “what works,” but what constitutes “working” is not self-evident and, indeed, is impossible to detach from some worldview and set of principles. Alan Greenspan, of all people, made this point deftly while testifying before Henry Waxman’s House Oversight Committee. Waxman asked Greenspan, “Do you feel that your ideology pushed you to make decisions that you wish you had not made?” To which Greenspan responded, “Well, remember that what an ideology is, is a conceptual framework with the way people deal with reality. Everyone has one. You have to–to exist, you need an ideology. The question is whether it is accurate or not.”

Yup.

Guest Post: Keith Harrison

Keith Harrison is an emeritus professor at Carleton College. My wife (who was an English major) took a class or two from him, but my only connection with him was doing the poster announcing his convocation "How to Stop Your Papers from Killing You (and me)"... Which I of course missed. But a few weeks ago I was visiting our friends the Heimans in Northfield, and discovered they are publishing a book by Keith based on the concept he was then developing, which is essentially an attack on Everything You Ever Learned About Writing School Essays: the "hourglass" model, removing sense of personal voice, outlining first... provocative stuff. Mark Heiman was looking for notes, so I took it home and read the proof, and realized I had erred badly in missing the convo. I wrote to Keith and told him what I thought, and he responded with what amounts to a blog entry. So with his permission I'm posting it here:

The Subj/Obj opposition has puzzled me since the time I heard an English teacher call a poem by Shelley ‘a subjective lyric’. I couldn’t understand what he meant, and said so, and learned nothing from his reply. Much later, reading Bronowski and Polyani, and a host of others, I got thinking about it again. I believe (especially since Heisenberg) it’s a pseudo-distinction, and certainly in the humanities a useless pis-aller. Whether in cartography or poetry I believe all we can do is to give versions of that part of the world which takes our attention. In spite of what many scientists actually assume in their practice, if not in their belief-system, there’s no god’s eye view of the world. We are not (at our best??) cameras, for reasons that should be transparent to anyone who thinks a little about it. Scientists hate that thought because it ushers in the dreaded C-Word as Murray Gell-Man puts it. What in the hell do we do with consciousness, which is after all the most fundamental fact of our being here? The answer that scientists often give is that you have to regard it, as Freud does the mind, as an epi-phenomenon of the body. Or, in the case of Crick, you dismiss the question as trivial. Generally speaking, you’re better off to forget about it and get on with the "real work". The trouble is that, as writers, we can’t do that because it doesn’t make sense. We are here and we have to tell stories - all kinds of stories - about what we experience. Part of my brief is that because we have been trained to think of ourselves as non-persons and because we have tried hard to do that, the result is the kind of prose that pours out of our colleges by the truck-load. In most student-essays there’s nobody home and when you ask the simple question— where did this dogma of ‘impersonality’ come from?—it’s not possible to find a satisfactory answer, except: we have always done it that way. But if essays are really forms of narration (stories), questions of accuracy inevitably arise. Why is my version of the auto bail-out more accurate than another’s? Or less? Interesting questions. Not, I would maintain because mine is more objective (whatever that means) but maybe because it has a wider explanatory range, because it is more consistent with many other ‘explanations.’. Consistency does seem to be a key, but clearly not a self-sufficient one (people used to be consistent about phlogiston). I could go on but will stop (on this question) with this: there seems to me nothing wrong with either a scientist or any other person declaring him or herself to be a largely ignorant person trying to make a somewhat intelligible "version" of one part of the world we all live in. Yet our dominies, our Strunks and Whites, and the greater part of our professoriat, would argue very strenuously against that assumption. We must tell the truth, be objective etc. There’s always the ghost in the machine, even when we take God away. The belief is very powerful. Someone must know the truth. It’s got to be there. Doesn’t it? Even Dawkins fall for the delusion.

Now for something provocative. I’m more and more convinced that beneath all our professional ‘belief’ in objectivity, five-paras, the forbidden ‘I’, and on an on, is a deeply entrenched commitment to the status quo. In other words that commitment is based a political belief which is almost invisible and, because of that, all the more powerful. This is the elephant in the room. We have taken our binary oppositions (heredity v. environment, nature v. culture) so much for granted that we’ve become stupefied and stunted in our thinking on very important matters. When one considers the brief given implicitly to most student writers, but NEVER examined, it goes something like this: You don’t know much about the recent history of Madagascar but your task is to write about it AS IF you do know something about it (you will get the vast bulk of your knowledge from sources, of course) and AT THE SAME TIME you should write as if you are not a person and must never use the first person. The brief is doubly incoherent at root. No wonder students hate writing essays but being, essentially, survivors they will find the best way to get under the wire. The most common practice is to string together a series of ‘quotes’ (properly acknowledged, of course) and to try to give the impression that the essay has an author, but not really, because the ‘author’ doesn’t really know anything. One can hardly imagine a more futile dry loop, a more complete waste of time. To ‘succeed’ in this exercise requires an imagination as dense as that of George Bush or Bill Kristol or Larry Summers. It’s main driving power is an unflinching commitment NOT TO THINK.

Against this ‘method’ of writing a paper I would propose the following. Get interested, get very interested in a topic, put yourself on the line as you think about it. Work. (If you can’t find a topic please do something else. Anything. But DON’T start writing until you are really involved.) Stand firm in your own partial knowledge, ask real questions. Use you genuine ignorance as your strength. Explore. Use quotations to help shape your own ideas, questions, puzzles. This is your essay it cannot be written by your sources. Use your essay as an authentic exploration of a question which matters to you. Remember that most teachers cannot write. They have been trained to think in very proscribed modes for reasons which become clear as you think about the whole purpose of education which, in the words of our some time Governor, Arnie Carlson, is to produce ‘successful units for deployment in the economic sphere.’

You were surprised by my ‘weird’ ideas on outlining. Another reader was delighted to find that it’s okay to use the first person in an essay, a third felt relieved that it’s alright to end her essay at the end and not at the beginning as she usually does. More questions: what do the words ‘alright’ and ‘okay’ mean in these sentences? More still: a university is a place where we should ask questions, sure. But not questions about sacred matters like this, or patriotism, and on and on.

In the teeth of all the conformism I have found in fifty years of teaching I want to join in the exciting task of helping students be authentic persons, in whatever they do. We (all students) have to give ourselves permission to be alive, questioning, foibled, ignorant, occasionally savvy, always fully ‘here’. Bloody difficult task. Our systems have made it an almost impossible one. Most schools have a corpse in the basement, and another one in the brain-pan. (Another full essay needed here). To cut to the essential thought: A revolution, what Blake called a Mental War, seems necessary.
Whew.

And there you have, in sum, his new book. My only comment (I viscerally agree with most of what Keith says) is to go back to objectivism (the cult of objectivity) as a way of creating common ground based in verifiable experience. Whatever the culture of science may have become (and I hope to have more to say on this soon), the basic fundamental core of science is the idea of repeatable experiment. And the idea of objectivity comes out of this sense that if I drop two cannonballs from the Tower of Pisa, from the second of planet Foozbain, or the top of Mount Doom, they will land on the ground at the same time, regardless of their varying mass. This skeleton of "verifiable facts" seems to me to be the basis of the whole shooting match: the langauge of cartography, the voiceless essay, journalistic objectivity...

It's all pidgin, and placed against the previous context of a common language based on divine and miraculous explanations for things, it makes a lot of sense. It makes conversations about practical matters possible for a broader range of people. The trouble comes when we start wanting to insert lyrical, subjective content into this pidgin, because that content is adamantly non-repeatable. Conversely, we can get in trouble if we hide behind "objectivity" in order to get our selfish way (see Woods' critique of cartography).

And when we insist that all discourse be carried out under this rubric, even when what we are talking about doesn't need the pidgin to be able to cross a cultural divide, we (as Keith points out) stifle real creative work, which needs to be carried out by a whole person, not just the part that can be translated into pidgin.

Thursday, September 25, 2008

No Man's Land

We've tossed around word like "neutrality" and "arbitrariness" and "objectivity" on this blog a fair amount. I've been arguing that the value of "the Grid," the "neutral" framework on which we compile common knowledge of the world, is that it provides a pidgin, a non-native language of commerce. It provides common ground, so to speak.

The phrase "no man's land" popped into my head this morning, and I find it resonating.

What the phrase evokes most of all to me is the dreadful no-man's-lands of World War I, the muddy, bloody plains of death. No man's land is not a place to live in, it's a place to separate peoples who cannot live together. Instead of creating a shared space, it creates an unclaimed space. To be blunt, there is no love there.

I was trying to conjure up an alternative to the scientific, objectivist way of finding common ground, asking "what other common grounds are there?" The obvious one is personal contact. The way you make the stranger into a non-stranger is to spend time with him/her. Host and guest. Or neighbor and neighbor. Not that I really know my neighbors all that well, but communion can be achieved through common work, even among strangers.

The opposite of no man's land then is "the commons" where we all graze our livestock— "we" in this case meaning the shareholders of the commons. Not everyone everywhere, but everyone in the village, everyone working on the same project.

How does a no man's land become a commons? I think of this literally happening in the story of Christmas 1914 in the trenches of World War I, where the guns stopped and soldiers from opposite trenches met, traded songs and cigarettes and played soccer. The story brings tears to my eyes still, like the hopeful/exhausted refrain of the Decemberists' "Sons and Daughters": "here all the bombs fade away..." [actually the lyrics sheet says "Hear all the bombs, they fade away," but I hear otherwise] but that's another blog entry.

But to think of it, Christmas 1914 depended on the majority of soldiers sharing a common religion. They both celebrated Christmas, neither side wanted to be shooting when they would rather have been home with family. I'm guessing things would have been different if the Gallipoli campaign had happened in 1914. But no, a truce to allow clearing of bodies did happen. So sometimes you can appeal to commonality as a species.

No-man's-land implies the opposite of itself—territory. I need to do some studying about the evolution of modern ideas of property and territory, because they clearly aren't universal. Nomadic tribal societies, while they wanted to keep their own hunting grounds for themselves, did not allocate land to individual "owners." And many settled societies have had owners as equivalent to rulers (see lords and serfs). As small freeholdings became more common in Europe, how did the idea of territory change, and when did "commons" arise as an alternative to private property (or is that how it worked at all)? Like I said, I need to learn more. (I note with interest a reference in the wikipedia discussion page on the article Property to "Richard Schlatter's by now classic Private Property: The History of an Idea. London: George Allen & Unwin, 1951, or for a more current perspective Laura Brace's The Politics of Property. Edinburgh University Press, 2004")

So how do you get from no man's land to commons? And (to get back to the general theme of things here), where does the cool light of "objectivity" fit in? The question, I think is to what end neutrality is invoked. By itself, "neutral territory" can mean the no-man's land of World War I; or its cold modern alternative, the DMZ; or Switzerland, or the town common. Is the difference a matter of scale and dispute, or is there something else going on in the range of possibilities?

Tuesday, September 23, 2008

Selective memory

I've been involved in a debate on wikipedia recently (Avoid Academic Boosterism), and it raised to me a longer-standing question about the nature of reference materials (including maps). As I said on the wiki,

I've been trying to figure out what's really going on here, and not just here but in general in the whole "selectivity" business in college guides and so on. Here's what I think: It is not seemly and polite to talk about a college's "mythos," but that's what's going on. One of the important things about Harvard is that "Harvard Aura" and the same is true of other "selective" institutions. You go there, you know you're hanging out with future Nobel Prize winners, or at least with people who can plausibly sound like future Nobel Prize winners. And this is known in the public at large.

So my question is, how can we talk about this in wikipedia? Some colleges have "Wobegon University in popular culture" sections,but these are mostly lists of mentions on TV. Seems to me this is the place to mention "aura", and in some cases there's specific examples to bring up: Robert Pirsig and the University of Chicago, Paper Chase and Yale. For smaller schools, not so much. I went to Carleton College, which has the reputation as the highest-caliber college in Minnesota. But there's no movie or popular book that backs this up, and no news outlet wants to tick off alumni of other places unnecessarily by saying things like "Minnesota's top college". Maybe reference here to less-rigorous college guides (like College Prowler or The Insider's Guide to the Colleges) is in order, under the rubric of "How Carleton College is talked about," separate from verifiable stats.

The point is, if we can find some way to talk about reputation that isn't the article defining that reputation, I think that will get at a lot of the underlying issues here.


Which gets us right back to "what kinds of things can you put in encyclopedias" or maps or other reference works? By classifying things and then only accepting those classes of information that can be part of a communication pidgin (verifiable, supportable, can-be-agreed upon), we leave out a whole lot. But by including those things, we lose the cross-community communication that reference material allows.

It's a dilemma. Well, its the same damned dilemma I keep talking about here, from another angle.

Friday, August 8, 2008

Theological diversity

I started another blog a little while ago, to talk about Quaker issues, but I'm finding this discussion and that one are coming together and interweaving in my thinking and my life to a distracting degree. So I'm giving it up and just bringing that discussion over here. There's only a handful of posts over there if you're interested. And if this stuff is utterly uninteresting to you, well, sorry but there it is.

I was reading Joe's excellent comment tonight while gearing up for a session on "theological diversity" at Meeting tomorrow. In our meeting (and among FGC Friends in general) there's been a resurgence over the last decade or so in Jesus-centered worship and ministry. For those of us for whom Jesus is not the central exemplar and teacher, and who may have signed on with the Quakers to get away from dogmatic Christians, it's been a little weird. But that resurgence has in my experience been gentle, not prosletyzing, not hegemonistic. It has all been about individuals being open about the center of their universe.

To me it feels like the shoe is now on our foot (those of us who have held a more universalist point of view), to come clean about our centers, instead of using old hegemonizing, power-grabby, churchy attitudes as straw-men. The Bible doesn't especially speak to you as scripture? OK, then what does speak to you as scripture? Outward religious ritual isn't your thing? What formal, regular recognition of the universe and where we fit in it does, then?

Where this fits into the discussion of maps and architecture, is that I think it presents a model for how to carry that balancing act forward. The argument shouldn't be between objectivists and subjectivists. It shouldn't even really be an argument at all. The work as I'm coming to see it, is to theoretically explore how those two ways of dealing with the universe interact, and build practices that respect each. And that involves (as a cartographer) simultaneously respecting the traditions and knowledge we've built up over the centuries, and recognizing that cartography is (and when used properly only can be) a structure, upon which centers can be constructed. Instead of isolating ourselves from that center-building, we need to really look at how we can be part of the subjective, center-building, all-too-human process of Making the World.

Wednesday, August 6, 2008

Steig said it all

I was reflecting on the way home last night on the differences between a scientific and a subjective perspective last night, and how a picture of the world from the latter necessarily makes where we are (or where the author/artist/audience is) the center of the universe. And then this morning I was reading the Caldecott Award acceptance speech by William Steig, who was given the award in 1970 for Sylvester and the Magic Pebble. The book has been reissued with a fresh set of the illustrations based on the original watercolors, and has a copy of the speech at the back.
Art, including juvenile literature, has the power to make any spot on earth the living center of the universe; and unlike science, which often gives us the illusion of understanding things we really do not understand, it helps us to know life in a way that still keeps before us the mystery of things. It enhances the sense of wonder. And wonder is respect for life. Art also stimulates the adventurousness and the playfulness that keep us moving in a lively way and that lead to useful discovery.
So there you are.

What I was reflecting last night again is how cartographic maps and other scientific communication intentionally leave out any of the content that is personal, including point of view. As I've discussed earlier, the idea is to create a pidgin point of view that bridges the subjective points of view and personal biases. But in doing so by leaving out the personal and subjective, we also leave out of the discussion the "mystery of things." We depend on that discussion happening elsewhere.

It has been suggested that this lack of the subjective, of the "mystery of things," in cartography
is a fault. But I want to suggest that the problem is the separating of cartography into its own little ontological niche. Cartography is a part of a wider discussion, and it performs a valuable role. But is a role, and not the entirety of the play.

Thursday, July 31, 2008

Sunday in the Park with Joe

We had a lovely visit in New Haven with Joe and family. Well, New Haven was pretty muggy, but their apartment was pleasantly air-conditioned and it was great to have time to catch up with everyone.

Joe and I like to talk when we visit, and we managed to sneak a few longer conversations into the visit. Basically rehashing previous discussions, and I think (Joe can argue with me) realizing we were saying the same thing more or less, but in a different dialect. I think. Maybe. Ten days later and my memory grows dimmer... And those little kids are distracting.

Our main sticking point had to do with the idea of neutrality/arbitrariness. I like the former because to me it evokes Switzerland: there's nothing inherent in that patch of land that makes it "neutral," it's just agreed that it is, and so it functions usefully as a place intentionally outside of international alliances and conflicts. Joe likes arbitrary in part because of its root in "arbitration": an arbitrary decision is originally one reached through arbitration. But I think we realized we basically agree, that utter objectivity/neutrality is impossible, but that finding pidgins and setting arbitrary benchmarks allows people to work with one another.

Where the conversation really got interesting I think is where it veered into religion. It feels to me like a lot of the background to this blog is at root religious: the conflict between objectivists and subjectivists looks a lot like the conflicts between universalists and "specifists" in my Quaker meeting and in the world as a whole. To me, the point is not which one is right; the point is that both are necessary, and finding a Grand Theory of Everything should be the goal. That might look like nestling one inside the other, or explaining one as a social function and the other as a personal function, or one as temporally long term and the other as momentary.

Who can say...

****

For jollies, I also attach a link Joe sent me from Archinect, trying to rethink the architectural plan, another orthographic representational school. A map really, of planned space. Anyway, some intriguing suggestions of how to turn the Plan on its head and bring it out of the camphor-filled cubbyhole it's been relegated to.

My only question as a cartographer is, where's the ground?

Saturday, March 29, 2008

360°

Been thinking some about arbitrariness, re Joe's recent comments:
I used the arbitrary exactly because it implies a judgement, on somebody's part, as to how one will divide one's representation of reality by marking it. That's really important!
To me "arbitrary" also implies decisions not really based in the subject at hand. The decision to use the equator and the poles as a basis for a global grid has a different level of arbitrariness than the decision to use the pole-to-pole meridian that goes through Greenwich, England. The four cardinal directions aren't arbitrary; they're based on the direction of earth's rotations and appear to be nearly universal. On the other hand, north as up is arbitrary.

I argued in one of my last posts that the decision to use 360° of longitude was somewhere between. It's of course because of the common idea of 360° in a circle, but my assumption was that 360 was the number of choice because it is such a great factorial number (360=2 x 2 x 2 x 3 x 3 x 5; factors are 2, 3, 4, 5, 6, 8, 9, 10, 12, 15, 18, 20, 24, 30, 36, 40, 45, 60, 72, 90, 120, 180). Well, the consensus seems to be the circle is 360° because the Babylonians had a base-60 number system, and that this in turn is due to the proportions of a circle inscribed into a hexagon. Which seems a little more arbitrary than I had thought.

Which brings up questions of the arbitrariness of any given counting system: we use decimal (base-10) for most of our everyday activities, but binary (base-1) and hexadecimal (base-16) are pretty universal in the world of computes. While base 10 is arbitrary in the sense that it "just happens" that we generally have 10 fingers and 10 toes, so that choice wasn't totally arbitrary to those who began counting on those fingers and toes.

So again, it's a question of perspective. One person's arbitrary is another's fundamental. From our modern, detached, viewpoint, the Greenwich meridian is truly arbitrary; all Longitudes are equal. The meridian was established was established in 1851 at the Royal Observatory and because Britain's Empire was approaching its peak, it quickly became a global commonplace. It was made the international standard in 1884 at an international convention. It surprised me how late this convention was set. But in the sense that the Royal Observatory really was the center of world standards at the time, it certainly isn't totally arbitrary.

There's an interesting history and a list of other meridians on Wikipedia.

This in turn leads to an interesting article on the Washington meridian(s). This in turn leads to a discussion of where those straight lines that form so much of the west actually come from. Most, interestingly, are not based on the modern longitude (being older than the Greenwich standard), but on degrees west of Washington. But which meridian in Washington? The Capitol or the Naval Observatory?

Arbitrary? Well, in a universal sense, yes, but in the sense that American national identity is centered on that most symmetric of capital cities, it's not arbitrary at all, any more than the circular arc boundary of Delaware and Pennsylvania, nominally centered on the steeple of the old state house at New Castle.

But then there are those boundaries based on rivers or mountain crests or other actual markings on the land. These seem less "arbitrary" yet.

And then there's the whole idea of boundaries. I still often bear in mind Matthew Edney's description of the early 17th-century boundary between the France and the Holy Roman Empire, where you would go riding east from Paris, and for a while every estate owed allegiance to the King of France. Then after a while, every now and then you'd have one who was loyal to the Holy Roman Emperor. Then as you approach the Rhine, the mix would be come pretty even, and then eventually as you crossed into what is now Germany, pretty much everyone was a Holy Roman by affiliation. So when you see a historic map of that era that shows a boundary line, this is a modern artifact (AND arbitrary!); national boundaries then were often soft.

I think it would be really cool to do a world political map that also reflected relative loyalty to the central government: Somalia as a very light color, Japan as quite vivid, and the regional variation: Tibet lighter than Shanghai. Maybe just eliminate national boundaries altogether, and do a dot-scatter map of populations and loyalties.

Then again, it'd be hard to keep up to date.

Friday, February 29, 2008

Maps and Violence

I'm still working on Steven's comment a few days ago. The thing I found so hard to digest was the violence implied directly to "the grid." I respect Steven's work a lot (sorry, part of my initial confusion was also not knowing who you were, Steven), and actually knowing he was the one making the response makes it clearer where he is coming from.

I had a similar reaction to elin o'Hara slavick's Bomb after Bomb: A Violent Cartography, which she presented a selection of at NACIS last year. Her art is a deeply-felt indictment of bombing as an anonymous evil: bombs kill without the killer having to directly face the consequences. Her maps with stains and wounds painted over them were really powerful stuff. The problem I had was her equation of cartography with the violence it enabled: she spoke of hegemonic mapping, of mapping as the anonymous-making of space. Maps as implicit in murders and bombings.

So here's my question: when, in general, are whole abstract systems responsible for the evil that people do while using them? There's a lot of powerful arguments for language (for example) being responsible for violence and for other perversions of humanity (think Orwell's Newspeak in his 1984). But Orwell himself was writing in language to make this point, and was not indicting language per se, but the control and manipulation of language from above.

Much cartocriticism works from the vantage point of cartography as the exercise of such power: modern cartography arose out of military and political power-struggles, out of desire to control. But one of the peculiar things about it is that while power has been the sponsor of cartography, the resulting maps themselves were in a sense a democratic, decentralizing visual expression.

One of the huge cultural shifts over the last umpteen hundred years, but especially over the last 500, has been that of lord-and-vassal relationships to citizen-and-citizen relationships. Of course power still exists and is exercised, but in the West only the mad kings and their followers these days seriously believe that God gives rights of power to kings, and that it flows down from them like mana. I find it hard to imagine a world in which my basic legitimacy as a person was based in my relationship to my lord and master rather than in the assumption that "I am a person and so I count."

Modern cartography—including the grid—reflects this humanist point of view, in that space is not privileged. We don't just make New York City bigger because it's more important; a mile is a mile is a mile. Kind of spatial one citizen one vote. Classed information are made larger and smaller not out of ordainment, but out of quantitative measurement.

So. Maps and violence. Maps, the grid and violence I should say.

I feel the grid. I am frustrated by the inability of my pidgin graphic tongue to speak poetry. But, that isn't what cartography was built for, and almost no-one speaks pidgin as their first language. But pidgin evolved to deal with places like New Guinea with hundreds and hundreds of languages: sure it would be great to sit and take the time with everyone we meet and learn the nuances of their mother tongue, but we are here to trade our goods for a goat.

It is easy to work backwards from the horrors that have resulted from some uses of cartography (and yes it is true, bombing would not be possible in a modern sense without cartography). But I would suggest the opposite is also true. In my better moments making maps, I feel like a native guide to a new place. I don't speak my charge's native graphic tongue, but in my pidgin, I can get him or her to a warm place to eat and rest. At cartography's best, this is true in general: it is a plain language, reduced to the smallest vocabulary you can get away with, which lets strangers meet and be cordial and hospitable, and smooths whatever business they need to do.