I ran across a new article in Wired about Ray Kurzweil’s ideas about immortality — using technology to transform ourselves into everlasting containers of our essential being — and it occurred to me that the concept behaves much like the horcrux from Harry Potter and the Half-Blood Prince.
Maybe Arthur C Clarke was right about technology and magic?
Owner of inkblurt.com
I had no idea Google had a repository of all its special-occasion logos.
Adam Gopnik has an excellent piece on C.S. Lewis in this week’s New Yorker: Prisoner of Narnia.
He reminds us of a few important things to keep in mind about Lewis (he’s viewed differently in Britain, for instance), and discusses his brand of religious belief, and how it kept him in a sort of internal tension between belief and myth.
Gopnik manages to articulate something that’s always bugged me about the Narnia stories as “Christian” allegory:
Yet a central point of the Gospel story is that Jesus is not the lion of the faith but the lamb of God, while his other symbolic animal is, specifically, the lowly and bedraggled donkey. The moral force of the Christian story is that the lions are all on the other side. If we had, say, a donkey, a seemingly uninspiring animal from an obscure corner of Narnia, raised as an uncouth and low-caste beast of burden, rallying the mice and rats and weasels and vultures and all the other unclean animals, and then being killed by the lions in as humiliating a manner as possible—a donkey who reëmerges, to the shock even of his disciples and devotees, as the king of all creation—now, that would be a Christian allegory. A powerful lion, starting life at the top of the food chain, adored by all his subjects and filled with temporal power, killed by a despised evil witch for his power and then reborn to rule, is a Mithraic, not a Christian, myth.
Now who’s going to write *that* story?? I’d like to see it. But, alas, I probably won’t. Instead I’ll see Lewis’ stories further glorified in film.
It’s not that I don’t like his stories. They’re fine, really. Old-fashioned, but fine, and quite inspired and beautiful in places. But I don’t think they’re very accurate or helpful as Christian allegory.
Philip Pullman, the author of the “His Dark Materials” books, has made clear his own feelings on the Narnia books. In the wake of Disney’s working so hard to publicize the new Narnia films, and evidently to capitalize on the huge evangelical Christian market for the stories, Pullman has been pretty strident. In the Guardian:
‘If the Disney Corporation wants to market this film as a great Christian story, they’ll just have to tell lies about it,’ Pullman told The Observer.
Pullman believes that Lewis’s books portray a version of Christianity that relies on martial combat, outdated fears of sexuality and women, and also portrays a religion that looks a lot like Islam in unashamedly racist terms.
‘It’s not the presence of Christian doctrine I object to so much as the absence of Christian virtue. The highest virtue, we have on the authority of the New Testament itself, is love, and yet you find not a trace of that in the books,’ he said.
Well, I think that may be a bit harsh. You do find certain kinds of love, but not precisely the mix I happen to find in the Gospels. In fact, great swaths seem to be missing.
At any rate, I think as fantasy the stories are pretty successful. I don’t hold them in holy reverence like so many do, though. But I think that until I read these articles, I was sort of afraid to admit that out loud, for some reason.
On their book-brand site, John Seely Brown and John Hagel have a nicely articulate PDF up for grabs for anyone who wants to register. The article is called “Interest Rates vs. Innovation Rates.”
Here’s a nice bit:
In their relentless quest for efficiency, companies have tended to shy away from the edge. Edges represent uncertainty, while executives crave predictability. Edges generate friction as employees explore, experiment and tinker with unfamiliar needs and opportunities, while management roots out friction wherever it can. With appropriate management techniques, friction can become highly productive, generating valuable innovation and learning.
I’ve been hearing lots of level-headed, wise CEO’s lately say that they’re not interested in the “cutting edge” or that super-radical “bleeding edge”: that it’s not always prudent to be first or do something for the sake of novelty or hype.
But that depends on how you define the edges. The way they talk about edges, typically, is as a straw-man concept. (Who the heck *does* want to do something new for its own sake?? )
The problem is when that kind of thinking makes us comfortable with the status quo and slow, tunnel-visioned, incremental improvement. What I like about this article is that Hagel & Brown are redefining what “edge” means.
The point is that edges represent the intersection of established ways of doing things with new needs and new possibilities. It is this ntersection that creates a fertile ground for innovation and capability building. Employees are forced out of their comfort zones and pushed to question and refine traditional practices.
Asking hard questions about the edges is a great way to start.
There’s a glowing paean over on AlterNet (via Common Ground) on ‘Howl’ at Fifty, about Allen Ginsberg’s first public reading of his seminal poem.
The article says he “brought American poetry back to life,” but I’d have to disagree. The Beats certainly helped things along, but American poetry was doing quite well already, thank you. The Modernists had kicked plenty of ass before WWII, as had the Fugitives (who, if you count the critics who were born of their numbers, were unbelievably influential) and then after WWII (and somewhat before) the Black Mountain School was powerfully influential, setting the stage for a lot of what the Beats celebrated.
Evidently what he means is that the Beats “… brought poetry down from the sacrosanct halls of the academy. It took poetry off the musty printed page into the lives of listeners.”
It engaged more regular folks on the street? That would seem like an odd way to define “back to life” — we don’t hold other art forms to that standard. Charlie Parker injected new life into American music, but only people who loved Jazz “got it” at the time.
This is terribly ignorant … poetry was far from sacrosanct and academic. It was thriving in America, in journals and small magazines and readings, correspondence and publishing. There’s something about the Beat-to-Hippie cultural event that allows people to glom onto it and not think any more about anything else. Because everything else is “musty” or “academic.” Hey, maybe I sound like a curmudgeon here, but that’s just lazy thinking. And it’s the sort of thing for which Ginsberg would’ve cuffed you about the ears.
I met Allen Ginsberg, and had the blessing of spending a couple of days in his company, about 12 years ago. He was downright spry, and in constant search of macrobiotic food (“for the diabeetus”), and furtively snapping pictures with his Leica. He wore a suit the whole time.
He ran a poetry workshop on campus one day (yeah, a campus, one of those musty academic ones) in which he drilled everyone on basic poetics and referred to some very old examples of poetry as models. In fact, he mentioned none of his contemporaries when discussing the best influences for poetry, that I remember. And when he performed his poems, he took about 15 minutes of the reading to sing some of William Blake’s “Songs of Innocence and Experience” with his little squeezebox for accompaniment.
I love the Beats. I just don’t worship them as the one great thing to happen in literature in the last 60 years. And I don’t think any of them would welcome the sort simplistic sanctimony in which so many little beatlings hold them.
So, when I find myself hearing a lot of “populist” poetry — say, for example, the stuff on HBO’s “Def Poetry” — I realize that the most successful poems, the ones people respond to the best, still hold to the same practices that all those musty poets used in their best work: specific, arresting imagery; effective rhythm; original and provocative insight. Yeah, this happens in many forms and styles, but it doesn’t happen in a vacuum, whether the writers/performers and audience know it or not.
All that said … “angelheaded hipsters burning for the ancient heavenly connection to the starry dynamo in the machinery of night” is still pretty dang cool.
South Street in Philly had a day of the dead parade on Sunday. It was small but charming. This picture was actually in a shop window, but there are a couple of shots of the skeleton puppets in the parade on my flickr stream.
Unfortunately my real camera broke, so all I could get were these fuzzy phone-cam shots.
This is just grand: The Haunted Mansion – Secrets
An in-depth history and explanation of the Disney Haunted Mansion.
As a kid, I used to have some of the most delightful nightmares about this attraction. And in my child dreamscape I’d sometimes just have happy dreams of Disney in general, the Haunted Mansion was second only to the (in my dreams, hyperbolically fantastic) Magic Shop for causing fireworks in my brains.
Happy Halloween!!!
I’m late in acknowledging this, but October is “Lupus Awareness Month.”
People I care very much about struggle with this disease, an auto-immune disorder that is very complex and often debilitating.
There are so many causes out there, and so many needs. But chances are, someone you know is challenged with this disease. I encourage you to click on the link and read about it. The more people are aware, the better.
One problem with the participation model is that so much of it is fueled by idealists. Well, it’s not totally a problem, because we need idealists. But it makes the “movement” behind the model seem naive to the more realistic and/or cynical.
I like to think I’m more cynical than not, though I often surprise myself with an occasional rash of gullibility. (I’ve posted stuff here that, looking back, I have to roll my eyes at.)
Wikis (just google it if you need context) have their orthodox fanatics, for sure. There are tons of people who are appalled that anyone would sully a simple, elegant wiki with login authorization or levels of access, for example. It’s not “the Wiki Way!”
But that’s like the first person to invent the wheel being angry when someone adds an axle.
Reflecting its somewhat idealistic origins, the wiki concept started with complete openness, and is now having to mature into other less specialized uses with more general audiences (or, for that matter, more specialized audiences), and it’s having to adapt some more “closed” genetic code for those environments. (Kind of like wheels eventually need tires, brakes and differentials….)
Over at The Register, Andrew Orlowski sneeringly opines about Wikipedia, that the “inner sanctum” is finally admitting the system isn’t flawless and all things Wiki aren’t necessarily holy and beyond criticism. Well, yeah, if that’s true, that’s a good thing. He also points out that Wikipedia is now getting replicated all over the web in spots like Answers.com. I would agree that this is probably not a good idea — taking the wiki articles out of the context of Wikipedia leads people at other sites to think that this is “published” official information, when if they were on Wikipedia they at least understand that it’s editable by anybody, and therefore somewhat less official.
What I don’t especially appreciate about the article is the patronizing stance. It’s dismissive of the participation model, treating it as so much pot-induced hippie talk. When the fact is, many more knowledgeable and experienced people have endorsed this concept and used it in the real world. (Two examples: John Seely Brown’s “Eureka” knowledge network at Xerox; and CAVNET at the US Military.) Are these the idealistic Wiki concept, with no vetting or hierarchy? No. But the “ideal” wiki in any serious real-world endeavor is a straw-man example. Peter Morville recently wrote a more reasoned and informed take on Wikipedia in his column on Authority. See? There are mature perspectives on the usefulness of the model that don’t buy into the hype.
Nick Carr (whose blog I’d not even seen until yesterday, but which is suddenly gotten me sucked into its comments here and there) had this to say in
The law of the wiki
It reveals that collectivism and intelligence are actually inversely correlated. Here, then, is what I’ll propose as the Law of the Wiki: Output quality declines as the number of contributors increases. Making matters worse, the best contributors will tend to become more and more alienated as they watch their work get mucked up by the knuckleheads, and they’ll eventually stop contributing altogether…
I commented the following: I don’t think it’s necessarily true that the number of participants decreases the quality of the output. It depends on the subject and the people involved. Two physicists can write an entry on Quantum Mechanics, but twenty of them can fill it out will all kinds of specialized information that only two wouldn’t have to offer. But that’s because it’s a circumscribed topic relevant to a somewhat narrow community of practice (which is actually what wiki’s were sort of created to support to begin with).
Once you start opening things up to *anybody* about *anything* — i.e. “vulgar” interests — you will end up with mediocre writing and information about topics that appeal to the lowest common denominator. That’s just how human systems structure themselves.
So yes, once you take the “pure wiki” out of the rarified environment of specialists or small groups, you definitely have to impose some top-down structure and peer review. I don’t think anyone but the most absurdly wide-eyed idealists would say differently.
He answers that he was indeed talking about the purely democratic wiki, and also asks if even in the specialized info areas, there could be such a thing as “too many” involved, that would erode quality.
I think that’s entirely possible, sure. But to some degree I wonder if it misses the point.
In fact, calling Wikipedia after an encyclopedia kind of misses the point too. I think it’s part of what’s throwing so many people off. But I’m not sure what else to call it, really. Other than “The Big General Knowledge Wiki.”
To my knowledge, even the first simple “idealistic” wikis were never meant as “authoritative” sources of knowledge (and by authority here, I mean the conventionally agreed upon and empirically verified facts on a topic). It was a place for quick gatherings over certain topics, simple entry and editing, for groups of people who wanted a place where all their thoughts could stick for later viewing and shaping. That’s great … and that’s what Wikipedia really is, just very very large. It does its job very well: it acts as a quick, simple repository of what people are thinking or communicating about particular topics. It’s not made for eloquence or even necessarily permanence.
Wikipedia is an experiment in taking that totally open environment (the wiki) and seeing what happens if we layer into it elements of traditional knowledge-authority-making. And (I think wisely) the organizers started with as little structure as possible rather than assuming it needed too much. Because it’s a living, breathing entity that can change over time through the actions of its community, it didn’t have to be a Parthenon. It just had to be a decent Quonset Hut. The other feature can be added and tweaked as needed.
As for the messiness and unevenness, I think people need to get over it. If you prefer your information pre-packaged and pre-authorized, go to the traditonal sources (not that they’ll all be correct, but at least you won’t get in *trouble* for it, probably — you can always say “I checked in a real book!” Kind of like all those companies using Linux even though they bought it from IBM — because it’s more official and feels less risky and messy.)
All Wikipedia does is take the relatively invisible market of ideas, the activity of the knowledge hive, and make it visible, in all its messy glory. “Official” knowledge came from the same hive — but the activity wasn’t out there for everyone to watch. It was in academic conferences or the editorial meetings at People magazine.
But we still need “peer reviewed” authoritative decisionmaking around what information should be referred to when somebody wants “the right answer.”
So, Wikipedia, I think, ought to explain this somehow to its users. Now that the community using it has gone way beyond the early adopter crowd, and hits on all kinds of things on Google are pointing to Wikipedia in the first 10 lines, they probably should let people know: what you read here was put here mostly by people like you. Always check primary sources, etc etc etc .
At some point, Wikipedia needs to have ways of denoting how ‘official’ something is — maybe a coding system, where the Quantum Physics page has several physicists looking after it, but the J-Lo page is basically a free for all?
I believe in the Wiki Way. I do. I just think it’s only one virtue among many, and that it has to be shaped to meet the demands of different contexts.