We Live Here

The article I wrote for the August/September 2006 ASIS&T Bulletin is up. Thanks to Stacy Surla and the gang at the Bulletin for helping me get it into shape. I’m pleased to say it’s sharing space with a lot of really excellent writing.

It’s weird to read it now, in a way. It’s a snapshot of where my head was 2-3 months ago, and now I my thoughts about the topic have changed somewhat. Not drastically, but just natural drift (hopefully some evolution?). If I can get my wits about me I’ll write about it here.

In my last post, I opined at excruciating length about how so much of what makes one’s message in corporate life effective is the context and how one plays that context. It has to do with much more than appearance, which is just one factor; it’s about presence. That self-assurance that in some people seems arrogant or cocky but in others makes you want to defer to their judgment automatically.

Con artists use this very well. It’s a ‘confidence’ game, after all, and the con artist understands intuitively that confidence in oneself is necessary in order for others to have confidence in you.

Ann Coulter is one such con artist. She’s peddled her (relatively speaking, when compared to other political pundits) photogenic looks, rapier tongue and unapologetic attitude into a lucrative, powerful career as one of the most televised dilettantes alive. Oh, and she writes books too.

I have a hard time imagining Coulter sitting at a laptop surrounded by piles of meticulously perused research. I have an easier time imagining her spewing vitriol into a tape recorder and paying some hack(s) to edit it into something coherent, and run out and find anything in print that might be used as evidence. At least, that’s how her prose reads to me.

There’s a big difference between thoughtful, reasoned prose based on thorough research and crude polemic dressed up as respectable political opinion. That’s why I doubt Coulter would’ve gotten far in her career if she’d just written books. Like a trashy pop singer, it’s her TV appearances that make her career.

And it’s in those appearances where she performs brilliantly. Not that I think she’s brilliant. She’s a brilliant performer. She’s smart, certainly, but I think she actually believes she’s making intelligent, logical arguments, which signals to me that she’s not really as smart as she thinks.

That said, the lack of logical argumentation in her rantings doesn’t seem to be a problem. She knows she can get away with so much because she’s amazing at manipulating conversations. For example, on the rare occasion that someone argues with her or contradicts something she’s said, she weasels out of it by one of several strategems: 1. impugn the honor of the other person by making some outrageous, straw-man assertion about them because they would even think of contradicting her; 2. impugn the intelligence of the other person by quoting from “facts” that the other person doesn’t know, pointing to her book and squawking “I have XX pages of footnotes on this,” leaving the other person stammering and wondering if maybe they haven’t really done their homework; 3. making some other outrageous claim about someone not even in the room in order to derail the conversation. (This last one was evidenced most recently when she asserted that Clinton was a latent homosexual.)

Why does she get away with it? I think it’s in her delivery. In her utter and complete confidence. Couple that with a very quick mind (again, brilliant and quick are two different things) that can pop a comeback at an interviewer faster than the Williams sisters can nail a poorly lobbed serve, and time and time again you see people stumble over themselves trying to get around her. And she thrives on it; you can see it in her face. The television interview is her favorite element, and she plays it like a virtuoso.

This combination for an honest person would be admirable. But in someone who twists others’ words in order to fuel her unfounded pronouncements and allegations, it’s insidious.

The second trick works especially well. Claiming that you have numbers and research to back up your claims is a great way to shut other people up, especially if it’s during a TV taping or live interview when they can’t go and check your facts. She does it a lot, according to Media Matters.

This was hardly the first time Coulter and her defenders have offered the large number of footnotes contained in her book as “evidence” of the quality of her scholarship. Also on July 7, Terence Jeffrey, editor of conservative weekly Human Events, defended Coulter’s book on CNN’s The Situation Room by citing her “19 pages of footnotes.” And when similar questions were raised about her 2002 book, Slander: Liberal Lies About the American Right (Crown, June 2002), Coulter repeatedly cited her “35 pages of footnotes” as evidence that her claims were accurate.

The same Media Matters article goes on to check these oft-cited footnotes, and finds them lacking.

Media Matters’ analysis of the endnotes in Godless revealed that Coulter routinely misrepresented the information of her sources, as well as omitted inconvenient information within those same sources that refuted her claims. Coulter relied upon secondary sources to support many of her claims, as well as unreliable or outdated information.

In addition to demonstrating her poor scholarship, this analysis also made clear Coulter’s lack of respect for her readers, who she clearly assumed would believe anything she wrote, as long as there was a citation attached to it.

That last bit is awfully accurate. People really do swallow a lot if it has the appearance of authority, and they rarely bother to look beneath the veneer. I’m guilty of it frequently. Who has the time? And if you’re predisposed to believe the points they’re making anyway, why not roll with it? We all walk around assuming that big publishing houses would never publish something that wasn’t well-documented.

I’m sure there’s plenty of left-leaning stuff published with similar weaknesses. What steams me about Coulter, though, is that she’s so hateful. She delights in polarizing people, and in objectifying and criminalizing her opposition. What she does is only a couple of tiny steps away from the sort of hate speech people use for minorities when they call them “vermin.” Calling ‘liberals’ things like “Godless” and “Traitors” is the sort of talk that one uses to start wars or pogroms.

What really disturbs me is that this woman is paraded as a real expert, as someone we should listen to, along with the other professional windblowers from both sides of the political spectrum, just because her antics grab viewers.

Obviously, the woman shovels a lot of crap; she’s got a real problem with that shovel. But nobody’s going to talk about it. I’d love to see the networks and news shows jumping on her about this stuff, but they didn’t do it about the last books so why this one?

It makes me nostalgic for the days when we had no 24-hour news channels that had to fill their hours no matter what. Back when the nightly news was 30 minutes, and that was it, there was at least some vetting of sources. Can you imagine CBS circa 1975 wasting even 15 seconds getting an opinion from a hack like Coulter? Not that things were perfect in 1975, by a long shot.

I don’t have a pithy wrapup for this post… just a pleading hope that, in the same way people get sick of so many other things and then move on, maybe we’ll all get sick of this and leave people like Coulter to the dust heap of “what were we thinking?”

For a year or so now, “innovation” has been bobbing around at the very top of the memepool. Everybody wants to bottle the stuff and mix it into their corporate water supplies.

I’ve been on the bandwagon too, I confess. It fascinates me — where do ideas come from and how do they end up seeing the light of day? How does an idea become relevant and actionable?

There’s a recent commercial for Fedex where a group of pensive executives are sitting around a conference table, their salt-and-pepper haired and square-jawed CEO (I assume) sitting at the head of the group and a weak-chinned, rumpled and dorky underling sitting next to him. The CEO asks how they can cut costs (I’m paraphrasing) and the little younger dorky guy recommends one of Fedex’s new services. He’s ignored. But then the CEO says exactly the same thing, and everybody nods in agreement and congratulates him on his genius.

The whole setup is a big cliche. We’ve seen it time and again in sitcoms and elsewhere. But what makes this rendition different is how it points out the difference in delivery and context.

In looking for a transcript of this thing, I found another blog that summarizes it nicely, so I’ll point to it and quote here.

The group loudly concurs as the camera moves to the face of the worker who proposed the idea in the first place. Perplexed, he declares, “You just said what I just said only you did this,” as he mimics his boss’s hand motions.
The boss looks not at him, but straight ahead, and says, “No, I did this,” as he repeats his hand motion. The group of sycophants proclaims, “Bingo, Got it, Great.” The camera captures the contributor, who has a sour grimace on his face.

(Thanks Joanne Cini for the handy recap.)

What it also captures is the reaction of an older colleague sitting next to the grimacing dorky guy who gives a little nod to him that shows a mixture of pity, complicity in what just happened, and a sort of weariness that seems to say, “yeah, see? that’s how it works young fella.”

It’s a particularly insightful bit of comedy. It lampoons the fact that so much of how ideas happen in a group environment depends on context, delivery, and perception (and here I’m going to pick on business, but it happens everywhere in slightly different flavors). Dork-guy not only doesn’t get the language that’s being used (physical and tonal), but doesn’t “see” it well enough to even be able to imitate it correctly. He doesn’t have the literacy in that language that the others in the room do, and feels suddenly as if he’s surrounded by aliens. Of course, they all perceive him as alien (or just clueless) as well.

I know I’m reading a lot into this slight character, but I can’t help it. By the way, I’m not trying to insult him by calling him dork-guy — it’s just the way he’s set up in the commercial; I think the dork in all of us identify with him. I definitely do.

In fact, I know from personal experience that, in dork-guy’s internal value matrix, none of the posturing means a hill of beans. He and his friends probably make fun of people who put so much weight on external signals — they think of it as a shallow veneer. Like most nerdy people, the assumption is that your gestures, haircut or tone of voice doesn’t affect whether you win the chess match or not. But in the corporate game of social capital, “presence” is an essential part of winning.

Ok, so back to innovation. There’s a tension among those who talk and think about innovation between Collective Intelligence (CI) and Individual Genius (IG). To some degree there are those who favor one over the other, but I think most people who think seriously about innovation and try to do anything about it struggle with the tension within themselves. How do we create the right conditions for CI and IG to work in synergy?

The Collective Intelligence side has lots of things in its favor, especially lately. With so many collective, emergent activities happening on the Web, people now have the tools to tap into CI like never before — when else in history did we have the ability for people all over the world to collaborate almost instantaneously in rapid conversation, discussion and idea-vetting? Open Source philosophy and the “Wisdom of Crowds” have really found their moment in our culture.

I’m a big believer too, frankly. I’m not an especially rabid social constructivist, but I’m certainly a convert. Innovation (outside of the occasional bit that’s just for an individual privately) derives its value from communal context. And most innovations that we encounter daily were, in one way or another, vetted, refined and amplified by collaboration.

Still, I also realize that the Eureka Moments don’t happen in multiple minds all at once. There’s usually someone who blurts out the Eureka thought that catalyzes a whole new conversation from that “so perfect it should’ve been obvious” insight. Sometimes, of course, an individual can’t find anyone who hears and understands the Eureka thought, and their Individual Genius goes on its lonely course until either they do find the right context that “gets” their idea or it just never goes anywhere.

This tension betwen IG and CI is rich for discussion and theorizing, but I’m not going to do much of that here. It’s all just a very long setup for me to write down something that was on my mind today.

In order for individuals to care enough to have their Eureka thoughts, they have to be in a fertile, receptive environment that encourages that mindset. People new to a company often have a lot of that passion, but it can be drained away long before their 401k matching is vested. But is what these people are after personal glory? Well, yeah, that’s part of it. But they also want to be the person who thought of the thing that changed everybody’s lives for the better. They want to be able to walk around and see the results of that idea. Both of these incentives are crucial, and they’re both important ingredients in the feed and care of the delicate balance that brings forth innovation.

Take the Fedex commercial from above. The guy had the idea and he’ll see it executed. Why wouldn’t he be gratified to see the savings in the company’s bottom line and to see people happier? Because that’s only part of his incentive. The other part is for his boss, at the quarterly budget meeting, to look over and say “X over there had a great idea to use this service, and look what it saved us; everybody give a round of applause to X!” A bonus or promotion wouldn’t hurt either, but public acknowledgement of an idea’s origins goes a very very long way.

I’ve worked in a number of different business and academic environments, and they vary widely in how they handle this bit of etiquette. And it is a kind of etiquette. It’s not much different from what I did above, where I thanked the source of the text I quoted. Maybe it’s my academic experience that drilled this into me, but it’s just the right thing to do to acknowledge your sources.

In some of my employment situations, I’ve been in meetings where an idea I’ve been evangelizing for months finally emerges from the lips of one of my superiors, and it’s stated as if it just came to them out of the blue. Maybe I’m naive, but I usually assume the person just didn’t remember they’d heard it first from me. But even if that’s the case, it’s a failure of leadership. (I’ve heard it done not just to my ideas but to others’ too. I also fully acknowledge I could be just as guilty of this as anyone, because I’m relatively absent-minded, but I consciously work to be sure I point out how anything I do was supported or enhanced by others.) It’s a well-known strategy to subliminally get a boss to think something is his or her own idea in order to make sure it happens, but if that strategy is the rule rather than the exception, it’s a strong indicator of an unhealthy place for ideas and innovation (not to mention people).

But the Fedex commercial does bring a harsh lesson to bear — a lesson I still struggle with learning. No matter how good an idea is, it’s only as effective as the manner in which it’s communicated. Sometimes you have no control over this; it’s just built into the wiring. In the (admittedly exaggerated, but not very much) situation in the Fedex commercial, it’s obvious that most of the dork-guy’s problem is he works in a codependent culture full of sycophants who mollycoddle a narcissistic boss.

But perhaps as much as half of dork-guy’s problem is that he’s dork-guy. It’s possible that there are some idyllic work environments where everyone respects and celebrates the contributions of everyone else, no matter what their personal quirks. But chances are it’s either a Kindergarten classroom or a non-profit organization. And I happen to be a big fan of both! I’m just saying, I’m learning that if you want to play in certain environments, you have to play by their rules, both written and unwritten. And I think we all know that the ratio of unwritten-to-written is something like ten-to-one.

In dork-guy’s company, sitting up straight, having a good haircut and a pressed shirt mean a lot. But what means even more is saying what you have to say with confidence, and an air of calm inevitability. Granted, his boss probably would still steal the idea, but his colleagues will start thinking of him as a leader and, over time, maybe he’ll manage to claw his way higher up the ladder. I’m not celebrating this worldview, by the way. But I’m not condemning it either. It just is. (There is much written hither and yon about how gender and ethnicity complicate things even further; speaking with confidence as a woman can come off negatively in some environments, and for some cultural and ethnic backgrounds, it would be very rude. Whole books cover this better than I can here, but it’s worth mentioning.)

Well, it may be a common reality, but it certainly isn’t the best way to get innovation out of a community of coworkers. In environments like that, great ideas flower in spite of where they are, not because of it. The sad thing is, too many workplaces assume that “oh we had four great ideas happen last year, so we must have an excellent environment for innovation,” not realizing that they’re killing off hundreds of possibly better seedlings in the process.

I’ve managed smaller teams on occasion, sometimes officially and sometimes not, but I haven’t been responsible for whole departments or large teams. Managing people isn’t easy. It’s damn hard. It’s easy for me to sit at my laptop and second-guess other people with responsibilities I’ve never shared. That said, sometimes I’m amazed at how ignorant and self-destructive as a group some management teams can be. They can talk about innovation or quality or whatever buzzword du jour, and they can institute all sorts of new activities, pronouncements and processes to further said buzzword, but not do anything about the major rifts in their own ranks that painfully hinder their workers from collaborating or sharing knowledge; they reinforce (either on purpose or unwittingly) cultural norms that alienate the eccentric-but-talented and give comfort to the bland-but-mediocre. They crow about thinking outside the box, while perpetuating a hierarchical corporate system that’s one of the most primitive boxes around.

Ok, that last bit was a rant. Mea Culpa.

My personal take-away from all this hand-wringing? I can’t blame the ‘system’ or ‘the man’ for anything until I’ve done an honest job of playing by the un/written rules of my environment. It’s either that, or play a new game. To me, it’s an interesting challenge if I look at it that way; otherwise it’s just disheartening. I figure either I’ll succeed or I’ll get so tired of beating myself against the cubicle partitions, I’ll give up and find a new game to play.

Still, eventually? It’d be great to change the environment itself. Maybe I should go stand in front of my bathroom mirror and practice saying that with authority? First, I have to starch my shirts.

This is unbelievably creepy …

David Byrne Journal: 8.2.06: American Madrassas

Saw a screening of a documentary called Jesus Camp. It focuses on a woman preacher (Becky Fischer) who indoctrinates children in a summer camp in North Dakota. Right wing political agendas and slogans are mixed with born again rituals that end with most of the kids in tears. Jesus CampTears of release and joy, they would claim — the children are not physically abused. The kids are around 9 or 10 years old, recruited from various churches, and are pliant willing receptacles. They are instructed that evolution is being forced upon us by evil Godless secular humanists, that abortion must be stopped at all costs, that we must form an “army” to defeat the Godless influences, that we must band together to insure that the right judges and politicians get into the courts and office and that global warming is a lie.

Oz-IA 2006

If there’s any chance you can make it to a terrific IA conference in Australia, definitely check out Oz-IA 2006. Dates: Saturday, September 30th & Sunday, October 1st

The industrious Eric Scheid tells me that “We’ve now announced the conference program, and it’s quite exciting – lots of practical sessions, by practitioners, for practitioners. Over the next few weeks we’ll be expanding the detail on each session.”

Just in case anyone has forgotten, the dream of the neocons was quite different from what has actually transpired in Iraq.

They honestly thought their twisted ideology was going to result in the perfect case study for their beliefs (and line their wallets in the process). Far from the privatized utopia they were expecting, their experiment on the flesh and blood residents of Iraq has instead resulted in an ever escalating death toll.

Here’s a bit from the article by Naomi Klein in Harper’s way back in 2004.

In only a few months, the postwar plan to turn Iraq into a laboratory for the neocons had been realized. Leo Strauss may have provided the intellectual framework for invading Iraq preemptively, but it was that other University of Chicago professor, Milton Friedman, author of the anti-government manifesto Capitalism and Freedom, who supplied the manual for what to do once the country was safely in America’s hands. This represented an enormous victory for the most ideological wing of the Bush Administration. But it was also something more: the culmination of two interlinked power struggles, one among Iraqi exiles advising the White House on its postwar strategy, the other within the White House itself.

Flash

Just one more Sunday with my daughter, dwindling now. Another week of work and daycamp, then a return drive to NC on Saturday, and our month together for 2006 will be over.

This time it went so fast. That’s a cliche, I know. But it did.

At the amusement park last week, kids lined up to ride the big swing ride over and over again. Parents tired of riding it with them, as I did, and so we all stood and watched, our necks crooked upward at our children flung in the wide circle. The sun was starting to drop, and some of us had cameras out trying to pluck images of our children from the screaming, sweaty orbiting ring, one frame at a time.

Cameras flashed. Parents yelled up “yes, I see you! hang on!” Then another flash. And we all had that frozen smile on our faces, the one where the mouth is all joy and wonderment, but the eyes behind the cameras say “I will keep this moment, this moment will never change” (flash) “stop” (flash) “yes, stop there and there” (flash) “and that moment, that smile, I’ll keep that one” (flash) “and that one too! oh god so many look at them pass, too fast” (flash) “too fast, too fast.”

The New Yorker has a very good article on Wikipedia this week. It acknowledges both the positive and negative aspects of the site. I have to agree that Wikipedia will ever supplant the usefulness of a peer-reviewed traditional publication, but it will serve as a useful foil.

Over breakfast in early May, I asked Cauz for an analogy with which to compare Britannica and Wikipedia. “Wikipedia is to Britannica as ‘American Idol’ is to the Juilliard School,” he e-mailed me the next day. A few days later, Wales also chose a musical metaphor. “Wikipedia is to Britannica as rock and roll is to easy listening,” he suggested. “It may not be as smooth, but it scares the parents and is a lot smarter in the end.” He is right to emphasize the fright factor over accuracy. As was the Encyclopédie, Wikipedia is a combination of manifesto and reference work. Peer review, the mainstream media, and government agencies have landed us in a ditch. Not only are we impatient with the authorities but we are in a mood to talk back.

One point the article makes clear is that Wikipedia is, if defined mainly by writing activity, a community where people discuss things. The talk and discussion pages get more use than the actual articles. And that’s part of what I really love about it. Wikipedia (like the Web in general) records and makes explicit all the tacit conversations that go into collective truthmaking.

Evidently the guy who started Wikipedia with Jimmy Wales, Larry Sanger, is now working on a new project — a hybrid of Wikipedia-like opennness with editorial peer review. Depending on how that’s handled, it could be extremely powerful. And why couldn’t Wikipedia be the breeding ground of what eventually ends up there?

Anyway, the article also makes the point that Encyclopedias have always been challenges to hegemonies …

In its seminal Western incarnation, the encyclopedia had been a dangerous book. The Encyclopédie muscled aside religious institutions and orthodoxies to install human reason at the center of the universe—and, for that muscling, briefly earned the book’s publisher a place in the Bastille. As the historian Robert Darnton pointed out, the entry in the Encyclopédie on cannibalism ends with the cross-reference “See Eucharist.”

It’ll be strange to look at something like Wikipedia one day and think of it as a dusty, traditional way of sharing knowledge. But for now, it’s fun to watch the fight.

O Solo Veto

The world is going to the crapper in the Middle East right now, so in a way part of me wonders why I’m obsessing over this issue, but it’s important. Like everybody else I’m wondering how President Bush has managed never to veto a single thing in all his years in office.

I mean, if you’d hired a quality control officer in your company and, unlike every q.c. officer before him, he’d not found a single bit of quality to control and said “well I got the factory to change everything to my specifications before it got to the point where it had to be sent back” would you be suspicious? I would. Either the guy is a genius who just reinvented your quality capabilities or he’s slacking. And there aren’t that many geniuses in the world.

Anyway, this stem cell thing … there are many reasoned arguments on both sides. I’ve heard some very decent and rational people explain how, if you define life as beginning at conception, an embryo is a human being and therefore should be protected under the law. Fair enough. But if that’s the case, why do we dispose of so many of them?

According to the legislation that was vetoed, there are thousands of them disposed of every year. The legislation only sets boundaries saying we can use the ones that would’ve been disposed of for research, and only if the donors agree to it. These would never be implanted in a woman. If they’re all human life, why are they being disposed of to begin with?

Part of U.S. Congressman Mike Castle’s letter to Bush:

* The stem cells were derived from human embryos that have been donated from in vitro fertilization clinics, were created for the purposes of fertility treatment, and were in excess of the clinical need of the individuals seeking such treatment. Prior to the consideration of embryo donation and through consultation with the individuals seeking fertility treatment, it was determined that the embryos would never be implanted in a woman and would otherwise be discarded.
* The individuals seeking fertility treatment donated the embryos with written informed consent and without receiving any financial or other inducements to make the donation.

This logic goes unmentioned in the administration’s denouncements.

What we’re really witnessing is a calculated pandering to ignorance. I don’t think Bush is pandering, though — I think he really believes each blastocyst is a human child crying out for a uterus. He’s swallowing whole the dogma spoon fed to him by Rove, especially. (Rove, who has been distorting the science to begin with — and we know Bush won’t actually read anything for himself, so whatever Rove says, Bush takes as gospel.)

This is frightening to me because of the implications — that even with a Republican majority in Congress passing this bill, the President still sees it as his responsibility to be the voice of his version of God for our nation. I can’t find the link right now, but it’s on record that at least four senators who spoke against the bill invoked God’s name saying the Creator would be very displeased and would do bad things to America if we passed it.

Ben Franklin and the rest of them are rolling in their graves.

The pandering is possible because of the semantics involved. What do we mean by “life” and “child”? Who gets to decide if a blastocyst is a child or not? Obviously, in reality, it’s more complicated than “cell a plus cell b equals Junior.” Nature doesn’t treat it that way; even the Bible doesn’t treat it that way (it refers to life as “breath” not blastocysts; so much of scripture is misquoted, mistranslated and misinterpreted to support all kinds of views that I’ve given up trying to even discuss it in those terms with anyone). And evidently our own laws don’t treat it that way either, because the law allows the discarding of these blastocysts in fertility clinics.

This is a way for an administration that has championed so much death to doubletalk their way into being all about life, to hold onto their shredding political base by pandering to the ignorant, superstitious and misguided who keep putting them in office.

I don’t necessarily mean “ignorant” as an insult, either. Everybody can’t be an expert on everything. People are busy with their regular lives. In an information saturated world, we depend on sound bites to navigate the terrain. I confess that, listening to the bits and pieces coming over the airwaves, I too figured using embryos for research sounded creepy to me. But being informed about it with an open mind goes a long way toward understanding it’s not so simple, especially when you weigh the benefits.

The NIH has an excellent overview here.

Over four hundred thousand blastocysts are out there, frozen, and a tiny fraction of them are ever “adopted” for attempted impregnation. Plus, if I understand Castle’s letter quoted above, only cell groups that are flagged by donors as ok for research would ever be used.

It’s a slippery ontological question: who decides a group of cells is a human person and who doesn’t? If someone is brain-dead, and the family insists the person is alive enough to still be the person they knew, should the government be allowed to pull the plug anyway? Probably not. Then why would the government be allowed to decide the converse — that a microscpopic blastocyst is a person when the people who created it say it isn’t? It’s uncomfortable to discuss, but necessary.

However, rational discussion is impossible with the rampant disinformation and ignorance being spread (by both sides, in some instances, but the *science* and logic are on the pro-research side, it seems to me). The most ridiculous stuff is coming from the silly portion of the right wing, such as Limbaugh claiming that you have to have abortions to get stem cells.

Why am I angry about this? Because of the same reasons that most of the country should be up in arms about it. Because I have people I love who could be helped by this research — the brightest light in the dark tunnel of medicine for so many people with diseases that don’t respond to anything as simple as a miracle vaccine. It’s the same reason Arlen Specter breaks with his more extreme Republican brethren on the issue on the Senate floor. Because for him it’s a matter of life and death, but not in the sense of superstition and theory:

There are some 400k frozen embryos, and the choice is discarding them or using them to save lives; Sen. Brownback and I had a debate where he challenged me on when life began, and I retorted, suffering from Hodgkins cancer myself, the question on my mind was when life ended, and life will never begin for these embroys because there are 400 thousand and notwithstanding millions of dollars appropriated to encourage adoption, only 128 have been adopted; so those [potential] lives [of the remaining embryos] will not begin, but many other lives will end if we do not use all the scientific resources available.”

This is real-world thinking. The kind of thinking that stands up and makes adult, difficult choices about the reality of the world around us. My stepfather (with Alzheimer’s disease) and others close to me with things like immunological disorders could be helped by this research. My daughter and I just sent flowers to a funeral of a loved one who died from complications after a stem-cell procedure that could’ve been improved if the research hadn’t been stymied for the last five years.

But any such morally responsible thinking is precluded by the insidious manipulative drivel piped into the conversation by dogmatic fundamentalists who believe the cells from our bodies belong to the government, not us. And they’re so effective at this twisting of logic, that even my own mother (my ailing stepfather’s wife, who has to face their last years together under the weight of Alzheimer’s) is convinced that her President is a saint who would never do her wrong.

Yeah, that’s why I’m angry.

419 Fun

I read this article in the New Yorker a few weeks ago, about “The Perfect Mark” — a man who, in spite of being relatively intelligent, fell for one of the Nigerian “419” scams. (The ones where you get an email saying “I have a million dollars in a blind account and need your help to get it out.”)

I’d always assumed these scams were just quick one-hit stabs at getting a credit card number. I had no idea how deep they actually go, and how sophisticated they are. They strung this guy along for years (and he still wants so much to believe that the characters are real!) and did such an amazing job of reality-twisting. Now when I see these emails, I’m no longer just amused and puzzled but a little creeped out that I just got an email from murderous, organized criminals.

So it feels especially wonderful to run across this site, “Welcome to the 419 Eater” where some clever soul conned the cons. He basically tells them “yeah I wish I could help you, but I’m in the middle of this really big deal that’s making me even more money” and baits them into a similar trap. Only in his case, it’s just an extended practical joke. (He never invites anyone to fly to his country then robs and kills them, for example.)

In this one: http://www.419eater.com/html/john_boko.htm they manage to get a scammer to think they’ll make thousands of dollars by carving strange things out of wood, then claim a hamster has eaten the goods. Hysterical.

I’ve been working for a couple of months now on an article for the ASIS&T Bulletin (American Society for Information Science and Technology). It started out as an article version of my “Clues to the Future” presentation, but I soon realized that 1) I couldn’t really explain the same stuff very well in a 4000 word article, and 2) to do so would be a bit redundant with the presentation itself (which is fairly well explained in the text part of the pdf download). I also realized (I guess this is 3) that there were other things I really wanted to say but hadn’t managed to figure out how to articulate them yet, and this was a good incentive and/or opportunity to do so.

After banging my head against a few walls (both real and virtual) for eight weeks, and the extreme patience of an editor, I think it may manage to at least form the beginnings of what I’ve had rolling around in my brains.

Writing it was hard. Period. It always has been, at least to do it well. This is true when I’ve written fiction or poetry (in mostly a previous life) but I found it especially hard writing a long-form essay for print. I’ve been so used to writing PowerPoint presentations and blog posts, I was quite out of practice with developing my ideas with any rigor. And I’m still not sure how effectively I’ve articulated this stuff, but so it goes. One of my favorite quotations ever is E.M. Forster’s “I don’t know what I think until I see what I say” or some version of that. It’s so very true — the act of putting it into parsable language inevitably changes any idea for good or ill, but hopefully makes it better.

Especially, though, writing about things that don’t really have a solid, agreed-upon vocabulary just yet is quite difficult. I used to curse the philosophy texts I read as a student because they were so full of neologisms — especially from those pesky Germans like Heidegger — but I sort of understand that they were trying to express ideas that hadn’t been expressed yet, and needed rubrics by which to signify them without re-explaining each time.

The piece is called “We Live Here: Games, Third Places, and the Information Architecture of the Future.” Egad, now that I see it here it sounds awfully pompous.

When it’s published I’ll post a link or excerpt or something.

Mao Mao Mao

There’s been a lot of buzz over the last week or so about Jaron Lanier’s “DIGITAL MAOISM: The Hazards of the New Online Collectivism”
[http://edge.org/3rd_culture/lanier06/lanier06_index.html] in which he warns of a sort of irrational exuberance about “collective intelligence.”

I found myself taking mental notes as I read it, ticking off what I agreed and disagreed with and why. But then I read Douglas Rushkoff’s response:
http://edge.org/discourse/digital_maoism.html#rushkoff

And I realized he’d already expressed everything in my tick-list, and then some, and better than I would’ve.

Lanier’s essay and all the responses to it at Edge are excellent reading for anyone who thinks deeply about what the Internet means to the social fabric, culture, learning and history.

Just a couple of personal reactions:

I found myself feeling a little mollified reading Lanier’s essay. I already knew what it was about and was ready to find mostly disagreement with his points, but ended up realizing I had been guilty of some of the foolishness he calls us on and agreeing with most of what he says.

But then I thought about what I’ve actually believed on the subject and realized, I don’t think I’ve ever thought or said the collective is superior to the individual. Only that “architectures of participation” allow even more individuals to participate in the marketplace of ideas in ways that they simply couldn’t have before. Lanier runs the risk of equating “collective intelligence” with “collectivism” — which is a bit like equating free-market capitalism with Social Darwinism (itself a misnomer).

His main bugbear is Wikipedia. I agree there’s too much hype and not enough understanding of the realities of Wikipedia’s actual creation, use and relevance. But I think that’ll sort itself out over time. It’s still very new. Wikipedia doesn’t replace (and never will) truly authoritative peer-reviewed-by-experts information sources. Even if people are currently referencing it like it’s the highest authority, over time we’ll all start learning to be more authority-literate and realize what’s ok to reference at Wikipedia and what isn’t (just like War of the Worlds tricked thousands in the earlier days of radio — but you really can’t imagine that happening now, could you?)

One thing Lanier doesn’t seem to realize, though, is that Wikipedia isn’t faceless. Underneath its somewhat anonymous exterior is an underculture of named content creators who discuss, argue, compromise and whatever else in order to make the content that ends up on the site. Within that community, people *do* have recognizable personalities. In the constrained medium of textual threaded forums, some of them manage to be leaders who gain consensus and marshall qualitative improvement. They’re far from anonymous, and the “hive” they’re a part of is much closer to a meritocracy than Lanier seems to think.

Not that Wikipedia’s perfect, and not that it meets the qualifications of conventional “authoritative” information sources. But we’re all figuring out what the new qualifications are for this kind of knowledge-share.

At any rate, his essay is very good and has important stuff we have to consider.

« Older entries § Newer entries »