Change in Plans
Posted on January 3, 2003 Posted by John Scalzi
So, there’s been a slight change of plans. As you may remember (surely 2002 isn’t too hazy yet), I serialized my most recent science fiction novel, Old Man’s War, here in December, and this month I was going to put it up as shareware, a la Agent to the Stars. Well, I won’t be doing that. The reason for this is that, well, I kind of sold it. Instead of being available as shareware, Old Man’s War will be available either later this year or early next year in a hardcover edition from Tor Books, publishers of (among others) Orson Scott Card, Robert Jordan, Steven Brust and Teddy Roosevelt. Yes, really, Teddy Roosevelt. It’s a reissue, I think, not one of those L. Ron Hubbard-eqsue “dictating from beyond the grave” situations.
Am I happy? What a silly question. I’m just glad this is a text medium, so you can’t see the footprints on my desk from where I was dancing on it (yes, I could make a little videocam movie. But, no). And it’s actually a two book deal, so I get to write at least one more novel and have someone pay me for it. As they say, it beats a sharp poke in the eye. Or in the groin. Or just about anywhere else, for that matter. Sharp poking: Bad. Two book deal: Good.
I’m also pleased to say that the book deal comes as a direct result of having the book up here on the Web site; the editor who made the offer (Patrick Nielsen Hayden, who in addition to being the Senior Editor of Tor is the author of the Electrolite blog) did so after reading chapters on the site and then downloading the complete book (and by doing so, Mr. Nielsen Hayden’s ranking on my list of People Who Can Ask For a Kidney and Not Be Dismissed Out of Hand has shot up rather dramatically over the last few days. And I can assure you, it’s a very short list).
I’m not 100% positive on this, so please don’t hit me if I’m wrong. But I’m pretty sure it’s the first time that an SF novel that’s been published on a personal Web site has been picked up by a major publisher for traditional publication. It’s not the first book that’s been derived from a personal Web site, of course — James Lileks, for one, turned part of his site into a successful non-fiction book (and has another one coming), and Pamela Ribon’s upcoming novel is in part mined from her Web site entries. But it might be the first time for a novel that was presented in more or less completed form to make the jump. If this is the case, then as you may imagine I’m rather pleased about it. One always likes to be the first at something, or at least near the top of that list. If it’s not, of course, I’m still pleased for me.
(Addendum: MJ Rose reminds me that she did it first, with her erotic-tinged novel “Lip Service.” In 1998, even. But it wasn’t science fiction, so I can still cling to being the first in that genre until someone inevitably comes to knock me off that perch. Come and get me!)
It’s also a confirmation of something I wrote just prior to serializing the novel on the site. I wrote:
There’s also another reason I’m putting it online, which is simply that I’d like to advance the possibility that something like this — self-published and online — doesn’t necessarily have to be automatically shoved into the “loser” box. Over the last year, I’ve been spending a lot of time listening to and reviewing independent music through my IndieCrit site, and speaking out of a decade and a half of critical experience, much of this independently-released (and indeed largely “self-published”) music is as good or better than the music that is being shot out of the major music labels. Why shouldn’t independently-released novels have the same chance of reasonable quality? Someone’s got to start making the case for it, and why not me.
And indeed, the fact this sale was possible at all is yet another example of the maturation of the online medium. Many folks still see the online medium as the medium for people who can’t or won’t get published any other way, but that hasn’t been the case for a while now. Over the last year in particular several of the more prominent bloggers have seen their online bylines become useful in transferring their writing into the mainstream media; by the same token a number of old-line writers and journalists used the “blog” format to ratchet up their reputations (my high school classmate Josh Marshall being a fine example of that).
For talented and committed writers, writing on one’s own site is a true alternative medium which can be used for one’s overall gain, and not simply as a catchall for otherwise unusable or unpublishable material. People who write well, and write online, no longer need to feel at an inherent disadvantage to those who write well, and write in a traditional medium (bad writers, alas for them, are still stuck).
Now, before someone gets it into their head that my publishing my novel online was some sort of controlled, Machiavellian plan to forgo the SF publishing slush pile, and that Mr. Nielsen Hayden fell right into my trap, bwa hah hah hah, I’d like to be clear: I’m not nearly that organized, and I don’t know that this sort of thing is easily reproducible. Old Man’s War is a good book, but there are lots of good books out there waiting to be looked at, and typically speaking, your book has a much better chance of getting bought if you actually set it in front of an editor rather than waiting for an editor to come on by.
So in short: Boy, did I ever get lucky. I really cannot emphasize that enough, and to drive this point home, I’ll note that Agent has been online for going on four years and hasn’t had so much as a nibble, traditional publishing-wise. I don’t want to suggest we all ditch the traditional methods of getting work published. They’re traditional because they tend to work (also, if people start hounding Mr. Nielsen Hayden or any other SF editor to swing by their Web site and look at their novel, I don’t want it said it’s because I said that’s what they should do. Please, be nice to the editors. Give them love. And jewelry).
What I am saying is clearly we’ve gotten to the point where it’s no longer the smart thing to automatically dismiss writing online — even an online novel — as “not good enough.” Sometimes, it is good enough. It’s just that simple. I’m happy to be one of the guys who gets to be the case in point for that.
The Child on the Train
Posted on January 1, 2003 Posted by John Scalzi
About a week after Krissy completed the first trimester of her pregnancy, she went in to the doctor to have a routine checkup for herself and her baby. While she was being examined, the doctor had difficulty finding the baby’s heartbeat. This in itself was not unusual — at just over three months, a fetus is still a small thing. The sound of its nascent heartbeat is easy to lose in the other sounds of the body. But by the next day, Krissy had begun to spot and bleed, and shortly thereafter she miscarried. As with nearly a quarter of all pregnancies, the processes that form and shape a life had stopped at a certain point well short of completion, and for whatever reason this child would not be born. It was a death in the family.
By and large, we kept the matter to ourselves, telling the people who needed to know — family and close friends — but otherwise saying nothing. I had written about Krissy’s pregnancy on my Web site, as I had written about Krissy’s first pregnancy — and why not, since a pregnancy (at least in the context of a happily married and financially secure couple) is a happy thing. For a writer, there’s a lot of material to discuss, so long as it’s done in a tasteful manner that doesn’t have one’s pregnant wife planning to beat one in the head with a pan. But a miscarriage is obviously something different. There’s no way to write on one’s Web site, in a breezy and conversational style, that a pregnancy has ceased.
Even if there were, the event was too close and too personal to share in that way. Celebration should be public, by definition, but grief is a fragile thing. Grief is a small, difficult and necessary visitor that dwells in your home for some little time, and then has to be gently encouraged to depart. Crowds make it nervous and inclined to stay put. We didn’t want that. We figured anyone who learned of it later would understand. We held our grief close and then after enough time, bid it farewell and set it on its way.
And it is gone; its time in our house was brief. Our friends, our family, and most of all our daughter helped see to that. One cannot stand in the face of such fortunate circumstances as we have and wish to cling to grief. There is too much that is good in our lives together to stay sad for long. So we didn’t.
Were you to express your condolences to us today, we would of course thank you for them — we know they’re sincere and we know they’re meant from the heart. But we would hope you would also understand when we said “thank you” and then chatted with you about something else entirely, it’s not because we are pained about revisiting the grief. It’s that the grief is like a shirt that is six sizes too small. It fit once, but it doesn’t fit now, and trying to get it back over our heads would be an exercise in futility.
I mention the miscarriage now primarily because this is around the time that Krissy would have been due, and various correspondents have been asking about it. When I write back that Krissy has miscarried, they’re all deeply apologetic for bringing up what they (not unreasonably) assume is a painful topic. And of course, it’s not their fault at all, since I mentioned the pregnancy but not the miscarriage. I really don’t want anyone else to feel horrifyingly embarrassed because of my decision not to discuss certain information.
I also want to avoid scenes like that one I had in October, in which I was standing around with a circle of casual acquaintances. One of them was discoursing about the danger of asking other casual acquaintances about their personal lives, since there’s always something horrible that’s happened — and no sooner did this acquaintance finish saying this than she asked me how Krissy’s pregnancy was coming along. Rarely has someone posited a statement and proved it with such brutal efficiency. I felt bad that my omission put her in such a situation. So now it’s out there.
I should mention that the fact that we’ve left behind the grief of the miscarry does not mean the event is forgotten; or perhaps it’s better to say that the child we lost is not now nor ever will be forgotten by us. It is, as I’ve said, a death in the family, and while the small absence it created is small indeed, it is yet still an absence. It doesn’t go away, and even though we see it without grief, we recognize it exists. It would be wrong to pretend it does not.
If I could describe to you what a miscarry feels like from an emotional point of view, I would ask you to imagine a dream in which you are standing on a train station platform. While you are waiting, you look through the dirty windows of the train car in front of you and see a small child looking back at you. The child’s face is indistinct because of condition of the windows, but what you can see looks achingly familiar. For a moment, the child is separated from you by only that single, dirty pane of glass. Then the train starts to move, and the child starts to move with it.
And you realize that the reason you’re on the platform at all is because you’re waiting for your own child to arrive, a child you have yet to meet. And you realize that you could have claimed that child as your own. And you know that whatever child eventually comes to you, you will love that child like the sun loves the sky, like the water loves the river, and the branch loves the tree. The child will be the greater whole in which you dwell.
But it will never be that child, the one you could only glimpse, the one who went away from you. All you can do is remember, and hope with everything in your heart that the child who went away from you finds another who will love it as the sun loves the sky, the water loves the river, and the branch loves the tree. You pray and you hope and you never forget. That’s what you do. That’s what I do.
Four Years Old
Posted on December 23, 2002 Posted by John Scalzi
Athena woke up today to her mother and father thrusting a birthday cake at her and singing “Happy Birthday.” This confused her since she didn’t know that it was her birthday — we don’t tell her until it happens, partly because it’s fun to be surprised, and partly because this way it’s easier to manage expectations. She already knows Christmas is coming up; contemplating Christmas and her birthday together would be enough to cause her little head to pop. In a few years, of course, she’ll probably learn to dislike it because of the whole “one gift for Christmas and birthday” thing people will pull, but hopefully between now and then we’ll have a chance to impress upon her the idea that it’s not the gifts that count. In the meantime, however, we got her this really cool talking globe.
It will not come as news that I adore my child. Indeed, this is the position that one is assumed to have, and I’m quite happy to conform to expectations. Athena is exactly the child I would have wished I would have had: Smart, strong, stubborn beyond belief, and curious in all the sense that the word can be applied to a child. She’s also heartbreakingly beautiful, loving and happy (except when she not. And then she’s very not. But usually this passes quickly). She’s not a perfect child, just perfect for me. The idea that I get to help this quirky little human get a running start at the world fills me with a sense enduring and constant joy.
The bitterest of childless folks that marinate in various “childfree” groups online seem to think that feeling of joy is akin to getting a lobotomy, and as much as I’m inclined to dismiss that as the feculent bleating of the terminally selfish, I will grant that they’re on to something, even if once they’re on it they wallow in it. What they’re onto is the fact that parents fetishize their children, which annoys everyone else (even the folks who don’t endemically hate children for dark, squirrelly reasons of their own) and is really no good for the kids either.
But I think all this fetishism is something different from the joy I’m talking about. And honestly, I don’t know what’s at its root, since I don’t much understand it myself. I’m not likely to become one of those parents who schedules their kid’s life from 5:30 in the morning until 8:00 at night and who then whines, in a petulantly prideful way, about how much time and energy they have to devote to their children’s well-being. I’m certainly not going to be one of those parents who tries to block out anything in a culture that’s not child-safe, if for no other reason than that most of the things that these type of parents see as a threat are things I see as an opportunity to teach my daughter how to mock. It’s already working; when the commercials come on the TV, she typically turns to me and says “Would you mute these, please, daddy? These commercials are evil.” She’s well on her way.
A good friend of mine who is also a parent recently commented that as much as she loves and adores her child, there’s a part of her that misses the freedom she had before, including the ability to actually have a thought process that goes for more than 20 seconds before it’s interrupted by a little human asking for a juice box. And it’s true enough that parenting fundamentally means giving up part of yourself into the service of someone else, who is at first too young, and in the teenage years typically too self-oriented, to conceive of the idea that you also belong a world with which they do not intersect. Everyone remembers the disorientation we felt when we first realized our parents had more going on their lives that we had ever suspected. Now we’re on the other side of the equation.
I’m pretty confident my friend will figure out the right balance in time; she’s smart and if nothing else, in time the kids go to school, and you have several hours in a day in which you can link thoughts together without interruption. For me, it’s always been that what I’ve gained from being a parent has outweighed what I’ve had to put aside. This is partly due to my own personal inclinations — I’m not now nor have ever been one of the 24-hour party people, for whom the introduction of a small person would mean schedule crimpage, and the things I do enjoy are easy enough to juggle along with family. But I also simply suspect I’m one of those people who is reasonably well-designed for parenthood. At the very least, I can be pedantic, which suits having a kid. In the short term, at least, my kid enjoys hearing me rattle on about stuff.
I’m looking forward to the next few years. I was pleasantly surprised by Athena’s early childhood — I thought my assumed particular skills as a parent would be wasted on the infant and toddler years. I learned that they weren’t, and that the skills I thought I had were different from the skills I actually do have. And I do know that I already miss Athena as a baby.
But now Athena’s asking questions about the world — really good questions — and she’s at the point where she’s able to understand answers. And she asks a lot of questions. And boy, am I ready. Finally, all those years of learning pointless information are going to pay off, since it’s no longer pointless. It serves to expand my daughter’s knowledge of the world, and she’s loving the fact that her world is expanding. It’s fun.
I can’t wait to see what the next year brings for Athena and to us as her parents. We are blessed and lucky. And hopefully we can make our daughter feel the same way. We’ll try. It’s what you want as a parent: The hope, the effort and the enduring joy.
Happy birthday, Athena. I love you.
Al, Trent, Andrew
Posted on December 16, 2002 Posted by John Scalzi
Some folks have written in asking what I think of various events that have transpired in the last week, so I thought I’d briefly comment on three of them.
First: Al Gore. I’m not entirely surprised that Gore has decided not to seek re-election (heh heh) in 2004, since if he did run he’d lose. He’d not only lose because Bush is legitimately popular in his own right now — although he is, and Gore would lose on that datum alone — but because Democrats now approach Gore like a stinky old-fashioned herbal medicine that you take because it’s supposed to be good for you, even though you suspect it doesn’t do anything useful at all.
Make no mistake, Gore would win the 2004 Democratic nomination, on the backs of hardcore Democrats who would pull the lever for him for the same reason legions of Star Wars geeks trudge joylessly to George Lucas’ latest betrayal of their trust: Because that’s what expected of them, and because if they didn’t, they’d be admitting that former investment of time and energy was a complete waste. Meanwhile, the rest of pool of the potential Democratic voters, who are not glumly enthralled by Democratic Jedi mind tricks, will get a look at Gore’s reheated visage and say: Screw this, let’s go catch The Matrix. Gore’s a loser, baby.
Anyway, Gore’s better off where he is. Right now there’s still a sizable chunk of people who feel vaguely that the man got screwed out of a job; better to ride that wave of disassociated pity to a posh sinecure on the lecture circuit and a kingmaker perch in Democratic politics, than lose unambiguously and stink up the room like the second coming of Mike Dukakis. And God forbid that he should run again and it comes down to Florida once more.
Second: Trent Lott. The Washington Post yesterday offered the tantalizing suggestion that Lott might resign from the Senate if he were pushed out of the Majority Leader post, a sort of “mutually assured destruction” thing since the Mississippi governor is a Democrat and would almost certainly appoint a someone of his own party to the Senate. Then all it takes is one reasonably liberal Northern Republican Senator to defect and it’s another two years of the Democrats sticking a knife into the Republican agenda and yanking vigorously back and forth. Lott’s people deny he would do any such thing, of course, but God. Who wants to cross that line to find out for sure?
I feel vaguely sorry for Lott, the same way I feel sorry for someone who followed the “greed is good” mantra all the way to an 18-month insider trading stint at a minimum security prison: It’s not that the person isn’t getting what they deserve, but you pity them that in their heart they don’t understand what it is they done that’s so damn wrong. Lott has apologized his brains out, and he’s under the impression that sooner or later all that apologizing is going to take — Americans are famously forgiving, after all.
But the thing about that is people prefer to have the impression that when one’s apologizing, that one is actually sorry about the thing they’ve said or done. Lott distinctly gives the impression that he’s apologizing because he knows that’s what’s required of him so he can get back to more important things, like ramming conservative judges through confirmation hearings. And of course it doesn’t help that every time he apologizes, another news story surfaces of him opposing segregation in college, or palling around with white supremacists, or suggesting the coalitions of people who’d like to box black folks up like veal “have the right ideas” about state’s rights and what not, wink, wink, nod nod. That kind of cuts the legs out of his whole “Racism is a bad thing” line.
Do I think Lott is a racist? Well, at the very least, I do suspect that Lott thinks of black people the way that conservative Republicans my age and slightly older think of gays and lesbians — that whole “why, this person seems agreeable enough, and look, I’m not even thinking about the fact he’s gay at all” sort of thing. The folks in this situation deal with gays by concentrating on the trivial matters at hand in front of them and desperately not thinking of that gay person in any other context — say, at home with their partners, slicing tomatoes for a salad or watching HBO or talking on the phone or having red-hot oral sex on the stairwell. Replace “gay” with black” and you get an idea of where Lott is coming from. It’s sort of like being told not to think about a white elephant, and so of course that’s exactly what you do. “White Elephant,” of course, being oddly appropriate here.
Do I think Lott should step down? Well, of course I do, but I’d want him to step down no matter what; as a general rule I’m against antediluvian helmet-heads laying out a legislative agenda that is inimical to any number of my deeply-held beliefs. This is just icing on the cake. But on the other hand, if he steps down (and doesn’t resign the Senate in a huff), he’ll just be replaced with another GOPer who does not have as many political liabilities. So as with many people who are not GOPers, I’m perfectly happy for him to stay where he is. He’s going to be a fine poster boy to motivate people in 2004, both within the GOP to root out people like him, and outside the GOP to punt the GOP back into minority status.
The good news is that one way or another Lott’s gaffe is costing him and the GOP. The question is the ultimate cost and when it’s paid up. I’m just as happy to have it later as sooner.
Third: Andrew Sullivan. Sullivan held a pledge week for his blog last week, saying in essence that if a certain small percentage (1% or so) of his readership didn’t kick in $20 a year, he’d roll up his blog and go back to writing articles for people who actually paid him money. Apparently the threat worked, since Sullivan is going to announce later this week that he’s cleared enough in contributions to keep his blog going.
A number of anti-Sullivan types have gotten themselves into a tizzy about this, but I’m really hard-pressed to see why. Like Sullivan, I’m a professional writer; I get paid to write. Therefore I can’t see what possible reason one should have against a writer getting paid. Sullivan’s had the benefit of seeing how other various revenue models have worked online, and he’s trying one that allows maximum choice for the readers and doesn’t require every single reader to consider paying.
So if people want to voluntarily pay Sullivan money, why should anyone else care? His not-so-subtle threat that he’d pull the plug might have seemed unseemly to some, but if the man doesn’t get paid for his writing, he doesn’t eat. If he can get some portion his audience to support the blog he enjoys writing and they enjoy reading, more power to him.
Having said that, I don’t think the average Joe Blogger should start thinking that Andrew Sullivan’s success extracting cash from readers is going to translate to a blogoverse-wide rain of money. Very few bloggers have Sullivan’s audience, and even he is smart enough to realize that he’s lucky if he gets 1% of his audience to chip in. 1% of most bloggers’ audience comes to a few dozen people, and most people aren’t going to be able to suggest a recommended contribution of $20 like Sullivan has. There’s also the matter of content — i.e., whether people have content that’s worth supporting with cash. No offense, but most don’t.
Outside of Sullivan, I suspect no more than two or three bloggers could actually extract a serious amount of cash from their readers through a “pledge week,” and those bloggers are the usual suspects of Glenn Reynolds, James Lileks, and possibly Josh Marshall (all three of which have contribution buttons but have not tried pledge weeks). Everyone else would get diddly, and yes, I include myself in that assessment. Right now, as it happens, I’m getting a nice flow of cash in from people buying the downloadable version of my current novel, but given the asking price ($1.50) and the numbers of my daily audience, well, let’s just say I’m not giving up my day job. Which is fine, since I like my day job.
(Speaking of which, the final bit of the Rough Guide to the Universe text is in to the publishers — I’m done, I’m done, hallelujah, I’m done. Now all I have to do is wait for the publication and the book tour. In the meantime, I have another “Uncle John” book to contribute to, and at least one other book project brewing for the year. It’s not a bad life.)
Posted on December 3, 2002 Posted by John Scalzi 2 Comments
One of the nice things about writing something mildly controversial, such as the Big Bang and Creationism or Confederate idiocy, is that it brings in a number of new readers, many of whom are not familiar with my rhetorical style and are therefore shocked about how mean and unfair I am to whatever position it is that they have that I don’t. So let’s talk about being “fair” for a moment.
Basically, for the purposes of the Whatever, I’m wholly uninterested in it. Complainants about my unfairness have suggested that as a journalist (or having been one in the past), I should know something about being fair and objective. Well, I admit to having been a journalist now and again, although when I worked at the newspaper I was primarily a film critic and a columnist, jobs which were all about being subjective. So I wouldn’t go entirely out of my way to trumpet my own rich personal history of journalistic endeavors. I can do traditional journalism, and when I do it, I do a very good job of it. But it’s never been my main thing; opinion is what what I got paid for in my time as a journalist.
This space is not about journalism; never has been, never will be. It’s about whatever’s on my brain at the moment (hence the name), and it makes no pretense of being anything else. This gets written in the interstitial time between paid writing assignments; it’s meant to be a venting mechanism and a practical way to keep writing in a certain style — the writer’s equivalent of doing scales — so that when I do this sort of thing on a paid basis (it does happen), I’m ready to go.
But ultimately it’s all about me: I pick the topics, I comment on the topics, and the basis for the comments is whatever I’m thinking about the subject. I. Me. Mine. It’s all me, baby. What’s going on in my head is inherently unfair because it comes from my own, singular point of view; I don’t try to consider every point of view on a subject when I write about something here: I don’t have the time, for one thing, and for another thing I don’t have an inclination.
If you have your own opinion, don’t expect me to air it for you, unless you understand that typically when I present other people’s points of view here it’s to point out why they are so very wrong wrong wrong. Expecting me or anyone to validate your point of view out of the goodness of our hearts seems a dangerously passive thing to do. You have a functioning brain and an Internet connection; get your own damn Web page. Don’t worry, I won’t expect you to be “fair,” either.
But I doubt that many of the people who want me to be “fair” are actually asking for actual fairness, anyway. What they want is some sort of murmured polite dissent to whatever beef-witted thing they want to promulgate, something that implicitly suggests that their ideas have legitimacy and should be discussed reasonably among reasonable people.
To which my response is: Well, no. Your opinion that whatever it is you want to foist on the world is reasonable does not mean that I have to agree, or treat it with the “fairness” you think it deserves. Rest assured that I am “fair” to the extent that I give every idea I encounter the respect I think it rates.
To take the two most recent examples of this, by and large Creationism (from a scientific point of view) is complete crap; therefore I am rightfully critical of attempts to teach it (or its weak sister “intelligent design”) in science classes. Likewise, denying that the Confederate flags represent evil is pure twaddle and I’m not required to treat the idea that they don’t with anything approaching seriousness. You may not like this position, but ask me if I care. If you want me to treat your ideas with more respect, get some better ideas.
(Somewhat related to this, I’ve noticed that most of the people bitching about “fairness” to me tend to be conservative in one way or another. This makes sense as the topics I’ve been writing about recently fall into the conservative camp. However, inasmuch as conservatives have written the manual on how to demonize those who hold unconforming views — please refer to Newt Gingrich on this — this position strikes me as awfully rich. Not every single conservative person can be held responsible for the rhetorical attack-dog manner of many public conservatives, of course. But on the other hand, I’m not particularly moved by complaints of my mild version here. It’s like someone from a family of public gluttons castigating someone else for going back to the buffet for a second helping.)
I’m likewise not responsible for your reading comprehension of what I’ve written. I do of course try to be coherent — it’s a good thing for a writer to attempt — but what I write and what you think I wrote can be two entirely separate things. More than one person saw what I wrote about Creationists the other day as a general broadside on Christians and Christianity. However, had I wanted to do broadside swack at Christians in general, I would have written “Christians” rather than “Creationists” — the two words not being synonymous, after all.
Another good example of this is when I mention a particular stance is likely caused by ignorance. Well, no one likes to be called “ignorant,” since the common opinion is that people who are ignorant are also typically dumber than rocks. However, ignorance does not imply stupidity; it merely implies lack of knowledge. Ignorance is correctable; stupidity, unfortunately, is typically irreversible. The good news is that rather more people are ignorant than stupid, which means there’s hope. So if you’re ignorant, congratulations! You can work on that.
I’m happy to clear up any misunderstandings or offer any clarifications if you have questions; send along an e-mail, I’ll respond if I can. But generally, in terms of my writing here, I tend to be a strict constitutionalist — what I mean to say is usually in the text itself.
I recognize that a lot of people will consider my utter lack of concern regarding “fairness” here as proof that I’m unreasonable or disinterested in hearing other points of view, but again, that’s another assumption over which I have no control. Likewise people may assume that I’m exactly like I write here, which is also not entirely accurate; what’s here is just one aspect of my total personality, not the complete picture. It does no good to assume that people are only what they write, but I’m not going to lose sleep over it if you think that about me. I can accept a certain amount of unfairness. Life, after all, is famous for not being fair.
Big Bang Belief
Posted on November 26, 2002 Posted by John Scalzi 2 Comments
Astronomy magazine, to which I subscribe, asks on this month’s cover: Do you believe in the BIG BANG? 5 reasons you should. I was initially a little confused by the cover, in that with the exception of a couple of unregenerate Hoyle-loving solid-statists out there, probably the entire of the magazine’s 185,000-member subscriber base has probably already signed off on the whole Big Bang thing; it’d be like Parenting magazine having a cover story that asked if its readers believed in pregnancy.
But of course, the article is not for Astronomy’s regular readers, per se. It has a two-fold aim. The first is to lure whatever Creationists might be lurking near the magazine rack into opening up the magazine and getting a point of view on the genesis of the universe without the Genesis interpretation. I think this is sort of sweet, since I don’t really think most Creationists really want to challenge their beliefs; after all, Jesus didn’t tell them to question, merely to believe. But you can’t blame the Astronomy editors for making the effort.
The second aim is to give non-Creationist parents some reasonable ammunition at the next school board meeting, when some Bible-brandishing yahoo demands the science curriculum be changed to give equal footing to whatever damn fool brew of mysticism and junk science they’ve cobbled together this year to make an end-run around the separation of church and state, and someone rational needs to step in and point out what evidence exists to suggest the Big Bang actually happened.
In that case, the object is not to convince a Creationist of the veracity of the Big Bang; any Creationist who shows up at a school board meeting is already a lost cause in terms of rationality. The idea is to appeal to the school board members that the Big Bang is not interchangeable with the idea that God whipped up the universe in seven days or that the universe was vomited up by a celestial cane toad that ate a bad fly or whatever other pleasant, simple teleological shortcut one might choose to believe.
In this case, I again I appreciate Astronomy’s intent; it’s nice to know they believe a school board might be amenable to reason. Personally, however, I would skip the middleman preliminaries, which is what such an appeal to reason would be. I’d go straight to the endgame, which would be to inform the school board that if it went ahead and confused science and theology, I’d be more than pleased to drag in the ACLU and make it take all the tax money it was planning to use on football uniforms and use it to pay lawyers instead. I’m not at all confident of a school board’s ability to follow science, but I’m pretty sure most of its members can count money. And here in Ohio, at least, they sure do love their football.
Astronomy notes that based on an NSF survey, less than a third of Americans believe in the Big Bang. Part of the problem comes from most people simply not paying attention in science class — evidenced by the fact that only 70% of Americans believe in the Copernican theory, which posits that the Earth is in orbit around the Sun, and you’d have to be fairly ignorant and/or inattentive not to believe that. Another part of the problem comes from the idea that the Big Bang might somehow conflict with religious beliefs — that the end result of accepting the Big Bang as a theory is an eternity of Satan cramming M-80s behind your eyeballs and cackling, “You want a Big Bang? I’ll give you a Big Bang,” before lighting the fuse with his own pinky finger. But a large part of it also has to do with language itself, and how it’s used to confuse.
For example, the word “theory.” Commonly speaking, “theory” equates to “whatever ridiculous idea that has popped into my head at this very moment” — so people have theories about UFOs, alligators in the sewers, the Kennedy Assassination, the healing power of magnets and so on. The somewhat debased nature of the word “theory” is what allows Creationists and others to say “it’s just a theory,” about evolution or the Big Bang or whatever bit of science is inconvenient to them at the moment, implicitly suggesting that as such, it should be paid little regard.
However (and Astronomy magazine has a nice sidebar on this), the word “theory” means something different to scientists than it does to the average Joe. In the world of science, the initial crazy idea that you or I would call a theory is a “hypothesis”; it’s not until you can provide strong, verifiable evidence that the universe actually conforms to your hypothesis that you’re allowed to say it’s an actual theory. So to recap: Crazy idea = hypothesis; crazy idea + independently verifiable facts to back it up = theory.
The Big Bang is a theory not because it’s just this zany idea a bunch of astronomers thought up one night while they were smoking dope in the observation dome; it’s a theory because of a preponderance of evidence out there in the universe suggests this is how the universe was created — to the near exclusion of other hypotheses. It’s a theory to the same extent that gravity is a theory, and be warned that if you don’t believe in gravity, you’ll probably fall right on your ass.
“Believe,” incidentally, is another problem word, since its common usage is synonymous with “I have faith,” and faith, by its nature, is not particularly evidentiary. Someone who says “I believe in Jesus,” is declaring faith in Christ, whose nature is ineffable. One wouldn’t say that one has faith in the Big Bang — and rightly so.
Fundamentally, one doesn’t “believe” or have faith in much of anything as it regards science, since as a process science isn’t about believing at all. It’s about testing and verifying, discarding what doesn’t work, and refining what does work to make it better describe the nature of reality. For a scientist, a belief functions at the level of a hypothesis, which is to say, it’s an idea that requires testing to determine whether it accurately models reality.
Even at their current stage of understanding about it, it’s probably not accurate to say that scientists “believe” in the Big Bang theory, to the extent that there are still holes in the theoretical model that need to be plugged and scientists working to plug them (Astronomy magazine points out these holes, as it should, since doing so doesn’t expose the weakness of the Big Bang theory, but the strength of the scientific process). If it turns out that the Big Bang theory is ultimately incompatible with the data, it’ll have to be thrown out and something more accurate created to replace it.
Asking whether one “believes” in the Big Bang doesn’t really answer any questions — it merely suggests that the Big Bang is itself part of a faith-based system, equivalent to a belief in Christ or Allah or Buddha or whomever. This is another piece of semantic ammunition that Creationists and others like to use: That science is just another system of “belief,” just another species of religion. Not only is science not just another species of faith, it’s not even in the same phylum. Faith is a conclusion. Science is a process. This is why, incidentally, the two are not ultimately inherently incompatible, just as driving somewhere is not inherently incompatible with having a fixed home address.
If I were putting together a poll on the Big Bang, I wouldn’t ask people if they believed in it. I would ask them, based on the evidence, what model of universal creation best described its current state. I’d make sure I left space for the “I have no idea” option. I believe — and this is just hypothesis, not a theory — that the data from that question would be informative.
Flags and the Confederacy (Again)
Posted on November 25, 2002 Posted by John Scalzi 15 Comments
I’m still getting a lot of mail from Confederate partisans over my recent posts on how the Confederacy was evil, and so are its flags. Most of these apologists are spieling out lines suggesting that, yes, yes, fine, the Confederacy did institutionalize slavery. But today its flags mean entirely different things, like pride and heritage and (inevitably) states rights over federal rights. Why can’t we (meaning, presumably, the folk not in the states of the former Confederacy and the descendants of the people the Confederacy explicitly enslaved) just get over it? My God, haven’t the decent white folk of the South suffered enough? They lost their country, after all.
Well, let me make a counter-suggestion, which is that I’ll start trying to forget that the Confederate flag is fundamentally evil, if the Confederacy-pushers will acknowledge that the Confederacy was in fact, a big fat loser, and therefore any of its symbols are less than fertile ground for positive associations.
Loooooooooooser. And it isn’t just a loser in war. Although it is that, let’s not forget — and it lost that war big. Sure, they kept it close in the first half, but after that it was a blowout. The North had a deeper bench. Even a post-game late hit on the North’s general manager (while he was in his luxury suite, for God’s sake!) couldn’t change that fact. But even tossing aside the war, the Confederacy is a loser in so many other ways it’s hard to know where to begin. But let’s begin anyway, shall we?
States’ rights: Loser. The Confederacy so bungled the states’ rights issue that it ended up establishing the primacy of the federal government over states, and additionally ensured that no other state could ever secede from the Union again. Oh, and then the former Confederate states were subjected to a rather unfortunate period of time (it’s called the Reconstruction) where they had about as many state’s rights as the District of Columbia. So, in all, not a particularly shining example for states’ rights.
This where Confederate partisans grumble that yeah, but technically the Confederacy was right on the constitutionality of secession. Well, kids, two things: One, nuh uh. Clearly that was a matter open to interpretation, which is why you had to fight a war about it (which — did I mention? — you lost). Two, even if the Confederacy were technically right on secession, this is a really stupid argument anyway. What, like the United States is just going to go, “Gee, okay, what we’d really like is to have a hostile neighbor to the south of us, competing with us for land on this here North American continent?” I mean, Christ, people. Get a grip.
Clearly we think the Colonists were in the right when they drafted up the Declaration of Independence and suggested that we and Britain had to go our own ways. But they still had to fight a war regarding the matter — and win it. I don’t recall the Colonists being shocked, shocked when Britain didn’t exactly roll over and cheerfully lose a few thousand miles of North American coastline. They knew what they were getting into. So it’s a little silly to suggest that the Confederates, either then or now, should feel otherwise. It’s just whining.
When it comes to things like land and constitutions, being right is half the battle; the other half of the battle is the actual battle you have to fight to enforce your claim. The Confederacy lost that part, which is just as well, because they were way off base with that whole secession thing to begin with. Bad premises, bad results.
Heritage: Loser. Let’s be honest here. There is almost no truly Confederate heritage, if only because the Confederacy in itself didn’t last long enough to generate any while it was an ongoing concern, and while it was around, it was too busy trying to survive to do much of anything else. There is of course a rich heritage of Confederania now, but it exists entirely as the fly-blown leavings from the Confederate corpse, rather than the fruits of a living tree, and that’s not entirely the same thing.
Confederate partisans try to backdate Confederate heritage to before the Confederate era, but I don’t think that is something we should cede to them. There is indeed an antebellum Southern culture, but the participants of that culture did not equate their culture with the political entity known as the Confederacy, since that entity didn’t exist. If they didn’t I don’t see why the rest of us should make that equation, either.
Part of the whitewash campaign of the Confederate partisans is to try to sell the idea that Confederate symbols somehow encompass the entire history of the South, and they don’t, neither prior to the Confederacy nor after. Let’s remember that Confederate and Southern are not synonyms. Southern heritage is a fine thing; Confederate heritage is not. Using the symbols of the latter to represent the former is presumptuous.
Pride: Loser. Proud of what? Of the fact the Confederacy precipitated a civil war that killed hundreds of thousands of men on both sides of the battle? Which — let’s never forget — it lost? Of constitutionally enslaving black people? Of being the cause of the devastation and occupation of the Southern states by Union troops and carpetbaggers?
Oh, yes, Confederate friends, that last one was your fault. We know all about that whole “War of Northern Aggression” line you’ve got going down there, as if you were just sitting there minding your own business when all of a sudden Sherman popped up and started, like, burning things. However, allow me to suggest that from the point of view of the United States, trying to make off with half the country, as you did, seemed like a fairly aggressive maneuver at the time. I’ll be happy to know if you disagree, since then you won’t mind if I come over and take over half of your house, preferably the half with the hot tub.
Individual Southerners feel pride in ancestors who went out and fought (and sometimes died) for the Confederate side of the war, which as I’ve mentioned before is just fine. But I don’t see how one can ignore the fact that all those Johnny Rebs would have been safe as houses had the Confederacy never existed. Prior to December of 1860, it’s not as if the armies of the north were perennially massed at the Mason-Dixon line, champing at the bit to torch the south, and the poor southerners had no choice but to hoist grandpappy’s musket and slug it out at Antietam.
Many of the Confederate apologists with whom I’ve corresponded maintain that their ancestors fought and died to protect their homes, not for the ideals of the Confederacy, and I suspect that in many cases that’s probably true. It still stands whatever their personal reasons for fighting, they fought because of the fact of the Confederacy, which was an evil institution, for reasons I’ve outlined before. Essentially, these people fought and died because an unnecessary and wholly evil entity invited trouble to their doorstep. Someone needs to explain to me why one should feel pride in that.
(Anyway, I do think there needs to be a line drawn in terms of responsibility. Not every Confederate soldier was fighting simply to protect the homestead; at least a few here and there had to believe in the principles of the Confederacy or at the very least the right of the Confederate states to go their own way. These people were wrong, however bravely they may have fought. It’s well and good that they were defeated, since the “independence” they would have bought was rotten to begin with.)
The only real pride one should have as a Confederate partisan is Loser Pride, in which one invests one’s energy in a perennially losing entity primarily as an exercise in existential humility; i.e., Cubs fans. But even Cubs fans have the possibility for glory in that the Cubs are an ongoing concern. The Confederacy, on the other hand, is deader than a gay bar in Branson and will stay that way. It will never be anything but a loser.
Useful Flags: Loser! Look, the Confederacy was so screwed up that it couldn’t even get its flags right. The first official Confederate flag was the Stars and Bars, which was rather too similar to the flag of the United States; it made things even more confusing on the battlefield than they already were. So, the Confederacy decided on another flag, which was largely white. The problem with this flag was that it pretty much looked like a flag of surrender — it was that whole “field of white” thing it had going. Obviously this was problematic if in fact you weren’t trying to surrender, or alternately, if you were, since the Union folks wouldn’t be able to tell right off whether you were giving up or fixin’ to stab them with your bayonets, so they’d be better off shooting you just to be sure.
So out comes a third flag, which, unfortunately for the Confederacy, came out just about the time the Confederacy was imploding from total loserness and teetering on the cusp of non-existence. Shortly thereafter, another flag flew at the Confederate capital, Richmond, and other points south: The flag of the United States of America. And personally I’m hard-pressed not to see that as a vast improvement.
Given the voluminous evidence of the total loser-osity of the Confederacy, you’ll understand why every time I get a letter from someone proclaiming the Confederate flags to be a positive symbol, I just get flummoxed. Frankly, it’s difficult to think of any flags anywhere at any point in time that are as steeped in complete failure on as many social, cultural and political levels as these are. It’s just so damn sad that people are still out there trying to delude themselves otherwise.
The only explanation I can come up with that makes any sense is that certain people from the south simply cannot think rationally about the Confederate flags, much in the same way that certain otherwise totally rational Christians freak out about the fact they’re descended from stooped, hooting proto-primates just like the rest of us. It’s a blank spot in their brain in which they choose not to allow thought of any sort.
Fine. As I’ve said before, if you want to believe that the Confederate flags represent anything but an evil and ultimately pathetically inept institution, and all the consequent stupidity that followed through its use by segregationists, morons and demagogic flag wavers who’d rather rile up the easily excitable than actually make the South a better place for all its citizens, then by all means go right ahead. We’ll agree to disagree.
But please don’t write to me saying that the meaning of the Confederate flag has changed or should change. Short of wiping out the history of the Confederacy itself and pretending it never existed, this isn’t going to happen. The Confederate flag a symbol of evil, and like most symbols of evil it’s much better used as a reminder of the damage evil can do, than it is as a misplaced symbol of pride.
The Confederate flags are the symbols of losers, and those who glorify losers. I really wouldn’t have it any other way.
Posted on November 4, 2002 Posted by John Scalzi 7 Comments
If you are over the age of 18, a US citizen, and you’re not voting on Tuesday, you are a stinky stinky moron, and my response, should I ever hear you bitch about the government of the United States, will be to laugh like a drunken horse right in your face. If nothing else, taking a few minutes out of your day to vote will entitle you to two to six years of unmitigated kvetching about your system of government. Talk about a return on your investment.
If you are under the age of 18, not a US citizen, and are voting on Tuesday — well, that’s just no good for anyone. Go shoot some pool or something.
I vote. Almost as important, I am politically independent. I’ve always been registered as an independent. Part of this is due to my contrarian nature regarding joining any organization; once you join something, the people in it start wanting you to do things with them. The next thing you know, your weekends and Wednesday evenings are given over to bake sales and irritating rallies and stuffing envelopes until your fingers are sausage-shaped agglomerations of paper cuts. When I die, I can guarantee you that my list of regrets won’t include not spending more time doing any of those things. Probably the only organization I see myself joining at some point is the PTA, although that will be exclusively a preventative measure, to head off any attempts to ban Huck Finn or the Harry Potter books before someone has to haul in the local branch of the ACLU, and my tax dollars go to pay lawyers rather than to educate my child.
But part of it is due to the fact I dislike political parties. This is usually for one or more of the following three reasons: Their overall platforms, their tenuous relationship to the actual principles of democracy, and their general emphasis on getting an agglomeration of their kind elected rather than finding the best representatives of the people that those representatives are supposed to elect. Some parties set me on edge more than others — I’ve never made any secret that I distrust the GOP to such an extent that I tend to think that people who register Republican have some sort of unfortunate brain damage that keeps them from thinking clearly — but to be clear, I don’t like any of them. Ultimately political parties are about someone else telling me how I should exercise my franchise, based on the idea that they’ve got so many other people planning to vote the same way. Or to put it another way: “50 million Dubya fans can’t be wrong!” Well, yes, they can.
The one fly in the ointment here is that generally speaking enough people do register for and support political parties (specifically the Republicans and the Democrats) that here in the US those are the flavors you get when it comes to election day; indeed, over the history of my voting, I don’t think I’ve ever voted for someone who wasn’t of one or the other party (despite my general paranoid suspicion of the GOP, I have voted for Republicans in the past and would do it again in the future if — as in the past — there were compelling reasons to do so. See, that’s what being independent is all about). Certainly tomorrow I’ll be voting for candidates from those parties. But as is often the case, it’s not so much that I’m voting for a particular candidate as voting against another, and putting my defensive vote into a bin where it can do the most good. This set-up is hardly my fault. The problem isn’t that I’m not part of a political party, it’s that so many other people are.
Look at this way: If you registered as an independent, more candidates would have to think independently and focus on what actually works for their constituency — an actual representative democracy rather than one where the parties offered their candidates just enough leeway from the party platform not to alienate the voters in their district. Political races would get proportionately less “soft money” from the outside world, meaning the would-be politicians would have to actively engage in grass-roots campaigning, which again means the candidates would have to be more responsive to the needs of the constituency.
There’d be less political triangulation in the primary season, in which moderate editions of political party candidates get the bounce to appease the hard-line nutbags that constitute the political baggage of both parties — or are bounced through the maneuverings of the other party, which is hoping to for an opposing candidate that alienates the maximum number of moderate voters. Party politics would also enter far less into the sausage-grinding process of law-making, since the concern the lawmakers would have is whether they’re pleasing the people back home, not the party. We’d see ever-shifting alliances of politicians, based on specific issues, rather than an inflexible platform that grown men and women have to be politically “whipped” into adhering.
Where’s the downside here? Is there a real downside to the end, not only of the two major political parties, but to all political parties altogether? Certainly not for the individual. You can still have your own political beliefs, you know. You can still be a “pro-choice” card-carrying member of the ACLU without being a Democrat; you can still be a “pro-family values” gun-toting member of the NRA without being a Republican; you can still be “pro-polyamory” Ayn Rand-worshiping dateless freak without being a Libertarian. And you wouldn’t have to put those little cardboard signs on your lawn every two years.
Just think about it the next time you go to register. If you really want better political discourse in this country, do it by making the politicians listen to you and your neighbors, not your party affiliation. Vote independent. Hell, the reduction in political junk mail as you’re taken off the party mailing list will be worth it alone.
The Confederacy is Evil
Posted on October 28, 2002 Posted by John Scalzi 9 Comments
Based on a (very good and civil, mind you) e-mail conversation I had over the weekend, I think now is a fine time to expand some points I made here over a year ago, when I wrote my “Southern Heritage is a Crock” column. So here we go:
The Confederate States of America was a fundamentally evil institution. Period, end of sentence. That’s “evil,” spelled “E-V-I-L.” “Evil,” as in “morally reprehensible,” “sinful,” “wicked,” “pernicious,” “offensive” and “noxious.” “Evil,” as in “the world is a demonstrably better place without this thing in it.” Evil. That’s right, evil. Once again, for those of you who haven’t figured it out yet: Evil. And for those of you yet hard of hearing, the ASL version:
Really, I don’t know how much clearer I can make it.
The CSA was a fundamentally evil institution because it codified slavery into its system of government; N.B: Article IV Section 2 of the Constitution of the Confederacy. And lest you think this was just some sort of mamby-pamby sop thrown in the CSA constitution to please the slave-holders, let’s go to the historical record, to a speech by CSA Vice-President Alexander Stephens in March of 1861, in which he discussed the CSA Constitution at great length. The entire text is here, but allow me to excerpt considerably (and to place emphasis on the relevant passages) from Stephens’ comments about slavery and its role in the CSA, both in its constitution and in its very formation:
“The new constitution has put at rest, forever, all the agitating questions relating to our peculiar institution — African slavery as it exists amongst us — the proper status of the negro in our form of civilization. This was the immediate cause of the late rupture and present revolution. [US President Thomas] Jefferson in his forecast, had anticipated this, as the “rock upon which the old Union would split.” He was right. What was conjecture with him, is now a realized fact. But whether he fully comprehended the great truth upon which that rock stood and stands, may be doubted.
“The prevailing ideas entertained by him and most of the leading statesmen at the time of the formation of the old constitution, were that the enslavement of the African was in violation of the laws of nature; that it was wrong in principle, socially, morally, and politically. It was an evil they knew not well how to deal with, but the general opinion of the men of that day was that, somehow or other in the order of Providence, the institution would be evanescent and pass away. This idea, though not incorporated in the constitution, was the prevailing idea at that time. The constitution, it is true, secured every essential guarantee to the institution while it should last, and hence no argument can be justly urged against the constitutional guarantees thus secured, because of the common sentiment of the day. Those ideas, however, were fundamentally wrong. They rested upon the assumption of the equality of races. This was an error. It was a sandy foundation, and the government built upon it fell when the “storm came and the wind blew.”
Our new government is founded upon exactly the opposite idea; its foundations are laid, its cornerstone rests upon the great truth, that the negro is not equal to the white man; that slavery — subordination to the superior race — is his natural and normal condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth. This truth has been slow in the process of its development, like all other truths in the various departments of science. It has been so even amongst us. Many who hear me, perhaps, can recollect well, that this truth was not generally admitted, even within their day. The errors of the past generation still clung to many as late as twenty years ago. Those at the North, who still cling to these errors, with a zeal above knowledge, we justly denominate fanatics. All fanaticism springs from an aberration of the mind — from a defect in reasoning. It is a species of insanity.
“One of the most striking characteristics of insanity, in many instances, is forming correct conclusions from fancied or erroneous premises; so with the anti-slavery fanatics; their conclusions are right if their premises were. They assume that the negro is equal, and hence conclude that he is entitled to equal privileges and rights with the white man. If their premises were correct, their conclusions would be logical and just — but their premise being wrong, their whole argument fails.
“I recollect once of having heard a gentleman from one of the northern States, of great power and ability, announce in the House of Representatives, with imposing effect, that we of the South would be compelled, ultimately, to yield upon this subject of slavery, that it was as impossible to war successfully against a principle in politics, as it was in physics or mechanics. That the principle would ultimately prevail. That we, in maintaining slavery as it exists with us, were warring against a principle, a principle founded in nature, the principle of the equality of men. The reply I made to him was, that upon his own grounds, we should, ultimately, succeed, and that he and his associates, in this crusade against our institutions, would ultimately fail. The truth announced, that it was as impossible to war successfully against a principle in politics as it was in physics and mechanics, I admitted; but told him that it was he, and those acting with him, who were warring against a principle. They were attempting to make things equal which the Creator had made unequal.“
Lots of Confederate sympathizers like to say that what the Confederacy was really about was state’s rights, and all that. But I don’t know. Let’s put on one side a bunch of Confederate sympathizers who understandably want to downplay their fetish’s unfortunate association with that whole “people owning people” thing. And on the other side, let’s put the CSA’s second-highest executive, speaking about a Constitution he helped create, specifically discussing the role of slavery in his country’s formation. When it comes to what the Confederacy was really about, who are you going to believe?
Yes, the United States had slavery (and continued to have it, even during the Civil War; that Emancipation Proclamation thing of Lincoln was only effective in rebellious states), and isn’t blameless of other nasty habits, including brushing the natives off land it wanted to own. However, the United States did not codify evil into its Constitution by enshrining the practice of slavery; as Stephens proudly notes, it took the CSA, among all other countries in the world, to do that. The United States has done evil, but is not fundamentally evil in its formulation, as is the CSA.
It comes to this: When someone tells you the Confederacy was about something other than people owning people, they’re either being intentionally disingenuous or (more charitably) are ignorant about the deep and abiding role slavery had in the formation of the CSA. It was about other things, too. But, and in an entirely appropriate, non Godwin-izing use of this particular political entity, the Third Reich was about more than just exterminating the Jews. It just happens that that’s the one cornerstone policy of the Reich that, you know, sort of stands out.
Given that the CSA is a fundamentally evil institution, it’s clear that any of its trappings are symbols of evil, including those flags Confederate sympathizers love so well. This is a pretty cut and dried thing: If the answer to the question “Was this symbol/flag/insignia/whatever used as an identifying object by the Confederate States of America?” is “Yes,” then it is, point of fact, a racist and evil symbol. If you’re wearing such a symbol or otherwise endorsing it in some public way, it’s not unreasonable for people who see you wearing such symbols (particularly the descendants of former slaves) to wonder if you’re either racist and somewhat evil yourself or, alternately, just plain dim.
If you have an ancestor who fought for the CSA, then, yes, he fought for an evil institution — but no, I don’t think it makes that individual evil in himself. I think it’s perfectly reasonable and right for the descendants of Confederate soldiers to note the bravery and valor with which they fought, and to commemorate their individual efforts on the field. I think it would be nice if they additionally noted that it was sad that the government for which they fought was ultimately undeserving of their blood and defense, and that it was rightfully expunged from the world, but that’s another matter entirely.
(My correspondent this weekend asked me an interesting question as to whether a memorial for American soldiers who died in combat should include names of Confederate soldiers — the genesis of this question being some fracas he’d heard about at a northern university that was putting together such a memorial. My response is that it shouldn’t, for the reason that either the CSA was its own country, in which case its soldiers weren’t “American” soldiers (“American” understood to refer to citizens of the US), or it wasn’t its own country and the Confederate soldiers were in open and treasonous rebellion, and as a general rule one does not commemorate traitors, particularly ones whose rebellious actions ultimately caused the deaths of hundreds of thousands. I don’t have a problem with such memorials in formerly Confederate territory, but the rest of the United States is not obligated to follow suit.)
Now, look: I understand that for a lot of Confederacy fans, it really isn’t about race or anything else other than pride for the South. My response to that is: Groovy. Go for it. Love the South. What y’all need to do, however, is get some new symbols, some that don’t hearken back to the Dixie Days, when you went to war for the right to keep owning people. The Confederacy was evil, and now it’s dead, and its being dead is front and center the best thing that there ever was about it. There is the South, and there is the Confederacy, and a good thing for you and for the rest of us would be the realization that these two things don’t have to be synonymous.
We Need New Constellations
Posted on October 25, 2002 Posted by John Scalzi 19 Comments
I’ve been spending the last few days working with the constellations, drafting images for the cartographers over at Rough Guides to turn into actual star charts (hint: It’s easier to do when you’re making screenshots off of astronomy software, as I’ve been doing. Yes, you have to get permission from the software makers before you do this sort of thing. Yes, I did). There are 88 officially recognized constellations, but I ended up with 69 charts, on account that I paired up several of the smaller and/or less impressive constellations. Sad to say, many constellations just don’t rate their own star chart.
It’s not like they care, mind you. They’re just abstract representations of earthly objects projected into the sky by humans, using stars that have only a passing relationship to each other. Stars that look close in our night sky can be hundreds of light years apart; it’s that whole “space is three dimensional” thing (and actually, space is four dimensional — some stars we see in the sky may already be long-dead and gone, it’s just taking a while for the news to reach us, thank you very much Dr. Einstein).
I don’t think most people realize how many strange and pointless constellations are sitting up there in the sky. In a way, this is only natural (said, of course, ironically): Most of us live in urban areas, where light pollution and other sorts of pollution conspire to blank out fainter stars from our view. I remember living in Chicago and looking up and being able to see nothing but the 10 or 20 brightest stars — really not enough to go naming constellations by. Since many of the more obscure constellations are composed mainly of faint stars, why should people know them? When it comes to constellations, you can’t know what you can’t see.
The other reason is that constellations just don’t mean what they used to people. When you’ve got PlayStation 2, what do you need with the constellation Vulpecula (this is not a knock on PlayStation 2, said the Chief Entertainment Media Critic for Official US PlayStation Magazine, quickly, before he can get fired for disloyalty). If you can make out and recognize the Big Dipper (which, strictly speaking, is an asterism, not a constellation), or maybe Taurus or Orion, you’re doing just fine.
Still, it’s interesting to know what weird and freaky objects are up there in the sky. For example, did you know that there’s a giraffe walking around near the celestial north pole? It’s the constellation Camelopardalis (pictured above), which, being circumpolar as it is, is always hovering in the night sky here in the northern hemisphere. Its near neighbors include two bears, a bobcat, a dragon, and a guy carrying around a couple of goats. I think it’s a little out of place.
The fact of the matter is that Camelopardalis is a fairly recent constellation, created just a few hundred years ago by an astronomer who noticed that there was this wide swath of space with no constellation in it; he just spotted a few dim stars (none higher than 4th magnitude, which means you won’t be able to see them n the suburbs), strung ’em together, and there you have it — instant constellation.
Other lesser-known constellations in the northern sky: Delphinus and Equuleus (the dolphin and horse, respectively), Sagitta (the arrow) and Vulpecula (the fox), Corvus and Crater (a crow and a cup, and they actually share a mythological story together), Canes Venatici (hunting dogs) and Coma Berenices (Berenice’s hair, and isn’t that a weird one: A wig in space). The thing about these constellations is that if you can identify one of them, you’re probably the sort of person who can identify them all. Not that there’s anything wrong with that, mind you. I am writing an astronomy book, you know. I want you to be know these things.
The earth’s southern hemisphere has a lot of unfamiliar constellations for most of us, but that’s to be expected, since most people on the planet live in the northern hemisphere, rather above the equator, thus there are constellations down under that we never see: Chameleon, Pavo, Apus, Hydrus, Tucana, Octans — all circumpolar to the South Pole.
Be that as it may, the southern hemisphere has a lot of constellations seem a little odd in their own right; many of them were described and created during the Age of Exploration (when the Europeans hopped in their ships to travel the world and surprise the natives of other lands with Jesus and smallpox), and so describe scientific objects: Microscopes, telescopes, compasses, air pumps, carpenter’s levels, chisels, pendulum clocks and octants. A fan of rationality though I may be, I’m not at all impressed with any of these: I want the night sky to be filled with wild animals and mythological heroes, not to resemble Galileo’s laboratory.
Given the fact that so many constellations are dim and/or obscure and/or just plain lame, I have an idea. I say we yank most of the constellations. I figure we have to keep the signs of the zodiac, otherwise we’ll have to fund an Omnibus Astrologers’ Assistance Bill in congress, and then keep on some of the most obvious constellations in both hemispheres: Orion, Centaurus, Ursae Major and Minor, Crux, and so on. Say, the top 25 or 30 constellations get to stay. The rest: Gone. Then we start voting on new constellations — and by “we” I mean pretty much the whole planet. You may not know this, but the night sky is officially pretty damn Eurocentric, up to and including the parts that can’t actually be seen from Europe (although there is a Native American in the southern sky — Indus — and I bet he’s surprised to be so far from home). It can’t hurt to let the voting power of China or India put in a constellation or two (or three, whatever).
The only rules I’d put in would be that the new constellations couldn’t be of real people — thus avoiding the constellations Mao, Elvis and Dale Earnhardt — and that we’d pretty much want to avoid any technological advance of, oh, the last 100 years. That way we’re not stuck with the constellations TiVo, Nintendo or Cell Phone. Other than that, let ’em rip. We’ll let the astronomers keep the old constellations, of course, because there’s no point in having to rename the entire sky for scientific purposes. It’s like how they use Metric and stuff. You know, just because they do doesn’t mean we have to. And it’ll get people looking up at the sky again. That’s not bad.
Oh, come on. It’ll be fun. You won’t miss dumb ol’ Camelopardalis anyway.
The Coming War
Posted on October 11, 2002 Posted by John Scalzi 4 Comments
Since it looks like we’re heading toward one, here’s my take on war.
1. It should be done if it’s necessary. For now, I’ll be vague as to what constitutes “necessary” because it’s very much open to interpretation.
2. If you’re going to do it, then you should make sure your opponent ends up as a grease spot on the wall, and that his country is reformulated so that it never ever bothers you again.
In the best of all worlds, both of these are fulfilled; you have no choice but to go to war, and you squash your opponent like a plump grape underneath a sledgehammer. But to be entirely honest, if I had to choose between the two of these, I’d pick number 2, if only because if we must participate in an unjust war, ’tis better it was done quickly. That way the stench of our pointless involvement is over quickly, and we expend as little matériel as possible (not to mention, you know, the deaths of those who fight our wars for us are kept to a minimum). Also, if you have the first, but not the second, what you end up with is a long-standing crapfest that you will ultimately have to revisit, whether you wish to or not.
Such as it was with the Gulf War. I’m not a terribly big fan of that war, but I’m perfectly happy to cede the point that it was necessary to some great extent. Yes, it was a war about oil. Thing is, while we can argue about the need to reduce our oil consumption (I tend to think the greatest advance in technology in the last couple of decades is the coming age of fuel cell and alternate energy cars), ultimately we still do need oil, and certainly needed it in 1990.
And of course it’s not like it was just a war about oil on our side of the fence; had Kuwait’s primary export been goat meat, Saddam would have been less likely to get all fired up about reintegrating the lost 19th province of Iraq. The Gulf War also offered the added attraction of the possibility of turning Saddam into a fine particulate mist with the aid of a well-placed smart missile. He’s a morally disagreeable enough person, and his regime largely worthless enough to have made the case for its dismantling persuasive.
The Gulf War took place while I was in college, and I remember being at candlelight vigils in the quads, not to pray that the US stopped the madness of the attack, but that we kicked the righteous hell out of the Iraqis and that it would all be over quickly. I had a brother in the Army, who was over there in the fight. The longer the fighting went on the better the chance something bad would happen to him. Fortunately, it was over quickly, and we learned what happens when a large but poorly-trained, badly-equipped army goes head-to-head with a highly-trained, massively-equipped army: The poorly-trained army loses people by a ratio of more than 100 to 1. We squashed the Iraqi army, all right.
But we didn’t squash Saddam or his regime, and ultimately, I find this inexplicable. Saddam should have not been allowed to continue to rule. His personal detention (to say the least) and the dismantling of his political machine should have been part of any surrender. War isn’t a football game, after all, where the losing coach gets to try to rebuild for next season. Particularly in Saddam’s case, where he was the aggressor; he started it. The penalty for starting a war (which, to be clear, you then lose, miserably) should be a nice 8×8 cell with no phone privileges until you die.
Lacking the will to depose Saddam, we (and by we I mean the US and the UN) should have been willing to back up the weapons inspectors with the immediate and massive threat of force. Simply put, any facility that the weapons inspectors were denied entry to should have been bombed into pebble-sized pieces within 15 minutes of the inspectors leaving the area. Aggressive countries that have been defeated in war do not have the luxury of “national dignity” or whatever it is you want to call it. The fact that we just spent more than a decade letting a hostile regime jerk the world around is angrifying (a new word. Use it. Love it).
Let’s turn our attention to the new war we’ll be having soon. Toward the first point, is this war absolutely necessary? I doubt it. I think it would be much more useful to swarm the country with weapons inspectors and high-altitude bombers that track their every destination. After the first few times Saddam’s precious presidential palaces are turned into powder when the inspectors are turned back, they’ll get the clue. I see nothing wrong with reminding Iraq on the point of a missile of its obligation to let us look anywhere for anything. Clearly they won’t like it, but, you know. So what.
Many suggest that the purpose of the coming war will to be to assure that Iraq cannot ever threaten any of us, but this achieves the same goal at lesser cost (and without exposing our military to undue chance of death). If indeed containing that threat were the goal of the upcoming war, this works just as well, and will have the additional value of being what was actually the correct response anyway, and only the better part of a decade late.
However, it’s clear that Dubya wants a war for purposes not related to weapons containment; indeed, his administration is utterly disinterested in that aspect of the Iraq problem, except as a convenient trope to sell the war to inattentive voters. Dubya wants regime change, and I can sympathize. Saddam has been in power a decade longer than he should have been, and I can think of worse uses of the American military than clearing out bad governments around the world. If Dubya said something along the lines of “First we get rid of Saddam, and then we’re going to pay a call to Robert Mugabe,” well, that’s a barricade that I’d be inclined to rush.
I’m not holding my breath on that pronouncement, however. Ultimately I suspect that Dubya wants Saddam out as part of a father-avengement thing, although what Bush I needs to be avenged for is unclear; Bush I isn’t dead at the hand of Saddam, after all, nor injured, nor in fact seriously put out in any recognizable way. I believe at best Dubya is avenging his father’s taunting at the hands of Saddam. If that’s the case, Dana Carvey had better go to ground as quickly as humanly possible. This is of course a poor reason to send a nation into war, but Dubya does have the advantage of a decade’s worth of stupidity in dealing with Iraq providing him with some actual legitimate reasons to plug Saddam.
Let’s get down to brass tacks. On balance, the end results of fighting this war will be (cross fingers) the removal of Saddam and the dismantling of his political state and (incidentally) a clearing out of whatever weapons capability that may exist. For those reasons, I’m not opposed to fighting a war with Iraq now. Be that as it may, even those people who fully support a war against Iraq are rather painfully aware that the stated reasons that the Dubya administration wants to gear up for war are window dressing for a revenge fantasy. It is possible to fight a just war for less than entirely just reasons. We’re about to do it.
Just, necessary or not, let’s hope that this war is total, complete and ends with Saddam dead or in chains, his system smashed, and Iraq occupied in the same manner as Japan or Germany was at the end of WWII, with an eye toward making the revamped country successful and benign (the scariest things to come out of Japan and Germany in the last 55 years, after all, were Godzilla and the Scorpions, respectively). Anything less will be, in a word, unforgivable. If we mean to wage war, let’s wage war like we mean it.
Posted on October 8, 2002 Posted by John Scalzi 4 Comments
I’ve had a long and somewhat excruciating journey back from San Francisco, although thanks to standard airline practice of overbooking and begging for volunteers, I am now the owner of a free trip to anywhere in the continental US. Depending on future travel plans, I actually made a profit on the trip. So it’s not all bad. Be that as it may, I have to play catch up on a number of things, including invoicing my clients for the month. In short. no recap of my journey (I can assure you, however, that I had a really fabulous time). Maybe later.
Instead, I present to you an article I wrote for the Dayton Daily News, which appeared this last Sunday. It’s a response to a NY Times article by author Joseph Epstein, in which he suggested that everyone who thinks they might want to write a book should just keep that book to themselves. As you might expect, I think Epstein’s opinion on the matter is entirely full of crap.
I’d’ve linked to the article on the DDN site if it were up there, but it’s not. I’m presenting it here instead. This is the “unedited” copy; it differs slightly from the printed version, which was edited for space and does not have me using the phrase “shove it” — which to be entirely honest, I didn’t really expect to survive the editing process anyway. Anyway, here we go.
By John Scalzi
Author Joseph Epstein had a message for would-be authors this week Drop dead.
“According to a recent survey, 81 percent of Americans feel they have a book in them — and that they should write it,” Epstein wrote in the September 28 edition of the New York Times. “As the author of 14 books, with a 15th to be published next spring, I’d like to use this space to do what I can to discourage them.”
And discourage them he does. Epstein — a professor at Northwestern whose most recent book, curiously enough, is called Snobbery The American Version — notes that every year 80,000 books are already published in the United States, “most of them not needed, not wanted, not in any way remotely necessary.” Many people who want to write a book, Epstein suggests, do so with the idea of leaving something for posterity, and to proclaim their personal significance to the world. However, Epstein notes, “Writing a book is likely, through the quickness and completeness with which one’s book will die, to make the notion of oblivion all the more vivid.”
Ultimately, Epstein concludes, “Misjudging one’s ability to knock out a book can only be a serious and time-consuming mistake. Save the typing, save the trees, save the high tax on your own vanity. Don’t write that book, my advice is, don’t even think about it. Keep it inside you, where it belongs.”
Well, as the author of or contributor to several books, I’d like to offer a counter-proposal for you would-be authors As nicely as humanly possible, tell author Joseph Epstein to take his advice and shove it. There are many things this world has too much of, but books and storytellers are not two of them.
Epstein is right about some things. Most of the people who think they want to write a book never will. Of those who start, most will give up about 50 pages in, when they realize writing a book is actually work. Most of those who manage to finish writing a book will never see their book published, or will have to resort to vanity presses, and most copies of the book will sit the boxes in which they were delivered. Of those authors that do get published (and get paid for it), most will have the dubious pleasure of watching the book disappear off bookstore shelves in a few short months, to migrate to the remainders bin or sent off to be pulped into paper towels. If you want immortality, negotiate with your higher power, not a book publisher.
But to say that book-writing is difficult and publishing industry is competitive is not the same as saying that people should not write books. That’s like saying that because most people will never get signed by a major label or make an album, they shouldn’t bother to learn an instrument. Or that since most people will never be hired as a chef or open a restaurant, they should just stick to microwave meals. Thing is, most people have figured out that they’ll never be a four-star chef or a rock star. Most people don’t even worry about it. In each case, the skill is its own reward.
That’s why people should write books. They should write books because it shows a love of language and because writing is a skill worth having. I don’t think anyone would argue that we as a people should leave literacy and self-expression up to the professionals; among other things, that’s a fine way to narrow down that professional class.
People should also write books because despite Epstein’s implicit dismissal, every human being has a story to tell, and most of us have more than one. Admittedly, most people can’t write well enough to write a whole book. Most people can’t knit a sweater or compose a song, either — but could with time, effort and encouragement. Likewise, writing is a skill that improves with practice. Could having 81% of the American population working on their writing skills really be such a bad thing?
Anyway, here’s a secret writers don’t want you to know: Good writers are frequently not the professionals. As just one famous example, “Harry Potter” author J.K. Rowling was a divorced mother on public assistance before she started writing, scratching out pages in a café while her daughter napped. Presumably Epstein would have encouraged her to smother Harry Potter in the literary womb. Good writers come from everywhere; good stories — and good books — are often where we least expect them.
Let me provide another example closer to home. There’s a guy down the street from me named Darrell Gambill. He’s not a professional writer; he has a farm and works as a machinist at Goodyear. He had a story he wanted to write, about boxers and guardian angels. So he wrote it His book, The Lion’s War, was published last year. I don’t know how well The Lion’s War is doing; I don’t expect to see it on the bestseller lists or taught in classrooms around the US, or made into a feature film. But so what? The author wrote the story he wanted to tell. I’m glad he didn’t save the typing, or the trees, or the tax on his own vanity. His book is outside, which, contrary to Epstein’s opinion, is where books belong.
Noah’s Flood: DId it Happen?
Posted on October 3, 2002 Posted by John Scalzi 4 Comments
Finally got my copy of Uncle John’s Bathroom Reader Plunges Through the Universe, and was mildly surprised to learn that the cover art was different than what is pictured On Amazon. It’s actually blue and black. So if you’ve ordered it, don’t be shocked when it looks different. It’s a feature, not a bug.
With that, I’m out of here for a few days. I’m off to San Francisco to see a few folks and to speak at JournalCon 2002; I’ll be on the panel discussion “Writing for Fun and Profit.” That’s fair since I do both. I’ll be back on Monday but probably won’t update this site until Tuesday at the earliest but more likely next Wednesday. Until then — well, it’s a big Internet. I’m sure you’ll keep yourself amused. Here’s one last science article to send you off.
Did Noah’s Flood Really Happen?
Some think they’ve found the historical event that launched the legend of Noah’s Ark. Others aren’t so sure.
You know the story of The Flood, of course: One day God, annoyed with humanity, decides that what the Earth really needs is a good long soak. So He commands His faithful servant Noah to build an ark to hold two of every species (except livestock and birds, for which he needs to carry seven pair of each — a detail many people forget); once that’s accomplished, God unleashed a flood with rain that lasted for the fabled 40 days and 40 nights.
Many Christians take this account as the gospel truth. Others, however, wonder if the story of Noah isn’t rooted in some more local and less globally catastrophic event — one memorable enough, however, to spawn a series of flood legends. Besides the Biblical story of the flood, other civilizations in the Eastern Mediterranean area also had significant flood legends, including the Greeks (who has Zeus creating a flood to punish the wicked), and the Sumerians and Babylonians, whose flood legends also include a righteous family, and an ark filled with creatures (the Sumerian version even had the ark’s owner, a fellow named Utnapishtim, release birds to find land).
In 1999, two Columbia University researchers named William Ryan and Walter Pitman put out a book called Noah’s Flood, which offered a tantalizing suggestion The flood in question happened near the Black Sea around 7,000 years ago. At this time, the theory goes, glaciers left on the European continent from the last ice age melted, sending their runoff into the Mediterranean Sea. As the Mediterranean Sea swelled, it breached the land at the Bosporus Strait, near where Istanbul stands. This breach released a flood of water into a freshwater lake that sat where the Black Sea is today. This freshwater lake was quickly inundated with salty Mediterranean water (at the rate of six inches per day) and grew to the present size of the Black Sea within a couple of years — bad news for the humans whose homes and villages were situated on the shores of the former freshwater lake, and certainly memorable enough to be the basis for many a flood legend.
Ryan and Pittman’s flood theory appeared to get a major boost in 2000, when famed underwater explorer Robert Ballard discovered the remnants of human habitation in 300 feet of water, 12 miles into the Black Sea, off the coast of northern Turkey. Ballard also found evidence of the Black Sea changing from fresh water to salt water: Sets of freshwater shells that dated back 7,000 years, followed by saltwater shells that dated back 6,500 years. Somewhere between those times, it seemed, the Black Sea was born out of a freshwater lake. It’s also the historically correct time for Noah’s famous flood.
Aside from the obvious housing problems that this rising tide of saltwater presented anyone living on the edge of the freshwater lake, it would also have the rather unfortunate side effect of killing anything that lived in the freshwater lake itself — most creatures that live in freshwater environments will die off in saltwater environments (and vice-versa).
However, the newly arriving saltwater species wouldn’t have been much better off: Salt water is denser than fresh water, so the new water from the Mediterranean sank under the fresh water, and the oxygen exchange between these levels of water was pretty much blocked. Any saltwater creatures that came along for the ride eventually suffocated. All those dead animals probably made the Black Sea a stinky place to be for a while. The silver lining here, however, is that oxygen-free water makes for a fabulous medium to preserve shipwrecks. Any boat that’s sunk to the bottom of the Black Sea since about 5500 BC is still there, unmolested by local marine life.
So, case closed, right? We’ve found the famous Biblical flood? Not so fast: In May of 2002 a group of scientists published an article in GSA Today, the magazine of the Geographical Society of America, refuting the idea of a sudden flood of Mediterranean seawater flooding into the Black Sea area. Their contention is that based on mud samples they’ve found in the Marmara Sea (just on the other side of the Bosporus Strait from the Black Sea), there has been interaction between the Mediterranean and the Black Sea area for at least 10,000 years — suggesting that the Black Sea filled in over a much slower period of time: About 2,000 years or so. So while the water levels in the Black Sea definitely rose, the rate of their rise wouldn’t constitute a “flood” by any conventional standard of the word.
That’s where the debate stands at the moment — those who think the Black Sea was created in two years, and those who contend it was created in two thousand. In the scientific search for Noah’s Flood, the jury is still out.
Posted on October 2, 2002 Posted by John Scalzi 2 Comments
I’m continuing my cavalcade of science-related pieces in honor of the release of Uncle John’s Bathroom Reader Plunges Through the Universe, to which I contributed a significant number of the articles contained therein. The pieces you’re reading here this week are rather similar in tone and content to the pieces you’ll find in the Uncle John’s books (explicitly in the case of the articles I wrote, and implicitly even in the one I didn’t, since they wouldn’ta bought so many of my pieces if they didn’t fit the general tone of their book), so if you like it, consider getting the book (click on the graphic for an Amazon link). Remember: You don’t have to only read it in the bathroom. Also remember the contest I’m running: The winner will get a whole stack of Scalziana. Yes, that’s a word. At least it is now.
In an Alternate Universe the Cubs Win the World Series Every Year
Ready to get your mind blown? Get a load of this The “Many Worlds Interpretation” of quantum physics.
Chicago Cub fans are a long-suffering lot: Their beloved Cubbies have been choking for almost a century now, failing every year since 1908 to win the World Series. And there’s no relief in the form of Chicago’s other team, the White Sox, which have found themselves similarly throttled since 1917. At least their misery is shared by Boston, whose Red Sox have been laboring under the “Curse of the Bambino” since 1918.
But what if we told you Cubs and Sox fans that your misery is unfounded — and that in fact your teams have won the World Series? Not just since 1908 (or 1917, or 1918), but every single year since. That’s right. Each of these teams. The World Series. Every. Single. Year. It’s true.
“Not in this world,” you say. And you know what? You’re exactly right. Not in this world. But in other worlds, and in other universes, each of which has been created from our universe. It’s doable in something that is called the “Many Worlds Interpretation” of the universe, and it’s actually possible, thanks to the deeply freaky and unsettling nature of quantum physics.
Here’s how it works. Think about an electron. Now, most people think of an electron as a little ball that whirls around the nucleus of an atom, very quickly. But a quantum physicist would tell you that in reality, an electron isn’t a ball, but a haze of mathematical probabilities, all of which exist at the same time. The electron could be in point A, or it could be at point Z, or in any (and every) point in-between. It’s only when you make the effort to observe the electron — to actually look at the thing — that the electron “decides” where it wants to be, and picks one of its possible locations to be at. For folks who don’t regularly dabble in quantum physics, the idea of a sub-atomic particle “deciding” to physically exist only when you observe it is more than a little creepy. But, hey, that’s how the universe is; we’re just telling you about it.
Up until 1950, scientists handled the idea of an electron (or any quantum event) collapsing into one possibility by suggesting the idea of multiple theoretical “ghost worlds” in which the electron shows up at a different point — as many possible points as it’s possible for that electron to collapse into. However, these “ghost worlds” don’t actually exist; they’re just a theoretical construction that’s convenient to use. Well, in 1950, a Princeton graduate student named Hugh Everett said: What if these “ghost worlds” actually existed?
In Everett’s theory, an electron collapses into a single point when it’s observed, just like it always does. But the event also creates entirely new alternate universes, into which the electron collapses to a different point — so the universes that are created are exactly the same, except for the position of that one single electron. How many universes are created? One for every single possibility. Depending on the quantum event, that can be quite a few universes, just from a single electron collapsing. Consider how many electrons exist in the universe (our universe), and you’re presented with a staggering number of universes being created, every instant, throughout the entire span of time that our universe has created. And that’s just with the electrons (there are, of course, other quantum events).
Again, this idea is truly wild. But the thing is, the physics on this theory checks out. It really is possible that the universe works this way. The catch (and there’s always a catch) is that there’s no way to test it. Any universes that are created from the quantum splittings are impossible for us to visit or observe.
What happens with these possible “other” universes? Well, they just keep existing — away from us, in their own space. There’s no reason to assume that what happens in those universes from the instant they split off from our own is what happens in our universe. In alternate universes, anything can — and as far as we know, anything does — happen. In a universe that split off from our own in 1908, it’s perfectly conceivable the Cubs came back in 1909 to beat the Pittsburgh Pirates to the NL pennant — and then took the Series again from the hapless Detroit Tigers for the third year running. And then came back in 1910 (which they did in our universe, incidentally), and won the Series again (which they did not). And again in 1911, and in 1912, and so on and so on. Admittedly, this would get boring for anyone who’s not a Cubs fan. But don’t worry, guys. In other universes, your team is the one that wins every single year, or (if you choose not to be greedy about it), any year you wish for it to win.
This doesn’t just work with baseball, either. Ever wonder what it would have been like if you’d married someone else? You did — in another universe. Wanted to be an astronaut? You are — in another universe. Wanted to race a monkey-navigated rocket car across the Bonneville Salt Flats? You did it, baby. Just not here. Yes, it’s a little sad the other yous are having all the fun. On the other hand, think of all the other yous that are sleeping on steam grates or doing time in the big house for petty larceny or woke up in the hospital with their bodies amputated from the third vertebrae down and a doctor saying “What the hell were you doing, letting a monkey navigate your rocket car?” You’ll realize this world is not so bad. Even if the Cubs don’t have chance.
Posted on October 1, 2002 Posted by John Scalzi 7 Comments
I’m continuing my cavalcade of science-related pieces in honor of the release of Uncle John’s Bathroom Reader Plunges Through the Universe, to which I contributed a significant number of the articles contained therein. The pieces you’re reading here this week are rather similar in tone and content to the pieces you’ll find in the Uncle John’s books (explicitly in the case of the articles I wrote, and implicitly even in the one I didn’t, since they wouldn’ta bought so many of my pieces if they didn’t fit the general tone of their book), so if you like it, consider getting the book (click on the graphic for an Amazon link). Remember: You don’t have to only read it in the bathroom. Also remember the contest I’m running: The winner will get a whole stack of Scalziana. Yes, that’s a word. At least it is now.
Have We Got a Disease For You!
Looking for a little something to make you stand out from the infectious crowd? One of these maladies may just do the trick.
We know how it is. You want to be different from the other guy. Everyone else is walking around with a cold or a flu — your standard issue rhinovirus or influenza bug — but you want something different. Something that you’re just not going to catch on any street corner. Well, then, come one down. Right now we’ve got a nice suite of diseases, maladies and genetic conditions that will make you stand out in the crowd, if only because you’ll have to be locked in a sterile room with two or three levels of biological isolation protocols placed between you and the outside world. Won’t that be fun? Oh, don’t worry. Some of these diseases and maladies aren’t even fatal.
Carotenosis: Let’s start off with something relatively benign, shall we? Carotenosis comes from over ingestion of beta carotene, a pigment that you’ll find in vegetables such as carrots — your body turns it into vitamin A, which is, generally speaking, a good thing. Ingest too much beta carotene, however (say you eat nothing but carrots and drink nothing but carrot juice, just because you were curious to see what would happen) and eventually your skin will turn orange That’s carotenosis, a real example of “you are what you eat.” Carotenosis won’t kill you, it’ll just make you look funny, but massive doses of vitamin A can cause: Nausea, vomiting, irritability, hair loss, weight loss, liver enlargement, menstrual problems, bone abnormalities, and stunted growth for the kids. So if you find yourself turning orange, lay off the rabbit food for a couple of days.
Hereditary Methemoglobinemia: You say orange really isn’t your color? How do you feel about blue? This genetic malady causes a malformation in the hemoglobin molecule in your blood, reducing your blood’s capacity to carry oxygen. This turns arterial blood sort of brownish, and in folks of a Caucasian stripe, this will give your skin a distinct — and distinctive! — bluish tinge. True, your blood’s not exactly richly oxygenated, but that’ll just give you a fashionable appearance of ennui. But there is a catch: Hereditary methemoglobinemia is recessive, so by and large it’s prevalent only where (oh, how to put this delicately), a family tree has a few too many recursive branches. This was the case in the most famous case of this ailment, the Fugate family of Kentucky, where a high incidence of cousin-marrying eventually caused a number of Fugates to be blue, and not just in the traditional “I’m feeling mighty low” sense.
Want blue skin but would rather not have father-uncles and sister-cousins? There is also acquired metheoglobinemia, which you can get by exposure to certain toxic chemicals. However, the side effects of this variant are headache, fatigue, tachycardia, weakness and dizziness at low levels of exposure, followed by dyspnea, acidosis, arrhythmias, coma, and convulsions at higher levels, which is then followed by death. Speaking of feeling blue.
Kuru: Enough with this skin color nonsense, you say. Give me a truly distinctive disease! Fine, if you really want to make an impression, try on kuru for size. Even the name tell you it’s something truly nasty, since “kuru” means “trembling with fear” in the language for the Fore, the New Guinea highland tribe in which the disease reached epidemic proportions in the middle of the last century. Kuru’s first symptoms are headaches and joint pains, followed several weeks later by difficulty in walking, and uncontrolled trembling while asleep or while stressed (which would be most of the time, considering). Tremors become progressively worse, confining the patient to bed. This is followed by total loss of the ability to swallow or eat, and after that you’re just a hydrating IV drip away from doom. Oh yes, you’ll definitely be the belle of the ball with this one.
One minor detail, which would be how you catch Kuru in the first place You have to eat brains. Specifically, human brains. Even more specifically, human brains already infected with kuru. This is how the Fore got it — as part of their funeral rituals, they ate the brains of their dead. Not quite up for a Hannibal Lector moment? Well, fine. Let’s move on then, shall we.
Necrotizing Fasciitis: Or as you know it, flesh-eating bacteria! The funny thing is (and that’s funny as in ironic, not funny as in “non-stop chucklefest”), the affliction does live up to its name The bacteria involved in necrotizing fasciitis (which include the usually somewhat less virulent Group A streptococcus that give us run-of-the-mill ailments like strep throat) can actually eat through an inch of flesh in the space of an hour. What will make you truly paranoid is that early symptoms of necrotizing fasciitis are remarkably similar to flu symptoms, including vomiting, diarrhea, dehydration, weakness, muscle pain, and fever.
It’s the second set of symptoms — very painful infection around a cut or a bruise and/or a rapidly growing infection around said bruise — that will have you rocketing towards the doctors and praying that Western Civilization’s rampant misuse of antibiotics in everything from bathroom soaps to livestock feed hasn’t caused your personal area of infection to be packed with drug-resistant bacteria that will simply laugh cruelly at whatever it is the doctor administers to fight them.
The good news here is that the odds of your flu-like symptoms devolving into necrotizing fasciitis are a couple hundred thousand to one (your odds are somewhat greater if you’ve just had chicken pox, however). If you really want to reduce your fretting, wash any cut or scrape you get with warm soapy water, and keep the wound dry and bandaged, just like you’re supposed to. And rethink your desire to have a truly unique disease. After considering necrotizing fasciitis, a nice run-of -the-mill cold is beginning to look mighty attractive.
Posted on September 29, 2002 Posted by John Scalzi 1 Comment
All right! One of my books for 2002 is now out: Uncle John’s Bathroom Reader Plunges Into the Universe (my other book for 2002, The Rough Guide to the Universe, now looks like it’s going to be released in 2003. Which is good — too much product released at the same time is no good, especially when both books have the word “Universe” in the title). Of course, I advise you all to run out and buy this book as soon as possible — if you can’t wait to get to a bookstore or are otherwise incapacitated (you’re being held down by stoats, say), then head on over to Amazon.com. I’m all about facilitating purchases.
The Uncle John books, if you’re not familiar with the series, are compilations of short articles (sized just right for light bathroom reading, hence the title); this particular one has a science theme — not just astronomy, but also health and earth sciences. I should note for the sake of clarity that I am not the “Uncle John” of the title: Indeed, technically, this is not my book at all. I am but a mere contributor. However, I wrote 40 articles in the book, which by page count is about a quarter of its total, and I think what I’ve written is pretty interesting. And I have very high regard for the Uncle John’s folks, so even if I hadn’t written a fair chunk of this book, I’d want you to go out and buy it anyway.
So what did I write about? Here is a sampling of the titles of articles I wrote for this one:
*Cool Astronomical Terms to Make Friends and Impress People
*Read a Weather Map Like a Pro
*How to Make a Black Hole
*”You Think I’m Mad, Don’t You?” (Mad scientist movies)
*The Body’s Second String (Little-known organs and systems)
*Big Moments in Forensics
*10 SF Books Even Nongeeks Would Love
And there are 33 others spread around the book. No, I’m not going to tell you which ones they are. I want you to guess.
In fact, let’s make this a contest. Go out and buy Uncle John’s Bathroom Reader Plunges Into the Universe (or, if you’re cheap and can weather annoyed bookstore staff, thumb through it at the store) and then send me the list of articles you think I wrote. The person who gets the most correct will win a John Scalzi Multimedia Gift Pack, which includes an autographed copy of The Rough Guide to the Universe (which is solely written by yours truly), an autographed copy of The Rough Guide to Money Online (a classic of the online money management genre!) a personally-burned CD compilation of Musicforheadphones plus extra tracks, and an electronic copy of Old Man’s War, the novel I’m currently shopping around. It’s a fabulous gift pack with a street value of, oh, I don’t know, $28 or thereabouts. The winner will get it sent whenever it is I get my author copies of Rough Guide to the Universe.
The rules: First, you have to send your list of guesses to me by December 31, 2002. Second, put “Universe Article Guesses” as your e-mail subject header, so I can filter them to a special mailbox and keep track of them. Third, if you were on the list of readers that I sent the Uncle John articles to while I was writing them, obviously you’re not eligible (and if you are one of these people, don’t tell anyone the titles of the articles; that’s just not fair). In the event of a tie, I’ll pick a winner by flipping a coin or whatever. No purchase necessary, but you’ll look fairly cheap if you don’t.
To give you a taste of the tone of the articles in the book, all this week I’ll be posting articles that I wrote for Uncle John’s Bathroom Reader Plunges Into the Universe but which didn’t make the final cut for whatever reason (4 didn’t make it; 40 did. I have no complaints). The first one is below. I’ll post another on Tuesday, one on Wednesday and one on Thursday (after which I’ll be out for a few days while I travel). So enjoy, and good luck with the contest.
You Smell Great!
Thinking about getting that pheromone-laden cologne? Hold that thought.
There’s a new special ingredient to cologne these days: Pheromones — chemicals your body secretes, or so you’re being told, that can help you attract the sort of hot mate that will get all slobbery with little or no prompting (or even noticeable social skill) on your part. And you think to yourself Finally. That whole flowers-and-chocolate-and-pretending-to-be-
interested-in-the-conversation thing was killing me. And off you go, to buy your pheromone cologne and let the chemicals do the talking for you. Well, before you pull out your credit card, let’s have a quick reality check about pheromones, humans, and you.
First off: Yes, pheromones really do exist, and they are chemicals that living things give off, not unlike a scent, in order to communicate with other members of their species. These pheromone communications are all over the board: Ants and termites, for example, will use pheromones to lay down a trail that other ants and termites can follow. Queen bees use pheromones to signal bee pupae that they’re going to be worker bees and not queens themselves. Wounded minnows will release pheromones to alert the rest of the school of fish to danger, a sort of fish version of the wounded soldier who says arrrrgh, I’ve been shot, go on without me.
However, many species use pheromones specifically to attract sexual partners. Insects are famous for this: Certain species of moths are so sensitive to a female moths’ pheromones that just a couple of molecules of it can get them running (well, flying. You know what we mean). Male wild boars have a pheromone that will actually cause a female of the breed to lock her hind legs into a sexually receptive position: No flowers-and-chocolate routine needed there. Even non-animals get into the act: Fungi, slime molds and algae all use pheromones to makes themselves super-sexy to other fungi, slime molds and algae. It’s not love, the fungi/slime mold/algae says to its excited new friend. It’s just pheromones.
So there you have it: Pheromones = instant sex appeal, right? Sure, if you’re a slime mold. But it’s never been proven that humans use pheromones to make themselves more attractive to the opposite sex. In fact, until 1998, it wasn’t even clear that humans were receptive to pheromones at all. There were several reasons for this, not the least of which was that the organ used by many animals to receive pheromone signals — a thing called the vomeronasal organ — is all but non-existent in humans. What small vomeronasal organs we have are tiny notches tucked away in our noses, and it’s not at all clear that they’re connected to anything.
What changed that was a study performed at the University of Chicago by researchers Martha K. McClintock and Kathleen Stern. While an undergraduate at the U of C in the early 70s, McClintock noted that the menstrual cycles of the women in her dormitory eventually synced up (it is, by the way, very typical U of C undergraduate behavior to notice this sort of thing), and suspected pheromones might have something to do with it. To check this, she collected sweat samples from nine women (by having them wear gauze in their armpits), and noted where in their menstrual cycle those women were. Then she took those sweat samples and daubed them under the noses of 20 other women. Yes, yes, total icks-ville. Science is not for the squeamish.
What she found was that the women who sniffed the sweat had their menstrual cycles noticeably lengthened or shortened, depending on what sweat they were sniffing. Sweat from women early in their cycle caused the sniffers to shorten their own cycles, while sweat from women later in their cycle had an opposite effect. There you had it: The first strong indication that humans can and do pay attention to pheromones. McClintock’s study left open many questions, such as how exactly the pheromones did their signaling, or even whether pheromones would work on other people who weren’t actively sniffing sweaty gauze. But those are details to be worked out.
There are some indications that humans use pheromones (or something very much like them) in helping to determine mates — in other studies, women appear to be attracted to the smell of men who have immune systems that are different from their own (this study involved women sniffing sweaty shirts). But again, it’s important to note that so far, there’s no conclusive study that specifically identifies a human pheromone that actually makes one sex more attractive to the other sex (or one sex attractive to the same sex, if you want to go that direction).
What does this mean for your pheromone-laced cologne? Basically that it’s a waste of money. The only thing we’re reasonably certain human pheromones do is manipulate the menstrual cycle, and generally speaking, that’s not something you really want to fiddle with, for everyone’s piece of mind. Your best course of action at this point is to stick with your current cologne and try to brush up on your social skills. Hey, people have been finding love the old-fashioned way for millennial, without the use of pheromones (so far as they knew). It could work for you too. Flowers and chocolate can’t hurt either.
Posted on September 26, 2002 Posted by John Scalzi
Here’s an interesting question for you: Considering that the music industry essentially dictates the shape of the youth culture, how can it be so thickheadedly clueless about talking to teens about file sharing? The latest music industry salvo in this direction is a Web site called MusicUnited.org, which is designed to bring home the point that nearly all file sharing is illegal and wrong. Let’s take a moment and discuss all the ways that this site is going to fail miserably.
1. It’s not a cool site. It’s not cool in its intent, of course, since its intent is to keep kids from doing something they want to do, which is to share files with each other. But you can get past that if you can get your message across. The site totally screws this up right from the beginning: One of the headlines on the front page says of file sharing: “It’s illegal and it’s a drag!”
A drag? I mean, good Lord. I’m 33 and I winced when I saw that. It immediately calls to mind your junior high health teacher trying to use hep slang to tell you about why drugs are bad. The worst thing an adult can ever do when speaking to “the kids” is try to use current slang and fail (the second worse thing is to use it and use it correctly, and yet still sound like you have no clue). The site immediately sets itself up to be mocked purely on the basis of how it presents its message, which means the message won’t even get considered.
2. The site threatens. Despite the nice (but too conservatively-designed) graphic design, the textual tone of the site is one of distinct and total menace. Every bit of text reinforces ominously that file sharing is illegal (and wrong), and that there are severe penalties if you’re caught: The site’s favorite bit of trivia in this respect is the maximum penalty for copyright violations, which is five years in the stony lonesome and a $250,000 fine. “Don’t you have a better way to spend five years and $250,000?” asks the site.
Please. The minute the music industry actually ever pressed for the maximum sentence for copyright violations to be imposed on an actual teenager is the minute the shit really hits the fan. No one in their right mind believes that the penalty for a college student downloading the White Stripes album from Kazaa should be half a decade of prison rape and being traded in the exercise yard for a carton of Kools. If the RIAA actually pressed for this for a single casual downloader of music, the backlash of public opinion would destroy the music industry. They know it, and more importantly the kids know it, too. Waving around a big threat stick when you have no ability to use it makes you look sad, desperate and weak, which is certainly no way to get a teenager to listen to you.
3. The site romanticizes file-sharing. The music industry is using the same style of rhetoric against file-sharing as responsible adults used against drug use in the 60s and 70s, during which time, you’ll recall, the kids made drug use pretty much the cornerstone of youth culture. Because anything that really pisses off the grownups is worth doing more than once.
Now, this is not going to be an exact analogy, and thank God for that, since the last thing the world needs is a Cheech & Chong-like pair of wacky file sharers making movies about ripping off the music industry. But it’s good enough, and is certainly more than enough to make the kids feel that by downloading Vanessa Carelton, they’re striking a blow against the Man, or whatever it is the kids are calling “the Man” these days.
The site additionally compromises its position by featuring an area that details the civil and criminal penalties parents can face when teens download files, thereby informing the kids that here is yet another way that they can get back at their parents for having birthed them and forcing them to grow up in suburbia. Good move.
4. The site picks the wrong musicians to plead its case. On the site and in a newspaper ad that runs today, the music industry hauled out the stars to make its point, featuring quotes by Britney Spears, Nelly, Dixie Chicks and (wait for it) Luciano Pavarotti. This is supposed to reflect the depth of diversity of the musicians want you not to share files. The problem is, each of these artists is a multi-platinum artist whose net worth is in the millions. Britney Spears is worth over $100 million personally, as she noted recently in a People interview. The kids are not going to be sympathetic to a bunch of millionaires complaining they money is being taken from them. I know this because I’m not sympathetic to them.
The sort of musicians who should be highlighted in a campaign like this are the ones who actually will get hurt by file sharing: New musicians, musicians with smaller followings, musicians who aren’t already millionaires. The Web site features a couple of these, hidden so far down that their quotes are buried. But you tell me, which of these quotes is more compelling to you?
“Would you go into a CD store and steal a CD? It’s the same thing, people going into the computers and logging on and stealing our music. It’s the exact same thing, so why do it?” — Britney Spears
“I live with my drummer and guitarist and we have no money. Our survival is based solely on the purchase of our music. Music is not free. Even the street performer gets a dime in his box.” — James Grundler, Singer/Songwriter, Member of Paloalto.
Personally, I think the “Dude, I’d like to eat” line from a struggling musician carries rather a bit more moral weight than the “Golly, it’s like stealing from a CD store!” line from a 20-year-old woman who has more money than she can reasonably expect to spend in a lifetime. If nothing else, the kids who want to be musicians will feel closer to the situation of the guy in Paloalto than to Britney.
The final problem, however, is one that the music industry made for itself, which is widely-held perception that music is both absurdly expensive and that the vast majority of the money that gets paid for a CD goes to everyone but the people who actually make the music. The reason for the perception is that it’s true. Why should a kid believe that $18 is a fair price for a CD when he or she can burn one at home for about 50 cents? The economics of record contracts are now common knowledge as well, and when a kid realizes that his or her favorite band can sell millions of CDs and still be in the hole to the record company, there hardly seems to be an incentive to support a system that appears to screw the people who make the music.
The site notes that making an album these days can cost $1 million or more, but this doesn’t argue against pirating music, it argues against spending so damn much to make a record. I review indie albums every week on my IndieCrit site, and the sound quality of a sizable percentage of those recordings rivals anything you’ll hear from a major label. I can guarantee you those indie artists aren’t spending a million making their CDs. They’re also not to blame for creating a system of promoting music that requires an outlay hundreds of thousands of dollars to get music added to the playlists of ever-more consolidated radio stations, which play ever-safer music.
I’m not suggesting the kids are striking a blow for artists rights by boycotting the unfair system. That’d be a little much. Most of them just like not having to pay for the music. It’s more that they can spend on video games. But it wouldn’t hurt if the music industry wasn’t perceived as a bloated, vaguely vampiric entity that appears to survive by sucking the life force out of the people who make the music that kids respond to.
If I were the music industry, I’d scrap the MusicUnited.org site and try for something that starts with the assumption that the kids aren’t the enemy and have to be threatened, but are actually reasonably intelligent people who might be persuaded to spend money to support their favorite musicians if it could be intelligently explained to them why this is actually a good thing to do. In the meantime, the site is the music industry equivalent of “Just Say No” — The right message, perhaps, but the utterly wrong way to say it.
Posted on September 20, 2002 Posted by John Scalzi 1 Comment
Krissy came home the other night with Who Moved My Cheese? It was pressed onto her at work by one of the managers at her new place of employment, who told her that all new hires were actively encouraged to read it (Here’s a clue to the sensible Midwestern frugality of her new place of work: Rather than buying a copy for every new hire, which would cost $20 a pop at list price, they simply lend out the same copy over and over). My understanding is that it’s arguably the number one business motivational book on the market. Well, I’m in business, and I prefer to be motivated, so I read it. And now I can say, if this is what people are using to motivate themselves in corporate America today, no wonder the Dow is where it’s at. It is, without exception, the stupidest book I have ever read.
The motivational lessons in the book come in the form of a parable, suitable for reading to your three-year-old, about four creatures in a lab-rat maze. Two of them are mice, and two of them are little mice-size humans, and they eat the cheese that’s placed in a certain location in the maze. Eventually, the amount of cheese decreases and then disappears. The mice, who noticed the decreasing amounts of cheese, take off through the maze to find more cheese. The little humans, on the other hand, bitch about the loss of cheese and reminisce about the days when cheese was plentiful. Eventually one of the humans gets off his ass and heads out to find more cheese, and in doing so, has a motivational epiphany every few steps, which he feels compelled to scrawl on the walls of the maze.
Eventually he finds more cheese in the maze, as well as the mice, who have grown fat and happy with their new store of food. The little human considers going back for his friend, but then decides that, no, his friend must find his own way through the maze. He leaves his old pal to starve, as that’s almost certainly what his dim, stubborn friend does, and feels all shiny and self-important for finding his new cheese.
The entire parable is framed with a conversation between several friends, one of whom is telling the parable, and the rest of whom spend the parable’s epilogue wondering how they ever got through their professional and personal lives without hearing about the cheese (an interesting rhetorical cheat, incidentally — the author is confirming the usefulness of the book by creating characters that are helped by its philosophy, but which don’t actually exist in the real world. This is a very Ayn Rand thing to do).
The overall idea of the book is that change is inevitable and if you’re smart, when it happens you won’t spend much of your time bitching about how you don’t like change; instead you’ll adapt to the change and get on with your life. The “cheese” represents all the things you’ve come to rely upon. Well, let me save you 20 bucks and boil the lesson of the book down to exactly five words: Shit Happens. Deal With It.
Also, the book throws in a few other lessons, which are hopefully unintended:
1. Life is a maze that has been laid out without your control or consent. The best you can do is run through it and hope you run into the things that make you happy.
2. You have no control over the things that make you happy — their quantity and quality are controlled totally by outside forces, with whom you cannot interact, and which have no interest in your needs.
3. The mice in the parable understood that the “cheese” was decreasing but neither informed the little humans nor seemed interested in helping them once the cheese was gone. Mice represent the “other.” You cannot trust the “other.” Stick to your own kind (alternately, the mice represent management, who know more about the reality of the situation, and the little humans are the rank-and-file, intentionally kept in the dark by management. Either way: Not to be trusted).
4. The one little human found more cheese but decided not to return to help his friend, rationalizing that it was up to his friend to find the way. Moral: Once you’ve got yours, you don’t need to share. It’s not your responsibility to share your knowledge with others, even if the cost of sharing that knowledge is trivial and doing so will immeasurably improve their lives (i.e., in this case, keep the other little human from starving to death).
In other words, the formulation of the book posits a world that is confusing and sterile, in which the things that might make us happy all exist outside of ourselves, and in which the ultimate successful qualities are selfishness and paranoia. I wonder how popular this book was at Enron and Global Crossing.
Look, people. If you ever find your “cheese” decreasing, don’t run around frantically in a maze, looking for something else to replace it. Simply learn to make cheese. Which is to say, be responsible for creating your own happiness internally instead of relying on something outside of you to provide it, and living in fear that it will go away. This way, when the cosmic forces take away your cheese, you can look up and say, screw you and your stinkin’ maze. I’ll move when I damn well feel like it.
Even better, you won’t have to compete with others for your cheese. Heck, eventually you’ll have surplus cheese to give to your friends who might be starving for some. You can teach them to make cheese, too. Give a man a piece of cheese, and he has cheese toast for a day. Teach him how to make cheese, and you’ve got a life-long fondue party pal.
Mmmm. Fondue. Much better than scampering blindly through a maze. Or paying $20 for a book that condescendingly tells you that’s what you should be doing with your life.
Bob Greene Redux
Posted on September 18, 2002 Posted by John Scalzi
Interesting feedback from the Bob Greene thing the other day. Aside from the journalistic schadenfreude of watching Bob Greene fall — which is considerable, so that’s a warning to all of you who wish you had his career up until last weekend — the largest spate of e-mail I got about it came from 40-plus-year-old men who wanted me to know that they don’t like 18-year-old girls. Not at all. My universal response to these fellows was: Good for you. I’m sure your wives are proud.
As it happens, I’m not so keen on 18-year-olds myself; in the grand scheme of things, procuring one today would be more trouble than it’s worth. This has nothing to do with their physical charms (about which I’ll comment in a minute) and pretty much everything to do with the fact that at the age of 33, the only two things I have in common with the typical 18-year-old girl are that we are both human and speak the same language, plus or minus a couple dozen words of slang. To be terribly male about it, I suppose I could have sex with an 18-year-old if I had to. I just wouldn’t enjoy the post-coital conversation very much. So if it’s all the same I’ll pass. Fortunately for me, there are not great throngs of 18-year-old hotties at my door, licking the window panes to entice me to let them come up for a romp. You can imagine my relief.
Over at Slate, Mickey Kaus begs to differ about my point concerning Greene’s encroaching mortality being a consideration for his boinking a teenage woman; Kaus writes:
“Why do men — like Scalzi here, or Warren Beatty in Shampoo (or whoever wrote Warren Beatty’s lines in Shampoo) — have to explain their desire to have sex with attractive women in terms of a struggle against mortality (“middle-age-death-denying” in Scalzi’s words)? You mean they wouldn’t have sex with young women if they were in good shape and knew they were going to live to be 300? They didn’t want to have sex with young women when they were young themselves? It’s sex! Millions of years of evolution have designed men to want it and enjoy it.. It’s stupid to try to explain this urge in some highfalutin’ literary or spiritual way — and revealing that even relatively no-BS men like Scalzi (or Nick Hornby in High Fidelity, to name another) feel that they have to.”
Let’s separate this out. There’s the first point, on which Kaus is entirely correct, which is that boinking hot young women is really its own excuse. You all know the drill concerning the genetic and cultural reasons for this, so let’s pretend I’ve made all those points so we can move on. There is the point to be made here that (some) men are turned off by the yawning chasm in life experience between themselves and the average 18-year-old, and therefore prefer the company of women nearer their own age. As I mentioned earlier: Good for them.
On the other hand: Provide a man with the brain of a 45-year-old woman (yes, he’ll suddenly become smarter, ha ha ha, thank you very much) and tell him he can put it either into the body of a fit, attractive 45-year-old woman, or into the body of a fit, attractive 18-year-old woman. Let’s all not pretend that the 45-year-old body is going to do anything but sit there with a blinking neon “vacancy” sign flashing over its head. In a perfect world (for men) women would hover around age 23 forever (In a perfect world for women, I expect you’d see a lot more variation in age, from a Heath Ledger 22 to a Pierce Brosnan 49, with the median being a Brad Pitt 38).
Still, conceding this point, which I readily do, doesn’t mean that middle-age dudes still don’t actually see (or at least rationalize) porking the young as a fist in the snout of death. It’s not especially highfalutin’ to point it out, it’s actually pretty sad and common. If you’re thinking about death, or how you’ve squandered your potential in middle management or wherever, you want to do things that make you feel alive. Having sex with young women is the male mid-life crisis version of the Make-A-Wish Foundation. It doesn’t keep you from dying, but at least you get to go to the Magic Kingdom one more time.
Whether this is the particular case with Bob Greene is another matter entirely. As journalist Nancy Nall notes on her site, Greene has had a reputation as a skirt-chaser for a while now, so if these scandalous rumors are true, he’s merely pursuing a modus operandi honed over decades (eeeeew). In which case Kaus carries the day. This encounter really is less about middle-aged angst than it is just about making a fast and easy booty call on the Youth of America: Dinner and dessert. Let’s hope it was at least an expensive dinner. Taking the girl out to Harold’s Chicken Shack before slipping her the drumstick would just be chintzy and sad.
Moving away from the realm of horndog newspaper columnists and the teenage girls they cavort with, let me take a minute to bow down to my own superfabulous wife, who as you may know started a new full-time job on Monday. She was at the job roughly six hours before she got a promotion into another department; the department had an opening, saw her resume and made a (barely) internal hire. This is a testament both to Krissy’s fabulousness and her new company’s ability to judge talent. I’m pleased because at this rate of ascent, Krissy will be able to support us all on her income alone by about this time next week. Which means I can retire and spend more time on the important things, which are, of course, video games. Yes, I’m aware that this last statement means that if anyone in this relationship is going to be traded away for a new hot and young plaything, it’s going to me. It’s a risk I’m willing to take.
Bob Greene Gets Canned
Posted on September 16, 2002 Posted by John Scalzi 1 Comment
Header in my Spam box today: “Barnyard animal rapers take it to the extreme!!!” Jesus. Aren’t they there already?
Speaking of taking it to the extreme, Chicago Tribune columnist Bob Greene resigned his position over the weekend because someone blabbed to the Tribune (in an anonymous e-mail, no less) that Ol’ Bob had a sexual encounter with a teenage girl a decade ago (he would have been in his mid-40s at the time). He had met the girl in connection with his newspaper column. Interestingly enough, it’s that last part that seems to be the smoking gun, not that she was a teenage girl and he was a middle-aged guy with what looks like a bad haircut, although all of that looks bad enough. Apparently she was the age of consent, even if she was a teenager (there’s a couple of years where those two overlap). But having sex with someone you meet in connection with a story is a no-no.
That Bob Greene would have sex with a teenager while he was huffin’ and puffin’ away at middle age is not much of a surprise. First off, he’s a guy, and if the average 40+ guy gets a chance to boink an 18-year-old without penalty (or in this case, a penalty delayed by several years), he’s going to take it. Undoubtedly he’ll have a good rationalization (we always do, and Greene, being a writer, probably has a better one than most), but to cut to the chase, he’ll do it because she’s hot and young, and because during middle age the Veil of Male Self-Deception, even at maximum power, can no longer hide the fact that one day the man will die, and that between now and then, the number of truly hot young women he can have without paying for them is small and getting smaller, fast. So that’s reason number one.
Reason number two that it’s not at all surprising is that Bob Greene is, by self-appointment, Boomer America’s Newspaper Columnist. Well, was. Anyway, as a chronicler of the Boomer Nation observing itself, it was only a matter of time. Boomers have never done anything that wasn’t eventually about them; it’s the funky never-ending narcissism thing they’ve got going. No, that doesn’t make the Boomers evil — every generation has its annoying tics (my generation, for example, has a tendency to whine like kicked puppies being shown the boots that will get them in the ribs), and this is the Boomers’. Also, rather unwisely, the Boomers made a fetish of their youth when they were younger — hey, they were young, what did they know — and they’re not handling the inevitable decrepitude well. Narcissism + Getting Older = Irrational Behavior, often involving younger women in ill-advised trysts. As Boomer America’s Newspaper Columnist, how could Greene not do this? He’s just staying true to his demographic.
Reason number three is that Bob Greene telegraphed the idea he’d do (or did, depending on the timeline) something like this a decade ago in his perfectly awful novel All Summer Long. The story involves three life-long high-school chums, who when confronted with the stirrings of middle-age do what all newly-middle-aged men do in mediocre quasi-autobiographical fiction written by newly-middle-aged Boomer men: Take a long vacation away from their families and responsibilities to “find themselves” on America’s byways. This, of course, often involves extracurricular sex with hot babes. In the case of Bob Greene’s obvious stand-in inside the novel (a nationally well-known TV journalist named “Ben”), this means having sex with a graduate student roughly half his age. In real life, Greene diddled with a high school student closer to a third his age, but, speaking as a writer, one always tries to make oneself look better in fiction.
Now, Greene didn’t have to follow through on the whole sex-with-a-much-younger woman thing just because he wrote about it. Mystery writers write about killing people all the time; most of them don’t actually attempt to follow through. But sex with a younger woman won’t kill you (just your career) and anyway let’s revisit points one and two here. It wasn’t inevitable, but when a guy draws himself a roadmap and hands himself the keys to the car, it’s not entirely surprising he ends up in Whoops-I-slept-with-someone-my-daughter’s-age-ville, looking for a motel that rents by the hour.
Be all that as it may, I do have to wonder what the problem is here. Greene’s sleeping with a teenage woman is gross to think about, but they were both of legal age, and even if she was just barely so, “just barely so,” counts as legal. So far as I know, Greene applied no coercion other than his not-especially-dazzling celebrity, and as everyone knows, if a great many celebrities didn’t do that (especially the not-especially-dazzling ones, and especially ones, like Greene, who have a face for radio) they wouldn’t get any action at all; they’re just as lumpy and furtive as the rest of us.
Journalistically speaking, having sex with someone in one of your stories isn’t very smart and is definitely suspension-worthy (a nice long “leave of absence” would have been good), but it’s not a crime. From what I can tell, Greene even waited until after he had written about the woman to hit her up. The Tribune is labeling it a “breach of trust” between journalist and subject, but if he did in fact wait until after he had written about her (and did not write about her post-boinkage), where is the breach? What I see is simply middle-age-death-denying sex, which God knows is common enough. Unseemly, sad and more than a little creepy, but there are worse things a journalist can do. Hell, it’s not plagiarism.
There’s probably more here than what we know now, that’s my only guess. It’s worth noting that the Trib didn’t fire Greene; he apparently offered to resign and the resignation was accepted. If I were a corporate suit, I’d’ve taken the resignation too, since it was an easy way to distance my company from Greene’s compromising position.
Also, I think Greene should have been cut as a columnist years ago, not because he’s morally tainted, but because he’s a boring columnist. He stopped being interesting and started being filler long before he did his questionable after-school activities. From a purely utilitarian point of view, there’s no downside to Greene hightailing it out of town, excepting that there will be the painfully rationalized mea culpa six months down the road as part of Greene’s inevitable comeback (America loves a reformed sinner).
But based on what we know now, this isn’t the way Greene should go out. If he needed to be yanked, he should have been yanked on the merits of his writing (or lack thereof), not because of sex he had a decade ago with a legal adult who apparently gave her consent after she was no longer his journalistic subject. Greene is getting popped on a dubious technicality, and though I would have never imagined I’d say something like this, I think he probably deserves better. Getting canned for being a boring columnist would probably have been harder on the ego, but at least it would have been a reasonable excuse for getting escorted from the building. I won’t much miss Greene’s columns, but even I wish he could have had a better final act.
Whatever Everyone Else is Saying