To Begin With…

My editor didn’t much care for the three-part prologue I wrote for Book 3 of my four-volume epic. I’m pretty sure she’s right. What I was doing was setting up the story, rather in the way that one would assemble scaffolding. Two of the sections of the prologue introduce the reader to very minor characters — characters who are never seen or heard from again. The things these characters do are certainly relevant to the later action, but nothing they do is dramatic. Minor characters and no drama — not an effective opening. The third section of the prologue is an excerpt from the autobiography (written many years later) of one of the secondary characters. It’s by way of being a quick summary of some of the action in Book 2 and a couple of hints about what is to come. There’s no action at all in that section.

The conventional wisdom on how to start a novel — or, for that matter, a short story — is to begin in medias res. That’s Latin for “in the middle of things.” Don’t lead up to the main conflict of the story; dump us into the middle of it!

I have a few multi-volume series in my Wall of Paperbacks, so tonight I went and looked at how other authors start Book 2 or Book 3. One starts with an invading army pouring over a hill while the king watches from the battlements of his castle. Okay, we know where that’s going. Another starts with a deadly standoff — two good guys and two bad guys aiming crossbow bolts at one another, from extremely close range. Gets your attention, doesn’t it?

Two factors need to be weighed, I think, in considering how to start Book 2 or Book 3 in a series. There may be other factors, but these two leap to mind.

First, how directly do the events at the start of this book follow the events at the end of the previous book? Some series are loosely tied, each book standing more or less on its own. In that case it’s easy enough to just start the new story with fresh action — no need to bog down the opening with an explanation of how the characters came to be doing whatever they’re doing. But in other series the action is closely linked, each book leading directly to the next.

That’s what I have. The beginning of Book 2 follows the end of Book 1 by only a few hours, and the segue from Book 2 to Book 3 is equally taut. In that situation, a fresh and dramatic opener becomes a bit of a challenge, because the previous book had an ending. The action had resolved. And then suddenly shit goes boom? That would be tricky to set up.

Second, do we anticipate that the reader will have finished with the previous book only a few days before and will remember most of the salient details, or do we anticipate that some months may have passed, in which case the reader is likely to need reminding in order to understand what the heck is going on?

There’s also the size of the cast in the opening scene to consider. An opening scene with six or eight people is almost bound to be a mistake, because the author has to drag the action to a screeching halt in order to make sure the reader knows who everybody is. But if Book 3 follows Book 2 with a lapse of only a few hours, and if Book 2 ended with a whole bunch of characters in the same place at the same time, they’re still going to be onstage in the dramatic opening of Book 3.

Pardon me while I devote fifteen seconds to feeling sorry for myself. Okay, I’m fine now.

No matter how those factors line up, though, an opening needs some sort of tension. As Holmes says to Watson, “The game’s afoot!”

This is why some editors, and some readers, despise prologues. They want the author to get on with the story right now, with no hemming and hawing. (George R. R. Martin starts every book in the Game of Thrones series with a prologue, but what does he know?)

If you’re writing literature, of course, the game that’s afoot may be very much more subtle than a confrontation with crossbows, but a careful examination of almost any well-written novel is likely to reveal that the author is very carefully setting up the psychological, emotional, or cultural conflicts to be explicated in the book.

I’m not trying to write literature. I’m just trying to tell a good story. But if I ditch the prologue of Book 3, the opening scene is going to be a banquet with more than a hundred people, among whom will be Kyura, Meery, Spindler, Benagat, Dunny, Strudabend, Iknizer, Farin, and several characters who are new and need to be introduced. There is tension in this scene, but it’s going to be a mess.

Whatever. As I like to tell my cello students, if playing the cello was easy, everybody would do it. I think that applies pretty well to writing novels, too.

Muddle Me This

I don’t often post links to other writing blogs, but I enjoyed this essay on bookbaby ( The number one enemy of the writer, blogger Dawn Field suggests, is unclear thinking. If you don’t envision the details of a scene clearly, how are you ever going to describe it to your readers? If you aren’t clear about why your characters are doing whatever they’re doing, not only will your characters likely be flopping back and forth on the page like a loose fire hose, you won’t know how to describe their actions — with gentle verbs, with angular verbs, with hesitations or without, and so forth.

It’s a tell-tale, and I’ve seen this sort of thing in more than one novel, when the writer describes a room by saying, “There were three or four portraits of ancestors on the wall.” Or perhaps “three or four straight-backed chairs” or “two or three French windows that opened on the garden.” That writer has not envisioned the room clearly. It’s not hard to count the chairs or the portraits! The result is vagueness.

That may seem a trivial example, and it is, but if you find yourself writing that sentence or anything like it, I hope you’ll pause and take stock. Do you have the details of the scene firmly in mind? If there are ten or twelve portraits on the wall, then fine — vagueness is appropriate. But “three or four”? That’s lazy writing.


One of my weaknesses as a fiction writer is that I want things to make sense. I want to understand how the events in the story could and would actually unfold. Okay, except for the unicorns or zombies or whatever — their existence I feel no need to explain. But given the presence of a unicorn in a story, I want the characters’ reactions to it to be realistic. And not just the characters. If the unicorn is in a fenced paddock and is later found running free, I insist on knowing exactly how it got out.

How high a fence is a unicorn capable of jumping over? These details matter to me.

Curiously, I’m also a fan of Doctor Who. My love for the series is not diminished in any way by the fact that the plots make no sense at all. Loose ends are left flapping in the breeze. Any sort of jerry-rigged five-word explanation can be used, and will be, to explain the latest howling absurdity.

I don’t know if the BBC has script conferences, but if there were a script conference for Doctor Who and anybody ever said, “But is it plausible?”, the miscreant would be dragged out behind the building and shot. Plausibility is not just irrelevant in Doctor Who, it would be anathema.

Where the series succeeds, and brilliantly, is in the emotions that each scene arouses. We’re not bothered that the Doctor is obviously dead and ten minutes later is alive again (with a five-word explanation). What the writers are aiming at are the twin emotional peaks of terror and grief followed by amazement and celebration. Those emotions are the pivot on which the series succeeds. Well, that and the special effects.

To be sure, the endless stream of bizarre plot complications couldn’t possibly be handled any other way. But that’s not the point I’m driving at here. The point is, viewers don’t care — and the writers know they don’t.

Most people have a simple, primitive view of the world they live in. They’re not equipped to understand chains of logical reasoning, or even to notice glaring logical flaws. They look at the world and see a good guy, a bad guy, danger, thrills, victory, and not much beyond that. Our Republican propaganda experts know this; our current alleged president could not possibly have been elected by a nation where voters understood and were swayed by logic, truth, or science.

But that’s a side issue. The point, for a fiction writer, is this: Every scene needs to have some sort of emotional core. Its pulse needs to beat. If you can manage that, your readers probably won’t care that the unicorn has jumped the fence. Most of them won’t even notice.

Are We Having Fun Yet?

Since this blog is mostly about writing, I’m going to try to turn this anecdote into a writing tip, but you’ll have to bear with me for a minute. A few months ago I joined the local Unitarian Church. As a card-carrying atheist, my choices in the Sunday morning worship department are are rather limited; fortunately, the local UU’s, as we’re supposed to call ourselves, are an active congregation, very inclusive and welcoming — and I get to practice my sight-singing with the secular humanist hymns.

We have an excellent minister (he’s leaving in July — boo, hiss!), but sometimes the services are presented by members of the congregation, or by guests. Yesterday we had a guest, an aspiring Unitarian minister named Claire Eustace. Her announced topic for the sermon was “Let’s Play!” Now, I’m one of the more playful geezers you’re likely to run into, so I was ready to be inspired.

Except, not. Ms. Eustace began her sermon with a long list of the awful things that are going on in the world. Our alleged president was mentioned. Global warming was mentioned. Discrimination against LGBTQ people was mentioned. I’m sitting there thinking, if you’ll pardon the phrase, “What the fuck?”

After a couple of minutes she suggested that we all wave our arms and make silly noises while we paint the sanctuary with imaginary colors — quite silly, but at least she’s getting onto the topic now. But no. After that we’re back to another litany of misery courtesy of the really awful world we all live in. I’m dyin’ inside.

And then, mirabile dictu, she mentions how important it is to be spontaneous. This was my cue. Fortunately, I was on an aisle seat. I jumped up and started dancing down the aisle snapping my fingers, back to the back of the sanctuary and out the door. I wandered around outside for ten minutes and went back in just as she was finishing.

It’s not often you get to make a post-modern editorial statement on the spur of the moment while also saving your sanity, but I was locked and loaded. I wanted to get out of there!

My read of the situation is that most likely Ms. Eustace is a painfully serious person, and was trying to apply somebody’s advice (perhaps the advice of a therapist) that she lighten up. But here’s the thing about play: Play is not — repeat, NOT — a way of giving ourselves a break from the soul-destroying crises to which we’re exposed in the daily news. Play is just play. That’s all it is.

A baby goat does not frolic to distract itself from the knowledge that it may soon be eaten by a puma. It frolics quite simply because frolicking feels good. That’s all play is, Ms. Eustace. It’s about feeling good. It’s not a response to anything; it is an end in itself.

Consider how we humans use the word. We play board games and card games. We play music. And sometimes we go to a play, where the people onstage play parts. Why do we do all this stuff? Because it’s fun.

The lesson for me, as a writer, is that if I’m not having fun writing, I’m doing it wrong. Writing is not about making a point. It’s not about proving anything, or inspiring people. Nothing like that. Writing fiction is a form of play.

That doesn’t mean it’s always easy! Sometimes writing is painful. But I would hope that it’s painful because the paragraph or the chapter is going badly, not because I’m writing about things that are inherently painful. I know some writers devote themselves to exploring painful emotions, impossible family conflicts, and so forth. I have nothing to say to those writers, other than to misquote Fleetwood Mac: “You go your way, I’ll go mine.”

For me as a reader, if a novel doesn’t shine with a spirit of playfulness, I’m going to set it down and not pick it up again. We can’t all be Terry Pratchett, but life is too short to spend it grinding around in the muck.

New Words or Old?

One of the important ways to tie together novels in a series is using similar titles. It’s a marketing trick, but a useful one, and it’s at least 80 years old. (Starting in about 1935, Erle Stanley Gardner wrote Perry Mason novels whose titles all began with The Case of.)

I’ve been working on a four-novel series using made-up compound words in the titles: The Leafstone Shield, The Ribbonglass Tree, The Heartsong Fountain, and The Firepearl Chalice. The one I’ve never much liked is the word “ribbonglass.” And now that I have an excellent cover artist (Karri Klawiter) working on the series, she’s struggling with how to make a visual image of a Ribbonglass Tree. So now I’m thinking, should I change it?

I think I’m going to change it. My leading candidate, after listing about 30 options, is The Rainbow Tree. Oddly enough, it appears nobody has ever used that title for a novel — that’s the good part. The bad part is, “rainbow” is not a made-up word. It describes the tree well, and has the added benefit that it’s a shorter word, which means it can be put on the cover in a larger font. Also, the other three words are all joinings of single syllables (leafstone, heartsong, firepearl), so the meter of “rainbow” fits better than “ribbonglass.”

But do I dare go with a familiar word? My list of bad alternatives includes lightdance, shimmerflower, glowfruit, silverapple, and sparkglow. The least clumsy of the bunch is shimmerflower, but it’s an even longer word, so it would be just horribly small on the cover.

Along the way I had to change a made-up word that’s not in a title. I had a thing called tumblerock — a patch that might be the size of your back yard, or larger, in which boulders ceaselessly tumble over one another, occasionally colliding and shooting out shards of rock. Tumblerock is deadly, unless you know the right magic spell to pass through it unharmed.

But the tumblerock patches are centuries old, and it occurred to me belatedly that after only a few decades, each of them would be surrounded by a ring of gravel several meters high. Having to climb up over the gravel to reach the tumblerock would be bad storytelling. It would be undignified. It would be silly. So I needed a replacement for tumblerock.

I’m probably going to end up with air-tangles. It’s much the same concept, but now space itself is contorted and in constant motion rather than a bunch of boulders. An air-tangle is just as deadly if you walk into it, but it doesn’t produce inconvenient amounts of gravel. I’m still wondering what you’ll be walking on if you know the magic spell and enter the air-tangle. Walking on air?

Are You “Serious”?

The Romans put it this way: de gustibus non disputandum est. In English, “There’s no arguing over matters of taste.” Of course, we often engage in such arguments, even though doing so is pointless.

My thoughts about this were triggered by a discussion of music, but they seem to relate somewhat to writing too, so I’m going to put my rambling, incoherent commentary into this, my mostly-about-writing blog. The connections, such as they are, will appear further down the page.

A small minority of music lovers, found primarily but not exclusively in university music departments, is passionately dedicated to the composition and enjoyment (if that’s the right word) of music that is ugly and difficult. Those who love the stuff don’t consider it ugly, of course. If pressed, they may admit that it’s difficult, or at least that it’s an acquired taste. The phrase “acquired taste” unpacks to mean, “If you had listened to as much of this music as I have, and knew as much about it as I do, you’d love it too.” This way of looking at it puts the cart before the horse, though. I suspect that listeners need already to have an affinity for ugly, difficult music in order to get very far with listening to or learning about it. Or at least, those who are introduced to it for the first time need to be motivated by a desire of some sort — perhaps the desire for a good grade, or the desire to be surrounded by sounds that express their chaotic, dystopian view of the world.

Meanwhile, most lovers of classical music are happy to subsist on a steady diet of Bach, Haydn, Mozart, Beethoven, and Brahms, with occasional side dishes of Vivaldi, Faure, and Debussy. I had the temerity to suggest to a couple of my Facebook friends that there are reasons for this, and that the reasons are rooted not in listeners’ familiarity with the standard canon of classical music, nor in a conservative approach to culture, but rather in the nature of the human nervous system. Our capacity to understand music, it seems to me, relies heavily on our ability to perceive musical patterns, and to store them in short-term memory so that their relations to other patterns can be examined retrospectively.

Where music has no perceptible patterns, it cannot be understood. It cannot properly be said to be saying anything. It can express incoherence, rage, bafflement, or ennui, but not much else.

This way of looking at music, which seems quite obvious to me, gave offense to the people I was conversing with. One of them responded, I have to say, rather abusively. He felt it necessary to insult me for having denigrated his beloved art form.

Needless to say, this is not how an intellectual discourse should be conducted. If I’m wrong about how music is perceived (or about how ugly, difficult music is perceived), then fine — please show me where I have erred. Insulting me does not allow me to amend my thinking.

Part of the problem is that when a group of people shares a passionate interest in something, be it a religion, a genre of music, a favorite author, or the success of a sports team, those people tend to look down on those who don’t share their passion. They may react to those who feel differently in any of several ways — by dismissing the outsiders as ignorant, by getting angry at them, or simply by huddling together in their feeling of superiority. If they understand, in some dim subconscious way, that the outsiders have a valid point of view, they’re more likely to get defensive and angry in order to preserve the supposed integrity of their view. This is why fans of opposing soccer teams start riots. On some level, the rioting fans understand that their beloved team is exactly like the other team in every respect.

I think that was what was happening today — not the soccer fans part, that was an aside; I mean the defensive in-group part. I think the fellow who felt it necessary to insult me knows, though he would never admit it, that the music he likes is ugly, difficult, and basically meaningless. That it’s rubbish. If he were comfortable with his love of that music, I don’t think he would have reacted that way. If someone says to me that Bach or Haydn is boring and meaningless, I don’t find it necessary to belittle their intelligence or dismiss them as misguided. I don’t hurl bricks, either real or metaphorical, at them. I just smile and move on.

I don’t find it necessary to display an emotional attachment to this music, because its value is simply obvious. Yes, your enjoyment will be vastly improved if you know more about it and listen to more of it, but its value and meaning are right there, in the dots on the page. No defense of Bach, Haydn, or Mozart is needed.

Instrumental music is a peculiar art form in that, with a few isolated exceptions such as Saint-Saens’s Carnival of the Animals, it’s entirely abstract. (We’ll leave opera out of this discussion, for purposes of clarity.) For this reason, a piece of music does rely on the listener to understand the idioms that make up its style. Writing in general, and storytelling in particular, is not abstract in that way. One could tell the same story (say, the story of Romeo and Juliet) in a dozen wildly different idioms, and it would still be exactly the same story. A story is not about its idioms or style in the way that a piece of music is.

You could, of course, arrange a piece of classical music — let’s say Beethoven’s Third Symphony — for synthesizers, or a saxophone septet, or a ukulele band, or even a sufficiently virtuosic doo-wop vocal ensemble. It would still be recognizably the same piece. You couldn’t do the same thing with a piece of difficult modern music that relies on sonorities (masses of crash cymbals played with mallets, say) for its effect, because sonorities don’t translate in the same way.

Difficult “modern” writing is so much less regarded and less common than difficult “modern” music. If you aren’t telling a meaningful story in a comprehensible way, readers will toss your book aside. James Joyce’s Finnegans Wake is perhaps the best known example of modern writing. It’s impenetrable — and it has inspired no imitators at all. Outside of university courses on modern literature, has anybody ever read Finnegans Wake? I doubt it. Why would anybody bother?

But because music is abstract to begin with, composers who feel a need to venture deeper into abstraction have no obvious anchor to hold them back. Anything goes.

If we’re going to blame someone for this deplorable trend, I suppose we have to blame Beethoven. I love his music — but his influence over 19th century classical composition could hardly be overestimated. Beethoven popularized the notion that the greatest music was music that overthrew the earlier conventions of music — that went further. That broke new ground. That revolutionized the art form.

He himself did all that. It was the age of revolution. The American and French Revolutions were fresh news, and the Romantic movement in literature, with its idolization of the Hero, was building up steam. Beethoven’s stance was heroic. He revolted against the polite conventions of the music of the preceding generation, and did a spectacular job of it.

For more than a hundred years after Beethoven’s death, classical composers were gripped with the belief that to be significant as artists, they had to produce work that was new and different and revolutionary. That they had to break fresh ground. Within the confines of the classical style, that became more and more difficult, and eventually their efforts became absurd. Schoenberg jettisoned harmony theory in favor of the 12-tone row, a sterile effort that today is taken seriously only by a few diehards. John Cage went even further, discarding the formal restrictions of serialism, harmony, and form in favor of complete randomness. Cage’s music cannot be comprehended, because it doesn’t say anything. It was meticulously designed so as to destroy any attempt to understand it.

Well, that was certainly revolutionary, wasn’t it? Arguably it was deeply insulting to listeners … but it was revolutionary, no doubt about that. As H. G. Wells said, “If anything is possible, then nothing is interesting.” Cage’s early music is interesting; his later work is not.

While Schoenberg was busily bemoaning the death of tonality, of course, jazz musicians on the other side of the ocean hadn’t received the memo. They were joyously creating entirely new kinds of music that were entirely tonal — that used traditional chord progressions in ways that had never before been imagined. Schoenberg had crawled out on a limb and then sawed it off.

This is one of the problems with the culture of modern “serious” classical music. It insists on taking itself seriously, and as a result it has to ignore everything that has happened in pop music for the past 75 years. Not all classical composers have fallen into this trap, of course. In the 1920s, Stravinsky and other composers in Paris were well aware of American jazz, as was Aaron Copland. Coming from the other direction, Frank Zappa fearlessly shuffled the abstract sonorities of Edgard Varese into doo-wop and progressive rock. Crossovers tend to work well! But insisting that your music be entirely new and groundbreaking is a recipe for failure and obscurity. New is not necessarily better than old. The golden ideal of progress has, in recent years, been revealed as a sham and a grave danger.

I’m reminded of a pithy observation made some years ago by a blues guitarist named Big Bill Broonzy. (I once asked Chris Strachwitz of Arhoolie Records who had said it. He told me it was Broonzy.) Broonzy was once interviewed by a white musicologist. This was in the 1950s, and we may imagine, if we like, that the musicologist was from Harvard and was wearing glasses with black rims. The fellow, whoever he was, said, “Tell me, Mr. Broonzy — do you consider your blues a form of folk music?”

Broonzy thought for a moment and replied, “It’s all folk music. I never heard a horse sing none of it.” Leaving aside the deft way the guitarist deflected a racist question, we’re left with an important truth: It’s all folk music! Beethoven is folk music, and so are Varese and Xenakis. That being the case, there’s no pedestal on which to place “serious” classical music of the difficult variety. It’s all folk music. And if you can’t walk out of the dance hall whistling the tune, let’s face it: It’s a pretty pathetic excuse for folk music.

No, you don’t have to compose music that sounds like Haydn or hip-hop, nor to write novels that read like Ian Fleming or Jacqueline Susann, nor to paint the way Rembrandt did, in order to produce work that is interesting. But it is necessary to pursue meaning, and to produce work whose meanings can be comprehended by reasonably attentive listeners, readers, or viewers. By folks.

Unfortunately, the world of “serious” classical music seems to be infested by poseurs who find that challenge too difficult, or at any rate uncongenial. It’s not exactly true that Milton Babbitt wrote an essay called “Who Cares If You Listen?” That title was added to the essay by the editors of High Fidelity magazine. His title was “The Composer As Specialist.” But in applying that headline they weren’t entirely ignoring what Babbitt was saying. As the entry in wikipedia puts it, “Babbitt’s suggestion in the article for the composer of ‘advanced music’ is ‘total, resolute, and voluntary withdrawal from this public world to one of private performance.'”

As a footnote, Babbitt’s notorious article was published in 1958, the same year Bill Broonzy died.

The world of jazz and pop music doesn’t suffer from this isolationist attitude. Nor is it of much concern among writers of fiction. Writers of fiction understand that if you write gibberish, nobody is going to read your work. Some writers manage to find deep meanings, others are satisfied with shallow platitudes, but word salad is not highly regarded. A mash-up of Romeo & Juliet with Oliver Twist and The Sun Also Rises, sentences and single words chosen at random — or not at random but according to some arcane formula — and tossed into a blender on their way to the printed page, is not going to gain you any followers. Or at least, you’ll have to forgive me if I hope it doesn’t.

I have better things to do than listen to music that adamantly refuses to speak to me. And it’s really quite pointless for anybody to try to convince me that a piece by a “serious” composer of difficult noise music is as meaningful as a string quartet by Haydn. It just isn’t. If you think it is, you’ve redefined the word “meaningful” in a way that makes it meaningless.

That way madness lies.

Bad Memes

There is an infectious agent (I forget whether it’s a virus or a bacterium) that causes ants to crawl out to the tips of grass leaves. This change in the ant’s behavior is of no value to the ant — the value is to the infectious agent. A cow eats the grass and ingests the ant. The virus (we’ll say it’s a virus) needs to be in the digestive tract of a cow in order to reproduce.

There’s another virus that causes rodents to become fearless around cats. Same deal. The infection results in the death of the rodent, who is now behaving in an irrational manner, charging around in front of the cat instead of running. The behavior is, however, advantageous to the virus, which needs to be in the digestive tract of the cat in order to reproduce.

These examples explain a great deal about the current political environment in the United States.

I’ve been reading about memes. The idea is, a meme is not a physical thing like a virus, but it can act in an analogous way. A meme is a pattern of mental behavior, and the pattern can either reproduce successfully by spreading through a human culture, or it can die out. Patterns of behavior that are well suited to the human brain tend to spread. Those patterns are called memes.

The behavior, in many cases, is verbal behavior. An idea that lodges successfully in your brain and urges you to speak (or write) that idea so that your fellow humans can ingest it is going to survive. That’s how the meme reproduces. It’s evolution in action.

A meme that causes your brain to bypass the fact-checking process has an advantage. It’s more likely to survive, because it’s streamlined. Fact-checking is not only expensive biologically (in terms of brain effort), fact-checking can also kill bad memes. So if the meme can bypass fact-checking by appealing to your emotions, it’s more likely to survive.

This is how the idea of “God” has become so pervasive. It appeals to our emotions. The “God” idea has to bypass fact-checking in order to survive, because fact-checking would kill it.

Many conservative ideas survive in exactly the same way: Fact-checking would kill them. Racial bigotry, for example (a very popular meme among conservatives), appeals to our fear of the stranger. Someone who is Not Like Me And My Friends is a source of fear. The meme — the idea that other races are inferior to mine — hijacks that fear and uses it to reproduce itself, spreading through a population.

This morning I got into one of those pointless Facebook wrangles with a fellow who insisted on thinking that there’s a debate about global warming. There isn’t, not really. The details are still somewhat unsettled, but the facts are clear. The polar ice caps are melting (fact). Carbon dioxide is a greenhouse gas (fact). Human activity produces an enormous excess of carbon dioxide (fact). Therefore, humans are causing global warming. Them’s the facts.

What seems to happen is the the meme “global warming is a hoax” has hijacked the emotional mechanism that says, “I’m just as smart as anybody else.” And also, “My friend says this, and I trust my friend.” Those are simple, emotionally appealing ideas. The meme uses those emotions to spread itself. In the absence of those emotions, fact-checking would be a lot more likely to kick in. Fact-checking would destroy the meme.

I feel sorry for people whose brains have been hijacked by bad memes. Also, those people scare me, because they’re dangerous. They’re in the grip of these mentally transmitted infectious ideas, and they’re quite likely to destroy all that’s good in our shared future.

They’re rodents dancing fearlessly in front of the cats. And they don’t know why they’re doing it. Memes are mind control agents. Check yourself.

Dragon Chow

At some fairly early point in the development of my series of fantasy novels (no, they’re still not ready for publication — rewrites are ongoing) I made what in retrospect appears to have been a stunningly bad decision. It made sense at the time, because it has a certain mythic resonance, but I didn’t think it through.

I decided to send my intrepid heroine off on a detour to a land ruled by dragons. The dragons eat people; they consider humans a tasty treat. There are whole villages of peasants whose fate is to be eaten. And of course it’s a staple of heroic fantasy that the hero has to rescue people who are in danger and distress. Leaving those people behind as dragon chow would not make for a Happy Ending.

And now I’m forced to consider what an awful plot problem I’ve created for myself.

To see the problem clearly, we need to do the numbers. Let’s suppose the entire population of human-munching dragons consists of 2,000 of the beasts. Let’s further assume that on average, a dragon eats a human only 50 times per year, subsisting otherwise on cattle or sheep.

That’s 100,000 humans per year. In order to replenish the population loss due to dragon predation, 100,000 new babies will have to be born (and not die due to infant mortality) each year. If a woman has a surviving baby, on average, every two years, the population of peasants will have to include 200,000 women of childbearing age. We don’t need that many men; men are expendable. So maybe the adult population is 250,000, plus more than a million children between the ages of newborn and 14.

Even assuming my heroine can arrange to hold the dragons at bay while a population that size is evacuated, where are they going to end up? We know from news stories in our own world that resettling refugees in already populated areas is not easy to manage. And if the refugees are taken off to an area that isn’t already settled, there’s no infrastructure. There are no stored supplies of grain or anything else in a land where nobody is living. And the refugees won’t be able to subsist by hunting and gathering, because (a) having worked in the potato fields all their lives, they’re not trained hunters and (b) they don’t know what local wild plants are edible.

Fortunately, my heroine has access to well-trained wizards. But I’m not sure how even a crackerjack team of wizards could feed more than a million people on an ongoing basis while they get organized to plant a crop in their new location and then wait months for it to ripen. Even cutting the numbers by 90% still leaves an intractable plot problem: Feeding 125,000 refugees is not materially easier, in a novel at least, than feeding 1,250,000 of them. Either way, you have to have a very large and viable source of food.

This is why I hate writing fiction. I have a fiendish talent for wanting the story to make sense. Pardon me while I go chew on nails for a while to calm down.

The Gods

Here’s a tip. I’m sure I’ve mentioned this before, but it bears repeating: If you’re writing fantasy, or really any type of plotted fiction, do not — repeat, do not — include a god or gods in your story. Gods and plot do not mix.

Why, you may ask. Isn’t a god a fantasy creature par excellence? Well, yes. The problem is, gods are too darn powerful. The mainspring of plot is that the lead character (who we’ll assume is human, or something vaguely humanish) has a serious problem. The actions that the lead character takes to solve the problem form the plot. But a god can solve your hero’s problem with the wave of his, her, or its little finger. No more problem! No more plot! And thus, no more novel. Your novel lies on the floor, twitching faintly. It’s pushin’ up daisies. It’s joined the Choir Invisible. It is no more.

Real theologians and real worshipers have this same difficulty, of course. Why does “God” allow suffering, when he/she/it could prevent it? One popular answer is that suffering is instructive. Another popular answer is that because we have free will (supposedly), we’re creating our own suffering. I could go into detail about the weaknesses in those really bizarre and pathetic theories, but this is not the time or place for it. If you want to put both a god and a theological justification of that sort into your book, you’re welcome to do so — but at that point you’ll be writing a religious tract, not a plotted novel, and it’s likely to be boring. It will bore me, for sure.

Anyway, the supposedly real “God” in our real world never does anything at all. Actions: zero. If you put a god into your novel as a source of some action or other, the god is just another character, however vast or dimly visible. A character who is powerful enough to solve your Big Plot Problem, but who lets your lead character suffer instead, is basically a Bad Guy. That’s an evil character. For a human to triumph over an evil god — that’s a viable plot, I’m sure, but given a god’s enormous powers, it’s not a plot whose details I would be keen to try to work out.

You could also write about the conflict between good gods and evil gods; that would certainly be a viable plot, but it wouldn’t be about human beings. The humans in the story would just be pawns. Most human readers want the human characters to be the movers and shakers, so that may not work too well as a plot. You’re welcome to try it, of course.

Fantasy literature is generally about a world that inherently has some sort of moral order. (Our real world doesn’t.) The elements of fantasy, be they unicorns or vampires, tend to be either good or evil, unless you’re Terry Pratchett, of course, in which case they’re just good fun. But it’s best, I think, to let the moral order be inanimate, implied, or exhibited by ordinary, limited beings. A god will only get you in deep shit.


Make Up Your Mind, Please

Just a quick kvetch today, a follow-up to yesterday’s post about how you have to trust your own vision for a story and not just follow an editor’s suggestions blindly. As already noted, the editor who is working on my novel series is giving me a lot of very useful feedback. But once in a while I find myself twitching, running in circles, or tearing out the few hairs that remain on my old gray head.

On page 139 she says, “Be careful of giving away the plot in dialogue.” In this scene a character is advancing an idea about something that’s going to happen a few chapters later, and yes, the editor is absolutely right. I slashed those sentences, so as to have the eventual unfolding of events be more surprising.

Then on page 144, still in the same chapter, my heroine is sending a telegram (yes, this is a fantasy epic with telegrams) in order to learn something important. She declines to explain to her companion what the telegram is about. And here the editor says, “This scene is laying groundwork for future plot events, but it’s not feeling immediately relevant or like it’s advancing the story.”

Would you like to have your cake and eat it too? Gee, that would be swell. If my heroine explains to her companion what the telegram is about, I’ll be giving away the plot in dialogue. Aarrggh!