Thursday, November 29, 2012

The Subjectivity of Writing

If asked, every American would admit that art is subjective. Beauty is in the eye of the beholder, one man’s trash is another’s treasure, etc. Then they proceed to identify a series of movies, books, and authors who are “just absolute crap.” When analyzing and considering the writing world, this shift of judgment leads to problems for the author who actually cares about improvement. The subjectivity of subjectivity is one of the more confusing aspects when it comes to working to be a “better” writer.

I say there is no such thing as quality. “Good or bad” is better said as “effective or not.” Upon telling my teacher that senior year of college, his immediate response, similar to most’s response, “Then how come the other professors and I can watch the freshman’s auditions and agree on who is good or who isn’t?”

My response being, “Way to be supportive, you abusive hack."

The response, of course, being a silent one.

There is an obvious answer; the three faculty members were notorious for not having their own opinions or defending them, each being a special brand of yes man. So when discussing the talents or lack thereof of the students who paid 100,000 dollars to be there, they would systematically start agreeing with each other without even being aware of it.

But to just leave the answer at, “You’re just easily influenced,” would be lying about the reality of the situation.

You could bring several individuals into a room and give them a “bad” story, and, without having any prompting, see them agree on not liking it. Though I don’t believe that there is such a thing as actual quality, I recognize when a story is not up to my standards. In fact, if I truly believed that there was no such thing as good or bad, I would never edit a story because that would be a form of concealing the truth. My first draft would be what my mind is telling me to write about. The child-like view I sought to get rid of is how I truly perceive reality. The typos and grammar errors and Freudian slips reveal more about the English language than anything consciously accurate. The long, boring tangents into daily activities illustrates what my concerns are. Making changes to a draft to make it “better” wouldn’t make any sense.

There are two aspects that control the quality of a work and combat the subjectivity of "good."

One, culture; two, purpose.

Every book is written for a reason. It has a purpose onto itself, an intended point, a goal it hopes to achieve, whether or not the author is aware of it. Even stories of conversation, recollections, and gossip, have some sort of motivation. That intention could be as simple as being entertaining or as complex as enlisting help for the starving children in Africa.

No one writes, speaks, or even acts without a motive. We may not notice it consciously, but it’s still there. A book wants to do something—make the reader laugh, like the main character, or even just provoke thought. In that sense, a bad book does not achieve its goals, a good one does.

This, of course, still creates subjectivity in two parts. Not all authors have the same goals and not all readers will be affected by the same things. If, for example, the goal was to make someone cry, a style that is incredibly accurate for one soul is a waste of time for another.

But, it can be agreed that a book should do something for the reader, and stories that barely affect him at all isn't worth a read.

It gets complicated, however, because if we start defining quality by success of an objective, we would have to first and foremost know what the author’s objective was and assume he’s not lying about it. Secondly, just because an author did what he sought out to do, does not mean that it will be a fantastic book. If he paid only attention to exploring a problem and none to being at all interesting, he’s hurt himself pretty badly.

Of course, we could look at it as entertaining people and getting them to read the book would help fuel his goal, but either way, you get my gist. If an author wants to do something (say, make money) and he does it (say, by using them to transport drugs across the boarder) you're not necessarily going to get the noble prize for that one, nor should you.

The second aspect of agreed about “quality” has to do with culture.

Let me put it this way: If a person were to walk into work at Wall Street wearing sagging jeans and a football Jersey, would the majority of the population recognize it as weird?

I’m going to finish that rhetorical question as absolutely yes. They may not say it’s weird, they may not even think the actual word “weird,” but they would notice, and, with the exception of a few open-minded liars, everyone would agree that it is unusual.

But why? What is the practical use of a suit that makes it fit for that situation? Little, outside of simple appearance. We expect it because so many people follow that trend, and they really only follow it because of the expectation. Who’s going to buy 10,000 dollars of stock from a man who looks like he is waiting for the super bowl? Unless, perhaps, he actually going to be playing the game, but that's a whole other issue altogether.

Yet, it’s not some concrete rule of the universe. You bring someone over from a village in Zimbabwe whose culture has its own fashion laws, he’s not going to immediately know that the outfit is inappropriate, and he probably won’t be able to tell you what the man “should” be wearing.

It’s the same for art. The brain incorporates certain things as normal and has certain expectations to maintain that normalcy. Of course, and here's the problem, no one wants to read a book that can only be described as “normal;” readers demand for the author to defy expectations constantly. So an expectation is to defy expectations.

But this pressure to challenge the standards of protocol still has a great deal of regulations attached to it. The “weirdness” must be deliberate and it must have a reason. Why? Because someone who walks into Wall Street with a gold and green jersey to make a statement about individuality is very different than someone who doesn’t know how to dress properly. Unfortunately, sometimes it's hard to identify a statement from naivety, which can often lead to the credit of those who don't deserve it and the detriment to those who do.

Quality is contextual. There are too many people with too many personalities and too many different expectations to appease all of them with one concrete rule. Art cannot be classified as right or wrong, it just appeals to the largest number of people it can. In order to do that, it requires the upkeep of appearances and consideration of basic sociology. An author must be willing to use and manipulate preexisting notions unless his plan is to just depend on dumb luck.

Wednesday, November 28, 2012

Utilizing Normal and the Abnormal in Writing

Both these words have negative connotations to them. The label of “normal” indicates that there is such a thing as “should” in cultural and personal traditions and decisions. That people who are abnormal have something wrong with them, and that, at least in America, being normal is the same as being bland.

While writing, however, normalcy is a powerful tool that can solve a whole grouping of problems once an author understands it.

In a story, “normal” is what the reader ignores. “Abnormal” indicates importance.

For example, let’s describe a bathroom:

Jack opened the door and looked inside. The room was quaint: a white toilet, a white sink, and pink tiles spanning across the floor.

What is the reader going to think is the most important aspect of that description? The pink draws attention. The atypical coloring says more about the owner of the room than anything else described. We have a good indication that not only is it not John’s house (partially because of the words “looked inside”), but it is probably a woman’s, or a woman dominated household. It gives the reader a good image that bleeds out from just the room, that allows for details of life besides just what he’s seeing.

Which leads us to the first use of abnormalcy. When the author defies some expectations, the world seems more vivid. When the author only operates within the "norm" the reader starts to remember that he is making this up on the spot.

It is also important to notice how normalcy affects image and “abstract space.”

If we were to describe a different bathroom as:

Jack went to the restroom and looked inside. There was a toilet. He stared at if for a moment before shrugging and walking in.

The important information seems to be that it was only a toilet, the importance not being placed on the object itself, but the absence of others.

If we were to say, however:

Jack went to the restroom and looked inside. There was a coffin. He stared at it for a moment before slowly shutting the door and walking away.

The important part is clearly about the object. A reader is likely to picture the rest of the room with it, but what is vital for her to know is that there, for some reason, is a coffin in the room.

Where does this come into play? How is this remotely beneificial?

It is a great tool to get readers noticing and remembering things that they might not later, which, often times, will be a problem. Subtle foreshadowing can often be forgotten, especially in the world of the novel where the reader will put a book down for days before picking it up again.

When I was in college, the university did a staging of Hamlet. Now what is important to know about Hamlet is it is big. In all forms of the word. Four hours long with a thirty person cast and settings that range from Denmark to England, it can be a beast to put on.

Of course, most people cut it down, removing the whole plot issue of Ferdinand, and cut out the third time it delivers the same piece of information.

In order to do the show, the directors double casted a hell of a lot of people, especially when the few actors were dropping out like flies anyway. This meant that when we’d see someone one minute trying to kill Hamlet, then consoling him the next. Or that might have been an actual plot point. I don’t know.

Anyway, the director had an idea. At the end where, spoiler alert, everyone is sprawled out dead, a Player (one of the actors hired by Hamlet) walks on stage, picks up the crown and puts it on his head. Which apparently makes him the king now, because that’s how crowns work.

There were many reasons why the audience didn’t get it. It was Shakespeare, for one. Half the script had been removed. Most only snuck in the back ten minutes ago to pretend they had been there the whole time. And of course, mainly, no one knew who he was supposed to be because he had played six other characters.

But, even if they had single casted the show, and they hadn’t cut the script, and the ending bit did have a logical flow, it is important to realize that this “Player” was a tall, white, brown haired man in a cast of 20 other tall, white, brown haired men.

The audience would not have remembered him from his three seconds on stage to know who he was.

This is where authors can abuse the wonders of racism, sexism, and all forms of visual stereotypes, utilizing it as a tool to maneuver a plot point.

Had he been the only black man in the cast, they would have remembered him.

Partial joking aside, this is something important to realize. Authors and filmmakers have the tendency to assign “normal” traits to their characters, with exceptions of deliberate choices, jokes, and niches being filled.

For instance, when we watch a movie, the protagonist and the guy who gives them their coffee could easily be mistaken for one another, as long as that protagonist isn’t of Gerard Butler fame. There will be few stocky, balding men, but only as the funny sidekick, rarely any random extra.

Extras, of course, need to fade into the background, so it can make sense why we wouldn’t want a “strange” looking man on the subway; it would sidetrack from the action.

Of course, part of the idea of normalcy is that it is normal to have abnormalcy in certain situations. Basic stereotypes come into play, so it is common for us to see Persian taxi drivers with no lines or motivation for the racially insensitive casting. (Allegedly insensitive. Alleged by me. Because I've never heard an Persian actor complain.) Normal is contextual, obviously, and that needs to be considered as well, which requires us to examine our more insulting thoughts. It also becomes a huge problem because “normal” for fiction and “normal” for reality is often very different.

So, when I say others have the tendency to make their characters all similar due to “normalcy,” it is, of course, a generalization that only requires a casual glance to keep in check. It is also more important in visual arts, because often a novelist can get away without describing a good portion of their characters, and it is even more important in drawing than movie making because, hey, you’re not going to find a hundred actors who look exactly as we think they should.

To reiterate, there are three ways to utilize understanding normalcy:
            1. Draw attention to something.
            2. Draw attention away from something.
            3. Create more diverse and detailed worlds.

Our brain’s efficiency is dependent on this ability to gloss over ordinary in favor of the new. It works by filtering in thousands of pieces of information, and learns to recognize what it can ignore and what it should think about. For an author who wants to manipulate feelings, deliver information, and tell a story in the best way possible, he must consider the controversial and contextual concept of what we perceive as normal.

Friday, November 16, 2012

Choice versus Mistake

When sitting in a creative writing class, or any sort of formal group discussion, a common problem is the fixation on what makes a work unique. Such as the sparkling vampires of Twilight, duel titles, or the decision to kill off a beloved character, the lines of choice and mistake are blurred. Is the Styrofoam Edward something that “ruined” the book? Or is it just something noticeable, ignoring the more subtle mistakes in favor of the obvious?

Choices can destroy a concept, easily, and changing a decision like a disco ball vampire could be the easiest form of improvement. However, that doesn’t mean it is the best form. People lean on criticizing choices and risks over subtler and maybe more influential details.

To give a less controversial example than Twilight, let’s talk cursing. Swearing is the immaculate version of a preference that zaps all attention to it.

When reading the play God of Carnage for the theatre I worked at, I found the (irrationally popular) play boring and painful. There were many things to be said as to why, such as its predictability, its point being to hide the conflict for the first half, and its unlikeable and stereotypical characters, but I found one very interesting thing when the members of the board had to talk about it. They only gave a fuck about the “fucks” given. Twenty minutes of people complaining about the language and nothing else. The play was rejected for the season and that was the end of it.

Can swearing hurt a book? Absolutely. But it is still a matter of taste. It will hurt a story in that people will not like the choice being made, not because of the negative consequences swearing has.

Mistakes are things like typos, breaking continuity, sounding like it’s being made up on the spot, and essentially anything that directly leads to an undesired response from the audience, of which there are five: boredom, confusion, wasted time, insult from pretension, and expulsion from immersion.

Some of these are more subjective than the others. An author will never want the audience to be bored, but every once in a while confusion can be fun.
What does swearing do for the audience? It makes them mad that the author chose to swear. What does sparkling vampires do for the audience? It makes them mad that the author chose to make the vampires sparkle. Neither are dull (unless we were to go on and on about it), neither ruin the clarity of what is happen (except as to the reason why), these tiny details don't distract from the point of the story, and it's not like the audience is complaining about being condescended to. It could be that it's just unconvincing and jarring enough to remind you that someone is choosing these things, ruining the consistency of the world, but really, if you're so far into the book and you find that hard to believe with no qualms for anything else, you're not going to care.

Like most things in writing, there is no obvious line between the two. Sometimes a choice is also a mistake, e.g. infusing the token female into a script may led the audience to be expelled from the story, thinking only about why the author chose to make the character a girl. Often, what makes the book unique from others of its kind may not be benefiting the audience’s reception. Thus it becomes harder to decipher the difference.

Why would we bother though? What is important about identifying a choice or a mistake? Nothing much in the writing process, but when it comes to editing, it can solve a lot of internal dilemmas.

What is the difference between being yourself and brashly being stubborn? What is the difference between selling out and playing the game? What is the difference between keeping your voice and being arrogant?

Recognizing that people will often comment on the obvious before the important is a clear indication of what direction we need to take, if the advice is unusable, if it just needs to be dissected before it can be implemented. It says that "if I am offended by this, then I don't have to use it for it to be usable" and that pushing further into the conversation will bring out more important options from the reluctant critic.

Let’s take a more convoluted example into consideration. A story has a lot of characters. That in itself is a stylistic choice that has no direct consequence or reward without context. When receiving feedback, the author hears, “You have too many characters.”

Now, as a choice, it means that it’s a matter of tastes, which indicates that she will receive the opposite response from a secondary critic. This flaw in one man’s eyes may be the virtue for someone else. So it isn’t a “problem” per se, but it certainly doesn’t mean there isn’t one. A mistake needs to be changed, a choice doesn’t. In this case, as the number of characters is a choice, finding out the problem (“I was confused by all the characters” or “I was bored because of all the different back stories”) will allow for the author to take that specific suggestion or solve the problem in another manner. It may not be optimum to remove cursing from the novel on the grounds that having a bunch of people in prison speaking like they’re in a book club won’t particularly benefit the atmosphere, but, it might behoove the author to have a character who doesn't swear and gives legitimate arguments why others shouldn't. It also, of course, might be a hopeless cause that the writer realizes he's not going to win.

Actually utilizing feedback is not as easy as the outsider tends to believe. It is not just about eschewing egos or emotionlessly analyzing the advice. Quality is subjective; it constantly changes, not just by each different viewer, but by their moods, their ages, their cultures, their perceptions, and by comparisons. Not only does accurate editing require a great understanding of self, but of others, and often, feedback is not cut and dry. If my fellow writers in my peer groups and classes were to take every piece of advice they got, they would create the most homogenized work this side of binary. But if they were to ignore it, there would be no reason to be there at all.

Tuesday, November 6, 2012

Writing is Not Like Losing Weight

From fairy tales to modern media, from the Ugly Duckling to Buffy the Vampire Slayer, from Ratatouille to Star Wars, stories make praises of innate talent over any sort of hard work. There are obvious logistical reasons for this; it is easier to write a story that says success is due to ability rather than one that says "this is how you succeed." A novel’s plot line, in fact, goes, “a character is trying to do something,” and “this is why it is hard,” e.g., in literary terms, is the objective and the conflict. In many circumstances it is something that not everyone could do, and, if the author wanted to indicate success was due to actions, not fate, the character would have take actions things that other characters wouldn’t have thought of and done themselves, and, furthermore, things that the readers haven't tried themselves.

American culture is strange because it is supposed to be “where the streets are made with gold,” and where anyone can build themselves up to a better life. If you ask one of us if we depend on fate, we will say no. If you ask us if practicing is important, we will say yes. And yet if we were to be told that we are terrible at something, we will give up. Teacher will make horrible claims: out of all their years of teaching, I’ve only seen one student who could direct. Stephen King says that a bad writer will never become a mediocre writer, a good writer will never be a great writer, but maybe just maybe a mediocre one can become good. Discouragement, and legitimizing discouragement through "you're not meant to succeed," is too popular for its own good.

The main reason it is so hard to get students to go the extra mile in art is because of this subconscious belief in fate. If we are meant to do something, then we should have been given the talent. The greats, the elites, the Gods who have succeeded, are not the same as us. They make it look easy. “Are you telling me that every actor on Broadway sits down and marks where he wants to speed up and slow down?” a student asked me once.

And there is the problem. We consider bettering ourselves at the arts like losing weight. You eat right, exercise a lot, and, with a great deal of effort, willpower, and care, you manage to get to the place where you want to be. But, then, if you want to stay there, you must keep it up. You start to slip, you go back to your old ways, you go back to your old weight. If you don’t put in the same energy into it, you’ll be back where you started. Once you begin to try, you must keep trying. You will be held back by your own genetics, forced to try just as hard to keep up, let alone surpass, those with higher metabolisms.

This isn’t true with practicing. We practice so we don't have to practice. An author who requires a lot of extraneous, outside work, who needs to sit down and diagram sentences, making luxurious outlines, conscientiously motivate characters, and spend hours making the right word choice, will not have to do that for the rest of his life.

Say, for instance, the writer overuses words too much. He takes the subject of the conversation, in this case, “writer,” and proceeds to repeat it in every sentence. He does not bother to think of synonyms or assume that the reader will know what he is referencing. It sounds like he did not carry the last thought to the next sentence. In reality, it sounds like he is writing rather than speaking. This is a common problem and one readily solved. Usually fixed in the editing process, a writer will begin to watch for it. Then he will begin to hear it without thought. Over the years he will start to do it for himself without even having to pause. Though in the beginning he needed to reread it, circle it, look it up, and change it, towards the end it will be likely that he doesn’t even need to take a moment to consider it.

Of course the problem won’t vanish entirely. Every once in a while he’ll happen across the mistake, or will have to spend a long time thinking of a synonym, but my point is, the effort won’t nearly be the same or as constant.

If someone is a terrible writer, they can get better. Everyone does get better. In fact, everyone starts out terribly. If they didn’t, publishers would just go to kindergarten classes and take samples. Now some people will take longer to get better. Some people will quit before they can. Some people will refuse to try to improve. And some people have further to go than others. But if this is truly a passion, it shouldn’t matter where an author starts out. Talent isn’t a gauge that some people get a head start on. Talent isn’t something that a person can bank on. Art is too subjective to sit there and say, “Well, he’s already ten points on you, so you might as well not bother.” Not everyone works hard, not everyone likes it, not everyone has your point of view. Art is about a variety of styles, subjects, and perspectives, each of which require different abilities, which no one starts out with all that’s required. The problem with fatalism is that it discourages trying for those with talent or not, and sometimes, trying is the only real talent that stands between success and failure.