I was arguing with a friend. It was one of those stupid, horrible arguments where one moment you’re pulling out of the Target parking lot, wearing your cool new sunglasses, two beautiful women squeezed into the bench seat in the cab of your truck, and the next moment everybody’s sulking and the only exchanges are muted monosyllables about where the next turn is. There is never — never — a good reason to insult the beautiful women squeezed into the bench seat of your truck. That’s just foolish.
So, ladies, this is for you.
The real argument started days earlier. I was home on leave from Iraq, with nothing but an assault pack and a laptop case and about one-and-a-half sets of civilian clothes, plus one sweater bought by my mom that looks like a sweater your mom would buy you. The three of us had gone shopping to find me some threads. We started at the Salvation Army, where they vetoed the one and only thing I tried on, a ratty old cardigan. When the thrift store yielded no fruit, we continued on to the mall. We ended up in Banana Republic, where they found many gorgeous things for me to wear, most of them well out of my price range. Apparently, I have expensive taste — I didn’t like any of the $25 shirts, only the $60 shirts.
But — and here is where I was not fair to my personal shoppers — I also secretly despise my expensive tastes. Because it seems that what I like is upscale clothing that looks like you could have found it in a thrift store. I like Weezer Chic. I like retro patterned shirts, I like sweaters, I like corduroys, I like Chuck Taylors, I like flared leather pimp coats. All of which, except maybe Chucks, are now being sold at outrageous prices in suburban malls so that kids with lots of money can look like rock stars who used dress out of the salvage bin because they didn’t have any money, which gives me all sorts of heebie-jeebies about legitimacy and authenticity and so on.
We irritated the Banana Republic staff by taking out tons of stuff and not buying any of it. Eventually, my personal shoppers gave up and sent me to Old Navy and Target by myself, saying gently, “It kind of seems like you’re just going to buy what you want anyway.” To prove my shopping bona fides, I brought home a nice pair of jeans and a very pretty, subtle white-on-white dress shirt. Otherwise, I bought largely in my standard mode, Nondescript Casual.
So — floating around in the back of my head are all these issues with class and authenticity, and there’s me in a happy new pair of sunglasses ($10 at the hippest local secondhand shop), and we’re in Target again for something else, and the ladies demand, in a firm, teasing way, that I take off the sunglasses now that I’m indoors. It’s a must, they say. Looks kind of egotistical, they say. We’re here to keep you from committing this kind of fashion faux-pas, they say.
I should let this go.
We’re in the truck, almost shouting at one another, about whether there are “rules,” about whether dress matters, whether style matters, me saying it’s a superficial concern and it’s patronizing to tell other people what to wear, her saying that it’s not patronizing to help people look nice and feel good about themselves. Then I say that style in clothing is not an objective thing, that it changes with the moment, and also that when you say that someone’s clothes are hideous or wrong or that they’re committing a fashion faux-pas or that they’re breaking some kind of rule you’re coming really close to making a value judgement about someone’s character.
“I don’t, actually,” I say. “I really don’t ever want to make those kinds of judgements. I reserve the word ‘bad’ for work that’s dishonest, that’s actually immoral in some way.”
Frustrated by my obtuse position, she says, “What about language? You judge people all the time who break the ‘rules’ in language — you know you do. If someone has poor grammar you pass judgement on them!”
“I…” [pause for moment of honest reflection] “…try not to.”
But she has me — although that doesn’t make either of us feel any better.
I admire precision in language. I strive for it in my own writing and even my day-to-day speech; it genuinely bothers me when I find myself unable to summon the words to communicate exactly what I mean, when I have to settle for old formulae or, worse, have other people try to guess at the ends of my sentences. And I passionately believe that the rules of language, though arbitrary, provide a means of constructing precise, communicable abstractions of our thoughts, feelings, and spiritual movements. The better we understand these rules, the more easily we can be understood and understand others. Sure, most of the time a reader can figure out from context that you meant “their shoes” instead of “they’re shoes” — but it slows down the process of communication, makes it a heavy, leaden activity, one to be avoided, or engaged in only to the very least degree possible. Or consider the use of punctuation in the last sentence, the way the em dash provides us with a dramatic break between a thought and its antithesis, while the commas separate three subtly different passes at the same idea. I am not talking about the studied rhetoric of the Sophists, but the natural, God-given possibility of expressing a novel thought cleanly and straightforwardly.
Fashion and art are probably also kinds of language. Indeed, there’s a compelling argument to be made that they may have been the first language.
Spoken language requires a complex set of evolutionary processes to have occurred before it’s even technically feasible, and it requires social transmission over time to be sensible. This process seems to have begun several hundred thousand years ago, as our ancestors developed into anatomically modern humans. But spoken language may not hav developed until quite recently — perhaps as little as 50,000 years ago, suggesting quite a bit of lag time between the physical potential for speech and the development of language itself.
But there are, surely, other kinds of communication through signs. We evolved, as hunter-gatherers, to distinguish fine visual details, in order to recognize plants, dangerous insects, predators (or prey) hiding in the grass, and so on. The mental processes for distinguishing practical meaning in subtle visual signs probably developed early on. But at what point did we invest visual signs with abstract meaning?
At an ancient burial site in Israel, the final resting places of seemingly post-Neanderthal humans were marked with red ochre, a clay-based paint, suggesting that the people of that time had made the conceptual link between a particular visual symbol and death or the dead.
It is, of course, entirely unknown what meaning may have been vested in the color — whether it was intended to honor the dead, or merely to put a seal on the exit door of life (as if to say, “Yes, we believe this state is final, and these people will not come back to us”). But the link itself is interesting. The idea of marking the dead, of leaving a sign for future visitors to see — there’s a depth to this that already leaps out of the merely animal realm.
It may be that we started using visual signs as a way to send practical messages. Imagine a group of paleolithic hunters, versed in tracking, alert for signs of edible game. Perhaps there is some tell-tale way in which a gazelle breaks a twig as he cuts his winding path through the forest. Certainly he leaves hoofprints, spoor — to our single-minded hunters, the ancient world’s equivalent of the CSI team, even the way a leaf has been chewed reveals the identity of the chewer. Is it not possible then, that hunters would develop a system of imitating or reproducing animal signs, eventually abstracting or elaborating them (say, a pyramid of small stones standing in for a pile of droppings), as a means of passing messages to one another? “Deer here.” “Wolves close by — watch out!”
I like to think that the human interest in art may have begun in gift-giving. At first, of course, gifts as a sign of friendship, kinship, or love may have been largely practical — food, tools, and so on. But once human minds were capable of making the leap between physical objects and abstractions, it’s possible that even tokens representing practical gifts became common. Gourds, shells, flowers, and so on may have represented the foods associated with them. A natural next step, then, would have been for the beloved to carry or wear these tokens of love, whether by way of returning the signification or merely as a status symbol — “Look how many people love me and would give me their food!”
This, of course, leads us to another possible evolutionary reason for the development of art — sheer ostentatious display, after the manner of the peacock. A noticeable and intricate display, costing time, effort, and brainpower, indicates an abundance, indeed a surplus of fitness, good genes and possibly the ability to care for young. The guy wearing the nicely tailored lion fur is indicating that he’s handy, smart, and probably fun at parties, while the guy wearing the sloppy, hastily-assembled, and odd-smelling hyena pelts is clearly lacking in ambition or talent or both — unless, of course, he makes those simply fantastic horn carvings, in which case his slovenly dress is perhaps mitigated.
Perhaps long before we ever spoke, we were developing other kinds of language, in the objects we made, the music we played, and of course the clothes we wore. From then until now, self-decoration has been the most universal and most significant visual art form, the one in which we all engage, to some extent or another. What you put on your body is a sign or a sign-system that describes you to the outside world. You can use that sign-system ironically, of course — hip young artists who develop a fondness for preppie clothing from the ’50s — or deceptively — clean-cut serial killers — or playfully — David Bowie’s experiments in androgyny — but one’s appearance is always a sign, always a kind of language. Its vocabulary is certainly less arbitrary than that of spoken language (for example, certain materials, such as gold, hemp, and leather, have economic or political importance beyond the realm of fashion), and its ability to signify is hardly less effective because its grammar may change over the decades or centuries. There is, then, a genuine science of fashion — a real ability to read its codes and manipulate them to communicate, both through direct signification (the “X” caps that were popular in the ’90s) and through the creation of an aesthetic sensibility or persona (see Bjork).
Do I judge people for an inability to use language? It’s hard to say. Certainly languages have rules — the very structure of the brain demands certain categories. Parts of speech, for example — there are in every language I know actors and objects (nouns) and actions (verbs). I know of no language which goes so far as to have separate words for, say, a dog as the subject of a sentence and a dog as the object of a sentence, though the word for “dog” may have different prefixes, suffixes, or infixes depending on its role in the sentence. Similarly, verbs may in most cases be passive or active, but there are not, generally, two separate categories of verbs for the two functions. So there are, seemingly, stable categories in the human brain for nouns and verbs. Then there are prepositions, which, though they rarely have easy counterparts in other languages (“ala” in Arabic, for example, sometimes means “on,” but it can’t be used in all situations calling for “on” in English, and sometimes means “above” as well), are nonetheless a distinct category in almost every language.
On the other hand, some seemingly stable categories turn out not to be so. In Korean, every adjective is also a verb (“The car reds” is a perfectly legitimate sentence in Korean), while a huge percentage of the verbs follow the formula, “to do [some noun].” There is, in other words, no verb “swim” — Korean can muster up only a noun from the Chinese that means “swimming” and the “do” verb pasted lamely on the end.
So even the basic building blocks of language are not rock-solid — is it any wonder that within our own language, all rules should be plastic, subject to transmutation? Indeed, setting rules only invites poets to break them. Partly this is because creative re-thinking of the rules enables some dramatic effects: e e cummings is, of course, famous for abandoning capitalization and severely convoluting his word order, and Stephen King often uses (some would say overuses) sentence fragments in the way a hip-hop DJ drops fragments of music into a beat.
But just as significantly, both ordinary people and artists often reject the established norms of language as a way of rejecting the ethical failures of the dominant culture. Black abolitionist writers — Frederick Douglass, Harriet Jacobs — and early civil rights activists — W.E.B. Dubois, Booker T. Washington — and even literary writers like Richard Wright often wrote about the most horrific injustices in conservative, drawing-room prose, perhaps in part to prove that black writers were the equals of whites. But there was always a counter-strain in African-American arts and letters, starting of course in the subversive songs of slaves and finding a fiercely independent voice in leaders like Marcus Garvey and Malcolm X, who in their roles as preachers folded uniquely black cadences into their nonetheless disciplined prose. Black English, a variety of Standard American English influenced by West African languages (especially in its conjugation of progressive verbs), initially arose as a dialect like any other. And while the grammatical features of Black English have rarely been adopted by the American community at large, its vocabulary, especially its slang, has been a wellspring of American expression since at least the Jazz Age. But in the 1960s and ’70s it became a kind of language of rebellion, a cultural marker that at first bestowed “cool” on its users as black artists rose to prominence, but that later also signified a certain stand against the establishment. Over time white kids from Andover began sounding like jazz club owners or, in extreme cases, Stokely Carmichael. And to this day, the fact that white suburban boys selected the language of Snoop Dogg as their patois of choice has not robbed Black English of its revolutionary spark — or at least, using certain elements of Black English is still a necessary proof of authenticity in certain fields. No one expects Chali 2na or Black Thought to write a letter to the electric company in the same kind of English they use in their songs, while rappers who rhyme in perfectly Standard English are inevitably novelty acts.
Fashion, of course, has its own rebels. Some are just innovators. But at times people deliberately reject, not merely the old-fashioned, but the idea of “looking good” altogether. They do so often as a way of rejecting a dominant society that, perhaps, has rejected them. The punk movement is the most prominent recent example — many people were attracted to punk as much for its creative rejection of norms of beauty as for its two-chord attack on the pretentious rock music of its day. Suddenly it was okay to be bald, to be scarred or tattooed or pierced, even to be fat. It was okay, even preferable, to look like you had just climbed out of a dumpster. For women in particular, perhaps, the strategy of radical, planned ugliness was liberating, and for everyone involved it had the added benefit of putting a thumb in the eye of a social system perceived as corrupt, dehumanizing, and moribund.
But the people I was thinking of during my ill-fated argument are not really the punks, who are more than capable of defending themselves. (And in any event, punk style quickly wrapped around to touch the other end of the spectrum — a 12-inch mohawk is arguably as high-maintenance and image-conscious as anything in runway couture.) I was thinking of the middle-aged man who wears a Speedo to the beach, the pudgy girl in the midriff-baring shirt, the guy who wears black socks with sandals, even in summer, the Asian kid with the scruffy beard that never quite grows in. I want to take a stand for mixed plaid, for denim shorts, for too much green eyeshadow and bright red lipstick. I want to be the champion of the mullet.
Dress and grooming are our most literally personal forms of expression — when I decorate my person, I can’t help but reflect some conception of myself as a whole. Perhaps I dress sexy, or dress for success, or dress like a rock star. Maybe I dress down to show that I’m relaxed and friendly — or maybe I always wear a suit in order to be the dominant presence in any room. If I am a doctor or a soldier or a judge, I wear a uniform imparting to me the special authority of my role in society. If I am part of the Nation of Islam, I wear the distinctive bowtie. If I am Eddie Izzard, I cross-dress. And so on.
The problem is that all of these are forms of signification. That is, in each case, I intend to communicate something to the outside world about myself. But dress may be the only kind of communication which has no form of silence. One may speak or not speak, write or not write, and very few of us do either one all the time. Still fewer make art, and indeed making and sharing art is basically an invitation to comment. But everyone wears clothes. (Nudists hardly count, as their nudity in a clothed culture is very obviously a kind of statement, while nudity among tropical peoples is almost always accompanied by elaborate decorative processes of other kinds, including body paint and outlandish penis sheaths.) There is no neutral option, no way to avoid being judged on this expression of self, which may be no expression at all, merely what’s comfortable or what’s clean, or simply what amuses or pleases me at the moment.
Perhaps it’s just my inner nerd standing up for himself, but I am almost irrationally belligerent about people criticizing the way I or anyone else chooses to dress — no matter how mild or well-intentioned the criticism. It’s a bit of a hot button, but I think for good reasons. I have no objection to style — I actually like to dress up and attract attention, like anyone else. But when I do it, I’m the same frame of mind that I’m in when I make art; I’m consciously trying to elicit a response in someone else. In fact, whenever I dress up or wear something a little flashy, I think of it as costuming, and I get the same giddy feeling of impersonation that I used to when I acted on stage in high school. But the rest of the time? Do I want to be on stage every hour of the day? Do I want to perform for an audience every time I step out my door? Personally, I wish we all wore gray utility jumpsuits except on special occasions. (You can see, of course, why the military is in many ways the right profession for me. On the other hand, no one is more hypercritical about appearance than a soldier, which suggests that even in a jumpsuit world people would seek difference and distinction.) Failing that, I often dress, as noted above, in a nondescript style that doesn’t attract either positive or negative attention.
That wasn’t always the case, to be sure — friends who’ve known me since high school can testify to the Afro, the loud Hawaiian shorts, the dangly earrings, the pink do-rag, the Powdermilk Biscuits T-shirt that ended up in my official senior photo, and the ancient red-and-blue-striped polyester tie that was always on my person somewhere (a token from my first major crush, a girl with her own quirky sense of style). None of these things was intended as a countercultural statement per se — my political awareness came later, and my status as a fashion misfit probably contributed to my leftist leanings, rather than the other way around. No, I simply wore what I liked, what I thought looked good, or just what made me happy.
And if what made me happy made others laugh…? Well, I was hurt by that, but I tended to dig my heels in — nothing would ensure that I’d wear something over and over again like critical comments from teachers or peers.
Since then I’ve mostly retired my stubborn streak, or at least tried to let it do its job on what I hope are more important issues. Anyway, surely we’ve all left high school behind by now. But this issue comes up again and again in adult life. When I was seventeen, I spent a month or more applying for summer jobs, to no avail. I eventually ended up “working” for my parents, painting the house, which was mortifying for me and, I think, disappointing for them. The following summer, I cut off my long, curling ponytail, and I was able to get a job easily. I have no way of knowing what other factors were in play — the maturity and confidence that I’d gained from a year of college, a slightly different economy — but I can’t help feeling the hair was the kicker, for two reasons. First, even as late as 1992, suburban Atlanta could be a conservative place. Second, I think my simple, rough ponytail, which in any event never quite contained all my curls, was a signal, perhaps, of a decision not to pick a hairstyle, not to care for my hair in any particular way. When I went to a hairdresser and got a fashionable cut, I was signalling, on an almost subliminal level, my willingness to participate in the normal rituals of social behavior. The question of whether I was signifying this more to my potential employers or to myself is probably the heart of the matter.
Perhaps because I’ve learned to tone down and, when necessary, fit in, I find something admirable in some people’s ability to dress as they choose, however “badly,” with whatever lack of grace or taste. We are not all artists, should not all be required to submit to criticism (whether overt or implicit) for our aesthetic works. Let those who put themselves on display for evaluation receive all fair judgement, but as for the rest, let us try to turn off that part of our mind that compulsively seeks communication in every detail, whether such was intended or not.
But what about actual communication? I accept the punk aesthetic in language — I accept the creative rejection of norms for a higher purpose or to take a stand for individual liberty. But can I accept the abandonment of those norms by those who simply don’t care?
My well-dressed friend called my number pretty well. Mistakes and sloppiness in language grind at me — even my own. The other day, after sending an email, I noticed that I had ended a question with a period. That’s going to bother me for a while. I often re-read my blog posts and correct errors, or even bad phrasing, months after they were posted.
But I proofread these posts because they’re intended for public consumption. The real question is why I should care about spelling, punctuation, and grammar mistakes — my own or others’ — in private emails, or during chats. I’m a notoriously slow IM chatter — and, in fact, I don’t really like the medium — precisely because I write full sentences and paragraphs. In this way, perhaps, I’m like the above-mentioned person who always wears a suit, only I’m wearing it to the beach, in the sauna, on the farm…. You might think this is just a writer’s tic, but I’ve known plenty of writers and academics who in their professional lives use scrupulously correct English, but who in their personal correspondence write prose as unkempt and rumpled as, well, a stereotypical writer or academic.
Studying languages has mellowed me a little on this point. Six years of constructing rickety conversation and feeble prose in Portuguese, Korean, and Arabic have given me some perspective on how badly, exactly, you can express yourself and still be understood — how little information, really, is carried in the average sentence, and how easily people can guess your meaning. For basic life needs, at least, and for most expressions of concrete reality, accurate and exact phrasing is just so much swag.
Yet I still aspire to correctness in foreign languages, just as in my native language I uphold it on occasions both appropriate and ridiculous. Out of politeness and a general unwillingness to interfere in the affairs of others, I don’t point out people’s mistakes unless they ask for help. But I’m marking them, in my mind. Each one eats at me a little.
Surely this is a natural and human tendency, as natural and as necessary to our survival as picking out the good fruit from the rotten. Human beings, weak, slow, and near-sighted, survive largely on their judgement, especially on their ability to spot deviations from patterns. On the other hand, perhaps it’s time to let that instinct go a little — or at least to alter it, so that we can clearly distinguish between the aesthetic (or even technical) error and a character flaw in the person who made it. Maybe it doesn’t matter whether people in their everyday speech and writing can’t float a thought elegantly or speak in complete paragraphs. Maybe it doesn’t matter any more than whether they wear overall-cutoffs and suede boots.
I couldn’t help thinking, there in the truck and later on reflection, that something about language does matter — that respect and love for language matters more than respect and love for one’s own appearance. I think the love I bear for a good sentence or paragraph or a well-fashioned lyric has something to do with the external and abstract nature of speech. When I dress well, that is, it seems to me, the end of it — I have decorated myself well. It may be pleasing to others, and there is a moral good in that. But when I have expressed a thought well, I have created (or at least been a conduit for) something new, and then I have decorated that external thing, and given it as a gift. Once the thought is out there, I may receive praise or shame for it, but in truth it no longer has anything to do with me; it belongs to the community.*
But when I wanted to defend this view, I failed to express my thoughts clearly. How pitiful is that? I certainly failed to make my words pleasing, or in any way like a gift. And because my pride was still bound up in a perceived slight to my fashion sense, I failed to let the idea go out from me and become a common possession. Instead it was something I held onto with all my strength and fought for. And so of course there is a moral component to linguistic competency: if your motives aren’t pure, it poisons your prose. Being incompetent with language doesn’t necessarily make you a bad person. But the least trace of selfish motive can easily corrupt your ability to express the truth.
We make up, of course, my friend and I. (The other young lady, who’s been trying to mediate, seems to come out of it unscathed, if a little rattled.) I tell her that I have a hair trigger about people criticizing me for being different, and she tells me that she has a hair trigger for people de-valuing what she’s good at and implying that she’s dumb. We brush it off, although a day or two later she jokingly mentions that now she feels awkward shopping around me. A few days after that, when she’s complaining good-naturedly about how expensive it is to buy stage clothes as a musician, I offer to teach her how to dress like the Ramones for next to nothing. We put it behind us.
But just to be safe, I take off my sunglasses indoors now.
*Of course, there are obviously many perfectly valid forms of expression that can’t be placed into purely external objects. Acting, for example, is always intimately tied up with the physicality and personality of the actor. But I think painters and writers are allowed at least the illusion that we could set our works down in the middle of the woods, with no name anywhere on them, and they would still mean as much to whoever found them as if that person knew who we were and what we looked like. This may be bunk — certainly Harlan Ellison’s writing is very much about Harlan Ellison, whatever he may say to the contrary, and of course the whole stock in trade of the memoirist is his own personal significance. This distinction, therefore, is obviously mostly a psychological one, and surely there are people whose creation of an externalized persona and obvious visual style is dictated by unselfish motives. Also, see excellent comment below about the breast cancer survivor.