I’ve learnt not to post anything controversial to Facebook. I’m conflict-averse, like any good native-born Midwesterner, and I also think that the medium of Facebook is inappropriate to debate. It’s not well designed for it. Kittens and puppies, I always say.
But sometimes it seems I can’t help myself. And so today I posted an article about belly-dancing that struck a chord with me. You see, one time I saw a performance as part of a larger event that appalled me. I didn’t know going in that the belly-dancing would be part of the evening’s festivities. And when these white women swiveled out onto the stage, not in haremesque attire associated with the art form, but in kimonos and geisha makeup for a “kabuki-inspired” performance, I raged out of the auditorium. I had fooled myself into thinking that we had somehow got beyond yellowface.
Now, this Japanese take on a minstrel show was beyond the bounds of decency. But it made me think. what about belly-dancing itself? Many performers are not of Middle Eastern descent. Is it okay for them to practice this art?
To answer my question, I just started paying attention to what my friends of Middle Eastern descent had to say on the subject. Not that belly-dancing came up in conversation all the time, and not that I broached the subject with them. But on occasion, a snippet of opinion surfaced, and, over time, I pieced the snippets together.
And the consensus was that it was not okay.
And this is the sort of thing that often has creative types like myself up in arms. An aesthetic can’t be owned by one culture to the exclusion of all others, so the argument goes. If so, we wouldn’t have English-language haiku, or the Asian influences present in Impressionist art. And without the intermingling of European and African influences, we wouldn’t have jazz or rock. So much would be lost, as the argument goes, if we all held to some strict, politically correct standard of artistic segregation. Besides, the artist should be completely free to use whatever methods or aesthetic she wants; creativity is paramount.
I argue that there is something more important than creativity–yes, even for artists. For there is an identity more fundamental than “artist”: human being. And for human beings to survive, let alone thrive, they must be able to live and work together in community. Our social nature, our ability to think in terms greater than the individual, is one of the chief reasons we have evolved to this point, and is key to our continued survival.
Respect is also the key to understanding the concept of appropriation. And the key to respect is listening. Simply put: if the consensus of a group to which you do not belong is that it’s okay for others to make use of an artistic expression originating in or representative of that group, go for it! Have fun.
But if the consensus of that group is that an expression is not okay, knock it off.
To the best of my knowledge, there has never been an upswell of discontent from Japanese people about speakers of other languages using the form of haiku — even as the form is sometimes stripped of its original intent as a meditation upon nature.
The presence of East Asian influences in Impressionist art came out of the larger European movements of Orientalism and Internationalism in the late 19th century, which developed as a direct result of European colonization in East Asia. It’s important in the study of the Impressionist era to bear this troublesome history in mind. However, to the best of my knowledge, there have not been any recent calls from Chinese, Japanese, and other East Asian artists to dismiss Monet’s Water Lilies or Van Gogh’s The Starry Night in the way we now do, say Al Jolson in The Jazz Singer or Mickey Rooney in Breakfast at Tiffany’s. (Though we might want to talk about Gaugin’s objectification of Tahitian women in his work.)
With regard to the musical examples I offered above, jazz and rock, it’s important to bear in mind that artistic movements do, indeed, develop organically. Cultural cross-pollination created jazz, rock, and many other movements musical and otherwise. To the best of my knowledge, there has not been a consensus from African American (and in the case of jazz, also Jewish) communities that those who do not belong their communities shouldn’t perform these genres — even as the audience for both jazz and rock over the decades grew increasingly white. An academic critique of, for instance, Elvis Presley and his complicated history with African American performers is worthwhile, but there has not been any great advocacy from the African American community that whites should quit listening to his music (though I half-wonder if some younger readers could list five of his songs — even Kings get dethroned eventually.)
To go back to my initial example, one could argue that the performers I saw that night were simply artists practicing a form of artistic syncretism. But the Asian American community has been resolute in its unacceptability of yellowface performance. And a growing number of people of Middle Eastern descent are decrying the appropriation of belly-dancing.
Even as I declared a certain black-and-white rubric regarding what to do and what not to do, notice that I’ve presented my examples with nuance and exceptions. Human beings are by nature complex, their histories, both personal and collective, tortuous and at times torturous. No one’s going to get all of this right 100% of the time, and group consensus also involves those who dissent. But the goal is not perfection, or “correctness,” but respect.
It’s tricky business. And it’s very much involved in what I do with my life. I’m a creative person across a few media. For instance, I designed this ballcap. (Sorry for the shameless plug.) I’ve been interested in sports branding for most of my life, but it wasn’t until I discovered the online sports-concept community (and the existence of graphic-design freeware) about four years ago that I took up my hobby in earnest. And as I engaged with my fellow designers, I discovered a sharp divide within the community regarding the use of Native American imagery in the branding of a team, whether real, (like the baseball team in Cleveland or the NFL team in Washington) or fictional (I imagined my ballcap for a baseball team in Charlotte.) And as some designers like myself decry, for instance, the questionable moves of the Washington NFL ownership, others not only state that the branding is intended to honor Native Americans even as Native Americans claim otherwise — exactly what the ownership maintains — but persist in using such imagery in their own fictional concepts. On which point, I will simply say it doesn’t matter what you believe if that belief is contrary to fact. And the fact is that the consensus of Native Americans — with, yes, a bit of dissent, an issue meriting its own essay — is that such branding is disrespectful, full stop. So, to my fellow designers, I simply want to say: stop.
I also design jewelry. Mostly, I practice what is called assembly, meaning that I put together manufactured pieces in original designs — I don’t smelt metal or melt glass or anything like that. (Another shameless plug for my work is here, though at this exact moment the work is not for sale.) Another popular and lucrative style of jewelry design is bead-stitching, much of which was first developed by Native Americans. It’s a style I’ve thought about doing, though I wonder if I’d have the patience for it. But I’m not going to take it up for the time being, for the simple fact that I presently live in a community with a large Native American population, many of whom practice bead-stitching as a source of livelihood. I have decided that to do so right now would be disrespectful to the Native American community in that I would be using my hobby to undercut their ability to earn a living — in spite of the fact that, to the best of my knowledge, the local Native American community has not come out against white people making and selling bead-stitched jewelry.
And, really, that’s what all of this comes down to: personal decisions. But none of us live alone; the personal decisions of all of us over time aggregate to build a culture. And it behooves us all to build a culture that edifies rather than destroys, on a foundation of respect rather than of selfishness.
I’m a mutt. My roots are flung all across southern Europe, western Europe, and western Africa. My family has been in the United States so long that it’s probably safe to say that my “people” aren’t from anywhere other than America. And if there is a such thing as a distinct American ethnicity (apart from Native American ethnicity), then I’m a likely archetype.
It’s not the only thing mutted about me. My dialect is almost literally all over the map.
A couple of weeks ago, an amazing study of dialects came out of North Carolina State University. This elegant and thorough study, best known for its eye-catching maps that are a lot clearer than one often finds in the fields of sociolinguistics and dialectology, caught fire across the internet, appearing most notably on Huffington Post and Business Insider. We even discussed the study in my Advanced Writing class.
The study endlessly fascinates me. I have long been interested in linguistics, to the point that a friend of mine and I devised our own language some years back. A lot of it is because, well, I talk funny. When I’ve spoken with professional linguists, they say that my dialect sounds something like a cross between North Dakota, Cleveland, and Maine. I even throw in some things that are way out there–a lot of Canadian “eh” and British “brilliant”.
There are a lot of reasons why I talk the way I do. If you dive into the maps in the study, you will see that my hometown, Bloomington, Indiana, is very much a borderland, a fact which any linguist will confirm. You’ll notice that, for a lot of the word usages that were studied, the numbers are roughly even. There is a line, roughly equivalent to Interstate 70 through Ohio, Indiana, and Illinois, that divides the North Midland dialect and the South Midland dialect. Bloomington is also a college town, attracting people from all over the country and all over the world. I grew up hearing many different dialects, and in my adolescence, I particularly took up with a household from Brooklyn and a household from Boston.
Then there was my moving around in adulthood. For instance, what do you call a carbonated beverage? If you check out the study’s map, you’ll see that Bloomington, Indiana is about evenly divided between “coke”, “soda”, and “pop”. Now, in my family, who came from further south, it was called “coke”. (And some of my friends and relatives also said “sodie pop”, and I’m always surprised that that usage never shows up in such studies. I guess it’s too rare.) But then I moved off to St. Louis at age 18, where virtually everyone says “soda”. The word stuck, although when I lived in England briefly (how I picked up “brilliant” to mean “cool”), I discovered I needed to say “fizzy pop” to be understood. But I’ve lived in Minnesota for nine years, where most everyone I know says “pop”, and yet “soda” has stuck with me.
Another test is “you guys” vs. “you all” vs. “y’all” vs. plain old “you”. “You guys” holds a slight majority in Bloomington, though my relatives an hour south more often said “y’all”. I’ve had mostly African American neighbors about half of the time I’ve lived in Minneapolis, and I’ve picked up “y’all”–but curiously, I never did from my family. My pronunciation of “I” has become more Southern for the same reason.
Some of it was a matter of choice. In third grade, during reading time, my teacher pointed out that “either” and “neither” could be pronounced with an “e” sound or an “i” sound, and I decided that very day I would use the one I heard less often, and have said the words with an “i” ever since.
Which leads me to the point of this post. Something else that some linguists have picked up on my dialect is that it sounds affected, like I’m trying to put on airs. Now, they don’t think I’m trying to do this; rather, they think these are subconscious habits. The main reason they think I do this is that they notice I even change dialects from one sentence to the next.
Like I said, they don’t think I’m trying to do this. They think it’s subconscious. And now, after having studied some linguistics, I finally understand why.
In any culture, there is what is called the prestige dialect. A prestige dialect is the one you’re supposed to have if you expect to climb the socioeconomic ladder. As an example, they say that, if you want to make something of yourself in New York City, you can’t actually sound like you’re from New York City. A lot of us are aware of the idea without necessarily labeling it a prestige dialect. In America, we have what’s called a “newscaster dialect”. It’s not really an actual dialect–though some say it most resembles the dialect of Des Moines, Iowa. However, if you wish to advance as a newscaster, sounding like you’re from Brooklyn or Atlanta is straight out. So this dialect wields a lot of influence in media, which influences how we talk. We associate having “no accent” (there’s not really a such thing) with power and influence and belonging to the upper classes.
I think I picked up on this at a very early age, and tried to sculpt the way I speak to something other than what I heard around me. I also cannot underestimate the power of television on my upbringing. As my father cut us off socially to hide the abuse, television, where the newscaster dialect holds sway, was my only window into how other people talked. And, looking back, I think at least some of my schoolteachers tried to “correct” the more Southern parts of us kids’ speech. Then again, with a university renowned for its school of education, not all of my teachers were from southern Indiana.
And so I went through life accruing what I thought sounded like the way people talked who were above me socially. I’m almost certain it’s how I picked up the more East Coast/New England parts of my dialect. Where I’m from, such a dialect means you’re most likely associated with the university, and thus you are educated.
And I wanted desperately to be educated. I entered kindergarten functioning at a fourth-grade level. But, rather than offer me any enrichment, the principal told my parents that the teachers couldn’t do their job with me in the classroom, so their goal was to dumb me down to the other students for the sake of classroom management. By the age of 13, my father out of the picture and my mother disabled, we found ourselves in public housing. In my neighborhood, trying to get out of there was frowned upon; you were “thinking you’re better than everyone else.” My mother didn’t understand the mentality–she thought that everyone living there deserved better than what the neighborhood had to offer.
But, at some point in the past few years, something clicked. I picked up a bit of a drawl–living in Minnesota!–that gets even stronger when I go home to visit. I started using the word “ain’t” in the hope that my awful, horrible first-grade teacher (who deserves about a half-dozen blog posts of her own) might roll in her grave. I quit caring about how I might impress people with the way I sound.
And I wish we all would just give it up. Last semester I researched the subject of dialect discrimination for class. It’s an ugly thing, primarily because it ensures that people remain in the class into which they were born. We have plenty of mechanisms that do that job in our society as it is. If we, as an American culture, truly hold to the Horatio Alger principle that success comes largely through hard work, then we must dismantle the impediments that keep the hard work of certain groups of people from receiving its just reward.
Don’t believe that such things exist in America? I could write volumes on the subject, but I’ll close out with this one simple fact I stumbled across yesterday: An adult born into wealth is 2.5 times more likely to be wealthy without a college degree than an adult born into poverty with a college degree.
Not a one of us is intrinsically any better or worse than the next person. We all have something valuable to share with our species, and justice demands that honest work deserves honest reward.
PS: For a nice, quick-and-dirty study of American and Canadian dialects, check out this great blog post: http://dialectblog.com/northamerican-accents/
In my online Adolescent Lit class the other day, we were asked to read two essays regarding the value of young-adult literary awards created especially for works that showcase the writing and stories of racial and ethnic minorities. The first essay was written by a white male scholar, who believed that such awards prefer subject matter over literary merit, and thus run a great risk of rewarding inferior writing. The second essay was written by an African American female author as a direct rebuttal, explaining the history of how mainstream awards have repeatedly dismissed the efforts of non-white authors and illustrators.
For my class, we were to post our response as to which side won the debate in the class “discussion,” which functions like a message board. I wrote that the field of literature was an extension of the field of academia, which exists as the result of centuries of white privilege and institutional racism. The vast majority of whites are, for many reasons, ignorant of the privileges they are afforded in society based solely on the color of their skin. Moreover, a person has no ground on which to claim what is appropriate for a group to which he does not belong, particularly if he belongs to a group that has historically oppressed the group in question. (This is simply a matter of respect in my book.) For these reasons, in the class discussion, I made the bold assertion that the first essayist did not even have the right to an opinion in the matter.
I awaited a mob of classmates, charging with virtual pitchforks, ready to pillory me for daring to suggest that someone doesn’t have the right to an opinion. I waited in vain. Most of my classmates–interestingly, including many who are not white–appreciated my perspective, and stated that they hadn’t even considered the angle of white privilege. Only one student rebutted my claim that the first essayist didn’t have the right to an opinion, since, as we are so often told, everyone always has the right to an opinion.
I, of course, disagree. For example, I have the right to an opinion about matters of taste. But even then, that only goes so far. I may not like what someone is wearing, for instance, but even then, I don’t necessarily have the right to air my opinion about it, especially if doing so belittles the other person (and so often it does.) In fact, if someone is walking down the street stark naked, the only reason that should be my business is if that person is too cold–then, it is my moral obligation as a fellow human being to ensure they are warm.
And then there are matters in which it doesn’t even make sense for opinion to come into play–and yet it seems almost everyone, in their postmodern, it’s-all-what-you-believe mentality, thinks otherwise. Many of these matters have to do with what a person has the right to know.
I do not have the right to know what two (or more) consenting adults do in the bedroom.
I do not have a right to know how your genitalia look or how they function. I do not have the right to an opinion as to whether your genitalia should match what I think your gender is, or even what you think your gender is.
I could go on, but I need to get a move-on with my day. Blogger Dan Pearce has a great list that delves further into this issue. I’m not sure I agree with all of them, but overall, it is great food for thought.
Like I said, there is much more I want to say on this subject, but that will have to wait for other days.
Many across the United States are aware that Minnesota is in the midst of a nasty battle for a constitutional amendment (an amendment the Republican-led state legislature felt so important that they had the state government shut down entirely via lack of budget until the amendment was put to ballot) that will restrict the rights of emancipated adults to engage in civil contract (the contract of legally-recognised marriage) and restrict the religious freedoms of churches who wish to follow their beliefs by marrying same-sex couples. This concerns me. Not because I am gay (I don’t think I’m quite marriable). If I were straight as an arrow, I believe this would still concern me. See, this battle is being fought by and large by people who claim they are fighting this battle in the name of Jesus. When I read the Gospels, I see that Jesus repeatedly lambasted those who wanted to embroil themselves in other people’s moral affairs. He actually praised the Pharisees for their practises of personal piety, but then condemned them for legislating everyone else’s morality ad nauseum. This Jesus has been forgotten somewhere along the way.
I’m also concerned because I live in the Twin Cities, what I sometimes call a “gay bubble,” and so many I converse with are utterly convinced that the anti-marriage amendment will be defeated, no problem, and they’ll point to some random poll to prove it. Yet every single poll I’ve encountered has said that, though a tight race, the amendment looks like it will pass. Moreover, I know a number of people who are working at various levels of Minnesotans United for All Families who all confirm my assertion and refute that of my acquaintances. I think living in this gay bubble inures people to attitudes outside the bubble
But as concerned as I am about this amendment, I am even more concerned about a second ballot issue which has garnered less national attention. The proposed amendment would require a “photo I.D.” in order to vote. Can’t afford the fee for a photo I.D.? Well, just head down to your DMV and they’ll make you a special, free voter ID. It sounds innocuous enough, doesn’t it? But then the truth rears its ugly head….
The hue and cry that got this proposed amendment put on next week’s ballot was claims of rampant voter fraud. Extensive studies have demonstrated that this rampant voter fraud simply doesn’t exist. The claim that it’s easy-peasy to just go down to the DMV? Never mind the many mitigating factors that can keep someone from the DMV. Just ask the good folks in Wisconsin. They were told that they could do just as Minnesotans are being told, to go down and get your free ID at the DMV. But then the DMV employees were instructed specifically by the state government to do everything possible to *discourage* applicants from obtaining these voter IDs. (Watch this video to see these tactics in action.) Or voters of a certain political persuasion (read: Democrat) had their closest DMVs taken away from them outright. Never mind that the implementation of this special voter ID will cost in the neighbourhood of $50 million with no clue as to how to fund it.
If this sounds like a diatribe against the Republicans, it is somewhat, but only because they are the ones who have seized upon this issue. (I have diatribes I can write against the Democrats, but that will have to wait for other writings.) Look at the stats across the country. There is a clear correlation between the ease with which one can vote in a particular state and the likelihood that that state will favour one party or the other in elections. (I say this having come from Indiana, one of the more dependably Republican states, which is also one of the hardest states in the country in which to vote. For example, you have to be registered at least thirty days before election time, and the polls close at 6 p.m.—the earliest in the entire country.) The Republican leadership are well aware of this correlation, and have admitted as much. So, on the surface, this is appears to be a matter of one political party subverting the political process to gain control, which is in itself repugnant. (For the record, I agree with George Washington in thinking that political parties are an inherently bad idea.)
But the heart of the issue is much more insidious than a simple power play. It is nothing less than the assertion that some human beings are inherently inferior to other human beings. A couple of months ago, I haphazardly ended up in a debate (I hate debate, or rather what is mislabelled as debate these days) on Facebook with a friend of a friend (there is no enemy like a friend’s friend). I gave him my personal account of how, two years ago, I was nearly turned away at the polls under the existing laws for reasons related entirely to poverty. And this friend of a friend asserted that he didn’t care. He didn’t care about whether circumstances beyond my control kept me from the polls. Furthermore, he stated that he could hear a million stories that were the same, and they still wouldn’t change his mind about ensuring that this repugnant amendment becomes enshrined in the Minnesota state constitution.
He stated it right there: he believes his Story is more important than mine, or those of the hypothetical million others, and by extension, *he* is more important than I or the million others are. And I maintain that the belief that some human beings are inherently better or worse than others lies at the core of most of our social ills.
And that is what this fight—what many fights—are about. It would take unmitigated gall to walk up to someone and say, “Yeah, you know? You could vote just fine last year, but I’m taking away your ability to vote next year.” Of course, most backers of this amendment would dare not express such unmitigated gall to someone’s face, instead hiding behind the anonymity of the ballot box and the socioeconomic, racial, and cultural cloisters that keep nearly all of us from ever truly learning the experience of anyone whose Story isn’t like our own.
Last night I went to a Halloween party. As I rode the bus through increasingly conservative neighbourhoods out to the inner-ring Saint Paul suburb of my hosts, I saw on a number of lawns a maddening sight that was the impetus for writing this article: signs, side-by-side, one saying to “Vote No” on the anti-marriage amendment, but to “Vote Yes” on the voter suppression amendment. This repeated sight angered me because the posters of the signs could not see that both of these amendments are cut from the same cloth of inequality: that homosexually-coupled individuals are inherently inferior and don’t deserve to live lives of the same quality as their heterosexually-coupled counterparts, and that the poor, the disabled, the elderly, college students and anyone else who doesn’t “fit” that look to be marginalised by this amendment are inherently inferior and don’t deserve to participate in one of the foundations of a functional democracy. Both of these amendments maintain that some people are fundamentally inferior to others, an assertion that undermines the very notion of democracy.
And so, I turn back to my earlier illustration of all of us hiding in our own little homogeneous cloisters. We have the gay, the lesbian, the ally who will fight tooth and nail for their own rights and of those close to them, but are at best indifferent to the rights of those who do not run in their own circles. And that is repugnant.
To vote no on both of these amendments is to affirm the dignity and equality of all our citizenry and to support democracy. It is the absolute least we can do. May we do this and far, far more to uplift our species.
A final note: this is my last word on the subject. And I will not be lured into what-passes-for-debate-today on the subject, because there is no possible way you can convince me that some human beings are inherently better or worse than others.
Edited 28 Oct 12 to add a link regarding Indiana voting shenanigans.
Edited 5 Nov 12: I also want to add that supporters of the amendment have stood on the idea that the amendment will “reduce voter fraud.” The evidence of voter fraud is virtually nonexistent, far smaller than the statistical margin of error. Yet this amendment would remove from thousands the ability to vote in order to sift out one or two voting cheats. From a mathematical standpoint, this makes no sense.
Edited 5 November: Fact-checked, figure “hundreds of millions” for implementation of Voter ID measures brought down to “in the neighbourhood of $50 million.” Still way too much for an unnecessary measure
Originally published here in March 2011, though this version has been thoroughly proofread and edited. The original was dashed off in a hurry, so I hope this revision demonstrates my editing abilities, if nothing else.
Human beings today seem to communicate primarily in two ways. We either share personal narrative, or we “debate”–though it does not merit the name. True debate is measured, calm, well-researched, and deliberate. What we have instead, coming from all sides, are name-calling, belittlement, anger, resentment, hatred, malice, insults, and every curse of hellish fate you can imagine.
These “debates” develop as we lose sight of our mutual humanity. We do this by mentally converting fellow human beings into labels, into abstractions. We call each other “liberal”, “conservative”, “gay”, “straight”, “Christian”, “Muslim”, “American”, “Chinese”, on and on it goes.
It is easy to go to war against an abstraction (why do you think they call them “casualties” and “collateral damage”, rather than “deaths”?), to oppress an abstraction, to abuse the rights of an abstraction. An abstraction does not share your breath and your DNA and your heartbeat. And if we behave as if the world consists of nothing but groups of abstractions, a “them”, and a small number that we call “us”, there’s nothing that to keep us from blowing “them” to smithereens. We should just drop the nukes and call it a day.
However, it does not have to be this way.
We may well be hardwired to think of each other in terms of our differences rather than our similarities. But we also have amazing minds that often transcend their wiring. What if we stretch our minds beyond the capacity to label? If our differences, and the way we use them to dehumanise each other, are speeding the destruction of our species, what are our similarities, and how might those similarities save us?
It’s not our genetics (for example, not all human beings have 46 chromosomes). It’s not our physical composition. It’s certainly not the way we look, dress, think, or believe. The one thing that all human beings share is Story.
By Story, I mean the personal narrative that each of us carries. It is the unique path that has brought us to where we are. It is the tale of our triumphs and tragedies, events both momentous and mundane, the things that shaped our decisions, beliefs, and character. Not only is Story the only thing that we all share, but, in a very real sense, it is the only thing that any of us has. You can lose your job, your home, your possessions, your family and friends, you can lose absolutely everything–but no-one and nothing can take away your Story.
So, if focussing on our differences hastens the destruction of our species, would focussing on the commonality of Story save it? First off, it is very easy for me to share my Story with someone who closely identifies with me–who shares my labels. The trick–for all of us–is to learn to transcend our boundaries in our sharing, to share with those who don’t share our labels, and to start seeing each other in terms of one label only: fellow human beings.
In this spirit, I am working hard not to engage in debate but to share Story. And I fail. A lot. But to keep trying in hopes of success is all I can do. And I know that I can’t force anyone to share their Story with me. But what I do know is that I’m not responsible for what others do, only what I do. And if I have the option of choosing actions that can make the world a worse place or a better place, I choose the latter.
Last night I indulged in a carton of Ben & Jerry’s–perhaps not the smartest thing for a man trying to lose weight, but it’s not like an everyday thing. As I decided on my flavour (“Late Night Snack”, fantastic), I noticed that one new fluffernutter-inspired concoction was rechristened, from “Cluster Fluff” to “What A Cluster”. This did not surprise me. The company had recently been pressured by conservative activist group One Million Moms to change the name of their latest flavour, “Schweddy Balls”, inspired by a Saturday Night Live sketch. However, as of today on the Ben & Jerry’s website, that name remains (though, personally, I think the idea of putting chocolate and rum together sounds kind of disgusting). Even so, though the company has used salacious flavour names in the past*, they apparently felt compelled to change the name of “Cluster Fluff”.
This censorious behaviour echoed an online conversation I’d had earlier in the day with a good friend in Canada. He had recommended a website for me to check out, and though I was certain it would include no “graphic” imagery, I figured it would still be blocked on library computers. I told him such, to his shock and consternation. After all, this was a library, a purveyor of information to the masses, and a cultural institution which has a long history of standing against censorship. If Canada doesn’t censor public internet use in this way, surely the United States wouldn’t, either. I then explained that in the United States, the federal government can reduce a public library’s federal funding if they do not install “nannyware” filters in their computer labs. (Some American libraries have simply chosen to forego the federal funding, on principle.) I illustrated this attitude in American culture with the catchphrase of Helen Lovejoy, the pastor’s wife on The Simpsons: “Will somebody please think of the children?!” My friend replied that people should focus on raising their own children, not other people’s.
I’m undecided on how I feel about his statement. On one hand, as they say, “It takes a village to raise a child.” Children grow up, not in the bubble of their parents’ watch, but in society at large, and we fool ourselves if we think our actions have no influence at all on the next generation. On the other hand, how one chooses to parent, how one chooses the values to inculcate into their children–we consider these sorts of choices as a hallmark of a free society, and, so the argument goes, if someone wants to raise their child more “precociously” than another, then so be it. And yet, this view is also used to enforce attitudes that really do harm society: “I’m raising my child to stand against homosexuality, and rules that say ‘gay’ students get ‘special protection’ from bullying is undercutting my right to raise my child as I want.”
What I am sure of is that it is absurd to believe one can raise a child in a protective bubble in perpetuity. There is a difference between, say, giving your twelve-year-old pornography (ignoring the fact that some of the Bible is quite pornographic), and that twelve-year-old discovering it just by being a member of society. Children are going to find out about the real world no matter how much they protect their children. It is the job of the parents to first build up values such that their children can handle “the real world” when–not if– they encounter it, and then, to discuss issues in an age-appropriate manner when–not if–they come up.
The challenge comes when a segment of society believes it is (literally) their God-given responsibility to act as God’s mouthpiece in any and all situations, to hand out the judgements and punishments in God’s place. To this, I can only reply that, in a great many situations–from the woman about to be stoned for adultery, to his many encounters with the Pharisees, Jesus told people to mind their own business when it came to others’ morality, and to focus on their own.
As an aside, just to make my personal statement about censorship, allow me to say that, if you were not aware of what “Cluster Fluff” refers to, it’s a play off the phrase “clusterfuck”, which generally refers to a complex and intractable situation.
*”The company has had other controversially named flavors as well — Karmel Sutra and Hubby Hubby (in support of gay marriage) — for example. But Schweddy Balls has received much publicity-generating attention.” Read more here.
Occupy Wall Street began modestly enough, a couple hundred seemingly homogeneous folks gathering to protest in the largest city in the United States. As such, it needn’t gather much more news coverage than a curiosity, perhaps buried in “news of the weird”. Yet, though in one month’s time, the protest has swollen to thousands, as its message and mission has grown more focussed, as it has carried out clear and positive actions (such as the demonstration that successfully averted some foreclosures), one would think from the amount and nature of mainstream media coverage of the movement that Occupy Wall Street remains entirely a small gaggle of unfocussed, vaguely angry “young people”–a myth.
The internet, though, isn’t (wholly) mainstream media. What is interesting, though, is that one need not look far online to find this same unanalysed and untrue trope–of “lazy”, misguided youth, small in number and devoid of real purpose–perpetuated in comment sections across the web. I personally believe that this is the direct result of the message that the mainstream media have perpetuated since the beginning, as one sees the same eerily similar phrasings repeated over and over. The mainstream media established the meme of the “spoilt young middle-class rabble-rousers”, and so it has perpetuated through repetition.
It makes sense, though. The mainstream media are owned by the same corporations that the protesters stand against–and if there is one enemy the protesters consider to have in common, it is corporations. Through consolidation and buyouts over the years, the power to disseminate news has concentrated itself into fewer and fewer hands, leaving the news we get every day more and more “corporate”–even as the internet, smart phones, etc have placed the power to spread news in more and more hands. The mainstream media outlets, quite simply, are not going to bite the hands that feed them by relating news stories that stand against their own self-interest.
This, however, is nothing new. The myth that news media are, or should be, “objective” is taught from childhood. But, first off, human beings are not by nature fully objective entities. We are a conflated, confused mishmash of beliefs, principles, and goals, and there is no real way around it. I think also of times past when the media would go to absurd lengths to promote its own economic well-being. At the turn of the twentieth century, it was not uncommon for a major city newspaper to print up a made-up story of, say, an elephant escaping the zoo and rampaging the city, causing panic and, yes, increased newspaper sales. The fact that the last paragraph would read, “The preceding was a complete work of fiction” did nothing to keep people from spreading the rumour of the escaped elephant that they knew their cousin’s cousin saw yesterday–after all, aren’t we taught in school that, when writing news, you put the least important information at the end?
The mainstream media are putting Occupy Wall Street in the last paragraph, hoping that, if we ignore it, we will keep believing them and buying their products.