Friday, September 19, 2008

Does "Evil" Do Nuance? Teshale's Comment

A a quick apology for the lateness of this post, I went on a minivacation
and then didn’t check the blog.

My colleague Stanek presents, as usual, an ultimately cheerful and uplifting
post for my humble self to consider! I will add at the outset that my personal religious beliefs will color this post; I believe in God, and I am a Christian, so I do believe that absolute evil exists in the world. Regarding Heaven and
Hell, they exist to, respectively, reward and punish those who believe in
God and follow His law, and those who do not. The idea is not necessarily that people who are Good go to Heaven, and people who are Bad go to Hell, but more that people whoare following God's law are doing the proper thing, and are rewarded for it. What makes Hell so awful is not only the whole lake-of-flame thing, but a permanent, irrevocable separation from God. If you will take it as true that a God does exist, then this would be pretty awful.
(For more on this, I direct you to Ted Chiang’s short story Hell Is The Absence Of God, which is pretty good considering I did not write it.) Many see the obvious loophole of the Deathbed Convert, but if God is infallible, I figure surely He would have seen that coming.

Stanek takes what sounds to me a Durkheimian view of morality. Religion arises as a result of society. If something benefits everybody, and everybody chips in doing it, it both benefits myself by proxy and involves
less personal work. As you already know, I disagree, and the reason is something that Stanek mentioned at the end of his post—inconsistency. I hate slippery-slope arguments, as they seem very fearmongery, and remind me of hysterical anti-drug
commercials. But in this case I think it applies. If you rationalize one thing,
you can rationalize anything if you really want to, and it’s much more likely
that you’ll rationalize something to help yourself, and explain away
possible collateral damage. Alright-- but does your ability to rationalize something necessarily make it okay? Does your ability to argue something is not evil under certain circumstances make it not evil? Does this mean that truthiness really does exist?! If, essentially, good and bad is what I can argue, what exactly keeps this web from collapsing against the whim of someone especially charismatic? It seems like the world eventually checks itself (albeit by sometimes overcompensating in the opposite direction). I always thought of it as a sort of infinite slide rule, where you have a great many amount of things that people disagree upon as bad or good, and then, heading towards one direction, you have things that more and more people agree are just bad. These things are fewer in number, however, and at the very, very far end of the spectrum you have something most all cultures have deemed bad, albeit with arguments on just how very bad it is. (See below.) You'll never get every single person to always say it is bad, full stop, but the overwhelming majority is such that you can probably round a bit. The worst thing ever is infinitely far out, and no person has reached it either. The same is true in the opposite direction. So I definitely think there are nuances, but on the other hand, some greys are so very dark that they look quite black, and some greys are so light they're close to white.
If one doesn’t subscribe to the idea that there’s a God that dictates
morality, there’s biological argument, that
something like altruism stems from the suspicion that the person drowning
over there might share genes with ourselves. This seems a bit funky to me,
because of the worldwide incest taboo which would, theoretically,
make dead sure your own personal genes survived. (If for some reason you
find yourself curious about the reasoning behind said incest taboo, feel
free to have a gander at that Wikipedia entry.) There are very few things
Anyway, I quote Stanek’s previous post:

The natural conclusion, I think, is that no one owns anything except
through force (or the threat of force supplied by their parent state), no one
has any inherent rights that can be violated, and thus no action can be
taken that is objectively "wrong" or "evil.


I agree with the first point, nobody really owns anything, although
for very different reasons (I believe that everything we have comes from
God, so there’s no point in being excessively greedy about hoarding
anything, it’s not actually YOURS). This second point, however, about
nobody having inherent rights, I dispute—I think that people have the right
to live. I don’t mean this in a Roe vs. Wade sense, but in the sense
of the death penalty, or even something as simple as a revenge killing. Even if the universe is a huge bunch of atoms meaninglessly careening
in space meaninglessly, the meaningless careening of atoms that
eventually led to life is outside of humanity’s, let alone society’s
jurisdiction. Is it possible to counter what the universe has set into motion? This is getting quite existential and excessively late-night-college-philosophy-session for me.
I end with another short story recommendation, Robert Charles
Wilson’s “The Cartesian Theater”*, that addresses the existence of evil
indirectly. If you’re not already familiar with the idea of the Cartesian Theater, I won’t tell the true subject of the story, as it’ll give things away a little. But I will say that it addresses Stanek’s ending point, that “One can't help but feel that "evil"-- soul-crushing, gut-wrenching awfulness that doesn't lend itself to mere
rational analysis--exists; no, not in an absolute sense but in some
nuanced way.” I would just say “It’s something more
powerful than you can possibly imagine
", but that is, of course, not
an intellectually satisfying argument. My bad. It seems to me that if one
feels this way, and the explanation for why one feels this way isn’t
satisfactory, despite being a good explanation it isn’t complete…it’s a
tricky thing.




*You can read it in Futureshocks, ed. Lou
Anders
, or Science Fiction: The Best Of
The Year 2007, ed. Rich Horton
.

Friday, September 12, 2008

Does “Evil” Do Nuance? -- Stanek

Not too long ago, the two major-party presidential candidates were invited to sit down and take the mandatory quadrennial religious test for office in violation of the Constitution at an event called the Saddleback Presidential Candidates Forum. Here's one of the hard-hitting softballs they were lobbed: “Does evil exist?” The answer the candidates gave—and that nearly every human being on the planet would give—was a resounding yes. But is there an absolute evil that we can all agree upon?

There's an old public administration maxim that goes by the name of Miles's law: “Where you stand depends on where you sit.” It applies as much to matters of moral certainty as it does to budgetary negotiations: perspective matters. I'm far from being an expert on theology but I've been under the impression that, for the significant chunk of the population that accepts the Judeo-Christian cosmology, Satan was as close to being the personification of evil as it gets. And yet Theistic Satanists worship Satan as a deity, a bringer of knowledge, purveyor of freedom, and proponent of equality. If some can question the evilness of a guy whose nickname is “evil” with a “d” tacked on in front, how can we hope to find points of agreement on the existence of absolute evil?

My own sense is that these archetypal dichotomies—black and white, good and evil, right and wrong—are socially useful but not principles of the universe itself. Where does it come from? Unless you believe in an anthropomorphic (cognitively, if not physically) god handing down tablets with laws engraved on them it's difficult to see where from moral laws could derive their authority. If there is no Heaven to reward good deeds and no Hell to punish misdeeds (and I don't believe there are) then we lose the ability to draw those stark, absolute lines. Yet I believe many atheists and agnostics are moral people who do subscribe to the notion that some things are right and some things are wrong.

Many times people with or without faith will justify a sort of morality using the Golden Rule: "Do unto others as you would have them do unto you," or some variant. In practice, this often morphs into its cousin, the Silver rule: "don't do to others what you would not have them do to you." Maybe a good rule of thumb but hardly an absolute principle. Under that logic, the actions of people like the Columbine killers are morally excusable because they intended—indeed, wanted—to be killed themselves. We could call this the Anti-Hypocrisy formulation of morality because its only concern seems to be self-consistency. But since I tend to think everyone is inconsistent on some level, it's tough to see any truly moral human being under this rule. Furthermore, it assumes either an inherent "goodness" or a certain equality between human beings; if I can get away with doing things to others that they couldn't possibly do back to me, wherein lies the merit of the Golden or Silver rules? The principle loses any practical value in guiding one's actions and all we're left with are reasons of empathy to avoid doing “evil.” I've heard the suggestion that conscience dictates morality: I know killing is wrong because if I killed someone I would feel bad. Again, that might be a good personal reason not to kill but a moral system with this as its cornerstone is rather fragile. Indeed, it explicitly endorses the view that morality is little more than an individual-specific frame of mind.

The (likely uninformed) view I take on the matter is that morality arose and persists because it has great social value. Society needs rules—both implicit and codified—to grease the wheels and secure a degree of cohesion. A sort of selection principle operates: those who reject the norms accepted by the majority soon find their way out of society, be that through social and geographical marginalization, imprisonment, or execution. In particular, it's very useful to be able to invoke a Boundless Source for a set of rules rather than simply invoking the state. The state has limited power; it can't be everywhere and enforce every law (though, of course, it's interesting to ponder the ramifications of this statement becoming less and less true). But if the laws are absolute and ultimately enforced by the Almighty—well, then it's best not to violate them even if you can get away with it in the short term. Historically, there have been times when significant portions of the population came to see a gap between what was legal and what was moral. The abolitionist movement is probably the most famous example, though I imagine most social movements--and certainly any instances of civil disobedience--contain a large degree of this (even today, arguments for minimum wage increases, anti-poverty efforts, universal health care, etc are sometimes presented in moralistic terms). All this tells us is that many people do not accept the notion that what is “right” derives from man's laws. Indeed, some minor civil laws are, in part, made to be broken--the fines provide essential revenue streams for cities.

There was a paper a few years ago in the journal Faith and Philosophy called “God and Moral Order.” A scenario was presented in it that I found very interesting:

Suppose Ms. Poore has lived many years in grinding poverty. She is not starving, but has only the bare necessities. She has tried very hard to get ahead by hard work, but nothing has come of her efforts. An opportunity to steal a large sum of money arises. If Ms. Poore steals the money and invests it wisely, she can obtain many desirable things her poverty has denied her: cure for a painful (but nonfatal) medical condition, a well-balanced diet, decent housing, adequate heat in the winter, health insurance, new career opportunities through education, etc. Moreover, if she steals the money, her chances of being caught are very low and she knows this. She is also aware that the person who owns the money is very wealthy and will not be greatly harmed by the theft. Let us add that Ms. Poore rationally believes that if she fails to steal the money, she will likely live in poverty for the remainder of her life. In short, Ms. Poore faces the choice of stealing the money or living in grinding poverty the rest of her life.


The author concludes that it would be morally wrong for this woman to steal the money. I cannot agree. Theft can be detrimental to society in any number of ways and, as such, society should not tolerate it. And it is in all of our interests, as members of society, to make sure others do not steal. But in our capacity as individuals against society—-in the absence of an absolute morality handed down from an Absolute—-there is no reason, other than the fear of legal punishment (which is absent in this example), not to do what we can get away with. I suppose we might call this one the Hypocritical formulation. Ms. Poore isn't doing anything objectively wrong in taking the money. It seems especially clear-cut in this case where the motives are presented as "pure"--in that society would approve of the way Ms. Poore intends to spend the money--and no one is significantly hurt by her actions. The "right" to one's wealth or property is conferred by the state and the entire concept is predicated upon the state being more powerful than individuals who might want to take it. However, if the state is ill-equipped to protect your property "rights" and you yourself are unable to do so, well, your "rights" aren't worth a dime. Indeed, history is replete with states depriving other states of property or resources by force (where did the American Southwest come from?).

The natural conclusion, I think, is that no one owns anything except through force (or the threat of force supplied by their parent state), no one has any inherent rights that can be violated, and thus no action can be taken that is objectively "wrong" or "evil." Certain things are frowned upon or even outright condemned by society--in many cases, I would think, the sentiment is drummed up by those who stand to gain the most from that attitude becoming prevalent--and we come to think of them as "bad." But they simply are what they are in an amoral universe. Thus, on a rational level, I take a sort of George Carlin-esque view of rights (he hits it at 8:33 in the video, though the whole clip is worth watching): in actuality, anything goes. You can do anything you please, though society intervenes to put the brakes on that. In an absolute sense, then, there aren't any rules and there can be no true "evil." There are only the rules we impose on each other (and perhaps attempt to skirt when no one is looking) to abate the scariness of the situation. This is where the intellect leads me.

But there's more to a human being than the intellect. One can't help but feel that "evil"--soul-crushing, gut-wrenching awfulness that doesn't lend itself to mere rational analysis--exists; no, not in an absolute sense* but in some nuanced way. At the very least, the sort of cold analysis that leads to a nihilistic and gray world view doesn't lend itself well to my efforts to conceptualize the Good Society. Like I said, everyone is inconsistent on some level.



*It should be clear by now that I don't believe in absolutes.

Wednesday, September 10, 2008

Can Great Art Be Created With A Great Budget? -- Stanek's Comment

Being something of a heretic, I’ll play Devil’s Advocate on this one. Since “art” can be a bit of a slippery concept, let me preface my comments by saying that I am not the quintessential pretentious snob: I don’t subscribe to the notion that if you’ve heard of it, it isn’t good. Quite the opposite, I’m a Philistine. I don’t know what makes a piece of music, or an oil painting, or a novel technically brilliant. Indeed, I doubt I could even describe objectively how to write a decent paper; I intuit mine (making me a terrible proofreader of other people’s). Thus, I experience art first and foremost on an emotional level.* I judge art in a singularly vulgar fashion: based primarily on how it makes me feel.

The world is a lonely place, never so much as when you’re immersed in a crowd of people. Begging your pardon, Mr. Donne, but every man is indeed an island. An island linked to others by a fragile network of ephemeral bridges but surrounded by black waters all the same. Art is one of the few exceptions to this. It affords us a comforting illusion, a pleasant semblance of commonality. When I read a novel or stand before a painting, I do so in isolation. The connection I make with the piece is unique and entirely mine. But it feels like I’ve tapped into something universal, some kind of Jungian collective unconscious that tells me I am indeed "a piece of the continent, a part of the main." Enjoying the fruits of another’s creativity is perhaps the only way to truly get inside the head of another person, if only because instead of telling you what someone else is experiencing it shows you by inspiring the same feeling in you (regardless of whether or not the artist himself ever actually felt anything remotely similar—your perception that he did is all that matters). With great art, the joys, the triumphs, the fears, the sadness, the hopes of another blend seamlessly with your own for a brief time, uniting you with not only the artist but with all those that came before who (you imagine) shared an experience of the same sort. And that's powerful.

Money can do plenty of things in this world but it can’t make you feel ("You’re a bastard in a basket!"). Bumping up a budget doesn’t make a work any more meaningful or important to its creator. Does anyone doubt that the Star Wars prequel—for all of its expensive CGI—lacks much of the heart the older trilogy had? Indeed, funding levels that make an artist too comfortable risk removing some of the great tensions that give a work its emotional import. Can an artist on whom status and wealth is lavished credibly emote on issues like the great social, political, and economic disparities that weigh on our society? I don’t know. Perhaps some can. There will undoubtedly be exceptions, as Teshale notes. But, in general, excess money—like any stimulus applied for too long—dulls the senses and clouds the mind. A class of overly well-fed artists may lose its ability to feign connections with the great mass of people, in which case the bell tolls for them and only them.


*I should also make it clear that I recognize that not all entertainment is art and not all art is great. But, in the words of porn-addict Justice Potter Stewart, I know it when I see it. Except that in actuality I probably do not.

Tuesday, September 9, 2008

Can Great Art Be Created With A Great Budget? Teshale

For a vast amount of history, fantastic art was in fact created on a great budget. In these times, of course, it was through a system of patronization, and great art was not the concern so much as great art honoring a certain person, but nonetheless, a very, very striking amount of money went into funding artists like Rembrandt and Da Vinci. In modern times, however, the relationship between the budget of a film (or novel) (or album) and its quality seems, after a certain level of funds, to become inverse. Important work is being made by people like Zana Briski, who won an Academy Award in 2005 for her film on the children of prostitutes, Born into Brothels-- but note, also, the many fellowships she has been awarded. I find myself wondering how much work she's received as a result of her Oscar win. Ultimately I suspect not very much-- it's all very nice that people admire your life's work, but a statue means little when you can't afford to do the work that got you it in the first place.

The other day, I happened upon an old article from the Guardian, discussing the (argued) uselessness, according to Charlie Kaufmann, of film schools and screenwriting seminars. One point Mr. Patterson makes is a good one-- that great art comes from a hunger, a desire, and more often than not that hunger can't be taught in schools. It's clear that great art comes from a desire to make something amazing, and by constantly practicing at doing so; that desire isn't instilled in you by being poor, however, and you can be wealthy and still write or paint or film something very, very good. Paul Thomas Anderson, who is mentioned in the Guardian article, surely had a fair-sized budget for There Will Be Blood (rental of oil rig: $2000/day; salary of Daniel Day-Lewis: $2 million; catering: $375,000; etc.) and it's both some of his best and most successful work.
It's interesting, and no doubt a coincidence, that the article is paired with a review of Christopher Columbus' bloated, safe, by-the-books rendering of Harry Potter and the Sorcerer's Stone (based, of course, on my 1998 novel Harry Potter and the Philosopher's Stone).


The cliche of the starving artist is an old one, but I think an inaccurate one. The reason a starving artist is starving is rarely because people don't understand his genius (there are, of course, exceptions), but because people simply don't understand what he's trying to do. Since I believe one purpose of art is to help people understand the world better, or to think of it in a new way, failing to do so should be seen as, perhaps, as something to think about, not a sign of legitimacy. I'm not, however, saying that an artist should dumb down his or her art-- that's not true at all. What makes any type of art great is innovation, not incomprehensibility. (Becoming successful, without trying to be, doesn't cheapen artwork-- it just means that it's reached more people.) Usually this innovation comes about through necessity-- having to distinguish yourself from the hordes of others who also think they have something to say, or not having enough money to do your original ideas. In this sense, money limits artistic expression in a certain way, because if you know you're going to get paid either way, there is little outside reason to challenge yourself. In addition, there is the risk of indulgence-- the more successful someone is, the less likely they will be challenged, as there is a sense they've already "proved" themselves with their earlier work.
But the presence of money or yes-men doesn't necessarily mean that artistic talent is compromised. The quality of art depends on the passion of the artist, and how deeply they love what they do. The type of people who make great art make it regardless of how much money or freedom they have (or don't have), because they love doing what they do so much that they'll find ways to do it, because they can't not do it. Making money from art is so unutterably difficult that it weeds out those who are serious from those who think, say, it'd be fun to be a director or a rock star. Thus, great art can be made on a great budget-- because the quality of the art depends on the person's talent and dedication to their craft, which is apparent no matter what the situation.

Wednesday, September 3, 2008

What Is The Relationship Between Science and Religion? Teshale's Comment

I must say, I am in agreement with Stanek this time round, and so apologize for the short length of this response.

As Aristotle said: Everything in moderation. Science and religion, I believe, can exist together, because science usually asks "how" where religion asks "why?" Religion in some form will always exist, though science changes-- religion initially answered all "why" type of questions (Why does the sun come up? What's wrong with my cows? Why does my head hurt so bad?). The fact that it still holds such power, for so many people, and provides an answer where science struggles (why is there evil in the world? Where did everything come from? Why does pretty much everyone in the world believe that a select few things are Wrong?) suggests that it's not leaving anytime soon. Ultimately, as has been suggested before, science is in a way a form of religion; both attempt to explain why things are the way they are. That being said, an issue with argument is that religion informs everyone's thinking in some way or another and so it's difficult to distance oneself from it in making arguments. Religion in an abstract sense doesn't just have an influence on science, it influences whole cultures, and it influences thinking.
I don't mean to get into the point of whether or not morals are inherent in all humans or socially constructed, but my point is that some form of belief system-- whether it be science, Buddhism, feeling that we're all one and life goes on-- will always be carried home with you, so to speak. But Science and Religion-- organized religion-- struggle with each other. The main issue always seems to be the more "mundane" aspects, if you can call them mundane-- the sheer breadth of different subsections of Christianity, for instance, illustrates that. In my experience, what most people object to most is being told what they believe is wrong-- whether an atheist is being told he's going to hell, or a Christian being told he is a fool and an idiot.
It seems as if people's inherent nature is to lean one way or the other, and so I don't know if the two can do so on a broader scale. There will always be a Richard Dawkins and there will always be a Pat Robertson, and they will always get more attention than those who are in the middle. And if someone's in the middle, they please neither side, really. I think the middling nature of a more general spiritual belief is such that it struggles against the inflexibility, and subsequent strength, of any type of organized belief system on a broad scale.

Hey, I have a lot of answers, but I don't have all of them.

Monday, September 1, 2008

What is the Relationship Between Science and Religion? -- Stanek

In my meanderings the other day I happened upon a list of Muslim scientists, which--at first glance--might seem a bit odd. I say "odd" because there is a certain conditioning that sometimes leads us to believe not only that scientists wear the same hat in the lab 9-5 that they do after clocking out for the night, but that they should do so. In short, the argument goes, the mode of thinking required by scientific inquiry should intrude on all aspects of a scientist's thinking. And thus being a [insert any religion] scientist would seem like some sort of contradiction. But it's a five o'clock world and scientists are allowed to realize that there are indeed unscientific things in this world (like fashion or the appeal of reality TV).

The people in that wiki list might be Muslim and scientists or they may well be Muslim scientists. To be done well, in the least biased way possible, natural science has to rely only on what we can observe--no miracles, nothing outside that which we can very-nearly-but-not-quite-literally hold in our hands. A deus ex machina won't get you even partial credit for a physics question you're unable to answer (or, as Laplace succinctly stated the philosophy when asked where God fit into his model of planetary orbits, "I had no need of that hypothesis.") But it doesn't follow that just because this is the way we have to do science, that this is the way we have to view the universe. A scientific perspective of the universe does not necessarily imply an atheistic view. Science says we can only deal with the tangible; it doesn't say the tangible is all that is or all that can ever be. That's not to say that this sort of world view (sometimes called metaphysical naturalism) isn't popular among some scientific types--including, for a long time, myself--but only that it doesn't follow logically from scientific thinking.

Though some are loath to admit it--lest the romantic picture of a struggle between science and religion for the intellectual soul of humanity be shattered--science owes a tremendous debt to religion. Priests, monks, and otherwise religiously-minded participants are responsible for some of the great advances of science; many people don't realize that the big bang theory of cosmology was invented by a Catholic priest in the 1920s, while a trio of atheist astronomers developed and pushed the competing steady state theory until long after the idea had worn out its welcome. But the debt I speak of reaches much further than mere personalities, men and women who happened to embrace both religion and science. It is a creed shared by both the most atheistic of scientists and the most starry-eyed of religious believers. It is the simple belief that the universe has a mind and we can read it.

Einstein is famously quoted as saying "The most incomprehensible thing about the universe is that it is at all comprehensible." To the religious-minded, the universe makes such perfect, coherent sense and possesses an underlying order because an intelligence lies behind it all. Indeed, some religious adherents have pointed to the mere fact that science can exist at all as evidence of a higher power . To the atheistic scientist, the fact that science can exist--that an underlying simplicity and order hold sway and we can lay down with mathematical precision a series of rules that seem to govern the entire universe--is both exhilarating and deeply puzzling. Physical science implicitly embraces a "God-as-lawgiver" conception of the universe, regardless of whether it explicitly endorses the "God" bit (though I think it would be a grave mistake to limit ourselves to considering only the Judeo-Christian conception of a deity in thinking through this).

A decade ago, Stephen Jay Gould advanced the notion that science and religion are non-overlapping magisteria, meaning they cover entirely different domains of inquiry and thus do not--cannot--conflict. The empirical and spiritual are not competing for the same scraps. Regardless of whether that's true--and it has been debated--science and religion do overlap in at least one fundamental way: they're both driven by the same forces. Furthermore, both are after a deeper understanding of reality and, I dare say, they want to glean some bona fide meaning from whatever picture of reality emerges (every equation invites interpretation). Physics pursues that understanding by seeking a single unifying law from which all else follows, religion goes a step further by looking for the source from which such a principle would arise. So it is indeed possible to have Muslim scientists or Christian scientists or scientists who pursue a spiritual fulfillment far from the organized religions*. The same human factors--confusion, uncertainty, terror--push people toward science and religion but neither discipline by itself can assuage our anxieties. As long as human beings retain the essentials of their humanity, neither science nor religion is going away.


*I tend to think of "religion" in terms of an abstract mode of thought rather than a concrete social institution like an organized religion. So, while grievances could be lodged against specific organized religions, "religion" as I've tried to use the word in this post is a bit apart from that.

Saturday, August 30, 2008

Will the 2012 Opening Ceremonies Be Anywhere Near As Good As the 2008 Opening Ceremonies? -- Stanek's Comment

Apologies on the long delay, it's been a terribly busy week. I must profess a certain disinterest in this subject; neither the Olympics nor London are particularly exciting to me and putting them together is potentially a bigger snoozefest than the Mark Warner keynote at the Democratic convention. However, with the traditional American political season kickoff this Labor Day Weekend, I'm drawn to a certain aspect of the Olympic games: the real point of that pageantry, the real stakes in the high-speed races between men-fish, the real reason we pretend for a week that running is an interesting sport that requires commentary from Bob Costas.

That reason, of course, is that someone figured out how to take a game of Risk and turn it into a two-week athletic competition. That's right, the endgame is nothing short of world domination, won one synchronized diving medal at a time (well, I guess you probably win two at a time). The tradition at least goes back as far as the 1936 games when He-Who-Must-Not-Be-Named(-Lest-Godwin's-Law-Smite-Me) used the occasion to both showcase his nation's progress and popularize the snazziest dance of the '30s, the goose-step. Subsequent games were marked by boycotts, political killings, and always, always showboating. Fans the world over were dismayed when the first Post-Cold War games--the 1992 Barcelona Summer Olympics--eliminated what had hitherto been one of the most popular events: the pissing contest (only the United States and the Soviet Union were allowed to participate).*

Stepping in to fill the void this year was that new up-and-comer, China. Amid the dazzling displays of hive-mindery, government-stolen youth, and more bling than Midas' collection of Braille books was a simple message: "America, if we can hit that bull's eye, the rest of the dominoes will fall like a house of cards. . . Checkmate!" (translation from the original Chinese was provided by Zapp Brannigan) In other words, if we can indeed give you a run for your money in this game of Risk in 2008, in a few years you'll be back to playing Checkers (Chinese or otherwise). Leave the global strategizing to us; your day is done. Nothing spells an end to hegemony like a bronze medal.

It doesn't help that to much of the world's citizenry--especially Americans--the Olympic Games have a greater legitimacy than the United Nations. Russia invaded Georgia? Perhaps we can resolve this issue on the uneven bars? The world apparently stands as one only for a two week period in even-numbered years. But for those who can't demonstrate their importance in the world through their medal count, another option exists: hosting privileges! Presumably in their desperation to prove they still deserve a permanent seat on the U.N. Security Council, the Brits will try and put on a hell of a show in 2012. But--barring an opening ceremony prominently featuring Gropecunt--they will fail to live up to the example set by the Chinese. They would be wise to lower expectations and instead aim at beating their successor city, Chicago (or Madrid or Rio de Janeiro or Tokyo). The crumbled empire has a better chance of besting the crumbling empire than it does the ascending empire. Though, let's face it: Chicago will still kick London's ass.


*This event usually ended in a tie.

Tuesday, August 26, 2008

Will the 2012 Opening Ceremonies Be Anywhere Near As Good As the 2008 Opening Ceremonies? Teshale

*The general idea of this essay has been broached, by myself, in much abbreviated form elsewhere on the internet. Just so you know. I frequent forums. Do not let this dissuade you from perusing the following, dear reader.

So, with a giant memory tower of people , the latest Olympic ceremonies have come to a close. I hear there were some sporting competitions between ceremonies. Unsurprisingly, China offered organization, enthusiasm, and manpower beyond already high expectations; NBC commentary estimated that there were as many as a million volunteers at the Games this year. The two thousand and eight men that drummed their way through the opening ceremonies were awesome to behold, not to mention all the other dancers. Apparently the drummers were told to smile as much as possible because otherwise they appeared rather threatening, but as I watched them perform, I’m sure I was not the only one sitting in my pajamas at home feeling a little unnerved at just how unbelievably in sync all those people were. Truly, the whole world was united, in feeling slightly inadequate. Imagine how the British, hosts of the 2012 games, felt. From the sleepy town of Fenny Compton, which seems to be the most stereotypical Small Village I Have Never Heard Of I have ever heard of, to the annals of Whitehall, that sound you hear is an entire city collectively self-flagellating itself for the seemingly inevitable letdown of London 2012. Now, the Olympics were not only an opportunity for the host nation to try and outdo everyone else in how bombastic their ceremonies were. They were quite well-organized, with few sport-related controversies (apart from the 13-year-old... uh... 16-year-old gymnasts) and offered moments to fill years’ worth of inspirational Nike advertisments, although they could not quite manage to unite the world in peace and harmony for their whole run. Events were all on time. America watched the ascension of Michael Phelps into the heavens to sit at the seat of Olympus with his father, the mighty Zeus.** Kenyans celebrated their first marathon gold. The Japanese upset the Americans in the softball finals, and the Nigerians did not upset the Argentinians in the soccer finals. I myself laughed delightedly at the cruelty of it all, as some Australian guy very cheekily snatched the gold away from the Chinese in men’s 10-meter diving to prevent their clean sweep of diving events, literally with his last dive of the evening (four 10s, you guys. FOUR!!). Most everyone seemed satisfied with the affair, and the Chinese got a chance to show the world that their country was not simply a gigantic factory for all the stuff in your house, “guarded” from dissention by a firewall and overly friendly police.

The closing ceremonies were not quite as jaw-dropping as the opening, purely because at this point it was known what to expect rather than because of any failings on the hosts’ part. I do wonder why those drummers were wearing bike helmets, though, and I’d like to get my hands on one of those green lightbulb suits. But they did give the world a glance at what the 2012 Olympics might be like.

The next Olympics, as the double-decker bus and bowler-hatted dancers and twirling umbrellas and David Beckham gurning tipped everyone off, will take place in London. Watching the 8-minute transition perforance, one is struck by the difference between the two countries’ styles. China: manpower by the thousands. Incredible coordination. Breathtaking, if slightly lofty artistry. London: Jimmy Page. Of course there were dancers, wearing delightfully bizarre outfits meant to represent different segments of the new host city (I am sure I saw a dancer wearing a pinstripe suit with a graph of quarterly earnings on, I assume to represent the City) (as opposed to the city); there was a choir of young, hip-looking kids singing My Country Tis of Thee; there was an adorable little girl who represented the…I don’t know, there’s always an adorable little girl at the Olympics; all standard ceremony stuff, all very well. But all of this paled in comparison to a middle aged man who, even while being forced to mime playing, still rocked your face off.

Now anyone who knows anything about me knows that I am a crazy Anglophile. I can’t explain it. Look, the whole thing about inexplicable loves is that they are inexplicable, don't you judge me. So, in the interests of full disclosure I will admit that I am a sucker for most British things (apart from oppressive colonization, and racism, and blood sausage). I do not know how you can’t appreciate a city with a
suburb called Cockfosters, that named their red light district this. (This is not limited to London: see my book Rude Britain [2005] for more information.) But it seems that the people tearing their hair in grief about how awful their ceremonies are going to be are Londoners themselves, not anyone else. We love that Swinging London red double-decker bus stuff. (By we: I mean I.) Granted, they are the ones who will have to follow the Chinese show, not any of the other six billion of us, so they have a right to wonder. And granted, they are completely correct in thinking they cannot top the acrobatics and syncopation of the Chinese. But all of this self-hatred, while very British, is pointless. Nobody expects them to be as good as the Chinese at elaborate man movements, because almost no-one is as good as the Chinese at elaborate man movements. When the youth of the world,in answer to Jacques Rogge’s call, assemble four years from now in London, the British should stick to their strengths to welcome ‘em in their opening ceremony. These are: music, self-deprecation, probably fashion, soccer, and trendsetting. Don’t believe me about that last one? Two words for you: Swinging London(…baby.) All of these things were addressed in the Brits’ handover ceremony, which got the job done of stirring up some excitement for the next Olympics.

If all else fails, make the 2012 opening ceremonies a two-hour Led Zeppelin performance. There is nobody on Earth who will say the opening ceremonies were not awesome if this happens. Either way, I suspect Londoners’ harshest critics will be themselves-- four years from now people will probably only remember that the Chinese put on a spectacular show, recalling few of the actual events comprising it. The 2012 Opening Ceremonies will probably not be as awe-inspiring as the 2008 ceremonies, but I’m sure the Brits can work out an entertaining spectacle. And, there will always be David Beckham, and no doubt, good fireworks. (The British know fireworks.) (Too soon?)




**Props where they’re due, one of my friends and her family came up with that.

Note: Overworked, overconventioned (long live Teddy K!) Stanek's response will be...later.

Monday, August 18, 2008

Technology: Oppressive Big Brother or Harmless Little Sister? -- Teshale's Comment

I agree with Stanek's argument, that technology has the potential to create the semblance of free discourse. I believe, however,the issue with technology today for me is not necessarily that it creates apathy in and of itself. I would argue that meaningful (for varying values of "meaningful") associations are alive and well on the internet. There are hundreds of message boards and online forums and communites for every sort of interest. Tocqueville's quote is even more apt nowadays; I myself accidentally stumbled upon a forum of sneeze fetishists the other day. Without the internet, probably none of these people would have ever met, chiefly because no one on earth would ever admit to having a sneeze fetish.

The problem with the internet is also one of its main attractions-- it does not require you, as a person, to adapt if you don't want to. This is not to say that those who participate in online communities are all antisocial nerds too awkward to carry on a basic conversation in real life; that is just patently untrue. Granted, if someone happened to be such a person, they could get away with it if they wanted to. However, people who are already quite garrulous in real life will not turn into loners because they find people they like talking to online. There are plenty of sports nerds in bars, plenty of music nerds who hang out together, plenty of film nerds that form clubs at school. I think that the internet is generally useful for someone who develops a very keen interest in a specific or unusual thing-- 18th century firearms, kiteboarding, writing stories about Kirk and Spock doin' it, whatever. One of the good things about the internet is the sheer amount of people on it; statistically you're bound to find a group of people that share your interest, however bizarre. On the other hand, the fact that it's so easy to find a group online means you don't have to bite the bullet and strike up a conversation with someone at school (i.e. participate in real life), which you will have to do sooner or later. There is a good case to be made for saying it's healthier to to bowl in a league than by yourself, but on the other hand, just because a group of people like bowling doesn't necessarily mean they'll like each other.

Facebook is an interesting example. It's generally agreed to be crap now, although the date of its change is disputed (when the network opened to non-Harvard students? To high school students? To everyone? When it got ads? When it was bought out? When they started making all those stupid apps?). Why is it crap, though? Because most can tell that the point of it is no longer, despite what Zuckerberg insists, to create a social network online. It's simply an advertising tool for corporations; if people do meet, it's in spite of the changes, or reacting to them. Even when it only allowed college students, it was very difficult to create genuine friendships on Facebook, because it difficult to gauge how into (band) or (film) another person was. Facebook seemed, essentially, an attempt to transfer real-world networking online without taking into account how real-world networking works (i.e., conversation). It was the equivalent of introducing two people at a party. I'm sure people have become good friends using Facebook, and I suppose it's useful to immediately know if someone in your class likes the same band as you, but this sort of knowledge is useless if that band is, say, the Beatles.

I think it comes down to whether or not you believe it's possible to genuinely be friends with someone you have never met in real life. Many online communities operate upon the assumption that this is true, and have had meetups, conventions, and so on, so I don't hold with the idea that it usually creates a false sense of community. (Would you really want to hang out with that guy down the street for any other reason than you both have Xboxes?) In its best form, it fulfils all those cliches about bringing people together from all over the world in a sort of glorified pen-pal network. In its worst, it can exacerbate the worst or most harmful of intrinsic tendencies in people. I personally think its pros outweigh its cons-- I have spoken to a lot of interesting people online, almost all* of whom were/are normal people-- but only just. The internet, like I've said, isn't really good or bad-- it will just let people do interesting or harmful things a bit more easily.



*As in real life, sometimes you just meet mad people.

Saturday, August 16, 2008

Technology: Oppressive Big Brother or Harmless Little Sister? -- Stanek

Since I first picked it up,* Orwell's Nineteen Eighty-Four has been one of my favorite pieces of literature. It is a work that—among many other things—affords a deeply disturbing glimpse of the doors technology flings open for a totalitarian society. Omnipresent government-monitored telescreens, an effective state-oriented governing ideology, and a near-total monopoly on the flow of information supply the social control needed for those with power to retain it, for a time at least. Of course, I'm selling the cleverness of Ingsoc's architects short but space and attention spans (mine most of all) are limited.

The real 1984 has come and gone (as has Walter Mondale but not, strangely, Geraldine Ferraro) but Orwell's point still rings true: the relentless march of technological progress makes it ever-easier for the few to exert control over the many. Not to say that this has not been the case throughout human history; it is simply much easier today to place thoughts in someone's head that are not their own or to monitor their activities. And, while any comparisons drawn between the direction the U.S. is taking today and Orwell's novel are alarmist at best, I would be remiss if I didn't call attention to the gradual nudging of select civil liberties toward the chopping block in favor of technological supremacy. It can occasionally be infuriating to see leaders who should know better failing to go to bat for a sacrosanct piece of paper. Who among us is not tempted to use Emmanuel Goldstein as a metaphor for the latent political courage of his favorite political party? Still, it should be stressed that the unsavory political potential of modern technology is nowhere near to being utilized to its full capacity anywhere in the world that I'm aware of, but then I don't get out much.

The natural flip-side to this coin is that technology can be a defensive weapon against overbearing governments. This is part of the reason that some governments have placed restrictions on the import of cryptography technology ; the shield such tools provide to political dissidents—or any citizen who believes he has a right to privacy—is something some states do not feel comfortable allowing. More than that, technology (specifically the interwebs) is supposedly the great democratizer, the vanguard of Lady Liberty. Any complete nobody with a bag of Cheetos and an internet connection (i.e. people unlike Teshale and myself) can hear and voice ideas. And the give-and-take of democracy is the antithesis of totalitarianism.

But democracy is about talking to people, not at them (and make no mistake, since the comments here are so severely underutilized, I am most definitely talking at you). Over a decade ago, Robert Putnam wrote a paper, which he later spun into a book (coauthored with Teshale, I think), called “Bowling Alone: America's Declining Social Capital.” His point, grossly truncated, was that the connections between us are eroding. People bowl more now than they used to but they do it alone these days instead of in leagues, engagement in politics and government has plummeted to worrying lows, and so on. Among the culprits Putnam identifies is the "technological transformation of leisure." The growing popularity over the past half century of isolating technology has been crowding out bona fide human interaction. For a little perspective (particularly vis-à-vis democracy), I'll share a note made by Putnam in the paper:

When Tocqueville visited the United States in the 1830s, it was the Americans' propensity for civic association that most impressed him as the key to their unprecedented ability to make democracy work. "Americans of all ages, all stations in life, and all types of disposition," he observed, "are forever forming associations. There are not only commercial and industrial associations in which all take part, but others of a thousand different types--religious, moral, serious, futile, very general and very limited, immensely large and very minute. . . . Nothing, in my view, deserves more attention than the intellectual and moral associations in America."


Today, sometimes it seems that Google is my closest friend, the Alfred to my Bruce Wayne (that's right, the Michael Caine incarnation). The Vietnam-era mass mobilizations of young adults have apparently been replaced by a plethora of “I bet I can find 1,000,000 people for ___, lolz!!!1!!11!” Facebook groups. Everything from groceries to brides (probably) can be ordered online. You don't even have to round up eight people and 2 Xboxes for a rousing game of Halo anymore. The Internet will take care of you. But the sealing of people into technological bubbles that sometimes overlap but often are merely echo chambers is not conducive to democracy. In addition, I've seen no indication that the Internet is exempt from Michels' iron law of oligarchy. Even in a medium where everyone has a voice, some voices will drown out the rest.

In the world of Nineteen Eighty-Four, perhaps the most palpable element is the isolation. The protagonist, Winston Smith, notices it acutely because there is little else to occupy his mind. Apparently even Orwell could not envision a world in which the technology used to impose the isolation could simultaneously fill minds with all manner of distractions and white noise to the point that the isolation becomes nearly imperceptible. Whether corroding democracy by buttressing totalitarianism or fostering apathy and disengagement, technology may well be a wolf in sheep's clothing, an agent with profoundly anti-democratic potential in the guise of a democratizing force. Buyer beware.



*Actually, I first read a free version online. How that fact impacts the debate here is a question I'll leave up to you.

Tuesday, August 12, 2008

Should People Read Books Before Seeing the Movie? -- Stanek's Comment

Teshale raises a timeless but ultimately irrelevant question: do I have to read the book first? The answer, she rightly concludes, is absolutely not. Movies are more easily enjoyed when viewed only on a superficial level. Never tell me the book was better; frankly my dear, I don't give a damn shit. I paid eight bucks to catch a flick, not read a book on a screen. Movies are about over-salted, under-buttered, over-priced buckets of popcorn; sticky armrests awkwardly shared with a stranger not gracious enough to leave the obligatory open seat between you; jibber-jabberers that fail to appreciate that “please don't spoil the movie by adding your own soundtrack” does not just mean turn your phone off, it means shut the fuck up(!); obnoxiously distinct laughers who fail to trail off with the rest of the crowd; littering; etc.

You most certainly do not get that experience from the book. And if you do, your reading habits leave serious room for improvement. Besides, if you have to ask the question of whether or not you should read the book, the answer is decidedly no. Suppose the movie can only be appreciated by fans of the book: well, in that case forget it. It does not want or need you, and you do not need to surrender your dignity--by giving in and reading the book--just so you can catch a matinee. And honestly, it's for readers who enjoyed the book, not for posers who might come to like the book someday. Get over yourself. On the other hand, an adaptation that is accessible to everyone is designed for people like yourself. Why insult the moviemaker's hospitality? Reading the book under those circumstances is not prudent, it is positively rude.

Now I started off by saying the question is ultimately irrelevant. Be honest with yourself: broaching the question is merely a formality. You were never the literary type. Sure, you may consider yourself an intellectual (though not a public intellectual, like Teshale) and perhaps you even have a stack of novels you've bought for “when I have the time.” But you do not really want to read the book first. Books are inscrutable; they revel in being unjudgeable based only on their covers. Movies are your friend, mercifully willing to do your thinking for you (unlike those haughty books!). They will tell you whether or not you want to try giving the book a read. If trusting the movie's judgement is wrong, hey you don't wanna be right. Teshale may view this attitude as a cardinal sin but I place it all under the blanket category of “meh.”

Having said all that, I want to end on a note of caution: some of you may have the silly idea that if someone coughed up the money to adapt a book into a movie there must be a reason. Ergo either the movie will be good or—the worst case scenario—the movie will be a bad adaptation of a good book. Unfortunately, there is no guarantee that this rosy scenario will play out in reality. I present one of the finest examples of a crappy book being adapted into one of the worst films ever made: Battlefield Earth. You may be asking yourself why anyone would make this tripe. Well, Vinnie Barbarino is a force of nature and people do crazy things, not just for love but for their crazy cultish sci-fi religions (the late Isaac Hayes—hello, children!—went so far as to hang up his animated chef's hat for it and it was humanity's loss). But they also do it to fraudulently overinflate their budget to scam investors. To each his own (don't worry, the story ended happily: the studio went bankrupt).

Should People Read Books Before Seeing The Movie? Teshale

Stephen King is one of the few authors whose books I have not secretly ghostwritten for fun. King's output is almost as impressive as his net worth, having maintained a successful career for thirty years amongst multiple platforms-- novel, short story and film. King is best known, most likely, for his horror and supernatural writings-- It, The Shining, The Dark Tower and so on-- but he has also written more realistic fiction, such as a short story dealing with 9/11 ("The Things They Left Behind"), baby boomers (Hearts in Atlantis), etc.
The latter novel brings me to my topic today.

The book-to-film adaptation is a long-heralded Hollywood tradition. Stephen King is no stranger to it, and his novels have, on the whole, fared well when translated (The Shining an obvious example, but Hearts in Atlantis was not bad). This is probably for reasons of economics; Stephen King is rich enough, and powerful enough, that he has the luxury of being choosy with whom he options film rights to. In any case, many adaptations have not been so lucky.
In fact, many adaptations of books suck ass. The League of Extraordinary Gentlemen, for instance. A great idea, a perfectly filmable story, is turned into a series of explosions occasionally punctuated by Sean Connery phoning it in. A Series of Unfortunate Events was a series of bizarre and unnerrving and clever novels, rich and dark as any Victorian melodrama, and were robbed of their bite by a gurning Jim Carrey in a series of disguises.
I have not seen Cold Mountain, but then again I have not read it. Screw that.

There are three reasons a crappy adaptation is unfortunate. One, if people are not aware of the book until after the film comes out, they are under the misguided apprehension that the book is equally crappy. (It is painful to me that many great books are being made into films which usually remove whatever made the book interesting in the first place, and then are subsequently blamed for resulting box office failure.) Two, if people are aware of the book, they are disappointed that its adaptation sucked so much balls. Three, in the case of literary adaptations it allows vaguely pretentious people to brag about how they had read the novel before the film was made. For these reasons, I advise against reading a book before seeing a movie, no matter how much a bookish friend may plead with you.

The solution, rather, is to read the book after watching the movie. This benefits authors, in that they might get an extra print run and publicity. This benefits filmgoers, who have an interesting story to read. There's no direct benefit for filmmakers, who have already taken peoples' money, but they are still happy. Take, for instance, There Will Be Blood, which I believe is the Great American Film. How many people, for instance, had read Oil! before 2007? How many people (who are not history teachers) had even heard of it? I certainly hadn't-- I'd just heard of The Jungle in American History in high school. But now I want to read it. Maybe I will even learn something, although I know a lot of things already.
Or, note the case of The Dark is Rising by Susan Cooper. Many bookish people read it when they were young, myself included. Most people enjoyed it. It was an interesting take on Arthurian and Welsh mythology, with kids that had problems and argued and acted like real kids, even if they were unwilling Chosen People. Filmmakers decided, hey, to hell with that characterization BS! To hell with different! We'll make the main kid American, because it's hard for American kids to identify with British kids. We'll give him cliched superpowers like stopping time or whatever, because kids love cliched superpowers. We'll make his main problem getting some girl, because what kids like watching is some douchebag with stupid hair using superpowers to impress a girl. (Actually, maybe that is what people want to see.) One could very easily advise a person exiting the theater after watching this tripe, tearing their hair in grief and anger at its crapitude, to read the book as it's better. It may even be slightly cheaper, depending on if a used bookstore is within walking distance.

Even if the adaptation is pretty good, like V for Vendetta or The Shining or even extremely good in its own right, like The Godfather, the film usually loses compared to the book. Overwhelmingly it's a step up to read the original, and people don't read enough anyway. Reading the book afterwards allows you to wash the bad taste of a crap film out of your brain, and occasionally, it may even pleasantly surprise you, like in cases where the ending of the book is completely changed for seemingly no real reason. (Not like I'm bitter, or anything.) Instantly lives are improved; reading comprehension increased; and instead of a nation of sluggards with myopic eyesight, we will create a nation of sluggards with myopic eyesight, that have now read some sweet-ass books.* A vast improvement.


















*There is one exception to this rule. That exception is people who have read Eragon by Christopher Paolini, and/or have seen the film.

Saturday, August 9, 2008

Political...Scientists? Teshale's Comment

I agree with Stanek's point that science and politics are two vastly different animals, like unicorns and sharks, or pandas and Bill O'Reilly. However, I do not believe bureaucrats are the answer to bridging the gap. Such a union, in the hands of an expert, might well integrate seamlessly, like a mermaid where the human bit is the politics and the fish bit is the science. But is it not more likely that one will end up with some horrifying, Moreau-esque creation? Consider, rather, bureaucrats may be the cause of the gap. Their jobs could well depend on keeping people apart-- if this is achieved, they can act as the middleman. Most bureauacracies, in fact, seem to devolve into elephantine mazes of rules and regulations, where nothing is really feasible without the proper forms (as noted in my film Brazil [1985]).

I suppose I take umbrage with Stanek's usage of the word "bureaucrat," which in my eyes takes on a rather negative connotation. Max Weber defined the bureaucrat's task as the following: "Bureaucratic control is the use of rules, regulations, and formal authority to guide performance. It includes such things as budgets, statistical reports, and performance appraisals to regulate behavior and results."
A bureaucrat's task is to make sure a political body runs smoothly and efficiently, and not necessarily to determine any particular policy, but to ensure that policy is implemented well. I would propose that Stanek's idea is a sound one, but the task should fall to someone rather more dynamic, someone unbound by these regulations, whose office, intelligence and skills allows him or her to seamlessly transition between these two jobs.

This task, l propose, would be best accomplished by Enlightened absolutism. Some call it "benevolent despotism" but this is negative thinking. Since people are, of course, human-- with all the awkward emotions being human entails-- I suggest a giant supercomputer could be the answer to this issue. Giant supercomputers are nothing if not rational, and instilling a set of rules ala Asimov's Three Laws should ensure the computer does not do anything awkward, like become sentient. A drawback to this suggestion is the possibility that giant supercomputers always seem to develop megalomanaical tendencies, but-- to be fair-- it's not as if humans never develop these tendencies either. With a giant computer controlling policy and society, we could a) be sure that this computer would have society's interests at heart, b) employ the men and women stuck in paper-pushing jobs to better effect, and c) allow, in the resulting collapse of society, for an entertaining and exciting post-apocalyptic world where a solitary man (or woman) can battle against evil robots to save the world. Apocalypses are very in right now, as the success of my novel The Road (2006) can attest. I believe in giving the people what they want, and my proposal does so, while addressing the issue of problematic influences one way or the other between politics and science.



(Edited. Er, for one word.)

Friday, August 8, 2008

Political. . .Scientists? -- Stanek

There was a time, nightmarish as the idea my seem, before the separation of science and state (thank you very much, πth amendment). In the eighteenth century, British polymath Joseph Priestly—perhaps most famous for his contribution to the soda vs. pop wars—found time away from the laboratory to write a treatise on liberal political theory, the Essay on First Principles of Government. The content of it is not important and I will not pretend to have read it. I simply wish to point out the novelty of it: a man of scientific thought and a man of political thought, both grotesquely trapped in the same body.

Today we are doing a much better job of ensuring that the political world is expunged of the offensive presence of scientific facts. In between tirelessly working to expose climate change as a hoax and searching for pristine natural habitats to rape, the former chairman (and current ranking member) of the Senate Committee on Environment and Public Works, James Inhofe, has drawn comparisons between environmentalism and the Third Reich, the Environmental Protection Agency and the Gestapo, and an EPA administrator and Tokyo Rose. On the other hand, Inhofe represents an important data point for scientifically-minded observers wishing to prove that Godwin's Law is the one immutable law in the universe. The current president—a man of unimpeachable integrity, ahem—informs us that biological science is effectively equivalent to religious mythology and both should be offered up to students in science classes as competing “theories.” We have been graciously reminded by a political appointee at NASA headquarters that eighty years of big bang cosmology is merely someone's opinion. We need not even touch upon the unparalleled genius and unrivaled policy efficacy of abstinence-only sexual education. But honorable mention must be made of the former Senate majority leader (and bona fide medical doctor) who suggested that HIV might be spread through tears. I shudder to think of the public health disasters represented by Bambi and Old Yeller.

But like a viral YouTube video, science cannot be entirely kept out, not even by the near-foolproof eyes-shut, ears-plugged strategy. Insidiously and nearly unnoticed, it slips in. Yes, our government is being Rickrolled by science as we speak. The culprits are those noble drones who grease the squeaky wheels of government: the bureaucrats. These are the rank-and-file NASA and EPAers whose paychecks depend on them understanding rather than misunderstanding. Freer of the nefarious demands of ideology than their hand-picked political overlords, bureaucrats are free to do the job no one elected them to do. They are mercenaries in the war between science and politics and it is only through them that one side--Geeks or Hacks--will gain the upper hand. They may well be our last hope that a seamless and functional fusion of the scientific and the political is possible; that the polis will one day be presented with unaltered science with which to make informed decisions about the larger questions facing it. Tearing down the wall between science and politics will require going beyond politicians who let ideology manipulate science, and scientists who denounce the the impurity of politics. The challenge demands someone who transcends both worlds, someone singularly bold and unafraid to bridge the gap: me.

Note: Due to some entertainment and wireless issues, Teshale's response will not be up until tomorrow (Saturday).

Tuesday, August 5, 2008

Arcade Games: A Thing of the Past? -- Stanek's Comment

I agree that the convenience of home consoles poses the greatest threat to the arcade. Nobody wants to spend an evening getting jostled by greasy pre-teens when they can get the same gaming experience at home on the couch in their underwear. Especially when they can simply download the greatest arcade game ever designed, Geometry Wars. It is nearly perfect: stunning visuals, no unnecessary distractions like plots or sanitized geopolitical backdrops, and absolutely no illusion that beating the game is possible (that's right, it packs in important life lessons).

Of course there are certain advantages to gaming in an arcade. Using the dinky joystick on a controller does not compare to mowing down alien foes with a bolted-down plastic Uzi. And if you are not playing Cruisin' USA (or the equivalent) with a steering wheel and a gas pedal, then you are undeniably doing it wrong. Arcades also offer the opportunity to cream opponents face-to-face. On the other hand, the rise of online multiplayer gaming makes it easy to pwn dozens of n00bs per hour without leaving the...basement. But I digress; what will it take to keep arcades relevant?

A successful arcade for the twenty-first century will need to incorporate three crucial elements:
1)Eliminate the need for change. It jangles, it falls and rolls under the machine, and only one denomination has FDR on it. And any place that uses tokens in lieu of quarters should be razed to the ground.
2)Use a bouncer. Keeps out the riff-raff. As in, children.
3)Have a full-service bar. And a place where I can grab a steak. Saving the galaxy (or perhaps playing some sort of virtual golf) can be stressful and something to take the edge off is essential.

Luckily, such an arcade exists: Dave and Buster's , where buzzed adults outnumber snot-nosed kids. After getting past the bouncer, you load your money onto Power Cards instead of fumbling for change. For the mildly anti-social, the annoyance of other people being there can easily be dulled by the beer. On top of that, they have some fairly incredible games. I cannot simulate the cockpit of a 747 in my living room, but D&B can provide me with the experience of crashing one onto a runway. If arcades want to stay alive, that is what they will have to do: give people opportunities to maim and destroy in a realistic setting they cannot replicate at home. Most of them have already been desensitized by the rest of the visual media; just give them an outlet.

Monday, August 4, 2008

Arcade Games: A Thing Of The Past? Teshale

In 1990, SNK released NAM-1975 for the Neo Geo console. Initially avaliable as an arcade game, SNK developed a version for the Neo Geo in the hopes that video gamers would find the title enjoyable enough to continue playing it at home. (I assume that the release of this particular title is directly related to the release of my book The Things They Carried [1990].)

SNK's business model was ultimately unviable for two reasons. These reasons concern the entire purpose of arcade consoles, which is to take your money. To do so, they must 1) be intriguing or entertaining enough to draw you in. This is usually simple to do, as the glut of fighter games on the market indicate. 2) they must trick you into believing it is possible to beat the game; doing so must be a drawn out, long process. With NAM-1975, only half of this seems to be true. While the story, an unusual mishmash of Apocalypse Now-meets-GoldenEye (but written, oddly enough, before GoldenEye), is serviceable enough for a shooter, the actual work of eliminating enemies is tedious and unrewarding. The final boss is extremely difficult to beat, according to reviews on http://www.neo-geo.com, and the game does not allow you to continue while battling. One or the other is understandable; but creating an impossible-to-beat final boss, and then forcing the gamer to begin the game all over again if killed in-game, is...why? Why would you do that? Why would you make a game so unutterably difficult and then reward the player, who suffers ruined eyesight and sore thumbs in the hopes that their efforts will come to fruition, with a gigantic pixellated middle finger? Why, SNK?! COME ON!!!

As a young woman, I wasted many hours at the local arcade playing entertaining, but ultimately pointless rounds of Gauntlet Legends, Fist of the North Star, and even Aerosmith's Revolution X-- but how many are doing the same today, ten years later, when they could waste their lives in the comfort of their own homes? The extreme popularity of home consoles such as the Wii, XBox and others suggest that the arcade era is on its last legs. More and more consumers are investing in consoles for the home, where, despite the relatively high price, games themselves are fairly inexpensive, and payment is one-time-only. Young gamers have large amounts of disposable income to spend, and the general trend suggests their pocketbooks are signalling the death knoll for arcades. Online multiplayer options present all of the original perks of the arcade game without the two drawbacks-- having to see other people, and having to keep a large collection of quarters in one's pockets. Thus, a previously lucrative business is unable to draw in the crowds of the 1980s, and is now struggling to stay relevant.

Sunday, August 3, 2008

Introduction to the Stanek-Teshale Blog

Mike and Salom™ is a major new social, political, and economic phenomenon. Our style is fresh, our insights penetrating, our physical features striking (and our my lies bold). Moreover, we're entertaining sometimes.

We have decided to start a blog to explore various random topics--though there is an ingenious method to the madness--and meditate on the nature of reality. Or society. Or something. Much like our colleagues friends acquaintances people we've heard of, Becker and Posner, we'll be blogging in a dialogic format. We'll post on Mondays or, alternatively, whenever we feel like it. The first post will be whenever Teshale gets around to it.

Teshale (er, teshle?) is some kind of writer, and possibly a CIA psych-ops agent. The less you and I know, the safer we'll be. Teshale is co-host of the someday-going-to-be-a-hit radio show Mike and Salom in the Morning and a partner in an upstart consulting firm. Stanek lacks direction and has a major that seems okay with this. He is Teshale's co-host and all-around business partner. His name generally goes first because it just sounds better that way.

We'd like to close by thanking anyone who reads this. Ever. You're fantastic. Now start your RSS feeds.