Thursday, July 31, 2008

Two and a Half Things That Aren't Funny

I'm no expert or anything (Adam Carolla says that everyone believes themselves to have a great sense of humor and only 10 percent of people actually do), but I've been watching this season's Last Comic Standing, and it's a bleak bunch. Other than Iliza Shlesinger--and even she's more hot than funny--and that bald British guy who plays the double bass, no one makes me laugh. The contestants on LCS fall into two traps of modern comedy, both of which are getting more and more popular. These two things are not funny:

(a) "OMG THAT'S SO RANDOM." Really, I think this mistake is the product of overshooting. Detail is funny, no doubt. One of my favorite commercials right now is the Cars.com spot where the woman tells the salesman that she mixed up brownies with horse laxatives. The detail makes that joke funny. "Horse laxatives" is 50 percent funnier than just plain "laxatives." (The joke would be even funnier if it were "donkey laxatives," both because "donkey" features the komedy k sound and because donkeys are inherently funnier animals than horses.)

But there's a limit, and once a joke begins getting too detailed, it moves into "OMG THAT'S SO RANDOM" territory and stops being funny. One of the contestants on LCS, for example--I can't remember his name, but it's the guy with the "creepy stalker" persona--said that he can't get any laughs because he "smells like old squash." That joke doesn't make me laugh because it's not tied to anything in reality--it's a detail that could have been funny if it came from something real, but it doesn't. "Smells like failure" would be funny because the comedian actually does.

The most overrated comedy on television, Family Guy, does this all the time. Think of the psychotic monkey that lives in Chris Griffin's closet. That running gag assumes either that we think monkeys are funny in and of themselves or that we'll be amused by the "randomness" of the reference. Neither is true, at least for me.

For even more examples, look at Napoleon Dynamite--which I truly and utterly abhor--or that Chuck Norris website that was so popular until Chuck tossed his support behind Mike Huckabee.

(b) The Straight Reference. We've seen an increase in this type of "humor" with the Date/Superhero/Epic/Disaster/Scary Movie franchises. There are some fantastic parody movies out there--The Naked Gun and Airplane! being the two best--but this recent crop of comedies aren't really parodies because they deal with recognition rather than with humorous change. That is, they just present us with the reference as if our recognition of a bald Britney Spears is funny in and of itself. It's not.

In stand-up comedy, impressionists are more often than not guilty of this. Marcus on LCS gets way too much mileage out of his stock Christopher Walken impersonation. (News flash, professional comedians: Since everyone can do Walken, Jimmy Stewart, and Arnold Schwartzeneggar, they are no longer useful or entertaining impersonations.) A good impressionist can make me laugh until my sides hurt, but doing the impression is worth nothing without snappy writing. Marcus isn't smart or snappy, and so he comes off as a frat-boy douchebag. He entertained the Playboy Bunnies (and I knew he'd get the immunity), but a good rule of thumb should be that if the Girls Next Door think you're the funniest thing ever, you ain't.

Hey, once again, Family Guy is guilty of this. The show frequently just presents us with unchanged references to pop culture. There's the time the Griffin kids sing "Sixteen Going On Seventeen," and then there's the time Stewie sings "I've Grown Accustomed to Her Face." I kept waiting for the joke, and one never came. But the show filled two minutes with each of these bits.

Oh, and a bonus warning: Parody can be very funny, but only if you use it to get at the truth. I loved "Weird Al" Yankovic growing up, and he's fantastic at this. "Smells Like Nirvana," for example, is funny because of what it says about the source song--it crawls inside of it and understands it. Likewise, "Living with a Hernia" (a parody, of course, of James Brown's "Living in America") is great not only because of the research Yankovic must have done into hernias but also because of the way it plays with James Brown's mannerisms. Maybe his screeches and shouts are the result of a medical condition.

God's Pottery, mercifully eliminated from Last Comic Standing, got an obscene amount of laughs from people who have apparently never met an evangelical. The evangelical virginity movement is ripe for parody, it's true (King of the Hill has a wonderful and insightful episode called "Luanne Virgin 2.0"), but God's Pottery has a tin ear and no affection for the movement. (Parody doesn't work as well if it's mean-spirited.) They seem to have formed their impressions of Christians from Rod and Todd Flanders from The Simpsons. Rod and Todd are funny characters, but not because they're like most Evangelicals; they're funny because they are exactly the type of child Ned Flanders (who, while exaggerated, is much more rounded and true-to-life character) would have.

Once again, an example from Family Guy: The Griffin family moves to the Southeast with the Witness Protection Program. The writer of the episode had obviously never been south of Rhode Island because the episode doesn't ring true with anything in reality. One bit that made me particularly livid was a throwaway line: The radio DJ says, "That was Merle Haggard, with 'I Kissed My Sweetie with My Fist.'" Obviously, the writer has no understanding of or affection for Haggard in particular or country music in general. There are very few country songs promoting spousal abuse (and almost none since the '40s), and Haggard is exactly the type of singer who would never sing one. Parodists with a tin ear for their topics end up being aggressively unfunny.

Monday, July 28, 2008

Judge Vilhelm Replies


I've been reading noted literary critic James Wood's first and only novel, The Book Against God and am enjoying it as much as I enjoy his criticism in The New Yorker. It's the story of Thomas Bunting, who's a seventh-year PhD student in philosophy who can't finish his dissertation because of a much more pressing and elaborate project, the Book Against God, which is pretty much exactly what it sounds like.

I'm not too far into the novel just yet, but I suspect what we've got here is a portrait of the Kierkegaardian religious sphere removed from believe in God. (I doubt Wood, who I think is an atheist, intended such a meaning, but thank the Lord or evolution for the Death of the Author.) Thomas Bunting is, in my early analysis, a sociopath, a man who lies compulsively and feels no remorse, who can't find a philosophical difference "between lying to one's wife and lying to a corporation." This statement horrified me at first, but upon further analysis, it's akin to what I was taught growing up in a Southern Baptist church: No hierarchy of sins exists, and theologically if not practically, they are all the same. (I am interested to know if other brands of Christianity teach such an idea.) At any rate, Thomas ends up as a moral absolutist of sorts, even though he rejects morality.

So it makes sense that he describes Nietzsche as "one of my favorite philosophers," since his views echo the German's, specificially the essay "On Truth and Lying in a Non-Moral Sense." Nietzsche, a consummate atheist, argues that since consciousness "has no further mission that might extend beyond the bounds of human life," truth is
A mobile army of metaphors, metonymies, anthropomorphisms, in short a sum of human relations which have been subjective to poetic and rhetorical intensification, translation, and decoration, and which, after they have been in use for a long time, strike people as firmly established, canonical, and binding; truths are illusions of which we have forgotten that they are illusions.
Morality is therefore irrelevant to Nietzsche, and the prohibition against lying is a social construction and has no intrinsic value.

(I've often said that Nietzsche is the atheist I respect the most because he is the one who is unafraid to face directly the logical conclusion of materialism, that is, that there is no such thing as right and wrong or as value. We have no reason to do "the right thing," and further, we have no grounds on which to condemn anyone for doing something we find horrifying.)

Wood's book has value not just on its own terms, then, but because of what it suggests about Nietzsche, i.e., that he exists in the religious sphere without belief in God. But first: What on earth do I mean by the religious sphere? I wrote my thesis using this schema, so please bear with me as I ramble on . . .

Kierkegaard posits three spheres of existence, each of which telescopes and contains the one that precedes it. He does so chiefly in three books: Either/Or, Fear and Trembling, and Stages on Life's Way. The first sphere, the sphere in which most of us exist, is the aesthetic. The aesthete defines himself entirely by himself. He lives in his own head and has little genuine connection to the outside world. His relationships are thus necessarily strained, and indeed, the aesthetic resembles what Heidegger would later call the being-against-one-another (exactly what it sounds like). The aesthete believes himself to be happy but is deeply alienated--his sole goal in life is pleasure and more specifically, to avoid boredom, which after all is "the root of all evil."

A person enters the ethical sphere when he commits to a particular idea or theory, when he attaches himself to something outside of himself. Kierkegaard's example is of marriage, but the ethical could be any belief system--most religious folks are in the ethical sphere rather than the religious. (We shall see why in just a moment.) Walker Percy says that the great ethicists of our age are the artist and the scientist, and he suggests that both of them are in quiet despair, especially the artist. Reason and logic also belong to the ethical sphere.

The religious sphere is entered only when a person lays aside the ethical sphere in what Kierkegaard calls a "teleological suspension of the ethical" (later theologians would call it the "leap of faith," a term which Mr. K never uses). Kierkegaard's famous example is Abraham, who turned his back on the laws of society when God told him to sacrifice his son. One lays aside the ethical not to serve oneself but to serve God, and it's very important that this is not a teleological abandoning of the ethical but merely a suspension--Abraham is a knight of faith not because he was willing to sacrifice his son but because he knew that God would restore him. The religious sphere is a realm of pure subjectivity, and a person is enthralled only to God. Not very many people make it, and Kierkegaard says nothing, as far as I know, about what a person stuck in the religious with no belief in God would look like.

But I think Wood gives us just that, and here's why: Thomas Bunting's wife dislikes people "for only two reasons: it's either something murkily musical, or something elusively ethical." But he is unable to understand her system of demerits and demotion, and this failure to understand suggests a failure to understand both the aesthetic and the ethical. (The reasons for the ethical should be obvious, but Kierkegaard suggests that music belongs to the aesthetic sphere.) With those two spheres eliminated, Thomas has only one place to live, the religious, and given his affinity for Nietzschean thought, I don't think it's unreasonable to suggest that Nietzsche, too, exists in this sphere.

This connection explains the similarities between Kierkegaard and Nietzsche--I have often said that their ideas are basically the same, although they start from opposite places. They may meet here in the religious sphere, even if Nietzsche is unaware of it.

Monday, July 21, 2008

A Brief Prose-Hymn to John Lasseter


Lasseter is, of course, the head of Pixar and the new head of Disney animation. He directed Toy Story and Cars, among other movies, and more importantly, he's the one who's bringing back Disney's 2D animation department. (Leave it to the man who brought computer animation into the public consciousness to bring back the old style.)

And so the first 2D Disney film in half a decade is coming next year--The Princess and the Frog features 1920s New Orleans, a Randy Newman soundtrack, the directors of The Little Mermaid and Aladdin, and the first-ever African-American Disney princess. I'm geeking out.

Oh: Lasseter's other awesome move is to feature animated shorts before films, the way Disney did for decades and the way Pixar always has. I am not sure if these shorts will appear before live-action films, but I hope so, and apparently National Treasure 2 featured a classic-style instructional Goofy short.

I love you, John Lasseter. Here's to you and the new Disney Renaissance that you are certain to bring on.

Last Year's Model

I've resisted computer animation for years; it lacks the depth, ironically enough, of traditional 2D animation; too often, it ends up looking hollow and insincere, and its practitioners and apologists--many of them, anyway--are so infatuated with the technology that they end up leaving by the wayside insignificant things like story and character development. I've not seen Space Chimps, for example, nor do I plan to, but it looks like a perfectly unholy combination of poor writing and hollow effects. As for the bafflingly acclaimed Shrek series--it confuses cynicism with a legitimate critique of Disney mythology, and it features the most grating voice acting I've ever heard. It's clever without being smart, an attack on franchising that turned into a franchise itself. I hate these movies, I truly hate them, and Disney's own Enchanted, a much more affectionate tweaking of the princess trope, really shows how petulant and juvenile the Shrek movies are.

Disney screwed up, too, though--as movies like Brother Bear and Home on the Range foundered at the box office, Robert Iger and company shut down the Mouse's 2D animation department, only to produce Chicken Little and Meet the Robinsons, themselves greeted by tepid commercial reception. Disney apparently fails to understand that the third dimension doesn't help anything if the writing isn't there. This fall's Bolt, starring John Travolta and Miley Cyrus, looks pretty terrible but may surprise me. I hope so.

The exception to the general artistic failure of computer animation are the Pixar films, all of which--especially 2003's Finding Nemo and 2004's The Incredibles--have been whip-smart, innovative (both visually and thematically), and genuinely emotional in a way that DreamWorks features consistently fail to be. Pixar also has a penchant for choosing voice actors who are not huge stars, or at least not at the top of public consciousness, to great effect. Think of Craig T. Nelson anchoring The Incredibles or Albert Brooks as the emotional center of Finding Nemo or Don Rickles gruffing his way through the Toy Story movies. Pixar casts its actors because they're right for their parts, not because they want to capitalize on star power. These actors don't overplay their roles or use goofy voices for no reason--I'm looking at Eddie Murphy and Mike Myers, respectively--and so the films end up with an understated tone, and understatement is where the real emotion is.

I was certain Pixar's latest, WALL-E, would be an artistic success and a commercial failure, though. The cartoon-going public, after all, made the Shrek films hits, and I doubted their patience and ability to appreciate a film with limited dialogue--the main character, after all, says only about seven words the entire movie. Also, giving emotion to robots is notoriously difficult (as we learned from the Futurama DVD commentaries). But I didn't doubt Pixar's ability to make it happen--just the public's ability to do the work to identify with the robots. But my predictions, thankfully, turned out to be dead wrong: The movie opened at number one and made $23 million its first weekend, and it's still at number six a month after its release (ahead of Wanted, Kung Fu Panda, and Space Chimps). It's not going to be the studio's most successful feature, but it's going to do well.

More importantly, it's their best movie, and one of the best cartoons I've ever seen. The animation, particularly in the first half hour, is breaktaking, and the backgrounds are particularly well-done. (Backgrounds in animation are to some extent like bass in a rock band, in that most people don't pay a huge amount of attention to them but they subtly make the entire experience. Pixar's famed attention to detail makes them great at backgrounds, and WALL-E utilizes the talents of Finding Nemo director Andrew Stanton and Jim Reardon, late of The Simpsons, two of the best in the biz.) The animation loses some of its majesty once the humans show up--3D animation has a hard time making non-creepy human beings, though it worked well in The Incredibles--but the robots always look good and are beautifully designed.

A central, though unspoken, conflict in the movie involves generational technology, particularly the difference between WALL-E and the rest of the robots. WALL-E is a 1960s vision of the future, the kind of thing you might find in Tomorrowland. He's perhaps a little clunky, a little out-of-date, and he appears to malfunction on a semi-regular basis. The newer robots, on the other hand, look like iPods--they're coated with glossy Apple white and are all curves and smooth edges. They certainly work better than WALL-E, but the movie suggests that their efficiency is to their detriment. To put it briefly, WALL-E's malfunctioning gives him a soul; he's so poorly designed that he develops a personality. (This makes him human, of course; Nietzsche suggests in "On Truth and Lying in an Non-Moral Sense" that human consciousness is a similar accident of evolution.) But WALL-E's personality, in the ultra-modern world he will briefly inhabit, makes him something of a liability, as he's in trouble as soon as he sets his suitcase down. At the very least, he seems old fashioned once he gets to the space cruiser. His job performance (he's a garbage-bot, a rolling trash compactor left to clean up the mess evacuating humans have made of Earth) is not peak, as he spends a good part of each day collecting artifacts with which he decorates his sad and lonely apartment. He spends his nights watching a videotape of Hello Dolly, an old musical set in an even older time and stored on what is in the 22nd century a ridiculously outdated technology. He's learned from Dolly that life's greatest pleasure is holding hands, and he practices on himself. If humanity has an essence, say the existentialists, it's alienation; and so WALL-E is human, alone on a deserted planet with only Babs Streisand and a cockroach for company. (Leave it to Pixar to make those two palatable!)

The new robots are not evil, exactly, not even the Autopilot device who serves as the movie's de facto villain. But they're efficient, ruthlessly so,; they're programmed to get the job done, and such a mission has little room for personality. The new robots aren't dehumanized--they'd have to be human for that--but they're kept from attaining humanity. At least until they meet WALL-E. Last year's model shines a light on every robot he meets, starting with EVE, who's a ruthless killing machine until WALL-E gives her a nickname. And when she retreats into her mission and becomes totally non-responsive, WALL-E sits with her for days in the rain and the heat, watching over her. Once she sees her own security footage of his sacrifice, bam, she's human. Something similar happens to the valet robot who angrily cleans up after WALL-E; when the latter shakes his hand and asks his name, he gains a personality in a way he didn't formerly have one. Carl Jung talks about the zeroes who make up most of the world and the ones who bring them into the clear light of day; WALL-E is certainly a one, and probably the only one in the movie, at least until he starts making ones from zeroes.

If the new robots are ruthlessly efficient, the humans in the movie are disgustingly sedentary, completely unable to make a move--they are so alienated that they are completely unaware of their alienation. They ride around their cruise ship on recliners, watching television and drinking their food so they won't even have to exercise their jaws. They're so wrapped up in their chair-mounted screens that they don't even know they're sitting by a swimming pool. WALL-E's status as a Jungian one is reinforced when he knocks John Ratzenberger off his chair and turns off Kathy Najimi's screen--the two make a legitimate human connection (they hold hands, of course) and become heroes in their own right when utopia turns to dystopia.

The people who criticize the film's supposed attack on the obese miss the point (indeed, at least three of the film's stars--Ratzenberger, Najimy, and captain Jeff Garlin--are pretty overweight, and so is Pixar head John Lasseter). The film is not a criticism of obesity in and of itself; it uses obesity as an image of alienation. After all, the futuristic humans are not just fat--they are sedentary, and in being sedentary they are unable to make any connections, be they intellectual, emotional, or spiritual. WALL-E's heroism comes from his old-fashionedness; he seems to exist in an era where folks were industrious without being ruthless, where human connection was the main thing, where technology did not run our lives for us. Whether that era existed or not, I can't say, but WALL-E does a great job of making us believe it did, and in doing so, it eases our alienation for a little while. It makes us move, makes us want to hold hands with someone or maybe everyone.

Wednesday, July 9, 2008

The Itch Plato Can't Scratch

I'm beginning to tire of reading Plato, but I've only got four or five dialogues (including The Laws) left to go. I'd like to plow through and finish them all before the semester starts. We'll see if that happens.

I'm in the middle of Philebus right now, in which Plato claims that pleasures and pains can be true or false to the extent that they are "based upon realities" (40D). Socrates manages to convince his debate opponents, who by this time in Plato's works are becoming less and less of real people and more of cardboard stand-ins to tell Socrates "yassuh"--or maybe all of Athens was tired of Socrates by this time and agreed merely to shut him up. I, at any rate, am unconvinced.

Plato talks quite a bit about hope and dread as pleasure and pain, so let's use these as examples. I can look forward to winning the lottery, say, and I can plan what I will do with my millions. (Since it's me in this example, I will probably fill a notebook with lists, perhaps color-coded.) But I'm not going to win the lottery, not even if I enter it--that hope, that pleasure is not at all based upon reality. But it's still pleasure. I will still gain satisfaction, however temporary, from my detailed million-dollar budget. When they announce the winning numbers Saturday night, it will become immediately apparent to me that my pleasure was based on a false reality. That doesn't mean it wasn't pleasure.

Likewise, if I panic--as I often do--and fear that I'm going to exhaust my meager resources and overdraw my checking account, the pain and the terror this causes me is no less painful and terrifying for being built on a (usually) false assumption. No, the pain is real, and the reality of the pain (to reiterate, as opposed to the reality of the basis of the pain) only adds to the reality of my pleasure when I check my account balance and find an acceptable number of electronic dollars. Pain and pleasure, emotionally speaking, exist whether or not they are attached to legitimate, that is to say, real, objects. Plato's suggestion to the contrary is an attempt to bend emotion to the intellect--an admirable attempt, but one that's not actually going to lessen emotional pain.

The argument gets stickier when we're talking about physical pain, so let me talk briefly about the physiological and psychological bases for physical pain. (I am no neuroscientist, so bear with me.) The June 30 issue of the New Yorker features a piece called "The Itch," which deals with a young woman who has an itch with no physiological basis. That her pain is not based on a physical reality does not stop her from seeking real comfort and pleasure--she scratches her head so much and so deeply that
One morning, after she was awakened by her bedside alarm, she sat up and, she recalled, "this fluid came down my face, this greenish liquid" . . . Only in the Emergency Department at Massachusetts General Hospital, after the doctors started swarming, and one told her she needed surgery now, did M. learn what had happened. She had scratched through her skull during the night--and into her brain.
M.'s itch, it turns out, is caused by nothing in particular, a variation on "phantom limb pain," in which a person whose arm or leg has been amputated nevertheless feels pressure and, oftentimes, pain on the affected area. I can't think of any better example of real pain caused by nothing in the realm of reality.

Unless it's this: When I read "The Itch," I began itching wildly all over my body. The power of suggestion caused a real pain in me based on nothing in reality. One could, I suppose, respond that I was responding to something unreal. But then there's Samuel Hafenreffer's 1660 definition of "itch," which has apparently not been improved upon in the intervening years: "An unpleasant sensation that provokes the desire to scratch." Unreality begets reality.

Sunday, July 6, 2008

It Comes and Goes

I've got a dead-person crush on Suzanne Pleshette, best known for her role as Emily Hartley on The Bob Newhart Show. Today I stumbled across this email she supposedly sent to a friend of hers when she and her third husband, Newhart's Tom Poston, were both seriously ill in the last years of their lives:

BAD NEWS

I lost all of my hair
I look like shit
Tom has a catheter in his dickie
We have round-the-clock nurses, a walker and a wheelchair

GOOD NEWS
I'm saving a fortune on bikini waxes
Tom has lost all peripheral vision so he doesn't know
At his age we're just glad he has a lump in his pants
We're madly in love
And we feel lucky.

AIN'T LIFE GRAND!!!!!!!


What a woman. If I could only travel back in time . . .

Oh, and my fiancee is behind me 100 percent on this, I assure you.

Wednesday, July 2, 2008

O (Record #75)

O
Damien Rice

Vector, 2002

This is another of those records, like Maroon, that I am embarrassed and shocked to find on my list. After all, that pretentious Canadian snob Nick Perreault joyously told me that Damien Rice creates the “safest music ever.” Fair enough. “Artistic” indie girls the world over love O and swoon as they dream of rescuing Rice from his heartbreak. Fair enough. But for whatever reason, for a few months in 2004, O meant the world to me.

2004 was a weird year for me. Between two life-defining terrible relationships, I had a brief interlude in which I was single and uninterested but surrounded by new friends and still undiagnosed bipolar. I had a complete emotion breakdown one night in early February, prompted by God knows what. But a few weeks later, I made the 45-minute drive to the nearest record store and bought O.

And O is a record on the cusp of complete emotional breakdown. Rice sings every syllable as though he’s about to crack, fragile and wavering. I didn’t have a girl to attach most of these songs to at first, but the feeling was right. I remember driving back from Atlanta, alone on Valentine’s Day, and listening to “Cheers Darlin’,” a halfhearted anthem to bitterness that managed to work for me anyway.

I eventually found a girl to attach them to, and I remember that in the early stages of that relationship I went fly-fishing with my friend Garrett. As we stood by the 40-degree river in pre-dawn, I tried not to tell him the story but kept hearing lines from O in my head:
And so it is the colder water
The blower’s daughter
The pupil in denial
I can’t take my eyes off of you
And especially “Cannonball”:
It’s not hard to fall
And I don’t want to scare her
It’s not hard to fall
And I don’t want to lose
I did--I told Garrett, and I lost the girl, five or six times, actually--and if I didn’t think much about O after my 18-hour move to Omaha (the last time I remember listening to this record in earnest), it still makes me shiver in the 6 a.m. chill.

Say what you will about Damien Rice—and I’ve said plenty, starting with the overindulgence of two (terrible) hidden bonus tracks and ending with that aggravating dramatic gasp he does almost constantly—the man knows how to record. I remember hearing that he recorded O in his bedroom, but you’d never guess it. The acoustic guitar sounds crisp and full, and I’m not sure I’ve ever heard a percussion sound I liked better. He soaks the record in strings--mostly just cello, but he slathers the full 1940s film score orchestra onto “Amie”--and while I’m usually not a fan of that sound, it really works here.

Then there’s the duets. Rice is accompanied on nearly every song by the vocals of one Lisa Hannigan, about whom I know nothing and who, I am told, no longer plays with him. A pity. Her vocals bring his back down to earth--they’re soft but not weepy, strong but not intrusive. Also, twist of fate: That girl I thought about down by the river was named Lisa. Not a pity.

I think, for a certain segment of the population—and I am not sure if I am part of this group or not--Rice is the indie-rock equivalent of Roy Orbison. He lacks Orbison’s glorious range, but he produces those same mini-symphonies of teenage loss and heartbreak. These days, I couldn’t care less what he’s doing, and I’ll probably never buy or listen to another of his records, but for that stretch of time, I needed to hear someone more banged-up than me—even if it was mostly for dramatic effect.

O SONG-BY-SONG
Delicate ****
Volcano ****
The Blower’s Daughter ****
Cannonball *****
Older Chests ****
Amie ****
Cheers Darlin’ ****
Cold Water ****
I Remember ****
Eskimo **
Prague **
Silent Night **