Turn on the TV or radio, or log onto the Internet, and you can’t avoid the numerous remembrances of 2011, including tributes to those who bid adieu to this world in the year about to end. The deaths of Steve Jobs and Elizabeth Taylor received the most attention, but I’ll miss Peter Falk the most (Hail, Columbo! And The In-Laws!). Others who passed on this year include Jerry Lieber, who with Mike Stoller wrote “Hound Dog” and “Jailhouse Rock” and many other rock ‘n’ roll classics, actor Cliff Robertson, and Sidney Lumet, who directed more than his share of great movies, but who I will always remember for Network, the most prescient film of the 1970s. Christopher Hitchens discovered, as everyone does in that final moment before death, that there is a God, but the revelation came too late for him to write about. Too bad. Imagine the book that could have resulted. What title would he have given it? Maybe something on the order of Uh-Oh, God Is Great, After All or Guess Who I Met in Hell?.
Barack Obama interrupted regular programming to tell us that Osama Bin Laden, the supposed mastermind behind the 9/11 attacks (but actually the scapegoat), had been killed, but no one saw a picture of the corpse. Then there was Muammar Gaddafi, dragged through the streets of Libya and murdered by his own people, undoubtedly with an assist from the CIA which has played a role in numerous revolutions during its 64 year history, including this year's so-called Arab Spring. The CIA was the secret hand behind at least two major revolutions in Iran. In 1951, when Prime Minister Muhammed Mossadegh nationalized the oil industry, the agency, with backing from the likes of David Rockefeller, mobilized Iranians who were hostile to the ruling powers and had him overthrown. Mossadegh was replaced with Mohammed Roza Pahlavi, the Shah, who would be ousted himself in 1979. In his memoirs, the Shah expressed the belief that the CIA was involved in his overthrow, too. In his place, the Ayatollah Khomeini took charge, turning Iran into the stronghold of Islamic radicalism that it remains today. Egypt’s recent revolution brought the Muslim brotherhood to power there, and now that Quadaffi has been kicked to the curb and killed, a more extreme Muslim government will almost certainly take the reins in Libya, a guarantee that events in the Middle East will continue to make headlines in 2012 and beyond.
The year 2011 was a dismal one on all fronts with more than a few of the “birth pangs” that the Bible warns are a sign of the end-times. The events in the Middle East may have the most damaging consequences in the long run, but don’t tell that to Japan. The earthquake that rocked the country in March registered an 8.9 on the Richter scale, assuring it a place as one of the worst natural disasters in history. In Luke 21:11, Jesus warned of the last days that “There will be great earthquakes in diverse places.” Those who dismiss Bible prophecy as so much superstitious claptrap would argue that there have always been earthquakes and natural disasters, but shortly after the 8.8 earthquake in Chile on February 27, 2010, World News and Prophecy put it in frightening perspective:
“If we look at the 12 strongest earthquakes registered in the world since measurements of them began some 300 years ago, four - or a third of the list - have occurred within the last six years.”
I don’t know which of those 12 earthquakes dropped off the list following this year’s disaster, but it’s now safe to say that five of the worst earthquakes in history occurred within seven years.
What does the world have in store for 2012?
© 2011 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
Friday, December 30, 2011
Wednesday, December 21, 2011
Countdown to Christmas
Christmas is only four days away, but it doesn't seem like Christmas, and hasn't since my parents passed away more than five years ago. There's Christmas music on the radio, including that duet by Bing Crosby and David Bowie (“Little Drummer Boy/Peace on Earth”), but I’ve found it more of a nuisance than a pleasure. Still, I'm always amused to see the video clip of that musical odd couple standing side-by-side in front of a piano singing that medley on Crosby’s final Christmas special which aired a few months after his death. Broadcast in 1977 and titled A Merry Olde Christmas, it was the subject of much curiosity due to Bowie's appearance. Pop stars had joined Der Bingle on TV shows before, but Bowie was no ordinary hit maker. The fact that he was a bonafide rock and roll star, and, therefore, not the kind of performer then prominent on prime-time television, made his guest shot on a Crosby Christmas special unusual to begin with, but his androgynous, bi-sexual reputation made it downright bizarre.
It’s doubtful Crosby was familiar with Bowie, and the decision to recruit him for the special was made by the producers who enlisted three of the show's musical directors to write a counter melody to "The Little Drummer Boy" after Bowie made it clear he disliked the popular song. The result was "Peace on Earth," which Bowie sang while Crosby handled the more familiar "ba-bump-bump-bump-bum" of the other song.
Bowie was in the midst of his most musically experimental phase at the time, having recorded the cold, melancholy Low and its follow-up Heroes during this period (Bowie also performed the title track from the latter album on the Crosby hour). Bowie’s ethereal music was never more alienated or detached than it was in these striking albums. Singing with lovable old Bing would have shocked his audience at any time but never more so than in 1977 when he was creating some of his darkest music. And here he was, on a Christmas special no less, dueting with Bing Crosby on “The Little Drummer Boy.” Thin, somewhat effeminate, and more than a little other worldly in appearance, Bowie provided a stunning contrast to the rumpled crooner in a comfy sweater. Surprisingly, their joint venture was a success. Their styles did not clash, and the recording, though made strictly for the CBS-TV broadcast and not intended for release on vinyl (the introduction of the compact disc was still five years away), is a memorable one. Five years later, RCA (then Bowie's label) released it as a single. It's now a staple in the season of good cheer.
There's little cheery about “O Come, O Come, Emmanuel,” the traditional Christmas carol that has no competition as my favorite yuletide song. Its richly melancholic melody has made it a favorite with some jazz musicians, but you do not find it being performed by many popular recording artists. Joan Baez offered a superb rendition on an album of Christmas songs she released in the ‘60s, but other than Peter, Paul, and Mary, I can’t think of any other mainstream performers who have attempted it. It may be too solemn, even a little depressing, certainly in comparison to "Rockin' Around the Christmas Tree" and "Grandma Got Run Over By a Reindeer," two songs I positively loathe.
If you don't hear "O Come, O Come Emmanuel" much on the radio, it's still more frequently played than the selections from the Rotary Connection's 1968 album, Peace. Although it includes an electrified version of “Silent Night,” most of the material was original, but it remains one of the finest Christmas albums ever recorded. The songs are moody and contemplative, highlighted by two exceptional pieces: “Sidewalk Santa” and “Shopping Bay Menagerie.” Unfortunately, the album has been long out of print, and is hard to find on CD. If the Rotary Connection is known at all today, it's likely because its lineup included the late Minnie Riperton who had a huge hit with "Lovin' You" ("It's easy 'cause you're beautiful") back in 1975. Her daughter, Maya Rudolph, went on to fame herself on Saturday Night Live.
© 2011 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
It’s doubtful Crosby was familiar with Bowie, and the decision to recruit him for the special was made by the producers who enlisted three of the show's musical directors to write a counter melody to "The Little Drummer Boy" after Bowie made it clear he disliked the popular song. The result was "Peace on Earth," which Bowie sang while Crosby handled the more familiar "ba-bump-bump-bump-bum" of the other song.
Bowie was in the midst of his most musically experimental phase at the time, having recorded the cold, melancholy Low and its follow-up Heroes during this period (Bowie also performed the title track from the latter album on the Crosby hour). Bowie’s ethereal music was never more alienated or detached than it was in these striking albums. Singing with lovable old Bing would have shocked his audience at any time but never more so than in 1977 when he was creating some of his darkest music. And here he was, on a Christmas special no less, dueting with Bing Crosby on “The Little Drummer Boy.” Thin, somewhat effeminate, and more than a little other worldly in appearance, Bowie provided a stunning contrast to the rumpled crooner in a comfy sweater. Surprisingly, their joint venture was a success. Their styles did not clash, and the recording, though made strictly for the CBS-TV broadcast and not intended for release on vinyl (the introduction of the compact disc was still five years away), is a memorable one. Five years later, RCA (then Bowie's label) released it as a single. It's now a staple in the season of good cheer.
There's little cheery about “O Come, O Come, Emmanuel,” the traditional Christmas carol that has no competition as my favorite yuletide song. Its richly melancholic melody has made it a favorite with some jazz musicians, but you do not find it being performed by many popular recording artists. Joan Baez offered a superb rendition on an album of Christmas songs she released in the ‘60s, but other than Peter, Paul, and Mary, I can’t think of any other mainstream performers who have attempted it. It may be too solemn, even a little depressing, certainly in comparison to "Rockin' Around the Christmas Tree" and "Grandma Got Run Over By a Reindeer," two songs I positively loathe.
If you don't hear "O Come, O Come Emmanuel" much on the radio, it's still more frequently played than the selections from the Rotary Connection's 1968 album, Peace. Although it includes an electrified version of “Silent Night,” most of the material was original, but it remains one of the finest Christmas albums ever recorded. The songs are moody and contemplative, highlighted by two exceptional pieces: “Sidewalk Santa” and “Shopping Bay Menagerie.” Unfortunately, the album has been long out of print, and is hard to find on CD. If the Rotary Connection is known at all today, it's likely because its lineup included the late Minnie Riperton who had a huge hit with "Lovin' You" ("It's easy 'cause you're beautiful") back in 1975. Her daughter, Maya Rudolph, went on to fame herself on Saturday Night Live.
© 2011 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
Tuesday, December 20, 2011
The Err to Gore Vidal
In the February 2010 Vanity Fair, Christopher Hitchens cast his critical eye on Gore Vidal, a man whose “tough-mindedness” and “subversive wit” he greatly admired. When participating in a panel discussion on the life and work of Oscar Wilde, Hitchens recalled the moderator proposing that Vidal was the Oscar Wilde of our time, “and, really, once that name had been mentioned, there didn’t seem to be any obvious rival.”
If one had been looking for Gore Vidal’s successor, Hitchens didn’t have any obvious rival either, and even now, with his voice permanently stilled, no rival seems likely to emerge. Vidal himself once championed Hitchens as his heir, but then came 9/11. The attack on New York and Washington D.C. had many ramifications, the most serious of which - the shredding of the Bill of Rights and the shedding of blood in Iraq - Vidal addressed in his controversial pamphlets, Perpetual War for Perpetual Peace and Dreaming War. One of its more frivolous results was the end of the two writers’ mutual admiration society.
Suddenly, Hitchens regarded Vidal as a “crackpot” for proposing that the Bush administration had advance knowledge of the attacks and took merciless advantage of them to justify an invasion of Iraq and the suspension of many of our civil liberties. Hitchens wrote that “if it’s true even to any degree that we were all changed by September 11, 2001, it’s probably truer of Vidal that it made him more the way he already was. . .” As an example, Hitchens referred to Vidal’s previously stated belief that Franklin Roosevelt ignored warnings that an attack on Pearl Harbor was imminent, knowing that such a tragedy would rally support for America’s entry into World War II.
Prior to 9/11, Hitchens was seemingly on the political left, contributing to such progressive publications as The Nation. A closer look at some of his activities suggests that the heart of a right winger was beating in his chest years before that day in 2001. While the left marched in step, supporting Bill Clinton even as he was impeached for lying under oath in the Monica Lewinsky case, Hitchens joined the conservative choir, admirably so in my view, by condemning him, writing a book, No One Left to Lie To: The Triangulations of William Jefferson Clinton.
Hitchens didn't really change after 9/11. As he said of Vidal, “it made him more the way he already was . . .” Hitchens' words, which I replaced with an ellipsis - “and accentuated a crackpot strain that gradually asserted itself as dominant” - perfectly describe Hitchens himself who told USA Today that 9/11 was “an attack on America and its ideals.” George W. Bush had said that the terrorists “hate our freedom.” Different words expressing the same ridiculous sentiment.
Hitchens, the self-described contrarian, became an unofficial publicist for the Bush administration and its policies. Since he wasn’t on the government’s payroll, he was free to make statements that were more inflammatory, such as his description of the enemy as “Islamofascists.” Hitchens, like most Bush cheerleaders, failed to acknowledge that U.S. involvement in regions where we have no business being involved was, as Patrick Buchanan has stated, asking for trouble.
It’s a pointless argument, however, since the facts support the claim made by Vidal and others that the Bush administration had prior knowledge of the attacks. Of course, Hitchens, like others in the mainstream media, ridiculed such beliefs as unworthy of anyone but a “crackpot.”
Hitchens appears to have been something far more dangerous than a crackpot. He was a disinformation specialist. They come in all shapes and sizes, all colors and creeds, and can be found on the left, right, and in the center of every political party. Hitchens, I stated previously, “wasn’t on the government payroll,” but many journalists are secretly employed by the CIA, and have been since the days of the company’s forerunner, the OSS. In a 1977 article in The Washington Post, Carl Bernstein reported that the CIA’s “assets” included employees of virtually every major newspaper, magazine, and TV network, all of whom could be called on to do their bidding. Whether or not Hitchens was one of them, he certainly supported their agenda, something the true contrarian, Gore Vidal, never did.
In attempting to explain why Hitchens demoted him from idol to crackpot, Vidal told an audience that “I didn’t die. I just kept going on and on and on.”
And on he goes, a man without an heir, but he never really had one in Hitchens. Sure, they were both witty and had a gift for words, but Hitchens worked hard to counter Vidal’s most important message. Perhaps when Vidal called Hitchens his heir, he forgot to use the spellcheck. He may have meant “err.”
© 2011 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
Sunday, December 18, 2011
The Incredible Shrinking Movie Theater
When I was a young lad, I practically lived at the movies. Correction: I lived in the movies. The real world was merely something to pass through on my way into and out of the theater. Not anymore. The movie-going experience is not what it used to be and the movies are not to blame. The culprit is the movie theater.
When I was growing up in the late Sixties, downtown Cleveland, Ohio was home to such theaters as the Hippodrome, the Allen, the Loew’s State and Ohio, and the Palace, most located in an area still known as Playhouse Square. The Palace was especially well-named because, like all of these theaters, it was fit for a king. Plush architectural wonders, they made a trip to the movies an event even when the attraction was as mediocre as Murderer’s Row, a ridiculous spy thriller starring Dean Martin as Matt Helm, a secret agent less super, and certainly less sober, than James Bond whose fantastic adventures were all the rage during that decade. Yet I clearly recall seeing this turkey during my first visit to the Hippodrome in December 1966. I recall it more vividly than many superior movies I’ve seen since precisely because of the theater in which I saw it.
The downtown movie palaces are gone forever. The State, Ohio, and Palace are still standing but not as movie houses. The Hippodrome, however, exists only in memory. After closing its doors in 1980, it became a target of the wrecking ball to make room for a parking lot. Considering the grade Z exploitation movies the Hippodrome played throughout much of the Seventies (black exploitation and kung fu flicks), its demolition was a mercy killing. The Allen, boarded up for years, reopened its doors but only for live theater, not for the beam of a 35mm projector.
That’s a shame. These were special, distinctive theaters, completely unlike the multiplexes that dominate the industry today. Seeing a movie at the Hippodrome or the State was comparable to dinner at an elegant restaurant. Today’s multiplex cinemas are closer in spirit to McDonald’s. In the past, the theater where you saw a film was almost as important as the movie itself. If you saw the James Bond thriller Thunderball at the Loew’s State, as I did in December 1965, chances are you’d remember that you saw it at the State. Today, it’s doubtful anyone can accurately say where they saw a particular movie because the theaters all look and feel the same, and the movie playing at one multiplex is usually playing at all of them.
The neighborhood theaters of the past were not as impressive as the downtown movie palaces, but they had their own individual charms. My favorite was the Garden, located at West 25th Street and Clark Avenue, since it was in its darkened auditorium that I was introduced to the glorious world of cinematic make-believe, a world in which I spent some of the happiest times of my youth.
Today’s theaters are barely worthy of the name. The old theaters that do survive have been split up into multi-screen mini theaters, while the newer models built in the past thirty years are buried inside those ghastly shopping malls that are as lacking in character as the theaters themselves. And not just one theater, but often more than a dozen under one roof, all as unimaginatively numbered as so many of the sequels that play on their screens.
Those screens are another source of irritation. In the Fifties, when television lured Americans away from the movies by offering free entertainment in the living room, the motion picture industry fought the threat by offering what could not be found on that square box. Cinemascope, Cinerama, Vista-Vision (“Motion Picture High Fidelity”), and other innovative processes that required larger and wider screens were introduced. Now that television, as well as cable and home video, has proven victorious, theater screens are smaller than ever. It’s as though theatrical releases are little more than promotional tools to hype the disc or download scheduled for release a few months later.
Even walking through the lobby of those great old theaters was more pleasurable than being anywhere in one of the shoeboxes of today. The walls were adorned with large colorful posters of the current and coming attractions. Just as the screens started shrinking, so did those posters. They’re not as eye catching as they once were either. The poster for Thunderball featured panels of exciting artwork that conveyed the action in which star Sean Connery engaged in on screen in a way that may be more memorable than the movie itself. Posters for more recent 007 films simply feature photographs of the star posing with a gun and maybe a Bond girl or two.
Well, kids, let me sum up by saying you don’t know what you missed. I may sound like an old codger drawing unfavorable comparisons between modern reality and an idealized past, but the fact is a night at the movies isn’t what it used to be. Today, you buy your overpriced ticket, see the movie and maybe a coming attraction or two, and are then rushed off the property. Considering the quality of too many of today’s films, who needs to be rushed? Chances are you’ll be hurrying out the door with no encouragement from management for fear of being locked in overnight and subjected to another viewing of The Human Centipede by a deranged projectionist.
Murderer’s Row: now that was a good movie!
PS: On a bright note, the Capitol, a grand old neighborhood theater on the west side of Cleveland, has been restored and is a full time movie house (with digital projection rather than film, which could be the subject of another nostalgic blog entry). It's a multiplex with three screens, but the two smaller theaters are upstairs. If you see a movie in the main auditorium, it's just like old times, especially on those Sunday mornings when, for five bucks, you can see everything from Citizen Kane and North by Northwest to The Pink Panther (the original with Peter Sellers) and The Bad Seed.
© 2011 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
When I was growing up in the late Sixties, downtown Cleveland, Ohio was home to such theaters as the Hippodrome, the Allen, the Loew’s State and Ohio, and the Palace, most located in an area still known as Playhouse Square. The Palace was especially well-named because, like all of these theaters, it was fit for a king. Plush architectural wonders, they made a trip to the movies an event even when the attraction was as mediocre as Murderer’s Row, a ridiculous spy thriller starring Dean Martin as Matt Helm, a secret agent less super, and certainly less sober, than James Bond whose fantastic adventures were all the rage during that decade. Yet I clearly recall seeing this turkey during my first visit to the Hippodrome in December 1966. I recall it more vividly than many superior movies I’ve seen since precisely because of the theater in which I saw it.
The downtown movie palaces are gone forever. The State, Ohio, and Palace are still standing but not as movie houses. The Hippodrome, however, exists only in memory. After closing its doors in 1980, it became a target of the wrecking ball to make room for a parking lot. Considering the grade Z exploitation movies the Hippodrome played throughout much of the Seventies (black exploitation and kung fu flicks), its demolition was a mercy killing. The Allen, boarded up for years, reopened its doors but only for live theater, not for the beam of a 35mm projector.
That’s a shame. These were special, distinctive theaters, completely unlike the multiplexes that dominate the industry today. Seeing a movie at the Hippodrome or the State was comparable to dinner at an elegant restaurant. Today’s multiplex cinemas are closer in spirit to McDonald’s. In the past, the theater where you saw a film was almost as important as the movie itself. If you saw the James Bond thriller Thunderball at the Loew’s State, as I did in December 1965, chances are you’d remember that you saw it at the State. Today, it’s doubtful anyone can accurately say where they saw a particular movie because the theaters all look and feel the same, and the movie playing at one multiplex is usually playing at all of them.
The neighborhood theaters of the past were not as impressive as the downtown movie palaces, but they had their own individual charms. My favorite was the Garden, located at West 25th Street and Clark Avenue, since it was in its darkened auditorium that I was introduced to the glorious world of cinematic make-believe, a world in which I spent some of the happiest times of my youth.
Today’s theaters are barely worthy of the name. The old theaters that do survive have been split up into multi-screen mini theaters, while the newer models built in the past thirty years are buried inside those ghastly shopping malls that are as lacking in character as the theaters themselves. And not just one theater, but often more than a dozen under one roof, all as unimaginatively numbered as so many of the sequels that play on their screens.
Those screens are another source of irritation. In the Fifties, when television lured Americans away from the movies by offering free entertainment in the living room, the motion picture industry fought the threat by offering what could not be found on that square box. Cinemascope, Cinerama, Vista-Vision (“Motion Picture High Fidelity”), and other innovative processes that required larger and wider screens were introduced. Now that television, as well as cable and home video, has proven victorious, theater screens are smaller than ever. It’s as though theatrical releases are little more than promotional tools to hype the disc or download scheduled for release a few months later.
Even walking through the lobby of those great old theaters was more pleasurable than being anywhere in one of the shoeboxes of today. The walls were adorned with large colorful posters of the current and coming attractions. Just as the screens started shrinking, so did those posters. They’re not as eye catching as they once were either. The poster for Thunderball featured panels of exciting artwork that conveyed the action in which star Sean Connery engaged in on screen in a way that may be more memorable than the movie itself. Posters for more recent 007 films simply feature photographs of the star posing with a gun and maybe a Bond girl or two.
Well, kids, let me sum up by saying you don’t know what you missed. I may sound like an old codger drawing unfavorable comparisons between modern reality and an idealized past, but the fact is a night at the movies isn’t what it used to be. Today, you buy your overpriced ticket, see the movie and maybe a coming attraction or two, and are then rushed off the property. Considering the quality of too many of today’s films, who needs to be rushed? Chances are you’ll be hurrying out the door with no encouragement from management for fear of being locked in overnight and subjected to another viewing of The Human Centipede by a deranged projectionist.
Murderer’s Row: now that was a good movie!
PS: On a bright note, the Capitol, a grand old neighborhood theater on the west side of Cleveland, has been restored and is a full time movie house (with digital projection rather than film, which could be the subject of another nostalgic blog entry). It's a multiplex with three screens, but the two smaller theaters are upstairs. If you see a movie in the main auditorium, it's just like old times, especially on those Sunday mornings when, for five bucks, you can see everything from Citizen Kane and North by Northwest to The Pink Panther (the original with Peter Sellers) and The Bad Seed.
© 2011 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
Saturday, December 17, 2011
Christopher Hitchens Goes Forth to Meet His Maker
In the morning when I lift my tired body from my bed and go about the business of preparing for another day, I prefer absolute silence, but usually I turn on the radio even though I loathe the voices of those morning personalities. All I want is to return to the comfort of my bed, but these jokesters sound delighted to have braved the frigid autumn air to report to a workplace. I endure their cheerfulness because if an earth-shaking event - an earthquake, a terrorist attack - has occurred while I slept, I want to be aware of it. There wasn’t anything comparable in the news on Friday morning, but I did learn that Christopher Hitchens had died the day before at age 62, finally succumbing to the cancer with which he was diagnosed in June 2010.
Hitchens had been around a long time, writing for such “progressive” publications as The Nation, but he really seemed to capture the public’s attention in the past decade when his bass voice and elegant prose turned from concerns of the left to championing George W. Bush and his War on Terror. Suddenly, Hitchens, whom Gore Vidal once considered his heir, smeared Vidal as a crackpot for believing, as many do, that Bush and company were complicit in the attacks of 9/11 and that the War on Terror was really a war on us - the citizens of the United States. Hitchens was suddenly welcome to spout off on the radio shows of such conservative talkers as Dennis Prager and Michael Medved, even though both men are observant Jews and supporters of Christianity, while Hitchens was an outspoken atheist and critic of religion. It was in the latter guise that Hitchens was best known in the final years of his life, mainly owing to his best seller, God Is Not Great: How Religion Poisons Everything.
I never read any of Hitchens’ denunciations of religion, but the title of his book could make one suspect that he wasn’t as much of an atheist as he claimed. If God does not exist, it’s irrelevant to speak of His greatness or lack thereof. That which does not exist is nothing. It has no character whatsoever. Furthermore, if there is no God, would anyone even consider the question of His existence? C. S. Lewis didn’t think so, and neither do I. No one would worship God or choose competing idols to fill the void that has existed since Adam and Eve’s expulsion from the Garden of Eden. There would be no icons in any field. We would feel no need to look up to anyone, even a parent.
Hitchens stepped into eternity unrepentant, supposedly touched by those who prayed on his behalf, but steadfast in his denial of God. He insisted that any reports of a deathbed conversion should be dismissed as a lie or the words of a “raving, terrified person whose cancer has spread to the brain.” He also expressed no regrets about the booze and particularly the cigarettes that may have played a part in the cancer that killed him. “Writing is what’s important to me, and anything that helps me do that - or enhances and prolongs and deepens and sometimes intensifies argument and conversation - is worth it to me.”
© 2011 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
Friday, December 9, 2011
Classic Scrooge
Bah. Humbug. I'm tempted to utter Charles Dickens' famous expletive a lot during the Christmas season. Holiday displays featuring bears and bicycles bring out the Scrooge in me. (Bears? Bicycles? Am I missing the Christmas connection there?) Sometimes it's a version of A Christmas Carol. Have you seen the lumbering musical with Albert Finney? Or the one with Bill Murray that plays like a Saturday Night Live sketch stretched to the limit? There are versions with Mickey Mouse, the Muppets, Jim Carrey, and even the Fonz, er, Henry Winkler. Dickens' classic is in the public domain so anyone can make a movie or stage a theatrical version without requesting permission or paying the author's estate one cent. Even stingy Ebenezer would like that deal. But for my money, there are only two versions as enduring as the tale itself. One is a live action black-and-white theatrical feature. The other is a cartoon made for television.
The feature, produced by England's Renown Pictures in 1951, is known as Scrooge in its native U.K., but was retitled A Christmas Carol in the U.S. and elsewhere. It stars Alistair Sim, whom the late Shel Silverstein once called “the greatest comic actor. . . a genius.” Sim is also the only actor to make enough of an impression as Scrooge to claim the role as his own. The actor shows us Scrooge's hard exterior, but also gives us a glimpse of the sensitive young man he once was, a man bruised by life and determined to protect himself from further disappointment. Scrooge's transformation from cold miser to the man who "knew how to keep Christmas well" is believable because Sim creates a three dimensional figure.
The rest of the cast is also splendid with Mervyn Johns (best known for 1946's classic Dead of Night) a perfect Bob Cratchit and skeletal Ernest Thesiger, good old Dr. Pretorius from Bride of Frankenstein, as Mr. Stretch, the undertaker as cadaverous as his clients.
Since A Christmas Carol is a ghost story as much as a Christmas tale, the mood needs to be just right and director Brian Desmond Hurst captures it perfectly. The mood is what really distinguishes this film and sets it apart from other adaptations. This is no jolly sleigh ride but a fairly dark tale, often depressing until its joyous climax. When it debuted in U.S. theaters, the film was praised by The New York Times for its "somber and chilly atmosphere" while Variety panned it for being too grim. The film's lack of contrived good cheer hurt it at the box-office, but the film acquired a strong following after it began to appear on television.
Unfortunately, a colorized version that dilutes the film's power has become too prominent on television in the last decade, and the original black-and-white version can often be seen only on video.
If you really want your Scrooge in color, try Mr. Magoo's Christmas Carol. First aired on U.S. television on December 18, 1962, the hour long animated film is unique in that it casts a cartoon character, the bumbling, dangerously near-sighted Magoo, as Scrooge and does so without compromising the integrity of Dickens' creation. The show is well produced with many memorable songs, and, it too, has a suitably dark tone at times, although it is considerably more upbeat than the 1951 film. If this version isn’t entirely faithful to Dickens, it is nonetheless true to his spirit and is a fine way to introduce children to this beloved literary classic.
A Christmas Carol has been popular with filmmakers since the earliest days of the cinema with the earliest known version appearing in 1908. New versions will continue to be produced as long as we celebrate Christmas, but it's doubtful anyone will offer a version to challenge those of Mr. Sim or Mr. Magoo.
Originally published at Paris Woman Journal
© 2003 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
The feature, produced by England's Renown Pictures in 1951, is known as Scrooge in its native U.K., but was retitled A Christmas Carol in the U.S. and elsewhere. It stars Alistair Sim, whom the late Shel Silverstein once called “the greatest comic actor. . . a genius.” Sim is also the only actor to make enough of an impression as Scrooge to claim the role as his own. The actor shows us Scrooge's hard exterior, but also gives us a glimpse of the sensitive young man he once was, a man bruised by life and determined to protect himself from further disappointment. Scrooge's transformation from cold miser to the man who "knew how to keep Christmas well" is believable because Sim creates a three dimensional figure.
The rest of the cast is also splendid with Mervyn Johns (best known for 1946's classic Dead of Night) a perfect Bob Cratchit and skeletal Ernest Thesiger, good old Dr. Pretorius from Bride of Frankenstein, as Mr. Stretch, the undertaker as cadaverous as his clients.
Since A Christmas Carol is a ghost story as much as a Christmas tale, the mood needs to be just right and director Brian Desmond Hurst captures it perfectly. The mood is what really distinguishes this film and sets it apart from other adaptations. This is no jolly sleigh ride but a fairly dark tale, often depressing until its joyous climax. When it debuted in U.S. theaters, the film was praised by The New York Times for its "somber and chilly atmosphere" while Variety panned it for being too grim. The film's lack of contrived good cheer hurt it at the box-office, but the film acquired a strong following after it began to appear on television.
Unfortunately, a colorized version that dilutes the film's power has become too prominent on television in the last decade, and the original black-and-white version can often be seen only on video.
If you really want your Scrooge in color, try Mr. Magoo's Christmas Carol. First aired on U.S. television on December 18, 1962, the hour long animated film is unique in that it casts a cartoon character, the bumbling, dangerously near-sighted Magoo, as Scrooge and does so without compromising the integrity of Dickens' creation. The show is well produced with many memorable songs, and, it too, has a suitably dark tone at times, although it is considerably more upbeat than the 1951 film. If this version isn’t entirely faithful to Dickens, it is nonetheless true to his spirit and is a fine way to introduce children to this beloved literary classic.
A Christmas Carol has been popular with filmmakers since the earliest days of the cinema with the earliest known version appearing in 1908. New versions will continue to be produced as long as we celebrate Christmas, but it's doubtful anyone will offer a version to challenge those of Mr. Sim or Mr. Magoo.
Originally published at Paris Woman Journal
© 2003 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
Tuesday, December 6, 2011
3 Forgotten Christmas Movies
The promise of peace on earth is never fulfilled at Christmas, but everything is possible in Christmas movies. George Bailey is saved from suicide every Christmas Eve, Ebenezer Scrooge always greets Christmas morning as a kinder, gentler man, and Bing Crosby rises from the dead to sing "White Christmas" in black and white (Holiday Inn) and color (White Christmas). Like the sound of reindeer on the rooftop and sleighbells in the snow, Christmas movies are a part of the holiday landscape. It wouldn't really seem like Christmas without It's a Wonderful Life and Miracle on 34th Street, but while those classics are among the most heralded Christmas movies, there are other, less popular, but still delightful films sure to brighten your holiday season.
1940's Remember the Night is the charming tale of a prosecuting attorney who takes a shoplifter to his family's farm for Christmas because he doubts he can successfully convict her on Christmas Eve and doesn't wish to see her spend the holidays in jail. The shoplifter, a product of a broken home, experiences a Christmas unlike any she's known before. Meanwhile, the prosecutor falls in love with her, making his professional duties all but impossible. Four years later, the film's stars, Fred Macmurray and Barbara Stanwyck, made one of film noir’s most cynical couples in Double Indemnity, but in this Preston Sturges masterpiece, they display a warmth that is as endearing as the Preston Sturges screenplay.
In 1949's Holiday Affair, Janet Leigh is a widowed mother responsible for getting department store clerk Robert Mitchum fired during the Christmas season. Mitchum is soon competing with Wendell Corey for the affections of both Leigh and her son in this modest but effective Christmas romance. It's especially nice to see big, bad Bob Mitchum in such a gentle role, and his tough guy image keeps the sugar coated story from becoming too sweet.
Bob Hope is as synonymous with Christmas as Bing Crosby. In addition to starring in annual holiday TV specials and entertaining the troops, he starred in The Lemon Drop Kid, a delightful but now relatively forgotten 1951 comedy that deserves to be revived every Christmas. In this Damon Runyon story, Hope is a scam artist whose gang poses as Salvation Army sidewalk Santas in order to pay off a debt to a gangster. Like Scrooge, Hope sees the error of his ways by the time the credits roll, but not before providing some big laughs and introducing the classic Christmas carol, "Silver Bells," an Oscar nominee for best song.
All three of these films are available on video and are sure to brighten the holiday season as much as the more famous Christmas movies. Watch them, and, who knows, they may become as much a tradition in your home as It's a Wonderful Life.
© 2003 Brian W. Fairbanks
Originally published at Paris Woman Journal
VISIT MY KINDLE STORE AT AMAZON
1940's Remember the Night is the charming tale of a prosecuting attorney who takes a shoplifter to his family's farm for Christmas because he doubts he can successfully convict her on Christmas Eve and doesn't wish to see her spend the holidays in jail. The shoplifter, a product of a broken home, experiences a Christmas unlike any she's known before. Meanwhile, the prosecutor falls in love with her, making his professional duties all but impossible. Four years later, the film's stars, Fred Macmurray and Barbara Stanwyck, made one of film noir’s most cynical couples in Double Indemnity, but in this Preston Sturges masterpiece, they display a warmth that is as endearing as the Preston Sturges screenplay.
In 1949's Holiday Affair, Janet Leigh is a widowed mother responsible for getting department store clerk Robert Mitchum fired during the Christmas season. Mitchum is soon competing with Wendell Corey for the affections of both Leigh and her son in this modest but effective Christmas romance. It's especially nice to see big, bad Bob Mitchum in such a gentle role, and his tough guy image keeps the sugar coated story from becoming too sweet.
Bob Hope is as synonymous with Christmas as Bing Crosby. In addition to starring in annual holiday TV specials and entertaining the troops, he starred in The Lemon Drop Kid, a delightful but now relatively forgotten 1951 comedy that deserves to be revived every Christmas. In this Damon Runyon story, Hope is a scam artist whose gang poses as Salvation Army sidewalk Santas in order to pay off a debt to a gangster. Like Scrooge, Hope sees the error of his ways by the time the credits roll, but not before providing some big laughs and introducing the classic Christmas carol, "Silver Bells," an Oscar nominee for best song.
All three of these films are available on video and are sure to brighten the holiday season as much as the more famous Christmas movies. Watch them, and, who knows, they may become as much a tradition in your home as It's a Wonderful Life.
© 2003 Brian W. Fairbanks
Originally published at Paris Woman Journal
VISIT MY KINDLE STORE AT AMAZON
Sunday, December 4, 2011
Kurt Vonnegut: And So He Went
I read Slaughterhouse Five in a college literature class. It was only then that I realized that the phrase, “And so it goes,” did not originate with Linda Ellerbee of NBC News’ Overnight, the sassy late night news hour that ran from 1982-83. I liked the book, but wasn’t impressed enough to read anything else that Kurt Vonnegut had written. Years later, I did read a few of his essays (in one, he praised Helena Blavatsky, the witch credited with introducing the occult to America, as “quite wonderful”), and a slim volume, God Bless You, Dr. Kevorkian (chosen because it was slim and wouldn’t require more than an hour to read). By then, he was being published by the comparatively small Seven Stories Press since the major houses had written him off as a spent force.
I always liked his hair. Like me, he had a head of thick, unruly curls, and still did when he died at age 83 in April 2007. I always liked that he was an unrepentant and unapologetic chain smoker (another trait I share, although he preferred Pall Malls while I favor Camels). Somehow, he avoided the diseases associated with the nicotine habit, although I do recall reading he had emphysema in his latter years, but a case too mild to require traveling with an oxygen tank. However, I did not like his rather smug, knee-jerk liberalism, often expressed in cheap shots like the one he took at George W. Bush on The Jon Stewart Show. I don’t remember his exact words, but then they weren’t very memorable, pretty much limited to calling Bush an “idiot.” I wasn’t a fan of Bush 43 either, but to go on TV and suck up applause for such a trite insult is unworthy of anyone, but especially of a man who made his name with words. Real clever, Kurt.
A self-professed “humanist,” a philosophy he described as promoting decency and kindness without concern for the rewards in an afterlife, Vonnegut was not always decent and kind himself, even at times when decency and kindness would not require much effort. In And So It Goes - Kurt Vonnegut: A Life, Charles J. Shields describes Vonnegut’s behavior at a 1983 speaking engagement at Oxford. Taking questions from the audience, he was asked if he kept any “tools of the trade” on his desk, perhaps a favorite dictionary. It’s as legitimate as asking a writer where he gets his ideas, but an easier one to answer. Vonnegut treated the question as an opportunity to get a cheap laugh at the expense of an admirer. As Shields writes:
“Kurt chuckled, apparently amused by such a jejune question - a favorite dictionary? The audience murmured and laughed in sympathy.”
The fan who asked the question felt humiliated. After the audience stopped chuckling, Vonnegut finally provided an answer.
“No, he said finally, he had no ‘favorite dictionary,’ dismissing the notion by shaking his curly head”
Angry that the big shot author he idolized would embarrass him publicly, the student who posed the question wrote Vonnegut the next day: “It is your prerogative to piss on everything till doomsday, Mr. Vonnegut: but why do it in public? And why do it pretending to be doing something else?”
A few weeks later, Vonnegut sent a reply, along with a check, a refund for the cost of admission to his speaking engagement, not, he insisted, as a gesture of apology. Vonnegut was angry that he had been taken him to task for his rudeness. Years later, the student, now a reporter for Newsweek, found himself having lunch with Vonnegut and several others at Rockefeller Center. Before leaving, he reminded Vonnegut of their previous correspondence, believing both could laugh it off. Vonnegut didn’t laugh. “Oh, I remember,” he said. “Funny, you don’t seem like an asshole.” Of course not. On both occasions, the asshole was Vonnegut.
Before becoming a successful novelist, one who I always assumed was highly regarded in literary circles (alas, the literati generally dismiss his work as belonging to a phase that college students go through before abandoning him for more serious writers), Vonnegut was a corporate writer (so was I for a time), a public relations man at General Electric. He got the job on the recommendation of his brother, Bernard, a chemist credited with co-inventing a process of “cloud seeding” that could manipulate the weather. (Gee, I wonder what those who chuckle at conspiracy theories involving chemtrails would say about that?)
Vonnegut, like many another liberal, would later rail against corporations like GE. But, like many another liberal - Bella Abzug, are you listening from beyond the grave? - he owned stock in them, fattening his bank account on profits from, among others, “Dow Chemical, the sole maker of napalm during the Vietnam War: and Multitrust Real Estate Fund, a development of apartment complexes and shopping centers in six cities.”
Shields claims these investments were not really inconsistent with Vonnegut’s beliefs. “He believed in free enterprise,” he writes. “It had made his forebears rich. And he recognized that many ideas of Western freedom are intrinsically tied to capitalism.”
I guess if you’re going to devote a chunk of your life to writing a 513 page bio (including the index, etc), you’re tempted to make excuses for your subject, but for Vonnegut to profit from a war he publicly opposed, which he certainly did by investing in Dow, is hypocrisy. Even investing in a company that built shopping malls and had to mow down nature to do it should raise eyebrows. Vonnegut was an environmentalist, after all, who said, “I think the earth’s immune system is trying to get rid of us . . . we are a disease on the face of this planet.”
It was only after he achieved literary success that Vonnegut even began to think about a political stance. Since the audience for his work was primarily young and liberal, he adapted a pose that would appeal to that demographic. He let his close-cropped hair grow into an unruly mop and grew a mustache. But he continued to favor suits, ties, and black wingtips, attire that, if favored by anyone under 30 in the late ‘60s and early ‘70s, could only mean they were flag waving, Nixon supporting republicans. That wouldn’t describe Jefferson Airplane, the psychedelic rock band responsible for “White Rabbit” whose lead singer, Grace Slick, once claimed to have dropped LSD into Tricia Nixon’s drink during a visit to the White House. As a “hero of the counterculture,” Vonnegut received an invitation from the band to brainstorm ideas for their next album. “The vibrations were just awful,” he remembered. “I wanted out as fast as possible.. . . They may have had funny ideas about who I am on the basis of my books, and I turned out not to be that way at all.”
Some fans who never met him personally saw through the facade. After receiving a “fill in the blanks” form after requesting Vonnegut for a personal appearance, a fan wrote an angry letter to his agent. “Maybe I’m silly but I thought he’d be different. I thought he’d care just a little. How wrong I was. He’s just a capitalist like everyone else. No time for someone truly interested, for someone who truly cares.”
To hear others tell it, Vonnegut gave Hal Holbrook a run for his money by consciously playing Mark Twain in public. Critics had first pointed out Vonnegut’s resemblance to Twain in appearance (the hair, mostly, and the mustache), and the fact that they shared a birthplace and a sense of humor encouraged Vonnegut to play up the similarities, and pattern himself after the former Samuel Clemens. Much in demand for public speaking engagements, Vonnegut did his Mark Twain schtick, an act he also trotted out for fellow writers. At a party for PEN, an organization claiming to promote free speech but actually designed to further a liberal political agenda, novelist Hilary Masters recalled Vonnegut “doing his Mark Twain imitation, baggy white suit, bushy hair, and flowing mustache. He was standing a little apart, maybe aloof, like an icon of some kind . . . My attitude toward Vonnegut was that he was something of a poseur and that his impersonation of Twain was almost a theatrical device.”
Poseur. My thoughts exactly. And so it goes.
© 2011 Brian W. Fairbanks
Saturday, December 3, 2011
Nick Adams was a rebel
“Johnny Yuma was a rebel. He roamed through the West.”
Those words were sung over the credits of The Rebel, a half-hour western drama from the early 1960s that is probably familiar to most baby boomers. It doesn’t resonate as strongly as a Beatles song, but it resonates. In an episode of Seinfeld, Kramer absent-mindedly sings the theme on the phone after he's put on hold. It might have been scripted, but it could have just been a bit of improv by Michael Richards, an actor old enough to remember when the show starring Nick Adams originally aired. It helps that the theme was sung by Johnny Cash, a bonafide music legend whose death in 2002 was marked with all the hoopla one expects post-Elvis.
The Rebel's star died in February 1968 at age 36, and though his passing was not a media event, it was front page news, certainly in Cleveland, Ohio. The Plain Dealer ran the headline, “Actor Nick Adams Found Dead,” near the bottom of its front page. A few days later, The Cleveland Press’ "Showtime," an entertainment tabloid included in the Friday edition, featured a eulogy by a showbiz columnist titled “Then His Star Began to Fade,” detailing Adams’ struggle to achieve stardom in Hollywood and of how he fell short of realizing his ambition while still coming closer than most to grabbing the brass ring.
As a poor kid growing up in Pennsylvania, Adams escaped at the movies, idolizing John Wayne and James Cagney in whose footsteps he hoped to follow. In Hollywood, he hooked up with another wannabe star, James Dean. If the more sensational accounts of their history are accurate, the two survived by hustling gay men on the streets of Hollywood and Vine. In Hollywood Babylon Revisited, author Darwin Porter claims Adams used his rumored “well hung” status to land auditions and movie roles. However he went about realizing his dream, he succeeded in being cast in Mister Roberts with Cagney and Henry Fonda, though he’s a blink-and-you’ll-miss-him presence. He was a little more prominent, though not by much, in Rebel Without a Cause, James Dean’s follow-up to East of Eden which rocketed him to stardom in early 1955. By the time Rebel was released later that same year, Dean was dead and about to be reborn as the object of one of the first celebrity death cults.
A born hustler, Adams exploited his Dean connection to work his way into the shadow of Elvis Presley, who admired Dean and inherited his place as the misunderstood rebel whose sneer was but a cover for a sensitive, wounded heart. It is primarily through his association with these two legends that Adams is known today, but he had a few memorable roles in such films as Picnic, Pillow Talk, and The Last Wagon. His finest moment came in a comedic part, that of Private Ben Whitledge, Andy Griffith’s bespectacled pal in the hilarious No Time for Sergeants. But such a role wasn’t compatible with the image that Adams wanted to project, that of the tight-jawed hero like those he worshiped in his movie-going youth. If casting directors didn’t see the 5'7" actor in the Duke Wayne mold, he’d have to take matters into his own hands, and did, by co-creating The Rebel with Andrew J. Fenady, and successfully selling the idea to ABC.
Popular for a time, The Rebel survived only two seasons (1959-1961) on the then struggling network, but Adams wasn’t through. After landing a minor supporting role in Twilight of Honor, a 1963 courtroom drama dashed off to cash in on Richard Chamberlain’s popularity as TV’s Dr. Kildare, Adams seized the opportunity to position himself for an Oscar nomination as best supporting actor. He hustled - buying $8,000 worth of trade ads to promote himself and successfully bought a nomination. It was a classic example of chutzpah, all the more remarkable since Adams’ role in the film wound up on the cutting room floor. Melvyn Douglas won for Hud. If Douglas hadn’t triumphed, singer Bobby Darin would likely have won for an impressive turn as a traumatized soldier in Captain Newman, M.D.
With Hollywood concerned that so many productions were now being filmed on foreign soil, Adams made some noise by telling reporters he would only accept parts in movies made at home. Trouble is, he wasn’t receiving offers in the States, so he did an about face, and went to England to co-star with Boris Karloff in Die, Monster, Die, an above average adaptation of an H.P. Lovecraft story released by American International in 1965, the same year Allied Artists tossed out Young Dillinger, with Adams in the title role, as one half of a drive-in double bill.
His co-star in the latter film was a young actor for whom Adams was a mentor: Robert Conrad, already popular due to his role on TV’s Hawaiian Eye, and about to win more fans as James T. West, the secret service agent of The Wild, Wild, West. Conrad managed to squeeze Adams in as a guest star on a pair of episodes at a time when his only other options were such made-in-Japan atrocities as Frankenstein Conquers the World and Monster Zero. By then, Adams was more familiar to TV audiences as a celebrity contestant on PDQ, a game show hosted by Dennis James.
The circumstances of Adams’ death in February 1968 remain a subject of debate with some mystery aficionados suggesting he was knocked off, perhaps by someone intent on covering some shady business practices. He was found slumped against the wall of his bedroom, eyes open and staring. An autopsy found a combination of drugs in his system, usually a sign of either an accidental overdose or suicide.
Before his death, he managed two more film roles. Mission Mars, a shoddy space travel flick with Darren McGavin, may not have been released theatrically (it never arrived in my hometown, anyway), but Fever Heat, a racing drama for Paramount, was unloaded at drive-ins four months after his death, supported by a re-issue of The Sons of Katie Elder.
Adams comes to mind because The Rebel is being revived on Me TV (Memorable Television), a network whose programming is a sort of Greatest Hits from TV’s past (look for it on one of the many sub-stations that popped up after the switch from analog to digital broadcasting). Nick Adams may be little more than a blip on the radar screen of pop culture today, but there was a time when, for a half-hour of network prime time each week, he was the brave, somewhat surly, hero he always wanted to be. And now, thanks to reruns, he is sneering heroically again.
POSTSCRIPT: Allyson Adams, daughter of the late Nick Adams, recently found a diary written by her father during the eight days he accompanied Elvis Presley when the latter was being celebrated by his hometown of Tupelo, Mississippi. Titled The Rebel and the King, you can find more information about the book (and pre-order a copy) at the following link: http://therebelandtheking.com/
© 2011 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
Thursday, November 3, 2011
Who's Bob Hope?
This, a TV network catering to the substations that began to appear after the switch from analog to digital broadcasting, recently showed I’ll Take Sweden, a Bob Hope flick with Tuesday Weld and Frankie Avalon (on break from American International’s Beach Party series to give the United Artists comedy some “youth appeal”). It’s one of several films that the ski-nosed comedian made for United Artists in the 1960s (Call Me Bwana, Eight on the Lam, Boy, Did I Get a Wrong Number, and The Private Navy of Sgt. O’ Farrell were among the others), a decade in which he thrived on television as the host of numerous NBC specials and the annual Academy Awards, but found his big screen appeal on the wane. The movie audience by that time was growing increasingly younger, and had turned their attention to more contemporary comic personalities like Peter Sellers. Even Hope’s older fans were finding his movies embarrassing as he continued to play the would-be-Lothario even in his mid-60s. In his final big-screen starring vehicle, 1972's Cancel My Reservation, released by Warner Bros., Hope, then nearing 70, tells someone he’s 42, and, no, it wasn’t meant as a joke, a variation on Jack Benny’s running gag that he was always 39.
In I’ll Take Sweden, Hope plays his age. He’s married to Dina Merrill. Tuesday Weld plays his daughter who is being wooed by Avalon, of whom Hope probably disapproves. I say “probably” because I wasn’t following the intricacies, such as they were, of the plot. Seeing Hope in this movie made me wonder if the comic, once a godlike figure in American culture, and a controversial one for his outspoken conservative politics and support of the Vietnam War, is known to anyone under 30 today?
On a recent episode of The Late Show with David Letterman, bandleader Paul Shaffer made a reference to Hope when pop star Britney Spears made an unscheduled appearance, something Hope had done many times on The Tonight Show during the reign of Johnny Carson.
“Who’s Bob Hope?” she asked.
Who’s Bob Hope?!!!
In the 1960s and ‘70s (and in earlier decades), such a question would have been unthinkable. Britney Spears may not be typical of her generation in being unfamiliar with a man whose presence was impossible to avoid when I was growing up, but I think she is. The days when the three major TV networks (CBS, NBC, ABC) dominated American culture are long gone. Even a casual TV viewer would have likely been exposed to Bob Hope in the ‘60s and the ‘70s, either through his Christmas specials, his old movies (and some of the older ones, like My Favorite Spy and those Road movies with Bing Crosby and Dorothy Lamour, were good ones) on local television late shows, itself a remnant of the past, or through his regular visits to the evening and afternoon talk shows. Even if you didn’t actually watch these shows, you would have been aware of them during the commercials breaks during Star Trek or The Man from U.N.C.L.E.
Hope’s NBC specials were ratings powerhouses, always placing in the top 10 of the weekly Nielsen ratings. In the late ‘70s, however, his drawing power began to deteriorate. His much ballyhooed trip to China, the subject of a 1979 special, was creamed in the ratings, even losing to ABC’s airing of Woody Allen’s Annie Hall, ironic considering that Hope was a mainstream establishment figure with wide appeal, and Allen, who idolized Hope, was a comparatively fringe performer whose appeal was to a younger, more intellectually inclined (and pretentious) clique.
Hope continued cranking out specials into the ‘80s with disappointing results, then retired the next decade, finally passing on at age 100 in 2003. Hope’s death was front page news and received massive media coverage, but NBC, the network at which he spent several lucrative decades, didn’t bother to commemorate his career with a prime-time special compiling clips from his many specials. Times had changed and tastes had shifted. Hope was an historical figure of interest to viewers other than those in the age bracket (18-49) that appeals to advertisers, and, therefore, worthy of a news report but not an even an hour of prime-time.
Times waits for no one, as the Rolling Stones observed, and time passed Bob Hope by as it is passing by many icons for the 20th century. The icons include performers, but also products that were once essential to our daily lives. Timex may have sponsored many of Bob Hope’s TV specials, but the wristwatch is no longer a necessity to a generation raised in the age of the cell phone. Many young people don’t wear watches. They simply whip out their cell phones to check the time, and since it’s displayed in digits, knowing that it’s, say, 2:30 when the small hand is on the 2 and the big hand is on the 6, is almost useless. Newspapers have been a staple of American life since the 1800s, but their considerable influence was usurped by the Internet and most, even such venerable institutions as The New York Times, are unlikely to survive with their power intact, if they survive at all. Magazines are still around, but their circulation and revenue, dependent on advertising dollars, are dwindling and their future is looking increasingly precarious. Newsweek has been losing $20 million a year and has been put up for sale by its owner, The Washington Post. Time, which debuted in 1923 and on whose cover one was thought to have reached the pinnacle of fame, may be doing better, but no one really cares about “making” the cover anymore. I’m more likely to read it online than in print. Movies survive, but 35 mm film, on which movies have been shot and printed since their invention in the last years of the 19th century, is likely to become obsolete as digital becomes the format in which movies are made and projected. Television is still with us, but those three networks - ABC, CBS, and NBC - that once dominated the industry, must now compete with Fox which started in the ‘80s as a minor niche network, but now has the single most popular prime-time program thanks to American Idol. Now there are four networks, and while they still command the largest chunk of the viewing audience, cable channels have made inroads into their audience. Some younger viewers watch their favorite shows online.
Also competing with television is home video, introduced to the market in 1976 with the debut of the Betamax. Soon, the Betamax lost out to VHS, which became the format of choice among consumers. Now, tape has been replaced with discs, DVDs and now Blueray. Eventually, discs will disappear as movies are downloaded via the Internet. The accessibility of movies in these formats threatens movie theaters as they become less important. Will they disappear, too? Technology has had the greatest impact on the recording industry. It was in 1877 that Thomas Edison invented sound recording which soon became available to the public on cylinders. Dials made of wax and rubber soon took their place and pre-recorded music was available to the public on discs that played on phonographs at 78 rpm (which stood for “revolutions per minute”). The long-playing 33 1/3 disc was introduced in the middle of the 20th century, along with the 45 rpm single that traditionally contained two songs, one per side. Reel-to-reel tape made its debut shortly thereafter, followed by 8 Track cartridges, then cassettes. The vinyl record dominated until the ‘80s when the compact disc, a tiny metallic object little bigger than the average palm, took its place. Now, however, music is being downloaded from the Internet, sometimes illegally. Disc sales are way down, and the major recording companies, including such famous names as RCA, Capitol, and Columbia, are almost certainly going to crash and burn as recording artists bypass them completely and market their music themselves. And that music is heard more often than not on the Internet or played on tiny hand-held devices like the MP3 player, rather than on the radio. It’s a new world.
It’s a world in which many people are likely ask the same question that Britney Spears posed to Paul Shaffer: "Who’s Bob Hope?"
© 2011 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
Saturday, October 22, 2011
Halloween at "Horror Hotel"
Although set in February, the month of Candlemas Eve, Horror Hotel is a perfect movie for Halloween. From 2006, here are my thoughts on one of my favorite chillers.
The poster for the 1963 U.S. release of Horror Hotel, a tiny reproduction of which appears on the cover of Navarra’s Triple Feature Horror Classics, Volume 5, was one of the most misleading pieces of advertising ever devised for a film’s marketing campaign.
“Just ring for doom service!” the tag line reads. A key on which the film’s title is imprinted dangles from a skeleton’s hand surrounded by a grotesque face with fangs, saliva visible in its open mouth.
The poster gives no clue to the atmosphere that director John Moxey and cinematographer Douglas Dickinson bring to this tale of modern day witchcraft in the village of Whitewood, Massachusetts. A movie that basks in morbidity and sends chills down the spine is depicted as a grade B scream fest on the order of a William Castle production, something like The Tingler that inspires more giggles than groans. The Motion Picture Association of America (MPAA), which gives its seal of approval to films and their advertising, may have rejected an ad campaign that hinted at the movie’s portrayal of human sacrifice. When released in the U.K. as City of the Dead in 1960, this chiller may have benefitted from a more honest and effective promotional campaign, but I’m sure children were forbidden from attending even with parents.
By the early ‘60s, horror films were generally regarded as kid stuff in the U.S., fodder for Saturday matinees and drive-in triple features. The U.K. had a ratings system long before Hollywood introduced their self-policing system in 1969, and the colorfully gory titles produced by Hammer were frequently slapped with an “adults only” label in their native country of Great Britain. Once they traveled overseas to the States, they were open to all audiences, perhaps due to simple economics. The early Universal horror films, Dracula and Frankenstein, which established the horror genre, appealed to adults, as well as younger audiences. As the genre deteriorated into less sophisticated territory (Frankenstein Meets the Wolfman), the fan base narrowed to teens and kids.
By the late ‘50s, when Universal released its backlog of pre-1949 titles to television, Hammer was reviving the moribund genre with full color remakes. At the same time, Forrest J. Ackerman began publishing Famous Monsters of Filmland, a magazine devoted to horror and science-fiction films, soon to be followed by the competing Castle of Frankenstein whose approach was more sophisticated, geared more to film buffs who admired horror and science-fiction, and less to kids who simply liked monsters.
But the kids who liked monsters, and preferred building Aurora model sets of Dracula and Frankenstein to slapping the glue on plastic airplanes, won out, so Horror Hotel’s advertising campaign was directed at them, as were the ads for The Head, the German film that comprised the second half of the double bill when Horror Hotel opened stateside in June 1963, just in time for summer vacation and the start of the drive-in season. I saw it at the Pearl Road Drive-In in Cleveland, Ohio with my family, and it made a strong impression on me that was only strengthened when I saw it again a few years later on the Friday night late movie hosted by Houlihan and Big Chuck on WJW-TV8. It was a mainstay on local television throughout the ‘60s and ‘70s, then disappeared in the ‘80s, its owners having failed to renew the copyright. But it continued to haunt the imagination of those who saw it on television, usually in the wee hours, the perfect time for a film that casts such a macabre, creepy spell.
Horror Hotel was finally released on home video in the ‘90s, and I checked in again in 2004 to find it every bit as effective as I remembered.
Horror Hotel is a triumph. Filmed entirely on a soundstage at Shepperton Studios by Vulcan Productions, which would change its name to Amicus for such future shockers as Dr. Terror’s House of Horrors and The House That Dripped Blood, it is unmatched for its atmosphere. The fog machine never worked harder than it did when used to create the crypt-like ambience of Whitewood. The Raven’s Inn, the private hotel built on the exact spot where Elizabeth Selwyn was burned at the stake on March 3, 1692 (a dramatic sequence that opens the film), is dark and deathly quiet except for the ticking of the clock in the lobby and the solemn chanting that visiting college student Nan Barlow tells the proprietor, Mrs. Newliss, she hears coming from beneath the trap door in her room.
“There’s nothing under there but earth,” Mrs. Newliss says, and points to the trap door’s lack of a ring as evidence. The ring later appears dangling in the window shortly before the film’s most frightening moment: Nan Barlow’s descent via the cobwebbed stairs beneath the Raven’s Inn where she meets her terrifying fate.
Horror Hotel has a few hokey moments. When Professor Barlow (Dennis Lotis) invades the witches coven below the Raven’s Inn, he empties his handgun into his foreboding colleague, the undead witch, Professor Driscoll (Christopher Lee). The bullets have no effect, so what does Barlow do? The same thing the gangsters did on TV’s The Adventures of Superman when realizing the Man of Steel could not be deterred with mere bullets: Barlow throws his gun at Driscoll! Even more laughable is Driscoll’s reaction: he ducks!
Then there’s that bookstore operated by the granddaughter of the reverend who warns visitors to Whitewood that the devil lives and is worshiped there. Considering that the aligning church has no congregation, and all of the townsfolk appear to be witches, who is the clientele? Nan Barlow (Venetia Stevenson) stops in seeking books for her college thesis on the history of witchcraft in Whitewood, but she only borrows a dusty antique volume (A Treatise on Devil Worship in New England) that she can’t afford to buy. Ken Jones’ jazz that is heard as Barlow drives to Whitewood has been criticized, but since it seems to be emanating from the car radio, it’s not inappropriate. Still, it contributes nothing to the film, quite unlike Douglas Gamely’s eerie choral music that opens the film and is heard during the witches rituals. It has the effect of a cold, dead hand on your shoulder.
The acting is excellent, much better than required for a low-budget horror show, an indication that the producers envisioned a quality project and sought only the best talent.
Venetia Stevenson, the perky blonde whose disappearance while researching witchcraft for a college assignment paves the way for the fiery climax, has an innocent charm that contrasts well with the saturnine Patricia Jessel, whose strong features attract and repel at the same time. Despite her fourth billing, Jessel is the true star of Horror Hotel, and her performance as Elizabeth Selwyn/Mrs. Newliss would have been a mere caricature of a witch in lesser hands. But Jessel was a Tony Award winner as best supporting actress in the 1956 Broadway production of Agatha Christie’s Witness for the Prosecution, and her performance suggests she did not regard her role in a “horror movie” as a lark.
When Nan Barlow asks Mrs. Newliss if Elizabeth Selwyn was really burned as a witch on the site where the Raven’s Inn now stands, Jessel doesn’t merely say “She was,” but answers with a mix of pride and pain with a facial expression to match. She conveys both the haughty delight she takes in having been condemned for her beliefs, as well as the sorrow she feels about her persecution.
Christopher Lee as her accomplice is no mere boogeyman, but a weary sort, impatient with those, like Professor Barlow, who mock and condescendingly dismiss his teachings as nonsense. Dennis Lotis, previously known as a pop singer, is also impressive, never more so than when he follows his sister’s trail and, in the cavern below the Raven’s Inn, discovers the corpse of Lotti, the mute servant sympathetically played by Ann Beach, whose attempts to warn the guests of their host’s true nature leads to her doom.
Tom Naylor as Nan’s boyfriend, Betta St. John as the heroine of the second half of the film, and Norman Macowan as Reverend Russell are also excellent.
Then there’s Valentine Dyall as Jethro Keane, who hitches a ride with Nan Barlow on her way to Whitewood, only to disappear into mist when they reach the cemetery. Dyall, a character actor with a distinctive baritone, later played the caretaker in 1963's The Haunting, a more prestigious horror film that would make for a perfect double feature with Horror Hotel.
Horror Hotel is a superbly crafted thriller that received scant attention from critics (The New York Times panned it, along with The Head, in a review so brief that I suspect the critic didn’t bother to see the movie), but it has inspired devotion among those who discovered it, quite unexpectedly, through its television airings, and now regard it as the scariest movie they’ve ever seen.
© 2011 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
The poster for the 1963 U.S. release of Horror Hotel, a tiny reproduction of which appears on the cover of Navarra’s Triple Feature Horror Classics, Volume 5, was one of the most misleading pieces of advertising ever devised for a film’s marketing campaign.
“Just ring for doom service!” the tag line reads. A key on which the film’s title is imprinted dangles from a skeleton’s hand surrounded by a grotesque face with fangs, saliva visible in its open mouth.
The poster gives no clue to the atmosphere that director John Moxey and cinematographer Douglas Dickinson bring to this tale of modern day witchcraft in the village of Whitewood, Massachusetts. A movie that basks in morbidity and sends chills down the spine is depicted as a grade B scream fest on the order of a William Castle production, something like The Tingler that inspires more giggles than groans. The Motion Picture Association of America (MPAA), which gives its seal of approval to films and their advertising, may have rejected an ad campaign that hinted at the movie’s portrayal of human sacrifice. When released in the U.K. as City of the Dead in 1960, this chiller may have benefitted from a more honest and effective promotional campaign, but I’m sure children were forbidden from attending even with parents.
By the early ‘60s, horror films were generally regarded as kid stuff in the U.S., fodder for Saturday matinees and drive-in triple features. The U.K. had a ratings system long before Hollywood introduced their self-policing system in 1969, and the colorfully gory titles produced by Hammer were frequently slapped with an “adults only” label in their native country of Great Britain. Once they traveled overseas to the States, they were open to all audiences, perhaps due to simple economics. The early Universal horror films, Dracula and Frankenstein, which established the horror genre, appealed to adults, as well as younger audiences. As the genre deteriorated into less sophisticated territory (Frankenstein Meets the Wolfman), the fan base narrowed to teens and kids.
By the late ‘50s, when Universal released its backlog of pre-1949 titles to television, Hammer was reviving the moribund genre with full color remakes. At the same time, Forrest J. Ackerman began publishing Famous Monsters of Filmland, a magazine devoted to horror and science-fiction films, soon to be followed by the competing Castle of Frankenstein whose approach was more sophisticated, geared more to film buffs who admired horror and science-fiction, and less to kids who simply liked monsters.
But the kids who liked monsters, and preferred building Aurora model sets of Dracula and Frankenstein to slapping the glue on plastic airplanes, won out, so Horror Hotel’s advertising campaign was directed at them, as were the ads for The Head, the German film that comprised the second half of the double bill when Horror Hotel opened stateside in June 1963, just in time for summer vacation and the start of the drive-in season. I saw it at the Pearl Road Drive-In in Cleveland, Ohio with my family, and it made a strong impression on me that was only strengthened when I saw it again a few years later on the Friday night late movie hosted by Houlihan and Big Chuck on WJW-TV8. It was a mainstay on local television throughout the ‘60s and ‘70s, then disappeared in the ‘80s, its owners having failed to renew the copyright. But it continued to haunt the imagination of those who saw it on television, usually in the wee hours, the perfect time for a film that casts such a macabre, creepy spell.
Horror Hotel was finally released on home video in the ‘90s, and I checked in again in 2004 to find it every bit as effective as I remembered.
Horror Hotel is a triumph. Filmed entirely on a soundstage at Shepperton Studios by Vulcan Productions, which would change its name to Amicus for such future shockers as Dr. Terror’s House of Horrors and The House That Dripped Blood, it is unmatched for its atmosphere. The fog machine never worked harder than it did when used to create the crypt-like ambience of Whitewood. The Raven’s Inn, the private hotel built on the exact spot where Elizabeth Selwyn was burned at the stake on March 3, 1692 (a dramatic sequence that opens the film), is dark and deathly quiet except for the ticking of the clock in the lobby and the solemn chanting that visiting college student Nan Barlow tells the proprietor, Mrs. Newliss, she hears coming from beneath the trap door in her room.
“There’s nothing under there but earth,” Mrs. Newliss says, and points to the trap door’s lack of a ring as evidence. The ring later appears dangling in the window shortly before the film’s most frightening moment: Nan Barlow’s descent via the cobwebbed stairs beneath the Raven’s Inn where she meets her terrifying fate.
Horror Hotel has a few hokey moments. When Professor Barlow (Dennis Lotis) invades the witches coven below the Raven’s Inn, he empties his handgun into his foreboding colleague, the undead witch, Professor Driscoll (Christopher Lee). The bullets have no effect, so what does Barlow do? The same thing the gangsters did on TV’s The Adventures of Superman when realizing the Man of Steel could not be deterred with mere bullets: Barlow throws his gun at Driscoll! Even more laughable is Driscoll’s reaction: he ducks!
Then there’s that bookstore operated by the granddaughter of the reverend who warns visitors to Whitewood that the devil lives and is worshiped there. Considering that the aligning church has no congregation, and all of the townsfolk appear to be witches, who is the clientele? Nan Barlow (Venetia Stevenson) stops in seeking books for her college thesis on the history of witchcraft in Whitewood, but she only borrows a dusty antique volume (A Treatise on Devil Worship in New England) that she can’t afford to buy. Ken Jones’ jazz that is heard as Barlow drives to Whitewood has been criticized, but since it seems to be emanating from the car radio, it’s not inappropriate. Still, it contributes nothing to the film, quite unlike Douglas Gamely’s eerie choral music that opens the film and is heard during the witches rituals. It has the effect of a cold, dead hand on your shoulder.
The acting is excellent, much better than required for a low-budget horror show, an indication that the producers envisioned a quality project and sought only the best talent.
Venetia Stevenson, the perky blonde whose disappearance while researching witchcraft for a college assignment paves the way for the fiery climax, has an innocent charm that contrasts well with the saturnine Patricia Jessel, whose strong features attract and repel at the same time. Despite her fourth billing, Jessel is the true star of Horror Hotel, and her performance as Elizabeth Selwyn/Mrs. Newliss would have been a mere caricature of a witch in lesser hands. But Jessel was a Tony Award winner as best supporting actress in the 1956 Broadway production of Agatha Christie’s Witness for the Prosecution, and her performance suggests she did not regard her role in a “horror movie” as a lark.
When Nan Barlow asks Mrs. Newliss if Elizabeth Selwyn was really burned as a witch on the site where the Raven’s Inn now stands, Jessel doesn’t merely say “She was,” but answers with a mix of pride and pain with a facial expression to match. She conveys both the haughty delight she takes in having been condemned for her beliefs, as well as the sorrow she feels about her persecution.
Christopher Lee as her accomplice is no mere boogeyman, but a weary sort, impatient with those, like Professor Barlow, who mock and condescendingly dismiss his teachings as nonsense. Dennis Lotis, previously known as a pop singer, is also impressive, never more so than when he follows his sister’s trail and, in the cavern below the Raven’s Inn, discovers the corpse of Lotti, the mute servant sympathetically played by Ann Beach, whose attempts to warn the guests of their host’s true nature leads to her doom.
Tom Naylor as Nan’s boyfriend, Betta St. John as the heroine of the second half of the film, and Norman Macowan as Reverend Russell are also excellent.
Then there’s Valentine Dyall as Jethro Keane, who hitches a ride with Nan Barlow on her way to Whitewood, only to disappear into mist when they reach the cemetery. Dyall, a character actor with a distinctive baritone, later played the caretaker in 1963's The Haunting, a more prestigious horror film that would make for a perfect double feature with Horror Hotel.
Horror Hotel is a superbly crafted thriller that received scant attention from critics (The New York Times panned it, along with The Head, in a review so brief that I suspect the critic didn’t bother to see the movie), but it has inspired devotion among those who discovered it, quite unexpectedly, through its television airings, and now regard it as the scariest movie they’ve ever seen.
© 2011 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
Thursday, October 6, 2011
Bob Dylan, Art Thief?
Since when is it considered plagiarism to make a drawing or painting of a photograph?
Bob Dylan came under fire this week when it was revealed that several of his paintings on display at the Gagosian Gallery in New York were based on photographs, many retrieved from Flickr, and at least one of which copies the work of Henri Cartier-Bresson.
Other than Dylan's dubious claim that "I paint from real life," and that the paintings are a "visual journal" of his travels, my reaction is, so what?
Drawing from photographs is a fairly standard practice among high-school art students, as well as for artists without access to live models. Dylan is neither, but if creating a painting from a photograph is theft, then it could be argued that a photographer is a thief whenever he aims his camera at any subject other than himself.
If he photographs a bridge and doesn't acknowledge those who built it, well, he's a plagiarist, is he not?
If he photographs a building and does not acknowledge the architect, as well as receive his permission to duplicate his work on film, he's stealing the architect's work, right?
If that's the case, a photojournalist who captures a crowd scene on film needs the permission of every individual in the photo, all of whom have the right to decide if that photo can be published. They are also entitled to financial compensation if the photographer is paid for that published work. It's even been said that the camera steals the soul of its subjects. If that doesn't entitle a photographer's subject to damages, well, there's something wrong with our legal system.
If Dylan is a plagiarist, he's in good company.
Did Andy Warhol request permission from the Campbell Soup Company before creating his legendary "soup can"? Did Campbell's share in any profit that Warhol made from selling his work or reproductions thereof?
What about the manufacturers of Brillo, whose box became another memorable Warhol piece? Somebody created the original design, although it's doubtful Brillo gave him credit or anything but a flat fee or paycheck.
How about Warhol's piece depicting an electric chair? Should he have acknowledged whoever built it and been required to pay a licensing fee?
And let's not overlook Warhol's silkscreens of Elvis Presley and Marilyn Monroe, two superstars whose images are worth millions and who did not agree to pose for him. Warhol depicted Elvis in a scene from the 1960 Twentieth Century Fox film, Flaming Star, and Monroe from the same company's 1957 film, The Seven Year Itch. Not only was Warhol stealing from Elvis and Marilyn, he was ripping off Twentieth Century Fox, as well as the photographers, film directors, and even the screenwriters who might be able to claim ownership of those images.
As for music, where's the name of Lennon and McCartney on David Bowie's 1975 song, "Young Americans," which quotes "I read the news today, oh boy" and its accompanying melody from the Beatles' "A Day in the Life"?
Joe DiMaggio earned a mention in Paul Simon's "Mrs. Robinson." Was DiMaggio consulted beforehand? Did he have the right to demand that the line in which his name appears be excised? For that matter, what about all of us who comprise the “nation” whose lonely eyes Simon said were turned to Joltin’ Joe? What right does Simon have to suggest my eyes are lonely or that they are turned to a baseball player?
Dylan has written songs about several famous public figures, including Billy the Kid, Rubin "Hurricane" Carter, mobster Joey Gallo, and Hattie Carroll and her alleged killer, William Zantzinger. Do they or their descendants have the right to demand a share of the royalties as well as a credit on the songs that told their stories?
What about an artist who makes a collage using photos from various newspapers and magazines. Is he a plagiarist if he hasn't received permission from those publications?
The answer to all of the above questions is NO!
Dylan should have been more straight-forward about where he found his inspiration for those paintings, but they're still HIS paintings. Even if an old photograph was the catalyst, they are new creations. This controversy probably has less to do with giving credit where credit is due than it has to do with money, and the possibility of milking a millionaire in a plagiarism lawsuit.
Like the brouhaha over Dylan's concerts in China earlier this year, this is much ado about nothing. Oops, that's a quote from Shakespeare. "It's plagiarism, pure and simple," Rob Oechsle, the owner of that Flickr account told The Los Angeles Times regarding Dylan's use of photos posted online. "If a writer were to use a phrase from Shakespeare, and not credit him, or attribute it in any way," Oechsle said, "that's what they'd be accused of."
Well, not really, Mr. Oechsle, or "Okinawa Soba" as he calls himself. (That sounds a little like Kimosabi to me, and Mr. Oechsle does not acknowledge Tonto or the Lone Ranger on his Flickr page.) Many of Shakespeare's phrases are so common that it's possible to quote him without knowing it, just as I don't know the origin of the phrase I quoted earlier about the soul-stealing capabilities of a camera. I've heard it plenty of times, and never from anyone who cited its author. Some quotes are so famous that attribution isn't necessary. In jest, a film buff might quote Clark Gable's famous line from Gone With the Wind ("Frankly, my dear, I don't give a damn") or the line many people believe Humphrey Bogart said to Dooley Wilson in Casablanca ("Play it again, Sam") without mentioning the films, simply because it's assumed everybody knows them. And naming the movie might not be enough for Mr. Oechsle. You'd also have to list the screenwriters and the author of the original book or play from which the screenplays were adapted, and maybe the director and film company, too. Of course, to do so would be utterly ridiculous, much like the controversy over Dylan's paintings.
If the Gagosian Gallery were presenting an exhibit of Dylan photos, and it turned out those photographs came from Flickr or the portfolio of Henri Cartier-Bresson, that would be a scandal worth examining. But paintings from photographs? If a picture is really worth a thousand words, it's also worth a painting or two, and when the painting is by an artist of Dylan's standing, the photographer should feel complimented.
(By the way, the accompanying photo is an ink drawing I made of Dylan 21 years ago from a photo I found in a book. I don't know who took the original photo. If anyone does, let me know and I'll give him credit, lest I piss off Okinawa Soba.)
© 2011 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
Wednesday, August 31, 2011
FIve Easy Pieces and the tone of the times
Before the 1992 release of A Few Good Men in which Jack Nicholson delivered the much quoted line, "You can't handle the truth," his most famous screen moment was probably in 1970's Five Easy Pieces, the first film in which he claimed top billing since being noticed with his supporting role in the previous year's Easy Rider. It's the chicken salad sandwich scene that was chosen to represent the film during the clips from the best picture nominees at that year's Academy Awards.
Nicholson and Karen Black, as his sort-of girlfriend, stop at a diner with two hitchhikers they've picked up. Nicholson orders toast which is not on the menu, and a contentious exchange follows in which he explains to the waitress how she can fill his order without breaking the rules that she insists on following.
"You have bread, and a toaster of some kind?" he tells the frustrated woman, then instructs her to "hold the chicken," bring him a check for the chicken salad sandwich "and you haven't broken any rules."
"You want me to hold the chicken?" she asks in a tone of defiant sarcasm.
"I want you to hold it between your knees," he sneers.
With that remark, the waitress orders Nicholson and his companions to leave, and points to a sign that states the management's policy ("We reserve the right to refuse service").
"Do you see this sign?" Nicholson says, then violently clears the table of its glassware and metal utensils.
More than 40 years later, Nicholson's character looks more like an obnoxious boor than the rebel he may have seemed originally. He's a rude bully abusing a low-wage employee who is, after all, only doing her job. In 1970, when the film was released, it played a little differently, its rage colored by the tone of the times. To the young people of what would have been called the "counterculture," Nicholson's character wasn't merely abusing a waitress, but taking on Nixon, Vietnam, the assassins of JFK, RFK, and MLK, along with all the pointless rules of our straight-jacketed repressed society. He was "stickin' it to the man," challenging the "system," kicking the Establishment's ass, and putting down the whole rotten "scene" (add a "man" after "scene" for added hipness, 1970's style).
Nicholson's character was the rebel seizing his freedom and ignoring the "sign" that the waitress points to in her defense. A hit song by a group called the Five Man Electrical Band titled "Signs" that hit the airwaves in summer 1971 ("Do this, don't do that, can't you read the sign?") may have even been inspired by the scene. Once Nicholson and his companions leave the diner, one of the hitchhikers praises his miniature act of rebellion. He's less impressed with himself than she is, and recognizes the futility of, shall we say, fighting the system. In regard to that toast without the chicken, he says, "Well, I didn't get it, did I?"
A movie, like any creative work, does not exist in a void. It reflects its time, but time moves on while the work of art is static, frozen on canvas, on a page, or on frames of film. If its truths are universal, the art transcends the time in which it was made and communicates as effectively to its own generation as it does to those that follow. Five Easy Pieces, like that previous Nicholson movie with "easy" in the title, is probably a satisfying film to watch even now, four decades after its release, but its power has almost certainly diminished. In 1970, some of its strength was in its symbolism, which may not have even been deliberate, but what it symbolized for many members of its original audience has changed. Nixon is gone, Vietnam is over, and our government, though actually more oppressive than ever, is seen as less corrupt and authoritarian, certainly by filmmakers, most of whom lean left politically and are inclined to see a democratic president as a man in a white hat, especially when he's black.
© 2012 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
Saturday, August 27, 2011
The end of the world is not being televised
As of 3:48 p.m., the end of the world is not being televised. Maybe there’s a network on cable that successfully negotiated with God - or the devil - for exclusive rights, or maybe Hurricane Irene’s arrival on the east coast has been delayed, or maybe it arrived on schedule but had already weakened once it reached New York. All I know is that life goes on, or television does, and there’s been no mention of cataclysm on NBC where golf is being televised, or ABC which is presenting the Little League World Series (I’m not kidding), or Fox where a Major League game has just concluded. The digital convertor box on my analog TV set isn’t picking up the signal from the CBS affiliate, so I have no clue what’s going on there.
When I went to bed this morning, I was fully expecting to wake up to live news coverage resembling the tornado sequence in the Kansas prelude to The Wizard of Oz, and now I’m wondering if, like Dorothy’s stay in Oz, it was all a dream.
© 2012 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
When I went to bed this morning, I was fully expecting to wake up to live news coverage resembling the tornado sequence in the Kansas prelude to The Wizard of Oz, and now I’m wondering if, like Dorothy’s stay in Oz, it was all a dream.
© 2012 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
Tuesday, August 16, 2011
Elvis Remembered
Thirty-four years ago, August 16 also fell on a Tuesday. A hot, muggy day throughout most of the Midwest, by afternoon the news went forth that Elvis Presley had died. The hip-shaking Southern boy who made rock 'n' roll an international sensation that would soon become the dominant force in music, was found on the floor of his bathroom at Graceland, the gaudy mansion in Memphis, Tennessee that is now on the National Register of Historic Places, visited yearly by fans from around the globe.
It's a bit mind-boggling to realize that Elvis has been a dead icon for much longer than he was a living one. By the time of his death in 1977, he had spent twenty-one years in the spotlight, and now thirty-four years have passed since he proved he was always as mortal as the rest of us. It would also be difficult for the generations that followed to realize the impact Presley had on the world way back in 1956. A clue can be found in the complete episodes of The Ed Sullivan Show, released on DVD several years ago and reviewed below:
Elvis: The Ed Sullivan Shows (2006)
Elvis Presley made no less than nine network television appearances before performing on The Ed Sullivan Show the evening of September 9, 1956, but most of America first saw him then. Despite a stiff demeanor and tendency to pronounce show as “shoe,” Sullivan was the ringmaster of American entertainment. His Sunday night variety program was an institution in the days when television was still a three channel proposition. Appearing on his show was an important break for any entertainer. It was tantamount to receiving the show business seal of approval.
But Sullivan originally did not approve of Presley and vowed he wouldn’t touch the singer with a ten-foot pole. Despite selling more records faster than any recording artist in history, Presley was more than hot. He was scorching. The swivel hips that earned him the nickname “Elvis the Pelvis” (which he despised, calling it “childish”) and his expressive singing style made him a lightning rod of controversy. One journalist compared his stage act to that of a stripper. However, when Presley appeared on The Steve Allen Show which was scheduled opposite Sullivan on Sunday nights, the ratings went through the roof. Sullivan reversed himself and offered Presley a then record $50,000 to make three appearances on his show.
Just how shocking Presley was in 1956 was never apparent in the frequently recycled clips of his performances. Now, thanks to Image Entertainment’s 3 disc DVD set, Elvis: The Ed Sullivan Show, his performances can be seen in their proper context.
Ironically, a car accident prevented Sullivan from being present that first night. Charles Laughton, the brilliant British stage and screen actor (and husband of Elsa Lanchester, The Bride of Frankenstein), was the guest host that night, kicking off the proceedings by reading some poetry followed by limericks. The Brothers Amin, an acrobatic act, came next, then Dorothy Sarnoff performed a song from Broadway’s The King and I. After a commercial, Laughton, standing before a wall of Presley’s gold records, introduced the man whom a record 72 million views tuned in to see.
Wearing a plaid jacket and a guitar slung over his chest like a machine gun, Presley blasts his way into “Don’t Be Cruel” and it’s a little like Moses parting the Red Sea. Prior to Elvis, entertainment didn’t have to be rated with letters signifying what age group should be permitted to watch. Families watched TV and listened to music the same way they went to the movies: together. Now Elvis came to drive them apart.
Teenagers love him, of course, especially the girls, and what was there not to like? Handsome, but in a way men had not been before; threatening, yet still somehow tame, as if his mask of menace was only meant to conceal a wounded heart. He is, after all, very well-mannered, saying “Yes, sir” and thanking “Mr.” Laughton. What was one to make of this guy with the unusual name, the pompadour, and the long sideburns?
“He just does this,” Ed Sullivan would say while shaking his body on the October 28 show, “and everybody yells.” Presley looked a little more sinister this time in his dark suit, and he offers reprises of “Don’t Be Cruel,” “Love Me Tender,” and “Hound Dog” while also introducing one of his sultriest numbers, “Love Me.”
What did Dorothy Sarnoff think? And Senor Wences, who was on the bill the same night Presley appeared a second time?
Clearly, show business had been rocked into a new dimension.
His third and final appearance for Sullivan came on January 6, 1957 on a show that also featured Carol Burnett, one of the few stars on these episodes whose wattage would increase in future years. By now, the country was clearly divided into two camps: those who championed the King of Rock and Roll, and those who condemned him. Sullivan was now in the former, surprising audiences and Elvis himself by proclaiming him a “good, decent boy.”
But there was no turning back. Soon, people would be talking about the “generation gap” and, later, “youth culture.” The gap would widen in the ‘60s with even Presley taking his place among the old guard, but the gap started here. With the release of Elvis-The Ed Sullivan Shows on DVD, it’s now possible to properly assess the earth shaking impact Presley had in the more innocent era of the 1950's.
© 2011 Brian W. Fairbanks
VISIT MY KINDLE STORE AT AMAZON
Other posts on Elvis:
Remembering Elvis
August 16, 1977
Elvis Is Everywhere
Subscribe to:
Posts (Atom)