It’s been a long time since making traditional or even vaguely conventional “movies” has interested legendary French New-Wave filmmaker Jean-Luc Godard. If anything, the director’s movies over the last 20 or so years have been experiential audio/visual collages more interested in pictures, sounds, cuts, and de-saturation, a maddening barrage of dadaist statements. Even with all that being said, his latest, “The Image Book,” playing in competition at Cannes, should be considered as radical a Godard-ian statement as any.
Godard’s latest endeavor has perplexed many, but despite all that, it did manage to win a Special Palme d’Or at Cannes earlier this year. And “special” is exactly what this film is, though maybe not in the good sense of the term. Regardless, this film is a must for cinephiles as it is yet another addition into what has become one of the most inventive and original filmographies in cinematic history. Godard, at 88 years of age, is by all accounts, a God of the medium. You can check my review of the film here, although it’s not really a review as much as a bewildered what-the-hell-did-I-just-watch thingamajig. I’m not panning the film, nor am I necessarily recommending it, I just much preferred 2014’s 3D-infused “Goodbye to Language,” which felt meatier and more substantial than ‘Image Book.’
A few excerpts from my 5.17.18 review:
“No, this is not “The Avengers,” but Godard, a cinematic radicalist, seems to be at the pinnacle of his pretension here. Where to start when describing the “plot?” Forget it. Images ranging from Youssef Chahine to Hitchcock unfurl at a rapid, ADD-styled pace. Not to mention the uglified, and non-pixelated, assault of furiously angry stock footage. There’s another Godard-ian “statement” on the Holocaust, but we’re not really sure what he’s trying to say because the clip lasts all of four short seconds. The beautifully-realized 3D experiment of “Goodbye to Language” is replaced by a nastily vicious tone of resentment for the outside world.”
“Godard overloads each sequence in “The Image Book” with dozens of different films from Hollywood’s “Golden Age” to the unheard gems of world cinema. This magma of scenes and actors, known and unknown, can be interpreted according to everyone’s own experiences. Because nothing is explained, the images land on the viewer’s retina at such a fragmented and furious pace that it’s often impossible to interpret the experience. It’s only after you leave the theater that you try to assemble its incoherence, like a puzzle with pieces that don’t seem to fit.”
“Some audiences will naturally loathe “The Image Book,” some will find moments of transcendence within, and others will leave perplexed and baffled. And that’s OK. “The Image Book” demands digestion in a purely personal way.”
“Past and present, along with reality and fiction, seem to collide in this phonic, sensory world. As meaningless as Godard’s epileptic montage often feels, some moments do form a kind of coherence; the anger and revulsion for the horrors of war being normalized on television. Maybe this is just the message from the great beyond of an unfathomable cinematic mind warning of the incomprehension of the world that surrounds us. Godard surely does not care what we think.”
“The Image Book” will hit select theaters beginning on January 25, 2019.
In the next few weeks you will no doubt see a real push by the media to present “Black Panther” as a worthy Best Picture contender. All for the sake of progress and a rabid Disney agenda that is expertly pushing this narrative. Don’t bite.
Angela Bassett, who stared in ‘Panther’ has been quoted as saying “In my mind, it has the Oscar. I think it deserves it,” she explained while promoting her upcoming Transformers film, “Bumblebee.” Whereas, Collider‘s Adam Chitwood just wrote an op-ed pleading the case that ‘Panther’ is worthy of a Picture nomination.
No it’s not.
I will first mention the positives that come with the film. Yes, “Black Panther” is a cultural groundbreaker that is as important as many movies released this year in America. I know how significant it is to have the most powerful film franchise, Marvel, finally deliver a superhero movie with an almost entirely black cast at the forefront. I do, I really do. This is a time when a film like “Black Panther” should exist. Think of all the young kids watching this movie who will see themselves as the heroes, capable of doing just about anything that they set their minds in doing. Just for that, I am glad it has become such a resounding billion-dollar success.
However, this film doesn’t remotely come close to the tense or cinematic level of, say “The Winter Soldier,” a film that drastically changed the mold for Marvel as first and foremost a film inspired by the ’70s political thriller, or even “Logan,” a movie that tried to distance itself from the banal, predictable narrative structure of the superhero genre by infusing Western-like sensibilities and — shock — adult-oriented moral dilemmas. The problem with “Black Panther” was that there simply wasn’t all that much excitement to go around. Almost everything you expected in the narrative did happen
There wasn’t anything memorable, no moment that sent your pulse pounding, or your spine tingling. This is a straightforward telling of a story that on paper should not be straightforward at all, or at the very least, safe. Coogler’s source material was Ta-Nehisi Coates’ more recent “Black Panther” comics and to say he watered it down for the masses would be an understatement. Coates’ comics were firmly rooted in Afrofuturism and had a Shakespearian-influence in scope and tone.
Upon its release in 1982, “Blade Runner” had so much studio interference that its history is the stuff of legend. Receiving mixed reviews upon its release, it ended up receiving a cult following on home video — which got director Ridley Scott amped up and screening his own “final” versions, out of his own pocket, to audiences around the country. There have been several versions of “Blade Runner”, seven to be exact, but the ultimate seems be 1992’s “Final Cut” which got rid of the narration, left us with an extra final brilliant shot, and fixed many of the plot holes present in the original. It was the only time Scott ever had total creative freedom in the editing room for the film and it would only come 25 years after its release.
Way before Leonardo Dicaprio’s Cobb had his totem in “Inception“ and was questioning what was a dream and what was reality, Scott’s film was asking the existential questions, using sci-fi as his leeway, and having Harrison Ford‘s Deckard wonder if he was human or replicant. Three decades later and we are still asking that same question.
I was never part of the camp that thought “Blade Runner” was a great movie but, over the years, I’ve built appreciation for the film, despite the distancing effect. My history with the film is unique and unlike any other I’ve had. Its images, simmering with Jordan Cronenweth‘s all-time great cinematography, recall those of the great poetic paintings of the 18th century. The special effects, even by today’s standards, are absolutely stunning as the film rummages through a desolate but high-tech futuristic Los Angeles condemned by social and class warfare. The 1% has won, they live atop the grandest of towers, with skyline views that take your breath away, and the poor are all shacked up either in claustrophobic apartments, or the slummy L.A. streets overcrowded with what seem to be a primarily Japanese, Arabic and Caucasian population.
The vision of “Blade Runner” is based on Phillip K. Dick‘s short story, Do Androids Dream of Electric Sheep?, but the cinematic aspect is all Scott. To understand the importance of “Blade Runner” on film history, one must realize that before its release there wasn’t anything like it. The film’s dark style and futuristic designs served as the blueprint for the next 35 years of science fiction. So what did “Blade Runner” pave the way for in sci-fi? It used “film noir” to tackle what are now clichéd tropes of the sci-fi genre: giant global corporations, environmental rot, overpopulation, the rich getting richer, and poverty or slavery at the bottom. The great sci-fi films since “Blade Runner” all use these devices to set their worlds up: “The Matrix,” “Children of Men,” “Brazil,” “12 Monkeys” “Minority Report,” “Looper,” “Gattaca,” are all offsprings of the “Blade Runner” legacy.
Scott also adds sci-fi tropes that were already available to him thanks to groundbreakers like 1927’sFritz Lang-directed “Metropolis,” maybe the only other science fiction film that could be more influential than “Blade Runner.” Scott adds in flying cars, towering infrastructure, and high towers that feel genuinely colossal. If Lang’s vision managed to masterfully seep its way onto the screen in 1927, despite the obvious technical limitations of that era, then Ridley Scott expands on all of that. Yes, CGI was still not as advanced back in 1982 as it is today, but Scott managed to take our breath away with the visual schema of his film. Think of the giant billboards with moving, speaking faces on them, which Denis Villeneuve used impeccably in his sequel “Blade Runner 2049;” it’s a prediction that has come to full realization today — have you been in Times Square lately?
To say the least, the stylized world Ridley Scott created and its themes of futurism, isolation, and “artificial” humanity made an impact the size of the meteor crater in Arizona. To that effect, numerous films, and even Japanese anime, drew influence from its visual style and themes, infusing the landscape of Japanese animation with what would come to be known as cyberpunk. If Scott’s film didn’t exist, then there wouldn’t be the towering anime achievement of “Akira.”
Scott’s opus positioned a society in which Replicants were just as much humanized as we were. In fact, they were born so fully formed, infused with artificial memories they believed to be part of a past they never really had, that its maker Tyrell, czar of the corporation, sets them up with an expiration date that takes effect after four years; after the four year mark, having adapted with our society and knowing all the nooks and crannies to get around, replicants would become too smart and would begin to develop human emotions and feelings, which would make them feel too human to exist. To prevent a civil rights crisis they need die.
That’s where Deckard comes in, he’s a “blade runner,” a man assigned to track down six replicants that have rebelled and want their life expectancy expanded, and they will go as far as murder to prevent themselves from expiring. Of note, Deckard’s task is to kill six replicants, but we only see 5 — does that mean he’s the sixth and final kill? By all accounts, we see him as a human throughout the film, but Replicants are supposed to be just that — “more human than human,” as Tyrell says. The existential themes of the film end up revolving around Deckard; we don’t know much about him but we know enough to suspect that his past, or lack thereof, is suspicious. There are instances and there are clues but they all purposely contradict one another.
As mentioned, it’s taken me several viewings to finally warm up to Scott’s classic, but each viewing brings about new insights and details that I hadn’t noticed the previous rounds. There’s so much carefully crafted detail in Jordan Cronenworth’s shots that you could just spend an entire viewing focusing on what’s on the frame instead of the story. It’s not uncommon either to just freeze a frame and study it. And that wouldn’t be a bad idea, since I’ve always found the story thinly veiled and almost like an excuse to expound of Scott’s deepest visual desires at the time, rather than to entertain. This is not an uncommon complaint when it comes to “Blade Runner” because, despite its reputation as a sci-fi classic, it still has many people left to convince. Speak to any dissenter of the film and they will acknowledge that the film is easier to admire than to like. It’s not a film that you could necessarily warm up to. The brilliance of “Blade Runner” comes in its existential ideas rather than its entertainment value, which, quite frankly will turn off many viewers in its lack thereof. No, the film’s legacy is that of a visual groundbreaker rather than anything else, and it posits itself as a shape-shifter that ultimately changed the direction of photography. The American Society of photographers sealed its legacy by naming it the 9th best photographed film since 1950.
Joe and Anthony Russo’s “Avengers: Infinity War,” was 149 minutes, but Russo admits he would love for its sequel to run 3 hours, which is the cut he has right now. Something’s telling me the mouse house will want none of that and a considerable snip is in the works.
Fine, Marvel movies run at two hours in length, on average, so will the token Marvel movie go extinct as well? Russo, of course, claims that Marvel movies should be seen as an exception and that they are, in fact a “new form of storytelling” because each movie is a continuation of the universe-building the MCU has been known to do since 2008’s “Iron Man.” This structure “exploits the two-hour narrative in a different way,” the director claims.
Before you go on attacking Russo, there is some truth in what he’s saying, minus the whole “Marvel is an exception” shtick, which I believe is nonsense. What television has done is expand our idea of what a narrative can do. Why do so many TV shows feel fully fleshed out and immeasurably more layered than most movies these days? Well, because on the small-screen a director has 10+ hours to tell his or her story. There is so much more creative freedom.
The 120 minute run-time in cinema can, in fact, be quite limiting in making the viewer fully inhabit a particular world and its characters. Russo describing films as “sonnets” is dead-on; I like that comparison, but we’ve now evolved beyond that. There’s an unlimited array of options as to what can be done on TV due to a lack of restrictions in the narrative length. That’s why I love these one-off shows like “The Night Of,” “Making A Murderer,” “Fargo,” “True Detective (Season One),” “Twin Peaks: The Return” and “Big Little Lies.” Those are among the very best works I’ve seen this decade, in any medium.