Thoughts on Gemini Man, and its High Frame Rate
Posted on October 12, 2019 Posted by John Scalzi 28 Comments
My daughter asked me if I wanted to go see Gemini Man with her last night, and I did, not because I thought it would be gripping action film with just a tinge of science fiction (which is what it’s promoted as), but because I’m a cinema nerd and director Ang Lee shot the film at 120 frames a second, i.e., a much higher rate than the standard 24-frames-per-second that is used for the usual cinematic outing. I wanted to see what it looked like, and whether it would add anything to the experience.
The personal answer to this question: well, I thought it looked cool, anyway; and no, not really.
I’ll get to that in a minute, but first, the story: Will Smith is a 51-year-old assassin who feels he’s lost a step and wants to retire, but of course when you’re a professional assassin you can’t just retire, so the government, in the form of Clive Owen, sends an assassin to take him out, an assassin who just happens to be a clone of Smith’s character (this is not a spoiler, it’s all over the trailers and posters). Action scenes and bog standard plot twists ensue, and Mary Elizabeth Winstead and Benedict Wong are along for sidekick and comic relief duties respectively.
It’s fine. Director Ang Lee works beneath his level, but since his level is “two-time Oscar winner” it’s all still perfectly competent. The script has major holes in it but the movie doesn’t slow down to let you think about them, so that’s well enough, and the action scenes move along at an agreeable clip. Smith, Winstead, Wong and Owen are all attractive presences on screen, and the CGI’d younger version of Smith is credible enough both in physical detail and performance not to be distracting. It’s fine. Fine is fine. I don’t know that I will remember this movie a week from now, but while I was watching it I was reasonably entertained. Fair enough.
But for me, the thing I wanted to see was the high frame rate, and how it contributed (or didn’t) to the movie. There are purists who dislike movies being screened at higher than 24 frames a second because they think that 24fps is an essential part of cinematic grammar — it’s what gives cinema its “feel,” and higher frame rates make everything feel like a cheap soap opera. Personally, I’m meh on this; 24fps is a historical artifact, and there’s no particular reason to be tied to it these days, when nearly all theater projectors are digital and movies can be recorded and shown in higher film rates if the filmmakers want. Moreover, I’m pretty sure that younger people don’t see high frame rates as a negative; if they see something at 60fps or above, they don’t think “soap opera” — a reference which doesn’t mean anything to them since soap operas mostly don’t exist anymore — they think “video games.” And in video games, the higher the fps, the better. Why not the same in movies?
With that said, if you’re going to go out of your way to record your movie at a higher film rate, I think it helps to have a reason. I’m not tied to the 24 frame per second rate, but there’s nothing wrong with it, either. If you’re going to deviate from it — and call attention to that deviation — it’s worth it to have a good reason for doing so.
As far as I can see, there wasn’t any particularly good reason to go with the higher frame rate for Gemini Man. Yes, everything on screen moved more smoothly, and if you’re not used to higher frame rates, it can give the illusion of hyper reality. But the novelty of that wears off quickly enough, and then it becomes a question of whether the additional frames help with cinematography, or action sequences or special effects or anything else. And here, it didn’t, really. The action sequences, in particular, were not so complicated or choreographed that a higher frame rate added clarity to their execution; I suspect they would be have been equally effective at 24fps. I was aware of the additional smoothness in these scenes (especially the slow motion bits), but I wasn’t seeing how it mattered, aesthetically or functionally.
So, in the end, the higher frame rate of Gemini Man was… fine. The movie worked fine with it, and it would have worked just fine without it. It neither harmed nor added real value to the movie or the story. Does it make think that high frame rate movies are the wave of the future? Not really, no. It also doesn’t argue against the idea, either. It’s now just another tool in the filmmaker toolbox. Something they can do, if they want to, or not if they don’t. Like 3D, which, incidentally, I saw Gemini Man in, and which, like the high frame rate, neither added nor detracted from this particular movie and story.
This is the second film I’ve seen in theaters at a higher frame rate; the first was The Hobbit, which I went out of my way to see in “48HFR,” as it was advertised at the time. I liked it there and thought it suited the movie, but then I saw the subsequent Hobbit installments in regular 24fps and did not feel the lack of frame rate in any particular way. I’m still waiting for the movie for which a higher frame rate is actually critical for the cinematic experience. Maybe the upcoming Avatar sequels? Say what you will about Avatar, but for my money there was a distinct differential in experience between the 2D and 3D versions of that movie, and the 3D version was noticeably more affecting. I understand Cameron is shooting the sequels at 60fps, and if there’s any filmmaker who can make those higher frame rates pay off, it’s probably him. We’ll see.
In the meantime: Gemini Man is a perfectly adequate way to burn off two hours in the theatre. If you like Will Smith, it’s very Will Smithy. There are worse things.
My reaction to the higher film rate in “The Hobbit” wasn’t that it looked like a soap opera, but that it felt like they left the ‘action news’ filter on. But then again, I’m not a youngin’, so maybe that was the point. ;-)
I think I like your review, of the movie not just the tech, better than what was published in the local paper. The local review was pretty hard on it for what seemed to me to be pretty secondary reasons. I’m not sure folks who go for this kind of film are looking for any depth of plot. Nice SF seasoning to a run and gun plot. I haven’t seen it but I think it does beg to raise the question of genetic engineering attempting to improve the human organism. Not novel, but a refresh of the concept.
I remember advice about scotch from back in the day and the advice was, if you can’t taste the difference between (for example) Johnnie Walker Black and Johnnie Walker Green, why pay more for Green?
I know humans are analog creatures but what is the usual human eye-brain processing fps rate? If we can’t tell the difference between 60 and 120 fps, does it even matter?
I want to see Gemini Man, but the local theater doesn’t show it in English (at least, not currently – maybe it will next week). Hopefully they will sooner or later!
I wonder how many purist nimrods complained about talkies changing the frame rate from 16 fps?
“24fps is more ‘movie like'” is the visual version of “vinyl sounds better than digital”.
I wish they wouldn’t call stereoscopic movies “3D”. First, because since I lack stereoscopic vision, they do nothing for me. But also because the terminology will confuse people when and if we ever have fully holographic movies — truly 3D.
I know a lot of people had a problem with the Hobbit movies in theatres where they were shown at 48fps. I saw it at 24, and my only problems were with them being such bad movies.
Why are there so many movies about assassins?
As one of those people who play games at 60 fps or else! I wonder if the much higher frame has to do with those CGI sequences you mentioned. To get /them/ looking completely realistic, the director may have had to shoot the whole lot at that frame rate. Or perhaps the frame rate was necessary because of the scenes in which CGI and live action appear together. Would be interesting to know.
I didn’t like the high frame rate in The Hobbit, not because I am any sort of historical purist, but because the high frame rate (combined with the 3D sequences) triggered my motion sickness something chronic. Not as bad as Avatar, where just the 3D film making alone quite literally made me feel dizzy just looking at it (oddly, watching Avatar in black and white with the colour turned off made it watchable without giving me dizzy spells and nausea; weird). I am not looking forward to our high frame rate, 3D optimised future. All my life, moral guardians having been trying to get me to watch less movies and tv; now the motion picture industry is going to achieve that for them.
TL;DR High frame rates and 3D make me feel physically unwell. #NotAFan
DisplayNerd
It seems that Lee’s preferred format for screening this movie is the 120hz 4k in which it was shot, and apparently, no theater in the US in showing it in that format. Hard to say how different the experience might be, but I would leave some room for the possibility.
This sounds much like my opinion of the difference between Blu Ray video on a big tv and 4K video on a big tv.
Considering the all-star combination of writer, director, producer and actor I was a little disappointed in Gemini. True, it was “fine”. I noticed nothing in how the film looked (I saw it in IMAX), and not sure what looking like a soap opera even means but it doesn’t sound like something positive.
Cameron is the only director/producer who seems willing to put some time and work into 3D. I thought it added to Alita as well (Cameron produced). Animated movies in 3D are a complete and utter waste with the possible exception of Polar Express. I think 3D can add to the cinema experience in some cases but very few people seem to be creative with it. It needs a visionary to lead it forward.
IMHO, between increased frame rate presentations on the one hand and slo-mo / “bullet time” technology on the other, both producers and consumers of movies are spoiled for choices.
The thing is you have to take into account the reason he did the HFR and it had nothing to do with smoothness of motion in action. The reason for it was so that the de-aged Will Smith looked more life like. So it’s probably not possible to know if this purpose was a success or not unless your willing to watch the film twice in relatively quick succession.
I watched it last night in IMAX with wasn’t HFR. I think I’m going to try and peek in on the HFR showing tomorrow night when I’m seeing Joker.
Being said, the lifelikeness ebbed and wanted in my showing. There are spots I’d specifically like to see for comparison, but I’m not willing to watch the whole thing to Target them, especially since it has to be done in 3d.
“Personally, I’m meh on this; 24fps is a historical artifact, and there’s no particular reason to be tied to it these days…”
Hmm. Kinda like those who refuse to use the Oxford comma…
My reaction to THE HOBBIT was the same as many others; I thought it looked like a soap opera or news video, and therefore looked “cheap.” The other reaction I had, which I haven’t see anyone else mention, is that I could see the layers of effects. It was like looking at successive layers of animation cells piled on top of one another.
I have to admit, I don’t see the difference between 30 and 60 fps when I’m playing games on my computer, so it all seems like a bit of a waste to me.
Most of the resistance to high frame rate in cinema isn’t that it looks like a “soap opera”*, but that it makes things look too real in a way that reveals that they aren’t. It makes the sets look too much like sets and makes the presence of the camera too obvious. Making movies look like making of documentaries instead.
* That’s the criticism of frame interpolation/motion smoothing on flatscreen TVs.
So…reading between the lines of your highly opaque, hard to decipher reactions to this one, I infer that you thought it was…fine?
*nyuk nyuk*
From what I understand, the increased frame rate helps with eye strain when the movie is in 3D. That’s why Peter Jackson said he shot in HFR.
Personally I loved the Hobbit movies in 3D/HFR and saw all three in that format. One thing I think it adds is smooth motion and a more pleasing image. After getting used to it (I grew up with 24fps) I see so much judder whenever the camera pans at 24fps, and it’s hard to watch that way.
I wonder how much of the problem with increased frame rate might be similar to the problems with early digital sound recording: figuring out how to deal with what the medium does to the content. Early digital recordings were said to sound cold or brittle, and while I was never one of those “vinyl sounds better” guys (I bought and enjoyed “Bop Till You Drop” immediately and have the early 3M/Sound 80 digital LPs), I did come across Denon PCM classical recordings that were not pleasant to listen to in pretty much the ways that the dissenters described. But it’s been decades since I came across a classical digital recording with those problems.
The analysis that I found satisfying was that the microphone and mixing setups that worked fine with analog were not necessarily going to work for digital–which is apparently unforgiving–and the engineers had to figure out how to wrangle what the technology could “hear.”
Similarly, we have noticed when a TV series moved from film to video and/or digital recording and how some series producers have managed to tame the flat look–probably by adjusting the lighting. (The transition was pretty obvious in Misommer Murders.) I also recall the change in the way even filmed TV series were lit, so that they looked more like theatrical film than the old flat-lit style that was perhaps inherited from studio-shot color production, where the lights had to be so bright that news anchors had to wear “TV blue” instead of white shirts.
So: Does high-frame-rate video deliver technical advantages (as, say, increased resolution might) that need to be balanced with aesthetic concerns and expectations? I would think that digital recording allows all manner of post-production manipulation, even if, say, the lighting proved unsatisfactory.
Russell Letson:
I think you’re right here — when “next generation” tech meets “last generation” practices, there’s a gap where the artifice is noticeable, and in being noticable, is a negative. Then the practices catch up and everything is better, until the next generation of tech.
In my world we call this kind of movie a beer and snacks movie worthy of watching with friends on DVD, or at the discount theater when you can see it for cheap.
I never saw any of the Hobbit movies in 48fps, but I really wished I had. I rarely notice 24fps but the strobing on some of the action scenes at least in the first movie was just AWFUL. (The rabbit-sled chase across the downs is the one that sticks most vividly in my memory.)
However, doing so *now* would require both special equipment which I don’t have as well as a willingness to slog through them again, and I wasn’t up to that even when I binged their making-of documentaries a year or two ago. The documentaries at least were quite good—I’d wanted more from the Lord of the Rings’s documentaries, and The Hobbit’s were about right.
Interesting, I actually hadn’t heard about the higher frame rate on Gemini. I might have to see it just for that, although I agree the higher rates do make the movies seem cheaper to my eyes. As opposed to not really capitalizing on the frame rate in Gemini, have you seen the animated Spider-Man: Into the Spiderverse? The work they did with effectively decreasing the frame rate by animating on the twos for parts of the film, or for certain characters in scenes verses other characters, really has some payoff. I could feel that something was different the first time I watched it, way before I heard about their choices in animation and frame rate.
Ah, the perfect place to drop my kvetch (disclaimer: I’ve only seen the trailer): “Guys, if you’re gonna CGI a human character, you need more resolution around the mouth.” I’m not hearing-impaired but I depend heavily on lipreading, and the deadlips in a lot of CGI drives me nuts. Even Pixar! They used to be really good at this, but that seems to have slipped in the last couple of movies I’ve seen from them.
Also: the chase scene in the trailer looks like they need to work on their mass/momentum emulation. It had a lot of the motion issues I remember from the first Spiderman movie, when they hadn’t really modeled mass and momentum particularly well, yet.
Cant believe how many posts are made about film’s resolution.i used to watched shows&movies based on plots&storytelling.im an old lady of 40ish.guess in lost to the world of silly adults criticizing films using largess wordings without knowing it’s real meaning.
“Early digital recordings were said to sound cold or brittle”
Huh. Not sure exactly what would cause this, but digitzation of a raw analogue signal will cause aliasing, which introduces artifacts which arent in the original signal. The solution is to put a low pass filter on the signal before it gets converted to digital. The design of the filter can be a bit subjective with some tradeoffs to balance. Maybe the early digital recordings had bad (or no) anti alias filters.
Back when Compact Discs were brand spanking new, ASIC’s were struggling to keep up with the technology, so some cornere may have been cut. The original chipsets basically only had enough memory to store a second or so of audio, and the chip had to speed up and slow down the rpm of the disc to keep that tiny little buffer just full enough so that the audio data wouldnt underflow or overflow. (First gen cd chips treated the disc like a record player, trying to read data on the fly, in real time, at playback speed) A badly designed chip back then might have overran its buffer and dumped data here and there, resulting in that crappy audio.
Nowadays, asic memory is dirt cheap and one can easily store an entire cd on a chip no problem. And A/D design for audio signals is pretty much a “cooked” science. So there is very little one could screw up at the design level. First gen stuff, though, was definitely working some kinks out of the system.
The way i see frame rate, the standard will probably settl3 on 120 fps. Not because people can see at 120 fps. But because 120 is an integer multiple of both 24 and 30 fps, so both of those formats can upconvert to 120fps with zero distortion. And a 120fps display can play both 24 and 30 fps source material without conversion distortion.
Probably will see some new (incompatible) compression formats to deal with all that data. But we are rapidly approaching the point where standards far exceed the perception limits of humans.