Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Movies Television United States Entertainment Technology

Is Motion Smoothing Ruining Cinema? (vulture.com) 347

With TVs now delivering images faster than movies, TV manufacturers have tried to make up for that discrepancy via a digital process called motion smoothing. Whether you've realized it or not, you've likely watched a movie in motion smoothing, as it's now the default setting on most TVs sold in the United States. Bilge Ebiri from Vulture says that while this feature was well-intentioned, "most people hate it." He argues: "Motion smoothing transforms an absorbing movie or narrative TV show into something uncanny. The very texture of what you're watching changes. The drama onscreen reads as manufactured, and everyone moves like they're on a daytime soap -- which is why it's sometimes called the 'soap-opera effect.' In other words, motion smoothing is fundamentally ruining the way we experience film." From the report: Motion smoothing is unquestionably a compromised way of watching films and TV shows, which are meticulously crafted to look and feel the way they do. But its creeping influence is so pervasive that at the Cannes Film Festival this May -- the same Cannes Film Festival that so valorizes the magic of the theatrical experience and has been feuding with Netflix for the past two years -- the fancy official monitors throughout the main festival venue had left motion smoothing on.

That seems like a funny oversight, but it's not surprising. "There are a lot of things turned on with these TVs out of the box that you have to turn off," says Claudio Ciacci, lead TV tester for Consumer Reports, who makes sure to switch smoothing off on the sets he evaluates. "It's meant to create a little bit of eye candy in the store that makes customers think, at first glance, Hey, look at that picture, it really pops. But when you finally have it at home, it's really not suitable." He notes that most people don't fiddle much with their settings because motion smoothing isn't easy to find on a TV menu. (It's also called something different depending on the manufacturer.) Which gets to the heart of the problem: As more and more people watch movies at home instead of in theaters, most won't bother trying to see the film as it was intended to be seen without the digital "enhancements" mucking it up. "Once people get used to something, they get complacent and that becomes what's normal," Morano says. And what films were supposed to look like will be lost.
Mark Henninger, editor of the online tech community AVSForum, suggests TV manufacturers "just put a couple of buttons on the remote that are direct surface level -- TV, movie, sports, or whatever." The industry's reluctance, he says, has as much to do with uncertainty as anything else. "Manufacturers don't know who to listen to. They don't know if it should be the reviewers, their own quality-assurance lab, or user complaints."
This discussion has been archived. No new comments can be posted.

Is Motion Smoothing Ruining Cinema?

Comments Filter:
  • Here's a tip... (Score:5, Insightful)

    by BoogieChile ( 517082 ) on Wednesday July 24, 2019 @09:06PM (#58982716)

    Listen to the ones who are buying the TVs, not the ones who get to use them for free.

    • Listen to the ones who are buying the TVs, not the ones who get to use them for free.

      Yes, or as the end of the submission puts it: "Mark Henninger, editor of the online tech community AVSForum, suggests TV manufacturers "just put a couple of buttons on the remote that are direct surface level -- TV, movie, sports, or whatever." The industry's reluctance, he says, has as much to do with uncertainty as anything else. "Manufacturers don't know who to listen to. They don't know if it should be the reviewers, their own quality-assurance lab, or user complaints." "

      On my tv motion smoothing can

      • by Xenx ( 2211586 )
        I'm not saying you're definitely lying, but based on every TV remote I've used I can't imagine what you're talking about is what the summary was talking about. Yes, I can use a remote to go into a menu to then change over to a different mode. However, in my 38 years I've never had a TV remote with a single button press on the remote to make the change between screen settings. I can't say no TV has ever done it, just that it's never been an option I've seen on the dozens of different TVs used. Both are anecd
        • I'm not saying you're definitely lying, but based on every TV remote I've used I can't imagine what you're talking about is what the summary was talking about. Yes, I can use a remote to go into a menu to then change over to a different mode. However, in my 38 years I've never had a TV remote with a single button press on the remote to make the change between screen settings. I can't say no TV has ever done it, just that it's never been an option I've seen on the dozens of different TVs used. Both are anecdotal, but I believe my experience kind of counters your claim that it's a common feature.

          I didn't say single button/key press (that might be nice) - "Tools" button, scroll from a short list, and then "Enter" is what I do. Then again, this may be just as fast as a single button: the "Tools" button is left of the remote center and easily accessed without looking for it, and who knows where that "single-button settings control" would be hidden - maybe under a slide-out cover?

          • You're not using the cheap plasticky remote that came with the TV now are you.

            Those are the ones the summary is talking about, not some fancy Logitech home hub universal remote.

        • by jrumney ( 197329 )
          Mine has a single button for switching in and out of "Sports Mode", which ramps the contrast up to ridiculous levels to make the ball stand out, and a "Quick Menu" button that gives you access to the other display and audio modes within three button presses.
        • I've only had three lcd TVs so far, and two of them were Aquos displays, but all three of them had mode and scale buttons on the remote. The third one was a Vizio. Maybe just don't buy Sylvania

    • by Mr. Dollar Ton ( 5495648 ) on Wednesday July 24, 2019 @09:36PM (#58982810)

      Don't write a 1000 word summary without an actual explanation of what this "motion smoothing" thing is.

      • by ShanghaiBill ( 739463 ) on Wednesday July 24, 2019 @09:57PM (#58982880)

        Don't write a 1000 word summary without an actual explanation of what this "motion smoothing" thing is.

        It would also be nice if they explain why it is so bad, and why, if it is so bad, most people don't even notice it.

        I have no idea if my TV has this feature or not. Should I care?

        Perhaps my cinema experience has already been ruined without my knowledge. How can I find out?

        • by Anonymous Coward on Wednesday July 24, 2019 @10:07PM (#58982906)

          It isn't bad. Well, once in a while it can be done poorly, but for the most part, most of the time, it simply makes the video smoother in a pleasant way.

          Think about watching a film at 10 FPS. it would be super stuttery, right? 24 is much better and we don't see as much stutter, but it still isn't as smooth as we can perceive. Smoothing it out into higher frame rates makes it more pleasant to watch, just like that 24 is more pleasant than the 10.

          That's all it is. It's only bad if it's implemented poorly, like on some low end sets.

          • No, it is not pleasant. You are a crazy person. Everything looks like it's sliding across the screen. The original source was not shot with motion smoothing in mind. It is objectively not what is intended.

            • by MikeS2k ( 589190 ) <mikes2 AT ntlworld DOT com> on Thursday July 25, 2019 @02:24AM (#58983470)

              Actual video that has been shot in 60fps looks lovely; although for high fantasy movies there are complaints that it looks "too real" - actors on a set. This should be improved with better sets as people get used to working in 60fps.

              24fps or 30fps video that has been "upscaled" or interpolated to 60fps looks utter garbage. To me, it looks like those old videos shot in the 1920's with handcranked cameras - what was smooth motion now seems to change speed randomly, slightly speeding up then slowing down. It just looks uncanny, I can't believe anyone ever thought this nonsense was ever a good idea?

              • 30fps translated into 60fps should be unnoticeable to anyone.

                The problem comes with converting that 24fps into 60fps - its not easy to do as there's no direct mapping, so you end up having to do your best, making each frame last for 2.5 frames on averages, which means some will be there for 3 frames and others for 2.

                25fps is even worse.

                This is where the juddery effect comes in, its not so much the frame rate, but the frame rate displayed via a technology that can only match it imperfectly. And there's no re

                • This was all true a few years ago, but technology has caught up. The vast majority of late-model TVs tested here [rtings.com], both LED and OLED, display judder-free 24p from a native 24p source, and most even from 24p media that was converted and transported in 60p/60i.

                  As I understand it the former was largely driven by 120Hz refresh rates, since then all frames of a 24p source are simply displayed 5:1. And the latter is just a signal processing exercise of detecting the pulldown pattern in the source and throwing ou

        • Until motion smoothing is found to be causing brain tumors, I think it's here to stay.
        • Whether or not you like the movie, I found a good example scene in Scott Pilgrim vs. The World, during the first time where Scott and Knives are shown in a music store browsing the selections together, not too far into the movie. There is a tracking shot parallel to the music aisle, showing lots of store aisles and signs of music genres in the background. The first time I saw it on my TV (a Samsung D6400 with motion smoothing), the effect was distractingly obvious. The only way I could stop the effect wa
          • "Also, to get the motion smoothing effect, you may have to be watching the video stream from an HD source like a Blu-Ray disc"

            Other way around, probably. Motion smoothing is essentially framerate upsampling with interpolation. It shouldn't do anything if the source is already at an appropriate framerate. If you have and HD source and and HD TV, motion smoothing should be a no-op.

            Though many TVs are now 120Hz, which means you may get the effect from a 60Hz HD source.

            The effect may be noticeably different for

          • by 605dave ( 722736 )

            Thank you! I could never figure out why game mode looked the best, and it's simply because it turns off all the other crap.

        • Perhaps my cinema experience has already been ruined without my knowledge. How can I find out?

          Do you pay more than 10 cents to go to the theater on the weekend like my mom did? You do??

          It's been ruined.

        • by AmiMoJo ( 196126 ) on Thursday July 25, 2019 @03:03AM (#58983566) Homepage Journal

          Motion smoothing is where the display takes the original frame rate of the video, say 24 fps for movies, and adds extra frames to increase the rate to some higher multiple, typically 120 fps. The extra frames are created by looking at two original frames and interpolating the motion between them.

          This makes the video look very smooth and clear. It looks good in showrooms, but also a little bit fake or like a soap opera recorded on video tape rather than film stock, so some people don't like it. It's kinda like the "bass boost" button on hifi, audiophiles scoff at it but many consumers like it.

          This is really an argument about what movies should look like. When movies are re-mastered they often remove the film grain, and the result is very smooth and clear looking but purists argue that the imperfections added something. Not just movies in fact, there is currently a re-re-re-release of Dragon Ball Z that is getting the same treatment and people are complaining that it's too clean.

          • I've had a similar issue in making low-bitrate encodes of animated material*. In order to get the most efficient encode possible, I used some pretty elaborate filtering to remove all the artifacts from previous encoding and processing. I'm good enough that I could get the image absolutely perfect, artifact-free, with absolutely no loss of details. And yet it looks... wrong. The pure uniformity of color regions seems unnatural.

            *I will neither confirm not deny the legality of my reasons for doing this.

        • "Perhaps my cinema experience has already been ruined"

          Directors now seem so determined to demonstrate that their cinematic masterpiece is not to be confused with TV that you can put most films into one of two categories - "blue" or "yellow" - depending on the colour of the filter they've employed to reinforce their artistic credentials. Both look odd and seem to show nothing but the self-regard of the filmmaker.

          I'd happily accept a bit of motion-smoothing if I could actually see things in the right colours.

        • It would also be nice if they explain why it is so bad, and why, if it is so bad, most people don't even notice it.

          Most people WILL notice it, except that neophytes will not be able to pinpoint what's going on. My brother got a new 4K TV screen that does motion smoothing, and after looking at motion-smoothed 1080p picture, he declared to be "this is why 4K makes the difference".

          Motion smoothing is extremely bad because using motion smoothing interpolation to convert 24fps film into 60fps or 120fps (or smoot

      • by Jody Bruchon ( 3404363 ) on Wednesday July 24, 2019 @11:05PM (#58983064)
        Indeed. What they're calling "motion smoothing" is really just motion interpolation. [wikipedia.org] If a TV supports 60fps and the movie is in 24fps (I'm ignoring the matter of telecine because the TV auto-detects telecine and removes it) the TV will look at two adjacent frames relative to the desired sets of frames, look at the differences between the two frames (possibly using previously decoded frames to help), attempt to extrapolate some motion vectors from the changes between those frames, and recalculate new frames at the higher frame rate based on the existing frames and the calculated motion vectors. The original frames aren't generally used because they won't necessarily correspond to frames in the destination frame rate (24fps/60fps is 2.5 60fps frames for every 24fps frame, so only every five 6fps frames can overlap with a 24fps frame correctly) so what you end up seeing tends to be almost entirely recalculated frames. It gives a buttery smooth movement where before you would see heavily motion-blurred jumps between the frames, but because it's based on algorithmically guessed motion vectors, it always looks a little unnatural. Notably, what I just described is also how video editor frame rate changing plugins like Twixtor Pro function under the hood.

        This technique completely disintegrates when a motion vector can't be calculated to fill in every part of the interpolated frames. In this case, the interpolation engine has no clue what to do, so instead of interpolated motion vectors, it just fades the failed portion from one frame to the next frame. This is why you'll see the motion interpolation on your TV work fine for most things, but then some guy in an action movie has the audacity to swing his arm in an arc very quickly and the movement between frames is too large to safely consider as a motion, so you get 60fps everywhere but where the arms are moving at 24fps with a weird-looking fade effect. If you've ever noticed this, it's annoying as hell and really ruins the effect, looking way worse than if the source was left alone. It's also the main reason I turned that shit off on my TV within a week of buying it.
        • Then my TV doesn't have it, or my eyes are too slow.

          Either way I don't care, especially if it can be turned off.

        • I haven't bought a tv in 10 years. Can interpolation algorithms in modern tvs deal with subtitles in the video signal? The the picture is panning while the subtitles are stationary, it tends to look really bad because you have pixels with different motion vectors very close to each other and the interpolation algorithm doesn't know what to do.

          • Subs are not hard-coded, they are added on after the picture processing. So yes.

            If you are watching something with hard-coded subs, get a better source.

        • but because it's based on algorithmically guessed motion vectors, it always looks a little unnatural.

          A great summary but I disagree with this part. The "algorithmically guessed" frames look no more or less unnatural than native 60fps footage including stuff that was specifically shot at high frame rates for artistic direction (e.g. Hobbit, and ... errr... fine, I'll say it, Porn).

          Unless it breaks down as you described with the fading the whole unnatural uncanny valley bit comes only from the fact that it's not the motion we are used to seeing on the screen.

        • ... to simply take before and after pixel values in the book ending frames and do a simple linear interpolation between them for the manufactured frames?

        • by AmiMoJo ( 196126 )

          No TVs I've ever seen work like that. They never increase the frame rate to 60Hz, it's always an exact multiple of the source frame rate and they always display the source frames in full.

          60, 30, 24 -> 120

          59.94, 29.97 -> 119.88

          50, 25 -> 100

          60Hz displays showing 24Hz input use frame duplication. For broadcast they duplicate fields of the interlaced image, which modern TVs have to detect and discard because it would look terrible. It only really works on CRTs.

          On 50Hz displays they usually just speed t

        • Interesting. So it sounds like you could say that motion interpolation is an attempt to increase the "detail" of the movie image in transition. It seems ironic, then, that in video games today it's common to have a motion blur effect, which does the opposite; it decreases detail by blurring the image during motion. I tend to turn motion blur off for that reason. Likewise, it's ironic that other effects are added to make a video game look more like a film, such as lens flares and depth of field. This brings
    • Listen to the ones who are buying the TVs, not the ones who get to use them for free.

      Listening to customers is a bad idea. They should be listening to an industry calibration standard for motion.

      Display a calibration pattern on a new TV out of the box and you'll find colors are over saturated and display over sharpened (ditto for color temp and brightness level). Marketing people are pushing all kinds of ridiculous tweaks on purpose because they look superficially "better" to customers.

      • Most of it's just to ensure the TV is tuned for brightly lit stores with the customer standing farther away than they would at home.

        You shouldn't expect a TV to work well at both Best Buy and in your living room without some tweaking.

  • by xxxJonBoyxxx ( 565205 ) on Wednesday July 24, 2019 @09:10PM (#58982724)
    >> Is Motion Smoothing Ruining Cinema?

    No, Marvel Phase 4 will ruin cinema. Looking forward to telling the neighbor kids about the golden age of comic book movies and to get off my lawn.
  • 24fps?! (Score:5, Insightful)

    by Anonymous Coward on Wednesday July 24, 2019 @09:18PM (#58982744)

    Dear movie industry.

    Stop your bleating and film your media at a decent frame rate, so we don't have to fix your outdated crap with fancy flawed algorithms.

    Signed,
    2019

  • by SirAstral ( 1349985 ) on Wednesday July 24, 2019 @09:19PM (#58982756)

    I personally like all the stupid shaky cam shit so I can get a headache and barf on the person in front of me so they can barf on the one in front of them. /poe

    Smoothing is but a small respite from the idiocy that is current Cinema. I am tired of overly loud cinemas that need to jack the volume up so you can still hear the actors talk only to have your ear drums blown out when special effects happen.

    I would want TVs and Players to auto smooth audio for me so that dialog in the movie along with all the bullshit explosions are at the same decibel levels. I don't need my neighbors to know I am watching Die Hard! I don't want to hear them watching it either!

    • That's AGC (Automatic Gain Control) which is responsible for audio. It has nothing to do with Motion Smoothing.
      • I was talking about that along with Motion Smoothing, should have made my post a bit more clear. I will look into those features.

        Not sure what my BlueRay player is capable of yet so I have not even thought to try.

    • I am tired of overly loud cinemas that need to jack the volume up so you can still hear the actors talk only to have your ear drums blown out when special effects happen.

      I would want TVs and Players to auto smooth audio for me so that dialog in the movie along with all the bullshit explosions are at the same decibel levels.

      Literally all home cinema gear has this already. Turn the damn thing on and stop complaining. I don't want my experience ruined because you want to get along with your neighbours. Supporting dynamic range compression is a requirement for both Dolby and THX certifications for home. Depending on equipment you may even have the ability to apply it twice, once in the TV and once in your source.

      RTFM.

  • I'd happily vote (Score:4, Informative)

    by evanh ( 627108 ) on Wednesday July 24, 2019 @09:49PM (#58982852)

    for all films to be made at 100 fps. They're way too shuddery as is.

    As others have said, the interpolation is a decent stop-gap, albeit not perfect.

    • ^^ THIS.

      The "magic" number for silky smooth frame rate seems to be between 96 and 120 Hz (inclusive). Exact multiples of 24: either 4x or 5x respectively. There are decreasing returns past 144 Hz (some gamers can detect 240 Hz.)

      24 fps looks like crap because it is stuttery as hell -- especially pans and quick camera rotations. Interpolating extra frames looks worse.

      The Hobbit at 48 fps is NOT a good example because it was < 60 fps.

    • for all films to be made at 100 fps. They're way too shuddery as is.

      Sure; since flatscreen TV's run at 60 or 120 - and soon to be 240- let's choose a frame-rate that doesn't cleanly divide by any of those, so that we continue to have this fucking problem.(Are you sure you're not a bureaucrat in real life??)

  • Nope. (Score:2, Redundant)

    by dohzer ( 867770 )

    Nope, comic book franchises and blockbuster remakes are ruining cinema on their own.

  • The answer is "NO" (Score:5, Insightful)

    by PopeRatzo ( 965947 ) on Wednesday July 24, 2019 @10:07PM (#58982904) Journal

    Because of course it is. Motion smoothing is not ruining cinema. A lack of good ideas and over-reliance on previous intellectual property is ruining cinema. A singular focus on blockkbuster films that can have sequel after sequel is ruining cinema.

    And even with these challenges, there is still magnificent cinema to be had. You gotta dig a little bit, but it's definitely there. If you want good cinema, go look for some instead of watching jackoffs play video games or act an ass on YouTube.

    If you have a public library card, or have any affiliation with a university, you should check out the streaming service called "Kanopy". It's free, and you got your Criterion Collection for the best in film history, you've got your weird-ass indie features, you got shorts, documentaries, avant garde, the whole enchilada. Go check it out right now. Just a little while ago, I watched Ang Lee's directorial debut on Kanopy. It was a movie called "Pushing Hands" and it's about a tai chi master who comes to the US. It's cool and simple and there are no superhero muscle boys in tights or Dwayne Johnson exploding buildings (not that there's anything wrong with either of those things).

    • A singular focus on blockkbuster films that can have sequel after sequel is ruining cinema.

      ...or to put it another way, one-of-a-kind blockbusters are effectively becoming soap operas so a soap opera effect is actually ruining cinema.

    • by account_deleted ( 4530225 ) on Thursday July 25, 2019 @10:08AM (#58984928)
      Comment removed based on user account deletion
  • 24 fps is archaic (Score:5, Interesting)

    by djbckr ( 673156 ) on Wednesday July 24, 2019 @10:14PM (#58982936)
    24 fps was, at the time, a good compromise of cost, quality, and technical feasibility. Recently the Hobbit movies were released at 48 fps, and while I enjoyed them, I felt they were *still* too jittery. I think the minimum frame rate should be 60 these days.
    • 24 fps was, at the time, a good compromise of cost, quality, and technical feasibility. Recently the Hobbit movies were released at 48 fps, and while I enjoyed them, I felt they were *still* too jittery. I think the minimum frame rate should be 60 these days.

      Is your TV capable of a accepting a display mode with an integer multiple of 48 fps?

      I checked mine. It advertises 24, 25, 30, 50 and 60 hertz at various resolutions. No 48, 96, 144...etc. Without a match the jitter you are experiencing likely has nothing to do with content.

      • by Zuriel ( 1760072 )

        The good news is since variable refresh rate tech is now a part of the HDMI and DisplayPort standards, we're closing in on an actual permanent solution to frame rate issues.

        A TV that supports HDMI VRR will just run the panel at a multiple of the source's speed. 48 Hz? Sure. 43.6 Hz? No problem. 13 Hz? Display each frame 4 times, run the panel at 52 Hz. A TV that goes up to 144 Hz and has a big enough VRR range will handle basically anything with no jitter, tearing or interpolation artifacts.

        It's now support

    • One of the things you might find bothersome about the Hobbit movies wasnt the framerate although a coworker though it was.

      It was the digital post-processed makeup they put on the cast. The coworker said it was "distracting" but thought it was due to the higher framerate. You can see the same "distracting" effect in that latest superman movie where they had to digitally remove the actors mustache.
      • by AmiMoJo ( 196126 )

        It was just your standard bad CGI movie. Bad plot, bad dialogue, too much CGI bullshit where the characters had no weight and the impossible camera movements made it look completely fake.

        That last one is underappreciated. The reason Marvel went to all the trouble of filming live action actors in costume and then completely replaced their clothing and even entire bodies with CGI versions was to at least make sure the camera movement was realistic, because "video game camera" syndrome makes a scene look reall

    • by AmiMoJo ( 196126 )

      It's a bit more complicated than just frame rate.

      When you watch a moving object on screen the clarity with which you can see it depends on a number of factors. Frame rate, how fast the transition between frames is (very important for LCDs), and the settings of the camera which can increase or reduce motion blur.

      Then you have human perception. Jackie Chan is the master of fight scene cinematography because he understands that you need both fast motion and enough clarity to allow the viewer to follow the acti

    • I think it may pay off shooting film at higher frame rates. But this discussion is not about this. It's about smoothing film that was already intentionally shot at 24fps. Doing it is akin to looking at an oil painting that was digitally remastered to look like a photography or something else, completely ruining the experience that was originally intended by the artist.

      • by AmiMoJo ( 196126 )

        Intentionally shot at 24 fps, or shot at 24 fps because even Peter Jackson couldn't get anything higher to be widely adopted or even made available on the BluRay release. Really, the Hobbit BluRay is 24 fps.

  • by SvnLyrBrto ( 62138 ) on Wednesday July 24, 2019 @10:26PM (#58982970)

    24fps is not some universal ideal for visual perfection. Rather, it was merely a cost-cutting move because film used to be expensive and the studios were cheap and inclined to cut every corner. It is the bare minimum frame rate at which a motion picture will not look like jittery crap, that's all. The only reasons higher frame rates look "wrong" are 1) dodgy interpolation algorithms (Which, to be fair, should rightly be criticized.), and 2) that we're just not used to them because movies have traditionally been 24fps (Insert "Old man yells at cloud." picture with Abraham Simpson here.).

    If Hollywood were to switch to 48fps or higher across-the-board, and keep it up for five years or so, we'd all get used to it and 24fps would look as weird as 48 does now.

    • If Hollywood were to switch to 48fps or higher across-the-board, and keep it up for five years or so, we'd all get used to it and 24fps would look as weird as 48 does now.

      While it's true people will get used to anything no matter its objective qualities the problem with 48 fps is that interpolation is happening in the first place. When I watch 24 fps content the display mode is changed to match the frame rate of the content. This is currently not possible with 48 fps because no such display mode exists.

      If Hollywood wants a new standard they should pick 50 or 60hz so everyone doesn't have to buy a new TV or learn to live with "weird".

    • by Solandri ( 704621 ) on Wednesday July 24, 2019 @10:48PM (#58983018)

      Rather, it was merely a cost-cutting move because film used to be expensive and the studios were cheap and inclined to cut every corner.

      The biggest reason was completely practical - you could only fit about 15-20 minutes of film onto a single reel of a 24 fps movie [howstuffworks.com]. Someone had to sit in the projection room, watch for the cue mark (the little round shape which flashes in the upper right corner [youtube.com]), and manually turn on a second projector which was ready with the next reel of the film. The first projector had to be turned off at the same time, and prepped with the next reel of the film. The first reel then had to be rewound so it'd be ready for the next showing (about 30 min after the current showing ended).

      If they'd increased it to 48 fps, this would've required a reel change every 7-10 minutes and increased the number of reels per movie from about 6 to 12. 60 fps would've needed a reel change every 6-8 minutes, and 15 reels. The higher fps rates only really became practical once movies ditched film and switched to digital storage.

      • Your logic is predicated on the reel size, frame size, and frame spacing being predefined. And the rewinding is a non-issue; if you had two or more reels per film, you'd have the first reel ready by the time the second reel ends, rewinding at the same pace it plays. Rewinding is the simpler operation, not needing a precise pacer or the jerky movement of the projection. It's not even necessary for projection reels and normal recording reels to be the same length. Also, there would be no reason to pick 48 in
    • Yup. 60 fps should be the minimum now. Low framerates suck. I leave motion interpolation on on my TV because the interpolation to a higher framerate is more pleasing to my eye than low framerate jittery, stuttering, blurry crap. Movies are supposed to put you in another reality, supposed to put you in the story. Well, the higher the framerate, the more convincing/realistic that view into the other universe should be. When I saw The Hobbit movies in the theatre at 48 fps it was glorious. If it were 60

    • 48 fps still has judder for camera pans and fast camera rotations. The *bare* minimum is 60 fps. But even that looks stuttery as hell compared to 100+ fps.

      * https://www.red.com/red-101/ca... [red.com]

      The sweet spot for actual smooth motion is between 96 fps and 120 fps which are exact multiples of 24: 4x and 5x respectively. Anything past 144 fps has quickly decreasing returns.

      Console plebs have been stuck at 30 fps for most games -- the smoother ones use 60 fps like most fighting and driving games; PC master race ha

  • It is usually referred to as motion blur and it is cheaper than making the refresh rates higher. Plasma tvs used to have refresh rates of 600 vs the 120 motion blur stuff of today. I found that with a 3d tv you needed to turn that off in order to get rid of the artifacts.
  • I had enjoyed motion smoothing with my old 36" Philips CRT TV with it's PixedPlus or Natural Motion. Had it for ten years and loved it. It smoothed 25 fps PAL DVDs to 50 fps, and for NTSC DVDs it even smoothed the 3:2 pulldown. There were really no perceivable artifacts - even animation with not too many frames (like Yellow Submarine) was better.

    Now I have LG OLEDC8. It also has motion smoothing. It's nice - when it works. I don't know if it's due to the fact that HD/4k content has much more pixels to inter

  • Viewers perceive 60fps video as not-cinema, and think it looks cheap.

    Even if it works, motion smoothing turns a 24fps movie into a 60fps straight-to-video.

    • by Zuriel ( 1760072 )

      Viewers think 60 FPS video looks cheap because they're constantly shown cheap 60 FPS video. There's nothing about 60 FPS that magically makes the video look cheap.

      If TV stations thought lower frame rates would make their content look better and get them more viewers and more advertising dollars, they'd all switch to 12 FPS overnight.

  • by Miamicanes ( 730264 ) on Thursday July 25, 2019 @12:07AM (#58983190)

    The biggest casualties of motion-smoothing are things like pendulums, swinging objects, waving hands, etc. The algorithm knows something has changed direction, but has no way of guessing HOW FAR it moved along its predicted path before reversing, or how it decelerated or accelerated while reversing.

    A few years ago, I used MVtools2 to smooth an 8mm film of my parents' wedding from 16fps to 60fps. The effect on people dancing or waving at the camera is hilarious, though in other parts, after I stabilized & color-corrected it, it almost looked like a camcorder scene from a modern-day wedding.

    Someday, when I can get pro(-sumer) software tools to do "guided" motion-vector interpolation (with me helping out the algorithm when it gets confused & hand-tweaking away the arc-motion-artifacts), I might try to redo the interpolation.

    Pro tip: 8mm color film is grainy. 720p is pretty close to its "real" resolution. 1080p is high enough to start replicating grain. At 2160p, you're *literally* scanning film grain with roughly 2x-4x oversampling. 8mm @ 2160p16 with 48-bit RGB + 16-bit infrared (to help detect scratches), scanned frame-by-frame including the sprocket holes(*) is about as 'future-proof' of archival scanning for old family films as it gets.

    (*) You can always crop out the sprocket holes later, but most old 8mm cameras exposed the film out to the very edges. If you're doing motion-stabilization, that out-of-frame detail can come in handy, as long as the stabilization algorithm recognizes sprocket holes for what they are.

    Pro-tip #2: if you're digitizing VHS, you NEED a VERY high bitrate & at LEAST 720x480 (ntsc) to avoid mangling what little detail VHS *has*. VHS is NOISY... noise doesn't compress well. Keep the vertical resolution at 480 (ntsc) or 540/576 (pal), leave the horizontal res at 720 (for 2x-3x luma oversampling) and don't skimp on the bitrate (8-16mbps, MINIMUM). VHS might have a "real" ntsc resolution of 160-240x480, but I can *guarantee* that digitizing VHS to 240x480 @ 8mbps WILL look visibly inferior to the original. Storage is cheap... do it right, do it once, then render a second copy for casual viewing if desired. And save your high-quality copy to single-layer non-LTH BD-R to keep it passively safe for the next hundred years.

    • by AmiMoJo ( 196126 )

      The algorithm knows something has changed direction, but has no way of guessing HOW FAR it moved along its predicted path before reversing, or how it decelerated or accelerated while reversing.

      That's not how it works, at least not in any modern implementation.

      The display buffers a frame ahead, and delays audio to match. So it has a start point and an end point, it knows exactly how far everything moved and there is no path prediction, only interpolation.

  • Barf (Score:5, Informative)

    by markdavis ( 642305 ) on Thursday July 25, 2019 @12:39AM (#58983268)

    I know I will be in the minority (or perhaps not), but I can't stand motion smoothing. I hate it with a passion and will not use it, and will not watch anything with it. My TV has it off and I will not buy any equipment that doesn't allow me to turn it off. I notice it instantaneously and being forced to watch such video makes no difference to my level of hatred.

    Motion blurring/smoothing/interpolation/whatever makes the scenes look plastic, fake, and strange to me. Everything looks cheap, soap-opera-like, and has a feeling like it is shot on a $200 video camera. It makes it very hard to suspend disbelief and is impossible for me ignore or "tune out".

    Could it be because I am just used to 24/30FPS my entire life? Perhaps. Could it be that I am "ruined" because of that? Perhaps. Whatever the cause, I still want the choice to not be subjected to high frame rate video. Interestingly, nobody I have "helped" with finding and turning off the setting on their TV's have had any negative reaction to the change. About half loved it (turning it off) and the other half couldn't tell the difference (amazing to me).

    And yes, high frame rate also ruined my "Hobbit" experience in the theater. The 3D was great work, but it could not fix or make up for the high frame rate negative. I am very happy that HFR didn't catch on.

    • by Zuriel ( 1760072 )
      I mean, you can just discard frames to get down to whatever frame rate you like. You can watch at 10Hz, or 480p, or black and white. That's fine, if that's what floats your boat. That doesn't mean we should avoid improving anything because some people are used to things being the best we could manage in the 1940s.
      • >"I mean, you can just discard frames to get down to whatever frame rate you like."

        I thought about that years ago, wondering if that would have the same effect. It does get a bit technical with HOW the frames are grabbed, but that might work. Wouldn't help with theaters, but would at home, if such a mode would be effective and available.

  • "Once people get used to something, they get complacent and that becomes what's normal,"

    I've noticed that in music with autotune. I find the few artists who sing with a natural voice, wavers and all, refreshing. People who have been brought up listening to autotune seem to think it strange and unfinished.

  • Nobody wanted 3d TV. Nobody wanted motion smoothing. But the manufacturers decided we need it all.

    • It's a marketing weapons race that involves marking as many technical checkboxes in the marketing literature as they possibly can. Honestly, nobody also asked for built in smart tv features either, or at least there is still a ton of people who don't want the "smart" built into TV, but there you have it now on every TV panel anyways.

  • Easier Solution (Score:5, Insightful)

    by Going_Digital ( 1485615 ) on Thursday July 25, 2019 @03:26AM (#58983612)
    Instead of making the user make the choice, content creators should come up with a standard set of metadata flags. These flags would tell the TV about the content, things like this is a movie or sports, the TV just configures the settings appropriately and a savvy user can override it if they have reason to.
  • Something like 10 years ago when the sales of 1080p flat TVs took off, the manufacturers struggled for marketing ideas to distinguish their product from everyone else. And sadly somebody realized that they can load some stupid motion interpolation algorithm that detects and alters 24fps film and delivers you a motion smoothed film instead of the standard AAABB (look it up) pull down on a 60Hz screen. Of course, once company X created this, the manufacturers Y and Z rushed to do it as well. The next step in

  • I just went down a rabbit-hole on this, because I was curious what the real difference is. There are a variety of examples on YouTube; some of them show the same scene at normal speed, and then in slow-motion, so that you can see what is really happening. I also listened to a couple of rants on the subject. Looks to me like it really comes down to these points:

    - Motion-smoothing works marvelously on simple scenes containing simple, well-defined objects. A football flying across a field of green grass - grea

  • Some people just don't see it, it's crazy. I guess it's just lack of attention to detail: the other day I was watching something with my girlfriend, and there were clear synchronization issues between the video and the sound, making it look like it was a cheap dub. Likewise she cannot see the effects of motion smoothing or the sickening of 3D HDR in cinemas.

    It gets even worse when I go meet my old parents; they were sold a lot of overpriced equipments they don't need, and have motion smoothing, bad gamma ca

  • jeez, tv's are not just tv's anymore, what a bunch of rubbish is all this?
    smart tv's are bad enough, but they keep adding stupied features to it for unknown reasons (well, maybe except as mentioned in summary - to look good in the showroom).

    you're just better of these days to just use a big monitor as your TV.

  • Only teens go to see a movie in theaters and their damn cellphone use during the movie ruined it for the rest of us, so we don't go anymore.

"What is wanted is not the will to believe, but the will to find out, which is the exact opposite." -- Bertrand Russell, _Sceptical_Essays_, 1928

Working...