Which is best 120hz or 60hz
The refresh rate dictates the maximum frame rate that can be accomplished by your monitor. So if you have a Hz television, then it can refresh times per second. But, your can never exceed your refresh rate in terms of fps. The refresh rate is its overall capacity. Or if you have a TV capable of 60Hz, it will still try to run films at their original frames per second, which is typically 24fps.
Now, when watching TV on a television with a higher refresh rate than the standard 24fps, you may end up with something called motion interpolation, or motion smoothing. Basically, what this does is add additional frames in-between the current frame rate that the TV show or film was filmed in. As most films are made in 24fps, if you were to watch this on a Hz television, it would add an additional 4 frames in-between the ones that were filmed. Whilst the process is meant to help smooth the picture and minimize any motion blur, it can end up making things look realistic.
One exception was The Hobbit , which was actually filmed in a faster 48fps. It ended up being widely viewed in 24fps, as it was decided that it was a better viewing experience after some initial testing. In most cases, a 60Hz television is going to be absolutely fine for your purpose. Many of the smart TVs being released today are going to be 60Hz refresh rate. This is just because television networks broadcast at a 50Hz limit in Europe. This is the same case for higher refresh rates too — in the UK, you might see a TV advertised as Hz, but this can play video games at a higher Hz refresh rate just fine.
Most televisions that are Hz will have the option for you to reduce the rate back down to 30 or 60Hz if you want to. For the purposes of this article, 50 and 60 work the same, as do and For my own sanity, and ease of reading, I'm going to stick with 60 and , but feel free to read that as 50 and if you're in the UK, Australia or any place that has 50Hz electricity. So are these higher refresh numbers just another "more is better!
Not entirely. Interestingly, this blur is largely created by your brain. Basically, your brain notices the motion, and makes assumptions as to where that object or overall image is going to be in the next fraction of a second. The problem with LCD and current OLED TVs is that they hold that image there for the full 60th of a second, so your brain actually smears the motion, thinking it should be moving, when in fact it's just a series of still images.
It's actually quite fascinating, but the details are beyond the scope of this article. I recommend checking out BlurBuster's great article for more info. The motion blur we're talking about here, despite coming from your brain, is caused by how the television works. This is separate from whatever blur the camera itself creates. Some people aren't bothered by motion blur.
Some don't even notice it. Others, like me, do notice it and are bothered by it. Fortunately, it can be minimized. Refresh rate itself is really only part of the solution. Just doubling the same frames doesn't actually do much for reducing motion blur. Something else is needed. There are two main methods.
The first is frame interpolation, where the TV itself creates brand-new frames that are sort of hybrids of the frame that came before, and the one that comes after.
This can fool your brain enough that it doesn't blur the image. Depending how aggressive the interpolation is, however, it can lead to the soap opera effect , which makes movies look like ultra-smooth reality TV shows. Some viewers like the effect, but it's generally hated by film buffs and others who pay close attention to image quality.
There are different levels of this processing, where a little might reduce motion blur some, and not cause undue harm to the quality of the image.
Or on the other end of the "dial," it's cranked up so that there's even less motion blur, but the movement is hyper-realistic and for many, distractingly unreal. Some TVs let you choose how much of this processing gets applied to the image, others have just a single setting. More on these settings further down.
The other alternative is black frame insertion BFI or a scanning backlight. This is where all or part of the backlight of the TV turns off goes black. Do note, however, that the jump from there to FPS will not necessarily be as stark. Random blind tests have shown that the average user is likely to notice a perceptible difference—at least in gaming-related applications.
A study conducted by Hardware. In , Nvidia also found a positive correlation between higher refresh rates and player performance.
As a graphics hardware manufacturer, the company does have a vested interest in arriving at this conclusion. Either one will likely be a much better experience than a 60Hz display. As with any new technology, high refresh rates were extremely difficult to manufacture when they first appeared.
For years, the only way to get a good high refresh rate experience was to pay a pretty premium for a top-of-the-line gaming monitor. These days though, the manufacturing processes and technology have become widespread enough that high refresh rate displays can be found on other consumer electronics, including smartphones, laptops, and even tablets.
Apple was one of the first companies to adopt high refresh rates on mobile hardware. In the years since, high refresh rate displays have become ubiquitous on smartphones—even mid-range ones. Discerning users can almost immediately notice a difference after switching to a higher refresh rate display. However, not all high refresh rate experiences are created equal. While the tech is quite easy to find these days, it still requires competent hardware to deliver fluid experiences. In these cases, you're better off purchasing a phone equipped with a better processor.
For several years, gaming consoles offered a standard 60Hz output. Even then, the vast majority of games only managed to deliver half as many frames per second.
0コメント