Had this reflection that 144hz screens where the only type of screen I knew, that was not a multiple of 60. 60 Hz - 120hz - 240hz - 360hz
And in the middle 144hz Is there a reason why all follow this 60 rule, and if so, why is 144hz here
Submitted 1 year ago by Kyoyeou@slrpnk.net to [deleted]
Had this reflection that 144hz screens where the only type of screen I knew, that was not a multiple of 60. 60 Hz - 120hz - 240hz - 360hz
And in the middle 144hz Is there a reason why all follow this 60 rule, and if so, why is 144hz here
With computer displays only limitation is hardware. If I had to hazard a guess, 144Hz is there because that’s approximately maximum supported on widest range of hardware and 144Hz crystals were widely available and therefore cheap. Kind of how there’s a huge market for rollerblade ball bearings. Pretty much all of the power tools are using them. They are simply everywhere because they are cheap.
I was really hoping you were Lemmy’s 1996 rage in the cage account making every conversation about ball bearings
Tell me more about the ball bearing industry please!
Also subscribing for roller blade ball bearing facts
Haha, very little experience with that. But I do know rollerblade bearings are now most popular bearings thanks to low prices because of their initial popularity. Kind of how 18650 cell became popular because of laptops and is now virtually everywhere, including EVs. It’s all playing at large scale with manufacturers.
Not sure what’s the part you are interested in. I did learn about them in school, so perhaps I do have some knowledge you might find interesting.
Divide. They needed buffer room because 30 60 or 120hz aren’t always exactly 30, 60, or 120hz. Like you said 144 was just the cheapest that net or exceeded spec.
I have 160Hz screen.
Mine is an odd number, 165hz
75hz here. I thought it was pretty weird. It’s basically extra spicy 60hz.
72 Hz was used as a refresh rate for CRT monitors back in the day. 72 * 2.
It is likely a holdover from that era. I think from there is was a multiple of 24 HZ so movie content scaled smoothly without tearing before vsync? Last part is a guess.
Old reel projectors actually flashed their light at 72Hz. They had to turn off the light to move the reel to the next slide so you couln’t see the pictures moving up off the screen, and human eyes are better at spotting quickly flashing lights than they are at spotting microstuttery motion, so flashing the bulb once per frame at 24Hz in a dark room was headache inducing. The solution they came up with was just to flash the bulb 3 times per frame, which is 72Hz.
I had to scroll a bit to make sure this answer was here before I wrote the same. 👍
Wait until you find out why 24.9 was a standard and still is for most of the movies. Logical at the time, completely retarded today.
Is that the same reason that 30fps and 29.97 fps are two different things?
They are related. Black and white TV was fine and running at 30 frames for obvious easy timing since USA grid is 60Hz, but then introduction of color caused interference between chroma channels. So signal became backwards compatible, luminance channel was black and white while color TVs used two additional channels for color information. Whole 29.97 was a result of halving 60/1.001≈59.94. That slowing down of 0.1% was to prevent dot crawl, or chroma crawl. So all of today’s videos in 29.97, even digital ones, are in fact due to backwards compatibility with B&W TV which no longer exist and certainly pointless when it comes to digital formats.
On the other hand 24fps was just a convenience pick. It was easily divisible by 2, 3, 4, 6… and it was good enough since film stock was expensive. Europe rolled half of their power grid which was 50Hz, so 25… and movies stuck with 24 which was good enough but close enough to all the others. They still use this framerate today which is a joke considering you can get 8K video in resolution but have frame rate of a lantern show from last century.
It's actually 23.976 and yes it's because of NTSC frame rates. But increasingly things are shot now at a flat 24p since we're not as tied down to the NTSC framerate these days.
Well, share with the class
Did so, in other comment in this thread. :)
ITT: A ton of people who think computer displays can only sync at a single clockrate for some reason.
Fun fact, quite a few monitors can be overclocked simply by creating a custom resolution. I have a 32" Thinkvision that officially only supports 1440p 60hz but it’s fine running at 70hz when asked to.
The reason 60Hz was so prominent has to do with the power line frequency. Screens originated as cathode ray tube (CRT) TVs that were only able to use a single frequency, which was the one chosen by TV networks. They chose a the power line frequency because this minimizes flicker when recording light powered with the same frequency as the one you record with, and you want to play back in the same frequency for normal content.
This however isn’t as important for modern monitors. You have other image sources than video content produced for TV which benefit from higher rates but don’t need to match a multiple of 60. So nowadays manufacturers go as high as their panels allow, my guess is 144 exists because that’s 624Hz (the latter being the “cinematic” frequency). My monitor for example is 75 Hz which is 1.550Hz, which is the European power line frequency, but the refresh rate is variable anyways, making it can match full multiples of content frequency dynamically if desired.
the numbers are a maximum and software can alter it lower or split it up. I worked in a visualization lab and we would often mess with the refresh rates. That being said you could alter it and the screen would not respond (show an image) so there must be some limitations.
The 60 rule is actually based on a 59.97 rule and it’s not really a rule, just a standard that stuck around. What’s better than 60? Two times 60! What’s better than two times 60? Four times sixty!
With VRR you can run certain screens that get sold right now at exactly 91.3 fps if you want, it’s just extremely unpractical.
CRT monitors actually used to run at higher refresh rates (120Hz CRTs came way before anything close to flat panels were introduced) but the shitty limitations of the first ten years of flat panels changed the way displays were used and marketed.
It’s not by chance 60fps. It was chosen because power grid in USA is 60Hz, so it was easy and cheap way to synchronize frames without having additional timing hardware. As for 59.97, that was a 0.1% slowdown introduced when color TV was added to prevent cross-talk in chroma channels. Weird solution but it worked out fine. Today there’s no reason for sync anything with power grid but 60 is still a very convenient number as it’s easily divisible by many others.
Of course, and that’s why European Youtubers will sometimes upload video in 4k@50fps (because the grid frequency in Europe is 50Hz instead, and that’s what a lot of cameras are configured for to prevent banding).
Syncing with the power grid is one of those great ideas that lead to some very silly side effects, like that time the grid frequency in Europe sagged for a while and everyone’s alarm clocks started drifting. Grid operators increased the frequency slightly over the following months to correct the clocks as well, so if you adjusted your alarm clock you’d need to adjust it again!
Funny effect, though - many cheap electronics (think coffee makers and microwave ovens) use the line frequency as a time base. Taking a 60Hz or 50Hz appliance and plugging it into the other causes the clock to be off.
My iMac says it’s 77hz.
Display refresh rates, measured in hertz (Hz), significantly impact the smoothness of motion on screens. A display with a 60Hz refresh rate updates the image 60 times per second, each update representing a frame. Thus, at its maximum capacity, a 60Hz display shows 60 frames in one second. In contrast, a 360Hz display updates its image 360 times per second, allowing for the potential display of 360 frames in the same duration. This rapid succession of frames results in a markedly smoother visual experience, as the human eye perceives motion more fluidly when more frames are displayed per second.
Conversely, a display with a lower refresh rate, like 24Hz, refreshes the image just 24 times per second. This lower frequency results in a more ‘choppy’ or stuttered visual experience due to the fewer number of frames presented each second.
Analogous to a film projector, increasing the frame rate for smoother motion requires the film to move faster through the projector. However, without additional frames in the source material, this would simply speed up the playback. To maintain normal playback speed while achieving a higher frame rate, the source material must contain more frames. For instance, to sustain standard playback speed on a 360Hz display (which is 6 times faster than a 60Hz display), the source needs to provide six times as many frames per second.
The unit “hertz” means “per second”. A higher value is still one second, but more events per second.
All of the frames in the number (30, 60, 144, 360, etc) are shown in one second. So for 360 Hz you’re seeing a new frame every 1/360 = 0.0028 seconds vs 1/60 = 0.017 seconds which gives a smoother transition from frame to frame
CubbyTustard@reddthat.com 1 year ago
Kyoyeou@slrpnk.net 1 year ago
Wow that’s crazy logical!
Im sick in my bed right now and forgot I used to have a bad 75hz screen 😅
Thank you!