Seems like a controversial statement you’re making there. Back in the day, during the advent of 3D games in the late 90s and early 2000s, average frame rates were generally somewhere between 60 and 30fps, sometimes lower. A choice that the game developer (mostly) made for you because, in addition to limitations with platform support, they had a limited “performance budget” and therefore had to choose between more FPS, more polygons, more effects, or something else. , and they chose (more often), of course to some extent, for the latter. The menus and cutscenes often ran at a higher FPS rate than the game itself.
For example, in the 1999 Unreal Championship, maps were often designed with a limited number of objects to keep performance stable. Generally, it is recommended that you display no more than 100-150 objects at a time, depending on the size and complexity of the objects.
In the era before high definition, regional video formats standardized television performance. In Europe, this format was called PAL and had a refresh rate of 50 Hz and a frame rate of 25 fps. In North America, the format was NTSC, the refresh rate was higher at 60 Hz and the standard frame rate was 30 fps.
I personally have a very expensive gaming PC with a similar 4k screen and from that I can conclude that 30fps at 4k is perfectly playable at a minimum as long as there are no drops. It’s enough tires to do everything you’re supposed to do while gaming. Does this mean you have the best gaming experience? No, of course not. It is of course genre dependent and within genre it also depends if the game is a fast paced first person shooter or an RTS where you click nice and quiet from above, with not too many units.
However, ideally, you would want to get as high a frame rate as possible. Even at 240 vs. 144, I notice the former is much smoother and can make a difference to your performance. Especially with fast-paced games, it’s a good idea to get a sense of things (something is in play) or to get fresher frames to shoot, so you can tap faster in the right spot.
For example, Linus Tech Tips ran a test where five people ran CS Go on three monitors 60, 144, and 240 and it also showed that you’re better off with a higher frame rate on average. (Most) subjects’ reaction time was faster with each frame rate increase.
The added value, for example, a higher Hz screen is certainly there, and I can’t find where the effective use limit is, but in any case it is higher than the 240 Hz that can currently be found on the market. It seems to me that a 500Hz screen could definitely be an effective addition for gamers (fast-paced shooters, future vr?, flying drones) with a good use case.
In my opinion, 4K30 is perfectly playable as long as you don’t go below that minimum. For all the coolness and coolness to use as a minimum, especially in single player games. Do you want to play competitively? Then you have to make a trade-off between “coolness and awesomeness” and performance. 30fps is good, but it’s already been proven that, especially with first-person shooters, more fps results in better performance and you may still be able to get by with lower resolutions or settings. Not to say 30fps is unplayable, but you can play better at a higher frame rate.
Not forgetting an important detail yet: the price….which will ultimately determine to a large extent whether the above is acceptable (when it’s cheap) or something disappointing (when you’ve paid the prime price and so something expected here comes back).
See also: jdh009 in “BOE Showcases First Full HD Monitor with 500Hz Refresh Rate”
[Reactie gewijzigd door jdh009 op 11 maart 2023 23:13]
“Professional web ninja. Certified gamer. Avid zombie geek. Hipster-friendly baconaholic.”
Apple TV gets a four-player picture-in-picture sports broadcast mode with iOS 16.5 – Picture and sound – News
Dolphin Emulator for Nintendo Wii and GameCube Games Coming to Steam – Games – News
Nintendo Shows New Gameplay The Legend of Zelda: Tears of the Kingdom – Games – News