Have you ever saw your computer go at 70+ FPS? That is actually not true. If you see it go at a high rate, it actually is only going at your screen's refresh rate. Look at a table to see what I mean.
Table: FPS Rate: 30 Actual Rate: 30 FPS Rate: 50 Actual Rate: 50 FPS Rate: 70 Actual Rate: 70* FPS Rate: 100 Actual Rate: 70*
*Please note that not all computer monitors are the same so the screen's refresh rate may be lower or higher.
After looking at this table you should have learnt that it's a bad idea to pay to make your game go at 200 FPS.
Isn't this... like extremely common knowledge that anyone who is concerned with maxing FPS would already know about?
Rollback Post to RevisionRollBack
Want some advice on how to thrive in the Suggestions section? Check this handy list of guidelines and tips for posting your ideas and responding to the ideas of others!
This is common knowledge (or at least, it should be) for anyone familiar with computers and/or gaming. The most common screen refresh rate for modern monitors is 60Hz, not 70, so i'm not sure why you listed that as the maximum 'actual rate'.
Most gamers/people who built pc's already know this. FPS is limited by the monitor's refresh rate.
That isn't exactly true. The program is still running at whatever framerate it is reporting, however it is not visible if the number is higher than what the monitor's output capability is.
To be fair, 4K being 3840x2160 isn't really all THAT obvious. I still see people who think 4K is 4096x2048.
Well it's actually both those resolutions. The 'normal' 4k (or 4k UHD) is 3840x2160 which is much more commonly used for TV's and displays, while the original 4k (named so as it's just a double of 2k, another film/recording standard) is used for film projection and recording amongst a few other things.
There's actually two good reasons to have over you screen's refresh rate in fps.
1. FPS tends to dip in certain game areas more than others depending on visible geometry/effects/etc, having higher fps lowers the chance this slowdown will ever dip below your refresh rate, which actually causes much more visual stutter than if your fps was magically running 30 fps consistently.
2. Input lag, this is mainly important to fps's but input is processed at whatever rate the fps is due to the one step at a time process required for updating and drawing games. Even if you can't -see- something happen you can actually click one or two frames earlier and for instance, have a shot go off. Although some people would say this can go to unnecessary levels(and I sort of agree) humans are very good at adapting to guessing about motion. There actually is cases where a professional FPS player or something may click at the exact moment to land an incredibly far away shot just because the input was more responsive, doing so by their brain interpreting position between frames even if they can't actually see them move step by step because of refresh rate.
Thought this was common knowledge, it's mainly that people are confident of there gpu running that well, Some monitors can support 100 Htz
,but some can only do 30, it really depend on screen res, which most are 1080 p at 70Htz,
I would recommend 20% over your refresh, even if human eye only needs 30ish.
Have you ever saw your computer go at 70+ FPS? That is actually not true. If you see it go at a high rate, it actually is only going at your screen's refresh rate. Look at a table to see what I mean.
Table:
FPS Rate: 30 Actual Rate: 30
FPS Rate: 50 Actual Rate: 50
FPS Rate: 70 Actual Rate: 70*
FPS Rate: 100 Actual Rate: 70*
*Please note that not all computer monitors are the same so the screen's refresh rate may be lower or higher.
After looking at this table you should have learnt that it's a bad idea to pay to make your game go at 200 FPS.
Want some advice on how to thrive in the Suggestions section? Check this handy list of guidelines and tips for posting your ideas and responding to the ideas of others!
http://www.minecraftforum.net/forums/minecraft-discussion/suggestions/2775557-guidelines-for-the-suggestions-forum
K95 RGB / Logitech G502 PS / Alienware AW3418DW / ViewSonic XG2703-GS / Sennheiser HD 598
That isn't exactly true. The program is still running at whatever framerate it is reporting, however it is not visible if the number is higher than what the monitor's output capability is.
smh tbh fam
Well it's actually both those resolutions. The 'normal' 4k (or 4k UHD) is 3840x2160 which is much more commonly used for TV's and displays, while the original 4k (named so as it's just a double of 2k, another film/recording standard) is used for film projection and recording amongst a few other things.
1. FPS tends to dip in certain game areas more than others depending on visible geometry/effects/etc, having higher fps lowers the chance this slowdown will ever dip below your refresh rate, which actually causes much more visual stutter than if your fps was magically running 30 fps consistently.
2. Input lag, this is mainly important to fps's but input is processed at whatever rate the fps is due to the one step at a time process required for updating and drawing games. Even if you can't -see- something happen you can actually click one or two frames earlier and for instance, have a shot go off. Although some people would say this can go to unnecessary levels(and I sort of agree) humans are very good at adapting to guessing about motion. There actually is cases where a professional FPS player or something may click at the exact moment to land an incredibly far away shot just because the input was more responsive, doing so by their brain interpreting position between frames even if they can't actually see them move step by step because of refresh rate.
,but some can only do 30, it really depend on screen res, which most are 1080 p at 70Htz,
I would recommend 20% over your refresh, even if human eye only needs 30ish.
http://www.minecraftforum.net/forums/mapping-and-modding/maps/2340039-city-map-merriston-v1-airport-pc-map-mcpe-map-in
That's impossible.