Recently I have gotten a new monitor that is 240hz. I overclocked that monitor to 120hz because my graphics card does not support 240hz. I log in to Minecraft only to find out my fps is at 30 no matter how much I change the fps limiter.
I have looked it up on Google and it mentions something about an Nvidia control panel. My setup doesn't support any Nvidia products so that is not an option. Google also told me to took at my monitor display settings, and it was also at 120hz. I do not understand what is causing my Minecraft game to cap the fps to only 30. Can somebody please help me find a way to fix this issue?
You can reduce the render distance to 8 to increase the FPS
The screen shows, you are using an Intel 4600 GPU - this is an older/weak GPU
I understand that, but with my old monitor it used to show 100 to 150. This monitor shouldn't be much different. Since my old monitor was 75hz, this 120hz setting I have it set to should show at least 200.
I understand that, but with my old monitor it used to show 100 to 150. This monitor shouldn't be much different. Since my old monitor was 75hz, this 120hz setting I have it set to should show at least 200.
This is not at all how things work - your monitor's refresh rate is the maximum FPS that it can display (without screen tearing, or displaying parts of multiple frames at once, which is when vsync comes into play) and is not in any way related to how much you actually get, which is entirely down to how fast your GPU and CPU can process things. It is a lot like the common myth that the amount of RAM determines FPS - as long as the game has enough adding more will have no impact on performance - much as a monitor with a higher refresh rate will not allow you to see more if the game can't reach it.
If you are getting worse performance with a new monitor it is probably because the resolution is higher and your GPU is limited by pixel fill rate - you need a more powerful GPU that can draw more pixels per frame. Since you appear to have an integrated GPU and a desktop you can install a proper dedicated GPU (I suggest NVIDIA as AMD GPUs have known issues); even a relatively low-end dedicated GPU will be far better (a major issue with integrated GPUs is that they use system RAM, thus they have to share the memory bus with the CPU, and system RAM has a much lower bandwidth than the RAM that dedicated GPUs use).
This is not at all how things work - your monitor's refresh rate is the maximum FPS that it can display (without screen tearing, or displaying parts of multiple frames at once, which is when vsync comes into play) and is not in any way related to how much you actually get, which is entirely down to how fast your GPU and CPU can process things. It is a lot like the common myth that the amount of RAM determines FPS - as long as the game has enough adding more will have no impact on performance - much as a monitor with a higher refresh rate will not allow you to see more if the game can't reach it.
If you are getting worse performance with a new monitor it is probably because the resolution is higher and your GPU is limited by pixel fill rate - you need a more powerful GPU that can draw more pixels per frame. Since you appear to have an integrated GPU and a desktop you can install a proper dedicated GPU (I suggest NVIDIA as AMD GPUs have known issues); even a relatively low-end dedicated GPU will be far better (a major issue with integrated GPUs is that they use system RAM, thus they have to share the memory bus with the CPU, and system RAM has a much lower bandwidth than the RAM that dedicated GPUs use).
I used the same GPU and everything with my old monitor though, so I don't understand why it is doing this. Not only that, but the resolution is also the exact same.
I understand that, but with my old monitor it used to show 100 to 150. This monitor shouldn't be much different. Since my old monitor was 75hz, this 120hz setting I have it set to should show at least 200.
This is not how monitors work.
monitors that are set to 120hz cannot display 200 frames per second because they are only refreshing 120 times per second, it's simple math.
Your monitor is behaving exactly as it should if it is capped at 120 with v sync.
And no offence but your GPU is not the bees knees, it is not at all unexpected that your frame rate is capping out at 30fps because of this,
there's nothing wrong with your monitor, your GPU is just too weak/slow to render the average and minimum frame rate you want.
If you want better performance then your only option is a better computer.
Explain why my old monitor had 75hz and computed 140+ frames?
Because the graphics hardware is what generated 140+ frames, and your machine obviously didn't have any form of v-sync switched on, causing the frame rate to exceed the maximum refresh rate of your old monitor.
and you must had been using a different computer at the time if you got higher frames per second than what you're getting now, either that or you've not set up your computer properly, resulting in poorer performance.
I will take your word and say it is my desktop computer, so for Christmas or something I will get a new computer that can support the amount of performance this computer handles in-game. I have looked up why my computer was only able to compute 120hz instead of 240hz and it confirmed that my GPU was not strong enough.
Recently I have gotten a new monitor that is 240hz. I overclocked that monitor to 120hz because my graphics card does not support 240hz. I log in to Minecraft only to find out my fps is at 30 no matter how much I change the fps limiter.
I have looked it up on Google and it mentions something about an Nvidia control panel. My setup doesn't support any Nvidia products so that is not an option. Google also told me to took at my monitor display settings, and it was also at 120hz. I do not understand what is causing my Minecraft game to cap the fps to only 30. Can somebody please help me find a way to fix this issue?
Post a screenshot with F3 enabled and your JVM arguments.
Arguments: -Xmx2G -XX:+UnlockExperimentalVMOptions -XX:+UseG1GC -XX:G1NewSizePercent=20 -XX:G1ReservePercent=20 -XX:MaxGCPauseMillis=50 -XX:G1HeapRegionSize=32M
Screenshot: https://imgur.com/a/kJ5oJe7
The screen shows 54 FPS
You can reduce the render distance to 8 to increase the FPS
The screen shows, you are using an Intel 4600 GPU - this is an older/weak GPU
I understand that, but with my old monitor it used to show 100 to 150. This monitor shouldn't be much different. Since my old monitor was 75hz, this 120hz setting I have it set to should show at least 200.
FPS mostly rely on the GPU, yours is very old.
This is not at all how things work - your monitor's refresh rate is the maximum FPS that it can display (without screen tearing, or displaying parts of multiple frames at once, which is when vsync comes into play) and is not in any way related to how much you actually get, which is entirely down to how fast your GPU and CPU can process things. It is a lot like the common myth that the amount of RAM determines FPS - as long as the game has enough adding more will have no impact on performance - much as a monitor with a higher refresh rate will not allow you to see more if the game can't reach it.
If you are getting worse performance with a new monitor it is probably because the resolution is higher and your GPU is limited by pixel fill rate - you need a more powerful GPU that can draw more pixels per frame. Since you appear to have an integrated GPU and a desktop you can install a proper dedicated GPU (I suggest NVIDIA as AMD GPUs have known issues); even a relatively low-end dedicated GPU will be far better (a major issue with integrated GPUs is that they use system RAM, thus they have to share the memory bus with the CPU, and system RAM has a much lower bandwidth than the RAM that dedicated GPUs use).
TheMasterCaver's First World - possibly the most caved-out world in Minecraft history - includes world download.
TheMasterCaver's World - my own version of Minecraft largely based on my views of how the game should have evolved since 1.6.4.
Why do I still play in 1.6.4?
I used the same GPU and everything with my old monitor though, so I don't understand why it is doing this. Not only that, but the resolution is also the exact same.
This is not how monitors work.
monitors that are set to 120hz cannot display 200 frames per second because they are only refreshing 120 times per second, it's simple math.
Your monitor is behaving exactly as it should if it is capped at 120 with v sync.
And no offence but your GPU is not the bees knees, it is not at all unexpected that your frame rate is capping out at 30fps because of this,
there's nothing wrong with your monitor, your GPU is just too weak/slow to render the average and minimum frame rate you want.
If you want better performance then your only option is a better computer.
Keep the monitor though.
Explain why my old monitor had 75hz and computed 140+ frames?
Because the graphics hardware is what generated 140+ frames, and your machine obviously didn't have any form of v-sync switched on, causing the frame rate to exceed the maximum refresh rate of your old monitor.
and you must had been using a different computer at the time if you got higher frames per second than what you're getting now, either that or you've not set up your computer properly, resulting in poorer performance.
I will take your word and say it is my desktop computer, so for Christmas or something I will get a new computer that can support the amount of performance this computer handles in-game. I have looked up why my computer was only able to compute 120hz instead of 240hz and it confirmed that my GPU was not strong enough.