People are trying to help, you were asked to do things, and honestly 140 fps is more than enough, the whole missing 4fps isn't gonna make any difference
People are trying to help, you were asked to do things, and honestly 140 fps is more than enough, the whole missing 4fps isn't gonna make any difference
In my experience setting the FPS limit close to the monitor's refresh rate gives very bad results, visually stuttering at the difference between the two (i.e. 4 Hz in the case of 140 FPS and 144 Hz), otherwise, vsync will lock the framerate to 72 FPS.
Also, I'd consider only 140 FPS at minimum settings to be completely unacceptable with those specs; I have a weaker system yet get around 1000 FPS at 8 chunks and 400 FPS at 16 chunks in a highly demanding environment, and that's without any of the threaded optimizations present in newer versions (1.8 moved rendering to its own thread and the game now uses more modern OpenGL functions; of course, this doesn't mean they will run better and there are in bug reports regarding performance issues in the latest versions).
Can you provide a screenshot? Does this only happen on one world or in general (for example, I've seen people who had hundreds of entities listed in the debug screen, which will definitely cause lag - my uncapped FPS drops by a factor of 3 when I'm at my main base and all it has are a few dozen animals and villagers, and a few hundred chests and signs, the latter also a major source of lag; I did the same test in this link and my FPS went all the way down to 40 despite my own optimizations which halved the time taken to render a chest).
so I have 143 now I know. But! I see other people on youtube having 1000+fps in minecraft with the same setting that I have. I know it's just one fps and I couldn't see 1000fps but I am worried why my computer can't handle that.
My answer to your question is, that I actually only play bedwars. No matter what I change it bounces between 140-143 fps and sometimes it drops to 120. I can set everything on fancy turn shaders on, and still have 140 same when I set everything to low. Still 140. I haven't locked my fps I just don't know what to do.
I know I play on cracked minecraft(Tlauncher) but back when I had a bad office laptop I was able to play minecraft with 200+ fps and now I would say I have a pretty "overkill" setup for minecraft.(I mean for normal gaming not blowing up 1 mil tnts at once yk)
What I noticed is when the game is maximized and I actually see the game my CPU usage is 8-9% but when I minimize it the CPU usage goes up to 30%, I click back in the game and I can see that I have 2800+ fps but it quickly goes back down to 140fps.
In my experience setting the FPS limit close to the monitor's refresh rate gives very bad results, visually stuttering at the difference between the two (i.e. 4 Hz in the case of 140 FPS and 144 Hz), otherwise, vsync will lock the framerate to 72 FPS.
Also, I'd consider only 140 FPS at minimum settings to be completely unacceptable with those specs; I have a weaker system yet get around 1000 FPS at 8 chunks and 400 FPS at 16 chunks in a highly demanding environment, and that's without any of the threaded optimizations present in newer versions (1.8 moved rendering to its own thread and the game now uses more modern OpenGL functions; of course, this doesn't mean they will run better and there are in bug reports regarding performance issues in the latest versions).
Can you provide a screenshot? Does this only happen on one world or in general (for example, I've seen people who had hundreds of entities listed in the debug screen, which will definitely cause lag - my uncapped FPS drops by a factor of 3 when I'm at my main base and all it has are a few dozen animals and villagers, and a few hundred chests and signs, the latter also a major source of lag; I did the same test in this link and my FPS went all the way down to 40 despite my own optimizations which halved the time taken to render a chest).
Using Free Sync or G-Sync which is compatible with Nvidia GTX 1060 video cards and supported monitors, should eliminate the problem of stuttering even if the frame rate drops to 140 below the maximum of 144hz on the OP's display. It's actually the best way to keep frame rates in sync with the monitor without causing tearing or excessive input lag.
the alternative is to cap the frame rate to something that can remain consistent. But 140fps with regular V-Sync on a 144hz monitor isn't good, because as you said it would get halved to about 72fps which is far worse than my suggestion that got removed in an earlier post. Having the fame rate be half of your monitors refresh rate negates the benefits of having your monitor set to 144hz in the first place.
I tried turning V-Sync and G-Sync on yesterday but I still only have 140~ fps :/
My problem is that minecraft not even using my CPU at all, when minecraft is running it's using around 8-9~% of my CPU. I turned all my 6 cores on but still ust 140~fps.
I tried turning V-Sync and G-Sync on yesterday but I still only have 140~ fps :/
My problem is that minecraft not even using my CPU at all, when minecraft is running it's using around 8-9~% of my CPU. I turned all my 6 cores on but still ust 140~fps.
I wouldn't worry about it too much, if you got adaptive sync on then 140fps is fine, as long as it doesn't keep alternating between that and something drastically lower 1% of the time. Note, your minimum frame rate is just as important as average and the max.
But a frame rate that stays within 120 to 140 most of it the time or during high combat situations (i.e when you're battling Ghasts or Piglin in the Nether or Ender Dragon in End), as long as the monitor has FreeSync or G-Sync active, is good in my opinion. You're getting a much better performance in this game than on the console ports, or even some weak computers.
And I don't think you guys get the point, I saw people on youtube with the same set up as mine. (i5-9600k, gtx 1060 6g, 16gb ram) and they can play with 1000+fps without it being choppy/laggy.
I would really like to play minecraft without it having a bit of lag you know. I think my computer is just underperforming or something.
So as the title says, I only have 140 FPS in game even with the lowest settings.
I closed all apps running in the background and it haven't changed anything.
My FPS isn't locked it's on unlimited.
I want to play it with 144 FPS because of my monitor but it still feels laggy like I was on 60hz.(I've set my resolution to 144hz)
It would be nice if somebody could tell me what should I do.
My specs:
i5-9600k
GTX 1060 6gb
16gb 3200mhz RAM(2x8)
1 tb HDD
Test it with another GPU drivers versions
Ingame press F3 - the GTX should be displayed on the right side
Also make sure the Nvidia Overlay is disabled
Can someone please help? :/
People are trying to help, you were asked to do things, and honestly 140 fps is more than enough, the whole missing 4fps isn't gonna make any difference
In my experience setting the FPS limit close to the monitor's refresh rate gives very bad results, visually stuttering at the difference between the two (i.e. 4 Hz in the case of 140 FPS and 144 Hz), otherwise, vsync will lock the framerate to 72 FPS.
Also, I'd consider only 140 FPS at minimum settings to be completely unacceptable with those specs; I have a weaker system yet get around 1000 FPS at 8 chunks and 400 FPS at 16 chunks in a highly demanding environment, and that's without any of the threaded optimizations present in newer versions (1.8 moved rendering to its own thread and the game now uses more modern OpenGL functions; of course, this doesn't mean they will run better and there are in bug reports regarding performance issues in the latest versions).
Can you provide a screenshot? Does this only happen on one world or in general (for example, I've seen people who had hundreds of entities listed in the debug screen, which will definitely cause lag - my uncapped FPS drops by a factor of 3 when I'm at my main base and all it has are a few dozen animals and villagers, and a few hundred chests and signs, the latter also a major source of lag; I did the same test in this link and my FPS went all the way down to 40 despite my own optimizations which halved the time taken to render a chest).
TheMasterCaver's First World - possibly the most caved-out world in Minecraft history - includes world download.
TheMasterCaver's World - my own version of Minecraft largely based on my views of how the game should have evolved since 1.6.4.
Why do I still play in 1.6.4?
Hey,
so I have 143 now I know. But! I see other people on youtube having 1000+fps in minecraft with the same setting that I have. I know it's just one fps and I couldn't see 1000fps but I am worried why my computer can't handle that.
My answer to your question is, that I actually only play bedwars. No matter what I change it bounces between 140-143 fps and sometimes it drops to 120. I can set everything on fancy turn shaders on, and still have 140 same when I set everything to low. Still 140. I haven't locked my fps I just don't know what to do.
I know I play on cracked minecraft(Tlauncher) but back when I had a bad office laptop I was able to play minecraft with 200+ fps and now I would say I have a pretty "overkill" setup for minecraft.(I mean for normal gaming not blowing up 1 mil tnts at once yk)
What I noticed is when the game is maximized and I actually see the game my CPU usage is 8-9% but when I minimize it the CPU usage goes up to 30%, I click back in the game and I can see that I have 2800+ fps but it quickly goes back down to 140fps.
why are you allocating 14G to minecraft?
As I made the screenshot I saw in the right upper corner that my minecraft uses 480mb of ram, and in Tlauncher I set 6034mb ram.
I opened Tlauncher's settings and I wrote to Arguments -Xmx6G -Xms6G.
Even tho I changed it to 6gb of ram, nothing changed ingame.
Using Free Sync or G-Sync which is compatible with Nvidia GTX 1060 video cards and supported monitors, should eliminate the problem of stuttering even if the frame rate drops to 140 below the maximum of 144hz on the OP's display. It's actually the best way to keep frame rates in sync with the monitor without causing tearing or excessive input lag.
the alternative is to cap the frame rate to something that can remain consistent. But 140fps with regular V-Sync on a 144hz monitor isn't good, because as you said it would get halved to about 72fps which is far worse than my suggestion that got removed in an earlier post. Having the fame rate be half of your monitors refresh rate negates the benefits of having your monitor set to 144hz in the first place.
I tried turning V-Sync and G-Sync on yesterday but I still only have 140~ fps :/
My problem is that minecraft not even using my CPU at all, when minecraft is running it's using around 8-9~% of my CPU. I turned all my 6 cores on but still ust 140~fps.
I wouldn't worry about it too much, if you got adaptive sync on then 140fps is fine, as long as it doesn't keep alternating between that and something drastically lower 1% of the time. Note, your minimum frame rate is just as important as average and the max.
But a frame rate that stays within 120 to 140 most of it the time or during high combat situations (i.e when you're battling Ghasts or Piglin in the Nether or Ender Dragon in End), as long as the monitor has FreeSync or G-Sync active, is good in my opinion. You're getting a much better performance in this game than on the console ports, or even some weak computers.
Expensive computer setup, but still pirating mc? Impressive...
I don't know, 140 fps feels so laggy and choppy.
And I don't think you guys get the point, I saw people on youtube with the same set up as mine. (i5-9600k, gtx 1060 6g, 16gb ram) and they can play with 1000+fps without it being choppy/laggy.
I would really like to play minecraft without it having a bit of lag you know. I think my computer is just underperforming or something.
Ey bro, I started play bedwars a weekago and I like it, I am thinking about buying minecraft because my friends play on hypixel.
And you are right, after 8 years it would be nice to buy it! xD