I recently got a gaming laptop (the MSI GE63VR RAIDER with an i7 7700HQ and a 1070) as a nice upgrade to my Macbook pro 2013, which was bootcamped. Upon loading up the game I found that the fps I was getting was nowhere near what I should be getting with these specs. Sure, Minecraft is an unoptimised game, and the fps I was getting (300, 350 max) was decent, but I kid you not, my Macbook with a GT 750m gets consistently better fps somehow (700+fps when bridging in bedwars on hypixel where you are looking into the void). I have been tearing my hair out trying to find a solution for this, and have spent the last 2 days on this. I have tried:
- changing the gpu minecraft uses to nvidia instead of intel - the setting is applied but my fps stays at 350 (and while I am playing minecraft and it says it's using the dedicated gpu I have noticed that the integrated gpu is always under more load than the dedicated gpu: https://imgur.com/a/iA6WPVl)
- Making sure my laptop was on high performance and that both windows and all my drivers were up to date
- Trying to roll back the drivers to their oldest possible version on my device
- Uninstalling and reinstalling Minecraft (yes i did wipe the minecraft folder)
- Doing a clean reinstall of windows
- Many more 'quick fixes' I saw on forums that didn't work for me
I think the main issue stems with Nvidia Optimus, which controls the switching between the integrated GPU (intel) and the dedicated gpu (Nvidia). So, I also had the idea to try to disable the integrated GPU so that the only GPU that would be working would be the dedicated gpu. Turns out you can't disable it except from the BIOS, and my BIOS is ancient and DOESN'T HAVE THAT OPTION. What's more, I also considered updating my laptop BIOS. Turns out, some people with the same laptop model who tried to update their BIOS and apparently did nothing wrong bricked their motherboard!
I know many of you are thinking 'bro, chill, 300fps is good enough' but a. I want to get the most performance (and my money's worth!) out of my laptop and b. again, my macbook pro gets better fps, which is embarrasing AND proof that I do have the potential to get better fps.
Link to an fps test with an alienware laptop with the same specs (link is timestamped as well, look on the right side of the video: )
Any feedback/suggestions/tips are appreciated. Minecraft is my life and I am seriously considering selling the laptop at this point, even though I enjoy the hell out of it. Please help change my mind.
Post a screenshot with F3 enabled and your JVM arguments.
Rollback Post to RevisionRollBack
Say something silly, Laugh 'til it hurts, Take a risk, Sing out loud, Rock the boat, Shake things up, Flirt with disaster, Buy something frivolous, Color outside the lines, Cause a scene, Order dessert, Make waves, Get carried away, Have a great day!
Laptop monitor is connected to integrate graphics output. Video from dedicated GPU must pass through integrated graphics.
If it is disabled, there would be no picture on the monitor. Don't disable it. The manufacturer has done a good job of preventing you from disabling integrated graphics.
Windows 10 has desktop composition rendering on main display which is integrated graphics, so it will not be idling the whole time. There is always some work to do.
Considering GTX 1070 is roughly 10 times faster than Intel HD graphics 630, 23% load of GTX1070 is much more work than 76% load of Intel HD graphics.
The game clearly renders on NVIDIA Graphics. It does not lie.
It still doesn’t make sense how I can’t get over 400fps though, even when I put the render distance to 2, this calibre of a computer should be more than capable of getting well above 1000fps in certain situations (such as looking directly at the sky).
It still doesn’t make sense how I can’t get over 400fps though, even when I put the render distance to 2, this calibre of a computer should be more than capable of getting well above 1000fps in certain situations (such as looking directly at the sky).
i have a DESKTOP with an i9-9900k and a 2060 and i don't get 1000 FPS in 1.14. your laptop certainly wont. its most likely throttled to below normal spec newer versions of MC run very poorly. you have plenty of FPS and in fact more FPS can actually hurt game performance. turn on Vsync and limit your FPS to something sane. you'll be fine.
I play competitive Minecraft so I usually play on 1.8.9, but even on 1.14 I'm sure you can get 1000+fps with a 9900k and a 2060 if you turn the render distance down to 2. I have a desktop with a Ryzen 5 1600 (OC'ed to 3.8) and a 1070 (non OC'ed) which can easily hit 1000fps in a superflat world and 2000fps when I zoom in on the sky. Obviously nobody needs 1000+ frames or even 500+, but I want to see my hardware get its expected performance (like getting my money's worth). Otherwise what's the point of buying expensive hardware, you could run minecraft on a mac if you wanted to.
P.S. I have sold the laptop, for anybody that wants to buy this laptop and play Minecraft I would strongly urge you to reconsider. FPS wise there is something very wrong - must be some sort of throttling on the device. Will have to see if this is an MSI thing or just this computer (which was bought second hand).
EDIT: IGNORE THIS, I'M STILL TRYING TO WORK THIS FORUM. READ MY REPLY BELOW.
i have a DESKTOP with an i9-9900k and a 2060 and i don't get 1000 FPS in 1.14. your laptop certainly wont. its most likely throttled to below normal spec newer versions of MC run very poorly. you have plenty of FPS and in fact more FPS can actually hurt game performance. turn on Vsync and limit your FPS to something sane. you'll be fine.
I'm sure you could easily get 1000+fps with those specs by turning the render distance down to something ridiculous like 2, but that's besides the point. You're saying nobody needs these framerates (1000+), and I'm not disagreeing with you, but I'm saying that I SHOULD be getting those framerates with my specs. I have a desktop rig with a Ryzen 5 1600 (OC'ed to 3.8) and a GTX 1070 (non OC'ed) which can easily get 1000+fps in a superflat world (which cannot be obtained on my laptop, trust me, I've tried it) and even 2000+fps if zooming in on the sky or the void. I'd like to be getting framerates that are proportional to my specs - otherwise what's the point of buying better hardware if you're not going to see fps gains?
P.S. I've sold my laptop. Not worth the insane amount of hassle to get the thing up to speed with my FPS expectations. For anyone out there thinking of buying this laptop and playing Minecraft I would strongly urge you to reconsider. I haven't been able to test many laptops so I cannot confirm where the issue is but I suspect it may either be with this particular model or just all MSI laptops in general.
I'm sure you could easily get 1000+fps with those specs by turning the render distance down to something ridiculous like 2, but that's besides the point. You're saying nobody needs these framerates (1000+), and I'm not disagreeing with you, but I'm saying that I SHOULD be getting those framerates with my specs. I have a desktop rig with a Ryzen 5 1600 (OC'ed to 3.8) and a GTX 1070 (non OC'ed) which can easily get 1000+fps in a superflat world (which cannot be obtained on my laptop, trust me, I've tried it) and even 2000+fps if zooming in on the sky or the void. I'd like to be getting framerates that are proportional to my specs - otherwise what's the point of buying better hardware if you're not going to see fps gains?
P.S. I've sold my laptop. Not worth the insane amount of hassle to get the thing up to speed with my FPS expectations. For anyone out there thinking of buying this laptop and playing Minecraft I would strongly urge you to reconsider. I haven't been able to test many laptops so I cannot confirm where the issue is but I suspect it may either be with this particular model or just all MSI laptops in general.
The game as changed a lot. Your fps expectations are way out of wack even for a 'gaming' laptop. Regardless of clock speeds or hardware a laptop will always be slower due to power and thermal requirements. It's also possible that you lost the silicon lottery with one or both parts. It's also possible the laptop is using a cheap motherboard or ram, both of which significantly hurt performance. There are so many other factors than just hardware too, like software and bloatware and drivers. Without two identical systems you are making an unfair comparison. And while I CAN get 1000+ FPS at minimum settings on a flat world, why would I even want more than the screen supports.
Moral of the story is you bought and sold a laptop based on unfair (and unreasonable) expectations of desktop performance using different hardware
Edit: these are what you were matching up? I'm honestly shocked you think the Intel was going to be anywhere near as powerful
The game as changed a lot. Your fps expectations are way out of wack even for a 'gaming' laptop. Regardless of clock speeds or hardware a laptop will always be slower due to power and thermal requirements. It's also possible that you lost the silicon lottery with one or both parts. It's also possible the laptop is using a cheap motherboard or ram, both of which significantly hurt performance. There are so many other factors than just hardware too, like software and bloatware and drivers. Without two identical systems you are making an unfair comparison. And while I CAN get 1000+ FPS at minimum settings on a flat world, why would I even want more than the screen supports.
Moral of the story is you bought and sold a laptop based on unfair (and unreasonable) expectations of desktop performance using different hardware
Edit: these are what you were matching up? I'm honestly shocked you think the Intel was going to be anywhere near as powerful
You might be right about the cheap mobo and ram/silicon lottery, though. However bloatware and drivers were definitely not the issue, as I made sure the first thing I did was to do a clean reinstall of windows and driver wise, I have tried everything, from uninstalling and reinstalling the latest drivers, to rolling them back to their original drivers (that shipped with the laptop).
Linked below is a video of a recording of a guy getting around 1000fps on a laptop with worse specs than I do. If anything, this proves that:
a. My laptop should easily be attaining the same fps (and more)
b. My expectations were NOT unreasonable
c. This (low fps issue) might have been an MSI laptop exclusive thing? Or maybe, as you said, it could have been a lost silicon lottery or a cheap motherboard/ram etc.
Granted, he might've toned the settings down a little to reach those crazy numbers, but keep in mind that I did too, and no matter what I tried I was not able to get anywhere close to that number.
One thing that I forgot to mention before was that I was messing around with the Nvidia Control Panel settings, and I noticed that in certain situations (like when I was testing fps while stripmining in a cave), when I selected Minecraft to use the Intel HD graphics it would give me MORE fps than when I selected the 1070. E.g. in a cave stripmining I would be getting maybe 500fps when using Intel graphics and in the same situation I would be getting around 350fps on the 1070. Any thoughts on this?
One thing that I forgot to mention before was that I was messing around with the Nvidia Control Panel settings, and I noticed that in certain situations (like when I was testing fps while stripmining in a cave), when I selected Minecraft to use the Intel HD graphics it would give me MORE fps than when I selected the 1070. E.g. in a cave stripmining I would be getting maybe 500fps when using Intel graphics and in the same situation I would be getting around 350fps on the 1070. Any thoughts on this?
It is possible that your frame rate is limited by data transfer speed between components.
Rendered frame from dedicated GPU must be copied to system memory through PCIe bus which is slower than the link between iGPU and system memory.
Maybe your laptop has 8-lane PCIe connection to GPU while some other laptops have 16-lane PCIe connection.
I recently got a gaming laptop (the MSI GE63VR RAIDER with an i7 7700HQ and a 1070) as a nice upgrade to my Macbook pro 2013, which was bootcamped. Upon loading up the game I found that the fps I was getting was nowhere near what I should be getting with these specs. Sure, Minecraft is an unoptimised game, and the fps I was getting (300, 350 max) was decent, but I kid you not, my Macbook with a GT 750m gets consistently better fps somehow (700+fps when bridging in bedwars on hypixel where you are looking into the void). I have been tearing my hair out trying to find a solution for this, and have spent the last 2 days on this. I have tried:
- changing the gpu minecraft uses to nvidia instead of intel - the setting is applied but my fps stays at 350 (and while I am playing minecraft and it says it's using the dedicated gpu I have noticed that the integrated gpu is always under more load than the dedicated gpu: https://imgur.com/a/iA6WPVl)
- Making sure my laptop was on high performance and that both windows and all my drivers were up to date
- Trying to roll back the drivers to their oldest possible version on my device
- Uninstalling and reinstalling Minecraft (yes i did wipe the minecraft folder)
- Doing a clean reinstall of windows
- Many more 'quick fixes' I saw on forums that didn't work for me
I think the main issue stems with Nvidia Optimus, which controls the switching between the integrated GPU (intel) and the dedicated gpu (Nvidia). So, I also had the idea to try to disable the integrated GPU so that the only GPU that would be working would be the dedicated gpu. Turns out you can't disable it except from the BIOS, and my BIOS is ancient and DOESN'T HAVE THAT OPTION. What's more, I also considered updating my laptop BIOS. Turns out, some people with the same laptop model who tried to update their BIOS and apparently did nothing wrong bricked their motherboard!
I know many of you are thinking 'bro, chill, 300fps is good enough' but a. I want to get the most performance (and my money's worth!) out of my laptop and b. again, my macbook pro gets better fps, which is embarrasing AND proof that I do have the potential to get better fps.
Link to an fps test with an alienware laptop with the same specs (link is timestamped as well, look on the right side of the video: )
Any feedback/suggestions/tips are appreciated. Minecraft is my life and I am seriously considering selling the laptop at this point, even though I enjoy the hell out of it. Please help change my mind.
Post a screenshot with F3 enabled and your JVM arguments.
F3: https://imgur.com/a/9s0gtcq
JVM: -Xmx2G -XX:+UnlockExperimentalVMOptions -XX:+UseG1GC -XX:G1NewSizePercent=20 -XX:G1ReservePercent=20 -XX:MaxGCPauseMillis=50 -XX:G1HeapRegionSize=32M
Laptop monitor is connected to integrate graphics output. Video from dedicated GPU must pass through integrated graphics.
If it is disabled, there would be no picture on the monitor. Don't disable it. The manufacturer has done a good job of preventing you from disabling integrated graphics.
Windows 10 has desktop composition rendering on main display which is integrated graphics, so it will not be idling the whole time. There is always some work to do.
Considering GTX 1070 is roughly 10 times faster than Intel HD graphics 630, 23% load of GTX1070 is much more work than 76% load of Intel HD graphics.
The game clearly renders on NVIDIA Graphics. It does not lie.
It still doesn’t make sense how I can’t get over 400fps though, even when I put the render distance to 2, this calibre of a computer should be more than capable of getting well above 1000fps in certain situations (such as looking directly at the sky).
i have a DESKTOP with an i9-9900k and a 2060 and i don't get 1000 FPS in 1.14. your laptop certainly wont. its most likely throttled to below normal spec newer versions of MC run very poorly. you have plenty of FPS and in fact more FPS can actually hurt game performance. turn on Vsync and limit your FPS to something sane. you'll be fine.
Does allocating more RAM help (for 8GB RAM, allocate 4, for 16GB RAM, allocate 8)?
I play competitive Minecraft so I usually play on 1.8.9, but even on 1.14 I'm sure you can get 1000+fps with a 9900k and a 2060 if you turn the render distance down to 2. I have a desktop with a Ryzen 5 1600 (OC'ed to 3.8) and a 1070 (non OC'ed) which can easily hit 1000fps in a superflat world and 2000fps when I zoom in on the sky. Obviously nobody needs 1000+ frames or even 500+, but I want to see my hardware get its expected performance (like getting my money's worth). Otherwise what's the point of buying expensive hardware, you could run minecraft on a mac if you wanted to.
P.S. I have sold the laptop, for anybody that wants to buy this laptop and play Minecraft I would strongly urge you to reconsider. FPS wise there is something very wrong - must be some sort of throttling on the device. Will have to see if this is an MSI thing or just this computer (which was bought second hand).
EDIT: IGNORE THIS, I'M STILL TRYING TO WORK THIS FORUM. READ MY REPLY BELOW.
I'm sure you could easily get 1000+fps with those specs by turning the render distance down to something ridiculous like 2, but that's besides the point. You're saying nobody needs these framerates (1000+), and I'm not disagreeing with you, but I'm saying that I SHOULD be getting those framerates with my specs. I have a desktop rig with a Ryzen 5 1600 (OC'ed to 3.8) and a GTX 1070 (non OC'ed) which can easily get 1000+fps in a superflat world (which cannot be obtained on my laptop, trust me, I've tried it) and even 2000+fps if zooming in on the sky or the void. I'd like to be getting framerates that are proportional to my specs - otherwise what's the point of buying better hardware if you're not going to see fps gains?
P.S. I've sold my laptop. Not worth the insane amount of hassle to get the thing up to speed with my FPS expectations. For anyone out there thinking of buying this laptop and playing Minecraft I would strongly urge you to reconsider. I haven't been able to test many laptops so I cannot confirm where the issue is but I suspect it may either be with this particular model or just all MSI laptops in general.
The game as changed a lot. Your fps expectations are way out of wack even for a 'gaming' laptop. Regardless of clock speeds or hardware a laptop will always be slower due to power and thermal requirements. It's also possible that you lost the silicon lottery with one or both parts. It's also possible the laptop is using a cheap motherboard or ram, both of which significantly hurt performance. There are so many other factors than just hardware too, like software and bloatware and drivers. Without two identical systems you are making an unfair comparison. And while I CAN get 1000+ FPS at minimum settings on a flat world, why would I even want more than the screen supports.
Moral of the story is you bought and sold a laptop based on unfair (and unreasonable) expectations of desktop performance using different hardware
Edit: these are what you were matching up? I'm honestly shocked you think the Intel was going to be anywhere near as powerful
http://www.cpu-world.com/Compare/744/AMD_Ryzen_5_1600_vs_Intel_Core_i7_Mobile_i7-7700HQ.html
You might be right about the cheap mobo and ram/silicon lottery, though. However bloatware and drivers were definitely not the issue, as I made sure the first thing I did was to do a clean reinstall of windows and driver wise, I have tried everything, from uninstalling and reinstalling the latest drivers, to rolling them back to their original drivers (that shipped with the laptop).
Linked below is a video of a recording of a guy getting around 1000fps on a laptop with worse specs than I do. If anything, this proves that:
a. My laptop should easily be attaining the same fps (and more)
b. My expectations were NOT unreasonable
c. This (low fps issue) might have been an MSI laptop exclusive thing? Or maybe, as you said, it could have been a lost silicon lottery or a cheap motherboard/ram etc.
Granted, he might've toned the settings down a little to reach those crazy numbers, but keep in mind that I did too, and no matter what I tried I was not able to get anywhere close to that number.
One thing that I forgot to mention before was that I was messing around with the Nvidia Control Panel settings, and I noticed that in certain situations (like when I was testing fps while stripmining in a cave), when I selected Minecraft to use the Intel HD graphics it would give me MORE fps than when I selected the 1070. E.g. in a cave stripmining I would be getting maybe 500fps when using Intel graphics and in the same situation I would be getting around 350fps on the 1070. Any thoughts on this?
It is possible that your frame rate is limited by data transfer speed between components.
Rendered frame from dedicated GPU must be copied to system memory through PCIe bus which is slower than the link between iGPU and system memory.
Maybe your laptop has 8-lane PCIe connection to GPU while some other laptops have 16-lane PCIe connection.