I just bought GTX 1050ti 4gb DDR5 and I thought It would run Minecraft 10x Faster than my old GPU (GT 730 2gb DDR3) but unfortunately, It only gives me 20 Fps boost. I already updated my graphics driver to the latest and I also update my Java Version to Jre8u241x64. Now I was thinking if my CPU is bottlenecking my GPU, but my CPU is powerful enough to play the game...
But I can play Skyrim ( with ENB, enhanced ultra graphics mod ) In Ultra Settings, GTA V Ultra settings, Need for speed Ultra settings... in stable 60+ fps.
Here are some comparison
GT 730 2gb DDR3
GTX 1050ti 4GB GDDR5
My PC Specs
OS Name Microsoft Windows 7 UltimateOS Name Microsoft Windows 7 UltimateVersion 6.1.7601 Service Pack 1 Build 7601
Other OS Description Not Available
OS Manufacturer Microsoft Corporation
System Name KATSU-PCSystem
Manufacturer MSISystem Model MS-7721System Type x64-based
BIOS Version/Date American Megatrends Inc. V8.5, 11/1/2018SMBIOS Version 2.8
Windows Directory C:\WindowsSystem Directory C:\Windows\system32Boot Device \Device\HarddiskVolume1Locale United StatesHardware Abstraction Layer Version = "6.1.7601.24545"
User Name Katsu-PC\KatsuTime Zone Malay Peninsula Standard Time
Installed Physical Memory (RAM) 8.00 GBTotal Physical Memory 5.96 GB available Physical Memory 3.63 GBTotal Virtual Memory 17.2 GB available Virtual Memory 14.1 GBPage File Space 11.2 GBPage File C:\pagefile.sys
Download optifine for 1.15 by clicking on preview versions.
Once completed, double click the jarfile to auto-patch
I don't like modding my game because I'm scared of my World getting corrupted/glitched... When I used optifine back in 2019...my world.. got destroyed... thankfully.. I still have the backup from 2018
The Meaning of Life, the Universe, and Everything.
Join Date:
1/13/2020
Posts:
63
Member Details
There's no code in optifine that affects world generation. It is purely client-based. Worlds are hosted by either an internal server (your PC) or an external server (i.e. multiplayer servers)
If anything, use a resource pack that has a resolution lower than 16x16
Dedicate more RAM to Minecraft (4gb and up should be ok)
You may also try to update Java to the latest version and see if that's resolving your issue (assuming you are still on the java-required client)
Singleplayer worlds usually have more lag versus multiplayer servers because your computer has to do the calculations. Test your FPS on a multiplayer server and see if ur fps improves. If it does improve, something involving the calculations is the reason why your FPS is down. I'd do a check on your hard drive to see if it's still good or check your CPU again.
I already Allocated 4gb of ram for MC, I already updated my graphics card, I already Update Java, I already set my CPU to high performance, I already optimized my Nvidia 3d settings to the lowest as possible, I think I'm going use optifine again... cause I think it will help me, I always blame it for my world corruptions...and .. I was wrong
I have 78 fps in Single-player now... within my city... can I ask something? can torch make the game lag?
I'm not sure if this will help, but TheMasterCaver is a regular on these forums and he'll tell you that more memory allocated isn't always better and can actually make it worse. It'll cost you 2 minutes of your time, but try setting your memory back to 2 gb allocation. It looks like you have only 8gb RAM installed if I'm reading that right....your system may be getting bogged down since you're trying to allocate half of it to the game. If that doesn't help, you can always set it back to 4 if you choose.
Minecraft is actually more CPU dependent to an extent so getting a new video card isn't always going to be as much of a boost as you think.
This mentions something about the GPU not increasing its clock but even with it fixed FPS is still significantly lower than in older versions (for reference, I got twice the FPS and 5 times the server tick performance in 1.6.4 on a computer that would be 14 years old by now* - from your screenshots the internal server is having issues as well since the tick time should be less than 50 ms, especially when standing still - which suggests that something in the world may be causing poor performance; try creating a new world and see if performance is improved. Entities do not appear to be an issue since 124-145 is normal).
*it is completely unfair to claim that the game runs poorly because of "...core components were coded by a single guy..." - Notch hasn't worked on the game for nearly a decade, with countless major rewrites since then (at least 2 for the rendering engine, in 1.8 and 1.15, the latter of which doesn't even support the long-deprecated rendering that Notch used - the current development team is more to blame, see Minecraft 1.8 has so many performance problems that I just don't know where to start with (they make a typo; the "Notch" engine is not pre-1.3 but pre-1.8, unless they are referring to the internal server, but older versions performed worse in my experience, as expected since the game was truly single-threaded back then).
Also, biome blend should have no impact on performance unless you are moving around since the issue with it is the fact it has to average the biome colors for every grass/leaf/water block within the blend area (so 5x5 means checking 25 points), which only occurs on a chunk update, and otherwise the exact same amount of data is being sent to the GPU (the same is true of smooth lighting).
The Meaning of Life, the Universe, and Everything.
Join Date:
1/13/2020
Posts:
63
Member Details
It's also important to note that Minecraft was coded in Java. Java is not really a good language for a game at all, since you can't control the memory allocation in Java. Most games are written in C, so there's only so much you can do to optimize your game.
With every update, there will be a decrease in performance because of game calculations. This is inevitable. I highly suggest to rent a server and host your singleplayer world there. Then, you'll see a huge increase in performance because the server will do the calculations instead of your PC.
It's also important to note that Minecraft was coded in Java. Java is not really a good language for a game at all, since you can't control the memory allocation in Java. Most games are written in C, so there's only so much you can do to optimize your game.
This is completely wrong - why does 1.6.4 only allocate 1/10 or less the memory of newer versions? Because the current developers have no idea how to properly code:
The previous Minecraft releases were much less memory hungry. The original Notch code (pre 1.3 [pre-1.8]) was allocating about 10-20 MB/sec which was much more easy to control and optimize. The rendering itself needed only 1-2 MB/sec and was designed to minimize memory waste (reusing buffers, etc). The 200 MB/sec is pushing the limits and forcing the garbage collector to do a lot of work which takes time.
The general trend is that the developers do not care that much about memory allocation and use "best industry practices" without understanding the consequences. The standard reasoning being "immutables are good", "allocating new memory is faster than caching", "the garbage collector is so good these days" and so on.
The old Notch code was straightforward and relatively easy to follow. The new rendering system is an over-engineered monster full of factories, builders, bakeries, baked items, managers, dispatchers, states, enums and layers. Object allocation is rampant, small objects are allocated like there is no tomorrow. No wonder that the garbage collector has to work so hard.
Between 18w05a (the latest currently working version) and 18w06a - 18w15a, we get a 7x slowdown: not that great, but acceptable. With 18w16a, we get another x5 slowdown, for a total of x35: Amidst become almost too slow to be usable
Likewise, I've made very impressive improvements by minimizing object allocation, with memory usage down to 100 MB on Far render distance, absolutely unheard of for modern versions (the OP's screenshots show well over 1 GB being used). Note that this is still based around the "Notch" engine (same rendering methods and the like, just cleaned up):
Note that memory usage should be higher in the Nether since there are more blocks loaded but it is still only around 100 MB; I could even allocate just 256 MB and not have any issues:
This is an older screenshot from when I was optimizing sign rendering, with a mere 39 MB being used at the time (if in a mostly empty Superflat world) - there is certainly no reason to ever have issues with signs and the like causing lag when hundreds still give 1500 FPS, I also don't tick tile entities that don't need ticking for better server-side performance:
Note also that the hundreds of new features that I've added in TMCW have had virtually no impact on performance or memory usage, even without any of my optimizations, many of which are literally order-of-magnitude improvements (for example, Fancy clouds render faster than Fast clouds in vanilla, as do leaves when smooth lighting is enabled, all because of improved algorithms), and even game startup, which should be increased by having to load more assets, is still nearly instant, as is world load time (the last time I played on a recent version, 1.13, it took longer to load an existing world than vanilla 1.6.4 takes to load itself or generate a world).
Not only that, somebody claims to have made a mod that exceeds Bedrock's performance, enabling render distances up to 256 chunks (I can't imagine ever trying to run that) despite being on a newer version and with Forge (which itself is so bad that this optimization mod claims it runs worse than vanilla even with just itself, thus it is pointless to make a version for vanilla - some say to install Forge just to use Optifine but you should never do that).
Java automatically manages memory, as a result, developers aren't able to manually manage memory unlike in C.
You can't at all compare 1.6.4 and current versions at all. It's due to the fact that there's more stuff in it (an oversimplification). This is a false equivalence fallacy.
Some opinions with regards to java game development:
It's more fair to say that it isn't the whole story, rather than telling me I'm completely wrong. I do believe that the game being in Java is a factor on why it's not really optimizable.
Comparing a game, which core components were coded by a single guy, to a AAA game is like comparing apples to oranges.
well, I pointed out what is wrong between the Two screenshots of Minecraft, I just mentioned those 3 games because It's surprising.
Turn off mipmapping and biome blend.
any other options?
https://optifine.net/downloads
Download optifine for 1.15 by clicking on preview versions.
Once completed, double click the jarfile to auto-patch
any other options?
Are you having mouse senstivity problems too?
And are you using a Gaming Mouse??
I don't like modding my game because I'm scared of my World getting corrupted/glitched... When I used optifine back in 2019...my world.. got destroyed... thankfully.. I still have the backup from 2018
... Nope.. and I'm just using a regular mouse ...and it doesn't cause a single problem at all
There's no code in optifine that affects world generation. It is purely client-based. Worlds are hosted by either an internal server (your PC) or an external server (i.e. multiplayer servers)
If anything, use a resource pack that has a resolution lower than 16x16
Dedicate more RAM to Minecraft (4gb and up should be ok)
You may also try to update Java to the latest version and see if that's resolving your issue (assuming you are still on the java-required client)
Singleplayer worlds usually have more lag versus multiplayer servers because your computer has to do the calculations. Test your FPS on a multiplayer server and see if ur fps improves. If it does improve, something involving the calculations is the reason why your FPS is down. I'd do a check on your hard drive to see if it's still good or check your CPU again.
I already Allocated 4gb of ram for MC, I already updated my graphics card, I already Update Java, I already set my CPU to high performance, I already optimized my Nvidia 3d settings to the lowest as possible, I think I'm going use optifine again... cause I think it will help me, I always blame it for my world corruptions...and .. I was wrong
I have 78 fps in Single-player now... within my city... can I ask something? can torch make the game lag?
I'm not sure if this will help, but TheMasterCaver is a regular on these forums and he'll tell you that more memory allocated isn't always better and can actually make it worse. It'll cost you 2 minutes of your time, but try setting your memory back to 2 gb allocation. It looks like you have only 8gb RAM installed if I'm reading that right....your system may be getting bogged down since you're trying to allocate half of it to the game. If that doesn't help, you can always set it back to 4 if you choose.
Minecraft is actually more CPU dependent to an extent so getting a new video card isn't always going to be as much of a boost as you think.
1.15 has known issues with FPS performance:
MC-164123 Poor FPS performance with new rendering engine
This mentions something about the GPU not increasing its clock but even with it fixed FPS is still significantly lower than in older versions (for reference, I got twice the FPS and 5 times the server tick performance in 1.6.4 on a computer that would be 14 years old by now* - from your screenshots the internal server is having issues as well since the tick time should be less than 50 ms, especially when standing still - which suggests that something in the world may be causing poor performance; try creating a new world and see if performance is improved. Entities do not appear to be an issue since 124-145 is normal).
*it is completely unfair to claim that the game runs poorly because of "...core components were coded by a single guy..." - Notch hasn't worked on the game for nearly a decade, with countless major rewrites since then (at least 2 for the rendering engine, in 1.8 and 1.15, the latter of which doesn't even support the long-deprecated rendering that Notch used - the current development team is more to blame, see Minecraft 1.8 has so many performance problems that I just don't know where to start with (they make a typo; the "Notch" engine is not pre-1.3 but pre-1.8, unless they are referring to the internal server, but older versions performed worse in my experience, as expected since the game was truly single-threaded back then).
Also, biome blend should have no impact on performance unless you are moving around since the issue with it is the fact it has to average the biome colors for every grass/leaf/water block within the blend area (so 5x5 means checking 25 points), which only occurs on a chunk update, and otherwise the exact same amount of data is being sent to the GPU (the same is true of smooth lighting).
TheMasterCaver's First World - possibly the most caved-out world in Minecraft history - includes world download.
TheMasterCaver's World - my own version of Minecraft largely based on my views of how the game should have evolved since 1.6.4.
Why do I still play in 1.6.4?
It's also important to note that Minecraft was coded in Java. Java is not really a good language for a game at all, since you can't control the memory allocation in Java. Most games are written in C, so there's only so much you can do to optimize your game.
With every update, there will be a decrease in performance because of game calculations. This is inevitable. I highly suggest to rent a server and host your singleplayer world there. Then, you'll see a huge increase in performance because the server will do the calculations instead of your PC.
This is completely wrong - why does 1.6.4 only allocate 1/10 or less the memory of newer versions? Because the current developers have no idea how to properly code:
Likewise, I've made very impressive improvements by minimizing object allocation, with memory usage down to 100 MB on Far render distance, absolutely unheard of for modern versions (the OP's screenshots show well over 1 GB being used). Note that this is still based around the "Notch" engine (same rendering methods and the like, just cleaned up):
Note that memory usage should be higher in the Nether since there are more blocks loaded but it is still only around 100 MB; I could even allocate just 256 MB and not have any issues:
This is an older screenshot from when I was optimizing sign rendering, with a mere 39 MB being used at the time (if in a mostly empty Superflat world) - there is certainly no reason to ever have issues with signs and the like causing lag when hundreds still give 1500 FPS, I also don't tick tile entities that don't need ticking for better server-side performance:
Note also that the hundreds of new features that I've added in TMCW have had virtually no impact on performance or memory usage, even without any of my optimizations, many of which are literally order-of-magnitude improvements (for example, Fancy clouds render faster than Fast clouds in vanilla, as do leaves when smooth lighting is enabled, all because of improved algorithms), and even game startup, which should be increased by having to load more assets, is still nearly instant, as is world load time (the last time I played on a recent version, 1.13, it took longer to load an existing world than vanilla 1.6.4 takes to load itself or generate a world).
Not only that, somebody claims to have made a mod that exceeds Bedrock's performance, enabling render distances up to 256 chunks (I can't imagine ever trying to run that) despite being on a newer version and with Forge (which itself is so bad that this optimization mod claims it runs worse than vanilla even with just itself, thus it is pointless to make a version for vanilla - some say to install Forge just to use Optifine but you should never do that).
TheMasterCaver's First World - possibly the most caved-out world in Minecraft history - includes world download.
TheMasterCaver's World - my own version of Minecraft largely based on my views of how the game should have evolved since 1.6.4.
Why do I still play in 1.6.4?
How can you say I'm completely wrong without even understanding how Java manages memory?
https://www.javatpoint.com/memory-management-in-java
Java automatically manages memory, as a result, developers aren't able to manually manage memory unlike in C.
You can't at all compare 1.6.4 and current versions at all. It's due to the fact that there's more stuff in it (an oversimplification). This is a false equivalence fallacy.
Some opinions with regards to java game development:
https://softwareengineering.stackexchange.com/questions/55104/why-isnt-java-more-widely-used-for-game-development
https://gamedev.stackexchange.com/questions/25492/is-java-viable-for-serious-game-development
It's more fair to say that it isn't the whole story, rather than telling me I'm completely wrong. I do believe that the game being in Java is a factor on why it's not really optimizable.