1. I put together a modpack MC version 1.12.2 with 207 mods that play together nicely using forge ....2705. The pack includes Optifine to use shaders. Some recently added mods that effect world generation have made my pack need more RAM than I have installed; 24G (system ram usage at 3G outside of Java). Most RAM-related posts I have read advise relatively minimal amounts of RAM allocation Java arguments, so I'm wondering why, looking at my own experience, there seems to be such a disconnect between the advise and reality. Looking at both Afterburner and Win10 Task Manager performance monitors, I can see that Java seems to ignore the allocation parameters altogether and uses as much RAM as it needs anyway. Long story short, I'm awaiting a 4th 8G Ram module so I can play my pack again.
2. I tested a vanilla 1.12.2 with the same Forge and Optifine version to measure differences in a sample of resourcepacks. I also learned that the shader uses no additional RAM. At the risk of publishing old news, here is what I found:
- default uses 1.7G
- 32x32 textures use same....
-64x64 use 2.3G
-128x128 use 2.9G
I was surprised to see such little difference. I did not know that the extra load from shaders and resource packs is carried almost entirely by the GPU.
I would appreciate any feedback if you've had greatly different experiences, and any explanation about why the java RAM argument parameters seem somewhat irrelevant.
I find it completely impossible to believe that any modpack, no matter how large, needs even 1 GB of memory unless they are using crazy high resolution textures or something (mod creators should realize that every time you double the resolution of a texture its memory footprint quadruples, and the memory usage is the uncompressed size; e.g. 16 x 16 x 4 (32 bits per pixel) = 1024 bytes for a 16x16 texture. A 32x32 texture uses 4096 bytes, and so on); for example, the following are analysis from the Visual VM tool that is included with the JDK:
This shows that most of the memory being used is being used by byte arrays, which is of no surprise since the game uses them to store chunk data in memory (for example, each section has a 4 KB array for block IDs and three 2 KB arrays for light and metadata, 10 KB per section, 12 KB if you use block IDs above 255, which uses another 2 KB). The second part is a breakdown of the memory being used by biomes, one of the largest users of memory as a group; even hundreds of biomes have little impact on the total memory usage (presumably, the BiomeGenBase[] entry is counting more than just the biome instances, each of which exists as a single globally shared static instance and there are way less than 754 biomes). The same goes for blocks, most of which were less than 1 KB (though each 16x16 texture uses 1 KB by itself, which was presumably counted elsewhere); entities were perhaps the largest user of memory per instance, around 1-2 KB each and with dozens of many types (unlike blocks/biomes/items, which all use globally shared static instances, each entity is a separate instance), but overall memory usage was still less than 1 MB.
Interestingly, the single class with the highest memory usage and instances loaded, ChunkPosition, is to 1.6.4 what BlockPos is to 1.8 and later versions, where it is undoubtedly a far greater contribution to memory usage:
public class ChunkPosition
{
/** The x coordinate of this ChunkPosition */
public final int x;
/** The y coordinate of this ChunkPosition */
public final int y;
/** The z coordinate of this ChunkPosition */
public final int z;
public ChunkPosition(int par1, int par2, int par3)
{
this.x = par1;
this.y = par2;
this.z = par3;
}
I'd sure love to see an analysis of the memory usage of a large modpack to see just what is using so much memory; even with how bad Forge and vanilla 1.8+ are at memory usage and how poorly coded most mods are I can't imagine what could be using so much memory (nor can I even imagine running such a modpack; there is a reason why TMCW is optimized to run on a 32 bit OS with 512 MB allocated (this is with the game running) and even if I got a new computer with 64 bit I wouldn't use that as an excuse to stop optimizing it, not that I've even had any issues despite adding hundreds of new features (I can even go up to 20 chunk render distance with only 512 MB allocated), since as mentioned above they simply don't need much memory, and the memory used by chunk data is only affected by the number of chunks loaded (and how high the terrain is), not the number of unique blocks or entities, since block IDs are simply pointers to the actual block instances, one per type of block):
Almost all of FoamFix focuses on optimizing memory usage patterns common in Forge and mods. Porting it to vanilla would have very few benefits, as vanilla by itself is more efficient than even Forge+FoamFix.
As far as memory usage outside the heap goes, this is mainly used by the JVM itself as well as native libraries (e.g. OpenGL) and anything else that uses native memory, for example, the game uses a DirectByteBuffer to allocate memory for rendering, which allocates native memory outside the heap, so if I had to guess some mod(s) are allocating huge amounts of memory outside of the JVM for some purpose. This is also why mods like shaders advise to leave plenty of memory free outside the heap (in my experience, either vanilla or modded 1.6.4, the Java process uses around twice as much memory as seen in-game):
- Java heap size
This can be set in Minecraft Launcher - Edit Profile - JVM Arguments
-Xmx1024M or -Xmx1G is usually enough. 2048M is plenty, depending on render distance and mods.
Don't set it too high.
ShadersMod and OpenGL requires a lot of memory outside java heap.
Make sure you have enough memory remaining outside java heap.
Also, for in-game memory (at least) many world generation mods ignore the rules of world generation, such as placing decorations with an offset of 8 blocks from the provided chunk coordinates and not making them more than +/- 8 blocks in size (larger features are to be placed as vanilla structures are, where only blocks that intersect a mask are placed) so they do not overflow into unloaded chunks, which will be loaded if accessed, and in extreme cases cause an endless chain of recursive chunk generation until the game crashes from stack overflow or out of memory; even if this doesn't happen it will load a lot of chunks into memory, causing both very slow world generation as well as increased memory usage (modders should at least use a mapping tool like Minutor to make sure that chunks are only being loaded by the player; better yet, do what I did and add a check in the code for recursive chunk loading during chunk population):
Otherwise, you can try splitting the modpack in half and loading each half separately and check memory usage, repeating until you observe a more normal usage, then the culprit mod is in the other half (it could also be that it only happens when two particular mods are loaded together).
You clearly have great knowledge and experience with this issue. I am just a casual player, unfamiliar with how Java script functions and how Minecraft utilizes it. I described my personal experience with the pack I put together. I do not know how to optimize the pack. I find your suggestion to split the pack into smaller pieces and check memory usage very interesting. Perhaps if I have time I will do this and let you know what I find. For what it's worth, I have based my pack on DireWolf20's 1.12.2 pack and I added some mods that improve immersive qualities, and then Optifine and the Shader. The mods I added recently were, for example, TechReborn, Mekanism, and Rockhounding, and I'm assuming the ore gen would have been what raised my RAM usage. I added 8 more G's of RAM to my computer tonight and I was able to play the pack again. When Minecraft was loading the pack I watched the RAM usage monitored in the Task Manager and Afterburner. Judging from the difference in the what Java used and the total RAM displayed by AB, it it looked like Java was using up to 17G and the rest of my computer another 5-6G. When I loaded my new save the total RAM number on AB reached over 30G, then after I played for 30 mins or so I noticed that it had dropped down to around 18G, assuming MC was then using about 14 or so. Thanks for your feedback and educational information. I'm going to have to read over it several times to really understand it. Thanks
1. I put together a modpack MC version 1.12.2 with 207 mods that play together nicely using forge ....2705. The pack includes Optifine to use shaders. Some recently added mods that effect world generation have made my pack need more RAM than I have installed; 24G (system ram usage at 3G outside of Java). Most RAM-related posts I have read advise relatively minimal amounts of RAM allocation Java arguments, so I'm wondering why, looking at my own experience, there seems to be such a disconnect between the advise and reality. Looking at both Afterburner and Win10 Task Manager performance monitors, I can see that Java seems to ignore the allocation parameters altogether and uses as much RAM as it needs anyway. Long story short, I'm awaiting a 4th 8G Ram module so I can play my pack again.
2. I tested a vanilla 1.12.2 with the same Forge and Optifine version to measure differences in a sample of resourcepacks. I also learned that the shader uses no additional RAM. At the risk of publishing old news, here is what I found:
- default uses 1.7G
- 32x32 textures use same....
-64x64 use 2.3G
-128x128 use 2.9G
I was surprised to see such little difference. I did not know that the extra load from shaders and resource packs is carried almost entirely by the GPU.
I would appreciate any feedback if you've had greatly different experiences, and any explanation about why the java RAM argument parameters seem somewhat irrelevant.
I find it completely impossible to believe that any modpack, no matter how large, needs even 1 GB of memory unless they are using crazy high resolution textures or something (mod creators should realize that every time you double the resolution of a texture its memory footprint quadruples, and the memory usage is the uncompressed size; e.g. 16 x 16 x 4 (32 bits per pixel) = 1024 bytes for a 16x16 texture. A 32x32 texture uses 4096 bytes, and so on); for example, the following are analysis from the Visual VM tool that is included with the JDK:
This shows that most of the memory being used is being used by byte arrays, which is of no surprise since the game uses them to store chunk data in memory (for example, each section has a 4 KB array for block IDs and three 2 KB arrays for light and metadata, 10 KB per section, 12 KB if you use block IDs above 255, which uses another 2 KB). The second part is a breakdown of the memory being used by biomes, one of the largest users of memory as a group; even hundreds of biomes have little impact on the total memory usage (presumably, the BiomeGenBase[] entry is counting more than just the biome instances, each of which exists as a single globally shared static instance and there are way less than 754 biomes). The same goes for blocks, most of which were less than 1 KB (though each 16x16 texture uses 1 KB by itself, which was presumably counted elsewhere); entities were perhaps the largest user of memory per instance, around 1-2 KB each and with dozens of many types (unlike blocks/biomes/items, which all use globally shared static instances, each entity is a separate instance), but overall memory usage was still less than 1 MB.
Interestingly, the single class with the highest memory usage and instances loaded, ChunkPosition, is to 1.6.4 what BlockPos is to 1.8 and later versions, where it is undoubtedly a far greater contribution to memory usage:
I'd sure love to see an analysis of the memory usage of a large modpack to see just what is using so much memory; even with how bad Forge and vanilla 1.8+ are at memory usage and how poorly coded most mods are I can't imagine what could be using so much memory (nor can I even imagine running such a modpack; there is a reason why TMCW is optimized to run on a 32 bit OS with 512 MB allocated (this is with the game running) and even if I got a new computer with 64 bit I wouldn't use that as an excuse to stop optimizing it, not that I've even had any issues despite adding hundreds of new features (I can even go up to 20 chunk render distance with only 512 MB allocated), since as mentioned above they simply don't need much memory, and the memory used by chunk data is only affected by the number of chunks loaded (and how high the terrain is), not the number of unique blocks or entities, since block IDs are simply pointers to the actual block instances, one per type of block):
As far as memory usage outside the heap goes, this is mainly used by the JVM itself as well as native libraries (e.g. OpenGL) and anything else that uses native memory, for example, the game uses a DirectByteBuffer to allocate memory for rendering, which allocates native memory outside the heap, so if I had to guess some mod(s) are allocating huge amounts of memory outside of the JVM for some purpose. This is also why mods like shaders advise to leave plenty of memory free outside the heap (in my experience, either vanilla or modded 1.6.4, the Java process uses around twice as much memory as seen in-game):
Also, for in-game memory (at least) many world generation mods ignore the rules of world generation, such as placing decorations with an offset of 8 blocks from the provided chunk coordinates and not making them more than +/- 8 blocks in size (larger features are to be placed as vanilla structures are, where only blocks that intersect a mask are placed) so they do not overflow into unloaded chunks, which will be loaded if accessed, and in extreme cases cause an endless chain of recursive chunk generation until the game crashes from stack overflow or out of memory; even if this doesn't happen it will load a lot of chunks into memory, causing both very slow world generation as well as increased memory usage (modders should at least use a mapping tool like Minutor to make sure that chunks are only being loaded by the player; better yet, do what I did and add a check in the code for recursive chunk loading during chunk population):
https://www.reddit.com/r/feedthebeast/comments/5x0twz/investigating_extreme_worldgen_lag/
Otherwise, you can try splitting the modpack in half and loading each half separately and check memory usage, repeating until you observe a more normal usage, then the culprit mod is in the other half (it could also be that it only happens when two particular mods are loaded together).
TheMasterCaver's First World - possibly the most caved-out world in Minecraft history - includes world download.
TheMasterCaver's World - my own version of Minecraft largely based on my views of how the game should have evolved since 1.6.4.
Why do I still play in 1.6.4?
You clearly have great knowledge and experience with this issue. I am just a casual player, unfamiliar with how Java script functions and how Minecraft utilizes it. I described my personal experience with the pack I put together. I do not know how to optimize the pack. I find your suggestion to split the pack into smaller pieces and check memory usage very interesting. Perhaps if I have time I will do this and let you know what I find. For what it's worth, I have based my pack on DireWolf20's 1.12.2 pack and I added some mods that improve immersive qualities, and then Optifine and the Shader. The mods I added recently were, for example, TechReborn, Mekanism, and Rockhounding, and I'm assuming the ore gen would have been what raised my RAM usage. I added 8 more G's of RAM to my computer tonight and I was able to play the pack again. When Minecraft was loading the pack I watched the RAM usage monitored in the Task Manager and Afterburner. Judging from the difference in the what Java used and the total RAM displayed by AB, it it looked like Java was using up to 17G and the rest of my computer another 5-6G. When I loaded my new save the total RAM number on AB reached over 30G, then after I played for 30 mins or so I noticed that it had dropped down to around 18G, assuming MC was then using about 14 or so. Thanks for your feedback and educational information. I'm going to have to read over it several times to really understand it. Thanks