I have 2GB ram on my pc and I added 1400MB ram to minecraft which is my limit, but minecraft is still slow, so I added 10GB virtual memory and I don't know how to add them to minecraft because the limit didn't change from 1400MB to 10GB so how can I allocate virtual memory to it?
Virtual memory is something that is available system-wide. It sets a portion of your hard drive to present itself as RAM, but that is a horrid option. It may help you load apps that require lots of RAM and would just run out otherwise, but since virtual memory operates at hard drive speed, it can slow things to a crawl. In other words, virtiual is a solution when you want something to just work, no matter how slow. There are algorithms that try to keep more frequently used resources in physical memory, but you still going to take a severe performancevhit.
Now, if you have virtual memory set up, your application will just see it as memory and use as needed. You can't tell MC to use virtual memory, you can tell it to just use more memory, and it will utilize virtual too. But if the goal is to make things run faster, you are not going to achieve that. You will see severe slowdowns any time MC dips into virtual memory pool.
Virtual memory is something that is available system-wide. It sets a portion of your hard drive to present itself as RAM, but that is a horrid option. It may help you load apps that require lots of RAM and would just run out otherwise, but since virtual memory operates at hard drive speed, it can slow things to a crawl. In other words, virtiual is a solution when you want something to just work, no matter how slow. There are algorithms that try to keep more frequently used resources in physical memory, but you still going to take a severe performancevhit.
Now, if you have virtual memory set up, your application will just see it as memory and use as needed. You can't tell MC to use virtual memory, you can tell it to just use more memory, and it will utilize virtual too. But if the goal is to make things run faster, you are not going to achieve that. You will see severe slowdowns any time MC dips into virtual memory pool.
1st, the only thing that I understood from this is that the virtual memory used to make things that doesn't work work
2nd, if you mean that it is supposed to make things faster, why doesn't it change my FPS from 0-10 to more, and the ZERO is real, my game is super laggy and the virtual memory didn't do anything
Yes, that is just ancient. In fact, memory is probably the least of your worries here. Vanilla unmodded Minecraft runs fairly well with only 1GB of RAM allocated.
If you only have 2 GB of RAM I suggest allocating less memory than the default; 512 MB is more than enough at the settings that you'll be able to use (mainly render distance, which exponentially increases memory usage as it is increased; 32 chunk render distance loads 169 times more chunks than 2 chunks and nearly 10 times more than 10 chunks, the recommended minimum due to a bug with mob spawning. They recommend allocating 2 GB for 32 chunks which translates to 200 MB for 10 chunks and this is indeed about how much memory the game actually uses).
Of course, there is also the option of playing an older version; the system requirements have increased greatly over the years - the minimum requirements for 1.6 called for a Pentium 4, which came out in 2000 - probably older than most people who play the game! This is actually part of the reason why I still play 1.6.4; my older computer had major lag issues with 1.8 (plus some odd framerate issue with 1.7 but it wasn't actually lag) and by the time I got a newer one (which is also now below the system requirements; people still blame "notch code" for the game's poor performance but much of that code has been rewritten by now - in a much worse way).
Playing on a server can also help since your computer doesn't need to handle all of the game logic, which runs server-side, including a separate cache for loaded chunks (thus memory usage per loaded chunk is more than twice as high, more since the server also keeps spawn chunks loaded; the singleplayer client/server could share the same chunk cache but it would introduce contention issues due to two threads accessing the same data at once, and in any case the memory usage is not significant until you go to higher render distances; you are much more limited by your CPU/GPU).
Also, nobody mentioned Optifine, which can help; besides its optimizations as mentioned above the server keeps spawn chunks loaded and Optifine can disable that to save on memory and resources (this does mean that farms won't work in the spawn chunks). Note that only newer versions of Optifine, after around 1.8, do this (older versions of vanilla have "Advanced OpenGL", which doubled FPS on my older computer but this usually only helps on Nvidia GPUs and when the GPU is the limiting factor as it increases CPU load).
Well wht if your computer is above the requirements like mine is. Would the virtual memory make a difference?
You never, ever want to have to use virtual memory (or more accurately swap, as virtual memory is a more general concept used by modern operating systems), especially for a real-time application like Minecraft since it is swapped to/from disk and can be millions of times slower than system RAM, especially when accessing lots of random bits of data, as Minecraft does a lot (disks are better for large sequential accesses) - even an SSD is insanely slow (for comparison, 60 FPS is 16.666 ms per frame):
L1 cache reference 0.5 ns
Branch mispredict 5 ns
L2 cache reference 7 ns 14x L1 cache
Mutex lock/unlock 25 ns
Main memory reference 100 ns 20x L2 cache, 200x L1 cache
Compress 1K bytes with Zippy 3,000 ns 3 us
Send 1K bytes over 1 Gbps network 10,000 ns 10 us
Read 4K randomly from SSD* 150,000 ns 150 us ~1GB/sec SSD
Read 1 MB sequentially from memory 250,000 ns 250 us
Round trip within same datacenter 500,000 ns 500 us
Read 1 MB sequentially from SSD* 1,000,000 ns 1,000 us 1 ms ~1GB/sec SSD, 4X memory
Disk seek 10,000,000 ns 10,000 us 10 ms 20x datacenter roundtrip
Read 1 MB sequentially from disk 20,000,000 ns 20,000 us 20 ms 80x memory, 20X SSD
https://gist.github.com/jboner/2841832
In other words, you always want to make sure that you have enough free RAM to store the entire game and all its associated libraries, including the JVM, which at typical settings means about twice the memory you've allocated to Java (e.g. Xmx1G = 2 GB free, Xmx2G = 4 GB free, etc). It also benefits to have enough free RAM so the OS's disk cache can operate effectively (modern OSs use any free RAM to cache recently accessed files; the game continuously loads and saves chunks as you move around). Otherwise, the only practical solutions are to upgrade your system, allocate less memory and/or use memory optimization mods, or end unnecessary processes.
and even in the case of an SSD not only would it still be slow but it would end up wearing the flash cells out much faster, and in the end it would be a worse outcome than on a hard drive, purely because the hard drive would survive the rewrite cycles involved where as an SSD wouldn't, and after a few petabytes of write cycles you'll have to buy a new one.
In both cases it's better to purchase and allocate actual RAM to the game.
Depending on if it is vanilla or using mods, and the render distance, you will want to start with a minimum of 4gb system memory.
8gb of total RAM (with at least 1gb allocated to Minecraft which is the default in 1.16 64bit Java) is recommended because of things like your operating system and background processes.
if you go with a 32 chunk render distance I would suggest allocating 2gb, because more information has to be stored in memory when more chunks are loaded at once, in fact I would say this is the most memory intensive part of the game, where as things like redstone, lighting and mob AI tend to be more CPU/GPU dependent.
with at least 1gb allocated to Minecraft which is the default in 1.16 64bit Java
The default has been increased to 2 GB since around 1.13, which is much more resource-intensive than older versions, though memory usage had been increasing significantly since around 1.8; back in the 1.6 days Optifine even recommended allocating only 350 MB:
Lauch Minecraft with less memory (yes, really). Usually it does not need more than 350 MB and runs fine on all settings with the default texture pack.
Interestingly, they mention swapping as a reason to allocate less; back then a lot of people still used 32 bit systems with at most 3 GB of available memory, if they even had that much installed (I myself had issues with the game running out of memory with 1 GB allocated, which in this case meant a lack of process space, not Java heap space, but otherwise I never had issues with overall system RAM with around 1 GB still available while playing):
By default java allocates way too much memory (1GB) which may get swapped to disk and the overall performance may suffer a lot.
In any case, you shouldn't allocate much more than the game actually uses to maximize performance since having to manage more memory reduces garbage collector performance; this is also why my own mod recommends allocating 512 MB, of which around half is actually used, giving a good amount of headroom (other JVM arguments may also matter, the ones I recommend were the default until around 1.13, which now uses the G1 garbage collector instead of CMS, which is better for memory-intensive programs but requires more memory for management).
Also, there is another area where a lack of memory can significantly impact performance - video RAM, specifically, dedicated, which if exceeded will cause the GPU to swap from system RAM, and cause extreme FPS drops; as an illustration, my framerate went from around 80 FPS to 5 FPS if I enabled Fancy leaves in my mod's "mega tree" biomes, which did not happen all at once but after period of time with FPS returning to normal if I looked away, a classic indication of VRAM paging (integrated GPUs using system RAM are not affected but they also don't perform as well in general due to not having high-speed graphics memory, or only a very limited amount). In this case, using Fast graphics (not necessarily all Fast, Optifine lets you selectively set leaves, etc to Fast or Fancy) can help, otherwise a better GPU with more VRAM is necessary (even modern high-end GPUs can have issues with very large texture packs; another issue here is the "maximum texture size", which is limited to 16384x16384 on many GPUs and represents the maximum combined size of every texture since the game stitches them together into a single atlas).
The default has been increased to 2 GB since around 1.13, which is much more resource-intensive than older versions, though memory usage had been increasing significantly since around 1.8; back in the 1.6 days Optifine even recommended allocating only 350 MB:
Interestingly, they mention swapping as a reason to allocate less; back then a lot of people still used 32 bit systems with at most 3 GB of available memory, if they even had that much installed (I myself had issues with the game running out of memory with 1 GB allocated, which in this case meant a lack of process space, not Java heap space, but otherwise I never had issues with overall system RAM with around 1 GB still available while playing):
In any case, you shouldn't allocate much more than the game actually uses to maximize performance since having to manage more memory reduces garbage collector performance; this is also why my own mod recommends allocating 512 MB, of which around half is actually used, giving a good amount of headroom (other JVM arguments may also matter, the ones I recommend were the default until around 1.13, which now uses the G1 garbage collector instead of CMS, which is better for memory-intensive programs but requires more memory for management).
Also, there is another area where a lack of memory can significantly impact performance - video RAM, specifically, dedicated, which if exceeded will cause the GPU to swap from system RAM, and cause extreme FPS drops; as an illustration, my framerate went from around 80 FPS to 5 FPS if I enabled Fancy leaves in my mod's "mega tree" biomes, which did not happen all at once but after period of time with FPS returning to normal if I looked away, a classic indication of VRAM paging (integrated GPUs using system RAM are not affected but they also don't perform as well in general due to not having high-speed graphics memory, or only a very limited amount). In this case, using Fast graphics (not necessarily all Fast, Optifine lets you selectively set leaves, etc to Fast or Fancy) can help, otherwise a better GPU with more VRAM is necessary (even modern high-end GPUs can have issues with very large texture packs; another issue here is the "maximum texture size", which is limited to 16384x16384 on many GPUs and represents the maximum combined size of every texture since the game stitches them together into a single atlas).
Odd, this was the amount of memory I used to allocate most of the time to Minecraft Java so I assumed nothing changed with that default memory allocation since then, although thanks for correcting me on that statement.
2gb is the bare minimum I would ever consider allocating to that version of the game even on earlier versions, because I tend to use high rendering distances so the chunk loading system eats up a lot, if it is too little the garbage collector keeps having to flush out information to make room for more, and it has been proven that not enough RAM hurts performance a lot more than having excessive amounts of RAM, at least on most modern PC's.
Thankfully I play bedrock edition so the memory allocation is done automatically, the game will use as much as it needs which is another advantage this version has over Java version. If every game required us to allocate memory at the software level that would be a complete nuisance that most gamers simply don't wish to put up with.
It is reasonable to expect gamers to know and install the hardware needed to run their games, but it isn't reasonable to expect every gamer to toy around with Java settings just to get a game to work as it should, this is an inconvenience that shouldn't exist in modern PC games IMO.
Vanilla MC as it is runs quite fine on 1 GB. But when running with a few Forge mods - most notably Abyssalcraft - I would have major lag spikes, especially after a dimension swap (i.e. going in to Nether). Enabling lagometer showed that the spikes were due to 'garbage collection'. Bumping the memory to 4GB got rid of the issue and the game ran smooth again.
Vanilla MC as it is runs quite fine on 1 GB. But when running with a few Forge mods - most notably Abyssalcraft - I would have major lag spikes, especially after a dimension swap (i.e. going in to Nether). Enabling lagometer showed that the spikes were due to 'garbage collection'. Bumping the memory to 4GB got rid of the issue and the game ran smooth again.
You can get away with 1gb allocated out of the total, if you run vanilla and use a low render distance. I would suggest 1gb per player if it is for a server but this is pretty much already implied in the gamepedia wiki page. If you're playing with 10 people no less than 8gb is recommended for a Windows based Minecraft server, and I would say you should go a bit higher to reserve memory for the OS.
As has been discussed though and agreed by everyone in the replies to the OP, virtual memory should not be used as a replacement for insufficient RAM. Virtual memory exists to page away non used applications so that the operating system can use system memory efficiently and dedicate it toward active applications.
Windows task manager and/or Gamebar would give people enough information on whether or not a memory upgrade is needed.
If more than half of the RAM is being used on idle, and you only have 4gb, then it is very likely you're going to have problems with Minecraft.
I have 2GB ram on my pc and I added 1400MB ram to minecraft which is my limit, but minecraft is still slow, so I added 10GB virtual memory and I don't know how to add them to minecraft because the limit didn't change from 1400MB to 10GB so how can I allocate virtual memory to it?
Virtual memory is something that is available system-wide. It sets a portion of your hard drive to present itself as RAM, but that is a horrid option. It may help you load apps that require lots of RAM and would just run out otherwise, but since virtual memory operates at hard drive speed, it can slow things to a crawl. In other words, virtiual is a solution when you want something to just work, no matter how slow. There are algorithms that try to keep more frequently used resources in physical memory, but you still going to take a severe performancevhit.
Now, if you have virtual memory set up, your application will just see it as memory and use as needed. You can't tell MC to use virtual memory, you can tell it to just use more memory, and it will utilize virtual too. But if the goal is to make things run faster, you are not going to achieve that. You will see severe slowdowns any time MC dips into virtual memory pool.
1st, the only thing that I understood from this is that the virtual memory used to make things that doesn't work work
2nd, if you mean that it is supposed to make things faster, why doesn't it change my FPS from 0-10 to more, and the ZERO is real, my game is super laggy and the virtual memory didn't do anything
I don't know actually but you can tell me if you see the details
Processor : Intel(R) Celeron(R) CPU 847 @ 1.10GHz 1.10 GHz
Memory : 2.00GB (1.83 GB Usable)
Windows 8.1 Pro N
thanks
Yes, that is just ancient. In fact, memory is probably the least of your worries here. Vanilla unmodded Minecraft runs fairly well with only 1GB of RAM allocated.
You are limited mainly by your CPU/GPU speeds.
If you only have 2 GB of RAM I suggest allocating less memory than the default; 512 MB is more than enough at the settings that you'll be able to use (mainly render distance, which exponentially increases memory usage as it is increased; 32 chunk render distance loads 169 times more chunks than 2 chunks and nearly 10 times more than 10 chunks, the recommended minimum due to a bug with mob spawning. They recommend allocating 2 GB for 32 chunks which translates to 200 MB for 10 chunks and this is indeed about how much memory the game actually uses).
Of course, there is also the option of playing an older version; the system requirements have increased greatly over the years - the minimum requirements for 1.6 called for a Pentium 4, which came out in 2000 - probably older than most people who play the game! This is actually part of the reason why I still play 1.6.4; my older computer had major lag issues with 1.8 (plus some odd framerate issue with 1.7 but it wasn't actually lag) and by the time I got a newer one (which is also now below the system requirements; people still blame "notch code" for the game's poor performance but much of that code has been rewritten by now - in a much worse way).
Playing on a server can also help since your computer doesn't need to handle all of the game logic, which runs server-side, including a separate cache for loaded chunks (thus memory usage per loaded chunk is more than twice as high, more since the server also keeps spawn chunks loaded; the singleplayer client/server could share the same chunk cache but it would introduce contention issues due to two threads accessing the same data at once, and in any case the memory usage is not significant until you go to higher render distances; you are much more limited by your CPU/GPU).
Also, nobody mentioned Optifine, which can help; besides its optimizations as mentioned above the server keeps spawn chunks loaded and Optifine can disable that to save on memory and resources (this does mean that farms won't work in the spawn chunks). Note that only newer versions of Optifine, after around 1.8, do this (older versions of vanilla have "Advanced OpenGL", which doubled FPS on my older computer but this usually only helps on Nvidia GPUs and when the GPU is the limiting factor as it increases CPU load).
TheMasterCaver's First World - possibly the most caved-out world in Minecraft history - includes world download.
TheMasterCaver's World - my own version of Minecraft largely based on my views of how the game should have evolved since 1.6.4.
Why do I still play in 1.6.4?
Well wht if your computer is above the requirements like mine is. Would the virtual memory make a difference?
You never, ever want to have to use virtual memory (or more accurately swap, as virtual memory is a more general concept used by modern operating systems), especially for a real-time application like Minecraft since it is swapped to/from disk and can be millions of times slower than system RAM, especially when accessing lots of random bits of data, as Minecraft does a lot (disks are better for large sequential accesses) - even an SSD is insanely slow (for comparison, 60 FPS is 16.666 ms per frame):
In other words, you always want to make sure that you have enough free RAM to store the entire game and all its associated libraries, including the JVM, which at typical settings means about twice the memory you've allocated to Java (e.g. Xmx1G = 2 GB free, Xmx2G = 4 GB free, etc). It also benefits to have enough free RAM so the OS's disk cache can operate effectively (modern OSs use any free RAM to cache recently accessed files; the game continuously loads and saves chunks as you move around). Otherwise, the only practical solutions are to upgrade your system, allocate less memory and/or use memory optimization mods, or end unnecessary processes.
TheMasterCaver's First World - possibly the most caved-out world in Minecraft history - includes world download.
TheMasterCaver's World - my own version of Minecraft largely based on my views of how the game should have evolved since 1.6.4.
Why do I still play in 1.6.4?
Virtual memory sucks
as TMC said it is much slower than real RAM,
and even in the case of an SSD not only would it still be slow but it would end up wearing the flash cells out much faster, and in the end it would be a worse outcome than on a hard drive, purely because the hard drive would survive the rewrite cycles involved where as an SSD wouldn't, and after a few petabytes of write cycles you'll have to buy a new one.
In both cases it's better to purchase and allocate actual RAM to the game.
Depending on if it is vanilla or using mods, and the render distance, you will want to start with a minimum of 4gb system memory.
8gb of total RAM (with at least 1gb allocated to Minecraft which is the default in 1.16 64bit Java) is recommended because of things like your operating system and background processes.
if you go with a 32 chunk render distance I would suggest allocating 2gb, because more information has to be stored in memory when more chunks are loaded at once, in fact I would say this is the most memory intensive part of the game, where as things like redstone, lighting and mob AI tend to be more CPU/GPU dependent.
The default has been increased to 2 GB since around 1.13, which is much more resource-intensive than older versions, though memory usage had been increasing significantly since around 1.8; back in the 1.6 days Optifine even recommended allocating only 350 MB:
Interestingly, they mention swapping as a reason to allocate less; back then a lot of people still used 32 bit systems with at most 3 GB of available memory, if they even had that much installed (I myself had issues with the game running out of memory with 1 GB allocated, which in this case meant a lack of process space, not Java heap space, but otherwise I never had issues with overall system RAM with around 1 GB still available while playing):
In any case, you shouldn't allocate much more than the game actually uses to maximize performance since having to manage more memory reduces garbage collector performance; this is also why my own mod recommends allocating 512 MB, of which around half is actually used, giving a good amount of headroom (other JVM arguments may also matter, the ones I recommend were the default until around 1.13, which now uses the G1 garbage collector instead of CMS, which is better for memory-intensive programs but requires more memory for management).
Also, there is another area where a lack of memory can significantly impact performance - video RAM, specifically, dedicated, which if exceeded will cause the GPU to swap from system RAM, and cause extreme FPS drops; as an illustration, my framerate went from around 80 FPS to 5 FPS if I enabled Fancy leaves in my mod's "mega tree" biomes, which did not happen all at once but after period of time with FPS returning to normal if I looked away, a classic indication of VRAM paging (integrated GPUs using system RAM are not affected but they also don't perform as well in general due to not having high-speed graphics memory, or only a very limited amount). In this case, using Fast graphics (not necessarily all Fast, Optifine lets you selectively set leaves, etc to Fast or Fancy) can help, otherwise a better GPU with more VRAM is necessary (even modern high-end GPUs can have issues with very large texture packs; another issue here is the "maximum texture size", which is limited to 16384x16384 on many GPUs and represents the maximum combined size of every texture since the game stitches them together into a single atlas).
TheMasterCaver's First World - possibly the most caved-out world in Minecraft history - includes world download.
TheMasterCaver's World - my own version of Minecraft largely based on my views of how the game should have evolved since 1.6.4.
Why do I still play in 1.6.4?
Odd, this was the amount of memory I used to allocate most of the time to Minecraft Java so I assumed nothing changed with that default memory allocation since then, although thanks for correcting me on that statement.
2gb is the bare minimum I would ever consider allocating to that version of the game even on earlier versions, because I tend to use high rendering distances so the chunk loading system eats up a lot, if it is too little the garbage collector keeps having to flush out information to make room for more, and it has been proven that not enough RAM hurts performance a lot more than having excessive amounts of RAM, at least on most modern PC's.
Thankfully I play bedrock edition so the memory allocation is done automatically, the game will use as much as it needs which is another advantage this version has over Java version. If every game required us to allocate memory at the software level that would be a complete nuisance that most gamers simply don't wish to put up with.
It is reasonable to expect gamers to know and install the hardware needed to run their games, but it isn't reasonable to expect every gamer to toy around with Java settings just to get a game to work as it should, this is an inconvenience that shouldn't exist in modern PC games IMO.
Vanilla MC as it is runs quite fine on 1 GB. But when running with a few Forge mods - most notably Abyssalcraft - I would have major lag spikes, especially after a dimension swap (i.e. going in to Nether). Enabling lagometer showed that the spikes were due to 'garbage collection'. Bumping the memory to 4GB got rid of the issue and the game ran smooth again.
You can get away with 1gb allocated out of the total, if you run vanilla and use a low render distance. I would suggest 1gb per player if it is for a server but this is pretty much already implied in the gamepedia wiki page. If you're playing with 10 people no less than 8gb is recommended for a Windows based Minecraft server, and I would say you should go a bit higher to reserve memory for the OS.
As has been discussed though and agreed by everyone in the replies to the OP, virtual memory should not be used as a replacement for insufficient RAM. Virtual memory exists to page away non used applications so that the operating system can use system memory efficiently and dedicate it toward active applications.
Windows task manager and/or Gamebar would give people enough information on whether or not a memory upgrade is needed.
If more than half of the RAM is being used on idle, and you only have 4gb, then it is very likely you're going to have problems with Minecraft.
Thanks to help me!
I have posted a content on best Minecraft server hosting, I would love you to visit there.