Or play an older version; those specs are perfectly capable of running 1.6.4 or older versions, which had even lower requirements, although I'd recommend an NVIDIA GPU instead (in fact, the recommended requirements called for hardware that is currently up to 18 years old). Even modern hardware has issues with the latest versions due to how unoptimized they are (for example, this person has a RTX 2080 Super and Ryzen 7 1700 and gets only 60 FPS and 24 ms server tick time on a render distance of 17 (not that high), and uses more RAM than your computer has, while I only allocate 512 MB to 1.6.4 with no issues, which should leave enough free system RAM as long as you don't have other programs or background processes running):
(IMO, this is the result of sloppy programming and the thinking that everybody can just go out to buy a new computer or upgrade every year, which only serious gamers do, much less can afford. The addition of more content is often claimed to be the main reason for the increase in resource demands but that has little effect on memory usage in a game like Minecraft, where the vast majority of memory is used to store block data, which depends on the number of blocks loaded, not how many different types of blocks there are)
Note that this is with mods, albeit my own non-Forge mods which do not have all the issues that many mods have (generally poorly written code and no regard to memory usage):
On the other hand, performance was absolutely terrible on any newer version; some change in rendering in 1.7 caused severe stutter at the rate of once every 10 frames, regardless of any settings in-game, CPU control panel, etc, and so severe that the effective framerate looked like it was 10 times lower; then 1.8 was unbelievably laggy - in part because of this I never upgraded to a newer version, even after getting a new computer which actually gets a higher FPS in 1.8 than 1.6.4 unless I enable "Advanced OpenGL" (1.8 has native occlusion culling while older versions used an OpenGL function which worked best on NVIDIA GPUs, in fact, AMD/Intel would often perform worse, which is why it was optional. This also explains why some people say that 1.8 performs better (1.7 also had a known issue with stuttering, though not the framerate issue mentioned above).
Note that both of these are with Optifine while 1.6.4 did not have it installed, not that it had any effect on the periodic lag spikes (I mainly used it for its features like clear water, better fog, individual Fast/Fancy options, fixing chunk rendering issues, and zooming):
That's not to say that 1.6.4 never had issues; enabling Fancy leaves in my mod's "mega tree" biomes would kill performance due to running out of dedicated VRAM (only 256 MB) and using Far render distance risked getting an out of memory error due to having a 32 bit OS which can only provide around 1.5 GB of RAM to a process (the JVM would fail to start with more than about 1.3 GB allocated):
If I were in charge of development you'd still be able to run the latest version on the same hardware:
This is at maximum settings (Fancy, 16 chunk render distance); 536 FPS, 2.5 ms server tick, 232/365 MB memory usage:
(note that vanilla 1.6.4 is effectively limited to 10 chunk render distance but I fixed that; even Optifine doesn't properly work until you go above 16)
Contrast to this, from somebody with much better specs than I have - easily more than an order of magnitude worse in every aspect (FPS, tick time, memory):
Also, this is a heap dump analysis I made using Java VisualVM, showing that the majority of memory is taken by by byte arrays used to store block IDs, metadata, and light level; things like individual blocks/items/entities are far down the list ("retained" shows how much memory is referenced by not just the object itself but everything it references; Chunk has instances of ChunkSection which in turn have instances of byte arrays to store data. The size of the heapdump was also 164 MB, suggesting that is the actual amount of memory being used and anything above that is dead objects awaiting garbage collection; this means that 75% of memory is used by chunks):
How much memory does a block use? More than indicated due to the fact that the texture itself uses at least 1 KB (16x16 pixels x 32 bits per pixel), but far from what is needed to store chunk data (some of the blocks listed have a size of 0 because I replaced the vanilla class with my own, like BlockAnvilFix vs BlockAnvil, or they have sub-classes):
My Pc :
Ram: 2gb
Graphics Card: ATI Radeon HD 4200
CPU: AMD Athlon(TM) II X2 220 Processor
Or play an older version; those specs are perfectly capable of running 1.6.4 or older versions, which had even lower requirements, although I'd recommend an NVIDIA GPU instead (in fact, the recommended requirements called for hardware that is currently up to 18 years old). Even modern hardware has issues with the latest versions due to how unoptimized they are (for example, this person has a RTX 2080 Super and Ryzen 7 1700 and gets only 60 FPS and 24 ms server tick time on a render distance of 17 (not that high), and uses more RAM than your computer has, while I only allocate 512 MB to 1.6.4 with no issues, which should leave enough free system RAM as long as you don't have other programs or background processes running):
Minecraft 1.8 has so many performance problems that I just don't know where to start with.
MC-164123 Poor FPS performance with new rendering engine
(IMO, this is the result of sloppy programming and the thinking that everybody can just go out to buy a new computer or upgrade every year, which only serious gamers do, much less can afford. The addition of more content is often claimed to be the main reason for the increase in resource demands but that has little effect on memory usage in a game like Minecraft, where the vast majority of memory is used to store block data, which depends on the number of blocks loaded, not how many different types of blocks there are)
To give an idea of just how much more demanding newer versions are, I used to have a computer with an Athlon 64 X2 4200+ and GeForce 7600 GS, both dating from 2006 but still above the recommended system requirements (2.6 GHz Pentium D or Athlon 64, which both had about half the overall performance and lower single thread rating despite the 4200+ running at only 2.2 GHz; and a GeForce 6000 series GPU (no specific model given but a 7600 GS outperforms a 6800 GT):
On the other hand, performance was absolutely terrible on any newer version; some change in rendering in 1.7 caused severe stutter at the rate of once every 10 frames, regardless of any settings in-game, CPU control panel, etc, and so severe that the effective framerate looked like it was 10 times lower; then 1.8 was unbelievably laggy - in part because of this I never upgraded to a newer version, even after getting a new computer which actually gets a higher FPS in 1.8 than 1.6.4 unless I enable "Advanced OpenGL" (1.8 has native occlusion culling while older versions used an OpenGL function which worked best on NVIDIA GPUs, in fact, AMD/Intel would often perform worse, which is why it was optional. This also explains why some people say that 1.8 performs better (1.7 also had a known issue with stuttering, though not the framerate issue mentioned above).
Note that both of these are with Optifine while 1.6.4 did not have it installed, not that it had any effect on the periodic lag spikes (I mainly used it for its features like clear water, better fog, individual Fast/Fancy options, fixing chunk rendering issues, and zooming):
That's not to say that 1.6.4 never had issues; enabling Fancy leaves in my mod's "mega tree" biomes would kill performance due to running out of dedicated VRAM (only 256 MB) and using Far render distance risked getting an out of memory error due to having a 32 bit OS which can only provide around 1.5 GB of RAM to a process (the JVM would fail to start with more than about 1.3 GB allocated):
If I were in charge of development you'd still be able to run the latest version on the same hardware:
(note that vanilla 1.6.4 is effectively limited to 10 chunk render distance but I fixed that; even Optifine doesn't properly work until you go above 16)
Contrast to this, from somebody with much better specs than I have - easily more than an order of magnitude worse in every aspect (FPS, tick time, memory):
Also, this is a heap dump analysis I made using Java VisualVM, showing that the majority of memory is taken by by byte arrays used to store block IDs, metadata, and light level; things like individual blocks/items/entities are far down the list ("retained" shows how much memory is referenced by not just the object itself but everything it references; Chunk has instances of ChunkSection which in turn have instances of byte arrays to store data. The size of the heapdump was also 164 MB, suggesting that is the actual amount of memory being used and anything above that is dead objects awaiting garbage collection; this means that 75% of memory is used by chunks):
How much memory does a block use? More than indicated due to the fact that the texture itself uses at least 1 KB (16x16 pixels x 32 bits per pixel), but far from what is needed to store chunk data (some of the blocks listed have a size of 0 because I replaced the vanilla class with my own, like BlockAnvilFix vs BlockAnvil, or they have sub-classes):
TheMasterCaver's First World - possibly the most caved-out world in Minecraft history - includes world download.
TheMasterCaver's World - my own version of Minecraft largely based on my views of how the game should have evolved since 1.6.4.
Why do I still play in 1.6.4?