Let's assume that the average height of each chunk, from bedrock, is just above sea level. Sea level is 64, so let's assume that 70 is the average height of terrain.
That would make each chunk hold, on average, 70 x 16 x 16 blocks.
That's 17920 blocks. Each of those has 8 vertices. 8 vertices per block per chunk, and you get 143360 vertices per chunk.
Taken from the minecraft wiki, you load a 21 x 21 chunk area around you. That's 441 chunks at any time.
8 vertices per block. 143360 vertices per chunk, and 17920 blocks per chunk. 441 chunks loaded at any time.
143360 vertices per chunk times 441 chunks.
that all rolls together into a grand, whopping...
63221760 vertices loaded at any given time. Sixty-three million, two-hundred twenty-one thousand, seven-hundred and sixty vertices loaded at any time, PLUS the fact that more are being loaded and deleted every time your character takes a step!
tThen you Throw in all the per-tick calculations, particle effects, animations, mobs, pathfinding, and the textures, and you SHOULD realize that...
most games don't have NEARLY this amount of stuff going on at any time. So before you make a rant about how you can run certain games on "max settings" or other similar arguments, remember this post.
Granted, the Minecraft coding is not the best (According to some modders), and it could be improved. However, not all of Minecrafts bad performance is because of coding problems; Notch and Jeb are not inept coders, far from it. A lot of it is because the game itself is simply resource intensive.
Oh, and this is NOT about the "server lag" created by the merge. The problems with the merge are due to the code not being finished and the client and server failing to synch properly.
This post only refers to framerate and resource usage.
To be fair, Minecraft is a 3D game with a near-infinite world with a randomly-procedurally generated world that is entirely destructible, as well this experience can be played cooperatively online with up to several hundred players. Even now few games (if any?) can claim such traits. In that aspect at least I find Minecraft to be a developmental achievement.
I'm sorry, but no, just no. Blocks loaded into memory is not anywhere near equivalent to blocks rendered. Every game has a large number of per tick processes, most probably have more than minecraft. Even with the number of blocks, the number of polygons per block is really low and the number of polygons rendered at any given time is low compared to games with high quality graphics right now. Not to mention the fact that it's just rendering basic polygons with low-res textures and doing very little of the other effects that other games do.
If the graphics and processing in minecraft was nearly as intense as you're making it out to be there's no way it would be able to run in a java vm, which adds its own restrictions and overhead, at any kind of playable quality.
Do some research before making wild claims about how amazing the efficiency of minecraft is.
Funny how a single phrase flushes most of OP's calculations down the toilet.
There is, however, a lot of stuff unrelated to rendering that has to be done. Plants (including grass) need to be checked for growth, mobs wandering everywhere (pathfinding in a dynamic environment is difficult,) thousands of blocks need to be loaded/unloaded as the player moves through chunks, etc.
Minecraft has to be CPU intensive by its very nature. Sloppy coding just makes it worse.
Minecraft is a voxel game and doesn't use vertices. You may want to do more research and then rewrite your post. A voxel is a point in space. In minecraft that voxel is just scaled up. One of the issues with minecraft performance is actually to do with the fact your loading a texture onto a voxel. Only the sides in contact with air or water have there textures rendered.
World holes are when the texture isn't applied to the voxel, hence why you can still walk into and onto world holes.
If we were to take a flat chunk you have 16*16 area, this means it has to load a texture 256 times. If you were the then lift that flat chunk up by 1 block all the edges have an extra face showing and the corners an extra 2. This means an extra 64 textures to load. Giving a total of 320 instances of a texture the game has to load for a chunk. If you were to add caves in that chunk this number increases even more.
So how many is this for the 21 chunk radius with a flat chunk? that's 112896 texture instances to load without caves. (your super flat world. ) I'm not sure how many bytes a 16*16 image is but all you need to do is times it by 112896 and that's how much information it need to dump onto your ram.
This is why the textures for the blocks are 16*16 by default. If you go up to 32*32 texture sizes your are doubling the load on your pc/mac.
Minecraft doesn't require a graphics card. This because all this information doesn't need to be worked out just loaded and saved to the ram.
The main issue for performance in minecraft is mobs, crops and lighting and particles. These are thing that have to be calculated every tick and can easily overload your CPU if there's to much in your loaded chunks.
Removing mobs increases your FPS by a good 30-40% if not more. If your machine doesn't have to constantly follow these mobs and what not your game can apply textures to the blocks faster. Thus increasing world loading speeds.
Then we get the final yet most important performance factor. Minecraft is coded in Java. Java is slower that the big daddy C++. If minecraft was programmed in another language that is more efficient the performance would more than likely increase. (as long as the programming wasn't bad)
It's ironic that the more holes there are in the world, the more it has to process.
I would say a large group of mobs slows my computer by, idk, 40% and rain slows my computer by 80%. Rain is an animation right? That's why we can't turn it off even when we put particle effects on 'minimal'?
Speaking as an experienced game developer, Minecraft does indeed perform quite well given everything that it does. However, there are a lot of small parts of it that could undoubtedly be optimized a lot more.
That would make each chunk hold, on average, 70 x 16 x 16 blocks.
That's 17920 blocks. Each of those has 8 vertices. 8 vertices per block per chunk, and you get 143360 vertices per chunk.
Taken from the minecraft wiki, you load a 21 x 21 chunk area around you. That's 441 chunks at any time.
8 vertices per block. 143360 vertices per chunk, and 17920 blocks per chunk. 441 chunks loaded at any time.
143360 vertices per chunk times 441 chunks.
that all rolls together into a grand, whopping...
63221760 vertices loaded at any given time. Sixty-three million, two-hundred twenty-one thousand, seven-hundred and sixty vertices loaded at any time, PLUS the fact that more are being loaded and deleted every time your character takes a step!
tThen you Throw in all the per-tick calculations, particle effects, animations, mobs, pathfinding, and the textures, and you SHOULD realize that...
most games don't have NEARLY this amount of stuff going on at any time. So before you make a rant about how you can run certain games on "max settings" or other similar arguments, remember this post.
Granted, the Minecraft coding is not the best (According to some modders), and it could be improved. However, not all of Minecrafts bad performance is because of coding problems; Notch and Jeb are not inept coders, far from it. A lot of it is because the game itself is simply resource intensive.
Oh, and this is NOT about the "server lag" created by the merge. The problems with the merge are due to the code not being finished and the client and server failing to synch properly.
This post only refers to framerate and resource usage.
Mostly moved on. May check back a few times a year.
I believe that's only if you have advanced openGL turned on. Some older machines won't support that.
Click the picture!
-Derek Shunia
What advanced OpenGL is supposed to do is not even render the exposed surfaces that you can't see.
Mostly moved on. May check back a few times a year.
If the graphics and processing in minecraft was nearly as intense as you're making it out to be there's no way it would be able to run in a java vm, which adds its own restrictions and overhead, at any kind of playable quality.
Do some research before making wild claims about how amazing the efficiency of minecraft is.
Funny how a single phrase flushes most of OP's calculations down the toilet.
Minecraft has to be CPU intensive by its very nature. Sloppy coding just makes it worse.
Mostly moved on. May check back a few times a year.
I would say a large group of mobs slows my computer by, idk, 40% and rain slows my computer by 80%. Rain is an animation right? That's why we can't turn it off even when we put particle effects on 'minimal'?
However, part of my point still stands about how much stuff is going on all the time.