That's complete and utter rubbish. A 6850 + i7 980 will give you about 33fps in Crysis on high details @ 1920x1080. Minecraft is running for you at way over 100fps, and your monitor is most likely only capable of displaying 60fps anyway (eg 60hz), so it's impossible to see the difference between 110 and 130fps.
As for your laptop example, you have an Intel integrated GPU, which is terrible for OpenGL 3D graphics. Get one with a decent Nvidia or AMD GPU and you won't have a problem.
90% of the 'performance issues' people have with Minecraft seem to stem from trying to run the game on laptops that weren't designed for gaming and so only have a crappy Intel GPU. Any modern system with a discrete graphics card or decent quality Nvidia/AMD integrated GPU can play the game just fine.
I am aware that the crysis bit was ********, I put it out to see who would catch it. :wink.gif:
as for the laptop, I am aware of how **** intel's integrated stuff is, However it plays more graphically demanding games at a higher fps.
I am aware that the crysis bit was ********, I put it out to see who would catch it. :wink.gif:
as for the laptop, I am aware of how **** intel's integrated stuff is, However it plays more graphically demanding games at a higher fps.
Most games today have lots particles and 2D polygons with really fancy textures to make them look 3D.
Minecraft generates, renders, and calculates thousands upon thousands of blocks at any given time. Now, they may only be 16x16 textures, but it adds up. Plus all of the mobs and per-tick calculations to carry out game functions...
there is a whole lot going on in Minecraft. Quite a bit more than some other modern games that are hailed as "computer intensive"
Most games today have lots particles and 2D polygons with really fancy textures to make them look 3D.
Minecraft generates, renders, and calculates thousands upon thousands of blocks at any given time. Now, they may only be 16x16 textures, but it adds up. Plus all of the mobs and per-tick calculations to carry out game functions...
there is a whole lot going on in Minecraft. Quite a bit more than some other modern games that are hailed as "computer intensive"
you make a decent point. I will give you that one.
The FPS Performance I've experienced in Minecraft has done nothing but increase as the game has developed. And I play on a laptop. A non-gamer friendly laptop. I don't know what to tell you.
Rollback Post to RevisionRollBack
I've been a Minecraft player since Beta 1.7, still my favorite game!
Twitter: @EvanLange7737
I'd be willing to sacrifice my peak FPS moments if it meant smoothing out the lag spikes as well. I'd rather have a game that's running at a consistently average rate of FPS than one that's really high with frequent drops.
I'd be willing to sacrifice my peak FPS moments if it meant smoothing out the lag spikes as well. I'd rather have a game that's running at a consistently average rate of FPS than one that's really high with frequent drops.
Me, too! Even though my computer and graphics card aren't the greatest, Minecraft runs pretty well most of the time for me. The lag spikes can be killer, though (literally, if there are hostiles around).
A pretty big part of it is Java being weirdly coded. Also i'm pretty sure Minecraft itself was never really coded to make good FPS.
But at LEAST we have options to decrease particles, delete clouds, etc etc.
Also I do think that Notch gathering info from people would help them actually pinpoint what the issue might be with performance . . . but people are tin-foils. Mojang technically HAS your account details already since they're probably stored in a secure file somewhere. I doubt there's much else they can take.
Minecraft is a voxel based engine. Now, voxels are a much more graphically intensive form of rendering visuals in a game compared to polygons. Most games actually use graphical techniques like tesselation and bump mapping to improve the way a model looks. Now Minecraft on the other hand, every surface of every item is rendered physically. That is a lot for your graphics card to handle. Hope this was helpful.
Minecraft is a voxel based engine. Now, voxels are a much more graphically intensive form of rendering visuals in a game compared to polygons. Most games actually use graphical techniques like tesselation and bump mapping to improve the way a model looks. Now Minecraft on the other hand, every surface of every item is rendered physically. That is a lot for your graphics card to handle. Hope this was helpful.
Yeah, I think people forget this. They see low resolution textures and assume that it'll be an easy game to run.
This is also a question about what you have created in your world. If there are a lot of redstone and piston stuff it will take the game more resources to update each tick. If you click f3 you should see a chart about what the cpu cycles are spent on. Maybe you have an addon that adds a lot of entity logic as well.
There is also a graphic setting to limit the fps. Make sure it's on fast or balanced if your machine can handle it.
Also, you have to remember that Minecraft doesn't use multithreading, so for the people like you who have several average good processers working together for the power of one really good one (which is most common and best for gaming computers today) then Minecraft is only harnessing the power of one core. You mention Optifine in your OP, but how many modules are you using? Only the first one, Smooth? Try using the module that adds multithreaded graphics, that should improve it. On the other hand, I mainly play Minecraft on my Macbook on Fast Normal graphics. I get about 20 FPS with occasional lag spikes due to my mods. I noticed that 1.8 had a huge hit to FPS, because of the new lighting code. 1.0 improved that by a lot, and I didn't really see any difference from 1.0 in 1.1. About the whole voxel thing, you could try "Advanced OpenGL" which only renders things that are viewed. It improves FPS by about 5-10 for me when I'm just walking around, but because it has to render new things whenever you break/destroy a block, it lags horrendously when you mine/load chunks/anything but standing still and looking back and forth.
Rollback Post to RevisionRollBack
Thoughts? Suggestions? Random acts of petty fanboyish hatred?
I play on an Alienware-built i5 750 with an ATI Radeon HD5870, 6GB of RAM.
Back in the earlier stages of Minecraft I ran it at around 170 or so FPS on average, sometimes up to even 250+. In 1.7.3 it was very common for me to be getting 150-180 FPS. After 1.8 hit, I was running at just above 100, and now in the full release I average about 80-100 FPS.
Now of course this is a very playable frame rate, but I would still like to have that 150+ frames for a much smoother play.
I also noticed that Fancy graphics with large amounts of trees around drop the frame rate by nearly half.
With my experience I can narrow down the cause of this bad performance to the terrain generation. The change of terrain generation between 1.7.3 - 1.8 is a ton. With all these extra details the game has to render with the mass amount of transparent objects, like the cobwebs in the mineshafts and the thick forests, it's very stressful on the GPU because even if you can't see it, it's still being rendered.
Rollback Post to RevisionRollBack
OG Member of BrenyBeast's private BeastCraft Server
Well I have a 4 year old desktop, with only 2GB of RAM running windows xp, and can run a 128x texture pack at around 45fps, which is actually 5 fps faster than my monitor can output, so always plays nice and smooth for me :smile.gif:
I don't see people complaining of having over 100fps, it's just ridiculous to be honest I think? That's most likely faster than your monitor can display, so you wouldn't notice it being faster or slower anyway?
With my experience I can narrow down the cause of this bad performance to the terrain generation. The change of terrain generation between 1.7.3 - 1.8 is a ton. With all these extra details the game has to render with the mass amount of transparent objects, like the cobwebs in the mineshafts and the thick forests, it's very stressful on the GPU because even if you can't see it, it's still being rendered.
Advanced open GL will prevent the game from rendering unseen objects and voxels when turned on.
I play on an Alienware-built i5 750 with an ATI Radeon HD5870, 6GB of RAM.
Back in the earlier stages of Minecraft I ran it at around 170 or so FPS on average, sometimes up to even 250+. In 1.7.3 it was very common for me to be getting 150-180 FPS. After 1.8 hit, I was running at just above 100, and now in the full release I average about 80-100 FPS.
Now of course this is a very playable frame rate, but I would still like to have that 150+ frames for a much smoother play.
Wow, what monitor are you using? Must be a really good one to be able to display over 150 FPS.
I am aware that the crysis bit was ********, I put it out to see who would catch it. :wink.gif:
as for the laptop, I am aware of how **** intel's integrated stuff is, However it plays more graphically demanding games at a higher fps.
Most games today have lots particles and 2D polygons with really fancy textures to make them look 3D.
Minecraft generates, renders, and calculates thousands upon thousands of blocks at any given time. Now, they may only be 16x16 textures, but it adds up. Plus all of the mobs and per-tick calculations to carry out game functions...
there is a whole lot going on in Minecraft. Quite a bit more than some other modern games that are hailed as "computer intensive"
you make a decent point. I will give you that one.
Twitter: @EvanLange7737
Im fairly certain no make of CPU runs at 12.8ghz.
On topic, yeah Minecraft has gotten slower for me. But I do tent to run it with an ass load of mods...
Me, too! Even though my computer and graphics card aren't the greatest, Minecraft runs pretty well most of the time for me. The lag spikes can be killer, though (literally, if there are hostiles around).
But at LEAST we have options to decrease particles, delete clouds, etc etc.
Also I do think that Notch gathering info from people would help them actually pinpoint what the issue might be with performance . . . but people are tin-foils. Mojang technically HAS your account details already since they're probably stored in a secure file somewhere. I doubt there's much else they can take.
Minecraft is a voxel based engine. Now, voxels are a much more graphically intensive form of rendering visuals in a game compared to polygons. Most games actually use graphical techniques like tesselation and bump mapping to improve the way a model looks. Now Minecraft on the other hand, every surface of every item is rendered physically. That is a lot for your graphics card to handle. Hope this was helpful.
Steam Profile: http://steamcommunity.com/id/Umbre/
Yeah, I think people forget this. They see low resolution textures and assume that it'll be an easy game to run.
There is also a graphic setting to limit the fps. Make sure it's on fast or balanced if your machine can handle it.
Back in the earlier stages of Minecraft I ran it at around 170 or so FPS on average, sometimes up to even 250+. In 1.7.3 it was very common for me to be getting 150-180 FPS. After 1.8 hit, I was running at just above 100, and now in the full release I average about 80-100 FPS.
Now of course this is a very playable frame rate, but I would still like to have that 150+ frames for a much smoother play.
I also noticed that Fancy graphics with large amounts of trees around drop the frame rate by nearly half.
With my experience I can narrow down the cause of this bad performance to the terrain generation. The change of terrain generation between 1.7.3 - 1.8 is a ton. With all these extra details the game has to render with the mass amount of transparent objects, like the cobwebs in the mineshafts and the thick forests, it's very stressful on the GPU because even if you can't see it, it's still being rendered.
I don't see people complaining of having over 100fps, it's just ridiculous to be honest I think? That's most likely faster than your monitor can display, so you wouldn't notice it being faster or slower anyway?
There are more Windows based laptops in the market though :/
Advanced open GL will prevent the game from rendering unseen objects and voxels when turned on.
Steam Profile: http://steamcommunity.com/id/Umbre/
As far as I can tell it only prevents rendering unseen chunks. Either way there's no difference in FPS for me with it on or off.
Wow, what monitor are you using? Must be a really good one to be able to display over 150 FPS.