In the thousands? Overkill much? You should be playing Crysis 1 or 2 not this lmao. I get around a base of 30 FPS with fast graphics and far view distance. It's actually worse if I set it to anything lower than normal.
Just for kicks, I can get ~2600 with settings on low...
And on average with everything maxed out and the smooth lighting, ~500 while running around
As for the person saying you can't tell above 30fps.... what are you... blind? Why else are TVs going to 240hz? People can tell the difference. Retard.
And justin with the black ops. No? Done, the game has complete crap graphics (I'm serious, they are garbage). I play Crysis 2 with all specs maxed and I vsync with fps == 60, it doesn't drop. Way to have a fail comparison.
this was posted on another thread but i thought id throw it in here too. some one figured out if you go into your.minecraft folder and delete all your screenshots youve taken it gives you more fps i went from an average of about 80 to 100 after i deleted like 7 screenshots i had in my folder.
Just for kicks, I can get ~2600 with settings on low...
And on average with everything maxed out and the smooth lighting, ~500 while running around
As for the person saying you can't tell above 30fps.... what are you... blind? Why else are TVs going to 240hz? People can tell the difference. Retard.
And justin with the black ops. No? Done, the game has complete crap graphics (I'm serious, they are garbage). I play Crysis 2 with all specs maxed and I vsync with fps == 60, it doesn't drop. Way to have a fail comparison.
All tests done at 1080p.
Standard computer screens have a refresh rate of 60Hz so no you cannot see the difference on your computer. As for super high refresh rate screens well, you can see super bright to black image flashes easily into several hundreds of FPS but when is a game/movie/tv show going to be perfect bright white? So, seeing as Minecraft doesn't have anything that's white in it so an FPS over like 120 (if your screen is even capable) is really unneeded. But yes your eye can easily see FPS WAY over 30fps with any picture that's not dark.
Its interesting for me. On Windows 7, I got around 200-500 FPS and on Ubuntu, 500-800 with fancy and far graphics while normally playing. Now when 1.3 came out, I get about 140FPS always. I did have the betterlight mod
On before and my framerate is not locked. Do I care? Hell no.
...It just seems that chunk updates since 1.3 kill mine... Might consider looking at a RAMdisk...
Don't bother, I have 24GB of RAM, RAMdisk slows my performance down (I created a 4GB RAMdisk, I had plenty to spare). Although the reason for that probably is because windows already puts MC into RAM, then when it tries to retrieve and write, it is using the same bus (essentially cutting the max speed in half... if it ever reaches it). Then again, I'm talking about an FPS drop of maybe 30-40 out of 500... so perhaps for you, it will push it up if you HDD is the issue (doubtful, with the new update, chunk loading seems a bunch smoother).
ApexProxy, Agree. Last study I read said that humans can detect ~260hz of a bright flash after being adjusted to a dark room with very little to no light (even if said bright flash is a full image) they can detect features/parts of the image. Again, this isn't a constant 260hz, just a flash. But I think we should consider that this is motion in game that we are looking for... so the difference between 60-120-240hz is noticeable, at least to some people - especially out of the corner of ones eye (the concentration of rods in the outer parts of the eye allow for higher frequency light/dark detection/flickering which become overwhelmed if one looks directly at something - try being in a dark room, see the difference between looking at something from the corner of your eye between straight at something. The object will be brighter when looking out of the corner of your eye. The same property allows for higher frequency (also at lower levels of light in total)).
...It just seems that chunk updates since 1.3 kill mine... Might consider looking at a RAMdisk...
Don't bother, I have 24GB of RAM, RAMdisk slows my performance down (I created a 4GB RAMdisk, I had plenty to spare). Although the reason for that probably is because windows already puts MC into RAM, then when it tries to retrieve and write, it is using the same bus (essentially cutting the max speed in half... if it ever reaches it). Then again, I'm talking about an FPS drop of maybe 30-40 out of 500... so perhaps for you, it will push it up if you HDD is the issue (doubtful, with the new update, chunk loading seems a bunch smoother).
ApexProxy, Agree. Last study I read said that humans can detect ~260hz of a bright flash after being adjusted to a dark room with very little to no light (even if said bright flash is a full image) they can detect features/parts of the image. Again, this isn't a constant 260hz, just a flash. But I think we should consider that this is motion in game that we are looking for... so the difference between 60-120-240hz is noticeable, at least to some people - especially out of the corner of ones eye (the concentration of rods in the outer parts of the eye allow for higher frequency light/dark detection/flickering which become overwhelmed if one looks directly at something - try being in a dark room, see the difference between looking at something from the corner of your eye between straight at something. The object will be brighter when looking out of the corner of your eye. The same property allows for higher frequency (also at lower levels of light in total)).
Its only noticeable if you have a screen that can display it at that refresh rate. Anyone with a screen that runs on a 60Hz refresh will not notice any difference over 60 FPS... if vsync is on, otherwise you may have frames updating out of sync of your monitors refresh. In this case 60+ could look smoother only because its giving it more chances to display a frame in sync with your screen.
Before the update 1.3 I was getting about 100 FPS+ on Far and Fancy.
Now, thank to 1.3 Update, it about 35 FPS on Far and Fancy (No "Smooth Lights" on, because it decreases FPS, and it looks creppy anyways in my opinion).
Oh yeah and always with F11. (Fullscreen= Moar FPS). I also update my drivers just today.
Thats the only think I hate about this update. I am cool with beds and everything else. In fact, I could live with these FPS... if it werent for the fact that apparentily 90% of the people have got a BOOST in FPS rather than a decrease... its just not cool...
Also, whats the most important thing to have for minecraf? CPU? RAM?
I thought this was assumed that a 60hz could only display such. I only make the argument for those of us playing with higher hz screens (Plasma screens have been at 480hz for a while now). No, I don't own a plasma. I wish. Anyway yes, I stick with vsync. It is definitely better to limit screen to smooth visible frames (providing one already has sufficient fps).
Before the update 1.3 I was getting about 100 FPS+ on Far and Fancy.
Now, thank to 1.3 Update, it about 35 FPS on Far and Fancy (No "Smooth Lights" on, because it decreases FPS, and it looks creppy anyways in my opinion).
Oh yeah and always with F11. (Fullscreen= Moar FPS). I also update my drivers just today.
Thats the only think I hate about this update. I am cool with beds and everything else. In fact, I could live with these FPS... if it werent for the fact that apparentily 90% of the people have got a BOOST in FPS rather than a decrease... its just not cool...
Also, whats the most important thing to have for minecraf? CPU? RAM?
Well I would say Graphics Card>RAM>CPU (but it depends on how fast your CPU and RAM is if you need to swap the two). imo
My FPS is averaged around 75, and that's normal for most games. But, after watching it for a while its spiked to the thousands and with flux between 50-250 when I'm walking about.
Yeah. Not that noticeable over 30fps :tongue.gif: Even 20!
There is quite a large noticeable different between 30fps and 60fps and even between 60fps - 100fps.
Tip: If you really want to have good video output, turn on vsync. If your PC is capable, you will be locked at either 60fps with a 60Hz LCD or 120fps with a 120Hz LCD. 60Hz is the most common.
Being locked at 60fps with vsync will remove video tearing (oh god, i hate tearing) and is far better than attempting to increase your e-penis with 99999999 FPS, there's absolutely no point.
Yeah. Not that noticeable over 30fps :tongue.gif: Even 20!
There is quite a large noticeable different between 30fps and 60fps and even between 60fps - 100fps.
Tip: If you really want to have good video output, turn on vsync. If your PC is capable, you will be locked at either 60fps with a 60Hz LCD or 120fps with a 120Hz LCD. 60Hz is the most common.
Being locked at 60fps with vsync will remove video tearing (oh god, i hate tearing) and is far better than attempting to increase your e-penis with 99999999 FPS, there's absolutely no point.
You have a fundamental misunderstanding here.
Vsync will stop the video card (or cpu if it isn't being offloaded) rendering frames page-flipping the current frame to the next complete available frame in the drawing space when it is already doing it at <insert screen Hz here> frames per second. You won't magically get "locked" to anything, but you will be limited to it. It has no impact at all if you're unable to reach it.
Secondly, as long as the person sending data to the screen has ANY idea about blitting, syncing and backbuffering, you will not get tearing. Ever.
I would also hesitate to say that the difference between 60 and 100fps is "large" as well. But this is harder to quantify.
*stuff about how he is better, how 40 isn't a real number and how vsync requires a CPU/GPU to do less work*
Have you ever programmed something/used a computer/played a game? You do understand what vsync is right? It isn't actually preventing the GPU from doing work. In fact, it is causing the GPU to do extra work by keep sync and having a steady framerate - often cases lowering fps in total (so if someone were getting ~50 fps, they might end up with ~45, etc). Further, with someone who has a video card plenty capable of handling "thousands of frames per second", these frames are still written into the video buffer... the buffer just hasn't (and won't be) flipped into view... and in fact, vsync simply prevents video buffering to be displayed at a higher than steady (or max) rate. So essentially, all of those thousands of frames are still rendered, placed into the buffer, then discarded like a cheap ***** when the vsync timer is still counting down. You think the GPU just halts when the vsync timer is enabled? It doesn't, how else would it buffer without a huge software rewrite?
And if someone knows about blitting/syncing/backbuffering? What? So someone (Notch in this case) has to design a software solution that runs in a virtual machine (JVM) that will work across operating systems, a huge variety of hardware, and amongst numerous other applications that will be hogging system resources? No? That's what enabling vsync is for. That is written into the driver settings...so that people don't actually have to worry about it in a desktop environment.
On a final note, the CPU has nothing to do with vsync, never mention it again. The CPU doesn't handle visual output, ever. Seriously, every system has at least a designated set of memory and/or a visual core/library for handling output. Just in case you try to be a dumbass as state that the CPU can do software visual whatever - that is processing, it is NOT output. There are no pins that connect your CPU to vga/dvi/hdmi/etc, it doesn't have output to a display.
Yes puddi, I know who you are. IRC has been nice and peaceful lately no thanks to you. Found a new server to spam? :biggrin.gif:
Im so proud... -.-
And on average with everything maxed out and the smooth lighting, ~500 while running around
As for the person saying you can't tell above 30fps.... what are you... blind? Why else are TVs going to 240hz? People can tell the difference. Retard.
And justin with the black ops. No? Done, the game has complete crap graphics (I'm serious, they are garbage). I play Crysis 2 with all specs maxed and I vsync with fps == 60, it doesn't drop. Way to have a fail comparison.
All tests done at 1080p.
Standard computer screens have a refresh rate of 60Hz so no you cannot see the difference on your computer. As for super high refresh rate screens well, you can see super bright to black image flashes easily into several hundreds of FPS but when is a game/movie/tv show going to be perfect bright white? So, seeing as Minecraft doesn't have anything that's white in it so an FPS over like 120 (if your screen is even capable) is really unneeded. But yes your eye can easily see FPS WAY over 30fps with any picture that's not dark.
Fancy, Far, AO, I'm getting 40-50fps. It'll get up to 90+ if I stand still. Any number of chunk updates though and it just dies back to 30 or so.
Betterlight never hurt my fps this bad.
Turn AO off... If I stand still, I get 80-160fps, seemingly randomly.. Move much at all, back to 30. Chunk updates seem not-relevant to it.
Tiny, Fast, No AO, get 400fps. Walking to get 80+ chunk updates drops it to 300fps.]
Fancy on tiny is 300fps minus however many chunk update (100 chunk updates roughly 200fps)
Fancy on far is 50-100fps.
It just seems that chunk updates since 1.3 kill mine... Might consider looking at a RAMdisk
6870
i5 2400 (4x3.06GHz)
4GB RAM
On before and my framerate is not locked. Do I care? Hell no.
I love my laptop :3
Don't bother, I have 24GB of RAM, RAMdisk slows my performance down (I created a 4GB RAMdisk, I had plenty to spare). Although the reason for that probably is because windows already puts MC into RAM, then when it tries to retrieve and write, it is using the same bus (essentially cutting the max speed in half... if it ever reaches it). Then again, I'm talking about an FPS drop of maybe 30-40 out of 500... so perhaps for you, it will push it up if you HDD is the issue (doubtful, with the new update, chunk loading seems a bunch smoother).
ApexProxy, Agree. Last study I read said that humans can detect ~260hz of a bright flash after being adjusted to a dark room with very little to no light (even if said bright flash is a full image) they can detect features/parts of the image. Again, this isn't a constant 260hz, just a flash. But I think we should consider that this is motion in game that we are looking for... so the difference between 60-120-240hz is noticeable, at least to some people - especially out of the corner of ones eye (the concentration of rods in the outer parts of the eye allow for higher frequency light/dark detection/flickering which become overwhelmed if one looks directly at something - try being in a dark room, see the difference between looking at something from the corner of your eye between straight at something. The object will be brighter when looking out of the corner of your eye. The same property allows for higher frequency (also at lower levels of light in total)).
Its only noticeable if you have a screen that can display it at that refresh rate. Anyone with a screen that runs on a 60Hz refresh will not notice any difference over 60 FPS... if vsync is on, otherwise you may have frames updating out of sync of your monitors refresh. In this case 60+ could look smoother only because its giving it more chances to display a frame in sync with your screen.
Now, thank to 1.3 Update, it about 35 FPS on Far and Fancy (No "Smooth Lights" on, because it decreases FPS, and it looks creppy anyways in my opinion).
Oh yeah and always with F11. (Fullscreen= Moar FPS). I also update my drivers just today.
Thats the only think I hate about this update. I am cool with beds and everything else. In fact, I could live with these FPS... if it werent for the fact that apparentily 90% of the people have got a BOOST in FPS rather than a decrease... its just not cool...
Also, whats the most important thing to have for minecraf? CPU? RAM?
Umm... given. Of course. :happy.gif:
I thought this was assumed that a 60hz could only display such. I only make the argument for those of us playing with higher hz screens (Plasma screens have been at 480hz for a while now). No, I don't own a plasma. I wish. Anyway yes, I stick with vsync. It is definitely better to limit screen to smooth visible frames (providing one already has sufficient fps).
Well I would say Graphics Card>RAM>CPU (but it depends on how fast your CPU and RAM is if you need to swap the two). imo
There is quite a large noticeable different between 30fps and 60fps and even between 60fps - 100fps.
Tip: If you really want to have good video output, turn on vsync. If your PC is capable, you will be locked at either 60fps with a 60Hz LCD or 120fps with a 120Hz LCD. 60Hz is the most common.
Being locked at 60fps with vsync will remove video tearing (oh god, i hate tearing) and is far better than attempting to increase your e-penis with 99999999 FPS, there's absolutely no point.
You have a fundamental misunderstanding here.
Vsync will stop the video card (or cpu if it isn't being offloaded)
rendering framespage-flipping the current frame to the next complete available frame in the drawing space when it is already doing it at <insert screen Hz here> frames per second. You won't magically get "locked" to anything, but you will be limited to it. It has no impact at all if you're unable to reach it.Secondly, as long as the person sending data to the screen has ANY idea about blitting, syncing and backbuffering, you will not get tearing. Ever.
I would also hesitate to say that the difference between 60 and 100fps is "large" as well. But this is harder to quantify.
Edited to ensure clarity...
Have you ever programmed something/used a computer/played a game? You do understand what vsync is right? It isn't actually preventing the GPU from doing work. In fact, it is causing the GPU to do extra work by keep sync and having a steady framerate - often cases lowering fps in total (so if someone were getting ~50 fps, they might end up with ~45, etc). Further, with someone who has a video card plenty capable of handling "thousands of frames per second", these frames are still written into the video buffer... the buffer just hasn't (and won't be) flipped into view... and in fact, vsync simply prevents video buffering to be displayed at a higher than steady (or max) rate. So essentially, all of those thousands of frames are still rendered, placed into the buffer, then discarded like a cheap ***** when the vsync timer is still counting down. You think the GPU just halts when the vsync timer is enabled? It doesn't, how else would it buffer without a huge software rewrite?
And if someone knows about blitting/syncing/backbuffering? What? So someone (Notch in this case) has to design a software solution that runs in a virtual machine (JVM) that will work across operating systems, a huge variety of hardware, and amongst numerous other applications that will be hogging system resources? No? That's what enabling vsync is for. That is written into the driver settings...so that people don't actually have to worry about it in a desktop environment.
On a final note, the CPU has nothing to do with vsync, never mention it again. The CPU doesn't handle visual output, ever. Seriously, every system has at least a designated set of memory and/or a visual core/library for handling output. Just in case you try to be a dumbass as state that the CPU can do software visual whatever - that is processing, it is NOT output. There are no pins that connect your CPU to vga/dvi/hdmi/etc, it doesn't have output to a display.