Something changed between 12w snapshots and 1.3 prerelease, and maybe that helps some people and not others. One thing I noticed is that in 12w snapshots the client was still single threaded and would max out one core. 1.3 PR seems to be multithreaded (multiple active java processes) and maybe even throttled in Balanced mode on a fast computer.
For i5 650 3.2 GHz 2 core hyperthread (4 cores to any OS), 8 GB, GTX 550 Ti, 64 bit Linux, Fancy, Balanced, Normal view, the 12w snapshots max'd out one core generally 150-250 fps and occasionally in the 300's to over 400. 1.3 PR seems to consistantly run around 125 fps, none of the cores is max'd and gpu runs somewhat cooler. Either might dip into the 70's when first loading, but that is not an issue for a 60 Hz 1080p HDTV.
AMD C-50 1 GHz 2 core tablet PC (netbook size), 2 GB, ATI HD 6250 graphics, 10.1" 1280x800, 64-bit Linux, Fast, Max FPS, Normal, Particles: Decreased, everything else ON except Open GL, runs 1.2.5 at a bit of a jerky 15-20 fps. Both recent 12w snapshots and 1.3 PR run a smoother 20-30 fps, although, 1.3 PR seems to be making better use of both cores to reduce speed drop loading chunks (Linux is booted and running from SD card slower than most hard drives).
I know that this may not go for everyone. But one problem that I had was with the current Nvidia drivers and EVGA Precision software 3.0.2 and 3.0.3. The video card(s) depending if you run multiple cards. Mine underclock themselves when not in heavy or demanded use. And in rare occasions don't clock back up to their (763 mhz). They stay at 50mhz. So when I started the game it was laggy and coincidentally I installed Minecraft 1.3. So I started playing again after I uninstalled 1.3 and put 1.2.5 back. It was still laggy. So I checked Evga precision and my card wasn't even putting up normal settings. So I restarted the computer and installed 1.3 again and everything was working flawless.
So. I don't know if some of you have nvidia cards and use evga software. I can say check that out. Never know.
i72600 CPU (quad core 3.4ghz)
16GB RAM
MSI GeForce GTX570 Twin Frozr III OC Edition
MSI P67A-GD55 MotherBoard
OS: Windows 7 Ultimate 64-Bit
I usually play on 'full' settings - far render, smooth lighting, etc.
My FPS in 1.1:
My FPS in 1.2.5:
Though normally in 1.2.5 I'd be around 500 when there was no chunk generation going on.
In 1.3 I'm getting an average FPS of 30 - 40, after increasing the allocated RAM to 8GB, on fast graphics, particles minimal, render distance normal (above are all far), smooth lighting off, performance on Max FPS - both in SP and SMP.
I understand that single player is now effectively running a server and that some FPS drop is expected - but from 500 to 30? Seriously? There's something going on here. My boyfriend has an almost identical rig to mine, the only difference being the motherboard, and he's getting a steady 150fps. If I play on Linux (instead of, as in the screenies above, Win 7 Ultimate 64-bit), I get about 80-100fps. I have friends on PCs with half the specs of mine and they're getting 200fps.
Hardware-related issue?
Edit - some additional information: updated java (32 and 64), updated graphics drivers, removed and re-installed fresh Minecraft, overclocked PC (no details, just the magic 'overclock' button on the MoBo), moved java priority from Normal to High, changed allocated RAM from the default to 8GB.
i72600 CPU (quad core 3.4ghz)
16GB RAM
MSI GeForce GTX570 Twin Frozr III OC Edition
MSI P67A-GD55 MotherBoard
OS: Windows 7 Ultimate 64-Bit
I usually play on 'full' settings - far render, smooth lighting, etc.
In 1.3 I'm getting an average FPS of 30 - 40, after increasing the allocated RAM to 8GB, on fast graphics, particles minimal, render distance normal (above are all far), smooth lighting off, performance on Max FPS - both in SP and SMP.
I understand that single player is now effectively running a server and that some FPS drop is expected - but from 500 to 30? Seriously? There's something going on here. My boyfriend has an almost identical rig to mine, the only difference being the motherboard, and he's getting a steady 150fps. If I play on Linux (instead of, as in the screenies above, Win 7 Ultimate 64-bit), I get about 80-100fps. I have friends on PCs with half the specs of mine and they're getting 200fps.
1000 FPS!?!?! Mine increased by the way and this computer hasnt been upgraded since 2005. Its now 40-70 from 20-30.
Yeah. That's why I'm thinking there's no way I should have dropped down to 30 now. But like I said, I have friends on dual-core processors, gfx cards that are several years old and 8GB RAM that are getting 200fps on 1.3 on full settings. :/
Ah. As I suspected, both smooth and fancy lighting are turned off.
Regardless, Crystal's system specs are significantly lower than mine and yet when I run MC on 'low' settings, I'm still getting 30fps where she's getting 200.
Ah. As I suspected, both smooth and fancy lighting are turned off.
I've never turned smooth on, a holdover from my days with a laptop where my max fps was 12 with Optimine/Optifog. Must have left Fancy off while testing before. Will check and post another. Finding the dichotomy interesting!
Regardless, Crystal's system specs are significantly lower than mine and yet when I run MC on 'low' settings, I'm still getting 30fps where she's getting 200.
Honestly I have no solid idea. There's a lot of variables involved between everyone's PC and how well they're running the software installed on them.
I've never turned smooth on, a holdover from my days with a laptop where my max fps was 12 with Optimine/Optifog. Must have left Fancy off while testing before. Will check and post another. Finding the dichotomy interesting!
You should. It looks like you have the breathing room for it.
Honestly I have no solid idea. There's a lot of variables involved between everyone's PC and how well they're running the software installed on them.
Indeed. I'm hoping a lot more people will post their system specs, MC setup and resulting framerates between versions so that we can narrow something down.
Still doesn't account for the issues others are having, folks with specs MUCH greater than mine.
So many environmental and hardware variables, it's a combination of things in most cases. As Pixie stated above her BF has nearly the same PC hardware wise and is getting 150fps.
I have an i7-860 2.8, 4 gigs of ram, GTX 470, Windows 7 64bit Ultimate and Java 64bit. Have not told Java to launch minecraft with any extra ram.
1.3.1 performs very good. I'm not using mods of any kind and the default texture pack ATM. Far view distance, moody with smooth lighting, Advanced OpenGL not enabled.
For i5 650 3.2 GHz 2 core hyperthread (4 cores to any OS), 8 GB, GTX 550 Ti, 64 bit Linux, Fancy, Balanced, Normal view, the 12w snapshots max'd out one core generally 150-250 fps and occasionally in the 300's to over 400. 1.3 PR seems to consistantly run around 125 fps, none of the cores is max'd and gpu runs somewhat cooler. Either might dip into the 70's when first loading, but that is not an issue for a 60 Hz 1080p HDTV.
AMD C-50 1 GHz 2 core tablet PC (netbook size), 2 GB, ATI HD 6250 graphics, 10.1" 1280x800, 64-bit Linux, Fast, Max FPS, Normal, Particles: Decreased, everything else ON except Open GL, runs 1.2.5 at a bit of a jerky 15-20 fps. Both recent 12w snapshots and 1.3 PR run a smoother 20-30 fps, although, 1.3 PR seems to be making better use of both cores to reduce speed drop loading chunks (Linux is booted and running from SD card slower than most hard drives).
So. I don't know if some of you have nvidia cards and use evga software. I can say check that out. Never know.
i72600 CPU (quad core 3.4ghz)
16GB RAM
MSI GeForce GTX570 Twin Frozr III OC Edition
MSI P67A-GD55 MotherBoard
OS: Windows 7 Ultimate 64-Bit
I usually play on 'full' settings - far render, smooth lighting, etc.
My FPS in 1.1:
My FPS in 1.2.5:
Though normally in 1.2.5 I'd be around 500 when there was no chunk generation going on.
In 1.3 I'm getting an average FPS of 30 - 40, after increasing the allocated RAM to 8GB, on fast graphics, particles minimal, render distance normal (above are all far), smooth lighting off, performance on Max FPS - both in SP and SMP.
I understand that single player is now effectively running a server and that some FPS drop is expected - but from 500 to 30? Seriously? There's something going on here. My boyfriend has an almost identical rig to mine, the only difference being the motherboard, and he's getting a steady 150fps. If I play on Linux (instead of, as in the screenies above, Win 7 Ultimate 64-bit), I get about 80-100fps. I have friends on PCs with half the specs of mine and they're getting 200fps.
Hardware-related issue?
Edit - some additional information: updated java (32 and 64), updated graphics drivers, removed and re-installed fresh Minecraft, overclocked PC (no details, just the magic 'overclock' button on the MoBo), moved java priority from Normal to High, changed allocated RAM from the default to 8GB.
I have not had any discernible increase in FPS, and my computer is only TWO YEARS OLD! I don't want to get a new computer!
That's slightly comforting.
1000 FPS!?!?! Mine increased by the way and this computer hasnt been upgraded since 2005. Its now 40-70 from 20-30.
Intel Core 2Duo CPU (E7400 @ 2.8ghz)
8GB RAM
Nvidia GTX 270 OC
OS: Windows 7 Home Premium 64-bit
Hope the hardware info may help narrow down the issue. Pix (and others I've heard from) should be getting plenty more fps than I'm getting.
Yeah. That's why I'm thinking there's no way I should have dropped down to 30 now. But like I said, I have friends on dual-core processors, gfx cards that are several years old and 8GB RAM that are getting 200fps on 1.3 on full settings. :/
Edit - And one of them is CrystalCrow, above
The first image has nothing on screen while the second does. That explains a fair part of that dichotomy.
At max settings? I can't believe that.
Well, for clarification's sake then, in 1.1 my standard FPS on 'full' settings was 5-700 when moving, 7-1100 when not doing much.
Believe it.
Regardless, Crystal's system specs are significantly lower than mine and yet when I run MC on 'low' settings, I'm still getting 30fps where she's getting 200.
I've never turned smooth on, a holdover from my days with a laptop where my max fps was 12 with Optimine/Optifog. Must have left Fancy off while testing before. Will check and post another. Finding the dichotomy interesting!
The window is larger than 20" on a 32" LCD TV.
Still doesn't account for the issues others are having, folks with specs MUCH greater than mine.
Honestly I have no solid idea. There's a lot of variables involved between everyone's PC and how well they're running the software installed on them.
You should. It looks like you have the breathing room for it.
Edit: Bah. Too late.
Indeed. I'm hoping a lot more people will post their system specs, MC setup and resulting framerates between versions so that we can narrow something down.
So many environmental and hardware variables, it's a combination of things in most cases. As Pixie stated above her BF has nearly the same PC hardware wise and is getting 150fps.
I have an i7-860 2.8, 4 gigs of ram, GTX 470, Windows 7 64bit Ultimate and Java 64bit. Have not told Java to launch minecraft with any extra ram.
1.3.1 performs very good. I'm not using mods of any kind and the default texture pack ATM. Far view distance, moody with smooth lighting, Advanced OpenGL not enabled.
by c0yote
I tried it with terrible results. I gave my wife my glasses for a second, a creeper showed up and now my wife is pregnant.
Stupid 3D..