With the release of Catalyst Beta 12.7, AMD appears to be beating the GTX 680 in many titles. Can't wait to see what Nvidia's response is! http://www.tomshardw...rk,3232-19.html
In fact according to that testing, the only one it consistently outperformed in was one game. I imagine tomshardware's unreleased findings were made using artificial benchmarks (things like furmark, 3DMark, and other mostly pointless tools), rather than actual games.
The 7970 ghz edition idles at the same temp as a 690, it load temp is higher then the others except the 590 and 6990. Under idle it is 3-14 watts under the 680 and 670, but when under load it is 43 watts over the 680.
“These are the voyages of the starship Enterprise, it's continuing mission to explore a strange new worlds, to seek out new life and new civilizations, to boldly go where no one has gone before.”-Gene Roddenberry
Look at the high resolution benchmarks, guys. The 7970 (usually GHz edition, but sometimes the vanilla card) consistently comes in before the 680 at 2560*1600.
2560*1600 is really the only resolution the 680 and 7970 should be compared, as they're both rather OP'd for anything less then that.
Look at the high resolution benchmarks, guys. The 7970 (usually GHz edition, but sometimes the vanilla card) consistently comes in before the 680 at 2560*1600.
Sarcasm? Nvidia does struggle with high-res (above 1920x1200) and multi-monitor.
Rollback Post to RevisionRollBack
“These are the voyages of the starship Enterprise, it's continuing mission to explore a strange new worlds, to seek out new life and new civilizations, to boldly go where no one has gone before.”-Gene Roddenberry
Your point is null. Most gamers don't use these cards.
The amount of people who use highend graphics cards is a lot larger then people who use high res monitors.
Just so you know the amount of people who use 1200p+ is 1.2% based on steam hardware survey.
The amount of people who use highend graphics cards is a lot larger then people who use high res monitors.
Just so you know the amount of people who use 1200p+ is 1.2% based on steam hardware survey.
Well, at low resolutions, the difference between the cards is hardly noticeable, as they both provide very playable framerates. The only real way to see the full potential of the 7970 and 680 is to actually use a high resolution monitor.
Take BF3 for example. At 1680x1050, all the 7970s pull above 78 FPS average on the ultra preset. The GTX 680 performs higher, but it's obviously not noticeable, as it doesn't quite reach the 120 FPS needed for 3D Vision, but it doesn't go below 60 FPS. Without actually seeing the framerate, you probably wouldn't be able to tell the difference on your run of the mill 60Hz display.
Well, at low resolutions, the difference between the cards is hardly noticeable, as they both provide very playable framerates. The only real way to see the full potential of the 7970 and 680 is to actually use a high resolution monitor.
It is still noticeable but at the moment most games have been out for a while as more next gen games comes out we will see more demand on the cards especially in the next few years with new consoles.
At 1080p with only 4AA its only getting 78 on average it still is going to dip.
Well, at low resolutions, the difference between the cards is hardly noticeable, as they both provide very playable framerates. The only real way to see the full potential of the 7970 and 680 is to actually use a high resolution monitor.
Take BF3 for example. At 1680x1050, all the 7970s pull above 78 FPS average on the ultra preset. The GTX 680 performs higher, but it's obviously not noticeable, as it doesn't quite reach the 120 FPS needed for 3D Vision, but it doesn't go below 60 FPS. Without actually seeing the framerate, you probably wouldn't be able to tell the difference on your run of the mill 60Hz display.
According to pcpartpicker all the monitor in that res is over 1K and has a horrible response time:
“These are the voyages of the starship Enterprise, it's continuing mission to explore a strange new worlds, to seek out new life and new civilizations, to boldly go where no one has gone before.”-Gene Roddenberry
It is still noticeable but at the moment most games have been out for a while as more next gen games comes out we will see more demand on the cards especially in the next few years with new consoles.
At 1080p with only 4AA its only getting 78 on average it still is going to dip.
The dip will not be very noticeable, though. I highly doubt it would drop below ~50 FPS.
And lets be honest here. Why the **** would you spend $500 on a GPU with a 1680*1050 monitor?
Also note that is in singleplayer MP is a whole other ballgame I can get a solid 75 with my 6970 but I have dips into the 30fps range.
I am talking about 1200p area here.
My 560 TI 448 core delivers more than playable performance (50-100 FPS depending on the scene) in 64 man MP, with almost everything maxed out at 1680x1050. (2x MSAA instead of 4x, though). The lowest I've ever seen it dip was about 35 FPS, but it's very rare for that to happen.
The point is, the 7970 really shows it's true potential against the 680 when its additional memory bandwidth becomes useful. For high resolution gaming, I would for sure choose a 7970 over the 680.
And keep in mind that the benchmarks didn't have anything about Eyefinity. My guess is that the 7970 would beat the 680 even more when it comes to that too, given it's memory bandwidth advantage and the fact that it's rendering 6220800px (5760*1080) rather than the 4096000px you get with 2560x1600.
My 560 TI 448 core delivers more than playable performance (50-100 FPS depending on the scene) in 64 man MP, with almost everything maxed out at 1680x1050. (2x MSAA instead of 4x, though). The lowest I've ever seen it dip was about 35 FPS, but it's very rare for that to happen.
The point is, the 7970 really shows it's true potential against the 680 when its additional memory bandwidth becomes useful. For high resolution gaming, I would for sure choose a 7970 over the 680.
And keep in mind that the benchmarks didn't have anything about Eyefinity. My guess is that the 7970 would beat the 680 when it comes to that too, given it's memory bandwidth advantage.
How "high-res"? My 570 can easily play BF3 on ultra on BF3 people map at 1080p. Think it is around 45fps with V-sync. Also if your talking 2560x1600 then response time will kill your game, and your wallet:
“These are the voyages of the starship Enterprise, it's continuing mission to explore a strange new worlds, to seek out new life and new civilizations, to boldly go where no one has gone before.”-Gene Roddenberry
How "high-res"? My 570 can easily play BF3 on ultra on BF3 people map at 1080p. Think it is around 45fps with V-sync. Also if your talking 2560x1600 then response time will kill your game, and your wallet:
2560x1600 isn't the only "super high res" there is. You're forgetting about Eyefinity/Surround. See above post.
2560x1600 isn't the only "super high res" there is. You're forgetting about Eyefinity/Surround. See above post.
I know nvidia lacks in multi-monitor. Not sure how many people do multi-monitor but to be fair Nvidia just started to support multi-monitor on single GPU configs. To be honest I would never do it because of the bezel (since this is probably the wrong word I mean the plastic/metal edges of the screen).
“These are the voyages of the starship Enterprise, it's continuing mission to explore a strange new worlds, to seek out new life and new civilizations, to boldly go where no one has gone before.”-Gene Roddenberry
I know nvidia lacks in multi-monitor. Not sure how many people do multi-monitor but to be fair Nvidia just started to support multi-monitor on single GPU configs.
I'm talking about multi-monitor as if it is just a higher resolution, because that is essentially what it is for the most part. I would say Nvidia tends to fall behind at high resolutions, maybe due to the lower memory bandwidth available on their cards.
I would guess that there are a lot more Eyefinity/Surround users than 2560*1600 users, as three 1080p monitors is a couple hundred dollars cheaper than one 2560*1600 monitor on average. Personally, I would prefer an Eyefinity/Surround setup as it would give much better immersion into FPS games.
And lets be honest here. Why the **** would you spend $500 on a GPU with a 1680*1050 monitor?
Same reason people will create 1000$+ builds and just use the power supply that came with the case; also, a 500$ Graphics card is cheaper than a 500$ Graphics card + a monitor.
http://www.tomshardw...rk,3232-19.html
In fact according to that testing, the only one it consistently outperformed in was one game. I imagine tomshardware's unreleased findings were made using artificial benchmarks (things like furmark, 3DMark, and other mostly pointless tools), rather than actual games.
The ghz edition scores it more wins like I posted like a month ago benchmarks and all.
Good thing most gamers use that resolution.
Sarcasm? Nvidia does struggle with high-res (above 1920x1200) and multi-monitor.
Good thing most gamers don't use these cards.
The amount of people who use highend graphics cards is a lot larger then people who use high res monitors.
Just so you know the amount of people who use 1200p+ is 1.2% based on steam hardware survey.
Nvidia does have a harder time at high resolutions but the amount of people as I said above is minor.
Well, at low resolutions, the difference between the cards is hardly noticeable, as they both provide very playable framerates. The only real way to see the full potential of the 7970 and 680 is to actually use a high resolution monitor.
Take BF3 for example. At 1680x1050, all the 7970s pull above 78 FPS average on the ultra preset. The GTX 680 performs higher, but it's obviously not noticeable, as it doesn't quite reach the 120 FPS needed for 3D Vision, but it doesn't go below 60 FPS. Without actually seeing the framerate, you probably wouldn't be able to tell the difference on your run of the mill 60Hz display.
It is still noticeable but at the moment most games have been out for a while as more next gen games comes out we will see more demand on the cards especially in the next few years with new consoles.
At 1080p with only 4AA its only getting 78 on average it still is going to dip.
According to pcpartpicker all the monitor in that res is over 1K and has a horrible response time:
The dip will not be very noticeable, though. I highly doubt it would drop below ~50 FPS.
And lets be honest here. Why the **** would you spend $500 on a GPU with a 1680*1050 monitor?
Also note that is in singleplayer MP is a whole other ballgame I can get a solid 75 with my 6970 but I have dips into the 30fps range.
I am talking about 1200p area here.
My 560 TI 448 core delivers more than playable performance (50-100 FPS depending on the scene) in 64 man MP, with almost everything maxed out at 1680x1050. (2x MSAA instead of 4x, though). The lowest I've ever seen it dip was about 35 FPS, but it's very rare for that to happen.
The point is, the 7970 really shows it's true potential against the 680 when its additional memory bandwidth becomes useful. For high resolution gaming, I would for sure choose a 7970 over the 680.
And keep in mind that the benchmarks didn't have anything about Eyefinity. My guess is that the 7970 would beat the 680 even more when it comes to that too, given it's memory bandwidth advantage and the fact that it's rendering 6220800px (5760*1080) rather than the 4096000px you get with 2560x1600.
How "high-res"? My 570 can easily play BF3 on ultra on BF3 people map at 1080p. Think it is around 45fps with V-sync. Also if your talking 2560x1600 then response time will kill your game, and your wallet:
2560x1600 isn't the only "super high res" there is. You're forgetting about Eyefinity/Surround. See above post.
I know nvidia lacks in multi-monitor. Not sure how many people do multi-monitor but to be fair Nvidia just started to support multi-monitor on single GPU configs. To be honest I would never do it because of the bezel (since this is probably the wrong word I mean the plastic/metal edges of the screen).
I'm talking about multi-monitor as if it is just a higher resolution, because that is essentially what it is for the most part. I would say Nvidia tends to fall behind at high resolutions, maybe due to the lower memory bandwidth available on their cards.
I would guess that there are a lot more Eyefinity/Surround users than 2560*1600 users, as three 1080p monitors is a couple hundred dollars cheaper than one 2560*1600 monitor on average. Personally, I would prefer an Eyefinity/Surround setup as it would give much better immersion into FPS games.
Same reason people will create 1000$+ builds and just use the power supply that came with the case; also, a 500$ Graphics card is cheaper than a 500$ Graphics card + a monitor.