Same reason people will create 1000$+ builds and just use the power supply that came with the case; also, a 500$ Graphics card is cheaper than a 500$ Graphics card + a monitor.
Well yes, it's cheaper, but then why would they even need the graphics card? You could get by with something in the $130-$300 range on low resolution displays.
I would guess that there are a lot more Eyefinity/Surround users than 2560*1600 users, as three 1080p monitors is a couple hundred dollars cheaper than one 2560*1600 monitor on average. Personally, I would prefer an Eyefinity/Surround setup as it would give much better immersion into FPS games.
My 1.2.% is all resolutions above 1200p. I disagree Eyefinity bezels ruin all immersion.
A 300$ card may handle current games but what about crysis 3 or Arma3 I would also bet BF3 armored kill with the massive maps is going to be a lot more demanding then the earlier maps.
My 1.2.% is all resolutions above 1200p. I disagree Eyefinity bezels ruin all immersion.
A 300$ card may handle current games but what about crysis 3 or Arma3 I would also bet BF3 armored kill with the massive maps is going to be a lot more demanding then the earlier maps.
You're right. But I don't see why we're arguing about whether we'll need something as powerful as the 680 or 7970 at a lower resolution. I was just saying that the 7970 really shines at high resolutions, and it should be given credit for it. To be honest, it would be incredibly stupid for the developers to make games unplayable without a $500 card.
Maybe we should just come to the conclusion that if you're gaming at high resolutions, go for the 7970. If you're not, go for the 680.
And by the way, I don't see any guarantee that Crysis 3 will be any more demanding than 2. It's probably just going to be another Console port, unless it's coming out for the new consoles next year. But this is for another discussion...
The dip will not be very noticeable, though. I highly doubt it would drop below ~50 FPS.
And lets be honest here. Why the **** would you spend $500 on a GPU with a 1680*1050 monitor?
Future proofing? If a 500 dollar GPU is going to be able to run next generation games, guess who won't need to upgrade when the time comes... that's right, the people who bought the high end cards.
Future proofing? If a 500 dollar GPU is going to be able to run next generation games, guess who won't need to upgrade when the time comes... that's right, the people who bought the high end cards.
And the people who bought the $300 cards will be able to get another $300 card and quite possibly do just as well in the next generation games, too.
This is why "future proofing" is a joke.
I'm really beginning to see this argument get more and more pointless. The goal of this thread was to simply point out that AMD does in-fact perform better at high-resolutions than Nvidia. Not to discuss the nuances of how many gamers have high resolution setups, and definitely not future proofing.
A+ = Passed every test they used with no issues. These panels are top-grade, have zero dead pixels, minimal bleeding, no issues.
A = Lower grade. Some defects.
A- = Lowest grade. More defects.
A+ panels are sold at full price to Apple, Dell, etc. A and A- panels are sold for next to nothing to third-party builders, who make these cheap monitors and sell them for very little. If there wasn't a demand for these catleap/pcbank panels, the reject A/A- panels would most likely be thrown out or recycled.
Now, I still think they are a good value for the money. However, it is highly illogical to compare a A+ panel to a A/A- one. Think of it as comparing a brand-new ferrari to a heavily used ferrari with scratched paint, dinged doors, a worn-out engine, and damaged parts. This isn't the best comparison, but it's the best I can do. The second ferrari works fine, but it doesn't look as good, and won't function as well.
Note that that particular monitor uses a A- screen, allows for up to 5 dead pixels, has a AC adapter that is incompatible with North America's power standards, and is only 1ms lower.
“Two things are infinite: the universe and human stupidity; and I'm not sure about the universe.” — Albert Einstein
"Never try to teach a pig to sing; it wastes your time and it annoys the pig." — Robert Heinlein
Well yes, it's cheaper, but then why would they even need the graphics card? You could get by with something in the $130-$300 range on low resolution displays.
My 1.2.% is all resolutions above 1200p. I disagree Eyefinity bezels ruin all immersion.
A 300$ card may handle current games but what about crysis 3 or Arma3 I would also bet BF3 armored kill with the massive maps is going to be a lot more demanding then the earlier maps.
You're right. But I don't see why we're arguing about whether we'll need something as powerful as the 680 or 7970 at a lower resolution. I was just saying that the 7970 really shines at high resolutions, and it should be given credit for it. To be honest, it would be incredibly stupid for the developers to make games unplayable without a $500 card.
Maybe we should just come to the conclusion that if you're gaming at high resolutions, go for the 7970. If you're not, go for the 680.
And by the way, I don't see any guarantee that Crysis 3 will be any more demanding than 2. It's probably just going to be another Console port, unless it's coming out for the new consoles next year. But this is for another discussion...
Future proofing? If a 500 dollar GPU is going to be able to run next generation games, guess who won't need to upgrade when the time comes... that's right, the people who bought the high end cards.
And the people who bought the $300 cards will be able to get another $300 card and quite possibly do just as well in the next generation games, too.
This is why "future proofing" is a joke.
I'm really beginning to see this argument get more and more pointless. The goal of this thread was to simply point out that AMD does in-fact perform better at high-resolutions than Nvidia. Not to discuss the nuances of how many gamers have high resolution setups, and definitely not future proofing.
200$ shipping... fail
The samsung one has 5ms response time
The only way that I can think of any close to future proofing is to get mid range stuff and mid range stuff every 2-3 years.
Exactly what I'm saying.
7ms is very good for a IPS screen.
Time for me to repost what I wrote up on these...
Note that that particular monitor uses a A- screen, allows for up to 5 dead pixels, has a AC adapter that is incompatible with North America's power standards, and is only 1ms lower.
"Never try to teach a pig to sing; it wastes your time and it annoys the pig." — Robert Heinlein
16:9 isn't as desirable for web browsing and other things for many people.
DAMN that's cheap (er)
It's probably a similiar situation as the 2560*1600s that Battlekid was talking about, I bet.
Is the difference with 16:9 and 16:10 really that big?
Yes both my laptop and my desktop are 16:10 and its a defiantly noticeable difference.
Eloborate please?
Noticeable how?
The extra vertical pixels.
Okay
It's much better for web-browsing and some work stuff. It's also good for games. Best of both worlds IMO
Oh ok.
Too bad I aint got 1000$
They aren't the only 16:10 screens you know