I just ordered a Nvidia GTX 560 ti that has DDR5 memory. And I just now looked and found out that my computers motherboard has DDR3. Will the graphics card run when I get it? I hope I didnt just waste $300 on something that wont even run on my computer.
Also in case your wondering I do have a PCI Express 2.0 motherboard and the card is also for PCIE2.0 so it sounds like it may work on my pc.Im just not sure yet.
MS Windows 7 Ultimate 64-bit | AMD Athlon II X4 635 | 4.00 GB Dual-Channel DDR3 | BIOSTAR Group A880GU3 | ASUS VH238 | 1024MB GeForce GTX 550 Ti (Gigabyte) | 313GB Western Digital WDC WD3200AAJB-00J3A0 | TSSTcorp CDW/DVD TS-H492A | Realtek High Definition Audio
New account: FrozenOblivion, Contact me there, not here
Desktop (not yet built): i7 2600k/3770k, Gtx 680 DCII/Twin Frozr III, 16gb ram, 2TB Seagate hard drive, 500R/650D. psu that I haven't decided on yet
Quick lesson here: VRAM (Video RAM, which is RAM that you're graphics card uses) is internal. It doesn't rely on anything else, and is used solely by the graphics card. VRAM is used for better buffering on a graphics card.
RAM, on the other hand, is used when you use a program, let's say a game. This game uses 2GB of RAM, let's say. When you exit out of the game, the computer deletes the files that were on the RAM, so you can keep on using it over and over.
Now, thanks to the above poster, I have to actually Properly describe the difference in function and purpose that Graphics Memory and System memory serve.
I am not happy with this. But SOMEBODY IS WRONG ON THE INTERNET AND THAT CANNOT STAND.
The "Proper" term for a Video card would be "Video Adapter"; Because of the original design of the first IBM PC for extensibility, All Video cards include a Video BIOS.
What is a Video BIOS? When a PC boots up, one of the first things it does is look around for BIOS code to assimilate into it's own. All Video Cards include a BIOS, which is then integrated into the System BIOS. It includes the extremely basic stuff, like how to change some basic modes, display text. Earlier versions were more extensive, in that they provided BIOS level functions primitives for graphics drawing and so forth. Nowadays, the vendor usually just includes the stuff to make sure that the system can get past the POST; after that they integrate the advanced features into their Operating System Drivers, which is sensible.
The Video BIOS is similar in construction, but of course completely separate from the System BIOS on the motherboard; while the System is booted however, they form a cohesive whole; other devices, such as SCSI adapters or separate USB, network, and other adapters, often have their own BIOS; usually to provide booting support for the devices that connect to them.
Some adapters give you evidence when they first boot that this Video BIOS is there- for example my Geforce 5500FX would display a banner on the monitor for a few moments before the system began booting; this was the Video BIOS being initialized, and essentially "waiting" to be integrated into the System BIOS at which point the system would gain the functionality to show text (and usually Linear Frame Buffer, allowing for things like the Energy Star Logo) and would of course do so.
As noted, the Video BIOS is similar to the System BIOS; and takes the form of a ROM, PROM, EPROM, or a given today, EEPROM chip, and contains the basic instructions to provide an interface between the video adapter hardware and software running on the system. This is done by placing the code at well established addresses; earlier System BIOS's would often include code for working for specific graphics adapters, but nothing else; therefore when you installed another graphics adapter that wasn't supported, that adapter's BIOS would replace the existing functions with it's own, and hook the appropriate Interrupt Vectors.
BIOS code- both Video and System- is 16-bit and runs in real-mode, which is why very little has truly been added. It's only purpose today is to start the system and get it into Protected Mode, at which point the Operating System will replace all the 16-bit code with 32-bit (or, in 64-bit OS systems, 64-bit code) code loaded from Drivers.
This of course is tangential, or even unrelated, to the purposes of their various Memories. Oh well.
Anyway, All Graphics Adapters- going all the way back to the very earliest models that only supported text such as the MDA- include their own Memory.
For the early Adapters- the MDA adapters, this memory was used to store, typically, an 80x25 array of ASCII characters. Typically these adapters included 4K of Video memory for this purpose; only 2K was needed, but some supported "page flipping" in that a program could prepare a second screen in the unreferenced 2K and instantly flip to it.
Video Memory at the time was typically mapped into the Upper Memory Area; So programs could directly write to or read from video memory by reading and writing from specific addresses. As the memory used by newer adapters grew, it became clear that this was not maintainable. Standard VGA extablished a "standard" for programming the Adapter by way of registers and sending commands to specific I/O Addresses.
For quite a while, the amount of Video memory an adapter had only truly determined what Bit depths and resolutions you could use with the adapter. With 3-D graphics accelerators, Video Memory took on an entirely separate purpose of storing the texture data for easy access by the Graphics Adapter. Because Memory- especially the fast SRAM memory typically employed by Video adapters at the time- was expensive, we got things like AGP. Not only was AGP faster than existing connection technology, but it also allowed AGP cards to use System Memory as video memory by way of Bus Mastering; (This was of course separate from earlier integrated Video adapters that shared system memory by way of a BIOS setting). Memory sizes grew; and the memory demands of applications using them grew as well. Now, Graphics adapters usually have great gobs of memory- mine has 512MB of Memory, for example, some have 1GB or even 2GB or more- today, this memory is used exclusively by the video adapter. Programs can of course put data there, but it has to "mean something" to the GPU; it can be data about geometry that the GPU should draw (in the form of Vertex Buffers) or it can be texture data or other information about texture mapping, or it can be shader code. It doesn't <have> to mean something, but if it doesn't, why put it in Graphics memory, right?
System Memory of course is used by the System. It means something to the Applications running on the system.
I guess a good Summary would be "Video Memory holds data that the GPU . System Memory holds data that the System CPU uses." Even if it's not fully accurate.
No?I asked about what type of card I should get.Oh and just to let you all know I dont need help anymore.The question of memory compatibility was answered in the second post. Thanks anyways though.
Also in case your wondering I do have a PCI Express 2.0 motherboard and the card is also for PCIE2.0 so it sounds like it may work on my pc.Im just not sure yet.
It took me quite a while to understand what you were saying there.
GDDR5 will run too
Desktop (not yet built): i7 2600k/3770k, Gtx 680 DCII/Twin Frozr III, 16gb ram, 2TB Seagate hard drive, 500R/650D. psu that I haven't decided on yet
RAM, on the other hand, is used when you use a program, let's say a game. This game uses 2GB of RAM, let's say. When you exit out of the game, the computer deletes the files that were on the RAM, so you can keep on using it over and over.
Hope that helped!
I am not happy with this. But SOMEBODY IS WRONG ON THE INTERNET AND THAT CANNOT STAND.
The "Proper" term for a Video card would be "Video Adapter"; Because of the original design of the first IBM PC for extensibility, All Video cards include a Video BIOS.
What is a Video BIOS? When a PC boots up, one of the first things it does is look around for BIOS code to assimilate into it's own. All Video Cards include a BIOS, which is then integrated into the System BIOS. It includes the extremely basic stuff, like how to change some basic modes, display text. Earlier versions were more extensive, in that they provided BIOS level functions primitives for graphics drawing and so forth. Nowadays, the vendor usually just includes the stuff to make sure that the system can get past the POST; after that they integrate the advanced features into their Operating System Drivers, which is sensible.
The Video BIOS is similar in construction, but of course completely separate from the System BIOS on the motherboard; while the System is booted however, they form a cohesive whole; other devices, such as SCSI adapters or separate USB, network, and other adapters, often have their own BIOS; usually to provide booting support for the devices that connect to them.
Some adapters give you evidence when they first boot that this Video BIOS is there- for example my Geforce 5500FX would display a banner on the monitor for a few moments before the system began booting; this was the Video BIOS being initialized, and essentially "waiting" to be integrated into the System BIOS at which point the system would gain the functionality to show text (and usually Linear Frame Buffer, allowing for things like the Energy Star Logo) and would of course do so.
As noted, the Video BIOS is similar to the System BIOS; and takes the form of a ROM, PROM, EPROM, or a given today, EEPROM chip, and contains the basic instructions to provide an interface between the video adapter hardware and software running on the system. This is done by placing the code at well established addresses; earlier System BIOS's would often include code for working for specific graphics adapters, but nothing else; therefore when you installed another graphics adapter that wasn't supported, that adapter's BIOS would replace the existing functions with it's own, and hook the appropriate Interrupt Vectors.
BIOS code- both Video and System- is 16-bit and runs in real-mode, which is why very little has truly been added. It's only purpose today is to start the system and get it into Protected Mode, at which point the Operating System will replace all the 16-bit code with 32-bit (or, in 64-bit OS systems, 64-bit code) code loaded from Drivers.
This of course is tangential, or even unrelated, to the purposes of their various Memories. Oh well.
Anyway, All Graphics Adapters- going all the way back to the very earliest models that only supported text such as the MDA- include their own Memory.
For the early Adapters- the MDA adapters, this memory was used to store, typically, an 80x25 array of ASCII characters. Typically these adapters included 4K of Video memory for this purpose; only 2K was needed, but some supported "page flipping" in that a program could prepare a second screen in the unreferenced 2K and instantly flip to it.
Video Memory at the time was typically mapped into the Upper Memory Area; So programs could directly write to or read from video memory by reading and writing from specific addresses. As the memory used by newer adapters grew, it became clear that this was not maintainable. Standard VGA extablished a "standard" for programming the Adapter by way of registers and sending commands to specific I/O Addresses.
For quite a while, the amount of Video memory an adapter had only truly determined what Bit depths and resolutions you could use with the adapter. With 3-D graphics accelerators, Video Memory took on an entirely separate purpose of storing the texture data for easy access by the Graphics Adapter. Because Memory- especially the fast SRAM memory typically employed by Video adapters at the time- was expensive, we got things like AGP. Not only was AGP faster than existing connection technology, but it also allowed AGP cards to use System Memory as video memory by way of Bus Mastering; (This was of course separate from earlier integrated Video adapters that shared system memory by way of a BIOS setting). Memory sizes grew; and the memory demands of applications using them grew as well. Now, Graphics adapters usually have great gobs of memory- mine has 512MB of Memory, for example, some have 1GB or even 2GB or more- today, this memory is used exclusively by the video adapter. Programs can of course put data there, but it has to "mean something" to the GPU; it can be data about geometry that the GPU should draw (in the form of Vertex Buffers) or it can be texture data or other information about texture mapping, or it can be shader code. It doesn't <have> to mean something, but if it doesn't, why put it in Graphics memory, right?
System Memory of course is used by the System. It means something to the Applications running on the system.
I guess a good Summary would be "Video Memory holds data that the GPU . System Memory holds data that the System CPU uses." Even if it's not fully accurate.
Didn't you ask this question before?
Nevermind, looks like that was someone else.