Well, you basically just answered my question in full, whereas everyone else seemed to dither around it, or ask another question rather than answering mine. Thank you.
It's very complex, and most of the people here aren't well versed in explaining complex things.
But one more thing since you seem to be the only one willing to help:
I have seen i7 processors that are only clocked at 2. something GHz, and mine is clocked at 3.2GHz.
This is why I made the assumption of Intel "downgrading" their processors.
For isnt the GHz the speed at which your processor takes action on something? Or am I wrong about that?
Clock speed is the speed of each clock cycle of the processor. However, in recent days that does not matter much, as newer cpu architectures allow for more information per clock cycle, which eliminates the need for higher clocks entirely.
This gets a bit more complex when you add in multithreading and use of multiple CPU cores to accomplish this same thing, or double the effect.
It's a common misconception since the way we think of a CPU and how it handles these things now is far different from what it once was, especially with innovations like cores and things like SSE3.
To be honest, I am basing this questioning on how I have grown up, and seeing things that were once made by hand, and with superb quality, to now being mass produced, and made at a much lower quality to decrease the cost, as well as the speed at which they are created.
Join the club.
I've got a hand-me-down TV from the late 80s that still works, but the flatscreen I bought just 4 years ago is dead.
This is how I am looking at Intel. They at one point made amazing technology, and now they decide to mass market their products at the cost of having lower grade retail to sell.
For things like computer components this is not exactly true.
They've always been mass produced, and the tech is still amazing. Where you get lower end mass produced stuff is through OEM vendors (dell, HP, etc) for things like mobos, PSUs, and other things.
CPUs are all made in the same few factories for AMD and Intel and they are all mass produced. Always have been.
You can't really make a CPU without mass producing it, the wafers are made in large batches.
I have seen i7 processors that are only clocked at 2. something GHz, and mine is clocked at 3.2GHz.
This is why I made the assumption of Intel "downgrading" their processors.
For isnt the GHz the speed at which your processor takes action on something? Or am I wrong about that?
Directly, yes- Ghz is the clock speed of the processor. But The thing is, new generations of processors often take fewer of those cycles to complete operations. For example: take something like the early 8088 Processors. Many instructions and operations took 10, 13, or sometimes even 30 cycles. Later generations take a third- or even a quarter of the time, in addition to adding new instructions that didn't exist before. This touches on fm87's point as well; basically that Ghz really only measures speed when you compare two otherwise identical CPUs, but different generations have different measurements for how those Ghz are used.
Most Celeron Processors, for example, have taken advantage of this misconception; some celeron processors are in the 3 to 4Ghz range; however they lack some cache,sometimes parts of the processor are disabled or take extra cycles, and so on and so forth; basically, the CPU itself is "cheaper" so they just clock it up to a higher frequency so people think it's faster.
To be honest, I am basing this questioning on how I have grown up, and seeing things that were once made by hand, and with superb quality, to now being mass produced, and made at a much lower quality to decrease the cost, as well as the speed at which they are created.
This is how I am looking at Intel. They at one point made amazing technology, and now they decide to mass market their products at the cost of having lower grade retail to sell.
This is definitely not the case. I have to admit I find the idea that Intel used to make amazing technology false; They still do. Each new generation generally redesigns the chip, learning lessons from their earlier designs or trying new approaches.
On the topic of threads: a single Minecraft Process uses 50; the game itself I know of using at least three itself- one for the game, one for chunk I/O, and of course a separate thread for the server in Single Player. There are threads that get spun by the Video Driver, Java Garbage Collector, etc.
On the topic of threads: a single Minecraft Process uses 50; the game itself I know of using at least three itself- one for the game, one for chunk I/O, and of course a separate thread for the server in Single Player. There are threads that get spun by the Video Driver, Java Garbage Collector, etc.
Ah of course, I forgot about SSP having it's own server now.
Clock speed is the speed of each clock cycle of the processor. However, in recent days that does not matter much, as newer cpu architectures allow for more information per clock cycle, which eliminates the need for higher clocks entirely.
This gets a bit more complex when you add in multithreading and use of multiple CPU cores to accomplish this same thing, or double the effect.
It's a common misconception since the way we think of a CPU and how it handles these things now is far different from what it once was, especially with innovations like cores and things like SSE3.
Join the club.
I've got a hand-me-down TV from the late 80s that still works, but the flatscreen I bought just 4 years ago is dead.
For things like computer components this is not exactly true.
They've always been mass produced, and the tech is still amazing. Where you get lower end mass produced stuff is through OEM vendors (dell, HP, etc) for things like mobos, PSUs, and other things.
CPUs are all made in the same few factories for AMD and Intel and they are all mass produced. Always have been.
You can't really make a CPU without mass producing it, the wafers are made in large batches.
Each square here is a CPU wafer.
Directly, yes- Ghz is the clock speed of the processor. But The thing is, new generations of processors often take fewer of those cycles to complete operations. For example: take something like the early 8088 Processors. Many instructions and operations took 10, 13, or sometimes even 30 cycles. Later generations take a third- or even a quarter of the time, in addition to adding new instructions that didn't exist before. This touches on fm87's point as well; basically that Ghz really only measures speed when you compare two otherwise identical CPUs, but different generations have different measurements for how those Ghz are used.
Most Celeron Processors, for example, have taken advantage of this misconception; some celeron processors are in the 3 to 4Ghz range; however they lack some cache,sometimes parts of the processor are disabled or take extra cycles, and so on and so forth; basically, the CPU itself is "cheaper" so they just clock it up to a higher frequency so people think it's faster.
This is definitely not the case. I have to admit I find the idea that Intel used to make amazing technology false; They still do. Each new generation generally redesigns the chip, learning lessons from their earlier designs or trying new approaches.
On the topic of threads: a single Minecraft Process uses 50; the game itself I know of using at least three itself- one for the game, one for chunk I/O, and of course a separate thread for the server in Single Player. There are threads that get spun by the Video Driver, Java Garbage Collector, etc.