On top of that, Ive also looked into proccessor power it needs too, and it seems that running minecraft off multiple threads doesnt seem to work very well. Seems to run off 1 thread at a time and depending on how many threads your processor has, it really restricts it quite a lot. For example, ive got an intel 6 core processor running now at 4Ghz,(12 threads) so split that between 12 threads and t only 33% of the processor being used, which does cause issues with the server eventually. So yea, i would be more worried about your hardware specs in the data centre before anything else because that will limit your player base much more.
So it has six cores, and is clocked at 4Ghz, but can only handle 12 threads...? o.O
My question is this:
How is it that my old single core Pentium 4 Processor from 2005 is capable of running 580 threads, and a brand new 6 core processor can only run 12?
Here is a screen shot of the threads my CPU is running:
Here are my CPU Specs:
**Edit**
As you can see above, my processor is only using 1% of its capabilities.
I am running minecraft on Max graphics, I have Runescape open in chrome, I have Pandora Radio running, I have these forums open of course, I also have the Runescape forums open.
And yet, it remains at 1% of the processors capabilities, and that 6 core processor mentioned above has 33% of its capabilities being used by just running minecraft.
What he means by threading in this case is his 6 core processor has a technology called hyperthreading, what that means that each core has 2 virtual cores. For example the Intel Core i3 has two physical cores but each core has 2 virtual cores meaning that it is reported as 4 cores to the operating system, same with the i7 which has 4 physical cores and therefore 8 virtual.
Your pentium 4 also has hyperthreading but in reality it is only a single core.
And no, minecraft would use waaaaaay waaaaay much more resources than 1% on a P4, I don't think my i5 can achieve that.
>_< None of you have really answered my question. You all skip about, and name what your systems have in them.
Does anyone know why it seems so backwards?
How can I only have 1% of my processors power being used?
This isnt making any sense to me.....
Technology should get BETTER over time. Not worse. So why is my processor doing this?
And my graphics card is a:
BFG Tech Nvidia GeForce 9600GT OverClocked 2
I didnt type the full name above.
Here is a stock image that I found on google:
We're asking you questions so we can get an answer, we don't have magical powers and automatically know what's wrong.
Doctapper, lol.
I dont have any problems. I do not have anything that I am asking to be resolved.
If you read my first post in it's entirety, you will see that I asked a question out of curiosity.
I just want to know how my old processor is capable of handing things with much less impact on the system, than a more advanced, new processor would be able to do.
Rollback Post to RevisionRollBack
My current PC:
• Ryzen 9 3900x @4.2GHz
• 64GB DDR4 @3200MHz
• MSI RTX 2080 Ti Gaming X Trio
• 2TB NVMe SSD
• 6TB in HDD space
• Four monitors at 4k
I just want to know how my old processor is capable of handing things with much less impact on the system, than a more advanced, new processor would be able to do.
There are far more variables at hand here than just the CPU.
Minecraft itself, is also a horrible benchmark of performance.
When Minecraft isn't the active program it freezes, so task manager doesn't show anything. Make minecraft half of the screen and task manager the other half. Then take a screen shot with minecraft as the active program
Threading is also not the reason you need a better CPU, it has to do with performance of each cycle and how much it can process in each cycle.
An i5 with only 1 core enabled will still be 3x more powerful than a pentium 4, even though they are both 1 core/1 processor thread and of similar clock speeds. This has to do with architecture and is where it gets a bit more complex:
When Minecraft isn't the active program it freezes, so task manager doesn't show anything. Make minecraft half of the screen and task manager the other half. Then take a screen shot with minecraft as the active program
Threading is also not the reason you need a better CPU, it has to do with performance of each cycle and how much it can process in each cycle.
An i5 with only 1 core enabled will still be 3x more powerful than a pentium 4, even though they are both 1 core/1 processor thread and of similar clock speeds. This has to do with architecture and is where it gets a bit more complex:
I never said I wanted a new CPU, nor did I complain about my existing one.
If you re-read my first post, I asked why "new" processors seem to go slower than my processor that I am using right now. Which is from 2005.
It is almost as if Intel downgraded their processors over the years, rather than upgrading them.
I never said I wanted a new CPU, nor did I complain about my existing one.
If you re-read my first post, I asked why "new" processors seem to go slower than my processor that I am using right now. Which is from 2005.
It is almost as if Intel downgraded their processors over the years, rather than upgrading them.
They aren't going slower though, is the thing. They are over 3-4x faster, they just process more information per clock cycle.
If you re-read my first post, I asked why "new" processors seem to go slower than my processor that I am using right now. Which is from 2005. Then your sense of 'seeming' is off. Where are you basing your statement that new processors are slower than old ones.
No. If you look at my screenshot again, you will see that I am Running 49 processes over 580 threads. No you are not. You are confused.
Processor threads != program threads.
Rollback Post to RevisionRollBack
Asus M4A89GTD PRO/USB3 | AMD Phenom II X6 1090T | Sapphire Radeon HD 6870 | 1TB WD Caviar Black | Intel 520 240GB SSD | Corsair H70 push-pull | Antec P280 White Windowed
How is it that my old single core Pentium 4 Processor from 2005 is capable of running 580 threads, and a brand new 6 core processor can only run 12?
The question is the number of simultaneously running threads- that is, threads literally running at the same time; Single Core processors run a single thread at once- (Hyperthreading sort of clouds this- one can consider it sorta running 1 thread and a bit of another at once). Other than that, it simply context switches between active threads, giving them a bit of time to run, and moving to the next.
As you can see above, my processor is only using 1% of its capabilities.
really? where does it show that? I see the CPU having a high number of idle cycles... but not sure how that is relevant.
I am running minecraft on Max graphics
And yet... your screenshot has smooth lighting off,particles are off, and graphics are set to fast. Not to mention it's a world that was created using the Superflat preset, and is likely on peaceful- eliminating a lot of tick-sinks like Lava (embers) and underground mob spawns. On fancy, far, and Maximum smooth lighting and particles, sitting in my base uses 10% CPU when active- because of torches (particles), my mob farms, and automatic redstone harvesters. when paused it drops to 3%; if I use a superflat world and reduce the various graphics settings, this reduces the usage more as well.
I have Runescape open in chrome, I have Pandora Radio running, I have these forums open of course, I also have the Runescape forums open.
CPU utilization is not an indicator of how much of it's capabilities are used. For one thing, if an application (aside from perhaps a game) throttles a core, it better damn well be doing something useful. I don't want Firefox or Explorer to start using a full Core (in my case, pegging at 25% usage) unless it's doing something. Even so, Firefox insists on pegging at 25% usage for no particular reason every so often.
And yet, it remains at 1% of the processors capabilities, and that 6 core processor mentioned above has 33% of its capabilities being used by just running minecraft.
Well, the quoted person has no idea what they were talking about. I won't say the same about you, since, well- you are basically trying to diplomatically call out the fact that they don't know what they are talking about by posing it as a question about yourself. (I salute your tact :P)
The quoted info:
and it seems that running minecraft off multiple threads doesnt seem to work very well.
That sentence makes no sense. At all. you don't run "minecraft off multiple threads". Minecraft runs ON multiple threads- up to three since 1.3, with 2 before that being more typical) but it doesn't run "off" multiple threads, and running it ON multiple threads DOES work better than on a single thread as long as as many operations as possible are done concurrently instead of one thread waiting for another to finish with something.
Seems to run off 1 thread at a time and depending on how many threads your processor has
Also nonsense. Processors Don't have Threads. Threading is a higher level concept handled by the kernel.
For example, ive got an intel 6 core processor running now at 4Ghz,(12 threads)
6 core processor with hyperthreading does not make "12 threads". It makes a 6 core processor with hyperthreading.
so split that between 12 threads and t only 33% of the processor being used
This is also nonsense. I'm trying to decide if they even knew what they were trying to say at all, or whether they just wanted to make claims from a position of authority and hope nobody would call them on it for fear of being vacuous.
Anyway, I've honestly never understood this mindset, whereby idle cycles are "wasted" cycles. An application simply doesn't need to use more system resources than it needs; I've used a few programs that- even when idle, sit at 25% usage- they basically peg a full CPU core just by existing. Is my CPU being 'utilized' well... yes, but I could do that by simply running a completely vapid infinite loop script if I wanted to. The point is that your system resources should be used for things that are useful. a program should not consume a full core if it's not doing anything; and a program should not strive to leave idle cycles if it is doing something.
A good example of the latter is one case where I recall a person asking how they could force their program to never use 100% of the processor. Their application was very computationally expensive, so while it did it's number crunching it would consume as much CPU as possible. They didn't like this because people complained when they looked in task manager and saw it using 100% CPU; as if the CPU utilization was a measure of how many puppies the program was kicking per second or something (to quote Raymond Chen on the subject). A few suggestions provided examples that would allow their program to top out at 50%... they were ecstatic, but then got complaints that their program took twice as long to run.
Anyway, you didn't say anything of the sort, you were more or less confused by the vacuous statements made by another- and rightly so. For your specific case you have DWM disabled and graphical features of Minecraft reduced, (and in the case of the screenshot, window size). So you're usage is not likely to go very high. Especially if there is very little that the game is really doing; in the shown screenie it's not spawning and processing particles, it's not spawning and processing mobs, and it's not doing an extensive number of block updates, so it will be low usage.
That sentence makes no sense. At all. you don't run "minecraft off multiple threads". Minecraft runs ON multiple threads- up to three since 1.3, with 2 before that being more typical) but it doesn't run "off" multiple threads, and running it ON multiple threads DOES work better than on a single thread as long as as many operations as possible are done concurrently instead of one thread waiting for another to finish with something.
Mildly OT, Last I checked (admittedly, a while ago) MC used only two threads, what is the third one used for now?
IIRC chunk generation was (or was partially) offloaded in some way.
The question is the number of simultaneously running threads- that is, threads literally running at the same time; Single Core processors run a single thread at once- (Hyperthreading sort of clouds this- one can consider it sorta running 1 thread and a bit of another at once). Other than that, it simply context switches between active threads, giving them a bit of time to run, and moving to the next.
really? where does it show that? I see the CPU having a high number of idle cycles... but not sure how that is relevant.
And yet... your screenshot has smooth lighting off,particles are off, and graphics are set to fast. Not to mention it's a world that was created using the Superflat preset, and is likely on peaceful- eliminating a lot of tick-sinks like Lava (embers) and underground mob spawns. On fancy, far, and Maximum smooth lighting and particles, sitting in my base uses 10% CPU when active- because of torches (particles), my mob farms, and automatic redstone harvesters. when paused it drops to 3%; if I use a superflat world and reduce the various graphics settings, this reduces the usage more as well.
CPU utilization is not an indicator of how much of it's capabilities are used. For one thing, if an application (aside from perhaps a game) throttles a core, it better damn well be doing something useful. I don't want Firefox or Explorer to start using a full Core (in my case, pegging at 25% usage) unless it's doing something. Even so, Firefox insists on pegging at 25% usage for no particular reason every so often.
Well, the quoted person has no idea what they were talking about. I won't say the same about you, since, well- you are basically trying to diplomatically call out the fact that they don't know what they are talking about by posing it as a question about yourself. (I salute your tact )
The quoted info:
That sentence makes no sense. At all. you don't run "minecraft off multiple threads". Minecraft runs ON multiple threads- up to three since 1.3, with 2 before that being more typical) but it doesn't run "off" multiple threads, and running it ON multiple threads DOES work better than on a single thread as long as as many operations as possible are done concurrently instead of one thread waiting for another to finish with something.
Also nonsense. Processors Don't have Threads. Threading is a higher level concept handled by the kernel.
6 core processor with hyperthreading does not make "12 threads". It makes a 6 core processor with hyperthreading.
This is also nonsense. I'm trying to decide if they even knew what they were trying to say at all, or whether they just wanted to make claims from a position of authority and hope nobody would call them on it for fear of being vacuous.
Anyway, I've honestly never understood this mindset, whereby idle cycles are "wasted" cycles. An application simply doesn't need to use more system resources than it needs; I've used a few programs that- even when idle, sit at 25% usage- they basically peg a full CPU core just by existing. Is my CPU being 'utilized' well... yes, but I could do that by simply running a completely vapid infinite loop script if I wanted to. The point is that your system resources should be used for things that are useful. a program should not consume a full core if it's not doing anything; and a program should not strive to leave idle cycles if it is doing something.
A good example of the latter is one case where I recall a person asking how they could force their program to never use 100% of the processor. Their application was very computationally expensive, so while it did it's number crunching it would consume as much CPU as possible. They didn't like this because people complained when they looked in task manager and saw it using 100% CPU; as if the CPU utilization was a measure of how many puppies the program was kicking per second or something (to quote Raymond Chen on the subject). A few suggestions provided examples that would allow their program to top out at 50%... they were ecstatic, but then got complaints that their program took twice as long to run.
Anyway, you didn't say anything of the sort, you were more or less confused by the vacuous statements made by another- and rightly so. For your specific case you have DWM disabled and graphical features of Minecraft reduced, (and in the case of the screenshot, window size). So you're usage is not likely to go very high. Especially if there is very little that the game is really doing; in the shown screenie it's not spawning and processing particles, it's not spawning and processing mobs, and it's not doing an extensive number of block updates, so it will be low usage.
Well, you basically just answered my question in full, whereas everyone else seemed to dither around it, or ask another question rather than answering mine. Thank you.
But one more thing since you seem to be the only one willing to help:
I have seen i7 processors that are only clocked at 2. something GHz, and mine is clocked at 3.2GHz.
This is why I made the assumption of Intel "downgrading" their processors.
For isnt the GHz the speed at which your processor takes action on something? Or am I wrong about that?
To be honest, I am basing this questioning on how I have grown up, and seeing things that were once made by hand, and with superb quality, to now being mass produced, and made at a much lower quality to decrease the cost, as well as the speed at which they are created.
This is how I am looking at Intel. They at one point made amazing technology, and now they decide to mass market their products at the cost of having lower grade retail to sell.
Rollback Post to RevisionRollBack
My current PC:
• Ryzen 9 3900x @4.2GHz
• 64GB DDR4 @3200MHz
• MSI RTX 2080 Ti Gaming X Trio
• 2TB NVMe SSD
• 6TB in HDD space
• Four monitors at 4k
Most of you reading this will probably thing that this is an incredibly ignorant, and or stupid question, but I am honestly curious.
I have seen people say things about Quad Core processors running like 12 threads or something.
Here is a quote I saw:
So it has six cores, and is clocked at 4Ghz, but can only handle 12 threads...? o.O
My question is this:
How is it that my old single core Pentium 4 Processor from 2005 is capable of running 580 threads, and a brand new 6 core processor can only run 12?
Here is a screen shot of the threads my CPU is running:
Here are my CPU Specs:
**Edit**
As you can see above, my processor is only using 1% of its capabilities.
I am running minecraft on Max graphics, I have Runescape open in chrome, I have Pandora Radio running, I have these forums open of course, I also have the Runescape forums open.
And yet, it remains at 1% of the processors capabilities, and that 6 core processor mentioned above has 33% of its capabilities being used by just running minecraft.
• Ryzen 9 3900x @4.2GHz
• 64GB DDR4 @3200MHz
• MSI RTX 2080 Ti Gaming X Trio
• 2TB NVMe SSD
• 6TB in HDD space
• Four monitors at 4k
Yes.
I'm just wondering why an older processor is "supposedly" faster than a new one. I mean, by all logic, shouldn't it be the other way around? =/
• Ryzen 9 3900x @4.2GHz
• 64GB DDR4 @3200MHz
• MSI RTX 2080 Ti Gaming X Trio
• 2TB NVMe SSD
• 6TB in HDD space
• Four monitors at 4k
Your pentium 4 also has hyperthreading but in reality it is only a single core.
And no, minecraft would use waaaaaay waaaaay much more resources than 1% on a P4, I don't think my i5 can achieve that.
(also your window colour scheme is hideous)
Again, could anyone tell me why it seems to do this...?
Do you think it might be because of my graphics card?
I have a NvidiaGeforce 9600GT graphics card.
Here are the specs:
• Ryzen 9 3900x @4.2GHz
• 64GB DDR4 @3200MHz
• MSI RTX 2080 Ti Gaming X Trio
• 2TB NVMe SSD
• 6TB in HDD space
• Four monitors at 4k
Try running a program like Prime 95 and look at your CPU usage because that isn't normal.
Does anyone know why it seems so backwards?
How can I only have 1% of my processors power being used?
This isnt making any sense to me.....
Technology should get BETTER over time. Not worse. So why is my processor doing this?
And my graphics card is a:
BFG Tech Nvidia GeForce 9600GT OverClocked 2
I didnt type the full name above.
Here is a stock image that I found on google:
I'll download that now, and I'll post the results in a moment.
• Ryzen 9 3900x @4.2GHz
• 64GB DDR4 @3200MHz
• MSI RTX 2080 Ti Gaming X Trio
• 2TB NVMe SSD
• 6TB in HDD space
• Four monitors at 4k
Actually, no. Over 1 thread. I completely misread the processor's serial: It's single core (ie single thread) not quad core.
I am getting 137FPS.
My processor is at 100%.. but I assume that is the Prime95 doing that...?
These are the results so far:
Doctapper, lol.
I dont have any problems. I do not have anything that I am asking to be resolved.
If you read my first post in it's entirety, you will see that I asked a question out of curiosity.
I just want to know how my old processor is capable of handing things with much less impact on the system, than a more advanced, new processor would be able to do.
• Ryzen 9 3900x @4.2GHz
• 64GB DDR4 @3200MHz
• MSI RTX 2080 Ti Gaming X Trio
• 2TB NVMe SSD
• 6TB in HDD space
• Four monitors at 4k
Software processes != CPU threads
As you said, you could be running theoretically unlimited
processor(edit:I'm sleepy, processes*) on a single-threaded CPU.https://en.wikipedia...ead_(computing)
(See also the multithreading section)
There is also Hyprethreading on Intel CPUs that gives more processor threads:
https://en.wikipedia.../Hyperthreading
6 core processors with 12 threads have 12 processor threads, but the number of program threads it can handle is not so strictly limited. There are far more variables at hand here than just the CPU.
Minecraft itself, is also a horrible benchmark of performance.
An i5 with only 1 core enabled will still be 3x more powerful than a pentium 4, even though they are both 1 core/1 processor thread and of similar clock speeds. This has to do with architecture and is where it gets a bit more complex:
https://en.wikipedia.org/wiki/Microarchitecture
http://www.minecraftforum.net/topic/1351169-computer-building-general-knowledge-guide/#entry16473220
Here you go:
I never said I wanted a new CPU, nor did I complain about my existing one.
If you re-read my first post, I asked why "new" processors seem to go slower than my processor that I am using right now. Which is from 2005.
It is almost as if Intel downgraded their processors over the years, rather than upgrading them.
No. If you look at my screenshot again, you will see that I am Running 49 processes over 580 threads.
• Ryzen 9 3900x @4.2GHz
• 64GB DDR4 @3200MHz
• MSI RTX 2080 Ti Gaming X Trio
• 2TB NVMe SSD
• 6TB in HDD space
• Four monitors at 4k
How do you figure they are slower?
Processor threads != program threads.
The question is the number of simultaneously running threads- that is, threads literally running at the same time; Single Core processors run a single thread at once- (Hyperthreading sort of clouds this- one can consider it sorta running 1 thread and a bit of another at once). Other than that, it simply context switches between active threads, giving them a bit of time to run, and moving to the next.
really? where does it show that? I see the CPU having a high number of idle cycles... but not sure how that is relevant.
And yet... your screenshot has smooth lighting off,particles are off, and graphics are set to fast. Not to mention it's a world that was created using the Superflat preset, and is likely on peaceful- eliminating a lot of tick-sinks like Lava (embers) and underground mob spawns. On fancy, far, and Maximum smooth lighting and particles, sitting in my base uses 10% CPU when active- because of torches (particles), my mob farms, and automatic redstone harvesters. when paused it drops to 3%; if I use a superflat world and reduce the various graphics settings, this reduces the usage more as well.
CPU utilization is not an indicator of how much of it's capabilities are used. For one thing, if an application (aside from perhaps a game) throttles a core, it better damn well be doing something useful. I don't want Firefox or Explorer to start using a full Core (in my case, pegging at 25% usage) unless it's doing something. Even so, Firefox insists on pegging at 25% usage for no particular reason every so often.
Well, the quoted person has no idea what they were talking about. I won't say the same about you, since, well- you are basically trying to diplomatically call out the fact that they don't know what they are talking about by posing it as a question about yourself. (I salute your tact :P)
The quoted info:
That sentence makes no sense. At all. you don't run "minecraft off multiple threads". Minecraft runs ON multiple threads- up to three since 1.3, with 2 before that being more typical) but it doesn't run "off" multiple threads, and running it ON multiple threads DOES work better than on a single thread as long as as many operations as possible are done concurrently instead of one thread waiting for another to finish with something.
Also nonsense. Processors Don't have Threads. Threading is a higher level concept handled by the kernel.
6 core processor with hyperthreading does not make "12 threads". It makes a 6 core processor with hyperthreading.
This is also nonsense. I'm trying to decide if they even knew what they were trying to say at all, or whether they just wanted to make claims from a position of authority and hope nobody would call them on it for fear of being vacuous.
Anyway, I've honestly never understood this mindset, whereby idle cycles are "wasted" cycles. An application simply doesn't need to use more system resources than it needs; I've used a few programs that- even when idle, sit at 25% usage- they basically peg a full CPU core just by existing. Is my CPU being 'utilized' well... yes, but I could do that by simply running a completely vapid infinite loop script if I wanted to. The point is that your system resources should be used for things that are useful. a program should not consume a full core if it's not doing anything; and a program should not strive to leave idle cycles if it is doing something.
A good example of the latter is one case where I recall a person asking how they could force their program to never use 100% of the processor. Their application was very computationally expensive, so while it did it's number crunching it would consume as much CPU as possible. They didn't like this because people complained when they looked in task manager and saw it using 100% CPU; as if the CPU utilization was a measure of how many puppies the program was kicking per second or something (to quote Raymond Chen on the subject). A few suggestions provided examples that would allow their program to top out at 50%... they were ecstatic, but then got complaints that their program took twice as long to run.
Anyway, you didn't say anything of the sort, you were more or less confused by the vacuous statements made by another- and rightly so. For your specific case you have DWM disabled and graphical features of Minecraft reduced, (and in the case of the screenshot, window size). So you're usage is not likely to go very high. Especially if there is very little that the game is really doing; in the shown screenie it's not spawning and processing particles, it's not spawning and processing mobs, and it's not doing an extensive number of block updates, so it will be low usage.
IIRC chunk generation was (or was partially) offloaded in some way.
Well, you basically just answered my question in full, whereas everyone else seemed to dither around it, or ask another question rather than answering mine. Thank you.
But one more thing since you seem to be the only one willing to help:
I have seen i7 processors that are only clocked at 2. something GHz, and mine is clocked at 3.2GHz.
This is why I made the assumption of Intel "downgrading" their processors.
For isnt the GHz the speed at which your processor takes action on something? Or am I wrong about that?
To be honest, I am basing this questioning on how I have grown up, and seeing things that were once made by hand, and with superb quality, to now being mass produced, and made at a much lower quality to decrease the cost, as well as the speed at which they are created.
This is how I am looking at Intel. They at one point made amazing technology, and now they decide to mass market their products at the cost of having lower grade retail to sell.
• Ryzen 9 3900x @4.2GHz
• 64GB DDR4 @3200MHz
• MSI RTX 2080 Ti Gaming X Trio
• 2TB NVMe SSD
• 6TB in HDD space
• Four monitors at 4k