Okay so what is that? per hour, per second, per minute? what? I'm trying to find the monthly cost of these things. Just a general answer would be best (what most companies measure the power usage by.)
It would be a constant rate of 115w. Like when you have a lightbulb on.
Well it wouldn't be constant; modern CPUs dial back their consumption if its full power isn't needed.
In any case, in all contexts of watt, it is defined as 1W = 1J / 1S where J is Joules and S is seconds. In any case you are determining the cost of electricity for something, you'd use this measurement
If you want an easier way to figure out the costs of running the PC, get a Kill-a-Watt they are great for that.
I was looking for a way to measure costs per month for servers (which is why the equipment I linked was server CPU.)
There is a $2k server or I could get 2-3 other server cpu's that combined are stronger but use more power. I wanted to measure that power cost. (Around $160 for two CPU's and 230 watt usage.)
I mean if you're building a server with 2-3 CPUs, and assuming you have the Internet infrastructure to maintain its demands adequately (I am assuming it is going to be of some Internet use) then I don't think the cost of power should be of concern.
I mean if you're building a server with 2-3 CPUs, and assuming you have the Internet infrastructure to maintain its demands adequately (I am assuming it is going to be of some Internet use) then I don't think the cost of power should be of concern.
If I host these servers over the course of 7 years I'm going to want to save money in any way possible. The reason for that is not going to be released at this time but it is an issue. I don't want an extra 50$ a month bill after 4 years or so eating more cash than the $2k one. Part of the reason is also I just like knowing this information and I might just buy 25 servers (115 watts each.)
Think of me building a financial report for a business.
I was looking for a way to measure costs per month for servers (which is why the equipment I linked was server CPU.)
There is a $2k server or I could get 2-3 other server cpu's that combined are stronger but use more power. I wanted to measure that power cost. (Around $160 for two CPU's and 230 watt usage.)
That device measures calculates cost of operation for whatever you plug in, it doesn't care if it's a desktop, a server or a toaster lol. But yea, I guess it wouldn't help as you are in the planning stage and have no hardware, oops.
https://ark.intel.com/products/64595/Intel-Xeon-Processor-E5-2670-20M-Cache-2_60-GHz-8_00-GTs-Intel-QPI
So it says 115w (watts.)
Okay so what is that? per hour, per second, per minute? what? I'm trying to find the monthly cost of these things. Just a general answer would be best (what most companies measure the power usage by.)
It would be a constant rate of 115w. Like when you have a lightbulb on.
Im not sure how to measure that. Maybe its every 1killowatt an hour 12 cents is paid.
Well it wouldn't be constant; modern CPUs dial back their consumption if its full power isn't needed.
In any case, in all contexts of watt, it is defined as 1W = 1J / 1S where J is Joules and S is seconds. In any case you are determining the cost of electricity for something, you'd use this measurement
If you want an easier way to figure out the costs of running the PC, get a Kill-a-Watt they are great for that.
I was looking for a way to measure costs per month for servers (which is why the equipment I linked was server CPU.)
There is a $2k server or I could get 2-3 other server cpu's that combined are stronger but use more power. I wanted to measure that power cost. (Around $160 for two CPU's and 230 watt usage.)
I mean if you're building a server with 2-3 CPUs, and assuming you have the Internet infrastructure to maintain its demands adequately (I am assuming it is going to be of some Internet use) then I don't think the cost of power should be of concern.
If I host these servers over the course of 7 years I'm going to want to save money in any way possible. The reason for that is not going to be released at this time but it is an issue. I don't want an extra 50$ a month bill after 4 years or so eating more cash than the $2k one. Part of the reason is also I just like knowing this information and I might just buy 25 servers (115 watts each.)
Think of me building a financial report for a business.
That device measures calculates cost of operation for whatever you plug in, it doesn't care if it's a desktop, a server or a toaster lol. But yea, I guess it wouldn't help as you are in the planning stage and have no hardware, oops.
You could try http://outervision.com/power-supply-calculator. It's meant to spec a PSU, but it adds up power consumption for the machine so it can give you an idea.