I use unraid with 5950x and it wouldn't stop crashing until I disabled c states
So that plus 18 hdds and 2 ssds it sits at 200watts 24/7
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
No spam posting.
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
No trolling.
Resources:
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
I use unraid with 5950x and it wouldn't stop crashing until I disabled c states
So that plus 18 hdds and 2 ssds it sits at 200watts 24/7
kWh is a unit of energy, not power
Idles at around 24W. It’s amazing that your server only needs .1kWh once and keeps on working. You should get some physicists to take a look at it, you might just have found perpetual motion.
Running an old 7th gen Intel, It has a 2017 and a 1080 in it, six mechanical hard drives 3 SSDs. Then I have an eighth gen laptop with a 1070 TI mobile. But the laptops a camera server so it's always running balls to the wall. Running a unified dream machine pro, 24 port poe, 16 port poe and an 8 port poe
Because of the overall workload and the age of the CPU, it burns about 360 watts continuous.
I can save a few wants by putting the discs to sleep, But I'm in the camp where the spin up and spin down of the discs cost more wear than continuous running.
My server rack has
All together that draws.... 0.1 kWh.... in 0.327s.
In real time terms, measured at the UPS, I have a running stable state load of 900-1100w depending on what I have at load. I call it my computationally efficient space heater because it generates more heat than is required for my apartment in winter except for the coldest of days. It has a dedicated 120v 15A circuit
50W-ish idle? Ryzen 1700, 2 HDDs, and a GTX 750ti. My next upgrade will hopefully cut this in half.
17W for an N100 system with 4 HDD's
Which HDDs? That’s really good.
Seagate Ironwolf "ST4000VN006"
I do have some issues with read speeds but that's probably networking related or due to using RAID5.
Around 18-20 Watts on idle. It can go up to about 40 W at 100% load.
I have a Intel N100, I'm really happy about performance per watt, to be honest.
My whole setup including 2 PIs and one fully speced out AM4 system with 100TB of drives a Intel Arc and 4x 32gb ecc ram uses between 280W - 420W I live in Germany and pay 25ct per KWh and my whole apartment uses 600w at any given time and approximately 15kwh per day 😭
0.1kWh per hour? Day? Month?
What's in your system?
Computer with gpu and 50TB drives. I will measure the computer on its own in the enxt couple of days to see where the power consumption comes from
Which GPU? How many drives?
Put a kill-o-watt meter on it and see what it says for consumption.
You are misunderstanding the confusion, Kwh is an absolute measurement of an amount of power, not a rate of power usage. It's like being asked how fast your car can go and answering it can go 500 miles. 500 miles per hour? Per day? Per tank? It doesn't make sense as an answer.
Does your computer use 100 watt hours per hour? Translating to an average of 100 watts power usage? Or 100 watt hours per day maybe meaning an average power use of about 4 watts? One of those is certainly more likely but both are possible depending on your application and load.
You're adding to the confusion.
kWh (as in kW*h) and not kW/h is for measurement of energy.
Watt is for measurement of power.
Lol thank you, I knew that I don't know why I wrote it that way, in my defense it was like 4 in the morning.
You might have your units confused.
0.1kWh over how much time? Per day? Per hour? Per week?
Watthours refer to total power used to do something, from a starting point to an ending point. It makes no sense to say that a device needs a certain amount of Wh, unless you're talking about something like charging a battery to full.
Power being used by a device, (like a computer) is just watts.
Think of the difference between speed and distance. Watts is how fast power is being used, watt-hours is how much has been used, or will be used.
If you have a 500 watt PC, for example, it uses 500Wh, per hour. Or 12kWh in a day.
If you have a 500 watt PC, for example, it uses 500Wh, per hour. Or 12kWh in a day.
A maximum of 500 watts. Fortunately your PC doesn't actually max out your PSU or your system would crash.
I forgive 'em cuz watt hours are a disgusting unit in general
idea | what | unit |
---|---|---|
speed | change in position over time | meters per second m/s |
acceleration | change in speed over time | meters per second, per second m/s/s=m/s² |
force | acceleration applied to each of unit of mass | kg * m/s² |
work | acceleration applied along a distance, which transfers energy | kg * m/s² * m = kg * m²/s² |
power | work over time | kg * m² / s³ |
energy expenditure | power level during units of time | (kg * m² / s³) * s = kg * m²/s² |
Work over time, × time, is just work! kWh are just joules (J) with extra steps! Screw kWh, I will die on this hill!!! Raaah
Power over time could be interpreted as power/time. Power x time isn’t power, it’s energy (=== work). But otherwise I’m with you. Joules or gtfo.
Whoops, typo! Fixed c:
Could be worse, could be BTU. And some people still use tons (of heating/cooling).
kWh is the stupidest unit ever. kWh = 1000J/s * 6060s = 3.610^6J so 0.1kWh = 360kJ
Do you mean 0.1kWh per hour, so 0.1kW or 100W?
My N100 server needs about 11W.
The N100 is such a little powerhouse and I'm sad they haven't managed to produce anything better. All of the "upgrades" are either just not enough of an upgrade for the money, it just more power hungry.
To my understanding 0.1kWh means 0.1 kW per hour.
It's the other way around. 0.1 kWh means 0.1 kW times 1 h. So if your device draws 0.1 kW (100 W) of power for an hour, it consumes 0.1 kWh of energy. If your ~~device~~ factory draws 360 000 W for a second, it consumes the same amount of 0.1 kWh of energy.
Thank you for explaining it.
My computer uses 1kwh per hour.
It does not yet make sense to me. It just feels wrong. I understand that you may normalize 4W in 15 minutes to 16Wh because it would use 16W per hour if it would run that long.
Why can't you simply assume that I mean 1kWh per hour when I say 1kWh? And not 1kWh per 15 minutes.
kWh is a unit of power consumed. It doesn't say anything about time and you can't assume any time period. That wouldn't make any sense. If you want to say how much power a device consumes, just state how many watts (W) it draws.
Thanks!
0.1kWh per hour can be written as 0.1kWh/h, which is the same as 0.1kW.
Thanks. Hence, in the future I can say that it uses 0.1kW?
Yes. Or 100W.
If this was over an hour, yes. Though you'd typically state it as 100W ;)
last I checked with a kill-a-watt I was drawing an average of 2.5kWh after a week of monitoring my whole rack. that was about three years ago and the following was running in my rack.
I also have two battery systems split between high priority and low priority infrastructure.
I was drawing an average of 2.5kWh after a week of monitoring my whole rack
That doesn't seem right; that's only ~18W. Each one of those systems alone will exceed that at idle running 24/7. I'd expect 1-2 orders of magnitude more.
Ugh, I need to get off my ass and install a rack and some fiber drops to finalize my network buildout.
Idle: 30 Watts
Starting all docker containers after reboot: 140 Watts
It needs around 28 kWh per month.
9 spinning disks and a couple SSD's - Right around 190 watts, but that also includes my router and 3 PoE WiFi AP's. PoE consumption is reported as 20 watts, and the router should use about 10 watts, so I think the server is about 160 watts.
Electricity here is pretty expensive, about $.33 per kWh, so by my math I'm spending $38/month on this stuff. If I didn't have lots of digital media it'd be worth it to get a VPS probably. $38/month is still cheaper than Netflix, HBO, and all the other junk I'd have to subscribe to.
That's true. And the children of my family see no ads which is priceless. Yet I am looking into ways to cut costs in half by using an additional lower powered mini pc which is always on and the main computer only running in the evening - maybe.
My 10 year old ITX NAS build with 4 HDDs used 40W at idle. Just upgraded to an Aoostart WTR Pro with the same 4 HDDs, uses 28W at idle. My power bill currently averages around US$0.13/kWh.
My server with 8 hard drives uses about 60 watts and goes up to around 80 under heavy load. The firewall, switch, access points and modem use another 50-60 watts.
I really need upgrade my server and firewall to something about 10 years newer, it would reduce my power consumption quite a bit and I would have a lot more runtime on UPS.