Power consumption is higher on hiveos than on windows 10

Rig:
mb: H110 Pro BTC+
cpu: Intel Celeron G3930
ram: 4 gb
gpu: RX 470 × 1, RX 480 × 1, RX 570 × 11
hiveOS: 0.6-150@200801
winsows 10: LTSC 1809
claymore 15

The real power consumption via wattmeter on windows 10 is about 1670W. Claymore says it’s 985W for total gpu.
On hiveos the real power consumtion is about 1750W and hiveos says it’s 965W for total gpu.
All values for each card are the same as on windows (gpu, mem, voltage).
Hashrate is 390 mh/s and it’s the same as on windows too.

So, according the software monitoring, the cards consume about the same. Where could the power leak be?
LA 0.17 0.25 0.41

Yikes. BIOS settings, updated motherboard bios? I don’t work at all with AMD cards but I would guess some cards aren’t having the oc/undervolt applied so I’d test things one card at a time, or some strangeness with card link speed negotiations (PEG).

Hi, thank you for replying.
If some cards aren’t having the oc/undervolt applied, i would notice that by temperatures. Besides, the software monitoring shows right consumption for each card. My guess is too high cpu utilization or something’s wrong with amd drivers.
I’m wondering if anyone else has this problem.
Or is this specifics of mining on Linux?

Yea, depends on the mining software, but for ETH your averages shouldn’t be high. You should see a ‘LA’ (load average) top right corner,
Ex:
‘LA 0.98 0.95 0.68’

First number is last minute, second is last 5 min third is last 15 min or something like that.

Generally stuff under 2 is ok, if its over 4 you got probs and should probably look at your miners’ output, but if its running even higher than that something is up. Also might want to try using 8gb RAM, i know some miners need that much to function properly with more than 8/9 cards.

Also give another miner a shot (I personally use Phoenix miner.)

You should probably beef up on some basic linux terminal skills(mostly just how to use ‘screen’ which is what the internal control software uses), its not 100% necessary but it will save you time in the long run to learn the two key commands to exit out of screen once you’re in there, you can easily debug stuff from the SSH shell and launching ‘miner.’

Doing stuff from the web interface sometimes takes a heck of a lot longer than typing it in the terminal.

Also theres a setting per each rig that lets you set a value to add to the shown value (equipment power) and an efficiency field (percentage.)

Not sure what your question about a ‘leak.’ Is the real power output from a power meter significantly more than what HiveOS reports? Or does HiveOS report significantly less than what your power meter shows?

I remember when I first started using linux for mining had this same problem, didn’t care as much because I wasn’t paying for power, but now that I do I took a lot of time to ensure everything was balanced powerwise across risers/PSU’s etc, appropriate air flow, etc… its better anyway because I’m positive windows only delays equipment failure while linux won’t let you run if your equipment is near failure.

Yeah, i know about LA (I wrote about it in the topic). Also i know some linux stuff. it’s not about that.
The question is why mining on hiveos consumes more power than on windows with absolutely the same settings (the same miner, the same voltage, the same clock speed, the same temperatures, the same fan speeds). I measure consumption with a power meter. With Hiveos it’s 1750W, with Windows it’s 1670W. At the same time, software monitoring shows approximately the same consumption of video cards in both OS.
That’s why I’m talking about “leak”. I 'm wondering where almost 100W goes

you’re worried about a 100 watt difference? Maybe you should study integrated circuits and electricity, perhaps get some additional diagnostic tools (multi-meter) and develop tests to prove your hypothesis; maybe you can reach out to the kernal developers of Ubuntu and they might be able to point you in the right direction to squash this 100 watt difference. I believe in you…

Are you sure the VDDC settings are being correctly applied? I had this problem and when I asked support they turned on ‘aggressive power state’ which dropped my power consumption a lot. I think the difference is that in Windows the reported voltage is not the same as what you actually set due to droop and other factors - it also depends how you set it as MSI Afterburner uses offsets but doesn’t always enforce it properly.

I have read a few posts on Reddit that Linux just uses more power mining than Windows but it also seems more stable - so I can only assume there is more voltage flowing through the system. For me, being able to use 4GB cards with HiveOS at slightly higher power is better than battling Windows to keep mining. Also any downtime trying to figure it out is time spent not mining.

bruh…

Serious, 100 watts is a big loss over time; think of it like power that could be powering a whole 'nother card.

Its possible to rewrite parts of the OS to address hardware differently, and may result in lower power draw. Personally I have saved around 14-30 watts fine tuning BIOS settings alone (undervolting when possible, disabling all unused onboard systems), haven’t really needed to take it further, but these power savings aren’t really linux/windows specific. Its just ubuntu is kind of not too friendly to some low-level changes, but theres plenty of other linuxes’ out there; and people do run custom linux kernals i’m sure for GPU mining purposes; its just they ain’t telling everyone on mining fourms about it.

I keep mining on HiveOS while I’m trying to figuire it out

yes, I’m sure

I didn’t use MSI Afterburner, I set the parameters in Claymore

it looks more like the truth

ok ok, I suppose I should just be ok with it

Its a competitive edge that you can chose to implement. Linux was meant to be completely modifiable. But theres more than one way to screw up hard. If you’re running several rigs and you can save 100 watts per a rig that is definitely worth it, but if price doubles and you don’t really care about electric costs then its time spent doing other things.

I only own one rig with 13 4GB cards, so I can’t mine Ethereum on windows anymore. 100 watts slightly reduces profitability, but now there is no other choice. I just could not understand the reason of the higher consumption is in hiveos or in linux in general

Yeah you got bigger problems like when the DAG goes over 3.99 GB in coming months. Good luck.

I have the same issue. In Windows, it was 300 watt on wall, after switching to hiveos, it increased to 359 watts on wall with same oc settings

1 Like

Using proper OC settings you can achieve the same or even better power consumption than on Windows
There some factors which can influence on result:

  • BIOS mod (for linux enough timings straps mods)
  • use Aggressive undervolting
  • use VDDCI value equal to VDDC (core voltage) if your card stay stable can get up to 5W lower power consumption from the wall (this change no visible on Hive’s dashboard)

Its the doble! Real consumtion VS Hive OS reporte.

RTFM !
Under Windows you got the same values in GPU-Z. AMD cards and especially Polaris doesn’t have enough sensors and show only GPU core consumption.
You can setup to show power consumption from the wall.
To achieve this you’ll need add some compensation at farm or rig setting by playing with wattmeter and this values below:
изображение

After that you will have something like this:
on Hive’s dashboard
изображение

and on your wattmeter or smart socket:

I do it! Tnks.