More
referral
Increase your income with Hive. Invite your friends and earn real cryptocurrency!

CMP 170HX not workin

So I got a couple of CMP 170HX cards but no matter what I do, I can’t get HiveOS to detect them. Here are the details:

  • ASRock H110 Pro+ BTC motherboard
  • Core i3 CPU
  • 240G Kingston SSD
  • 8G AData RAM
  • Thermaltake 850W PSU
  • HiveOS 0.6-211@211111
  • t-rex miner
  • nVidia Driver 470.82.00 (latest stable nVidia driver for Linux) and I’ve also tried the latest driver 495

I have tried connecting the card directly to motherboard, I’ve tried connecting the card to a riser, I’ve tried t-rex miner, nbminer, ethminer, I’ve obtained some bios settings from HiveOS support which I’ve set, I’ve tried 2 different cards, 2 different risers, 2 different PSUs and the result is always the same. The rig shows up on my HiveOS farm, but the CMP 170HX doesn’t get detected. I have upgraded to the latest driver and the latest stable nVidia driver with the same result, the CMP 170HX is not detected by HiveOS.

Can someone help me get this figured out? Thanks.

If you send me one card, I think I can figure out. :wink:

haha…actually I figured it out. And if you are interested in getting these, they do hash about 163MH/s…for about a minute, until they get stupid hot and shut down. My guess is that nVidia expects people to add some serious cooling to these things. They won’t hash at all with the passive cooling solution on them.

I saw that the unit has no fans. I was wondering if it gets hot.

I am having the same overheating problem. You said you found rhe solution … what did you do??

1 Like

So I had a few Noctua fans from a rig I didn’t use them on, so I installed 3 in push and 3 in pull configuration, but since they are only case fans that run at 2000rpm, it’s not enough, they still get very hot. So I added a small 12" floor fan and that has helped a bit, but they still don’t hash at the full rate. Something else in my case, the damn electrician hasn’t come in to setup the electrical for the exhaust fan, so the fan is installed and ready to go, but it’s not getting the hot air out. I also have an old rack that used wooden shelfs and I have a wire rack on order that will help with airflow. But what I did is that I ordered the fan in the link below from Amazon because I was going to use asic miner fans that spin at 5000 or 6000 rpms but each fan uses 2A of power, so 6 fans would consume 12A, which to me was just nuts. Here is the fan I ordered, a picture of my setup, and a screenshot of what they are running at. Note that during the day when I keep the shed door open, they has much better, but still not at their full hashing capability.
Fan: https://smile.amazon.com/gp/product/B07DVPWG6L/ref=ppx_yo_dt_b_asin_title_o01_s02?ie=UTF8&psc=1

185MHASH with the Team Black Miner…

Just updating in the interest of helping others. Running a 7800CFM floor fan didn’t help as much as I thought, so 2 things I’ll be doing now is installing some large vents on the side of the shed and I also ordered some 5300rpm fans and a fan hub that runs off of a SATA power connection. The improvement from the floor fan was only about 100MH/s for the rig, but it still tops out at around 620MH/s when running 5 170HX cards at 163MH/s each should yield 815MH/s. I’ll report back once I install the vents and the high rpm fans.

Hey I got the same problem, Can you help me out with the solution ? My miner doesn’t detect the gpu at all

In my case, I was using 1 pcie cable directly from the power supply to the card, but the cards come with a pcie splitter. Once I connected the single end of the pcie splitter to the card and then connected 2 pcie cables from the PSU to the splitter, the cards ran fine and were detected by HiveOS with no problem. If you have an ATX PSU, and the pcie cables come with the slit ends, connect both to the splitter cable and then to the card. If you are using a server power supply just use to pcie cables from the breaker board to the splitter and then the splitter into the card.

Also, make sure you are going to have some good cooling, the passive cooling they have is crap, they won’t run for more than 1 minute and shut down because they get too hot.

cmp170hx. 需要 diy 水冷却!才能工作,为此你需要支付 260美金!

Surely you did enough research before dropping a huge amount of money on these to know that they are designed to run INSIDE of a server, not open air. GPU servers have crazy volumes of air pushed through them, much like an ASIC. I am surprised you haven’t damaged them running them open air like that. It speaks volumes to the ability of Nvidia’s overheating throttling to save the cards from destruction.

Hi bro, this is the solution for the overheating issue…
Just implement it as in the pic

1 Like

Sweet deal man, thanks for the suggestion. If I may ask, which case and fans are you running? Are the fans on the left pushing and the ones on the right (in the plastic) pulling? What are your temps running them like this? Thanks again.

Sure man the right fans and pushing air into the VGA and the left ones are pulling out the hot air.
These fans are around 3000 rpm to be honest.
The box is from china, if you need a link let me know.

Before I ordered these cards I researched about them for about 2 weeks before I pulled the trigger. Unfortunately, them being so new, there is little information about them, hence my initial issue of them not running at all. I am glad nVidia can design safely mechanism into their gear, but what would have been nicer is if they actually listed the damn cards on their website with some guidance and specs so people can run them safely and effectively. They don’t even list them on their website, all I could find was info on their 30HX, 70HX, and 90HX cards, all of which have fans, but not a peep on the 170HX or the rumored 230HX cards. I asked on discords about these cards, I searched Google and YouTube and there was almost no information on these cards at all. I did find a video of a guy in Indonesia about his cards running fine, but he only showed his HiveOS setup, not how he was running them. The fact that there are threads like these should prove that nVidia has done a piss poor job of publishing guidance on these cards, hell, there wasn’t even a quick guide included with the cards.

In a sense, I appreciate the feedback about using a server case, but I don’t appreciate the condescending tone.

Definitely, please share the link, I am considering a server case, though I see you are also using a riderless mobo, can you share the details of that one as well?

I apologize for the tone. An article published in September on videocardz.com specifically states that the CMP 170HX is a repurposed Nvidia A100. Those cards are designed to run in servers, much like this : Asrock Rack 3U8G+/C621 3U Rackmount Server Barebone Dual Socket LGA3647 Intel C621 8 GPU - Newegg.com
Using a GPU server like this is by far the best way to ensure the safety of these very expensive cards.

Hi, brother, cmp170hx needs a diy cooling system, and needs to modify the cooling backplane to install a Corsair 240 cooling system! Hope it is useful to everyone!