cancel
Showing results for 
Search instead for 
Did you mean: 

Status 84 to FC-attached drives

BrettRabe
Level 3

Hello. I'm simply posting this here because in tackling my own VTL/FC/NB status 84 problem, I never did see anything that really pointed me toward the solution specific to my install. Probably because it's so elementary.

By posting this, I'm making myself look like an utter idiot. If I can save someone else a headache, I'm ok with that.

We saw occasional status 84 errors when our VTL was under moderate loads; it would run fine at light loads.

After a long period of wrestling with this problem on the software side of things - tweaking our HBA settings, trying different HBA driver revs, etc. - I finally decided to take a physical look at things.

That's when I discovered that the installers had used single-mode fiber in the fiber runs from our (NetApp) VTL to the FC switch. Mysteriously, they had used (correctly) multi-mode from the switch to the servers and from the switch to our physical tape silo.

Moral of the story: Start with the easy stuff. Check the physical connections and components first. Save yourself many nights of headaches assuming that it was done correctly.

2 REPLIES 2

Marianne
Level 6
Partner    VIP    Accredited Certified

Thanks for sharing your experience Brett.

You are one of the handful of geniuses that understand that status 84 is in 99% of cases caused by a hardware problem, and not by NBU...

Ed_Wilts
Level 6

We started with ISLs at 2Gbps on 62.5u fibre that were simply too long.  We'd regularly get status 84 failures.  We dropped to 1Gbps and we were fine (although technically just out of spec).  We eventually replaced the switches, went with single mode fibre ISLs (and SFPs of course!) and run them at 4Gbps without issues.

You sometimes think physical connectivity is the "easy stuff" but when it comes to fibre runs, especially outside of the data center, you have to know what you're doing.  I have my Brocade certifications now :)