Decorum for the Forum:
  • Be nice. If you want to be mean, try Reddit.
  • No Piracy. If you want to be a thief, there are dark places on the internet dedicated to that.
  • No Cracking. Discussions on AnyDVD, DeUHD, DVDFab, UHDKeys and similar tools are not permitted here.
  • No Spamming. If you want to make a buck, work smarter... somewhere else.
  • No Adult Content. Half the internet is dedicated to adult content. This half isn't.

Privacy Policy: Click Here to Review (updated September 30, 2020)

What is Unraid and how to build an Unraid media server

Show off your HTPC builds, NAS Servers, and any other hardware. Great place to ask for hardware help too.
User avatar
Pauven
Posts: 2781
Joined: Tue Dec 26, 2017 10:28 pm
Location: Atlanta, GA, USA
Contact:

Re: Need advice on building an unraid media server

Post by Pauven » Thu Feb 21, 2019 1:45 pm

Jamie wrote: Thu Feb 21, 2019 12:09 pm I will look at xcase especially the RM424 Pro-V2 and the above as you suggested. What are the shipping costs from the UK to the east coast??

I bought mine so long ago, it could have changed dramatically since then. I think mine was between $100-200. Fill out the form on the website to get a free quote.
President, Chameleon Consulting LLC
Author, Chameleon MediaCenter

User avatar
Pauven
Posts: 2781
Joined: Tue Dec 26, 2017 10:28 pm
Location: Atlanta, GA, USA
Contact:

Re: Need advice on building an unraid media server

Post by Pauven » Thu Feb 21, 2019 1:51 pm

Jamie wrote: Thu Feb 21, 2019 12:09 pm Now for the controller card and MB. Any suggestions there ?? I will do my research. Just need some experienced what to watch out for advice.

This will depend upon the case, and the backplanes in the case.

If you get a case with SAS backplanes, you will need different controllers than a case with the expanders. I'll look at options for you. I think you need to add the [Controler Cost] to the [Case Cost] to truly compare one case to another. Give me some time to look at options.


Jamie wrote: Thu Feb 21, 2019 12:09 pm For CPU, you mention the atom processors as being acceptable. So the Zeon is really not necessary for doing just what I need to do which is storage?

If you do storage only, Atom's will work. But read up on my post above about the Dockers. If you think there's a chance you want to do Dockers, let's give this thing some more horsepower. For example, if you want to do video transcoding in a Docker, you need a CPU that is good at doing video transcoding - Atoms would be a poor choice for this.

There's even a possibility I may come out with my own Dockers in the future for CMC... (but no promises).
President, Chameleon Consulting LLC
Author, Chameleon MediaCenter

User avatar
Pauven
Posts: 2781
Joined: Tue Dec 26, 2017 10:28 pm
Location: Atlanta, GA, USA
Contact:

Re: Need advice on building an unraid media server

Post by Pauven » Thu Feb 21, 2019 1:58 pm

Also, the CPU and Controller Card choices will affect your motherboard choices too.

For example, you could get a low end motherboard with an embedded Atom processor, but it may only have a single PCI slot. That means you need a controller card capable of communicating with all 24 drives. That means either a 6-channel SAS card (like my $650 card), or a simple 4/8 port HBA controller (but that only works with the EX case).

Alternatively, if you get a motherboard with at least three full-size PCI-e slots, you have a lot more options on controller cards. For example you could get 3 controllers that each support 8 drives. Or if you get a motherboard with at least 8 SATA port, you could use those plus 2 controllers that each support 8 drives.

So case choice + motherboard choice has a big impact on controller card choice.

I'll be working up some options for you, but this may take a while. If you can do the EX case, that greatly simplifies your motherboard/controller options, so I think you should go ahead and reach out to X-Case to see if the EX will ever come back. It may be EOL.

Also, you can buy your own expanders too, to go between your adapters/SATA port and the case. I'll be looking at those options too.

There are even some motherboards that have some SAS connectors built-in. While these are more expensive, they may eliminate the need for purchasing more expensive controller cards.
President, Chameleon Consulting LLC
Author, Chameleon MediaCenter

Jamie
Posts: 945
Joined: Wed Dec 27, 2017 11:26 pm

Re: Need advice on building an unraid media server

Post by Jamie » Thu Feb 21, 2019 4:54 pm

Pauven wrote: Thu Feb 21, 2019 1:51 pm
If you do storage only, Atom's will work. But read up on my post above about the Dockers. If you think there's a chance you want to do Dockers, let's give this thing some more horsepower. For example, if you want to do video transcoding in a Docker, you need a CPU that is good at doing video transcoding - Atoms would be a poor choice for this.

There's even a possibility I may come out with my own Dockers in the future for CMC... (but no promises).

I don't think that I will use this device for VMs, but possibly Dockers. I want this machine to last a few years so I don't want to have to upgrade the CPU and ram after a year. RAM is a more doable upgrade if you plan ahead but it would be nice to not have to open it it up for a couple of years unless something fails.

Also, regarding a monitor. Does unraid have the ability to login from a different PC?

If others should want to pursue an unraid server, you are providing a lot of good info for them and others should feel free to chime in at anytime. I, myself, look at this forum as a community forum and not just a one on one interface with Paul.

User avatar
Pauven
Posts: 2781
Joined: Tue Dec 26, 2017 10:28 pm
Location: Atlanta, GA, USA
Contact:

Re: Need advice on building an unraid media server

Post by Pauven » Thu Feb 21, 2019 5:48 pm

Okay, a concept is coming together. This concept uses the X-Case eXtra Value with two 8-port controller cards and a motherboard with lots of SATA ports to minimize the # of controller cards needed. Here's the high level price breakdown:
  • Motherboard: $160
  • Memory: 8GB @ $55 / 16GB @ $90 / 32GB @ $180
  • CPU: Low-end @ $71 / Mid-range @ $172 / High-end @ $269
  • Controller Cards: $112
  • SAS Drive Cables: $64
  • Power Supply: $90
  • Case: ~$500 shipped (you need to get a shipping quote)
  • Unraid Pro license: $130

Total Costs:
  • 8GB Low-end: $1182 (that's only $50 per drive bay!!!)
  • 16GB Midrange: $1318
  • 32GB High-end: $1505

Optional Cache Upgrades
  • 1x 1TB SSD (Samsung 860 EVO): $150
  • 2x 1TB SSD (Samsung 860 EVO): $300 (mirrored for protection)
  • 1x 1TB NVMe (Samsung 970 EVO): $300 (only do 1 or you lose SATA ports)
My recommended options are in green, as they represent the best value plus Docker capability without going overboard, in case you want to run Dockers or maybe a lightweight VM. That's less than $1500 for a nicely equipped Unraid server.


Motherboard
The ASRock Z270 TAICHI LGA 1151 motherboard ($160, https://www.newegg.com/Product/Product. ... 6813157754), which will provide up to 10 SATA ports on-board, and at least two PCI-Express 3.0 x16 ports running in x8 mode concurrently (important for the Controller Cards below). This board also has 3 Ultra M.2 sockets for NVMe SSD's.
image.png
image.png (1.04 MiB) Viewed 14755 times

With this board, you would connect 8 SATA ports to the X-Case backplanes. That leaves either 2 SATA ports for SSD cache drives (i.e. you could connect two 1TB drives and mirror them, so you get 1TB of cache that is mirrored for protection), or you can use 1 M.2 socket to connect an NVMe SSD (only one, though, otherwise you lose some of those 8 SATA ports going to the case). While the NVMe is definitely faster, you won't notice this on a gigabit network connection, and they are more expensive, but the choice is yours.

Do note that any data on the cache is not protected unless you mirror the cache with 2 drives. Typically your rips will hit your cache drive first, then get moved over to the array overnight, so having a mirrored cache is not important. But if you use Dockers/VM's, that data may also reside permanently on your cache, so having those drives mirrored will be nice insurance. You can also set up some shares to always be on the cache (i.e. Music) so you get super fast network access to that data without spinning up any drives. In that case, you could either mirror your cache drives or just keep an extra copy of your music in your protected array as a backup.

Also note that while the cache drive is technically optional, YOU WILL WANT THIS!!! A must have option in my mind. I was scared of setting up a cache drive for years, but it is so easy and works so well, I regret every day I neglected to do this.

This motherboard is compatible with both 6th and 7th gen Intel Core i7/i5/i3 / Pentium/Celeron Processors (Socket 1151), but not 8th or 9th gen.

This motherboard has dual gigabit NICs. Hopefully that is fast enough.


CPU
For basic Unraid file serving, you don't need much horsepower at all. I ran for years on a very low end Celeron G1610. These days, a Celeron G3950 is a good choice at $71 with heatsink and fan included. Note that this CPU only supports up to DDR4 2133, so you might be able to save a few more bucks by looking at slower memory than what I listed below.

If you think you will be running some Dockers and doing any transcoding, it's probably worth throwing a bit more horsepower at it. The Core i3-7300 is a nice upgrade at $172 (heatsink and fan included). This also includes Hyperthreading and VT-d Virtualization, so you can run VM's, though with it only being a dual core I'm not sure I'd recommend running a Windows VM on it.

If you'd like to run a Windows VM (desktop/server) I would definitely recommend stepping up to a quad-core CPU. The Core i5-7600 is a great choice at $269 including the heatsink and fan.



Graphics/Video Card
All three of these CPU options include integrated graphics, as does the motherboard. So you won't need an external graphics card unless you want to do something like a Win10 VM for gaming or movies. I run lots of VM's (up to four Windows Server 2016 at the same time) but I access them all remotely over RDP, so I don't need anything more than basic graphics. But I'm the exception - most Unraid users who are doing VM's are doing this for a Windows desktop, so they need high end graphics passed through.

Normally the only time you will ever use the server's graphics capabilities is when setting up the motherboard's BIOS options, or when using the Unraid console for some troubleshooting. 99.9% of the time, you will access the server remotely over the network, using the web interface or remote console/terminal.

Instead of hooking a monitor/keyboard up to it, you could alternatively use a KVM over IP type device, so you could connect to the KVM device over you network, and view the monitor's output, including the BIOS screen. You would hook the KVM over IP device up instead of a monitor/keyboard/mouse. If you hunt around, I think you can find these for under $100. I keep thinking about getting one, but I so rarely interact with the server hardware in the basement, and I had a spare monitor/keyboard handy.


Memory/RAM
For basic Unraid file serving, you don't need much memory at all. I ran one for years on just 2GB of RAM. The biggest consumer of RAM for file serving is the directory caching (which is awesome, as a bigger cache of your directories prevents drives from spinning up just to browse your folders and makes browsing much faster). Currently my server is using 3.2 GB of RAM with a handful of plugins and a couple Dockers, but no VM's.

So at a basic level I would recommend 8GB. You can easily find 8GB DDR4 2400 for $55 or less.

If you plan to run some Dockers, it's probably worth the upgrade to 16GB, and you can get 16GB DDR4 2400 for about $90.

And if you plan to run VM's, or Dockers and VM's, 32GB is a good upgrade. 32 GB of DDR4 2400 is about $180.

It's probably not worth chasing higher RAM speeds unless you will be doing memory intensive tasks like transcoding.

It's always smart to review the motherboard's Memory Support List (on ASRock's website) to make sure you are getting supported memory.



Controller Cards
One of the most popular controller cards is the LSI 9201-8i. These are great for Unraid. Each one supports 8 drives (2 SAS connectors). You will need 2 of these to control 16 drives, since the motherboard will handle the other 8. They cost $56 each, shipped, for a total of $112: https://www.ebay.com/itm/LSI-6Gbps-SAS- ... :rk:1:pf:0

These are older cards, using a PCI Express 2.0 x8 connection, so it is important to make sure these get installed in a slot with x8 connectivity. The motherboard I spec'd above will do this just fine, you just have to make sure you use the right slots. If you use the wrong slot, they will drop to x4 lanes, halving throughput. Also, if you use 3 slots, then at most you will be in x8/x4/x4 mode, so you do not want to have 3 PCI-e slots occupied on this motherboard, or you will halve throughput on at least one of these controller cards.

There is a newer model, the 9207-8i, that uses a faster PCI Express 3.0 x8 connection, and would still be okay even in PCI-e 3.0 x4 mode, but this costs about $40-$50 more per card, so I really don't think it is worth it unless you think you'll have 3 PCI-e cards installed at some point.
image.png
image.png (1.52 MiB) Viewed 14755 times


SAS Data Cables
You will need four Mini-SAS SFF8087 to Mini-SAS SFF8087 cables (two per card, each cable supports 4 drives). These cables run from the LSI Controller Cards to the case's backplane. These are $9 each on Amazon (the 0.7m length should be good), for a total cost of $36:

https://www.amazon.com/CABLEDECONN-Inte ... B00S7KU3PC

You will also need 2 Mini SAS to SATA Reverse Breakout Cables (SFF-8087 to 4x SATA) which will connect the 8 SATA ports on the motherboard to the case backplanes. These are $14 on Amazon (1.6 feet length is good), for a total cost of $28:
https://www.amazon.com/Cable-Matters-In ... B018YHS9GM


Case
X-Case eXtra Value RM 424 for $351. The EX model is $638, $187 more than the eXtra Value, but saves you the expense of having to buy those 2 controller cards. Since the controller cards will only cost $112 for two, the eXtra Value case actually comes out $75 cheaper. True, you do have to splurge for a nicer motherboard with both more PCI-e lanes and more SATA ports, but you're probably still saving money over the EX model. On top of being cheaper, the controller cards provide dedicated bandwidth to every drive, while the expanders share bandwidth, possibly making certain data activities slower (i.e. monthly parity checks that use all drives simultaneously). So faster and cheaper = win in my book.

This case also has room to install a couple 2.5" SSD's inside, perfect for cache drives.

You'll need to get a quote on shipping, but I'm guessing it will be around $150, so let's call this case $500: https://www.xcase.co.uk/collections/4u- ... tswap-bays
image.png
image.png (242.91 KiB) Viewed 14755 times


Power Supply
The biggest power draw a server like this will see is when all 24 drives are spun up simultaneously (Unraid does not have a staggered spin-up, so even if your motherboard/controller cards offered staggered spin up, that would only be at system start-up). Whenever all your drives are idle, starting a Parity Check in Unraid will instantly spin up all 24 drives, maximizing the hit on the power supply.

This load is estimated at up to 50 Amps on the 12v line, and 9A on the 5V line. You'll obviously want a power supply that exceeds these specs, so you have capacity left over for the rest of the system (CPU/motherboard/controller cards/graphics/cooling).

Your typical 650W power supply will have plenty of 5V amperage, but only 50-55 Amps on the 12v line. A 750W power supply gets you into the low 60's on the 12V line, and should be good for most systems. An 850W takes the 12V line over 70A, but this should only be needed if you are using a very high-end power hungry CPU. If you were doing a dedicated high-end graphics card, you'd probably need to bump this up to 1000W or more, depending upon the card.

I'm recommending this Seasonic FOCUS SSR-750FM 750W 80+ Gold power supply. It is semi-modular, has 12V@62A, and an excellent value at $90: https://www.newegg.com/Product/Product. ... gnorebbr=1
image.png
image.png (627.88 KiB) Viewed 14755 times


Hopefully I didn't miss anything.

Paul
President, Chameleon Consulting LLC
Author, Chameleon MediaCenter

Jamie
Posts: 945
Joined: Wed Dec 27, 2017 11:26 pm

Re: Need advice on building an unraid media server

Post by Jamie » Fri Feb 22, 2019 9:35 am

This is a great value machine Paul! Much lower than I expected. I will start with getting the X-Case eXtra Value RM 424. I will get a quote first. I read this quickly so I'll look at it more thoroughly later.

I was wonder whether unraid had a remote access ability like VNC?

User avatar
Pauven
Posts: 2781
Joined: Tue Dec 26, 2017 10:28 pm
Location: Atlanta, GA, USA
Contact:

Re: Need advice on building an unraid media server

Post by Pauven » Fri Feb 22, 2019 10:04 am

A note on AMD Ryzen & Threadripper for Unraid

Anyone reading this thread, written in early 2019, would rightly question why I'm recommending Intel based systems and not AMD based systems. After all, the AMD Ryzen & Threadripper are more in-line with server roles, and their higher core counts and PCI Express lanes make them ideal candidates for an Unraid server that will see one or more Windows VM's passed through.

I agree 100%. That's why I built one in April of 2017, right after Ryzen was released. My Ryzen 7 1800X was one of the very first Ryzen Unraid systems, and I was at the forefront of discovering that there was an issue that was causing frequent system hangs, and discovering a workaround.

Long story short, when the Ryzen systems were completely idle and entered extremely low power states, the CPU consumed so little power that it caused a power imbalance across 5v and 12v rails that would exceed power supply design specs. This would cause the power supply's power output to become unstable, and in turn this would cause the CPU to hang or crash. (Note: there is a possibility that there is more than one issue at play, and another possible issue is that c-state values are being called incorrectly on early Ryzen revisions.)

The issue affects different systems to different degrees, depending upon the individual build and how it loads up the power supply's 5v and 12v rails, the individual power supply used, and how idle an Unraid server is allowed to be (i.e. running a Windows VM typically keeps the CPU active enough to greatly mitigate this issue), and even overclocking the CPU and driving up power usage ironically helps stabilize the system. It seems that my particular build and usage was extremely sensitive to this issue, and will crash within just a few hours of startup.

After an intensive investigation, I discoved was that by disabling Global C-state Control in the BIOS this problem would go away. Later, many motherboard manufacturers added a new BIOS setting to ensure power supplies are kept in their designed operating range (but this has never been added to my motherboard's options). There has also been a startup option identified for Unraid that disables C-states at the system level, replacing the BIOS Global C-state Control option.

It also seems that 2nd generation Ryzen 2000 series have a design improvement that minimizes or eliminates this issue.

Apparently this issue only affect Linux systems, and does not affect Windows. In my own testing of multiple Linux systems, I only observed the issue on Unraid's own Linux distro. That said, there is an open support ticket on this issue for Linux, and it certainly seems that many or all Linux flavors are susceptible to this issue, and unfortunately nearly two years later it is still not resolved: https://bugzilla.kernel.org/show_bug.cgi?id=196683

By employing the available solutions mentioned above, I currently enjoy my Ryzen 7 1800X based Unraid system with no stability issues at all. So, problem solved, right? Not exactly.

First, the latest beta versions of Unraid 6.7.0 seemed to have brought the crash/hang issue back for Ryzen users. Worse, it seems that our normal mitigation solutions are no longer working: https://forums.unraid.net/bug-reports/p ... lock-r354/

I haven't tried the new 6.7 versions yet, so I don't have firsthand knowledge if the reported issues would affect my system, though I fear they might.

In general, it is the lack of confidence that this issue is right and truly solved that make me hesitant to publicly recommend these systems. At the same time, if you're willing to deal with mitigating this issue, you might want to consider them. I love mine, and if I were to build another Unraid system for myself, I would likely use a Threadripper to build a 16-core 128GB beast of a machine, so I can better run multiple Windows server VM's.


What about Jamie's Build?
One other point I want to touch on is why I didn't recommend a Ryzen system for Jamie's build.

Based upon Jamies use case (primarily a file server, possibly some Docker usage), goals (20 or 24 hot-swap drive bays), and budget (cheaper is better), the AMD systems just couldn't compete with the Intel systems.

Problem 1 - most AMD Ryzen CPU's do NOT include an integrated GPU (essentially the exact opposite of Intel, where most do), so you are limited in CPU selection OR you have to use a discrete GPU card at an additional cost. While most users run their Unraid systems headless, Unraid will NOT boot without some type of graphics capability in the system - it doesn't have to be hooked up, but it does have to be present.

Problem 2 - because most Ryzen CPU's do not include a GPU, a large percentage of Ryzen motherboards also do not include video outputs, so you are further limited in your motherboard selection. These motherboards are also more expensive.

Problem 3 - To control 26 drives (24 hot-swap bays plus 2 cache drives), you will need either 3 drive controller cards with 8 ports each (or their equivalent) plus a couple extra on the motherboard for the cache drives, or a motherboard that offers 10 SATA ports to go along with 2 controller cards delivering 16 ports.

If you combine the 3 problems, what you discover is this: If you want to do integrated graphics, then there are no motherboards available that offer 10 SATA ports, so you are forced to buy a 3rd controller card. Or if you forego integrated graphics to get a more capable motherboard, you are forced to buy a discrete graphics card.

While a Ryzen system would normally trounce an Intel system in bang for the buck, for this exact scenario of a 24 bay hot-swap server with at least two extra SATA ports for cache drives the Intel based systems come out cheaper for the same capability.

Had I recommended a 20 bay hot-swap server, an AMD system would have come back into consideration. But seeing as the 20-bay servers are the same physical size as a 24 bay server, I personally think it is silly to consider a 20-bay server. 24 is the way to go, and it greatly drives down your $/drive cost.


The Final Word
One last thing I would like to state is that I am a huge AMD fan, so much so that I own a ton of AMD stock (more than I care to admit), and I follow them very closely and I'm highly motivated for them to succeed.

So it is with great pains that I write this post that recommends an Intel system over AMD.

While I truly think AMD Ryzen and Threadrippers are fantastic processors (for example, I'm writing this post on my Threadripper 2950X), superior in many ways to Intel's offerings, and an amazing value, due to the lack of a formal resolution of the stability issue on Unraid/Linux (and the uncertainty as to whether current mitigation steps will continue to work in new versions of Unraid), combined with the unique build requirements of Jamie's system, I felt it unwise to recommend an AMD Ryzen based system for this build.

Had Jamie needed a more powerful server with more CPU cores, I think the recommendation may have gone in a different direction. But it is what it is.

Paul
President, Chameleon Consulting LLC
Author, Chameleon MediaCenter

User avatar
Pauven
Posts: 2781
Joined: Tue Dec 26, 2017 10:28 pm
Location: Atlanta, GA, USA
Contact:

Re: Need advice on building an unraid media server

Post by Pauven » Fri Feb 22, 2019 10:19 am

Jamie wrote: Fri Feb 22, 2019 9:35 am This is a great value machine Paul! Much lower than I expected. I will start with getting the X-Case eXtra Value RM 424. I will get a quote first. I read this quickly so I'll look at it more thoroughly later.

Compared to Drobo, an Unraid system represents an amazing value. Even if they were the same price, I would choose Unraid over Drobo simply for the added functionality of Unraid, and the ability to maintain your own hardware. But a 24-drive Drobo system would cost 3x what this Unraid system costs (you'd have to buy 3 8-drive Drobo systems), and worse the Drobo system expends more of your storage on parity, so you have less storage available for data.

Another nice benefit of Unraid is that your data is not striped across drives, but rather exists whole and intact on a single drive. If the worst case were to happen and you lost 3 drives at the same time (with double parity, you can lose 2 and still rebuild), you would only lose the data on those 3 drives, and your other 21 drives would be intact. And if you're lucky and 2 of those drives lost are your parity drives, then you only lose the data on that one data drive.

Back before dual parity was an option, I lost two drives at the same time: my single parity drive and a data drive. But luckily the data drive would run and be readable for short periods of time (1-2 hours) before it would crash again. So I removed both bad drives from the server, installed replacements, rebuilt the parity, and got the system back to healthy but with an empty data drive that replaced the failed data drive. I then connected the failed data drive to my Windows PC, and over the course of several days I was able to copy data off of it in short bursts. Eventually I recovered all of my data from that drive, and I didn't even have to use any special RAID recovery type tools.

Jamie wrote: Fri Feb 22, 2019 9:35 am I was wonder whether unraid had a remote access ability like VNC?

Yes, VNC type capability is built in, and allows you to view the VM desktops essentially from your browser. I find that the mouse movements don't translate well, especially on Windows VM's, so I typically use RDP instead of the built in VNC type solution. There may be other options too, I simply haven't explored them.

Paul
President, Chameleon Consulting LLC
Author, Chameleon MediaCenter

Jamie
Posts: 945
Joined: Wed Dec 27, 2017 11:26 pm

Re: Need advice on building an unraid media server

Post by Jamie » Fri Feb 22, 2019 7:00 pm

Pauven wrote: Fri Feb 22, 2019 10:19 am Compared to Drobo, an Unraid system represents an amazing value. Even if they were the same price, I would choose Unraid over Drobo simply for the added functionality of Unraid, and the ability to maintain your own hardware. But a 24-drive Drobo system would cost 3x what this Unraid system costs (you'd have to buy 3 8-drive Drobo systems), and worse the Drobo system expends more of your storage on parity, so you have less storage available for data.

I agree with you Paul. I just pissed my money away with these Drobos. Sorry for the wording, but it is the most accurate term I can use here.

I spent over $1000 on each of my 2 - 8 drive drobos and another $500 a piece for my 3 - 5 drive drobos. They use more space for 1 drive parity than unraid and you can only use 80% of the free space left over after the drobo pairs. If the drive goes over 80%, it starts throwing obnoxious warnings.

Pauven wrote: Fri Feb 22, 2019 10:19 am
Another nice benefit of Unraid is that your data is not striped across drives, but rather exists whole and intact on a single drive. If the worst case were to happen and you lost 3 drives at the same time (with double parity, you can lose 2 and still rebuild), you would only lose the data on those 3 drives, and your other 21 drives would be intact. And if you're lucky and 2 of those drives lost are your parity drives, then you only lose the data on that one data drive.

That is essentially what happened to me. Due to some failure of the Drobo, it reported that the wrong drive had failed which I ended up misplacing. When I got a new drobo to replace the failed device, I couldn't fix the problem without losing all my data. I had one bad drive and I was missing the other drive that was miss reported and lost in the hassle of trying to fix the broken drobo. Now I have to hunt down all my dvd/bluray discs that I had on the original Drobo and rip them again.

I was always worried about replacing a failed drive in the Drobo because during the time it was reallocating the space "pairing?" for the replaced drive, if something went wrong like a power surge, or failure, you could lose all your data. Some times it took a couple of days for the Drobo to reallocate itself after a drive failure due to the size of the drive that was replaced.

Drobo also had a habit of changing how they configured the drives on a new drobo model from a previous model so that the drive packs from one to another were not compatible.

Hopefully I will have a better experience with unraid

Jamie

User avatar
Pauven
Posts: 2781
Joined: Tue Dec 26, 2017 10:28 pm
Location: Atlanta, GA, USA
Contact:

Re: Need advice on building an unraid media server

Post by Pauven » Fri Feb 22, 2019 7:43 pm

Jamie wrote: Fri Feb 22, 2019 7:00 pm I was always worried about replacing a failed drive in the Drobo because during the time it was reallocating the space "pairing?" for the replaced drive, if something went wrong like a power surge, or failure, you could lose all your data.

That's a perfect segue into my next topic: UPS devices.

I have several UPS devices, but even more computers than UPS devices. I've NEVER had a PC fail that was always run on a UPS device. I have PC's going strong 5-10 years, until I upgrade and retire them, on my UPS devices.

But the PC's that I never plug into UPS devices have ALWAYS failed on me, even through multiple rebuilds, and often within 1-2 years. It's so frustrating. I've rebuilt my HTPC twice in the past two years, and it has been dead for the past several months. I need to buy another UPS before I rebuild it yet again.

I always buy UPS devices that have that Pure Sinewave line conditioning, and I really believe that is the secret to how it protects these computers. Sure, it helps that they don't have the power turned off unexpectedly, but even better these PC's are receiving perfect A/C power, immune to brown outs and surges.

Sure, my evidence is anecdotal at best, but I am a die-hard believer. I've been using UPS devices for over 15 years now, and I have 1 large UPS on my Unraid server, and a couple smaller UPS devices on my critical computers.

Now, with regards to Unraid, there is a second reason to have a UPS device. When you are writing to the array, if power is lost during the write, those files will become corrupted. Unlike many expensive RAID controllers or NAS boxes, there is no write buffer or on-board battery/capacitor to finish out the write queue when there's a power failure. It shouldn't corrupt your array, just the one file being written, but a power failure might invalidate your parity, and then you would have to do a parity check/rebuild, which isn't fast (but maybe faster than Drobos).

You don't need a big battery either. Even a small one that can keep the server up 5-10 minutes will make a huge difference. You can connect the UPS to the Unraid server via USB, and configure Unraid to shutdown automatically to protect itself when the battery gets below a certain level. This will be a graceful shutdown too, so you don't get unexpected shutdown warnings when you start it back up, or the dreaded forced parity check (though you can always cancel it if you want).

Even my big UPS will only keep my Unraid server up for 30 min to an hour, less if all the drives are spun up for a parity check or rebuild. You'd need thousands of dollars worth of UPS batteries to get through a parity check/rebuild, so I wouldn't even worry about that - if they get interrupted, you just have to start over, but no data should be lost. I think in the new Unraid v6.7, LimeTech is even working on a pause/resume function for the parity checks, but this capability doesn't yet survive a restart. Fingers crossed he gets this figured out.

If you're interested in what I use, I really like the CyberPower CP1500PFCLCD. These typically go for about $200. I think it's a good value for the money, since it includes 100% Active PFC (Power Factor Correction) and 900 Watts of Pure Sine Wave. These are small and light enough (25 lbs) that they are easy to hide near all your important PC's. I find that the CyberPower UPS's have similar features to APC but better prices. Regardless of brand, you do pay a premium for the Pure Sine Wave feature, but to me that is the most important feature.
image.png
image.png (284.55 KiB) Viewed 14742 times
The bigger UPS that I have on my Unraid server (overkill, really) is the CyberPower FPC Sinewave OR1500PFCRT2U. It's so hard to tell from the specs, but this is double the size of the unit above, so it has twice the uptime. It's also more than double the price, and really heavy at 50 lbs. Actually, I've found that weight is one of the best ways to compare the size of one UPS to another. They all use the same battery technology, and 95% of the weight of a UPS is the battery, so a UPS that weighs twice as much has basically twice the energy storage. I got the bigger UPS for my Unraid server because for a while we were having frequent power outages that were lasting a few minutes longer than the smaller one could stay awake.
image.png
image.png (413.77 KiB) Viewed 14742 times

Something else that is also neat is that Unraid will log these power issues and display them to you in the GUI. Now it's super easy to know when I had a 12 minute power outage overnight. You also get UPS health stats on your Unraid dashboard:

image.png
image.png (794.13 KiB) Viewed 14740 times


Since you're coming in so far under budget, there's really no reason to not get a UPS if you don't already have one.

Paul
President, Chameleon Consulting LLC
Author, Chameleon MediaCenter

Post Reply