Using a Cache drive sounds complex, which is what kept me away from using it for many years. In reality, it is so simple to use, and Unraid does all the hard work for you.
Essentially, a Cache drive should be an SSD (or NVMe if you prefer), so that you get the benefits of fast speed and without spinning up a HDD.
Cache drives are used in relation to a Share. For each Share, you can specify one of the following options for "Use cache disk":
- NO - Never use the Cache
- YES - Temporarily use the Cache, space permitting
- ONLY - Only use the Cache (data stays on Cache, never copies to array)
- PREFER - Permanenent use the Cache, space permitting
If you set it to NO, then any writes to the Share (i.e. DVDs) will always go straight to the HDD array.
If you set it to YES, then any writes to the Share will automatically be written to the Cache disk if there is enough room (it adheres to the same min free space parameter defined for that share, i.e. 100GB that I set for my DVDs share is validated against the Cache disk, and if insufficient room then the writes are direct to the HDD array). The next time the MOVER is run (typically runs automatically each night), these files will be copied from the Cache disk to an array HDD.
If you set it to ONLY, then all data for that Share will be written to the Cache disk, even if there isn't sufficient space. A good example of how you could use this is for your Music Share. If it is small enough to fit on your Cache, you can set it to ONLY, and it will then reside on the SSD drive. That way you can play music from any PC in the house, getting really rapid data access from the SSD, and never spin up any HDD in your array.
If you set it to PREFER, then you get the same behavior as ONLY, unless there is insufficient space on the Cache disk, then new writes to that Share will go to the HDD array.
One of the nice things about the Cache disk and the MOVER is that you can change the setting above for a Share, and the MOVER will automatically move the data where you want it for you the next time it runs. For example, let's say you had your Music Share set to NO for the Cache disk, and the data was sitting on a HDD in your array. You then change it from NO to ONLY, and the next time the MOVER runs (overnight, or on demand if you don't want to wait) your entire Music Share will be relocated from your HDD array to your Cache disk.
Assigning a Cache disk is super simple, same as you assign any other disk to your array. On the array setup screen where you assign your Parity and Data disks for your array, you also assign your Cache disk.
Using a Cache disk is transparent - you really won't even know it is happening unless you look for it. For example, let's say I'm copying a new Blu-ray ISO to my array for the movie Batman Begins. I go to my Share named Blu-Rays and create a new folder named Batman Begins. Since my server name is Tower, the full UNC path looks like this: \\Tower\Blu-Rays\Batman Begins\. I then place my Batman.iso file in that folder.
Behind the scenes, Unraid created \\Tower\cache\Blu-Rays\Batman Begins\Batman.iso (so it's on the Cache disk). Right after copying the ISO out there, I can't wait to watch it, so I go sync with CMC and play the ISO. It exists at \\Tower\Blu-Rays\Batman Begins\Batman.iso.
Then overnight the MOVER kicks in, and since I have the Blu-Rays Cache disk set to YES, it knows to move the new directory into my protected HDD array. It checks my disks, and finds that disk14 has enough free space (more than 180 GB per my Blu-Rays min free space setting), and moves it from \\Tower\cache\Blu-Rays\Batman Begins\Batman.iso to \\Tower\disk14\Blu-Rays\Batman Begins\Batman.iso.
Since Batman Begins is so awesome, I decide to watch it again the next day. I launch it in CMC and play the ISO that is still at \\Tower\Blu-Rays\Batman Begins\Batman.iso.
That's how Shares work, they are a virtual directory that spans multiple disks (including the Cache disk) and presents them as a single directory. Behinds the scenes, Unraid can choose on which disks to store the data (Cache, disk1, disk14, whatever), and makes those decisions based upon the parameters you configured for that Share. But when you use that Share, you don't have to know any of that to get at your data, you simply access your Share.
Now there is one problem with Cache disks - they are OUTSIDE your protected array. So for my DVDs and Blu-Rays Shares that I have set to YES, for the few hours they sit on the Cache disk they are unprotected. If the Cache disk fails, I loose that data. Then after the MOVER relocates them to the HDD array, they will be protected.
For my Music Share, if I set it to ONLY use the Cache disk, then that data is never protected in the array (at least not automatically). I can still make an extra copy of my Music and place it in another Share (i.e. I keep an extra copy in my Share named Backups, which is set to NO for using the Cache).
For years people complained that the Cache drive was too vulnerable, and they wanted to have it protected too. So recently LimeTech added a Cache Pool option, which allows you to use multiple Cache disks and to pool them together for protection (or striped for more performance but even less protection). This is one of the main reasons I specified that particular motherboard in the build list, as with 10 SATA ports, you have 2 left over for hooking up 2 SSDs and creating a RAID 1 mirrored Cache pool.
Spaceinvader One also has a nice video that covers most of what I discussed above, and also explains a few other nice details:
https://www.youtube.com/watch?v=ij8AOEF1pTU
Paul