Recovering Data from a Failed Synology NAS
by Ganesh T S on August 22, 2014 6:00 AM ESTData Recovery - Evaluating Software & Hardware Options
Given the failure symptoms (and the low probability of all the four hard drives in the DS414j failing at the same time), I was cautiously optimistic of recovering the data from the drives. One option would have been to put the four drives in another DS414j (or another 4-bay Synology NAS unit) and hoping disk migration would work. However, with no access to such a unit, this option was quickly ruled out.
In many of our NAS reviews, I had seen readers ask questions about data recovery from the units using standard PCs. In the review of the LG N2A2 NAS, I had covered data recovery from a mdadm-based RAID-1 volume disk member using UFS Explorer Standard Recovery v4.9.2. Since then, I have tended to prefer open source software while keeping ease of use in mind.
Recovering RAID-1 Data using UFS Explorer Standard Recovery
Searching online for data recovery options for a failed Synology NAS didn't yield any particularly promising results for Windows users. From an open source perspective, Christophe Grenier's TestDisk appeared to be able to perform the task. However, with no full featured GUI and / or instructions for recovery in this particular case (4-disk RAID-5 volume), I fell back upon UFS Explorer for a quicker turn-around. My only worry was that I hadn't used standard RAID-5 while creating the volume, but Synology Hybrid RAID (SHR) with 1-disk fault tolerance. Though it was effectively RAID-5 with the 4x 2TB drives in the NAS, I wasn't completely sure whether the software would recognize the RAID volume.
Synology does have a FAQ entry covering this type of unfortunate event for users willing to work with Ubuntu. This involves booting Ubuntu on a PC with the drives connected, installing mdadm and using that to recognize the RAID volume created by the Synology NAS.
Data Recovery from Synology NAS Drives using Ubuntu
The pros and cons of the two data recovery software alternatives are summarized below:
-
Windows + UFS Explorer
- Pro - Intuitive and easy to use / minimal effort needed for users running Windows on the relevant PC
- Con - A licensed version of UFS Explorer costs around $200
-
Ubuntu + mdadm
- Pro - Free
- Con - Complicated for users without knowledge of Linux / users not comfortable with the command line
- Con - Synology's FAQ doesn't cover all possible scenarios
Evaluating the Hardware Options
UFS Explorer can take in disk images for RAID reconstruction. The hardware in my possession that came to mind immediately were our DAS testbed (the Asus Z97-PRO (Wi-Fi ac) in the Corsair Air 540 with two hot-swap bays configured) and the recently reviewed LaCie 2big Thunderbolt 2 / USB 3.0 12 TB DAS unit. My initial plan was to image the four drives one by one into the DAS and then load the images into UFS Explorer. I started the imaging of the first drive (using ODIN) and it indicated a run time of around 4.5 hours for the disk. After starting that process, I began to rummage through my parts closet and came upon the StarTech SATA duplicator / eSATA dock that we had reviewed back in 2011. Along with that, I also happened to get hold of a eSATA - SATA cable.
The Asus Z97-PRO (Wi-Fi ac) in our DAS testbed had two spare SATA slots (after using two for the hot swap bays and one each for the boot SSD and the Blu-ray drive). Now, it would have been possible for me bring out the two SATA ports and appropriate power cables from the other side of the Corsair Air 540 chassis to connect all the four drives simultaneously, but I had decided against it because of the difficulties arising due to the positioning of the SATA ports on the board (I would have considered had the ports been positioned vertically, but all six on the board are horizontal relative to the board surface). However, with the StarTech dock, I just had to connect the eSATA - SATA cable in one of the ports. There was no need to bring out the SATA power cables from the other side either (the dock had an external power supply).
Click on image for component details
Our DAS testbed runs Windows with a 400 GB Seagate boot SSD as the only SATA drive permnanelty connected to it. I wasn't about to install Ubuntu / dual boot this machine for this unexpected scenario, but a live CD (as suggested in Synology's FAQ) with temporary mdadm installation was also not to my liking (in case I needed to reuse the setup / had to reboot in the process). Initially, I tried out a 'live CD with persistence' install on a USB drive. In the end, I decided to go with a portable installed system, which, unlike a persistent install, can be upgraded / updated without issues. I used a Corsair Voyager GT USB 3.0 128 GB thumb drive to create a 'Ubuntu-to-go' portable installation in which I installed mdadm and lvm2 manually.
55 Comments
View All Comments
deeceefar2 - Friday, August 22, 2014 - link
If instead of using one qnap and one synology they were both the same brand, you wouldn't have had an issue. You could have just popped the drives immediately into the other nas, and sent the synology back for refurbishing. They way we did it was 2 Qnaps, one at the office, and one at my house. When we had a failure of the main Qnap we sent it in for repairs, and brought the one from home in. You have them doing remote replication, and then using dropbox sync we had one version in the cloud that was synced to individual workstations. So workstations doing video editing could do that much faster locally and then that would get synced to the main drive and then to the remote version at the same time.ruidc - Friday, August 22, 2014 - link
we had a Thecus that died and were told that we could simply plug the drives into the shipped replacement unit. When we did so, it initialized the array. Now, I'd go UFS every time instead (having used it successfully on a single drive to get the contents of an XFS drive that would not mount on another occasion). But I did not have a spare machine capable of connecting all the drives. Luckily nothing of importance was lost.imaheadcase - Friday, August 22, 2014 - link
Ganesh T S any plans to do a custom NAS buying guide like the one done in 2011? Lots of custom options out now for that.matt_v - Friday, August 22, 2014 - link
This article really takes me back to my own experience with NAS data recovery. After a firmware upgrade in 2012, my QNAP completely lost its encrypted RAID 6, and claimed it had 6 unassigned drives. After much Googling and careful experimenting with nothing but a CentOS VM on my notebook, I was able to extract all the files with all Unicode filenames intact (VERY important in a tri-lingual family).- SSH into the NAS as root and create a new mountpoint that's not referenced in any possibly corrupted config (I used /share/md1_data, where the default is md0_data)
- Assemble and mount the mdadm volume using the same console commands Ganesh used
- go into the web GUI and unlock the volume that "magically" appears in the Encrypted File System section
- Open WinSCP and log into the QNAP as root
- Copy out the contents of /share/md1_data to a backup volume of your choice (I used a NexStar HX4R with a 4x4TB RAID 5+1)
After successfully extracting all the files from the array, I completely nuked the QNAP configuration, built a new array from scratch, and copied the files back. These days, the Nexstar acts as a backup repository using TrueCrypt and a Windows batch script. Ugly, but functional, and the QNAP hasn't had a single config panic since.
Oyster - Friday, August 22, 2014 - link
"If QNAP's QSync had worked properly, I could have simply tried to reinitialize the NAS instead of going through the data recovery process."It seems you're blaming QSync for the failure as well... didn't you say the Synology circuit board died? How do you expect any external applications to "talk" with the Synology unit? Can you share your thoughts on why/how you expected QSync to function in this scenario?
This is no different than having OneDrive on two machines, and then blaming OneDrive for not syncing when one of the machines die on you!?!?
ganeshts - Friday, August 22, 2014 - link
The fact that the circuit board died is orthogonal to QSync's working.The data was in the hard drives of the DS414j for more than 12 hours before the 414j died. The CIFS share on the unit was used to upload the data, so there was actually no problem accessing the CIFS share at that time and for some time thereafter too.
The CIFS share was mapped as the 'QSync' folder on a separate Windows 8 PC (actually a VM on the TS451). QSync was installed on that PC. QSync's design intent or the way it presents itself to users is that it does real time bidirectional sync. It should have backed up the new data in the QSync PC folder (i.e, the DS414j share) to the TS451, but it didn't do it.
I had personally seen the backup taking place for the other data that I had uploaded earlier - to either the TS451 QSync folder or the DS414j share - so the concept does work. I actually noted this in my coverage of the VM applications earlier this week - the backup / sync apps don't quite deliver on the promise.
Oyster - Friday, August 22, 2014 - link
Thanks for the detailed clarification. Much appreciated.Two things I want to point out:
1) I was under the impression that QSync is simply for syncing folders. I'm surprised you're using it for full blown backups. Was this something QNAP suggested? I'm asking because I own a QNAP and would be good to know where QNAP is taking QSync.
2) I have backups setup on my QNAP NAS using the Backup Station app. I was always under the impression that Backup Station is the go-to app for maintaining proper backups on QNAP (it even provides rsync and remote NAS to NAS replication). This app has a notification feature which ties in with the notification settings in the Control Panel. I haven't had anything fail on me, but I tested the notification functionality using a test email, and it worked fine. I'd think had you utilized Backup Station, you would have been notified the moment things stopped working.
Just to point out, I'm in no way being defensive about the QNAP. I'm in full agreement with you that some of these utilities could use more work. Especially, something that allows us to read raw drives in a PC environment in the face of a failure.
ganeshts - Friday, August 22, 2014 - link
I am not sure how QSync is being understood by the users, but my impression after reading the feature list was that it could be used as an alternative to Dropbox, except that it used a 'private cloud'.Do I use Dropbox for folder syncing or backup? I would say, both. On my primary notebook, I work on some files and upload it to Dropbox. On my work PC, I could work on other files and upload them to the same Dropbox. In the end, I expect to be able to work with both files on both the notebook as well as the work PC. Extending this to QSync - I could put the files to 'sync/backup' through the QNAP QSync folder or upload it to some other path mapped as a QSync target along with the QSync program / application on a PC.
I believe backup and RTRR (real-time remote replication) are both uni-directional only. My intent was to achieve bidirectional sync / backup, which is possible only through 'Dropbox-style' implementations. If there are alternatives, I would love to hear about it.
Gigaplex - Saturday, August 23, 2014 - link
"I was under the impression that QSync is simply for syncing folders. I'm surprised you're using it for full blown backups."What is a backup? It is a copy of the data. What does syncing do? It copies data.
hrrmph - Friday, August 22, 2014 - link
I noticed AT's steady increase in NAS coverage and I wondered how long it would take to get to this point. Well, not too long it would seem.It just proves that once again, complexity kills. RAID, NAS, etc. aren't backup solutions, but rather are high capacity, high availability, and high performance solutions.
It's good to see that most people writing articles, and commenting today, already understand that a NAS isn't a 'safety' device. It is a fools errand to think of a NAS or RAID as providing any safety.
The value of a NAS as a high performance solution is questionable because a single SSD popped into a spare bay on a desktop system will outperform the NAS. Except when the NAS is populated with SSDs in a performance RAID configuration. Then you have the problem of getting a high enough bandwidth connection between the NAS and client. For performance, you are best sticking with a high performance desktop. If you insist on a laptop as your main machine, then connect it to the high performance desktop using Thunderbolt.
As for high availability, you are either a business and know how to do this yourself (and have a competent IT department to implement it), or you are a consumer. A consumer can just buy high availability as a service (such as from Amazon services). Or the consumer is a tinkerer, and doesn't care about efficiency, or cost effectiveness. Which brings us back to AT's series of articles on NAS devices.
If you are like me, and aren't ready to relinquish everything to the cloud, or a dodgy proprietary NAS scheme, or an even dodgier RAID setup, an alternative is to just build a low-power PC fitted with a 16-port HBA card and an appropriate chassis with racks. The hardest part these days is finding a case that is appropriate for a bunch of front loading racks to hold all of the quick swap drives. But, it is nonetheless one of the most viable ways to improve capacity and safety without going to the cloud.
As SSD prices slowly descend, this even becomes a viable performance option, with non-RAID drive setups capable of supplanting a bunch of spinning disks in a performance RAID setup.