I almost didn't make a demoreel because my Synology server corrupted 18 months worth of footage

It’s been since June of 2020 since I last created an FPV demoreel. It was my first one and given that I had only been doing professional gigs for about a year, I was amazed at how great the video came together.

2 years and some REALLY incredible projects later, I decided it was time to make another demoreel that would stand for the level and quality of work I can deliver now. Before I get down to the part of where things went really bad, here is the result:

A lot of things went wrong in order for this misery to happen. The fact that I managed to put the above video together and feel relatively proud about this is nothing short of a miracle. Before I get into the details of what went wrong, let me walk you through how I deliver, archive and backup my footage for any shoot or flying session.

My backup protocol

  1. Copy SD contents to workstation computer

  2. Upload both RAW and stabilized shots to my Google Drive (to share with clients or friends)

  3. Backup to my local Synology Server

  4. Offsite back from my Synology Server to Google Drive

Copy SD contents to my workstation computer

Obviously the first thing I do is copy all the SD cards and SSD drives to my local machine. I make a folder on my desktop with the date and the client’s name as the folder’s name. I put all the files in there. I create subfolders for each different type of drone used: A mavic drone, the cinelifter with fullsize BlackMagic camera, the Sub250 with naked GoPro etc.

Once all files are copied, I create stabilized versions with Gyroflow, GoPro RSGO and SteadXP depending on which footage was shot and what the client is after

Upload all footage to my Google Drive for client sharing

Since a typical shoot yields a minimum of 100GB of footage, I don’t want to start sending SSD drives or SD cards around using postal mail. As I have a paid Google for Work subscription, I don’t need to get WeTransfer, DropBox or Box. I can use Google Drive to share the files with the client. Since I have unlimited storage, I can upload the files there and they stay there forever. I just forward a download link to the client and they’re good to go!

Backup to my local Synology Server

I have an 8 disk Synology server with around 60TB of storage space that’s sitting in a hidden-away place in my home. It’s hidden away so that in case of a break-in, nobody wants to take the fancy heavy techy thing with them. (tech is usually a prime target for thieves. Even if they don’t know what it is)

Once I’m done with a project (stabilize footage and perhaps create a quick video for my own social media channels), I back up the entire folder to my Synology server.

Offsite backup of my Synology Server to Google Drive

As you can tell, when it comes to long term archiving of my professional work, I do consider the worst-case scenario. However a break-in is not the worst-case scenario. A fire that burns down my house and destroys everything in it does qualify for that.

To have one less headache in such a case, I have set up a full backup of everything on my Synology Drive to my Google Drive. So strictly speaking at one point or another, I should have 3 or even 4 copies of every file of my clients I ever made.

Where did things go wrong?

As I set this system up and saw that everything worked exactly the way it should, I felt really content about this setup. I’m not exactly a data security engineer but this solution clearly is pretty solid. Right? RIGHT?

You can see where this is going…

There was one minor (but very imporant) aspect that I had never considered to be a problem. I had heard about it but I never thought it would be a problem for me: Data corruption.

Sure data corruption can happen but my hardware is never exposed to heavy vibrations, ultra intense loads or high levels of non stop processing. Hard drive failures, bit flipping etc… Those things are VERY unlikely in my use-case. Sure, unlikely does not clear me from any risk but statistically speaking, I was on the safe side. And of course I had my offsite backups so what could be the worst to happen?

Google grandfathered my Google Drive plan

The first step where things started to go wrong was that Google had been emailing me for at least a year that they would grandfather my Google for Work subscription and in that process, kill my ‘unlimited Google Drive storage’ perk. I would now only have 5TB of storage. That might seem like plenty to you but at the time when they actively downgraded my account storage, I was using upwards of 25TB.

It made sense for them to put a stop that insanity that was my workflow. So once the new rules kicked in, I was no longer able to upload any files to my Google Drive until I had brought down my used storage to less than 5TB.

So what I ended up doing was deleting a lot of client delivery folders and my offsite backups. It made sense at the time: I had nowhere else to upload it and I wasn’t going to pay 200€/month just to keep those files there. If a client needed something, I could just re-upload it and then delete it later. That was possible as I had everything archived on my Synology, right?

So ended up deleting more than 20TB of backups and client footage from my Google Drive.

How to circumvent the new Google Drive storage limitations?

About 2 months later, a buddy of mine told me about Google Shared Drives. It also has limitations but different ones: You are not limited to 5TB of storage. Instead you are limited to 400.000 files. And there seems to be no limit to how many Shared Drives you can have. So I quickly whipped up a new Shared Drive for my Offsite backups and set up my Synology to start backing up again. Then I set up a Shared Drive to use for new client deliveries. Awesome because I was already having trouble dealing with the 5TB storage limitation. All was good again. Or was it?

Corrupt files on my Synology

About 2 months later, I started work on my new demoreel. Back then I still called it my 2022 demoreel. I started looking into the client projects that were on my Synology and started previewing footage from those shoots. Annoyingly, it looked like I couldn’t stream files from over the network.

I felt kinda bad: the Synology was fairly expensive and the configuration I had set up was meant for high data retention (in case of drive failures) but also for high read speeds since dealing with large files over the 10Gbps network was going to happen on occasion. I felt like I made a huge mistake. Anyway: i copied the files I felt good about to my local machine and started to check them out.

Some of them didn’t really play correctly but most of those were processed with the old ReelSteady GO application back in the day. Those files were known to be very badly encoded and playback was often a problem.

Once I found the files I liked I threw them in Adobe Premiere where my Proxy Ingest setup would start processing the files… and then crash. Constantly.

It started to occur to me that those files that caused to crash Adobe Media Encoder and Adobe Premiere seemed to be corrupted. So I downloaded them again from the server. The same thing kept happening.

I started to thing that the files on my Synology were getting corrupted. Not an inconceivable thing to consider given that I had just updated from my previous 4 disk Synology to this new machine. In the same timespace I had also moved back from windows to Apple Macbooks, this time with M1 chips instead of Intel.

At first I thought that my configuration of the Synology was incorrect and that I had just forgotten to enable data integrity checking on all Shared Folders. I enabled them but of course the files were already damaged.

What is wrong with my Synology??

Luckily for me, I had quite a lot of footage from recent shoots still on my machine so I started uploading those new versions to the Synology. Only to find that they were instantly corrupted again.

I started to no longer trust my Synology server at all at this point. I decided to grab a GoPro file from my machine and confirm that it was fully perfectly valid and conformed. Then I uploaded it to the Synology server. When I checked md5 hashes from the file on the Synology Server against the local file on my machine, it was off. I uploaded the same file again. It was different again. Not only against my file locally but also from the previous upload.

It started to look like the Synology Server was doing weird stuff. I started googling for this issue and found many people complaining about it. Many of them mentioned the new Macbook M1 chip. It started to look like the Macbook M1 chip was the culprit.

I used a Windows machine here on the network to connect to the server and upload the file over SMB file protocol. The same thing happened: The md5 hash no longer matched.

After may different approaches, I noticed that when I uploaded the file through the browser’s drag and drop upload feature, the file would be safe. The md5 hashes did match.

Holy shit! I lost all that footage!

I figured all of this out AFTER Google had forced me to delete my offsite backups and my client delivery archives on my Google Drive. What that means is that I had just deleted all uncorrupted versions of that footage and that the 30-day trash recovery period had expired weeks ago.

Long story short: I no longer had uncorrupted footage available. The files on the server were corrupted and no other files were available to me. My plan to create a killer demoreel started to look bleak.

That pissed me off to no end but I decided not too sad about it and be happy that a lot of cool projects were still in the pipeline so I could just use that foorage for another reel… Next year (in 2023…).

However, I had still a real problem with my Synology and that needed fixing soon!

Requesting help from Synology Support

At this point, it was clear to me that the files were not getting corrupted from hardware failures of the drives in the Synology. Nor was it related to my M1 Macbook or my local network.

Given that file transfers over the web browser worked but transfers over other protocols would end up in corrupted files, it was clear that Synology had done something wrong with file transfer protocols that was causing this. Time to get in touch with support and shine a light on their enormous shortcoming…

Just in case you were wondering: This is foreshadowing an awkward plots twist…

I opened a support ticket and provided them with ALL the information they could need. Serial number of my hardware, types of hard drives, firmware versions of my machine, software information of my laptops, information about my routers and network, etc. Everything!

They got back to me with more questions, with requests for remote access and a dedicated share to simulate the problem etc etc.

We had a back and forth for over 3 months while they tried to figure the problem out.

Unable to reproduce

As it turned out, they were unable to reproduce the problem, thus making it impossible for them to analyze and diagnose the problem. However, Synology knows how much I forked over for this machine and their customer support is best in class, so they asked me if I was fine with them escalating this issue to their HQ in Taiwan. I was pleased to see they didn’t want to leave me hanging and agreed.

A couple of days later, a new colleague of their team asked for some more isolated tests which required me to do some tests in a different network configuration. As it turns out, that solved the problem.

It is not the Synology. It is other hardware on your network.

They figured out the issue was caused not by the Synology or the firmware or anything related to the Synology Server. I fully expected them to say “we can’t help you with this as it’s not our hardware that is causing this” and I wouldn’t have been able to blame them.

However, since I provided them with a moutain of information up front, the engineers on their end dug deep and found a bug report of a customer of the exact same router that I have. The report stated some network issues. One of the proposed solutions for that report was to plug a cable from the blue ethernet port into a yellow one. The Synology engineers asked me if I had a similar setup and if so, what happens if I followed the given advice.

Eureka!

So I went to my router to check which ports I was using and apparently, I was plugged in to the blue port. I felt excited but also anxious about this. I plugged the cable from the blue into the yellow port and went to my computer to do the traditional tests that kept failing systematically before.

It was solved. I ran the test several times. It never failed again. This was no fluke. I tested it hundreds of times before. It never once gave a false positive. I did several other tests that all produced correct hashes.

Everything is safe.

So how did I manage to make a reel?

No, I was not able to recover any damaged files. File that were damaged are still damaged. However, those files are still playable to some degree, so I managed to use footage between the corrupted parts of the videos.

In many cases that meant that great shots were rendered unusable but given that I had SO MANY good shots, I still had enough to create a good video. I’m not sure if I could have made a better reel than this with access to other shots. I really do feel positive that the shots I used are one for one solid shots that I would have otherwise overlooked.

At least that’s what I tell myself 😂

Conclusion

There’s a few takeaways here

  1. You absolutely do need a great backup system

  2. Make sure your network setup is solid and that no data can be corrupted due to incorrect cabling. Test your network setup by transferring huge files (10GB+) over the network and confirming that md5 hashes always match

  3. Google Drive does allow for unlimited storage. Don’t delete it. Download it and keep it somewhere else. Even if you think you don’t need it.

  4. Have several local backups in place

  5. Synology Servers are great! So is Synology support

  6. BONUS: if you really want to delete a backup, make sure you have a valid copy of the data you still want to use before deleting said backup

Ronny Welter

Ronny has always been obsessed by technology, cameras and film-making. So when he finally discovered FPV drones, it was no wonder that he instantly fell in love with it. As a nerd, he quickly learned the ins and outs of how to build and fly these drones. Combined with his inexplicable obsession with camera technology, he had now found an answer to what he wanted to do with his career.

https://ronnyking.tv
Next
Next

My GoPro video files seem to be corrupted. But are they really?