Tag Archives: DS413

Data Management – Synology Cloud Sync to UW Google Drive

After a bit of a scare this weekend (Synology DX513 expansion unit no longer detected two drives after a system update – resolved by powering down the system and rebooting it), we revisited our approach for backing up data.

Our decision was to utilize UW Google Drive, as it provides unlimited storage space!

Synology has an available app for syncing data to/from Google Drive, so I set up both Owl (Synology DS1812+) and Eagle (Synology DS413) to sync all of their data to a shared UW Google Drive account. This should provide a functional backup solution for the massive amounts of data we’re storing and it will simplify tracking where/what is backed up where. Now, instead of wondering if certain directories are backed up via CrashPlan or Backblaze or Time Backup to another Synology server, we know that everything is backed up to Google Drive.

 

 

 

 

 

Data Analysis – Identification of duplicate files on Eagle

Recently, we’ve been bumping into our storage limit on Eagle (our Synology DS413):

 

Being fairly certain that there’s a significant amount of large datasets that is duplicated throughout Eagle, I ran a program on Linux called “fslint”. It searches for duplicates files based on a few parameters and is smart enough to be able to compare files with different filenames that share the same file contents!

I decided to check for duplicate files in the Eagle/archive folder and the Eagle/web folder. Initially, I tried searching for duplicates across all of Eagle, but after a week of running I got tired of waiting for results and ran the analysis on those two directories independently. As such, there is a possibility that there are more duplicates (consuming even more space) across the remainder of Eagle that have not been identified. However, this is a good starting point.

Here are the two output files from the fslint analysis:

 

To get a summary of the fslint output, I tallied the total amount of duplicates files that were >100MB in size. This was performed in a Jupyter notebook (see below):
Notebook Viewer: 20160114_wasted_space_synologies.ipynb
Jupyter (IPython) Notebook File: 20160114_wasted_space_synologies.ipynb

 

Here are the cleaned output files from the fslint analysis:

 

Summary

There are duplicates of files (>100MB in size) that are consuming at least 730GB!

Since the majority of these files exist in the Eagle/web folder, careful consideration will have to be taken in determining which duplicates (if any) can be deleted since it’s highly possible that there are notebooks that link to some of the files. Regardless, this analysis shows just how space is being consumed by the presence of large, duplicate files; something to consider for future data handling/storage/analysis with Eagle.