Публикация Школы траблшутеров

From the clouds down to the ground: keep your data safe and sound

Время чтения: 10 мин 15 сек
28 февраля 2025 г. Просмотров: 127

ProgramsPersonal Efficiency | Oleg Braginsky, Maksim Golyb

While we are being busy creating all our valuable documents, it is also important to make sure that they are stored, safe and sound. With School of Troubleshooter founder Oleg Braginsky and student Maksim Golub, we build a foundation of the system that automates creation of the backups and saves your time.

Ever since the clouds were introduced, they seemed to be an awesome idea. You put all your data up there and don’t care about it anymore. That was the first impression. But when you realise that you need to keep it up to date, take care of incremental changes, and trust 3d parties – it’s time to look for options.

By nature of work, we don’t create much in the cloud. It all starts with the laptop and is being uploaded to corporate portals once it is ready to be reviewed. This ensures that all the original data stays safe with its creators. The problem is that the hardware is not reliable. This is how the quest for the solution has started.

One thing was for sure. There was no intention of keeping trusting clouds. For many reasons, it might be good for hosting commercial apps, but as it comes to personal data and projects, there is a hesitation. Redundancy-wise, it could be successfully provided by portable flash drivers or secure network storage.

After some research, the choice was made for simple, yet powerful model – do regular backups, archive them and when the work is completed, send to one global storage. The issues were that a lot of files were created during the workday – we never delete any file and create copies for each of it. Well, it’s a saver!

At first, all the backups were made manually, through the interface. There was an event in the calendar to go over the data once a month and pack it nicely. The only one thing was annoying – it was a manual labor. Hence, we picked WinRAR to automate the task and pack all the data from folder one to folder two.

How it worked in a nutshell: you can create a configuration within WinRAR and decide what exactly you want to do: which folder to go over, how to pack the data. As we wouldn’t need all the files, additional criteria were added: only get files and folders that were created or modified within a given time range.

Still, the process was quite manual, as it required launching these scripts on a regular basis and then coming up with the naming. Side note: For the timeframe we choose three options: one day, as it would really help to get the most recent data back, then seven days, as it is a regular work cycle and one month:

Naming-wise, decided to stick to the nomenclature: Type of the object, parameters followed by date in Japanese format – year, month, day. All conned by underscores. For example, this is how 1 day backup would look like: Backup_1d_20250227. This is to ensure to identify, find and search required files easily:

After some research, we found that WinRAR has a well-documented console version that would allow it to launch with various parameters. Now, it has come time to do a little coding and put together the plan of what we want to achieve, then use it in the form of *.bat file to execute regularly with Windows Scheduler.

The plan for the script was quite straightforward. It should be small, easy to read, effortless to modify. As we needed something fast and used Windows, the built-in command language was a good choice:

  1. Pass the list of folders to archive.
  2. Set the path to the destination folder.
  3. Be able to set archive names dynamically.
  4. Specify parameters such as compression and redundancy.
  5. Encrypt the data within the archive and make sure 3d parties won’t see what’s inside.

To start, all you need is to create a *.bat file in text editor. First, we set the destination folder for all the data. The way it stores the data on the disks is by separating each increment by timeframe, hence “01d” exists in the path. If you want to use some test data and hide it later, use “rem” or “:” for commenting:

A second step was to define the name. The same approach here: set new variable, give it a name, make sure it exists, as Windows won’t be created automatically. Probably, there is room for future improvements here, but we took a lean approach to make sure the script will do the job first and handle extra cases later.

As you can see, the name was left with “_” at the end and it was done intentionally. This is the part where we would need to get the system data and turn it into the part of the title. To get it, there was a call to “wmic” to get the local time. But it would come in the format of “YYYYMMDDHHMMSS.SSSSSS±UT”.

To trim this data, we user one-liner: set another variable “YYYYMMDD” and build it with concatenating results from the “datetime”. The parameters are straightforward – it’s a string, followed by colon and tilde, then starting position, number of characters to cut inclusively. E.g. “datetime:~4,2” would be equal to “02”.

Yet, the funny part was that once we figured out the way to do such exercise, it turned out that the console version of WinRAR supports such arguments while creating the archive, so this trick was commented, but will certainly come handy for other use-cases where we would only rely on bare operating system toolset.

The last major step was to call WinRAR itself. The console app is called “rar.exe” and could be found in the same folder. Pease do note that it is preferable to put all parameters in one line. They split solely for illustrative purposes. As a result, we ended up with a lengthy list of things to pass on to the executable file:

Here is the list of what it consists of:

  1. “a” – indicates that it should be an archiving operation. Otherwise, it could be “e”.
  2. “m3” – defines the level of compression. In this case, it’s a normal one.
  3. “t” – ask the application to test everything once the packing complete.
  4. “rr3%%” – adds a piece to recover the data in case if file got broken.
  5. “ag” – generates archive’s name using parameters such as Y, M, D.
  6. “tn1d” – asks to look for files updated within the last day.
  7. “ap” – allows to keep the similar file structure within.
  8. “@list_backup.lst” – has a list of folders to process.

After a bit of testing, three different scripts were created. To launch them automatically, we used Windows Scheduler. It has functionality that allows setting up a specific time to run, has logging, manages queues of jobs making sure that if they weren’t performed on time, they will do, when the system is back online.

So far, it works as a charm, saving not only precious data, but also invaluable time that could be spent somewhere else. As ideas are worth contemplating, we were thinking about adding some cleaner to get rid of data each quarter or automatically uploading it to personal SFTP-server and reporting over an email.