Need Suggestion: Automatic backup of File Systems to NAS

Questions about other topics - please check if your question fits better in another category before posting here
Forum rules
Before you post read how to get help. Topics in this forum are automatically closed 6 months after creation.
Locked
mattlach
Level 4
Level 4
Posts: 323
Joined: Mon May 30, 2011 8:34 pm

Need Suggestion: Automatic backup of File Systems to NAS

Post by mattlach »

Hey everyone,

I recently had a boot drive on my workstation containing a dual boot of Mint 20.3 and Windows 10 Pro fail on me.

I always save my files to my NAS directly, and it is both redundant and backed up, so I have not lost anything irreplaceable, but it sure takes a lot of time to reinstall everything and get it set up and configured the way you like it. Because of this, I am leaning towards getting more serious on backups even on things that I CAN replace. (well, technically, spending a weekend recreating everything the way I like it represents lost time that I can never replace, so there is that)

Are there any good options for LAN backups out there?

My ideal solution would be something that could dump my partitions to my NAS when I shut the PC down. I click shut down, dump starts, and I go to bed.

What would be even better is if the solution could do some sort of rsync magic on the block level in the partitions, and only dump what has changed since the last time.

Even cooler would be if this resulted in some sort of snapshots, so I can go back and look at previous backups.

So my question is, does anything like this exist?
1.) It would have to work with ext4 and NTFS file systems
2.) It would need to transfer data to a remote LAN backup via NFS or SMB
3.) It would need to be executed at system shutdown time, preferably with some rules preventing it initiating during a reboot
4.) Ideally it would be able to do so by only transferring blocks that have changed since the last incremental dump
5.) Ideally it would keep each incremental dump state as a snapshot that can be retroactively reviewed.

I do something similar with ZFS on the NAS server. it snapshots important data nightly and syncs it using ZFS send/Recv to a remote server over WAN nightly, but for a number of reasons ZFS is not a viable alternative on my workstation, so some other tool is necessary.

Timeshift is great, except for two things:
1.) It only backs things up locally
2.) It can't handle my NTFS partitions.

I'd appreciate any suggestions anyone might have. If there is nothing out there that meets my requirements, I'd also appreciate hearing your recommendations for good alternatives.

Thanks,
Matt
Last edited by LockBot on Wed Dec 28, 2022 7:16 am, edited 1 time in total.
Reason: Topic automatically closed 6 months after creation. New replies are no longer allowed.
Corsair 1000D, Threadripper 3960x, Asus ROG Zenith II, 64GB, Samsung 990 Pro, Geforce RTX 4090, 42" LG C3, 2x Dell U2412M, Schiit Bifrost Multibit DAC
Server: AMD EPYC 7543(32C/64T), SuperMicro H12SSL-NT, 512GB RAM, 192TB ZFS
User avatar
Lady Fitzgerald
Level 15
Level 15
Posts: 5812
Joined: Tue Jan 07, 2020 3:12 pm
Location: AZ, SSA (Squabbling States of America)

Re: Need Suggestion: Automatic backup of File Systems to NAS

Post by Lady Fitzgerald »

My suggestion is to use manual backups, not automatic ones. An automatic backup could automatically "backup" an infection, such as ransomware, or a mistake, such as and accidental deletion or formatting.

Backups need to have the drive, NAS, whatever powered down and disconnected from the computer. Otherwise they are just redundancy, not a backup.
Jeannie

To ensure the safety of your data, you have to be proactive, not reactive, so, back it up!
mattlach
Level 4
Level 4
Posts: 323
Joined: Mon May 30, 2011 8:34 pm

Re: Need Suggestion: Automatic backup of File Systems to NAS

Post by mattlach »

Lady Fitzgerald wrote: Thu Jun 23, 2022 5:17 pm My suggestion is to use manual backups, not automatic ones. An automatic backup could automatically "backup" an infection, such as ransomware, or a mistake, such as and accidental deletion or formatting.
The benefit of automatic backups is they happen even when you forget, and most people forget most of the time. With my recent drive failure, I could have sworn I had recently imaged my drives. Whoops that was in 2018. We have busy lives with too much going on to keep on top of such things.

If it isn't automatic, it winds up being useless.

And backing up files encrypted with ransomware, or with corruption isn't a problem if the system maintains snapshots so you can still access previous versions. The worst that will happen is that when trying to backup a drive that has been exposed to a ransomware attack, it fills the backup drive because ALL the files being backed up are new/changed. A nuisance, but you just remove the latest snapshot and revert to a previous one and restore that.

And it doesn't need to be offline cold storage to be a backup. Certainly that is an extra step which adds additional safety, but at the same time, cold storage backups degrade over time as well.

You have to match your backup strategy to the expected threat and the level of seriousness of loss.

For my home system, I have set up all of my clients to store their files to the NAS, which is ZFS based. It takes nightly snapshots which it keeps locally and then proceeds to sync at the changed block level to my remote server out on the WAN using ZFS Send/Recv.

Daily snapshots are then kept locally and remotely for 7 days, weekly snapshots for 4 weeks, monthly snapshots for 12 months, and annual snapshots indefinitely.

All of this is scripted using cron, both backing stuff up, and purging old snapshots. Both local and remote server have redundant drives, local in a RAID60 equivalent configuration in ZFS of two Raidz2 VDEV's, and remotely in one large RAIDz3 VDEV (since it is limited by WAN speed anyway) with three redundant drives, which could be called RAID7, but no such standard exists.

The only way I'm losing data is if I lose both the local and remote server, which is highly unlikely. Even if I sync ransomware encoded data, it would only impact my latest snapshot on both sides, which would be easily remedied.

In order for a hardware loss, I'd have to have either more than two drives fail locally, and more than three drives fail remotely at the same time without me noticing, or a catastrophic event such as flood or fire in both locations at the same time. A successful ransomware destroying all of my data would require it to compromise root on both the local and remote server.

I guess nothing is impossible, but the chances here have to be vanishingly small.
Corsair 1000D, Threadripper 3960x, Asus ROG Zenith II, 64GB, Samsung 990 Pro, Geforce RTX 4090, 42" LG C3, 2x Dell U2412M, Schiit Bifrost Multibit DAC
Server: AMD EPYC 7543(32C/64T), SuperMicro H12SSL-NT, 512GB RAM, 192TB ZFS
User avatar
Lady Fitzgerald
Level 15
Level 15
Posts: 5812
Joined: Tue Jan 07, 2020 3:12 pm
Location: AZ, SSA (Squabbling States of America)

Re: Need Suggestion: Automatic backup of File Systems to NAS

Post by Lady Fitzgerald »

mattlach wrote: Thu Jun 23, 2022 5:42 pm
Lady Fitzgerald wrote: Thu Jun 23, 2022 5:17 pm My suggestion is to use manual backups, not automatic ones. An automatic backup could automatically "backup" an infection, such as ransomware, or a mistake, such as and accidental deletion or formatting.
The benefit of automatic backups is they happen even when you forget, and most people forget most of the time. With my recent drive failure, I could have sworn I had recently imaged my drives. Whoops that was in 2018. We have busy lives with too much going on to keep on top of such things.

If it isn't automatic, it winds up being useless...
Oh boy. I hate threads like these because there are so many misunderstandings about backups and things can quickly become viciously contentious so let's all please keep things civil.

I will willingly concede that a backup that doesn't happen is useless. However, for one to happen, it doesn't need to be automatic. Remembering to do things is especially difficult for me because of my ADHD. To help counter that difficulty, I have recurring calendar entries to remind me to do certain things on certain days of the month--some recur weekly and other monthly-- and some of those are to update backups. Also, I had to cultivate habits, such as looking at the calendar on my desktop, especially first thing when I boot up in the morning. Also, I had to make a habit of updating data backups before going to bed at night. doing it every night makes a routine I don't even have to think about. Since all that is getting backed up each night is only what was added, changed, or deleted during that day, the updates take place pretty quickly.

Also, how has one's system setup, what software is used, and what hardware is used also makes a difference since the correct setup and software can make making and updating backups easier and faster, making it far more likely for backups to be updated. The way I'm set up, I can update the backups for the roughly 11.4 TB of data I have in just a handful of minutes most nights, including connecting and disconnecting my backup drives. Most people will not have nearly as much data as I do so everything, software and hardware will be simpler for them.

One key to simplifying backups is to keep one's system files segregated from one's data files, which is contrary to how Mint normally gets installed. I do not keep any data in /home. Instead, I use separate data drives or partitions for my data. On the big laptop that is my daily driver, I have the luxury of being able to install Mint on a separate boot drive and my data on separate data drives. On my little Lenovo one drive wonder, I Installed Mint conventionally but carved out the bulk of the drive for a data only partition.

Segregating system files from data files allows the use of different software that is best suited for the different kinds of files being backed up.

mattlach wrote: Thu Jun 23, 2022 5:42 pm ...And backing up files encrypted with ransomware, or with corruption isn't a problem if the system maintains snapshots so you can still access previous versions. The worst that will happen is that when trying to backup a drive that has been exposed to a ransomware attack, it fills the backup drive because ALL the files being backed up are new/changed. A nuisance, but you just remove the latest snapshot and revert to a previous one and restore that...
I'm thinking you don't fully understand what ransomware can do. It infects all files on your computer--system and data-- and locks them down. if you have "backups" on the computer, they will also get locked down so you can't access them. The only way to keep backups from getting locked down like that is for them to be inaccessible to the ransomware and that only can be done by keeping them on drives or other media that is kept powered down and disconnected from the computer except while updating the backups. Since ransomware can also attack while you are updating a backup, it is essential to have more than one so you will still have another backup to fall back on if one backup gets infected while being updated. There are other reasons for multiple backups but this post has already become to long.

mattlach wrote: Thu Jun 23, 2022 5:42 pm ...And it doesn't need to be offline cold storage to be a backup. Certainly that is an extra step which adds additional safety, but at the same time, cold storage backups degrade over time as well...
Cold storage is more archival than backups. Backups need constant maintenance to keep them up to date and viable. And, for reasons I've already explained, for backups to be safe, they must be kept disconnected and powered down to be safe.

mattlach wrote: Thu Jun 23, 2022 5:42 pm ...You have to match your backup strategy to the expected threat and the level of seriousness of loss...
Actually, it's more important to match your backup strategy to the types of files being backed up. Backup software that is best suited for system files is too clunky and cumbersome for data files. Backup software that is best suited for data files usually will not work of system files (or not work well). That is the reason I recommend segregating system files from data files. There is no need to weigh the effectiveness of a backup strategy against one's loss tolerance when one can easily have full protection.

mattlach wrote: Thu Jun 23, 2022 5:42 pm ...For my home system, I have set up all of my clients to store their files to the NAS, which is ZFS based. It takes nightly snapshots which it keeps locally and then proceeds to sync at the changed block level to my remote server out on the WAN using ZFS Send/Recv.

Daily snapshots are then kept locally and remotely for 7 days, weekly snapshots for 4 weeks, monthly snapshots for 12 months, and annual snapshots indefinitely.

All of this is scripted using cron, both backing stuff up, and purging old snapshots. Both local and remote server have redundant drives, local in a RAID60 equivalent configuration in ZFS of two Raidz2 VDEV's, and remotely in one large RAIDz3 VDEV (since it is limited by WAN speed anyway) with three redundant drives, which could be called RAID7, but no such standard exists...
Here is where you show you do not fully understand the difference between backups and redundancy. An overly simple way to differentiate between backups and redundancy is one can recover lost data from a backup, even if the entire computer fails, but not from redundancy. Redundancy will protect from drive failure, allowing a computer to continue working if a drive should fail.

mattlach wrote: Thu Jun 23, 2022 5:42 pm ...For my home system, I have set up all of my clients to store their files to the NAS, which is ZFS based. It takes nightly snapshots which it keeps locally and then proceeds to sync at the changed block level to my remote server out on the WAN using ZFS Send/Recv.

Daily snapshots are then kept locally and remotely for 7 days, weekly snapshots for 4 weeks, monthly snapshots for 12 months, and annual snapshots indefinitely.

All of this is scripted using cron, both backing stuff up, and purging old snapshots. Both local and remote server have redundant drives, local in a RAID60 equivalent configuration in ZFS of two Raidz2 VDEV's, and remotely in one large RAIDz3 VDEV (since it is limited by WAN speed anyway) with three redundant drives, which could be called RAID7, but no such standard exists...
What you are describing here is redundancy. There is nothing wrong with redundancy--in fact, it can be critical when continued operation when a drive fails is imperative--but it isn't a backup.

mattlach wrote: Thu Jun 23, 2022 5:42 pm ...The only way I'm losing data is if I lose both the local and remote server, which is highly unlikely. Even if I sync ransomware encoded data, it would only impact my latest snapshot on both sides, which would be easily remedied.

In order for a hardware loss, I'd have to have either more than two drives fail locally, and more than three drives fail remotely at the same time without me noticing, or a catastrophic event such as flood or fire in both locations at the same time. A successful ransomware destroying all of my data would require it to compromise root on both the local and remote server...
I disagree. You can easily lose data, even with all your redundancy. A lightning strike on a nearby power line can blow through any surge protection you have and fry your computer and servers. Unless your remote servers aren't mirroring your data (in which case, you aren't automatically updating your data), you will lose your data.

Ransomware will lock any data it can access. If you are able to access your remote servers all the time, ransomware (and other malware) will also access them and lock any files there. The only way to keep ransomware off a drive is to make it inaccessible and the easiest and best way to do that is keep the backup drives disconnected from the computer and powered down.

I manually make two Timeshift snapshots every Monday morning when I first boot up the computer (my calendar reminds me to do it); One is saved to a folder on one of my internal data drives and the other one gets saved to an external backup drive. I'll also make a snapshot before making any major changes to settings, install or delete a program, or install a potentially dodgy update. I only save those to the internal data drive since I can always fall back on the weekly if I lose the internal drive. I also cull older snapshots manually. I do all this on a different workspace from the one I normally work on so it doesn't interfere with anything else I may be doing. Since I don't keep any data in /home, I include it in the snapshot to backup my settings. The entire operation takes less than five minutes of my time.

On Saturday mornings (again, per my calendar) I make a Rescuezilla image of the boot drive. This normally takes around five or six minutes. The computer is tied for this time so I usually kill the time by heading to the bathroom to take care of other, more pressing business. I manually cull older images. The images get saved to an internal drive in the laptop but then get backed up to an external backup drive that evening. On the first Saturday of the month (again, after being prompted buy my calendar), I "restore' the image I just made to a spare bootdrive I connect to the computer with an external enclosure, then swap out the freshly restored boot drive with the one in the computer (this takes only a couple of minutes, including the time spent blowing out any dust inside while it's still opened up) to test the image to make sure the ones I've been making are still valid (so far, all have). I have a total of three boot drives (NVMe) that I keep in rotation.

Imaging is the best way to backup system files but are too slow and inefficient for backing up data since the images would be gargantuan. Instead, I use a folder/file syncing program called FreeFileSync (FFS) to essentially make a clone of each data drive in my laptop; one 4TB SATA SSD, one 8TB SATA SSD, and one 8TB NVMe SSD divided into two 4TB partitions (this allowed me to use retired 4TB SATA SSDs for backup drives). I have multiple backup drives, both onsite and offsite, but to keep this simple, I'll first describe the dedicated set for the laptop.

FFS works by comparing a source drive (the drive or partition in the computer) with a destination drive. It will then copy any new or changed files on the source drive that are not on the destination drive to the destination drive. Any files on the destination drive that are not on the source drive will get deleted from the destination drive (files on the source drive are never touched). Deleted files get sent to a Versioning folder in case one should not have been deleted. Since I update daily, it normally takes only a few minutes to update and a few more to examine the files in the versioning folder to make sure they are safe to be permanently zapped. Including set up and teardown, I rarely spend more than five or six minutes each night.

I have four backup drives, each one in its own enclosure. To keep them from occupying too much desk space while updating the backups, I made an aluminum bracket to hold four of them together vertically. It takes less than a minute to dig out the backup drives and plug them into the computer (fortunately, I have four USB ports on the left side of my laptop) and the same for unplugging them and putting them away (I leave the cables connected to the enclosures).

I'm between desktop drives right now but I'm still maintaining the four data drives for the next desktop computer (data on the desktop computer will be identical to data on the laptop) so I also have them in enclosures that are in a bracket. To keep them up to date, I treat them like backup drives and update them every night along with the laptop backup drives. Each desktop drive has a set of four backup drives. Two of each set are kept onsite and the other two are kept offsite. The onsite and offsite drives get swapped out every month to keep the offsite drives as up to date as possible. I update these only once a week.

Since the desktop computer drives and the laptop have the same data, the offsite backups for the desktop serve as the offsite backups for the laptop.

Again, these get done without fail because of the calendar reminders except for the nightly backups; those have become a habit.
Jeannie

To ensure the safety of your data, you have to be proactive, not reactive, so, back it up!
mattlach
Level 4
Level 4
Posts: 323
Joined: Mon May 30, 2011 8:34 pm

Re: Need Suggestion: Automatic backup of File Systems to NAS

Post by mattlach »

Lady Fitzgerald wrote: Thu Jun 23, 2022 8:57 pm I'm thinking you don't fully understand what ransomware can do. It infects all files on your computer--system and data-- and locks them down. if you have "backups" on the computer, they will also get locked down so you can't access them. The only way to keep backups from getting locked down like that is for them to be inaccessible to the ransomware and that only can be done by keeping them on drives or other media that is kept powered down and disconnected from the computer except while updating the backups
Lady Fitzgerald wrote: Thu Jun 23, 2022 8:57 pm What you are describing here is redundancy. There is nothing wrong with redundancy--in fact, it can be critical when continued operation when a drive fails is imperative--but it isn't a backup.
Sorry, I am going to have to disagree completely.

I am well aware that redundancy is not backup. You don't - for instance - get a backup by simply mirroring or "RAID:ing" your storage. That is redundancy, but that is not what I am doing.

It does not need to be either offline, nor disconnected to be a backup. You are just flat out wrong on this, and I don't know where you ever got this crazy notion. For something to be a backup, it simply needs to exist in full duplicate somewhere in a place that is different than where the first copy exists. That is all that is required for something to be a backup.

So, in my example, I have both redundancy on my nearline NAS (this is provided through ZFS pools) AND a backup on a remote server, and as a cherry on top that remote server ALSO has redundancy through ZFS. This results in a completely separate copy residing on a completely separate server in a completely separate geographical location.

That's all there is to it.
Lady Fitzgerald wrote: Thu Jun 23, 2022 8:57 pm I'm thinking you don't fully understand what ransomware can do. It infects all files on your computer--system and data-- and locks them down. if you have "backups" on the computer, they will also get locked down so you can't access them. The only way to keep backups from getting locked down like that is for them to be inaccessible to the ransomware and that only can be done by keeping them on drives or other media that is kept powered down and disconnected from the computer except while updating the backups

Ransomware can only overwrite what the infected user has write access to.

In my case the NAS sserver shares folders over NFS and SMB to individual users. Even if that individual user is a fool and gets themselves ransomwhared, it is only able to encrypt the files that the user has write access to. The overall client system can not be locked down unless the user is a fool and running their day to day activities in an andministrative account.

A compromized user can indeed encrypt and overwrite all the files in its own share folder on the NAS, but that is where it will stop. The NAS server will not itself execute said ransomware, it will just affect the users share folder.

In my case, on both the initial and the backup server, ZFS is used to create snapshots. These snapshots are read only, even to root by design of the ZFS filesystem. The only thing that can happen to them is that root is able to remove the snapshots by issuing the "zfs destroy" command.

Both the the local NAS and remote backup servers are headless dedicated servers. No one is browsing or even using a GUI on them. Heck, except for periodic maintenance and updates, which take place via SSH and are highly unlikely to cause any infection, no one is ever even logging on to them, or using the root account on them.

If a client executes something that starts an encryption and overwriting process on their local machine, and this also encrypts the files in their share folder on the NAS, this can only be done to the locations the client can write to, not the rest of the system, so it would be fairly trivial to wipe and reimage the client machine, revert the snapshot NAS server side to before the files were encrypted, and you are back in business, having lost only the data since the last snapshot.

In order for it to go any further than that, the malware would have to break out of the user account and somehow get into the root account, AND be programmed to know how to use zfs to delete snapshots.

So on a Linux / Unix system, unless the user is an idiot and running in root, (the equivalent of which many Windows users do) the furthest a ransomware attack is gong to get is to encrypt the users user folder. The user may have executed the malware, but it literally cannot write anywhere outside the user folder, so that is where it would end. You could just log the user out, log in as a different user, wipe and recreate their user folder and you are back in business.

In my case I also have the NAS shares which the user has write access to, so those could also be overwritten, but again, the snapshots on the server cannot be destroyed by the ransomware unless it somehow gains server root privileges.

You can over-complicate things by also having offline backups if you like doing lots of extra works, but that is mostly unnecessary, and covering for some rather extreme corner cases, like for instance, a novel malware that has completely compromised the operating systems user management and permissions, or broken the kernel somehow and achieved privilege escalation. This might happen if you are running on outdated software, but is not a realistic failure mode if you keep up to date. User privilege type vulnerabilities are usually pretty rare, and patched VERY quickly. Unless you are some sort of high value target of state sponsored attacks, this is really something that is not worth worrying about. If - however - you are, you might expect novel zero days exploits to be used against you, and you may need to take more extreme measures, like air-gapping your systems (a lot of good that did the Iranians in Natanz) and doing offline backups.
Corsair 1000D, Threadripper 3960x, Asus ROG Zenith II, 64GB, Samsung 990 Pro, Geforce RTX 4090, 42" LG C3, 2x Dell U2412M, Schiit Bifrost Multibit DAC
Server: AMD EPYC 7543(32C/64T), SuperMicro H12SSL-NT, 512GB RAM, 192TB ZFS
User avatar
Lady Fitzgerald
Level 15
Level 15
Posts: 5812
Joined: Tue Jan 07, 2020 3:12 pm
Location: AZ, SSA (Squabbling States of America)

Re: Need Suggestion: Automatic backup of File Systems to NAS

Post by Lady Fitzgerald »

So much for keeping this discussion civil. I see no point in continuing with it.
Jeannie

To ensure the safety of your data, you have to be proactive, not reactive, so, back it up!
cliffcoggin
Level 8
Level 8
Posts: 2297
Joined: Sat Sep 17, 2016 6:40 pm
Location: England

Re: Need Suggestion: Automatic backup of File Systems to NAS

Post by cliffcoggin »

Lady Fitzgerald wrote: Fri Jun 24, 2022 2:51 am So much for keeping this discussion civil. I see no point in continuing with it.
I see nothing uncivil about the argument so far. Tedious and repetitive maybe, but not uncivil.
Cliff Coggin
User avatar
Lady Fitzgerald
Level 15
Level 15
Posts: 5812
Joined: Tue Jan 07, 2020 3:12 pm
Location: AZ, SSA (Squabbling States of America)

Re: Need Suggestion: Automatic backup of File Systems to NAS

Post by Lady Fitzgerald »

cliffcoggin wrote: Fri Jun 24, 2022 1:36 pm
Lady Fitzgerald wrote: Fri Jun 24, 2022 2:51 am So much for keeping this discussion civil. I see no point in continuing with it.
I see nothing uncivil about the argument so far. Tedious and repetitive maybe, but not uncivil.
Did you miss where he said, "...I don't know where you ever got this crazy notion..." If not the comment itself, the tone of it is not what I would call civil.

I've been down this road on many other forums and the flame war always starts like this. I can link various experts' articles and videos and their arguments will not be accepted, assuming the articles or videos were even read or watched (and most of the responses I have received strongly suggest they were not). So, again, I see no point in continuing this discussion.
Jeannie

To ensure the safety of your data, you have to be proactive, not reactive, so, back it up!
mikeflan
Level 17
Level 17
Posts: 7136
Joined: Sun Apr 26, 2020 9:28 am
Location: Houston, TX

Re: Need Suggestion: Automatic backup of File Systems to NAS

Post by mikeflan »

I agree with Lady Fitzgerald here. If you are like me - a single user hitting on dedicated drives, then manual backups are worth the time it takes to do them. You keep in touch with your data better. If you have 4 users hitting on the same drives, then maybe the answer is different. Or if you REALLY don't have the time to commit to proper backups, then maybe the answer is different. If you are in the latter case, then the logical answer is to go to a paid solution, or hire somebody else to do your backups properly.
User avatar
AndyMH
Level 21
Level 21
Posts: 13738
Joined: Fri Mar 04, 2016 5:23 pm
Location: Wiltshire

Re: Need Suggestion: Automatic backup of File Systems to NAS

Post by AndyMH »

Are there any good options for LAN backups out there?

My ideal solution would be something that could dump my partitions to my NAS when I shut the PC down. I click shut down, dump starts, and I go to bed.

What would be even better is if the solution could do some sort of rsync magic on the block level in the partitions, and only dump what has changed since the last time.
LAN backup, not that I'm aware of, but you might have a look a Veeam. I had a cursory look at this a few years back, unless things have changed the linux version is terminal only.

rsync and block level - no, rsync works at the file level.

When I first started with mint, I wrote my own backup scripts using rsync to a local drive (it was a 256GB SD card). Subsequently I discovered timeshift & backintime. There is no reason why this wouldn't work over a lan, with the NAS mounted via fstab (what make is your NAS?). But... if you want to backup system files you would need to use an NFS mount (to preserve the file attributes), if just data files then a CIFS mount would be okay.

To take care of shutting down, you could define a systemd service for this. When I did my backup script this was all too difficult, so I had a launcher on the desktop to run the script. Instead of shutting down as normal, when the script had finished it shut the system down.
Thinkcentre M720Q - LM21.3 cinnamon, 4 x T430 - LM21.3 cinnamon, Homebrew desktop i5-8400+GTX1080 Cinnamon 19.0
Petermint
Level 9
Level 9
Posts: 2983
Joined: Tue Feb 16, 2016 3:12 am

Re: Need Suggestion: Automatic backup of File Systems to NAS

Post by Petermint »

You could create a small shutdown script. Backup software is often designed to also run from the command line. Your script would run the backup command then the shutdown command.

Your script could run several backups, to the LAN and to external disks. Each backup could be a different type. Rsync. Image copy. Backintime. There is no limit.

For Ext4 to Ext4, Backintime is good for your home directory and can be run from the command line. Snapshotting is almost perfect on Ext4 to Ext4 as it can use the Ext4 inode system.
Locked

Return to “Other topics”