scorp123 wrote: I agree here - my solution doesn't provide that. On purpose. As you correctly said -- feeding 'tar' with the right parameter would be easy. But experience has taught me that incremental backups --while being a time-saver when making backups-- can cost you twice that time again when you have to restore things especially if things go seriously wrong, e.g. when for some stupid reasons you don't remember anymore important files are missing from the incremental backup but some moron --maybe even yourself?-- deleted the last good full backup ... Murphy's Law: if things can go wrong they sure as hell will go wrong and you'll be the one banging his head against the table.
The most common fault I have experienced is not someone else deleting my backup files, its a corrupted backup file. There is nothing more terrifying than, in the midst of repairing a critical system, watching your tools reject your last hope.
While it violates the KISS principle, putting together a system that creates separate full and incremental backup archives, rotated over a period by a cron job (e.g., a full backup Sunday night, and incrementals through the week, saving two weeks worth of backup files), gives one a lot more safety than a single archive. Frequently modified files (probably the ones you most care about recovering) tend to end up in more than one archive (one or more of the incrementals plus the full archive).
Unfortunately, this is likely to be something the typical self-supporting Linux user (especially a newbie) is unlikely to set up. The best fall back, then, is to do the full backup regularly, verify
the archive, and store the archive reliably.
Regarding reliable storage, several things are worth pointing out to newbies. CD's are not inherently reliable. Tape is susceptible to magnets, disuse and heat. Rotating disks fail. Those little flash widgets get lost.
The answer? Don't depend on a single copy of a single archive to save the day. If you use hard disk backup (which I do at home), use a basic RAID setup for reliability. Off the shelf terabyte RAID 1 (simple mirror) devices are cheap these days (that's 500GB available, replicated), and if you are the least bit handy, you can even build one yourself. (I see no need to go beyond RAID 1 for replicated backup purposes.)
CDs should be treated with utmost caution. I would recommend that if you depend on CDs for backup, you find the best available *archival* media, and that you verify the media can be read back, preferably not just on the device that created it. For backup, IMO, CDs are slow and scary.
Tape has all sorts of physical issues, but is quite reliable if handled well. Tape must be used, or else it will fail due to winding pressure on the tape reel. Thus, tape is something that does wear out faster than other media. Tape is usually the first choice for pro IT shops (cheap media, and easy to setup and leave unattended for a week or longer), and the last choice for consumers. For the home user, I recommend hard disk, RAID based backup, perhaps with critical personal files being infrequently placed on flash, CD or DVD for redundancy.
I guess the bottom line from this rambling note is that the choice of GUI/non-GUI front end is really the least of your concerns when thinking through a backup solution. Regardless of whether you are setting things up on Linux, the Mac, or Windoze, the serious thought needs to be put into the possible failure modes. Scorp123 rightly touches on one important one: "What if the system has failed down to the command prompt level?" There are others as well, perhaps even more serious, or at least compounding the crisis: "What if I can't read the backup media?", "What if the computer is completely destroyed (or lost or stolen) and I have to recover onto new hardware?", "What if I have to recover critical files via some other operating system?", etc.