I don’t mean system files, but your personal and work files. I have been using Mint for a few years, I use Timeshift for system backups, but archived my personal files by hand. This got me curious to see what other people use. When you daily drive Linux what are your preferred tools to keep backups? I have thousands of pictures, family movies, documents, personal PDFs, etc. that I don’t want to lose. Some are cloud backed but rather haphazardly. I would like to use a more systematic approach and use a tool that is user friendly and easy to setup and program.

    • Quazatron@lemmy.world
      link
      fedilink
      arrow-up
      14
      ·
      1 year ago

      BorgBackup is backup done right. Compressed, deduplicated, encrypted. After the initial backup, it takes only a few minutes to do a new backup. Need a specific file you deleted last week? Just mount a previous back and get the file back. It is that simple. Love it.

    • Freeman@lemmy.pub
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      So, just today actually, i wiped ubuntu and isntalled pop_os with btrfs. Basically using this walk through and setup Timeshift to manage snapshots.

      https://mutschler.dev/linux/pop-os-btrfs-22-04/

      but thats not really a backup.

      I have a backup box i use for files with rsync and the like. Need to figure out a full backup method to may backup location though.

      Might just setup an ansible deployment and call it a day.

      • Lemmyin@lemmy.nz
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        I have to say that I used to be a timeshift fan but I’ve started moving to snapper instead. Both are very similar but with snapper you can have multiple configs, one per sub vol. each with different settings. I like having a separate root and home schedules set up. Means I can restore one or the other independently. Works a treat.

        • Freeman@lemmy.pub
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Nice. I’ll check it out for sure. That post I followed also i a link to the authors scripts to run a btrfs snap before apt runs.

          Frankly I just moved some configs over before I did the wipe. My Linux desktops aren’t too customized.

          I had to work around his how to a bit since I use nvme and a pre-partitioned disk that I had to pre-format lvm to (he used a default install run to pre-format the disks)

    • Jo Miran@lemmy.ml
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      This is the way. A few test runs with non-critical files is always highly suggested to make sure you’ve got your syntax right.

  • denny@feddit.de
    link
    fedilink
    arrow-up
    10
    arrow-down
    1
    ·
    1 year ago

    Syncthing. I don’t want to invest into a NAS and put some load into my already greedy power bill, so I chose something decentralized. Syncthing really just works like Torrent but for your personal files: Whatever happens on the computer, also does on the phone, and on the laptop. Each have about 1TB of space and 3 times redundancy? Hell yea buddy dig in.

    • nis@feddit.dk
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      But that’s not really backup, is it? It just synchronizes folders.

      • denny@feddit.de
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Yes but it is a automated backup solution if you want it to. I just put important stuff in the Syncthing folder and rest assured its also on the phone incase the computers SSD caughs fire.

        • nis@feddit.dk
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          I think you are confusing synchronizing with backup. If you delete a file in your Syncthing folder and the deletion gets synchronized, that file is lost. If you do the same in a folder backed up by, say, Borg, you can roll back the deletion and restore the file.

          I may be wrong about Syncthing, though. I haven’t used it yet, but will probably use it in the future. Just not for backup :)

          • denny@feddit.de
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            This is true if you leave it at defaults but I make use of file versioning. When you flick that one on, files that are otherwise replaced or deleted will actually move to a offline .stversions folder. That is very vital I must say in case a host catches some encryptor malware eheh

            • nis@feddit.dk
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 year ago

              I didn’t know that was a possibility. Still, it seem kind of not really what Syncthing is intended for. I mean, they even state it in their FAQ:

              No. Syncthing is not a great backup application because all changes to your files (modifications, deletions, etc.) will be propagated to all your devices. You can enable versioning, but we encourage you to use other tools to keep your data safe from your (or our) mistakes.

    • Fantasy@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I just found out about syncthing yesterday and it really is superb, it’s so easy to use even crossplatform. unison is another syncing tool that I like, I find it better for bidirectional syncing

  • Fryboyter@discuss.tchncs.de
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    I am using Borg for years. So far, the tool has not let me down. I store the backups on external hard drives that are only used for backups. In addition, I save really important data at rsync.net and at Hetzer in a storage box. Which is not a problem because Borg automatically encrypts locally and for decryption in my case you need a password and a key file.

    Generally speaking, you should always test whether you can restore data from a backup. No matter which tool you use. Only then you have a real backup. And an up-to-date backup should always additionally be stored off-site (cloud, at a friend’s or relative’s house, etc.). Because if the house burns down, the external hard drive with the backups next to the computer is not much use.

    By the way, I would advise against using just rsync because, as the name suggests, rsync only synchronizes, so you don’t have multiple versions of a file. Which can be useful if you only notice later that a file has become defective at some point.

  • sneakyninjapants@sh.itjust.works
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    Kopia repo on a separate disk dedicated to backups. Have Kopia on my servers as well sending to my local s3 gateway and second copy to wasabi.

  • SymbolicLink@lemmy.ca
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    1 year ago

    Restic and borg are the best I’ve tried for remote, encrypted backups.

    I personally use Restic for my remote backups and rsync for my local.

    Restic beats out borg for me because there are a lot more compatible storage options.

    • ebits21@lemmy.ca
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Switched to Restic because then I don’t need any extra software on the server (Synology NAS in my case).

  • stravanasu@lemmy.ca
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    +1 rsync, to an external harddrive. Superfast. Useful also in case I need a backup of a single file that I changed or deleted by mistake. Work files are also backed up to the cloud on mega.nz, which is very useful also for cross-computer sync. But I don’t trust personal files to the cloud.

    • omeara4pheonix@lemmy.zip
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      1 year ago

      Don’t forget that a local backup is as bad as no backup at all in the case of a fire or other disaster. Not trusting the cloud is fine (though strong encryption can make this very safe), but looking into some kind of off site backup is important. Could be as simple as a second hard drive that you swap out weekly stored in a safe deposit, or a nas at a trusted friends house.

      • stravanasu@lemmy.ca
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        edit-2
        1 year ago

        Completely agree! I didn’t mention this, but I keep the back-up hard drive in another apartment.

        This reminds me of a story that happened in some university in England: they had two backups of some server in two different locations. One day one back-up drive failed, and the second failed the day after. Apparently they were the same brand & model. The moral was: use also different back-up hardware brands or means!

        • andruid@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          3 2 1 3 different backups 2 different mediums 1 off-site

          Haven’t seen that not be good move yet.

  • tool@r.rosettast0ned.com
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    At work/for business, you can’t beat Veeam. It’s the gold standard and there is literally nothing better.

    At home, Duplicity. Set it up once and then just let it go, and it supports a million different backup targets you can ship your backups off to, including the local filesystem. Has auto-aging/removal rules, easy restores, incrementals, etc. Encrypts by default too.

  • philipstorry@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    My local backups are handled by rdiff-backup to a mirror set of disks. That means my data is versioned but easily accessible for immediate restore, and now on three disks (my SSD, and two rotating rust drives). It also makes restores as simple as copying a file if I want the latest version, or an easy command if I want an older version. And testing backups is as easy as a diff command to compare the backup version with the live version.

    Having your files just be files in your backup solution is very handy. At work I don’t mind having to use an application like Veeam, because I’m being paid to do that. At home I want to see my backups quickly and easily, because I’d rather be working on my files than wrestling with backup software…

    Remote backups are handled by SpiderOak, who have been fine for me for almost a decade. I also use them to synchronise my desktop and laptop computer. On my desktop SpiderOak also backs up some files in an archive area on the rotating rust mirror set - stuff that’s large and I don’t access often, so don’t need to put on my laptop but do want backed up.

    I also have a USB thumbdrive that’s encrypted and used when I’m travelling to back up changes on my laptop via a simple rsync copy - just in case I have limited internet access and SpiderOak can’t do its thing…

    I did also have a NAS in the mix once, but I realised that it was a waste of energy - both mine and electricity. In normal circumstances my data is on 5 locations (desktop SSD, laptop SSD, desktop mirror set, SpiderOak’s storage) and in the very worst case it’s in two locations (laptop SSD, USB thumbdrive). Rdiff-backup to the NAS was simply overkill once I’d added the local mirror set into my desktop, so I retired it.

    I’d added the local mirror set because I was working with large files - data sets and VM images - and backups over the network to the NAS were taking an age. A local set of cheap disks in my desktop tower was faster and yet still fairly cheap.

    Here’s my advice for your consideration:

    • Simple is better than complicated.
    • How you restore is more important than how you backup; perform test restores regularly.
    • Performance matters; backups that take ages are backups you won’t run.
    • Look to meet the 3-2-1 criteria; 3 copies, on 2 different storage systems, with at least 1 in a different geographic location. Cloud storage helps with this.

    Good luck with your backup strategy!

    • Zucca@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      ⬆️ for rdiff-backup since it keeps the last backup easily readable.

      I had before (and I think I’ll implement it again) snapshot capable filesystem where to I rsynced my stuff. Then once a day did a snapshot of the backups. It has the advantage of all the backups being easily readable as long as your backup filesystem is intact and your kernel can mount it.

  • OptimisticPrime@lemmy.fmhy.ml
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    I almost never see rdiff-backup in such threads, so I am bringing it up now. Somehow I really like how it works and provides incremental backup with folder structures and file access still accessible directly. Works well enough for me.

    • average650@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I love rdiffbackup.

      I use it to backup a 30 TB array and it completes in like 20 minutes if there are no changes.

    • philipstorry@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Absolutely - rdiff-backup onto a local mirror set of disks. As you say, the big advantage is that the last “current” entry in the backup is available just by browsing, but I have a full history just a command away. Backups are no use if you can’t access them, and people really under-rate ease of access when evaluating their backup strategy.

  • ISOmorph@feddit.de
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    I almost never see FreeFileSync mentioned in those threads. It’s the only GUI based app I know that also gives you options to not copy file deletions for example. Also has the option to be automated with crontab. Backups are not fragmented or repackaged so you can browse them just fine. Encryption can be done with Veracrypt.