this post was submitted on 19 Mar 2025
50 points (89.1% liked)

Linux

56319 readers
721 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 6 years ago
MODERATORS
 

I recently implemented a backup workflow for me. I heavily use restic for desktop backup and for a full system backup of my local server. It works amazingly good. I always have a versioned backup without a lot of redundant data. It is fast, encrypted and compressed.

But I wondered, how do you guys do your backups? What software do you use? How often do you do them and what workflow do you use for it?

(page 2) 13 comments
sorted by: hot top controversial new old
[–] tasankovasara@sopuli.xyz 1 points 3 months ago* (last edited 3 months ago) (1 children)
  • daily important stuff (job stuff, Documents folder, Renoise mods) is kept synced between laptop, desktop and home server via Syncthing. A vimwiki additionally also syncs with the phone. Sync happens only when on home network.

  • the rest of the laptop and desktop I'll roll into a tar backup every now and then with a quick bash alias. The tar files also get synced onto home server's big file system (2 TB ssd) via Syncthing. Home server backs itself up on it's own once a week.

  • clever thing is that the 2 TB ssd replaced an old 2 TB spinning disk. I kept the old disk and set up a systemd thing that keeps it spun down, but starts and mounts it once a week and rsyncs the changes to the ssd over, then unmounts it so that it sleeps again for a week. That old drive is likely to serve for years still with this frugal use.

[–] ouch@lemmy.world 1 points 3 months ago (1 children)

How do you make sure the disk spins down? Is unmounting enough?

[–] tasankovasara@sopuli.xyz 1 points 3 months ago

Unmounting is enough if the disk has spindown configured. I've got this in /etc/udev/rules.d/ : ACTION=="add", SUBSYSTEM=="block", KERNEL=="sd[a-z]", ENV{ID_SERIAL_SHORT}=="S2H7J9FZB02854", RUN+="/usr/bin/hdparm -S 70 /dev/%k"

[–] golden_zealot@lemmy.ml 1 points 3 months ago

Using timeshift. Very, very easy, works great.

[–] tankplanker@lemmy.world 1 points 3 months ago

Borg daily to the local drive then copied across to a USB drive, then weekly to cloud storage. Script is triggered by daily runs of topgrade before I do any updates

[–] bitcrafter@programming.dev 1 points 3 months ago

I created a script that I dropped into /etc/cron.hourly which does the following:

  1. Use rsync to mirror my root partition to a btrfs partition on another hard drive (which only updates modified files).
  2. Use btrfs subvolume snapshot to create a snapshot of that mirror (which only uses additional storage for modified files).
  3. Moves "old" snapshots into a trash directory so I can delete them later if I want to save space.

It is as follows:

#!/usr/bin/env python
from datetime import datetime, timedelta
import os
import pathlib
import shutil
import subprocess
import sys

import portalocker

DATETIME_FORMAT = '%Y-%m-%d-%H%M'
BACKUP_DIRECTORY = pathlib.Path('/backups/internal')
MIRROR_DIRECTORY = BACKUP_DIRECTORY / 'mirror'
SNAPSHOT_DIRECTORY = BACKUP_DIRECTORY / 'snapshots'
TRASH_DIRECTORY = BACKUP_DIRECTORY / 'trash'

EXCLUDED = [
    '/backups',
    '/dev',
    '/media',
    '/lost+found',
    '/mnt',
    '/nix',
    '/proc',
    '/run',
    '/sys',
    '/tmp',
    '/var',

    '/home/*/.cache',
    '/home/*/.local/share/flatpak',
    '/home/*/.local/share/Trash',
    '/home/*/.steam',
    '/home/*/Downloads',
    '/home/*/Trash',
]

OPTIONS = [
    '-avAXH',
    '--delete',
    '--delete-excluded',
    '--numeric-ids',
    '--relative',
    '--progress',
]

def execute(command, *options):
    print('>', command, *options)
    subprocess.run((command,) + options).check_returncode()

execute(
    '/usr/bin/mount',
    '-o', 'rw,remount',
    BACKUP_DIRECTORY,
)

try:
    with portalocker.Lock(os.path.join(BACKUP_DIRECTORY,'lock')):
        execute(
            '/usr/bin/rsync',
            '/',
            MIRROR_DIRECTORY,
            *(
                OPTIONS
                +
                [f'--exclude={excluded_path}' for excluded_path in EXCLUDED]
            )
        )

        execute(
            '/usr/bin/btrfs',
            'subvolume',
            'snapshot',
            '-r',
            MIRROR_DIRECTORY,
            SNAPSHOT_DIRECTORY / datetime.now().strftime(DATETIME_FORMAT),
        )

        snapshot_datetimes = sorted(
            (
                datetime.strptime(filename, DATETIME_FORMAT)
                for filename in os.listdir(SNAPSHOT_DIRECTORY)
            ),
        )

        # Keep the last 24 hours of snapshot_datetimes
        one_day_ago = datetime.now() - timedelta(days=1)
        while snapshot_datetimes and snapshot_datetimes[-1] >= one_day_ago:
            snapshot_datetimes.pop()

        # Helper function for selecting all of the snapshot_datetimes for a given day/month
        def prune_all_with(get_metric):
            this = get_metric(snapshot_datetimes[-1])
            snapshot_datetimes.pop()
            while snapshot_datetimes and get_metric(snapshot_datetimes[-1]) == this:
                snapshot = SNAPSHOT_DIRECTORY / snapshot_datetimes[-1].strftime(DATETIME_FORMAT)
                snapshot_datetimes.pop()
                execute('/usr/bin/btrfs', 'property', 'set', '-ts', snapshot, 'ro', 'false')
                shutil.move(snapshot, TRASH_DIRECTORY)

        # Keep daily snapshot_datetimes for the last month
        last_daily_to_keep = datetime.now().date() - timedelta(days=30)
        while snapshot_datetimes and snapshot_datetimes[-1].date() >= last_daily_to_keep:
            prune_all_with(lambda x: x.date())

        # Keep weekly snapshot_datetimes for the last three month
        last_weekly_to_keep = datetime.now().date() - timedelta(days=90)
        while snapshot_datetimes and snapshot_datetimes[-1].date() >= last_weekly_to_keep:
            prune_all_with(lambda x: x.date().isocalendar().week)

        # Keep monthly snapshot_datetimes forever
        while snapshot_datetimes:
            prune_all_with(lambda x: x.date().month)
except portalocker.AlreadyLocked:
    sys.exit('Backup already in progress.')
finally:
    execute(
        '/usr/bin/mount',
        '-o', 'ro,remount',
        BACKUP_DIRECTORY,
    )
[–] heythatsprettygood@feddit.uk 1 points 3 months ago

I use Pika Backup (GUI that uses Borg Backup on the backend) to back up my desktop to my home server daily, then overnight that server has a daily backup using Borg to a Hetzner Storage Box. It's easy to set it and forget it (other than maybe verifying the backups every once in a while), and having that off site back up gives me peace of mind.

load more comments
view more: ‹ prev next ›