We'd like to remind Forumites to please avoid political debate on the Forum... Read More »
We're aware that some users are experiencing technical issues which the team are working to resolve. See the Community Noticeboard for more info. Thank you for your patience.
📨 Have you signed up to the Forum's new Email Digest yet? Get a selection of trending threads sent straight to your inbox daily, weekly or monthly!
Backups on Linux
Options
Comments
-
If the system is already up and running their are few options other than rsync based ones.
For the ability to create image snapshots you'll need to use Linux Logical Volume Manager (LVM), which usually means a fresh install.
Once that's done you can create/destroy snapshots and use them to create backups.
I've only used LVM's a couple of times. Don't see then need in Linux unless it's a running 24/7 Production server that needs close as possible to 100% uptime.
Then it's a case to creating a snapshot, then either leaving it or dd'ing (for bitbased image) or tar'ing it (for normal smaller archive backup) to an external drive.
Only downside to LVM's is the lack of user friendly "gui's" since it's mainly for server based work it's generally done via command line and scripts.
https://wiki.ubuntu.com/Lvm
https://www.howtoforge.com/linux_lvm_snapshotsLaters
Sol
"Have you found the secrets of the universe? Asked Zebade "I'm sure I left them here somewhere"0 -
I must admit that apart from the initial image I made of the flash card after setting everything up on my Pis I wanted to do I've done nothing and they've just been running for several years apart from the odd power outage.
I'll have a crawl through the options. I've currently got a DD command running to copy SDA1 to a USB drive. At the rate it's going I may just can it because if I just grab all the contig files I've touched I could probably reinstall and redo the setup quicker than it will restore - assuming it actually would.
I'll take a look at the gnome disk utility next which was hard to find as I'm using VNC only and it was set up with the minimalist XFCE desktop and it is just called Disks on that for some reason. As with all things Linux Google was my friend in locating it but mummbled and slurred a bit before coming up with the correct answer.
I've no interest in running even more machines as backup servers etc. which is obviously what the pros do - I just want a basic webserver but baulked at exposing Windows to the web. I had been running lighttpd on a Pi as that was faster than apache but on the recommendation of #1 son I've installed nginx instead. It's lightning fast compared to shared hosting (testing with https://tools.pingdom.com/ not across my own network of course)0 -
You could try ETCKeeper to save configs files in the /etc folder.
Git to save /home config files.
Both are Git based so can save to a local or remote git repo
Then just use rSync to save the www fiolders to another location.
**Side note**
AAARRGGGHHHH !!!!!!!!!!
Cloudflare currently hates my BY IPv6 address and flags it as attacking the site and blocks me!
this was the 6th time i've tried to post this post!
I had to disable IPv6 on my router and PC as well as reboot the Smart Hub a few times to get a new IP address (not connected to my old IPv6 address) before the forums would let me post!!Laters
Sol
"Have you found the secrets of the universe? Asked Zebade "I'm sure I left them here somewhere"0 -
**Side note**
AAARRGGGHHHH !!!!!!!!!!
Cloudflare currently hates my BY IPv6 address and flags it as attacking the site and blocks me!
this was the 6th time i've tried to post this post!
I had to disable IPv6 on my router and PC as well as reboot the Smart Hub a few times to get a new IP address (not connected to my old IPv6 address) before the forums would let me post!!
(chrome or slimjet) + Lazarus addon = heaven. Lazarus no longer works on FF0 -
kwikbreaks wrote: »Thanks. I was looking for suggestions from somebody who actually does such backups.
I just use tar. A full backup every so often, then weekly and daily incrementals from that. (In principle, my plan includes monthly incrementals, but I never seem to get round to doing those, and just keep on doing weeklys from the last full backup.)
Just use 'touch' to record when each level of backup happens, and then tar -N to backup files changed since the last larger backup. The full and weekly (and monthly if I ever do them) are written out to dvd. The daily just sits on the external disk. So I could lose up to a week's work if both internal and external disks fail, but I'm happy with that level of risk.
What I really ought to do is add a mechanism to record a snapshot of the state of the (debian) packages before each backup, and then in the event of a total system failure, I can just ask for all those packages to be reinstalled during a rebuild. But in that case, I'd probably just build a new system from scratch and gradually add the packages as I notice they're missing.0 -
kwikbreaks wrote: »I've currently got a DD command running to copy SDA1 to a USB drive. At the rate it's going I may just can it because if I just grab all the contig files I've touched I could probably reinstall and redo the setup quicker than it will restore - assuming it actually would.
Making an image backup while it's mounted ..? Have you actually checked that you can recover from one of those ?0 -
I'm revisiting just using DD to image the drive. I probably need to get a USB3 caddy to speed it up. When I tried the Gnome Disk Utility it insists that the drive is unmounted just as Clonezilla wants. That's going to give more certainty that the clone will be intact I suppose but isn't practical if you only have a single machine server and don't want it offline for backups.
I'll decide what my ultimate backup strategy will be later on. For now, as for many, it relies on the hardware not failing. I do wish somebody like Macrium would produce a linux version but I guess the damand is just too low.0 -
psychic_teabag wrote: »Making an image backup while it's mounted ..? Have you actually checked that you can recover from one of those ?
If DD is just dumping sector by sector I expect there is going to be a fair chance that the restore will be corrupt. Once the dum has completed I can mount the cloned drive and check it over I suppose.
I have quite successfully restored Macrium Reflect backups on Windows which you can run while the machine is running. This is why I said in my OP that linux doesn't seem to have this type of software available. Backing up is all based on scripted backups of directories.0 -
This webpage gives a number of supposed Macrium Reflect alternatives for Linux. Note the surprising number of products which have been abandoned!
Reflect on Windows uses Microsoft's Volume Shadow Services (VSS) which takes a copy of the relevant data to enable an online image backup to work.0 -
Thanks. I know for sure that some of those are offline only backups - ie you boot from a CD and clone the drive. That's probably what I'll have to do as a one-off and then just maintain cron based /etc and sql backups. I see I have Dropbox active on it. It may have come with the distro or maybe I just installed it to grab something and forgot amongst the umpteen other tasks of getting the thing running so I can backup to there.
Anyhow it's up and running and I've moved over to using it from the shared hosting. I can always switch back I guess if it fails before I've organised the backups.
It's actaully a lot faster than the shard hosting but of course bandwidth limited to the 20Mbps upsteam my FTTC connection supports - I guess all those spammers who seem to be the majority of visitors may just have to put up with that.
0
This discussion has been closed.
Confirm your email address to Create Threads and Reply

Categories
- All Categories
- 350.8K Banking & Borrowing
- 253.1K Reduce Debt & Boost Income
- 453.5K Spending & Discounts
- 243.8K Work, Benefits & Business
- 598.7K Mortgages, Homes & Bills
- 176.8K Life & Family
- 257.1K Travel & Transport
- 1.5M Hobbies & Leisure
- 16.1K Discuss & Feedback
- 37.6K Read-Only Boards