We’d like to remind Forumites to please avoid political debate on the Forum.
This is to keep it a safe and useful space for MoneySaving discussions. Threads that are – or become – political in nature may be removed in line with the Forum’s rules. Thank you for your understanding.
📨 Have you signed up to the Forum's new Email Digest yet? Get a selection of trending threads sent straight to your inbox daily, weekly or monthly!
Backing up a Website properly?
Comments
-
Do you have shell (command prompt) access to your webhost? If so, and it's *nix based, use tar to bundle all your files and directories into one big file. Then just copy that to your backup location. This method also makes restoring the data much easier.
If you're in the root (top) folder of your webspace, you can use the following command:
tar zcvf backup_file.tar.gz *
I'd also second DrBenways' comment about backing up any data stored in databases etc. For MySQL, look at the mysqldump command.
Another possibility is that some web hosting control panels (e.g. cPanel, Plesk) have a backup utility.0 -
No I raised this question becuase I do not have command prompt access and the Panel is very basic - no backup option. Ftp programs can do a backup of sorts but I need a way to verify that all files came across and later, a way of backing up only newer files.
Wget as suggested by Benway looks very promising, when I can get my head around it!0 -
I see. Looking at it from another perspective, are there any files on your site that change dynamically? (i.e. users' upload them, files are generated by scripts).
If not (i.e. your website is only updated when you FTP stuff across) then backing up the stuff on your webhost isn't worth it. Instead, just keep a backup of the files already located on your PC.0 -
Help with wget.
I am assuming you are using windows here.
Sit in an empty directory and do:
wget --mirror ftp://username:password@ftp.yoursite.com -o logfile.txt
Where username and password are the login details to the ftp site.
ftp.yoursite.com is the ftp site you login into.
logfile.txt will contain a listing of what wget tried to do.
You may need to add a path to wget like "c:\Program Files\wget\wget" instead of just wget if you didn't add the location of wget.exe to your PATH variable.
You can fine tune the options at your leisure but that will get you a copy of the ftp site.NURSE: "Shouldn't it be sterilized, doctor?"
DR. BENWAY: "Very likely but there's no time."0 -
Whoops. Forgot to disable smilies. Command is:
wget --mirror ftp://username:password@ftp.yoursite.com -o logfile.txtNURSE: "Shouldn't it be sterilized, doctor?"
DR. BENWAY: "Very likely but there's no time."0 -
As for the counting of files you have. Why not write a php script to do it. You can probably find one on the web if you look. The php functions opendir, closedir, readdir, stat and lstat are your friends here.NURSE: "Shouldn't it be sterilized, doctor?"
DR. BENWAY: "Very likely but there's no time."0 -
DrBenway wrote:Whoops. Forgot to disable smilies. Command is:
wget --mirror [URL="ftp://username:password@ftp.yoursite.com"]ftp://username:password@ftp.yoursite.com[/URL] -o logfile.txt
This is getting very close to what I need - I noticed the Mirror bit in the Wget documentation. There's an additional parameter to play with after the first mirror is created, which can copy down only newer files, which will save a lot of downloading (there is a bit of dynamic content in that if in our online store, new products get added, then new images are imported into the site).
Update: A bit of research and it looks like if I add -S as the 1st argument to the above command the first time I run it, then a -N the 2nd time, on the 2nd run it will only pick up newer files. Will be installing Wget to try this very soon...0 -
--mirror takes care of downloading only changed files.
From the wget docos:
--mirror
Turn on options suitable for mirroring. This option turns on recursion and time-stamping, sets infinite recursion depth and keeps ftp directory listings. It is currently equivalent to -r -N -l inf --no-remove-listing.
You might want to check what happens if you mirror the site after image files have been removed. Does it remove them from the local mirror ? Or leave them in ?NURSE: "Shouldn't it be sterilized, doctor?"
DR. BENWAY: "Very likely but there's no time."0 -
OK, installing it, found a precompiled version of Wget & installed it but getting error "packet driver not found". This will be working real soon now...0
This discussion has been closed.
Confirm your email address to Create Threads and Reply

Categories
- All Categories
- 352.1K Banking & Borrowing
- 253.6K Reduce Debt & Boost Income
- 454.2K Spending & Discounts
- 245.1K Work, Benefits & Business
- 600.7K Mortgages, Homes & Bills
- 177.5K Life & Family
- 258.9K Travel & Transport
- 1.5M Hobbies & Leisure
- 16.1K Discuss & Feedback
- 37.6K Read-Only Boards