Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Online Website Backup Options?

kdawson posted more than 5 years ago | from the backup-is-a-feature-but-recovery-is-a-benefit dept.

The Internet 173

pdcull writes "I can't be the only person on the planet who has this problem: I have a couple of websites, with around 2 GB of space in use on my hosting provider, plus a few MySQL databases. I need to keep up-to-date backups, as my host provides only a minimal backup function. However, with a Net connection that only gets to 150 Kbps on a good day, there is no way I can guarantee a decent backup on my home PC using FTP. So my question is: does anybody provide an online service where I can feed them a URL, an FTP password, and some money, and they will post me DVDs with my websites on them? If such services do exist (the closest I found was a site that promised to send CDs and had a special deal for customers that had expired in June!), has anybody had experience with them which they could share? Any recommendations of services to use or to avoid?"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered


Why not use an online solution? (5, Informative)

MosesJones (55544) | more than 5 years ago | (#24463935)

Rather than "posting DVDs" I'd go for something like Amazon's S3 and just dump the backup to them. Here is a list of S3 Backup solutions [zawodny.com] that would do the job.

I've personally moved away from hard-media as much as possible because the issue on restore is normally about the speed to get it back on the server and its there that online solutions really win as they have the peering arrangements to get you the bandwidth.

Re:Why not use an online solution? (0)

Anonymous Coward | more than 5 years ago | (#24464037)

Yep. And for almost bare metal solution, s3-curl [infinitesque.net] should do the trick

Re:Why not use an online solution? (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#24464601)

North's penis size of 8.5" in length by 6" in midshaft girth translates into a volume of nearly 25 cubic inches. Since the volume of the average penis is around 11 cubic inches, that makes North's penis roughly 125% larger than average. To be twice as large as average, his penis would only need to be about 22 cubic inches in volume. It's clearly larger than that, especially when the more substantial girth of the base of his penis is taken into consideration. And as for North's penis being much larger than Madison's face, it's seldom NOT noticeably larger than his female co-stars' faces. Please see the links referenced below for some citations of this fact. One last thing: you're right about the size distortion on video. Any object that is very thick when compared to its length, will appear smaller than it really is when being viewed via particular angles in a picture or video. Also, the human eye tends to perceive longer objects as larger than thicker objects when, in reality, the thicker object may be of equal or greater volume than its longer counterpart.
Solcis 05:47, 5 July 2006 (UTC)
If you think a penis of 8 1/2 inches by 6 inches is "huge," or "one of the biggest in porn," I feel sorry for you. Hell, MINE'S close to that. Any my boyfriend's is a full 2 inches longer and thicker! And there are HUNDREDS of guys in porn who have bigger cocks than that. I think the problem is that you guys are used to watching STRAIGHT porn, where the guys are generally hired off the lot as an afterthought. Gay porn is a little more discriminating.
"I don't see how anyone can possibly believe Peter North's penis to be of average size."
It certainly is not of average size. It's somewhere in the category of 7 to 7 1/2 inches. A size non-porn star women consider huge.
Allow me to get personal, beginning with a caveat: you don't know me, nor does anyone reading this; think very carefully before alleging I'm trying to impress anyone. I'm anonymous, I wish to remain that way, and have nothing to gain by what you may or may not think of my body.
HOWEVER, as a large man with a large penis, and a couple decades experience with many women, I can say with confidence women practically ALWAYS exaggerate the size of a man's penis. Judging by many men's reactions, many of we men do the same. For instance, if you've seen the Pamela Anderson/Tommy Lee video, and think Tommy Lee is 9 or more inches long, you're one of these people who is either tricked by the camera, or has little experience with above-average penis(es). Tommy, obviously larger than North, probably has the coveted 8 1/2" you claim Peter has.
I urge you to read Asia Carrera's website, for a number of reasons:
1. She's a MENSAN (no dummy, she)
2. She says on her site that, "(s)ome of the guys do have monster weenies, but most of them are around 6-7 inches... I personally hate big dicks, and I won't work with anyone 8 or more inches if I can help it."
3. She works with Peter North, and -- putatively -- will continue to do so.
Overall, I'm getting seriously disappointed with Wikipedia as a whole. Seems the facts don't matter when all one needs is lots of time to revert pages and/or become a moderator. Unless something seriously changes, Wikipedia will continue down the path of "the world according to whoever has loads of time on their hands"-pedia. ..|..
I, too, have been blessed with a large penis, which is one of the reasons why I'm a regular contributor to Peter North's Wikipedia entry. I agree that penis size is very often exaggerated... which is why Peter North's penis size has, on many occasions, been estimated to be in excess of 10 inches. Point taken. I never disagreed.
To address your comments regarding Tommy Lee's penis, how can you determine his size relative to Peter North? How can you know he's larger or smaller without additional frames of reference? I've seen Peter perform with hundreds of women. They (their bodies) are my frames of reference. But the video with Tommy Lee only has Pamela Anderson in it. There's just no way you can know how large he is with a lone frame of reference. On a guess, I feel that Tommy's penis is close to the same length of Peter's, but Peter is quite a bit thicker than Tommy. Again, there's no way to know for sure- unless additional Tommy Lee sex tapes with additional women suddenly surface.
Regarding Asia Carrera's comments, take a look at her quote that you included. Then look here at her scene pairings with male pornstars- http://www.iafd.com/matchups.rme/perfid=AsiaC/gender=F [iafd.com] According to this page, the largest performers that she had sex with were Julian, Peter North, Rocco Siffredi and Voodoo. Although she performed on three occasions with Rocco, she performed only once each with the other three men. Yes, you read that right: Asia Carrera has performed ONLY ONCE onscreen with Peter North. She has performed with in excess of 50 men onscreen, most of them on repeat occasions. In your quote above, Asia said that "some of the guys do have monster weenies... I won't work with anyone 8 or more inches if I can help it". Just who do you think she's referring to here?? I'll help you: she's certainly not referring to men whom she's appeared in scene after scene with.
I've presented quite a bit of material here, which clearly shows Peter North's penis to be close to 8.5 inches in length and about 6 inches in girth. But you've ignored almost all of it. Yet, you feel that Wikipedia is deteriorating and you fault its contributors. That's slightly ironic and perhaps a wee bit hypocritical, don't you think?

Re:Why not use an online solution? (4, Interesting)

beh (4759) | more than 5 years ago | (#24464237)

Similarly, I'm not using DVDs etc. for my server backup. A few years back, seeing how much my provider would charge me for a decent amount of backup space, I opted to get an additional server instead; the second server now provides
secondary DNS, secondary MX to my regular system, but also has all data for a cold-standby ( I would still need to change addresses in DNS manually in case of a disaster, and bring up services, but pretty much all the data is in place).

The data is synchronised between both servers several times a day - first backed up locally to a second disk on the same machine, then rsynced between the two...

The solution was cheaper than the cost of the backup, and gives me extra flexibility in terms of what I can do. The only 'cost' is that both machines sacrificed disk space to be back-up for the other (since both machines have >400GB in disk space, giving up even half the disk space of each machine isn't a big limitation - at least, not for *my* needs. YMMV).

Re:Why not use an online solution? (0)

Anonymous Coward | more than 5 years ago | (#24464385)

what happens if a hacker or script-kiddie gets at your data? Your backup-server will also synchronise the damaged data i guess?

Re:Why not use an online solution? (4, Insightful)

beh (4759) | more than 5 years ago | (#24464615)

Sure, it will - but that problem you will have with a provider-based backup as well. If your data gets corrupted without you noticing, your backup will 'save' corrupt data...

What you can do to at least partially save yourself is to at least make sure the rsync users are jailed and can only rsync to the target directory, not being able to access anything else.

Re:Why not use an online solution? (0)

Anonymous Coward | more than 5 years ago | (#24464979)

I hope your ISP's data center does not burn, get flooded, or broken into...

Re:Why not use an online solution? (2, Informative)

txoof (553270) | more than 5 years ago | (#24464473)

S3 is a pretty good option. I've been using the jungledisk client along with rsync to manage offsite home backups. S3 Is pretty cheap and the clients are fairly flexible.

I haven't played with any remote clients, but your hosting provider can probably hook up one of the many clients mentioned in the parrent. The price of S3 is hard to beat. I spend about $6 per month on ~20 gigs worth of backups.

Re:Why not use an online solution? (3, Informative)

ghoti (60903) | more than 5 years ago | (#24464651)

JungleDisk's built-in backup can also keep older versions of files, which is great in case a file gets corrupted and you only discover that after a few days. It's dirt cheap too, $20 for a lifetime license on an unlimited number of machines.

For this to work, you need to be able to run the jungledisk daemon though, which is not an option with some shared hosting plans. Also, to mount the S3 bucket as a disk, you obviously need root access. But if you do, JungleDisk is hard to beat IMHO.

Re:Why not use an online solution? (2, Informative)

alexgieg (948359) | more than 5 years ago | (#24464725)

. . . to mount the S3 bucket as a disk, you obviously need root access. But if you do, JungleDisk is hard to beat IMHO.

Not really. If the server kernel has FUSE [wikimedia.org] enabled, and the user space tools are installed, any user member of the related group can mount a "jungledisked" S3 bucket in his userspace without the need for root access.

Re:Why not use an online solution? (1)

Gazzonyx (982402) | more than 5 years ago | (#24464477)

You could also use S3Backer [googlecode.com] with an rsync script (or rsnapshot) on the host. That lets you mount the S3 bucket as a drive on your server through FUSE and then copy to it as if it were local. *NIX/BSD only, though.

Re:Why not use an online solution? (0)

Anonymous Coward | more than 5 years ago | (#24464553)

Do you not have any friends with a faster connection?

I backup my website which has a large forum attached to it to encrypted zip files at 3am then FTP this to a read only FTP server hosted on an ADSL line which doesn't belong to me.
I don't bother with diffs as TBH when the server goes down you want it back fast and pratting about with diffs isn't worth it.
I have a seperate IP with R/W access to the FTP server with security based on me phoning him to ask him to turn it on. Yeah you may think it crude but it's working fine and it's pretty secure if it aint plugged in!!

I suppose it helps that bandwidth isn't metered tho, but the full backup for the SQL and website is only about 300Mb per night.
If your site is bigger than that then you are either making revenue from it or damn well should be!!

Why not use Suso? (3, Informative)

suso (153703) | more than 5 years ago | (#24464557)

Sorry for the self plug, but this just seems silly. Your web host should be backing up your website and offer you restorations. I guess this isn't a standard feature any more. But it is at Suso [suso.com] . We backup your site and databases everyday. And can restore them for you for free.

Re:Why not use Suso? (4, Insightful)

cdrudge (68377) | more than 5 years ago | (#24464663)

One thing that I've learned though is you can not rely on a hosting company's backup to necessarily be timely, reliable, and/or convenient. If you want to backup multiple times during the day, have multiple generations of backups, be able to very quickly restore if need be, all can make the hosting backup unattractive. I'm not saying yours is that way, just with some of the hosting companies I've dealt with in the past.

This also doesn't take into consideration the best-practice of having your backups off-site for disaster recovery. It doesn't help very much to have your backup server/drive/whatever 1U down in the rack when the building collapses, has a fire, floods, etc destroying everything in it.

Re:Why not use Suso? (2, Interesting)

Lumpy (12016) | more than 5 years ago | (#24464743)

Because most sites have 2 failure points.

1 - they buy the super cheap service with no backup.

2 - the site is designed poorly with no backup capabilities.

If your site has a DB, your site better have a admin function to sump the DB to a tgz file you can download. Mine generates a password protected rss feed and encrypted tgz file (in a password protected area.) I simply have a rss reader/retriever configured to watch all my sites and retrieve the backups when they are generated.

I get that DB and any user/customer files, all is well. the site it's self I uploaded so it's silly to back it up works great and I don't care if all I get is 150kbps because it can go for 3 days for the weekly backup, and incremental take less than 2 hours.

S3 costs about $300 per TB (1)

giafly (926567) | more than 5 years ago | (#24465077)

If you keep your backups for one month, S3 costs about $300 per TB. That's not a bad price for offsite backup that's easily accessible from both your main and disaster recovery servers.
price list [amazon.com]

Re:Why not use an online solution? (1)

teknopurge (199509) | more than 5 years ago | (#24465107)

100% Agreed. We started using Vaultwise [vaultwise.com] for our servers and laptops and couldn't be happer. We spoke with some of their engineers and they have some kind of CDP that they are going to release that does block-level backups. Very cool stuff. I don't know why, nowadays, why anyone wouldn't want their backups accessible online.

Re:Why not use an online solution? (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#24465199)

I use an uncapped cable modem... 20mb down/ 10mb up...
u should look into that.. lol

Re:Why not use an online solution? (0)

Anonymous Coward | more than 5 years ago | (#24465321)

Just for the record: EVERY single comment in this thread refers to some online backup service ignoring the OP's request for finding services that send the backup data to him on DVDs. What part of 150 kb/s connection and 2GB website to backup don't you understand?

Re:Why not use an online solution? (0)

Anonymous Coward | more than 5 years ago | (#24465521)

I'm sure this isn't just coincidence... But LifeHacker has an article about this *today*.


Why FTP? Use rsync. (5, Informative)

NerveGas (168686) | more than 5 years ago | (#24463939)

It seems like the only problem with your home computer is FTP. Why not use rsync, which does things much more intelligently - and with checksumming, guarantees correct data?

The first time would be slow, but after that, things would go MUCH faster. Shoot, if you set up SSH keys, you can automate the entire process.

yeah, use rsync. (5, Insightful)

SethJohnson (112166) | more than 5 years ago | (#24463975)

I 100% agree with NerveGas on the rsync suggestion. I use it in reverse to backup my laptop to my hosting provider.

Here's the one thing to remember in terms of rsync. It's going to be the CURRENT snapshot of your data. Not a big deal, except if you're doing development and find out a week later that changes you made to your DB have had unintended consequences. If you've rsynced, you're going to want to have made additional local backups on a regular basis so you can roll back to one of those snapshots prior to when you hosed your DB. Apologies if that was obvious, but rsync is the transfer mechanism. You'll still want to manage archives locally.


Re:yeah, use rsync. (4, Informative)

Bert64 (520050) | more than 5 years ago | (#24464013)

Then what you need is rdiff-backup, works like rsync except it keeps older copies stored as diffs.

As for FTP, why the hell does anyone still use ftp? It's insecure, works badly with nat (which is all too common) and really offers nothing you don't get from other protocols.

Re:yeah, use rsync. (5, Informative)

xaxa (988988) | more than 5 years ago | (#24464273)

Then what you need is rdiff-backup, works like rsync except it keeps older copies stored as diffs.

Another option is to use the --link-dest option to rsync. You give rsync a list of the older backups (with --link-dest), and the new backup is made using hard links to the old files where they're identical.
I haven't looked at rdiff-backup, it probably provides similar functionality.

Part of my backups script (written for zsh):

setopt nullglob
unsetopt nullglob

rsync --verbose -8 --archive --recursive --link-dest=${^older[1,20]} \
                        user@server:/ $backups/$date/

Re:yeah, use rsync. (2, Informative)

xaxa (988988) | more than 5 years ago | (#24464291)

Also, rsync has a --bwlimit option to limit the bandwidth it uses.

Re:yeah, use rsync. (1)

spinkham (56603) | more than 5 years ago | (#24464579)

rdiff-backup makes incremental diffs of individual files, which saves a lot of space for large files which have small changes like database backups, virtual machine images, and large mailspools.
On the other hand, the rsync schemes are somewhat more straightforward to deal with if you don't have much space in such files.

Re:yeah, use rsync. (1)

fireboy1919 (257783) | more than 5 years ago | (#24464591)

Or...use subversion to actually store your data. If you use FSFS format (the filesystem version of SVN, which IMHO is better than the embedded database format because it doesn't occasionally get corrupted), all data is actually *stored* as diffs anyway.

You can actually do an rsync of the live data, and it'll work perfectly, and never overwrite things you need.

If you're worried about past versions, you should be using source control, so IMHO, this is a better option than an almost-source control one like rdiff-backup.

Re:yeah, use rsync. (1)

NMerriam (15122) | more than 5 years ago | (#24464807)

Then what you need is rdiff-backup, works like rsync except it keeps older copies stored as diffs.

When it works, that it. The problem with rdiff-backup is that ultimately it's a massive script with no error-checking, and if anything ever goes wrong (including just the network having difficulty), you have to "fix" your backup, and then start over again. Of course the process of fixing usually takes twice or three times as long as actually performing the backup, so you can wind up with a backup that's impossible to continue since you're spending more time working around the limitations of the script than you are actually transferring backup data.

This is a field that really needs a decent commercial piece of software with a full-time developer.

Re:yeah, use rsync. (0)

Anonymous Coward | more than 5 years ago | (#24464275)

See the rsync man page, rsync can keep older versions (as diffs or hardlinks). Using a frontend like dirvish might be prefereable.

Re:yeah, use rsync. (5, Informative)

Lennie (16154) | more than 5 years ago | (#24464283)

There is also the --backup --backup-dir options (you'll need both). It keeps a copy of the files that have been deleted or changed, if you use a script to keep it in seperate directories you'll have a pretty good history of all the changes.

Re:yeah, use rsync. (0)

Anonymous Coward | more than 5 years ago | (#24464515)

Rsync to transfer, then local snapshots are an easy and solid way to go. Remember to dump your database and put it somewhere in the filesystem which is being backed up. I use dirvish for the local backups. It uses hardlinks, so files that dont change dont require space (though its not as small as pure diff). But it gives you a snapshot of the full fs at the time of the backup,which is very useful. You can set retention for each "vault", letting you change just what to keep, for how long.

Re:yeah, use rsync. (1)

mlush (620447) | more than 5 years ago | (#24464977)

Here's the one thing to remember in terms of rsync. It's going to be the CURRENT snapshot of your data.

Rsnapshot [rsnapshot.org] may be an option. It creates a sucession of snapshot directory's but only one copy of each file (hard linking to make up the difference)

Re:Why FTP? Use rsync. (2, Informative)

Andrew Ford (664799) | more than 5 years ago | (#24463997)

Or simply use rsnapshot. However whatever backup solution you use, make sure to create dumps of your databases as backing up the database files while they are in use will give you backup files you cannot restore from. If you backup your database dumps, you can exclude your databases files from the backup.

Re:Why FTP? Use rsync. (0)

Anonymous Coward | more than 5 years ago | (#24464003)

either setup a rsync client in your home machine, or have someone backup your site for you (read: http://www.rsync.net/)

And if you want to have rsync at your windows machine, follow the instruction here: http://www.gaztronics.net/rsync.php

By doing rsync, you can turn on and off the backup process, it will pick it up where it was. And when you have full copy of your entire site in your local drive, next time, it will only be the incremental backup of it.

And for mysql, you can use mysqldump (http://dev.mysql.com/doc/refman/5.0/en/mysqldump.html)

Re:Why FTP? Use rsync. (1)

noob749 (1285846) | more than 5 years ago | (#24464023)

yeah, i say rsync and mysql replication would work nicely. of course you have to look into them and decide if they meet your needs, but i think you'll find it's probably good enough.

Re:Why FTP? Use rsync. (2, Informative)

Sophia Ricci (1337419) | more than 5 years ago | (#24464133)

Yes. rsync is lot better, I always use it. I personally use rsync with -z option to compress and decompress the files on the fly, which improves speed a lot, most of files being text files.

Re:Why FTP? Use rsync. (3, Insightful)

houghi (78078) | more than 5 years ago | (#24464141)

Many hosting providers do not have this option and not even sftp. :-/

So that makes that you are stuck with FTP or need to change hosting provider, which is also not always an option.

Re:Why FTP? Use rsync. (3, Informative)

v1 (525388) | more than 5 years ago | (#24464293)

I use rsync on a few dozen systems here, some of which are over 1TB in size. Rsync works very well for me. Keep in mind that if you are rsyncing an open file such as a database, the rsync'd copy may be in an inconsistent state if changes are not fully committed as rsync passes through the file. There are a few options here for your database. First one that comes to mind is to close or commit and suspend/lock it, make a copy of it, and then unsuspend it. Then just let it back up the whole thing, and if you need to restore, overwrite the DB with the copy that was made after restoring. The time the DB is offline for the local copy will be much less than the time it takes rsync to pass through the DB, and will always leave you with a coherent DB backup.

If your connection is slow, and if you are backing up large files, (both of which sound true for you?) be sure to use the keep-partial option.

One of my connections is particularly slow and unreliable. (it's a user desktop over a slow connection) For that one I have made special arrangements to cron once an hour instead of once a day. It attempts the backup, which is often interrupted by the user sleeping/shutting down the machine. So it keeps trying it every hour it's on, until a backup completes successfully. Then it resets the 24 hr counter and won't attempt again for another day. That way I am getting backups as close to every 24 hrs as possible, without more.

Another poster mentioned incrementals, which is not something I need here. In addition to using a version of rsync that does incrementals, you could also use something off-the-shelf/common like retrospect that does incremental but wouldn't normally work for your server, and instead of running that over the internet, run it on the local backup you are rsyncing to. If you need to go back in time a bit still can, but without figuring a way to jimmy in rsync through your network limits.

Actually, his only problem is.... (4, Insightful)

raehl (609729) | more than 5 years ago | (#24464399)

... his slow internet connection, and wants to pay something to not have to move files over his slow internet connection.

How about:

- Pay for a hosting provider that DOES provide real backup solutions....
- Pay for a real broadband connection so you CAN download your site....

As with most things that are 'important'...

Right, Fast or Cheap - pick two.

Re:Actually, his only problem is.... (1)

s0litaire (1205168) | more than 5 years ago | (#24464683)

2 slight problems with your comments... 1) He may be in a location that has Crud connection to the 'Net 2) That might be the best speed his area provides... :D apart from those 2 points I agree with you. He should get a better Web Host :D

Re:Actually, his only problem is.... (2, Informative)

pdcull (469825) | more than 5 years ago | (#24465059)

Yep... I didn't mention in the question that I'm in the middle of a favela (slum) in the hillside morro (slum) in the interior of Rio de Janeiro, Brazil... My host is in the US...

Re:Why FTP? Use rsync. (0)

Anonymous Coward | more than 5 years ago | (#24464799)

If you are using rsync you could also buy space from rsync.net. Very cheap drive space and you can then rsync from your webserver to another location.

Sure you need to back the full 2 gig? (2, Interesting)

sleeponthemic (1253494) | more than 5 years ago | (#24463947)

Presumably, much of that 2 gig of data is static, so perhaps you could look into minimisation of exactly *what* you need to back up? It might be within the realm of your net access.

bqinternet (2, Informative)

Anonymous Coward | more than 5 years ago | (#24463949)

We use http://www.bqinternet.com/
cheap, good, easy.

Gmail backup (3, Informative)

tangent3 (449222) | more than 5 years ago | (#24464033)

You may have to use extra tools to break your archive into seperate chunks fitting Gmail's maximum attachment size, but I've used Gmail to backup a relative small (~20mb) website. The trick is to make one complete backup, then make incremental backups using rdiff-backup. I have this done daily with a cron job, sending the bz2'ed diff to a Gmail account. Every month, it will make a complete backup again.

And a seperate Gmail account for the backup of the mysql database.

This may be harder to do with a 2GB website, i guess, since Gmail provides atm about 6GB of space which will probably last you about 2 months. Of course you could use multiple gmail accounts or automated deletion of older archives...

But seriously, 2GB isn't too hard to do your from own PC if you only handle diffs. The first time download would take a while, but incremental backups shouldn't take too long unless your site changes drastically all the time.

Re:Gmail backup (5, Insightful)

Anonymous Coward | more than 5 years ago | (#24464271)

This strikes me as a really dumb thing to do; as both a) using it for data storage rather than primarily email storage and b) signing up for multiple accounts are both violations of the gmail TOS, you are just asking for your backups to not be available when you most need them.

Re:Gmail backup (2, Interesting)

LiquidFire_HK (952632) | more than 5 years ago | (#24464643)

While I certainly don't claim using Gmail for backup is a smart thing to do, can you point out where in the ToS this is stated, as I looked through it and see no mention of either restriction?

Wow (5, Insightful)

crf00 (1048098) | more than 5 years ago | (#24464041)

Wow! So you are asking somebody to download your website's home folder and database, look at the password and private information of members, and deliver you dvd that is ready to be restored with rootkit along?

Re:Wow (5, Funny)

teknikl (539522) | more than 5 years ago | (#24464071)

Yeah, I had noticed the complete lack of paranoia in the original post as well.

Re:Wow (2, Interesting)

pdcull (469825) | more than 5 years ago | (#24465129)

That's why I'd want somebody realiable. My hosting provider could steal my info too if they really wanted too, although I certainly trust them not to. Oh, I'm paranoid alright... it's just that living in a Rio de Janeiro slum, as I do, my paranoia is more about things like flying lead objects...

Re:Wow (1)

KingOfBLASH (620432) | more than 5 years ago | (#24465151)

It's not difficult to set up a user with read-only access to mysql, and read only access to your entire web content. Since apache just basically points to a directory (or directories depending on how you set it up), to restore a backup you just copy the static content to /var/www/html/ (or wherever it's stored), and load up the MySQL data. That's it -- no possibility for a rootkit, and who cares if they have your password -- they only have read only access. I would guess if there was anything particularly sensitive in the OP's server, they wouldn't be looking for such a service.

Re:Wow (0)

Anonymous Coward | more than 5 years ago | (#24465445)

Most modern webapps have the user password hashed. So good luck with that :) As for the rootkit, that's up to the service provider ;)

rsync - it's in the tag (5, Informative)

DrogMan (708650) | more than 5 years ago | (#24464093)

rsync to get the data, cp -al to keep snapshots. I've been using this for years to manage TB of data over relatively low-speed links. You'll take a hit first-time (so kick it off at night, kill it in the morning, and the next night just execute the same command and it'll eventually catch up, then cp -al it, then lather rinse, repeat. This page: http://www.mikerubel.org/computers/rsync_snapshots/ [mikerubel.org] has been about for years. Use it!

Re:rsync - it's in the tag (2, Interesting)

xehonk (930376) | more than 5 years ago | (#24464151)

And if you dont feel like writing scripts yourself, you can use rsnapshot, which will do all of the work for you.

Re:rsync - it's in the tag (1)

MMC Monster (602931) | more than 5 years ago | (#24464171)

rsync is definitely your friend. Check out the man pages and look up some examples on the net. (The command line options I use are rsync -avurtpogL --progress --delete, but YMMV.)

Re:rsync - it's in the tag (1)

bot24 (771104) | more than 5 years ago | (#24464219)

Or rdiff-backup which stores the latest version and diffs instead of using cp -al.

Dirvish (1)

leonbloy (812294) | more than 5 years ago | (#24465011)

... or give a look at Dirvish [dirvish.org] . It uses rsync and keeps full snapshots using hardlinks for unachanged files. Works like a charm for me.

Backuppc.sourceforge.net (2, Informative)

star3am (1151041) | more than 5 years ago | (#24464155)

Unbelievable Backup Software, BackupPC, it uses Rsync, and will solve all your troubles, it's truly amazing backups/restore solution, check it out .. all the best! Riaan

I sure hope you're no UK based... (4, Informative)

jonnyj (1011131) | more than 5 years ago | (#24464159)

...because if you are and you're planning to sent personal data (there can't be many 2GB web sites that contain no personal data at all) on DVD through the mail, you might want to look at recent pronouncements from the Information Commissioner. A large fine could be waiting for you if you go down that route.

Re:I sure hope you're no UK based... (0)

Anonymous Coward | more than 5 years ago | (#24464675)

Better transport it using train in UK...

Sitecopy (2, Informative)

houghi (78078) | more than 5 years ago | (#24464165)

I would seperate the content. First there is the MySQL part. Export it on a daily basis (or more often). You can export it as a whole or only those parts that you desire. Make a php page for each thing you desire to download and protect it however you like.
Then point lynx to it to download the file.

The content is another matter. To update my sites I use sitecopy [manyfish.co.uk] What I do is make the site localy and when I am ready, I run sitecopy and it will upload the site.
As I do incremetial backups localy, I do have the previous version there.
If this is not an option, it should not be too hard to use sitebackup to, uh, backup the site.

Put all this in a script and crontab should do the rest.

SquirrelSave (1)

Lord Grumbleduke (13642) | more than 5 years ago | (#24464197)

I use a product called SquirrelSave:

http://www.squirrelsave.com/ [squirrelsave.com]

which uses a combination of rsync and SSH to push data to the backup servers. The client is currently only for Windows at the moment, but with promises of a Linux and OS X version coming soon.

It generally works quite well - WinSCP is included to pull data back off the servers.

Re:SquirrelSave (1)

Lord Grumbleduke (13642) | more than 5 years ago | (#24464223)

Unfortunately I just read the post - properly. SquirrelSave doesn't (yet) support server OSes according to the web site. Sorry 'about that.

Use a host that does better backups? (0)

Anonymous Coward | more than 5 years ago | (#24464229)

Have you considered changing hosting providers to one that offers a better backup solution? There are hosts out there that take several hourly, daily and weekly httpdocs backups along with daily mysql dumps and binary logs to provide point-in-time database backups. I know this because I work for a hosting provider that does just this: http://www.ayudahosting.com.au/faqs/hosting/ [ayudahosting.com.au] (see the last FAQ).

As for posting DVD's, I'd suggest contacting your hosting provider and asking them if they would be willing to do it for you every month or so. Getting your hosting provider to do this for you eliminates the need to transfer the data etc. We often get requests like these and are always more than happy to assist.

How quickly do you need it back? (4, Insightful)

jimicus (737525) | more than 5 years ago | (#24464233)

One thing a lot of people forget when they propose backup systems is not just how quickly can you take the backup, but how quickly do you need it back?

A sync to your own PC with rsync will, once the first one's done, be very fast and efficient. If you're paranoid and want to ensure you can restore to a point in time in the last month, once the rsync is complete you can then copy the snapshot that's on your PC elsewhere.

But you said yourself that your internet link isn't particularly fast. If you can't live with your site being unavailable for some time, what are you going to do if/when the time comes that you have to restore the data?

Dropbox? (0)

Anonymous Coward | more than 5 years ago | (#24464235)

I use dropbox.. and even though it's in beta, you get 10GB of space for free :D


Switch Web Hosts -- Proper Backups are a MUST (1)

flithm (756019) | more than 5 years ago | (#24464257)

I'm in agreement that an rsync based offsite backup solution is always a great idea. rdiff-backup [nongnu.org] or duplicity [nongnu.org] is the way to go.

That being said, proper backups is a must that any web host should provide. I used to use dreamhost and they did incrementals and gave you easy access to them. Some time ago I outgrew shared hosting and went to slicehost which offers absolutely awesome service and although backups cost extra, they do full nightly snapshots, and it's all easy to manage (restore, take your own snaps, etc) via a nice web interface.

Seriously, take your money where your mouth is: find a better host -- AND do your own offsite rsync based backups.

Re:Switch Web Hosts -- Proper Backups are a MUST (0)

Anonymous Coward | more than 5 years ago | (#24464435)

Example: Peak10 [peak10.com] works with Iron Mountain [ironmountain.com] for backups of hundreds of servers. You would need to supply hardware.

If you use wordpress, I have a solution for you (1)

vbanos (1338353) | more than 5 years ago | (#24464335)

In you use wordpress for your site, you can use blogreplica.com [blogreplica.com] , an online blog backup service which was created with this specific goal in mind. blogreplica.com [blogreplica.com] connects to your blog using XML-RPC and retrieves all the content to its servers where you have full access to it any time. Maybe this works for you

why not try ... (5, Funny)

Anonymous Coward | more than 5 years ago | (#24464347)

NSA: We backup your data so you won't have to!

How it works:
First, edit each page on your website ab add the following meta tags: how-to, build, allah, infidels, bomb (or just any of the last three, if you're in a hurry).

On the plus side, you don't need to give them your money, nor your password.

On the minus side ... there is no minus side (I mean, who needs to travel anyway?)

Posting anonymously as I have moderated in this thread (that, and they already know where I live).

Try Manent (1)

gsasha (550394) | more than 5 years ago | (#24464387)

Consider Manent (http://trac.manent-backup.com , freshmeat entry: http://freshmeat.net/projects/manent [freshmeat.net] ). It can currently back up local directory to a remote repo, so you can easily set it up to run at your server to back up to your home, and in the future it will be able to back up an FTP directory.
It is extremely efficient in backing up a local repository. A 2GB working set should be a non-issue for it. I'm doing hourly backups of my 40-G home dir.
Disclaimer: I am the author :)

SVN/rdiff-backup/rsync (0)

Anonymous Coward | more than 5 years ago | (#24464391)

Other posters mentioned rsync (which I agree with). You might also look into rdiff-backup. Another backup option is SVN. The nice thing about these options is that once you get the initial (0-level) backup they will only upload the changes. So, assuming that you are not updating things all the time, any of these options will work well even over 150KB connection.

What an AMAZING coincidence (0)

Anonymous Coward | more than 5 years ago | (#24464417)

Just at the same time this story hits slashdot, I see a bunch of stories on reddit/digg about 10/20/100 ways to back up your web site!!!111 What an AMAZING coincidence. There couldn't possibly be any business interests in this.

Shared hosting (4, Interesting)

DNS-and-BIND (461968) | more than 5 years ago | (#24464471)

OK, I keep hearing "use rsync" or other software. What about those of us who use shared web hosting, and don't get a unix shell, but only a control panel? Or who have a shell, but uncaring or incompetent admins who won't or can't install rsync? I know the standard slashdot response is "get a new host that does" but there are dozens of legitimate reasons that someone could be saddled with this kind of web host.

Re:Shared hosting (2, Insightful)

lukas84 (912874) | more than 5 years ago | (#24464523)

there are dozens of legitimate reasons that someone could be saddled with this kind of web host.

No, sorry. Not a single one.

Re:Shared hosting (1, Insightful)

pimpimpim (811140) | more than 5 years ago | (#24464633)

Either find a competent provider that already has the tools to do backups preinstalled. Or catch up on your (your technician's) system administration skills, If you have a serious business at your website, you should know what you are doing. The same goes for carpentry or someone who owns a car shop. You just don't get your money for nothing, you know.

Re:Shared hosting (1)

fireboy1919 (257783) | more than 5 years ago | (#24464647)

If you're storing the website *only* on a hosting provider that won't give you a shell, and don't have a complete copy of the entire site in your hands at all times, you've got a much bigger problem.

That is a very good sign that you're at a fly-by-night hosting company that's going to lose all your data. If you're worried about backup, you should pony up and get a decent hosting provider.

But that is probably something worth addressing anyway. Fortunately, there are many things similar to rsync that will work over http or ftp on freshmeat, though I've never used them.

Re:Shared hosting (3, Interesting)

Lumpy (12016) | more than 5 years ago | (#24464797)

Write PHP or ASP code to generate your backups as a tar or zip and get the files that way.

When you pay for the economy hosting, you gotta write your own solutions.

Re:Shared hosting (1)

azemon (820662) | more than 5 years ago | (#24464873)

there are dozens of legitimate reasons that someone could be saddled with this kind of web host.

Choice of hosting company tells me a lot about how the site owner views his own site. If he is unwilling to spend a few bucks on a quality host then he obviously places very little value on his web site. And if the site owner values his site so poorly, why are you worried about backups?

rsync.net (1, Informative)

Anonymous Coward | more than 5 years ago | (#24464511)

rsync.net --- online backup provider with sshfs/sftp/webdavs/scponly support

Check out Conduit or use rsync with ssh (1)

dtrick (955276) | more than 5 years ago | (#24464605)

If you are subscribed to say Box.net, using Conduit is one possibility. Making a compressed tar.gz backup with rsync shouldn't take that long--it's the file transfer that will at 150K. If you cron a daily/weekly unattended overnight backup with rsync over ssh that downloads (sftp -b batch) to your pc, that might be best. Your backup needs to take into consideration any database back-end. I just use mysqldump to output the entire database to a gzipped file. Lot's of examples on the net. Google is your friend. Good Luck. Dietrich T. Schmitz Linux IT Consultant www.dtschmitz.com

Get a shell (0)

Anonymous Coward | more than 5 years ago | (#24464635)

Get a shell and set up a chron job to ftp your data at reasonable involves. Many ISPs that provide data center services also offer back up solutions. Did you do any investigation before posing this question?

moCd do3n (-1, Troll)

Anonymous Coward | more than 5 years ago | (#24464733)

been lloking for! much as Windows people's faces at

finding net services (0)

Anonymous Coward | more than 5 years ago | (#24464861)

If you are willing to consider pay services, I have had great experience using a free service (colotraq) that helps get you exactly what you are looking for when it comes to internet/telecom/etc services. They are like lending tree or real estate agent - they don't do the stuff them self, they just have some giant b2b marketplace. And - No, I do not work for them, just really have done well with them.

Online backup is a better solution (1)

darkhalcon (678690) | more than 5 years ago | (#24464931)

As many stated before I think you have better options. Many backup services can give you "Delta", or "Incremental" backups. For each backup only data that has changed gets uploaded. The advantages to this are lower bandwidth, and the ability to have restore points. If you check around there are many companies that offer this.

Ditto for rsync.net (1)

gregraven (574513) | more than 5 years ago | (#24465143)

I've been using rsync.net for more than a year, and it works great. I back up four websites from one server -- files and MySQL databases -- each night, each week, and each month, and only once did the backup not work as planned. Good tech support, too.

You FAil It (-1, Flamebait)

Anonymous Coward | more than 5 years ago | (#24465183)

has 5ignificantly

Thanks for your comments... (5, Informative)

pdcull (469825) | more than 5 years ago | (#24465367)

Hi everyone, I didn't mention in my question that where I'm living (Rio de Janeiro slum) there aren't that many options for internet access. Also, as all my sites are my very much not-for-profit, I'm limited as to how much I can spend on my hosting plan. I've been using Lunarpages for a number of years now, and generally find them very good, although if I stupidly overwrite a file, or want to go back to a previous version of something, I'm out of luck. As I am a lazy (read time-challenged) individual, I tend to use Pmwiki and maintain my sites online, hence my need for regular, physical backups. Anyway, thanks everyone for your help, I still can't help thinking that somebody must be able to make a pile of cash offering such as service to non-techie site owners...

cuNm (-1, Flamebait)

Anonymous Coward | more than 5 years ago | (#24465385)

please moderaT;e

TarSnap (1)

ivoras (455934) | more than 5 years ago | (#24465599)

From the BSD camp (or at least one of the developers), there's TarSnap [daemonology.net] , which offers very high encryption and confidentiality, and also incremental backups (via snapshotting).
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account