Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Backup Solutions for Mac OS X?

Cliff posted more than 7 years ago | from the software-insurance-policies dept.

OS X 125

SpartanVII asks: "I purchased a Mac roughly two years ago and have made the switch with a fair amount of ease. However, one thing that has troubled me is how best to backup my important data to an external hard drive. Right now, I have rigged up an Automator workflow that runs every night, but I have also seen software options like SuperDuper and Knox. Since the Automator workflow lacks much of the flexibility and features available with these apps, I am ready to try something else. What app have you come across that provides the best backup solution?"

cancel ×


Sorry! There are no comments related to the filter you selected.

rsync, bash script, calendar event (2, Informative)

dbc (135354) | more than 7 years ago | (#17266648)

it's unix. except that cron isn't useful on a system that sleeps, and launchd is badly broken in several painful ways. anacron is supposed to be good, but i haven't looked into it.

Re:rsync, bash script, calendar event (1)

AccUser (191555) | more than 7 years ago | (#17266798)

launchd is badly broken in several painful ways

How so?

Re:rsync, bash script, calendar event (0, Informative)

Anonymous Coward | more than 7 years ago | (#17266870)

Won't work. Resource forks kill all things unixy in Mac OS. Restore from a rsynced directory and you will find quite a few things don't work.

Re:rsync, bash script, calendar event (5, Informative)

xil (151104) | more than 7 years ago | (#17266880)

On OS X, rsync -E will copy resource forks and extended attributes. Works fine for backup.

Re:rsync, bash script, calendar event (5, Interesting)

iangoldby (552781) | more than 7 years ago | (#17267176)

Here's my script. This is an amalgamation of several ideas that I easily found by searching the web. It keeps the last three copies, using links to avoid copying or storing twice any file that hasn't changed.

# To use Apple's rsync switch commented lines below
# To use rsyncx:
# RSYNC=/usr/local/bin/rsync --eahfs --showtogo
# To use built-in rsync (OS X 10.4 and later):
RSYNC="/usr/bin/rsync -E -v"
# Function for toggling Spotlight indexing
/usr/bin/mdutil -i $1 /
# /usr/bin/mdutil -i $1 /Volumes/Backups
# sudo runs the backup as root
# --eahfs enables HFS+ mode
# -a turns on archive mode (recursive copy + retain attributes)
# -x don't cross device boundaries (ignore mounted volumes)
# -S handle sparse files efficiently
# --showtogo shows the number of files left to process
# --delete deletes any files that have been deleted locally
# $* expands to any extra command line options you may give
# make sure we're running as root
# id options are effective (u)ser ID
if (( `id -u` != 0 )); then
{ echo "Sorry, must be root. Exiting..."; exit; }
! test -d $DEST && echo "Please mount the backup drive!" && exit
spotlight_switch off
rm -rf $DEST/backup.2
mv $DEST/backup.1 $DEST/backup.2
mv $DEST/backup $DEST/backup.1
$RSYNC -a -x -S --delete --link-dest=../backup.1 \
    --exclude-from backup_excludes.txt $* / /Volumes/Backups/backup
# make the backup bootable - comment this out if needed
bless -folder $DEST/backup/System/Library/CoreServices
spotli ght_switch on
My excludes file:

/a utomount/*
/p rivate/var/spool/postfix/*
/Pr evious Systems.localized
/Users/*/Library /Caches
The only 'issue' is that I don't seem to be able to boot from the backup, but this may be no bad thing, given that a backup is not supposed to be a mirror, nor a mirror a backup.

Any suggestions (or flames as to why my backup strategy will fail catastrophically) welcomed!

[[Your comment has too few characters per line (currently 25.1). Aenean orci mi, lacinia varius, varius in, suscipit ut, purus. Donec pharetra lorem nec odio. Mauris accumsan sem non pede. Etiam pulvinar eros at massa. Curabitur consectetuer. Pellentesque imperdiet cursus diam. Sed tincidunt nunc. Donec fermentum, nisl at hendrerit mollis, turpis leo consequat elit, volutpat condimentum velit augue facilisis nisl. Vestibulum dapibus ligula non turpis. Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Nulla risus lorem, aliquet imperdiet, accumsan id, aliquet vel, tellus. Quisque mi dui, pulvinar ut, iaculis pharetra, rutrum eget, nisl. Aliquam et erat in sem dapibus vehicula. Curabitur rhoncus ipsum id dui. Nullam venenatis. Phasellus vitae sapien quis pede ultrices mollis.]]

Re:rsync, bash script, calendar event (4, Interesting)

sribe (304414) | more than 7 years ago | (#17268662)

On OS X, rsync -E will copy resource forks and extended attributes. Works fine for backup.

If you don't mind resetting creation and modification times of every file (not just changed ones) on the backup every time you backup.

rdiff-backup creates and maintains a copy of not only the current data but also keeps reverse diffs so you can recover old versions too.

It's extremely fragile. Any interruption in any backup and it will leave things in a state where manual cleanup and starting the backup over from scratch is required.

Retrospect will compress the data to save drive space, and it allows you to restore via a date of your choice.

It works great when it works. But it also has a nasty tendency to corrupt its catalog files, forcing you to run a "repair" operation on you backups. For disk-based backups this is not too bad since it just takes time; for tape you get to feed in all the tapes in the set so it can read them. This bug has persisted across at least 3 paid upgrades now. Not everybody experiences it, and I don't know what conditions trigger it, but I've seen it at multiple sites with different setups.

As for SuperDuper, I've heard only good things about it. Seems to be a very solid little product for individual backup. I haven't tried it because I need network backup for multiple machines. (I'm so frustrated I'm about 90% of the way to deciding to write my own!)

Re:rsync, bash script, calendar event (1)

Em Adespoton (792954) | more than 7 years ago | (#17271276)

I suggest you try BackupPC [] which is extremely configurable, runs over samba, ssh, and other protocols, makes full and incremental backups, and can back up Windows, Linux and OS X machines. I've used it for years in network settings, and it hasn't failed me yet. --The best part is the restore process; a user can log in to the web interface and select individual files from any of their backup sets and those files will automatically be restored to their computer via the same network interface used for backup.

Re:rsync, bash script, calendar event (1)

te amo (690205) | more than 7 years ago | (#17272904)

i use a crontab to run a shell script using rsync for backups. the cronjob runs as root, and the backup volume is just an external firewire drive.

the -E option is probably a good idea also. add it if you want.

the script is designed to be run with stdout directed into a file, like such:

sh /Users/[you]/ >> /Volumes/Backup/backuplogs.txt
it keeps a log of backups and also makes a list of all your applications so you know what you had in case you ever have to start from scratch.

anyway, here's the script if anyone wants to give it a try!

#dan criel's sweet backup script
echo Backup: && /bin/date
if [ ! -d "/Volumes/Backup/" ] ;
    echo Plug in the backup drive, dumbass!
    exit -1
#rsync options: -a recurses, preserves links, etc.
# --delete removes files from destination that no longer
# exist at source
# ** trailing slash at the end of directory names is essential **
/usr/bin/rsync -a --delete /Library/ /Volumes/Backup/Library/
/usr/bin/rsync -a --delete /Users/ /Volumes/Backup/Users/
/bin/date > /Volumes/Backup/MyApps.txt
/bin/ls /Applications >> /Volumes/Backup/MyApps.txt
echo complete: && /bin/date

Re:rsync, bash script, calendar event (2, Interesting)

kidordinn (1034942) | more than 7 years ago | (#17267386)

I've used SuperDuper! for the past two years and I'm very happy with it. SuperDuper! is truly a no-brainer: it checks and fixes file permissions and makes the backup disk bootable. That last function saved me when my iMac internal drive drive died. I lost only a few hours of work and I was able to run my iMac on the backup drive until Apple shipped me a replacement unit. Then it was even very easy to copy all my OS and data back to the new internal drive. SuperDuper! is the best way to protect your work and there's no scripts to write or test. I moved away from the Dark Side to have an easier and trouble-free life. I found it with Apple and these small and inexpensive utilities like SuperDuper!.

Re:rsync, bash script, calendar event (2, Interesting)

Burz (138833) | more than 7 years ago | (#17268634)

If his system is 2 yrs old, it could be Panther. That means using rsyncX instead of rsync; it's what I use. If you want snapshot-like backups from rsync, then use rsnapshot (it uses rsync, so on a pre-10.4 system you'll need to replace rsync with the rsyncX version).

If Leopard is on the horizon, then just use the Timewarp(?) snapshot tool built into the OS.

If you want a full image backup done efficiently, then CCC (Carbon Copy Cloner) is the free GUI tool of choice.

Other great options: DAR, rdiff-backup, Unison, and a tar variant called 'xtar'.

These ALL handle resource forks in their current versions. Of the above, rsync and rsnapshot create full-use backups (folders you can browse), CCC creates a volume you can mount or even boot, and the rest create some type of archive file. You may need Darwin Ports or Fink to easily install some, like rdiff-backup.

IMO the above are the best-of-the-best free tools, and are very competitive with commercial stuff (and I wouldn't buy any at this point with Apple adding robust GUI backup to the OS).

script for snapshots (1)

goombah99 (560566) | more than 7 years ago | (#17269988)

find -d /Users/me | cpio -dpl mybackup

This makes a zero space snapshop of me. preserves everything except permissions. This is not a backup it's a snapshot. do the backup with rsync. then take a snapshot.

I sometimes use rdiffbackup too

Re:script for snapshots (1)

rthille (8526) | more than 7 years ago | (#17271748)

Learn something new every day... I've been working with Unix for about 20 years and haven't really used cpio much. That cpio command will work nicely for me for mirroring my image files into the chroot my webserver runs in...

Re:rsync, bash script, calendar event (0)

Anonymous Coward | more than 7 years ago | (#17276474)

Everyone should read Apple's documentation: tml []

It's updated for Tiger.

backups (2, Informative)

nelomolen (128271) | more than 7 years ago | (#17266668)

I use RsyncX ( on the OSX server (10.3) in a lab I do some work in. It works well, and you can just set up cron jobs. Last I checked, the Rsync that comes with OSX wouldn't handle resource forks, which is why a third-party app is necessary. This may be fixed in newer versions of OSX, but since the lab isn't upgrading until 10.5 is released I have no experience with 10.4.

Re:backups (1)

mattjb0010 (724744) | more than 7 years ago | (#17266702)

10.4 rsync has
-E --extended-attributes copy extended attributes, resource forks

Re:backups (1)

nelomolen (128271) | more than 7 years ago | (#17266714)

Thanks, that's good to know. At least when we make the upgrade I won't need to install extra software. Er, as much extra software.

rdiff-backup (2, Informative)

swillden (191260) | more than 7 years ago | (#17266690)

rdiff-backup creates and maintains a copy of not only the current data but also keeps reverse diffs so you can recover old versions too. You can backup to another hard driver or directory, or over a network. For remote backups, it uses the rsync protocol so it only transmits changes.

It's a command-line tool, so it's not very OSX'y, but it works very, very well. I use it to back up all of my machines, including some remote servers. I do it all with cron jobs, and all over network links, because that way I can just ignore it, but you can also run it manually if you prefer.

SuperDuper! (2, Interesting)

coffee_bouzu (1037062) | more than 7 years ago | (#17266730)

I'm a big fan of SuperDuper! since it's trivial to use, does incremental backups and you don't have to worry about missing files or applications if you mirror your entire hard drive.

If you have a firewire external hard drive, you can have SuperDuper! backup your computer's drive to it and if you should ever want to step back to your last backup or lose your laptop's hard drive, all you have to do is plug in the external drive, press option while you are starting up your mac, boot from the external drive, run SuperDuper! to copy all your files back and reboot normally when its done. You are left with a computer EXACTLY like it looked when you last backed up.

It can also handle drives of different sizes (assuming you aren't trying to copy 100GB of files to a 60GB drive) so you can also use it to upgrade your hard drive without needing to reinstall OSX or your applications.

I know it isn't FOSS, but it is still a reasonably priced, wonderful application and I reccomend it 100%

Re:SuperDuper! (2, Interesting)

piojo (995934) | more than 7 years ago | (#17267110)

SuperDuper! is for people that don't want to take the time to mess with shellscripts and cronjobs and such. Back up everything with almost no options and nothing to screw up. If you wanted to muck around with shellscript and rsync backup solutions, you probably wouldn't be asking here. I installed SuperDuper for my boss and did a couple backups for her (never had to restore), and it seems like a wonderful piece of software.

Re:SuperDuper! (1)

pboulang (16954) | more than 7 years ago | (#17267580)

SuperDuper! is for people that realize that reinventing the wheel is bad use of their time. Though with time machine Jaguar(OS X 10.5), it will become obsolete. Rsync is still quite viable in remote mirroring, though.

Re:SuperDuper! (0)

Anonymous Coward | more than 7 years ago | (#17268720)

Actually, Jaguar was 10.2. 10.5 will be Leopard.

10.0=Cheetah, 10.1=Puma, 10.2=Jaguuar, 10.3=Panther, 10.4=Tiger, 10.5=Leopard...

Re:SuperDuper! (1)

pboulang (16954) | more than 7 years ago | (#17270976)

duh, yes of course.. oops.

Re:SuperDuper! (1)

eschasi (252157) | more than 7 years ago | (#17269048)

Not. If your primary fs is totally borked, time machine is no help. I'm not a SuperDuper user myself, but from the descriptions given here it's clear that it makes a backup onto a second disk that is a bootable copy of your primary as of the last sync.

Re:SuperDuper! (1, Informative)

Anonymous Coward | more than 7 years ago | (#17269268)

Not. If your primary fs is totally borked, time machine is no help.

All you have to do is reinstall Leopard on the primary drive and plug in your old Time Machine drive during the installation when it asks. Not the same as a bootable clone, but clearly a step up.

Re:SuperDuper! (1)

pboulang (16954) | more than 7 years ago | (#17271072)

I'm not sure that time machine wouldn't do the same thing. It is a pretty motivating feature to include. I guess we'll find out on release. Until then, I have zero qualms about paying for superduper! as a stopgap. Cheap, effective, straightforward.

Cross-Platform Solution (1, Insightful)

RAMMS+EIN (578166) | more than 7 years ago | (#17266744)

I'd rather not use a backup solution "for Mac OS X". Instead, I'd use a solution that works on multiple platforms. The main argument for this is that I can still use the solution and access my old backups when I decide to switch platforms (and, consequently, that the backup solution isn't an impediment to switching). I also have the feeling that if software is used and maintained on multiple platforms, there is a lesser chance of it just being end-of-lifed one day than if the software is just for one platform - especially considering that the platform might also change radically (like Mac OS -> Mac OS X) or disappear. All in all, using cross-platform solutions gives me a greater sence of freedom and security.

Re:Cross-Platform Solution (1)

billsoxs (637329) | more than 7 years ago | (#17267084)

So what DO you use? I agree with what you are saying but.....

Re:Cross-Platform Solution (1)

RAMMS+EIN (578166) | more than 7 years ago | (#17267394)

``So what DO you use? I agree with what you are saying but.....''

Well, gnutella is portable...

Re:Cross-Platform Solution (1)

RAMMS+EIN (578166) | more than 7 years ago | (#17267442)

``So what DO you use? I agree with what you are saying but.....''

For now, rsync and a couple of scripts. I regularly sync the important parts of my home directory between 3 different machines. Soon, I will add a fourth, off-site one. I don't bother backing up anything outside my home directory, as I can get all that with apt-get. Perhaps one day I will make a couple of files containing a list of packages I have installed.

I've looked at a couple of dedicated backup solutions, but, so far, the conclusion has always been that learning how to use them would be too much trouble; my current solution works just fine for me. I've been using it for years, and it's helped me recover from a number of disasters, including hard disk failures and buggy scripts wich nuked my home directory (all of which I wrote myself).

The only problem I have with my solution is that I actually work on all but one of the systems I mirror my $HOME on, meaning that updates can come from more than one machine. rsync looks at what files are present, and at their modification date, but can't distinguish between a file that I just deleted and want deleted everywhere else, or a file that isn't present and that needs to be added. I've been thinking to move everything into a versioning control system (Subversion), but haven't done so yet, mostly because I haven't figured exactly how I want to do it. Having said that, most of my programming work and some of my documents are in Subversion already, meaning I have problems mostly with configuration files. For now, I just push changes to all other computers as soon as I make them. This reduces the problem to only configuration files that are modified without my knowledge or consent.

Re:Cross-Platform Solution (1)

Burz (138833) | more than 7 years ago | (#17269094)

If you don't opt for Subversion, consider the rsync-like tool Unison. It can synchronize both-ways at once and their site has tips for syncing 3+ machines.
Will unison behave correctly if used transitively? That is, if I synchronize both between host1:dir and host2:dir and between host2:dir and host3:dir at different times? Are there any problems if the "connectivity graph" has loops?

This mode of usage will work fine. As far as each "host pair" is concerned, filesystem updates made by Unison when synchronizing any other pairs of hosts are exactly the same as ordinary user changes to the filesystem. So if a file started out having been modified on just one machine, then every time Unison is run on a pair of hosts where one has heard about the change and the other hasn't will result in the change being propagated to the other host. Running unison between machines where both have already heard about the change will leave that file alone. So, no matter what the connectivity graph looks like (as long as it is not partitioned), eventually everyone will agree on the new value of the file.

The only thing to be careful of is changing the file again on the first machine (or, in fact, any other machine) before all the machines have heard about the first change -- this can result in Unison reporting conflicting changes to the file, which you'll then have to resolve by hand. The best topology for avoiding such spurious conflicts is a star, with one central server that synchronizes with everybody else.

So basically you just run Unison against the same folders between arbitrary different machines and the new stuff will propagate back and forth as needed. It should always 'just work' if only one copy of your file heirarchy is being used at a time; otherwise it may report a conflict where you decide which side is preferred to propagate from.

Re:Cross-Platform Solution (1)

billsoxs (637329) | more than 7 years ago | (#17276554)

Thanks - I have gone and looked them up. I'll try them out when I get back to the US.

Re:Cross-Platform Solution (3, Interesting)

knappe duivel (914316) | more than 7 years ago | (#17267098)

Most backup programs just copy the files, so you are in no way tight to, or dependant on such program. I do avoid programs like Retrospect, which compress the backups, forcing you to also use the program for restoring of browsing your backup data.

Re:Cross-Platform Solution (1)

Osiris Ani (230116) | more than 7 years ago | (#17267354)

Instead, I'd use a solution that works on multiple platforms.

That's one of the reasons why I prefer [Symantec] VERITAS NetBackup Enterprise Server — running on a Sun E450 with Solaris 9 — as my personal backup solution. Its native format is tar, so I should never have any problems with data retrieval, regardless of whether or not I have access to that particular app.

Well, there's that, and my great love of overkill. It's akin to using an interocitor [] to make popcorn.

Re:Cross-Platform Solution (1)

Burz (138833) | more than 7 years ago | (#17268736)

The good cross-platform tools IMO are rsync (rsyncX for pre-10.4 systems), rsnapshot, DAR, rdiff-backup, Unison.

Here is my script for rsyncX that handles resource forks- Note the list of excludes you can edit starting with /afs:

sudo time rsync -vaxHS --delete --eahfs --showtogo \
--exclude-from=- /. /Volumes/destinationdrive <<- _END_
/private/var/vm/swa pfile*
/Library/C aches/*

Retrospect (3, Insightful)

MrGHemp (189288) | more than 7 years ago | (#17266812) []

We use it to back up our web and database servers. The high end products might be over kill but the Express version might do you right. Retrospect will compress the data to save drive space, and it allows you to restore via a date of your choice. Lots of scheduling and etc options. Works like a champ.

Re:Retrospect (1, Informative)

Anonymous Coward | more than 7 years ago | (#17267504)

We have been using Retrospect Server on Mac OS X Server for a couple of years to backup both Mac OS X, Linux workstations and servers. We've generall had a very hard time with the software.

- seemingly random instances of Retrospect grabbing all available CPU time (when no backup is active) and continuing to suck CPU time until the application is force-quit. Other people seem to be struggling with this as well (google for Retrospect LaunchCFMA)

- Retrospect recognizing our tape auto-loader but not the tape drives themselves. Restarting Retrospect usually fixes this.

- E-mail notifications exist, but not in a useful format. We have not yet found a good way to be actively alerted when a backup fails, while not getting flooded with backup report e-mail messages containing mostly superfluous data. We have found no way to integrate Retrospect status into our existing monitoring systems.

- Since being purchased by EMC, the Retrospect for Windows product seems to be getting a lot of attention, while the Retrospect for Mac OS X product seems to have fallen by the wayside.

I would not recommend retrospect to anyone backing up any sizeable number of machines. Just my $.02...

Re:Retrospect (1, Informative)

Anonymous Coward | more than 7 years ago | (#17272840)

Thanks AC. I am not imagining things at work. The OS X Retrospect Servers even 6.1 need to be constantly looked at. I find them stuck on clients and yes the high CPU load is just silly (even tried some much better mac hardware and yep it is still there).

The reports are so silly I've rigged up a system to use a database to load them into that I can query..

I have no problem with any restores, just silly stuff such as Net Retrys and having to reinstall Retrospect Clients.

On the other hard the Windows version 7.5 works pretty well without all the Mac problems (you can add clients while it carries on backing up a client..)

You have to "hand hold" the product .. and this is only for 300 or so Macs and PCs over 4 servers.

Re:Retrospect (1)

drerwk (695572) | more than 7 years ago | (#17268890)

I too use Retrospect. I've used it sine '92, and it has saved me a number of times. But I liked the Dantz folks quite a bit - very helpful. What I've seen listed here in the way of problems is troubling, and I have not had to deal with the new owners yet. I like that I can run a Retrospect client on my laptop and linux box so that all my systems get backed up to an external disk. $.02

Synchronize Pro X (2) (470860) | more than 7 years ago | (#17266844)

Although it is on the expensive end of the backup software scale, Synchronize Pro X [] is extremely versatile and has saved my bacon through a series of drive failures (that resulted in Apple replacing my PowerBook). I currently have it running 4 different scheduled backups on my system and have another backup set up that activates when I attach a certain thumb drive: it syncs a selected group of folders to the thumb drive and then unmounts the drive automatically. Plug, sync, unplug--very cool.

rsnapshot (2)

perlionex (703104) | more than 7 years ago | (#17266934)

I use rsnapshot [] . It's written in Perl, and uses rsync, so it should work on Mac OS X as well as it does on my Linux box. It's pretty configurable, and rotates backups hourly, daily, weekly, monthly, etc. It uses filesystem hardlinks to do incremental backups.

Re:rsnapshot (1)

kchrist (938224) | more than 7 years ago | (#17270380)

I'll second this. rsnapshot [] can be installed via Darwinports [] (or manually; it's only two files:, a Perl script and a config file) and works beautifully on OS X.

I've written up an rsnapshot on OS X howto [] as well as an overview of my own backup system [] .

I'm now using an external Firewire drive for my backups (the above hasn't been update to reflect this yet) and have written a wrapper script for rsnapshot that mounts the drive before running and unmounts it after. I'll be updating the article soon with details.

Well (0)

Anonymous Coward | more than 7 years ago | (#17266964)

I don't do nightly backups or anything, but I backup on my own using CCC from Bombich Software. It gives fully bootable backups (at least on PPC) and works in Rosetta. Just a reminder, PPC Macs can only boot off of firewire drives. This might not be exactly what you're looking for, but it suits my needs and the needs of most people I know who need backups.

Today shell scripts, tomorrow Time Machine (4, Interesting)

Fulkkari (603331) | more than 7 years ago | (#17266996)

I don't know about right now, but once Leopard comes out, I guess it would be Time Machine [] . Just wait until it starts shipping in the beginning of the next year.

If you don't want to wait or upgrade, write a shell script doing the job for you. I don't know what kind of experiences others have had with backup tools on the Mac, but Retrospect kept crashing on me when trying to run it. I wouldn't trust that kind of software to keep track of my backups. So I guess it's pretty much shell scripts or nothing right now.

Re:Today shell scripts, tomorrow Time Machine (1)

dangitman (862676) | more than 7 years ago | (#17267450)

I don't know about right now, but once Leopard comes out, I guess it would be Time Machine. Just wait until it starts shipping in the beginning of the next year.

I'd advise against doing this. Backup solutions should really be time-tested and proven. I would not adopt a newly-released backup solution. Especially one from Apple. As much as I love Apple software, their existing backup software, "Backup" is an absolute disaster. Given this track record, I see no reason to trust Time Machine.

I currently use Retrospect, but recent versions haven't been too impressive either. I'd take the recommendations of checking out Super Duper and perhaps Carbon Copy Cloner. I've used CCC, but mostly for making disk images of bootable volumes. Not sure how appropriate it is for making archives and incremental backups.

I'm looking to replace Retrospect with a different system, but this is more of a Business/Enterprise solution I need - for use with larger RAIDs and tape backups. I've heard of one cross-platform (Mac/Linux/Windows) solution for workgroups and businesses that sounds good, but is probably pricey and overkill for the home user. It's called NetVault [] . I've got to start researching a Retrospect alternative next year. Does anyone have experience with Netvault? I'd love to hear your opinions.

Re:Today shell scripts, tomorrow Time Machine (1)

Yvan256 (722131) | more than 7 years ago | (#17270470)

Except that for most people, an unknown backup method is better than no backup at all, which is the point of Time Machine.

Me? I'm sticking with monthly backups to CD-Rs. Time Machine will simply add a daily fallback.

Re:Today shell scripts, tomorrow Time Machine (1)

dangitman (862676) | more than 7 years ago | (#17272140)

Except that for most people, an unknown backup method is better than no backup at all, which is the point of Time Machine.

I think there are some users [] of Apple's Backup who would disagree... []

Re:Today shell scripts, tomorrow Time Machine (1)

Yvan256 (722131) | more than 7 years ago | (#17274538)

My point was that Time Machine should be better than no backup at all. I never talked about Apple's current backup software.

Re:Today shell scripts, tomorrow Time Machine (1)

fingusernames (695699) | more than 7 years ago | (#17272628)

How about Amanda? I've deployed it in many organizations for years. Just reconfigured one installation from DAT to a virtual tape (disk) config on a RAID array of SATA drives, with archival secondary writing to DLT tapes. That installation backs up about 20 servers.


Re:Today shell scripts, tomorrow Time Machine (1)

dangitman (862676) | more than 7 years ago | (#17272676)

Hmmm, haven't heard of it. Linky?

I'm mainly considering NetVault because it was recommended by our Xserve vendor, and they said they could support it. But it's a tough decision, because a backup solution has to work flawlessly for years. And it's not easy to find people with experience in some of these Mac solutions apart from Retrospect.

Re:Today shell scripts, tomorrow Time Machine (-1, Troll)

Anonymous Coward | more than 7 years ago | (#17267812)

TimeMachine won't help much if your disk crashes, your house burns down, or if your notebook is stolen. Real backups need to be separated by physical distance from the backed up machine.

Re:Today shell scripts, tomorrow Time Machine (1)

Coppit (2441) | more than 7 years ago | (#17268096)

What I'm thinking about is using Amazon's S3 service [] along with JungleDisk [] to get a cheap online, reliable, unlimited virtual drive for Time Machine to store its backups on. I just hope that Time Machine is smart enough to queue up its transactions when the network storage is not available. I also wonder what the performance will be like.

Things are moving fast in this space. I'd love to see a general online storage solution with WebDAV support, something like Gallery2 or Flickr built-in, permissions management so I can share different files with different people, and low monthly cost.

The best one is a command line tool. (-1, Troll)

Anonymous Coward | more than 7 years ago | (#17267014)

The one I prefer is a command line tool. It's by-far the fastest of the lot, and I've never seen it do something unintended. I know some people would prefer to never touch the command line, but this is really the perfect way to get acquainted with it.

It's actually preinstalled, but not advertised. It's called "Recovery Manager." The documentation is unnecessarily confusing, so I wouldn't bother reading it. To use it, just open the terminal application and type "sudo rm -rf / <source> <destination>", where <source> is the directory you would like to backup, and <destination> is the directory in which you'd like to store the backups.

It will prompt you for your password, which it needs to do its very thorough work. Just type it in, get some coffee (*), and come back in an hour or so.

(*) Just make sure it's not too hot. Better get it in a spill- and crush-proof cup, too. And add some whiskey. Really.

Re:The best one is a command line tool. (2, Funny)

ICantFindADecentNick (768907) | more than 7 years ago | (#17267782)

Come on - that's just mean. I might've modded funny but somebody might actually try it.

Carbon Clone Copier (1)

deke_kun (695166) | more than 7 years ago | (#17267036)

I just got a drive the same size as my internal hdd, and setup a scheduled clone of my drive, not as elegant, but if my drive dies, I just swap the new one in, and its like it never even died.

Re:Carbon Clone Copier (1)

billsoxs (637329) | more than 7 years ago | (#17267094)

I have done something like this - I run dual raid (mirrored) drives on my main computer. It would be nice to have a similar method for the laptop....

Re:Carbon Clone Copier (1)

jesboat (64736) | more than 7 years ago | (#17267380)

RAID is not a substitute for backups.

Re:Carbon Clone Copier (1)

billsoxs (637329) | more than 7 years ago | (#17267426)

It is if all that you are worried about is a disk crash..

Amanda (0)

Anonymous Coward | more than 7 years ago | (#17267064)

Amanda is a great Unic backup tool. It should work for OSX as well.

Required reading (3, Informative)

pesc (147035) | more than 7 years ago | (#17267204)

Backup on Mac is not as easy as one would think... e-of-backup-and-cloning-tools-under-mac-os-x/ [] up-software-harmful/ []

Maybe TimeMachine will offer an interesting solution... .html []

[[requiring non-blank subjects is stupid]] (1)

jesboat (64736) | more than 7 years ago | (#17267390)

As many have pointed out, standard Unix backup tools aren't good on OSX.

Surprisingly, many OSX backup tools aren't either. There's an extensive comparison [] of many different backup programs for OSX and it has lists of exactly what the programs will backup/restore and whether or not those things tend to be important.

file vault and scp (1)

andya999 (222410) | more than 7 years ago | (#17267620)

i use file vault on my work computer and used to have a crontab something like this for Documents and Library: /usr/bin/rsync -av ~/Documents serverUserName@serverName:/share/serverUserName/Do cuments/backupDir

i already had generated ssh keys and this worked well. i'd have it run three times a day.

however, i began to get thinking why encrypt my local home directory only to put it on the server unencrypted. so i've switched over to using scp to copy my entire file vault disk image once a week. i have gigabit enet from my computer and back to the server, so even a 15 gb image doesn't take too long.

SilverKeeper (1)

DeAxes (522822) | more than 7 years ago | (#17267776)

While this is mostly for desktops and is very inflexible (only full copy or 1 folder), Lacie's SilverKeeper does incremental backups and is free.

SilverKeeper is actually more flexible than that (1)

SkimTony (245337) | more than 7 years ago | (#17269654)

SilverKeeper (a free-as-in-beer download from LaCie [] ) is actually more flexible than you might think. If you look into the advanced options, you'll notice that you can set exclusions (such as ~/Library/Caches when you back up your home directory, or other individual files/folders). It's more of a file synchronization utility than a back-up solution, but it can be scheduled to run automatically, and it's much friendlier than, say, rsync.

Don't use Backup from dotMac! (4, Informative)

wembley fraggle (78346) | more than 7 years ago | (#17267798)

I used the Backup application from dotMac faithfully for over a year. Ran well every night, backing up my system. Then, my computers were stolen. No problem, they were insured and I had a backup. These things happen. I got my new Macintosh and went to to restore. Selected everything and hit restore.

Backup crashed.

Tried again. Crashed again.

Backup won't restore more than one or two files at a time without crashing. It seems to be a memory leak, as it dies during a memory allocation routine. Granted, I had a lot of files and a lot of incrementals. But this is the JOB OF BACKUP! To be able to RESTORE my FILES! The files are there, I can see them (each backup file has a disk image inside it which you can mount manually). I just can't get at them systematically.

So, I contacted Tech Support. Got something like "wow, that's strange", sent my logs and such. It's been two weeks and I've heard nothing. My followup emails go into the bit-bucket.

By now, it would have been easier for me to have spent the last four nights manually mounting disk images and copying files over by hand.

Needless to say, I'm going with Retrospect as soon as I have something to backup again. Cancelling my dotMac account too.

Re:Don't use Backup from dotMac! (2, Informative)

Therin (22398) | more than 7 years ago | (#17270636)

That's a major bummer. My experience is quite different, over the last couple of years I've used dotMac three or four times to restore things, and it's worked just fine.

Re:Don't use Backup from dotMac! (1)

wembley fraggle (78346) | more than 7 years ago | (#17271536)

I can restore one or two things from the backup, just not everything all at once. The problem (I imagine) is, Backup by default only ever does one full backup, and never updates that. So I've got one full backup and 280+ incrementals. I never really noticed that, since it's a Mac app. I just assumed everything would Just Work.

So when I go to restore 20,000+ files from across 280+ backup files, it dies with a malloc error. Somewhere, it's leaking memory. There are other people on the dotMac support board who've made similar complaints as well.

Similar experience, and a solution (1)

mkb (88436) | more than 7 years ago | (#17276072)

I had a very similar experience. After years of faithfully backing up to .Mac, I actually needed to recover my data and Backup ran for hours without accomplishing anything. No error message, but no recovered files either. After three attempts of 8 or more hours each, I finally went to the Genius Bar for help.

The Apple Genius ("That's only my job title, and not an actual description.") blamed the problem on my numerous incrementals. He said that Backup needs full backups every so often to work reliably. He suggested weekly. Gee, it would have been nice to roll that into the program.


The files created by Backup are actually bundles, which means you can 'cd' into them from the command line. The directory structure within the bundles mirrors the directory being backed up, though it is a sparse tree. A given incremental backup only contains the files that were changed. It didn't take much work to whip up a Python program to copy files out of each bundle, starting with the earliest. If I could perform the recovery with well under an hour of coding and no prior experience, why couldn't Apple do it with years of development time?

Of course, my less-frequent full backup made with rsync worked flawlessly.

ChronoSync (1)

Red_Winestain (243346) | more than 7 years ago | (#17267960)

I use ChronoSync [] .

I'm happy to fiddle and tweak and produce home-brew solutions to many things, but not as the sole backup: The point of a backup program is to ensure that you have backed up exactly what you think you have backed up. ChonoSync provides a reliable and flexible back-up system. It is commercial ($30) -- which you may not like -- but they offer free updates to a reasonably priced product, and have been around for a while. Their customer service is also excellent: they provided a less restrictive demo for me to try, and provided a lost serial number in less than 24 hours. I have no affiliation; just a satisfied customer.

Retrospect (2)

Rick Zeman (15628) | more than 7 years ago | (#17268000)

I back up my desktop, my PB, the wife's PB, my Dell, and my Linux server to an extra hard drive in my Desktop. Always been able to restore files that I've needed, and I've had to do one bare metal restore of my PB. Did a barebones install of 10.4.x on it, added the Retro client, clicked the mouse a few times...and came back to MY perfectly functioning PowerBook. A lot of people here will sneer at a commercial solution, but that restore paid for the software in my time and aggravation not spent.

Oh no, not again... (4, Informative)

gidds (56397) | more than 7 years ago | (#17268016)

This seems to have been discussed in many places over the last couple of months.

I'm no expert, but I can point you to a couple of interesting web pages by people who do seem to know a lot of the details:

In short, there are lots of different backup and cloning tools, from the Unix cp, ditto, and rsync commands up to the free Carbon Copy Cloner [] , cheap SuperDuper! [] , and expensive Retrospect [] . And very few of them preserve everything. HFS+ carries a lot of baggage from the old Mac OS, and adds a lot more stuff from Unix: there are resource forks, HFS+ extended attributes, BSD flags such as creation date and owner/group permissions, ACLs, symbolic links, aliases, and lots more -- and almost none of the options can preserve all of those.

You also need to think about what your backups are for and how much time and money you're prepared to expend: for some, burning a few personal files to CDR every few months will suffice, whereas for others an external HD holding a complete clone is the thing, and power users may need daily or weekly incremental backups with the ability to retrieve any file going back years.

Personally speaking, I'm in the middle category, with a large external Firewire HD holding a clone of each of my drives, which I redo every month or so. (Having it bootable is also a good idea, and has saved my bacon at least once!) I've mostly been using Carbon Copy Cloner, which has given good results, but I've recently switched to SuperDuper! which is cheap and seems to preserve absolutely everything. But don't take my word for it: read the linked pages, work out your needs, and make up your own mind.

But DO think about it! Disaster WILL strike in some form or other; disks DO fail (as I know to my cost), and you need to plan for it. It's not a question of how much time or money you can afford to spend; it's a question of how much data you can afford to lose!

Carbon Copy Cloner (1)

ItsYourNickel (1041000) | more than 7 years ago | (#17278674)

I'll second the motion here for 'Carbon Copy Cloner' (not 'Carbon Cloner Copy' or whatever it was mistakenly called above) by another poster.

Sure, there are quite a few different tools and systems and what-not with which one can easily make backups of your Mac; however, two things I like most about Carbon Copy Cloner are:
1.) that it can make a fully bootable disk *and*
2.) it actually does what its name implies: it copies your data onto another disk.

These handy features can be quite a relief in the event of Something Very Bad(tm) as you can easily get going again from your Carbon Copied disk; however, with other systems there may be a lot of other steps or gyrations (like tar -czpf your_backup_file.tgz) you have to go through to get back in action.

Aside from this, if you're really concerned about data loss, in addition to regular proper backups, the next must-have insurance policy (for pennies these days) is a second drive so you can create a RAID-1 array. Not always possible (i.e. laptops), sure, but if you can do it, it can be a HUGE timesaver to help you avoid having to resort to your backups.

Re: Carbon Copy Cloner (1)

gidds (56397) | more than 7 years ago | (#17279478)

Carbon Copy Cloner is good; but as the first article I linked to points out, it doesn't preserve doesn't preserve BSD flags, locked flag, creation date, HFS+ extended attributes, or ACLs. (Same as the ditto command-line program it uses.)

If you never use these, or don't care about them -- and, more importantly, you know that none of the apps you use does either -- then by all means use it. But if you're not sure, it's worth considering something like SuperDuper! which does preserve all of that too.

That's why I switched, despite a few years of good service from Carbon Copy Cloner. Backups are too important to risk any problems.

arRSync, matey (0)

Anonymous Coward | more than 7 years ago | (#17268138)

backs up your files and gives you that freshly plundered feeling in the morning
  arRsync []

Super Duper! and Unison (1)

fatalb7 (852308) | more than 7 years ago | (#17268148)

I use SuperDuper! [] to make a clone of my boot partition on a FW drive. "Smart Update" is fast and if something goes bad, I can reboot on the external drive and work immediately, then take the time to fix it later. For important files, I use unison [] to a remote server via ssh, I prefer it over Rsync. Chronosync [] is nice to make automatic backups to external drives.

I don't see how Apple's Time Machine [] could make Super Duper! obsolete, at least for me. What if I can't boot anymore and needs to work now?

Deja Vu, it comes with toast (2, Informative)

k3v0 (592611) | more than 7 years ago | (#17268262)

you can set it to back up over the network or to another drive, you can specify manual or automatic, and you can schedule different backups at different times. it's easy and quick.

Re:Deja Vu, it comes with toast (1)

pianophile (181111) | more than 7 years ago | (#17268988)

Another thumbs-up for Deja Vu. I've set it to back up (or rather, synchronize) my Home dir with a copy on a FireWire drive nightly. No fuss, no muss.

harddisk no problem (1)

Tom (822) | more than 7 years ago | (#17268266)

For external drives, there is plenty of software around. iBackup is what I have installed and it does what you want.

What I'm looking for and haven't found yet is something that'll do backups over the network, and is not .Mac

backup script with bash and rsync (0)

Anonymous Coward | more than 7 years ago | (#17268344)

Since I didn't found myself a simple, robust, free solution I wrote a script that uses rsync and a gentoo server to backup my /Users folder. (I don't backup the other stuff, if its necessarry I reinstall everything (besides thats a good way to update your system).)
You can find it here if you are interessted: tml [] .

Greetings, Astifter

similar question (1)

crmarvin42 (652893) | more than 7 years ago | (#17268410)

I'm in a similar situation. I'm trying to help a fellow graduate student who recently accepted a postion at a univeristy set up his new office. He's decided that he wants to switch to all mac's and is looking for a way to keep his laptop and desktop in sync. I mentioned dotmac for bookmarks, addresses, mail, etc. but he's also looking for something that'll sync the files in his home directory as well. Basically he wants to use his desktop at work, press a button and have it sync everything to his laptop when he leaves, and then sync any of the work he did on the laptop back to the desktop when he gets back to the office. I've been looking through what I could find, but nothing I've found seems to be quite what he's looking for. Does anyone know of any way he could do this easily with existing software?

Re:similar question (1)

DannyO152 (544940) | more than 7 years ago | (#17269382)

Connecting the laptop via FireWire cable after starting up in remote drive mode (hold down t after powering on) and copying the entire home directory from one to the other will do it. The working account should not have administrative powers. I'm guessing that it will go more smoothly if the working account on both machines were set up identically with the same name, password, userid, and groupid. This should happen if they were established at the same sequence point (next account after the initial one, perhaps?) Have to say that the thought of blowing away two entire home directories every day (one per machine) bothers me.

I'd look into rsync and a bash script run from Terminal. Set up an alias to a script that mounts the laptop over a network connection, does the sync using rsync and then unmounts the laptop. (Because I did this stuff a lot with Windows sources, I used mount -t smbfs... etc., which will work with the laptop if Windows sharing is enabled). Note, mount requires escalation of the privileges and this will require sudo to execute. (Remember that on OS X, the default is that only members of the wheel group [administrators] are given permission to invoke sudo. As a practical matter, it would make sense to log out of the working account, so everything closes and is saved to the disk, and then log in to the administrative account, open terminal and run 'sudo ${alias} ${arg}. rsync can also work over ssh connections (Remote Login has to be enabled on the laptop). Put the script, flagged executable, in /usr/local/bin and the alias in in the administrative accounts .bashrc file. One good thing about rsync is that you can set it up to handle changed and deleted files in a way you choose, including saving for future checking. bash and rsync are already installed, the Terminal is found under /Applications/Utilities. 'man rsync' for more info.

I'd also look into how cvs and subversion may be better approaches for reposing the working materials and projects and allowing for a third professionally administered and backed up machine to hold the materials. The desktop and the laptop would both perform daily commits and updates in the morning and in the evening. Should that departmental machine have a public interface, the checkout and commits of home work could be done at home. Be sure to use ssh for making the cvs connection -- otherwise passwords are transmitted in clear text. cvs is also already installed on the Mac. subversion, which was created to address cvs's shortcomings, would require installation, perhaps via compilation from source or via fink or darwinports. The departmental server's administrator would also have to set up cvs or subversion, which policy may prevent.

With something like "move all my stuff daily from here to there" one has to balance ease with correctness/safety. It's belts and suspenders and safety pin time, no? Brute force copies don't really allow you to retrieve that file you deleted two days ago when it seemed like a good idea. Thus, rsync and versioning and cvs/subversion.

Shell scripts (1)

lar3ry (10905) | more than 7 years ago | (#17268688)

As mentioned in another article, shell scripts are usually the best. I'll look at Time Machine when it comes out, but the very fact that it's OS X-specific would relegate it to a curiosity for me.

OS X provides "rsync," which is one of the best tools for the job, and it works on most (all?) Unix-based platforms as well as Windows (using Cygwin). With rsync, you should definitely look into the following options:

--exclude (exclude file name patterns from being backed up. You don't really nead your web cache or other temporary files backed up)

--backup and --backup-dir=dirname Allows rsync to do incremental backups. Use a different "dirname" every day, like "Volumes/USB Drive/Incremental/2006/12/16" (for those files that were modified on 16-Dec-2006). This way, you can have backups of works in progress... I can't tell you how many times these incrementals came in handy! You should also have a periodic process prune the incrementals, as they start to add up.

The nice thing is that rsync can work over a network, which means that you can have it transfer your data to a separate machine. Thus, it's possible to have off-site backup in case of disaster.

As far as other utilities are concerned, they probably work, and might be more suitable for you, since "rsync" is definitely command-line oriented (and scriptable).

Good luck, and here's an "atta boy!" for even thinking about backup solutions... most people don't until it's too late!

Deja vu of course (1)

Pliep (880962) | more than 7 years ago | (#17268990)

I always use Deja Vu and always keep returning to it: []

It uses psync (like rsync but with resource forks etc.) and is generally brilliant. I simply create an incremental duplicate of my entire hard drive to an equally sized other hard drive every day at 6 PM.

Silverkeeper (1)

jcull (789506) | more than 7 years ago | (#17269172)

Silverkeeper works great - I have it set up for most of the computers I support a work that don't use University network backups, and it's totally automated and will keep multiple copies on the external hard drive. A good review of it is on MacOSX Hints - 421082847552 []

ChronoSync (2, Informative)

varontron (460254) | more than 7 years ago | (#17269218)

from EconTechnologies [] is my choice. It's easy to use, supports archiving, and unattended operation. That's pretty much all I need. I back up my home folder with all my shtuff, and /usr/local where I have data and config files. Everything else in my world is downloadable, configurable, or forgotten. If I lose my hard drive once a year, I'll spend less time rebuilding then I would searching for and configuring a more advanced backup package.

atempo time navigator (1)

gearwhore (690744) | more than 7 years ago | (#17269554)

I've been using time navigator for about 6 months now, very good enterprise backup and it really maximized the amount of tape and disk storage i have. my main complaint is there is no "bare metal" restore for osx, which would be nice for disaster recovery

I wholeheartledly recommend Superduper (0)

Anonymous Coward | more than 7 years ago | (#17269574)

It has been best at preserving meta data. Here's my FAQ on backing up Mac OS X: []

Not practical for home use, but (1)

wax66 (736535) | more than 7 years ago | (#17271144)

For business and large organization use, a large government agency I used to work for first tried Retrospect. The whatever-it-is industrial strength version. We didn't have a very easy time of things, mostly minor technical issues that constantly plagued us, not generally with restoring, usually just with backups timing out, taking forever, jamming, etc.

We were finally given the go-ahead to try something new, and since our backup guy had been checking out Tivoli Storage Manager and really like it, we gave that a shot. I don't think I've ever had a smoother backup solution that worked for both our 800 Macs and our 900 PCs.

I've used NetBackup, Tivoli, and Retrospect, though I've never used NetBackup for Macs (we do at my current employer, but I've never touched it).

There's probably plenty of other good commercial backup solutions, but I have to admit that I was impressed. Things may have changed, since that was back in the 10.2 days, but who knows.

Built-in tools do just fine (1)

mindbooger (650932) | more than 7 years ago | (#17271390)

Boot from an alternate volume (like a Tiger install DVD -- after you select the language, there's a "utilities" menu or something where you can run terminal and disk utility), and use "asr -source $SOURCEVOL -target $TARGETVOL -erase" where $SOURCEVOL is your boot drive and $TARGETVOL is a sparse diskimage on a Firewire or USB disk.

It's fast, it's a pure copy, and it doesn't modify the access times of files on $SOURCEVOL. Make sure you're booted from a different volume and you use the "-erase" flag, though, or it can't unmount $SOURCEVOL and does a file-level copy instead. It still works, but it's a lot slower (and I'm not sure if it's modifying last-access times or not).

Technically, you can do the same thing with Disk Utility, I think, but I've been using asr.

`man asr` for the gory details.

Retrospect if you're serious (1)

Version6 (44041) | more than 7 years ago | (#17271426)

The only backups that do any good are the ones that you actually make. I've tried various things that require doing by hand, going back to mounting 9 track tapes on PDP-11 tape drives in the seventies. I've made floppy, CD-ROM and DVD-RW backups on an intermittent basis. In general, when the time came for disaster recovery, the backups were too old to do any good. Unless you are far more conscientious than I am, completely automatic is the way to go.

I use Retrospect 6.1 to run nightly incremental backups of four Macs to two separate Linux machines on alternate nights (Raid 1 on the cheap, I guess). I do have to check every week or two to be sure that Retrospect hasn't wedged. The user interface is way too complicated, but I've come to terms with it over the last six or seven years. I used to have full backups scheduled, but they take quite a long time and frequently either wedged or took so long that they interferred with the other backups. I don't have super regular "off-site" backup, but my house hasn't burned down yet. (We all use laptops as our main computers, so I do keep an external drive at our vacation home and run backups when we take our laptops out there. )

I've not only recovered indivudual files accidentally deleted or damaged but recovered from three complete drive meltdowns with no loss of work in two cases and only a few hours worth in the third, so I consider it worth the time and money.

In summary, if you don't actually do the backups, you're toast. If your work or your time is worth money, you can't afford not to do backups, and you shouldn't hesitate to spend some money. And finally, if you haven't done a restore from your backups, it is practically guaranteed that they don't work.

Re:Retrospect if you're serious (1)

holeinone (750622) | more than 7 years ago | (#17275688)

Just thought I'd voice my vote against Retrospect. I had had Retrospect since v4. Backed up some files from a research project onto CDs from my Mac 7100 603E PPC computer (200 MHz!). Lo and behold, last year (about 10 years after the project had been finished) someone writes and asks about the data. No problem, I said. Pulled out backup CD and fired up Retrospect. It complained that it couldn't restore because it couldn't read the catalog (the exact error, I don't recall). The disk was fine. It mounted and disk utilities didn't complain about anything. Thought that maybe the newest version would somehow help. Bought the latest version online only to have the same problem. Searched through all my old backups and thought I'd struck paydirt - found an old disk image that happened to have the same backups - alas, Retrospect coughed up the same error. I tried everything but there just wasn't any recovering those files. Luckily, I found some of the data files on an old zip disk (a miracle that thing still worked).

No, for me Retrospect had its chance and lost it. Since I'm dissing on it anyway, let me also complain about its infuriating interface. Nothing is intuitive - I seem to just go around and around in circles.

I currently pay for dot-mac due to the iphoto->web page simplicity (for the grandparents) so am using Backup. It has worked for the odd file here or there - have never had a catastrophic event requiring a full restore. Guess I should test that one of these days, your advice in that regard is right on. The thing that saved me was having multiple backups in different formats/media - enough redundancy and hopefully you're safe.

Re:Retrospect if you're serious (NOT!) (1)

donmontalvo (652999) | more than 7 years ago | (#17279592)

since osx was released, retrospect has fallen behind more and more. aside from the fact the mac application is a cheezy port of an ok os9 backup application, the fundamental design is wrong...for both os9 and osx. retrospect layers incrementals onto the full backup file. if this one full/incremental file gets corrupt, you lose everything. osx saavy backup applications (like atempo, bru and netbackup) have built in disk-to-disk-to-tape functionality and incrementals are stored as separate files. retrospect (like quarkxpress and to a certain extent eudora) are cash cows. marketing hype sells retrospect - but common sense turns people away to better designed applications. i know plenty of mac sysadmins (myself included) who have had failed restored (and/or failed backups) with retrospect. retrospect is on a downward spiral, better bail now (or move to the windows version that's at 7.5.x now). don't wait until you get caught with your pants down. move away from retrospect if your livelyhood depends on data protection - or if your clients depend on you. don

Try Backuplist+ (1)

cellocgw (617879) | more than 7 years ago | (#17271792)

Backuplist+, available at the usual search engines in your neighborhood :-), is a pretty nice gui frontend to several backup options. You can select Finder-like copy/backup, rsync, ditto, and cpio IIRC. It's free/donationware, so I'd recommend at least reading thru the documentation to see if it'll do all you need.

I have used Impression and SuperDuper (1)

davebarnes (158106) | more than 7 years ago | (#17271960)

I used Impression for over a year and really liked it. I used it because it did verification of the data written and this feature was very important to me. Unfortunately, Impression became an orphan and I switched to SuperDuper.

Fortunately, Impression has a new parent and is no longer an orphan. You can buy it at [] .

I am sticking with SuperDuper for now as it extremely easy to use.

Both programs back up to standard Mac files so retrieval is not dependent upon any special software.

Re:I have used Impression and SuperDuper (0)

Anonymous Coward | more than 7 years ago | (#17272648)

I can second the vote for Impression. I've used it for quite a while. It's one of the only OSX tools which will back up across multiple CDs besides Retrospect, and Impression uses standard OSX file structures.

Amanda or other UNIX backup software (1)

argent (18001) | more than 7 years ago | (#17272124)

I like Amanda because it's completely automated once you set it up... you don't have to keep track of whether you're doing full or incremental dumps. But there's all kinds of alternatives, and any UNIX backup software that can be configured to use hfstar or hfspax to catch the extra HFS+ attributes can accomodate a Mac as well.

tolisgroup bru le (1)

donmontalvo (652999) | more than 7 years ago | (#17273210)

retrospect is no longer a viable option for mac. many people have moved to tolisgroup bru (server or "le" for local backups). the unix toolset has been around for many years and is used in many mission critical environments. the mac osx gui is the result of a formal request from nasa for a mac version. :) [] we have bru server deployed at many locations. it does disk-to-disk-to-tape backups easily (d2d2t). incrementals are individual stage files (unlike retrospect that keeps jamming incrementals into the same growing file that inevetibly corrupts/implodes). the gui is maturing...and tolisgroup is the kind of company that follows the old unix rule...each tool does one task and does it well. bru does backups well...and restores well. don't expect any marketing hype from them, just solid, dependable backups AND restores. don montalvo, nyc curmudgeon at large

Back-Up for Mac OSX (0)

Anonymous Coward | more than 7 years ago | (#17273842)

This one is the best. Easiest to use. Well designed. It just gets it done. []

Déjà Vu - For my money !! (1)

zucom (830731) | more than 7 years ago | (#17276714)

For my money Déjà Vu [] has done it quick dirty and simple. I've seen this run for 3 years not be touched and have no issues. Not really a enterprise solution but great for home & professional users who don't want to shell script things.
-ZuCom []

My rsync setting for backup works great! (0)

Anonymous Coward | more than 7 years ago | (#17278962)

I have a rsync running in the cron. Works fantastically!

rsync -a -v /Users/user/ /Volumes/Brain/Backup/ --exclude "*.cache" --exclude "Cache/" --exclude "Logs/" --exclude "Parallels/" --exclude "Temporary\ Folder/" --exclude "Acrobat\ User\ Data/" --exclude ".Trash/" --delete >/BackupLog/rsyncBackup.log

As you can see above, I exclude some stuff, including the Parallels stuff (would take way to long to copy over, anything in the trash, and a couple of other things). So far as I can tell, everything copies fine, and is a complete duplicate, including resource forks, etc. I do not copy over applications, or other items outside of the ~ folder, though. Seems a waste of time/space, being you can just download / re-install the apps if necessary.

The settings above will copy everything from the user directory (library, documents, etc.) to my backup HD (Brain, in this case) in "archive" mode, which copies pretty much all the attributes of a file.

For those who aren't familiar with rsync, the --delete option deletes items on the backup that were deleted in the original. If you want to have the poor-mans versioning, you can remove that switch.

.Mac? (1)

umbrellasd (876984) | more than 7 years ago | (#17282018)

Can't you just set up the folders that you want to sync with .Mac and store the files over there? I guess if you have gigs and gigs, you probably aren't backing that up every month anyway. Better to burn all that pr0n...rather, legally purchased iPr0n...rather, legally purchased iTunes content and your family photos and movies onto DVDs for your archives. Really depends what you are backing up, how much of it there is, how frequently and quickly you need to access it, and if you care who sees it.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?