tyggerjai: (Default)
tyggerjai ([personal profile] tyggerjai) wrote2011-08-10 11:58 pm

One for the nerds....

So I now have 2 or 3 machines I use for tinkering with the little projects I have, and I'd like to keep the code synched. There are many tools for this, of course, and I'm not wanting to start a religious war about which tool I should use, but rather seeking advice on how to use them best :)

Assume, for the sake of argument, that I have access to git and/or subversion. Assume, for the sake of argument, that I have a ~/Code directory, which has, of course, a Python directory and a Perl directory, which then contain project directories and library directories.

Assuming for the moment that I don't care so much about synching "broken" code - in that I want to be able to half-finish something on the desktop, commit it to the repo, and check it out on the laptop to keep tinkering - as much as I care about the latest version, is my best strategy to commit ~/Code en masse and then just update en masse?

Or should I commit each project individually to the repo, so I can branch later if need be?
ideological_cuddle: (Default)

[personal profile] ideological_cuddle 2011-08-12 05:03 am (UTC)(link)
Yeah, in that instance you'd either do your builds in a different directory (which is something the gcc people like to do, if memory serves) or use selective sync. I think the former is probably a cleaner solution, you're already going to have to make sure that your binary files go into a subdirectory rather than sitting alongside the source anyway...

I like that Dropbox will also sync across the LAN if it can, so it takes care of both the remote backup and the rsync-like cases. At this point anything I care about winds up in TM, on Dropbox, and synced to a second machine via Dropbox-over-LAN. I feel *reasonably* safe.