One for the nerds....
Aug. 10th, 2011 11:58 pm![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
So I now have 2 or 3 machines I use for tinkering with the little projects I have, and I'd like to keep the code synched. There are many tools for this, of course, and I'm not wanting to start a religious war about which tool I should use, but rather seeking advice on how to use them best :)
Assume, for the sake of argument, that I have access to git and/or subversion. Assume, for the sake of argument, that I have a ~/Code directory, which has, of course, a Python directory and a Perl directory, which then contain project directories and library directories.
Assuming for the moment that I don't care so much about synching "broken" code - in that I want to be able to half-finish something on the desktop, commit it to the repo, and check it out on the laptop to keep tinkering - as much as I care about the latest version, is my best strategy to commit ~/Code en masse and then just update en masse?
Or should I commit each project individually to the repo, so I can branch later if need be?
Assume, for the sake of argument, that I have access to git and/or subversion. Assume, for the sake of argument, that I have a ~/Code directory, which has, of course, a Python directory and a Perl directory, which then contain project directories and library directories.
Assuming for the moment that I don't care so much about synching "broken" code - in that I want to be able to half-finish something on the desktop, commit it to the repo, and check it out on the laptop to keep tinkering - as much as I care about the latest version, is my best strategy to commit ~/Code en masse and then just update en masse?
Or should I commit each project individually to the repo, so I can branch later if need be?
(no subject)
Date: 2011-08-11 02:54 am (UTC)That said, to solve the synchronisation of *working* space problem, you *could* put the whole thing inside your Dropbox folder. That isn't so good for large projects involving binary builds, but for scripting language projects, it's great. And then you don't have to worry about forgetting to commit code (and you should *also* be using a code repository, but it then fulfils its correct function - keeping a history of your code, as opposed to being misused for a function it's not designed for - keeping directories in sync).
(no subject)
Date: 2011-08-11 03:03 am (UTC)(no subject)
Date: 2011-08-11 03:29 am (UTC)You want to keep directories in sync across different machines, put them inside Dropbox. There are other solutions, but I guarantee you Dropbox is the simplest, quickest, and least prone to error, by a very very long way. Just the dev directories, if you like, since presumably the stable branches shouldn't contain anything except correct stable versions.
(no subject)
Date: 2011-08-11 06:15 am (UTC)(no subject)
Date: 2011-08-11 06:17 am (UTC)(no subject)
Date: 2011-08-11 07:44 am (UTC)(no subject)
Date: 2011-08-12 04:52 am (UTC)Dropbox treats that symlink as though it were a real directory so far as the other client machines go, so anywhere I don't explicitly exclude ~/Dropbox/Music from sync, it'll just pull all the music across as a directory rather than as a symlink.
This may be useful if for some reason you really want stuff in random directories in different places but still want Dropbox to sync it all.
(no subject)
Date: 2011-08-11 07:52 am (UTC)Still not as good as Dropbox because you have to remember to push/pull/run-script-to-sync, but not everything fits in Dropbox. :-)
(no subject)
Date: 2011-08-12 04:53 am (UTC)It does if you give them (enough) money. ;)
I mainly use Dropbox for an offsite backup of my music collection, so I've paid for a 100GB account. It works pretty well in that role and I don't have to think about it all once it was set up, it just automatically syncs properly.
(no subject)
Date: 2011-08-12 05:00 am (UTC)I'm more thinking that if you're compiling a project that involves regularly creating and destroying hundreds of MB of libraries, or working with editing multi-GB media files, Dropbox is not going to help you much... if only because sending all that crap up to .us even once (assuming the other machines are local so they do LAN sync) is going to take forever. De-duplication may or may not help, but I wouldn't like to bet on it helping. :-)
Personally I'm very happy with http://crashplan.com/ as my emergency cloud backup provider and I really hope I never have to do a catastrophic restore (of all machines), but it's there just in case. I did do a full crashplan restore of my linux VM when that died horribly, but that was a mere 13GB. :-)
Dropbox I'm keeping just for "a small amount of stuff I like to sync", so the free GB is fine for me so far. I'd be happy to pony up for a bit more if I needed it though.
(no subject)
Date: 2011-08-12 05:03 am (UTC)I like that Dropbox will also sync across the LAN if it can, so it takes care of both the remote backup and the rsync-like cases. At this point anything I care about winds up in TM, on Dropbox, and synced to a second machine via Dropbox-over-LAN. I feel *reasonably* safe.
(no subject)
Date: 2011-08-12 06:27 am (UTC)(no subject)
Date: 2011-08-11 04:31 am (UTC)I've tried both models, and have found that the one-big-blob approach quickly gets frustrating because you have leftover commits all over the place (even though I started out intending to be okay with broken commits, in practice it didn't work out!). It also makes it difficult to share just one project with a friend at a future date.
With the multiple repos approach, things are nicely separated, and even when there's a broken commit it's easy to pick back up on it the next time you're working on that section of code because you can just look at the log for head/tip/whatever, but there is a chance that you'll forget to sync one at the end of a coding session before you move on to another one. Perhaps you could write a script that goes over and syncs everything just before you logout?