View Single Post
Posts: 3,841 | Thanked: 1,079 times | Joined on Nov 2006
#9
When I develop code on more than one computer I use 'git' (distributed version control) to synch them. I make each module a git repo.
Say I start with some Debian package and I unpack the source. Then I cd down into the top dir of the unpacked source and do
Code:
git init
git add .
git commit -m"Put version x.y of abcd under git"
Then I hack away. And I use 'git' (with git add, git commit etc.) to keep my work under version control. This even lets me import newer versions of what I started with and still keep my changes.

Then, when I want to work on the other (or third) computer:
Code:
git clone user@machine:path-to-where-the.git-dir-is
It only needs an ssh server installed.
Then I can hack away at the second computer. If I have instead done more work on the first computer I can just do (on the second computer):
Code:
git pull
If I do work on the second computer I must tell where to pull from when I pull back to the first one:
Code:
git pull user@machine:path-to-where-the.git-dir-is
And so on. Google for git documentation, there are good tutorials out there. I find it to be much faster, simpler and more flexible (and takes less space) than starting to mess about with SVN repositories and the like.
__________________
N800/OS2007|N900/Maemo5
-- Metalayer-crawler delenda est.
-- Current state: Fed up with everything MeeGo.