Hi all!
I'm publishing my projects at GitHub using git-gui. Everything works fine if I have only a signle local folder synced to GitHub master.
And as easy and convenient it looked at the first glance it became a hell when I need to work on several different local code sources (at home and at work). I'd be very grateful for an useful GIT how-to's of regular use. Like how to download/upload/check the code versions when GIT's folders represent multiple code simultaneously?
Namely I right now need:
Is there any way kill all local changes and fetch them from the online source? I'm really tired of cloning the repository a few times a week just to upload a minor change to github .
e.g. I've modified the code, copy it to GIT's folder, but the cloned folder was created based on previous code version and stages very strange changes (files I didn't modify even since that version).
There is a remote>fetch and Merge>local merge, but it raises a lot of strange errors when the 'local' and 'remote' are different and does nothing.
Simply put:
I write code at work.
I clone the git repository to a separate folder.
I copy the code to this folder.
I upload the code to GitHub.
I come home and clone the code.
I make changes and upload them to GitHub.
I come to back to work next day.
I make changes to the code.
I copy the code to the yesterday-created folder (containing an outdated version of the code).
But I can't upload them to GitHub (it stages changes I didn't make in binary files).
I delete the folder content and try to fetch from remote and I only get errors.
So... I just delete the folder completely, clone the remote repository and copy the new code over it. Now all changes are staged fine. But I'll have to repeat the same procedure at home in the evening...