Concurrency in GIT repository in network share

I want to have a bare git repository stored in a network share (Windows). I am using linux and have the indicated network share mounted using CIFS. My coleague uses Windows XP, and the network adapter (from ActiveDirectory, somehow) as a network drive.

I wonder if I can use repo from both computers without concurrency problems.

I already tested, and at my end I can clone normally, but I'm afraid of what might happen if we both work on the same repo (push / pull).

The git FAQ has a link to using network file systems (and some problems with SMBFS), but I'm not sure if there is any file locking done by network / server / windows / linux - I'm pretty sure not.

So, has anyone used the git repository on a network share without a server and no problem?

Thank,
Alex

PS: I want to avoid using an http server (or git -daemon) because I do not have access to the server with the shared ones. In addition, I know that we can just push / pull from each other, but we must have a code / repo on the resource for backup reasons.

Update:

My concern is not about the possibility of a network failure. However, we will have the necessary branches locally, and we will be able to collect our sources.

But we usually commit quite often and often need reconnection / merging. From my point of view, the best option would be to have a central repo on the shared resource (therefore backups are guaranteed), and we will both clone from it and use it for rebase.

But, due to the fact that we do this often, I am afraid for damage to the file / repo , if this happens, we both push / pull at the same time. Usually we could yell at each other every time we access the remote repo :), but it would be better if it was protected by computers / network.

And it is possible that git has an internal mechanism for this (since someone can click on one of your repositories while you are working on it), but I have not found anything convincing yet.

Update 2:

A repo on a shared drive would be a bare repository without a working copy.

+43
git linux windows concurrency
Apr 15 '09 at 8:21
source share
4 answers

Git requires minimal file locking, which, in my opinion, is the main cause of problems when using this type of share in a network file system. The reason for this is that most of the files in the Git repository - all those that make up the database of objects --- are called the digest of their contents and are immutable after creation. Thus, the problem of two clients trying to use the same file for different content does not occur.

The other part of the object database is more complicated - the links are stored in files in the "refs" directory (or in the "packed refs"), and they change: although the refs/* files are small and always rewritten, not edited. In this case, Git writes the new ref to the temporary “.lock” file, and then renames it over the target file. If the file system respects the semantics of O_EXCL , it is safe. Even if it is not, the worst thing that can happen is a race overwriting the ref file. Although it would be unpleasant to meet, it should not lead to corruption as such: perhaps this may be the case when you click on a common repo, and this push looks as if it succeeded, while in fact someone else did it. But this could be sorted simply by pulling (merging in another commit) and clicking again.

All in all, I don’t think that repo corruption is too big a problem here - it’s true that things can go a bit wrong due to locking issues, but the Git repo design minimizes damage.

(Disclaimer: all this sounds good in theory, but I didn’t do a repo hit at the same time to check it, and only share them through NFS, not CIFS)

+38
Apr 15 '09 at 10:10
source share

Why bother? Git is for distribution. Just create a repository on each machine and use the publish and pull mechanism to spread your changes between them.

For backup purposes, run the night task to copy the repository to the share.

Or create one repository on a shared resource and do your work from them, but use them as distributed repositories from which you can pull revisions from each other. If you use this method, build performance, etc. It will be reduced, as you will constantly gain access to the network.

Or, distribute repositories on your computers and run a periodic task to redirect your commits to storage on a shared resource.

+7
Apr 15 '09 at 8:25
source share

The central git repository seems to be supported. Most of the prescribed uses point to ssh or http access, none of which allow concurrent repo access. Even if you use fully distributed usage, this question arises if more than two collaborators click on the same repo. So far, the answer has not answered this question. Is a git design allowing it to N by pushing a branch at the same time?

+5
Nov 10 '09 at 17:20
source share

It sounds like you would like to use a centralized version control system, so the backup request will be satisfied. Perhaps with xxx2git between you so that you can work locally.

-2
Apr 15 '09 at 8:29
source share



All Articles