[prev in list] [next in list] [prev in thread] [next in thread] 

List:       git
Subject:    Re: Git ~unusable on slow lines :,'C
From:       Marcel Partap <mpartap () gmx ! net>
Date:       2012-10-09 14:06:11
Message-ID: 50742F53.3050205 () gmx ! net
[Download RAW message or body]

>> Bam, the server kicked me off after taking to long to sync my copy.
> This is unrelated to git. The HTTP server's configuration is too
> impatient.
Yes. How does that mean it is unrelated to git?

>> - git fetch should show the total amount of data it is about to
>> transfer!
> It can't, because it doesn't know.
The server side doesn't know at how much the objects *it just repacked
for transfer* weigh in?
If that truly is the case, wouldn't it make sense to make git a little
more introspective? f.e.
> # git info git://foo.org/bar.git
> .. [server generating figures] ..
> URL: git://foo.org/bar.git
> Created/Earliest commit: ...
> Last modified/Latest commit: ...
> Total object count: .... (..commits, ..files, .. directories)
> Total repository size (compressed): ... MiB
> Branches:
> [git branch -va] + branch size

> The error message doesn't really know whether it is going to overwrite
> it (the CR comes from the server), though I suppose an extra LF wouldn't
> hurt there.
Definitely wouldn't hurt.

>> - would be nice to be able to tell git fetch to get the next chunk of
>> say 500 commits instead of trying to receive ALL commits, then b0rking
>> after umpteen percent on server timeout. Not?
> You asked for the current state of the repository, and that's what its
> giving you.
And instead, I would rather like to ask for the next 500 commits. No way
to do it.

> The timeout has nothing to do with git, if you can't
> convince the admins to increase it, you can try using another transport
> which doesn't suffer from HTTP, as it's most likely an anti-DoS measure.
See, I probably can't convince the admins to drop their anti-dos measures.
And they (drupal.org admins) probably will not change their allowed
protocol policies.
Despite that, i've had timeouts or simply stale connections dying down
before with other repositories and various transport modes.
The easiest fix would be an option to tell git to not fetch everything...

> If you want to download it bit by bit, you can tell fetch to download
> particular tags.
..without specifying specific commit tags.
Browsing gitweb sites to find a tag for which the fetch doesn't time out
is hugely inconvenient, especially on a slow line.

> Doing this automatically for this would be working
> around a configuration issue for a particular server, which is generally
> better fixed in other ways.
It is not only a configuration issue for one particular server. Git in
general is hardly usable on slow lines because
- it doesn't show the volume of data that is to be downloaded!
- it doesn't allow the user to sync up in steps the circumstances will
allow to succeed.

#Regards!Marcel.
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html
[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic