It turns out that not only did git-daemon do DWIM, but git-upload-pack
does as well. This is bad; security checks have to be performed *after*
canonicalization, not before.
Additionally, the current git-daemon can be trivially DoSed by spewing
SYNs at the target port.
This patch adds a --strict option to git-upload-pack to disable all
DWIM, a --timeout option to git-daemon and git-upload-pack, and an
--init-timeout option to git-daemon (which is typically set to a much
lower value, since the initial request should come immediately from the
client.)
Signed-off-by: H. Peter Anvin <hpa@zytor.com>
Signed-off-by: Junio C Hamano <junkio@cox.net>
Cloning from a repository with more than 256 refs (heads and tags
included) will choke, because upload-pack has a built-in limit of
feeding not more than MAX_NEEDS (currently 256) heads to underlying
git-rev-list. This is a problem when cloning a repository with many
tags, like http://www.linux-mips.org/pub/scm/linux.git, which has 290+
tags.
This commit introduces a new flag, --all, to git-rev-list, to include
all refs in the repository. Updated upload-pack detects requests that
ask more than MAX_NEEDS refs, and sends everything back instead.
We may probably want to tweak the definitions of MAX_NEEDS and
MAX_HAS, but that is a separate topic.
Signed-off-by: Junio C Hamano <junkio@cox.net>
Solaris 8 doesn't have the newer unsetenv() and setenv()
functions, so replace them with putenv(). The one use of
unsetenv() in fsck-cache.c now sets GIT_ALTERNATE_OBJECT_
DIRECTORIES to the empty string. Every place that var
is used, NULLs are also replaced with empty strings, so
it's ok.
Signed-off-by: Jason Riedy <ejr@cs.berkeley.edu>
Now that git-clone-pack exists, we actually have somebody requesting
more than just a single head in a pack. So allow the Jeff's of this
world to clone things with tens of heads.
"git_path()" returns a static pathname pointer into the git directory
using a printf-like format specifier.
"head_ref()" works like "for_each_ref()", except for just the HEAD.
It returns the result SHA1 on stdout, so you can do
remote=$(git-fetch-pack host:dir branchname)
and it will unpack the objects and "remote" will be the SHA1 name of the
branch on the other side. You can then save that off, or merge it, or
whatever.
It's meant to be used by "git fetch" for the local and ssh case.
It doesn't actually do the fetching now, but it does discover the common
commit point.