Recursively finds and lists the svn:ignore property on
directories. The output is suitable for appending to the
$GIT_DIR/info/exclude file.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
Signed-off-by: Junio C Hamano <junkio@cox.net>
Note: This needs someone to tell me what the value of $^O is on ActiveState.
Signed-off-by: Ryan Anderson <ryan@michonline.com>
Signed-off-by: Junio C Hamano <junkio@cox.net>
When we settle on a solution for ActiveState's forking issues, all
compatibility checks can be handled inside this one function.
Also, fixed an abuse of global variables in the process of cleaning this up.
Signed-off-by: Ryan Anderson <ryan@michonline.com>
Signed-off-by: Junio C Hamano <junkio@cox.net>
Also, use Getopt::Long and only process each rev once.
(Thanks to Morten Welinder for spotting the performance problems.)
Signed-off-by: Ryan Anderson <ryan@michonline.com>
Signed-off-by: Junio C Hamano <junkio@cox.net>
When compiling on ia64 I get this warning (from gcc 3.4.3):
gcc -o pack-objects.o -c -g -O2 -Wall -DSHA1_HEADER='<openssl/sha.h>' pack-objects.c
pack-objects.c: In function `pack_revindex_ix':
pack-objects.c:94: warning: cast from pointer to integer of different size
A double cast (first to long, then to int) shuts gcc up, but is there
a better way?
[jc: Andreas Ericsson suggests to use ulong instead. ]
Signed-off-by: Tony Luck <tony.luck@intel.com>
Signed-off-by: Junio C Hamano <junkio@cox.net>
* jc/rev-list:
rev-list --objects: use full pathname to help hashing.
rev-list --objects-edge: remove duplicated edge commit output.
rev-list --objects-edge
* jc/pack-thin:
pack-objects: hash basename and direname a bit differently.
pack-objects: allow "thin" packs to exceed depth limits
pack-objects: use full pathname to help hashing with "thin" pack.
pack-objects: thin pack micro-optimization.
Use thin pack transfer in "git fetch".
Add git-push --thin.
send-pack --thin: use "thin pack" delta transfer.
Thin pack - create packfile with missing delta base.
Conflicts:
pack-objects.c (taking "next")
send-pack.c (taking "next")
* master:
Merge branches 'jc/rev-list' and 'jc/pack-thin'
gitview: Code cleanup
Add missing programs to ignore list
git ls files recursively show ignored files
Build and install git-mailinfo.
gitview: Bump the rev
gitview: Fix DeprecationWarning
* jc/rev-list:
rev-list --objects: use full pathname to help hashing.
rev-list --objects-edge: remove duplicated edge commit output.
rev-list --objects-edge
* jc/pack-thin:
pack-objects: hash basename and direname a bit differently.
pack-objects: allow "thin" packs to exceed depth limits
pack-objects: use full pathname to help hashing with "thin" pack.
pack-objects: thin pack micro-optimization.
Use thin pack transfer in "git fetch".
Add git-push --thin.
send-pack --thin: use "thin pack" delta transfer.
Thin pack - create packfile with missing delta base.
Conflicts:
pack-objects.c (manual adjustment for thin pack needed)
send-pack.c
This fix all the known issue with the graph display
The bug need to be explained graphically
|
a
This line need not be there ---->| \
b |
| /
c
c is parent of a and all a,b and c are placed on the same line and b is child of c
With my last checkin I added a seperate line to indicate that a is
connected to c. But then we had the line connecting a and b which should
not be ther. This changes fixes the same bug
Signed-off-by: Aneesh Kumar K.V <aneesh.kumar@gmail.com>
Signed-off-by: Junio C Hamano <junkio@cox.net>
Rearrange the code little bit so that it is easier to read
Signed-off-by: Aneesh Kumar K.V <aneesh.kumar@gmail.com>
Signed-off-by: Junio C Hamano <junkio@cox.net>
Make git-ls-files --others --ignored recurse into non-excluded
subdirectories.
Typically when asking git-ls-files to display all files which are
ignored by one or more exclude patterns one would want it to recurse
into subdirectories which are not themselves excluded to see if
there are any excluded files contained within those subdirectories.
Signed-off-by: Junio C Hamano <junkio@cox.net>
The merge 712b1dd389 was done
incorrectly, and lost this program from Makefile.
Big thanks go to Tony Luck for noticing it, and Linus for
diagnosing it.
Signed-off-by: Junio C Hamano <junkio@cox.net>
* jc/rev-list:
rev-list --objects: use full pathname to help hashing.
* jc/pack-thin:
pack-objects: hash basename and direname a bit differently.
pack-objects: allow "thin" packs to exceed depth limits
pack-objects: use full pathname to help hashing with "thin" pack.
* np/delta:
Revert "diff-delta: produce optimal pack data"
Tweak break/merge score to adjust to the new delta generation code.
count-delta: fix counting of copied source.
This reverts 6b7d25d97b commit.
It turns out that the new algorithm has a really bad corner
case, that literally spends minutes for inputs that takes less
than a quater seconds to delta with the old algorithm. The
resulting delta is 50% smaller which is admirable, but the
performance degradation is simply unacceptable for unconditional
use.
Some example cases are these blobs in Linux 2.6 repository:
4917ec509720a42846d513addc11cbd25e0e3c4f
9af06ba723df75fed49f7ccae5b6c9c34bc5115f
dfc9cd58dc065d17030d875d3fea6e7862ede143
Signed-off-by: Junio C Hamano <junkio@cox.net>
...so that "Makefile"s from different revs are sorted together,
separate from "t/Makefile"s, but close enough.
Signed-off-by: Junio C Hamano <junkio@cox.net>
This helps to group the same files from different revs together,
while spreading files with the same basename in different
directories, to help pack-object.
Signed-off-by: Junio C Hamano <junkio@cox.net>
When creating a new pack to be used in .git/objects/pack/
directory, we carefully count the depth of deltified objects to
be reused, so that the generated pack does not to exceed the
specified depth limit for runtime efficiency. However, when we
are generating a thin pack that does not contain base objects,
such a pack can only be used during network transfer that is
expanded on the other end upon reception, so being careful and
artificially cutting the delta chain does not buy us anything
except increased bandwidth requirement. This patch disables the
delta chain depth limit check when reusing an existing delta.
Signed-off-by: Junio C Hamano <junkio@cox.net>
Since i wanted to limit the graph box size i was resetting
the window after an index of 5. This result in line joining
commit nodes to pass over nodes which are not related. The
changes fixes the same
Signed-off-by: Aneesh Kumar K.V <aneesh.kumar@gmail.com>
Signed-off-by: Junio C Hamano <junkio@cox.net>
Running "git-am --resolved" without doing anything can create an empty
commit. Prevent it.
Thanks for Eric W. Biederman for spotting this.
Signed-off-by: Junio C Hamano <junkio@cox.net>
This lowers the default merge threshold score to 75% from
earlier 80%. The break threshold stays the same at 50% for now,
but we might want to revisit it (and the rename detection limit
as well).
* break score: this much edit (both insertion of new material
and deletion of old material) needs to be there in the file
before we consider this _might_ be a rewrite and break the
filepair.
* merge score: after a filepair is broken by the above criteria
and goes through rename detection, if their pieces did not
match with other files as rename/copy, we merge them back
into one as if nothing happened. If the filepair had at
least this much deletion of old material, however, we say
this is completely rewritten with dissimilarity index X% when
we do so.
The updated delta code by Nico is so good that what we earlier
thought to be complete rewrite now reuses a lot more from the
source material (reducing the counted "delete"), so this
adjustment is needed to keep the perceived behaviour similar to
what we had earlier.
Signed-off-by: Junio C Hamano <junkio@cox.net>
The previous one wrongly coalesced a span with the next one
even though the span being added does not reach it.
Signed-off-by: Junio C Hamano <junkio@cox.net>
In windows you cannot remove current or opened directory,
an opened file, a running program, a loaded library, etc...
[jc: signoffs? With a minor quoting fix.]
Signed-off-by: Junio C Hamano <junkio@cox.net>
With the finer grained delta algorithm, count-delta algorithm
started overcounting copied source material, since the new delta
output tends to reuse the same source range more than once and
more aggressively. This broke an earlier assumption that the
number of bytes copied out from the source buffer is a good
approximation how much source material is actually remaining in
the result.
This uses fairly inefficient algorithm to keep track of ranges
of source material that are actually copied out to the
destination buffer. With this tweak, the obvious rename/break
detection tests in the testsuite start to work again.
Signed-off-by: Junio C Hamano <junkio@cox.net>
This uses the same hashing algorithm to the "preferred base
tree" objects and the incoming pathnames, to group the same
files from different revs together, while spreading files with
the same basename in different directories.
Signed-off-by: Junio C Hamano <junkio@cox.net>
Since we sort objects by type, hash, preferredness and then
size, after we have a delta against preferred base, there is no
point trying a delta with non-preferred base. This seems to
save expensive calls to diff-delta and it also seems to save the
output space as well.
Signed-off-by: Junio C Hamano <junkio@cox.net>