completion: improve ls-files filter performance
From the output of ls-files, we remove all but the leftmost path component and then we eliminate duplicates. We do this in a while loop, which is a performance bottleneck when the number of iterations is large (e.g. for 60000 files in linux.git). $ COMP_WORDS=(git status -- ar) COMP_CWORD=3; time _git real 0m11.876s user 0m4.685s sys 0m6.808s Replacing the loop with the cut command improves performance significantly: $ COMP_WORDS=(git status -- ar) COMP_CWORD=3; time _git real 0m1.372s user 0m0.263s sys 0m0.167s The measurements were done with Msys2 bash, which is used by Git for Windows. When filtering the ls-files output we take care not to touch absolute paths. This is redundant, because ls-files will never output absolute paths. Remove the unnecessary operations. The issue was reported here: https://github.com/git-for-windows/git/issues/1533 Signed-off-by: Clemens Buchacher <drizzd@gmx.net> Signed-off-by: Junio C Hamano <gitster@pobox.com>
This commit is contained in:
parent
468165c1d8
commit
78a2d21231
@ -388,12 +388,7 @@ __git_index_files ()
|
||||
local root="${2-.}" file
|
||||
|
||||
__git_ls_files_helper "$root" "$1" |
|
||||
while read -r file; do
|
||||
case "$file" in
|
||||
?*/*) echo "${file%%/*}" ;;
|
||||
*) echo "$file" ;;
|
||||
esac
|
||||
done | sort | uniq
|
||||
cut -f1 -d/ | sort | uniq
|
||||
}
|
||||
|
||||
# Lists branches from the local repository.
|
||||
|
Loading…
Reference in New Issue
Block a user