Merge branch 'ps/build-tweaks' into next

Tweak the build infrastructure by moving tools around.

* ps/build-tweaks:
  meson: precompile "git-compat-util.h"
  meson: compile compatibility sources separately
  git-compat-util.h: move warning infra to prepare for PCHs
  builds: move build scripts into "tools/"
  contrib: move "update-unicode.sh" script into "tools/"
  contrib: move "coverage-diff.sh" script into "tools/"
  contrib: move "coccinelle/" directory into "tools/"
  Introduce new "tools/" directory
This commit is contained in:
Junio C Hamano
2026-03-20 14:49:12 -07:00
49 changed files with 123 additions and 99 deletions

7
tools/README.md Normal file
View File

@@ -0,0 +1,7 @@
Developer Tooling
-----------------
This directory is expected to contain all sorts of tooling that
relates to our build infrastructure. This includes scripts and
inputs required by our build systems, but also scripts that
developers are expected to run manually.

34
tools/check-builtins.sh Executable file
View File

@@ -0,0 +1,34 @@
#!/bin/sh
{
cat <<\EOF
sayIt:
$(foreach b,$(BUILT_INS),echo XXX $(b:$X=) YYY;)
EOF
cat Makefile
} |
make -f - sayIt 2>/dev/null |
sed -n -e 's/.*XXX \(.*\) YYY.*/\1/p' |
sort |
{
bad=0
while read builtin
do
base=$(expr "$builtin" : 'git-\(.*\)')
x=$(sed -ne 's/.*{ "'$base'", \(cmd_[^, ]*\).*/'$base' \1/p' git.c)
if test -z "$x"
then
echo "$base is builtin but not listed in git.c command list"
bad=1
fi
for sfx in sh perl py
do
if test -f "$builtin.$sfx"
then
echo "$base is builtin but $builtin.$sfx still exists"
bad=1
fi
done
done
exit $bad
}

1
tools/coccinelle/.gitignore vendored Normal file
View File

@@ -0,0 +1 @@
*.patch

124
tools/coccinelle/README Normal file
View File

@@ -0,0 +1,124 @@
= coccinelle
This directory provides Coccinelle (http://coccinelle.lip6.fr/) semantic patches
that might be useful to developers.
== Types of semantic patches
* Using the semantic transformation to check for bad patterns in the code;
The target 'make coccicheck' is designed to check for these patterns and
it is expected that any resulting patch indicates a regression.
The patches resulting from 'make coccicheck' are small and infrequent,
so once they are found, they can be sent to the mailing list as per usual.
Example for introducing new patterns:
67947c34ae (convert "hashcmp() != 0" to "!hasheq()", 2018-08-28)
b84c783882 (fsck: s/++i > 1/i++/, 2018-10-24)
Example of fixes using this approach:
248f66ed8e (run-command: use strbuf_addstr() for adding a string to
a strbuf, 2018-03-25)
f919ffebed (Use MOVE_ARRAY, 2018-01-22)
These types of semantic patches are usually part of testing, c.f.
0860a7641b (travis-ci: fail if Coccinelle static analysis found something
to transform, 2018-07-23)
* Using semantic transformations in large scale refactorings throughout
the code base.
When applying the semantic patch into a real patch, sending it to the
mailing list in the usual way, such a patch would be expected to have a
lot of textual and semantic conflicts as such large scale refactorings
change function signatures that are used widely in the code base.
A textual conflict would arise if surrounding code near any call of such
function changes. A semantic conflict arises when other patch series in
flight introduce calls to such functions.
So to aid these large scale refactorings, semantic patches can be used.
However we do not want to store them in the same place as the checks for
bad patterns, as then automated builds would fail.
That is why semantic patches 'tools/coccinelle/*.pending.cocci'
are ignored for checks, and can be applied using 'make coccicheck-pending'.
This allows to expose plans of pending large scale refactorings without
impacting the bad pattern checks.
== Git-specific tips & things to know about how we run "spatch":
* The "make coccicheck" will piggy-back on
"COMPUTE_HEADER_DEPENDENCIES". If you've built a given object file
the "coccicheck" target will consider its depednency to decide if
it needs to re-run on the corresponding source file.
This means that a "make coccicheck" will re-compile object files
before running. This might be unexpected, but speeds up the run in
the common case, as e.g. a change to "column.h" won't require all
coccinelle rules to be re-run against "grep.c" (or another file
that happens not to use "column.h").
To disable this behavior use the "SPATCH_USE_O_DEPENDENCIES=NoThanks"
flag.
* To speed up our rules the "make coccicheck" target will by default
concatenate all of the *.cocci files here into an "ALL.cocci", and
apply it to each source file.
This makes the run faster, as we don't need to run each rule
against each source file. See the Makefile for further discussion,
this behavior can be disabled with "SPATCH_CONCAT_COCCI=".
But since they're concatenated any <id> in the <rulname> (e.g. "@
my_name", v.s. anonymous "@@") needs to be unique across all our
*.cocci files. You should only need to name rules if other rules
depend on them (currently only one rule is named).
* To speed up incremental runs even more use the "spatchcache" tool
in this directory as your "SPATCH". It aimns to be a "ccache" for
coccinelle, and piggy-backs on "COMPUTE_HEADER_DEPENDENCIES".
It caches in Redis by default, see it source for a how-to.
In one setup with a primed cache "make coccicheck" followed by a
"make clean && make" takes around 10s to run, but 2m30s with the
default of "SPATCH_CONCAT_COCCI=Y".
With "SPATCH_CONCAT_COCCI=" the total runtime is around ~6m, sped
up to ~1m with "spatchcache".
Most of the 10s (or ~1m) being spent on re-running "spatch" on
files we couldn't cache, as we didn't compile them (in contrib/*
and compat/* mostly).
The absolute times will differ for you, but the relative speedup
from caching should be on that order.
== Authoring and reviewing coccinelle changes
* When a .cocci is made, both the Git changes and .cocci file should be
reviewed. When reviewing such a change, do your best to understand the .cocci
changes (e.g. by asking the author to explain the change) and be explicit
about your understanding of the changes. This helps us decide whether input
from coccinelle experts is needed or not. If you aren't sure of the cocci
changes, indicate what changes you actively endorse and leave an Acked-by
(instead of Reviewed-by).
* Authors should consider that reviewers may not be coccinelle experts, thus the
the .cocci changes may not be self-evident. A plain text description of the
changes is strongly encouraged, especially when using more esoteric features
of the language.
* .cocci rules should target only the problem it is trying to solve; "collateral
damage" is not allowed. Reviewers should look out and flag overly-broad rules.
* Consider the cost-benefit ratio of .cocci changes. In particular, consider the
effect on the runtime of "make coccicheck", and how often your .cocci check
will catch something valuable. As a rule of thumb, rules that can bail early
if a file doesn't have a particular token will have a small impact on runtime,
and vice-versa.
* .cocci files used for refactoring should be temporarily kept in-tree to aid
the refactoring of out-of-tree code (e.g. in-flight topics). Periodically
evaluate the cost-benefit ratio to determine when the file should be removed.
For example, consider how many out-of-tree users are left and how much this
slows down "make coccicheck".

View File

@@ -0,0 +1,147 @@
@@
type T;
T *dst_ptr;
T *src_ptr;
expression n;
@@
- memcpy(dst_ptr, src_ptr, (n) * \( sizeof(T)
- \| sizeof(*(dst_ptr))
- \| sizeof(*(src_ptr))
- \| sizeof(dst_ptr[...])
- \| sizeof(src_ptr[...])
- \) )
+ COPY_ARRAY(dst_ptr, src_ptr, n)
@@
type T;
T *dst_ptr;
T[] src_arr;
expression n;
@@
- memcpy(dst_ptr, src_arr, (n) * \( sizeof(T)
- \| sizeof(*(dst_ptr))
- \| sizeof(*(src_arr))
- \| sizeof(dst_ptr[...])
- \| sizeof(src_arr[...])
- \) )
+ COPY_ARRAY(dst_ptr, src_arr, n)
@@
type T;
T[] dst_arr;
T *src_ptr;
expression n;
@@
- memcpy(dst_arr, src_ptr, (n) * \( sizeof(T)
- \| sizeof(*(dst_arr))
- \| sizeof(*(src_ptr))
- \| sizeof(dst_arr[...])
- \| sizeof(src_ptr[...])
- \) )
+ COPY_ARRAY(dst_arr, src_ptr, n)
@@
type T;
T[] dst_arr;
T[] src_arr;
expression n;
@@
- memcpy(dst_arr, src_arr, (n) * \( sizeof(T)
- \| sizeof(*(dst_arr))
- \| sizeof(*(src_arr))
- \| sizeof(dst_arr[...])
- \| sizeof(src_arr[...])
- \) )
+ COPY_ARRAY(dst_arr, src_arr, n)
@@
type T;
T *dst;
T *src;
expression n;
@@
(
- memmove(dst, src, (n) * sizeof(*dst));
+ MOVE_ARRAY(dst, src, n);
|
- memmove(dst, src, (n) * sizeof(*src));
+ MOVE_ARRAY(dst, src, n);
|
- memmove(dst, src, (n) * sizeof(T));
+ MOVE_ARRAY(dst, src, n);
)
@@
type T;
T *ptr;
expression n;
@@
- ptr = xmalloc((n) * sizeof(*ptr));
+ ALLOC_ARRAY(ptr, n);
@@
type T;
T *ptr;
expression n;
@@
- ptr = xmalloc((n) * sizeof(T));
+ ALLOC_ARRAY(ptr, n);
@@
type T;
T *ptr;
expression n != 1;
@@
- ptr = xcalloc(n, \( sizeof(*ptr) \| sizeof(T) \) )
+ CALLOC_ARRAY(ptr, n)
@@
expression dst, src, n;
@@
-ALLOC_ARRAY(dst, n);
-COPY_ARRAY(dst, src, n);
+DUP_ARRAY(dst, src, n);
@@
type T;
T *ptr;
expression n;
@@
- memset(ptr, \( 0 \| '\0' \), \( (n) \| n \) * \( sizeof(T)
- \| sizeof(ptr[...])
- \| sizeof(*ptr)
- \) )
+ MEMZERO_ARRAY(ptr, n)
@@
type T;
T *ptr;
expression n;
@@
- memset(ptr, \( 0 \| '\0' \), \( sizeof(T)
- \| sizeof(ptr[...])
- \| sizeof(*ptr)
- \) * \( (n) \| n \) )
+ MEMZERO_ARRAY(ptr, n)
@@
type T;
T[] ptr;
expression n;
@@
- memset(ptr, \( 0 \| '\0' \), \( (n) \| n \) * \( sizeof(T)
- \| sizeof(ptr[...])
- \| sizeof(*ptr)
- \) )
+ MEMZERO_ARRAY(ptr, n)
@@
type T;
T[] ptr;
expression n;
@@
- memset(ptr, \( 0 \| '\0' \), \( sizeof(T)
- \| sizeof(ptr[...])
- \| sizeof(*ptr)
- \) * \( (n) \| n \) )
+ MEMZERO_ARRAY(ptr, n)

View File

@@ -0,0 +1,53 @@
@@
expression c;
@@
- &c->maybe_tree->object.oid
+ get_commit_tree_oid(c)
@@
expression c;
@@
- c->maybe_tree->object.oid.hash
+ get_commit_tree_oid(c)->hash
@@
identifier f !~ "^set_commit_tree$";
expression c;
expression s;
@@
f(...) {<...
- c->maybe_tree = s
+ set_commit_tree(c, s)
...>}
// These excluded functions must access c->maybe_tree directly.
// Note that if c->maybe_tree is written somewhere outside of these
// functions, then the recommended transformation will be bogus with
// repo_get_commit_tree() on the LHS.
@@
identifier f != { repo_get_commit_tree, get_commit_tree_in_graph_one,
load_tree_for_commit, set_commit_tree, repo_parse_commit_no_graph };
expression c;
@@
f(...) {<...
- c->maybe_tree
+ repo_get_commit_tree(specify_the_right_repo_here, c)
...>}
@@
struct commit *c;
expression E;
@@
(
- c->generation = E;
+ commit_graph_data_at(c)->generation = E;
|
- c->graph_pos = E;
+ commit_graph_data_at(c)->graph_pos = E;
|
- c->generation
+ commit_graph_generation(c)
|
- c->graph_pos
+ commit_graph_position(c)
)

View File

@@ -0,0 +1,144 @@
@ get_fn @
identifier fn, R;
@@
(
(
git_config_from_file
|
git_config_from_file_with_options
|
git_config_from_mem
|
git_config_from_blob_oid
|
read_early_config
|
read_very_early_config
|
config_with_options
|
git_config
|
git_protected_config
|
config_from_gitmodules
)
(fn, ...)
|
repo_config(R, fn, ...)
)
@ extends get_fn @
identifier C1, C2, D;
@@
int fn(const char *C1, const char *C2,
+ const struct config_context *ctx,
void *D);
@ extends get_fn @
@@
int fn(const char *, const char *,
+ const struct config_context *,
void *);
@ extends get_fn @
// Don't change fns that look like callback fns but aren't
identifier fn2 != tar_filter_config && != git_diff_heuristic_config &&
!= git_default_submodule_config && != git_color_config &&
!= bundle_list_update && != parse_object_filter_config;
identifier C1, C2, D1, D2, S;
attribute name UNUSED;
@@
int fn(const char *C1, const char *C2,
+ const struct config_context *ctx,
void *D1) {
<+...
(
fn2(C1, C2
+ , ctx
, D2);
|
if(fn2(C1, C2
+ , ctx
, D2) < 0) { ... }
|
return fn2(C1, C2
+ , ctx
, D2);
|
S = fn2(C1, C2
+ , ctx
, D2);
)
...+>
}
@ extends get_fn@
identifier C1, C2, D;
attribute name UNUSED;
@@
int fn(const char *C1, const char *C2,
+ const struct config_context *ctx UNUSED,
void *D) {...}
// The previous rules don't catch all callbacks, especially if they're defined
// in a separate file from the repo_config() call. Fix these manually.
@@
identifier C1, C2, D;
attribute name UNUSED;
@@
int
(
git_ident_config
|
urlmatch_collect_fn
|
write_one_config
|
forbid_remote_url
|
credential_config_callback
)
(const char *C1, const char *C2,
+ const struct config_context *ctx UNUSED,
void *D) {...}
@@
identifier C1, C2, D, D2, S, fn2;
@@
int
(
http_options
|
git_status_config
|
git_commit_config
|
git_default_core_config
|
grep_config
)
(const char *C1, const char *C2,
+ const struct config_context *ctx,
void *D) {
<+...
(
fn2(C1, C2
+ , ctx
, D2);
|
if(fn2(C1, C2
+ , ctx
, D2) < 0) { ... }
|
return fn2(C1, C2
+ , ctx
, D2);
|
S = fn2(C1, C2
+ , ctx
, D2);
)
...+>
}

View File

@@ -0,0 +1,30 @@
/* SPDX-License-Identifier: LGPL-2.1-or-later */
@@
expression e;
statement s;
@@
if (
(
!e
|
- e == NULL
+ !e
)
)
{...}
else s
@@
expression e;
statement s;
@@
if (
(
e
|
- e != NULL
+ e
)
)
{...}
else s

View File

@@ -0,0 +1,13 @@
@@
expression str;
identifier x, flexname;
@@
- FLEX_ALLOC_MEM(x, flexname, str, strlen(str));
+ FLEX_ALLOC_STR(x, flexname, str);
@@
expression str;
identifier x, ptrname;
@@
- FLEXPTR_ALLOC_MEM(x, ptrname, str, strlen(str));
+ FLEXPTR_ALLOC_STR(x, ptrname, str);

View File

@@ -0,0 +1,45 @@
@@
expression E;
@@
- if (E)
(
free(E);
|
commit_list_free(E);
)
@@
expression E;
@@
- if (!E)
(
free(E);
|
commit_list_free(E);
)
@@
expression E;
@@
- free(E);
+ FREE_AND_NULL(E);
- E = NULL;
@@
expression E;
@@
- if (E)
- {
commit_list_free(E);
E = NULL;
- }
@@
expression E;
statement S;
@@
- if (E) {
+ if (E)
S
commit_list_free(E);
- }

View File

@@ -0,0 +1,27 @@
@@
identifier C1, C2, C3;
@@
(
(
git_config_int
|
git_config_int64
|
git_config_ulong
|
git_config_ssize_t
)
(C1, C2
+ , ctx->kvi
)
|
(
git_configset_get_value
|
git_config_bool_or_int
)
(C1, C2
+ , ctx->kvi
, C3
)
)

View File

@@ -0,0 +1,16 @@
@@
expression E;
struct hashmap_entry HME;
@@
- HME.hash = E;
+ hashmap_entry_init(&HME, E);
@@
identifier f !~ "^hashmap_entry_init$";
expression E;
struct hashmap_entry *HMEP;
@@
f(...) {<...
- HMEP->hash = E;
+ hashmap_entry_init(HMEP, E);
...>}

View File

@@ -0,0 +1,157 @@
// the_index.* variables
@@
identifier AC = active_cache;
identifier AN = active_nr;
identifier ACC = active_cache_changed;
identifier ACT = active_cache_tree;
@@
(
- AC
+ the_index.cache
|
- AN
+ the_index.cache_nr
|
- ACC
+ the_index.cache_changed
|
- ACT
+ the_index.cache_tree
)
// "the_repository" simple cases
@@
@@
(
- read_cache
+ repo_read_index
|
- read_cache_unmerged
+ repo_read_index_unmerged
|
- hold_locked_index
+ repo_hold_locked_index
)
(
+ the_repository,
...)
// "the_repository" special-cases
@@
@@
(
- read_cache_preload
+ repo_read_index_preload
)
(
+ the_repository,
...
+ , 0
)
// "the_index" simple cases
@@
@@
(
- is_cache_unborn
+ is_index_unborn
|
- unmerged_cache
+ unmerged_index
|
- rename_cache_entry_at
+ rename_index_entry_at
|
- chmod_cache_entry
+ chmod_index_entry
|
- cache_file_exists
+ index_file_exists
|
- cache_name_is_other
+ index_name_is_other
|
- unmerge_cache_entry_at
+ unmerge_index_entry_at
|
- add_to_cache
+ add_to_index
|
- add_file_to_cache
+ add_file_to_index
|
- add_cache_entry
+ add_index_entry
|
- remove_file_from_cache
+ remove_file_from_index
|
- ce_match_stat
+ ie_match_stat
|
- ce_modified
+ ie_modified
|
- resolve_undo_clear
+ resolve_undo_clear_index
|
- cache_name_pos
+ index_name_pos
|
- update_main_cache_tree
+ cache_tree_update
|
- discard_cache
+ discard_index
)
(
+ &the_index,
...)
@@
@@
(
- refresh_and_write_cache
+ repo_refresh_and_write_index
)
(
+ the_repository,
...
+ , NULL, NULL, NULL
)
// "the_index" special-cases
@@
@@
(
- read_cache_from
+ read_index_from
)
(
+ &the_index,
...
+ , get_git_dir()
)
@@
@@
(
- refresh_cache
+ refresh_index
)
(
+ &the_index,
...
+ , NULL, NULL, NULL
)
@@
expression O;
@@
- write_cache_as_tree
+ write_index_as_tree
(
- O,
+ O, &the_index, get_index_file(),
...
)

View File

@@ -0,0 +1,85 @@
coccinelle_opt = get_option('coccinelle').require(
fs.exists(meson.project_source_root() / '.git'),
error_message: 'coccinelle can only be run from a git checkout',
)
spatch = find_program('spatch', required: coccinelle_opt)
if not spatch.found()
subdir_done()
endif
rules = [
'array.cocci',
'commit.cocci',
'config_fn_ctx.pending.cocci',
'equals-null.cocci',
'flex_alloc.cocci',
'free.cocci',
'git_config_number.cocci',
'hashmap.cocci',
'index-compatibility.cocci',
'object_id.cocci',
'preincr.cocci',
'qsort.cocci',
'refs.cocci',
'strbuf.cocci',
'swap.cocci',
'the_repository.cocci',
'xcalloc.cocci',
'xopen.cocci',
'xstrdup_or_null.cocci',
'xstrncmpz.cocci',
]
concatenated_rules = custom_target(
command: [
'cat', '@INPUT@',
],
input: rules,
output: 'rules.cocci',
capture: true,
)
coccinelle_sources = []
foreach source : run_command(git, '-C', meson.project_source_root(), 'ls-files', '--deduplicate', '*.c', third_party_excludes, check: true).stdout().split()
coccinelle_sources += source
endforeach
coccinelle_headers = []
foreach header : headers_to_check
coccinelle_headers += meson.project_source_root() / header
endforeach
coccinelle_includes = []
foreach path : ['compat', 'ewah', 'refs', 'sha256', 'trace2', 'win32', 'xdiff']
coccinelle_includes += ['-I', meson.project_source_root() / path]
endforeach
patches = [ ]
foreach source : coccinelle_sources
patches += custom_target(
command: [
spatch,
'--all-includes',
'--sp-file', concatenated_rules,
'--patch', meson.project_source_root(),
coccinelle_includes,
'@INPUT@',
],
input: meson.project_source_root() / source,
output: source.underscorify() + '.patch',
capture: true,
depend_files: coccinelle_headers,
)
endforeach
concatenated_patch = custom_target(
command: [
'cat', '@INPUT@',
],
input: patches,
output: 'cocci.patch',
capture: true,
)
alias_target('coccicheck', concatenated_patch)

View File

@@ -0,0 +1,75 @@
@@
struct object_id OID;
@@
- hashclr(OID.hash)
+ oidclr(&OID)
@@
identifier f != oidclr;
struct object_id *OIDPTR;
@@
f(...) {<...
- hashclr(OIDPTR->hash)
+ oidclr(OIDPTR)
...>}
@@
struct object_id OID1, OID2;
@@
- hashcmp(OID1.hash, OID2.hash)
+ oidcmp(&OID1, &OID2)
@@
identifier f != oidcmp;
struct object_id *OIDPTR1, OIDPTR2;
@@
f(...) {<...
- hashcmp(OIDPTR1->hash, OIDPTR2->hash)
+ oidcmp(OIDPTR1, OIDPTR2)
...>}
@@
struct object_id *OIDPTR;
struct object_id OID;
@@
- hashcmp(OIDPTR->hash, OID.hash)
+ oidcmp(OIDPTR, &OID)
@@
struct object_id *OIDPTR;
struct object_id OID;
@@
- hashcmp(OID.hash, OIDPTR->hash)
+ oidcmp(&OID, OIDPTR)
@@
struct object_id *OIDPTR1;
struct object_id *OIDPTR2;
@@
- oidcmp(OIDPTR1, OIDPTR2) == 0
+ oideq(OIDPTR1, OIDPTR2)
@@
identifier f != hasheq;
expression E1, E2;
@@
f(...) {<...
- hashcmp(E1, E2) == 0
+ hasheq(E1, E2)
...>}
@@
struct object_id *OIDPTR1;
struct object_id *OIDPTR2;
@@
- oidcmp(OIDPTR1, OIDPTR2) != 0
+ !oideq(OIDPTR1, OIDPTR2)
@@
identifier f != hasheq;
expression E1, E2;
@@
f(...) {<...
- hashcmp(E1, E2) != 0
+ !hasheq(E1, E2)
...>}

View File

@@ -0,0 +1,5 @@
@@
identifier i;
@@
- ++i > 1
+ i++

View File

@@ -0,0 +1,37 @@
@@
expression base, nmemb, compar;
@@
- qsort(base, nmemb, sizeof(*base), compar);
+ QSORT(base, nmemb, compar);
@@
expression base, nmemb, compar;
@@
- qsort(base, nmemb, sizeof(base[0]), compar);
+ QSORT(base, nmemb, compar);
@@
type T;
T *base;
expression nmemb, compar;
@@
- qsort(base, nmemb, sizeof(T), compar);
+ QSORT(base, nmemb, compar);
@@
expression base, nmemb, compar;
@@
- if (nmemb)
QSORT(base, nmemb, compar);
@@
expression base, nmemb, compar;
@@
- if (nmemb > 0)
QSORT(base, nmemb, compar);
@@
expression base, nmemb, compar;
@@
- if (nmemb > 1)
QSORT(base, nmemb, compar);

103
tools/coccinelle/refs.cocci Normal file
View File

@@ -0,0 +1,103 @@
// Migrate "refs.h" to not rely on `the_repository` implicitly anymore.
@@
@@
(
- resolve_ref_unsafe
+ refs_resolve_ref_unsafe
|
- resolve_refdup
+ refs_resolve_refdup
|
- read_ref_full
+ refs_read_ref_full
|
- read_ref
+ refs_read_ref
|
- ref_exists
+ refs_ref_exists
|
- head_ref
+ refs_head_ref
|
- for_each_ref
+ refs_for_each_ref
|
- for_each_ref_in
+ refs_for_each_ref_in
|
- for_each_fullref_in
+ refs_for_each_fullref_in
|
- for_each_tag_ref
+ refs_for_each_tag_ref
|
- for_each_branch_ref
+ refs_for_each_branch_ref
|
- for_each_remote_ref
+ refs_for_each_remote_ref
|
- for_each_glob_ref
+ refs_for_each_glob_ref
|
- for_each_glob_ref_in
+ refs_for_each_glob_ref_in
|
- head_ref_namespaced
+ refs_head_ref_namespaced
|
- for_each_namespaced_ref
+ refs_for_each_namespaced_ref
|
- for_each_rawref
+ refs_for_each_rawref
|
- safe_create_reflog
+ refs_create_reflog
|
- reflog_exists
+ refs_reflog_exists
|
- delete_ref
+ refs_delete_ref
|
- delete_refs
+ refs_delete_refs
|
- delete_reflog
+ refs_delete_reflog
|
- for_each_reflog_ent
+ refs_for_each_reflog_ent
|
- for_each_reflog_ent_reverse
+ refs_for_each_reflog_ent_reverse
|
- for_each_reflog
+ refs_for_each_reflog
|
- shorten_unambiguous_ref
+ refs_shorten_unambiguous_ref
|
- rename_ref
+ refs_rename_ref
|
- copy_existing_ref
+ refs_copy_existing_ref
|
- create_symref
+ refs_create_symref
|
- ref_transaction_begin
+ ref_store_transaction_begin
|
- update_ref
+ refs_update_ref
|
- reflog_expire
+ refs_reflog_expire
)
(
+ get_main_ref_store(the_repository),
...)

304
tools/coccinelle/spatchcache Executable file
View File

@@ -0,0 +1,304 @@
#!/bin/sh
#
# spatchcache: a poor-man's "ccache"-alike for "spatch" in git.git
#
# This caching command relies on the peculiarities of the Makefile
# driving "spatch" in git.git, in particular if we invoke:
#
# make
# # See "spatchCache.cacheWhenStderr" for why "--very-quiet" is
# # used
# make coccicheck SPATCH_FLAGS=--very-quiet
#
# We can with COMPUTE_HEADER_DEPENDENCIES (auto-detected as true with
# "gcc" and "clang") write e.g. a .depend/grep.o.d for grep.c, when we
# compile grep.o.
#
# The .depend/grep.o.d will have the full header dependency tree of
# grep.c, and we can thus cache the output of "spatch" by:
#
# 1. Hashing all of those files
# 2. Hashing our source file, and the *.cocci rule we're
# applying
# 3. Running spatch, if suggests no changes (by far the common
# case) we invoke "spatchCache.getCmd" and
# "spatchCache.setCmd" with a hash SHA-256 to ask "does this
# ID have no changes" or "say that ID had no changes>
# 4. If no "spatchCache.{set,get}Cmd" is specified we'll use
# "redis-cli" and maintain a SET called "spatch-cache". Set
# appropriate redis memory policies to keep it from growing
# out of control.
#
# This along with the general incremental "make" support for
# "tools/coccinelle" makes it viable to (re-)run coccicheck
# e.g. when merging integration branches.
#
# Note that the "--very-quiet" flag is currently critical. The cache
# will refuse to cache anything that has output on STDERR (which might
# be errors from spatch), but see spatchCache.cacheWhenStderr below.
#
# The STDERR (and exit code) could in principle be cached (as with
# ccache), but then the simple structure in the Redis cache would need
# to change, so just supply "--very-quiet" for now.
#
# To use this, simply set SPATCH to
# tools/coccinelle/spatchcache. Then optionally set:
#
# [spatchCache]
# # Optional: path to a custom spatch
# spatch = ~/g/coccicheck/spatch.opt
#
# As well as this trace config (debug implies trace):
#
# cacheWhenStderr = true
# trace = false
# debug = false
#
# The ".depend/grep.o.d" can also be customized, as a string that will
# be eval'd, it has access to a "$dirname" and "$basename":
#
# [spatchCache]
# dependFormat = "$dirname/.depend/${basename%.c}.o.d"
#
# Setting "trace" to "true" allows for seeing when we have a cache HIT
# or MISS. To debug whether the cache is working do that, and run e.g.:
#
# redis-cli FLUSHALL
# <make && make coccicheck, as above>
# grep -hore HIT -e MISS -e SET -e NOCACHE -e CANTCACHE .build/tools/coccinelle | sort | uniq -c
# 600 CANTCACHE
# 7365 MISS
# 7365 SET
#
# A subsequent "make cocciclean && make coccicheck" should then have
# all "HIT"'s and "CANTCACHE"'s.
#
# The "spatchCache.cacheWhenStderr" option is critical when using
# spatchCache.{trace,debug} to debug whether something is set in the
# cache, as we'll write to the spatch logs in .build/* we'd otherwise
# always emit a NOCACHE.
#
# Reading the config can make the command much slower, to work around
# this the config can be set in the environment, with environment
# variable name corresponding to the config key. "default" can be used
# to use whatever's the script default, e.g. setting
# spatchCache.cacheWhenStderr=true and deferring to the defaults for
# the rest is:
#
# export GIT_CONTRIB_SPATCHCACHE_DEBUG=default
# export GIT_CONTRIB_SPATCHCACHE_TRACE=default
# export GIT_CONTRIB_SPATCHCACHE_CACHEWHENSTDERR=true
# export GIT_CONTRIB_SPATCHCACHE_SPATCH=default
# export GIT_CONTRIB_SPATCHCACHE_DEPENDFORMAT=default
# export GIT_CONTRIB_SPATCHCACHE_SETCMD=default
# export GIT_CONTRIB_SPATCHCACHE_GETCMD=default
set -e
env_or_config () {
env="$1"
shift
if test "$env" = "default"
then
# Avoid expensive "git config" invocation
return
elif test -n "$env"
then
echo "$env"
else
git config $@ || :
fi
}
## Our own configuration & options
debug=$(env_or_config "$GIT_CONTRIB_SPATCHCACHE_DEBUG" --bool "spatchCache.debug")
if test "$debug" != "true"
then
debug=
fi
if test -n "$debug"
then
set -x
fi
trace=$(env_or_config "$GIT_CONTRIB_SPATCHCACHE_TRACE" --bool "spatchCache.trace")
if test "$trace" != "true"
then
trace=
fi
if test -n "$debug"
then
# debug implies trace
trace=true
fi
cacheWhenStderr=$(env_or_config "$GIT_CONTRIB_SPATCHCACHE_CACHEWHENSTDERR" --bool "spatchCache.cacheWhenStderr")
if test "$cacheWhenStderr" != "true"
then
cacheWhenStderr=
fi
trace_it () {
if test -z "$trace"
then
return
fi
echo "$@" >&2
}
spatch=$(env_or_config "$GIT_CONTRIB_SPATCHCACHE_SPATCH" --path "spatchCache.spatch")
if test -n "$spatch"
then
if test -n "$debug"
then
trace_it "custom spatchCache.spatch='$spatch'"
fi
else
spatch=spatch
fi
dependFormat='$dirname/.depend/${basename%.c}.o.d'
dependFormatCfg=$(env_or_config "$GIT_CONTRIB_SPATCHCACHE_DEPENDFORMAT" "spatchCache.dependFormat")
if test -n "$dependFormatCfg"
then
dependFormat="$dependFormatCfg"
fi
set=$(env_or_config "$GIT_CONTRIB_SPATCHCACHE_SETCMD" "spatchCache.setCmd")
get=$(env_or_config "$GIT_CONTRIB_SPATCHCACHE_GETCMD" "spatchCache.getCmd")
## Parse spatch()-like command-line for caching info
arg_sp=
arg_file=
args="$@"
spatch_opts() {
while test $# != 0
do
arg_file="$1"
case "$1" in
--sp-file)
arg_sp="$2"
;;
esac
shift
done
}
spatch_opts "$@"
if ! test -f "$arg_file"
then
arg_file=
fi
hash_for_cache() {
# Parameters that should affect the cache
echo "args=$args"
echo "config spatchCache.spatch=$spatch"
echo "config spatchCache.debug=$debug"
echo "config spatchCache.trace=$trace"
echo "config spatchCache.cacheWhenStderr=$cacheWhenStderr"
echo
# Our target file and its dependencies
git hash-object "$1" "$2" $(grep -E -o '^[^:]+:$' "$3" | tr -d ':')
}
# Sanity checks
if ! test -f "$arg_sp" && ! test -f "$arg_file"
then
echo $0: no idea how to cache "$@" >&2
exit 128
fi
# Main logic
dirname=$(dirname "$arg_file")
basename=$(basename "$arg_file")
eval "dep=$dependFormat"
if ! test -f "$dep"
then
trace_it "$0: CANTCACHE have no '$dep' for '$arg_file'!"
exec "$spatch" "$@"
fi
if test -n "$debug"
then
trace_it "$0: The full cache input for '$arg_sp' '$arg_file' '$dep'"
hash_for_cache "$arg_sp" "$arg_file" "$dep" >&2
fi
sum=$(hash_for_cache "$arg_sp" "$arg_file" "$dep" | git hash-object --stdin)
trace_it "$0: processing '$arg_file' with '$arg_sp' rule, and got hash '$sum' for it + '$dep'"
getret=
if test -z "$get"
then
if test $(redis-cli SISMEMBER spatch-cache "$sum") = 1
then
getret=0
else
getret=1
fi
else
$set "$sum"
getret=$?
fi
if test "$getret" = 0
then
trace_it "$0: HIT for '$arg_file' with '$arg_sp'"
exit 0
else
trace_it "$0: MISS: for '$arg_file' with '$arg_sp'"
fi
out="$(mktemp)"
err="$(mktemp)"
set +e
"$spatch" "$@" >"$out" 2>>"$err"
ret=$?
cat "$out"
cat "$err" >&2
set -e
nocache=
if test $ret != 0
then
nocache="exited non-zero: $ret"
elif test -s "$out"
then
nocache="had patch output"
elif test -z "$cacheWhenStderr" && test -s "$err"
then
nocache="had stderr (use --very-quiet or spatchCache.cacheWhenStderr=true?)"
fi
if test -n "$nocache"
then
trace_it "$0: NOCACHE ($nocache): for '$arg_file' with '$arg_sp'"
exit "$ret"
fi
trace_it "$0: SET: for '$arg_file' with '$arg_sp'"
setret=
if test -z "$set"
then
if test $(redis-cli SADD spatch-cache "$sum") = 1
then
setret=0
else
setret=1
fi
else
"$set" "$sum"
setret=$?
fi
if test "$setret" != 0
then
echo "FAILED to set '$sum' in cache!" >&2
exit 128
fi
exit "$ret"

View File

@@ -0,0 +1,73 @@
@@
expression E;
constant fmt !~ "%";
@@
- strbuf_addf
+ strbuf_addstr
(E,
(
fmt
|
_(fmt)
)
);
@@
expression E;
struct strbuf SB;
format F =~ "^s$";
@@
- strbuf_addf(E, "%@F@", SB.buf);
+ strbuf_addbuf(E, &SB);
@@
expression E;
struct strbuf *SBP;
format F =~ "^s$";
@@
- strbuf_addf(E, "%@F@", SBP->buf);
+ strbuf_addbuf(E, SBP);
@@
expression E;
struct strbuf SB;
@@
- strbuf_addstr(E, SB.buf);
+ strbuf_addbuf(E, &SB);
@@
expression E;
struct strbuf *SBP;
@@
- strbuf_addstr(E, SBP->buf);
+ strbuf_addbuf(E, SBP);
@@
expression E1, E2;
format F =~ "^s$";
@@
- strbuf_addf(E1, "%@F@", E2);
+ strbuf_addstr(E1, E2);
@@
expression E1, E2, E3;
@@
- strbuf_addstr(E1, find_unique_abbrev(E2, E3));
+ strbuf_add_unique_abbrev(E1, E2, E3);
@@
expression E1, E2;
@@
- strbuf_addstr(E1, real_path(E2));
+ strbuf_add_real_path(E1, E2);
@@
identifier fn, param;
@@
fn(...,
- struct strbuf param
+ struct strbuf *param
,...)
{
...
}

View File

@@ -0,0 +1,28 @@
@@
type T;
identifier tmp;
T a, b;
@@
- T tmp = a;
+ T tmp;
+ tmp = a;
a = b;
b = tmp;
@ swap @
type T;
T tmp, a, b;
@@
- tmp = a;
- a = b;
- b = tmp;
+ SWAP(a, b);
@ extends swap @
identifier unused;
@@
{
...
- T unused;
... when != unused
}

View File

@@ -0,0 +1,11 @@
int use_FREE_AND_NULL(int *v)
{
free(*v);
*v = NULL;
}
int need_no_if(int *v)
{
if (v)
free(v);
}

View File

@@ -0,0 +1,9 @@
int use_FREE_AND_NULL(int *v)
{
FREE_AND_NULL(*v);
}
int need_no_if(int *v)
{
free(v);
}

View File

@@ -0,0 +1,18 @@
// Fully migrated "the_repository" additions
@@
@@
(
// TODO: remove the rules below and the macros from tree.h after the
// next Git release.
- parse_tree
+ repo_parse_tree
|
- parse_tree_gently
+ repo_parse_tree_gently
|
- parse_tree_indirect
+ repo_parse_tree_indirect
)
(
+ the_repository,
...)

View File

@@ -0,0 +1,10 @@
@@
type T;
T *ptr;
expression n;
@@
xcalloc(
+ n,
\( sizeof(T) \| sizeof(*ptr) \)
- , n
)

View File

@@ -0,0 +1,19 @@
@@
identifier fd;
identifier die_fn =~ "^(die|die_errno)$";
@@
int fd =
- open
+ xopen
(...);
- if ( \( fd < 0 \| fd == -1 \) ) { die_fn(...); }
@@
expression fd;
identifier die_fn =~ "^(die|die_errno)$";
@@
fd =
- open
+ xopen
(...);
- if ( \( fd < 0 \| fd == -1 \) ) { die_fn(...); }

View File

@@ -0,0 +1,5 @@
@@
expression E;
@@
- xstrdup(absolute_path(E))
+ absolute_pathdup(E)

View File

@@ -0,0 +1,28 @@
@@
expression S, T, L;
@@
(
- strncmp(S, T, L) || S[L]
+ !!xstrncmpz(S, T, L)
|
- strncmp(S, T, L) || S[L] != '\0'
+ !!xstrncmpz(S, T, L)
|
- strncmp(S, T, L) || T[L]
+ !!xstrncmpz(T, S, L)
|
- strncmp(S, T, L) || T[L] != '\0'
+ !!xstrncmpz(T, S, L)
|
- !strncmp(S, T, L) && !S[L]
+ !xstrncmpz(S, T, L)
|
- !strncmp(S, T, L) && S[L] == '\0'
+ !xstrncmpz(S, T, L)
|
- !strncmp(S, T, L) && !T[L]
+ !xstrncmpz(T, S, L)
|
- !strncmp(S, T, L) && T[L] == '\0'
+ !xstrncmpz(T, S, L)
)

103
tools/coverage-diff.sh Executable file
View File

@@ -0,0 +1,103 @@
#!/bin/sh
# Usage: Run 'contrib/coverage-diff.sh <version1> <version2>' from source-root
# after running
#
# make coverage-test
# make coverage-report
#
# while checked out at <version2>. This script combines the *.gcov files
# generated by the 'make' commands above with 'git diff <version1> <version2>'
# to report new lines that are not covered by the test suite.
V1=$1
V2=$2
diff_lines () {
perl -e '
my $line_num;
while (<>) {
# Hunk header? Grab the beginning in postimage.
if (/^@@ -\d+(?:,\d+)? \+(\d+)(?:,\d+)? @@/) {
$line_num = $1;
next;
}
# Have we seen a hunk? Ignore "diff --git" etc.
next unless defined $line_num;
# Deleted line? Ignore.
if (/^-/) {
next;
}
# Show only the line number of added lines.
if (/^\+/) {
print "$line_num\n";
}
# Either common context or added line appear in
# the postimage. Count it.
$line_num++;
}
'
}
files=$(git diff --name-only "$V1" "$V2" -- \*.c)
# create empty file
>coverage-data.txt
for file in $files
do
git diff "$V1" "$V2" -- "$file" |
diff_lines |
sort >new_lines.txt
if ! test -s new_lines.txt
then
continue
fi
hash_file=$(echo $file | sed "s/\//\#/")
if ! test -s "$hash_file.gcov"
then
continue
fi
sed -ne '/#####:/{
s/ #####://
s/:.*//
s/ //g
p
}' "$hash_file.gcov" |
sort >uncovered_lines.txt
comm -12 uncovered_lines.txt new_lines.txt |
sed -e 's/$/\)/' -e 's/^/ /' >uncovered_new_lines.txt
grep -q '[^[:space:]]' <uncovered_new_lines.txt &&
echo $file >>coverage-data.txt &&
git blame -s "$V2" -- "$file" |
sed 's/\t//g' |
grep -f uncovered_new_lines.txt >>coverage-data.txt &&
echo >>coverage-data.txt
rm -f new_lines.txt uncovered_lines.txt uncovered_new_lines.txt
done
cat coverage-data.txt
echo "Commits introducing uncovered code:"
commit_list=$(awk '/^[0-9a-f]{7,}/ { print $1 }' coverage-data.txt | sort -u)
(
for commit in $commit_list
do
git log --no-decorate --pretty=format:'%an %h: %s' -1 $commit
echo
done
) | sort
rm coverage-data.txt

58
tools/detect-compiler Executable file
View File

@@ -0,0 +1,58 @@
#!/bin/sh
#
# Probe the compiler for vintage, version, etc. This is used for setting
# optional make knobs under the DEVELOPER knob.
CC="$*"
# we get something like (this is at least true for gcc and clang)
#
# FreeBSD clang version 3.4.1 (tags/RELEASE...)
get_version_line() {
LANG=C LC_ALL=C $CC -v 2>&1 | sed -n '/ version /{p;q;}'
}
get_family() {
get_version_line | sed 's/^\(.*\) version [0-9].*/\1/'
}
get_version() {
# A string that begins with a digit up to the next SP
ver=$(get_version_line | sed 's/^.* version \([0-9][^ ]*\).*/\1/')
# There are known -variant suffixes that do not affect the
# meaning of the main version number. Strip them.
ver=${ver%-win32}
ver=${ver%-posix}
echo "$ver"
}
print_flags() {
family=$1
version=$(get_version | cut -f 1 -d .)
# Print a feature flag not only for the current version, but also
# for any prior versions we encompass. This avoids needing to do
# numeric comparisons in make, which are awkward.
while test "$version" -gt 0
do
echo $family$version
version=$((version - 1))
done
}
case "$(get_family)" in
gcc)
print_flags gcc
;;
clang | *" clang")
print_flags clang
;;
"Apple LLVM")
print_flags clang
;;
*)
: unknown compiler family
;;
esac

120
tools/generate-cmdlist.sh Executable file
View File

@@ -0,0 +1,120 @@
#!/bin/sh
die () {
echo "$@" >&2
exit 1
}
command_list () {
while read cmd rest
do
case "$cmd" in
"#"* | '')
# Ignore comments and allow empty lines
continue
;;
*)
case "$exclude_programs" in
*":$cmd:"*)
;;
*)
echo "$cmd $rest"
;;
esac
esac
done <"$1"
}
category_list () {
echo "$1" |
cut -d' ' -f2- |
tr ' ' '\012' |
grep -v '^$' |
LC_ALL=C sort -u
}
define_categories () {
echo
echo "/* Command categories */"
bit=0
echo "$1" |
while read cat
do
echo "#define CAT_$cat (1UL << $bit)"
bit=$(($bit+1))
done
test "$bit" -gt 32 && die "Urgh.. too many categories?"
}
define_category_names () {
echo
echo "/* Category names */"
echo "static const char *category_names[] = {"
bit=0
echo "$1" |
while read cat
do
echo " \"$cat\", /* (1UL << $bit) */"
bit=$(($bit+1))
done
echo " NULL"
echo "};"
}
print_command_list () {
echo "static struct cmdname_help command_list[] = {"
echo "$2" |
while read cmd rest
do
synopsis=
while read line
do
case "$line" in
"$cmd - "*)
synopsis=${line#$cmd - }
break
;;
esac
done <"$1/Documentation/$cmd.adoc"
printf '\t{ "%s", N_("%s"), 0' "$cmd" "$synopsis"
printf " | CAT_%s" $rest
echo " },"
done
echo "};"
}
exclude_programs=:
while test "--exclude-program" = "$1"
do
shift
exclude_programs="$exclude_programs$1:"
shift
done
if test "$#" -ne 2
then
die "USAGE: $0 <SOURCE_DIR> <OUTPUT>"
fi
SOURCE_DIR="$1"
OUTPUT="$2"
{
commands="$(command_list "$SOURCE_DIR"/command-list.txt)"
categories="$(category_list "$commands")"
echo "/* Automatically generated by generate-cmdlist.sh */
struct cmdname_help {
const char *name;
const char *help;
uint32_t category;
};
"
define_categories "$categories"
echo
define_category_names "$categories"
echo
print_command_list "$SOURCE_DIR" "$commands"
} >"$OUTPUT"

52
tools/generate-configlist.sh Executable file
View File

@@ -0,0 +1,52 @@
#!/bin/sh
SOURCE_DIR="$1"
OUTPUT="$2"
DEPFILE="$3"
if test -z "$SOURCE_DIR" || ! test -d "$SOURCE_DIR" || test -z "$OUTPUT"
then
echo >&2 "USAGE: $0 <SOURCE_DIR> <OUTPUT> [<DEPFILE>]"
exit 1
fi
print_config_list () {
cat <<EOF
static const char *config_name_list[] = {
EOF
sed -e '
/^`*[a-zA-Z].*\..*`*::$/ {
/deprecated/d;
s/::$//;
s/`//g;
s/^.*$/ "&",/;
p;};
d' \
"$SOURCE_DIR"/Documentation/*config.adoc \
"$SOURCE_DIR"/Documentation/config/*.adoc |
sort
cat <<EOF
NULL,
};
EOF
}
{
echo "/* Automatically generated by generate-configlist.sh */"
echo
echo
print_config_list
} >"$OUTPUT"
if test -n "$DEPFILE"
then
QUOTED_OUTPUT="$(printf '%s\n' "$OUTPUT" | sed 's,[&/\],\\&,g')"
{
printf '%s\n' "$SOURCE_DIR"/Documentation/*config.adoc \
"$SOURCE_DIR"/Documentation/config/*.adoc |
sed -e 's/[# ]/\\&/g' -e "s/^/$QUOTED_OUTPUT: /"
printf '%s:\n' "$SOURCE_DIR"/Documentation/*config.adoc \
"$SOURCE_DIR"/Documentation/config/*.adoc |
sed -e 's/[# ]/\\&/g'
} >"$DEPFILE"
fi

33
tools/generate-hooklist.sh Executable file
View File

@@ -0,0 +1,33 @@
#!/bin/sh
#
# Usage: ./generate-hooklist.sh >hook-list.h
SOURCE_DIR="$1"
OUTPUT="$2"
if test -z "$SOURCE_DIR" || ! test -d "$SOURCE_DIR" || test -z "$OUTPUT"
then
echo >&2 "USAGE: $0 <SOURCE_DIR> <OUTPUT>"
exit 1
fi
{
cat <<EOF
/* Automatically generated by generate-hooklist.sh */
static const char *hook_name_list[] = {
EOF
sed -n \
-e '/^~~~~*$/ {x; s/^.*$/ "&",/; p;}' \
-e 'x' \
<"$SOURCE_DIR"/Documentation/githooks.adoc |
LC_ALL=C sort
cat <<EOF
NULL,
};
EOF
} >"$OUTPUT"

37
tools/generate-perl.sh Executable file
View File

@@ -0,0 +1,37 @@
#!/bin/sh
set -e
if test $# -ne 5
then
echo >&2 "USAGE: $0 <GIT_BUILD_OPTIONS> <GIT_VERSION_FILE> <PERL_HEADER> <INPUT> <OUTPUT>"
exit 1
fi
GIT_BUILD_OPTIONS="$1"
GIT_VERSION_FILE="$2"
PERL_HEADER="$3"
INPUT="$4"
OUTPUT="$5"
. "$GIT_BUILD_OPTIONS"
. "$GIT_VERSION_FILE"
sed -e '1{' \
-e " /^#!.*perl/!b" \
-e " s|#!.*perl|#!$PERL_PATH|" \
-e " r $PERL_HEADER" \
-e ' G' \
-e '}' \
-e "s|@GIT_VERSION@|$GIT_VERSION|g" \
-e "s|@LOCALEDIR@|$PERL_LOCALEDIR|g" \
-e "s|@NO_GETTEXT@|$NO_GETTEXT|g" \
-e "s|@NO_PERL_CPAN_FALLBACKS@|$NO_PERL_CPAN_FALLBACKS|g" \
"$INPUT" >"$OUTPUT"
case "$INPUT" in
*.perl|*git-contacts)
chmod a+x "$OUTPUT";;
*)
;;
esac

20
tools/generate-python.sh Executable file
View File

@@ -0,0 +1,20 @@
#!/bin/sh
set -e
if test $# -ne 3
then
echo >&2 "USAGE: $0 <GIT_BUILD_OPTIONS> <INPUT> <OUTPUT>"
exit 1
fi
GIT_BUILD_OPTIONS="$1"
INPUT="$2"
OUTPUT="$3"
. "$GIT_BUILD_OPTIONS"
sed -e "1s|#!.*python|#!$PYTHON_PATH|" \
"$INPUT" >"$OUTPUT+"
chmod a+x "$OUTPUT+"
mv "$OUTPUT+" "$OUTPUT"

34
tools/generate-script.sh Executable file
View File

@@ -0,0 +1,34 @@
#!/bin/sh
set -e
if test $# -ne 3
then
echo >&2 "USAGE: $0 <INPUT> <OUTPUT> <GIT-BUILD-OPTIONS>"
exit 1
fi
INPUT="$1"
OUTPUT="$2"
BUILD_OPTIONS="$3"
. "$BUILD_OPTIONS"
sed -e "1s|#!.*/sh|#!$SHELL_PATH|" \
-e "s|@SHELL_PATH@|$SHELL_PATH|" \
-e "s|@DIFF@|$DIFF|" \
-e "s|@LOCALEDIR@|$LOCALEDIR|g" \
-e "s/@USE_GETTEXT_SCHEME@/$USE_GETTEXT_SCHEME/g" \
-e "$BROKEN_PATH_FIX" \
-e "s|@GITWEBDIR@|$GITWEBDIR|g" \
-e "s|@PERL_PATH@|$PERL_PATH|g" \
-e "s|@PAGER_ENV@|$PAGER_ENV|g" \
"$INPUT" >"$OUTPUT"
case "$(basename "$INPUT")" in
git-mergetool--lib.sh|git-sh-i18n.sh|git-sh-setup.sh)
;;
*)
chmod a+x "$OUTPUT"
;;
esac

1
tools/meson.build Normal file
View File

@@ -0,0 +1 @@
subdir('coccinelle')

1
tools/precompiled.h Normal file
View File

@@ -0,0 +1 @@
#include "git-compat-util.h"

3
tools/update-unicode/.gitignore vendored Normal file
View File

@@ -0,0 +1,3 @@
uniset/
UnicodeData.txt
EastAsianWidth.txt

View File

@@ -0,0 +1,20 @@
TL;DR: Run update_unicode.sh after the publication of a new Unicode
standard and commit the resulting unicode-widths.h file.
The long version
================
The Git source code ships the file unicode-widths.h which contains
tables of zero and double width Unicode code points, respectively.
These tables are generated using update_unicode.sh in this directory.
update_unicode.sh itself uses a third-party tool, uniset, to query two
Unicode data files for the interesting code points.
On first run, update_unicode.sh clones uniset from Github and builds it.
This requires a current-ish version of autoconf (2.69 works per December
2016).
On each run, update_unicode.sh checks whether more recent Unicode data
files are available from the Unicode consortium, and rebuilds the header
unicode-widths.h with the new data. The new header can then be
committed.

View File

@@ -0,0 +1,33 @@
#!/bin/sh
#See http://www.unicode.org/reports/tr44/
#
#Me Enclosing_Mark an enclosing combining mark
#Mn Nonspacing_Mark a nonspacing combining mark (zero advance width)
#Cf Format a format control character
#
cd "$(dirname "$0")"
UNICODEWIDTH_H=$(git rev-parse --show-toplevel)/unicode-width.h
wget -N http://www.unicode.org/Public/UCD/latest/ucd/UnicodeData.txt \
http://www.unicode.org/Public/UCD/latest/ucd/EastAsianWidth.txt &&
if ! test -d uniset; then
git clone https://github.com/depp/uniset.git &&
( cd uniset && git checkout 4b186196dd )
fi &&
(
cd uniset &&
if ! test -x uniset; then
autoreconf -i &&
./configure --enable-warnings=-Werror CFLAGS='-O0 -ggdb'
fi &&
make
) &&
UNICODE_DIR=. && export UNICODE_DIR &&
cat >$UNICODEWIDTH_H <<-EOF
static const struct interval zero_width[] = {
$(uniset/uniset --32 cat:Me,Mn,Cf + U+1160..U+11FF - U+00AD)
};
static const struct interval double_width[] = {
$(uniset/uniset --32 eaw:F,W)
};
EOF