mirror of
https://github.com/git-for-windows/git.git
synced 2026-03-15 09:08:42 -05:00
Introduce 'git backfill' to get missing blobs in a partial clone (#5172)
This change introduces the `git backfill` command which uses the path walk API to download missing blobs in a blobless partial clone. By downloading blobs that correspond to the same file path at the same time, we hope to maximize the potential benefits of delta compression against multiple versions. These downloads occur in a configurable batch size, presenting a mechanism to perform "resumable" clones: `git clone --filter=blob:none` gets the commits and trees, then `git backfill` will download all missing blobs. If `git backfill` is interrupted partway through, it can be restarted and will redownload only the missing objects. When combining blobless partial clones with sparse-checkout, `git backfill` will assume its `--sparse` option and download only the blobs within the sparse-checkout. Users may want to do this as the repo size will still be smaller than the full repo size, but commands like `git blame` or `git log -L` will not suffer from many one-by-one blob downloads. Future directions should consider adding a pathspec or file prefix to further focus which paths are being downloaded in a batch.
This commit is contained in:
1
.gitignore
vendored
1
.gitignore
vendored
@@ -20,6 +20,7 @@
|
||||
/git-apply
|
||||
/git-archimport
|
||||
/git-archive
|
||||
/git-backfill
|
||||
/git-bisect
|
||||
/git-blame
|
||||
/git-branch
|
||||
|
||||
60
Documentation/git-backfill.txt
Normal file
60
Documentation/git-backfill.txt
Normal file
@@ -0,0 +1,60 @@
|
||||
git-backfill(1)
|
||||
===============
|
||||
|
||||
NAME
|
||||
----
|
||||
git-backfill - Download missing objects in a partial clone
|
||||
|
||||
|
||||
SYNOPSIS
|
||||
--------
|
||||
[verse]
|
||||
(EXPERIMENTAL) 'git backfill' [--batch-size=<n>] [--[no-]sparse]
|
||||
|
||||
DESCRIPTION
|
||||
-----------
|
||||
|
||||
Blobless partial clones are created using `git clone --filter=blob:none`
|
||||
and then configure the local repository such that the Git client avoids
|
||||
downloading blob objects unless they are required for a local operation.
|
||||
This initially means that the clone and later fetches download reachable
|
||||
commits and trees but no blobs. Later operations that change the `HEAD`
|
||||
pointer, such as `git checkout` or `git merge`, may need to download
|
||||
missing blobs in order to complete their operation.
|
||||
|
||||
In the worst cases, commands that compute blob diffs, such as `git blame`,
|
||||
become very slow as they download the missing blobs in single-blob
|
||||
requests to satisfy the missing object as the Git command needs it. This
|
||||
leads to multiple download requests and no ability for the Git server to
|
||||
provide delta compression across those objects.
|
||||
|
||||
The `git backfill` command provides a way for the user to request that
|
||||
Git downloads the missing blobs (with optional filters) such that the
|
||||
missing blobs representing historical versions of files can be downloaded
|
||||
in batches. The `backfill` command attempts to optimize the request by
|
||||
grouping blobs that appear at the same path, hopefully leading to good
|
||||
delta compression in the packfile sent by the server.
|
||||
|
||||
By default, `git backfill` downloads all blobs reachable from the `HEAD`
|
||||
commit. This set can be restricted or expanded using various options.
|
||||
|
||||
OPTIONS
|
||||
-------
|
||||
|
||||
--batch-size=<n>::
|
||||
Specify a minimum size for a batch of missing objects to request
|
||||
from the server. This size may be exceeded by the last set of
|
||||
blobs seen at a given path. Default batch size is 16,000.
|
||||
|
||||
--[no-]sparse::
|
||||
Only download objects if they appear at a path that matches the
|
||||
current sparse-checkout. If the sparse-checkout feature is enabled,
|
||||
then `--sparse` is assumed and can be disabled with `--no-sparse`.
|
||||
|
||||
SEE ALSO
|
||||
--------
|
||||
linkgit:git-clone[1].
|
||||
|
||||
GIT
|
||||
---
|
||||
Part of the linkgit:git[1] suite
|
||||
@@ -65,9 +65,18 @@ better off using the revision walk API instead.
|
||||
the revision walk so that the walk emits commits marked with the
|
||||
`UNINTERESTING` flag.
|
||||
|
||||
`pl`::
|
||||
This pattern list pointer allows focusing the path-walk search to
|
||||
a set of patterns, only emitting paths that match the given
|
||||
patterns. See linkgit:gitignore[5] or
|
||||
linkgit:git-sparse-checkout[1] for details about pattern lists.
|
||||
When the pattern list uses cone-mode patterns, then the path-walk
|
||||
API can prune the set of paths it walks to improve performance.
|
||||
|
||||
Examples
|
||||
--------
|
||||
|
||||
See example usages in:
|
||||
`t/helper/test-path-walk.c`,
|
||||
`builtin/backfill.c`,
|
||||
`builtin/pack-objects.c`
|
||||
|
||||
1
Makefile
1
Makefile
@@ -1213,6 +1213,7 @@ BUILTIN_OBJS += builtin/am.o
|
||||
BUILTIN_OBJS += builtin/annotate.o
|
||||
BUILTIN_OBJS += builtin/apply.o
|
||||
BUILTIN_OBJS += builtin/archive.o
|
||||
BUILTIN_OBJS += builtin/backfill.o
|
||||
BUILTIN_OBJS += builtin/bisect.o
|
||||
BUILTIN_OBJS += builtin/blame.o
|
||||
BUILTIN_OBJS += builtin/branch.o
|
||||
|
||||
@@ -120,6 +120,7 @@ int cmd_am(int argc, const char **argv, const char *prefix, struct repository *r
|
||||
int cmd_annotate(int argc, const char **argv, const char *prefix, struct repository *repo);
|
||||
int cmd_apply(int argc, const char **argv, const char *prefix, struct repository *repo);
|
||||
int cmd_archive(int argc, const char **argv, const char *prefix, struct repository *repo);
|
||||
int cmd_backfill(int argc, const char **argv, const char *prefix, struct repository *repo);
|
||||
int cmd_bisect(int argc, const char **argv, const char *prefix, struct repository *repo);
|
||||
int cmd_blame(int argc, const char **argv, const char *prefix, struct repository *repo);
|
||||
int cmd_branch(int argc, const char **argv, const char *prefix, struct repository *repo);
|
||||
|
||||
146
builtin/backfill.c
Normal file
146
builtin/backfill.c
Normal file
@@ -0,0 +1,146 @@
|
||||
#define USE_THE_REPOSITORY_VARIABLE /* for core_apply_sparse_checkout */
|
||||
|
||||
#include "builtin.h"
|
||||
#include "git-compat-util.h"
|
||||
#include "config.h"
|
||||
#include "parse-options.h"
|
||||
#include "repository.h"
|
||||
#include "commit.h"
|
||||
#include "dir.h"
|
||||
#include "environment.h"
|
||||
#include "hex.h"
|
||||
#include "tree.h"
|
||||
#include "tree-walk.h"
|
||||
#include "object.h"
|
||||
#include "object-store-ll.h"
|
||||
#include "oid-array.h"
|
||||
#include "oidset.h"
|
||||
#include "promisor-remote.h"
|
||||
#include "strmap.h"
|
||||
#include "string-list.h"
|
||||
#include "revision.h"
|
||||
#include "trace2.h"
|
||||
#include "progress.h"
|
||||
#include "packfile.h"
|
||||
#include "path-walk.h"
|
||||
|
||||
static const char * const builtin_backfill_usage[] = {
|
||||
N_("(EXPERIMENTAL) git backfill [--batch-size=<n>] [--[no-]sparse]"),
|
||||
NULL
|
||||
};
|
||||
|
||||
struct backfill_context {
|
||||
struct repository *repo;
|
||||
struct oid_array current_batch;
|
||||
size_t batch_size;
|
||||
int sparse;
|
||||
};
|
||||
|
||||
static void clear_backfill_context(struct backfill_context *ctx)
|
||||
{
|
||||
oid_array_clear(&ctx->current_batch);
|
||||
}
|
||||
|
||||
static void download_batch(struct backfill_context *ctx)
|
||||
{
|
||||
promisor_remote_get_direct(ctx->repo,
|
||||
ctx->current_batch.oid,
|
||||
ctx->current_batch.nr);
|
||||
oid_array_clear(&ctx->current_batch);
|
||||
|
||||
/*
|
||||
* We likely have a new packfile. Add it to the packed list to
|
||||
* avoid possible duplicate downloads of the same objects.
|
||||
*/
|
||||
reprepare_packed_git(ctx->repo);
|
||||
}
|
||||
|
||||
static int fill_missing_blobs(const char *path UNUSED,
|
||||
struct oid_array *list,
|
||||
enum object_type type,
|
||||
void *data)
|
||||
{
|
||||
struct backfill_context *ctx = data;
|
||||
|
||||
if (type != OBJ_BLOB)
|
||||
return 0;
|
||||
|
||||
for (size_t i = 0; i < list->nr; i++) {
|
||||
off_t size = 0;
|
||||
struct object_info info = OBJECT_INFO_INIT;
|
||||
info.disk_sizep = &size;
|
||||
if (oid_object_info_extended(ctx->repo,
|
||||
&list->oid[i],
|
||||
&info,
|
||||
OBJECT_INFO_FOR_PREFETCH) ||
|
||||
!size)
|
||||
oid_array_append(&ctx->current_batch, &list->oid[i]);
|
||||
}
|
||||
|
||||
if (ctx->current_batch.nr >= ctx->batch_size)
|
||||
download_batch(ctx);
|
||||
|
||||
return 0;
|
||||
}
|
||||
|
||||
static int do_backfill(struct backfill_context *ctx)
|
||||
{
|
||||
struct rev_info revs;
|
||||
struct path_walk_info info = PATH_WALK_INFO_INIT;
|
||||
int ret;
|
||||
|
||||
if (ctx->sparse) {
|
||||
CALLOC_ARRAY(info.pl, 1);
|
||||
if (get_sparse_checkout_patterns(info.pl))
|
||||
return error(_("problem loading sparse-checkout"));
|
||||
}
|
||||
|
||||
repo_init_revisions(ctx->repo, &revs, "");
|
||||
handle_revision_arg("HEAD", &revs, 0, 0);
|
||||
|
||||
info.blobs = 1;
|
||||
info.tags = info.commits = info.trees = 0;
|
||||
|
||||
info.revs = &revs;
|
||||
info.path_fn = fill_missing_blobs;
|
||||
info.path_fn_data = ctx;
|
||||
|
||||
ret = walk_objects_by_path(&info);
|
||||
|
||||
/* Download the objects that did not fill a batch. */
|
||||
if (!ret)
|
||||
download_batch(ctx);
|
||||
|
||||
clear_backfill_context(ctx);
|
||||
return ret;
|
||||
}
|
||||
|
||||
int cmd_backfill(int argc, const char **argv, const char *prefix, struct repository *repo)
|
||||
{
|
||||
struct backfill_context ctx = {
|
||||
.repo = repo,
|
||||
.current_batch = OID_ARRAY_INIT,
|
||||
.batch_size = 16000,
|
||||
.sparse = 0,
|
||||
};
|
||||
struct option options[] = {
|
||||
OPT_INTEGER(0, "batch-size", &ctx.batch_size,
|
||||
N_("Minimun number of objects to request at a time")),
|
||||
OPT_BOOL(0, "sparse", &ctx.sparse,
|
||||
N_("Restrict the missing objects to the current sparse-checkout")),
|
||||
OPT_END(),
|
||||
};
|
||||
|
||||
if (argc == 2 && !strcmp(argv[1], "-h"))
|
||||
usage_with_options(builtin_backfill_usage, options);
|
||||
|
||||
argc = parse_options(argc, argv, prefix, options, builtin_backfill_usage,
|
||||
0);
|
||||
|
||||
repo_config(repo, git_default_config, NULL);
|
||||
|
||||
if (ctx.sparse < 0)
|
||||
ctx.sparse = core_apply_sparse_checkout;
|
||||
|
||||
return do_backfill(&ctx);
|
||||
}
|
||||
@@ -60,6 +60,7 @@ git-annotate ancillaryinterrogators
|
||||
git-apply plumbingmanipulators complete
|
||||
git-archimport foreignscminterface
|
||||
git-archive mainporcelain
|
||||
git-backfill mainporcelain history
|
||||
git-bisect mainporcelain info
|
||||
git-blame ancillaryinterrogators complete
|
||||
git-branch mainporcelain history
|
||||
|
||||
10
dir.c
10
dir.c
@@ -1088,10 +1088,6 @@ static void invalidate_directory(struct untracked_cache *uc,
|
||||
dir->dirs[i]->recurse = 0;
|
||||
}
|
||||
|
||||
static int add_patterns_from_buffer(char *buf, size_t size,
|
||||
const char *base, int baselen,
|
||||
struct pattern_list *pl);
|
||||
|
||||
/* Flags for add_patterns() */
|
||||
#define PATTERN_NOFOLLOW (1<<0)
|
||||
|
||||
@@ -1181,9 +1177,9 @@ static int add_patterns(const char *fname, const char *base, int baselen,
|
||||
return 0;
|
||||
}
|
||||
|
||||
static int add_patterns_from_buffer(char *buf, size_t size,
|
||||
const char *base, int baselen,
|
||||
struct pattern_list *pl)
|
||||
int add_patterns_from_buffer(char *buf, size_t size,
|
||||
const char *base, int baselen,
|
||||
struct pattern_list *pl)
|
||||
{
|
||||
char *orig = buf;
|
||||
int i, lineno = 1;
|
||||
|
||||
3
dir.h
3
dir.h
@@ -467,6 +467,9 @@ void add_patterns_from_file(struct dir_struct *, const char *fname);
|
||||
int add_patterns_from_blob_to_list(struct object_id *oid,
|
||||
const char *base, int baselen,
|
||||
struct pattern_list *pl);
|
||||
int add_patterns_from_buffer(char *buf, size_t size,
|
||||
const char *base, int baselen,
|
||||
struct pattern_list *pl);
|
||||
void parse_path_pattern(const char **string, int *patternlen, unsigned *flags, int *nowildcardlen);
|
||||
void add_pattern(const char *string, const char *base,
|
||||
int baselen, struct pattern_list *pl, int srcpos);
|
||||
|
||||
1
git.c
1
git.c
@@ -509,6 +509,7 @@ static struct cmd_struct commands[] = {
|
||||
{ "annotate", cmd_annotate, RUN_SETUP },
|
||||
{ "apply", cmd_apply, RUN_SETUP_GENTLY },
|
||||
{ "archive", cmd_archive, RUN_SETUP_GENTLY },
|
||||
{ "backfill", cmd_backfill, RUN_SETUP },
|
||||
{ "bisect", cmd_bisect, RUN_SETUP },
|
||||
{ "blame", cmd_blame, RUN_SETUP },
|
||||
{ "branch", cmd_branch, RUN_SETUP | DELAY_PAGER_CONFIG },
|
||||
|
||||
18
path-walk.c
18
path-walk.c
@@ -10,6 +10,7 @@
|
||||
#include "hex.h"
|
||||
#include "object.h"
|
||||
#include "oid-array.h"
|
||||
#include "repository.h"
|
||||
#include "revision.h"
|
||||
#include "string-list.h"
|
||||
#include "strmap.h"
|
||||
@@ -119,6 +120,23 @@ static int add_children(struct path_walk_context *ctx,
|
||||
if (type == OBJ_TREE)
|
||||
strbuf_addch(&path, '/');
|
||||
|
||||
if (ctx->info->pl) {
|
||||
int dtype;
|
||||
enum pattern_match_result match;
|
||||
match = path_matches_pattern_list(path.buf, path.len,
|
||||
path.buf + base_len, &dtype,
|
||||
ctx->info->pl,
|
||||
ctx->repo->index);
|
||||
|
||||
if (ctx->info->pl->use_cone_patterns &&
|
||||
match == NOT_MATCHED)
|
||||
continue;
|
||||
else if (!ctx->info->pl->use_cone_patterns &&
|
||||
type == OBJ_BLOB &&
|
||||
match != MATCHED)
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!(list = strmap_get(&ctx->paths_to_lists, path.buf))) {
|
||||
CALLOC_ARRAY(list, 1);
|
||||
list->type = type;
|
||||
|
||||
11
path-walk.h
11
path-walk.h
@@ -6,6 +6,7 @@
|
||||
|
||||
struct rev_info;
|
||||
struct oid_array;
|
||||
struct pattern_list;
|
||||
|
||||
/**
|
||||
* The type of a function pointer for the method that is called on a list of
|
||||
@@ -46,6 +47,16 @@ struct path_walk_info {
|
||||
* walk the children of such trees.
|
||||
*/
|
||||
int prune_all_uninteresting;
|
||||
|
||||
/**
|
||||
* Specify a sparse-checkout definition to match our paths to. Do not
|
||||
* walk outside of this sparse definition. If the patterns are in
|
||||
* cone mode, then the search may prune directories that are outside
|
||||
* of the cone. If not in cone mode, then all tree paths will be
|
||||
* explored but the path_fn will only be called when the path matches
|
||||
* the sparse-checkout patterns.
|
||||
*/
|
||||
struct pattern_list *pl;
|
||||
};
|
||||
|
||||
#define PATH_WALK_INFO_INIT { \
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
#define USE_THE_REPOSITORY_VARIABLE
|
||||
|
||||
#include "test-tool.h"
|
||||
#include "dir.h"
|
||||
#include "environment.h"
|
||||
#include "hex.h"
|
||||
#include "object-name.h"
|
||||
@@ -9,6 +10,7 @@
|
||||
#include "revision.h"
|
||||
#include "setup.h"
|
||||
#include "parse-options.h"
|
||||
#include "strbuf.h"
|
||||
#include "path-walk.h"
|
||||
#include "oid-array.h"
|
||||
|
||||
@@ -67,7 +69,7 @@ static int emit_block(const char *path, struct oid_array *oids,
|
||||
|
||||
int cmd__path_walk(int argc, const char **argv)
|
||||
{
|
||||
int res;
|
||||
int res, stdin_pl = 0;
|
||||
struct rev_info revs = REV_INFO_INIT;
|
||||
struct path_walk_info info = PATH_WALK_INFO_INIT;
|
||||
struct path_walk_test_data data = { 0 };
|
||||
@@ -82,6 +84,8 @@ int cmd__path_walk(int argc, const char **argv)
|
||||
N_("toggle inclusion of tree objects")),
|
||||
OPT_BOOL(0, "prune", &info.prune_all_uninteresting,
|
||||
N_("toggle pruning of uninteresting paths")),
|
||||
OPT_BOOL(0, "stdin-pl", &stdin_pl,
|
||||
N_("read a pattern list over stdin")),
|
||||
OPT_END(),
|
||||
};
|
||||
|
||||
@@ -102,6 +106,17 @@ int cmd__path_walk(int argc, const char **argv)
|
||||
info.path_fn = emit_block;
|
||||
info.path_fn_data = &data;
|
||||
|
||||
if (stdin_pl) {
|
||||
struct strbuf in = STRBUF_INIT;
|
||||
CALLOC_ARRAY(info.pl, 1);
|
||||
|
||||
info.pl->use_cone_patterns = 1;
|
||||
|
||||
strbuf_fread(&in, 2048, stdin);
|
||||
add_patterns_from_buffer(in.buf, in.len, "", 0, info.pl);
|
||||
strbuf_release(&in);
|
||||
}
|
||||
|
||||
res = walk_objects_by_path(&info);
|
||||
|
||||
printf("commits:%" PRIuMAX "\n"
|
||||
@@ -110,5 +125,9 @@ int cmd__path_walk(int argc, const char **argv)
|
||||
"tags:%" PRIuMAX "\n",
|
||||
data.commit_nr, data.tree_nr, data.blob_nr, data.tag_nr);
|
||||
|
||||
if (info.pl) {
|
||||
clear_pattern_list(info.pl);
|
||||
free(info.pl);
|
||||
}
|
||||
return res;
|
||||
}
|
||||
|
||||
178
t/t5620-backfill.sh
Executable file
178
t/t5620-backfill.sh
Executable file
@@ -0,0 +1,178 @@
|
||||
#!/bin/sh
|
||||
|
||||
test_description='git backfill on partial clones'
|
||||
|
||||
GIT_TEST_DEFAULT_INITIAL_BRANCH_NAME=main
|
||||
export GIT_TEST_DEFAULT_INITIAL_BRANCH_NAME
|
||||
|
||||
. ./test-lib.sh
|
||||
|
||||
# We create objects in the 'src' repo.
|
||||
test_expect_success 'setup repo for object creation' '
|
||||
echo "{print \$1}" >print_1.awk &&
|
||||
echo "{print \$2}" >print_2.awk &&
|
||||
|
||||
git init src &&
|
||||
|
||||
mkdir -p src/a/b/c &&
|
||||
mkdir -p src/d/e &&
|
||||
|
||||
for i in 1 2
|
||||
do
|
||||
for n in 1 2 3 4
|
||||
do
|
||||
echo "Version $i of file $n" > src/file.$n.txt &&
|
||||
echo "Version $i of file a/$n" > src/a/file.$n.txt &&
|
||||
echo "Version $i of file a/b/$n" > src/a/b/file.$n.txt &&
|
||||
echo "Version $i of file a/b/c/$n" > src/a/b/c/file.$n.txt &&
|
||||
echo "Version $i of file d/$n" > src/d/file.$n.txt &&
|
||||
echo "Version $i of file d/e/$n" > src/d/e/file.$n.txt &&
|
||||
git -C src add . &&
|
||||
git -C src commit -m "Iteration $n" || return 1
|
||||
done
|
||||
done
|
||||
'
|
||||
|
||||
# Clone 'src' into 'srv.bare' so we have a bare repo to be our origin
|
||||
# server for the partial clone.
|
||||
test_expect_success 'setup bare clone for server' '
|
||||
git clone --bare "file://$(pwd)/src" srv.bare &&
|
||||
git -C srv.bare config --local uploadpack.allowfilter 1 &&
|
||||
git -C srv.bare config --local uploadpack.allowanysha1inwant 1
|
||||
'
|
||||
|
||||
# do basic partial clone from "srv.bare"
|
||||
test_expect_success 'do partial clone 1, backfill gets all objects' '
|
||||
git clone --no-checkout --filter=blob:none \
|
||||
--single-branch --branch=main \
|
||||
"file://$(pwd)/srv.bare" backfill1 &&
|
||||
|
||||
# Backfill with no options gets everything reachable from HEAD.
|
||||
GIT_TRACE2_EVENT="$(pwd)/backfill-file-trace" git \
|
||||
-C backfill1 backfill &&
|
||||
|
||||
# We should have engaged the partial clone machinery
|
||||
test_trace2_data promisor fetch_count 48 <backfill-file-trace &&
|
||||
|
||||
# No more missing objects!
|
||||
git -C backfill1 rev-list --quiet --objects --missing=print HEAD >revs2 &&
|
||||
test_line_count = 0 revs2
|
||||
'
|
||||
|
||||
test_expect_success 'do partial clone 2, backfill batch size' '
|
||||
git clone --no-checkout --filter=blob:none \
|
||||
--single-branch --branch=main \
|
||||
"file://$(pwd)/srv.bare" backfill2 &&
|
||||
|
||||
GIT_TRACE2_EVENT="$(pwd)/batch-trace" git \
|
||||
-C backfill2 backfill --batch-size=20 &&
|
||||
|
||||
# Batches were used
|
||||
test_trace2_data promisor fetch_count 20 <batch-trace >matches &&
|
||||
test_line_count = 2 matches &&
|
||||
test_trace2_data promisor fetch_count 8 <batch-trace &&
|
||||
|
||||
# No more missing objects!
|
||||
git -C backfill2 rev-list --quiet --objects --missing=print HEAD >revs2 &&
|
||||
test_line_count = 0 revs2
|
||||
'
|
||||
|
||||
test_expect_success 'backfill --sparse without sparse-checkout fails' '
|
||||
git init not-sparse &&
|
||||
test_must_fail git -C not-sparse backfill --sparse 2>err &&
|
||||
grep "problem loading sparse-checkout" err
|
||||
'
|
||||
|
||||
test_expect_success 'backfill --sparse' '
|
||||
git clone --sparse --filter=blob:none \
|
||||
--single-branch --branch=main \
|
||||
"file://$(pwd)/srv.bare" backfill3 &&
|
||||
|
||||
# Initial checkout includes four files at root.
|
||||
git -C backfill3 rev-list --quiet --objects --missing=print HEAD >missing &&
|
||||
test_line_count = 44 missing &&
|
||||
|
||||
# Initial sparse-checkout is just the files at root, so we get the
|
||||
# older versions of the four files at tip.
|
||||
GIT_TRACE2_EVENT="$(pwd)/sparse-trace1" git \
|
||||
-C backfill3 backfill --sparse &&
|
||||
test_trace2_data promisor fetch_count 4 <sparse-trace1 &&
|
||||
test_trace2_data path-walk paths 5 <sparse-trace1 &&
|
||||
git -C backfill3 rev-list --quiet --objects --missing=print HEAD >missing &&
|
||||
test_line_count = 40 missing &&
|
||||
|
||||
# Expand the sparse-checkout to include 'd' recursively. This
|
||||
# engages the algorithm to skip the trees for 'a'. Note that
|
||||
# the "sparse-checkout set" command downloads the objects at tip
|
||||
# to satisfy the current checkout.
|
||||
git -C backfill3 sparse-checkout set d &&
|
||||
GIT_TRACE2_EVENT="$(pwd)/sparse-trace2" git \
|
||||
-C backfill3 backfill --sparse &&
|
||||
test_trace2_data promisor fetch_count 8 <sparse-trace2 &&
|
||||
test_trace2_data path-walk paths 15 <sparse-trace2 &&
|
||||
git -C backfill3 rev-list --quiet --objects --missing=print HEAD >missing &&
|
||||
test_line_count = 24 missing &&
|
||||
|
||||
# Disabling the --sparse option (on by default) will download everything
|
||||
git -C backfill3 backfill --no-sparse &&
|
||||
git -C backfill3 rev-list --quiet --objects --missing=print HEAD >missing &&
|
||||
test_line_count = 0 missing
|
||||
'
|
||||
|
||||
test_expect_success 'backfill --sparse without cone mode' '
|
||||
git clone --no-checkout --filter=blob:none \
|
||||
--single-branch --branch=main \
|
||||
"file://$(pwd)/srv.bare" backfill4 &&
|
||||
|
||||
# No blobs yet
|
||||
git -C backfill4 rev-list --quiet --objects --missing=print HEAD >missing &&
|
||||
test_line_count = 48 missing &&
|
||||
|
||||
# Define sparse-checkout by filename regardless of parent directory.
|
||||
# This downloads 6 blobs to satisfy the checkout.
|
||||
git -C backfill4 sparse-checkout set --no-cone "**/file.1.txt" &&
|
||||
git -C backfill4 checkout main &&
|
||||
|
||||
GIT_TRACE2_EVENT="$(pwd)/no-cone-trace1" git \
|
||||
-C backfill4 backfill --sparse &&
|
||||
test_trace2_data promisor fetch_count 6 <no-cone-trace1 &&
|
||||
|
||||
# This walk needed to visit all directories to search for these paths.
|
||||
test_trace2_data path-walk paths 12 <no-cone-trace1 &&
|
||||
git -C backfill4 rev-list --quiet --objects --missing=print HEAD >missing &&
|
||||
test_line_count = 36 missing
|
||||
'
|
||||
|
||||
. "$TEST_DIRECTORY"/lib-httpd.sh
|
||||
start_httpd
|
||||
|
||||
test_expect_success 'create a partial clone over HTTP' '
|
||||
SERVER="$HTTPD_DOCUMENT_ROOT_PATH/server" &&
|
||||
rm -rf "$SERVER" repo &&
|
||||
git clone --bare "file://$(pwd)/src" "$SERVER" &&
|
||||
test_config -C "$SERVER" uploadpack.allowfilter 1 &&
|
||||
test_config -C "$SERVER" uploadpack.allowanysha1inwant 1 &&
|
||||
|
||||
git clone --no-checkout --filter=blob:none \
|
||||
"$HTTPD_URL/smart/server" backfill-http
|
||||
'
|
||||
|
||||
test_expect_success 'backfilling over HTTP succeeds' '
|
||||
GIT_TRACE2_EVENT="$(pwd)/backfill-http-trace" git \
|
||||
-C backfill-http backfill &&
|
||||
|
||||
# We should have engaged the partial clone machinery
|
||||
test_trace2_data promisor fetch_count 48 <backfill-http-trace &&
|
||||
|
||||
# Confirm all objects are present, none missing.
|
||||
git -C backfill-http rev-list --objects --all >rev-list-out &&
|
||||
awk "{print \$1;}" <rev-list-out >oids &&
|
||||
GIT_TRACE2_EVENT="$(pwd)/walk-trace" git -C backfill-http \
|
||||
cat-file --batch-check <oids >batch-out &&
|
||||
! grep missing batch-out
|
||||
'
|
||||
|
||||
# DO NOT add non-httpd-specific tests here, because the last part of this
|
||||
# test script is only executed when httpd is available and enabled.
|
||||
|
||||
test_done
|
||||
@@ -108,6 +108,41 @@ test_expect_success 'all' '
|
||||
test_cmp expect.sorted out.sorted
|
||||
'
|
||||
|
||||
test_expect_success 'base & topic, sparse' '
|
||||
cat >patterns <<-EOF &&
|
||||
/*
|
||||
!/*/
|
||||
/left/
|
||||
EOF
|
||||
|
||||
test-tool path-walk --stdin-pl -- base topic <patterns >out &&
|
||||
|
||||
cat >expect <<-EOF &&
|
||||
COMMIT::$(git rev-parse topic)
|
||||
COMMIT::$(git rev-parse base)
|
||||
COMMIT::$(git rev-parse base~1)
|
||||
COMMIT::$(git rev-parse base~2)
|
||||
commits:4
|
||||
TREE::$(git rev-parse topic^{tree})
|
||||
TREE::$(git rev-parse base^{tree})
|
||||
TREE::$(git rev-parse base~1^{tree})
|
||||
TREE::$(git rev-parse base~2^{tree})
|
||||
TREE:left/:$(git rev-parse base:left)
|
||||
TREE:left/:$(git rev-parse base~2:left)
|
||||
trees:6
|
||||
BLOB:a:$(git rev-parse base~2:a)
|
||||
BLOB:left/b:$(git rev-parse base~2:left/b)
|
||||
BLOB:left/b:$(git rev-parse base:left/b)
|
||||
blobs:3
|
||||
tags:0
|
||||
EOF
|
||||
|
||||
sort expect >expect.sorted &&
|
||||
sort out >out.sorted &&
|
||||
|
||||
test_cmp expect.sorted out.sorted
|
||||
'
|
||||
|
||||
test_expect_success 'topic only' '
|
||||
test-tool path-walk -- topic >out &&
|
||||
|
||||
|
||||
Reference in New Issue
Block a user