gpt4 book ai didi

mercurial - 用所有大文件复制 Hg repo

转载 作者:行者123 更新时间:2023-12-03 08:20:14 26 4
gpt4 key购买 nike

我们有一个包含大文件的大型旧存储库。我想使用运行 hg pull 的 cron 脚本将存储库复制到备份服务器。 .但是,此命令不会检索大文件。

我目前复制了 2GB 的历史记录,但我缺少 6GB 的大文件。我怎样才能让 Hg 拉下那些重要的文件?

最佳答案

默认情况下,只会下载更新到的修订版的大文件。

'hg help largefiles' 说:

When you pull a changeset that affects largefiles from a remote repository,
the largefiles for the changeset will by default not be pulled down. However,
when you update to such a revision, any largefiles needed by that revision are
downloaded and cached (if they have never been downloaded before). One way to
pull largefiles when pulling is thus to use --update, which will update your
working copy to the latest pulled revision (and thereby downloading any new
largefiles).

If you want to pull largefiles you don't need for update yet, then you can use
pull with the "--lfrev" option or the "hg lfpull" command.

为此,您应该能够使用 'hg lfpull --rev "all()"'。

关于mercurial - 用所有大文件复制 Hg repo,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/25490455/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com