Compare commits

..

No commits in common. "master" and "v2.0" have entirely different histories.
master ... v2.0

74 changed files with 1809 additions and 4203 deletions

16
.flake8
View File

@ -1,15 +1,3 @@
[flake8] [flake8]
max-line-length=100 max-line-length=80
ignore= ignore=E111,E114,E402
# E111: Indentation is not a multiple of four
E111,
# E114: Indentation is not a multiple of four (comment)
E114,
# E402: Module level import not at top of file
E402,
# E731: do not assign a lambda expression, use a def
E731,
# W503: Line break before binary operator
W503,
# W504: Line break after binary operator
W504

View File

@ -1,31 +0,0 @@
# GitHub actions workflow.
# https://help.github.com/en/actions/automating-your-workflow-with-github-actions/workflow-syntax-for-github-actions
name: Test CI
on:
push:
branches: [master, repo-1, stable, maint]
tags: [v*]
jobs:
test:
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest, macos-latest, windows-latest]
python-version: [3.6, 3.7, 3.8]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v1
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install tox tox-gh-actions
- name: Test with tox
run: tox

1
.gitignore vendored
View File

@ -1,4 +1,3 @@
*.asc
*.egg-info/ *.egg-info/
*.log *.log
*.pyc *.pyc

View File

@ -4,7 +4,6 @@ Hu Xiuyun <xiuyun.hu@hisilicon.com> Hu xiuyun <xiuyun.hu@hisilicon.com
Hu Xiuyun <xiuyun.hu@hisilicon.com> Hu Xiuyun <clouds08@qq.com> Hu Xiuyun <xiuyun.hu@hisilicon.com> Hu Xiuyun <clouds08@qq.com>
Jelly Chen <chenguodong@huawei.com> chenguodong <chenguodong@huawei.com> Jelly Chen <chenguodong@huawei.com> chenguodong <chenguodong@huawei.com>
Jia Bi <bijia@xiaomi.com> bijia <bijia@xiaomi.com> Jia Bi <bijia@xiaomi.com> bijia <bijia@xiaomi.com>
Jiri Tyr <jiri.tyr@gmail.com> Jiri tyr <jiri.tyr@gmail.com>
JoonCheol Park <jooncheol@gmail.com> Jooncheol Park <jooncheol@gmail.com> JoonCheol Park <jooncheol@gmail.com> Jooncheol Park <jooncheol@gmail.com>
Sergii Pylypenko <x.pelya.x@gmail.com> pelya <x.pelya.x@gmail.com> Sergii Pylypenko <x.pelya.x@gmail.com> pelya <x.pelya.x@gmail.com>
Shawn Pearce <sop@google.com> Shawn O. Pearce <sop@google.com> Shawn Pearce <sop@google.com> Shawn O. Pearce <sop@google.com>

View File

@ -6,29 +6,15 @@ development workflow. Repo is not meant to replace Git, only to make it
easier to work with Git. The repo command is an executable Python script easier to work with Git. The repo command is an executable Python script
that you can put anywhere in your path. that you can put anywhere in your path.
* Homepage: <https://gerrit.googlesource.com/git-repo/> * Homepage: https://gerrit.googlesource.com/git-repo/
* Mailing list: [repo-discuss on Google Groups][repo-discuss] * Bug reports: https://bugs.chromium.org/p/gerrit/issues/list?q=component:repo
* Bug reports: <https://bugs.chromium.org/p/gerrit/issues/list?q=component:repo> * Source: https://gerrit.googlesource.com/git-repo/
* Source: <https://gerrit.googlesource.com/git-repo/> * Overview: https://source.android.com/source/developing.html
* Overview: <https://source.android.com/source/developing.html> * Docs: https://source.android.com/source/using-repo.html
* Docs: <https://source.android.com/source/using-repo.html>
* [repo Manifest Format](./docs/manifest-format.md) * [repo Manifest Format](./docs/manifest-format.md)
* [repo Hooks](./docs/repo-hooks.md) * [repo Hooks](./docs/repo-hooks.md)
* [Submitting patches](./SUBMITTING_PATCHES.md) * [Submitting patches](./SUBMITTING_PATCHES.md)
* Running Repo in [Microsoft Windows](./docs/windows.md) * Running Repo in [Microsoft Windows](./docs/windows.md)
* GitHub mirror: <https://github.com/GerritCodeReview/git-repo>
* Postsubmit tests: <https://github.com/GerritCodeReview/git-repo/actions>
## Contact
Please use the [repo-discuss] mailing list or [issue tracker] for questions.
You can [file a new bug report][new-bug] under the "repo" component.
Please do not e-mail individual developers for support.
They do not have the bandwidth for it, and often times questions have already
been asked on [repo-discuss] or bugs posted to the [issue tracker].
So please search those sites first.
## Install ## Install
@ -48,8 +34,3 @@ $ PATH="${HOME}/.bin:${PATH}"
$ curl https://storage.googleapis.com/git-repo-downloads/repo > ~/.bin/repo $ curl https://storage.googleapis.com/git-repo-downloads/repo > ~/.bin/repo
$ chmod a+rx ~/.bin/repo $ chmod a+rx ~/.bin/repo
``` ```
[new-bug]: https://bugs.chromium.org/p/gerrit/issues/entry?template=Repo+tool+issue
[issue tracker]: https://bugs.chromium.org/p/gerrit/issues/list?q=component:repo
[repo-discuss]: https://groups.google.com/forum/#!forum/repo-discuss

View File

@ -4,7 +4,7 @@
- Make small logical changes. - Make small logical changes.
- Provide a meaningful commit message. - Provide a meaningful commit message.
- Check for coding errors and style nits with flake8. - Check for coding errors and style nits with pyflakes and flake8
- Make sure all code is under the Apache License, 2.0. - Make sure all code is under the Apache License, 2.0.
- Publish your changes for review. - Publish your changes for review.
- Make corrections if requested. - Make corrections if requested.
@ -38,30 +38,34 @@ If your description starts to get too long, that's a sign that you
probably need to split up your commit to finer grained pieces. probably need to split up your commit to finer grained pieces.
## Check for coding errors and style violations with flake8 ## Check for coding errors and style nits with pyflakes and flake8
Run `flake8` on changed modules: ### Coding errors
Run `pyflakes` on changed modules:
pyflakes file.py
Ideally there should be no new errors or warnings introduced.
### Style violations
Run `flake8` on changes modules:
flake8 file.py flake8 file.py
Note that repo generally follows [Google's Python Style Guide] rather than Note that repo generally follows [Google's python style guide] rather than
[PEP 8], with a couple of notable exceptions: [PEP 8], so it's possible that the output of `flake8` will be quite noisy.
It's not mandatory to avoid all warnings, but at least the maximum line
length should be followed.
* Indentation is at 2 columns rather than 4 If there are many occurrences of the same warning that cannot be
* The maximum line length is 100 columns rather than 80 avoided without going against the Google style guide, these may be
suppressed in the included `.flake8` file.
There should be no new errors or warnings introduced. [Google's python style guide]: https://google.github.io/styleguide/pyguide.html
Warnings that cannot be avoided without going against the Google Style Guide
may be suppressed inline individally using a `# noqa` comment as described
in the [flake8 documentation].
If there are many occurrences of the same warning, these may be suppressed for
the entire project in the included `.flake8` file.
[Google's Python Style Guide]: https://google.github.io/styleguide/pyguide.html
[PEP 8]: https://www.python.org/dev/peps/pep-0008/ [PEP 8]: https://www.python.org/dev/peps/pep-0008/
[flake8 documentation]: https://flake8.pycqa.org/en/3.1.1/user/ignoring-errors.html#in-line-ignoring-errors
## Running tests ## Running tests

View File

@ -84,7 +84,6 @@ def _Color(fg=None, bg=None, attr=None):
code = '' code = ''
return code return code
DEFAULT = None DEFAULT = None

View File

@ -66,8 +66,7 @@ class Command(object):
usage = self.helpUsage.strip().replace('%prog', me) usage = self.helpUsage.strip().replace('%prog', me)
except AttributeError: except AttributeError:
usage = 'repo %s' % self.NAME usage = 'repo %s' % self.NAME
epilog = 'Run `repo help %s` to view the detailed manual.' % self.NAME self._optparse = optparse.OptionParser(usage=usage)
self._optparse = optparse.OptionParser(usage=usage, epilog=epilog)
self._Options(self._optparse) self._Options(self._optparse)
return self._optparse return self._optparse
@ -124,9 +123,9 @@ class Command(object):
project = None project = None
if os.path.exists(path): if os.path.exists(path):
oldpath = None oldpath = None
while (path and while path and \
path != oldpath and path != oldpath and \
path != manifest.topdir): path != manifest.topdir:
try: try:
project = self._by_path[path] project = self._by_path[path]
break break
@ -237,7 +236,6 @@ class InteractiveCommand(Command):
"""Command which requires user interaction on the tty and """Command which requires user interaction on the tty and
must not run within a pager, even if the user asks to. must not run within a pager, even if the user asks to.
""" """
def WantPager(self, _opt): def WantPager(self, _opt):
return False return False
@ -246,7 +244,6 @@ class PagedCommand(Command):
"""Command which defaults to output in a pager, as its """Command which defaults to output in a pager, as its
display tends to be larger than one screen full. display tends to be larger than one screen full.
""" """
def WantPager(self, _opt): def WantPager(self, _opt):
return True return True

View File

@ -23,10 +23,6 @@ It is always safe to re-run `repo init` in existing repo client checkouts.
For example, if you want to change the manifest branch, you can simply run For example, if you want to change the manifest branch, you can simply run
`repo init --manifest-branch=<new name>` and repo will take care of the rest. `repo init --manifest-branch=<new name>` and repo will take care of the rest.
* `config`: Per-repo client checkout settings using [git-config] file format.
* `.repo_config.json`: JSON cache of the `config` file for repo to
read/process quickly.
### repo/ state ### repo/ state
* `repo/`: A git checkout of the repo project. This is how `repo` re-execs * `repo/`: A git checkout of the repo project. This is how `repo` re-execs
@ -34,7 +30,7 @@ For example, if you want to change the manifest branch, you can simply run
It tracks the git repository at `REPO_URL` using the `REPO_REV` branch. It tracks the git repository at `REPO_URL` using the `REPO_REV` branch.
Those are specified at `repo init` time using the `--repo-url=<REPO_URL>` Those are specified at `repo init` time using the `--repo-url=<REPO_URL>`
and `--repo-rev=<REPO_REV>` options. and `--repo-branch=<REPO_REV>` options.
Any changes made to this directory will usually be automatically discarded Any changes made to this directory will usually be automatically discarded
by repo itself when it checks for updates. If you want to update to the by repo itself when it checks for updates. If you want to update to the
@ -68,20 +64,13 @@ support, see the [manifest-format.md] file.
If you want to switch the tracking settings, re-run `repo init` with the If you want to switch the tracking settings, re-run `repo init` with the
new settings. new settings.
* `manifest.xml`: The manifest that repo uses. It is generated at `repo init`
and uses the `--manifest-name` to determine what manifest file to load next
out of `manifests/`.
Do not try to modify this to load other manifests as it will confuse repo.
If you want to switch manifest files, re-run `repo init` with the new
setting.
Older versions of repo managed this with symlinks.
* `manifest.xml -> manifests/<manifest-name>.xml`: A symlink to the manifest * `manifest.xml -> manifests/<manifest-name>.xml`: A symlink to the manifest
that the user wishes to sync. It is specified at `repo init` time via that the user wishes to sync. It is specified at `repo init` time via
`--manifest-name`. `--manifest-name`.
Do not try to repoint this symlink to other files as it will confuse repo.
If you want to switch manifest files, re-run `repo init` with the new
setting.
* `manifests.git/.repo_config.json`: JSON cache of the `manifests.git/config` * `manifests.git/.repo_config.json`: JSON cache of the `manifests.git/config`
file for repo to read/process quickly. file for repo to read/process quickly.
@ -103,27 +92,18 @@ support, see the [manifest-format.md] file.
Some git state is further split out under `project-objects/`. Some git state is further split out under `project-objects/`.
* `project-objects/`: Git objects that are safe to share across multiple * `project-objects/`: Git objects that are safe to share across multiple
git checkouts. The filesystem layout matches the `<project name=...` git checkouts. The filesystem layout matches the `<project name=...`
setting in the manifest (i.e. the path on the remote server) with a `.git` setting in the manifest (i.e. the path on the remote server). This allows
suffix. This allows for multiple checkouts of the same remote git repo to for multiple checkouts of the same remote git repo to share their objects.
share their objects. For example, you could have different branches of For example, you could have different branches of `foo/bar.git` checked
`foo/bar.git` checked out to `foo/bar-master`, `foo/bar-release`, etc... out to `foo/bar-master`, `foo/bar-release`, etc... There will be multiple
There will be multiple trees under `projects/` for each one, but only one trees under `projects/` for each one, but only one under `project-objects/`.
under `project-objects/`.
This layout is designed to allow people to sync against different remotes This can run into problems if different remotes use the same path on their
(e.g. a local mirror & a public review server) while avoiding duplicating respective servers ...
the content. However, this can run into problems if different remotes use
the same path on their respective servers. Best to avoid that.
* `subprojects/`: Like `projects/`, but for git submodules. * `subprojects/`: Like `projects/`, but for git submodules.
* `subproject-objects/`: Like `project-objects/`, but for git submodules. * `subproject-objects/`: Like `project-objects/`, but for git submodules.
* `worktrees/`: Bare checkouts of every project synced by the manifest. The
filesystem layout matches the `<project name=...` setting in the manifest
(i.e. the path on the remote server) with a `.git` suffix. This has the
same advantages as the `project-objects/` layout above.
This is used when git worktrees are enabled. ### Settings
### Global settings
The `.repo/manifests.git/config` file is used to track settings for the entire The `.repo/manifests.git/config` file is used to track settings for the entire
repo client checkout. repo client checkout.
@ -134,7 +114,6 @@ User controlled settings are initialized when running `repo init`.
|-------------------|---------------------------|-------------| |-------------------|---------------------------|-------------|
| manifest.groups | `--groups` & `--platform` | The manifest groups to sync | | manifest.groups | `--groups` & `--platform` | The manifest groups to sync |
| repo.archive | `--archive` | Use `git archive` for checkouts | | repo.archive | `--archive` | Use `git archive` for checkouts |
| repo.clonebundle | `--clone-bundle` | Whether the initial sync used clone.bundle explicitly |
| repo.clonefilter | `--clone-filter` | Filter setting when using [partial git clones] | | repo.clonefilter | `--clone-filter` | Filter setting when using [partial git clones] |
| repo.depth | `--depth` | Create shallow checkouts when cloning | | repo.depth | `--depth` | Create shallow checkouts when cloning |
| repo.dissociate | `--dissociate` | Dissociate from any reference/mirrors after initial clone | | repo.dissociate | `--dissociate` | Dissociate from any reference/mirrors after initial clone |
@ -142,91 +121,25 @@ User controlled settings are initialized when running `repo init`.
| repo.partialclone | `--partial-clone` | Create [partial git clones] | | repo.partialclone | `--partial-clone` | Create [partial git clones] |
| repo.reference | `--reference` | Reference repo client checkout | | repo.reference | `--reference` | Reference repo client checkout |
| repo.submodules | `--submodules` | Sync git submodules | | repo.submodules | `--submodules` | Sync git submodules |
| repo.worktree | `--worktree` | Use `git worktree` for checkouts |
| user.email | `--config-name` | User's e-mail address; Copied into `.git/config` when checking out a new project | | user.email | `--config-name` | User's e-mail address; Copied into `.git/config` when checking out a new project |
| user.name | `--config-name` | User's name; Copied into `.git/config` when checking out a new project | | user.name | `--config-name` | User's name; Copied into `.git/config` when checking out a new project |
[partial git clones]: https://git-scm.com/docs/gitrepository-layout#_code_partialclone_code [partial git clones]: https://git-scm.com/docs/gitrepository-layout#_code_partialclone_code
### Repo hooks settings
For more details on this feature, see the [repo-hooks docs](./repo-hooks.md).
We'll just discuss the internal configuration settings.
These are stored in the registered `<repo-hooks>` project itself, so if the
manifest switches to a different project, the settings will not be copied.
| Setting | Use/Meaning |
|--------------------------------------|-------------|
| repo.hooks.\<hook\>.approvedmanifest | User approval for secure manifest sources (e.g. https://) |
| repo.hooks.\<hook\>.approvedhash | User approval for insecure manifest sources (e.g. http://) |
For example, if our manifest had the following entries, we would store settings
under `.repo/projects/src/repohooks.git/config` (which would be reachable via
`git --git-dir=src/repohooks/.git config`).
```xml
<project path="src/repohooks" name="chromiumos/repohooks" ... />
<repo-hooks in-project="chromiumos/repohooks" ... />
```
If `<hook>` is `pre-upload`, the `.git/config` setting might be:
```ini
[repo "hooks.pre-upload"]
approvedmanifest = https://chromium.googlesource.com/chromiumos/manifest
```
## Per-project settings
These settings are somewhat meant to be tweaked by the user on a per-project
basis (e.g. `git config` in a checked out source repo).
Where possible, we re-use standard git settings to avoid confusion, and we
refrain from documenting those, so see [git-config] documentation instead.
See `repo help upload` for documentation on `[review]` settings.
The `[remote]` settings are automatically populated/updated from the manifest.
The `[branch]` settings are updated by `repo start` and `git branch`.
| Setting | Subcommands | Use/Meaning |
|-------------------------------|---------------|-------------|
| review.\<url\>.autocopy | upload | Automatically add to `--cc=<value>` |
| review.\<url\>.autoreviewer | upload | Automatically add to `--reviewers=<value>` |
| review.\<url\>.autoupload | upload | Automatically answer "yes" or "no" to all prompts |
| review.\<url\>.uploadhashtags | upload | Automatically add to `--hashtag=<value>` |
| review.\<url\>.uploadlabels | upload | Automatically add to `--label=<value>` |
| review.\<url\>.uploadnotify | upload | [Notify setting][upload-notify] to use |
| review.\<url\>.uploadtopic | upload | Default [topic] to use |
| review.\<url\>.username | upload | Override username with `ssh://` review URIs |
| remote.\<remote\>.fetch | sync | Set of refs to fetch |
| remote.\<remote\>.projectname | \<network\> | The name of the project as it exists in Gerrit review |
| remote.\<remote\>.pushurl | upload | The base URI for pushing CLs |
| remote.\<remote\>.review | upload | The URI of the Gerrit review server |
| remote.\<remote\>.url | sync & upload | The URI of the git project to fetch |
| branch.\<branch\>.merge | sync & upload | The branch to merge & upload & track |
| branch.\<branch\>.remote | sync & upload | The remote to track |
## ~/ dotconfig layout ## ~/ dotconfig layout
Repo will create & maintain a few files in the user's home directory. Repo will create & maintain a few files in the user's home directory.
* `.repoconfig/`: Repo's per-user directory for all random config files/state. * `.repoconfig/`: Repo's per-user directory for all random config files/state.
* `.repoconfig/config`: Per-user settings using [git-config] file format.
* `.repoconfig/keyring-version`: Cache file for checking if the gnupg subdir * `.repoconfig/keyring-version`: Cache file for checking if the gnupg subdir
has all the same keys as the repo launcher. Used to avoid running gpg has all the same keys as the repo launcher. Used to avoid running gpg
constantly as that can be quite slow. constantly as that can be quite slow.
* `.repoconfig/gnupg/`: GnuPG's internal state directory used when repo needs * `.repoconfig/gnupg/`: GnuPG's internal state directory used when repo needs
to run `gpg`. This provides isolation from the user's normal `~/.gnupg/`. to run `gpg`. This provides isolation from the user's normal `~/.gnupg/`.
* `.repoconfig/.repo_config.json`: JSON cache of the `.repoconfig/config`
file for repo to read/process quickly.
* `.repo_.gitconfig.json`: JSON cache of the `.gitconfig` file for repo to * `.repo_.gitconfig.json`: JSON cache of the `.gitconfig` file for repo to
read/process quickly. read/process quickly.
[git-config]: https://git-scm.com/docs/git-config
[manifest-format.md]: ./manifest-format.md [manifest-format.md]: ./manifest-format.md
[local manifests]: ./manifest-format.md#Local-Manifests [local manifests]: ./manifest-format.md#Local-Manifests
[topic]: https://gerrit-review.googlesource.com/Documentation/intro-user.html#topics
[upload-notify]: https://gerrit-review.googlesource.com/Documentation/user-upload.html#notify

View File

@ -396,4 +396,10 @@ these extra projects.
Manifest files stored in `$TOP_DIR/.repo/local_manifests/*.xml` will Manifest files stored in `$TOP_DIR/.repo/local_manifests/*.xml` will
be loaded in alphabetical order. be loaded in alphabetical order.
The legacy `$TOP_DIR/.repo/local_manifest.xml` path is no longer supported. Additional remotes and projects may also be added through a local
manifest, stored in `$TOP_DIR/.repo/local_manifest.xml`. This method
is deprecated in favor of using multiple manifest files as mentioned
above.
If `$TOP_DIR/.repo/local_manifest.xml` exists, it will be loaded before
any manifest files stored in `$TOP_DIR/.repo/local_manifests/*.xml`.

View File

@ -5,37 +5,6 @@ related topics and flows.
[TOC] [TOC]
## Schedule
There is no specific schedule for when releases are made.
Usually it's more along the lines of "enough minor changes have been merged",
or "there's a known issue the maintainers know should get fixed".
If you find a fix has been merged for an issue important to you, but hasn't been
released after a week or so, feel free to [contact] us to request a new release.
### Release Freezes {#freeze}
We try to observe a regular schedule for when **not** to release.
If something goes wrong, staff need to be active in order to respond quickly &
effectively.
We also don't want to disrupt non-Google organizations if possible.
We generally follow the rules:
* Release during Mon - Thu, 9:00 - 14:00 [US PT]
* Avoid holidays
* All regular [US holidays]
* Large international ones if possible
* All the various [New Years]
* Jan 1 in Gregorian calendar is the most obvious
* Check for large Lunar New Years too
* Follow the normal [Google production freeze schedule]
[US holidays]: https://en.wikipedia.org/wiki/Federal_holidays_in_the_United_States
[US PT]: https://en.wikipedia.org/wiki/Pacific_Time_Zone
[New Years]: https://en.wikipedia.org/wiki/New_Year
[Google production freeze schedule]: http://goto.google.com/prod-freeze
## Launcher script ## Launcher script
The main repo script serves as a standalone program and is often referred to as The main repo script serves as a standalone program and is often referred to as
@ -80,11 +49,11 @@ control how repo finds updates:
* `--repo-url`: This tells repo where to clone the full repo project itself. * `--repo-url`: This tells repo where to clone the full repo project itself.
It defaults to the official project (`REPO_URL` in the launcher script). It defaults to the official project (`REPO_URL` in the launcher script).
* `--repo-rev`: This tells repo which branch to use for the full project. * `--repo-branch`: This tells repo which branch to use for the full project.
It defaults to the `stable` branch (`REPO_REV` in the launcher script). It defaults to the `stable` branch (`REPO_REV` in the launcher script).
Whenever `repo sync` is run, repo will check to see if an update is available. Whenever `repo sync` is run, repo will check to see if an update is available.
It fetches the latest repo-rev from the repo-url. It fetches the latest repo-branch from the repo-url.
Then it verifies that the latest commit in the branch has a valid signed tag Then it verifies that the latest commit in the branch has a valid signed tag
using `git tag -v` (which uses gpg). using `git tag -v` (which uses gpg).
If the tag is valid, then repo will update its internal checkout to it. If the tag is valid, then repo will update its internal checkout to it.
@ -122,7 +91,7 @@ When you want to create a new release, you'll need to select a good version and
create a signed tag using a key registered in repo itself. create a signed tag using a key registered in repo itself.
Typically we just tag the latest version of the `master` branch. Typically we just tag the latest version of the `master` branch.
The tag could be pushed now, but it won't be used by clients normally (since the The tag could be pushed now, but it won't be used by clients normally (since the
default `repo-rev` setting is `stable`). default `repo-branch` setting is `stable`).
This would allow some early testing on systems who explicitly select `master`. This would allow some early testing on systems who explicitly select `master`.
### Creating a signed tag ### Creating a signed tag
@ -192,92 +161,7 @@ You can create a short changelog using the command:
$ git log --format="%h (%aN) %s" --no-merges origin/stable..$r $ git log --format="%h (%aN) %s" --no-merges origin/stable..$r
``` ```
## Project References
Here's a table showing the relationship of major tools, their EOL dates, and
their status in Ubuntu & Debian.
Those distros tend to be good indicators of how long we need to support things.
Things in bold indicate stuff to take note of, but does not guarantee that we
still support them.
Things in italics are things we used to care about but probably don't anymore.
| Date | EOL | [Git][rel-g] | [Python][rel-p] | [Ubuntu][rel-u] / [Debian][rel-d] | Git | Python |
|:--------:|:------------:|--------------|-----------------|-----------------------------------|-----|--------|
| Oct 2008 | *Oct 2013* | | 2.6.0 | *10.04 Lucid* - 10.10 Maverick / *Squeeze* |
| Dec 2008 | *Feb 2009* | | 3.0.0 |
| Feb 2009 | *Mar 2012* | | | Debian 5 Lenny | 1.5.6.5 | 2.5.2 |
| Jun 2009 | *Jun 2016* | | 3.1.0 | *10.04 Lucid* - 10.10 Maverick / *Squeeze* |
| Feb 2010 | *Oct 2012* | 1.7.0 | | *10.04 Lucid* - *12.04 Precise* - 12.10 Quantal |
| Apr 2010 | *Apr 2015* | | | *10.04 Lucid* | 1.7.0.4 | 2.6.5 3.1.2 |
| Jul 2010 | *Dec 2019* | | **2.7.0** | 11.04 Natty - **<current>** |
| Oct 2010 | | | | 10.10 Maverick | 1.7.1 | 2.6.6 3.1.3 |
| Feb 2011 | *Feb 2016* | | | Debian 6 Squeeze | 1.7.2.5 | 2.6.6 3.1.3 |
| Apr 2011 | | | | 11.04 Natty | 1.7.4 | 2.7.1 3.2.0 |
| Oct 2011 | *Feb 2016* | | 3.2.0 | 11.04 Natty - 12.10 Quantal |
| Oct 2011 | | | | 11.10 Ocelot | 1.7.5.4 | 2.7.2 3.2.2 |
| Apr 2012 | *Apr 2019* | | | *12.04 Precise* | 1.7.9.5 | 2.7.3 3.2.3 |
| Sep 2012 | *Sep 2017* | | 3.3.0 | 13.04 Raring - 13.10 Saucy |
| Oct 2012 | *Dec 2014* | 1.8.0 | | 13.04 Raring - 13.10 Saucy |
| Oct 2012 | | | | 12.10 Quantal | 1.7.10.4 | 2.7.3 3.2.3 |
| Apr 2013 | | | | 13.04 Raring | 1.8.1.2 | 2.7.4 3.3.1 |
| May 2013 | *May 2018* | | | Debian 7 Wheezy | 1.7.10.4 | 2.7.3 3.2.3 |
| Oct 2013 | | | | 13.10 Saucy | 1.8.3.2 | 2.7.5 3.3.2 |
| Feb 2014 | *Dec 2014* | **1.9.0** | | **14.04 Trusty** |
| Mar 2014 | *Mar 2019* | | **3.4.0** | **14.04 Trusty** - 15.10 Wily / **Jessie** |
| Apr 2014 | **Apr 2022** | | | **14.04 Trusty** | 1.9.1 | 2.7.5 3.4.0 |
| May 2014 | *Dec 2014* | 2.0.0 |
| Aug 2014 | *Dec 2014* | **2.1.0** | | 14.10 Utopic - 15.04 Vivid / **Jessie** |
| Oct 2014 | | | | 14.10 Utopic | 2.1.0 | 2.7.8 3.4.2 |
| Nov 2014 | *Sep 2015* | 2.2.0 |
| Feb 2015 | *Sep 2015* | 2.3.0 |
| Apr 2015 | *May 2017* | 2.4.0 |
| Apr 2015 | **Jun 2020** | | | **Debian 8 Jessie** | 2.1.4 | 2.7.9 3.4.2 |
| Apr 2015 | | | | 15.04 Vivid | 2.1.4 | 2.7.9 3.4.3 |
| Jul 2015 | *May 2017* | 2.5.0 | | 15.10 Wily |
| Sep 2015 | *May 2017* | 2.6.0 |
| Sep 2015 | **Sep 2020** | | **3.5.0** | **16.04 Xenial** - 17.04 Zesty / **Stretch** |
| Oct 2015 | | | | 15.10 Wily | 2.5.0 | 2.7.9 3.4.3 |
| Jan 2016 | *Jul 2017* | **2.7.0** | | **16.04 Xenial** |
| Mar 2016 | *Jul 2017* | 2.8.0 |
| Apr 2016 | **Apr 2024** | | | **16.04 Xenial** | 2.7.4 | 2.7.11 3.5.1 |
| Jun 2016 | *Jul 2017* | 2.9.0 | | 16.10 Yakkety |
| Sep 2016 | *Sep 2017* | 2.10.0 |
| Oct 2016 | | | | 16.10 Yakkety | 2.9.3 | 2.7.11 3.5.1 |
| Nov 2016 | *Sep 2017* | **2.11.0** | | 17.04 Zesty / **Stretch** |
| Dec 2016 | **Dec 2021** | | **3.6.0** | 17.10 Artful - **18.04 Bionic** - 18.10 Cosmic |
| Feb 2017 | *Sep 2017* | 2.12.0 |
| Apr 2017 | | | | 17.04 Zesty | 2.11.0 | 2.7.13 3.5.3 |
| May 2017 | *May 2018* | 2.13.0 |
| Jun 2017 | **Jun 2022** | | | **Debian 9 Stretch** | 2.11.0 | 2.7.13 3.5.3 |
| Aug 2017 | *Dec 2019* | 2.14.0 | | 17.10 Artful |
| Oct 2017 | *Dec 2019* | 2.15.0 |
| Oct 2017 | | | | 17.10 Artful | 2.14.1 | 2.7.14 3.6.3 |
| Jan 2018 | *Dec 2019* | 2.16.0 |
| Apr 2018 | *Dec 2019* | 2.17.0 | | **18.04 Bionic** |
| Apr 2018 | **Apr 2028** | | | **18.04 Bionic** | 2.17.0 | 2.7.15 3.6.5 |
| Jun 2018 | *Dec 2019* | 2.18.0 |
| Jun 2018 | **Jun 2023** | | 3.7.0 | 19.04 Disco - **20.04 Focal** / **Buster** |
| Sep 2018 | *Dec 2019* | 2.19.0 | | 18.10 Cosmic |
| Oct 2018 | | | | 18.10 Cosmic | 2.19.1 | 2.7.15 3.6.6 |
| Dec 2018 | *Dec 2019* | **2.20.0** | | 19.04 Disco / **Buster** |
| Feb 2019 | *Dec 2019* | 2.21.0 |
| Apr 2019 | | | | 19.04 Disco | 2.20.1 | 2.7.16 3.7.3 |
| Jun 2019 | | 2.22.0 |
| Jul 2019 | **Jul 2024** | | | **Debian 10 Buster** | 2.20.1 | 2.7.16 3.7.3 |
| Aug 2019 | | 2.23.0 |
| Oct 2019 | **Oct 2024** | | 3.8.0 |
| Oct 2019 | | | | 19.10 Eoan | 2.20.1 | 2.7.17 3.7.5 |
| Nov 2019 | | 2.24.0 |
| Jan 2020 | | 2.25.0 | | **20.04 Focal** |
| Apr 2020 | **Apr 2030** | | | **20.04 Focal** | 2.25.0 | 2.7.17 3.7.5 |
[contact]: ../README.md#contact
[rel-d]: https://en.wikipedia.org/wiki/Debian_version_history
[rel-g]: https://en.wikipedia.org/wiki/Git#Releases
[rel-p]: https://en.wikipedia.org/wiki/History_of_Python#Table_of_versions
[rel-u]: https://en.wikipedia.org/wiki/Ubuntu_version_history#Table_of_versions
[example announcement]: https://groups.google.com/d/topic/repo-discuss/UGBNismWo1M/discussion [example announcement]: https://groups.google.com/d/topic/repo-discuss/UGBNismWo1M/discussion
[repo-discuss@googlegroups.com]: https://groups.google.com/forum/#!forum/repo-discuss [repo-discuss@googlegroups.com]: https://groups.google.com/forum/#!forum/repo-discuss
[go/repo-release]: https://goto.google.com/repo-release [go/repo-release]: https://goto.google.com/repo-release

View File

@ -19,33 +19,7 @@ also due to most developers not using Windows.
We will never add code specific to older versions of Windows. We will never add code specific to older versions of Windows.
It might work, but it most likely won't, so please don't bother asking. It might work, but it most likely won't, so please don't bother asking.
## Git worktrees ## Symlinks
*** note
**Warning**: Repo's support for Git worktrees is new & experimental.
Please report any bugs and be sure to maintain backups!
***
The Repo 2.4 release introduced support for [Git worktrees][git-worktree].
You don't have to worry about or understand this particular feature, so don't
worry if this section of the Git manual is particularly impenetrable.
The salient point is that Git worktrees allow Repo to create repo client
checkouts that do not require symlinks at all under Windows.
This means users no longer need Administrator access to sync code.
Simply use `--worktree` when running `repo init` to opt in.
This does not effect specific Git repositories that use symlinks themselves.
[git-worktree]: https://git-scm.com/docs/git-worktree
## Symlinks by default
*** note
**NB**: This section applies to the default Repo behavior which does not use
Git worktrees (see the previous section for more info).
***
Repo will use symlinks heavily internally. Repo will use symlinks heavily internally.
On *NIX platforms, this isn't an issue, but Windows makes it a bit difficult. On *NIX platforms, this isn't an issue, but Windows makes it a bit difficult.
@ -88,8 +62,9 @@ This also helps `tar` unpack symlinks, so that's nice.
## Python ## Python
Python 3.6 or newer is required. You should make sure to be running Python 3.6 or newer under Windows.
Python 2 is known to be broken when running under Windows. Python 2 might work, but due to already limited platform testing, you should
only run newer Python versions.
See our [Python Support](./python-support.md) document for more details. See our [Python Support](./python-support.md) document for more details.
You can grab the latest Windows installer here:<br> You can grab the latest Windows installer here:<br>

View File

@ -24,7 +24,6 @@ import tempfile
from error import EditorError from error import EditorError
import platform_utils import platform_utils
class Editor(object): class Editor(object):
"""Manages the user's preferred text editor.""" """Manages the user's preferred text editor."""

View File

@ -14,26 +14,21 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
class ManifestParseError(Exception): class ManifestParseError(Exception):
"""Failed to parse the manifest file. """Failed to parse the manifest file.
""" """
class ManifestInvalidRevisionError(Exception): class ManifestInvalidRevisionError(Exception):
"""The revision value in a project is incorrect. """The revision value in a project is incorrect.
""" """
class ManifestInvalidPathError(Exception): class ManifestInvalidPathError(Exception):
"""A path used in <copyfile> or <linkfile> is incorrect. """A path used in <copyfile> or <linkfile> is incorrect.
""" """
class NoManifestException(Exception): class NoManifestException(Exception):
"""The required manifest does not exist. """The required manifest does not exist.
""" """
def __init__(self, path, reason): def __init__(self, path, reason):
super(NoManifestException, self).__init__() super(NoManifestException, self).__init__()
self.path = path self.path = path
@ -42,11 +37,9 @@ class NoManifestException(Exception):
def __str__(self): def __str__(self):
return self.reason return self.reason
class EditorError(Exception): class EditorError(Exception):
"""Unspecified error from the user's text editor. """Unspecified error from the user's text editor.
""" """
def __init__(self, reason): def __init__(self, reason):
super(EditorError, self).__init__() super(EditorError, self).__init__()
self.reason = reason self.reason = reason
@ -54,11 +47,9 @@ class EditorError(Exception):
def __str__(self): def __str__(self):
return self.reason return self.reason
class GitError(Exception): class GitError(Exception):
"""Unspecified internal error from git. """Unspecified internal error from git.
""" """
def __init__(self, command): def __init__(self, command):
super(GitError, self).__init__() super(GitError, self).__init__()
self.command = command self.command = command
@ -66,11 +57,9 @@ class GitError(Exception):
def __str__(self): def __str__(self):
return self.command return self.command
class UploadError(Exception): class UploadError(Exception):
"""A bundle upload to Gerrit did not succeed. """A bundle upload to Gerrit did not succeed.
""" """
def __init__(self, reason): def __init__(self, reason):
super(UploadError, self).__init__() super(UploadError, self).__init__()
self.reason = reason self.reason = reason
@ -78,11 +67,9 @@ class UploadError(Exception):
def __str__(self): def __str__(self):
return self.reason return self.reason
class DownloadError(Exception): class DownloadError(Exception):
"""Cannot download a repository. """Cannot download a repository.
""" """
def __init__(self, reason): def __init__(self, reason):
super(DownloadError, self).__init__() super(DownloadError, self).__init__()
self.reason = reason self.reason = reason
@ -90,11 +77,9 @@ class DownloadError(Exception):
def __str__(self): def __str__(self):
return self.reason return self.reason
class NoSuchProjectError(Exception): class NoSuchProjectError(Exception):
"""A specified project does not exist in the work tree. """A specified project does not exist in the work tree.
""" """
def __init__(self, name=None): def __init__(self, name=None):
super(NoSuchProjectError, self).__init__() super(NoSuchProjectError, self).__init__()
self.name = name self.name = name
@ -108,7 +93,6 @@ class NoSuchProjectError(Exception):
class InvalidProjectGroupsError(Exception): class InvalidProjectGroupsError(Exception):
"""A specified project is not suitable for the specified groups """A specified project is not suitable for the specified groups
""" """
def __init__(self, name=None): def __init__(self, name=None):
super(InvalidProjectGroupsError, self).__init__() super(InvalidProjectGroupsError, self).__init__()
self.name = name self.name = name
@ -118,18 +102,15 @@ class InvalidProjectGroupsError(Exception):
return 'in current directory' return 'in current directory'
return self.name return self.name
class RepoChangedException(Exception): class RepoChangedException(Exception):
"""Thrown if 'repo sync' results in repo updating its internal """Thrown if 'repo sync' results in repo updating its internal
repo or manifest repositories. In this special case we must repo or manifest repositories. In this special case we must
use exec to re-execute repo with the new code and manifest. use exec to re-execute repo with the new code and manifest.
""" """
def __init__(self, extra_args=None): def __init__(self, extra_args=None):
super(RepoChangedException, self).__init__() super(RepoChangedException, self).__init__()
self.extra_args = extra_args or [] self.extra_args = extra_args or []
class HookError(Exception): class HookError(Exception):
"""Thrown if a 'repo-hook' could not be run. """Thrown if a 'repo-hook' could not be run.

View File

@ -23,7 +23,6 @@ TASK_COMMAND = 'command'
TASK_SYNC_NETWORK = 'sync-network' TASK_SYNC_NETWORK = 'sync-network'
TASK_SYNC_LOCAL = 'sync-local' TASK_SYNC_LOCAL = 'sync-local'
class EventLog(object): class EventLog(object):
"""Event log that records events that occurred during a repo invocation. """Event log that records events that occurred during a repo invocation.
@ -166,7 +165,6 @@ class EventLog(object):
# An integer id that is unique across this invocation of the program. # An integer id that is unique across this invocation of the program.
_EVENT_ID = multiprocessing.Value('i', 1) _EVENT_ID = multiprocessing.Value('i', 1)
def _NextEventId(): def _NextEventId():
"""Helper function for grabbing the next unique id. """Helper function for grabbing the next unique id.

View File

@ -16,7 +16,6 @@
from __future__ import print_function from __future__ import print_function
import os import os
import re
import sys import sys
import subprocess import subprocess
import tempfile import tempfile
@ -29,17 +28,8 @@ from repo_trace import REPO_TRACE, IsTrace, Trace
from wrapper import Wrapper from wrapper import Wrapper
GIT = 'git' GIT = 'git'
# NB: These do not need to be kept in sync with the repo launcher script. # Should keep in sync with the "repo" launcher file.
# These may be much newer as it allows the repo launcher to roll between MIN_GIT_VERSION = (2, 10, 2)
# different repo releases while source versions might require a newer git.
#
# The soft version is when we start warning users that the version is old and
# we'll be dropping support for it. We'll refuse to work with versions older
# than the hard version.
#
# git-1.7 is in (EOL) Ubuntu Precise. git-1.9 is in Ubuntu Trusty.
MIN_GIT_VERSION_SOFT = (1, 9, 1)
MIN_GIT_VERSION_HARD = (1, 7, 2)
GIT_DIR = 'GIT_DIR' GIT_DIR = 'GIT_DIR'
LAST_GITDIR = None LAST_GITDIR = None
@ -48,36 +38,6 @@ LAST_CWD = None
_ssh_proxy_path = None _ssh_proxy_path = None
_ssh_sock_path = None _ssh_sock_path = None
_ssh_clients = [] _ssh_clients = []
_ssh_version = None
def _run_ssh_version():
"""run ssh -V to display the version number"""
return subprocess.check_output(['ssh', '-V'], stderr=subprocess.STDOUT).decode()
def _parse_ssh_version(ver_str=None):
"""parse a ssh version string into a tuple"""
if ver_str is None:
ver_str = _run_ssh_version()
m = re.match(r'^OpenSSH_([0-9.]+)(p[0-9]+)?\s', ver_str)
if m:
return tuple(int(x) for x in m.group(1).split('.'))
else:
return ()
def ssh_version():
"""return ssh version as a tuple"""
global _ssh_version
if _ssh_version is None:
try:
_ssh_version = _parse_ssh_version()
except subprocess.CalledProcessError:
print('fatal: unable to detect ssh version', file=sys.stderr)
sys.exit(1)
return _ssh_version
def ssh_sock(create=True): def ssh_sock(create=True):
global _ssh_sock_path global _ssh_sock_path
@ -87,16 +47,11 @@ def ssh_sock(create=True):
tmp_dir = '/tmp' tmp_dir = '/tmp'
if not os.path.exists(tmp_dir): if not os.path.exists(tmp_dir):
tmp_dir = tempfile.gettempdir() tmp_dir = tempfile.gettempdir()
if ssh_version() < (6, 7):
tokens = '%r@%h:%p'
else:
tokens = '%C' # hash of %l%h%p%r
_ssh_sock_path = os.path.join( _ssh_sock_path = os.path.join(
tempfile.mkdtemp('', 'ssh-', tmp_dir), tempfile.mkdtemp('', 'ssh-', tmp_dir),
'master-' + tokens) 'master-%r@%h:%p')
return _ssh_sock_path return _ssh_sock_path
def _ssh_proxy(): def _ssh_proxy():
global _ssh_proxy_path global _ssh_proxy_path
if _ssh_proxy_path is None: if _ssh_proxy_path is None:
@ -105,18 +60,15 @@ def _ssh_proxy():
'git_ssh') 'git_ssh')
return _ssh_proxy_path return _ssh_proxy_path
def _add_ssh_client(p): def _add_ssh_client(p):
_ssh_clients.append(p) _ssh_clients.append(p)
def _remove_ssh_client(p): def _remove_ssh_client(p):
try: try:
_ssh_clients.remove(p) _ssh_clients.remove(p)
except ValueError: except ValueError:
pass pass
def terminate_ssh_clients(): def terminate_ssh_clients():
global _ssh_clients global _ssh_clients
for p in _ssh_clients: for p in _ssh_clients:
@ -127,10 +79,8 @@ def terminate_ssh_clients():
pass pass
_ssh_clients = [] _ssh_clients = []
_git_version = None _git_version = None
class _GitCall(object): class _GitCall(object):
def version_tuple(self): def version_tuple(self):
global _git_version global _git_version
@ -143,14 +93,11 @@ class _GitCall(object):
def __getattr__(self, name): def __getattr__(self, name):
name = name.replace('_','-') name = name.replace('_','-')
def fun(*cmdv): def fun(*cmdv):
command = [name] command = [name]
command.extend(cmdv) command.extend(cmdv)
return GitCommand(None, command).Wait() == 0 return GitCommand(None, command).Wait() == 0
return fun return fun
git = _GitCall() git = _GitCall()
@ -231,10 +178,8 @@ class UserAgent(object):
return self._git_ua return self._git_ua
user_agent = UserAgent() user_agent = UserAgent()
def git_require(min_version, fail=False, msg=''): def git_require(min_version, fail=False, msg=''):
git_version = git.version_tuple() git_version = git.version_tuple()
if min_version <= git_version: if min_version <= git_version:
@ -247,6 +192,8 @@ def git_require(min_version, fail=False, msg=''):
sys.exit(1) sys.exit(1)
return False return False
def _setenv(env, name, value):
env[name] = value.encode()
class GitCommand(object): class GitCommand(object):
def __init__(self, def __init__(self,
@ -256,7 +203,6 @@ class GitCommand(object):
provide_stdin = False, provide_stdin = False,
capture_stdout = False, capture_stdout = False,
capture_stderr = False, capture_stderr = False,
merge_output=False,
disable_editor = False, disable_editor = False,
ssh_proxy = False, ssh_proxy = False,
cwd = None, cwd = None,
@ -267,21 +213,21 @@ class GitCommand(object):
self.tee = {'stdout': not capture_stdout, 'stderr': not capture_stderr} self.tee = {'stdout': not capture_stdout, 'stderr': not capture_stderr}
if disable_editor: if disable_editor:
env['GIT_EDITOR'] = ':' _setenv(env, 'GIT_EDITOR', ':')
if ssh_proxy: if ssh_proxy:
env['REPO_SSH_SOCK'] = ssh_sock() _setenv(env, 'REPO_SSH_SOCK', ssh_sock())
env['GIT_SSH'] = _ssh_proxy() _setenv(env, 'GIT_SSH', _ssh_proxy())
env['GIT_SSH_VARIANT'] = 'ssh' _setenv(env, 'GIT_SSH_VARIANT', 'ssh')
if 'http_proxy' in env and 'darwin' == sys.platform: if 'http_proxy' in env and 'darwin' == sys.platform:
s = "'http.proxy=%s'" % (env['http_proxy'],) s = "'http.proxy=%s'" % (env['http_proxy'],)
p = env.get('GIT_CONFIG_PARAMETERS') p = env.get('GIT_CONFIG_PARAMETERS')
if p is not None: if p is not None:
s = p + ' ' + s s = p + ' ' + s
env['GIT_CONFIG_PARAMETERS'] = s _setenv(env, 'GIT_CONFIG_PARAMETERS', s)
if 'GIT_ALLOW_PROTOCOL' not in env: if 'GIT_ALLOW_PROTOCOL' not in env:
env['GIT_ALLOW_PROTOCOL'] = ( _setenv(env, 'GIT_ALLOW_PROTOCOL',
'file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc') 'file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc')
env['GIT_HTTP_USER_AGENT'] = user_agent.git _setenv(env, 'GIT_HTTP_USER_AGENT', user_agent.git)
if project: if project:
if not cwd: if not cwd:
@ -292,7 +238,7 @@ class GitCommand(object):
command = [GIT] command = [GIT]
if bare: if bare:
if gitdir: if gitdir:
env[GIT_DIR] = gitdir _setenv(env, GIT_DIR, gitdir)
cwd = None cwd = None
command.append(cmdv[0]) command.append(cmdv[0])
# Need to use the --progress flag for fetch/clone so output will be # Need to use the --progress flag for fetch/clone so output will be
@ -308,7 +254,7 @@ class GitCommand(object):
stdin = None stdin = None
stdout = subprocess.PIPE stdout = subprocess.PIPE
stderr = subprocess.STDOUT if merge_output else subprocess.PIPE stderr = subprocess.PIPE
if IsTrace(): if IsTrace():
global LAST_CWD global LAST_CWD
@ -336,8 +282,6 @@ class GitCommand(object):
dbg += ' 1>|' dbg += ' 1>|'
if stderr == subprocess.PIPE: if stderr == subprocess.PIPE:
dbg += ' 2>|' dbg += ' 2>|'
elif stderr == subprocess.STDOUT:
dbg += ' 2>&1'
Trace('%s', dbg) Trace('%s', dbg)
try: try:
@ -385,7 +329,6 @@ class GitCommand(object):
p = self.process p = self.process
s_in = platform_utils.FileDescriptorStreams.create() s_in = platform_utils.FileDescriptorStreams.create()
s_in.add(p.stdout, sys.stdout, 'stdout') s_in.add(p.stdout, sys.stdout, 'stdout')
if p.stderr is not None:
s_in.add(p.stderr, sys.stderr, 'stderr') s_in.add(p.stderr, sys.stderr, 'stderr')
self.stdout = '' self.stdout = ''
self.stderr = '' self.stderr = ''

View File

@ -21,7 +21,6 @@ import errno
import json import json
import os import os
import re import re
import signal
import ssl import ssl
import subprocess import subprocess
import sys import sys
@ -42,6 +41,7 @@ else:
urllib.request = urllib2 urllib.request = urllib2
urllib.error = urllib2 urllib.error = urllib2
from signal import SIGTERM
from error import GitError, UploadError from error import GitError, UploadError
import platform_utils import platform_utils
from repo_trace import Trace from repo_trace import Trace
@ -59,23 +59,18 @@ ID_RE = re.compile(r'^[0-9a-f]{40}$')
REVIEW_CACHE = dict() REVIEW_CACHE = dict()
def IsChange(rev): def IsChange(rev):
return rev.startswith(R_CHANGES) return rev.startswith(R_CHANGES)
def IsId(rev): def IsId(rev):
return ID_RE.match(rev) return ID_RE.match(rev)
def IsTag(rev): def IsTag(rev):
return rev.startswith(R_TAGS) return rev.startswith(R_TAGS)
def IsImmutable(rev): def IsImmutable(rev):
return IsChange(rev) or IsId(rev) or IsTag(rev) return IsChange(rev) or IsId(rev) or IsTag(rev)
def _key(name): def _key(name):
parts = name.split('.') parts = name.split('.')
if len(parts) < 2: if len(parts) < 2:
@ -84,16 +79,13 @@ def _key(name):
parts[-1] = parts[-1].lower() parts[-1] = parts[-1].lower()
return '.'.join(parts) return '.'.join(parts)
class GitConfig(object): class GitConfig(object):
_ForUser = None _ForUser = None
_USER_CONFIG = '~/.gitconfig'
@classmethod @classmethod
def ForUser(cls): def ForUser(cls):
if cls._ForUser is None: if cls._ForUser is None:
cls._ForUser = cls(configfile=os.path.expanduser(cls._USER_CONFIG)) cls._ForUser = cls(configfile = os.path.expanduser('~/.gitconfig'))
return cls._ForUser return cls._ForUser
@classmethod @classmethod
@ -124,43 +116,6 @@ class GitConfig(object):
return self.defaults.Has(name, include_defaults = True) return self.defaults.Has(name, include_defaults = True)
return False return False
def GetInt(self, name):
"""Returns an integer from the configuration file.
This follows the git config syntax.
Args:
name: The key to lookup.
Returns:
None if the value was not defined, or is not a boolean.
Otherwise, the number itself.
"""
v = self.GetString(name)
if v is None:
return None
v = v.strip()
mult = 1
if v.endswith('k'):
v = v[:-1]
mult = 1024
elif v.endswith('m'):
v = v[:-1]
mult = 1024 * 1024
elif v.endswith('g'):
v = v[:-1]
mult = 1024 * 1024 * 1024
base = 10
if v.startswith('0x'):
base = 16
try:
return int(v, base=base) * mult
except ValueError:
return None
def GetBoolean(self, name): def GetBoolean(self, name):
"""Returns a boolean from the configuration file. """Returns a boolean from the configuration file.
None : The value was not defined, or is not a boolean. None : The value was not defined, or is not a boolean.
@ -313,7 +268,8 @@ class GitConfig(object):
def _ReadJson(self): def _ReadJson(self):
try: try:
if os.path.getmtime(self._json) <= os.path.getmtime(self.file): if os.path.getmtime(self._json) \
<= os.path.getmtime(self.file):
platform_utils.remove(self._json) platform_utils.remove(self._json)
return None return None
except OSError: except OSError:
@ -362,7 +318,7 @@ class GitConfig(object):
return c return c
def _do(self, *args): def _do(self, *args):
command = ['config', '--file', self.file, '--includes'] command = ['config', '--file', self.file]
command.extend(args) command.extend(args)
p = GitCommand(None, p = GitCommand(None,
@ -375,12 +331,6 @@ class GitConfig(object):
GitError('git config %s: %s' % (str(args), p.stderr)) GitError('git config %s: %s' % (str(args), p.stderr))
class RepoConfig(GitConfig):
"""User settings for repo itself."""
_USER_CONFIG = '~/.repoconfig/config'
class RefSpec(object): class RefSpec(object):
"""A Git refspec line, split into its components: """A Git refspec line, split into its components:
@ -442,7 +392,6 @@ _master_keys = set()
_ssh_master = True _ssh_master = True
_master_keys_lock = None _master_keys_lock = None
def init_ssh(): def init_ssh():
"""Should be called once at the start of repo to init ssh master handling. """Should be called once at the start of repo to init ssh master handling.
@ -452,7 +401,6 @@ def init_ssh():
assert _master_keys_lock is None, "Should only call init_ssh once" assert _master_keys_lock is None, "Should only call init_ssh once"
_master_keys_lock = _threading.Lock() _master_keys_lock = _threading.Lock()
def _open_ssh(host, port=None): def _open_ssh(host, port=None):
global _ssh_master global _ssh_master
@ -473,9 +421,9 @@ def _open_ssh(host, port=None):
if key in _master_keys: if key in _master_keys:
return True return True
if (not _ssh_master if not _ssh_master \
or 'GIT_SSH' in os.environ or 'GIT_SSH' in os.environ \
or sys.platform in ('win32', 'cygwin')): or sys.platform in ('win32', 'cygwin'):
# failed earlier, or cygwin ssh can't do this # failed earlier, or cygwin ssh can't do this
# #
return False return False
@ -510,7 +458,9 @@ def _open_ssh(host, port=None):
# to the log there. # to the log there.
pass pass
command = command_base[:1] + ['-M', '-N'] + command_base[1:] command = command_base[:1] + \
['-M', '-N'] + \
command_base[1:]
try: try:
Trace(': %s', ' '.join(command)) Trace(': %s', ' '.join(command))
p = subprocess.Popen(command) p = subprocess.Popen(command)
@ -531,7 +481,6 @@ def _open_ssh(host, port=None):
finally: finally:
_master_keys_lock.release() _master_keys_lock.release()
def close_ssh(): def close_ssh():
global _master_keys_lock global _master_keys_lock
@ -539,7 +488,7 @@ def close_ssh():
for p in _master_processes: for p in _master_processes:
try: try:
os.kill(p.pid, signal.SIGTERM) os.kill(p.pid, SIGTERM)
p.wait() p.wait()
except OSError: except OSError:
pass pass
@ -556,18 +505,15 @@ def close_ssh():
# We're done with the lock, so we can delete it. # We're done with the lock, so we can delete it.
_master_keys_lock = None _master_keys_lock = None
URI_SCP = re.compile(r'^([^@:]*@?[^:/]{1,}):') URI_SCP = re.compile(r'^([^@:]*@?[^:/]{1,}):')
URI_ALL = re.compile(r'^([a-z][a-z+-]*)://([^@/]*@?[^/]*)/') URI_ALL = re.compile(r'^([a-z][a-z+-]*)://([^@/]*@?[^/]*)/')
def GetSchemeFromUrl(url): def GetSchemeFromUrl(url):
m = URI_ALL.match(url) m = URI_ALL.match(url)
if m: if m:
return m.group(1) return m.group(1)
return None return None
@contextlib.contextmanager @contextlib.contextmanager
def GetUrlCookieFile(url, quiet): def GetUrlCookieFile(url, quiet):
if url.startswith('persistent-'): if url.startswith('persistent-'):
@ -608,7 +554,6 @@ def GetUrlCookieFile(url, quiet):
cookiefile = os.path.expanduser(cookiefile) cookiefile = os.path.expanduser(cookiefile)
yield cookiefile, None yield cookiefile, None
def _preconnect(url): def _preconnect(url):
m = URI_ALL.match(url) m = URI_ALL.match(url)
if m: if m:
@ -629,11 +574,9 @@ def _preconnect(url):
return False return False
class Remote(object): class Remote(object):
"""Configuration options related to a remote. """Configuration options related to a remote.
""" """
def __init__(self, config, name): def __init__(self, config, name):
self._config = config self._config = config
self.name = name self.name = name
@ -656,8 +599,8 @@ class Remote(object):
insteadOfList = globCfg.GetString(key, all_keys=True) insteadOfList = globCfg.GetString(key, all_keys=True)
for insteadOf in insteadOfList: for insteadOf in insteadOfList:
if (self.url.startswith(insteadOf) if self.url.startswith(insteadOf) \
and len(insteadOf) > len(longest)): and len(insteadOf) > len(longest):
longest = insteadOf longest = insteadOf
longestUrl = url longestUrl = url
@ -794,7 +737,6 @@ class Remote(object):
class Branch(object): class Branch(object):
"""Configuration options related to a single branch. """Configuration options related to a single branch.
""" """
def __init__(self, config, name): def __init__(self, config, name):
self._config = config self._config = config
self.name = name self.name = name

View File

@ -23,8 +23,6 @@ R_CHANGES = 'refs/changes/'
R_HEADS = 'refs/heads/' R_HEADS = 'refs/heads/'
R_TAGS = 'refs/tags/' R_TAGS = 'refs/tags/'
R_PUB = 'refs/published/' R_PUB = 'refs/published/'
R_WORKTREE = 'refs/worktree/'
R_WORKTREE_M = R_WORKTREE + 'm/'
R_M = 'refs/remotes/m/' R_M = 'refs/remotes/m/'

View File

@ -29,15 +29,12 @@ from error import ManifestParseError
NUM_BATCH_RETRIEVE_REVISIONID = 32 NUM_BATCH_RETRIEVE_REVISIONID = 32
def get_gitc_manifest_dir(): def get_gitc_manifest_dir():
return wrapper.Wrapper().get_gitc_manifest_dir() return wrapper.Wrapper().get_gitc_manifest_dir()
def parse_clientdir(gitc_fs_path): def parse_clientdir(gitc_fs_path):
return wrapper.Wrapper().gitc_parse_clientdir(gitc_fs_path) return wrapper.Wrapper().gitc_parse_clientdir(gitc_fs_path)
def _set_project_revisions(projects): def _set_project_revisions(projects):
"""Sets the revisionExpr for a list of projects. """Sets the revisionExpr for a list of projects.
@ -66,7 +63,6 @@ def _set_project_revisions(projects):
(proj.remote.url, proj.revisionExpr)) (proj.remote.url, proj.revisionExpr))
proj.revisionExpr = revisionExpr proj.revisionExpr = revisionExpr
def _manifest_groups(manifest): def _manifest_groups(manifest):
"""Returns the manifest group string that should be synced """Returns the manifest group string that should be synced
@ -81,7 +77,6 @@ def _manifest_groups(manifest):
groups = 'default,platform-' + platform.system().lower() groups = 'default,platform-' + platform.system().lower()
return groups return groups
def generate_gitc_manifest(gitc_manifest, manifest, paths=None): def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
"""Generate a manifest for shafsd to use for this GITC client. """Generate a manifest for shafsd to use for this GITC client.
@ -109,11 +104,11 @@ def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
if not proj.upstream and not git_config.IsId(proj.revisionExpr): if not proj.upstream and not git_config.IsId(proj.revisionExpr):
proj.upstream = proj.revisionExpr proj.upstream = proj.revisionExpr
if path not in gitc_manifest.paths: if not path in gitc_manifest.paths:
# Any new projects need their first revision, even if we weren't asked # Any new projects need their first revision, even if we weren't asked
# for them. # for them.
projects.append(proj) projects.append(proj)
elif path not in paths: elif not path in paths:
# And copy revisions from the previous manifest if we're not updating # And copy revisions from the previous manifest if we're not updating
# them now. # them now.
gitc_proj = gitc_manifest.paths[path] gitc_proj = gitc_manifest.paths[path]
@ -145,7 +140,6 @@ def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
# Save the manifest. # Save the manifest.
save_manifest(manifest) save_manifest(manifest)
def save_manifest(manifest, client_dir=None): def save_manifest(manifest, client_dir=None):
"""Save the manifest file in the client_dir. """Save the manifest file in the client_dir.

431
hooks.py
View File

@ -1,431 +0,0 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import os
import re
import sys
import traceback
from error import HookError
from git_refs import HEAD
from pyversion import is_python3
if is_python3():
import urllib.parse
else:
import imp
import urlparse
urllib = imp.new_module('urllib')
urllib.parse = urlparse
input = raw_input # noqa: F821
class RepoHook(object):
"""A RepoHook contains information about a script to run as a hook.
Hooks are used to run a python script before running an upload (for instance,
to run presubmit checks). Eventually, we may have hooks for other actions.
This shouldn't be confused with files in the 'repo/hooks' directory. Those
files are copied into each '.git/hooks' folder for each project. Repo-level
hooks are associated instead with repo actions.
Hooks are always python. When a hook is run, we will load the hook into the
interpreter and execute its main() function.
"""
def __init__(self,
hook_type,
hooks_project,
topdir,
manifest_url,
abort_if_user_denies=False):
"""RepoHook constructor.
Params:
hook_type: A string representing the type of hook. This is also used
to figure out the name of the file containing the hook. For
example: 'pre-upload'.
hooks_project: The project containing the repo hooks. If you have a
manifest, this is manifest.repo_hooks_project. OK if this is None,
which will make the hook a no-op.
topdir: Repo's top directory (the one containing the .repo directory).
Scripts will run with CWD as this directory. If you have a manifest,
this is manifest.topdir
manifest_url: The URL to the manifest git repo.
abort_if_user_denies: If True, we'll throw a HookError() if the user
doesn't allow us to run the hook.
"""
self._hook_type = hook_type
self._hooks_project = hooks_project
self._manifest_url = manifest_url
self._topdir = topdir
self._abort_if_user_denies = abort_if_user_denies
# Store the full path to the script for convenience.
if self._hooks_project:
self._script_fullpath = os.path.join(self._hooks_project.worktree,
self._hook_type + '.py')
else:
self._script_fullpath = None
def _GetHash(self):
"""Return a hash of the contents of the hooks directory.
We'll just use git to do this. This hash has the property that if anything
changes in the directory we will return a different has.
SECURITY CONSIDERATION:
This hash only represents the contents of files in the hook directory, not
any other files imported or called by hooks. Changes to imported files
can change the script behavior without affecting the hash.
Returns:
A string representing the hash. This will always be ASCII so that it can
be printed to the user easily.
"""
assert self._hooks_project, "Must have hooks to calculate their hash."
# We will use the work_git object rather than just calling GetRevisionId().
# That gives us a hash of the latest checked in version of the files that
# the user will actually be executing. Specifically, GetRevisionId()
# doesn't appear to change even if a user checks out a different version
# of the hooks repo (via git checkout) nor if a user commits their own revs.
#
# NOTE: Local (non-committed) changes will not be factored into this hash.
# I think this is OK, since we're really only worried about warning the user
# about upstream changes.
return self._hooks_project.work_git.rev_parse('HEAD')
def _GetMustVerb(self):
"""Return 'must' if the hook is required; 'should' if not."""
if self._abort_if_user_denies:
return 'must'
else:
return 'should'
def _CheckForHookApproval(self):
"""Check to see whether this hook has been approved.
We'll accept approval of manifest URLs if they're using secure transports.
This way the user can say they trust the manifest hoster. For insecure
hosts, we fall back to checking the hash of the hooks repo.
Note that we ask permission for each individual hook even though we use
the hash of all hooks when detecting changes. We'd like the user to be
able to approve / deny each hook individually. We only use the hash of all
hooks because there is no other easy way to detect changes to local imports.
Returns:
True if this hook is approved to run; False otherwise.
Raises:
HookError: Raised if the user doesn't approve and abort_if_user_denies
was passed to the consturctor.
"""
if self._ManifestUrlHasSecureScheme():
return self._CheckForHookApprovalManifest()
else:
return self._CheckForHookApprovalHash()
def _CheckForHookApprovalHelper(self, subkey, new_val, main_prompt,
changed_prompt):
"""Check for approval for a particular attribute and hook.
Args:
subkey: The git config key under [repo.hooks.<hook_type>] to store the
last approved string.
new_val: The new value to compare against the last approved one.
main_prompt: Message to display to the user to ask for approval.
changed_prompt: Message explaining why we're re-asking for approval.
Returns:
True if this hook is approved to run; False otherwise.
Raises:
HookError: Raised if the user doesn't approve and abort_if_user_denies
was passed to the consturctor.
"""
hooks_config = self._hooks_project.config
git_approval_key = 'repo.hooks.%s.%s' % (self._hook_type, subkey)
# Get the last value that the user approved for this hook; may be None.
old_val = hooks_config.GetString(git_approval_key)
if old_val is not None:
# User previously approved hook and asked not to be prompted again.
if new_val == old_val:
# Approval matched. We're done.
return True
else:
# Give the user a reason why we're prompting, since they last told
# us to "never ask again".
prompt = 'WARNING: %s\n\n' % (changed_prompt,)
else:
prompt = ''
# Prompt the user if we're not on a tty; on a tty we'll assume "no".
if sys.stdout.isatty():
prompt += main_prompt + ' (yes/always/NO)? '
response = input(prompt).lower()
print()
# User is doing a one-time approval.
if response in ('y', 'yes'):
return True
elif response == 'always':
hooks_config.SetString(git_approval_key, new_val)
return True
# For anything else, we'll assume no approval.
if self._abort_if_user_denies:
raise HookError('You must allow the %s hook or use --no-verify.' %
self._hook_type)
return False
def _ManifestUrlHasSecureScheme(self):
"""Check if the URI for the manifest is a secure transport."""
secure_schemes = ('file', 'https', 'ssh', 'persistent-https', 'sso', 'rpc')
parse_results = urllib.parse.urlparse(self._manifest_url)
return parse_results.scheme in secure_schemes
def _CheckForHookApprovalManifest(self):
"""Check whether the user has approved this manifest host.
Returns:
True if this hook is approved to run; False otherwise.
"""
return self._CheckForHookApprovalHelper(
'approvedmanifest',
self._manifest_url,
'Run hook scripts from %s' % (self._manifest_url,),
'Manifest URL has changed since %s was allowed.' % (self._hook_type,))
def _CheckForHookApprovalHash(self):
"""Check whether the user has approved the hooks repo.
Returns:
True if this hook is approved to run; False otherwise.
"""
prompt = ('Repo %s run the script:\n'
' %s\n'
'\n'
'Do you want to allow this script to run')
return self._CheckForHookApprovalHelper(
'approvedhash',
self._GetHash(),
prompt % (self._GetMustVerb(), self._script_fullpath),
'Scripts have changed since %s was allowed.' % (self._hook_type,))
@staticmethod
def _ExtractInterpFromShebang(data):
"""Extract the interpreter used in the shebang.
Try to locate the interpreter the script is using (ignoring `env`).
Args:
data: The file content of the script.
Returns:
The basename of the main script interpreter, or None if a shebang is not
used or could not be parsed out.
"""
firstline = data.splitlines()[:1]
if not firstline:
return None
# The format here can be tricky.
shebang = firstline[0].strip()
m = re.match(r'^#!\s*([^\s]+)(?:\s+([^\s]+))?', shebang)
if not m:
return None
# If the using `env`, find the target program.
interp = m.group(1)
if os.path.basename(interp) == 'env':
interp = m.group(2)
return interp
def _ExecuteHookViaReexec(self, interp, context, **kwargs):
"""Execute the hook script through |interp|.
Note: Support for this feature should be dropped ~Jun 2021.
Args:
interp: The Python program to run.
context: Basic Python context to execute the hook inside.
kwargs: Arbitrary arguments to pass to the hook script.
Raises:
HookError: When the hooks failed for any reason.
"""
# This logic needs to be kept in sync with _ExecuteHookViaImport below.
script = """
import json, os, sys
path = '''%(path)s'''
kwargs = json.loads('''%(kwargs)s''')
context = json.loads('''%(context)s''')
sys.path.insert(0, os.path.dirname(path))
data = open(path).read()
exec(compile(data, path, 'exec'), context)
context['main'](**kwargs)
""" % {
'path': self._script_fullpath,
'kwargs': json.dumps(kwargs),
'context': json.dumps(context),
}
# We pass the script via stdin to avoid OS argv limits. It also makes
# unhandled exception tracebacks less verbose/confusing for users.
cmd = [interp, '-c', 'import sys; exec(sys.stdin.read())']
proc = subprocess.Popen(cmd, stdin=subprocess.PIPE)
proc.communicate(input=script.encode('utf-8'))
if proc.returncode:
raise HookError('Failed to run %s hook.' % (self._hook_type,))
def _ExecuteHookViaImport(self, data, context, **kwargs):
"""Execute the hook code in |data| directly.
Args:
data: The code of the hook to execute.
context: Basic Python context to execute the hook inside.
kwargs: Arbitrary arguments to pass to the hook script.
Raises:
HookError: When the hooks failed for any reason.
"""
# Exec, storing global context in the context dict. We catch exceptions
# and convert to a HookError w/ just the failing traceback.
try:
exec(compile(data, self._script_fullpath, 'exec'), context)
except Exception:
raise HookError('%s\nFailed to import %s hook; see traceback above.' %
(traceback.format_exc(), self._hook_type))
# Running the script should have defined a main() function.
if 'main' not in context:
raise HookError('Missing main() in: "%s"' % self._script_fullpath)
# Call the main function in the hook. If the hook should cause the
# build to fail, it will raise an Exception. We'll catch that convert
# to a HookError w/ just the failing traceback.
try:
context['main'](**kwargs)
except Exception:
raise HookError('%s\nFailed to run main() for %s hook; see traceback '
'above.' % (traceback.format_exc(), self._hook_type))
def _ExecuteHook(self, **kwargs):
"""Actually execute the given hook.
This will run the hook's 'main' function in our python interpreter.
Args:
kwargs: Keyword arguments to pass to the hook. These are often specific
to the hook type. For instance, pre-upload hooks will contain
a project_list.
"""
# Keep sys.path and CWD stashed away so that we can always restore them
# upon function exit.
orig_path = os.getcwd()
orig_syspath = sys.path
try:
# Always run hooks with CWD as topdir.
os.chdir(self._topdir)
# Put the hook dir as the first item of sys.path so hooks can do
# relative imports. We want to replace the repo dir as [0] so
# hooks can't import repo files.
sys.path = [os.path.dirname(self._script_fullpath)] + sys.path[1:]
# Initial global context for the hook to run within.
context = {'__file__': self._script_fullpath}
# Add 'hook_should_take_kwargs' to the arguments to be passed to main.
# We don't actually want hooks to define their main with this argument--
# it's there to remind them that their hook should always take **kwargs.
# For instance, a pre-upload hook should be defined like:
# def main(project_list, **kwargs):
#
# This allows us to later expand the API without breaking old hooks.
kwargs = kwargs.copy()
kwargs['hook_should_take_kwargs'] = True
# See what version of python the hook has been written against.
data = open(self._script_fullpath).read()
interp = self._ExtractInterpFromShebang(data)
reexec = False
if interp:
prog = os.path.basename(interp)
if prog.startswith('python2') and sys.version_info.major != 2:
reexec = True
elif prog.startswith('python3') and sys.version_info.major == 2:
reexec = True
# Attempt to execute the hooks through the requested version of Python.
if reexec:
try:
self._ExecuteHookViaReexec(interp, context, **kwargs)
except OSError as e:
if e.errno == errno.ENOENT:
# We couldn't find the interpreter, so fallback to importing.
reexec = False
else:
raise
# Run the hook by importing directly.
if not reexec:
self._ExecuteHookViaImport(data, context, **kwargs)
finally:
# Restore sys.path and CWD.
sys.path = orig_syspath
os.chdir(orig_path)
def Run(self, user_allows_all_hooks, **kwargs):
"""Run the hook.
If the hook doesn't exist (because there is no hooks project or because
this particular hook is not enabled), this is a no-op.
Args:
user_allows_all_hooks: If True, we will never prompt about running the
hook--we'll just assume it's OK to run it.
kwargs: Keyword arguments to pass to the hook. These are often specific
to the hook type. For instance, pre-upload hooks will contain
a project_list.
Raises:
HookError: If there was a problem finding the hook or the user declined
to run a required hook (from _CheckForHookApproval).
"""
# No-op if there is no hooks project or if hook is disabled.
if ((not self._hooks_project) or (self._hook_type not in
self._hooks_project.enabled_repo_hooks)):
return
# Bail with a nice error if we can't find the hook.
if not os.path.isfile(self._script_fullpath):
raise HookError('Couldn\'t find repo hook: "%s"' % self._script_fullpath)
# Make sure the user is OK with running the hook.
if (not user_allows_all_hooks) and (not self._CheckForHookApproval()):
return
# Run the hook with the same version of python we're using.
self._ExecuteHook(**kwargs)

View File

@ -1,5 +1,5 @@
#!/bin/sh #!/bin/sh
# From Gerrit Code Review 3.1.3 # From Gerrit Code Review 2.14.6
# #
# Part of Gerrit Code Review (https://www.gerritcodereview.com/) # Part of Gerrit Code Review (https://www.gerritcodereview.com/)
# #
@ -16,48 +16,176 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
#
# avoid [[ which is not POSIX sh. unset GREP_OPTIONS
if test "$#" != 1 ; then
echo "$0 requires an argument." CHANGE_ID_AFTER="Bug|Depends-On|Issue|Test|Feature|Fixes|Fixed"
exit 1 MSG="$1"
# Check for, and add if missing, a unique Change-Id
#
add_ChangeId() {
clean_message=`sed -e '
/^diff --git .*/{
s///
q
}
/^Signed-off-by:/d
/^#/d
' "$MSG" | git stripspace`
if test -z "$clean_message"
then
return
fi fi
if test ! -f "$1" ; then # Do not add Change-Id to temp commits
echo "file does not exist: $1" if echo "$clean_message" | head -1 | grep -q '^\(fixup\|squash\)!'
exit 1 then
return
fi fi
# Do not create a change id if requested if test "false" = "`git config --bool --get gerrit.createChangeId`"
if test "false" = "`git config --bool --get gerrit.createChangeId`" ; then then
exit 0 return
fi fi
# $RANDOM will be undefined if not using bash, so don't use set -u # Does Change-Id: already exist? if so, exit (no change).
random=$( (whoami ; hostname ; date; cat $1 ; echo $RANDOM) | git hash-object --stdin) if grep -i '^Change-Id:' "$MSG" >/dev/null
dest="$1.tmp.${random}" then
return
trap 'rm -f "${dest}"' EXIT
if ! git stripspace --strip-comments < "$1" > "${dest}" ; then
echo "cannot strip comments from $1"
exit 1
fi fi
if test ! -s "${dest}" ; then id=`_gen_ChangeId`
echo "file is empty: $1" T="$MSG.tmp.$$"
exit 1 AWK=awk
if [ -x /usr/xpg4/bin/awk ]; then
# Solaris AWK is just too broken
AWK=/usr/xpg4/bin/awk
fi fi
# Avoid the --in-place option which only appeared in Git 2.8 # Get core.commentChar from git config or use default symbol
# Avoid the --if-exists option which only appeared in Git 2.15 commentChar=`git config --get core.commentChar`
if ! git -c trailer.ifexists=doNothing interpret-trailers \ commentChar=${commentChar:-#}
--trailer "Change-Id: I${random}" < "$1" > "${dest}" ; then
echo "cannot insert change-id line in $1"
exit 1
fi
if ! mv "${dest}" "$1" ; then # How this works:
echo "cannot mv ${dest} to $1" # - parse the commit message as (textLine+ blankLine*)*
exit 1 # - assume textLine+ to be a footer until proven otherwise
# - exception: the first block is not footer (as it is the title)
# - read textLine+ into a variable
# - then count blankLines
# - once the next textLine appears, print textLine+ blankLine* as these
# aren't footer
# - in END, the last textLine+ block is available for footer parsing
$AWK '
BEGIN {
# while we start with the assumption that textLine+
# is a footer, the first block is not.
isFooter = 0
footerComment = 0
blankLines = 0
}
# Skip lines starting with commentChar without any spaces before it.
/^'"$commentChar"'/ { next }
# Skip the line starting with the diff command and everything after it,
# up to the end of the file, assuming it is only patch data.
# If more than one line before the diff was empty, strip all but one.
/^diff --git / {
blankLines = 0
while (getline) { }
next
}
# Count blank lines outside footer comments
/^$/ && (footerComment == 0) {
blankLines++
next
}
# Catch footer comment
/^\[[a-zA-Z0-9-]+:/ && (isFooter == 1) {
footerComment = 1
}
/]$/ && (footerComment == 1) {
footerComment = 2
}
# We have a non-blank line after blank lines. Handle this.
(blankLines > 0) {
print lines
for (i = 0; i < blankLines; i++) {
print ""
}
lines = ""
blankLines = 0
isFooter = 1
footerComment = 0
}
# Detect that the current block is not the footer
(footerComment == 0) && (!/^\[?[a-zA-Z0-9-]+:/ || /^[a-zA-Z0-9-]+:\/\//) {
isFooter = 0
}
{
# We need this information about the current last comment line
if (footerComment == 2) {
footerComment = 0
}
if (lines != "") {
lines = lines "\n";
}
lines = lines $0
}
# Footer handling:
# If the last block is considered a footer, splice in the Change-Id at the
# right place.
# Look for the right place to inject Change-Id by considering
# CHANGE_ID_AFTER. Keys listed in it (case insensitive) come first,
# then Change-Id, then everything else (eg. Signed-off-by:).
#
# Otherwise just print the last block, a new line and the Change-Id as a
# block of its own.
END {
unprinted = 1
if (isFooter == 0) {
print lines "\n"
lines = ""
}
changeIdAfter = "^(" tolower("'"$CHANGE_ID_AFTER"'") "):"
numlines = split(lines, footer, "\n")
for (line = 1; line <= numlines; line++) {
if (unprinted && match(tolower(footer[line]), changeIdAfter) != 1) {
unprinted = 0
print "Change-Id: I'"$id"'"
}
print footer[line]
}
if (unprinted) {
print "Change-Id: I'"$id"'"
}
}' "$MSG" > "$T" && mv "$T" "$MSG" || rm -f "$T"
}
_gen_ChangeIdInput() {
echo "tree `git write-tree`"
if parent=`git rev-parse "HEAD^0" 2>/dev/null`
then
echo "parent $parent"
fi fi
echo "author `git var GIT_AUTHOR_IDENT`"
echo "committer `git var GIT_COMMITTER_IDENT`"
echo
printf '%s' "$clean_message"
}
_gen_ChangeId() {
_gen_ChangeIdInput |
git hash-object -t commit --stdin
}
add_ChangeId

139
main.py
View File

@ -26,7 +26,6 @@ import getpass
import netrc import netrc
import optparse import optparse
import os import os
import shlex
import sys import sys
import textwrap import textwrap
import time import time
@ -48,8 +47,8 @@ except ImportError:
from color import SetDefaultColoring from color import SetDefaultColoring
import event_log import event_log
from repo_trace import SetTrace from repo_trace import SetTrace
from git_command import user_agent from git_command import git, GitCommand, user_agent
from git_config import init_ssh, close_ssh, RepoConfig from git_config import init_ssh, close_ssh
from command import InteractiveCommand from command import InteractiveCommand
from command import MirrorSafeCommand from command import MirrorSafeCommand
from command import GitcAvailableCommand, GitcClientCommand from command import GitcAvailableCommand, GitcClientCommand
@ -70,35 +69,7 @@ from wrapper import WrapperPath, Wrapper
from subcmds import all_commands from subcmds import all_commands
if not is_python3(): if not is_python3():
input = raw_input # noqa: F821 input = raw_input
# NB: These do not need to be kept in sync with the repo launcher script.
# These may be much newer as it allows the repo launcher to roll between
# different repo releases while source versions might require a newer python.
#
# The soft version is when we start warning users that the version is old and
# we'll be dropping support for it. We'll refuse to work with versions older
# than the hard version.
#
# python-3.6 is in Ubuntu Bionic.
MIN_PYTHON_VERSION_SOFT = (3, 6)
MIN_PYTHON_VERSION_HARD = (3, 4)
if sys.version_info.major < 3:
print('repo: warning: Python 2 is no longer supported; '
'Please upgrade to Python {}.{}+.'.format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr)
else:
if sys.version_info < MIN_PYTHON_VERSION_HARD:
print('repo: error: Python 3 version is too old; '
'Please upgrade to Python {}.{}+.'.format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr)
sys.exit(1)
elif sys.version_info < MIN_PYTHON_VERSION_SOFT:
print('repo: warning: your Python 3 version is no longer supported; '
'Please upgrade to Python {}.{}+.'.format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr)
global_options = optparse.OptionParser( global_options = optparse.OptionParser(
usage='repo [-p|--paginate|--no-pager] COMMAND [ARGS]', usage='repo [-p|--paginate|--no-pager] COMMAND [ARGS]',
@ -109,7 +80,7 @@ global_options.add_option('-p', '--paginate',
dest='pager', action='store_true', dest='pager', action='store_true',
help='display command output in the pager') help='display command output in the pager')
global_options.add_option('--no-pager', global_options.add_option('--no-pager',
dest='pager', action='store_false', dest='no_pager', action='store_true',
help='disable the pager') help='disable the pager')
global_options.add_option('--color', global_options.add_option('--color',
choices=('auto', 'always', 'never'), default=None, choices=('auto', 'always', 'never'), default=None,
@ -130,11 +101,12 @@ global_options.add_option('--event-log',
dest='event_log', action='store', dest='event_log', action='store',
help='filename of event log to append timeline to') help='filename of event log to append timeline to')
class _Repo(object): class _Repo(object):
def __init__(self, repodir): def __init__(self, repodir):
self.repodir = repodir self.repodir = repodir
self.commands = all_commands self.commands = all_commands
# add 'branch' as an alias for 'branches'
all_commands['branch'] = all_commands['branches']
def _ParseArgs(self, argv): def _ParseArgs(self, argv):
"""Parse the main `repo` command line options.""" """Parse the main `repo` command line options."""
@ -154,9 +126,6 @@ class _Repo(object):
argv = [] argv = []
gopts, _gargs = global_options.parse_args(glob) gopts, _gargs = global_options.parse_args(glob)
name, alias_args = self._ExpandAlias(name)
argv = alias_args + argv
if gopts.help: if gopts.help:
global_options.print_help() global_options.print_help()
commands = ' '.join(sorted(self.commands)) commands = ' '.join(sorted(self.commands))
@ -167,27 +136,6 @@ class _Repo(object):
return (name, gopts, argv) return (name, gopts, argv)
def _ExpandAlias(self, name):
"""Look up user registered aliases."""
# We don't resolve aliases for existing subcommands. This matches git.
if name in self.commands:
return name, []
key = 'alias.%s' % (name,)
alias = RepoConfig.ForRepository(self.repodir).GetString(key)
if alias is None:
alias = RepoConfig.ForUser().GetString(key)
if alias is None:
return name, []
args = alias.strip().split(' ', 1)
name = args[0]
if len(args) == 2:
args = shlex.split(args[1])
else:
args = []
return name, args
def _Run(self, name, gopts, argv): def _Run(self, name, gopts, argv):
"""Execute the requested subcommand.""" """Execute the requested subcommand."""
result = 0 result = 0
@ -204,7 +152,7 @@ class _Repo(object):
SetDefaultColoring(gopts.color) SetDefaultColoring(gopts.color)
try: try:
cmd = self.commands[name]() cmd = self.commands[name]
except KeyError: except KeyError:
print("repo: '%s' is not a repo command. See 'repo help'." % name, print("repo: '%s' is not a repo command. See 'repo help'." % name,
file=sys.stderr) file=sys.stderr)
@ -245,7 +193,7 @@ class _Repo(object):
file=sys.stderr) file=sys.stderr)
return 1 return 1
if gopts.pager is not False and not isinstance(cmd, InteractiveCommand): if not gopts.no_pager and not isinstance(cmd, InteractiveCommand):
config = cmd.manifest.globalConfig config = cmd.manifest.globalConfig
if gopts.pager: if gopts.pager:
use_pager = True use_pager = True
@ -280,8 +228,7 @@ class _Repo(object):
if e.name: if e.name:
print('error: project group must be enabled for project %s' % e.name, file=sys.stderr) print('error: project group must be enabled for project %s' % e.name, file=sys.stderr)
else: else:
print('error: project group must be enabled for the project in the current directory', print('error: project group must be enabled for the project in the current directory', file=sys.stderr)
file=sys.stderr)
result = 1 result = 1
except SystemExit as e: except SystemExit as e:
if e.code: if e.code:
@ -308,66 +255,42 @@ class _Repo(object):
return result return result
def _CheckWrapperVersion(ver_str, repo_path): def _CheckWrapperVersion(ver, repo_path):
"""Verify the repo launcher is new enough for this checkout.
Args:
ver_str: The version string passed from the repo launcher when it ran us.
repo_path: The path to the repo launcher that loaded us.
"""
# Refuse to work with really old wrapper versions. We don't test these,
# so might as well require a somewhat recent sane version.
# v1.15 of the repo launcher was released in ~Mar 2012.
MIN_REPO_VERSION = (1, 15)
min_str = '.'.join(str(x) for x in MIN_REPO_VERSION)
if not repo_path: if not repo_path:
repo_path = '~/bin/repo' repo_path = '~/bin/repo'
if not ver_str: if not ver:
print('no --wrapper-version argument', file=sys.stderr) print('no --wrapper-version argument', file=sys.stderr)
sys.exit(1) sys.exit(1)
# Pull out the version of the repo launcher we know about to compare.
exp = Wrapper().VERSION exp = Wrapper().VERSION
ver = tuple(map(int, ver_str.split('.'))) ver = tuple(map(int, ver.split('.')))
if len(ver) == 1:
ver = (0, ver[0])
exp_str = '.'.join(map(str, exp)) exp_str = '.'.join(map(str, exp))
if ver < MIN_REPO_VERSION: if exp[0] > ver[0] or ver < (0, 4):
print(""" print("""
repo: error: !!! A new repo command (%5s) is available. !!!
!!! Your version of repo %s is too old. !!! You must upgrade before you can continue: !!!
!!! We need at least version %s.
!!! A new version of repo (%s) is available.
!!! You must upgrade before you can continue:
cp %s %s cp %s %s
""" % (ver_str, min_str, exp_str, WrapperPath(), repo_path), file=sys.stderr) """ % (exp_str, WrapperPath(), repo_path), file=sys.stderr)
sys.exit(1) sys.exit(1)
if exp > ver: if exp > ver:
print('\n... A new version of repo (%s) is available.' % (exp_str,), print("""
file=sys.stderr) ... A new repo command (%5s) is available.
if os.access(repo_path, os.W_OK):
print("""\
... You should upgrade soon: ... You should upgrade soon:
cp %s %s
""" % (WrapperPath(), repo_path), file=sys.stderr)
else:
print("""\
... New version is available at: %s
... The launcher is run from: %s
!!! The launcher is not writable. Please talk to your sysadmin or distro
!!! to get an update installed.
""" % (WrapperPath(), repo_path), file=sys.stderr)
cp %s %s
""" % (exp_str, WrapperPath(), repo_path), file=sys.stderr)
def _CheckRepoDir(repo_dir): def _CheckRepoDir(repo_dir):
if not repo_dir: if not repo_dir:
print('no --repo-dir argument', file=sys.stderr) print('no --repo-dir argument', file=sys.stderr)
sys.exit(1) sys.exit(1)
def _PruneOptions(argv, opt): def _PruneOptions(argv, opt):
i = 0 i = 0
while i < len(argv): while i < len(argv):
@ -383,7 +306,6 @@ def _PruneOptions(argv, opt):
continue continue
i += 1 i += 1
class _UserAgentHandler(urllib.request.BaseHandler): class _UserAgentHandler(urllib.request.BaseHandler):
def http_request(self, req): def http_request(self, req):
req.add_header('User-Agent', user_agent.repo) req.add_header('User-Agent', user_agent.repo)
@ -393,7 +315,6 @@ class _UserAgentHandler(urllib.request.BaseHandler):
req.add_header('User-Agent', user_agent.repo) req.add_header('User-Agent', user_agent.repo)
return req return req
def _AddPasswordFromUserInput(handler, msg, req): def _AddPasswordFromUserInput(handler, msg, req):
# If repo could not find auth info from netrc, try to get it from user input # If repo could not find auth info from netrc, try to get it from user input
url = req.get_full_url() url = req.get_full_url()
@ -407,7 +328,6 @@ def _AddPasswordFromUserInput(handler, msg, req):
return return
handler.passwd.add_password(None, url, user, password) handler.passwd.add_password(None, url, user, password)
class _BasicAuthHandler(urllib.request.HTTPBasicAuthHandler): class _BasicAuthHandler(urllib.request.HTTPBasicAuthHandler):
def http_error_401(self, req, fp, code, msg, headers): def http_error_401(self, req, fp, code, msg, headers):
_AddPasswordFromUserInput(self, msg, req) _AddPasswordFromUserInput(self, msg, req)
@ -417,14 +337,13 @@ class _BasicAuthHandler(urllib.request.HTTPBasicAuthHandler):
def http_error_auth_reqed(self, authreq, host, req, headers): def http_error_auth_reqed(self, authreq, host, req, headers):
try: try:
old_add_header = req.add_header old_add_header = req.add_header
def _add_header(name, val): def _add_header(name, val):
val = val.replace('\n', '') val = val.replace('\n', '')
old_add_header(name, val) old_add_header(name, val)
req.add_header = _add_header req.add_header = _add_header
return urllib.request.AbstractBasicAuthHandler.http_error_auth_reqed( return urllib.request.AbstractBasicAuthHandler.http_error_auth_reqed(
self, authreq, host, req, headers) self, authreq, host, req, headers)
except Exception: except:
reset = getattr(self, 'reset_retry_count', None) reset = getattr(self, 'reset_retry_count', None)
if reset is not None: if reset is not None:
reset() reset()
@ -432,7 +351,6 @@ class _BasicAuthHandler(urllib.request.HTTPBasicAuthHandler):
self.retried = 0 self.retried = 0
raise raise
class _DigestAuthHandler(urllib.request.HTTPDigestAuthHandler): class _DigestAuthHandler(urllib.request.HTTPDigestAuthHandler):
def http_error_401(self, req, fp, code, msg, headers): def http_error_401(self, req, fp, code, msg, headers):
_AddPasswordFromUserInput(self, msg, req) _AddPasswordFromUserInput(self, msg, req)
@ -442,14 +360,13 @@ class _DigestAuthHandler(urllib.request.HTTPDigestAuthHandler):
def http_error_auth_reqed(self, auth_header, host, req, headers): def http_error_auth_reqed(self, auth_header, host, req, headers):
try: try:
old_add_header = req.add_header old_add_header = req.add_header
def _add_header(name, val): def _add_header(name, val):
val = val.replace('\n', '') val = val.replace('\n', '')
old_add_header(name, val) old_add_header(name, val)
req.add_header = _add_header req.add_header = _add_header
return urllib.request.AbstractDigestAuthHandler.http_error_auth_reqed( return urllib.request.AbstractDigestAuthHandler.http_error_auth_reqed(
self, auth_header, host, req, headers) self, auth_header, host, req, headers)
except Exception: except:
reset = getattr(self, 'reset_retry_count', None) reset = getattr(self, 'reset_retry_count', None)
if reset is not None: if reset is not None:
reset() reset()
@ -457,7 +374,6 @@ class _DigestAuthHandler(urllib.request.HTTPDigestAuthHandler):
self.retried = 0 self.retried = 0
raise raise
class _KerberosAuthHandler(urllib.request.BaseHandler): class _KerberosAuthHandler(urllib.request.BaseHandler):
def __init__(self): def __init__(self):
self.retried = 0 self.retried = 0
@ -492,7 +408,7 @@ class _KerberosAuthHandler(urllib.request.BaseHandler):
return response return response
except kerberos.GSSError: except kerberos.GSSError:
return None return None
except Exception: except:
self.reset_retry_count() self.reset_retry_count()
raise raise
finally: finally:
@ -538,7 +454,6 @@ class _KerberosAuthHandler(urllib.request.BaseHandler):
kerberos.authGSSClientClean(self.context) kerberos.authGSSClientClean(self.context)
self.context = None self.context = None
def init_http(): def init_http():
handlers = [_UserAgentHandler()] handlers = [_UserAgentHandler()]
@ -566,7 +481,6 @@ def init_http():
handlers.append(urllib.request.HTTPSHandler(debuglevel=1)) handlers.append(urllib.request.HTTPSHandler(debuglevel=1))
urllib.request.install_opener(urllib.request.build_opener(*handlers)) urllib.request.install_opener(urllib.request.build_opener(*handlers))
def _Main(argv): def _Main(argv):
result = 0 result = 0
@ -614,7 +528,7 @@ def _Main(argv):
argv = list(sys.argv) argv = list(sys.argv)
argv.extend(rce.extra_args) argv.extend(rce.extra_args)
try: try:
os.execv(sys.executable, [__file__] + argv) os.execv(__file__, argv)
except OSError as e: except OSError as e:
print('fatal: cannot restart repo after upgrade', file=sys.stderr) print('fatal: cannot restart repo after upgrade', file=sys.stderr)
print('fatal: %s' % e, file=sys.stderr) print('fatal: %s' % e, file=sys.stderr)
@ -623,6 +537,5 @@ def _Main(argv):
TerminatePager() TerminatePager()
sys.exit(result) sys.exit(result)
if __name__ == '__main__': if __name__ == '__main__':
_Main(sys.argv[1:]) _Main(sys.argv[1:])

View File

@ -31,7 +31,7 @@ else:
urllib.parse = urlparse urllib.parse = urlparse
import gitc_utils import gitc_utils
from git_config import GitConfig, IsId from git_config import GitConfig
from git_refs import R_HEADS, HEAD from git_refs import R_HEADS, HEAD
import platform_utils import platform_utils
from project import RemoteSpec, Project, MetaProject from project import RemoteSpec, Project, MetaProject
@ -56,61 +56,6 @@ urllib.parse.uses_netloc.extend([
'sso', 'sso',
'rpc']) 'rpc'])
def XmlBool(node, attr, default=None):
"""Determine boolean value of |node|'s |attr|.
Invalid values will issue a non-fatal warning.
Args:
node: XML node whose attributes we access.
attr: The attribute to access.
default: If the attribute is not set (value is empty), then use this.
Returns:
True if the attribute is a valid string representing true.
False if the attribute is a valid string representing false.
|default| otherwise.
"""
value = node.getAttribute(attr)
s = value.lower()
if s == '':
return default
elif s in {'yes', 'true', '1'}:
return True
elif s in {'no', 'false', '0'}:
return False
else:
print('warning: manifest: %s="%s": ignoring invalid XML boolean' %
(attr, value), file=sys.stderr)
return default
def XmlInt(node, attr, default=None):
"""Determine integer value of |node|'s |attr|.
Args:
node: XML node whose attributes we access.
attr: The attribute to access.
default: If the attribute is not set (value is empty), then use this.
Returns:
The number if the attribute is a valid number.
Raises:
ManifestParseError: The number is invalid.
"""
value = node.getAttribute(attr)
if not value:
return default
try:
return int(value)
except ValueError:
raise ManifestParseError('manifest: invalid %s="%s" integer' %
(attr, value))
class _Default(object): class _Default(object):
"""Project defaults within the manifest.""" """Project defaults within the manifest."""
@ -129,7 +74,6 @@ class _Default(object):
def __ne__(self, other): def __ne__(self, other):
return self.__dict__ != other.__dict__ return self.__dict__ != other.__dict__
class _XmlRemote(object): class _XmlRemote(object):
def __init__(self, def __init__(self,
name, name,
@ -183,7 +127,6 @@ class _XmlRemote(object):
orig_name=self.name, orig_name=self.name,
fetchUrl=self.fetchUrl) fetchUrl=self.fetchUrl)
class XmlManifest(object): class XmlManifest(object):
"""manages the repo configuration file""" """manages the repo configuration file"""
@ -192,6 +135,7 @@ class XmlManifest(object):
self.topdir = os.path.dirname(self.repodir) self.topdir = os.path.dirname(self.repodir)
self.manifestFile = os.path.join(self.repodir, MANIFEST_FILE_NAME) self.manifestFile = os.path.join(self.repodir, MANIFEST_FILE_NAME)
self.globalConfig = GitConfig.ForUser() self.globalConfig = GitConfig.ForUser()
self.localManifestWarning = False
self.isGitcClient = False self.isGitcClient = False
self._load_local_manifests = True self._load_local_manifests = True
@ -199,17 +143,9 @@ class XmlManifest(object):
gitdir = os.path.join(repodir, 'repo/.git'), gitdir = os.path.join(repodir, 'repo/.git'),
worktree = os.path.join(repodir, 'repo')) worktree = os.path.join(repodir, 'repo'))
mp = MetaProject(self, 'manifests', self.manifestProject = MetaProject(self, 'manifests',
gitdir = os.path.join(repodir, 'manifests.git'), gitdir = os.path.join(repodir, 'manifests.git'),
worktree = os.path.join(repodir, 'manifests')) worktree = os.path.join(repodir, 'manifests'))
self.manifestProject = mp
# This is a bit hacky, but we're in a chicken & egg situation: all the
# normal repo settings live in the manifestProject which we just setup
# above, so we couldn't easily query before that. We assume Project()
# init doesn't care if this changes afterwards.
if os.path.exists(mp.gitdir) and mp.config.GetBoolean('repo.worktree'):
mp.use_git_worktrees = True
self._Unload() self._Unload()
@ -244,27 +180,12 @@ class XmlManifest(object):
""" """
self.Override(name) self.Override(name)
# Old versions of repo would generate symlinks we need to clean up. try:
if os.path.lexists(self.manifestFile): if os.path.lexists(self.manifestFile):
platform_utils.remove(self.manifestFile) platform_utils.remove(self.manifestFile)
# This file is interpreted as if it existed inside the manifest repo. platform_utils.symlink(os.path.join('manifests', name), self.manifestFile)
# That allows us to use <include> with the relative file name. except OSError as e:
with open(self.manifestFile, 'w') as fp: raise ManifestParseError('cannot link manifest %s: %s' % (name, str(e)))
fp.write("""<?xml version="1.0" encoding="UTF-8"?>
<!--
DO NOT EDIT THIS FILE! It is generated by repo and changes will be discarded.
If you want to use a different manifest, use `repo init -m <file>` instead.
If you want to customize your checkout by overriding manifest settings, use
the local_manifests/ directory instead.
For more information on repo manifests, check out:
https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
-->
<manifest>
<include name="%s" />
</manifest>
""" % (name,))
def _RemoteToXml(self, r, doc, root): def _RemoteToXml(self, r, doc, root):
e = doc.createElement('remote') e = doc.createElement('remote')
@ -283,7 +204,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
def _ParseGroups(self, groups): def _ParseGroups(self, groups):
return [x for x in re.split(r'[,\s]+', groups) if x] return [x for x in re.split(r'[,\s]+', groups) if x]
def Save(self, fd, peg_rev=False, peg_rev_upstream=True, peg_rev_dest_branch=True, groups=None): def Save(self, fd, peg_rev=False, peg_rev_upstream=True, groups=None):
"""Write the current manifest out to the given file descriptor. """Write the current manifest out to the given file descriptor.
""" """
mp = self.manifestProject mp = self.manifestProject
@ -388,13 +309,6 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
# Only save the origin if the origin is not a sha1, and the default # Only save the origin if the origin is not a sha1, and the default
# isn't our value # isn't our value
e.setAttribute('upstream', p.revisionExpr) e.setAttribute('upstream', p.revisionExpr)
if peg_rev_dest_branch:
if p.dest_branch:
e.setAttribute('dest-branch', p.dest_branch)
elif value != p.revisionExpr:
e.setAttribute('dest-branch', p.revisionExpr)
else: else:
revision = self.remotes[p.remote.orig_name].revision or d.revisionExpr revision = self.remotes[p.remote.orig_name].revision or d.revisionExpr
if not revision or revision != p.revisionExpr: if not revision or revision != p.revisionExpr:
@ -500,14 +414,6 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
self._Load() self._Load()
return self._manifest_server return self._manifest_server
@property
def CloneBundle(self):
clone_bundle = self.manifestProject.config.GetBoolean('repo.clonebundle')
if clone_bundle is None:
return False if self.manifestProject.config.GetBoolean('repo.partialclone') else True
else:
return clone_bundle
@property @property
def CloneFilter(self): def CloneFilter(self):
if self.manifestProject.config.GetBoolean('repo.partialclone'): if self.manifestProject.config.GetBoolean('repo.partialclone'):
@ -518,10 +424,6 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
def IsMirror(self): def IsMirror(self):
return self.manifestProject.config.GetBoolean('repo.mirror') return self.manifestProject.config.GetBoolean('repo.mirror')
@property
def UseGitWorktrees(self):
return self.manifestProject.config.GetBoolean('repo.worktree')
@property @property
def IsArchive(self): def IsArchive(self):
return self.manifestProject.config.GetBoolean('repo.archive') return self.manifestProject.config.GetBoolean('repo.archive')
@ -554,12 +456,15 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
self.manifestProject.worktree)) self.manifestProject.worktree))
if self._load_local_manifests: if self._load_local_manifests:
if os.path.exists(os.path.join(self.repodir, LOCAL_MANIFEST_NAME)): local = os.path.join(self.repodir, LOCAL_MANIFEST_NAME)
print('error: %s is not supported; put local manifests in `%s`' if os.path.exists(local):
'instead' % (LOCAL_MANIFEST_NAME, if not self.localManifestWarning:
self.localManifestWarning = True
print('warning: %s is deprecated; put local manifests '
'in `%s` instead' % (LOCAL_MANIFEST_NAME,
os.path.join(self.repodir, LOCAL_MANIFESTS_DIR_NAME)), os.path.join(self.repodir, LOCAL_MANIFESTS_DIR_NAME)),
file=sys.stderr) file=sys.stderr)
sys.exit(1) nodes.append(self._ParseManifestXml(local, self.repodir))
local_dir = os.path.abspath(os.path.join(self.repodir, local_dir = os.path.abspath(os.path.join(self.repodir,
LOCAL_MANIFESTS_DIR_NAME)) LOCAL_MANIFESTS_DIR_NAME))
@ -705,10 +610,6 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
p.groups.extend(groups) p.groups.extend(groups)
if revision: if revision:
p.revisionExpr = revision p.revisionExpr = revision
if IsId(revision):
p.revisionId = revision
else:
p.revisionId = None
if remote: if remote:
p.remote = remote.ToRemoteSpec(name) p.remote = remote.ToRemoteSpec(name)
if node.nodeName == 'repo-hooks': if node.nodeName == 'repo-hooks':
@ -754,6 +655,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
if self._repo_hooks_project and (self._repo_hooks_project.name == name): if self._repo_hooks_project and (self._repo_hooks_project.name == name):
self._repo_hooks_project = None self._repo_hooks_project = None
def _AddMetaProjectMirror(self, m): def _AddMetaProjectMirror(self, m):
name = None name = None
m_url = m.GetRemote(m.remote.name).url m_url = m.GetRemote(m.remote.name).url
@ -826,14 +728,29 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
d.destBranchExpr = node.getAttribute('dest-branch') or None d.destBranchExpr = node.getAttribute('dest-branch') or None
d.upstreamExpr = node.getAttribute('upstream') or None d.upstreamExpr = node.getAttribute('upstream') or None
d.sync_j = XmlInt(node, 'sync-j', 1) sync_j = node.getAttribute('sync-j')
if d.sync_j <= 0: if sync_j == '' or sync_j is None:
raise ManifestParseError('%s: sync-j must be greater than 0, not "%s"' % d.sync_j = 1
(self.manifestFile, d.sync_j)) else:
d.sync_j = int(sync_j)
d.sync_c = XmlBool(node, 'sync-c', False) sync_c = node.getAttribute('sync-c')
d.sync_s = XmlBool(node, 'sync-s', False) if not sync_c:
d.sync_tags = XmlBool(node, 'sync-tags', True) d.sync_c = False
else:
d.sync_c = sync_c.lower() in ("yes", "true", "1")
sync_s = node.getAttribute('sync-s')
if not sync_s:
d.sync_s = False
else:
d.sync_s = sync_s.lower() in ("yes", "true", "1")
sync_tags = node.getAttribute('sync-tags')
if not sync_tags:
d.sync_tags = True
else:
d.sync_tags = sync_tags.lower() in ("yes", "true", "1")
return d return d
def _ParseNotice(self, node): def _ParseNotice(self, node):
@ -910,15 +827,39 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
raise ManifestParseError("project %s path cannot be absolute in %s" % raise ManifestParseError("project %s path cannot be absolute in %s" %
(name, self.manifestFile)) (name, self.manifestFile))
rebase = XmlBool(node, 'rebase', True) rebase = node.getAttribute('rebase')
sync_c = XmlBool(node, 'sync-c', False) if not rebase:
sync_s = XmlBool(node, 'sync-s', self._default.sync_s) rebase = True
sync_tags = XmlBool(node, 'sync-tags', self._default.sync_tags) else:
rebase = rebase.lower() in ("yes", "true", "1")
clone_depth = XmlInt(node, 'clone-depth') sync_c = node.getAttribute('sync-c')
if clone_depth is not None and clone_depth <= 0: if not sync_c:
raise ManifestParseError('%s: clone-depth must be greater than 0, not "%s"' % sync_c = False
(self.manifestFile, clone_depth)) else:
sync_c = sync_c.lower() in ("yes", "true", "1")
sync_s = node.getAttribute('sync-s')
if not sync_s:
sync_s = self._default.sync_s
else:
sync_s = sync_s.lower() in ("yes", "true", "1")
sync_tags = node.getAttribute('sync-tags')
if not sync_tags:
sync_tags = self._default.sync_tags
else:
sync_tags = sync_tags.lower() in ("yes", "true", "1")
clone_depth = node.getAttribute('clone-depth')
if clone_depth:
try:
clone_depth = int(clone_depth)
if clone_depth <= 0:
raise ValueError()
except ValueError:
raise ManifestParseError('invalid clone-depth %s in %s' %
(clone_depth, self.manifestFile))
dest_branch = node.getAttribute('dest-branch') or self._default.destBranchExpr dest_branch = node.getAttribute('dest-branch') or self._default.destBranchExpr
@ -930,10 +871,8 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
groups = self._ParseGroups(groups) groups = self._ParseGroups(groups)
if parent is None: if parent is None:
relpath, worktree, gitdir, objdir, use_git_worktrees = \ relpath, worktree, gitdir, objdir = self.GetProjectPaths(name, path)
self.GetProjectPaths(name, path)
else: else:
use_git_worktrees = False
relpath, worktree, gitdir, objdir = \ relpath, worktree, gitdir, objdir = \
self.GetSubprojectPaths(parent, name, path) self.GetSubprojectPaths(parent, name, path)
@ -941,7 +880,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
groups.extend(set(default_groups).difference(groups)) groups.extend(set(default_groups).difference(groups))
if self.IsMirror and node.hasAttribute('force-path'): if self.IsMirror and node.hasAttribute('force-path'):
if XmlBool(node, 'force-path', False): if node.getAttribute('force-path').lower() in ("yes", "true", "1"):
gitdir = os.path.join(self.topdir, '%s.git' % path) gitdir = os.path.join(self.topdir, '%s.git' % path)
project = Project(manifest = self, project = Project(manifest = self,
@ -962,7 +901,6 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
upstream = upstream, upstream = upstream,
parent = parent, parent = parent,
dest_branch = dest_branch, dest_branch = dest_branch,
use_git_worktrees=use_git_worktrees,
**extra_proj_attrs) **extra_proj_attrs)
for n in node.childNodes: for n in node.childNodes:
@ -978,11 +916,6 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
return project return project
def GetProjectPaths(self, name, path): def GetProjectPaths(self, name, path):
# The manifest entries might have trailing slashes. Normalize them to avoid
# unexpected filesystem behavior since we do string concatenation below.
path = path.rstrip('/')
name = name.rstrip('/')
use_git_worktrees = False
relpath = path relpath = path
if self.IsMirror: if self.IsMirror:
worktree = None worktree = None
@ -991,15 +924,8 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
else: else:
worktree = os.path.join(self.topdir, path).replace('\\', '/') worktree = os.path.join(self.topdir, path).replace('\\', '/')
gitdir = os.path.join(self.repodir, 'projects', '%s.git' % path) gitdir = os.path.join(self.repodir, 'projects', '%s.git' % path)
# We allow people to mix git worktrees & non-git worktrees for now.
# This allows for in situ migration of repo clients.
if os.path.exists(gitdir) or not self.UseGitWorktrees:
objdir = os.path.join(self.repodir, 'project-objects', '%s.git' % name) objdir = os.path.join(self.repodir, 'project-objects', '%s.git' % name)
else: return relpath, worktree, gitdir, objdir
use_git_worktrees = True
gitdir = os.path.join(self.repodir, 'worktrees', '%s.git' % name)
objdir = gitdir
return relpath, worktree, gitdir, objdir, use_git_worktrees
def GetProjectsWithName(self, name): def GetProjectsWithName(self, name):
return self._projects.get(name, []) return self._projects.get(name, [])
@ -1014,10 +940,6 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
return os.path.relpath(relpath, parent_relpath) return os.path.relpath(relpath, parent_relpath)
def GetSubprojectPaths(self, parent, name, path): def GetSubprojectPaths(self, parent, name, path):
# The manifest entries might have trailing slashes. Normalize them to avoid
# unexpected filesystem behavior since we do string concatenation below.
path = path.rstrip('/')
name = name.rstrip('/')
relpath = self._JoinRelpath(parent.relpath, path) relpath = self._JoinRelpath(parent.relpath, path)
gitdir = os.path.join(parent.gitdir, 'subprojects', '%s.git' % path) gitdir = os.path.join(parent.gitdir, 'subprojects', '%s.git' % path)
objdir = os.path.join(parent.gitdir, 'subproject-objects', '%s.git' % name) objdir = os.path.join(parent.gitdir, 'subproject-objects', '%s.git' % name)
@ -1063,30 +985,17 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
# Assume paths might be used on case-insensitive filesystems. # Assume paths might be used on case-insensitive filesystems.
path = path.lower() path = path.lower()
# Split up the path by its components. We can't use os.path.sep exclusively # We don't really need to reject '.' here, but there shouldn't really be a
# as some platforms (like Windows) will convert / to \ and that bypasses all # need to ever use it, so no need to accept it either.
# our constructed logic here. Especially since manifest authors only use for part in set(path.split(os.path.sep)):
# / in their paths.
resep = re.compile(r'[/%s]' % re.escape(os.path.sep))
parts = resep.split(path)
# Some people use src="." to create stable links to projects. Lets allow
# that but reject all other uses of "." to keep things simple.
if parts != ['.']:
for part in set(parts):
if part in {'.', '..', '.git'} or part.startswith('.repo'): if part in {'.', '..', '.git'} or part.startswith('.repo'):
return 'bad component: %s' % (part,) return 'bad component: %s' % (part,)
if not symlink and resep.match(path[-1]): if not symlink and path.endswith(os.path.sep):
return 'dirs not allowed' return 'dirs not allowed'
# NB: The two abspath checks here are to handle platforms with multiple
# filesystem path styles (e.g. Windows).
norm = os.path.normpath(path) norm = os.path.normpath(path)
if (norm == '..' or if norm == '..' or norm.startswith('../') or norm.startswith(os.path.sep):
(len(norm) >= 3 and norm.startswith('..') and resep.match(norm[0])) or
os.path.isabs(norm) or
norm.startswith('/')):
return 'path cannot be outside' return 'path cannot be outside'
@classmethod @classmethod
@ -1182,7 +1091,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
diff = {'added': [], 'removed': [], 'changed': [], 'unreachable': []} diff = {'added': [], 'removed': [], 'changed': [], 'unreachable': []}
for proj in fromKeys: for proj in fromKeys:
if proj not in toKeys: if not proj in toKeys:
diff['removed'].append(fromProjects[proj]) diff['removed'].append(fromProjects[proj])
else: else:
fromProj = fromProjects[proj] fromProj = fromProjects[proj]
@ -1223,3 +1132,4 @@ class GitcManifest(XmlManifest):
"""Output GITC Specific Project attributes""" """Output GITC Specific Project attributes"""
if p.old_revision: if p.old_revision:
e.setAttribute('old-revision', str(p.old_revision)) e.setAttribute('old-revision', str(p.old_revision))

View File

@ -27,7 +27,6 @@ pager_process = None
old_stdout = None old_stdout = None
old_stderr = None old_stderr = None
def RunPager(globalConfig): def RunPager(globalConfig):
if not os.isatty(0) or not os.isatty(1): if not os.isatty(0) or not os.isatty(1):
return return
@ -36,37 +35,33 @@ def RunPager(globalConfig):
return return
if platform_utils.isWindows(): if platform_utils.isWindows():
_PipePager(pager) _PipePager(pager);
else: else:
_ForkPager(pager) _ForkPager(pager)
def TerminatePager(): def TerminatePager():
global pager_process, old_stdout, old_stderr global pager_process, old_stdout, old_stderr
if pager_process: if pager_process:
sys.stdout.flush() sys.stdout.flush()
sys.stderr.flush() sys.stderr.flush()
pager_process.stdin.close() pager_process.stdin.close()
pager_process.wait() pager_process.wait();
pager_process = None pager_process = None
# Restore initial stdout/err in case there is more output in this process # Restore initial stdout/err in case there is more output in this process
# after shutting down the pager process # after shutting down the pager process
sys.stdout = old_stdout sys.stdout = old_stdout
sys.stderr = old_stderr sys.stderr = old_stderr
def _PipePager(pager): def _PipePager(pager):
global pager_process, old_stdout, old_stderr global pager_process, old_stdout, old_stderr
assert pager_process is None, "Only one active pager process at a time" assert pager_process is None, "Only one active pager process at a time"
# Create pager process, piping stdout/err into its stdin # Create pager process, piping stdout/err into its stdin
pager_process = subprocess.Popen([pager], stdin=subprocess.PIPE, stdout=sys.stdout, pager_process = subprocess.Popen([pager], stdin=subprocess.PIPE, stdout=sys.stdout, stderr=sys.stderr)
stderr=sys.stderr)
old_stdout = sys.stdout old_stdout = sys.stdout
old_stderr = sys.stderr old_stderr = sys.stderr
sys.stdout = pager_process.stdin sys.stdout = pager_process.stdin
sys.stderr = pager_process.stdin sys.stderr = pager_process.stdin
def _ForkPager(pager): def _ForkPager(pager):
global active global active
# This process turns into the pager; a child it forks will # This process turns into the pager; a child it forks will
@ -93,7 +88,6 @@ def _ForkPager(pager):
print("fatal: cannot start pager '%s'" % pager, file=sys.stderr) print("fatal: cannot start pager '%s'" % pager, file=sys.stderr)
sys.exit(255) sys.exit(255)
def _SelectPager(globalConfig): def _SelectPager(globalConfig):
try: try:
return os.environ['GIT_PAGER'] return os.environ['GIT_PAGER']
@ -111,7 +105,6 @@ def _SelectPager(globalConfig):
return 'less' return 'less'
def _BecomePager(pager): def _BecomePager(pager):
# Delaying execution of the pager until we have output # Delaying execution of the pager until we have output
# ready works around a long-standing bug in popularly # ready works around a long-standing bug in popularly

View File

@ -90,14 +90,8 @@ class _FileDescriptorStreamsNonBlocking(FileDescriptorStreams):
""" Implementation of FileDescriptorStreams for platforms that support """ Implementation of FileDescriptorStreams for platforms that support
non blocking I/O. non blocking I/O.
""" """
def __init__(self):
super(_FileDescriptorStreamsNonBlocking, self).__init__()
self._poll = select.poll()
self._fd_to_stream = {}
class Stream(object): class Stream(object):
""" Encapsulates a file descriptor """ """ Encapsulates a file descriptor """
def __init__(self, fd, dest, std_name): def __init__(self, fd, dest, std_name):
self.fd = fd self.fd = fd
self.dest = dest self.dest = dest
@ -119,18 +113,11 @@ class _FileDescriptorStreamsNonBlocking(FileDescriptorStreams):
self.fd.close() self.fd.close()
def _create_stream(self, fd, dest, std_name): def _create_stream(self, fd, dest, std_name):
stream = self.Stream(fd, dest, std_name) return self.Stream(fd, dest, std_name)
self._fd_to_stream[stream.fileno()] = stream
self._poll.register(stream, select.POLLIN)
return stream
def remove(self, stream):
self._poll.unregister(stream)
del self._fd_to_stream[stream.fileno()]
super(_FileDescriptorStreamsNonBlocking, self).remove(stream)
def select(self): def select(self):
return [self._fd_to_stream[fd] for fd, _ in self._poll.poll()] ready_streams, _, _ = select.select(self.streams, [], [])
return ready_streams
class _FileDescriptorStreamsThreads(FileDescriptorStreams): class _FileDescriptorStreamsThreads(FileDescriptorStreams):
@ -138,7 +125,6 @@ class _FileDescriptorStreamsThreads(FileDescriptorStreams):
non blocking I/O. This implementation requires creating threads issuing non blocking I/O. This implementation requires creating threads issuing
blocking read operations on file descriptors. blocking read operations on file descriptors.
""" """
def __init__(self): def __init__(self):
super(_FileDescriptorStreamsThreads, self).__init__() super(_FileDescriptorStreamsThreads, self).__init__()
# The queue is shared accross all threads so we can simulate the # The queue is shared accross all threads so we can simulate the
@ -158,14 +144,12 @@ class _FileDescriptorStreamsThreads(FileDescriptorStreams):
class QueueItem(object): class QueueItem(object):
""" Item put in the shared queue """ """ Item put in the shared queue """
def __init__(self, stream, data): def __init__(self, stream, data):
self.stream = stream self.stream = stream
self.data = data self.data = data
class Stream(object): class Stream(object):
""" Encapsulates a file descriptor """ """ Encapsulates a file descriptor """
def __init__(self, fd, dest, std_name, queue): def __init__(self, fd, dest, std_name, queue):
self.fd = fd self.fd = fd
self.dest = dest self.dest = dest
@ -191,7 +175,7 @@ class _FileDescriptorStreamsThreads(FileDescriptorStreams):
for line in iter(self.fd.readline, b''): for line in iter(self.fd.readline, b''):
self.queue.put(_FileDescriptorStreamsThreads.QueueItem(self, line)) self.queue.put(_FileDescriptorStreamsThreads.QueueItem(self, line))
self.fd.close() self.fd.close()
self.queue.put(_FileDescriptorStreamsThreads.QueueItem(self, b'')) self.queue.put(_FileDescriptorStreamsThreads.QueueItem(self, None))
def symlink(source, link_name): def symlink(source, link_name):

View File

@ -152,8 +152,7 @@ def create_dirsymlink(source, link_name):
def _create_symlink(source, link_name, dwFlags): def _create_symlink(source, link_name, dwFlags):
if not CreateSymbolicLinkW(link_name, source, if not CreateSymbolicLinkW(link_name, source, dwFlags | SYMBOLIC_LINK_FLAG_ALLOW_UNPRIVILEGED_CREATE):
dwFlags | SYMBOLIC_LINK_FLAG_ALLOW_UNPRIVILEGED_CREATE):
# See https://github.com/golang/go/pull/24307/files#diff-b87bc12e4da2497308f9ef746086e4f0 # See https://github.com/golang/go/pull/24307/files#diff-b87bc12e4da2497308f9ef746086e4f0
# "the unprivileged create flag is unsupported below Windows 10 (1703, v10.0.14972). # "the unprivileged create flag is unsupported below Windows 10 (1703, v10.0.14972).
# retry without it." # retry without it."
@ -219,8 +218,8 @@ def _preserve_encoding(source, target):
if is_python3(): if is_python3():
return target return target
if isinstance(source, unicode): # noqa: F821 if isinstance(source, unicode):
return unicode(target) # noqa: F821 return unicode(target)
return str(target) return str(target)

View File

@ -26,7 +26,6 @@ _NOT_TTY = not os.isatty(2)
# column 0. # column 0.
CSI_ERASE_LINE = '\x1b[2K' CSI_ERASE_LINE = '\x1b[2K'
class Progress(object): class Progress(object):
def __init__(self, title, total=0, units='', print_newline=False, def __init__(self, title, total=0, units='', print_newline=False,
always_print_percentage=False): always_print_percentage=False):

File diff suppressed because it is too large Load Diff

View File

@ -16,6 +16,5 @@
import sys import sys
def is_python3(): def is_python3():
return sys.version_info[0] == 3 return sys.version_info[0] == 3

View File

@ -1,2 +0,0 @@
These are helper tools for managing official releases.
See the [release process](../docs/release-process.md) document for more details.

View File

@ -1,114 +0,0 @@
#!/usr/bin/env python3
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Helper tool for signing repo launcher scripts correctly.
This is intended to be run only by the official Repo release managers.
"""
import argparse
import os
import subprocess
import sys
import util
def sign(opts):
"""Sign the launcher!"""
output = ''
for key in opts.keys:
# We use ! at the end of the key so that gpg uses this specific key.
# Otherwise it uses the key as a lookup into the overall key and uses the
# default signing key. i.e. It will see that KEYID_RSA is a subkey of
# another key, and use the primary key to sign instead of the subkey.
cmd = ['gpg', '--homedir', opts.gpgdir, '-u', f'{key}!', '--batch', '--yes',
'--armor', '--detach-sign', '--output', '-', opts.launcher]
ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE)
output += ret.stdout
# Save the combined signatures into one file.
with open(f'{opts.launcher}.asc', 'w', encoding='utf-8') as fp:
fp.write(output)
def check(opts):
"""Check the signature."""
util.run(opts, ['gpg', '--verify', f'{opts.launcher}.asc'])
def postmsg(opts):
"""Helpful info to show at the end for release manager."""
print(f"""
Repo launcher bucket:
gs://git-repo-downloads/
To upload this launcher directly:
gsutil cp -a public-read {opts.launcher} {opts.launcher}.asc gs://git-repo-downloads/
NB: You probably want to upload it with a specific version first, e.g.:
gsutil cp -a public-read {opts.launcher} gs://git-repo-downloads/repo-3.0
gsutil cp -a public-read {opts.launcher}.asc gs://git-repo-downloads/repo-3.0.asc
""")
def get_parser():
"""Get a CLI parser."""
parser = argparse.ArgumentParser(description=__doc__)
parser.add_argument('-n', '--dry-run',
dest='dryrun', action='store_true',
help='show everything that would be done')
parser.add_argument('--gpgdir',
default=os.path.join(util.HOMEDIR, '.gnupg', 'repo'),
help='path to dedicated gpg dir with release keys '
'(default: ~/.gnupg/repo/)')
parser.add_argument('--keyid', dest='keys', default=[], action='append',
help='alternative signing keys to use')
parser.add_argument('launcher',
default=os.path.join(util.TOPDIR, 'repo'), nargs='?',
help='the launcher script to sign')
return parser
def main(argv):
"""The main func!"""
parser = get_parser()
opts = parser.parse_args(argv)
if not os.path.exists(opts.gpgdir):
parser.error(f'--gpgdir does not exist: {opts.gpgdir}')
if not os.path.exists(opts.launcher):
parser.error(f'launcher does not exist: {opts.launcher}')
opts.launcher = os.path.relpath(opts.launcher)
print(f'Signing "{opts.launcher}" launcher script and saving to '
f'"{opts.launcher}.asc"')
if opts.keys:
print(f'Using custom keys to sign: {" ".join(opts.keys)}')
else:
print('Using official Repo release keys to sign')
opts.keys = [util.KEYID_DSA, util.KEYID_RSA, util.KEYID_ECC]
util.import_release_key(opts)
sign(opts)
check(opts)
postmsg(opts)
return 0
if __name__ == '__main__':
sys.exit(main(sys.argv[1:]))

View File

@ -1,140 +0,0 @@
#!/usr/bin/env python3
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Helper tool for signing repo release tags correctly.
This is intended to be run only by the official Repo release managers, but it
could be run by people maintaining their own fork of the project.
NB: Check docs/release-process.md for production freeze information.
"""
import argparse
import os
import re
import subprocess
import sys
import util
# We currently sign with the old DSA key as it's been around the longest.
# We should transition to RSA by Jun 2020, and ECC by Jun 2021.
KEYID = util.KEYID_DSA
# Regular expression to validate tag names.
RE_VALID_TAG = r'^v([0-9]+[.])+[0-9]+$'
def sign(opts):
"""Tag the commit & sign it!"""
# We use ! at the end of the key so that gpg uses this specific key.
# Otherwise it uses the key as a lookup into the overall key and uses the
# default signing key. i.e. It will see that KEYID_RSA is a subkey of
# another key, and use the primary key to sign instead of the subkey.
cmd = ['git', 'tag', '-s', opts.tag, '-u', f'{opts.key}!',
'-m', f'repo {opts.tag}', opts.commit]
key = 'GNUPGHOME'
print('+', f'export {key}="{opts.gpgdir}"')
oldvalue = os.getenv(key)
os.putenv(key, opts.gpgdir)
util.run(opts, cmd)
if oldvalue is None:
os.unsetenv(key)
else:
os.putenv(key, oldvalue)
def check(opts):
"""Check the signature."""
util.run(opts, ['git', 'tag', '--verify', opts.tag])
def postmsg(opts):
"""Helpful info to show at the end for release manager."""
cmd = ['git', 'rev-parse', 'remotes/origin/stable']
ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE)
current_release = ret.stdout.strip()
cmd = ['git', 'log', '--format=%h (%aN) %s', '--no-merges',
f'remotes/origin/stable..{opts.tag}']
ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE)
shortlog = ret.stdout.strip()
print(f"""
Here's the short log since the last release.
{shortlog}
To push release to the public:
git push origin {opts.commit}:stable {opts.tag} -n
NB: People will start upgrading to this version immediately.
To roll back a release:
git push origin --force {current_release}:stable -n
""")
def get_parser():
"""Get a CLI parser."""
parser = argparse.ArgumentParser(
description=__doc__,
formatter_class=argparse.RawDescriptionHelpFormatter)
parser.add_argument('-n', '--dry-run',
dest='dryrun', action='store_true',
help='show everything that would be done')
parser.add_argument('--gpgdir',
default=os.path.join(util.HOMEDIR, '.gnupg', 'repo'),
help='path to dedicated gpg dir with release keys '
'(default: ~/.gnupg/repo/)')
parser.add_argument('-f', '--force', action='store_true',
help='force signing of any tag')
parser.add_argument('--keyid', dest='key',
help='alternative signing key to use')
parser.add_argument('tag',
help='the tag to create (e.g. "v2.0")')
parser.add_argument('commit', default='HEAD', nargs='?',
help='the commit to tag')
return parser
def main(argv):
"""The main func!"""
parser = get_parser()
opts = parser.parse_args(argv)
if not os.path.exists(opts.gpgdir):
parser.error(f'--gpgdir does not exist: {opts.gpgdir}')
if not opts.force and not re.match(RE_VALID_TAG, opts.tag):
parser.error(f'tag "{opts.tag}" does not match regex "{RE_VALID_TAG}"; '
'use --force to sign anyways')
if opts.key:
print(f'Using custom key to sign: {opts.key}')
else:
print('Using official Repo release key to sign')
opts.key = KEYID
util.import_release_key(opts)
sign(opts)
check(opts)
postmsg(opts)
return 0
if __name__ == '__main__':
sys.exit(main(sys.argv[1:]))

View File

@ -1,73 +0,0 @@
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Random utility code for release tools."""
import os
import re
import subprocess
import sys
assert sys.version_info >= (3, 6), 'This module requires Python 3.6+'
TOPDIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
HOMEDIR = os.path.expanduser('~')
# These are the release keys we sign with.
KEYID_DSA = '8BB9AD793E8E6153AF0F9A4416530D5E920F5C65'
KEYID_RSA = 'A34A13BE8E76BFF46A0C022DA2E75A824AAB9624'
KEYID_ECC = 'E1F9040D7A3F6DAFAC897CD3D3B95DA243E48A39'
def cmdstr(cmd):
"""Get a nicely quoted shell command."""
ret = []
for arg in cmd:
if not re.match(r'^[a-zA-Z0-9/_.=-]+$', arg):
arg = f'"{arg}"'
ret.append(arg)
return ' '.join(ret)
def run(opts, cmd, check=True, **kwargs):
"""Helper around subprocess.run to include logging."""
print('+', cmdstr(cmd))
if opts.dryrun:
cmd = ['true', '--'] + cmd
try:
return subprocess.run(cmd, check=check, **kwargs)
except subprocess.CalledProcessError as e:
print(f'aborting: {e}', file=sys.stderr)
sys.exit(1)
def import_release_key(opts):
"""Import the public key of the official release repo signing key."""
# Extract the key from our repo launcher.
launcher = getattr(opts, 'launcher', os.path.join(TOPDIR, 'repo'))
print(f'Importing keys from "{launcher}" launcher script')
with open(launcher, encoding='utf-8') as fp:
data = fp.read()
keys = re.findall(
r'\n-----BEGIN PGP PUBLIC KEY BLOCK-----\n[^-]*'
r'\n-----END PGP PUBLIC KEY BLOCK-----\n', data, flags=re.M)
run(opts, ['gpg', '--import'], input='\n'.join(keys).encode('utf-8'))
print('Marking keys as fully trusted')
run(opts, ['gpg', '--import-ownertrust'],
input=f'{KEYID_DSA}:6:\n'.encode('utf-8'))

804
repo

File diff suppressed because it is too large Load Diff

View File

@ -28,16 +28,13 @@ REPO_TRACE = 'REPO_TRACE'
_TRACE = os.environ.get(REPO_TRACE) == '1' _TRACE = os.environ.get(REPO_TRACE) == '1'
def IsTrace(): def IsTrace():
return _TRACE return _TRACE
def SetTrace(): def SetTrace():
global _TRACE global _TRACE
_TRACE = True _TRACE = True
def Trace(fmt, *args): def Trace(fmt, *args):
if IsTrace(): if IsTrace():
print(fmt % args, file=sys.stderr) print(fmt % args, file=sys.stderr)

View File

@ -42,11 +42,9 @@ def main(argv):
"""The main entry.""" """The main entry."""
# Add the repo tree to PYTHONPATH as the tests expect to be able to import # Add the repo tree to PYTHONPATH as the tests expect to be able to import
# modules directly. # modules directly.
pythonpath = os.path.dirname(os.path.realpath(__file__)) topdir = os.path.dirname(os.path.realpath(__file__))
oldpythonpath = os.environ.get('PYTHONPATH', None) pythonpath = os.environ.get('PYTHONPATH', '')
if oldpythonpath is not None: os.environ['PYTHONPATH'] = '%s:%s' % (topdir, pythonpath)
pythonpath += os.pathsep + oldpythonpath
os.environ['PYTHONPATH'] = pythonpath
return run_pytest('pytest', argv) return run_pytest('pytest', argv)

View File

@ -16,7 +16,6 @@
import os import os
# A mapping of the subcommand name to the class that implements it.
all_commands = {} all_commands = {}
my_dir = os.path.dirname(__file__) my_dir = os.path.dirname(__file__)
@ -38,7 +37,7 @@ for py in os.listdir(my_dir):
['%s' % name]) ['%s' % name])
mod = getattr(mod, name) mod = getattr(mod, name)
try: try:
cmd = getattr(mod, clsn) cmd = getattr(mod, clsn)()
except AttributeError: except AttributeError:
raise SyntaxError('%s/%s does not define class %s' % ( raise SyntaxError('%s/%s does not define class %s' % (
__name__, py, clsn)) __name__, py, clsn))
@ -47,5 +46,5 @@ for py in os.listdir(my_dir):
cmd.NAME = name cmd.NAME = name
all_commands[name] = cmd all_commands[name] = cmd
# Add 'branch' as an alias for 'branches'. if 'help' in all_commands:
all_commands['branch'] = all_commands['branches'] all_commands['help'].commands = all_commands

View File

@ -15,15 +15,12 @@
# limitations under the License. # limitations under the License.
from __future__ import print_function from __future__ import print_function
from collections import defaultdict
import sys import sys
from command import Command from command import Command
from collections import defaultdict
from git_command import git from git_command import git
from progress import Progress from progress import Progress
class Abandon(Command): class Abandon(Command):
common = True common = True
helpSummary = "Permanently abandon a development branch" helpSummary = "Permanently abandon a development branch"
@ -35,11 +32,7 @@ deleting it (and all its history) from your local repository.
It is equivalent to "git branch -D <branchname>". It is equivalent to "git branch -D <branchname>".
""" """
def _Options(self, p): def _Options(self, p):
p.add_option('-q', '--quiet',
action='store_true', default=False,
help='be quiet')
p.add_option('--all', p.add_option('--all',
dest='all', action='store_true', dest='all', action='store_true',
help='delete all branches in all projects') help='delete all branches in all projects')
@ -96,14 +89,11 @@ It is equivalent to "git branch -D <branchname>".
file=sys.stderr) file=sys.stderr)
sys.exit(1) sys.exit(1)
else: else:
# Everything below here is displaying status. print('Abandoned branches:', file=sys.stderr)
if opt.quiet:
return
print('Abandoned branches:')
for br in success.keys(): for br in success.keys():
if len(all_projects) > 1 and len(all_projects) == len(success[br]): if len(all_projects) > 1 and len(all_projects) == len(success[br]):
result = "all project" result = "all project"
else: else:
result = "%s" % ( result = "%s" % (
('\n'+' '*width + '| ').join(p.relpath for p in success[br])) ('\n'+' '*width + '| ').join(p.relpath for p in success[br]))
print("%s%s| %s\n" % (br, ' ' * (width - len(br)), result)) print("%s%s| %s\n" % (br,' '*(width-len(br)), result),file=sys.stderr)

View File

@ -19,7 +19,6 @@ import sys
from color import Coloring from color import Coloring
from command import Command from command import Command
class BranchColoring(Coloring): class BranchColoring(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, 'branch') Coloring.__init__(self, config, 'branch')
@ -27,7 +26,6 @@ class BranchColoring(Coloring):
self.local = self.printer('local') self.local = self.printer('local')
self.notinproject = self.printer('notinproject', fg='red') self.notinproject = self.printer('notinproject', fg='red')
class BranchInfo(object): class BranchInfo(object):
def __init__(self, name): def __init__(self, name):
self.name = name self.name = name
@ -160,7 +158,7 @@ is shown, then the branch appears in all projects.
for b in i.projects: for b in i.projects:
have.add(b.project) have.add(b.project)
for p in projects: for p in projects:
if p not in have: if not p in have:
paths.append(p.relpath) paths.append(p.relpath)
s = ' %s %s' % (in_type, ', '.join(paths)) s = ' %s %s' % (in_type, ', '.join(paths))

View File

@ -19,7 +19,6 @@ import sys
from command import Command from command import Command
from progress import Progress from progress import Progress
class Checkout(Command): class Checkout(Command):
common = True common = True
helpSummary = "Checkout a branch for development" helpSummary = "Checkout a branch for development"

View File

@ -22,7 +22,6 @@ from git_command import GitCommand
CHANGE_ID_RE = re.compile(r'^\s*Change-Id: I([0-9a-f]{40})\s*$') CHANGE_ID_RE = re.compile(r'^\s*Change-Id: I([0-9a-f]{40})\s*$')
class CherryPick(Command): class CherryPick(Command):
common = True common = True
helpSummary = "Cherry-pick a change." helpSummary = "Cherry-pick a change."

View File

@ -16,7 +16,6 @@
from command import PagedCommand from command import PagedCommand
class Diff(PagedCommand): class Diff(PagedCommand):
common = True common = True
helpSummary = "Show changes between commit and working tree" helpSummary = "Show changes between commit and working tree"
@ -29,6 +28,10 @@ to the Unix 'patch' command.
""" """
def _Options(self, p): def _Options(self, p):
def cmd(option, opt_str, value, parser):
setattr(parser.values, option.dest, list(parser.rargs))
while parser.rargs:
del parser.rargs[0]
p.add_option('-u', '--absolute', p.add_option('-u', '--absolute',
dest='absolute', action='store_true', dest='absolute', action='store_true',
help='Paths are relative to the repository root') help='Paths are relative to the repository root')

View File

@ -18,12 +18,10 @@ from color import Coloring
from command import PagedCommand from command import PagedCommand
from manifest_xml import XmlManifest from manifest_xml import XmlManifest
class _Coloring(Coloring): class _Coloring(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, "status") Coloring.__init__(self, config, "status")
class Diffmanifests(PagedCommand): class Diffmanifests(PagedCommand):
""" A command to see logs in projects represented by manifests """ A command to see logs in projects represented by manifests
@ -79,7 +77,7 @@ synced and their revisions won't be found.
metavar='<FORMAT>', metavar='<FORMAT>',
help='print the log using a custom git pretty format string') help='print the log using a custom git pretty format string')
def _printRawDiff(self, diff, pretty_format=None): def _printRawDiff(self, diff):
for project in diff['added']: for project in diff['added']:
self.printText("A %s %s" % (project.relpath, project.revisionExpr)) self.printText("A %s %s" % (project.relpath, project.revisionExpr))
self.out.nl() self.out.nl()
@ -92,7 +90,7 @@ synced and their revisions won't be found.
self.printText("C %s %s %s" % (project.relpath, project.revisionExpr, self.printText("C %s %s %s" % (project.relpath, project.revisionExpr,
otherProject.revisionExpr)) otherProject.revisionExpr))
self.out.nl() self.out.nl()
self._printLogs(project, otherProject, raw=True, color=False, pretty_format=pretty_format) self._printLogs(project, otherProject, raw=True, color=False)
for project, otherProject in diff['unreachable']: for project, otherProject in diff['unreachable']:
self.printText("U %s %s %s" % (project.relpath, project.revisionExpr, self.printText("U %s %s %s" % (project.relpath, project.revisionExpr,
@ -203,6 +201,6 @@ synced and their revisions won't be found.
diff = manifest1.projectsDiff(manifest2) diff = manifest1.projectsDiff(manifest2)
if opt.raw: if opt.raw:
self._printRawDiff(diff, pretty_format=opt.pretty_format) self._printRawDiff(diff)
else: else:
self._printDiff(diff, color=opt.color, pretty_format=opt.pretty_format) self._printDiff(diff, color=opt.color, pretty_format=opt.pretty_format)

View File

@ -23,7 +23,6 @@ from error import GitError
CHANGE_RE = re.compile(r'^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$') CHANGE_RE = re.compile(r'^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$')
class Download(Command): class Download(Command):
common = True common = True
helpSummary = "Download and checkout a change" helpSummary = "Download and checkout a change"
@ -37,13 +36,9 @@ If no project is specified try to use current directory as a project.
""" """
def _Options(self, p): def _Options(self, p):
p.add_option('-b', '--branch',
help='create a new branch first')
p.add_option('-c', '--cherry-pick', p.add_option('-c', '--cherry-pick',
dest='cherrypick', action='store_true', dest='cherrypick', action='store_true',
help="cherry-pick instead of checkout") help="cherry-pick instead of checkout")
p.add_option('-x', '--record-origin', action='store_true',
help='pass -x when cherry-picking')
p.add_option('-r', '--revert', p.add_option('-r', '--revert',
dest='revert', action='store_true', dest='revert', action='store_true',
help="revert instead of checkout") help="revert instead of checkout")
@ -82,14 +77,6 @@ If no project is specified try to use current directory as a project.
project = self.GetProjects([a])[0] project = self.GetProjects([a])[0]
return to_get return to_get
def ValidateOptions(self, opt, args):
if opt.record_origin:
if not opt.cherrypick:
self.OptionParser.error('-x only makes sense with --cherry-pick')
if opt.ffonly:
self.OptionParser.error('-x and --ff are mutually exclusive options')
def Execute(self, opt, args): def Execute(self, opt, args):
for project, change_id, ps_id in self._ParseChangeIds(args): for project, change_id, ps_id in self._ParseChangeIds(args):
dl = project.DownloadPatchSet(change_id, ps_id) dl = project.DownloadPatchSet(change_id, ps_id)
@ -106,41 +93,22 @@ If no project is specified try to use current directory as a project.
continue continue
if len(dl.commits) > 1: if len(dl.commits) > 1:
print('[%s] %d/%d depends on %d unmerged changes:' print('[%s] %d/%d depends on %d unmerged changes:' \
% (project.name, change_id, ps_id, len(dl.commits)), % (project.name, change_id, ps_id, len(dl.commits)),
file=sys.stderr) file=sys.stderr)
for c in dl.commits: for c in dl.commits:
print(' %s' % (c), file=sys.stderr) print(' %s' % (c), file=sys.stderr)
if opt.cherrypick: if opt.cherrypick:
mode = 'cherry-pick'
elif opt.revert:
mode = 'revert'
elif opt.ffonly:
mode = 'fast-forward merge'
else:
mode = 'checkout'
# We'll combine the branch+checkout operation, but all the rest need a
# dedicated branch start.
if opt.branch and mode != 'checkout':
project.StartBranch(opt.branch)
try: try:
if opt.cherrypick: project._CherryPick(dl.commit)
project._CherryPick(dl.commit, ffonly=opt.ffonly, except GitError:
record_origin=opt.record_origin) print('[%s] Could not complete the cherry-pick of %s' \
% (project.name, dl.commit), file=sys.stderr)
sys.exit(1)
elif opt.revert: elif opt.revert:
project._Revert(dl.commit) project._Revert(dl.commit)
elif opt.ffonly: elif opt.ffonly:
project._FastForward(dl.commit, ffonly=True) project._FastForward(dl.commit, ffonly=True)
else:
if opt.branch:
project.StartBranch(opt.branch, revision=dl.commit)
else: else:
project._Checkout(dl.commit) project._Checkout(dl.commit)
except GitError:
print('[%s] Could not complete the %s of %s'
% (project.name, mode, dl.commit), file=sys.stderr)
sys.exit(1)

View File

@ -127,8 +127,7 @@ without iterating through the remaining projects.
help="Execute the command only on projects matching regex or wildcard expression") help="Execute the command only on projects matching regex or wildcard expression")
p.add_option('-i', '--inverse-regex', p.add_option('-i', '--inverse-regex',
dest='inverse_regex', action='store_true', dest='inverse_regex', action='store_true',
help="Execute the command only on projects not matching regex or " help="Execute the command only on projects not matching regex or wildcard expression")
"wildcard expression")
p.add_option('-g', '--groups', p.add_option('-g', '--groups',
dest='groups', dest='groups',
help="Execute the command only on projects matching the specified groups") help="Execute the command only on projects matching the specified groups")
@ -179,8 +178,6 @@ without iterating through the remaining projects.
'annotations': dict((a.name, a.value) for a in project.annotations), 'annotations': dict((a.name, a.value) for a in project.annotations),
'gitdir': project.gitdir, 'gitdir': project.gitdir,
'worktree': project.worktree, 'worktree': project.worktree,
'upstream': project.upstream,
'dest_branch': project.dest_branch,
} }
def ValidateOptions(self, opt, args): def ValidateOptions(self, opt, args):
@ -280,7 +277,6 @@ without iterating through the remaining projects.
return return
yield [mirror, opt, cmd, shell, cnt, config, project] yield [mirror, opt, cmd, shell, cnt, config, project]
class WorkerKeyboardInterrupt(Exception): class WorkerKeyboardInterrupt(Exception):
""" Keyboard interrupt exception for worker processes. """ """ Keyboard interrupt exception for worker processes. """
pass pass
@ -289,7 +285,6 @@ class WorkerKeyboardInterrupt(Exception):
def InitWorker(): def InitWorker():
signal.signal(signal.SIGINT, signal.SIG_IGN) signal.signal(signal.SIGINT, signal.SIG_IGN)
def DoWorkWrapper(args): def DoWorkWrapper(args):
""" A wrapper around the DoWork() method. """ A wrapper around the DoWork() method.
@ -308,10 +303,11 @@ def DoWorkWrapper(args):
def DoWork(project, mirror, opt, cmd, shell, cnt, config): def DoWork(project, mirror, opt, cmd, shell, cnt, config):
env = os.environ.copy() env = os.environ.copy()
def setenv(name, val): def setenv(name, val):
if val is None: if val is None:
val = '' val = ''
if hasattr(val, 'encode'):
val = val.encode()
env[name] = val env[name] = val
setenv('REPO_PROJECT', project['name']) setenv('REPO_PROJECT', project['name'])
@ -319,8 +315,6 @@ def DoWork(project, mirror, opt, cmd, shell, cnt, config):
setenv('REPO_REMOTE', project['remote_name']) setenv('REPO_REMOTE', project['remote_name'])
setenv('REPO_LREV', project['lrev']) setenv('REPO_LREV', project['lrev'])
setenv('REPO_RREV', project['rrev']) setenv('REPO_RREV', project['rrev'])
setenv('REPO_UPSTREAM', project['upstream'])
setenv('REPO_DEST_BRANCH', project['dest_branch'])
setenv('REPO_I', str(cnt + 1)) setenv('REPO_I', str(cnt + 1))
for name in project['annotations']: for name in project['annotations']:
setenv("REPO__%s" % (name), project['annotations'][name]) setenv("REPO__%s" % (name), project['annotations'][name])
@ -374,8 +368,8 @@ def DoWork(project, mirror, opt, cmd, shell, cnt, config):
for s in in_ready: for s in in_ready:
buf = s.read().decode() buf = s.read().decode()
if not buf: if not buf:
s_in.remove(s)
s.close() s.close()
s_in.remove(s)
continue continue
if not opt.verbose: if not opt.verbose:

View File

@ -22,8 +22,7 @@ import platform_utils
from pyversion import is_python3 from pyversion import is_python3
if not is_python3(): if not is_python3():
input = raw_input # noqa: F821 input = raw_input
class GitcDelete(Command, GitcClientCommand): class GitcDelete(Command, GitcClientCommand):
common = True common = True

View File

@ -62,8 +62,7 @@ use for this GITC client.
def Execute(self, opt, args): def Execute(self, opt, args):
gitc_client = gitc_utils.parse_clientdir(os.getcwd()) gitc_client = gitc_utils.parse_clientdir(os.getcwd())
if not gitc_client or (opt.gitc_client and gitc_client != opt.gitc_client): if not gitc_client or (opt.gitc_client and gitc_client != opt.gitc_client):
print('fatal: Please update your repo command. See go/gitc for instructions.', print('fatal: Please update your repo command. See go/gitc for instructions.', file=sys.stderr)
file=sys.stderr)
sys.exit(1) sys.exit(1)
self.client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(), self.client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(),
gitc_client) gitc_client)

View File

@ -21,8 +21,7 @@ import sys
from color import Coloring from color import Coloring
from command import PagedCommand from command import PagedCommand
from error import GitError from error import GitError
from git_command import GitCommand from git_command import git_require, GitCommand
class GrepColoring(Coloring): class GrepColoring(Coloring):
def __init__(self, config): def __init__(self, config):
@ -30,7 +29,6 @@ class GrepColoring(Coloring):
self.project = self.printer('project', attr='bold') self.project = self.printer('project', attr='bold')
self.fail = self.printer('fail', fg='red') self.fail = self.printer('fail', fg='red')
class Grep(PagedCommand): class Grep(PagedCommand):
common = True common = True
helpSummary = "Print lines matching a pattern" helpSummary = "Print lines matching a pattern"
@ -158,11 +156,12 @@ contain a line that matches both expressions:
action='callback', callback=carry, action='callback', callback=carry,
help='Show only file names not containing matching lines') help='Show only file names not containing matching lines')
def Execute(self, opt, args): def Execute(self, opt, args):
out = GrepColoring(self.manifest.manifestProject.config) out = GrepColoring(self.manifest.manifestProject.config)
cmd_argv = ['grep'] cmd_argv = ['grep']
if out.is_on: if out.is_on and git_require((1, 6, 3)):
cmd_argv.append('--color') cmd_argv.append('--color')
cmd_argv.extend(getattr(opt, 'cmd_argv', [])) cmd_argv.extend(getattr(opt, 'cmd_argv', []))

View File

@ -19,12 +19,10 @@ import re
import sys import sys
from formatter import AbstractFormatter, DumbWriter from formatter import AbstractFormatter, DumbWriter
from subcmds import all_commands
from color import Coloring from color import Coloring
from command import PagedCommand, MirrorSafeCommand, GitcAvailableCommand, GitcClientCommand from command import PagedCommand, MirrorSafeCommand, GitcAvailableCommand, GitcClientCommand
import gitc_utils import gitc_utils
class Help(PagedCommand, MirrorSafeCommand): class Help(PagedCommand, MirrorSafeCommand):
common = False common = False
helpSummary = "Display detailed help on a command" helpSummary = "Display detailed help on a command"
@ -43,7 +41,7 @@ Displays detailed usage information about a command.
fmt = ' %%-%ds %%s' % maxlen fmt = ' %%-%ds %%s' % maxlen
for name in commandNames: for name in commandNames:
command = all_commands[name]() command = self.commands[name]
try: try:
summary = command.helpSummary.strip() summary = command.helpSummary.strip()
except AttributeError: except AttributeError:
@ -53,7 +51,7 @@ Displays detailed usage information about a command.
def _PrintAllCommands(self): def _PrintAllCommands(self):
print('usage: repo COMMAND [ARGS]') print('usage: repo COMMAND [ARGS]')
print('The complete list of recognized repo commands are:') print('The complete list of recognized repo commands are:')
commandNames = list(sorted(all_commands)) commandNames = list(sorted(self.commands))
self._PrintCommands(commandNames) self._PrintCommands(commandNames)
print("See 'repo help <command>' for more information on a " print("See 'repo help <command>' for more information on a "
'specific command.') 'specific command.')
@ -74,7 +72,7 @@ Displays detailed usage information about a command.
return False return False
commandNames = list(sorted([name commandNames = list(sorted([name
for name, command in all_commands.items() for name, command in self.commands.items()
if command.common and gitc_supported(command)])) if command.common and gitc_supported(command)]))
self._PrintCommands(commandNames) self._PrintCommands(commandNames)
@ -133,8 +131,8 @@ Displays detailed usage information about a command.
out._PrintSection('Description', 'helpDescription') out._PrintSection('Description', 'helpDescription')
def _PrintAllCommandHelp(self): def _PrintAllCommandHelp(self):
for name in sorted(all_commands): for name in sorted(self.commands):
cmd = all_commands[name]() cmd = self.commands[name]
cmd.manifest = self.manifest cmd.manifest = self.manifest
self._PrintCommandHelp(cmd, header_prefix='[%s] ' % (name,)) self._PrintCommandHelp(cmd, header_prefix='[%s] ' % (name,))
@ -159,7 +157,7 @@ Displays detailed usage information about a command.
name = args[0] name = args[0]
try: try:
cmd = all_commands[name]() cmd = self.commands[name]
except KeyError: except KeyError:
print("repo: '%s' is not a repo command." % name, file=sys.stderr) print("repo: '%s' is not a repo command." % name, file=sys.stderr)
sys.exit(1) sys.exit(1)

View File

@ -16,14 +16,12 @@
from command import PagedCommand from command import PagedCommand
from color import Coloring from color import Coloring
from git_refs import R_M, R_HEADS from git_refs import R_M
class _Coloring(Coloring): class _Coloring(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, "status") Coloring.__init__(self, config, "status")
class Info(PagedCommand): class Info(PagedCommand):
common = True common = True
helpSummary = "Get info on the manifest branch, current branch or unmerged branches" helpSummary = "Get info on the manifest branch, current branch or unmerged branches"
@ -43,6 +41,7 @@ class Info(PagedCommand):
dest="local", action="store_true", dest="local", action="store_true",
help="Disable all remote operations") help="Disable all remote operations")
def Execute(self, opt, args): def Execute(self, opt, args):
self.out = _Coloring(self.manifest.globalConfig) self.out = _Coloring(self.manifest.globalConfig)
self.heading = self.out.printer('heading', attr = 'bold') self.heading = self.out.printer('heading', attr = 'bold')
@ -123,14 +122,11 @@ class Info(PagedCommand):
self.printSeparator() self.printSeparator()
def findRemoteLocalDiff(self, project): def findRemoteLocalDiff(self, project):
# Fetch all the latest commits. #Fetch all the latest commits
if not self.opt.local: if not self.opt.local:
project.Sync_NetworkHalf(quiet=True, current_branch_only=True) project.Sync_NetworkHalf(quiet=True, current_branch_only=True)
branch = self.manifest.manifestProject.config.GetBranch('default').merge logTarget = R_M + self.manifest.manifestProject.config.GetBranch("default").merge
if branch.startswith(R_HEADS):
branch = branch[len(R_HEADS):]
logTarget = R_M + branch
bareTmp = project.bare_git._bare bareTmp = project.bare_git._bare
project.bare_git._bare = False project.bare_git._bare = False

View File

@ -15,8 +15,6 @@
# limitations under the License. # limitations under the License.
from __future__ import print_function from __future__ import print_function
import optparse
import os import os
import platform import platform
import re import re
@ -36,10 +34,8 @@ from command import InteractiveCommand, MirrorSafeCommand
from error import ManifestParseError from error import ManifestParseError
from project import SyncBuffer from project import SyncBuffer
from git_config import GitConfig from git_config import GitConfig
from git_command import git_require, MIN_GIT_VERSION_SOFT, MIN_GIT_VERSION_HARD from git_command import git_require, MIN_GIT_VERSION
import platform_utils import platform_utils
from wrapper import Wrapper
class Init(InteractiveCommand, MirrorSafeCommand): class Init(InteractiveCommand, MirrorSafeCommand):
common = True common = True
@ -54,8 +50,7 @@ from the server and is installed in the .repo/ directory in the
current working directory. current working directory.
The optional -b argument can be used to select the manifest branch The optional -b argument can be used to select the manifest branch
to checkout and use. If no branch is specified, the remote's default to checkout and use. If no branch is specified, master is assumed.
branch is used.
The optional -m argument can be used to specify an alternate manifest The optional -m argument can be used to specify an alternate manifest
to be used. If no manifest is specified, the manifest default.xml to be used. If no manifest is specified, the manifest default.xml
@ -89,12 +84,9 @@ to update the working directory files.
def _Options(self, p, gitc_init=False): def _Options(self, p, gitc_init=False):
# Logging # Logging
g = p.add_option_group('Logging options') g = p.add_option_group('Logging options')
g.add_option('-v', '--verbose',
dest='output_mode', action='store_true',
help='show all output')
g.add_option('-q', '--quiet', g.add_option('-q', '--quiet',
dest='output_mode', action='store_false', dest="quiet", action="store_true", default=False,
help='only show errors') help="be quiet")
# Manifest # Manifest
g = p.add_option_group('Manifest options') g = p.add_option_group('Manifest options')
@ -135,10 +127,6 @@ to update the working directory files.
g.add_option('--clone-filter', action='store', default='blob:none', g.add_option('--clone-filter', action='store', default='blob:none',
dest='clone_filter', dest='clone_filter',
help='filter for use with --partial-clone [default: %default]') help='filter for use with --partial-clone [default: %default]')
# TODO(vapier): Expose option with real help text once this has been in the
# wild for a while w/out significant bug reports. Goal is by ~Sep 2020.
g.add_option('--worktree', action='store_true',
help=optparse.SUPPRESS_HELP)
g.add_option('--archive', g.add_option('--archive',
dest='archive', action='store_true', dest='archive', action='store_true',
help='checkout an archive instead of a git repository for ' help='checkout an archive instead of a git repository for '
@ -156,13 +144,11 @@ to update the working directory files.
help='restrict manifest projects to ones with a specified ' help='restrict manifest projects to ones with a specified '
'platform group [auto|all|none|linux|darwin|...]', 'platform group [auto|all|none|linux|darwin|...]',
metavar='PLATFORM') metavar='PLATFORM')
g.add_option('--clone-bundle', action='store_true',
help='force use of /clone.bundle on HTTP/HTTPS (default if not --partial-clone)')
g.add_option('--no-clone-bundle', g.add_option('--no-clone-bundle',
dest='clone_bundle', action='store_false', dest='no_clone_bundle', action='store_true',
help='disable use of /clone.bundle on HTTP/HTTPS (default if --partial-clone)') help='disable use of /clone.bundle on HTTP/HTTPS')
g.add_option('--no-tags', g.add_option('--no-tags',
dest='tags', default=True, action='store_false', dest='no_tags', action='store_true',
help="don't fetch tags in the manifest") help="don't fetch tags in the manifest")
# Tool # Tool
@ -170,12 +156,11 @@ to update the working directory files.
g.add_option('--repo-url', g.add_option('--repo-url',
dest='repo_url', dest='repo_url',
help='repo repository location', metavar='URL') help='repo repository location', metavar='URL')
g.add_option('--repo-rev', metavar='REV', g.add_option('--repo-branch',
help='repo branch or revision') dest='repo_branch',
g.add_option('--repo-branch', dest='repo_rev', help='repo branch or revision', metavar='REVISION')
help=optparse.SUPPRESS_HELP)
g.add_option('--no-repo-verify', g.add_option('--no-repo-verify',
dest='repo_verify', default=True, action='store_false', dest='no_repo_verify', action='store_true',
help='do not verify repo source code') help='do not verify repo source code')
# Other # Other
@ -198,8 +183,7 @@ to update the working directory files.
sys.exit(1) sys.exit(1)
if not opt.quiet: if not opt.quiet:
print('Downloading manifest from %s' % print('Get %s' % GitConfig.ForUser().UrlInsteadOf(opt.manifest_url),
(GitConfig.ForUser().UrlInsteadOf(opt.manifest_url),),
file=sys.stderr) file=sys.stderr)
# The manifest project object doesn't keep track of the path on the # The manifest project object doesn't keep track of the path on the
@ -216,27 +200,24 @@ to update the working directory files.
m._InitGitDir(mirror_git=mirrored_manifest_git) m._InitGitDir(mirror_git=mirrored_manifest_git)
if opt.manifest_branch:
m.revisionExpr = opt.manifest_branch
else:
m.revisionExpr = 'refs/heads/master'
else:
if opt.manifest_branch:
m.revisionExpr = opt.manifest_branch
else:
m.PreSync()
self._ConfigureDepth(opt) self._ConfigureDepth(opt)
# Set the remote URL before the remote branch as we might need it below.
if opt.manifest_url: if opt.manifest_url:
r = m.GetRemote(m.remote.name) r = m.GetRemote(m.remote.name)
r.url = opt.manifest_url r.url = opt.manifest_url
r.ResetFetch() r.ResetFetch()
r.Save() r.Save()
if opt.manifest_branch:
m.revisionExpr = opt.manifest_branch
else:
if is_new:
default_branch = m.ResolveRemoteHead()
if default_branch is None:
# If the remote doesn't have HEAD configured, default to master.
default_branch = 'refs/heads/master'
m.revisionExpr = default_branch
else:
m.PreSync()
groups = re.split(r'[,\s]+', opt.groups) groups = re.split(r'[,\s]+', opt.groups)
all_platforms = ['linux', 'darwin', 'windows'] all_platforms = ['linux', 'darwin', 'windows']
platformize = lambda x: 'platform-' + x platformize = lambda x: 'platform-' + x
@ -264,20 +245,6 @@ to update the working directory files.
if opt.dissociate: if opt.dissociate:
m.config.SetString('repo.dissociate', 'true') m.config.SetString('repo.dissociate', 'true')
if opt.worktree:
if opt.mirror:
print('fatal: --mirror and --worktree are incompatible',
file=sys.stderr)
sys.exit(1)
if opt.submodules:
print('fatal: --submodules and --worktree are incompatible',
file=sys.stderr)
sys.exit(1)
m.config.SetString('repo.worktree', 'true')
if is_new:
m.use_git_worktrees = True
print('warning: --worktree is experimental!', file=sys.stderr)
if opt.archive: if opt.archive:
if is_new: if is_new:
m.config.SetString('repo.archive', 'true') m.config.SetString('repo.archive', 'true')
@ -309,18 +276,13 @@ to update the working directory files.
else: else:
opt.clone_filter = None opt.clone_filter = None
if opt.clone_bundle is None:
opt.clone_bundle = False if opt.partial_clone else True
else:
m.config.SetString('repo.clonebundle', 'true' if opt.clone_bundle else 'false')
if opt.submodules: if opt.submodules:
m.config.SetString('repo.submodules', 'true') m.config.SetString('repo.submodules', 'true')
if not m.Sync_NetworkHalf(is_new=is_new, quiet=opt.quiet, verbose=opt.verbose, if not m.Sync_NetworkHalf(is_new=is_new, quiet=opt.quiet,
clone_bundle=opt.clone_bundle, clone_bundle=not opt.no_clone_bundle,
current_branch_only=opt.current_branch_only, current_branch_only=opt.current_branch_only,
tags=opt.tags, submodules=opt.submodules, no_tags=opt.no_tags, submodules=opt.submodules,
clone_filter=opt.clone_filter): clone_filter=opt.clone_filter):
r = m.GetRemote(m.remote.name) r = m.GetRemote(m.remote.name)
print('fatal: cannot obtain manifest %s' % r.url, file=sys.stderr) print('fatal: cannot obtain manifest %s' % r.url, file=sys.stderr)
@ -364,7 +326,7 @@ to update the working directory files.
return value return value
return a return a
def _ShouldConfigureUser(self, opt): def _ShouldConfigureUser(self):
gc = self.manifest.globalConfig gc = self.manifest.globalConfig
mp = self.manifest.manifestProject mp = self.manifest.manifestProject
@ -376,23 +338,20 @@ to update the working directory files.
mp.config.SetString('user.name', gc.GetString('user.name')) mp.config.SetString('user.name', gc.GetString('user.name'))
mp.config.SetString('user.email', gc.GetString('user.email')) mp.config.SetString('user.email', gc.GetString('user.email'))
if not opt.quiet:
print() print()
print('Your identity is: %s <%s>' % (mp.config.GetString('user.name'), print('Your identity is: %s <%s>' % (mp.config.GetString('user.name'),
mp.config.GetString('user.email'))) mp.config.GetString('user.email')))
print("If you want to change this, please re-run 'repo init' with --config-name") print('If you want to change this, please re-run \'repo init\' with --config-name')
return False return False
def _ConfigureUser(self, opt): def _ConfigureUser(self):
mp = self.manifest.manifestProject mp = self.manifest.manifestProject
while True: while True:
if not opt.quiet:
print() print()
name = self._Prompt('Your Name', mp.UserName) name = self._Prompt('Your Name', mp.UserName)
email = self._Prompt('Your Email', mp.UserEmail) email = self._Prompt('Your Email', mp.UserEmail)
if not opt.quiet:
print() print()
print('Your identity is: %s <%s>' % (name, email)) print('Your identity is: %s <%s>' % (name, email))
print('is this correct [y/N]? ', end='') print('is this correct [y/N]? ', end='')
@ -465,16 +424,15 @@ to update the working directory files.
# We store the depth in the main manifest project. # We store the depth in the main manifest project.
self.manifest.manifestProject.config.SetString('repo.depth', depth) self.manifest.manifestProject.config.SetString('repo.depth', depth)
def _DisplayResult(self, opt): def _DisplayResult(self):
if self.manifest.IsMirror: if self.manifest.IsMirror:
init_type = 'mirror ' init_type = 'mirror '
else: else:
init_type = '' init_type = ''
if not opt.quiet:
print() print()
print('repo %shas been initialized in %s' % print('repo %shas been initialized in %s'
(init_type, self.manifest.topdir)) % (init_type, self.manifest.topdir))
current_dir = os.getcwd() current_dir = os.getcwd()
if current_dir != self.manifest.topdir: if current_dir != self.manifest.topdir:
@ -492,48 +450,15 @@ to update the working directory files.
if opt.archive and opt.mirror: if opt.archive and opt.mirror:
self.OptionParser.error('--mirror and --archive cannot be used together.') self.OptionParser.error('--mirror and --archive cannot be used together.')
if args:
self.OptionParser.error('init takes no arguments')
def Execute(self, opt, args): def Execute(self, opt, args):
git_require(MIN_GIT_VERSION_HARD, fail=True) git_require(MIN_GIT_VERSION, fail=True)
if not git_require(MIN_GIT_VERSION_SOFT):
print('repo: warning: git-%s+ will soon be required; please upgrade your '
'version of git to maintain support.'
% ('.'.join(str(x) for x in MIN_GIT_VERSION_SOFT),),
file=sys.stderr)
opt.quiet = opt.output_mode is False
opt.verbose = opt.output_mode is True
rp = self.manifest.repoProject
# Handle new --repo-url requests.
if opt.repo_url:
remote = rp.GetRemote('origin')
remote.url = opt.repo_url
remote.Save()
# Handle new --repo-rev requests.
if opt.repo_rev:
wrapper = Wrapper()
remote_ref, rev = wrapper.check_repo_rev(
rp.gitdir, opt.repo_rev, repo_verify=opt.repo_verify, quiet=opt.quiet)
branch = rp.GetBranch('default')
branch.merge = remote_ref
rp.work_git.update_ref('refs/heads/default', rev)
branch.Save()
if opt.worktree:
# Older versions of git supported worktree, but had dangerous gc bugs.
git_require((2, 15, 0), fail=True, msg='git gc worktree corruption')
self._SyncManifest(opt) self._SyncManifest(opt)
self._LinkManifest(opt.manifest_name) self._LinkManifest(opt.manifest_name)
if os.isatty(0) and os.isatty(1) and not self.manifest.IsMirror: if os.isatty(0) and os.isatty(1) and not self.manifest.IsMirror:
if opt.config_name or self._ShouldConfigureUser(opt): if opt.config_name or self._ShouldConfigureUser():
self._ConfigureUser(opt) self._ConfigureUser()
self._ConfigureColor() self._ConfigureColor()
self._DisplayResult(opt) self._DisplayResult()

View File

@ -15,10 +15,10 @@
# limitations under the License. # limitations under the License.
from __future__ import print_function from __future__ import print_function
import sys
from command import Command, MirrorSafeCommand from command import Command, MirrorSafeCommand
class List(Command, MirrorSafeCommand): class List(Command, MirrorSafeCommand):
common = True common = True
helpSummary = "List projects and their associated directories" helpSummary = "List projects and their associated directories"

View File

@ -20,26 +20,19 @@ import sys
from command import PagedCommand from command import PagedCommand
class Manifest(PagedCommand): class Manifest(PagedCommand):
common = False common = False
helpSummary = "Manifest inspection utility" helpSummary = "Manifest inspection utility"
helpUsage = """ helpUsage = """
%prog [-o {-|NAME.xml}] [-m MANIFEST.xml] [-r] %prog [-o {-|NAME.xml} [-r]]
""" """
_helpDescription = """ _helpDescription = """
With the -o option, exports the current manifest for inspection. With the -o option, exports the current manifest for inspection.
The manifest and (if present) local_manifests/ are combined The manifest and (if present) local_manifest.xml are combined
together to produce a single manifest file. This file can be stored together to produce a single manifest file. This file can be stored
in a Git repository for use during future 'repo init' invocations. in a Git repository for use during future 'repo init' invocations.
The -r option can be used to generate a manifest file with project
revisions set to the current commit hash. These are known as
"revision locked manifests", as they don't follow a particular branch.
In this case, the 'upstream' attribute is set to the ref we were on
when the manifest was generated. The 'dest-branch' attribute is set
to indicate the remote ref to push changes to via 'repo upload'.
""" """
@property @property
@ -56,18 +49,11 @@ to indicate the remote ref to push changes to via 'repo upload'.
p.add_option('-r', '--revision-as-HEAD', p.add_option('-r', '--revision-as-HEAD',
dest='peg_rev', action='store_true', dest='peg_rev', action='store_true',
help='Save revisions as current HEAD') help='Save revisions as current HEAD')
p.add_option('-m', '--manifest-name',
help='temporary manifest to use for this sync', metavar='NAME.xml')
p.add_option('--suppress-upstream-revision', dest='peg_rev_upstream', p.add_option('--suppress-upstream-revision', dest='peg_rev_upstream',
default=True, action='store_false', default=True, action='store_false',
help='If in -r mode, do not write the upstream field. ' help='If in -r mode, do not write the upstream field. '
'Only of use if the branch names for a sha1 manifest are ' 'Only of use if the branch names for a sha1 manifest are '
'sensitive.') 'sensitive.')
p.add_option('--suppress-dest-branch', dest='peg_rev_dest_branch',
default=True, action='store_false',
help='If in -r mode, do not write the dest-branch field. '
'Only of use if the branch names for a sha1 manifest are '
'sensitive.')
p.add_option('-o', '--output-file', p.add_option('-o', '--output-file',
dest='output_file', dest='output_file',
default='-', default='-',
@ -75,18 +61,13 @@ to indicate the remote ref to push changes to via 'repo upload'.
metavar='-|NAME.xml') metavar='-|NAME.xml')
def _Output(self, opt): def _Output(self, opt):
# If alternate manifest is specified, override the manifest file that we're using.
if opt.manifest_name:
self.manifest.Override(opt.manifest_name, False)
if opt.output_file == '-': if opt.output_file == '-':
fd = sys.stdout fd = sys.stdout
else: else:
fd = open(opt.output_file, 'w') fd = open(opt.output_file, 'w')
self.manifest.Save(fd, self.manifest.Save(fd,
peg_rev = opt.peg_rev, peg_rev = opt.peg_rev,
peg_rev_upstream=opt.peg_rev_upstream, peg_rev_upstream = opt.peg_rev_upstream)
peg_rev_dest_branch=opt.peg_rev_dest_branch)
fd.close() fd.close()
if opt.output_file != '-': if opt.output_file != '-':
print('Saved manifest to %s' % opt.output_file, file=sys.stderr) print('Saved manifest to %s' % opt.output_file, file=sys.stderr)

View File

@ -18,7 +18,6 @@ from __future__ import print_function
from color import Coloring from color import Coloring
from command import PagedCommand from command import PagedCommand
class Prune(PagedCommand): class Prune(PagedCommand):
common = True common = True
helpSummary = "Prune (delete) already merged topics" helpSummary = "Prune (delete) already merged topics"

View File

@ -53,7 +53,7 @@ branch but need to incorporate new upstream changes "underneath" them.
dest='force_rebase', action='store_true', dest='force_rebase', action='store_true',
help='Pass --force-rebase to git rebase') help='Pass --force-rebase to git rebase')
p.add_option('--no-ff', p.add_option('--no-ff',
dest='ff', default=True, action='store_false', dest='no_ff', action='store_true',
help='Pass --no-ff to git rebase') help='Pass --no-ff to git rebase')
p.add_option('-q', '--quiet', p.add_option('-q', '--quiet',
dest='quiet', action='store_true', dest='quiet', action='store_true',
@ -93,7 +93,7 @@ branch but need to incorporate new upstream changes "underneath" them.
common_args.append('--quiet') common_args.append('--quiet')
if opt.force_rebase: if opt.force_rebase:
common_args.append('--force-rebase') common_args.append('--force-rebase')
if not opt.ff: if opt.no_ff:
common_args.append('--no-ff') common_args.append('--no-ff')
if opt.autosquash: if opt.autosquash:
common_args.append('--autosquash') common_args.append('--autosquash')

View File

@ -22,7 +22,6 @@ from command import Command, MirrorSafeCommand
from subcmds.sync import _PostRepoUpgrade from subcmds.sync import _PostRepoUpgrade
from subcmds.sync import _PostRepoFetch from subcmds.sync import _PostRepoFetch
class Selfupdate(Command, MirrorSafeCommand): class Selfupdate(Command, MirrorSafeCommand):
common = False common = False
helpSummary = "Update repo to the latest version" helpSummary = "Update repo to the latest version"
@ -40,7 +39,7 @@ need to be performed by an end-user.
def _Options(self, p): def _Options(self, p):
g = p.add_option_group('repo Version options') g = p.add_option_group('repo Version options')
g.add_option('--no-repo-verify', g.add_option('--no-repo-verify',
dest='repo_verify', default=True, action='store_false', dest='no_repo_verify', action='store_true',
help='do not verify repo source code') help='do not verify repo source code')
g.add_option('--repo-upgraded', g.add_option('--repo-upgraded',
dest='repo_upgraded', action='store_true', dest='repo_upgraded', action='store_true',
@ -60,5 +59,5 @@ need to be performed by an end-user.
rp.bare_git.gc('--auto') rp.bare_git.gc('--auto')
_PostRepoFetch(rp, _PostRepoFetch(rp,
repo_verify=opt.repo_verify, no_repo_verify = opt.no_repo_verify,
verbose = True) verbose = True)

View File

@ -16,7 +16,6 @@
from subcmds.sync import Sync from subcmds.sync import Sync
class Smartsync(Sync): class Smartsync(Sync):
common = True common = True
helpSummary = "Update working tree to the latest known good revision" helpSummary = "Update working tree to the latest known good revision"

View File

@ -21,7 +21,6 @@ from color import Coloring
from command import InteractiveCommand from command import InteractiveCommand
from git_command import GitCommand from git_command import GitCommand
class _ProjectList(Coloring): class _ProjectList(Coloring):
def __init__(self, gc): def __init__(self, gc):
Coloring.__init__(self, gc, 'interactive') Coloring.__init__(self, gc, 'interactive')
@ -29,7 +28,6 @@ class _ProjectList(Coloring):
self.header = self.printer('header', attr='bold') self.header = self.printer('header', attr='bold')
self.help = self.printer('help', fg='red', attr='bold') self.help = self.printer('help', fg='red', attr='bold')
class Stage(InteractiveCommand): class Stage(InteractiveCommand):
common = True common = True
helpSummary = "Stage file(s) for commit" helpSummary = "Stage file(s) for commit"
@ -107,7 +105,6 @@ The '%prog' command stages files to prepare the next commit.
continue continue
print('Bye.') print('Bye.')
def _AddI(project): def _AddI(project):
p = GitCommand(project, ['add', '--interactive'], bare=False) p = GitCommand(project, ['add', '--interactive'], bare=False)
p.Wait() p.Wait()

View File

@ -25,7 +25,6 @@ import gitc_utils
from progress import Progress from progress import Progress
from project import SyncBuffer from project import SyncBuffer
class Start(Command): class Start(Command):
common = True common = True
helpSummary = "Start a new branch for development" helpSummary = "Start a new branch for development"
@ -61,7 +60,7 @@ revision specified in the manifest.
if not opt.all: if not opt.all:
projects = args[1:] projects = args[1:]
if len(projects) < 1: if len(projects) < 1:
projects = ['.'] # start it in the local project by default projects = ['.',] # start it in the local project by default
all_projects = self.GetProjects(projects, all_projects = self.GetProjects(projects,
missing_ok=bool(self.gitc_manifest)) missing_ok=bool(self.gitc_manifest))

View File

@ -16,17 +16,21 @@
from __future__ import print_function from __future__ import print_function
import functools
import glob
import multiprocessing
import os
from command import PagedCommand from command import PagedCommand
try:
import threading as _threading
except ImportError:
import dummy_threading as _threading
import glob
import itertools
import os
from color import Coloring from color import Coloring
import platform_utils import platform_utils
class Status(PagedCommand): class Status(PagedCommand):
common = True common = True
helpSummary = "Show the working tree status" helpSummary = "Show the working tree status"
@ -91,20 +95,25 @@ the following meanings:
p.add_option('-q', '--quiet', action='store_true', p.add_option('-q', '--quiet', action='store_true',
help="only print the name of modified projects") help="only print the name of modified projects")
def _StatusHelper(self, quiet, project): def _StatusHelper(self, project, clean_counter, sem, quiet):
"""Obtains the status for a specific project. """Obtains the status for a specific project.
Obtains the status for a project, redirecting the output to Obtains the status for a project, redirecting the output to
the specified object. the specified object. It will release the semaphore
when done.
Args: Args:
quiet: Where to output the status.
project: Project to get status of. project: Project to get status of.
clean_counter: Counter for clean projects.
Returns: sem: Semaphore, will call release() when complete.
The status of the project. output: Where to output the status.
""" """
return project.PrintWorkTreeStatus(quiet=quiet) try:
state = project.PrintWorkTreeStatus(quiet=quiet)
if state == 'CLEAN':
next(clean_counter)
finally:
sem.release()
def _FindOrphans(self, dirs, proj_dirs, proj_dirs_parents, outstring): def _FindOrphans(self, dirs, proj_dirs, proj_dirs_parents, outstring):
"""find 'dirs' that are present in 'proj_dirs_parents' but not in 'proj_dirs'""" """find 'dirs' that are present in 'proj_dirs_parents' but not in 'proj_dirs'"""
@ -124,18 +133,27 @@ the following meanings:
def Execute(self, opt, args): def Execute(self, opt, args):
all_projects = self.GetProjects(args) all_projects = self.GetProjects(args)
counter = 0 counter = itertools.count()
if opt.jobs == 1: if opt.jobs == 1:
for project in all_projects: for project in all_projects:
state = project.PrintWorkTreeStatus(quiet=opt.quiet) state = project.PrintWorkTreeStatus(quiet=opt.quiet)
if state == 'CLEAN': if state == 'CLEAN':
counter += 1 next(counter)
else: else:
with multiprocessing.Pool(opt.jobs) as pool: sem = _threading.Semaphore(opt.jobs)
states = pool.map(functools.partial(self._StatusHelper, opt.quiet), all_projects) threads = []
counter += states.count('CLEAN') for project in all_projects:
if not opt.quiet and len(all_projects) == counter: sem.acquire()
t = _threading.Thread(target=self._StatusHelper,
args=(project, counter, sem, opt.quiet))
threads.append(t)
t.daemon = True
t.start()
for t in threads:
t.join()
if not opt.quiet and len(all_projects) == next(counter):
print('nothing to commit (working directory clean)') print('nothing to commit (working directory clean)')
if opt.orphans: if opt.orphans:

View File

@ -15,7 +15,6 @@
# limitations under the License. # limitations under the License.
from __future__ import print_function from __future__ import print_function
import json import json
import netrc import netrc
from optparse import SUPPRESS_HELP from optparse import SUPPRESS_HELP
@ -54,7 +53,6 @@ except ImportError:
try: try:
import resource import resource
def _rlimit_nofile(): def _rlimit_nofile():
return resource.getrlimit(resource.RLIMIT_NOFILE) return resource.getrlimit(resource.RLIMIT_NOFILE)
except ImportError: except ImportError:
@ -83,16 +81,13 @@ from manifest_xml import GitcManifest
_ONE_DAY_S = 24 * 60 * 60 _ONE_DAY_S = 24 * 60 * 60
class _FetchError(Exception): class _FetchError(Exception):
"""Internal error thrown in _FetchHelper() when we don't want stack trace.""" """Internal error thrown in _FetchHelper() when we don't want stack trace."""
pass pass
class _CheckoutError(Exception): class _CheckoutError(Exception):
"""Internal error thrown in _CheckoutOne() when we don't want stack trace.""" """Internal error thrown in _CheckoutOne() when we don't want stack trace."""
class Sync(Command, MirrorSafeCommand): class Sync(Command, MirrorSafeCommand):
jobs = 1 jobs = 1
common = True common = True
@ -138,11 +133,11 @@ if the manifest server specified in the manifest file already includes
credentials. credentials.
By default, all projects will be synced. The --fail-fast option can be used By default, all projects will be synced. The --fail-fast option can be used
to halt syncing as soon as possible when the first project fails to sync. to halt syncing as soon as possible when the the first project fails to sync.
The --force-sync option can be used to overwrite existing git The --force-sync option can be used to overwrite existing git
directories if they have previously been linked to a different directories if they have previously been linked to a different
object directory. WARNING: This may cause data to be lost since object direcotry. WARNING: This may cause data to be lost since
refs may be removed when overwriting. refs may be removed when overwriting.
The --force-remove-dirty option can be used to remove previously used The --force-remove-dirty option can be used to remove previously used
@ -235,21 +230,17 @@ later is required to fix a server side protocol bug.
p.add_option('-c', '--current-branch', p.add_option('-c', '--current-branch',
dest='current_branch_only', action='store_true', dest='current_branch_only', action='store_true',
help='fetch only current branch from server') help='fetch only current branch from server')
p.add_option('-v', '--verbose',
dest='output_mode', action='store_true',
help='show all sync output')
p.add_option('-q', '--quiet', p.add_option('-q', '--quiet',
dest='output_mode', action='store_false', dest='quiet', action='store_true',
help='only show errors') help='be more quiet')
p.add_option('-j', '--jobs', p.add_option('-j', '--jobs',
dest='jobs', action='store', type='int', dest='jobs', action='store', type='int',
help="projects to fetch simultaneously (default %d)" % self.jobs) help="projects to fetch simultaneously (default %d)" % self.jobs)
p.add_option('-m', '--manifest-name', p.add_option('-m', '--manifest-name',
dest='manifest_name', dest='manifest_name',
help='temporary manifest to use for this sync', metavar='NAME.xml') help='temporary manifest to use for this sync', metavar='NAME.xml')
p.add_option('--clone-bundle', action='store_true', p.add_option('--no-clone-bundle',
help='enable use of /clone.bundle on HTTP/HTTPS') dest='no_clone_bundle', action='store_true',
p.add_option('--no-clone-bundle', dest='clone_bundle', action='store_false',
help='disable use of /clone.bundle on HTTP/HTTPS') help='disable use of /clone.bundle on HTTP/HTTPS')
p.add_option('-u', '--manifest-server-username', action='store', p.add_option('-u', '--manifest-server-username', action='store',
dest='manifest_server_username', dest='manifest_server_username',
@ -261,14 +252,11 @@ later is required to fix a server side protocol bug.
dest='fetch_submodules', action='store_true', dest='fetch_submodules', action='store_true',
help='fetch submodules from server') help='fetch submodules from server')
p.add_option('--no-tags', p.add_option('--no-tags',
dest='tags', default=True, action='store_false', dest='no_tags', action='store_true',
help="don't fetch tags") help="don't fetch tags")
p.add_option('--optimized-fetch', p.add_option('--optimized-fetch',
dest='optimized_fetch', action='store_true', dest='optimized_fetch', action='store_true',
help='only fetch projects fixed to sha1 if revision does not exist locally') help='only fetch projects fixed to sha1 if revision does not exist locally')
p.add_option('--retry-fetches',
default=0, action='store', type='int',
help='number of times to retry fetches on transient errors')
p.add_option('--prune', dest='prune', action='store_true', p.add_option('--prune', dest='prune', action='store_true',
help='delete refs that no longer exist on the remote') help='delete refs that no longer exist on the remote')
if show_smart: if show_smart:
@ -281,7 +269,7 @@ later is required to fix a server side protocol bug.
g = p.add_option_group('repo Version options') g = p.add_option_group('repo Version options')
g.add_option('--no-repo-verify', g.add_option('--no-repo-verify',
dest='repo_verify', default=True, action='store_false', dest='no_repo_verify', action='store_true',
help='do not verify repo source code') help='do not verify repo source code')
g.add_option('--repo-upgraded', g.add_option('--repo-upgraded',
dest='repo_upgraded', action='store_true', dest='repo_upgraded', action='store_true',
@ -340,13 +328,11 @@ later is required to fix a server side protocol bug.
try: try:
success = project.Sync_NetworkHalf( success = project.Sync_NetworkHalf(
quiet=opt.quiet, quiet=opt.quiet,
verbose=opt.verbose,
current_branch_only=opt.current_branch_only, current_branch_only=opt.current_branch_only,
force_sync=opt.force_sync, force_sync=opt.force_sync,
clone_bundle=opt.clone_bundle, clone_bundle=not opt.no_clone_bundle,
tags=opt.tags, archive=self.manifest.IsArchive, no_tags=opt.no_tags, archive=self.manifest.IsArchive,
optimized_fetch=opt.optimized_fetch, optimized_fetch=opt.optimized_fetch,
retry_fetches=opt.retry_fetches,
prune=opt.prune, prune=opt.prune,
clone_filter=clone_filter) clone_filter=clone_filter)
self._fetch_times.Set(project, time.time() - start) self._fetch_times.Set(project, time.time() - start)
@ -369,7 +355,7 @@ later is required to fix a server side protocol bug.
except _FetchError: except _FetchError:
pass pass
except Exception as e: except Exception as e:
print('error: Cannot fetch %s (%s: %s)' print('error: Cannot fetch %s (%s: %s)' \
% (project.name, type(e).__name__, str(e)), file=sys.stderr) % (project.name, type(e).__name__, str(e)), file=sys.stderr)
err_event.set() err_event.set()
raise raise
@ -574,23 +560,13 @@ later is required to fix a server side protocol bug.
def _GCProjects(self, projects, opt, err_event): def _GCProjects(self, projects, opt, err_event):
gc_gitdirs = {} gc_gitdirs = {}
for project in projects: for project in projects:
# Make sure pruning never kicks in with shared projects. if len(project.manifest.GetProjectsWithName(project.name)) > 1:
if (not project.use_git_worktrees and print('Shared project %s found, disabling pruning.' % project.name)
len(project.manifest.GetProjectsWithName(project.name)) > 1): project.bare_git.config('--replace-all', 'gc.pruneExpire', 'never')
print('%s: Shared project %s found, disabling pruning.' %
(project.relpath, project.name))
if git_require((2, 7, 0)):
project.EnableRepositoryExtension('preciousObjects')
else:
# This isn't perfect, but it's the best we can do with old git.
print('%s: WARNING: shared projects are unreliable when using old '
'versions of git; please upgrade to git-2.7.0+.'
% (project.relpath,),
file=sys.stderr)
project.config.SetString('gc.pruneExpire', 'never')
gc_gitdirs[project.gitdir] = project.bare_git gc_gitdirs[project.gitdir] = project.bare_git
if multiprocessing: has_dash_c = git_require((1, 7, 2))
if multiprocessing and has_dash_c:
cpu_count = multiprocessing.cpu_count() cpu_count = multiprocessing.cpu_count()
else: else:
cpu_count = 1 cpu_count = 1
@ -612,7 +588,7 @@ later is required to fix a server side protocol bug.
bare_git.gc('--auto', config=config) bare_git.gc('--auto', config=config)
except GitError: except GitError:
err_event.set() err_event.set()
except Exception: except:
err_event.set() err_event.set()
raise raise
finally: finally:
@ -637,6 +613,65 @@ later is required to fix a server side protocol bug.
else: else:
self.manifest._Unload() self.manifest._Unload()
def _DeleteProject(self, path):
print('Deleting obsolete path %s' % path, file=sys.stderr)
# Delete the .git directory first, so we're less likely to have a partially
# working git repository around. There shouldn't be any git projects here,
# so rmtree works.
try:
platform_utils.rmtree(os.path.join(path, '.git'))
except OSError as e:
print('Failed to remove %s (%s)' % (os.path.join(path, '.git'), str(e)), file=sys.stderr)
print('error: Failed to delete obsolete path %s' % path, file=sys.stderr)
print(' remove manually, then run sync again', file=sys.stderr)
return 1
# Delete everything under the worktree, except for directories that contain
# another git project
dirs_to_remove = []
failed = False
for root, dirs, files in platform_utils.walk(path):
for f in files:
try:
platform_utils.remove(os.path.join(root, f))
except OSError as e:
print('Failed to remove %s (%s)' % (os.path.join(root, f), str(e)), file=sys.stderr)
failed = True
dirs[:] = [d for d in dirs
if not os.path.lexists(os.path.join(root, d, '.git'))]
dirs_to_remove += [os.path.join(root, d) for d in dirs
if os.path.join(root, d) not in dirs_to_remove]
for d in reversed(dirs_to_remove):
if platform_utils.islink(d):
try:
platform_utils.remove(d)
except OSError as e:
print('Failed to remove %s (%s)' % (os.path.join(root, d), str(e)), file=sys.stderr)
failed = True
elif len(platform_utils.listdir(d)) == 0:
try:
platform_utils.rmdir(d)
except OSError as e:
print('Failed to remove %s (%s)' % (os.path.join(root, d), str(e)), file=sys.stderr)
failed = True
continue
if failed:
print('error: Failed to delete obsolete path %s' % path, file=sys.stderr)
print(' remove manually, then run sync again', file=sys.stderr)
return 1
# Try deleting parent dirs if they are empty
project_dir = path
while project_dir != self.manifest.topdir:
if len(platform_utils.listdir(project_dir)) == 0:
platform_utils.rmdir(project_dir)
else:
break
project_dir = os.path.dirname(project_dir)
return 0
def UpdateProjectList(self, opt): def UpdateProjectList(self, opt):
new_project_paths = [] new_project_paths = []
for project in self.GetProjects(None, missing_ok=True): for project in self.GetProjects(None, missing_ok=True):
@ -663,15 +698,23 @@ later is required to fix a server side protocol bug.
remote = RemoteSpec('origin'), remote = RemoteSpec('origin'),
gitdir = gitdir, gitdir = gitdir,
objdir = gitdir, objdir = gitdir,
use_git_worktrees=os.path.isfile(gitdir),
worktree = os.path.join(self.manifest.topdir, path), worktree = os.path.join(self.manifest.topdir, path),
relpath = path, relpath = path,
revisionExpr = 'HEAD', revisionExpr = 'HEAD',
revisionId = None, revisionId = None,
groups = None) groups = None)
if not project.DeleteWorktree(
quiet=opt.quiet, if project.IsDirty() and opt.force_remove_dirty:
force=opt.force_remove_dirty): print('WARNING: Removing dirty project "%s": uncommitted changes '
'erased' % project.relpath, file=sys.stderr)
self._DeleteProject(project.worktree)
elif project.IsDirty():
print('error: Cannot remove project "%s": uncommitted changes '
'are present' % project.relpath, file=sys.stderr)
print(' commit changes, then run sync again',
file=sys.stderr)
return 1
elif self._DeleteProject(project.worktree):
return 1 return 1
new_project_paths.sort() new_project_paths.sort()
@ -690,7 +733,7 @@ later is required to fix a server side protocol bug.
if not opt.quiet: if not opt.quiet:
print('Using manifest server %s' % manifest_server) print('Using manifest server %s' % manifest_server)
if '@' not in manifest_server: if not '@' in manifest_server:
username = None username = None
password = None password = None
if opt.manifest_server_username and opt.manifest_server_password: if opt.manifest_server_username and opt.manifest_server_password:
@ -733,13 +776,13 @@ later is required to fix a server side protocol bug.
if branch.startswith(R_HEADS): if branch.startswith(R_HEADS):
branch = branch[len(R_HEADS):] branch = branch[len(R_HEADS):]
if 'SYNC_TARGET' in os.environ: env = os.environ.copy()
target = os.environ['SYNC_TARGET'] if 'SYNC_TARGET' in env:
target = env['SYNC_TARGET']
[success, manifest_str] = server.GetApprovedManifest(branch, target) [success, manifest_str] = server.GetApprovedManifest(branch, target)
elif ('TARGET_PRODUCT' in os.environ and elif 'TARGET_PRODUCT' in env and 'TARGET_BUILD_VARIANT' in env:
'TARGET_BUILD_VARIANT' in os.environ): target = '%s-%s' % (env['TARGET_PRODUCT'],
target = '%s-%s' % (os.environ['TARGET_PRODUCT'], env['TARGET_BUILD_VARIANT'])
os.environ['TARGET_BUILD_VARIANT'])
[success, manifest_str] = server.GetApprovedManifest(branch, target) [success, manifest_str] = server.GetApprovedManifest(branch, target)
else: else:
[success, manifest_str] = server.GetApprovedManifest(branch) [success, manifest_str] = server.GetApprovedManifest(branch)
@ -778,11 +821,10 @@ later is required to fix a server side protocol bug.
"""Fetch & update the local manifest project.""" """Fetch & update the local manifest project."""
if not opt.local_only: if not opt.local_only:
start = time.time() start = time.time()
success = mp.Sync_NetworkHalf(quiet=opt.quiet, verbose=opt.verbose, success = mp.Sync_NetworkHalf(quiet=opt.quiet,
current_branch_only=opt.current_branch_only, current_branch_only=opt.current_branch_only,
tags=opt.tags, no_tags=opt.no_tags,
optimized_fetch=opt.optimized_fetch, optimized_fetch=opt.optimized_fetch,
retry_fetches=opt.retry_fetches,
submodules=self.manifest.HasSubmodules, submodules=self.manifest.HasSubmodules,
clone_filter=self.manifest.CloneFilter) clone_filter=self.manifest.CloneFilter)
finish = time.time() finish = time.time()
@ -827,9 +869,6 @@ later is required to fix a server side protocol bug.
soft_limit, _ = _rlimit_nofile() soft_limit, _ = _rlimit_nofile()
self.jobs = min(self.jobs, (soft_limit - 5) // 3) self.jobs = min(self.jobs, (soft_limit - 5) // 3)
opt.quiet = opt.output_mode is False
opt.verbose = opt.output_mode is True
if opt.manifest_name: if opt.manifest_name:
self.manifest.Override(opt.manifest_name) self.manifest.Override(opt.manifest_name)
@ -837,9 +876,6 @@ later is required to fix a server side protocol bug.
smart_sync_manifest_path = os.path.join( smart_sync_manifest_path = os.path.join(
self.manifest.manifestProject.worktree, 'smart_sync_override.xml') self.manifest.manifestProject.worktree, 'smart_sync_override.xml')
if opt.clone_bundle is None:
opt.clone_bundle = self.manifest.CloneBundle
if opt.smart_sync or opt.smart_tag: if opt.smart_sync or opt.smart_tag:
manifest_name = self._SmartSyncSetup(opt, smart_sync_manifest_path) manifest_name = self._SmartSyncSetup(opt, smart_sync_manifest_path)
else: else:
@ -854,13 +890,6 @@ later is required to fix a server side protocol bug.
rp = self.manifest.repoProject rp = self.manifest.repoProject
rp.PreSync() rp.PreSync()
cb = rp.CurrentBranch
if cb:
base = rp.GetBranch(cb).merge
if not base or not base.startswith('refs/heads/'):
print('warning: repo is not tracking a remote branch, so it will not '
'receive updates; run `repo init --repo-rev=stable` to fix.',
file=sys.stderr)
mp = self.manifest.manifestProject mp = self.manifest.manifestProject
mp.PreSync() mp.PreSync()
@ -927,7 +956,7 @@ later is required to fix a server side protocol bug.
fetched = self._Fetch(to_fetch, opt, err_event) fetched = self._Fetch(to_fetch, opt, err_event)
_PostRepoFetch(rp, opt.repo_verify) _PostRepoFetch(rp, opt.no_repo_verify)
if opt.network_only: if opt.network_only:
# bail out now; the rest touches the working tree # bail out now; the rest touches the working tree
if err_event.isSet(): if err_event.isSet():
@ -1004,10 +1033,6 @@ later is required to fix a server side protocol bug.
file=sys.stderr) file=sys.stderr)
sys.exit(1) sys.exit(1)
if not opt.quiet:
print('repo sync has finished successfully.')
def _PostRepoUpgrade(manifest, quiet=False): def _PostRepoUpgrade(manifest, quiet=False):
wrapper = Wrapper() wrapper = Wrapper()
if wrapper.NeedSetupGnuPG(): if wrapper.NeedSetupGnuPG():
@ -1016,12 +1041,11 @@ def _PostRepoUpgrade(manifest, quiet=False):
if project.Exists: if project.Exists:
project.PostRepoUpgrade() project.PostRepoUpgrade()
def _PostRepoFetch(rp, no_repo_verify=False, verbose=False):
def _PostRepoFetch(rp, repo_verify=True, verbose=False):
if rp.HasChanges: if rp.HasChanges:
print('info: A new version of repo is available', file=sys.stderr) print('info: A new version of repo is available', file=sys.stderr)
print(file=sys.stderr) print(file=sys.stderr)
if not repo_verify or _VerifyTag(rp): if no_repo_verify or _VerifyTag(rp):
syncbuf = SyncBuffer(rp.config) syncbuf = SyncBuffer(rp.config)
rp.Sync_LocalHalf(syncbuf) rp.Sync_LocalHalf(syncbuf)
if not syncbuf.Finish(): if not syncbuf.Finish():
@ -1035,7 +1059,6 @@ def _PostRepoFetch(rp, repo_verify=True, verbose=False):
print('repo version %s is current' % rp.work_git.describe(HEAD), print('repo version %s is current' % rp.work_git.describe(HEAD),
file=sys.stderr) file=sys.stderr)
def _VerifyTag(project): def _VerifyTag(project):
gpg_dir = os.path.expanduser('~/.repoconfig/gnupg') gpg_dir = os.path.expanduser('~/.repoconfig/gnupg')
if not os.path.exists(gpg_dir): if not os.path.exists(gpg_dir):
@ -1061,8 +1084,8 @@ def _VerifyTag(project):
return False return False
env = os.environ.copy() env = os.environ.copy()
env['GIT_DIR'] = project.gitdir env['GIT_DIR'] = project.gitdir.encode()
env['GNUPGHOME'] = gpg_dir env['GNUPGHOME'] = gpg_dir.encode()
cmd = [GIT, 'tag', '-v', cur] cmd = [GIT, 'tag', '-v', cur]
proc = subprocess.Popen(cmd, proc = subprocess.Popen(cmd,
@ -1140,8 +1163,6 @@ class _FetchTimes(object):
# and supporting persistent-http[s]. It cannot change hosts from # and supporting persistent-http[s]. It cannot change hosts from
# request to request like the normal transport, the real url # request to request like the normal transport, the real url
# is passed during initialization. # is passed during initialization.
class PersistentTransport(xmlrpc.client.Transport): class PersistentTransport(xmlrpc.client.Transport):
def __init__(self, orig_host): def __init__(self, orig_host):
self.orig_host = orig_host self.orig_host = orig_host
@ -1152,7 +1173,7 @@ class PersistentTransport(xmlrpc.client.Transport):
# Since we're only using them for HTTP, copy the file temporarily, # Since we're only using them for HTTP, copy the file temporarily,
# stripping those prefixes away. # stripping those prefixes away.
if cookiefile: if cookiefile:
tmpcookiefile = tempfile.NamedTemporaryFile(mode='w') tmpcookiefile = tempfile.NamedTemporaryFile()
tmpcookiefile.write("# HTTP Cookie File") tmpcookiefile.write("# HTTP Cookie File")
try: try:
with open(cookiefile) as f: with open(cookiefile) as f:
@ -1233,3 +1254,4 @@ class PersistentTransport(xmlrpc.client.Transport):
def close(self): def close(self):
pass pass

View File

@ -23,18 +23,16 @@ from command import InteractiveCommand
from editor import Editor from editor import Editor
from error import HookError, UploadError from error import HookError, UploadError
from git_command import GitCommand from git_command import GitCommand
from git_refs import R_HEADS from project import RepoHook
from hooks import RepoHook
from pyversion import is_python3 from pyversion import is_python3
if not is_python3(): if not is_python3():
input = raw_input # noqa: F821 input = raw_input
else: else:
unicode = str unicode = str
UNUSUAL_COMMIT_THRESHOLD = 5 UNUSUAL_COMMIT_THRESHOLD = 5
def _ConfirmManyUploads(multiple_branches=False): def _ConfirmManyUploads(multiple_branches=False):
if multiple_branches: if multiple_branches:
print('ATTENTION: One or more branches has an unusually high number ' print('ATTENTION: One or more branches has an unusually high number '
@ -46,20 +44,17 @@ def _ConfirmManyUploads(multiple_branches=False):
answer = input("If you are sure you intend to do this, type 'yes': ").strip() answer = input("If you are sure you intend to do this, type 'yes': ").strip()
return answer == "yes" return answer == "yes"
def _die(fmt, *args): def _die(fmt, *args):
msg = fmt % args msg = fmt % args
print('error: %s' % msg, file=sys.stderr) print('error: %s' % msg, file=sys.stderr)
sys.exit(1) sys.exit(1)
def _SplitEmails(values): def _SplitEmails(values):
result = [] result = []
for value in values: for value in values:
result.extend([s.strip() for s in value.split(',')]) result.extend([s.strip() for s in value.split(',')])
return result return result
class Upload(InteractiveCommand): class Upload(InteractiveCommand):
common = True common = True
helpSummary = "Upload changes for code review" helpSummary = "Upload changes for code review"
@ -131,23 +126,6 @@ is set to "true" then repo will assume you always want the equivalent
of the -t option to the repo command. If unset or set to "false" then of the -t option to the repo command. If unset or set to "false" then
repo will make use of only the command line option. repo will make use of only the command line option.
review.URL.uploadhashtags:
To add hashtags whenever uploading a commit, you can set a per-project
or global Git option to do so. The value of review.URL.uploadhashtags
will be used as comma delimited hashtags like the --hashtag option.
review.URL.uploadlabels:
To add labels whenever uploading a commit, you can set a per-project
or global Git option to do so. The value of review.URL.uploadlabels
will be used as comma delimited labels like the --label option.
review.URL.uploadnotify:
Control e-mail notifications when uploading.
https://gerrit-review.googlesource.com/Documentation/user-upload.html#notify
# References # References
Gerrit Code Review: https://www.gerritcodereview.com/ Gerrit Code Review: https://www.gerritcodereview.com/
@ -158,15 +136,6 @@ Gerrit Code Review: https://www.gerritcodereview.com/
p.add_option('-t', p.add_option('-t',
dest='auto_topic', action='store_true', dest='auto_topic', action='store_true',
help='Send local branch name to Gerrit Code Review') help='Send local branch name to Gerrit Code Review')
p.add_option('--hashtag', '--ht',
dest='hashtags', action='append', default=[],
help='Add hashtags (comma delimited) to the review.')
p.add_option('--hashtag-branch', '--htb',
action='store_true',
help='Add local branch name as a hashtag.')
p.add_option('-l', '--label',
dest='labels', action='append', default=[],
help='Add a label when uploading.')
p.add_option('--re', '--reviewers', p.add_option('--re', '--reviewers',
type='string', action='append', dest='reviewers', type='string', action='append', dest='reviewers',
help='Request reviews from these people.') help='Request reviews from these people.')
@ -179,6 +148,9 @@ Gerrit Code Review: https://www.gerritcodereview.com/
p.add_option('--cbr', '--current-branch', p.add_option('--cbr', '--current-branch',
dest='current_branch', action='store_true', dest='current_branch', action='store_true',
help='Upload current git branch.') help='Upload current git branch.')
p.add_option('-d', '--draft',
action='store_true', dest='draft', default=False,
help='If specified, upload as a draft.')
p.add_option('--ne', '--no-emails', p.add_option('--ne', '--no-emails',
action='store_false', dest='notify', default=True, action='store_false', dest='notify', default=True,
help='If specified, do not send emails on upload.') help='If specified, do not send emails on upload.')
@ -196,15 +168,6 @@ Gerrit Code Review: https://www.gerritcodereview.com/
type='string', action='store', dest='dest_branch', type='string', action='store', dest='dest_branch',
metavar='BRANCH', metavar='BRANCH',
help='Submit for review on this target branch.') help='Submit for review on this target branch.')
p.add_option('-n', '--dry-run',
dest='dryrun', default=False, action='store_true',
help='Do everything except actually upload the CL.')
p.add_option('-y', '--yes',
default=False, action='store_true',
help='Answer yes to all safe prompts.')
p.add_option('--no-cert-checks',
dest='validate_certs', action='store_false', default=True,
help='Disable verifying ssl certs (unsafe).')
# Options relating to upload hook. Note that verify and no-verify are NOT # Options relating to upload hook. Note that verify and no-verify are NOT
# opposites of each other, which is why they store to different locations. # opposites of each other, which is why they store to different locations.
@ -222,16 +185,15 @@ Gerrit Code Review: https://www.gerritcodereview.com/
# Never run upload hooks, but upload anyway (AKA bypass hooks). # Never run upload hooks, but upload anyway (AKA bypass hooks).
# - no-verify=True, verify=True: # - no-verify=True, verify=True:
# Invalid # Invalid
g = p.add_option_group('Upload hooks') p.add_option('--no-cert-checks',
g.add_option('--no-verify', dest='validate_certs', action='store_false', default=True,
help='Disable verifying ssl certs (unsafe).')
p.add_option('--no-verify',
dest='bypass_hooks', action='store_true', dest='bypass_hooks', action='store_true',
help='Do not run the upload hook.') help='Do not run the upload hook.')
g.add_option('--verify', p.add_option('--verify',
dest='allow_all_hooks', action='store_true', dest='allow_all_hooks', action='store_true',
help='Run the upload hook without prompting.') help='Run the upload hook without prompting.')
g.add_option('--ignore-hooks',
dest='ignore_hooks', action='store_true',
help='Do not abort uploading if upload hooks fail.')
def _SingleBranch(self, opt, branch, people): def _SingleBranch(self, opt, branch, people):
project = branch.project project = branch.project
@ -250,7 +212,7 @@ Gerrit Code Review: https://www.gerritcodereview.com/
destination = opt.dest_branch or project.dest_branch or project.revisionExpr destination = opt.dest_branch or project.dest_branch or project.revisionExpr
print('Upload project %s/ to remote branch %s%s:' % print('Upload project %s/ to remote branch %s%s:' %
(project.relpath, destination, ' (private)' if opt.private else '')) (project.relpath, destination, ' (draft)' if opt.draft else ''))
print(' branch %s (%2d commit%s, %s):' % ( print(' branch %s (%2d commit%s, %s):' % (
name, name,
len(commit_list), len(commit_list),
@ -262,10 +224,6 @@ Gerrit Code Review: https://www.gerritcodereview.com/
print('to %s (y/N)? ' % remote.review, end='') print('to %s (y/N)? ' % remote.review, end='')
# TODO: When we require Python 3, use flush=True w/print above. # TODO: When we require Python 3, use flush=True w/print above.
sys.stdout.flush() sys.stdout.flush()
if opt.yes:
print('<--yes>')
answer = True
else:
answer = sys.stdin.readline().strip().lower() answer = sys.stdin.readline().strip().lower()
answer = answer in ('y', 'yes', '1', 'true', 't') answer = answer in ('y', 'yes', '1', 'true', 't')
@ -364,12 +322,12 @@ Gerrit Code Review: https://www.gerritcodereview.com/
key = 'review.%s.autoreviewer' % project.GetBranch(name).remote.review key = 'review.%s.autoreviewer' % project.GetBranch(name).remote.review
raw_list = project.config.GetString(key) raw_list = project.config.GetString(key)
if raw_list is not None: if not raw_list is None:
people[0].extend([entry.strip() for entry in raw_list.split(',')]) people[0].extend([entry.strip() for entry in raw_list.split(',')])
key = 'review.%s.autocopy' % project.GetBranch(name).remote.review key = 'review.%s.autocopy' % project.GetBranch(name).remote.review
raw_list = project.config.GetString(key) raw_list = project.config.GetString(key)
if raw_list is not None and len(people[0]) > 0: if not raw_list is None and len(people[0]) > 0:
people[1].extend([entry.strip() for entry in raw_list.split(',')]) people[1].extend([entry.strip() for entry in raw_list.split(',')])
def _FindGerritChange(self, branch): def _FindGerritChange(self, branch):
@ -406,10 +364,6 @@ Gerrit Code Review: https://www.gerritcodereview.com/
print('Continue uploading? (y/N) ', end='') print('Continue uploading? (y/N) ', end='')
# TODO: When we require Python 3, use flush=True w/print above. # TODO: When we require Python 3, use flush=True w/print above.
sys.stdout.flush() sys.stdout.flush()
if opt.yes:
print('<--yes>')
a = 'yes'
else:
a = sys.stdin.readline().strip().lower() a = sys.stdin.readline().strip().lower()
if a not in ('y', 'yes', 't', 'true', 'on'): if a not in ('y', 'yes', 't', 'true', 'on'):
print("skipping upload", file=sys.stderr) print("skipping upload", file=sys.stderr)
@ -422,51 +376,12 @@ Gerrit Code Review: https://www.gerritcodereview.com/
key = 'review.%s.uploadtopic' % branch.project.remote.review key = 'review.%s.uploadtopic' % branch.project.remote.review
opt.auto_topic = branch.project.config.GetBoolean(key) opt.auto_topic = branch.project.config.GetBoolean(key)
def _ExpandCommaList(value):
"""Split |value| up into comma delimited entries."""
if not value:
return
for ret in value.split(','):
ret = ret.strip()
if ret:
yield ret
# Check if hashtags should be included.
key = 'review.%s.uploadhashtags' % branch.project.remote.review
hashtags = set(_ExpandCommaList(branch.project.config.GetString(key)))
for tag in opt.hashtags:
hashtags.update(_ExpandCommaList(tag))
if opt.hashtag_branch:
hashtags.add(branch.name)
# Check if labels should be included.
key = 'review.%s.uploadlabels' % branch.project.remote.review
labels = set(_ExpandCommaList(branch.project.config.GetString(key)))
for label in opt.labels:
labels.update(_ExpandCommaList(label))
# Basic sanity check on label syntax.
for label in labels:
if not re.match(r'^.+[+-][0-9]+$', label):
print('repo: error: invalid label syntax "%s": labels use forms '
'like CodeReview+1 or Verified-1' % (label,), file=sys.stderr)
sys.exit(1)
# Handle e-mail notifications.
if opt.notify is False:
notify = 'NONE'
else:
key = 'review.%s.uploadnotify' % branch.project.remote.review
notify = branch.project.config.GetString(key)
destination = opt.dest_branch or branch.project.dest_branch destination = opt.dest_branch or branch.project.dest_branch
# Make sure our local branch is not setup to track a different remote branch # Make sure our local branch is not setup to track a different remote branch
merge_branch = self._GetMergeBranch(branch.project) merge_branch = self._GetMergeBranch(branch.project)
if destination: if destination:
full_dest = destination full_dest = 'refs/heads/%s' % destination
if not full_dest.startswith(R_HEADS):
full_dest = R_HEADS + full_dest
if not opt.dest_branch and merge_branch and merge_branch != full_dest: if not opt.dest_branch and merge_branch and merge_branch != full_dest:
print('merge branch %s does not match destination branch %s' print('merge branch %s does not match destination branch %s'
% (merge_branch, full_dest)) % (merge_branch, full_dest))
@ -477,12 +392,10 @@ Gerrit Code Review: https://www.gerritcodereview.com/
continue continue
branch.UploadForReview(people, branch.UploadForReview(people,
dryrun=opt.dryrun,
auto_topic=opt.auto_topic, auto_topic=opt.auto_topic,
hashtags=hashtags, draft=opt.draft,
labels=labels,
private=opt.private, private=opt.private,
notify=notify, notify=None if opt.notify else 'NONE',
wip=opt.wip, wip=opt.wip,
dest_branch=destination, dest_branch=destination,
validate_certs=opt.validate_certs, validate_certs=opt.validate_certs,
@ -505,8 +418,8 @@ Gerrit Code Review: https://www.gerritcodereview.com/
else: else:
fmt = '\n (%s)' fmt = '\n (%s)'
print(('[FAILED] %-15s %-15s' + fmt) % ( print(('[FAILED] %-15s %-15s' + fmt) % (
branch.project.relpath + '/', branch.project.relpath + '/', \
branch.name, branch.name, \
str(branch.error)), str(branch.error)),
file=sys.stderr) file=sys.stderr)
print() print()
@ -565,12 +478,8 @@ Gerrit Code Review: https://www.gerritcodereview.com/
pending.append((project, avail)) pending.append((project, avail))
if not pending: if not pending:
if branch is None: print("no branches ready for upload", file=sys.stderr)
print('repo: error: no branches ready for upload', file=sys.stderr) return
else:
print('repo: error: no branches named "%s" ready for upload' %
(branch,), file=sys.stderr)
return 1
if not opt.bypass_hooks: if not opt.bypass_hooks:
hook = RepoHook('pre-upload', self.manifest.repo_hooks_project, hook = RepoHook('pre-upload', self.manifest.repo_hooks_project,
@ -579,24 +488,12 @@ Gerrit Code Review: https://www.gerritcodereview.com/
abort_if_user_denies=True) abort_if_user_denies=True)
pending_proj_names = [project.name for (project, available) in pending] pending_proj_names = [project.name for (project, available) in pending]
pending_worktrees = [project.worktree for (project, available) in pending] pending_worktrees = [project.worktree for (project, available) in pending]
passed = True
try: try:
hook.Run(opt.allow_all_hooks, project_list=pending_proj_names, hook.Run(opt.allow_all_hooks, project_list=pending_proj_names,
worktree_list=pending_worktrees) worktree_list=pending_worktrees)
except SystemExit:
passed = False
if not opt.ignore_hooks:
raise
except HookError as e: except HookError as e:
passed = False
print("ERROR: %s" % str(e), file=sys.stderr) print("ERROR: %s" % str(e), file=sys.stderr)
return
if not passed:
if opt.ignore_hooks:
print('\nWARNING: pre-upload hooks failed, but uploading anyways.',
file=sys.stderr)
else:
return 1
if opt.reviewers: if opt.reviewers:
reviewers = _SplitEmails(opt.reviewers) reviewers = _SplitEmails(opt.reviewers)

View File

@ -15,15 +15,11 @@
# limitations under the License. # limitations under the License.
from __future__ import print_function from __future__ import print_function
import platform
import sys import sys
from command import Command, MirrorSafeCommand from command import Command, MirrorSafeCommand
from git_command import git, RepoSourceVersion, user_agent from git_command import git, RepoSourceVersion, user_agent
from git_refs import HEAD from git_refs import HEAD
class Version(Command, MirrorSafeCommand): class Version(Command, MirrorSafeCommand):
wrapper_version = None wrapper_version = None
wrapper_path = None wrapper_path = None
@ -43,11 +39,10 @@ class Version(Command, MirrorSafeCommand):
rp_ver = rp.bare_git.describe(HEAD) rp_ver = rp.bare_git.describe(HEAD)
print('repo version %s' % rp_ver) print('repo version %s' % rp_ver)
print(' (from %s)' % rem.url) print(' (from %s)' % rem.url)
print(' (%s)' % rp.bare_git.log('-1', '--format=%cD', HEAD))
if self.wrapper_path is not None: if Version.wrapper_path is not None:
print('repo launcher version %s' % self.wrapper_version) print('repo launcher version %s' % Version.wrapper_version)
print(' (from %s)' % self.wrapper_path) print(' (from %s)' % Version.wrapper_path)
if src_ver != rp_ver: if src_ver != rp_ver:
print(' (currently at %s)' % src_ver) print(' (currently at %s)' % src_ver)
@ -56,11 +51,3 @@ class Version(Command, MirrorSafeCommand):
print('git %s' % git.version_tuple().full) print('git %s' % git.version_tuple().full)
print('git User-Agent %s' % user_agent.git) print('git User-Agent %s' % user_agent.git)
print('Python %s' % sys.version) print('Python %s' % sys.version)
uname = platform.uname()
if sys.version_info.major < 3:
# Python 3 returns a named tuple, but Python 2 is simpler.
print(uname)
else:
print('OS %s %s (%s)' % (uname.system, uname.release, uname.version))
print('CPU %s (%s)' %
(uname.machine, uname.processor if uname.processor else 'unknown'))

View File

@ -1,13 +1,3 @@
[section] [section]
empty empty
nonempty = true nonempty = true
boolinvalid = oops
booltrue = true
boolfalse = false
intinvalid = oops
inthex = 0x10
inthexk = 0x10k
int = 10
intk = 10k
intm = 10m
intg = 10g

View File

@ -21,40 +21,7 @@ from __future__ import print_function
import re import re
import unittest import unittest
try:
from unittest import mock
except ImportError:
import mock
import git_command import git_command
import wrapper
class SSHUnitTest(unittest.TestCase):
"""Tests the ssh functions."""
def test_ssh_version(self):
"""Check ssh_version() handling."""
ver = git_command._parse_ssh_version('Unknown\n')
self.assertEqual(ver, ())
ver = git_command._parse_ssh_version('OpenSSH_1.0\n')
self.assertEqual(ver, (1, 0))
ver = git_command._parse_ssh_version('OpenSSH_6.6.1p1 Ubuntu-2ubuntu2.13, OpenSSL 1.0.1f 6 Jan 2014\n')
self.assertEqual(ver, (6, 6, 1))
ver = git_command._parse_ssh_version('OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\n')
self.assertEqual(ver, (7, 6))
def test_ssh_sock(self):
"""Check ssh_sock() function."""
with mock.patch('tempfile.mkdtemp', return_value='/tmp/foo'):
# old ssh version uses port
with mock.patch('git_command.ssh_version', return_value=(6, 6)):
self.assertTrue(git_command.ssh_sock().endswith('%p'))
git_command._ssh_sock_path = None
# new ssh version uses hash
with mock.patch('git_command.ssh_version', return_value=(6, 7)):
self.assertTrue(git_command.ssh_sock().endswith('%C'))
git_command._ssh_sock_path = None
class GitCallUnitTest(unittest.TestCase): class GitCallUnitTest(unittest.TestCase):
@ -109,45 +76,3 @@ class UserAgentUnitTest(unittest.TestCase):
# the general form. # the general form.
m = re.match(r'^git/[^ ]+ ([^ ]+) git-repo/[^ ]+', ua) m = re.match(r'^git/[^ ]+ ([^ ]+) git-repo/[^ ]+', ua)
self.assertIsNotNone(m) self.assertIsNotNone(m)
class GitRequireTests(unittest.TestCase):
"""Test the git_require helper."""
def setUp(self):
ver = wrapper.GitVersion(1, 2, 3, 4)
mock.patch.object(git_command.git, 'version_tuple', return_value=ver).start()
def tearDown(self):
mock.patch.stopall()
def test_older_nonfatal(self):
"""Test non-fatal require calls with old versions."""
self.assertFalse(git_command.git_require((2,)))
self.assertFalse(git_command.git_require((1, 3)))
self.assertFalse(git_command.git_require((1, 2, 4)))
self.assertFalse(git_command.git_require((1, 2, 3, 5)))
def test_newer_nonfatal(self):
"""Test non-fatal require calls with newer versions."""
self.assertTrue(git_command.git_require((0,)))
self.assertTrue(git_command.git_require((1, 0)))
self.assertTrue(git_command.git_require((1, 2, 0)))
self.assertTrue(git_command.git_require((1, 2, 3, 0)))
def test_equal_nonfatal(self):
"""Test require calls with equal values."""
self.assertTrue(git_command.git_require((1, 2, 3, 4), fail=False))
self.assertTrue(git_command.git_require((1, 2, 3, 4), fail=True))
def test_older_fatal(self):
"""Test fatal require calls with old versions."""
with self.assertRaises(SystemExit) as e:
git_command.git_require((2,), fail=True)
self.assertNotEqual(0, e.code)
def test_older_fatal_msg(self):
"""Test fatal require calls with old versions and message."""
with self.assertRaises(SystemExit) as e:
git_command.git_require((2,), fail=True, msg='so sad')
self.assertNotEqual(0, e.code)

View File

@ -23,17 +23,14 @@ import unittest
import git_config import git_config
def fixture(*paths): def fixture(*paths):
"""Return a path relative to test/fixtures. """Return a path relative to test/fixtures.
""" """
return os.path.join(os.path.dirname(__file__), 'fixtures', *paths) return os.path.join(os.path.dirname(__file__), 'fixtures', *paths)
class GitConfigUnitTest(unittest.TestCase): class GitConfigUnitTest(unittest.TestCase):
"""Tests the GitConfig class. """Tests the GitConfig class.
""" """
def setUp(self): def setUp(self):
"""Create a GitConfig object using the test.gitconfig fixture. """Create a GitConfig object using the test.gitconfig fixture.
""" """
@ -71,43 +68,5 @@ class GitConfigUnitTest(unittest.TestCase):
val = config.GetString('empty') val = config.GetString('empty')
self.assertEqual(val, None) self.assertEqual(val, None)
def test_GetBoolean_undefined(self):
"""Test GetBoolean on key that doesn't exist."""
self.assertIsNone(self.config.GetBoolean('section.missing'))
def test_GetBoolean_invalid(self):
"""Test GetBoolean on invalid boolean value."""
self.assertIsNone(self.config.GetBoolean('section.boolinvalid'))
def test_GetBoolean_true(self):
"""Test GetBoolean on valid true boolean."""
self.assertTrue(self.config.GetBoolean('section.booltrue'))
def test_GetBoolean_false(self):
"""Test GetBoolean on valid false boolean."""
self.assertFalse(self.config.GetBoolean('section.boolfalse'))
def test_GetInt_undefined(self):
"""Test GetInt on key that doesn't exist."""
self.assertIsNone(self.config.GetInt('section.missing'))
def test_GetInt_invalid(self):
"""Test GetInt on invalid integer value."""
self.assertIsNone(self.config.GetBoolean('section.intinvalid'))
def test_GetInt_valid(self):
"""Test GetInt on valid integers."""
TESTS = (
('inthex', 16),
('inthexk', 16384),
('int', 10),
('intk', 10240),
('intm', 10485760),
('intg', 10737418240),
)
for key, value in TESTS:
self.assertEqual(value, self.config.GetInt('section.%s' % (key,)))
if __name__ == '__main__': if __name__ == '__main__':
unittest.main() unittest.main()

View File

@ -1,60 +0,0 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the hooks.py module."""
from __future__ import print_function
import hooks
import unittest
class RepoHookShebang(unittest.TestCase):
"""Check shebang parsing in RepoHook."""
def test_no_shebang(self):
"""Lines w/out shebangs should be rejected."""
DATA = (
'',
'# -*- coding:utf-8 -*-\n',
'#\n# foo\n',
'# Bad shebang in script\n#!/foo\n'
)
for data in DATA:
self.assertIsNone(hooks.RepoHook._ExtractInterpFromShebang(data))
def test_direct_interp(self):
"""Lines whose shebang points directly to the interpreter."""
DATA = (
('#!/foo', '/foo'),
('#! /foo', '/foo'),
('#!/bin/foo ', '/bin/foo'),
('#! /usr/foo ', '/usr/foo'),
('#! /usr/foo -args', '/usr/foo'),
)
for shebang, interp in DATA:
self.assertEqual(hooks.RepoHook._ExtractInterpFromShebang(shebang),
interp)
def test_env_interp(self):
"""Lines whose shebang launches through `env`."""
DATA = (
('#!/usr/bin/env foo', 'foo'),
('#!/bin/env foo', 'foo'),
('#! /bin/env /bin/foo ', '/bin/foo'),
)
for shebang, interp in DATA:
self.assertEqual(hooks.RepoHook._ExtractInterpFromShebang(shebang),
interp)

View File

@ -18,9 +18,7 @@
from __future__ import print_function from __future__ import print_function
import os
import unittest import unittest
import xml.dom.minidom
import error import error
import manifest_xml import manifest_xml
@ -51,8 +49,6 @@ class ManifestValidateFilePaths(unittest.TestCase):
# We allow symlinks to end in a slash since we allow them to point to dirs # We allow symlinks to end in a slash since we allow them to point to dirs
# in general. Technically the slash isn't necessary. # in general. Technically the slash isn't necessary.
check('foo/', 'bar') check('foo/', 'bar')
# We allow a single '.' to get a reference to the project itself.
check('.', 'bar')
def test_bad_paths(self): def test_bad_paths(self):
"""Make sure bad paths (src & dest) are rejected.""" """Make sure bad paths (src & dest) are rejected."""
@ -80,69 +76,8 @@ class ManifestValidateFilePaths(unittest.TestCase):
# Block Unicode characters that get normalized out by filesystems. # Block Unicode characters that get normalized out by filesystems.
u'foo\u200Cbar', u'foo\u200Cbar',
) )
# Make sure platforms that use path separators (e.g. Windows) are also
# rejected properly.
if os.path.sep != '/':
PATHS += tuple(x.replace('/', os.path.sep) for x in PATHS)
for path in PATHS: for path in PATHS:
self.assertRaises( self.assertRaises(
error.ManifestInvalidPathError, self.check_both, path, 'a') error.ManifestInvalidPathError, self.check_both, path, 'a')
self.assertRaises( self.assertRaises(
error.ManifestInvalidPathError, self.check_both, 'a', path) error.ManifestInvalidPathError, self.check_both, 'a', path)
class ValueTests(unittest.TestCase):
"""Check utility parsing code."""
def _get_node(self, text):
return xml.dom.minidom.parseString(text).firstChild
def test_bool_default(self):
"""Check XmlBool default handling."""
node = self._get_node('<node/>')
self.assertIsNone(manifest_xml.XmlBool(node, 'a'))
self.assertIsNone(manifest_xml.XmlBool(node, 'a', None))
self.assertEqual(123, manifest_xml.XmlBool(node, 'a', 123))
node = self._get_node('<node a=""/>')
self.assertIsNone(manifest_xml.XmlBool(node, 'a'))
def test_bool_invalid(self):
"""Check XmlBool invalid handling."""
node = self._get_node('<node a="moo"/>')
self.assertEqual(123, manifest_xml.XmlBool(node, 'a', 123))
def test_bool_true(self):
"""Check XmlBool true values."""
for value in ('yes', 'true', '1'):
node = self._get_node('<node a="%s"/>' % (value,))
self.assertTrue(manifest_xml.XmlBool(node, 'a'))
def test_bool_false(self):
"""Check XmlBool false values."""
for value in ('no', 'false', '0'):
node = self._get_node('<node a="%s"/>' % (value,))
self.assertFalse(manifest_xml.XmlBool(node, 'a'))
def test_int_default(self):
"""Check XmlInt default handling."""
node = self._get_node('<node/>')
self.assertIsNone(manifest_xml.XmlInt(node, 'a'))
self.assertIsNone(manifest_xml.XmlInt(node, 'a', None))
self.assertEqual(123, manifest_xml.XmlInt(node, 'a', 123))
node = self._get_node('<node a=""/>')
self.assertIsNone(manifest_xml.XmlInt(node, 'a'))
def test_int_good(self):
"""Check XmlInt numeric handling."""
for value in (-1, 0, 1, 50000):
node = self._get_node('<node a="%s"/>' % (value,))
self.assertEqual(value, manifest_xml.XmlInt(node, 'a'))
def test_int_invalid(self):
"""Check XmlInt invalid handling."""
with self.assertRaises(error.ManifestParseError):
node = self._get_node('<node a="xx"/>')
manifest_xml.XmlInt(node, 'a')

View File

@ -27,7 +27,6 @@ import unittest
import error import error
import git_config import git_config
import platform_utils
import project import project
@ -41,7 +40,46 @@ def TempGitTree():
subprocess.check_call(['git', 'init'], cwd=tempdir) subprocess.check_call(['git', 'init'], cwd=tempdir)
yield tempdir yield tempdir
finally: finally:
platform_utils.rmtree(tempdir) shutil.rmtree(tempdir)
class RepoHookShebang(unittest.TestCase):
"""Check shebang parsing in RepoHook."""
def test_no_shebang(self):
"""Lines w/out shebangs should be rejected."""
DATA = (
'',
'# -*- coding:utf-8 -*-\n',
'#\n# foo\n',
'# Bad shebang in script\n#!/foo\n'
)
for data in DATA:
self.assertIsNone(project.RepoHook._ExtractInterpFromShebang(data))
def test_direct_interp(self):
"""Lines whose shebang points directly to the interpreter."""
DATA = (
('#!/foo', '/foo'),
('#! /foo', '/foo'),
('#!/bin/foo ', '/bin/foo'),
('#! /usr/foo ', '/usr/foo'),
('#! /usr/foo -args', '/usr/foo'),
)
for shebang, interp in DATA:
self.assertEqual(project.RepoHook._ExtractInterpFromShebang(shebang),
interp)
def test_env_interp(self):
"""Lines whose shebang launches through `env`."""
DATA = (
('#!/usr/bin/env foo', 'foo'),
('#!/bin/env foo', 'foo'),
('#! /bin/env /bin/foo ', '/bin/foo'),
)
for shebang, interp in DATA:
self.assertEqual(project.RepoHook._ExtractInterpFromShebang(shebang),
interp)
class FakeProject(object): class FakeProject(object):
@ -125,7 +163,7 @@ class CopyLinkTestCase(unittest.TestCase):
@staticmethod @staticmethod
def touch(path): def touch(path):
with open(path, 'w'): with open(path, 'w') as f:
pass pass
def assertExists(self, path, msg=None): def assertExists(self, path, msg=None):
@ -205,22 +243,20 @@ class CopyFile(CopyLinkTestCase):
src = os.path.join(self.worktree, 'foo.txt') src = os.path.join(self.worktree, 'foo.txt')
sym = os.path.join(self.worktree, 'sym') sym = os.path.join(self.worktree, 'sym')
self.touch(src) self.touch(src)
platform_utils.symlink('foo.txt', sym) os.symlink('foo.txt', sym)
self.assertExists(sym) self.assertExists(sym)
cf = self.CopyFile('sym', 'foo') cf = self.CopyFile('sym', 'foo')
self.assertRaises(error.ManifestInvalidPathError, cf._Copy) self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_src_block_symlink_traversal(self): def test_src_block_symlink_traversal(self):
"""Do not allow reading through a symlink dir.""" """Do not allow reading through a symlink dir."""
realfile = os.path.join(self.tempdir, 'file.txt') src = os.path.join(self.worktree, 'bar', 'passwd')
self.touch(realfile) os.symlink('/etc', os.path.join(self.worktree, 'bar'))
src = os.path.join(self.worktree, 'bar', 'file.txt')
platform_utils.symlink(self.tempdir, os.path.join(self.worktree, 'bar'))
self.assertExists(src) self.assertExists(src)
cf = self.CopyFile('bar/file.txt', 'foo') cf = self.CopyFile('bar/foo.txt', 'foo')
self.assertRaises(error.ManifestInvalidPathError, cf._Copy) self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_src_block_copy_from_dir(self): def test_src_block_dir(self):
"""Do not allow copying from a directory.""" """Do not allow copying from a directory."""
src = os.path.join(self.worktree, 'dir') src = os.path.join(self.worktree, 'dir')
os.makedirs(src) os.makedirs(src)
@ -231,7 +267,7 @@ class CopyFile(CopyLinkTestCase):
"""Do not allow writing to a symlink.""" """Do not allow writing to a symlink."""
src = os.path.join(self.worktree, 'foo.txt') src = os.path.join(self.worktree, 'foo.txt')
self.touch(src) self.touch(src)
platform_utils.symlink('dest', os.path.join(self.topdir, 'sym')) os.symlink('dest', os.path.join(self.topdir, 'sym'))
cf = self.CopyFile('foo.txt', 'sym') cf = self.CopyFile('foo.txt', 'sym')
self.assertRaises(error.ManifestInvalidPathError, cf._Copy) self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
@ -239,12 +275,11 @@ class CopyFile(CopyLinkTestCase):
"""Do not allow writing through a symlink dir.""" """Do not allow writing through a symlink dir."""
src = os.path.join(self.worktree, 'foo.txt') src = os.path.join(self.worktree, 'foo.txt')
self.touch(src) self.touch(src)
platform_utils.symlink(tempfile.gettempdir(), os.symlink('/tmp', os.path.join(self.topdir, 'sym'))
os.path.join(self.topdir, 'sym'))
cf = self.CopyFile('foo.txt', 'sym/foo.txt') cf = self.CopyFile('foo.txt', 'sym/foo.txt')
self.assertRaises(error.ManifestInvalidPathError, cf._Copy) self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_src_block_copy_to_dir(self): def test_src_block_dir(self):
"""Do not allow copying to a directory.""" """Do not allow copying to a directory."""
src = os.path.join(self.worktree, 'foo.txt') src = os.path.join(self.worktree, 'foo.txt')
self.touch(src) self.touch(src)
@ -268,7 +303,7 @@ class LinkFile(CopyLinkTestCase):
dest = os.path.join(self.topdir, 'foo') dest = os.path.join(self.topdir, 'foo')
self.assertExists(dest) self.assertExists(dest)
self.assertTrue(os.path.islink(dest)) self.assertTrue(os.path.islink(dest))
self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest)) self.assertEqual('git-project/foo.txt', os.readlink(dest))
def test_src_subdir(self): def test_src_subdir(self):
"""Link to a file in a subdir of a project.""" """Link to a file in a subdir of a project."""
@ -279,14 +314,6 @@ class LinkFile(CopyLinkTestCase):
lf._Link() lf._Link()
self.assertExists(os.path.join(self.topdir, 'foo')) self.assertExists(os.path.join(self.topdir, 'foo'))
def test_src_self(self):
"""Link to the project itself."""
dest = os.path.join(self.topdir, 'foo', 'bar')
lf = self.LinkFile('.', 'foo/bar')
lf._Link()
self.assertExists(dest)
self.assertEqual(os.path.join('..', 'git-project'), os.readlink(dest))
def test_dest_subdir(self): def test_dest_subdir(self):
"""Link a file to a subdir of a checkout.""" """Link a file to a subdir of a checkout."""
src = os.path.join(self.worktree, 'foo.txt') src = os.path.join(self.worktree, 'foo.txt')
@ -296,21 +323,6 @@ class LinkFile(CopyLinkTestCase):
lf._Link() lf._Link()
self.assertExists(os.path.join(self.topdir, 'sub', 'dir', 'foo', 'bar')) self.assertExists(os.path.join(self.topdir, 'sub', 'dir', 'foo', 'bar'))
def test_src_block_relative(self):
"""Do not allow relative symlinks."""
BAD_SOURCES = (
'./',
'..',
'../',
'foo/.',
'foo/./bar',
'foo/..',
'foo/../foo',
)
for src in BAD_SOURCES:
lf = self.LinkFile(src, 'foo')
self.assertRaises(error.ManifestInvalidPathError, lf._Link)
def test_update(self): def test_update(self):
"""Make sure changed targets get updated.""" """Make sure changed targets get updated."""
dest = os.path.join(self.topdir, 'sym') dest = os.path.join(self.topdir, 'sym')
@ -319,10 +331,10 @@ class LinkFile(CopyLinkTestCase):
self.touch(src) self.touch(src)
lf = self.LinkFile('foo.txt', 'sym') lf = self.LinkFile('foo.txt', 'sym')
lf._Link() lf._Link()
self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest)) self.assertEqual('git-project/foo.txt', os.readlink(dest))
# Point the symlink somewhere else. # Point the symlink somewhere else.
os.unlink(dest) os.unlink(dest)
platform_utils.symlink(self.tempdir, dest) os.symlink('/', dest)
lf._Link() lf._Link()
self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest)) self.assertEqual('git-project/foo.txt', os.readlink(dest))

View File

@ -1,43 +0,0 @@
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the subcmds module (mostly __init__.py than subcommands)."""
import unittest
import subcmds
class AllCommands(unittest.TestCase):
"""Check registered all_commands."""
def test_required_basic(self):
"""Basic checking of registered commands."""
# NB: We don't test all subcommands as we want to avoid "change detection"
# tests, so we just look for the most common/important ones here that are
# unlikely to ever change.
for cmd in {'cherry-pick', 'help', 'init', 'start', 'sync', 'upload'}:
self.assertIn(cmd, subcmds.all_commands)
def test_naming(self):
"""Verify we don't add things that we shouldn't."""
for cmd in subcmds.all_commands:
# Reject filename suffixes like "help.py".
self.assertNotIn('.', cmd)
# Make sure all '_' were converted to '-'.
self.assertNotIn('_', cmd)
# Reject internal python paths like "__init__".
self.assertFalse(cmd.startswith('__'))

View File

@ -1,49 +0,0 @@
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the subcmds/init.py module."""
import unittest
from subcmds import init
class InitCommand(unittest.TestCase):
"""Check registered all_commands."""
def setUp(self):
self.cmd = init.Init()
def test_cli_parser_good(self):
"""Check valid command line options."""
ARGV = (
[],
)
for argv in ARGV:
opts, args = self.cmd.OptionParser.parse_args(argv)
self.cmd.ValidateOptions(opts, args)
def test_cli_parser_bad(self):
"""Check invalid command line options."""
ARGV = (
# Too many arguments.
['asdf'],
# Conflicting options.
['--mirror', '--archive'],
)
for argv in ARGV:
opts, args = self.cmd.OptionParser.parse_args(argv)
with self.assertRaises(SystemExit):
self.cmd.ValidateOptions(opts, args)

View File

@ -18,83 +18,24 @@
from __future__ import print_function from __future__ import print_function
import contextlib
import os import os
import re
import shutil
import tempfile
import unittest import unittest
import platform_utils
from pyversion import is_python3
import wrapper import wrapper
if is_python3():
from unittest import mock
from io import StringIO
else:
import mock
from StringIO import StringIO
@contextlib.contextmanager
def TemporaryDirectory():
"""Create a new empty git checkout for testing."""
# TODO(vapier): Convert this to tempfile.TemporaryDirectory once we drop
# Python 2 support entirely.
try:
tempdir = tempfile.mkdtemp(prefix='repo-tests')
yield tempdir
finally:
platform_utils.rmtree(tempdir)
def fixture(*paths): def fixture(*paths):
"""Return a path relative to tests/fixtures. """Return a path relative to tests/fixtures.
""" """
return os.path.join(os.path.dirname(__file__), 'fixtures', *paths) return os.path.join(os.path.dirname(__file__), 'fixtures', *paths)
class RepoWrapperUnitTest(unittest.TestCase):
class RepoWrapperTestCase(unittest.TestCase):
"""TestCase for the wrapper module."""
def setUp(self):
"""Load the wrapper module every time."""
wrapper._wrapper_module = None
self.wrapper = wrapper.Wrapper()
if not is_python3():
self.assertRegex = self.assertRegexpMatches
class RepoWrapperUnitTest(RepoWrapperTestCase):
"""Tests helper functions in the repo wrapper """Tests helper functions in the repo wrapper
""" """
def setUp(self):
def test_version(self): """Load the wrapper module every time
"""Make sure _Version works.""" """
with self.assertRaises(SystemExit) as e: wrapper._wrapper_module = None
with mock.patch('sys.stdout', new_callable=StringIO) as stdout: self.wrapper = wrapper.Wrapper()
with mock.patch('sys.stderr', new_callable=StringIO) as stderr:
self.wrapper._Version()
self.assertEqual(0, e.exception.code)
self.assertEqual('', stderr.getvalue())
self.assertIn('repo launcher version', stdout.getvalue())
def test_init_parser(self):
"""Make sure 'init' GetParser works."""
parser = self.wrapper.GetParser(gitc_init=False)
opts, args = parser.parse_args([])
self.assertEqual([], args)
self.assertIsNone(opts.manifest_url)
def test_gitc_init_parser(self):
"""Make sure 'gitc-init' GetParser works."""
parser = self.wrapper.GetParser(gitc_init=True)
opts, args = parser.parse_args([])
self.assertEqual([], args)
self.assertIsNone(opts.manifest_file)
def test_get_gitc_manifest_dir_no_gitc(self): def test_get_gitc_manifest_dir_no_gitc(self):
""" """
@ -131,355 +72,9 @@ class RepoWrapperUnitTest(RepoWrapperTestCase):
self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test/extra'), 'test') self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test/extra'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test'), 'test') self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/'), 'test') self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/extra'), self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/extra'), 'test')
'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/'), None) self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/'), None)
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/'), None) self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/'), None)
class SetGitTrace2ParentSid(RepoWrapperTestCase):
"""Check SetGitTrace2ParentSid behavior."""
KEY = 'GIT_TRACE2_PARENT_SID'
VALID_FORMAT = re.compile(r'^repo-[0-9]{8}T[0-9]{6}Z-P[0-9a-f]{8}$')
def test_first_set(self):
"""Test env var not yet set."""
env = {}
self.wrapper.SetGitTrace2ParentSid(env)
self.assertIn(self.KEY, env)
value = env[self.KEY]
self.assertRegex(value, self.VALID_FORMAT)
def test_append(self):
"""Test env var is appended."""
env = {self.KEY: 'pfx'}
self.wrapper.SetGitTrace2ParentSid(env)
self.assertIn(self.KEY, env)
value = env[self.KEY]
self.assertTrue(value.startswith('pfx/'))
self.assertRegex(value[4:], self.VALID_FORMAT)
def test_global_context(self):
"""Check os.environ gets updated by default."""
os.environ.pop(self.KEY, None)
self.wrapper.SetGitTrace2ParentSid()
self.assertIn(self.KEY, os.environ)
value = os.environ[self.KEY]
self.assertRegex(value, self.VALID_FORMAT)
class RunCommand(RepoWrapperTestCase):
"""Check run_command behavior."""
def test_capture(self):
"""Check capture_output handling."""
ret = self.wrapper.run_command(['echo', 'hi'], capture_output=True)
self.assertEqual(ret.stdout, 'hi\n')
def test_check(self):
"""Check check handling."""
self.wrapper.run_command(['true'], check=False)
self.wrapper.run_command(['true'], check=True)
self.wrapper.run_command(['false'], check=False)
with self.assertRaises(self.wrapper.RunError):
self.wrapper.run_command(['false'], check=True)
class RunGit(RepoWrapperTestCase):
"""Check run_git behavior."""
def test_capture(self):
"""Check capture_output handling."""
ret = self.wrapper.run_git('--version')
self.assertIn('git', ret.stdout)
def test_check(self):
"""Check check handling."""
with self.assertRaises(self.wrapper.CloneFailure):
self.wrapper.run_git('--version-asdfasdf')
self.wrapper.run_git('--version-asdfasdf', check=False)
class ParseGitVersion(RepoWrapperTestCase):
"""Check ParseGitVersion behavior."""
def test_autoload(self):
"""Check we can load the version from the live git."""
ret = self.wrapper.ParseGitVersion()
self.assertIsNotNone(ret)
def test_bad_ver(self):
"""Check handling of bad git versions."""
ret = self.wrapper.ParseGitVersion(ver_str='asdf')
self.assertIsNone(ret)
def test_normal_ver(self):
"""Check handling of normal git versions."""
ret = self.wrapper.ParseGitVersion(ver_str='git version 2.25.1')
self.assertEqual(2, ret.major)
self.assertEqual(25, ret.minor)
self.assertEqual(1, ret.micro)
self.assertEqual('2.25.1', ret.full)
def test_extended_ver(self):
"""Check handling of extended distro git versions."""
ret = self.wrapper.ParseGitVersion(
ver_str='git version 1.30.50.696.g5e7596f4ac-goog')
self.assertEqual(1, ret.major)
self.assertEqual(30, ret.minor)
self.assertEqual(50, ret.micro)
self.assertEqual('1.30.50.696.g5e7596f4ac-goog', ret.full)
class CheckGitVersion(RepoWrapperTestCase):
"""Check _CheckGitVersion behavior."""
def test_unknown(self):
"""Unknown versions should abort."""
with mock.patch.object(self.wrapper, 'ParseGitVersion', return_value=None):
with self.assertRaises(self.wrapper.CloneFailure):
self.wrapper._CheckGitVersion()
def test_old(self):
"""Old versions should abort."""
with mock.patch.object(
self.wrapper, 'ParseGitVersion',
return_value=self.wrapper.GitVersion(1, 0, 0, '1.0.0')):
with self.assertRaises(self.wrapper.CloneFailure):
self.wrapper._CheckGitVersion()
def test_new(self):
"""Newer versions should run fine."""
with mock.patch.object(
self.wrapper, 'ParseGitVersion',
return_value=self.wrapper.GitVersion(100, 0, 0, '100.0.0')):
self.wrapper._CheckGitVersion()
class NeedSetupGnuPG(RepoWrapperTestCase):
"""Check NeedSetupGnuPG behavior."""
def test_missing_dir(self):
"""The ~/.repoconfig tree doesn't exist yet."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = os.path.join(tempdir, 'foo')
self.assertTrue(self.wrapper.NeedSetupGnuPG())
def test_missing_keyring(self):
"""The keyring-version file doesn't exist yet."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = tempdir
self.assertTrue(self.wrapper.NeedSetupGnuPG())
def test_empty_keyring(self):
"""The keyring-version file exists, but is empty."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = tempdir
with open(os.path.join(tempdir, 'keyring-version'), 'w'):
pass
self.assertTrue(self.wrapper.NeedSetupGnuPG())
def test_old_keyring(self):
"""The keyring-version file exists, but it's old."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = tempdir
with open(os.path.join(tempdir, 'keyring-version'), 'w') as fp:
fp.write('1.0\n')
self.assertTrue(self.wrapper.NeedSetupGnuPG())
def test_new_keyring(self):
"""The keyring-version file exists, and is up-to-date."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = tempdir
with open(os.path.join(tempdir, 'keyring-version'), 'w') as fp:
fp.write('1000.0\n')
self.assertFalse(self.wrapper.NeedSetupGnuPG())
class SetupGnuPG(RepoWrapperTestCase):
"""Check SetupGnuPG behavior."""
def test_full(self):
"""Make sure it works completely."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = tempdir
self.wrapper.gpg_dir = os.path.join(self.wrapper.home_dot_repo, 'gnupg')
self.assertTrue(self.wrapper.SetupGnuPG(True))
with open(os.path.join(tempdir, 'keyring-version'), 'r') as fp:
data = fp.read()
self.assertEqual('.'.join(str(x) for x in self.wrapper.KEYRING_VERSION),
data.strip())
class VerifyRev(RepoWrapperTestCase):
"""Check verify_rev behavior."""
def test_verify_passes(self):
"""Check when we have a valid signed tag."""
desc_result = self.wrapper.RunResult(0, 'v1.0\n', '')
gpg_result = self.wrapper.RunResult(0, '', '')
with mock.patch.object(self.wrapper, 'run_git',
side_effect=(desc_result, gpg_result)):
ret = self.wrapper.verify_rev('/', 'refs/heads/stable', '1234', True)
self.assertEqual('v1.0^0', ret)
def test_unsigned_commit(self):
"""Check we fall back to signed tag when we have an unsigned commit."""
desc_result = self.wrapper.RunResult(0, 'v1.0-10-g1234\n', '')
gpg_result = self.wrapper.RunResult(0, '', '')
with mock.patch.object(self.wrapper, 'run_git',
side_effect=(desc_result, gpg_result)):
ret = self.wrapper.verify_rev('/', 'refs/heads/stable', '1234', True)
self.assertEqual('v1.0^0', ret)
def test_verify_fails(self):
"""Check we fall back to signed tag when we have an unsigned commit."""
desc_result = self.wrapper.RunResult(0, 'v1.0-10-g1234\n', '')
gpg_result = Exception
with mock.patch.object(self.wrapper, 'run_git',
side_effect=(desc_result, gpg_result)):
with self.assertRaises(Exception):
self.wrapper.verify_rev('/', 'refs/heads/stable', '1234', True)
class GitCheckoutTestCase(RepoWrapperTestCase):
"""Tests that use a real/small git checkout."""
GIT_DIR = None
REV_LIST = None
@classmethod
def setUpClass(cls):
# Create a repo to operate on, but do it once per-class.
cls.GIT_DIR = tempfile.mkdtemp(prefix='repo-rev-tests')
run_git = wrapper.Wrapper().run_git
remote = os.path.join(cls.GIT_DIR, 'remote')
os.mkdir(remote)
run_git('init', cwd=remote)
run_git('commit', '--allow-empty', '-minit', cwd=remote)
run_git('branch', 'stable', cwd=remote)
run_git('tag', 'v1.0', cwd=remote)
run_git('commit', '--allow-empty', '-m2nd commit', cwd=remote)
cls.REV_LIST = run_git('rev-list', 'HEAD', cwd=remote).stdout.splitlines()
run_git('init', cwd=cls.GIT_DIR)
run_git('fetch', remote, '+refs/heads/*:refs/remotes/origin/*', cwd=cls.GIT_DIR)
@classmethod
def tearDownClass(cls):
if not cls.GIT_DIR:
return
shutil.rmtree(cls.GIT_DIR)
class ResolveRepoRev(GitCheckoutTestCase):
"""Check resolve_repo_rev behavior."""
def test_explicit_branch(self):
"""Check refs/heads/branch argument."""
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/heads/stable')
self.assertEqual('refs/heads/stable', rrev)
self.assertEqual(self.REV_LIST[1], lrev)
with self.assertRaises(wrapper.CloneFailure):
self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/heads/unknown')
def test_explicit_tag(self):
"""Check refs/tags/tag argument."""
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/tags/v1.0')
self.assertEqual('refs/tags/v1.0', rrev)
self.assertEqual(self.REV_LIST[1], lrev)
with self.assertRaises(wrapper.CloneFailure):
self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/tags/unknown')
def test_branch_name(self):
"""Check branch argument."""
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'stable')
self.assertEqual('refs/heads/stable', rrev)
self.assertEqual(self.REV_LIST[1], lrev)
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'master')
self.assertEqual('refs/heads/master', rrev)
self.assertEqual(self.REV_LIST[0], lrev)
def test_tag_name(self):
"""Check tag argument."""
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'v1.0')
self.assertEqual('refs/tags/v1.0', rrev)
self.assertEqual(self.REV_LIST[1], lrev)
def test_full_commit(self):
"""Check specific commit argument."""
commit = self.REV_LIST[0]
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, commit)
self.assertEqual(commit, rrev)
self.assertEqual(commit, lrev)
def test_partial_commit(self):
"""Check specific (partial) commit argument."""
commit = self.REV_LIST[0][0:20]
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, commit)
self.assertEqual(self.REV_LIST[0], rrev)
self.assertEqual(self.REV_LIST[0], lrev)
def test_unknown(self):
"""Check unknown ref/commit argument."""
with self.assertRaises(wrapper.CloneFailure):
self.wrapper.resolve_repo_rev(self.GIT_DIR, 'boooooooya')
class CheckRepoVerify(RepoWrapperTestCase):
"""Check check_repo_verify behavior."""
def test_no_verify(self):
"""Always fail with --no-repo-verify."""
self.assertFalse(self.wrapper.check_repo_verify(False))
def test_gpg_initialized(self):
"""Should pass if gpg is setup already."""
with mock.patch.object(self.wrapper, 'NeedSetupGnuPG', return_value=False):
self.assertTrue(self.wrapper.check_repo_verify(True))
def test_need_gpg_setup(self):
"""Should pass/fail based on gpg setup."""
with mock.patch.object(self.wrapper, 'NeedSetupGnuPG', return_value=True):
with mock.patch.object(self.wrapper, 'SetupGnuPG') as m:
m.return_value = True
self.assertTrue(self.wrapper.check_repo_verify(True))
m.return_value = False
self.assertFalse(self.wrapper.check_repo_verify(True))
class CheckRepoRev(GitCheckoutTestCase):
"""Check check_repo_rev behavior."""
def test_verify_works(self):
"""Should pass when verification passes."""
with mock.patch.object(self.wrapper, 'check_repo_verify', return_value=True):
with mock.patch.object(self.wrapper, 'verify_rev', return_value='12345'):
rrev, lrev = self.wrapper.check_repo_rev(self.GIT_DIR, 'stable')
self.assertEqual('refs/heads/stable', rrev)
self.assertEqual('12345', lrev)
def test_verify_fails(self):
"""Should fail when verification fails."""
with mock.patch.object(self.wrapper, 'check_repo_verify', return_value=True):
with mock.patch.object(self.wrapper, 'verify_rev', side_effect=Exception):
with self.assertRaises(Exception):
self.wrapper.check_repo_rev(self.GIT_DIR, 'stable')
def test_verify_ignore(self):
"""Should pass when verification is disabled."""
with mock.patch.object(self.wrapper, 'verify_rev', side_effect=Exception):
rrev, lrev = self.wrapper.check_repo_rev(self.GIT_DIR, 'stable', repo_verify=False)
self.assertEqual('refs/heads/stable', rrev)
self.assertEqual(self.REV_LIST[1], lrev)
if __name__ == '__main__': if __name__ == '__main__':
unittest.main() unittest.main()

14
tox.ini
View File

@ -15,18 +15,8 @@
# https://tox.readthedocs.io/ # https://tox.readthedocs.io/
[tox] [tox]
envlist = py36, py37, py38 envlist = py27, py36, py37, py38
[gh-actions]
python =
3.6: py36
3.7: py37
3.8: py38
[testenv] [testenv]
deps = pytest deps = pytest
commands = {envpython} run_tests commands = {toxinidir}/run_tests
setenv =
GIT_AUTHOR_NAME = Repo test author
GIT_COMMITTER_NAME = Repo test committer
EMAIL = repo@gerrit.nodomain

View File

@ -27,10 +27,7 @@ import os
def WrapperPath(): def WrapperPath():
return os.path.join(os.path.dirname(__file__), 'repo') return os.path.join(os.path.dirname(__file__), 'repo')
_wrapper_module = None _wrapper_module = None
def Wrapper(): def Wrapper():
global _wrapper_module global _wrapper_module
if not _wrapper_module: if not _wrapper_module: