Compare commits

..

No commits in common. "master" and "v1.13.7" have entirely different histories.

79 changed files with 2068 additions and 5597 deletions

16
.flake8
View File

@ -1,15 +1,3 @@
[flake8] [flake8]
max-line-length=100 max-line-length=80
ignore= ignore=E111,E114,E402
# E111: Indentation is not a multiple of four
E111,
# E114: Indentation is not a multiple of four (comment)
E114,
# E402: Module level import not at top of file
E402,
# E731: do not assign a lambda expression, use a def
E731,
# W503: Line break before binary operator
W503,
# W504: Line break after binary operator
W504

View File

@ -1,31 +0,0 @@
# GitHub actions workflow.
# https://help.github.com/en/actions/automating-your-workflow-with-github-actions/workflow-syntax-for-github-actions
name: Test CI
on:
push:
branches: [master, repo-1, stable, maint]
tags: [v*]
jobs:
test:
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest, macos-latest, windows-latest]
python-version: [3.6, 3.7, 3.8]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v1
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install tox tox-gh-actions
- name: Test with tox
run: tox

9
.gitignore vendored
View File

@ -1,12 +1,3 @@
*.asc
*.egg-info/
*.log
*.pyc *.pyc
__pycache__
/dist
.repopickle_* .repopickle_*
/repoc /repoc
/.tox
# PyCharm related
/.idea/

View File

@ -4,7 +4,6 @@ Hu Xiuyun <xiuyun.hu@hisilicon.com> Hu xiuyun <xiuyun.hu@hisilicon.com
Hu Xiuyun <xiuyun.hu@hisilicon.com> Hu Xiuyun <clouds08@qq.com> Hu Xiuyun <xiuyun.hu@hisilicon.com> Hu Xiuyun <clouds08@qq.com>
Jelly Chen <chenguodong@huawei.com> chenguodong <chenguodong@huawei.com> Jelly Chen <chenguodong@huawei.com> chenguodong <chenguodong@huawei.com>
Jia Bi <bijia@xiaomi.com> bijia <bijia@xiaomi.com> Jia Bi <bijia@xiaomi.com> bijia <bijia@xiaomi.com>
Jiri Tyr <jiri.tyr@gmail.com> Jiri tyr <jiri.tyr@gmail.com>
JoonCheol Park <jooncheol@gmail.com> Jooncheol Park <jooncheol@gmail.com> JoonCheol Park <jooncheol@gmail.com> Jooncheol Park <jooncheol@gmail.com>
Sergii Pylypenko <x.pelya.x@gmail.com> pelya <x.pelya.x@gmail.com> Sergii Pylypenko <x.pelya.x@gmail.com> pelya <x.pelya.x@gmail.com>
Shawn Pearce <sop@google.com> Shawn O. Pearce <sop@google.com> Shawn Pearce <sop@google.com> Shawn O. Pearce <sop@google.com>

View File

View File

@ -1,6 +0,0 @@
graft docs hooks tests
include *.py
include LICENSE
include git_ssh
include repo
include run_tests

View File

@ -6,50 +6,11 @@ development workflow. Repo is not meant to replace Git, only to make it
easier to work with Git. The repo command is an executable Python script easier to work with Git. The repo command is an executable Python script
that you can put anywhere in your path. that you can put anywhere in your path.
* Homepage: <https://gerrit.googlesource.com/git-repo/> * Homepage: https://gerrit.googlesource.com/git-repo/
* Mailing list: [repo-discuss on Google Groups][repo-discuss] * Bug reports: https://bugs.chromium.org/p/gerrit/issues/list?q=component:repo
* Bug reports: <https://bugs.chromium.org/p/gerrit/issues/list?q=component:repo> * Source: https://gerrit.googlesource.com/git-repo/
* Source: <https://gerrit.googlesource.com/git-repo/> * Overview: https://source.android.com/source/developing.html
* Overview: <https://source.android.com/source/developing.html> * Docs: https://source.android.com/source/using-repo.html
* Docs: <https://source.android.com/source/using-repo.html>
* [repo Manifest Format](./docs/manifest-format.md) * [repo Manifest Format](./docs/manifest-format.md)
* [repo Hooks](./docs/repo-hooks.md) * [repo Hooks](./docs/repo-hooks.md)
* [Submitting patches](./SUBMITTING_PATCHES.md) * [Submitting patches](./SUBMITTING_PATCHES.md)
* Running Repo in [Microsoft Windows](./docs/windows.md)
* GitHub mirror: <https://github.com/GerritCodeReview/git-repo>
* Postsubmit tests: <https://github.com/GerritCodeReview/git-repo/actions>
## Contact
Please use the [repo-discuss] mailing list or [issue tracker] for questions.
You can [file a new bug report][new-bug] under the "repo" component.
Please do not e-mail individual developers for support.
They do not have the bandwidth for it, and often times questions have already
been asked on [repo-discuss] or bugs posted to the [issue tracker].
So please search those sites first.
## Install
Many distros include repo, so you might be able to install from there.
```sh
# Debian/Ubuntu.
$ sudo apt-get install repo
# Gentoo.
$ sudo emerge dev-vcs/repo
```
You can install it manually as well as it's a single script.
```sh
$ mkdir -p ~/.bin
$ PATH="${HOME}/.bin:${PATH}"
$ curl https://storage.googleapis.com/git-repo-downloads/repo > ~/.bin/repo
$ chmod a+rx ~/.bin/repo
```
[new-bug]: https://bugs.chromium.org/p/gerrit/issues/entry?template=Repo+tool+issue
[issue tracker]: https://bugs.chromium.org/p/gerrit/issues/list?q=component:repo
[repo-discuss]: https://groups.google.com/forum/#!forum/repo-discuss

View File

@ -4,7 +4,7 @@
- Make small logical changes. - Make small logical changes.
- Provide a meaningful commit message. - Provide a meaningful commit message.
- Check for coding errors and style nits with flake8. - Check for coding errors and style nits with pyflakes and flake8
- Make sure all code is under the Apache License, 2.0. - Make sure all code is under the Apache License, 2.0.
- Publish your changes for review. - Publish your changes for review.
- Make corrections if requested. - Make corrections if requested.
@ -38,65 +38,41 @@ If your description starts to get too long, that's a sign that you
probably need to split up your commit to finer grained pieces. probably need to split up your commit to finer grained pieces.
## Check for coding errors and style violations with flake8 ## Check for coding errors and style nits with pyflakes and flake8
Run `flake8` on changed modules: ### Coding errors
Run `pyflakes` on changed modules:
pyflakes file.py
Ideally there should be no new errors or warnings introduced.
### Style violations
Run `flake8` on changes modules:
flake8 file.py flake8 file.py
Note that repo generally follows [Google's Python Style Guide] rather than Note that repo generally follows [Google's python style guide] rather than
[PEP 8], with a couple of notable exceptions: [PEP 8], so it's possible that the output of `flake8` will be quite noisy.
It's not mandatory to avoid all warnings, but at least the maximum line
length should be followed.
* Indentation is at 2 columns rather than 4 If there are many occurrences of the same warning that cannot be
* The maximum line length is 100 columns rather than 80 avoided without going against the Google style guide, these may be
suppressed in the included `.flake8` file.
There should be no new errors or warnings introduced. [Google's python style guide]: https://google.github.io/styleguide/pyguide.html
Warnings that cannot be avoided without going against the Google Style Guide
may be suppressed inline individally using a `# noqa` comment as described
in the [flake8 documentation].
If there are many occurrences of the same warning, these may be suppressed for
the entire project in the included `.flake8` file.
[Google's Python Style Guide]: https://google.github.io/styleguide/pyguide.html
[PEP 8]: https://www.python.org/dev/peps/pep-0008/ [PEP 8]: https://www.python.org/dev/peps/pep-0008/
[flake8 documentation]: https://flake8.pycqa.org/en/3.1.1/user/ignoring-errors.html#in-line-ignoring-errors
## Running tests ## Running tests
We use [pytest](https://pytest.org/) and [tox](https://tox.readthedocs.io/) for There is a [`./run_tests`](./run_tests) helper script for quickly invoking all
running tests. You should make sure to install those first. of our unittests. The coverage isn't great currently, but it should still be
run for all commits.
To run the full suite against all supported Python versions, simply execute:
```sh
$ tox -p auto
```
We have [`./run_tests`](./run_tests) which is a simple wrapper around `pytest`:
```sh
# Run the full suite against the default Python version.
$ ./run_tests
# List each test as it runs.
$ ./run_tests -v
# Run a specific unittest module (and all tests in it).
$ ./run_tests tests/test_git_command.py
# Run a specific testsuite in a specific unittest module.
$ ./run_tests tests/test_editor.py::EditString
# Run a single test.
$ ./run_tests tests/test_editor.py::EditString::test_cat_editor
# List all available tests.
$ ./run_tests --collect-only
# Run a single test using substring match.
$ ./run_tests -k test_cat_editor
```
The coverage isn't great currently, but it should still be run for all commits.
Adding more unittests for changes you make would be greatly appreciated :). Adding more unittests for changes you make would be greatly appreciated :).
Check out the [tests/](./tests/) subdirectory for more details. Check out the [tests/](./tests/) subdirectory for more details.

View File

@ -84,7 +84,6 @@ def _Color(fg=None, bg=None, attr=None):
code = '' code = ''
return code return code
DEFAULT = None DEFAULT = None

View File

@ -66,8 +66,7 @@ class Command(object):
usage = self.helpUsage.strip().replace('%prog', me) usage = self.helpUsage.strip().replace('%prog', me)
except AttributeError: except AttributeError:
usage = 'repo %s' % self.NAME usage = 'repo %s' % self.NAME
epilog = 'Run `repo help %s` to view the detailed manual.' % self.NAME self._optparse = optparse.OptionParser(usage=usage)
self._optparse = optparse.OptionParser(usage=usage, epilog=epilog)
self._Options(self._optparse) self._Options(self._optparse)
return self._optparse return self._optparse
@ -124,9 +123,9 @@ class Command(object):
project = None project = None
if os.path.exists(path): if os.path.exists(path):
oldpath = None oldpath = None
while (path and while path and \
path != oldpath and path != oldpath and \
path != manifest.topdir): path != manifest.topdir:
try: try:
project = self._by_path[path] project = self._by_path[path]
break break
@ -237,7 +236,6 @@ class InteractiveCommand(Command):
"""Command which requires user interaction on the tty and """Command which requires user interaction on the tty and
must not run within a pager, even if the user asks to. must not run within a pager, even if the user asks to.
""" """
def WantPager(self, _opt): def WantPager(self, _opt):
return False return False
@ -246,7 +244,6 @@ class PagedCommand(Command):
"""Command which defaults to output in a pager, as its """Command which defaults to output in a pager, as its
display tends to be larger than one screen full. display tends to be larger than one screen full.
""" """
def WantPager(self, _opt): def WantPager(self, _opt):
return True return True

View File

@ -1,232 +0,0 @@
# Repo internal filesystem layout
A reference to the `.repo/` tree in repo client checkouts.
Hopefully it's complete & up-to-date, but who knows!
*** note
**Warning**:
This is meant for developers of the repo project itself as a quick reference.
**Nothing** in here must be construed as ABI, or that repo itself will never
change its internals in backwards incompatible ways.
***
[TOC]
## .repo/ layout
All content under `.repo/` is managed by `repo` itself with few exceptions.
In general, you should not make manual changes in here.
If a setting was initialized using an option to `repo init`, you should use that
command to change the setting later on.
It is always safe to re-run `repo init` in existing repo client checkouts.
For example, if you want to change the manifest branch, you can simply run
`repo init --manifest-branch=<new name>` and repo will take care of the rest.
* `config`: Per-repo client checkout settings using [git-config] file format.
* `.repo_config.json`: JSON cache of the `config` file for repo to
read/process quickly.
### repo/ state
* `repo/`: A git checkout of the repo project. This is how `repo` re-execs
itself to get the latest released version.
It tracks the git repository at `REPO_URL` using the `REPO_REV` branch.
Those are specified at `repo init` time using the `--repo-url=<REPO_URL>`
and `--repo-rev=<REPO_REV>` options.
Any changes made to this directory will usually be automatically discarded
by repo itself when it checks for updates. If you want to update to the
latest version of repo, use `repo selfupdate` instead. If you want to
change the git URL/branch that this tracks, re-run `repo init` with the new
settings.
* `.repo_fetchtimes.json`: Used by `repo sync` to record stats when syncing
the various projects.
### Manifests
For more documentation on the manifest format, including the local_manifests
support, see the [manifest-format.md] file.
* `manifests/`: A git checkout of the manifest project. Its `.git/` state
points to the `manifest.git` bare checkout (see below). It tracks the git
branch specified at `repo init` time via `--manifest-branch`.
The local branch name is always `default` regardless of the remote tracking
branch. Do not get confused if the remote branch is not `default`, or if
there is a remote `default` that is completely different!
No manual changes should be made in here as it will just confuse repo and
it won't automatically recover causing no new changes to be picked up.
* `manifests.git/`: A bare checkout of the manifest project. It tracks the
git repository specified at `repo init` time via `--manifest-url`.
No manual changes should be made in here as it will just confuse repo.
If you want to switch the tracking settings, re-run `repo init` with the
new settings.
* `manifest.xml`: The manifest that repo uses. It is generated at `repo init`
and uses the `--manifest-name` to determine what manifest file to load next
out of `manifests/`.
Do not try to modify this to load other manifests as it will confuse repo.
If you want to switch manifest files, re-run `repo init` with the new
setting.
Older versions of repo managed this with symlinks.
* `manifest.xml -> manifests/<manifest-name>.xml`: A symlink to the manifest
that the user wishes to sync. It is specified at `repo init` time via
`--manifest-name`.
* `manifests.git/.repo_config.json`: JSON cache of the `manifests.git/config`
file for repo to read/process quickly.
* `local_manifest.xml` (*Deprecated*): User-authored tweaks to the manifest
used to sync. See [local manifests] for more details.
* `local_manifests/`: Directory of user-authored manifest fragments to tweak
the manifest used to sync. See [local manifests] for more details.
### Project objects
* `project.list`: Tracking file used by `repo sync` to determine when projects
are added or removed and need corresponding updates in the checkout.
* `projects/`: Bare checkouts of every project synced by the manifest. The
filesystem layout matches the `<project path=...` setting in the manifest
(i.e. where it's checked out in the repo client source tree). Those
checkouts will symlink their `.git/` state to paths under here.
Some git state is further split out under `project-objects/`.
* `project-objects/`: Git objects that are safe to share across multiple
git checkouts. The filesystem layout matches the `<project name=...`
setting in the manifest (i.e. the path on the remote server) with a `.git`
suffix. This allows for multiple checkouts of the same remote git repo to
share their objects. For example, you could have different branches of
`foo/bar.git` checked out to `foo/bar-master`, `foo/bar-release`, etc...
There will be multiple trees under `projects/` for each one, but only one
under `project-objects/`.
This layout is designed to allow people to sync against different remotes
(e.g. a local mirror & a public review server) while avoiding duplicating
the content. However, this can run into problems if different remotes use
the same path on their respective servers. Best to avoid that.
* `subprojects/`: Like `projects/`, but for git submodules.
* `subproject-objects/`: Like `project-objects/`, but for git submodules.
* `worktrees/`: Bare checkouts of every project synced by the manifest. The
filesystem layout matches the `<project name=...` setting in the manifest
(i.e. the path on the remote server) with a `.git` suffix. This has the
same advantages as the `project-objects/` layout above.
This is used when git worktrees are enabled.
### Global settings
The `.repo/manifests.git/config` file is used to track settings for the entire
repo client checkout.
Most settings use the `[repo]` section to avoid conflicts with git.
User controlled settings are initialized when running `repo init`.
| Setting | `repo init` Option | Use/Meaning |
|-------------------|---------------------------|-------------|
| manifest.groups | `--groups` & `--platform` | The manifest groups to sync |
| repo.archive | `--archive` | Use `git archive` for checkouts |
| repo.clonebundle | `--clone-bundle` | Whether the initial sync used clone.bundle explicitly |
| repo.clonefilter | `--clone-filter` | Filter setting when using [partial git clones] |
| repo.depth | `--depth` | Create shallow checkouts when cloning |
| repo.dissociate | `--dissociate` | Dissociate from any reference/mirrors after initial clone |
| repo.mirror | `--mirror` | Checkout is a repo mirror |
| repo.partialclone | `--partial-clone` | Create [partial git clones] |
| repo.reference | `--reference` | Reference repo client checkout |
| repo.submodules | `--submodules` | Sync git submodules |
| repo.worktree | `--worktree` | Use `git worktree` for checkouts |
| user.email | `--config-name` | User's e-mail address; Copied into `.git/config` when checking out a new project |
| user.name | `--config-name` | User's name; Copied into `.git/config` when checking out a new project |
[partial git clones]: https://git-scm.com/docs/gitrepository-layout#_code_partialclone_code
### Repo hooks settings
For more details on this feature, see the [repo-hooks docs](./repo-hooks.md).
We'll just discuss the internal configuration settings.
These are stored in the registered `<repo-hooks>` project itself, so if the
manifest switches to a different project, the settings will not be copied.
| Setting | Use/Meaning |
|--------------------------------------|-------------|
| repo.hooks.\<hook\>.approvedmanifest | User approval for secure manifest sources (e.g. https://) |
| repo.hooks.\<hook\>.approvedhash | User approval for insecure manifest sources (e.g. http://) |
For example, if our manifest had the following entries, we would store settings
under `.repo/projects/src/repohooks.git/config` (which would be reachable via
`git --git-dir=src/repohooks/.git config`).
```xml
<project path="src/repohooks" name="chromiumos/repohooks" ... />
<repo-hooks in-project="chromiumos/repohooks" ... />
```
If `<hook>` is `pre-upload`, the `.git/config` setting might be:
```ini
[repo "hooks.pre-upload"]
approvedmanifest = https://chromium.googlesource.com/chromiumos/manifest
```
## Per-project settings
These settings are somewhat meant to be tweaked by the user on a per-project
basis (e.g. `git config` in a checked out source repo).
Where possible, we re-use standard git settings to avoid confusion, and we
refrain from documenting those, so see [git-config] documentation instead.
See `repo help upload` for documentation on `[review]` settings.
The `[remote]` settings are automatically populated/updated from the manifest.
The `[branch]` settings are updated by `repo start` and `git branch`.
| Setting | Subcommands | Use/Meaning |
|-------------------------------|---------------|-------------|
| review.\<url\>.autocopy | upload | Automatically add to `--cc=<value>` |
| review.\<url\>.autoreviewer | upload | Automatically add to `--reviewers=<value>` |
| review.\<url\>.autoupload | upload | Automatically answer "yes" or "no" to all prompts |
| review.\<url\>.uploadhashtags | upload | Automatically add to `--hashtag=<value>` |
| review.\<url\>.uploadlabels | upload | Automatically add to `--label=<value>` |
| review.\<url\>.uploadnotify | upload | [Notify setting][upload-notify] to use |
| review.\<url\>.uploadtopic | upload | Default [topic] to use |
| review.\<url\>.username | upload | Override username with `ssh://` review URIs |
| remote.\<remote\>.fetch | sync | Set of refs to fetch |
| remote.\<remote\>.projectname | \<network\> | The name of the project as it exists in Gerrit review |
| remote.\<remote\>.pushurl | upload | The base URI for pushing CLs |
| remote.\<remote\>.review | upload | The URI of the Gerrit review server |
| remote.\<remote\>.url | sync & upload | The URI of the git project to fetch |
| branch.\<branch\>.merge | sync & upload | The branch to merge & upload & track |
| branch.\<branch\>.remote | sync & upload | The remote to track |
## ~/ dotconfig layout
Repo will create & maintain a few files in the user's home directory.
* `.repoconfig/`: Repo's per-user directory for all random config files/state.
* `.repoconfig/config`: Per-user settings using [git-config] file format.
* `.repoconfig/keyring-version`: Cache file for checking if the gnupg subdir
has all the same keys as the repo launcher. Used to avoid running gpg
constantly as that can be quite slow.
* `.repoconfig/gnupg/`: GnuPG's internal state directory used when repo needs
to run `gpg`. This provides isolation from the user's normal `~/.gnupg/`.
* `.repoconfig/.repo_config.json`: JSON cache of the `.repoconfig/config`
file for repo to read/process quickly.
* `.repo_.gitconfig.json`: JSON cache of the `.gitconfig` file for repo to
read/process quickly.
[git-config]: https://git-scm.com/docs/git-config
[manifest-format.md]: ./manifest-format.md
[local manifests]: ./manifest-format.md#Local-Manifests
[topic]: https://gerrit-review.googlesource.com/Documentation/intro-user.html#topics
[upload-notify]: https://gerrit-review.googlesource.com/Documentation/user-upload.html#notify

View File

@ -89,7 +89,6 @@ following DTD:
<!ATTLIST extend-project path CDATA #IMPLIED> <!ATTLIST extend-project path CDATA #IMPLIED>
<!ATTLIST extend-project groups CDATA #IMPLIED> <!ATTLIST extend-project groups CDATA #IMPLIED>
<!ATTLIST extend-project revision CDATA #IMPLIED> <!ATTLIST extend-project revision CDATA #IMPLIED>
<!ATTLIST extend-project remote CDATA #IMPLIED>
<!ELEMENT remove-project EMPTY> <!ELEMENT remove-project EMPTY>
<!ATTLIST remove-project name CDATA #REQUIRED> <!ATTLIST remove-project name CDATA #REQUIRED>
@ -307,9 +306,6 @@ belongs. Same syntax as the corresponding element of `project`.
Attribute `revision`: If specified, overrides the revision of the original Attribute `revision`: If specified, overrides the revision of the original
project. Same syntax as the corresponding element of `project`. project. Same syntax as the corresponding element of `project`.
Attribute `remote`: If specified, overrides the remote of the original
project. Same syntax as the corresponding element of `project`.
### Element annotation ### Element annotation
Zero or more annotation elements may be specified as children of a Zero or more annotation elements may be specified as children of a
@ -342,7 +338,7 @@ It's just like copyfile and runs at the same time as copyfile but
instead of copying it creates a symlink. instead of copying it creates a symlink.
The symlink is created at "dest" (relative to the top of the tree) and The symlink is created at "dest" (relative to the top of the tree) and
points to the path specified by "src" which is a path in the project. points to the path specified by "src".
Parent directories of "dest" will be automatically created if missing. Parent directories of "dest" will be automatically created if missing.
@ -396,4 +392,10 @@ these extra projects.
Manifest files stored in `$TOP_DIR/.repo/local_manifests/*.xml` will Manifest files stored in `$TOP_DIR/.repo/local_manifests/*.xml` will
be loaded in alphabetical order. be loaded in alphabetical order.
The legacy `$TOP_DIR/.repo/local_manifest.xml` path is no longer supported. Additional remotes and projects may also be added through a local
manifest, stored in `$TOP_DIR/.repo/local_manifest.xml`. This method
is deprecated in favor of using multiple manifest files as mentioned
above.
If `$TOP_DIR/.repo/local_manifest.xml` exists, it will be loaded before
any manifest files stored in `$TOP_DIR/.repo/local_manifests/*.xml`.

View File

@ -7,9 +7,9 @@ their old LTS/corp systems and have little power to change the system.
## Summary ## Summary
* Python 3.6 (released Dec 2016) is required by default starting with repo-2.x. * Python 3.6 (released Dec 2016) is required by default starting with repo-1.14.
* Older versions of Python (e.g. v2.7) may use the legacy feature-frozen branch * Older versions of Python (e.g. v2.7) may use the legacy feature-frozen branch
based on repo-1.x. based on repo-1.13.
## Overview ## Overview

View File

@ -5,37 +5,6 @@ related topics and flows.
[TOC] [TOC]
## Schedule
There is no specific schedule for when releases are made.
Usually it's more along the lines of "enough minor changes have been merged",
or "there's a known issue the maintainers know should get fixed".
If you find a fix has been merged for an issue important to you, but hasn't been
released after a week or so, feel free to [contact] us to request a new release.
### Release Freezes {#freeze}
We try to observe a regular schedule for when **not** to release.
If something goes wrong, staff need to be active in order to respond quickly &
effectively.
We also don't want to disrupt non-Google organizations if possible.
We generally follow the rules:
* Release during Mon - Thu, 9:00 - 14:00 [US PT]
* Avoid holidays
* All regular [US holidays]
* Large international ones if possible
* All the various [New Years]
* Jan 1 in Gregorian calendar is the most obvious
* Check for large Lunar New Years too
* Follow the normal [Google production freeze schedule]
[US holidays]: https://en.wikipedia.org/wiki/Federal_holidays_in_the_United_States
[US PT]: https://en.wikipedia.org/wiki/Pacific_Time_Zone
[New Years]: https://en.wikipedia.org/wiki/New_Year
[Google production freeze schedule]: http://goto.google.com/prod-freeze
## Launcher script ## Launcher script
The main repo script serves as a standalone program and is often referred to as The main repo script serves as a standalone program and is often referred to as
@ -80,11 +49,11 @@ control how repo finds updates:
* `--repo-url`: This tells repo where to clone the full repo project itself. * `--repo-url`: This tells repo where to clone the full repo project itself.
It defaults to the official project (`REPO_URL` in the launcher script). It defaults to the official project (`REPO_URL` in the launcher script).
* `--repo-rev`: This tells repo which branch to use for the full project. * `--repo-branch`: This tells repo which branch to use for the full project.
It defaults to the `stable` branch (`REPO_REV` in the launcher script). It defaults to the `stable` branch (`REPO_REV` in the launcher script).
Whenever `repo sync` is run, repo will check to see if an update is available. Whenever `repo sync` is run, repo will check to see if an update is available.
It fetches the latest repo-rev from the repo-url. It fetches the latest repo-branch from the repo-url.
Then it verifies that the latest commit in the branch has a valid signed tag Then it verifies that the latest commit in the branch has a valid signed tag
using `git tag -v` (which uses gpg). using `git tag -v` (which uses gpg).
If the tag is valid, then repo will update its internal checkout to it. If the tag is valid, then repo will update its internal checkout to it.
@ -122,7 +91,7 @@ When you want to create a new release, you'll need to select a good version and
create a signed tag using a key registered in repo itself. create a signed tag using a key registered in repo itself.
Typically we just tag the latest version of the `master` branch. Typically we just tag the latest version of the `master` branch.
The tag could be pushed now, but it won't be used by clients normally (since the The tag could be pushed now, but it won't be used by clients normally (since the
default `repo-rev` setting is `stable`). default `repo-branch` setting is `stable`).
This would allow some early testing on systems who explicitly select `master`. This would allow some early testing on systems who explicitly select `master`.
### Creating a signed tag ### Creating a signed tag
@ -192,92 +161,7 @@ You can create a short changelog using the command:
$ git log --format="%h (%aN) %s" --no-merges origin/stable..$r $ git log --format="%h (%aN) %s" --no-merges origin/stable..$r
``` ```
## Project References
Here's a table showing the relationship of major tools, their EOL dates, and
their status in Ubuntu & Debian.
Those distros tend to be good indicators of how long we need to support things.
Things in bold indicate stuff to take note of, but does not guarantee that we
still support them.
Things in italics are things we used to care about but probably don't anymore.
| Date | EOL | [Git][rel-g] | [Python][rel-p] | [Ubuntu][rel-u] / [Debian][rel-d] | Git | Python |
|:--------:|:------------:|--------------|-----------------|-----------------------------------|-----|--------|
| Oct 2008 | *Oct 2013* | | 2.6.0 | *10.04 Lucid* - 10.10 Maverick / *Squeeze* |
| Dec 2008 | *Feb 2009* | | 3.0.0 |
| Feb 2009 | *Mar 2012* | | | Debian 5 Lenny | 1.5.6.5 | 2.5.2 |
| Jun 2009 | *Jun 2016* | | 3.1.0 | *10.04 Lucid* - 10.10 Maverick / *Squeeze* |
| Feb 2010 | *Oct 2012* | 1.7.0 | | *10.04 Lucid* - *12.04 Precise* - 12.10 Quantal |
| Apr 2010 | *Apr 2015* | | | *10.04 Lucid* | 1.7.0.4 | 2.6.5 3.1.2 |
| Jul 2010 | *Dec 2019* | | **2.7.0** | 11.04 Natty - **<current>** |
| Oct 2010 | | | | 10.10 Maverick | 1.7.1 | 2.6.6 3.1.3 |
| Feb 2011 | *Feb 2016* | | | Debian 6 Squeeze | 1.7.2.5 | 2.6.6 3.1.3 |
| Apr 2011 | | | | 11.04 Natty | 1.7.4 | 2.7.1 3.2.0 |
| Oct 2011 | *Feb 2016* | | 3.2.0 | 11.04 Natty - 12.10 Quantal |
| Oct 2011 | | | | 11.10 Ocelot | 1.7.5.4 | 2.7.2 3.2.2 |
| Apr 2012 | *Apr 2019* | | | *12.04 Precise* | 1.7.9.5 | 2.7.3 3.2.3 |
| Sep 2012 | *Sep 2017* | | 3.3.0 | 13.04 Raring - 13.10 Saucy |
| Oct 2012 | *Dec 2014* | 1.8.0 | | 13.04 Raring - 13.10 Saucy |
| Oct 2012 | | | | 12.10 Quantal | 1.7.10.4 | 2.7.3 3.2.3 |
| Apr 2013 | | | | 13.04 Raring | 1.8.1.2 | 2.7.4 3.3.1 |
| May 2013 | *May 2018* | | | Debian 7 Wheezy | 1.7.10.4 | 2.7.3 3.2.3 |
| Oct 2013 | | | | 13.10 Saucy | 1.8.3.2 | 2.7.5 3.3.2 |
| Feb 2014 | *Dec 2014* | **1.9.0** | | **14.04 Trusty** |
| Mar 2014 | *Mar 2019* | | **3.4.0** | **14.04 Trusty** - 15.10 Wily / **Jessie** |
| Apr 2014 | **Apr 2022** | | | **14.04 Trusty** | 1.9.1 | 2.7.5 3.4.0 |
| May 2014 | *Dec 2014* | 2.0.0 |
| Aug 2014 | *Dec 2014* | **2.1.0** | | 14.10 Utopic - 15.04 Vivid / **Jessie** |
| Oct 2014 | | | | 14.10 Utopic | 2.1.0 | 2.7.8 3.4.2 |
| Nov 2014 | *Sep 2015* | 2.2.0 |
| Feb 2015 | *Sep 2015* | 2.3.0 |
| Apr 2015 | *May 2017* | 2.4.0 |
| Apr 2015 | **Jun 2020** | | | **Debian 8 Jessie** | 2.1.4 | 2.7.9 3.4.2 |
| Apr 2015 | | | | 15.04 Vivid | 2.1.4 | 2.7.9 3.4.3 |
| Jul 2015 | *May 2017* | 2.5.0 | | 15.10 Wily |
| Sep 2015 | *May 2017* | 2.6.0 |
| Sep 2015 | **Sep 2020** | | **3.5.0** | **16.04 Xenial** - 17.04 Zesty / **Stretch** |
| Oct 2015 | | | | 15.10 Wily | 2.5.0 | 2.7.9 3.4.3 |
| Jan 2016 | *Jul 2017* | **2.7.0** | | **16.04 Xenial** |
| Mar 2016 | *Jul 2017* | 2.8.0 |
| Apr 2016 | **Apr 2024** | | | **16.04 Xenial** | 2.7.4 | 2.7.11 3.5.1 |
| Jun 2016 | *Jul 2017* | 2.9.0 | | 16.10 Yakkety |
| Sep 2016 | *Sep 2017* | 2.10.0 |
| Oct 2016 | | | | 16.10 Yakkety | 2.9.3 | 2.7.11 3.5.1 |
| Nov 2016 | *Sep 2017* | **2.11.0** | | 17.04 Zesty / **Stretch** |
| Dec 2016 | **Dec 2021** | | **3.6.0** | 17.10 Artful - **18.04 Bionic** - 18.10 Cosmic |
| Feb 2017 | *Sep 2017* | 2.12.0 |
| Apr 2017 | | | | 17.04 Zesty | 2.11.0 | 2.7.13 3.5.3 |
| May 2017 | *May 2018* | 2.13.0 |
| Jun 2017 | **Jun 2022** | | | **Debian 9 Stretch** | 2.11.0 | 2.7.13 3.5.3 |
| Aug 2017 | *Dec 2019* | 2.14.0 | | 17.10 Artful |
| Oct 2017 | *Dec 2019* | 2.15.0 |
| Oct 2017 | | | | 17.10 Artful | 2.14.1 | 2.7.14 3.6.3 |
| Jan 2018 | *Dec 2019* | 2.16.0 |
| Apr 2018 | *Dec 2019* | 2.17.0 | | **18.04 Bionic** |
| Apr 2018 | **Apr 2028** | | | **18.04 Bionic** | 2.17.0 | 2.7.15 3.6.5 |
| Jun 2018 | *Dec 2019* | 2.18.0 |
| Jun 2018 | **Jun 2023** | | 3.7.0 | 19.04 Disco - **20.04 Focal** / **Buster** |
| Sep 2018 | *Dec 2019* | 2.19.0 | | 18.10 Cosmic |
| Oct 2018 | | | | 18.10 Cosmic | 2.19.1 | 2.7.15 3.6.6 |
| Dec 2018 | *Dec 2019* | **2.20.0** | | 19.04 Disco / **Buster** |
| Feb 2019 | *Dec 2019* | 2.21.0 |
| Apr 2019 | | | | 19.04 Disco | 2.20.1 | 2.7.16 3.7.3 |
| Jun 2019 | | 2.22.0 |
| Jul 2019 | **Jul 2024** | | | **Debian 10 Buster** | 2.20.1 | 2.7.16 3.7.3 |
| Aug 2019 | | 2.23.0 |
| Oct 2019 | **Oct 2024** | | 3.8.0 |
| Oct 2019 | | | | 19.10 Eoan | 2.20.1 | 2.7.17 3.7.5 |
| Nov 2019 | | 2.24.0 |
| Jan 2020 | | 2.25.0 | | **20.04 Focal** |
| Apr 2020 | **Apr 2030** | | | **20.04 Focal** | 2.25.0 | 2.7.17 3.7.5 |
[contact]: ../README.md#contact
[rel-d]: https://en.wikipedia.org/wiki/Debian_version_history
[rel-g]: https://en.wikipedia.org/wiki/Git#Releases
[rel-p]: https://en.wikipedia.org/wiki/History_of_Python#Table_of_versions
[rel-u]: https://en.wikipedia.org/wiki/Ubuntu_version_history#Table_of_versions
[example announcement]: https://groups.google.com/d/topic/repo-discuss/UGBNismWo1M/discussion [example announcement]: https://groups.google.com/d/topic/repo-discuss/UGBNismWo1M/discussion
[repo-discuss@googlegroups.com]: https://groups.google.com/forum/#!forum/repo-discuss [repo-discuss@googlegroups.com]: https://groups.google.com/forum/#!forum/repo-discuss
[go/repo-release]: https://goto.google.com/repo-release [go/repo-release]: https://goto.google.com/repo-release

View File

@ -1,169 +0,0 @@
# Microsoft Windows Details
Repo is primarily developed on Linux with a lot of users on macOS.
Windows is, unfortunately, not a common platform.
There is support in repo for Windows, but there might be some rough edges.
Keep in mind that Windows in general is "best effort" and "community supported".
That means we don't actively test or verify behavior, but rely heavily on users
to report problems back to us, and to contribute fixes as needed.
[TOC]
## Windows
We only support Windows 10 or newer.
This is largely due to symlinks not being available in older versions, but it's
also due to most developers not using Windows.
We will never add code specific to older versions of Windows.
It might work, but it most likely won't, so please don't bother asking.
## Git worktrees
*** note
**Warning**: Repo's support for Git worktrees is new & experimental.
Please report any bugs and be sure to maintain backups!
***
The Repo 2.4 release introduced support for [Git worktrees][git-worktree].
You don't have to worry about or understand this particular feature, so don't
worry if this section of the Git manual is particularly impenetrable.
The salient point is that Git worktrees allow Repo to create repo client
checkouts that do not require symlinks at all under Windows.
This means users no longer need Administrator access to sync code.
Simply use `--worktree` when running `repo init` to opt in.
This does not effect specific Git repositories that use symlinks themselves.
[git-worktree]: https://git-scm.com/docs/git-worktree
## Symlinks by default
*** note
**NB**: This section applies to the default Repo behavior which does not use
Git worktrees (see the previous section for more info).
***
Repo will use symlinks heavily internally.
On *NIX platforms, this isn't an issue, but Windows makes it a bit difficult.
There are some documents out there for how to do this, but usually the easiest
answer is to run your shell as an Administrator and invoke repo/git in that.
This isn't a great solution, but Windows doesn't make this easy, so here we are.
### Launch Git Bash
If you install Git Bash (see below), you can launch that with appropriate
permissions so that all programs "just work".
* Open the Start Menu (i.e. press the ⊞ key).
* Find/search for "Git Bash".
* Right click it and select "Run as administrator".
*** note
**NB**: This environment is only needed when running `repo`, or any specific `git`
command that might involve symlinks (e.g. `pull` or `checkout`).
You do not need to run all your commands in here such as your editor.
***
### Symlinks with GNU tools
If you want to use `ln -s` inside of the default Git/bash shell, you might need
to export this environment variable:
```sh
$ export MSYS="winsymlinks:nativestrict"
```
Otherwise `ln -s` will copy files and not actually create a symlink.
This also helps `tar` unpack symlinks, so that's nice.
### References
* https://github.com/git-for-windows/git/wiki/Symbolic-Links
* https://blogs.windows.com/windowsdeveloper/2016/12/02/symlinks-windows-10/
## Python
Python 3.6 or newer is required.
Python 2 is known to be broken when running under Windows.
See our [Python Support](./python-support.md) document for more details.
You can grab the latest Windows installer here:<br>
https://www.python.org/downloads/release/python-3
## Git
You should install the most recent version of Git for Windows:<br>
https://git-scm.com/download/win
When installing, make sure to turn on "Enable symbolic links" when prompted.
If you've already installed Git for Windows, you can simply download the latest
installer from above and run it again.
It should safely upgrade things in situ for you.
This is useful if you want to switch the symbolic link option after the fact.
## Shell
We don't have a specific requirement for shell environments when running repo.
Most developers use MinTTY/bash that's included with the Git for Windows install
(so see above for installing Git).
Command & Powershell & the Windows Terminal probably work.
Who knows!
## FAQ
### repo upload always complains about allowing hooks or using --no-verify!
When using `repo upload` in projects that have custom repohooks, you might get
an error like the following:
```sh
$ repo upload
ERROR: You must allow the pre-upload hook or use --no-verify.
```
This can be confusing as you never get prompted.
[MinTTY has a bug][mintty] that breaks isatty checking inside of repo which
causes repo to never interactively prompt the user which means the upload check
always fails.
You can workaround this by manually granting consent when uploading.
Simply add the `--verify` option whenever uploading:
```sh
$ repo upload --verify
```
You will have to specify this flag every time you upload.
[mintty]: https://github.com/mintty/mintty/issues/56
### repohooks always fail with an close_fds error.
When using the [reference repohooks project][repohooks] included in AOSP,
you might see errors like this when running `repo upload`:
```sh
$ repo upload
ERROR: Traceback (most recent call last):
...
File "C:\...\lib\subprocess.py", line 351, in __init__
raise ValueError("close_fds is not supported on Windows "
ValueError: close_fds is not supported on Windows platforms if you redirect stdin/stderr/stdout
Failed to run main() for pre-upload hook; see traceback above.
```
This error shows up when using Python 2.
You should upgrade to Python 3 instead (see above).
If you already have Python 3 installed, make sure it's the default version.
Running `python --version` should say `Python 3`, not `Python 2`.
If you didn't install the Python versions, or don't have permission to change
the default version, you can probably workaround this by changing `$PATH` in
your shell so the Python 3 version is found first.
[repohooks]: https://android.googlesource.com/platform/tools/repohooks

View File

@ -24,7 +24,6 @@ import tempfile
from error import EditorError from error import EditorError
import platform_utils import platform_utils
class Editor(object): class Editor(object):
"""Manages the user's preferred text editor.""" """Manages the user's preferred text editor."""
@ -58,7 +57,7 @@ class Editor(object):
if os.getenv('TERM') == 'dumb': if os.getenv('TERM') == 'dumb':
print( print(
"""No editor specified in GIT_EDITOR, core.editor, VISUAL or EDITOR. """No editor specified in GIT_EDITOR, core.editor, VISUAL or EDITOR.
Tried to fall back to vi but terminal is dumb. Please configure at Tried to fall back to vi but terminal is dumb. Please configure at
least one of these before using this command.""", file=sys.stderr) least one of these before using this command.""", file=sys.stderr)
sys.exit(1) sys.exit(1)
@ -70,13 +69,10 @@ least one of these before using this command.""", file=sys.stderr)
"""Opens an editor to edit the given content. """Opens an editor to edit the given content.
Args: Args:
data: The text to edit. data : the text to edit
Returns: Returns:
New value of edited text. new value of edited text; None if editing did not succeed
Raises:
EditorError: The editor failed to run.
""" """
editor = cls._GetEditor() editor = cls._GetEditor()
if editor == ':': if editor == ':':
@ -84,7 +80,7 @@ least one of these before using this command.""", file=sys.stderr)
fd, path = tempfile.mkstemp() fd, path = tempfile.mkstemp()
try: try:
os.write(fd, data.encode('utf-8')) os.write(fd, data)
os.close(fd) os.close(fd)
fd = None fd = None
@ -110,8 +106,11 @@ least one of these before using this command.""", file=sys.stderr)
raise EditorError('editor failed with exit status %d: %s %s' raise EditorError('editor failed with exit status %d: %s %s'
% (rc, editor, path)) % (rc, editor, path))
with open(path, mode='rb') as fd2: fd2 = open(path)
return fd2.read().decode('utf-8') try:
return fd2.read()
finally:
fd2.close()
finally: finally:
if fd: if fd:
os.close(fd) os.close(fd)

View File

@ -14,26 +14,17 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
class ManifestParseError(Exception): class ManifestParseError(Exception):
"""Failed to parse the manifest file. """Failed to parse the manifest file.
""" """
class ManifestInvalidRevisionError(Exception): class ManifestInvalidRevisionError(Exception):
"""The revision value in a project is incorrect. """The revision value in a project is incorrect.
""" """
class ManifestInvalidPathError(Exception):
"""A path used in <copyfile> or <linkfile> is incorrect.
"""
class NoManifestException(Exception): class NoManifestException(Exception):
"""The required manifest does not exist. """The required manifest does not exist.
""" """
def __init__(self, path, reason): def __init__(self, path, reason):
super(NoManifestException, self).__init__() super(NoManifestException, self).__init__()
self.path = path self.path = path
@ -42,11 +33,9 @@ class NoManifestException(Exception):
def __str__(self): def __str__(self):
return self.reason return self.reason
class EditorError(Exception): class EditorError(Exception):
"""Unspecified error from the user's text editor. """Unspecified error from the user's text editor.
""" """
def __init__(self, reason): def __init__(self, reason):
super(EditorError, self).__init__() super(EditorError, self).__init__()
self.reason = reason self.reason = reason
@ -54,11 +43,9 @@ class EditorError(Exception):
def __str__(self): def __str__(self):
return self.reason return self.reason
class GitError(Exception): class GitError(Exception):
"""Unspecified internal error from git. """Unspecified internal error from git.
""" """
def __init__(self, command): def __init__(self, command):
super(GitError, self).__init__() super(GitError, self).__init__()
self.command = command self.command = command
@ -66,11 +53,9 @@ class GitError(Exception):
def __str__(self): def __str__(self):
return self.command return self.command
class UploadError(Exception): class UploadError(Exception):
"""A bundle upload to Gerrit did not succeed. """A bundle upload to Gerrit did not succeed.
""" """
def __init__(self, reason): def __init__(self, reason):
super(UploadError, self).__init__() super(UploadError, self).__init__()
self.reason = reason self.reason = reason
@ -78,11 +63,9 @@ class UploadError(Exception):
def __str__(self): def __str__(self):
return self.reason return self.reason
class DownloadError(Exception): class DownloadError(Exception):
"""Cannot download a repository. """Cannot download a repository.
""" """
def __init__(self, reason): def __init__(self, reason):
super(DownloadError, self).__init__() super(DownloadError, self).__init__()
self.reason = reason self.reason = reason
@ -90,11 +73,9 @@ class DownloadError(Exception):
def __str__(self): def __str__(self):
return self.reason return self.reason
class NoSuchProjectError(Exception): class NoSuchProjectError(Exception):
"""A specified project does not exist in the work tree. """A specified project does not exist in the work tree.
""" """
def __init__(self, name=None): def __init__(self, name=None):
super(NoSuchProjectError, self).__init__() super(NoSuchProjectError, self).__init__()
self.name = name self.name = name
@ -108,7 +89,6 @@ class NoSuchProjectError(Exception):
class InvalidProjectGroupsError(Exception): class InvalidProjectGroupsError(Exception):
"""A specified project is not suitable for the specified groups """A specified project is not suitable for the specified groups
""" """
def __init__(self, name=None): def __init__(self, name=None):
super(InvalidProjectGroupsError, self).__init__() super(InvalidProjectGroupsError, self).__init__()
self.name = name self.name = name
@ -118,18 +98,15 @@ class InvalidProjectGroupsError(Exception):
return 'in current directory' return 'in current directory'
return self.name return self.name
class RepoChangedException(Exception): class RepoChangedException(Exception):
"""Thrown if 'repo sync' results in repo updating its internal """Thrown if 'repo sync' results in repo updating its internal
repo or manifest repositories. In this special case we must repo or manifest repositories. In this special case we must
use exec to re-execute repo with the new code and manifest. use exec to re-execute repo with the new code and manifest.
""" """
def __init__(self, extra_args=None): def __init__(self, extra_args=None):
super(RepoChangedException, self).__init__() super(RepoChangedException, self).__init__()
self.extra_args = extra_args or [] self.extra_args = extra_args or []
class HookError(Exception): class HookError(Exception):
"""Thrown if a 'repo-hook' could not be run. """Thrown if a 'repo-hook' could not be run.

View File

@ -23,7 +23,6 @@ TASK_COMMAND = 'command'
TASK_SYNC_NETWORK = 'sync-network' TASK_SYNC_NETWORK = 'sync-network'
TASK_SYNC_LOCAL = 'sync-local' TASK_SYNC_LOCAL = 'sync-local'
class EventLog(object): class EventLog(object):
"""Event log that records events that occurred during a repo invocation. """Event log that records events that occurred during a repo invocation.
@ -166,7 +165,6 @@ class EventLog(object):
# An integer id that is unique across this invocation of the program. # An integer id that is unique across this invocation of the program.
_EVENT_ID = multiprocessing.Value('i', 1) _EVENT_ID = multiprocessing.Value('i', 1)
def _NextEventId(): def _NextEventId():
"""Helper function for grabbing the next unique id. """Helper function for grabbing the next unique id.

View File

@ -16,7 +16,6 @@
from __future__ import print_function from __future__ import print_function
import os import os
import re
import sys import sys
import subprocess import subprocess
import tempfile import tempfile
@ -29,17 +28,7 @@ from repo_trace import REPO_TRACE, IsTrace, Trace
from wrapper import Wrapper from wrapper import Wrapper
GIT = 'git' GIT = 'git'
# NB: These do not need to be kept in sync with the repo launcher script. MIN_GIT_VERSION = (1, 5, 4)
# These may be much newer as it allows the repo launcher to roll between
# different repo releases while source versions might require a newer git.
#
# The soft version is when we start warning users that the version is old and
# we'll be dropping support for it. We'll refuse to work with versions older
# than the hard version.
#
# git-1.7 is in (EOL) Ubuntu Precise. git-1.9 is in Ubuntu Trusty.
MIN_GIT_VERSION_SOFT = (1, 9, 1)
MIN_GIT_VERSION_HARD = (1, 7, 2)
GIT_DIR = 'GIT_DIR' GIT_DIR = 'GIT_DIR'
LAST_GITDIR = None LAST_GITDIR = None
@ -48,36 +37,6 @@ LAST_CWD = None
_ssh_proxy_path = None _ssh_proxy_path = None
_ssh_sock_path = None _ssh_sock_path = None
_ssh_clients = [] _ssh_clients = []
_ssh_version = None
def _run_ssh_version():
"""run ssh -V to display the version number"""
return subprocess.check_output(['ssh', '-V'], stderr=subprocess.STDOUT).decode()
def _parse_ssh_version(ver_str=None):
"""parse a ssh version string into a tuple"""
if ver_str is None:
ver_str = _run_ssh_version()
m = re.match(r'^OpenSSH_([0-9.]+)(p[0-9]+)?\s', ver_str)
if m:
return tuple(int(x) for x in m.group(1).split('.'))
else:
return ()
def ssh_version():
"""return ssh version as a tuple"""
global _ssh_version
if _ssh_version is None:
try:
_ssh_version = _parse_ssh_version()
except subprocess.CalledProcessError:
print('fatal: unable to detect ssh version', file=sys.stderr)
sys.exit(1)
return _ssh_version
def ssh_sock(create=True): def ssh_sock(create=True):
global _ssh_sock_path global _ssh_sock_path
@ -87,16 +46,11 @@ def ssh_sock(create=True):
tmp_dir = '/tmp' tmp_dir = '/tmp'
if not os.path.exists(tmp_dir): if not os.path.exists(tmp_dir):
tmp_dir = tempfile.gettempdir() tmp_dir = tempfile.gettempdir()
if ssh_version() < (6, 7):
tokens = '%r@%h:%p'
else:
tokens = '%C' # hash of %l%h%p%r
_ssh_sock_path = os.path.join( _ssh_sock_path = os.path.join(
tempfile.mkdtemp('', 'ssh-', tmp_dir), tempfile.mkdtemp('', 'ssh-', tmp_dir),
'master-' + tokens) 'master-%r@%h:%p')
return _ssh_sock_path return _ssh_sock_path
def _ssh_proxy(): def _ssh_proxy():
global _ssh_proxy_path global _ssh_proxy_path
if _ssh_proxy_path is None: if _ssh_proxy_path is None:
@ -105,18 +59,15 @@ def _ssh_proxy():
'git_ssh') 'git_ssh')
return _ssh_proxy_path return _ssh_proxy_path
def _add_ssh_client(p): def _add_ssh_client(p):
_ssh_clients.append(p) _ssh_clients.append(p)
def _remove_ssh_client(p): def _remove_ssh_client(p):
try: try:
_ssh_clients.remove(p) _ssh_clients.remove(p)
except ValueError: except ValueError:
pass pass
def terminate_ssh_clients(): def terminate_ssh_clients():
global _ssh_clients global _ssh_clients
for p in _ssh_clients: for p in _ssh_clients:
@ -127,10 +78,8 @@ def terminate_ssh_clients():
pass pass
_ssh_clients = [] _ssh_clients = []
_git_version = None _git_version = None
class _GitCall(object): class _GitCall(object):
def version_tuple(self): def version_tuple(self):
global _git_version global _git_version
@ -142,15 +91,12 @@ class _GitCall(object):
return _git_version return _git_version
def __getattr__(self, name): def __getattr__(self, name):
name = name.replace('_', '-') name = name.replace('_','-')
def fun(*cmdv): def fun(*cmdv):
command = [name] command = [name]
command.extend(cmdv) command.extend(cmdv)
return GitCommand(None, command).Wait() == 0 return GitCommand(None, command).Wait() == 0
return fun return fun
git = _GitCall() git = _GitCall()
@ -231,10 +177,8 @@ class UserAgent(object):
return self._git_ua return self._git_ua
user_agent = UserAgent() user_agent = UserAgent()
def git_require(min_version, fail=False, msg=''): def git_require(min_version, fail=False, msg=''):
git_version = git.version_tuple() git_version = git.version_tuple()
if min_version <= git_version: if min_version <= git_version:
@ -247,41 +191,42 @@ def git_require(min_version, fail=False, msg=''):
sys.exit(1) sys.exit(1)
return False return False
def _setenv(env, name, value):
env[name] = value.encode()
class GitCommand(object): class GitCommand(object):
def __init__(self, def __init__(self,
project, project,
cmdv, cmdv,
bare=False, bare = False,
provide_stdin=False, provide_stdin = False,
capture_stdout=False, capture_stdout = False,
capture_stderr=False, capture_stderr = False,
merge_output=False, disable_editor = False,
disable_editor=False, ssh_proxy = False,
ssh_proxy=False, cwd = None,
cwd=None, gitdir = None):
gitdir=None):
env = self._GetBasicEnv() env = self._GetBasicEnv()
# If we are not capturing std* then need to print it. # If we are not capturing std* then need to print it.
self.tee = {'stdout': not capture_stdout, 'stderr': not capture_stderr} self.tee = {'stdout': not capture_stdout, 'stderr': not capture_stderr}
if disable_editor: if disable_editor:
env['GIT_EDITOR'] = ':' _setenv(env, 'GIT_EDITOR', ':')
if ssh_proxy: if ssh_proxy:
env['REPO_SSH_SOCK'] = ssh_sock() _setenv(env, 'REPO_SSH_SOCK', ssh_sock())
env['GIT_SSH'] = _ssh_proxy() _setenv(env, 'GIT_SSH', _ssh_proxy())
env['GIT_SSH_VARIANT'] = 'ssh' _setenv(env, 'GIT_SSH_VARIANT', 'ssh')
if 'http_proxy' in env and 'darwin' == sys.platform: if 'http_proxy' in env and 'darwin' == sys.platform:
s = "'http.proxy=%s'" % (env['http_proxy'],) s = "'http.proxy=%s'" % (env['http_proxy'],)
p = env.get('GIT_CONFIG_PARAMETERS') p = env.get('GIT_CONFIG_PARAMETERS')
if p is not None: if p is not None:
s = p + ' ' + s s = p + ' ' + s
env['GIT_CONFIG_PARAMETERS'] = s _setenv(env, 'GIT_CONFIG_PARAMETERS', s)
if 'GIT_ALLOW_PROTOCOL' not in env: if 'GIT_ALLOW_PROTOCOL' not in env:
env['GIT_ALLOW_PROTOCOL'] = ( _setenv(env, 'GIT_ALLOW_PROTOCOL',
'file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc') 'file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc')
env['GIT_HTTP_USER_AGENT'] = user_agent.git _setenv(env, 'GIT_HTTP_USER_AGENT', user_agent.git)
if project: if project:
if not cwd: if not cwd:
@ -292,7 +237,7 @@ class GitCommand(object):
command = [GIT] command = [GIT]
if bare: if bare:
if gitdir: if gitdir:
env[GIT_DIR] = gitdir _setenv(env, GIT_DIR, gitdir)
cwd = None cwd = None
command.append(cmdv[0]) command.append(cmdv[0])
# Need to use the --progress flag for fetch/clone so output will be # Need to use the --progress flag for fetch/clone so output will be
@ -308,7 +253,7 @@ class GitCommand(object):
stdin = None stdin = None
stdout = subprocess.PIPE stdout = subprocess.PIPE
stderr = subprocess.STDOUT if merge_output else subprocess.PIPE stderr = subprocess.PIPE
if IsTrace(): if IsTrace():
global LAST_CWD global LAST_CWD
@ -336,17 +281,15 @@ class GitCommand(object):
dbg += ' 1>|' dbg += ' 1>|'
if stderr == subprocess.PIPE: if stderr == subprocess.PIPE:
dbg += ' 2>|' dbg += ' 2>|'
elif stderr == subprocess.STDOUT:
dbg += ' 2>&1'
Trace('%s', dbg) Trace('%s', dbg)
try: try:
p = subprocess.Popen(command, p = subprocess.Popen(command,
cwd=cwd, cwd = cwd,
env=env, env = env,
stdin=stdin, stdin = stdin,
stdout=stdout, stdout = stdout,
stderr=stderr) stderr = stderr)
except Exception as e: except Exception as e:
raise GitError('%s: %s' % (command[1], e)) raise GitError('%s: %s' % (command[1], e))
@ -385,7 +328,6 @@ class GitCommand(object):
p = self.process p = self.process
s_in = platform_utils.FileDescriptorStreams.create() s_in = platform_utils.FileDescriptorStreams.create()
s_in.add(p.stdout, sys.stdout, 'stdout') s_in.add(p.stdout, sys.stdout, 'stdout')
if p.stderr is not None:
s_in.add(p.stderr, sys.stderr, 'stderr') s_in.add(p.stderr, sys.stderr, 'stderr')
self.stdout = '' self.stdout = ''
self.stderr = '' self.stderr = ''

View File

@ -21,7 +21,6 @@ import errno
import json import json
import os import os
import re import re
import signal
import ssl import ssl
import subprocess import subprocess
import sys import sys
@ -42,6 +41,7 @@ else:
urllib.request = urllib2 urllib.request = urllib2
urllib.error = urllib2 urllib.error = urllib2
from signal import SIGTERM
from error import GitError, UploadError from error import GitError, UploadError
import platform_utils import platform_utils
from repo_trace import Trace from repo_trace import Trace
@ -59,47 +59,39 @@ ID_RE = re.compile(r'^[0-9a-f]{40}$')
REVIEW_CACHE = dict() REVIEW_CACHE = dict()
def IsChange(rev): def IsChange(rev):
return rev.startswith(R_CHANGES) return rev.startswith(R_CHANGES)
def IsId(rev): def IsId(rev):
return ID_RE.match(rev) return ID_RE.match(rev)
def IsTag(rev): def IsTag(rev):
return rev.startswith(R_TAGS) return rev.startswith(R_TAGS)
def IsImmutable(rev): def IsImmutable(rev):
return IsChange(rev) or IsId(rev) or IsTag(rev) return IsChange(rev) or IsId(rev) or IsTag(rev)
def _key(name): def _key(name):
parts = name.split('.') parts = name.split('.')
if len(parts) < 2: if len(parts) < 2:
return name.lower() return name.lower()
parts[0] = parts[0].lower() parts[ 0] = parts[ 0].lower()
parts[-1] = parts[-1].lower() parts[-1] = parts[-1].lower()
return '.'.join(parts) return '.'.join(parts)
class GitConfig(object): class GitConfig(object):
_ForUser = None _ForUser = None
_USER_CONFIG = '~/.gitconfig'
@classmethod @classmethod
def ForUser(cls): def ForUser(cls):
if cls._ForUser is None: if cls._ForUser is None:
cls._ForUser = cls(configfile=os.path.expanduser(cls._USER_CONFIG)) cls._ForUser = cls(configfile = os.path.expanduser('~/.gitconfig'))
return cls._ForUser return cls._ForUser
@classmethod @classmethod
def ForRepository(cls, gitdir, defaults=None): def ForRepository(cls, gitdir, defaults=None):
return cls(configfile=os.path.join(gitdir, 'config'), return cls(configfile = os.path.join(gitdir, 'config'),
defaults=defaults) defaults = defaults)
def __init__(self, configfile, defaults=None, jsonFile=None): def __init__(self, configfile, defaults=None, jsonFile=None):
self.file = configfile self.file = configfile
@ -115,52 +107,15 @@ class GitConfig(object):
os.path.dirname(self.file), os.path.dirname(self.file),
'.repo_' + os.path.basename(self.file) + '.json') '.repo_' + os.path.basename(self.file) + '.json')
def Has(self, name, include_defaults=True): def Has(self, name, include_defaults = True):
"""Return true if this configuration file has the key. """Return true if this configuration file has the key.
""" """
if _key(name) in self._cache: if _key(name) in self._cache:
return True return True
if include_defaults and self.defaults: if include_defaults and self.defaults:
return self.defaults.Has(name, include_defaults=True) return self.defaults.Has(name, include_defaults = True)
return False return False
def GetInt(self, name):
"""Returns an integer from the configuration file.
This follows the git config syntax.
Args:
name: The key to lookup.
Returns:
None if the value was not defined, or is not a boolean.
Otherwise, the number itself.
"""
v = self.GetString(name)
if v is None:
return None
v = v.strip()
mult = 1
if v.endswith('k'):
v = v[:-1]
mult = 1024
elif v.endswith('m'):
v = v[:-1]
mult = 1024 * 1024
elif v.endswith('g'):
v = v[:-1]
mult = 1024 * 1024 * 1024
base = 10
if v.startswith('0x'):
base = 16
try:
return int(v, base=base) * mult
except ValueError:
return None
def GetBoolean(self, name): def GetBoolean(self, name):
"""Returns a boolean from the configuration file. """Returns a boolean from the configuration file.
None : The value was not defined, or is not a boolean. None : The value was not defined, or is not a boolean.
@ -187,7 +142,7 @@ class GitConfig(object):
v = self._cache[_key(name)] v = self._cache[_key(name)]
except KeyError: except KeyError:
if self.defaults: if self.defaults:
return self.defaults.GetString(name, all_keys=all_keys) return self.defaults.GetString(name, all_keys = all_keys)
v = [] v = []
if not all_keys: if not all_keys:
@ -198,7 +153,7 @@ class GitConfig(object):
r = [] r = []
r.extend(v) r.extend(v)
if self.defaults: if self.defaults:
r.extend(self.defaults.GetString(name, all_keys=True)) r.extend(self.defaults.GetString(name, all_keys = True))
return r return r
def SetString(self, name, value): def SetString(self, name, value):
@ -262,7 +217,7 @@ class GitConfig(object):
""" """
return self._sections.get(section, set()) return self._sections.get(section, set())
def HasSection(self, section, subsection=''): def HasSection(self, section, subsection = ''):
"""Does at least one key in section.subsection exist? """Does at least one key in section.subsection exist?
""" """
try: try:
@ -313,23 +268,30 @@ class GitConfig(object):
def _ReadJson(self): def _ReadJson(self):
try: try:
if os.path.getmtime(self._json) <= os.path.getmtime(self.file): if os.path.getmtime(self._json) \
<= os.path.getmtime(self.file):
platform_utils.remove(self._json) platform_utils.remove(self._json)
return None return None
except OSError: except OSError:
return None return None
try: try:
Trace(': parsing %s', self.file) Trace(': parsing %s', self.file)
with open(self._json) as fd: fd = open(self._json)
try:
return json.load(fd) return json.load(fd)
finally:
fd.close()
except (IOError, ValueError): except (IOError, ValueError):
platform_utils.remove(self._json) platform_utils.remove(self._json)
return None return None
def _SaveJson(self, cache): def _SaveJson(self, cache):
try: try:
with open(self._json, 'w') as fd: fd = open(self._json, 'w')
try:
json.dump(cache, fd, indent=2) json.dump(cache, fd, indent=2)
finally:
fd.close()
except (IOError, TypeError): except (IOError, TypeError):
if os.path.exists(self._json): if os.path.exists(self._json):
platform_utils.remove(self._json) platform_utils.remove(self._json)
@ -362,25 +324,19 @@ class GitConfig(object):
return c return c
def _do(self, *args): def _do(self, *args):
command = ['config', '--file', self.file, '--includes'] command = ['config', '--file', self.file]
command.extend(args) command.extend(args)
p = GitCommand(None, p = GitCommand(None,
command, command,
capture_stdout=True, capture_stdout = True,
capture_stderr=True) capture_stderr = True)
if p.Wait() == 0: if p.Wait() == 0:
return p.stdout return p.stdout
else: else:
GitError('git config %s: %s' % (str(args), p.stderr)) GitError('git config %s: %s' % (str(args), p.stderr))
class RepoConfig(GitConfig):
"""User settings for repo itself."""
_USER_CONFIG = '~/.repoconfig/config'
class RefSpec(object): class RefSpec(object):
"""A Git refspec line, split into its components: """A Git refspec line, split into its components:
@ -442,7 +398,6 @@ _master_keys = set()
_ssh_master = True _ssh_master = True
_master_keys_lock = None _master_keys_lock = None
def init_ssh(): def init_ssh():
"""Should be called once at the start of repo to init ssh master handling. """Should be called once at the start of repo to init ssh master handling.
@ -452,7 +407,6 @@ def init_ssh():
assert _master_keys_lock is None, "Should only call init_ssh once" assert _master_keys_lock is None, "Should only call init_ssh once"
_master_keys_lock = _threading.Lock() _master_keys_lock = _threading.Lock()
def _open_ssh(host, port=None): def _open_ssh(host, port=None):
global _ssh_master global _ssh_master
@ -473,16 +427,16 @@ def _open_ssh(host, port=None):
if key in _master_keys: if key in _master_keys:
return True return True
if (not _ssh_master if not _ssh_master \
or 'GIT_SSH' in os.environ or 'GIT_SSH' in os.environ \
or sys.platform in ('win32', 'cygwin')): or sys.platform in ('win32', 'cygwin'):
# failed earlier, or cygwin ssh can't do this # failed earlier, or cygwin ssh can't do this
# #
return False return False
# We will make two calls to ssh; this is the common part of both calls. # We will make two calls to ssh; this is the common part of both calls.
command_base = ['ssh', command_base = ['ssh',
'-o', 'ControlPath %s' % ssh_sock(), '-o','ControlPath %s' % ssh_sock(),
host] host]
if port is not None: if port is not None:
command_base[1:1] = ['-p', str(port)] command_base[1:1] = ['-p', str(port)]
@ -491,7 +445,7 @@ def _open_ssh(host, port=None):
# ...but before actually starting a master, we'll double-check. This can # ...but before actually starting a master, we'll double-check. This can
# be important because we can't tell that that 'git@myhost.com' is the same # be important because we can't tell that that 'git@myhost.com' is the same
# as 'myhost.com' where "User git" is setup in the user's ~/.ssh/config file. # as 'myhost.com' where "User git" is setup in the user's ~/.ssh/config file.
check_command = command_base + ['-O', 'check'] check_command = command_base + ['-O','check']
try: try:
Trace(': %s', ' '.join(check_command)) Trace(': %s', ' '.join(check_command))
check_process = subprocess.Popen(check_command, check_process = subprocess.Popen(check_command,
@ -510,14 +464,16 @@ def _open_ssh(host, port=None):
# to the log there. # to the log there.
pass pass
command = command_base[:1] + ['-M', '-N'] + command_base[1:] command = command_base[:1] + \
['-M', '-N'] + \
command_base[1:]
try: try:
Trace(': %s', ' '.join(command)) Trace(': %s', ' '.join(command))
p = subprocess.Popen(command) p = subprocess.Popen(command)
except Exception as e: except Exception as e:
_ssh_master = False _ssh_master = False
print('\nwarn: cannot enable ssh control master for %s:%s\n%s' print('\nwarn: cannot enable ssh control master for %s:%s\n%s'
% (host, port, str(e)), file=sys.stderr) % (host,port, str(e)), file=sys.stderr)
return False return False
time.sleep(1) time.sleep(1)
@ -531,7 +487,6 @@ def _open_ssh(host, port=None):
finally: finally:
_master_keys_lock.release() _master_keys_lock.release()
def close_ssh(): def close_ssh():
global _master_keys_lock global _master_keys_lock
@ -539,7 +494,7 @@ def close_ssh():
for p in _master_processes: for p in _master_processes:
try: try:
os.kill(p.pid, signal.SIGTERM) os.kill(p.pid, SIGTERM)
p.wait() p.wait()
except OSError: except OSError:
pass pass
@ -556,18 +511,15 @@ def close_ssh():
# We're done with the lock, so we can delete it. # We're done with the lock, so we can delete it.
_master_keys_lock = None _master_keys_lock = None
URI_SCP = re.compile(r'^([^@:]*@?[^:/]{1,}):') URI_SCP = re.compile(r'^([^@:]*@?[^:/]{1,}):')
URI_ALL = re.compile(r'^([a-z][a-z+-]*)://([^@/]*@?[^/]*)/') URI_ALL = re.compile(r'^([a-z][a-z+-]*)://([^@/]*@?[^/]*)/')
def GetSchemeFromUrl(url): def GetSchemeFromUrl(url):
m = URI_ALL.match(url) m = URI_ALL.match(url)
if m: if m:
return m.group(1) return m.group(1)
return None return None
@contextlib.contextmanager @contextlib.contextmanager
def GetUrlCookieFile(url, quiet): def GetUrlCookieFile(url, quiet):
if url.startswith('persistent-'): if url.startswith('persistent-'):
@ -582,7 +534,7 @@ def GetUrlCookieFile(url, quiet):
cookiefile = None cookiefile = None
proxy = None proxy = None
for line in p.stdout: for line in p.stdout:
line = line.strip().decode('utf-8') line = line.strip()
if line.startswith(cookieprefix): if line.startswith(cookieprefix):
cookiefile = os.path.expanduser(line[len(cookieprefix):]) cookiefile = os.path.expanduser(line[len(cookieprefix):])
if line.startswith(proxyprefix): if line.startswith(proxyprefix):
@ -594,7 +546,7 @@ def GetUrlCookieFile(url, quiet):
finally: finally:
p.stdin.close() p.stdin.close()
if p.wait(): if p.wait():
err_msg = p.stderr.read().decode('utf-8') err_msg = p.stderr.read()
if ' -print_config' in err_msg: if ' -print_config' in err_msg:
pass # Persistent proxy doesn't support -print_config. pass # Persistent proxy doesn't support -print_config.
elif not quiet: elif not quiet:
@ -608,7 +560,6 @@ def GetUrlCookieFile(url, quiet):
cookiefile = os.path.expanduser(cookiefile) cookiefile = os.path.expanduser(cookiefile)
yield cookiefile, None yield cookiefile, None
def _preconnect(url): def _preconnect(url):
m = URI_ALL.match(url) m = URI_ALL.match(url)
if m: if m:
@ -629,11 +580,9 @@ def _preconnect(url):
return False return False
class Remote(object): class Remote(object):
"""Configuration options related to a remote. """Configuration options related to a remote.
""" """
def __init__(self, config, name): def __init__(self, config, name):
self._config = config self._config = config
self.name = name self.name = name
@ -656,8 +605,8 @@ class Remote(object):
insteadOfList = globCfg.GetString(key, all_keys=True) insteadOfList = globCfg.GetString(key, all_keys=True)
for insteadOf in insteadOfList: for insteadOf in insteadOfList:
if (self.url.startswith(insteadOf) if self.url.startswith(insteadOf) \
and len(insteadOf) > len(longest)): and len(insteadOf) > len(longest):
longest = insteadOf longest = insteadOf
longestUrl = url longestUrl = url
@ -788,13 +737,12 @@ class Remote(object):
def _Get(self, key, all_keys=False): def _Get(self, key, all_keys=False):
key = 'remote.%s.%s' % (self.name, key) key = 'remote.%s.%s' % (self.name, key)
return self._config.GetString(key, all_keys=all_keys) return self._config.GetString(key, all_keys = all_keys)
class Branch(object): class Branch(object):
"""Configuration options related to a single branch. """Configuration options related to a single branch.
""" """
def __init__(self, config, name): def __init__(self, config, name):
self._config = config self._config = config
self.name = name self.name = name
@ -825,12 +773,15 @@ class Branch(object):
self._Set('merge', self.merge) self._Set('merge', self.merge)
else: else:
with open(self._config.file, 'a') as fd: fd = open(self._config.file, 'a')
try:
fd.write('[branch "%s"]\n' % self.name) fd.write('[branch "%s"]\n' % self.name)
if self.remote: if self.remote:
fd.write('\tremote = %s\n' % self.remote.name) fd.write('\tremote = %s\n' % self.remote.name)
if self.merge: if self.merge:
fd.write('\tmerge = %s\n' % self.merge) fd.write('\tmerge = %s\n' % self.merge)
finally:
fd.close()
def _Set(self, key, value): def _Set(self, key, value):
key = 'branch.%s.%s' % (self.name, key) key = 'branch.%s.%s' % (self.name, key)
@ -838,4 +789,4 @@ class Branch(object):
def _Get(self, key, all_keys=False): def _Get(self, key, all_keys=False):
key = 'branch.%s.%s' % (self.name, key) key = 'branch.%s.%s' % (self.name, key)
return self._config.GetString(key, all_keys=all_keys) return self._config.GetString(key, all_keys = all_keys)

View File

@ -23,8 +23,6 @@ R_CHANGES = 'refs/changes/'
R_HEADS = 'refs/heads/' R_HEADS = 'refs/heads/'
R_TAGS = 'refs/tags/' R_TAGS = 'refs/tags/'
R_PUB = 'refs/published/' R_PUB = 'refs/published/'
R_WORKTREE = 'refs/worktree/'
R_WORKTREE_M = R_WORKTREE + 'm/'
R_M = 'refs/remotes/m/' R_M = 'refs/remotes/m/'
@ -143,11 +141,18 @@ class GitRefs(object):
def _ReadLoose1(self, path, name): def _ReadLoose1(self, path, name):
try: try:
with open(path) as fd: fd = open(path)
except IOError:
return
try:
try:
mtime = os.path.getmtime(path) mtime = os.path.getmtime(path)
ref_id = fd.readline() ref_id = fd.readline()
except (IOError, OSError): except (IOError, OSError):
return return
finally:
fd.close()
try: try:
ref_id = ref_id.decode() ref_id = ref_id.decode()

View File

@ -29,15 +29,12 @@ from error import ManifestParseError
NUM_BATCH_RETRIEVE_REVISIONID = 32 NUM_BATCH_RETRIEVE_REVISIONID = 32
def get_gitc_manifest_dir(): def get_gitc_manifest_dir():
return wrapper.Wrapper().get_gitc_manifest_dir() return wrapper.Wrapper().get_gitc_manifest_dir()
def parse_clientdir(gitc_fs_path): def parse_clientdir(gitc_fs_path):
return wrapper.Wrapper().gitc_parse_clientdir(gitc_fs_path) return wrapper.Wrapper().gitc_parse_clientdir(gitc_fs_path)
def _set_project_revisions(projects): def _set_project_revisions(projects):
"""Sets the revisionExpr for a list of projects. """Sets the revisionExpr for a list of projects.
@ -66,7 +63,6 @@ def _set_project_revisions(projects):
(proj.remote.url, proj.revisionExpr)) (proj.remote.url, proj.revisionExpr))
proj.revisionExpr = revisionExpr proj.revisionExpr = revisionExpr
def _manifest_groups(manifest): def _manifest_groups(manifest):
"""Returns the manifest group string that should be synced """Returns the manifest group string that should be synced
@ -81,7 +77,6 @@ def _manifest_groups(manifest):
groups = 'default,platform-' + platform.system().lower() groups = 'default,platform-' + platform.system().lower()
return groups return groups
def generate_gitc_manifest(gitc_manifest, manifest, paths=None): def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
"""Generate a manifest for shafsd to use for this GITC client. """Generate a manifest for shafsd to use for this GITC client.
@ -109,11 +104,11 @@ def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
if not proj.upstream and not git_config.IsId(proj.revisionExpr): if not proj.upstream and not git_config.IsId(proj.revisionExpr):
proj.upstream = proj.revisionExpr proj.upstream = proj.revisionExpr
if path not in gitc_manifest.paths: if not path in gitc_manifest.paths:
# Any new projects need their first revision, even if we weren't asked # Any new projects need their first revision, even if we weren't asked
# for them. # for them.
projects.append(proj) projects.append(proj)
elif path not in paths: elif not path in paths:
# And copy revisions from the previous manifest if we're not updating # And copy revisions from the previous manifest if we're not updating
# them now. # them now.
gitc_proj = gitc_manifest.paths[path] gitc_proj = gitc_manifest.paths[path]
@ -126,7 +121,7 @@ def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
index = 0 index = 0
while index < len(projects): while index < len(projects):
_set_project_revisions( _set_project_revisions(
projects[index:(index + NUM_BATCH_RETRIEVE_REVISIONID)]) projects[index:(index+NUM_BATCH_RETRIEVE_REVISIONID)])
index += NUM_BATCH_RETRIEVE_REVISIONID index += NUM_BATCH_RETRIEVE_REVISIONID
if gitc_manifest is not None: if gitc_manifest is not None:
@ -145,7 +140,6 @@ def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
# Save the manifest. # Save the manifest.
save_manifest(manifest) save_manifest(manifest)
def save_manifest(manifest, client_dir=None): def save_manifest(manifest, client_dir=None):
"""Save the manifest file in the client_dir. """Save the manifest file in the client_dir.

431
hooks.py
View File

@ -1,431 +0,0 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import os
import re
import sys
import traceback
from error import HookError
from git_refs import HEAD
from pyversion import is_python3
if is_python3():
import urllib.parse
else:
import imp
import urlparse
urllib = imp.new_module('urllib')
urllib.parse = urlparse
input = raw_input # noqa: F821
class RepoHook(object):
"""A RepoHook contains information about a script to run as a hook.
Hooks are used to run a python script before running an upload (for instance,
to run presubmit checks). Eventually, we may have hooks for other actions.
This shouldn't be confused with files in the 'repo/hooks' directory. Those
files are copied into each '.git/hooks' folder for each project. Repo-level
hooks are associated instead with repo actions.
Hooks are always python. When a hook is run, we will load the hook into the
interpreter and execute its main() function.
"""
def __init__(self,
hook_type,
hooks_project,
topdir,
manifest_url,
abort_if_user_denies=False):
"""RepoHook constructor.
Params:
hook_type: A string representing the type of hook. This is also used
to figure out the name of the file containing the hook. For
example: 'pre-upload'.
hooks_project: The project containing the repo hooks. If you have a
manifest, this is manifest.repo_hooks_project. OK if this is None,
which will make the hook a no-op.
topdir: Repo's top directory (the one containing the .repo directory).
Scripts will run with CWD as this directory. If you have a manifest,
this is manifest.topdir
manifest_url: The URL to the manifest git repo.
abort_if_user_denies: If True, we'll throw a HookError() if the user
doesn't allow us to run the hook.
"""
self._hook_type = hook_type
self._hooks_project = hooks_project
self._manifest_url = manifest_url
self._topdir = topdir
self._abort_if_user_denies = abort_if_user_denies
# Store the full path to the script for convenience.
if self._hooks_project:
self._script_fullpath = os.path.join(self._hooks_project.worktree,
self._hook_type + '.py')
else:
self._script_fullpath = None
def _GetHash(self):
"""Return a hash of the contents of the hooks directory.
We'll just use git to do this. This hash has the property that if anything
changes in the directory we will return a different has.
SECURITY CONSIDERATION:
This hash only represents the contents of files in the hook directory, not
any other files imported or called by hooks. Changes to imported files
can change the script behavior without affecting the hash.
Returns:
A string representing the hash. This will always be ASCII so that it can
be printed to the user easily.
"""
assert self._hooks_project, "Must have hooks to calculate their hash."
# We will use the work_git object rather than just calling GetRevisionId().
# That gives us a hash of the latest checked in version of the files that
# the user will actually be executing. Specifically, GetRevisionId()
# doesn't appear to change even if a user checks out a different version
# of the hooks repo (via git checkout) nor if a user commits their own revs.
#
# NOTE: Local (non-committed) changes will not be factored into this hash.
# I think this is OK, since we're really only worried about warning the user
# about upstream changes.
return self._hooks_project.work_git.rev_parse('HEAD')
def _GetMustVerb(self):
"""Return 'must' if the hook is required; 'should' if not."""
if self._abort_if_user_denies:
return 'must'
else:
return 'should'
def _CheckForHookApproval(self):
"""Check to see whether this hook has been approved.
We'll accept approval of manifest URLs if they're using secure transports.
This way the user can say they trust the manifest hoster. For insecure
hosts, we fall back to checking the hash of the hooks repo.
Note that we ask permission for each individual hook even though we use
the hash of all hooks when detecting changes. We'd like the user to be
able to approve / deny each hook individually. We only use the hash of all
hooks because there is no other easy way to detect changes to local imports.
Returns:
True if this hook is approved to run; False otherwise.
Raises:
HookError: Raised if the user doesn't approve and abort_if_user_denies
was passed to the consturctor.
"""
if self._ManifestUrlHasSecureScheme():
return self._CheckForHookApprovalManifest()
else:
return self._CheckForHookApprovalHash()
def _CheckForHookApprovalHelper(self, subkey, new_val, main_prompt,
changed_prompt):
"""Check for approval for a particular attribute and hook.
Args:
subkey: The git config key under [repo.hooks.<hook_type>] to store the
last approved string.
new_val: The new value to compare against the last approved one.
main_prompt: Message to display to the user to ask for approval.
changed_prompt: Message explaining why we're re-asking for approval.
Returns:
True if this hook is approved to run; False otherwise.
Raises:
HookError: Raised if the user doesn't approve and abort_if_user_denies
was passed to the consturctor.
"""
hooks_config = self._hooks_project.config
git_approval_key = 'repo.hooks.%s.%s' % (self._hook_type, subkey)
# Get the last value that the user approved for this hook; may be None.
old_val = hooks_config.GetString(git_approval_key)
if old_val is not None:
# User previously approved hook and asked not to be prompted again.
if new_val == old_val:
# Approval matched. We're done.
return True
else:
# Give the user a reason why we're prompting, since they last told
# us to "never ask again".
prompt = 'WARNING: %s\n\n' % (changed_prompt,)
else:
prompt = ''
# Prompt the user if we're not on a tty; on a tty we'll assume "no".
if sys.stdout.isatty():
prompt += main_prompt + ' (yes/always/NO)? '
response = input(prompt).lower()
print()
# User is doing a one-time approval.
if response in ('y', 'yes'):
return True
elif response == 'always':
hooks_config.SetString(git_approval_key, new_val)
return True
# For anything else, we'll assume no approval.
if self._abort_if_user_denies:
raise HookError('You must allow the %s hook or use --no-verify.' %
self._hook_type)
return False
def _ManifestUrlHasSecureScheme(self):
"""Check if the URI for the manifest is a secure transport."""
secure_schemes = ('file', 'https', 'ssh', 'persistent-https', 'sso', 'rpc')
parse_results = urllib.parse.urlparse(self._manifest_url)
return parse_results.scheme in secure_schemes
def _CheckForHookApprovalManifest(self):
"""Check whether the user has approved this manifest host.
Returns:
True if this hook is approved to run; False otherwise.
"""
return self._CheckForHookApprovalHelper(
'approvedmanifest',
self._manifest_url,
'Run hook scripts from %s' % (self._manifest_url,),
'Manifest URL has changed since %s was allowed.' % (self._hook_type,))
def _CheckForHookApprovalHash(self):
"""Check whether the user has approved the hooks repo.
Returns:
True if this hook is approved to run; False otherwise.
"""
prompt = ('Repo %s run the script:\n'
' %s\n'
'\n'
'Do you want to allow this script to run')
return self._CheckForHookApprovalHelper(
'approvedhash',
self._GetHash(),
prompt % (self._GetMustVerb(), self._script_fullpath),
'Scripts have changed since %s was allowed.' % (self._hook_type,))
@staticmethod
def _ExtractInterpFromShebang(data):
"""Extract the interpreter used in the shebang.
Try to locate the interpreter the script is using (ignoring `env`).
Args:
data: The file content of the script.
Returns:
The basename of the main script interpreter, or None if a shebang is not
used or could not be parsed out.
"""
firstline = data.splitlines()[:1]
if not firstline:
return None
# The format here can be tricky.
shebang = firstline[0].strip()
m = re.match(r'^#!\s*([^\s]+)(?:\s+([^\s]+))?', shebang)
if not m:
return None
# If the using `env`, find the target program.
interp = m.group(1)
if os.path.basename(interp) == 'env':
interp = m.group(2)
return interp
def _ExecuteHookViaReexec(self, interp, context, **kwargs):
"""Execute the hook script through |interp|.
Note: Support for this feature should be dropped ~Jun 2021.
Args:
interp: The Python program to run.
context: Basic Python context to execute the hook inside.
kwargs: Arbitrary arguments to pass to the hook script.
Raises:
HookError: When the hooks failed for any reason.
"""
# This logic needs to be kept in sync with _ExecuteHookViaImport below.
script = """
import json, os, sys
path = '''%(path)s'''
kwargs = json.loads('''%(kwargs)s''')
context = json.loads('''%(context)s''')
sys.path.insert(0, os.path.dirname(path))
data = open(path).read()
exec(compile(data, path, 'exec'), context)
context['main'](**kwargs)
""" % {
'path': self._script_fullpath,
'kwargs': json.dumps(kwargs),
'context': json.dumps(context),
}
# We pass the script via stdin to avoid OS argv limits. It also makes
# unhandled exception tracebacks less verbose/confusing for users.
cmd = [interp, '-c', 'import sys; exec(sys.stdin.read())']
proc = subprocess.Popen(cmd, stdin=subprocess.PIPE)
proc.communicate(input=script.encode('utf-8'))
if proc.returncode:
raise HookError('Failed to run %s hook.' % (self._hook_type,))
def _ExecuteHookViaImport(self, data, context, **kwargs):
"""Execute the hook code in |data| directly.
Args:
data: The code of the hook to execute.
context: Basic Python context to execute the hook inside.
kwargs: Arbitrary arguments to pass to the hook script.
Raises:
HookError: When the hooks failed for any reason.
"""
# Exec, storing global context in the context dict. We catch exceptions
# and convert to a HookError w/ just the failing traceback.
try:
exec(compile(data, self._script_fullpath, 'exec'), context)
except Exception:
raise HookError('%s\nFailed to import %s hook; see traceback above.' %
(traceback.format_exc(), self._hook_type))
# Running the script should have defined a main() function.
if 'main' not in context:
raise HookError('Missing main() in: "%s"' % self._script_fullpath)
# Call the main function in the hook. If the hook should cause the
# build to fail, it will raise an Exception. We'll catch that convert
# to a HookError w/ just the failing traceback.
try:
context['main'](**kwargs)
except Exception:
raise HookError('%s\nFailed to run main() for %s hook; see traceback '
'above.' % (traceback.format_exc(), self._hook_type))
def _ExecuteHook(self, **kwargs):
"""Actually execute the given hook.
This will run the hook's 'main' function in our python interpreter.
Args:
kwargs: Keyword arguments to pass to the hook. These are often specific
to the hook type. For instance, pre-upload hooks will contain
a project_list.
"""
# Keep sys.path and CWD stashed away so that we can always restore them
# upon function exit.
orig_path = os.getcwd()
orig_syspath = sys.path
try:
# Always run hooks with CWD as topdir.
os.chdir(self._topdir)
# Put the hook dir as the first item of sys.path so hooks can do
# relative imports. We want to replace the repo dir as [0] so
# hooks can't import repo files.
sys.path = [os.path.dirname(self._script_fullpath)] + sys.path[1:]
# Initial global context for the hook to run within.
context = {'__file__': self._script_fullpath}
# Add 'hook_should_take_kwargs' to the arguments to be passed to main.
# We don't actually want hooks to define their main with this argument--
# it's there to remind them that their hook should always take **kwargs.
# For instance, a pre-upload hook should be defined like:
# def main(project_list, **kwargs):
#
# This allows us to later expand the API without breaking old hooks.
kwargs = kwargs.copy()
kwargs['hook_should_take_kwargs'] = True
# See what version of python the hook has been written against.
data = open(self._script_fullpath).read()
interp = self._ExtractInterpFromShebang(data)
reexec = False
if interp:
prog = os.path.basename(interp)
if prog.startswith('python2') and sys.version_info.major != 2:
reexec = True
elif prog.startswith('python3') and sys.version_info.major == 2:
reexec = True
# Attempt to execute the hooks through the requested version of Python.
if reexec:
try:
self._ExecuteHookViaReexec(interp, context, **kwargs)
except OSError as e:
if e.errno == errno.ENOENT:
# We couldn't find the interpreter, so fallback to importing.
reexec = False
else:
raise
# Run the hook by importing directly.
if not reexec:
self._ExecuteHookViaImport(data, context, **kwargs)
finally:
# Restore sys.path and CWD.
sys.path = orig_syspath
os.chdir(orig_path)
def Run(self, user_allows_all_hooks, **kwargs):
"""Run the hook.
If the hook doesn't exist (because there is no hooks project or because
this particular hook is not enabled), this is a no-op.
Args:
user_allows_all_hooks: If True, we will never prompt about running the
hook--we'll just assume it's OK to run it.
kwargs: Keyword arguments to pass to the hook. These are often specific
to the hook type. For instance, pre-upload hooks will contain
a project_list.
Raises:
HookError: If there was a problem finding the hook or the user declined
to run a required hook (from _CheckForHookApproval).
"""
# No-op if there is no hooks project or if hook is disabled.
if ((not self._hooks_project) or (self._hook_type not in
self._hooks_project.enabled_repo_hooks)):
return
# Bail with a nice error if we can't find the hook.
if not os.path.isfile(self._script_fullpath):
raise HookError('Couldn\'t find repo hook: "%s"' % self._script_fullpath)
# Make sure the user is OK with running the hook.
if (not user_allows_all_hooks) and (not self._CheckForHookApproval()):
return
# Run the hook with the same version of python we're using.
self._ExecuteHook(**kwargs)

View File

@ -1,5 +1,5 @@
#!/bin/sh #!/bin/sh
# From Gerrit Code Review 3.1.3 # From Gerrit Code Review 2.14.6
# #
# Part of Gerrit Code Review (https://www.gerritcodereview.com/) # Part of Gerrit Code Review (https://www.gerritcodereview.com/)
# #
@ -16,48 +16,176 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
#
# avoid [[ which is not POSIX sh. unset GREP_OPTIONS
if test "$#" != 1 ; then
echo "$0 requires an argument."
exit 1
fi
if test ! -f "$1" ; then CHANGE_ID_AFTER="Bug|Depends-On|Issue|Test|Feature|Fixes|Fixed"
echo "file does not exist: $1" MSG="$1"
exit 1
fi
# Do not create a change id if requested # Check for, and add if missing, a unique Change-Id
if test "false" = "`git config --bool --get gerrit.createChangeId`" ; then #
exit 0 add_ChangeId() {
fi clean_message=`sed -e '
/^diff --git .*/{
s///
q
}
/^Signed-off-by:/d
/^#/d
' "$MSG" | git stripspace`
if test -z "$clean_message"
then
return
fi
# $RANDOM will be undefined if not using bash, so don't use set -u # Do not add Change-Id to temp commits
random=$( (whoami ; hostname ; date; cat $1 ; echo $RANDOM) | git hash-object --stdin) if echo "$clean_message" | head -1 | grep -q '^\(fixup\|squash\)!'
dest="$1.tmp.${random}" then
return
fi
trap 'rm -f "${dest}"' EXIT if test "false" = "`git config --bool --get gerrit.createChangeId`"
then
return
fi
if ! git stripspace --strip-comments < "$1" > "${dest}" ; then # Does Change-Id: already exist? if so, exit (no change).
echo "cannot strip comments from $1" if grep -i '^Change-Id:' "$MSG" >/dev/null
exit 1 then
fi return
fi
if test ! -s "${dest}" ; then id=`_gen_ChangeId`
echo "file is empty: $1" T="$MSG.tmp.$$"
exit 1 AWK=awk
fi if [ -x /usr/xpg4/bin/awk ]; then
# Solaris AWK is just too broken
AWK=/usr/xpg4/bin/awk
fi
# Avoid the --in-place option which only appeared in Git 2.8 # Get core.commentChar from git config or use default symbol
# Avoid the --if-exists option which only appeared in Git 2.15 commentChar=`git config --get core.commentChar`
if ! git -c trailer.ifexists=doNothing interpret-trailers \ commentChar=${commentChar:-#}
--trailer "Change-Id: I${random}" < "$1" > "${dest}" ; then
echo "cannot insert change-id line in $1"
exit 1
fi
if ! mv "${dest}" "$1" ; then # How this works:
echo "cannot mv ${dest} to $1" # - parse the commit message as (textLine+ blankLine*)*
exit 1 # - assume textLine+ to be a footer until proven otherwise
fi # - exception: the first block is not footer (as it is the title)
# - read textLine+ into a variable
# - then count blankLines
# - once the next textLine appears, print textLine+ blankLine* as these
# aren't footer
# - in END, the last textLine+ block is available for footer parsing
$AWK '
BEGIN {
# while we start with the assumption that textLine+
# is a footer, the first block is not.
isFooter = 0
footerComment = 0
blankLines = 0
}
# Skip lines starting with commentChar without any spaces before it.
/^'"$commentChar"'/ { next }
# Skip the line starting with the diff command and everything after it,
# up to the end of the file, assuming it is only patch data.
# If more than one line before the diff was empty, strip all but one.
/^diff --git / {
blankLines = 0
while (getline) { }
next
}
# Count blank lines outside footer comments
/^$/ && (footerComment == 0) {
blankLines++
next
}
# Catch footer comment
/^\[[a-zA-Z0-9-]+:/ && (isFooter == 1) {
footerComment = 1
}
/]$/ && (footerComment == 1) {
footerComment = 2
}
# We have a non-blank line after blank lines. Handle this.
(blankLines > 0) {
print lines
for (i = 0; i < blankLines; i++) {
print ""
}
lines = ""
blankLines = 0
isFooter = 1
footerComment = 0
}
# Detect that the current block is not the footer
(footerComment == 0) && (!/^\[?[a-zA-Z0-9-]+:/ || /^[a-zA-Z0-9-]+:\/\//) {
isFooter = 0
}
{
# We need this information about the current last comment line
if (footerComment == 2) {
footerComment = 0
}
if (lines != "") {
lines = lines "\n";
}
lines = lines $0
}
# Footer handling:
# If the last block is considered a footer, splice in the Change-Id at the
# right place.
# Look for the right place to inject Change-Id by considering
# CHANGE_ID_AFTER. Keys listed in it (case insensitive) come first,
# then Change-Id, then everything else (eg. Signed-off-by:).
#
# Otherwise just print the last block, a new line and the Change-Id as a
# block of its own.
END {
unprinted = 1
if (isFooter == 0) {
print lines "\n"
lines = ""
}
changeIdAfter = "^(" tolower("'"$CHANGE_ID_AFTER"'") "):"
numlines = split(lines, footer, "\n")
for (line = 1; line <= numlines; line++) {
if (unprinted && match(tolower(footer[line]), changeIdAfter) != 1) {
unprinted = 0
print "Change-Id: I'"$id"'"
}
print footer[line]
}
if (unprinted) {
print "Change-Id: I'"$id"'"
}
}' "$MSG" > "$T" && mv "$T" "$MSG" || rm -f "$T"
}
_gen_ChangeIdInput() {
echo "tree `git write-tree`"
if parent=`git rev-parse "HEAD^0" 2>/dev/null`
then
echo "parent $parent"
fi
echo "author `git var GIT_AUTHOR_IDENT`"
echo "committer `git var GIT_COMMITTER_IDENT`"
echo
printf '%s' "$clean_message"
}
_gen_ChangeId() {
_gen_ChangeIdInput |
git hash-object -t commit --stdin
}
add_ChangeId

154
main.py
View File

@ -26,9 +26,7 @@ import getpass
import netrc import netrc
import optparse import optparse
import os import os
import shlex
import sys import sys
import textwrap
import time import time
from pyversion import is_python3 from pyversion import is_python3
@ -48,8 +46,8 @@ except ImportError:
from color import SetDefaultColoring from color import SetDefaultColoring
import event_log import event_log
from repo_trace import SetTrace from repo_trace import SetTrace
from git_command import user_agent from git_command import git, GitCommand, user_agent
from git_config import init_ssh, close_ssh, RepoConfig from git_config import init_ssh, close_ssh
from command import InteractiveCommand from command import InteractiveCommand
from command import MirrorSafeCommand from command import MirrorSafeCommand
from command import GitcAvailableCommand, GitcClientCommand from command import GitcAvailableCommand, GitcClientCommand
@ -70,46 +68,16 @@ from wrapper import WrapperPath, Wrapper
from subcmds import all_commands from subcmds import all_commands
if not is_python3(): if not is_python3():
input = raw_input # noqa: F821 input = raw_input
# NB: These do not need to be kept in sync with the repo launcher script.
# These may be much newer as it allows the repo launcher to roll between
# different repo releases while source versions might require a newer python.
#
# The soft version is when we start warning users that the version is old and
# we'll be dropping support for it. We'll refuse to work with versions older
# than the hard version.
#
# python-3.6 is in Ubuntu Bionic.
MIN_PYTHON_VERSION_SOFT = (3, 6)
MIN_PYTHON_VERSION_HARD = (3, 4)
if sys.version_info.major < 3:
print('repo: warning: Python 2 is no longer supported; '
'Please upgrade to Python {}.{}+.'.format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr)
else:
if sys.version_info < MIN_PYTHON_VERSION_HARD:
print('repo: error: Python 3 version is too old; '
'Please upgrade to Python {}.{}+.'.format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr)
sys.exit(1)
elif sys.version_info < MIN_PYTHON_VERSION_SOFT:
print('repo: warning: your Python 3 version is no longer supported; '
'Please upgrade to Python {}.{}+.'.format(*MIN_PYTHON_VERSION_SOFT),
file=sys.stderr)
global_options = optparse.OptionParser( global_options = optparse.OptionParser(
usage='repo [-p|--paginate|--no-pager] COMMAND [ARGS]', usage="repo [-p|--paginate|--no-pager] COMMAND [ARGS]"
add_help_option=False) )
global_options.add_option('-h', '--help', action='store_true',
help='show this help message and exit')
global_options.add_option('-p', '--paginate', global_options.add_option('-p', '--paginate',
dest='pager', action='store_true', dest='pager', action='store_true',
help='display command output in the pager') help='display command output in the pager')
global_options.add_option('--no-pager', global_options.add_option('--no-pager',
dest='pager', action='store_false', dest='no_pager', action='store_true',
help='disable the pager') help='disable the pager')
global_options.add_option('--color', global_options.add_option('--color',
choices=('auto', 'always', 'never'), default=None, choices=('auto', 'always', 'never'), default=None,
@ -130,11 +98,12 @@ global_options.add_option('--event-log',
dest='event_log', action='store', dest='event_log', action='store',
help='filename of event log to append timeline to') help='filename of event log to append timeline to')
class _Repo(object): class _Repo(object):
def __init__(self, repodir): def __init__(self, repodir):
self.repodir = repodir self.repodir = repodir
self.commands = all_commands self.commands = all_commands
# add 'branch' as an alias for 'branches'
all_commands['branch'] = all_commands['branches']
def _ParseArgs(self, argv): def _ParseArgs(self, argv):
"""Parse the main `repo` command line options.""" """Parse the main `repo` command line options."""
@ -154,40 +123,8 @@ class _Repo(object):
argv = [] argv = []
gopts, _gargs = global_options.parse_args(glob) gopts, _gargs = global_options.parse_args(glob)
name, alias_args = self._ExpandAlias(name)
argv = alias_args + argv
if gopts.help:
global_options.print_help()
commands = ' '.join(sorted(self.commands))
wrapped_commands = textwrap.wrap(commands, width=77)
print('\nAvailable commands:\n %s' % ('\n '.join(wrapped_commands),))
print('\nRun `repo help <command>` for command-specific details.')
global_options.exit()
return (name, gopts, argv) return (name, gopts, argv)
def _ExpandAlias(self, name):
"""Look up user registered aliases."""
# We don't resolve aliases for existing subcommands. This matches git.
if name in self.commands:
return name, []
key = 'alias.%s' % (name,)
alias = RepoConfig.ForRepository(self.repodir).GetString(key)
if alias is None:
alias = RepoConfig.ForUser().GetString(key)
if alias is None:
return name, []
args = alias.strip().split(' ', 1)
name = args[0]
if len(args) == 2:
args = shlex.split(args[1])
else:
args = []
return name, args
def _Run(self, name, gopts, argv): def _Run(self, name, gopts, argv):
"""Execute the requested subcommand.""" """Execute the requested subcommand."""
result = 0 result = 0
@ -204,7 +141,7 @@ class _Repo(object):
SetDefaultColoring(gopts.color) SetDefaultColoring(gopts.color)
try: try:
cmd = self.commands[name]() cmd = self.commands[name]
except KeyError: except KeyError:
print("repo: '%s' is not a repo command. See 'repo help'." % name, print("repo: '%s' is not a repo command. See 'repo help'." % name,
file=sys.stderr) file=sys.stderr)
@ -245,7 +182,7 @@ class _Repo(object):
file=sys.stderr) file=sys.stderr)
return 1 return 1
if gopts.pager is not False and not isinstance(cmd, InteractiveCommand): if not gopts.no_pager and not isinstance(cmd, InteractiveCommand):
config = cmd.manifest.globalConfig config = cmd.manifest.globalConfig
if gopts.pager: if gopts.pager:
use_pager = True use_pager = True
@ -280,8 +217,7 @@ class _Repo(object):
if e.name: if e.name:
print('error: project group must be enabled for project %s' % e.name, file=sys.stderr) print('error: project group must be enabled for project %s' % e.name, file=sys.stderr)
else: else:
print('error: project group must be enabled for the project in the current directory', print('error: project group must be enabled for the project in the current directory', file=sys.stderr)
file=sys.stderr)
result = 1 result = 1
except SystemExit as e: except SystemExit as e:
if e.code: if e.code:
@ -308,66 +244,42 @@ class _Repo(object):
return result return result
def _CheckWrapperVersion(ver_str, repo_path): def _CheckWrapperVersion(ver, repo_path):
"""Verify the repo launcher is new enough for this checkout.
Args:
ver_str: The version string passed from the repo launcher when it ran us.
repo_path: The path to the repo launcher that loaded us.
"""
# Refuse to work with really old wrapper versions. We don't test these,
# so might as well require a somewhat recent sane version.
# v1.15 of the repo launcher was released in ~Mar 2012.
MIN_REPO_VERSION = (1, 15)
min_str = '.'.join(str(x) for x in MIN_REPO_VERSION)
if not repo_path: if not repo_path:
repo_path = '~/bin/repo' repo_path = '~/bin/repo'
if not ver_str: if not ver:
print('no --wrapper-version argument', file=sys.stderr) print('no --wrapper-version argument', file=sys.stderr)
sys.exit(1) sys.exit(1)
# Pull out the version of the repo launcher we know about to compare.
exp = Wrapper().VERSION exp = Wrapper().VERSION
ver = tuple(map(int, ver_str.split('.'))) ver = tuple(map(int, ver.split('.')))
if len(ver) == 1:
ver = (0, ver[0])
exp_str = '.'.join(map(str, exp)) exp_str = '.'.join(map(str, exp))
if ver < MIN_REPO_VERSION: if exp[0] > ver[0] or ver < (0, 4):
print(""" print("""
repo: error: !!! A new repo command (%5s) is available. !!!
!!! Your version of repo %s is too old. !!! You must upgrade before you can continue: !!!
!!! We need at least version %s.
!!! A new version of repo (%s) is available.
!!! You must upgrade before you can continue:
cp %s %s cp %s %s
""" % (ver_str, min_str, exp_str, WrapperPath(), repo_path), file=sys.stderr) """ % (exp_str, WrapperPath(), repo_path), file=sys.stderr)
sys.exit(1) sys.exit(1)
if exp > ver: if exp > ver:
print('\n... A new version of repo (%s) is available.' % (exp_str,), print("""
file=sys.stderr) ... A new repo command (%5s) is available.
if os.access(repo_path, os.W_OK):
print("""\
... You should upgrade soon: ... You should upgrade soon:
cp %s %s
""" % (WrapperPath(), repo_path), file=sys.stderr)
else:
print("""\
... New version is available at: %s
... The launcher is run from: %s
!!! The launcher is not writable. Please talk to your sysadmin or distro
!!! to get an update installed.
""" % (WrapperPath(), repo_path), file=sys.stderr)
cp %s %s
""" % (exp_str, WrapperPath(), repo_path), file=sys.stderr)
def _CheckRepoDir(repo_dir): def _CheckRepoDir(repo_dir):
if not repo_dir: if not repo_dir:
print('no --repo-dir argument', file=sys.stderr) print('no --repo-dir argument', file=sys.stderr)
sys.exit(1) sys.exit(1)
def _PruneOptions(argv, opt): def _PruneOptions(argv, opt):
i = 0 i = 0
while i < len(argv): while i < len(argv):
@ -383,7 +295,6 @@ def _PruneOptions(argv, opt):
continue continue
i += 1 i += 1
class _UserAgentHandler(urllib.request.BaseHandler): class _UserAgentHandler(urllib.request.BaseHandler):
def http_request(self, req): def http_request(self, req):
req.add_header('User-Agent', user_agent.repo) req.add_header('User-Agent', user_agent.repo)
@ -393,7 +304,6 @@ class _UserAgentHandler(urllib.request.BaseHandler):
req.add_header('User-Agent', user_agent.repo) req.add_header('User-Agent', user_agent.repo)
return req return req
def _AddPasswordFromUserInput(handler, msg, req): def _AddPasswordFromUserInput(handler, msg, req):
# If repo could not find auth info from netrc, try to get it from user input # If repo could not find auth info from netrc, try to get it from user input
url = req.get_full_url() url = req.get_full_url()
@ -407,7 +317,6 @@ def _AddPasswordFromUserInput(handler, msg, req):
return return
handler.passwd.add_password(None, url, user, password) handler.passwd.add_password(None, url, user, password)
class _BasicAuthHandler(urllib.request.HTTPBasicAuthHandler): class _BasicAuthHandler(urllib.request.HTTPBasicAuthHandler):
def http_error_401(self, req, fp, code, msg, headers): def http_error_401(self, req, fp, code, msg, headers):
_AddPasswordFromUserInput(self, msg, req) _AddPasswordFromUserInput(self, msg, req)
@ -417,14 +326,13 @@ class _BasicAuthHandler(urllib.request.HTTPBasicAuthHandler):
def http_error_auth_reqed(self, authreq, host, req, headers): def http_error_auth_reqed(self, authreq, host, req, headers):
try: try:
old_add_header = req.add_header old_add_header = req.add_header
def _add_header(name, val): def _add_header(name, val):
val = val.replace('\n', '') val = val.replace('\n', '')
old_add_header(name, val) old_add_header(name, val)
req.add_header = _add_header req.add_header = _add_header
return urllib.request.AbstractBasicAuthHandler.http_error_auth_reqed( return urllib.request.AbstractBasicAuthHandler.http_error_auth_reqed(
self, authreq, host, req, headers) self, authreq, host, req, headers)
except Exception: except:
reset = getattr(self, 'reset_retry_count', None) reset = getattr(self, 'reset_retry_count', None)
if reset is not None: if reset is not None:
reset() reset()
@ -432,7 +340,6 @@ class _BasicAuthHandler(urllib.request.HTTPBasicAuthHandler):
self.retried = 0 self.retried = 0
raise raise
class _DigestAuthHandler(urllib.request.HTTPDigestAuthHandler): class _DigestAuthHandler(urllib.request.HTTPDigestAuthHandler):
def http_error_401(self, req, fp, code, msg, headers): def http_error_401(self, req, fp, code, msg, headers):
_AddPasswordFromUserInput(self, msg, req) _AddPasswordFromUserInput(self, msg, req)
@ -442,14 +349,13 @@ class _DigestAuthHandler(urllib.request.HTTPDigestAuthHandler):
def http_error_auth_reqed(self, auth_header, host, req, headers): def http_error_auth_reqed(self, auth_header, host, req, headers):
try: try:
old_add_header = req.add_header old_add_header = req.add_header
def _add_header(name, val): def _add_header(name, val):
val = val.replace('\n', '') val = val.replace('\n', '')
old_add_header(name, val) old_add_header(name, val)
req.add_header = _add_header req.add_header = _add_header
return urllib.request.AbstractDigestAuthHandler.http_error_auth_reqed( return urllib.request.AbstractDigestAuthHandler.http_error_auth_reqed(
self, auth_header, host, req, headers) self, auth_header, host, req, headers)
except Exception: except:
reset = getattr(self, 'reset_retry_count', None) reset = getattr(self, 'reset_retry_count', None)
if reset is not None: if reset is not None:
reset() reset()
@ -457,7 +363,6 @@ class _DigestAuthHandler(urllib.request.HTTPDigestAuthHandler):
self.retried = 0 self.retried = 0
raise raise
class _KerberosAuthHandler(urllib.request.BaseHandler): class _KerberosAuthHandler(urllib.request.BaseHandler):
def __init__(self): def __init__(self):
self.retried = 0 self.retried = 0
@ -492,7 +397,7 @@ class _KerberosAuthHandler(urllib.request.BaseHandler):
return response return response
except kerberos.GSSError: except kerberos.GSSError:
return None return None
except Exception: except:
self.reset_retry_count() self.reset_retry_count()
raise raise
finally: finally:
@ -538,7 +443,6 @@ class _KerberosAuthHandler(urllib.request.BaseHandler):
kerberos.authGSSClientClean(self.context) kerberos.authGSSClientClean(self.context)
self.context = None self.context = None
def init_http(): def init_http():
handlers = [_UserAgentHandler()] handlers = [_UserAgentHandler()]
@ -566,7 +470,6 @@ def init_http():
handlers.append(urllib.request.HTTPSHandler(debuglevel=1)) handlers.append(urllib.request.HTTPSHandler(debuglevel=1))
urllib.request.install_opener(urllib.request.build_opener(*handlers)) urllib.request.install_opener(urllib.request.build_opener(*handlers))
def _Main(argv): def _Main(argv):
result = 0 result = 0
@ -614,7 +517,7 @@ def _Main(argv):
argv = list(sys.argv) argv = list(sys.argv)
argv.extend(rce.extra_args) argv.extend(rce.extra_args)
try: try:
os.execv(sys.executable, [__file__] + argv) os.execv(__file__, argv)
except OSError as e: except OSError as e:
print('fatal: cannot restart repo after upgrade', file=sys.stderr) print('fatal: cannot restart repo after upgrade', file=sys.stderr)
print('fatal: %s' % e, file=sys.stderr) print('fatal: %s' % e, file=sys.stderr)
@ -623,6 +526,5 @@ def _Main(argv):
TerminatePager() TerminatePager()
sys.exit(result) sys.exit(result)
if __name__ == '__main__': if __name__ == '__main__':
_Main(sys.argv[1:]) _Main(sys.argv[1:])

View File

@ -31,12 +31,11 @@ else:
urllib.parse = urlparse urllib.parse = urlparse
import gitc_utils import gitc_utils
from git_config import GitConfig, IsId from git_config import GitConfig
from git_refs import R_HEADS, HEAD from git_refs import R_HEADS, HEAD
import platform_utils import platform_utils
from project import RemoteSpec, Project, MetaProject from project import RemoteSpec, Project, MetaProject
from error import (ManifestParseError, ManifestInvalidPathError, from error import ManifestParseError, ManifestInvalidRevisionError
ManifestInvalidRevisionError)
MANIFEST_FILE_NAME = 'manifest.xml' MANIFEST_FILE_NAME = 'manifest.xml'
LOCAL_MANIFEST_NAME = 'local_manifest.xml' LOCAL_MANIFEST_NAME = 'local_manifest.xml'
@ -56,61 +55,6 @@ urllib.parse.uses_netloc.extend([
'sso', 'sso',
'rpc']) 'rpc'])
def XmlBool(node, attr, default=None):
"""Determine boolean value of |node|'s |attr|.
Invalid values will issue a non-fatal warning.
Args:
node: XML node whose attributes we access.
attr: The attribute to access.
default: If the attribute is not set (value is empty), then use this.
Returns:
True if the attribute is a valid string representing true.
False if the attribute is a valid string representing false.
|default| otherwise.
"""
value = node.getAttribute(attr)
s = value.lower()
if s == '':
return default
elif s in {'yes', 'true', '1'}:
return True
elif s in {'no', 'false', '0'}:
return False
else:
print('warning: manifest: %s="%s": ignoring invalid XML boolean' %
(attr, value), file=sys.stderr)
return default
def XmlInt(node, attr, default=None):
"""Determine integer value of |node|'s |attr|.
Args:
node: XML node whose attributes we access.
attr: The attribute to access.
default: If the attribute is not set (value is empty), then use this.
Returns:
The number if the attribute is a valid number.
Raises:
ManifestParseError: The number is invalid.
"""
value = node.getAttribute(attr)
if not value:
return default
try:
return int(value)
except ValueError:
raise ManifestParseError('manifest: invalid %s="%s" integer' %
(attr, value))
class _Default(object): class _Default(object):
"""Project defaults within the manifest.""" """Project defaults within the manifest."""
@ -129,7 +73,6 @@ class _Default(object):
def __ne__(self, other): def __ne__(self, other):
return self.__dict__ != other.__dict__ return self.__dict__ != other.__dict__
class _XmlRemote(object): class _XmlRemote(object):
def __init__(self, def __init__(self,
name, name,
@ -183,7 +126,6 @@ class _XmlRemote(object):
orig_name=self.name, orig_name=self.name,
fetchUrl=self.fetchUrl) fetchUrl=self.fetchUrl)
class XmlManifest(object): class XmlManifest(object):
"""manages the repo configuration file""" """manages the repo configuration file"""
@ -192,24 +134,17 @@ class XmlManifest(object):
self.topdir = os.path.dirname(self.repodir) self.topdir = os.path.dirname(self.repodir)
self.manifestFile = os.path.join(self.repodir, MANIFEST_FILE_NAME) self.manifestFile = os.path.join(self.repodir, MANIFEST_FILE_NAME)
self.globalConfig = GitConfig.ForUser() self.globalConfig = GitConfig.ForUser()
self.localManifestWarning = False
self.isGitcClient = False self.isGitcClient = False
self._load_local_manifests = True self._load_local_manifests = True
self.repoProject = MetaProject(self, 'repo', self.repoProject = MetaProject(self, 'repo',
gitdir=os.path.join(repodir, 'repo/.git'), gitdir = os.path.join(repodir, 'repo/.git'),
worktree=os.path.join(repodir, 'repo')) worktree = os.path.join(repodir, 'repo'))
mp = MetaProject(self, 'manifests', self.manifestProject = MetaProject(self, 'manifests',
gitdir=os.path.join(repodir, 'manifests.git'), gitdir = os.path.join(repodir, 'manifests.git'),
worktree=os.path.join(repodir, 'manifests')) worktree = os.path.join(repodir, 'manifests'))
self.manifestProject = mp
# This is a bit hacky, but we're in a chicken & egg situation: all the
# normal repo settings live in the manifestProject which we just setup
# above, so we couldn't easily query before that. We assume Project()
# init doesn't care if this changes afterwards.
if os.path.exists(mp.gitdir) and mp.config.GetBoolean('repo.worktree'):
mp.use_git_worktrees = True
self._Unload() self._Unload()
@ -244,27 +179,12 @@ class XmlManifest(object):
""" """
self.Override(name) self.Override(name)
# Old versions of repo would generate symlinks we need to clean up. try:
if os.path.lexists(self.manifestFile): if os.path.lexists(self.manifestFile):
platform_utils.remove(self.manifestFile) platform_utils.remove(self.manifestFile)
# This file is interpreted as if it existed inside the manifest repo. platform_utils.symlink(os.path.join('manifests', name), self.manifestFile)
# That allows us to use <include> with the relative file name. except OSError as e:
with open(self.manifestFile, 'w') as fp: raise ManifestParseError('cannot link manifest %s: %s' % (name, str(e)))
fp.write("""<?xml version="1.0" encoding="UTF-8"?>
<!--
DO NOT EDIT THIS FILE! It is generated by repo and changes will be discarded.
If you want to use a different manifest, use `repo init -m <file>` instead.
If you want to customize your checkout by overriding manifest settings, use
the local_manifests/ directory instead.
For more information on repo manifests, check out:
https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
-->
<manifest>
<include name="%s" />
</manifest>
""" % (name,))
def _RemoteToXml(self, r, doc, root): def _RemoteToXml(self, r, doc, root):
e = doc.createElement('remote') e = doc.createElement('remote')
@ -283,7 +203,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
def _ParseGroups(self, groups): def _ParseGroups(self, groups):
return [x for x in re.split(r'[,\s]+', groups) if x] return [x for x in re.split(r'[,\s]+', groups) if x]
def Save(self, fd, peg_rev=False, peg_rev_upstream=True, peg_rev_dest_branch=True, groups=None): def Save(self, fd, peg_rev=False, peg_rev_upstream=True, groups=None):
"""Write the current manifest out to the given file descriptor. """Write the current manifest out to the given file descriptor.
""" """
mp = self.manifestProject mp = self.manifestProject
@ -303,7 +223,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
if self.notice: if self.notice:
notice_element = root.appendChild(doc.createElement('notice')) notice_element = root.appendChild(doc.createElement('notice'))
notice_lines = self.notice.splitlines() notice_lines = self.notice.splitlines()
indented_notice = ('\n'.join(" " * 4 + line for line in notice_lines))[4:] indented_notice = ('\n'.join(" "*4 + line for line in notice_lines))[4:]
notice_element.appendChild(doc.createTextNode(indented_notice)) notice_element.appendChild(doc.createTextNode(indented_notice))
d = self.default d = self.default
@ -388,13 +308,6 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
# Only save the origin if the origin is not a sha1, and the default # Only save the origin if the origin is not a sha1, and the default
# isn't our value # isn't our value
e.setAttribute('upstream', p.revisionExpr) e.setAttribute('upstream', p.revisionExpr)
if peg_rev_dest_branch:
if p.dest_branch:
e.setAttribute('dest-branch', p.dest_branch)
elif value != p.revisionExpr:
e.setAttribute('dest-branch', p.revisionExpr)
else: else:
revision = self.remotes[p.remote.orig_name].revision or d.revisionExpr revision = self.remotes[p.remote.orig_name].revision or d.revisionExpr
if not revision or revision != p.revisionExpr: if not revision or revision != p.revisionExpr:
@ -500,14 +413,6 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
self._Load() self._Load()
return self._manifest_server return self._manifest_server
@property
def CloneBundle(self):
clone_bundle = self.manifestProject.config.GetBoolean('repo.clonebundle')
if clone_bundle is None:
return False if self.manifestProject.config.GetBoolean('repo.partialclone') else True
else:
return clone_bundle
@property @property
def CloneFilter(self): def CloneFilter(self):
if self.manifestProject.config.GetBoolean('repo.partialclone'): if self.manifestProject.config.GetBoolean('repo.partialclone'):
@ -518,10 +423,6 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
def IsMirror(self): def IsMirror(self):
return self.manifestProject.config.GetBoolean('repo.mirror') return self.manifestProject.config.GetBoolean('repo.mirror')
@property
def UseGitWorktrees(self):
return self.manifestProject.config.GetBoolean('repo.worktree')
@property @property
def IsArchive(self): def IsArchive(self):
return self.manifestProject.config.GetBoolean('repo.archive') return self.manifestProject.config.GetBoolean('repo.archive')
@ -554,12 +455,15 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
self.manifestProject.worktree)) self.manifestProject.worktree))
if self._load_local_manifests: if self._load_local_manifests:
if os.path.exists(os.path.join(self.repodir, LOCAL_MANIFEST_NAME)): local = os.path.join(self.repodir, LOCAL_MANIFEST_NAME)
print('error: %s is not supported; put local manifests in `%s`' if os.path.exists(local):
'instead' % (LOCAL_MANIFEST_NAME, if not self.localManifestWarning:
self.localManifestWarning = True
print('warning: %s is deprecated; put local manifests '
'in `%s` instead' % (LOCAL_MANIFEST_NAME,
os.path.join(self.repodir, LOCAL_MANIFESTS_DIR_NAME)), os.path.join(self.repodir, LOCAL_MANIFESTS_DIR_NAME)),
file=sys.stderr) file=sys.stderr)
sys.exit(1) nodes.append(self._ParseManifestXml(local, self.repodir))
local_dir = os.path.abspath(os.path.join(self.repodir, local_dir = os.path.abspath(os.path.join(self.repodir,
LOCAL_MANIFESTS_DIR_NAME)) LOCAL_MANIFESTS_DIR_NAME))
@ -694,9 +598,6 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
if groups: if groups:
groups = self._ParseGroups(groups) groups = self._ParseGroups(groups)
revision = node.getAttribute('revision') revision = node.getAttribute('revision')
remote = node.getAttribute('remote')
if remote:
remote = self._get_remote(node)
for p in self._projects[name]: for p in self._projects[name]:
if path and p.relpath != path: if path and p.relpath != path:
@ -705,12 +606,6 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
p.groups.extend(groups) p.groups.extend(groups)
if revision: if revision:
p.revisionExpr = revision p.revisionExpr = revision
if IsId(revision):
p.revisionId = revision
else:
p.revisionId = None
if remote:
p.remote = remote.ToRemoteSpec(name)
if node.nodeName == 'repo-hooks': if node.nodeName == 'repo-hooks':
# Get the name of the project and the (space-separated) list of enabled. # Get the name of the project and the (space-separated) list of enabled.
repo_hooks_project = self._reqatt(node, 'in-project') repo_hooks_project = self._reqatt(node, 'in-project')
@ -754,6 +649,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
if self._repo_hooks_project and (self._repo_hooks_project.name == name): if self._repo_hooks_project and (self._repo_hooks_project.name == name):
self._repo_hooks_project = None self._repo_hooks_project = None
def _AddMetaProjectMirror(self, m): def _AddMetaProjectMirror(self, m):
name = None name = None
m_url = m.GetRemote(m.remote.name).url m_url = m.GetRemote(m.remote.name).url
@ -780,15 +676,15 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
if name not in self._projects: if name not in self._projects:
m.PreSync() m.PreSync()
gitdir = os.path.join(self.topdir, '%s.git' % name) gitdir = os.path.join(self.topdir, '%s.git' % name)
project = Project(manifest=self, project = Project(manifest = self,
name=name, name = name,
remote=remote.ToRemoteSpec(name), remote = remote.ToRemoteSpec(name),
gitdir=gitdir, gitdir = gitdir,
objdir=gitdir, objdir = gitdir,
worktree=None, worktree = None,
relpath=name or None, relpath = name or None,
revisionExpr=m.revisionExpr, revisionExpr = m.revisionExpr,
revisionId=None) revisionId = None)
self._projects[project.name] = [project] self._projects[project.name] = [project]
self._paths[project.relpath] = project self._paths[project.relpath] = project
@ -826,14 +722,29 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
d.destBranchExpr = node.getAttribute('dest-branch') or None d.destBranchExpr = node.getAttribute('dest-branch') or None
d.upstreamExpr = node.getAttribute('upstream') or None d.upstreamExpr = node.getAttribute('upstream') or None
d.sync_j = XmlInt(node, 'sync-j', 1) sync_j = node.getAttribute('sync-j')
if d.sync_j <= 0: if sync_j == '' or sync_j is None:
raise ManifestParseError('%s: sync-j must be greater than 0, not "%s"' % d.sync_j = 1
(self.manifestFile, d.sync_j)) else:
d.sync_j = int(sync_j)
d.sync_c = XmlBool(node, 'sync-c', False) sync_c = node.getAttribute('sync-c')
d.sync_s = XmlBool(node, 'sync-s', False) if not sync_c:
d.sync_tags = XmlBool(node, 'sync-tags', True) d.sync_c = False
else:
d.sync_c = sync_c.lower() in ("yes", "true", "1")
sync_s = node.getAttribute('sync-s')
if not sync_s:
d.sync_s = False
else:
d.sync_s = sync_s.lower() in ("yes", "true", "1")
sync_tags = node.getAttribute('sync-tags')
if not sync_tags:
d.sync_tags = True
else:
d.sync_tags = sync_tags.lower() in ("yes", "true", "1")
return d return d
def _ParseNotice(self, node): def _ParseNotice(self, node):
@ -881,7 +792,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
def _UnjoinName(self, parent_name, name): def _UnjoinName(self, parent_name, name):
return os.path.relpath(name, parent_name) return os.path.relpath(name, parent_name)
def _ParseProject(self, node, parent=None, **extra_proj_attrs): def _ParseProject(self, node, parent = None, **extra_proj_attrs):
""" """
reads a <project> element from the manifest file reads a <project> element from the manifest file
""" """
@ -910,15 +821,39 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
raise ManifestParseError("project %s path cannot be absolute in %s" % raise ManifestParseError("project %s path cannot be absolute in %s" %
(name, self.manifestFile)) (name, self.manifestFile))
rebase = XmlBool(node, 'rebase', True) rebase = node.getAttribute('rebase')
sync_c = XmlBool(node, 'sync-c', False) if not rebase:
sync_s = XmlBool(node, 'sync-s', self._default.sync_s) rebase = True
sync_tags = XmlBool(node, 'sync-tags', self._default.sync_tags) else:
rebase = rebase.lower() in ("yes", "true", "1")
clone_depth = XmlInt(node, 'clone-depth') sync_c = node.getAttribute('sync-c')
if clone_depth is not None and clone_depth <= 0: if not sync_c:
raise ManifestParseError('%s: clone-depth must be greater than 0, not "%s"' % sync_c = False
(self.manifestFile, clone_depth)) else:
sync_c = sync_c.lower() in ("yes", "true", "1")
sync_s = node.getAttribute('sync-s')
if not sync_s:
sync_s = self._default.sync_s
else:
sync_s = sync_s.lower() in ("yes", "true", "1")
sync_tags = node.getAttribute('sync-tags')
if not sync_tags:
sync_tags = self._default.sync_tags
else:
sync_tags = sync_tags.lower() in ("yes", "true", "1")
clone_depth = node.getAttribute('clone-depth')
if clone_depth:
try:
clone_depth = int(clone_depth)
if clone_depth <= 0:
raise ValueError()
except ValueError:
raise ManifestParseError('invalid clone-depth %s in %s' %
(clone_depth, self.manifestFile))
dest_branch = node.getAttribute('dest-branch') or self._default.destBranchExpr dest_branch = node.getAttribute('dest-branch') or self._default.destBranchExpr
@ -930,10 +865,8 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
groups = self._ParseGroups(groups) groups = self._ParseGroups(groups)
if parent is None: if parent is None:
relpath, worktree, gitdir, objdir, use_git_worktrees = \ relpath, worktree, gitdir, objdir = self.GetProjectPaths(name, path)
self.GetProjectPaths(name, path)
else: else:
use_git_worktrees = False
relpath, worktree, gitdir, objdir = \ relpath, worktree, gitdir, objdir = \
self.GetSubprojectPaths(parent, name, path) self.GetSubprojectPaths(parent, name, path)
@ -941,28 +874,27 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
groups.extend(set(default_groups).difference(groups)) groups.extend(set(default_groups).difference(groups))
if self.IsMirror and node.hasAttribute('force-path'): if self.IsMirror and node.hasAttribute('force-path'):
if XmlBool(node, 'force-path', False): if node.getAttribute('force-path').lower() in ("yes", "true", "1"):
gitdir = os.path.join(self.topdir, '%s.git' % path) gitdir = os.path.join(self.topdir, '%s.git' % path)
project = Project(manifest=self, project = Project(manifest = self,
name=name, name = name,
remote=remote.ToRemoteSpec(name), remote = remote.ToRemoteSpec(name),
gitdir=gitdir, gitdir = gitdir,
objdir=objdir, objdir = objdir,
worktree=worktree, worktree = worktree,
relpath=relpath, relpath = relpath,
revisionExpr=revisionExpr, revisionExpr = revisionExpr,
revisionId=None, revisionId = None,
rebase=rebase, rebase = rebase,
groups=groups, groups = groups,
sync_c=sync_c, sync_c = sync_c,
sync_s=sync_s, sync_s = sync_s,
sync_tags=sync_tags, sync_tags = sync_tags,
clone_depth=clone_depth, clone_depth = clone_depth,
upstream=upstream, upstream = upstream,
parent=parent, parent = parent,
dest_branch=dest_branch, dest_branch = dest_branch,
use_git_worktrees=use_git_worktrees,
**extra_proj_attrs) **extra_proj_attrs)
for n in node.childNodes: for n in node.childNodes:
@ -973,16 +905,11 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
if n.nodeName == 'annotation': if n.nodeName == 'annotation':
self._ParseAnnotation(project, n) self._ParseAnnotation(project, n)
if n.nodeName == 'project': if n.nodeName == 'project':
project.subprojects.append(self._ParseProject(n, parent=project)) project.subprojects.append(self._ParseProject(n, parent = project))
return project return project
def GetProjectPaths(self, name, path): def GetProjectPaths(self, name, path):
# The manifest entries might have trailing slashes. Normalize them to avoid
# unexpected filesystem behavior since we do string concatenation below.
path = path.rstrip('/')
name = name.rstrip('/')
use_git_worktrees = False
relpath = path relpath = path
if self.IsMirror: if self.IsMirror:
worktree = None worktree = None
@ -991,15 +918,8 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
else: else:
worktree = os.path.join(self.topdir, path).replace('\\', '/') worktree = os.path.join(self.topdir, path).replace('\\', '/')
gitdir = os.path.join(self.repodir, 'projects', '%s.git' % path) gitdir = os.path.join(self.repodir, 'projects', '%s.git' % path)
# We allow people to mix git worktrees & non-git worktrees for now.
# This allows for in situ migration of repo clients.
if os.path.exists(gitdir) or not self.UseGitWorktrees:
objdir = os.path.join(self.repodir, 'project-objects', '%s.git' % name) objdir = os.path.join(self.repodir, 'project-objects', '%s.git' % name)
else: return relpath, worktree, gitdir, objdir
use_git_worktrees = True
gitdir = os.path.join(self.repodir, 'worktrees', '%s.git' % name)
objdir = gitdir
return relpath, worktree, gitdir, objdir, use_git_worktrees
def GetProjectsWithName(self, name): def GetProjectsWithName(self, name):
return self._projects.get(name, []) return self._projects.get(name, [])
@ -1014,10 +934,6 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
return os.path.relpath(relpath, parent_relpath) return os.path.relpath(relpath, parent_relpath)
def GetSubprojectPaths(self, parent, name, path): def GetSubprojectPaths(self, parent, name, path):
# The manifest entries might have trailing slashes. Normalize them to avoid
# unexpected filesystem behavior since we do string concatenation below.
path = path.rstrip('/')
name = name.rstrip('/')
relpath = self._JoinRelpath(parent.relpath, path) relpath = self._JoinRelpath(parent.relpath, path)
gitdir = os.path.join(parent.gitdir, 'subprojects', '%s.git' % path) gitdir = os.path.join(parent.gitdir, 'subprojects', '%s.git' % path)
objdir = os.path.join(parent.gitdir, 'subproject-objects', '%s.git' % name) objdir = os.path.join(parent.gitdir, 'subproject-objects', '%s.git' % name)
@ -1027,112 +943,21 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
worktree = os.path.join(parent.worktree, path).replace('\\', '/') worktree = os.path.join(parent.worktree, path).replace('\\', '/')
return relpath, worktree, gitdir, objdir return relpath, worktree, gitdir, objdir
@staticmethod
def _CheckLocalPath(path, symlink=False):
"""Verify |path| is reasonable for use in <copyfile> & <linkfile>."""
if '~' in path:
return '~ not allowed (due to 8.3 filenames on Windows filesystems)'
# Some filesystems (like Apple's HFS+) try to normalize Unicode codepoints
# which means there are alternative names for ".git". Reject paths with
# these in it as there shouldn't be any reasonable need for them here.
# The set of codepoints here was cribbed from jgit's implementation:
# https://eclipse.googlesource.com/jgit/jgit/+/9110037e3e9461ff4dac22fee84ef3694ed57648/org.eclipse.jgit/src/org/eclipse/jgit/lib/ObjectChecker.java#884
BAD_CODEPOINTS = {
u'\u200C', # ZERO WIDTH NON-JOINER
u'\u200D', # ZERO WIDTH JOINER
u'\u200E', # LEFT-TO-RIGHT MARK
u'\u200F', # RIGHT-TO-LEFT MARK
u'\u202A', # LEFT-TO-RIGHT EMBEDDING
u'\u202B', # RIGHT-TO-LEFT EMBEDDING
u'\u202C', # POP DIRECTIONAL FORMATTING
u'\u202D', # LEFT-TO-RIGHT OVERRIDE
u'\u202E', # RIGHT-TO-LEFT OVERRIDE
u'\u206A', # INHIBIT SYMMETRIC SWAPPING
u'\u206B', # ACTIVATE SYMMETRIC SWAPPING
u'\u206C', # INHIBIT ARABIC FORM SHAPING
u'\u206D', # ACTIVATE ARABIC FORM SHAPING
u'\u206E', # NATIONAL DIGIT SHAPES
u'\u206F', # NOMINAL DIGIT SHAPES
u'\uFEFF', # ZERO WIDTH NO-BREAK SPACE
}
if BAD_CODEPOINTS & set(path):
# This message is more expansive than reality, but should be fine.
return 'Unicode combining characters not allowed'
# Assume paths might be used on case-insensitive filesystems.
path = path.lower()
# Split up the path by its components. We can't use os.path.sep exclusively
# as some platforms (like Windows) will convert / to \ and that bypasses all
# our constructed logic here. Especially since manifest authors only use
# / in their paths.
resep = re.compile(r'[/%s]' % re.escape(os.path.sep))
parts = resep.split(path)
# Some people use src="." to create stable links to projects. Lets allow
# that but reject all other uses of "." to keep things simple.
if parts != ['.']:
for part in set(parts):
if part in {'.', '..', '.git'} or part.startswith('.repo'):
return 'bad component: %s' % (part,)
if not symlink and resep.match(path[-1]):
return 'dirs not allowed'
# NB: The two abspath checks here are to handle platforms with multiple
# filesystem path styles (e.g. Windows).
norm = os.path.normpath(path)
if (norm == '..' or
(len(norm) >= 3 and norm.startswith('..') and resep.match(norm[0])) or
os.path.isabs(norm) or
norm.startswith('/')):
return 'path cannot be outside'
@classmethod
def _ValidateFilePaths(cls, element, src, dest):
"""Verify |src| & |dest| are reasonable for <copyfile> & <linkfile>.
We verify the path independent of any filesystem state as we won't have a
checkout available to compare to. i.e. This is for parsing validation
purposes only.
We'll do full/live sanity checking before we do the actual filesystem
modifications in _CopyFile/_LinkFile/etc...
"""
# |dest| is the file we write to or symlink we create.
# It is relative to the top of the repo client checkout.
msg = cls._CheckLocalPath(dest)
if msg:
raise ManifestInvalidPathError(
'<%s> invalid "dest": %s: %s' % (element, dest, msg))
# |src| is the file we read from or path we point to for symlinks.
# It is relative to the top of the git project checkout.
msg = cls._CheckLocalPath(src, symlink=element == 'linkfile')
if msg:
raise ManifestInvalidPathError(
'<%s> invalid "src": %s: %s' % (element, src, msg))
def _ParseCopyFile(self, project, node): def _ParseCopyFile(self, project, node):
src = self._reqatt(node, 'src') src = self._reqatt(node, 'src')
dest = self._reqatt(node, 'dest') dest = self._reqatt(node, 'dest')
if not self.IsMirror: if not self.IsMirror:
# src is project relative; # src is project relative;
# dest is relative to the top of the tree. # dest is relative to the top of the tree
# We only validate paths if we actually plan to process them. project.AddCopyFile(src, dest, os.path.join(self.topdir, dest))
self._ValidateFilePaths('copyfile', src, dest)
project.AddCopyFile(src, dest, self.topdir)
def _ParseLinkFile(self, project, node): def _ParseLinkFile(self, project, node):
src = self._reqatt(node, 'src') src = self._reqatt(node, 'src')
dest = self._reqatt(node, 'dest') dest = self._reqatt(node, 'dest')
if not self.IsMirror: if not self.IsMirror:
# src is project relative; # src is project relative;
# dest is relative to the top of the tree. # dest is relative to the top of the tree
# We only validate paths if we actually plan to process them. project.AddLinkFile(src, dest, os.path.join(self.topdir, dest))
self._ValidateFilePaths('linkfile', src, dest)
project.AddLinkFile(src, dest, self.topdir)
def _ParseAnnotation(self, project, node): def _ParseAnnotation(self, project, node):
name = self._reqatt(node, 'name') name = self._reqatt(node, 'name')
@ -1182,7 +1007,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
diff = {'added': [], 'removed': [], 'changed': [], 'unreachable': []} diff = {'added': [], 'removed': [], 'changed': [], 'unreachable': []}
for proj in fromKeys: for proj in fromKeys:
if proj not in toKeys: if not proj in toKeys:
diff['removed'].append(fromProjects[proj]) diff['removed'].append(fromProjects[proj])
else: else:
fromProj = fromProjects[proj] fromProj = fromProjects[proj]
@ -1214,7 +1039,7 @@ class GitcManifest(XmlManifest):
gitc_client_name) gitc_client_name)
self.manifestFile = os.path.join(self.gitc_client_dir, '.manifest') self.manifestFile = os.path.join(self.gitc_client_dir, '.manifest')
def _ParseProject(self, node, parent=None): def _ParseProject(self, node, parent = None):
"""Override _ParseProject and add support for GITC specific attributes.""" """Override _ParseProject and add support for GITC specific attributes."""
return super(GitcManifest, self)._ParseProject( return super(GitcManifest, self)._ParseProject(
node, parent=parent, old_revision=node.getAttribute('old-revision')) node, parent=parent, old_revision=node.getAttribute('old-revision'))
@ -1223,3 +1048,4 @@ class GitcManifest(XmlManifest):
"""Output GITC Specific Project attributes""" """Output GITC Specific Project attributes"""
if p.old_revision: if p.old_revision:
e.setAttribute('old-revision', str(p.old_revision)) e.setAttribute('old-revision', str(p.old_revision))

13
pager.py Normal file → Executable file
View File

@ -27,7 +27,6 @@ pager_process = None
old_stdout = None old_stdout = None
old_stderr = None old_stderr = None
def RunPager(globalConfig): def RunPager(globalConfig):
if not os.isatty(0) or not os.isatty(1): if not os.isatty(0) or not os.isatty(1):
return return
@ -36,37 +35,33 @@ def RunPager(globalConfig):
return return
if platform_utils.isWindows(): if platform_utils.isWindows():
_PipePager(pager) _PipePager(pager);
else: else:
_ForkPager(pager) _ForkPager(pager)
def TerminatePager(): def TerminatePager():
global pager_process, old_stdout, old_stderr global pager_process, old_stdout, old_stderr
if pager_process: if pager_process:
sys.stdout.flush() sys.stdout.flush()
sys.stderr.flush() sys.stderr.flush()
pager_process.stdin.close() pager_process.stdin.close()
pager_process.wait() pager_process.wait();
pager_process = None pager_process = None
# Restore initial stdout/err in case there is more output in this process # Restore initial stdout/err in case there is more output in this process
# after shutting down the pager process # after shutting down the pager process
sys.stdout = old_stdout sys.stdout = old_stdout
sys.stderr = old_stderr sys.stderr = old_stderr
def _PipePager(pager): def _PipePager(pager):
global pager_process, old_stdout, old_stderr global pager_process, old_stdout, old_stderr
assert pager_process is None, "Only one active pager process at a time" assert pager_process is None, "Only one active pager process at a time"
# Create pager process, piping stdout/err into its stdin # Create pager process, piping stdout/err into its stdin
pager_process = subprocess.Popen([pager], stdin=subprocess.PIPE, stdout=sys.stdout, pager_process = subprocess.Popen([pager], stdin=subprocess.PIPE, stdout=sys.stdout, stderr=sys.stderr)
stderr=sys.stderr)
old_stdout = sys.stdout old_stdout = sys.stdout
old_stderr = sys.stderr old_stderr = sys.stderr
sys.stdout = pager_process.stdin sys.stdout = pager_process.stdin
sys.stderr = pager_process.stdin sys.stderr = pager_process.stdin
def _ForkPager(pager): def _ForkPager(pager):
global active global active
# This process turns into the pager; a child it forks will # This process turns into the pager; a child it forks will
@ -93,7 +88,6 @@ def _ForkPager(pager):
print("fatal: cannot start pager '%s'" % pager, file=sys.stderr) print("fatal: cannot start pager '%s'" % pager, file=sys.stderr)
sys.exit(255) sys.exit(255)
def _SelectPager(globalConfig): def _SelectPager(globalConfig):
try: try:
return os.environ['GIT_PAGER'] return os.environ['GIT_PAGER']
@ -111,7 +105,6 @@ def _SelectPager(globalConfig):
return 'less' return 'less'
def _BecomePager(pager): def _BecomePager(pager):
# Delaying execution of the pager until we have output # Delaying execution of the pager until we have output
# ready works around a long-standing bug in popularly # ready works around a long-standing bug in popularly

View File

@ -80,7 +80,7 @@ class FileDescriptorStreams(object):
""" """
raise NotImplementedError raise NotImplementedError
def _create_stream(self, fd, dest, std_name): def _create_stream(fd, dest, std_name):
""" Creates a new stream wrapping an existing file descriptor. """ Creates a new stream wrapping an existing file descriptor.
""" """
raise NotImplementedError raise NotImplementedError
@ -90,14 +90,8 @@ class _FileDescriptorStreamsNonBlocking(FileDescriptorStreams):
""" Implementation of FileDescriptorStreams for platforms that support """ Implementation of FileDescriptorStreams for platforms that support
non blocking I/O. non blocking I/O.
""" """
def __init__(self):
super(_FileDescriptorStreamsNonBlocking, self).__init__()
self._poll = select.poll()
self._fd_to_stream = {}
class Stream(object): class Stream(object):
""" Encapsulates a file descriptor """ """ Encapsulates a file descriptor """
def __init__(self, fd, dest, std_name): def __init__(self, fd, dest, std_name):
self.fd = fd self.fd = fd
self.dest = dest self.dest = dest
@ -119,18 +113,11 @@ class _FileDescriptorStreamsNonBlocking(FileDescriptorStreams):
self.fd.close() self.fd.close()
def _create_stream(self, fd, dest, std_name): def _create_stream(self, fd, dest, std_name):
stream = self.Stream(fd, dest, std_name) return self.Stream(fd, dest, std_name)
self._fd_to_stream[stream.fileno()] = stream
self._poll.register(stream, select.POLLIN)
return stream
def remove(self, stream):
self._poll.unregister(stream)
del self._fd_to_stream[stream.fileno()]
super(_FileDescriptorStreamsNonBlocking, self).remove(stream)
def select(self): def select(self):
return [self._fd_to_stream[fd] for fd, _ in self._poll.poll()] ready_streams, _, _ = select.select(self.streams, [], [])
return ready_streams
class _FileDescriptorStreamsThreads(FileDescriptorStreams): class _FileDescriptorStreamsThreads(FileDescriptorStreams):
@ -138,7 +125,6 @@ class _FileDescriptorStreamsThreads(FileDescriptorStreams):
non blocking I/O. This implementation requires creating threads issuing non blocking I/O. This implementation requires creating threads issuing
blocking read operations on file descriptors. blocking read operations on file descriptors.
""" """
def __init__(self): def __init__(self):
super(_FileDescriptorStreamsThreads, self).__init__() super(_FileDescriptorStreamsThreads, self).__init__()
# The queue is shared accross all threads so we can simulate the # The queue is shared accross all threads so we can simulate the
@ -158,14 +144,12 @@ class _FileDescriptorStreamsThreads(FileDescriptorStreams):
class QueueItem(object): class QueueItem(object):
""" Item put in the shared queue """ """ Item put in the shared queue """
def __init__(self, stream, data): def __init__(self, stream, data):
self.stream = stream self.stream = stream
self.data = data self.data = data
class Stream(object): class Stream(object):
""" Encapsulates a file descriptor """ """ Encapsulates a file descriptor """
def __init__(self, fd, dest, std_name, queue): def __init__(self, fd, dest, std_name, queue):
self.fd = fd self.fd = fd
self.dest = dest self.dest = dest
@ -191,7 +175,7 @@ class _FileDescriptorStreamsThreads(FileDescriptorStreams):
for line in iter(self.fd.readline, b''): for line in iter(self.fd.readline, b''):
self.queue.put(_FileDescriptorStreamsThreads.QueueItem(self, line)) self.queue.put(_FileDescriptorStreamsThreads.QueueItem(self, line))
self.fd.close() self.fd.close()
self.queue.put(_FileDescriptorStreamsThreads.QueueItem(self, b'')) self.queue.put(_FileDescriptorStreamsThreads.QueueItem(self, None))
def symlink(source, link_name): def symlink(source, link_name):
@ -257,15 +241,14 @@ def _makelongpath(path):
return path return path
def rmtree(path, ignore_errors=False): def rmtree(path):
"""shutil.rmtree(path) wrapper with support for long paths on Windows. """shutil.rmtree(path) wrapper with support for long paths on Windows.
Availability: Unix, Windows.""" Availability: Unix, Windows."""
onerror = None
if isWindows(): if isWindows():
path = _makelongpath(path) shutil.rmtree(_makelongpath(path), onerror=handle_rmtree_error)
onerror = handle_rmtree_error else:
shutil.rmtree(path, ignore_errors=ignore_errors, onerror=onerror) shutil.rmtree(path)
def handle_rmtree_error(function, path, excinfo): def handle_rmtree_error(function, path, excinfo):

View File

@ -16,21 +16,15 @@
import errno import errno
from pyversion import is_python3
from ctypes import WinDLL, get_last_error, FormatError, WinError, addressof from ctypes import WinDLL, get_last_error, FormatError, WinError, addressof
from ctypes import c_buffer from ctypes import c_buffer
from ctypes.wintypes import BOOL, BOOLEAN, LPCWSTR, DWORD, HANDLE from ctypes.wintypes import BOOL, BOOLEAN, LPCWSTR, DWORD, HANDLE, POINTER, c_ubyte
from ctypes.wintypes import WCHAR, USHORT, LPVOID, ULONG from ctypes.wintypes import WCHAR, USHORT, LPVOID, Structure, Union, ULONG
if is_python3(): from ctypes.wintypes import byref
from ctypes import c_ubyte, Structure, Union, byref
from ctypes.wintypes import LPDWORD
else:
# For legacy Python2 different imports are needed.
from ctypes.wintypes import POINTER, c_ubyte, Structure, Union, byref
LPDWORD = POINTER(DWORD)
kernel32 = WinDLL('kernel32', use_last_error=True) kernel32 = WinDLL('kernel32', use_last_error=True)
LPDWORD = POINTER(DWORD)
UCHAR = c_ubyte UCHAR = c_ubyte
# Win32 error codes # Win32 error codes
@ -152,8 +146,7 @@ def create_dirsymlink(source, link_name):
def _create_symlink(source, link_name, dwFlags): def _create_symlink(source, link_name, dwFlags):
if not CreateSymbolicLinkW(link_name, source, if not CreateSymbolicLinkW(link_name, source, dwFlags | SYMBOLIC_LINK_FLAG_ALLOW_UNPRIVILEGED_CREATE):
dwFlags | SYMBOLIC_LINK_FLAG_ALLOW_UNPRIVILEGED_CREATE):
# See https://github.com/golang/go/pull/24307/files#diff-b87bc12e4da2497308f9ef746086e4f0 # See https://github.com/golang/go/pull/24307/files#diff-b87bc12e4da2497308f9ef746086e4f0
# "the unprivileged create flag is unsupported below Windows 10 (1703, v10.0.14972). # "the unprivileged create flag is unsupported below Windows 10 (1703, v10.0.14972).
# retry without it." # retry without it."
@ -186,7 +179,7 @@ def readlink(path):
if reparse_point_handle == INVALID_HANDLE_VALUE: if reparse_point_handle == INVALID_HANDLE_VALUE:
_raise_winerror( _raise_winerror(
get_last_error(), get_last_error(),
'Error opening symbolic link \"%s\"'.format(path)) 'Error opening symblic link \"%s\"'.format(path))
target_buffer = c_buffer(MAXIMUM_REPARSE_DATA_BUFFER_SIZE) target_buffer = c_buffer(MAXIMUM_REPARSE_DATA_BUFFER_SIZE)
n_bytes_returned = DWORD() n_bytes_returned = DWORD()
io_result = DeviceIoControl(reparse_point_handle, io_result = DeviceIoControl(reparse_point_handle,
@ -201,7 +194,7 @@ def readlink(path):
if not io_result: if not io_result:
_raise_winerror( _raise_winerror(
get_last_error(), get_last_error(),
'Error reading symbolic link \"%s\"'.format(path)) 'Error reading symblic link \"%s\"'.format(path))
rdb = REPARSE_DATA_BUFFER.from_buffer(target_buffer) rdb = REPARSE_DATA_BUFFER.from_buffer(target_buffer)
if rdb.ReparseTag == IO_REPARSE_TAG_SYMLINK: if rdb.ReparseTag == IO_REPARSE_TAG_SYMLINK:
return _preserve_encoding(path, rdb.SymbolicLinkReparseBuffer.PrintName) return _preserve_encoding(path, rdb.SymbolicLinkReparseBuffer.PrintName)
@ -210,17 +203,13 @@ def readlink(path):
# Unsupported reparse point type # Unsupported reparse point type
_raise_winerror( _raise_winerror(
ERROR_NOT_SUPPORTED, ERROR_NOT_SUPPORTED,
'Error reading symbolic link \"%s\"'.format(path)) 'Error reading symblic link \"%s\"'.format(path))
def _preserve_encoding(source, target): def _preserve_encoding(source, target):
"""Ensures target is the same string type (i.e. unicode or str) as source.""" """Ensures target is the same string type (i.e. unicode or str) as source."""
if isinstance(source, unicode):
if is_python3(): return unicode(target)
return target
if isinstance(source, unicode): # noqa: F821
return unicode(target) # noqa: F821
return str(target) return str(target)

View File

@ -26,7 +26,6 @@ _NOT_TTY = not os.isatty(2)
# column 0. # column 0.
CSI_ERASE_LINE = '\x1b[2K' CSI_ERASE_LINE = '\x1b[2K'
class Progress(object): class Progress(object):
def __init__(self, title, total=0, units='', print_newline=False, def __init__(self, title, total=0, units='', print_newline=False,
always_print_percentage=False): always_print_percentage=False):
@ -40,7 +39,7 @@ class Progress(object):
self._print_newline = print_newline self._print_newline = print_newline
self._always_print_percentage = always_print_percentage self._always_print_percentage = always_print_percentage
def update(self, inc=1, msg=''): def update(self, inc=1):
self._done += inc self._done += inc
if _NOT_TTY or IsTrace(): if _NOT_TTY or IsTrace():
@ -63,13 +62,12 @@ class Progress(object):
if self._lastp != p or self._always_print_percentage: if self._lastp != p or self._always_print_percentage:
self._lastp = p self._lastp = p
sys.stderr.write('%s\r%s: %3d%% (%d%s/%d%s)%s%s%s' % ( sys.stderr.write('%s\r%s: %3d%% (%d%s/%d%s)%s' % (
CSI_ERASE_LINE, CSI_ERASE_LINE,
self._title, self._title,
p, p,
self._done, self._units, self._done, self._units,
self._total, self._units, self._total, self._units,
' ' if msg else '', msg,
"\n" if self._print_newline else "")) "\n" if self._print_newline else ""))
sys.stderr.flush() sys.stderr.flush()

1237
project.py Normal file → Executable file

File diff suppressed because it is too large Load Diff

View File

@ -16,6 +16,5 @@
import sys import sys
def is_python3(): def is_python3():
return sys.version_info[0] == 3 return sys.version_info[0] == 3

View File

@ -1,2 +0,0 @@
These are helper tools for managing official releases.
See the [release process](../docs/release-process.md) document for more details.

View File

@ -1,114 +0,0 @@
#!/usr/bin/env python3
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Helper tool for signing repo launcher scripts correctly.
This is intended to be run only by the official Repo release managers.
"""
import argparse
import os
import subprocess
import sys
import util
def sign(opts):
"""Sign the launcher!"""
output = ''
for key in opts.keys:
# We use ! at the end of the key so that gpg uses this specific key.
# Otherwise it uses the key as a lookup into the overall key and uses the
# default signing key. i.e. It will see that KEYID_RSA is a subkey of
# another key, and use the primary key to sign instead of the subkey.
cmd = ['gpg', '--homedir', opts.gpgdir, '-u', f'{key}!', '--batch', '--yes',
'--armor', '--detach-sign', '--output', '-', opts.launcher]
ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE)
output += ret.stdout
# Save the combined signatures into one file.
with open(f'{opts.launcher}.asc', 'w', encoding='utf-8') as fp:
fp.write(output)
def check(opts):
"""Check the signature."""
util.run(opts, ['gpg', '--verify', f'{opts.launcher}.asc'])
def postmsg(opts):
"""Helpful info to show at the end for release manager."""
print(f"""
Repo launcher bucket:
gs://git-repo-downloads/
To upload this launcher directly:
gsutil cp -a public-read {opts.launcher} {opts.launcher}.asc gs://git-repo-downloads/
NB: You probably want to upload it with a specific version first, e.g.:
gsutil cp -a public-read {opts.launcher} gs://git-repo-downloads/repo-3.0
gsutil cp -a public-read {opts.launcher}.asc gs://git-repo-downloads/repo-3.0.asc
""")
def get_parser():
"""Get a CLI parser."""
parser = argparse.ArgumentParser(description=__doc__)
parser.add_argument('-n', '--dry-run',
dest='dryrun', action='store_true',
help='show everything that would be done')
parser.add_argument('--gpgdir',
default=os.path.join(util.HOMEDIR, '.gnupg', 'repo'),
help='path to dedicated gpg dir with release keys '
'(default: ~/.gnupg/repo/)')
parser.add_argument('--keyid', dest='keys', default=[], action='append',
help='alternative signing keys to use')
parser.add_argument('launcher',
default=os.path.join(util.TOPDIR, 'repo'), nargs='?',
help='the launcher script to sign')
return parser
def main(argv):
"""The main func!"""
parser = get_parser()
opts = parser.parse_args(argv)
if not os.path.exists(opts.gpgdir):
parser.error(f'--gpgdir does not exist: {opts.gpgdir}')
if not os.path.exists(opts.launcher):
parser.error(f'launcher does not exist: {opts.launcher}')
opts.launcher = os.path.relpath(opts.launcher)
print(f'Signing "{opts.launcher}" launcher script and saving to '
f'"{opts.launcher}.asc"')
if opts.keys:
print(f'Using custom keys to sign: {" ".join(opts.keys)}')
else:
print('Using official Repo release keys to sign')
opts.keys = [util.KEYID_DSA, util.KEYID_RSA, util.KEYID_ECC]
util.import_release_key(opts)
sign(opts)
check(opts)
postmsg(opts)
return 0
if __name__ == '__main__':
sys.exit(main(sys.argv[1:]))

View File

@ -1,140 +0,0 @@
#!/usr/bin/env python3
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Helper tool for signing repo release tags correctly.
This is intended to be run only by the official Repo release managers, but it
could be run by people maintaining their own fork of the project.
NB: Check docs/release-process.md for production freeze information.
"""
import argparse
import os
import re
import subprocess
import sys
import util
# We currently sign with the old DSA key as it's been around the longest.
# We should transition to RSA by Jun 2020, and ECC by Jun 2021.
KEYID = util.KEYID_DSA
# Regular expression to validate tag names.
RE_VALID_TAG = r'^v([0-9]+[.])+[0-9]+$'
def sign(opts):
"""Tag the commit & sign it!"""
# We use ! at the end of the key so that gpg uses this specific key.
# Otherwise it uses the key as a lookup into the overall key and uses the
# default signing key. i.e. It will see that KEYID_RSA is a subkey of
# another key, and use the primary key to sign instead of the subkey.
cmd = ['git', 'tag', '-s', opts.tag, '-u', f'{opts.key}!',
'-m', f'repo {opts.tag}', opts.commit]
key = 'GNUPGHOME'
print('+', f'export {key}="{opts.gpgdir}"')
oldvalue = os.getenv(key)
os.putenv(key, opts.gpgdir)
util.run(opts, cmd)
if oldvalue is None:
os.unsetenv(key)
else:
os.putenv(key, oldvalue)
def check(opts):
"""Check the signature."""
util.run(opts, ['git', 'tag', '--verify', opts.tag])
def postmsg(opts):
"""Helpful info to show at the end for release manager."""
cmd = ['git', 'rev-parse', 'remotes/origin/stable']
ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE)
current_release = ret.stdout.strip()
cmd = ['git', 'log', '--format=%h (%aN) %s', '--no-merges',
f'remotes/origin/stable..{opts.tag}']
ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE)
shortlog = ret.stdout.strip()
print(f"""
Here's the short log since the last release.
{shortlog}
To push release to the public:
git push origin {opts.commit}:stable {opts.tag} -n
NB: People will start upgrading to this version immediately.
To roll back a release:
git push origin --force {current_release}:stable -n
""")
def get_parser():
"""Get a CLI parser."""
parser = argparse.ArgumentParser(
description=__doc__,
formatter_class=argparse.RawDescriptionHelpFormatter)
parser.add_argument('-n', '--dry-run',
dest='dryrun', action='store_true',
help='show everything that would be done')
parser.add_argument('--gpgdir',
default=os.path.join(util.HOMEDIR, '.gnupg', 'repo'),
help='path to dedicated gpg dir with release keys '
'(default: ~/.gnupg/repo/)')
parser.add_argument('-f', '--force', action='store_true',
help='force signing of any tag')
parser.add_argument('--keyid', dest='key',
help='alternative signing key to use')
parser.add_argument('tag',
help='the tag to create (e.g. "v2.0")')
parser.add_argument('commit', default='HEAD', nargs='?',
help='the commit to tag')
return parser
def main(argv):
"""The main func!"""
parser = get_parser()
opts = parser.parse_args(argv)
if not os.path.exists(opts.gpgdir):
parser.error(f'--gpgdir does not exist: {opts.gpgdir}')
if not opts.force and not re.match(RE_VALID_TAG, opts.tag):
parser.error(f'tag "{opts.tag}" does not match regex "{RE_VALID_TAG}"; '
'use --force to sign anyways')
if opts.key:
print(f'Using custom key to sign: {opts.key}')
else:
print('Using official Repo release key to sign')
opts.key = KEYID
util.import_release_key(opts)
sign(opts)
check(opts)
postmsg(opts)
return 0
if __name__ == '__main__':
sys.exit(main(sys.argv[1:]))

View File

@ -1,73 +0,0 @@
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Random utility code for release tools."""
import os
import re
import subprocess
import sys
assert sys.version_info >= (3, 6), 'This module requires Python 3.6+'
TOPDIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
HOMEDIR = os.path.expanduser('~')
# These are the release keys we sign with.
KEYID_DSA = '8BB9AD793E8E6153AF0F9A4416530D5E920F5C65'
KEYID_RSA = 'A34A13BE8E76BFF46A0C022DA2E75A824AAB9624'
KEYID_ECC = 'E1F9040D7A3F6DAFAC897CD3D3B95DA243E48A39'
def cmdstr(cmd):
"""Get a nicely quoted shell command."""
ret = []
for arg in cmd:
if not re.match(r'^[a-zA-Z0-9/_.=-]+$', arg):
arg = f'"{arg}"'
ret.append(arg)
return ' '.join(ret)
def run(opts, cmd, check=True, **kwargs):
"""Helper around subprocess.run to include logging."""
print('+', cmdstr(cmd))
if opts.dryrun:
cmd = ['true', '--'] + cmd
try:
return subprocess.run(cmd, check=check, **kwargs)
except subprocess.CalledProcessError as e:
print(f'aborting: {e}', file=sys.stderr)
sys.exit(1)
def import_release_key(opts):
"""Import the public key of the official release repo signing key."""
# Extract the key from our repo launcher.
launcher = getattr(opts, 'launcher', os.path.join(TOPDIR, 'repo'))
print(f'Importing keys from "{launcher}" launcher script')
with open(launcher, encoding='utf-8') as fp:
data = fp.read()
keys = re.findall(
r'\n-----BEGIN PGP PUBLIC KEY BLOCK-----\n[^-]*'
r'\n-----END PGP PUBLIC KEY BLOCK-----\n', data, flags=re.M)
run(opts, ['gpg', '--import'], input='\n'.join(keys).encode('utf-8'))
print('Marking keys as fully trusted')
run(opts, ['gpg', '--import-ownertrust'],
input=f'{KEYID_DSA}:6:\n'.encode('utf-8'))

961
repo

File diff suppressed because it is too large Load Diff

View File

@ -28,16 +28,13 @@ REPO_TRACE = 'REPO_TRACE'
_TRACE = os.environ.get(REPO_TRACE) == '1' _TRACE = os.environ.get(REPO_TRACE) == '1'
def IsTrace(): def IsTrace():
return _TRACE return _TRACE
def SetTrace(): def SetTrace():
global _TRACE global _TRACE
_TRACE = True _TRACE = True
def Trace(fmt, *args): def Trace(fmt, *args):
if IsTrace(): if IsTrace():
print(fmt % args, file=sys.stderr) print(fmt % args, file=sys.stderr)

View File

@ -1,4 +1,4 @@
#!/usr/bin/env python #!/usr/bin/python
# -*- coding:utf-8 -*- # -*- coding:utf-8 -*-
# Copyright 2019 The Android Open Source Project # Copyright 2019 The Android Open Source Project
# #
@ -27,13 +27,14 @@ import sys
def run_pytest(cmd, argv): def run_pytest(cmd, argv):
"""Run the unittests via |cmd|.""" """Run the unittests via |cmd|."""
try: try:
return subprocess.call([cmd] + argv) subprocess.check_call([cmd] + argv)
return 0
except OSError as e: except OSError as e:
if e.errno == errno.ENOENT: if e.errno == errno.ENOENT:
print('%s: unable to run `%s`: %s' % (__file__, cmd, e), file=sys.stderr) print('%s: unable to run `%s`: %s' % (__file__, cmd, e), file=sys.stderr)
print('%s: Try installing pytest: sudo apt-get install python-pytest' % print('%s: Try installing pytest: sudo apt-get install python-pytest' %
(__file__,), file=sys.stderr) (__file__,), file=sys.stderr)
return 127 return 1
else: else:
raise raise
@ -42,11 +43,9 @@ def main(argv):
"""The main entry.""" """The main entry."""
# Add the repo tree to PYTHONPATH as the tests expect to be able to import # Add the repo tree to PYTHONPATH as the tests expect to be able to import
# modules directly. # modules directly.
pythonpath = os.path.dirname(os.path.realpath(__file__)) topdir = os.path.dirname(os.path.realpath(__file__))
oldpythonpath = os.environ.get('PYTHONPATH', None) pythonpath = os.environ.get('PYTHONPATH', '')
if oldpythonpath is not None: os.environ['PYTHONPATH'] = '%s:%s' % (topdir, pythonpath)
pythonpath += os.pathsep + oldpythonpath
os.environ['PYTHONPATH'] = pythonpath
return run_pytest('pytest', argv) return run_pytest('pytest', argv)

View File

@ -1,63 +0,0 @@
#!/usr/bin/env python
# -*- coding:utf-8 -*-
# Copyright 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the 'License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Python packaging for repo."""
from __future__ import print_function
import os
import setuptools
TOPDIR = os.path.dirname(os.path.abspath(__file__))
# Rip out the first intro paragraph.
with open(os.path.join(TOPDIR, 'README.md')) as fp:
lines = fp.read().splitlines()[2:]
end = lines.index('')
long_description = ' '.join(lines[0:end])
# https://packaging.python.org/tutorials/packaging-projects/
setuptools.setup(
name='repo',
version='1.13.8',
maintainer='Various',
maintainer_email='repo-discuss@googlegroups.com',
description='Repo helps manage many Git repositories',
long_description=long_description,
long_description_content_type='text/plain',
url='https://gerrit.googlesource.com/git-repo/',
project_urls={
'Bug Tracker': 'https://bugs.chromium.org/p/gerrit/issues/list?q=component:repo',
},
# https://pypi.org/classifiers/
classifiers=[
'Development Status :: 6 - Mature',
'Environment :: Console',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Natural Language :: English',
'Operating System :: MacOS :: MacOS X',
'Operating System :: Microsoft :: Windows :: Windows 10',
'Operating System :: POSIX :: Linux',
'Topic :: Software Development :: Version Control :: Git',
],
# We support Python 2.7 and Python 3.6+.
python_requires='>=2.7, ' + ', '.join('!=3.%i.*' % x for x in range(0, 6)),
packages=['subcmds'],
)

View File

@ -16,7 +16,6 @@
import os import os
# A mapping of the subcommand name to the class that implements it.
all_commands = {} all_commands = {}
my_dir = os.path.dirname(__file__) my_dir = os.path.dirname(__file__)
@ -38,7 +37,7 @@ for py in os.listdir(my_dir):
['%s' % name]) ['%s' % name])
mod = getattr(mod, name) mod = getattr(mod, name)
try: try:
cmd = getattr(mod, clsn) cmd = getattr(mod, clsn)()
except AttributeError: except AttributeError:
raise SyntaxError('%s/%s does not define class %s' % ( raise SyntaxError('%s/%s does not define class %s' % (
__name__, py, clsn)) __name__, py, clsn))
@ -47,5 +46,5 @@ for py in os.listdir(my_dir):
cmd.NAME = name cmd.NAME = name
all_commands[name] = cmd all_commands[name] = cmd
# Add 'branch' as an alias for 'branches'. if 'help' in all_commands:
all_commands['branch'] = all_commands['branches'] all_commands['help'].commands = all_commands

View File

@ -15,15 +15,12 @@
# limitations under the License. # limitations under the License.
from __future__ import print_function from __future__ import print_function
from collections import defaultdict
import sys import sys
from command import Command from command import Command
from collections import defaultdict
from git_command import git from git_command import git
from progress import Progress from progress import Progress
class Abandon(Command): class Abandon(Command):
common = True common = True
helpSummary = "Permanently abandon a development branch" helpSummary = "Permanently abandon a development branch"
@ -35,11 +32,7 @@ deleting it (and all its history) from your local repository.
It is equivalent to "git branch -D <branchname>". It is equivalent to "git branch -D <branchname>".
""" """
def _Options(self, p): def _Options(self, p):
p.add_option('-q', '--quiet',
action='store_true', default=False,
help='be quiet')
p.add_option('--all', p.add_option('--all',
dest='all', action='store_true', dest='all', action='store_true',
help='delete all branches in all projects') help='delete all branches in all projects')
@ -86,24 +79,21 @@ It is equivalent to "git branch -D <branchname>".
if err: if err:
for br in err.keys(): for br in err.keys():
err_msg = "error: cannot abandon %s" % br err_msg = "error: cannot abandon %s" %br
print(err_msg, file=sys.stderr) print(err_msg, file=sys.stderr)
for proj in err[br]: for proj in err[br]:
print(' ' * len(err_msg) + " | %s" % proj.relpath, file=sys.stderr) print(' '*len(err_msg) + " | %s" % proj.relpath, file=sys.stderr)
sys.exit(1) sys.exit(1)
elif not success: elif not success:
print('error: no project has local branch(es) : %s' % nb, print('error: no project has local branch(es) : %s' % nb,
file=sys.stderr) file=sys.stderr)
sys.exit(1) sys.exit(1)
else: else:
# Everything below here is displaying status. print('Abandoned branches:', file=sys.stderr)
if opt.quiet:
return
print('Abandoned branches:')
for br in success.keys(): for br in success.keys():
if len(all_projects) > 1 and len(all_projects) == len(success[br]): if len(all_projects) > 1 and len(all_projects) == len(success[br]):
result = "all project" result = "all project"
else: else:
result = "%s" % ( result = "%s" % (
('\n' + ' ' * width + '| ').join(p.relpath for p in success[br])) ('\n'+' '*width + '| ').join(p.relpath for p in success[br]))
print("%s%s| %s\n" % (br, ' ' * (width - len(br)), result)) print("%s%s| %s\n" % (br,' '*(width-len(br)), result),file=sys.stderr)

View File

@ -19,7 +19,6 @@ import sys
from color import Coloring from color import Coloring
from command import Command from command import Command
class BranchColoring(Coloring): class BranchColoring(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, 'branch') Coloring.__init__(self, config, 'branch')
@ -27,7 +26,6 @@ class BranchColoring(Coloring):
self.local = self.printer('local') self.local = self.printer('local')
self.notinproject = self.printer('notinproject', fg='red') self.notinproject = self.printer('notinproject', fg='red')
class BranchInfo(object): class BranchInfo(object):
def __init__(self, name): def __init__(self, name):
self.name = name self.name = name
@ -160,7 +158,7 @@ is shown, then the branch appears in all projects.
for b in i.projects: for b in i.projects:
have.add(b.project) have.add(b.project)
for p in projects: for p in projects:
if p not in have: if not p in have:
paths.append(p.relpath) paths.append(p.relpath)
s = ' %s %s' % (in_type, ', '.join(paths)) s = ' %s %s' % (in_type, ', '.join(paths))
@ -172,11 +170,11 @@ is shown, then the branch appears in all projects.
fmt = out.current if i.IsCurrent else out.write fmt = out.current if i.IsCurrent else out.write
for p in paths: for p in paths:
out.nl() out.nl()
fmt(width * ' ' + ' %s' % p) fmt(width*' ' + ' %s' % p)
fmt = out.write fmt = out.write
for p in non_cur_paths: for p in non_cur_paths:
out.nl() out.nl()
fmt(width * ' ' + ' %s' % p) fmt(width*' ' + ' %s' % p)
else: else:
out.write(' in all projects') out.write(' in all projects')
out.nl() out.nl()

View File

@ -19,7 +19,6 @@ import sys
from command import Command from command import Command
from progress import Progress from progress import Progress
class Checkout(Command): class Checkout(Command):
common = True common = True
helpSummary = "Checkout a branch for development" helpSummary = "Checkout a branch for development"

View File

@ -22,7 +22,6 @@ from git_command import GitCommand
CHANGE_ID_RE = re.compile(r'^\s*Change-Id: I([0-9a-f]{40})\s*$') CHANGE_ID_RE = re.compile(r'^\s*Change-Id: I([0-9a-f]{40})\s*$')
class CherryPick(Command): class CherryPick(Command):
common = True common = True
helpSummary = "Cherry-pick a change." helpSummary = "Cherry-pick a change."
@ -47,8 +46,8 @@ change id will be added.
p = GitCommand(None, p = GitCommand(None,
['rev-parse', '--verify', reference], ['rev-parse', '--verify', reference],
capture_stdout=True, capture_stdout = True,
capture_stderr=True) capture_stderr = True)
if p.Wait() != 0: if p.Wait() != 0:
print(p.stderr, file=sys.stderr) print(p.stderr, file=sys.stderr)
sys.exit(1) sys.exit(1)
@ -62,8 +61,8 @@ change id will be added.
p = GitCommand(None, p = GitCommand(None,
['cherry-pick', sha1], ['cherry-pick', sha1],
capture_stdout=True, capture_stdout = True,
capture_stderr=True) capture_stderr = True)
status = p.Wait() status = p.Wait()
print(p.stdout, file=sys.stdout) print(p.stdout, file=sys.stdout)
@ -75,9 +74,9 @@ change id will be added.
new_msg = self._Reformat(old_msg, sha1) new_msg = self._Reformat(old_msg, sha1)
p = GitCommand(None, ['commit', '--amend', '-F', '-'], p = GitCommand(None, ['commit', '--amend', '-F', '-'],
provide_stdin=True, provide_stdin = True,
capture_stdout=True, capture_stdout = True,
capture_stderr=True) capture_stderr = True)
p.stdin.write(new_msg) p.stdin.write(new_msg)
p.stdin.close() p.stdin.close()
if p.Wait() != 0: if p.Wait() != 0:
@ -98,7 +97,7 @@ change id will be added.
def _StripHeader(self, commit_msg): def _StripHeader(self, commit_msg):
lines = commit_msg.splitlines() lines = commit_msg.splitlines()
return "\n".join(lines[lines.index("") + 1:]) return "\n".join(lines[lines.index("")+1:])
def _Reformat(self, old_msg, sha1): def _Reformat(self, old_msg, sha1):
new_msg = [] new_msg = []

View File

@ -16,7 +16,6 @@
from command import PagedCommand from command import PagedCommand
class Diff(PagedCommand): class Diff(PagedCommand):
common = True common = True
helpSummary = "Show changes between commit and working tree" helpSummary = "Show changes between commit and working tree"
@ -29,6 +28,10 @@ to the Unix 'patch' command.
""" """
def _Options(self, p): def _Options(self, p):
def cmd(option, opt_str, value, parser):
setattr(parser.values, option.dest, list(parser.rargs))
while parser.rargs:
del parser.rargs[0]
p.add_option('-u', '--absolute', p.add_option('-u', '--absolute',
dest='absolute', action='store_true', dest='absolute', action='store_true',
help='Paths are relative to the repository root') help='Paths are relative to the repository root')

View File

@ -18,12 +18,10 @@ from color import Coloring
from command import PagedCommand from command import PagedCommand
from manifest_xml import XmlManifest from manifest_xml import XmlManifest
class _Coloring(Coloring): class _Coloring(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, "status") Coloring.__init__(self, config, "status")
class Diffmanifests(PagedCommand): class Diffmanifests(PagedCommand):
""" A command to see logs in projects represented by manifests """ A command to see logs in projects represented by manifests
@ -79,7 +77,7 @@ synced and their revisions won't be found.
metavar='<FORMAT>', metavar='<FORMAT>',
help='print the log using a custom git pretty format string') help='print the log using a custom git pretty format string')
def _printRawDiff(self, diff, pretty_format=None): def _printRawDiff(self, diff):
for project in diff['added']: for project in diff['added']:
self.printText("A %s %s" % (project.relpath, project.revisionExpr)) self.printText("A %s %s" % (project.relpath, project.revisionExpr))
self.out.nl() self.out.nl()
@ -92,7 +90,7 @@ synced and their revisions won't be found.
self.printText("C %s %s %s" % (project.relpath, project.revisionExpr, self.printText("C %s %s %s" % (project.relpath, project.revisionExpr,
otherProject.revisionExpr)) otherProject.revisionExpr))
self.out.nl() self.out.nl()
self._printLogs(project, otherProject, raw=True, color=False, pretty_format=pretty_format) self._printLogs(project, otherProject, raw=True, color=False)
for project, otherProject in diff['unreachable']: for project, otherProject in diff['unreachable']:
self.printText("U %s %s %s" % (project.relpath, project.revisionExpr, self.printText("U %s %s %s" % (project.relpath, project.revisionExpr,
@ -186,10 +184,10 @@ synced and their revisions won't be found.
self.out = _Coloring(self.manifest.globalConfig) self.out = _Coloring(self.manifest.globalConfig)
self.printText = self.out.nofmt_printer('text') self.printText = self.out.nofmt_printer('text')
if opt.color: if opt.color:
self.printProject = self.out.nofmt_printer('project', attr='bold') self.printProject = self.out.nofmt_printer('project', attr = 'bold')
self.printAdded = self.out.nofmt_printer('green', fg='green', attr='bold') self.printAdded = self.out.nofmt_printer('green', fg = 'green', attr = 'bold')
self.printRemoved = self.out.nofmt_printer('red', fg='red', attr='bold') self.printRemoved = self.out.nofmt_printer('red', fg = 'red', attr = 'bold')
self.printRevision = self.out.nofmt_printer('revision', fg='yellow') self.printRevision = self.out.nofmt_printer('revision', fg = 'yellow')
else: else:
self.printProject = self.printAdded = self.printRemoved = self.printRevision = self.printText self.printProject = self.printAdded = self.printRemoved = self.printRevision = self.printText
@ -203,6 +201,6 @@ synced and their revisions won't be found.
diff = manifest1.projectsDiff(manifest2) diff = manifest1.projectsDiff(manifest2)
if opt.raw: if opt.raw:
self._printRawDiff(diff, pretty_format=opt.pretty_format) self._printRawDiff(diff)
else: else:
self._printDiff(diff, color=opt.color, pretty_format=opt.pretty_format) self._printDiff(diff, color=opt.color, pretty_format=opt.pretty_format)

46
subcmds/download.py Normal file → Executable file
View File

@ -23,7 +23,6 @@ from error import GitError
CHANGE_RE = re.compile(r'^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$') CHANGE_RE = re.compile(r'^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$')
class Download(Command): class Download(Command):
common = True common = True
helpSummary = "Download and checkout a change" helpSummary = "Download and checkout a change"
@ -37,13 +36,9 @@ If no project is specified try to use current directory as a project.
""" """
def _Options(self, p): def _Options(self, p):
p.add_option('-b', '--branch',
help='create a new branch first')
p.add_option('-c', '--cherry-pick', p.add_option('-c', '--cherry-pick',
dest='cherrypick', action='store_true', dest='cherrypick', action='store_true',
help="cherry-pick instead of checkout") help="cherry-pick instead of checkout")
p.add_option('-x', '--record-origin', action='store_true',
help='pass -x when cherry-picking')
p.add_option('-r', '--revert', p.add_option('-r', '--revert',
dest='revert', action='store_true', dest='revert', action='store_true',
help="revert instead of checkout") help="revert instead of checkout")
@ -82,14 +77,6 @@ If no project is specified try to use current directory as a project.
project = self.GetProjects([a])[0] project = self.GetProjects([a])[0]
return to_get return to_get
def ValidateOptions(self, opt, args):
if opt.record_origin:
if not opt.cherrypick:
self.OptionParser.error('-x only makes sense with --cherry-pick')
if opt.ffonly:
self.OptionParser.error('-x and --ff are mutually exclusive options')
def Execute(self, opt, args): def Execute(self, opt, args):
for project, change_id, ps_id in self._ParseChangeIds(args): for project, change_id, ps_id in self._ParseChangeIds(args):
dl = project.DownloadPatchSet(change_id, ps_id) dl = project.DownloadPatchSet(change_id, ps_id)
@ -106,41 +93,22 @@ If no project is specified try to use current directory as a project.
continue continue
if len(dl.commits) > 1: if len(dl.commits) > 1:
print('[%s] %d/%d depends on %d unmerged changes:' print('[%s] %d/%d depends on %d unmerged changes:' \
% (project.name, change_id, ps_id, len(dl.commits)), % (project.name, change_id, ps_id, len(dl.commits)),
file=sys.stderr) file=sys.stderr)
for c in dl.commits: for c in dl.commits:
print(' %s' % (c), file=sys.stderr) print(' %s' % (c), file=sys.stderr)
if opt.cherrypick: if opt.cherrypick:
mode = 'cherry-pick'
elif opt.revert:
mode = 'revert'
elif opt.ffonly:
mode = 'fast-forward merge'
else:
mode = 'checkout'
# We'll combine the branch+checkout operation, but all the rest need a
# dedicated branch start.
if opt.branch and mode != 'checkout':
project.StartBranch(opt.branch)
try: try:
if opt.cherrypick: project._CherryPick(dl.commit)
project._CherryPick(dl.commit, ffonly=opt.ffonly, except GitError:
record_origin=opt.record_origin) print('[%s] Could not complete the cherry-pick of %s' \
% (project.name, dl.commit), file=sys.stderr)
sys.exit(1)
elif opt.revert: elif opt.revert:
project._Revert(dl.commit) project._Revert(dl.commit)
elif opt.ffonly: elif opt.ffonly:
project._FastForward(dl.commit, ffonly=True) project._FastForward(dl.commit, ffonly=True)
else:
if opt.branch:
project.StartBranch(opt.branch, revision=dl.commit)
else: else:
project._Checkout(dl.commit) project._Checkout(dl.commit)
except GitError:
print('[%s] Could not complete the %s of %s'
% (project.name, mode, dl.commit), file=sys.stderr)
sys.exit(1)

View File

@ -127,8 +127,7 @@ without iterating through the remaining projects.
help="Execute the command only on projects matching regex or wildcard expression") help="Execute the command only on projects matching regex or wildcard expression")
p.add_option('-i', '--inverse-regex', p.add_option('-i', '--inverse-regex',
dest='inverse_regex', action='store_true', dest='inverse_regex', action='store_true',
help="Execute the command only on projects not matching regex or " help="Execute the command only on projects not matching regex or wildcard expression")
"wildcard expression")
p.add_option('-g', '--groups', p.add_option('-g', '--groups',
dest='groups', dest='groups',
help="Execute the command only on projects matching the specified groups") help="Execute the command only on projects matching the specified groups")
@ -140,9 +139,6 @@ without iterating through the remaining projects.
p.add_option('-e', '--abort-on-errors', p.add_option('-e', '--abort-on-errors',
dest='abort_on_errors', action='store_true', dest='abort_on_errors', action='store_true',
help='Abort if a command exits unsuccessfully') help='Abort if a command exits unsuccessfully')
p.add_option('--ignore-missing', action='store_true',
help='Silently skip & do not exit non-zero due missing '
'checkouts')
g = p.add_option_group('Output') g = p.add_option_group('Output')
g.add_option('-p', g.add_option('-p',
@ -179,8 +175,6 @@ without iterating through the remaining projects.
'annotations': dict((a.name, a.value) for a in project.annotations), 'annotations': dict((a.name, a.value) for a in project.annotations),
'gitdir': project.gitdir, 'gitdir': project.gitdir,
'worktree': project.worktree, 'worktree': project.worktree,
'upstream': project.upstream,
'dest_branch': project.dest_branch,
} }
def ValidateOptions(self, opt, args): def ValidateOptions(self, opt, args):
@ -280,7 +274,6 @@ without iterating through the remaining projects.
return return
yield [mirror, opt, cmd, shell, cnt, config, project] yield [mirror, opt, cmd, shell, cnt, config, project]
class WorkerKeyboardInterrupt(Exception): class WorkerKeyboardInterrupt(Exception):
""" Keyboard interrupt exception for worker processes. """ """ Keyboard interrupt exception for worker processes. """
pass pass
@ -289,7 +282,6 @@ class WorkerKeyboardInterrupt(Exception):
def InitWorker(): def InitWorker():
signal.signal(signal.SIGINT, signal.SIG_IGN) signal.signal(signal.SIGINT, signal.SIG_IGN)
def DoWorkWrapper(args): def DoWorkWrapper(args):
""" A wrapper around the DoWork() method. """ A wrapper around the DoWork() method.
@ -308,10 +300,11 @@ def DoWorkWrapper(args):
def DoWork(project, mirror, opt, cmd, shell, cnt, config): def DoWork(project, mirror, opt, cmd, shell, cnt, config):
env = os.environ.copy() env = os.environ.copy()
def setenv(name, val): def setenv(name, val):
if val is None: if val is None:
val = '' val = ''
if hasattr(val, 'encode'):
val = val.encode()
env[name] = val env[name] = val
setenv('REPO_PROJECT', project['name']) setenv('REPO_PROJECT', project['name'])
@ -319,8 +312,6 @@ def DoWork(project, mirror, opt, cmd, shell, cnt, config):
setenv('REPO_REMOTE', project['remote_name']) setenv('REPO_REMOTE', project['remote_name'])
setenv('REPO_LREV', project['lrev']) setenv('REPO_LREV', project['lrev'])
setenv('REPO_RREV', project['rrev']) setenv('REPO_RREV', project['rrev'])
setenv('REPO_UPSTREAM', project['upstream'])
setenv('REPO_DEST_BRANCH', project['dest_branch'])
setenv('REPO_I', str(cnt + 1)) setenv('REPO_I', str(cnt + 1))
for name in project['annotations']: for name in project['annotations']:
setenv("REPO__%s" % (name), project['annotations'][name]) setenv("REPO__%s" % (name), project['annotations'][name])
@ -332,10 +323,6 @@ def DoWork(project, mirror, opt, cmd, shell, cnt, config):
cwd = project['worktree'] cwd = project['worktree']
if not os.path.exists(cwd): if not os.path.exists(cwd):
# Allow the user to silently ignore missing checkouts so they can run on
# partial checkouts (good for infra recovery tools).
if opt.ignore_missing:
return 0
if ((opt.project_header and opt.verbose) if ((opt.project_header and opt.verbose)
or not opt.project_header): or not opt.project_header):
print('skipping %s/' % project['relpath'], file=sys.stderr) print('skipping %s/' % project['relpath'], file=sys.stderr)
@ -372,10 +359,10 @@ def DoWork(project, mirror, opt, cmd, shell, cnt, config):
while not s_in.is_done: while not s_in.is_done:
in_ready = s_in.select() in_ready = s_in.select()
for s in in_ready: for s in in_ready:
buf = s.read().decode() buf = s.read()
if not buf: if not buf:
s_in.remove(s)
s.close() s.close()
s_in.remove(s)
continue continue
if not opt.verbose: if not opt.verbose:

View File

@ -22,8 +22,7 @@ import platform_utils
from pyversion import is_python3 from pyversion import is_python3
if not is_python3(): if not is_python3():
input = raw_input # noqa: F821 input = raw_input
class GitcDelete(Command, GitcClientCommand): class GitcDelete(Command, GitcClientCommand):
common = True common = True

View File

@ -50,7 +50,7 @@ use for this GITC client.
""" """
def _Options(self, p): def _Options(self, p):
super(GitcInit, self)._Options(p, gitc_init=True) super(GitcInit, self)._Options(p)
g = p.add_option_group('GITC options') g = p.add_option_group('GITC options')
g.add_option('-f', '--manifest-file', g.add_option('-f', '--manifest-file',
dest='manifest_file', dest='manifest_file',
@ -62,8 +62,7 @@ use for this GITC client.
def Execute(self, opt, args): def Execute(self, opt, args):
gitc_client = gitc_utils.parse_clientdir(os.getcwd()) gitc_client = gitc_utils.parse_clientdir(os.getcwd())
if not gitc_client or (opt.gitc_client and gitc_client != opt.gitc_client): if not gitc_client or (opt.gitc_client and gitc_client != opt.gitc_client):
print('fatal: Please update your repo command. See go/gitc for instructions.', print('fatal: Please update your repo command. See go/gitc for instructions.', file=sys.stderr)
file=sys.stderr)
sys.exit(1) sys.exit(1)
self.client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(), self.client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(),
gitc_client) gitc_client)

View File

@ -21,8 +21,7 @@ import sys
from color import Coloring from color import Coloring
from command import PagedCommand from command import PagedCommand
from error import GitError from error import GitError
from git_command import GitCommand from git_command import git_require, GitCommand
class GrepColoring(Coloring): class GrepColoring(Coloring):
def __init__(self, config): def __init__(self, config):
@ -30,7 +29,6 @@ class GrepColoring(Coloring):
self.project = self.printer('project', attr='bold') self.project = self.printer('project', attr='bold')
self.fail = self.printer('fail', fg='red') self.fail = self.printer('fail', fg='red')
class Grep(PagedCommand): class Grep(PagedCommand):
common = True common = True
helpSummary = "Print lines matching a pattern" helpSummary = "Print lines matching a pattern"
@ -158,11 +156,12 @@ contain a line that matches both expressions:
action='callback', callback=carry, action='callback', callback=carry,
help='Show only file names not containing matching lines') help='Show only file names not containing matching lines')
def Execute(self, opt, args): def Execute(self, opt, args):
out = GrepColoring(self.manifest.manifestProject.config) out = GrepColoring(self.manifest.manifestProject.config)
cmd_argv = ['grep'] cmd_argv = ['grep']
if out.is_on: if out.is_on and git_require((1, 6, 3)):
cmd_argv.append('--color') cmd_argv.append('--color')
cmd_argv.extend(getattr(opt, 'cmd_argv', [])) cmd_argv.extend(getattr(opt, 'cmd_argv', []))

View File

@ -19,12 +19,10 @@ import re
import sys import sys
from formatter import AbstractFormatter, DumbWriter from formatter import AbstractFormatter, DumbWriter
from subcmds import all_commands
from color import Coloring from color import Coloring
from command import PagedCommand, MirrorSafeCommand, GitcAvailableCommand, GitcClientCommand from command import PagedCommand, MirrorSafeCommand, GitcAvailableCommand, GitcClientCommand
import gitc_utils import gitc_utils
class Help(PagedCommand, MirrorSafeCommand): class Help(PagedCommand, MirrorSafeCommand):
common = False common = False
helpSummary = "Display detailed help on a command" helpSummary = "Display detailed help on a command"
@ -35,26 +33,23 @@ class Help(PagedCommand, MirrorSafeCommand):
Displays detailed usage information about a command. Displays detailed usage information about a command.
""" """
def _PrintCommands(self, commandNames): def _PrintAllCommands(self):
"""Helper to display |commandNames| summaries.""" print('usage: repo COMMAND [ARGS]')
print('The complete list of recognized repo commands are:')
commandNames = list(sorted(self.commands))
maxlen = 0 maxlen = 0
for name in commandNames: for name in commandNames:
maxlen = max(maxlen, len(name)) maxlen = max(maxlen, len(name))
fmt = ' %%-%ds %%s' % maxlen fmt = ' %%-%ds %%s' % maxlen
for name in commandNames: for name in commandNames:
command = all_commands[name]() command = self.commands[name]
try: try:
summary = command.helpSummary.strip() summary = command.helpSummary.strip()
except AttributeError: except AttributeError:
summary = '' summary = ''
print(fmt % (name, summary)) print(fmt % (name, summary))
def _PrintAllCommands(self):
print('usage: repo COMMAND [ARGS]')
print('The complete list of recognized repo commands are:')
commandNames = list(sorted(all_commands))
self._PrintCommands(commandNames)
print("See 'repo help <command>' for more information on a " print("See 'repo help <command>' for more information on a "
'specific command.') 'specific command.')
@ -74,13 +69,24 @@ Displays detailed usage information about a command.
return False return False
commandNames = list(sorted([name commandNames = list(sorted([name
for name, command in all_commands.items() for name, command in self.commands.items()
if command.common and gitc_supported(command)])) if command.common and gitc_supported(command)]))
self._PrintCommands(commandNames)
maxlen = 0
for name in commandNames:
maxlen = max(maxlen, len(name))
fmt = ' %%-%ds %%s' % maxlen
for name in commandNames:
command = self.commands[name]
try:
summary = command.helpSummary.strip()
except AttributeError:
summary = ''
print(fmt % (name, summary))
print( print(
"See 'repo help <command>' for more information on a specific command.\n" "See 'repo help <command>' for more information on a specific command.\n"
"See 'repo help --all' for a complete list of recognized commands.") "See 'repo help --all' for a complete list of recognized commands.")
def _PrintCommandHelp(self, cmd, header_prefix=''): def _PrintCommandHelp(self, cmd, header_prefix=''):
class _Out(Coloring): class _Out(Coloring):
@ -133,8 +139,8 @@ Displays detailed usage information about a command.
out._PrintSection('Description', 'helpDescription') out._PrintSection('Description', 'helpDescription')
def _PrintAllCommandHelp(self): def _PrintAllCommandHelp(self):
for name in sorted(all_commands): for name in sorted(self.commands):
cmd = all_commands[name]() cmd = self.commands[name]
cmd.manifest = self.manifest cmd.manifest = self.manifest
self._PrintCommandHelp(cmd, header_prefix='[%s] ' % (name,)) self._PrintCommandHelp(cmd, header_prefix='[%s] ' % (name,))
@ -159,7 +165,7 @@ Displays detailed usage information about a command.
name = args[0] name = args[0]
try: try:
cmd = all_commands[name]() cmd = self.commands[name]
except KeyError: except KeyError:
print("repo: '%s' is not a repo command." % name, file=sys.stderr) print("repo: '%s' is not a repo command." % name, file=sys.stderr)
sys.exit(1) sys.exit(1)

View File

@ -16,14 +16,12 @@
from command import PagedCommand from command import PagedCommand
from color import Coloring from color import Coloring
from git_refs import R_M, R_HEADS from git_refs import R_M
class _Coloring(Coloring): class _Coloring(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, "status") Coloring.__init__(self, config, "status")
class Info(PagedCommand): class Info(PagedCommand):
common = True common = True
helpSummary = "Get info on the manifest branch, current branch or unmerged branches" helpSummary = "Get info on the manifest branch, current branch or unmerged branches"
@ -43,14 +41,15 @@ class Info(PagedCommand):
dest="local", action="store_true", dest="local", action="store_true",
help="Disable all remote operations") help="Disable all remote operations")
def Execute(self, opt, args): def Execute(self, opt, args):
self.out = _Coloring(self.manifest.globalConfig) self.out = _Coloring(self.manifest.globalConfig)
self.heading = self.out.printer('heading', attr='bold') self.heading = self.out.printer('heading', attr = 'bold')
self.headtext = self.out.nofmt_printer('headtext', fg='yellow') self.headtext = self.out.nofmt_printer('headtext', fg = 'yellow')
self.redtext = self.out.printer('redtext', fg='red') self.redtext = self.out.printer('redtext', fg = 'red')
self.sha = self.out.printer("sha", fg='yellow') self.sha = self.out.printer("sha", fg = 'yellow')
self.text = self.out.nofmt_printer('text') self.text = self.out.nofmt_printer('text')
self.dimtext = self.out.printer('dimtext', attr='dim') self.dimtext = self.out.printer('dimtext', attr = 'dim')
self.opt = opt self.opt = opt
@ -104,10 +103,6 @@ class Info(PagedCommand):
self.headtext(currentBranch) self.headtext(currentBranch)
self.out.nl() self.out.nl()
self.heading("Manifest revision: ")
self.headtext(p.revisionExpr)
self.out.nl()
localBranches = list(p.GetBranches().keys()) localBranches = list(p.GetBranches().keys())
self.heading("Local Branches: ") self.heading("Local Branches: ")
self.redtext(str(len(localBranches))) self.redtext(str(len(localBranches)))
@ -123,14 +118,11 @@ class Info(PagedCommand):
self.printSeparator() self.printSeparator()
def findRemoteLocalDiff(self, project): def findRemoteLocalDiff(self, project):
# Fetch all the latest commits. #Fetch all the latest commits
if not self.opt.local: if not self.opt.local:
project.Sync_NetworkHalf(quiet=True, current_branch_only=True) project.Sync_NetworkHalf(quiet=True, current_branch_only=True)
branch = self.manifest.manifestProject.config.GetBranch('default').merge logTarget = R_M + self.manifest.manifestProject.config.GetBranch("default").merge
if branch.startswith(R_HEADS):
branch = branch[len(R_HEADS):]
logTarget = R_M + branch
bareTmp = project.bare_git._bare bareTmp = project.bare_git._bare
project.bare_git._bare = False project.bare_git._bare = False
@ -208,7 +200,7 @@ class Info(PagedCommand):
for commit in commits: for commit in commits:
split = commit.split() split = commit.split()
self.text('{0:38}{1} '.format('', '-')) self.text('{0:38}{1} '.format('','-'))
self.sha(split[0] + " ") self.sha(split[0] + " ")
self.text(" ".join(split[1:])) self.text(" ".join(split[1:]))
self.out.nl() self.out.nl()

View File

@ -15,8 +15,6 @@
# limitations under the License. # limitations under the License.
from __future__ import print_function from __future__ import print_function
import optparse
import os import os
import platform import platform
import re import re
@ -36,10 +34,8 @@ from command import InteractiveCommand, MirrorSafeCommand
from error import ManifestParseError from error import ManifestParseError
from project import SyncBuffer from project import SyncBuffer
from git_config import GitConfig from git_config import GitConfig
from git_command import git_require, MIN_GIT_VERSION_SOFT, MIN_GIT_VERSION_HARD from git_command import git_require, MIN_GIT_VERSION
import platform_utils import platform_utils
from wrapper import Wrapper
class Init(InteractiveCommand, MirrorSafeCommand): class Init(InteractiveCommand, MirrorSafeCommand):
common = True common = True
@ -54,8 +50,7 @@ from the server and is installed in the .repo/ directory in the
current working directory. current working directory.
The optional -b argument can be used to select the manifest branch The optional -b argument can be used to select the manifest branch
to checkout and use. If no branch is specified, the remote's default to checkout and use. If no branch is specified, master is assumed.
branch is used.
The optional -m argument can be used to specify an alternate manifest The optional -m argument can be used to specify an alternate manifest
to be used. If no manifest is specified, the manifest default.xml to be used. If no manifest is specified, the manifest default.xml
@ -86,15 +81,12 @@ manifest, a subsequent `repo sync` (or `repo sync -d`) is necessary
to update the working directory files. to update the working directory files.
""" """
def _Options(self, p, gitc_init=False): def _Options(self, p):
# Logging # Logging
g = p.add_option_group('Logging options') g = p.add_option_group('Logging options')
g.add_option('-v', '--verbose',
dest='output_mode', action='store_true',
help='show all output')
g.add_option('-q', '--quiet', g.add_option('-q', '--quiet',
dest='output_mode', action='store_false', dest="quiet", action="store_true", default=False,
help='only show errors') help="be quiet")
# Manifest # Manifest
g = p.add_option_group('Manifest options') g = p.add_option_group('Manifest options')
@ -104,12 +96,7 @@ to update the working directory files.
g.add_option('-b', '--manifest-branch', g.add_option('-b', '--manifest-branch',
dest='manifest_branch', dest='manifest_branch',
help='manifest branch or revision', metavar='REVISION') help='manifest branch or revision', metavar='REVISION')
cbr_opts = ['--current-branch'] g.add_option('--current-branch',
# The gitc-init subcommand allocates -c itself, but a lot of init users
# want -c, so try to satisfy both as best we can.
if not gitc_init:
cbr_opts += ['-c']
g.add_option(*cbr_opts,
dest='current_branch_only', action='store_true', dest='current_branch_only', action='store_true',
help='fetch only current manifest branch from server') help='fetch only current manifest branch from server')
g.add_option('-m', '--manifest-name', g.add_option('-m', '--manifest-name',
@ -135,10 +122,6 @@ to update the working directory files.
g.add_option('--clone-filter', action='store', default='blob:none', g.add_option('--clone-filter', action='store', default='blob:none',
dest='clone_filter', dest='clone_filter',
help='filter for use with --partial-clone [default: %default]') help='filter for use with --partial-clone [default: %default]')
# TODO(vapier): Expose option with real help text once this has been in the
# wild for a while w/out significant bug reports. Goal is by ~Sep 2020.
g.add_option('--worktree', action='store_true',
help=optparse.SUPPRESS_HELP)
g.add_option('--archive', g.add_option('--archive',
dest='archive', action='store_true', dest='archive', action='store_true',
help='checkout an archive instead of a git repository for ' help='checkout an archive instead of a git repository for '
@ -156,13 +139,11 @@ to update the working directory files.
help='restrict manifest projects to ones with a specified ' help='restrict manifest projects to ones with a specified '
'platform group [auto|all|none|linux|darwin|...]', 'platform group [auto|all|none|linux|darwin|...]',
metavar='PLATFORM') metavar='PLATFORM')
g.add_option('--clone-bundle', action='store_true',
help='force use of /clone.bundle on HTTP/HTTPS (default if not --partial-clone)')
g.add_option('--no-clone-bundle', g.add_option('--no-clone-bundle',
dest='clone_bundle', action='store_false', dest='no_clone_bundle', action='store_true',
help='disable use of /clone.bundle on HTTP/HTTPS (default if --partial-clone)') help='disable use of /clone.bundle on HTTP/HTTPS')
g.add_option('--no-tags', g.add_option('--no-tags',
dest='tags', default=True, action='store_false', dest='no_tags', action='store_true',
help="don't fetch tags in the manifest") help="don't fetch tags in the manifest")
# Tool # Tool
@ -170,12 +151,11 @@ to update the working directory files.
g.add_option('--repo-url', g.add_option('--repo-url',
dest='repo_url', dest='repo_url',
help='repo repository location', metavar='URL') help='repo repository location', metavar='URL')
g.add_option('--repo-rev', metavar='REV', g.add_option('--repo-branch',
help='repo branch or revision') dest='repo_branch',
g.add_option('--repo-branch', dest='repo_rev', help='repo branch or revision', metavar='REVISION')
help=optparse.SUPPRESS_HELP)
g.add_option('--no-repo-verify', g.add_option('--no-repo-verify',
dest='repo_verify', default=True, action='store_false', dest='no_repo_verify', action='store_true',
help='do not verify repo source code') help='do not verify repo source code')
# Other # Other
@ -198,8 +178,7 @@ to update the working directory files.
sys.exit(1) sys.exit(1)
if not opt.quiet: if not opt.quiet:
print('Downloading manifest from %s' % print('Get %s' % GitConfig.ForUser().UrlInsteadOf(opt.manifest_url),
(GitConfig.ForUser().UrlInsteadOf(opt.manifest_url),),
file=sys.stderr) file=sys.stderr)
# The manifest project object doesn't keep track of the path on the # The manifest project object doesn't keep track of the path on the
@ -216,27 +195,24 @@ to update the working directory files.
m._InitGitDir(mirror_git=mirrored_manifest_git) m._InitGitDir(mirror_git=mirrored_manifest_git)
if opt.manifest_branch:
m.revisionExpr = opt.manifest_branch
else:
m.revisionExpr = 'refs/heads/master'
else:
if opt.manifest_branch:
m.revisionExpr = opt.manifest_branch
else:
m.PreSync()
self._ConfigureDepth(opt) self._ConfigureDepth(opt)
# Set the remote URL before the remote branch as we might need it below.
if opt.manifest_url: if opt.manifest_url:
r = m.GetRemote(m.remote.name) r = m.GetRemote(m.remote.name)
r.url = opt.manifest_url r.url = opt.manifest_url
r.ResetFetch() r.ResetFetch()
r.Save() r.Save()
if opt.manifest_branch:
m.revisionExpr = opt.manifest_branch
else:
if is_new:
default_branch = m.ResolveRemoteHead()
if default_branch is None:
# If the remote doesn't have HEAD configured, default to master.
default_branch = 'refs/heads/master'
m.revisionExpr = default_branch
else:
m.PreSync()
groups = re.split(r'[,\s]+', opt.groups) groups = re.split(r'[,\s]+', opt.groups)
all_platforms = ['linux', 'darwin', 'windows'] all_platforms = ['linux', 'darwin', 'windows']
platformize = lambda x: 'platform-' + x platformize = lambda x: 'platform-' + x
@ -264,20 +240,6 @@ to update the working directory files.
if opt.dissociate: if opt.dissociate:
m.config.SetString('repo.dissociate', 'true') m.config.SetString('repo.dissociate', 'true')
if opt.worktree:
if opt.mirror:
print('fatal: --mirror and --worktree are incompatible',
file=sys.stderr)
sys.exit(1)
if opt.submodules:
print('fatal: --submodules and --worktree are incompatible',
file=sys.stderr)
sys.exit(1)
m.config.SetString('repo.worktree', 'true')
if is_new:
m.use_git_worktrees = True
print('warning: --worktree is experimental!', file=sys.stderr)
if opt.archive: if opt.archive:
if is_new: if is_new:
m.config.SetString('repo.archive', 'true') m.config.SetString('repo.archive', 'true')
@ -309,18 +271,13 @@ to update the working directory files.
else: else:
opt.clone_filter = None opt.clone_filter = None
if opt.clone_bundle is None:
opt.clone_bundle = False if opt.partial_clone else True
else:
m.config.SetString('repo.clonebundle', 'true' if opt.clone_bundle else 'false')
if opt.submodules: if opt.submodules:
m.config.SetString('repo.submodules', 'true') m.config.SetString('repo.submodules', 'true')
if not m.Sync_NetworkHalf(is_new=is_new, quiet=opt.quiet, verbose=opt.verbose, if not m.Sync_NetworkHalf(is_new=is_new, quiet=opt.quiet,
clone_bundle=opt.clone_bundle, clone_bundle=not opt.no_clone_bundle,
current_branch_only=opt.current_branch_only, current_branch_only=opt.current_branch_only,
tags=opt.tags, submodules=opt.submodules, no_tags=opt.no_tags, submodules=opt.submodules,
clone_filter=opt.clone_filter): clone_filter=opt.clone_filter):
r = m.GetRemote(m.remote.name) r = m.GetRemote(m.remote.name)
print('fatal: cannot obtain manifest %s' % r.url, file=sys.stderr) print('fatal: cannot obtain manifest %s' % r.url, file=sys.stderr)
@ -364,7 +321,7 @@ to update the working directory files.
return value return value
return a return a
def _ShouldConfigureUser(self, opt): def _ShouldConfigureUser(self):
gc = self.manifest.globalConfig gc = self.manifest.globalConfig
mp = self.manifest.manifestProject mp = self.manifest.manifestProject
@ -376,23 +333,20 @@ to update the working directory files.
mp.config.SetString('user.name', gc.GetString('user.name')) mp.config.SetString('user.name', gc.GetString('user.name'))
mp.config.SetString('user.email', gc.GetString('user.email')) mp.config.SetString('user.email', gc.GetString('user.email'))
if not opt.quiet:
print() print()
print('Your identity is: %s <%s>' % (mp.config.GetString('user.name'), print('Your identity is: %s <%s>' % (mp.config.GetString('user.name'),
mp.config.GetString('user.email'))) mp.config.GetString('user.email')))
print("If you want to change this, please re-run 'repo init' with --config-name") print('If you want to change this, please re-run \'repo init\' with --config-name')
return False return False
def _ConfigureUser(self, opt): def _ConfigureUser(self):
mp = self.manifest.manifestProject mp = self.manifest.manifestProject
while True: while True:
if not opt.quiet:
print() print()
name = self._Prompt('Your Name', mp.UserName) name = self._Prompt('Your Name', mp.UserName)
email = self._Prompt('Your Email', mp.UserEmail) email = self._Prompt('Your Email', mp.UserEmail)
if not opt.quiet:
print() print()
print('Your identity is: %s <%s>' % (name, email)) print('Your identity is: %s <%s>' % (name, email))
print('is this correct [y/N]? ', end='') print('is this correct [y/N]? ', end='')
@ -465,16 +419,15 @@ to update the working directory files.
# We store the depth in the main manifest project. # We store the depth in the main manifest project.
self.manifest.manifestProject.config.SetString('repo.depth', depth) self.manifest.manifestProject.config.SetString('repo.depth', depth)
def _DisplayResult(self, opt): def _DisplayResult(self):
if self.manifest.IsMirror: if self.manifest.IsMirror:
init_type = 'mirror ' init_type = 'mirror '
else: else:
init_type = '' init_type = ''
if not opt.quiet:
print() print()
print('repo %shas been initialized in %s' % print('repo %shas been initialized in %s'
(init_type, self.manifest.topdir)) % (init_type, self.manifest.topdir))
current_dir = os.getcwd() current_dir = os.getcwd()
if current_dir != self.manifest.topdir: if current_dir != self.manifest.topdir:
@ -492,48 +445,15 @@ to update the working directory files.
if opt.archive and opt.mirror: if opt.archive and opt.mirror:
self.OptionParser.error('--mirror and --archive cannot be used together.') self.OptionParser.error('--mirror and --archive cannot be used together.')
if args:
self.OptionParser.error('init takes no arguments')
def Execute(self, opt, args): def Execute(self, opt, args):
git_require(MIN_GIT_VERSION_HARD, fail=True) git_require(MIN_GIT_VERSION, fail=True)
if not git_require(MIN_GIT_VERSION_SOFT):
print('repo: warning: git-%s+ will soon be required; please upgrade your '
'version of git to maintain support.'
% ('.'.join(str(x) for x in MIN_GIT_VERSION_SOFT),),
file=sys.stderr)
opt.quiet = opt.output_mode is False
opt.verbose = opt.output_mode is True
rp = self.manifest.repoProject
# Handle new --repo-url requests.
if opt.repo_url:
remote = rp.GetRemote('origin')
remote.url = opt.repo_url
remote.Save()
# Handle new --repo-rev requests.
if opt.repo_rev:
wrapper = Wrapper()
remote_ref, rev = wrapper.check_repo_rev(
rp.gitdir, opt.repo_rev, repo_verify=opt.repo_verify, quiet=opt.quiet)
branch = rp.GetBranch('default')
branch.merge = remote_ref
rp.work_git.update_ref('refs/heads/default', rev)
branch.Save()
if opt.worktree:
# Older versions of git supported worktree, but had dangerous gc bugs.
git_require((2, 15, 0), fail=True, msg='git gc worktree corruption')
self._SyncManifest(opt) self._SyncManifest(opt)
self._LinkManifest(opt.manifest_name) self._LinkManifest(opt.manifest_name)
if os.isatty(0) and os.isatty(1) and not self.manifest.IsMirror: if os.isatty(0) and os.isatty(1) and not self.manifest.IsMirror:
if opt.config_name or self._ShouldConfigureUser(opt): if opt.config_name or self._ShouldConfigureUser():
self._ConfigureUser(opt) self._ConfigureUser()
self._ConfigureColor() self._ConfigureColor()
self._DisplayResult(opt) self._DisplayResult()

View File

@ -15,10 +15,10 @@
# limitations under the License. # limitations under the License.
from __future__ import print_function from __future__ import print_function
import sys
from command import Command, MirrorSafeCommand from command import Command, MirrorSafeCommand
class List(Command, MirrorSafeCommand): class List(Command, MirrorSafeCommand):
common = True common = True
helpSummary = "List projects and their associated directories" helpSummary = "List projects and their associated directories"
@ -77,7 +77,7 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
lines = [] lines = []
for project in projects: for project in projects:
if opt.name_only and not opt.path_only: if opt.name_only and not opt.path_only:
lines.append("%s" % (project.name)) lines.append("%s" % ( project.name))
elif opt.path_only and not opt.name_only: elif opt.path_only and not opt.name_only:
lines.append("%s" % (_getpath(project))) lines.append("%s" % (_getpath(project)))
else: else:

View File

@ -20,26 +20,19 @@ import sys
from command import PagedCommand from command import PagedCommand
class Manifest(PagedCommand): class Manifest(PagedCommand):
common = False common = False
helpSummary = "Manifest inspection utility" helpSummary = "Manifest inspection utility"
helpUsage = """ helpUsage = """
%prog [-o {-|NAME.xml}] [-m MANIFEST.xml] [-r] %prog [-o {-|NAME.xml} [-r]]
""" """
_helpDescription = """ _helpDescription = """
With the -o option, exports the current manifest for inspection. With the -o option, exports the current manifest for inspection.
The manifest and (if present) local_manifests/ are combined The manifest and (if present) local_manifest.xml are combined
together to produce a single manifest file. This file can be stored together to produce a single manifest file. This file can be stored
in a Git repository for use during future 'repo init' invocations. in a Git repository for use during future 'repo init' invocations.
The -r option can be used to generate a manifest file with project
revisions set to the current commit hash. These are known as
"revision locked manifests", as they don't follow a particular branch.
In this case, the 'upstream' attribute is set to the ref we were on
when the manifest was generated. The 'dest-branch' attribute is set
to indicate the remote ref to push changes to via 'repo upload'.
""" """
@property @property
@ -47,27 +40,21 @@ to indicate the remote ref to push changes to via 'repo upload'.
helptext = self._helpDescription + '\n' helptext = self._helpDescription + '\n'
r = os.path.dirname(__file__) r = os.path.dirname(__file__)
r = os.path.dirname(r) r = os.path.dirname(r)
with open(os.path.join(r, 'docs', 'manifest-format.md')) as fd: fd = open(os.path.join(r, 'docs', 'manifest-format.md'))
for line in fd: for line in fd:
helptext += line helptext += line
fd.close()
return helptext return helptext
def _Options(self, p): def _Options(self, p):
p.add_option('-r', '--revision-as-HEAD', p.add_option('-r', '--revision-as-HEAD',
dest='peg_rev', action='store_true', dest='peg_rev', action='store_true',
help='Save revisions as current HEAD') help='Save revisions as current HEAD')
p.add_option('-m', '--manifest-name',
help='temporary manifest to use for this sync', metavar='NAME.xml')
p.add_option('--suppress-upstream-revision', dest='peg_rev_upstream', p.add_option('--suppress-upstream-revision', dest='peg_rev_upstream',
default=True, action='store_false', default=True, action='store_false',
help='If in -r mode, do not write the upstream field. ' help='If in -r mode, do not write the upstream field. '
'Only of use if the branch names for a sha1 manifest are ' 'Only of use if the branch names for a sha1 manifest are '
'sensitive.') 'sensitive.')
p.add_option('--suppress-dest-branch', dest='peg_rev_dest_branch',
default=True, action='store_false',
help='If in -r mode, do not write the dest-branch field. '
'Only of use if the branch names for a sha1 manifest are '
'sensitive.')
p.add_option('-o', '--output-file', p.add_option('-o', '--output-file',
dest='output_file', dest='output_file',
default='-', default='-',
@ -75,18 +62,13 @@ to indicate the remote ref to push changes to via 'repo upload'.
metavar='-|NAME.xml') metavar='-|NAME.xml')
def _Output(self, opt): def _Output(self, opt):
# If alternate manifest is specified, override the manifest file that we're using.
if opt.manifest_name:
self.manifest.Override(opt.manifest_name, False)
if opt.output_file == '-': if opt.output_file == '-':
fd = sys.stdout fd = sys.stdout
else: else:
fd = open(opt.output_file, 'w') fd = open(opt.output_file, 'w')
self.manifest.Save(fd, self.manifest.Save(fd,
peg_rev=opt.peg_rev, peg_rev = opt.peg_rev,
peg_rev_upstream=opt.peg_rev_upstream, peg_rev_upstream = opt.peg_rev_upstream)
peg_rev_dest_branch=opt.peg_rev_dest_branch)
fd.close() fd.close()
if opt.output_file != '-': if opt.output_file != '-':
print('Saved manifest to %s' % opt.output_file, file=sys.stderr) print('Saved manifest to %s' % opt.output_file, file=sys.stderr)

View File

@ -18,7 +18,6 @@ from __future__ import print_function
from color import Coloring from color import Coloring
from command import PagedCommand from command import PagedCommand
class Prune(PagedCommand): class Prune(PagedCommand):
common = True common = True
helpSummary = "Prune (delete) already merged topics" helpSummary = "Prune (delete) already merged topics"
@ -52,16 +51,11 @@ class Prune(PagedCommand):
out.project('project %s/' % project.relpath) out.project('project %s/' % project.relpath)
out.nl() out.nl()
print('%s %-33s ' % (
branch.name == project.CurrentBranch and '*' or ' ',
branch.name), end='')
if not branch.base_exists:
print('(ignoring: tracking branch is gone: %s)' % (branch.base,))
else:
commits = branch.commits commits = branch.commits
date = branch.date date = branch.date
print('(%2d commit%s, %s)' % ( print('%s %-33s (%2d commit%s, %s)' % (
branch.name == project.CurrentBranch and '*' or ' ',
branch.name,
len(commits), len(commits),
len(commits) != 1 and 's' or ' ', len(commits) != 1 and 's' or ' ',
date)) date))

View File

@ -53,7 +53,7 @@ branch but need to incorporate new upstream changes "underneath" them.
dest='force_rebase', action='store_true', dest='force_rebase', action='store_true',
help='Pass --force-rebase to git rebase') help='Pass --force-rebase to git rebase')
p.add_option('--no-ff', p.add_option('--no-ff',
dest='ff', default=True, action='store_false', dest='no_ff', action='store_true',
help='Pass --no-ff to git rebase') help='Pass --no-ff to git rebase')
p.add_option('-q', '--quiet', p.add_option('-q', '--quiet',
dest='quiet', action='store_true', dest='quiet', action='store_true',
@ -93,7 +93,7 @@ branch but need to incorporate new upstream changes "underneath" them.
common_args.append('--quiet') common_args.append('--quiet')
if opt.force_rebase: if opt.force_rebase:
common_args.append('--force-rebase') common_args.append('--force-rebase')
if not opt.ff: if opt.no_ff:
common_args.append('--no-ff') common_args.append('--no-ff')
if opt.autosquash: if opt.autosquash:
common_args.append('--autosquash') common_args.append('--autosquash')

View File

@ -22,7 +22,6 @@ from command import Command, MirrorSafeCommand
from subcmds.sync import _PostRepoUpgrade from subcmds.sync import _PostRepoUpgrade
from subcmds.sync import _PostRepoFetch from subcmds.sync import _PostRepoFetch
class Selfupdate(Command, MirrorSafeCommand): class Selfupdate(Command, MirrorSafeCommand):
common = False common = False
helpSummary = "Update repo to the latest version" helpSummary = "Update repo to the latest version"
@ -40,7 +39,7 @@ need to be performed by an end-user.
def _Options(self, p): def _Options(self, p):
g = p.add_option_group('repo Version options') g = p.add_option_group('repo Version options')
g.add_option('--no-repo-verify', g.add_option('--no-repo-verify',
dest='repo_verify', default=True, action='store_false', dest='no_repo_verify', action='store_true',
help='do not verify repo source code') help='do not verify repo source code')
g.add_option('--repo-upgraded', g.add_option('--repo-upgraded',
dest='repo_upgraded', action='store_true', dest='repo_upgraded', action='store_true',
@ -60,5 +59,5 @@ need to be performed by an end-user.
rp.bare_git.gc('--auto') rp.bare_git.gc('--auto')
_PostRepoFetch(rp, _PostRepoFetch(rp,
repo_verify=opt.repo_verify, no_repo_verify = opt.no_repo_verify,
verbose=True) verbose = True)

View File

@ -16,7 +16,6 @@
from subcmds.sync import Sync from subcmds.sync import Sync
class Smartsync(Sync): class Smartsync(Sync):
common = True common = True
helpSummary = "Update working tree to the latest known good revision" helpSummary = "Update working tree to the latest known good revision"

View File

@ -21,7 +21,6 @@ from color import Coloring
from command import InteractiveCommand from command import InteractiveCommand
from git_command import GitCommand from git_command import GitCommand
class _ProjectList(Coloring): class _ProjectList(Coloring):
def __init__(self, gc): def __init__(self, gc):
Coloring.__init__(self, gc, 'interactive') Coloring.__init__(self, gc, 'interactive')
@ -29,7 +28,6 @@ class _ProjectList(Coloring):
self.header = self.printer('header', attr='bold') self.header = self.printer('header', attr='bold')
self.help = self.printer('help', fg='red', attr='bold') self.help = self.printer('help', fg='red', attr='bold')
class Stage(InteractiveCommand): class Stage(InteractiveCommand):
common = True common = True
helpSummary = "Stage file(s) for commit" helpSummary = "Stage file(s) for commit"
@ -107,7 +105,6 @@ The '%prog' command stages files to prepare the next commit.
continue continue
print('Bye.') print('Bye.')
def _AddI(project): def _AddI(project):
p = GitCommand(project, ['add', '--interactive'], bare=False) p = GitCommand(project, ['add', '--interactive'], bare=False)
p.Wait() p.Wait()

View File

@ -25,7 +25,6 @@ import gitc_utils
from progress import Progress from progress import Progress
from project import SyncBuffer from project import SyncBuffer
class Start(Command): class Start(Command):
common = True common = True
helpSummary = "Start a new branch for development" helpSummary = "Start a new branch for development"
@ -61,7 +60,7 @@ revision specified in the manifest.
if not opt.all: if not opt.all:
projects = args[1:] projects = args[1:]
if len(projects) < 1: if len(projects) < 1:
projects = ['.'] # start it in the local project by default projects = ['.',] # start it in the local project by default
all_projects = self.GetProjects(projects, all_projects = self.GetProjects(projects,
missing_ok=bool(self.gitc_manifest)) missing_ok=bool(self.gitc_manifest))

View File

@ -16,17 +16,21 @@
from __future__ import print_function from __future__ import print_function
import functools
import glob
import multiprocessing
import os
from command import PagedCommand from command import PagedCommand
try:
import threading as _threading
except ImportError:
import dummy_threading as _threading
import glob
import itertools
import os
from color import Coloring from color import Coloring
import platform_utils import platform_utils
class Status(PagedCommand): class Status(PagedCommand):
common = True common = True
helpSummary = "Show the working tree status" helpSummary = "Show the working tree status"
@ -91,20 +95,25 @@ the following meanings:
p.add_option('-q', '--quiet', action='store_true', p.add_option('-q', '--quiet', action='store_true',
help="only print the name of modified projects") help="only print the name of modified projects")
def _StatusHelper(self, quiet, project): def _StatusHelper(self, project, clean_counter, sem, quiet):
"""Obtains the status for a specific project. """Obtains the status for a specific project.
Obtains the status for a project, redirecting the output to Obtains the status for a project, redirecting the output to
the specified object. the specified object. It will release the semaphore
when done.
Args: Args:
quiet: Where to output the status.
project: Project to get status of. project: Project to get status of.
clean_counter: Counter for clean projects.
Returns: sem: Semaphore, will call release() when complete.
The status of the project. output: Where to output the status.
""" """
return project.PrintWorkTreeStatus(quiet=quiet) try:
state = project.PrintWorkTreeStatus(quiet=quiet)
if state == 'CLEAN':
next(clean_counter)
finally:
sem.release()
def _FindOrphans(self, dirs, proj_dirs, proj_dirs_parents, outstring): def _FindOrphans(self, dirs, proj_dirs, proj_dirs_parents, outstring):
"""find 'dirs' that are present in 'proj_dirs_parents' but not in 'proj_dirs'""" """find 'dirs' that are present in 'proj_dirs_parents' but not in 'proj_dirs'"""
@ -124,18 +133,27 @@ the following meanings:
def Execute(self, opt, args): def Execute(self, opt, args):
all_projects = self.GetProjects(args) all_projects = self.GetProjects(args)
counter = 0 counter = itertools.count()
if opt.jobs == 1: if opt.jobs == 1:
for project in all_projects: for project in all_projects:
state = project.PrintWorkTreeStatus(quiet=opt.quiet) state = project.PrintWorkTreeStatus(quiet=opt.quiet)
if state == 'CLEAN': if state == 'CLEAN':
counter += 1 next(counter)
else: else:
with multiprocessing.Pool(opt.jobs) as pool: sem = _threading.Semaphore(opt.jobs)
states = pool.map(functools.partial(self._StatusHelper, opt.quiet), all_projects) threads = []
counter += states.count('CLEAN') for project in all_projects:
if not opt.quiet and len(all_projects) == counter: sem.acquire()
t = _threading.Thread(target=self._StatusHelper,
args=(project, counter, sem, opt.quiet))
threads.append(t)
t.daemon = True
t.start()
for t in threads:
t.join()
if not opt.quiet and len(all_projects) == next(counter):
print('nothing to commit (working directory clean)') print('nothing to commit (working directory clean)')
if opt.orphans: if opt.orphans:
@ -152,8 +170,8 @@ the following meanings:
class StatusColoring(Coloring): class StatusColoring(Coloring):
def __init__(self, config): def __init__(self, config):
Coloring.__init__(self, config, 'status') Coloring.__init__(self, config, 'status')
self.project = self.printer('header', attr='bold') self.project = self.printer('header', attr = 'bold')
self.untracked = self.printer('untracked', fg='red') self.untracked = self.printer('untracked', fg = 'red')
orig_path = os.getcwd() orig_path = os.getcwd()
try: try:

View File

@ -15,7 +15,6 @@
# limitations under the License. # limitations under the License.
from __future__ import print_function from __future__ import print_function
import json import json
import netrc import netrc
from optparse import SUPPRESS_HELP from optparse import SUPPRESS_HELP
@ -54,7 +53,6 @@ except ImportError:
try: try:
import resource import resource
def _rlimit_nofile(): def _rlimit_nofile():
return resource.getrlimit(resource.RLIMIT_NOFILE) return resource.getrlimit(resource.RLIMIT_NOFILE)
except ImportError: except ImportError:
@ -83,16 +81,13 @@ from manifest_xml import GitcManifest
_ONE_DAY_S = 24 * 60 * 60 _ONE_DAY_S = 24 * 60 * 60
class _FetchError(Exception): class _FetchError(Exception):
"""Internal error thrown in _FetchHelper() when we don't want stack trace.""" """Internal error thrown in _FetchHelper() when we don't want stack trace."""
pass pass
class _CheckoutError(Exception): class _CheckoutError(Exception):
"""Internal error thrown in _CheckoutOne() when we don't want stack trace.""" """Internal error thrown in _CheckoutOne() when we don't want stack trace."""
class Sync(Command, MirrorSafeCommand): class Sync(Command, MirrorSafeCommand):
jobs = 1 jobs = 1
common = True common = True
@ -138,11 +133,11 @@ if the manifest server specified in the manifest file already includes
credentials. credentials.
By default, all projects will be synced. The --fail-fast option can be used By default, all projects will be synced. The --fail-fast option can be used
to halt syncing as soon as possible when the first project fails to sync. to halt syncing as soon as possible when the the first project fails to sync.
The --force-sync option can be used to overwrite existing git The --force-sync option can be used to overwrite existing git
directories if they have previously been linked to a different directories if they have previously been linked to a different
object directory. WARNING: This may cause data to be lost since object direcotry. WARNING: This may cause data to be lost since
refs may be removed when overwriting. refs may be removed when overwriting.
The --force-remove-dirty option can be used to remove previously used The --force-remove-dirty option can be used to remove previously used
@ -222,10 +217,6 @@ later is required to fix a server side protocol bug.
p.add_option('-l', '--local-only', p.add_option('-l', '--local-only',
dest='local_only', action='store_true', dest='local_only', action='store_true',
help="only update working tree, don't fetch") help="only update working tree, don't fetch")
p.add_option('--no-manifest-update', '--nmu',
dest='mp_update', action='store_false', default='true',
help='use the existing manifest checkout as-is. '
'(do not update to the latest revision)')
p.add_option('-n', '--network-only', p.add_option('-n', '--network-only',
dest='network_only', action='store_true', dest='network_only', action='store_true',
help="fetch only, don't update working tree") help="fetch only, don't update working tree")
@ -235,21 +226,17 @@ later is required to fix a server side protocol bug.
p.add_option('-c', '--current-branch', p.add_option('-c', '--current-branch',
dest='current_branch_only', action='store_true', dest='current_branch_only', action='store_true',
help='fetch only current branch from server') help='fetch only current branch from server')
p.add_option('-v', '--verbose',
dest='output_mode', action='store_true',
help='show all sync output')
p.add_option('-q', '--quiet', p.add_option('-q', '--quiet',
dest='output_mode', action='store_false', dest='quiet', action='store_true',
help='only show errors') help='be more quiet')
p.add_option('-j', '--jobs', p.add_option('-j', '--jobs',
dest='jobs', action='store', type='int', dest='jobs', action='store', type='int',
help="projects to fetch simultaneously (default %d)" % self.jobs) help="projects to fetch simultaneously (default %d)" % self.jobs)
p.add_option('-m', '--manifest-name', p.add_option('-m', '--manifest-name',
dest='manifest_name', dest='manifest_name',
help='temporary manifest to use for this sync', metavar='NAME.xml') help='temporary manifest to use for this sync', metavar='NAME.xml')
p.add_option('--clone-bundle', action='store_true', p.add_option('--no-clone-bundle',
help='enable use of /clone.bundle on HTTP/HTTPS') dest='no_clone_bundle', action='store_true',
p.add_option('--no-clone-bundle', dest='clone_bundle', action='store_false',
help='disable use of /clone.bundle on HTTP/HTTPS') help='disable use of /clone.bundle on HTTP/HTTPS')
p.add_option('-u', '--manifest-server-username', action='store', p.add_option('-u', '--manifest-server-username', action='store',
dest='manifest_server_username', dest='manifest_server_username',
@ -261,14 +248,11 @@ later is required to fix a server side protocol bug.
dest='fetch_submodules', action='store_true', dest='fetch_submodules', action='store_true',
help='fetch submodules from server') help='fetch submodules from server')
p.add_option('--no-tags', p.add_option('--no-tags',
dest='tags', default=True, action='store_false', dest='no_tags', action='store_true',
help="don't fetch tags") help="don't fetch tags")
p.add_option('--optimized-fetch', p.add_option('--optimized-fetch',
dest='optimized_fetch', action='store_true', dest='optimized_fetch', action='store_true',
help='only fetch projects fixed to sha1 if revision does not exist locally') help='only fetch projects fixed to sha1 if revision does not exist locally')
p.add_option('--retry-fetches',
default=0, action='store', type='int',
help='number of times to retry fetches on transient errors')
p.add_option('--prune', dest='prune', action='store_true', p.add_option('--prune', dest='prune', action='store_true',
help='delete refs that no longer exist on the remote') help='delete refs that no longer exist on the remote')
if show_smart: if show_smart:
@ -281,7 +265,7 @@ later is required to fix a server side protocol bug.
g = p.add_option_group('repo Version options') g = p.add_option_group('repo Version options')
g.add_option('--no-repo-verify', g.add_option('--no-repo-verify',
dest='repo_verify', default=True, action='store_false', dest='no_repo_verify', action='store_true',
help='do not verify repo source code') help='do not verify repo source code')
g.add_option('--repo-upgraded', g.add_option('--repo-upgraded',
dest='repo_upgraded', action='store_true', dest='repo_upgraded', action='store_true',
@ -331,6 +315,9 @@ later is required to fix a server side protocol bug.
# We'll set to true once we've locked the lock. # We'll set to true once we've locked the lock.
did_lock = False did_lock = False
if not opt.quiet:
print('Fetching project %s' % project.name)
# Encapsulate everything in a try/except/finally so that: # Encapsulate everything in a try/except/finally so that:
# - We always set err_event in the case of an exception. # - We always set err_event in the case of an exception.
# - We always make sure we unlock the lock if we locked it. # - We always make sure we unlock the lock if we locked it.
@ -340,13 +327,11 @@ later is required to fix a server side protocol bug.
try: try:
success = project.Sync_NetworkHalf( success = project.Sync_NetworkHalf(
quiet=opt.quiet, quiet=opt.quiet,
verbose=opt.verbose,
current_branch_only=opt.current_branch_only, current_branch_only=opt.current_branch_only,
force_sync=opt.force_sync, force_sync=opt.force_sync,
clone_bundle=opt.clone_bundle, clone_bundle=not opt.no_clone_bundle,
tags=opt.tags, archive=self.manifest.IsArchive, no_tags=opt.no_tags, archive=self.manifest.IsArchive,
optimized_fetch=opt.optimized_fetch, optimized_fetch=opt.optimized_fetch,
retry_fetches=opt.retry_fetches,
prune=opt.prune, prune=opt.prune,
clone_filter=clone_filter) clone_filter=clone_filter)
self._fetch_times.Set(project, time.time() - start) self._fetch_times.Set(project, time.time() - start)
@ -365,11 +350,11 @@ later is required to fix a server side protocol bug.
raise _FetchError() raise _FetchError()
fetched.add(project.gitdir) fetched.add(project.gitdir)
pm.update(msg=project.name) pm.update()
except _FetchError: except _FetchError:
pass pass
except Exception as e: except Exception as e:
print('error: Cannot fetch %s (%s: %s)' print('error: Cannot fetch %s (%s: %s)' \
% (project.name, type(e).__name__, str(e)), file=sys.stderr) % (project.name, type(e).__name__, str(e)), file=sys.stderr)
err_event.set() err_event.set()
raise raise
@ -382,10 +367,11 @@ later is required to fix a server side protocol bug.
return success return success
def _Fetch(self, projects, opt, err_event): def _Fetch(self, projects, opt):
fetched = set() fetched = set()
lock = _threading.Lock() lock = _threading.Lock()
pm = Progress('Fetching projects', len(projects), pm = Progress('Fetching projects', len(projects),
print_newline=not(opt.quiet),
always_print_percentage=opt.quiet) always_print_percentage=opt.quiet)
objdir_project_map = dict() objdir_project_map = dict()
@ -394,6 +380,7 @@ later is required to fix a server side protocol bug.
threads = set() threads = set()
sem = _threading.Semaphore(self.jobs) sem = _threading.Semaphore(self.jobs)
err_event = _threading.Event()
for project_list in objdir_project_map.values(): for project_list in objdir_project_map.values():
# Check for any errors before running any more tasks. # Check for any errors before running any more tasks.
# ...we'll let existing threads finish, though. # ...we'll let existing threads finish, though.
@ -410,8 +397,8 @@ later is required to fix a server side protocol bug.
err_event=err_event, err_event=err_event,
clone_filter=self.manifest.CloneFilter) clone_filter=self.manifest.CloneFilter)
if self.jobs > 1: if self.jobs > 1:
t = _threading.Thread(target=self._FetchProjectList, t = _threading.Thread(target = self._FetchProjectList,
kwargs=kwargs) kwargs = kwargs)
# Ensure that Ctrl-C will not freeze the repo process. # Ensure that Ctrl-C will not freeze the repo process.
t.daemon = True t.daemon = True
threads.add(t) threads.add(t)
@ -422,11 +409,16 @@ later is required to fix a server side protocol bug.
for t in threads: for t in threads:
t.join() t.join()
# If we saw an error, exit with code 1 so that other scripts can check.
if err_event.isSet() and opt.fail_fast:
print('\nerror: Exited sync due to fetch errors', file=sys.stderr)
sys.exit(1)
pm.end() pm.end()
self._fetch_times.Save() self._fetch_times.Save()
if not self.manifest.IsArchive: if not self.manifest.IsArchive:
self._GCProjects(projects, opt, err_event) self._GCProjects(projects)
return fetched return fetched
@ -448,7 +440,7 @@ later is required to fix a server side protocol bug.
finally: finally:
sem.release() sem.release()
def _CheckoutOne(self, opt, project, lock, pm, err_event, err_results): def _CheckoutOne(self, opt, project, lock, pm, err_event):
"""Checkout work tree for one project """Checkout work tree for one project
Args: Args:
@ -460,8 +452,6 @@ later is required to fix a server side protocol bug.
lock held). lock held).
err_event: We'll set this event in the case of an error (after printing err_event: We'll set this event in the case of an error (after printing
out info about the error). out info about the error).
err_results: A list of strings, paths to git repos where checkout
failed.
Returns: Returns:
Whether the fetch was successful. Whether the fetch was successful.
@ -469,6 +459,9 @@ later is required to fix a server side protocol bug.
# We'll set to true once we've locked the lock. # We'll set to true once we've locked the lock.
did_lock = False did_lock = False
if not opt.quiet:
print('Checking out project %s' % project.name)
# Encapsulate everything in a try/except/finally so that: # Encapsulate everything in a try/except/finally so that:
# - We always set err_event in the case of an exception. # - We always set err_event in the case of an exception.
# - We always make sure we unlock the lock if we locked it. # - We always make sure we unlock the lock if we locked it.
@ -479,11 +472,11 @@ later is required to fix a server side protocol bug.
try: try:
try: try:
project.Sync_LocalHalf(syncbuf, force_sync=opt.force_sync) project.Sync_LocalHalf(syncbuf, force_sync=opt.force_sync)
success = syncbuf.Finish()
# Lock around all the rest of the code, since printing, updating a set # Lock around all the rest of the code, since printing, updating a set
# and Progress.update() are not thread safe. # and Progress.update() are not thread safe.
lock.acquire() lock.acquire()
success = syncbuf.Finish()
did_lock = True did_lock = True
if not success: if not success:
@ -492,7 +485,7 @@ later is required to fix a server side protocol bug.
file=sys.stderr) file=sys.stderr)
raise _CheckoutError() raise _CheckoutError()
pm.update(msg=project.name) pm.update()
except _CheckoutError: except _CheckoutError:
pass pass
except Exception as e: except Exception as e:
@ -503,8 +496,6 @@ later is required to fix a server side protocol bug.
raise raise
finally: finally:
if did_lock: if did_lock:
if not success:
err_results.append(project.relpath)
lock.release() lock.release()
finish = time.time() finish = time.time()
self.event_log.AddSync(project, event_log.TASK_SYNC_LOCAL, self.event_log.AddSync(project, event_log.TASK_SYNC_LOCAL,
@ -512,16 +503,12 @@ later is required to fix a server side protocol bug.
return success return success
def _Checkout(self, all_projects, opt, err_event, err_results): def _Checkout(self, all_projects, opt):
"""Checkout projects listed in all_projects """Checkout projects listed in all_projects
Args: Args:
all_projects: List of all projects that should be checked out. all_projects: List of all projects that should be checked out.
opt: Program options returned from optparse. See _Options(). opt: Program options returned from optparse. See _Options().
err_event: We'll set this event in the case of an error (after printing
out info about the error).
err_results: A list of strings, paths to git repos where checkout
failed.
""" """
# Perform checkouts in multiple threads when we are using partial clone. # Perform checkouts in multiple threads when we are using partial clone.
@ -536,10 +523,11 @@ later is required to fix a server side protocol bug.
syncjobs = 1 syncjobs = 1
lock = _threading.Lock() lock = _threading.Lock()
pm = Progress('Checking out projects', len(all_projects)) pm = Progress('Syncing work tree', len(all_projects))
threads = set() threads = set()
sem = _threading.Semaphore(syncjobs) sem = _threading.Semaphore(syncjobs)
err_event = _threading.Event()
for project in all_projects: for project in all_projects:
# Check for any errors before running any more tasks. # Check for any errors before running any more tasks.
@ -554,8 +542,7 @@ later is required to fix a server side protocol bug.
project=project, project=project,
lock=lock, lock=lock,
pm=pm, pm=pm,
err_event=err_event, err_event=err_event)
err_results=err_results)
if syncjobs > 1: if syncjobs > 1:
t = _threading.Thread(target=self._CheckoutWorker, t = _threading.Thread(target=self._CheckoutWorker,
kwargs=kwargs) kwargs=kwargs)
@ -570,27 +557,21 @@ later is required to fix a server side protocol bug.
t.join() t.join()
pm.end() pm.end()
# If we saw an error, exit with code 1 so that other scripts can check.
if err_event.isSet():
print('\nerror: Exited sync due to checkout errors', file=sys.stderr)
sys.exit(1)
def _GCProjects(self, projects, opt, err_event): def _GCProjects(self, projects):
gc_gitdirs = {} gc_gitdirs = {}
for project in projects: for project in projects:
# Make sure pruning never kicks in with shared projects. if len(project.manifest.GetProjectsWithName(project.name)) > 1:
if (not project.use_git_worktrees and print('Shared project %s found, disabling pruning.' % project.name)
len(project.manifest.GetProjectsWithName(project.name)) > 1): project.bare_git.config('--replace-all', 'gc.pruneExpire', 'never')
print('%s: Shared project %s found, disabling pruning.' %
(project.relpath, project.name))
if git_require((2, 7, 0)):
project.EnableRepositoryExtension('preciousObjects')
else:
# This isn't perfect, but it's the best we can do with old git.
print('%s: WARNING: shared projects are unreliable when using old '
'versions of git; please upgrade to git-2.7.0+.'
% (project.relpath,),
file=sys.stderr)
project.config.SetString('gc.pruneExpire', 'never')
gc_gitdirs[project.gitdir] = project.bare_git gc_gitdirs[project.gitdir] = project.bare_git
if multiprocessing: has_dash_c = git_require((1, 7, 2))
if multiprocessing and has_dash_c:
cpu_count = multiprocessing.cpu_count() cpu_count = multiprocessing.cpu_count()
else: else:
cpu_count = 1 cpu_count = 1
@ -605,6 +586,7 @@ later is required to fix a server side protocol bug.
threads = set() threads = set()
sem = _threading.Semaphore(jobs) sem = _threading.Semaphore(jobs)
err_event = _threading.Event()
def GC(bare_git): def GC(bare_git):
try: try:
@ -612,14 +594,14 @@ later is required to fix a server side protocol bug.
bare_git.gc('--auto', config=config) bare_git.gc('--auto', config=config)
except GitError: except GitError:
err_event.set() err_event.set()
except Exception: except:
err_event.set() err_event.set()
raise raise
finally: finally:
sem.release() sem.release()
for bare_git in gc_gitdirs.values(): for bare_git in gc_gitdirs.values():
if err_event.isSet() and opt.fail_fast: if err_event.isSet():
break break
sem.acquire() sem.acquire()
t = _threading.Thread(target=GC, args=(bare_git,)) t = _threading.Thread(target=GC, args=(bare_git,))
@ -630,6 +612,10 @@ later is required to fix a server side protocol bug.
for t in threads: for t in threads:
t.join() t.join()
if err_event.isSet():
print('\nerror: Exited sync due to gc errors', file=sys.stderr)
sys.exit(1)
def _ReloadManifest(self, manifest_name=None): def _ReloadManifest(self, manifest_name=None):
if manifest_name: if manifest_name:
# Override calls _Unload already # Override calls _Unload already
@ -637,6 +623,65 @@ later is required to fix a server side protocol bug.
else: else:
self.manifest._Unload() self.manifest._Unload()
def _DeleteProject(self, path):
print('Deleting obsolete path %s' % path, file=sys.stderr)
# Delete the .git directory first, so we're less likely to have a partially
# working git repository around. There shouldn't be any git projects here,
# so rmtree works.
try:
platform_utils.rmtree(os.path.join(path, '.git'))
except OSError as e:
print('Failed to remove %s (%s)' % (os.path.join(path, '.git'), str(e)), file=sys.stderr)
print('error: Failed to delete obsolete path %s' % path, file=sys.stderr)
print(' remove manually, then run sync again', file=sys.stderr)
return 1
# Delete everything under the worktree, except for directories that contain
# another git project
dirs_to_remove = []
failed = False
for root, dirs, files in platform_utils.walk(path):
for f in files:
try:
platform_utils.remove(os.path.join(root, f))
except OSError as e:
print('Failed to remove %s (%s)' % (os.path.join(root, f), str(e)), file=sys.stderr)
failed = True
dirs[:] = [d for d in dirs
if not os.path.lexists(os.path.join(root, d, '.git'))]
dirs_to_remove += [os.path.join(root, d) for d in dirs
if os.path.join(root, d) not in dirs_to_remove]
for d in reversed(dirs_to_remove):
if platform_utils.islink(d):
try:
platform_utils.remove(d)
except OSError as e:
print('Failed to remove %s (%s)' % (os.path.join(root, d), str(e)), file=sys.stderr)
failed = True
elif len(platform_utils.listdir(d)) == 0:
try:
platform_utils.rmdir(d)
except OSError as e:
print('Failed to remove %s (%s)' % (os.path.join(root, d), str(e)), file=sys.stderr)
failed = True
continue
if failed:
print('error: Failed to delete obsolete path %s' % path, file=sys.stderr)
print(' remove manually, then run sync again', file=sys.stderr)
return 1
# Try deleting parent dirs if they are empty
project_dir = path
while project_dir != self.manifest.topdir:
if len(platform_utils.listdir(project_dir)) == 0:
platform_utils.rmdir(project_dir)
else:
break
project_dir = os.path.dirname(project_dir)
return 0
def UpdateProjectList(self, opt): def UpdateProjectList(self, opt):
new_project_paths = [] new_project_paths = []
for project in self.GetProjects(None, missing_ok=True): for project in self.GetProjects(None, missing_ok=True):
@ -647,8 +692,11 @@ later is required to fix a server side protocol bug.
old_project_paths = [] old_project_paths = []
if os.path.exists(file_path): if os.path.exists(file_path):
with open(file_path, 'r') as fd: fd = open(file_path, 'r')
try:
old_project_paths = fd.read().split('\n') old_project_paths = fd.read().split('\n')
finally:
fd.close()
# In reversed order, so subfolders are deleted before parent folder. # In reversed order, so subfolders are deleted before parent folder.
for path in sorted(old_project_paths, reverse=True): for path in sorted(old_project_paths, reverse=True):
if not path: if not path:
@ -658,26 +706,37 @@ later is required to fix a server side protocol bug.
gitdir = os.path.join(self.manifest.topdir, path, '.git') gitdir = os.path.join(self.manifest.topdir, path, '.git')
if os.path.exists(gitdir): if os.path.exists(gitdir):
project = Project( project = Project(
manifest=self.manifest, manifest = self.manifest,
name=path, name = path,
remote=RemoteSpec('origin'), remote = RemoteSpec('origin'),
gitdir=gitdir, gitdir = gitdir,
objdir=gitdir, objdir = gitdir,
use_git_worktrees=os.path.isfile(gitdir), worktree = os.path.join(self.manifest.topdir, path),
worktree=os.path.join(self.manifest.topdir, path), relpath = path,
relpath=path, revisionExpr = 'HEAD',
revisionExpr='HEAD', revisionId = None,
revisionId=None, groups = None)
groups=None)
if not project.DeleteWorktree( if project.IsDirty() and opt.force_remove_dirty:
quiet=opt.quiet, print('WARNING: Removing dirty project "%s": uncommitted changes '
force=opt.force_remove_dirty): 'erased' % project.relpath, file=sys.stderr)
self._DeleteProject(project.worktree)
elif project.IsDirty():
print('error: Cannot remove project "%s": uncommitted changes '
'are present' % project.relpath, file=sys.stderr)
print(' commit changes, then run sync again',
file=sys.stderr)
return 1
elif self._DeleteProject(project.worktree):
return 1 return 1
new_project_paths.sort() new_project_paths.sort()
with open(file_path, 'w') as fd: fd = open(file_path, 'w')
try:
fd.write('\n'.join(new_project_paths)) fd.write('\n'.join(new_project_paths))
fd.write('\n') fd.write('\n')
finally:
fd.close()
return 0 return 0
def _SmartSyncSetup(self, opt, smart_sync_manifest_path): def _SmartSyncSetup(self, opt, smart_sync_manifest_path):
@ -690,7 +749,7 @@ later is required to fix a server side protocol bug.
if not opt.quiet: if not opt.quiet:
print('Using manifest server %s' % manifest_server) print('Using manifest server %s' % manifest_server)
if '@' not in manifest_server: if not '@' in manifest_server:
username = None username = None
password = None password = None
if opt.manifest_server_username and opt.manifest_server_password: if opt.manifest_server_username and opt.manifest_server_password:
@ -733,13 +792,13 @@ later is required to fix a server side protocol bug.
if branch.startswith(R_HEADS): if branch.startswith(R_HEADS):
branch = branch[len(R_HEADS):] branch = branch[len(R_HEADS):]
if 'SYNC_TARGET' in os.environ: env = os.environ.copy()
target = os.environ['SYNC_TARGET'] if 'SYNC_TARGET' in env:
target = env['SYNC_TARGET']
[success, manifest_str] = server.GetApprovedManifest(branch, target) [success, manifest_str] = server.GetApprovedManifest(branch, target)
elif ('TARGET_PRODUCT' in os.environ and elif 'TARGET_PRODUCT' in env and 'TARGET_BUILD_VARIANT' in env:
'TARGET_BUILD_VARIANT' in os.environ): target = '%s-%s' % (env['TARGET_PRODUCT'],
target = '%s-%s' % (os.environ['TARGET_PRODUCT'], env['TARGET_BUILD_VARIANT'])
os.environ['TARGET_BUILD_VARIANT'])
[success, manifest_str] = server.GetApprovedManifest(branch, target) [success, manifest_str] = server.GetApprovedManifest(branch, target)
else: else:
[success, manifest_str] = server.GetApprovedManifest(branch) [success, manifest_str] = server.GetApprovedManifest(branch)
@ -750,8 +809,11 @@ later is required to fix a server side protocol bug.
if success: if success:
manifest_name = os.path.basename(smart_sync_manifest_path) manifest_name = os.path.basename(smart_sync_manifest_path)
try: try:
with open(smart_sync_manifest_path, 'w') as f: f = open(smart_sync_manifest_path, 'w')
try:
f.write(manifest_str) f.write(manifest_str)
finally:
f.close()
except IOError as e: except IOError as e:
print('error: cannot write manifest to %s:\n%s' print('error: cannot write manifest to %s:\n%s'
% (smart_sync_manifest_path, e), % (smart_sync_manifest_path, e),
@ -778,11 +840,10 @@ later is required to fix a server side protocol bug.
"""Fetch & update the local manifest project.""" """Fetch & update the local manifest project."""
if not opt.local_only: if not opt.local_only:
start = time.time() start = time.time()
success = mp.Sync_NetworkHalf(quiet=opt.quiet, verbose=opt.verbose, success = mp.Sync_NetworkHalf(quiet=opt.quiet,
current_branch_only=opt.current_branch_only, current_branch_only=opt.current_branch_only,
tags=opt.tags, no_tags=opt.no_tags,
optimized_fetch=opt.optimized_fetch, optimized_fetch=opt.optimized_fetch,
retry_fetches=opt.retry_fetches,
submodules=self.manifest.HasSubmodules, submodules=self.manifest.HasSubmodules,
clone_filter=self.manifest.CloneFilter) clone_filter=self.manifest.CloneFilter)
finish = time.time() finish = time.time()
@ -827,9 +888,6 @@ later is required to fix a server side protocol bug.
soft_limit, _ = _rlimit_nofile() soft_limit, _ = _rlimit_nofile()
self.jobs = min(self.jobs, (soft_limit - 5) // 3) self.jobs = min(self.jobs, (soft_limit - 5) // 3)
opt.quiet = opt.output_mode is False
opt.verbose = opt.output_mode is True
if opt.manifest_name: if opt.manifest_name:
self.manifest.Override(opt.manifest_name) self.manifest.Override(opt.manifest_name)
@ -837,9 +895,6 @@ later is required to fix a server side protocol bug.
smart_sync_manifest_path = os.path.join( smart_sync_manifest_path = os.path.join(
self.manifest.manifestProject.worktree, 'smart_sync_override.xml') self.manifest.manifestProject.worktree, 'smart_sync_override.xml')
if opt.clone_bundle is None:
opt.clone_bundle = self.manifest.CloneBundle
if opt.smart_sync or opt.smart_tag: if opt.smart_sync or opt.smart_tag:
manifest_name = self._SmartSyncSetup(opt, smart_sync_manifest_path) manifest_name = self._SmartSyncSetup(opt, smart_sync_manifest_path)
else: else:
@ -850,17 +905,8 @@ later is required to fix a server side protocol bug.
print('error: failed to remove existing smart sync override manifest: %s' % print('error: failed to remove existing smart sync override manifest: %s' %
e, file=sys.stderr) e, file=sys.stderr)
err_event = _threading.Event()
rp = self.manifest.repoProject rp = self.manifest.repoProject
rp.PreSync() rp.PreSync()
cb = rp.CurrentBranch
if cb:
base = rp.GetBranch(cb).merge
if not base or not base.startswith('refs/heads/'):
print('warning: repo is not tracking a remote branch, so it will not '
'receive updates; run `repo init --repo-rev=stable` to fix.',
file=sys.stderr)
mp = self.manifest.manifestProject mp = self.manifest.manifestProject
mp.PreSync() mp.PreSync()
@ -868,9 +914,6 @@ later is required to fix a server side protocol bug.
if opt.repo_upgraded: if opt.repo_upgraded:
_PostRepoUpgrade(self.manifest, quiet=opt.quiet) _PostRepoUpgrade(self.manifest, quiet=opt.quiet)
if not opt.mp_update:
print('Skipping update of local manifest project.')
else:
self._UpdateManifestProject(opt, mp, manifest_name) self._UpdateManifestProject(opt, mp, manifest_name)
if self.gitc_manifest: if self.gitc_manifest:
@ -912,10 +955,6 @@ later is required to fix a server side protocol bug.
missing_ok=True, missing_ok=True,
submodules_ok=opt.fetch_submodules) submodules_ok=opt.fetch_submodules)
err_network_sync = False
err_update_projects = False
err_checkout = False
self._fetch_times = _FetchTimes(self.manifest) self._fetch_times = _FetchTimes(self.manifest)
if not opt.local_only: if not opt.local_only:
to_fetch = [] to_fetch = []
@ -925,14 +964,10 @@ later is required to fix a server side protocol bug.
to_fetch.extend(all_projects) to_fetch.extend(all_projects)
to_fetch.sort(key=self._fetch_times.Get, reverse=True) to_fetch.sort(key=self._fetch_times.Get, reverse=True)
fetched = self._Fetch(to_fetch, opt, err_event) fetched = self._Fetch(to_fetch, opt)
_PostRepoFetch(rp, opt.no_repo_verify)
_PostRepoFetch(rp, opt.repo_verify)
if opt.network_only: if opt.network_only:
# bail out now; the rest touches the working tree # bail out now; the rest touches the working tree
if err_event.isSet():
print('\nerror: Exited sync due to fetch errors.\n', file=sys.stderr)
sys.exit(1)
return return
# Iteratively fetch missing and/or nested unregistered submodules # Iteratively fetch missing and/or nested unregistered submodules
@ -954,60 +989,22 @@ later is required to fix a server side protocol bug.
if previously_missing_set == missing_set: if previously_missing_set == missing_set:
break break
previously_missing_set = missing_set previously_missing_set = missing_set
fetched.update(self._Fetch(missing, opt, err_event)) fetched.update(self._Fetch(missing, opt))
# If we saw an error, exit with code 1 so that other scripts can check.
if err_event.isSet():
err_network_sync = True
if opt.fail_fast:
print('\nerror: Exited sync due to fetch errors.\n'
'Local checkouts *not* updated. Resolve network issues & '
'retry.\n'
'`repo sync -l` will update some local checkouts.',
file=sys.stderr)
sys.exit(1)
if self.manifest.IsMirror or self.manifest.IsArchive: if self.manifest.IsMirror or self.manifest.IsArchive:
# bail out now, we have no working tree # bail out now, we have no working tree
return return
if self.UpdateProjectList(opt): if self.UpdateProjectList(opt):
err_event.set()
err_update_projects = True
if opt.fail_fast:
print('\nerror: Local checkouts *not* updated.', file=sys.stderr)
sys.exit(1) sys.exit(1)
err_results = [] self._Checkout(all_projects, opt)
self._Checkout(all_projects, opt, err_event, err_results)
if err_event.isSet():
err_checkout = True
# NB: We don't exit here because this is the last step.
# If there's a notice that's supposed to print at the end of the sync, print # If there's a notice that's supposed to print at the end of the sync, print
# it now... # it now...
if self.manifest.notice: if self.manifest.notice:
print(self.manifest.notice) print(self.manifest.notice)
# If we saw an error, exit with code 1 so that other scripts can check.
if err_event.isSet():
print('\nerror: Unable to fully sync the tree.', file=sys.stderr)
if err_network_sync:
print('error: Downloading network changes failed.', file=sys.stderr)
if err_update_projects:
print('error: Updating local project lists failed.', file=sys.stderr)
if err_checkout:
print('error: Checking out local projects failed.', file=sys.stderr)
if err_results:
print('Failing repos:\n%s' % '\n'.join(err_results), file=sys.stderr)
print('Try re-running with "-j1 --fail-fast" to exit at the first error.',
file=sys.stderr)
sys.exit(1)
if not opt.quiet:
print('repo sync has finished successfully.')
def _PostRepoUpgrade(manifest, quiet=False): def _PostRepoUpgrade(manifest, quiet=False):
wrapper = Wrapper() wrapper = Wrapper()
if wrapper.NeedSetupGnuPG(): if wrapper.NeedSetupGnuPG():
@ -1016,12 +1013,11 @@ def _PostRepoUpgrade(manifest, quiet=False):
if project.Exists: if project.Exists:
project.PostRepoUpgrade() project.PostRepoUpgrade()
def _PostRepoFetch(rp, no_repo_verify=False, verbose=False):
def _PostRepoFetch(rp, repo_verify=True, verbose=False):
if rp.HasChanges: if rp.HasChanges:
print('info: A new version of repo is available', file=sys.stderr) print('info: A new version of repo is available', file=sys.stderr)
print(file=sys.stderr) print(file=sys.stderr)
if not repo_verify or _VerifyTag(rp): if no_repo_verify or _VerifyTag(rp):
syncbuf = SyncBuffer(rp.config) syncbuf = SyncBuffer(rp.config)
rp.Sync_LocalHalf(syncbuf) rp.Sync_LocalHalf(syncbuf)
if not syncbuf.Finish(): if not syncbuf.Finish():
@ -1035,7 +1031,6 @@ def _PostRepoFetch(rp, repo_verify=True, verbose=False):
print('repo version %s is current' % rp.work_git.describe(HEAD), print('repo version %s is current' % rp.work_git.describe(HEAD),
file=sys.stderr) file=sys.stderr)
def _VerifyTag(project): def _VerifyTag(project):
gpg_dir = os.path.expanduser('~/.repoconfig/gnupg') gpg_dir = os.path.expanduser('~/.repoconfig/gnupg')
if not os.path.exists(gpg_dir): if not os.path.exists(gpg_dir):
@ -1061,14 +1056,14 @@ def _VerifyTag(project):
return False return False
env = os.environ.copy() env = os.environ.copy()
env['GIT_DIR'] = project.gitdir env['GIT_DIR'] = project.gitdir.encode()
env['GNUPGHOME'] = gpg_dir env['GNUPGHOME'] = gpg_dir.encode()
cmd = [GIT, 'tag', '-v', cur] cmd = [GIT, 'tag', '-v', cur]
proc = subprocess.Popen(cmd, proc = subprocess.Popen(cmd,
stdout=subprocess.PIPE, stdout = subprocess.PIPE,
stderr=subprocess.PIPE, stderr = subprocess.PIPE,
env=env) env = env)
out = proc.stdout.read() out = proc.stdout.read()
proc.stdout.close() proc.stdout.close()
@ -1102,13 +1097,16 @@ class _FetchTimes(object):
old = self._times.get(name, t) old = self._times.get(name, t)
self._seen.add(name) self._seen.add(name)
a = self._ALPHA a = self._ALPHA
self._times[name] = (a * t) + ((1 - a) * old) self._times[name] = (a*t) + ((1-a) * old)
def _Load(self): def _Load(self):
if self._times is None: if self._times is None:
try: try:
with open(self._path) as f: f = open(self._path)
try:
self._times = json.load(f) self._times = json.load(f)
finally:
f.close()
except (IOError, ValueError): except (IOError, ValueError):
try: try:
platform_utils.remove(self._path) platform_utils.remove(self._path)
@ -1128,8 +1126,11 @@ class _FetchTimes(object):
del self._times[name] del self._times[name]
try: try:
with open(self._path, 'w') as f: f = open(self._path, 'w')
try:
json.dump(self._times, f, indent=2) json.dump(self._times, f, indent=2)
finally:
f.close()
except (IOError, TypeError): except (IOError, TypeError):
try: try:
platform_utils.remove(self._path) platform_utils.remove(self._path)
@ -1140,8 +1141,6 @@ class _FetchTimes(object):
# and supporting persistent-http[s]. It cannot change hosts from # and supporting persistent-http[s]. It cannot change hosts from
# request to request like the normal transport, the real url # request to request like the normal transport, the real url
# is passed during initialization. # is passed during initialization.
class PersistentTransport(xmlrpc.client.Transport): class PersistentTransport(xmlrpc.client.Transport):
def __init__(self, orig_host): def __init__(self, orig_host):
self.orig_host = orig_host self.orig_host = orig_host
@ -1152,7 +1151,7 @@ class PersistentTransport(xmlrpc.client.Transport):
# Since we're only using them for HTTP, copy the file temporarily, # Since we're only using them for HTTP, copy the file temporarily,
# stripping those prefixes away. # stripping those prefixes away.
if cookiefile: if cookiefile:
tmpcookiefile = tempfile.NamedTemporaryFile(mode='w') tmpcookiefile = tempfile.NamedTemporaryFile()
tmpcookiefile.write("# HTTP Cookie File") tmpcookiefile.write("# HTTP Cookie File")
try: try:
with open(cookiefile) as f: with open(cookiefile) as f:
@ -1176,7 +1175,7 @@ class PersistentTransport(xmlrpc.client.Transport):
if proxy: if proxy:
proxyhandler = urllib.request.ProxyHandler({ proxyhandler = urllib.request.ProxyHandler({
"http": proxy, "http": proxy,
"https": proxy}) "https": proxy })
opener = urllib.request.build_opener( opener = urllib.request.build_opener(
urllib.request.HTTPCookieProcessor(cookiejar), urllib.request.HTTPCookieProcessor(cookiejar),
@ -1233,3 +1232,4 @@ class PersistentTransport(xmlrpc.client.Transport):
def close(self): def close(self):
pass pass

View File

@ -23,18 +23,16 @@ from command import InteractiveCommand
from editor import Editor from editor import Editor
from error import HookError, UploadError from error import HookError, UploadError
from git_command import GitCommand from git_command import GitCommand
from git_refs import R_HEADS from project import RepoHook
from hooks import RepoHook
from pyversion import is_python3 from pyversion import is_python3
if not is_python3(): if not is_python3():
input = raw_input # noqa: F821 input = raw_input
else: else:
unicode = str unicode = str
UNUSUAL_COMMIT_THRESHOLD = 5 UNUSUAL_COMMIT_THRESHOLD = 5
def _ConfirmManyUploads(multiple_branches=False): def _ConfirmManyUploads(multiple_branches=False):
if multiple_branches: if multiple_branches:
print('ATTENTION: One or more branches has an unusually high number ' print('ATTENTION: One or more branches has an unusually high number '
@ -46,20 +44,17 @@ def _ConfirmManyUploads(multiple_branches=False):
answer = input("If you are sure you intend to do this, type 'yes': ").strip() answer = input("If you are sure you intend to do this, type 'yes': ").strip()
return answer == "yes" return answer == "yes"
def _die(fmt, *args): def _die(fmt, *args):
msg = fmt % args msg = fmt % args
print('error: %s' % msg, file=sys.stderr) print('error: %s' % msg, file=sys.stderr)
sys.exit(1) sys.exit(1)
def _SplitEmails(values): def _SplitEmails(values):
result = [] result = []
for value in values: for value in values:
result.extend([s.strip() for s in value.split(',')]) result.extend([s.strip() for s in value.split(',')])
return result return result
class Upload(InteractiveCommand): class Upload(InteractiveCommand):
common = True common = True
helpSummary = "Upload changes for code review" helpSummary = "Upload changes for code review"
@ -131,23 +126,6 @@ is set to "true" then repo will assume you always want the equivalent
of the -t option to the repo command. If unset or set to "false" then of the -t option to the repo command. If unset or set to "false" then
repo will make use of only the command line option. repo will make use of only the command line option.
review.URL.uploadhashtags:
To add hashtags whenever uploading a commit, you can set a per-project
or global Git option to do so. The value of review.URL.uploadhashtags
will be used as comma delimited hashtags like the --hashtag option.
review.URL.uploadlabels:
To add labels whenever uploading a commit, you can set a per-project
or global Git option to do so. The value of review.URL.uploadlabels
will be used as comma delimited labels like the --label option.
review.URL.uploadnotify:
Control e-mail notifications when uploading.
https://gerrit-review.googlesource.com/Documentation/user-upload.html#notify
# References # References
Gerrit Code Review: https://www.gerritcodereview.com/ Gerrit Code Review: https://www.gerritcodereview.com/
@ -158,15 +136,6 @@ Gerrit Code Review: https://www.gerritcodereview.com/
p.add_option('-t', p.add_option('-t',
dest='auto_topic', action='store_true', dest='auto_topic', action='store_true',
help='Send local branch name to Gerrit Code Review') help='Send local branch name to Gerrit Code Review')
p.add_option('--hashtag', '--ht',
dest='hashtags', action='append', default=[],
help='Add hashtags (comma delimited) to the review.')
p.add_option('--hashtag-branch', '--htb',
action='store_true',
help='Add local branch name as a hashtag.')
p.add_option('-l', '--label',
dest='labels', action='append', default=[],
help='Add a label when uploading.')
p.add_option('--re', '--reviewers', p.add_option('--re', '--reviewers',
type='string', action='append', dest='reviewers', type='string', action='append', dest='reviewers',
help='Request reviews from these people.') help='Request reviews from these people.')
@ -179,6 +148,9 @@ Gerrit Code Review: https://www.gerritcodereview.com/
p.add_option('--cbr', '--current-branch', p.add_option('--cbr', '--current-branch',
dest='current_branch', action='store_true', dest='current_branch', action='store_true',
help='Upload current git branch.') help='Upload current git branch.')
p.add_option('-d', '--draft',
action='store_true', dest='draft', default=False,
help='If specified, upload as a draft.')
p.add_option('--ne', '--no-emails', p.add_option('--ne', '--no-emails',
action='store_false', dest='notify', default=True, action='store_false', dest='notify', default=True,
help='If specified, do not send emails on upload.') help='If specified, do not send emails on upload.')
@ -196,15 +168,6 @@ Gerrit Code Review: https://www.gerritcodereview.com/
type='string', action='store', dest='dest_branch', type='string', action='store', dest='dest_branch',
metavar='BRANCH', metavar='BRANCH',
help='Submit for review on this target branch.') help='Submit for review on this target branch.')
p.add_option('-n', '--dry-run',
dest='dryrun', default=False, action='store_true',
help='Do everything except actually upload the CL.')
p.add_option('-y', '--yes',
default=False, action='store_true',
help='Answer yes to all safe prompts.')
p.add_option('--no-cert-checks',
dest='validate_certs', action='store_false', default=True,
help='Disable verifying ssl certs (unsafe).')
# Options relating to upload hook. Note that verify and no-verify are NOT # Options relating to upload hook. Note that verify and no-verify are NOT
# opposites of each other, which is why they store to different locations. # opposites of each other, which is why they store to different locations.
@ -222,16 +185,15 @@ Gerrit Code Review: https://www.gerritcodereview.com/
# Never run upload hooks, but upload anyway (AKA bypass hooks). # Never run upload hooks, but upload anyway (AKA bypass hooks).
# - no-verify=True, verify=True: # - no-verify=True, verify=True:
# Invalid # Invalid
g = p.add_option_group('Upload hooks') p.add_option('--no-cert-checks',
g.add_option('--no-verify', dest='validate_certs', action='store_false', default=True,
help='Disable verifying ssl certs (unsafe).')
p.add_option('--no-verify',
dest='bypass_hooks', action='store_true', dest='bypass_hooks', action='store_true',
help='Do not run the upload hook.') help='Do not run the upload hook.')
g.add_option('--verify', p.add_option('--verify',
dest='allow_all_hooks', action='store_true', dest='allow_all_hooks', action='store_true',
help='Run the upload hook without prompting.') help='Run the upload hook without prompting.')
g.add_option('--ignore-hooks',
dest='ignore_hooks', action='store_true',
help='Do not abort uploading if upload hooks fail.')
def _SingleBranch(self, opt, branch, people): def _SingleBranch(self, opt, branch, people):
project = branch.project project = branch.project
@ -250,7 +212,7 @@ Gerrit Code Review: https://www.gerritcodereview.com/
destination = opt.dest_branch or project.dest_branch or project.revisionExpr destination = opt.dest_branch or project.dest_branch or project.revisionExpr
print('Upload project %s/ to remote branch %s%s:' % print('Upload project %s/ to remote branch %s%s:' %
(project.relpath, destination, ' (private)' if opt.private else '')) (project.relpath, destination, ' (draft)' if opt.draft else ''))
print(' branch %s (%2d commit%s, %s):' % ( print(' branch %s (%2d commit%s, %s):' % (
name, name,
len(commit_list), len(commit_list),
@ -262,10 +224,6 @@ Gerrit Code Review: https://www.gerritcodereview.com/
print('to %s (y/N)? ' % remote.review, end='') print('to %s (y/N)? ' % remote.review, end='')
# TODO: When we require Python 3, use flush=True w/print above. # TODO: When we require Python 3, use flush=True w/print above.
sys.stdout.flush() sys.stdout.flush()
if opt.yes:
print('<--yes>')
answer = True
else:
answer = sys.stdin.readline().strip().lower() answer = sys.stdin.readline().strip().lower()
answer = answer in ('y', 'yes', '1', 'true', 't') answer = answer in ('y', 'yes', '1', 'true', 't')
@ -313,6 +271,11 @@ Gerrit Code Review: https://www.gerritcodereview.com/
branches[project.name] = b branches[project.name] = b
script.append('') script.append('')
script = [ x.encode('utf-8')
if issubclass(type(x), unicode)
else x
for x in script ]
script = Editor.EditString("\n".join(script)).split("\n") script = Editor.EditString("\n".join(script)).split("\n")
project_re = re.compile(r'^#?\s*project\s*([^\s]+)/:$') project_re = re.compile(r'^#?\s*project\s*([^\s]+)/:$')
@ -364,12 +327,12 @@ Gerrit Code Review: https://www.gerritcodereview.com/
key = 'review.%s.autoreviewer' % project.GetBranch(name).remote.review key = 'review.%s.autoreviewer' % project.GetBranch(name).remote.review
raw_list = project.config.GetString(key) raw_list = project.config.GetString(key)
if raw_list is not None: if not raw_list is None:
people[0].extend([entry.strip() for entry in raw_list.split(',')]) people[0].extend([entry.strip() for entry in raw_list.split(',')])
key = 'review.%s.autocopy' % project.GetBranch(name).remote.review key = 'review.%s.autocopy' % project.GetBranch(name).remote.review
raw_list = project.config.GetString(key) raw_list = project.config.GetString(key)
if raw_list is not None and len(people[0]) > 0: if not raw_list is None and len(people[0]) > 0:
people[1].extend([entry.strip() for entry in raw_list.split(',')]) people[1].extend([entry.strip() for entry in raw_list.split(',')])
def _FindGerritChange(self, branch): def _FindGerritChange(self, branch):
@ -406,10 +369,6 @@ Gerrit Code Review: https://www.gerritcodereview.com/
print('Continue uploading? (y/N) ', end='') print('Continue uploading? (y/N) ', end='')
# TODO: When we require Python 3, use flush=True w/print above. # TODO: When we require Python 3, use flush=True w/print above.
sys.stdout.flush() sys.stdout.flush()
if opt.yes:
print('<--yes>')
a = 'yes'
else:
a = sys.stdin.readline().strip().lower() a = sys.stdin.readline().strip().lower()
if a not in ('y', 'yes', 't', 'true', 'on'): if a not in ('y', 'yes', 't', 'true', 'on'):
print("skipping upload", file=sys.stderr) print("skipping upload", file=sys.stderr)
@ -422,51 +381,12 @@ Gerrit Code Review: https://www.gerritcodereview.com/
key = 'review.%s.uploadtopic' % branch.project.remote.review key = 'review.%s.uploadtopic' % branch.project.remote.review
opt.auto_topic = branch.project.config.GetBoolean(key) opt.auto_topic = branch.project.config.GetBoolean(key)
def _ExpandCommaList(value):
"""Split |value| up into comma delimited entries."""
if not value:
return
for ret in value.split(','):
ret = ret.strip()
if ret:
yield ret
# Check if hashtags should be included.
key = 'review.%s.uploadhashtags' % branch.project.remote.review
hashtags = set(_ExpandCommaList(branch.project.config.GetString(key)))
for tag in opt.hashtags:
hashtags.update(_ExpandCommaList(tag))
if opt.hashtag_branch:
hashtags.add(branch.name)
# Check if labels should be included.
key = 'review.%s.uploadlabels' % branch.project.remote.review
labels = set(_ExpandCommaList(branch.project.config.GetString(key)))
for label in opt.labels:
labels.update(_ExpandCommaList(label))
# Basic sanity check on label syntax.
for label in labels:
if not re.match(r'^.+[+-][0-9]+$', label):
print('repo: error: invalid label syntax "%s": labels use forms '
'like CodeReview+1 or Verified-1' % (label,), file=sys.stderr)
sys.exit(1)
# Handle e-mail notifications.
if opt.notify is False:
notify = 'NONE'
else:
key = 'review.%s.uploadnotify' % branch.project.remote.review
notify = branch.project.config.GetString(key)
destination = opt.dest_branch or branch.project.dest_branch destination = opt.dest_branch or branch.project.dest_branch
# Make sure our local branch is not setup to track a different remote branch # Make sure our local branch is not setup to track a different remote branch
merge_branch = self._GetMergeBranch(branch.project) merge_branch = self._GetMergeBranch(branch.project)
if destination: if destination:
full_dest = destination full_dest = 'refs/heads/%s' % destination
if not full_dest.startswith(R_HEADS):
full_dest = R_HEADS + full_dest
if not opt.dest_branch and merge_branch and merge_branch != full_dest: if not opt.dest_branch and merge_branch and merge_branch != full_dest:
print('merge branch %s does not match destination branch %s' print('merge branch %s does not match destination branch %s'
% (merge_branch, full_dest)) % (merge_branch, full_dest))
@ -477,12 +397,10 @@ Gerrit Code Review: https://www.gerritcodereview.com/
continue continue
branch.UploadForReview(people, branch.UploadForReview(people,
dryrun=opt.dryrun,
auto_topic=opt.auto_topic, auto_topic=opt.auto_topic,
hashtags=hashtags, draft=opt.draft,
labels=labels,
private=opt.private, private=opt.private,
notify=notify, notify=None if opt.notify else 'NONE',
wip=opt.wip, wip=opt.wip,
dest_branch=destination, dest_branch=destination,
validate_certs=opt.validate_certs, validate_certs=opt.validate_certs,
@ -505,8 +423,8 @@ Gerrit Code Review: https://www.gerritcodereview.com/
else: else:
fmt = '\n (%s)' fmt = '\n (%s)'
print(('[FAILED] %-15s %-15s' + fmt) % ( print(('[FAILED] %-15s %-15s' + fmt) % (
branch.project.relpath + '/', branch.project.relpath + '/', \
branch.name, branch.name, \
str(branch.error)), str(branch.error)),
file=sys.stderr) file=sys.stderr)
print() print()
@ -524,14 +442,14 @@ Gerrit Code Review: https://www.gerritcodereview.com/
def _GetMergeBranch(self, project): def _GetMergeBranch(self, project):
p = GitCommand(project, p = GitCommand(project,
['rev-parse', '--abbrev-ref', 'HEAD'], ['rev-parse', '--abbrev-ref', 'HEAD'],
capture_stdout=True, capture_stdout = True,
capture_stderr=True) capture_stderr = True)
p.Wait() p.Wait()
local_branch = p.stdout.strip() local_branch = p.stdout.strip()
p = GitCommand(project, p = GitCommand(project,
['config', '--get', 'branch.%s.merge' % local_branch], ['config', '--get', 'branch.%s.merge' % local_branch],
capture_stdout=True, capture_stdout = True,
capture_stderr=True) capture_stderr = True)
p.Wait() p.Wait()
merge_branch = p.stdout.strip() merge_branch = p.stdout.strip()
return merge_branch return merge_branch
@ -565,12 +483,8 @@ Gerrit Code Review: https://www.gerritcodereview.com/
pending.append((project, avail)) pending.append((project, avail))
if not pending: if not pending:
if branch is None: print("no branches ready for upload", file=sys.stderr)
print('repo: error: no branches ready for upload', file=sys.stderr) return
else:
print('repo: error: no branches named "%s" ready for upload' %
(branch,), file=sys.stderr)
return 1
if not opt.bypass_hooks: if not opt.bypass_hooks:
hook = RepoHook('pre-upload', self.manifest.repo_hooks_project, hook = RepoHook('pre-upload', self.manifest.repo_hooks_project,
@ -579,24 +493,12 @@ Gerrit Code Review: https://www.gerritcodereview.com/
abort_if_user_denies=True) abort_if_user_denies=True)
pending_proj_names = [project.name for (project, available) in pending] pending_proj_names = [project.name for (project, available) in pending]
pending_worktrees = [project.worktree for (project, available) in pending] pending_worktrees = [project.worktree for (project, available) in pending]
passed = True
try: try:
hook.Run(opt.allow_all_hooks, project_list=pending_proj_names, hook.Run(opt.allow_all_hooks, project_list=pending_proj_names,
worktree_list=pending_worktrees) worktree_list=pending_worktrees)
except SystemExit:
passed = False
if not opt.ignore_hooks:
raise
except HookError as e: except HookError as e:
passed = False
print("ERROR: %s" % str(e), file=sys.stderr) print("ERROR: %s" % str(e), file=sys.stderr)
return
if not passed:
if opt.ignore_hooks:
print('\nWARNING: pre-upload hooks failed, but uploading anyways.',
file=sys.stderr)
else:
return 1
if opt.reviewers: if opt.reviewers:
reviewers = _SplitEmails(opt.reviewers) reviewers = _SplitEmails(opt.reviewers)

View File

@ -15,15 +15,11 @@
# limitations under the License. # limitations under the License.
from __future__ import print_function from __future__ import print_function
import platform
import sys import sys
from command import Command, MirrorSafeCommand from command import Command, MirrorSafeCommand
from git_command import git, RepoSourceVersion, user_agent from git_command import git, RepoSourceVersion, user_agent
from git_refs import HEAD from git_refs import HEAD
class Version(Command, MirrorSafeCommand): class Version(Command, MirrorSafeCommand):
wrapper_version = None wrapper_version = None
wrapper_path = None wrapper_path = None
@ -43,11 +39,10 @@ class Version(Command, MirrorSafeCommand):
rp_ver = rp.bare_git.describe(HEAD) rp_ver = rp.bare_git.describe(HEAD)
print('repo version %s' % rp_ver) print('repo version %s' % rp_ver)
print(' (from %s)' % rem.url) print(' (from %s)' % rem.url)
print(' (%s)' % rp.bare_git.log('-1', '--format=%cD', HEAD))
if self.wrapper_path is not None: if Version.wrapper_path is not None:
print('repo launcher version %s' % self.wrapper_version) print('repo launcher version %s' % Version.wrapper_version)
print(' (from %s)' % self.wrapper_path) print(' (from %s)' % Version.wrapper_path)
if src_ver != rp_ver: if src_ver != rp_ver:
print(' (currently at %s)' % src_ver) print(' (currently at %s)' % src_ver)
@ -56,11 +51,3 @@ class Version(Command, MirrorSafeCommand):
print('git %s' % git.version_tuple().full) print('git %s' % git.version_tuple().full)
print('git User-Agent %s' % user_agent.git) print('git User-Agent %s' % user_agent.git)
print('Python %s' % sys.version) print('Python %s' % sys.version)
uname = platform.uname()
if sys.version_info.major < 3:
# Python 3 returns a named tuple, but Python 2 is simpler.
print(uname)
else:
print('OS %s %s (%s)' % (uname.system, uname.release, uname.version))
print('CPU %s (%s)' %
(uname.machine, uname.processor if uname.processor else 'unknown'))

View File

@ -1,13 +1,3 @@
[section] [section]
empty empty
nonempty = true nonempty = true
boolinvalid = oops
booltrue = true
boolfalse = false
intinvalid = oops
inthex = 0x10
inthexk = 0x10k
int = 10
intk = 10k
intm = 10m
intg = 10g

View File

@ -1,60 +0,0 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the editor.py module."""
from __future__ import print_function
import unittest
from editor import Editor
class EditorTestCase(unittest.TestCase):
"""Take care of resetting Editor state across tests."""
def setUp(self):
self.setEditor(None)
def tearDown(self):
self.setEditor(None)
@staticmethod
def setEditor(editor):
Editor._editor = editor
class GetEditor(EditorTestCase):
"""Check GetEditor behavior."""
def test_basic(self):
"""Basic checking of _GetEditor."""
self.setEditor(':')
self.assertEqual(':', Editor._GetEditor())
class EditString(EditorTestCase):
"""Check EditString behavior."""
def test_no_editor(self):
"""Check behavior when no editor is available."""
self.setEditor(':')
self.assertEqual('foo', Editor.EditString('foo'))
def test_cat_editor(self):
"""Check behavior when editor is `cat`."""
self.setEditor('cat')
self.assertEqual('foo', Editor.EditString('foo'))

View File

@ -21,40 +21,7 @@ from __future__ import print_function
import re import re
import unittest import unittest
try:
from unittest import mock
except ImportError:
import mock
import git_command import git_command
import wrapper
class SSHUnitTest(unittest.TestCase):
"""Tests the ssh functions."""
def test_ssh_version(self):
"""Check ssh_version() handling."""
ver = git_command._parse_ssh_version('Unknown\n')
self.assertEqual(ver, ())
ver = git_command._parse_ssh_version('OpenSSH_1.0\n')
self.assertEqual(ver, (1, 0))
ver = git_command._parse_ssh_version('OpenSSH_6.6.1p1 Ubuntu-2ubuntu2.13, OpenSSL 1.0.1f 6 Jan 2014\n')
self.assertEqual(ver, (6, 6, 1))
ver = git_command._parse_ssh_version('OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\n')
self.assertEqual(ver, (7, 6))
def test_ssh_sock(self):
"""Check ssh_sock() function."""
with mock.patch('tempfile.mkdtemp', return_value='/tmp/foo'):
# old ssh version uses port
with mock.patch('git_command.ssh_version', return_value=(6, 6)):
self.assertTrue(git_command.ssh_sock().endswith('%p'))
git_command._ssh_sock_path = None
# new ssh version uses hash
with mock.patch('git_command.ssh_version', return_value=(6, 7)):
self.assertTrue(git_command.ssh_sock().endswith('%C'))
git_command._ssh_sock_path = None
class GitCallUnitTest(unittest.TestCase): class GitCallUnitTest(unittest.TestCase):
@ -68,7 +35,7 @@ class GitCallUnitTest(unittest.TestCase):
# We don't dive too deep into the values here to avoid having to update # We don't dive too deep into the values here to avoid having to update
# whenever git versions change. We do check relative to this min version # whenever git versions change. We do check relative to this min version
# as this is what `repo` itself requires via MIN_GIT_VERSION. # as this is what `repo` itself requires via MIN_GIT_VERSION.
MIN_GIT_VERSION = (2, 10, 2) MIN_GIT_VERSION = (1, 7, 2)
self.assertTrue(isinstance(ver.major, int)) self.assertTrue(isinstance(ver.major, int))
self.assertTrue(isinstance(ver.minor, int)) self.assertTrue(isinstance(ver.minor, int))
self.assertTrue(isinstance(ver.micro, int)) self.assertTrue(isinstance(ver.micro, int))
@ -109,45 +76,3 @@ class UserAgentUnitTest(unittest.TestCase):
# the general form. # the general form.
m = re.match(r'^git/[^ ]+ ([^ ]+) git-repo/[^ ]+', ua) m = re.match(r'^git/[^ ]+ ([^ ]+) git-repo/[^ ]+', ua)
self.assertIsNotNone(m) self.assertIsNotNone(m)
class GitRequireTests(unittest.TestCase):
"""Test the git_require helper."""
def setUp(self):
ver = wrapper.GitVersion(1, 2, 3, 4)
mock.patch.object(git_command.git, 'version_tuple', return_value=ver).start()
def tearDown(self):
mock.patch.stopall()
def test_older_nonfatal(self):
"""Test non-fatal require calls with old versions."""
self.assertFalse(git_command.git_require((2,)))
self.assertFalse(git_command.git_require((1, 3)))
self.assertFalse(git_command.git_require((1, 2, 4)))
self.assertFalse(git_command.git_require((1, 2, 3, 5)))
def test_newer_nonfatal(self):
"""Test non-fatal require calls with newer versions."""
self.assertTrue(git_command.git_require((0,)))
self.assertTrue(git_command.git_require((1, 0)))
self.assertTrue(git_command.git_require((1, 2, 0)))
self.assertTrue(git_command.git_require((1, 2, 3, 0)))
def test_equal_nonfatal(self):
"""Test require calls with equal values."""
self.assertTrue(git_command.git_require((1, 2, 3, 4), fail=False))
self.assertTrue(git_command.git_require((1, 2, 3, 4), fail=True))
def test_older_fatal(self):
"""Test fatal require calls with old versions."""
with self.assertRaises(SystemExit) as e:
git_command.git_require((2,), fail=True)
self.assertNotEqual(0, e.code)
def test_older_fatal_msg(self):
"""Test fatal require calls with old versions and message."""
with self.assertRaises(SystemExit) as e:
git_command.git_require((2,), fail=True, msg='so sad')
self.assertNotEqual(0, e.code)

View File

@ -23,17 +23,14 @@ import unittest
import git_config import git_config
def fixture(*paths): def fixture(*paths):
"""Return a path relative to test/fixtures. """Return a path relative to test/fixtures.
""" """
return os.path.join(os.path.dirname(__file__), 'fixtures', *paths) return os.path.join(os.path.dirname(__file__), 'fixtures', *paths)
class GitConfigUnitTest(unittest.TestCase): class GitConfigUnitTest(unittest.TestCase):
"""Tests the GitConfig class. """Tests the GitConfig class.
""" """
def setUp(self): def setUp(self):
"""Create a GitConfig object using the test.gitconfig fixture. """Create a GitConfig object using the test.gitconfig fixture.
""" """
@ -71,43 +68,5 @@ class GitConfigUnitTest(unittest.TestCase):
val = config.GetString('empty') val = config.GetString('empty')
self.assertEqual(val, None) self.assertEqual(val, None)
def test_GetBoolean_undefined(self):
"""Test GetBoolean on key that doesn't exist."""
self.assertIsNone(self.config.GetBoolean('section.missing'))
def test_GetBoolean_invalid(self):
"""Test GetBoolean on invalid boolean value."""
self.assertIsNone(self.config.GetBoolean('section.boolinvalid'))
def test_GetBoolean_true(self):
"""Test GetBoolean on valid true boolean."""
self.assertTrue(self.config.GetBoolean('section.booltrue'))
def test_GetBoolean_false(self):
"""Test GetBoolean on valid false boolean."""
self.assertFalse(self.config.GetBoolean('section.boolfalse'))
def test_GetInt_undefined(self):
"""Test GetInt on key that doesn't exist."""
self.assertIsNone(self.config.GetInt('section.missing'))
def test_GetInt_invalid(self):
"""Test GetInt on invalid integer value."""
self.assertIsNone(self.config.GetBoolean('section.intinvalid'))
def test_GetInt_valid(self):
"""Test GetInt on valid integers."""
TESTS = (
('inthex', 16),
('inthexk', 16384),
('int', 10),
('intk', 10240),
('intm', 10485760),
('intg', 10737418240),
)
for key, value in TESTS:
self.assertEqual(value, self.config.GetInt('section.%s' % (key,)))
if __name__ == '__main__': if __name__ == '__main__':
unittest.main() unittest.main()

View File

@ -1,60 +0,0 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the hooks.py module."""
from __future__ import print_function
import hooks
import unittest
class RepoHookShebang(unittest.TestCase):
"""Check shebang parsing in RepoHook."""
def test_no_shebang(self):
"""Lines w/out shebangs should be rejected."""
DATA = (
'',
'# -*- coding:utf-8 -*-\n',
'#\n# foo\n',
'# Bad shebang in script\n#!/foo\n'
)
for data in DATA:
self.assertIsNone(hooks.RepoHook._ExtractInterpFromShebang(data))
def test_direct_interp(self):
"""Lines whose shebang points directly to the interpreter."""
DATA = (
('#!/foo', '/foo'),
('#! /foo', '/foo'),
('#!/bin/foo ', '/bin/foo'),
('#! /usr/foo ', '/usr/foo'),
('#! /usr/foo -args', '/usr/foo'),
)
for shebang, interp in DATA:
self.assertEqual(hooks.RepoHook._ExtractInterpFromShebang(shebang),
interp)
def test_env_interp(self):
"""Lines whose shebang launches through `env`."""
DATA = (
('#!/usr/bin/env foo', 'foo'),
('#!/bin/env foo', 'foo'),
('#! /bin/env /bin/foo ', '/bin/foo'),
)
for shebang, interp in DATA:
self.assertEqual(hooks.RepoHook._ExtractInterpFromShebang(shebang),
interp)

View File

@ -1,148 +0,0 @@
# -*- coding:utf-8 -*-
#
# Copyright (C) 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the manifest_xml.py module."""
from __future__ import print_function
import os
import unittest
import xml.dom.minidom
import error
import manifest_xml
class ManifestValidateFilePaths(unittest.TestCase):
"""Check _ValidateFilePaths helper.
This doesn't access a real filesystem.
"""
def check_both(self, *args):
manifest_xml.XmlManifest._ValidateFilePaths('copyfile', *args)
manifest_xml.XmlManifest._ValidateFilePaths('linkfile', *args)
def test_normal_path(self):
"""Make sure good paths are accepted."""
self.check_both('foo', 'bar')
self.check_both('foo/bar', 'bar')
self.check_both('foo', 'bar/bar')
self.check_both('foo/bar', 'bar/bar')
def test_symlink_targets(self):
"""Some extra checks for symlinks."""
def check(*args):
manifest_xml.XmlManifest._ValidateFilePaths('linkfile', *args)
# We allow symlinks to end in a slash since we allow them to point to dirs
# in general. Technically the slash isn't necessary.
check('foo/', 'bar')
# We allow a single '.' to get a reference to the project itself.
check('.', 'bar')
def test_bad_paths(self):
"""Make sure bad paths (src & dest) are rejected."""
PATHS = (
'..',
'../',
'./',
'foo/',
'./foo',
'../foo',
'foo/./bar',
'foo/../../bar',
'/foo',
'./../foo',
'.git/foo',
# Check case folding.
'.GIT/foo',
'blah/.git/foo',
'.repo/foo',
'.repoconfig',
# Block ~ due to 8.3 filenames on Windows filesystems.
'~',
'foo~',
'blah/foo~',
# Block Unicode characters that get normalized out by filesystems.
u'foo\u200Cbar',
)
# Make sure platforms that use path separators (e.g. Windows) are also
# rejected properly.
if os.path.sep != '/':
PATHS += tuple(x.replace('/', os.path.sep) for x in PATHS)
for path in PATHS:
self.assertRaises(
error.ManifestInvalidPathError, self.check_both, path, 'a')
self.assertRaises(
error.ManifestInvalidPathError, self.check_both, 'a', path)
class ValueTests(unittest.TestCase):
"""Check utility parsing code."""
def _get_node(self, text):
return xml.dom.minidom.parseString(text).firstChild
def test_bool_default(self):
"""Check XmlBool default handling."""
node = self._get_node('<node/>')
self.assertIsNone(manifest_xml.XmlBool(node, 'a'))
self.assertIsNone(manifest_xml.XmlBool(node, 'a', None))
self.assertEqual(123, manifest_xml.XmlBool(node, 'a', 123))
node = self._get_node('<node a=""/>')
self.assertIsNone(manifest_xml.XmlBool(node, 'a'))
def test_bool_invalid(self):
"""Check XmlBool invalid handling."""
node = self._get_node('<node a="moo"/>')
self.assertEqual(123, manifest_xml.XmlBool(node, 'a', 123))
def test_bool_true(self):
"""Check XmlBool true values."""
for value in ('yes', 'true', '1'):
node = self._get_node('<node a="%s"/>' % (value,))
self.assertTrue(manifest_xml.XmlBool(node, 'a'))
def test_bool_false(self):
"""Check XmlBool false values."""
for value in ('no', 'false', '0'):
node = self._get_node('<node a="%s"/>' % (value,))
self.assertFalse(manifest_xml.XmlBool(node, 'a'))
def test_int_default(self):
"""Check XmlInt default handling."""
node = self._get_node('<node/>')
self.assertIsNone(manifest_xml.XmlInt(node, 'a'))
self.assertIsNone(manifest_xml.XmlInt(node, 'a', None))
self.assertEqual(123, manifest_xml.XmlInt(node, 'a', 123))
node = self._get_node('<node a=""/>')
self.assertIsNone(manifest_xml.XmlInt(node, 'a'))
def test_int_good(self):
"""Check XmlInt numeric handling."""
for value in (-1, 0, 1, 50000):
node = self._get_node('<node a="%s"/>' % (value,))
self.assertEqual(value, manifest_xml.XmlInt(node, 'a'))
def test_int_invalid(self):
"""Check XmlInt invalid handling."""
with self.assertRaises(error.ManifestParseError):
node = self._get_node('<node a="xx"/>')
manifest_xml.XmlInt(node, 'a')

View File

@ -18,311 +18,45 @@
from __future__ import print_function from __future__ import print_function
import contextlib
import os
import shutil
import subprocess
import tempfile
import unittest import unittest
import error
import git_config
import platform_utils
import project import project
@contextlib.contextmanager class RepoHookShebang(unittest.TestCase):
def TempGitTree(): """Check shebang parsing in RepoHook."""
"""Create a new empty git checkout for testing."""
# TODO(vapier): Convert this to tempfile.TemporaryDirectory once we drop
# Python 2 support entirely.
try:
tempdir = tempfile.mkdtemp(prefix='repo-tests')
subprocess.check_call(['git', 'init'], cwd=tempdir)
yield tempdir
finally:
platform_utils.rmtree(tempdir)
def test_no_shebang(self):
class FakeProject(object): """Lines w/out shebangs should be rejected."""
"""A fake for Project for basic functionality.""" DATA = (
'',
def __init__(self, worktree): '# -*- coding:utf-8 -*-\n',
self.worktree = worktree '#\n# foo\n',
self.gitdir = os.path.join(worktree, '.git') '# Bad shebang in script\n#!/foo\n'
self.name = 'fakeproject'
self.work_git = project.Project._GitGetByExec(
self, bare=False, gitdir=self.gitdir)
self.bare_git = project.Project._GitGetByExec(
self, bare=True, gitdir=self.gitdir)
self.config = git_config.GitConfig.ForRepository(gitdir=self.gitdir)
class ReviewableBranchTests(unittest.TestCase):
"""Check ReviewableBranch behavior."""
def test_smoke(self):
"""A quick run through everything."""
with TempGitTree() as tempdir:
fakeproj = FakeProject(tempdir)
# Generate some commits.
with open(os.path.join(tempdir, 'readme'), 'w') as fp:
fp.write('txt')
fakeproj.work_git.add('readme')
fakeproj.work_git.commit('-mAdd file')
fakeproj.work_git.checkout('-b', 'work')
fakeproj.work_git.rm('-f', 'readme')
fakeproj.work_git.commit('-mDel file')
# Start off with the normal details.
rb = project.ReviewableBranch(
fakeproj, fakeproj.config.GetBranch('work'), 'master')
self.assertEqual('work', rb.name)
self.assertEqual(1, len(rb.commits))
self.assertIn('Del file', rb.commits[0])
d = rb.unabbrev_commits
self.assertEqual(1, len(d))
short, long = next(iter(d.items()))
self.assertTrue(long.startswith(short))
self.assertTrue(rb.base_exists)
# Hard to assert anything useful about this.
self.assertTrue(rb.date)
# Now delete the tracking branch!
fakeproj.work_git.branch('-D', 'master')
rb = project.ReviewableBranch(
fakeproj, fakeproj.config.GetBranch('work'), 'master')
self.assertEqual(0, len(rb.commits))
self.assertFalse(rb.base_exists)
# Hard to assert anything useful about this.
self.assertTrue(rb.date)
class CopyLinkTestCase(unittest.TestCase):
"""TestCase for stub repo client checkouts.
It'll have a layout like:
tempdir/ # self.tempdir
checkout/ # self.topdir
git-project/ # self.worktree
Attributes:
tempdir: A dedicated temporary directory.
worktree: The top of the repo client checkout.
topdir: The top of a project checkout.
"""
def setUp(self):
self.tempdir = tempfile.mkdtemp(prefix='repo_tests')
self.topdir = os.path.join(self.tempdir, 'checkout')
self.worktree = os.path.join(self.topdir, 'git-project')
os.makedirs(self.topdir)
os.makedirs(self.worktree)
def tearDown(self):
shutil.rmtree(self.tempdir, ignore_errors=True)
@staticmethod
def touch(path):
with open(path, 'w'):
pass
def assertExists(self, path, msg=None):
"""Make sure |path| exists."""
if os.path.exists(path):
return
if msg is None:
msg = ['path is missing: %s' % path]
while path != '/':
path = os.path.dirname(path)
if not path:
# If we're given something like "foo", abort once we get to "".
break
result = os.path.exists(path)
msg.append('\tos.path.exists(%s): %s' % (path, result))
if result:
msg.append('\tcontents: %r' % os.listdir(path))
break
msg = '\n'.join(msg)
raise self.failureException(msg)
class CopyFile(CopyLinkTestCase):
"""Check _CopyFile handling."""
def CopyFile(self, src, dest):
return project._CopyFile(self.worktree, src, self.topdir, dest)
def test_basic(self):
"""Basic test of copying a file from a project to the toplevel."""
src = os.path.join(self.worktree, 'foo.txt')
self.touch(src)
cf = self.CopyFile('foo.txt', 'foo')
cf._Copy()
self.assertExists(os.path.join(self.topdir, 'foo'))
def test_src_subdir(self):
"""Copy a file from a subdir of a project."""
src = os.path.join(self.worktree, 'bar', 'foo.txt')
os.makedirs(os.path.dirname(src))
self.touch(src)
cf = self.CopyFile('bar/foo.txt', 'new.txt')
cf._Copy()
self.assertExists(os.path.join(self.topdir, 'new.txt'))
def test_dest_subdir(self):
"""Copy a file to a subdir of a checkout."""
src = os.path.join(self.worktree, 'foo.txt')
self.touch(src)
cf = self.CopyFile('foo.txt', 'sub/dir/new.txt')
self.assertFalse(os.path.exists(os.path.join(self.topdir, 'sub')))
cf._Copy()
self.assertExists(os.path.join(self.topdir, 'sub', 'dir', 'new.txt'))
def test_update(self):
"""Make sure changed files get copied again."""
src = os.path.join(self.worktree, 'foo.txt')
dest = os.path.join(self.topdir, 'bar')
with open(src, 'w') as f:
f.write('1st')
cf = self.CopyFile('foo.txt', 'bar')
cf._Copy()
self.assertExists(dest)
with open(dest) as f:
self.assertEqual(f.read(), '1st')
with open(src, 'w') as f:
f.write('2nd!')
cf._Copy()
with open(dest) as f:
self.assertEqual(f.read(), '2nd!')
def test_src_block_symlink(self):
"""Do not allow reading from a symlinked path."""
src = os.path.join(self.worktree, 'foo.txt')
sym = os.path.join(self.worktree, 'sym')
self.touch(src)
platform_utils.symlink('foo.txt', sym)
self.assertExists(sym)
cf = self.CopyFile('sym', 'foo')
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_src_block_symlink_traversal(self):
"""Do not allow reading through a symlink dir."""
realfile = os.path.join(self.tempdir, 'file.txt')
self.touch(realfile)
src = os.path.join(self.worktree, 'bar', 'file.txt')
platform_utils.symlink(self.tempdir, os.path.join(self.worktree, 'bar'))
self.assertExists(src)
cf = self.CopyFile('bar/file.txt', 'foo')
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_src_block_copy_from_dir(self):
"""Do not allow copying from a directory."""
src = os.path.join(self.worktree, 'dir')
os.makedirs(src)
cf = self.CopyFile('dir', 'foo')
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_dest_block_symlink(self):
"""Do not allow writing to a symlink."""
src = os.path.join(self.worktree, 'foo.txt')
self.touch(src)
platform_utils.symlink('dest', os.path.join(self.topdir, 'sym'))
cf = self.CopyFile('foo.txt', 'sym')
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_dest_block_symlink_traversal(self):
"""Do not allow writing through a symlink dir."""
src = os.path.join(self.worktree, 'foo.txt')
self.touch(src)
platform_utils.symlink(tempfile.gettempdir(),
os.path.join(self.topdir, 'sym'))
cf = self.CopyFile('foo.txt', 'sym/foo.txt')
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
def test_src_block_copy_to_dir(self):
"""Do not allow copying to a directory."""
src = os.path.join(self.worktree, 'foo.txt')
self.touch(src)
os.makedirs(os.path.join(self.topdir, 'dir'))
cf = self.CopyFile('foo.txt', 'dir')
self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
class LinkFile(CopyLinkTestCase):
"""Check _LinkFile handling."""
def LinkFile(self, src, dest):
return project._LinkFile(self.worktree, src, self.topdir, dest)
def test_basic(self):
"""Basic test of linking a file from a project into the toplevel."""
src = os.path.join(self.worktree, 'foo.txt')
self.touch(src)
lf = self.LinkFile('foo.txt', 'foo')
lf._Link()
dest = os.path.join(self.topdir, 'foo')
self.assertExists(dest)
self.assertTrue(os.path.islink(dest))
self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest))
def test_src_subdir(self):
"""Link to a file in a subdir of a project."""
src = os.path.join(self.worktree, 'bar', 'foo.txt')
os.makedirs(os.path.dirname(src))
self.touch(src)
lf = self.LinkFile('bar/foo.txt', 'foo')
lf._Link()
self.assertExists(os.path.join(self.topdir, 'foo'))
def test_src_self(self):
"""Link to the project itself."""
dest = os.path.join(self.topdir, 'foo', 'bar')
lf = self.LinkFile('.', 'foo/bar')
lf._Link()
self.assertExists(dest)
self.assertEqual(os.path.join('..', 'git-project'), os.readlink(dest))
def test_dest_subdir(self):
"""Link a file to a subdir of a checkout."""
src = os.path.join(self.worktree, 'foo.txt')
self.touch(src)
lf = self.LinkFile('foo.txt', 'sub/dir/foo/bar')
self.assertFalse(os.path.exists(os.path.join(self.topdir, 'sub')))
lf._Link()
self.assertExists(os.path.join(self.topdir, 'sub', 'dir', 'foo', 'bar'))
def test_src_block_relative(self):
"""Do not allow relative symlinks."""
BAD_SOURCES = (
'./',
'..',
'../',
'foo/.',
'foo/./bar',
'foo/..',
'foo/../foo',
) )
for src in BAD_SOURCES: for data in DATA:
lf = self.LinkFile(src, 'foo') self.assertIsNone(project.RepoHook._ExtractInterpFromShebang(data))
self.assertRaises(error.ManifestInvalidPathError, lf._Link)
def test_update(self): def test_direct_interp(self):
"""Make sure changed targets get updated.""" """Lines whose shebang points directly to the interpreter."""
dest = os.path.join(self.topdir, 'sym') DATA = (
('#!/foo', '/foo'),
('#! /foo', '/foo'),
('#!/bin/foo ', '/bin/foo'),
('#! /usr/foo ', '/usr/foo'),
('#! /usr/foo -args', '/usr/foo'),
)
for shebang, interp in DATA:
self.assertEqual(project.RepoHook._ExtractInterpFromShebang(shebang),
interp)
src = os.path.join(self.worktree, 'foo.txt') def test_env_interp(self):
self.touch(src) """Lines whose shebang launches through `env`."""
lf = self.LinkFile('foo.txt', 'sym') DATA = (
lf._Link() ('#!/usr/bin/env foo', 'foo'),
self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest)) ('#!/bin/env foo', 'foo'),
('#! /bin/env /bin/foo ', '/bin/foo'),
# Point the symlink somewhere else. )
os.unlink(dest) for shebang, interp in DATA:
platform_utils.symlink(self.tempdir, dest) self.assertEqual(project.RepoHook._ExtractInterpFromShebang(shebang),
lf._Link() interp)
self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest))

View File

@ -1,43 +0,0 @@
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the subcmds module (mostly __init__.py than subcommands)."""
import unittest
import subcmds
class AllCommands(unittest.TestCase):
"""Check registered all_commands."""
def test_required_basic(self):
"""Basic checking of registered commands."""
# NB: We don't test all subcommands as we want to avoid "change detection"
# tests, so we just look for the most common/important ones here that are
# unlikely to ever change.
for cmd in {'cherry-pick', 'help', 'init', 'start', 'sync', 'upload'}:
self.assertIn(cmd, subcmds.all_commands)
def test_naming(self):
"""Verify we don't add things that we shouldn't."""
for cmd in subcmds.all_commands:
# Reject filename suffixes like "help.py".
self.assertNotIn('.', cmd)
# Make sure all '_' were converted to '-'.
self.assertNotIn('_', cmd)
# Reject internal python paths like "__init__".
self.assertFalse(cmd.startswith('__'))

View File

@ -1,49 +0,0 @@
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the subcmds/init.py module."""
import unittest
from subcmds import init
class InitCommand(unittest.TestCase):
"""Check registered all_commands."""
def setUp(self):
self.cmd = init.Init()
def test_cli_parser_good(self):
"""Check valid command line options."""
ARGV = (
[],
)
for argv in ARGV:
opts, args = self.cmd.OptionParser.parse_args(argv)
self.cmd.ValidateOptions(opts, args)
def test_cli_parser_bad(self):
"""Check invalid command line options."""
ARGV = (
# Too many arguments.
['asdf'],
# Conflicting options.
['--mirror', '--archive'],
)
for argv in ARGV:
opts, args = self.cmd.OptionParser.parse_args(argv)
with self.assertRaises(SystemExit):
self.cmd.ValidateOptions(opts, args)

View File

@ -18,83 +18,24 @@
from __future__ import print_function from __future__ import print_function
import contextlib
import os import os
import re
import shutil
import tempfile
import unittest import unittest
import platform_utils
from pyversion import is_python3
import wrapper import wrapper
if is_python3():
from unittest import mock
from io import StringIO
else:
import mock
from StringIO import StringIO
@contextlib.contextmanager
def TemporaryDirectory():
"""Create a new empty git checkout for testing."""
# TODO(vapier): Convert this to tempfile.TemporaryDirectory once we drop
# Python 2 support entirely.
try:
tempdir = tempfile.mkdtemp(prefix='repo-tests')
yield tempdir
finally:
platform_utils.rmtree(tempdir)
def fixture(*paths): def fixture(*paths):
"""Return a path relative to tests/fixtures. """Return a path relative to tests/fixtures.
""" """
return os.path.join(os.path.dirname(__file__), 'fixtures', *paths) return os.path.join(os.path.dirname(__file__), 'fixtures', *paths)
class RepoWrapperUnitTest(unittest.TestCase):
class RepoWrapperTestCase(unittest.TestCase):
"""TestCase for the wrapper module."""
def setUp(self):
"""Load the wrapper module every time."""
wrapper._wrapper_module = None
self.wrapper = wrapper.Wrapper()
if not is_python3():
self.assertRegex = self.assertRegexpMatches
class RepoWrapperUnitTest(RepoWrapperTestCase):
"""Tests helper functions in the repo wrapper """Tests helper functions in the repo wrapper
""" """
def setUp(self):
def test_version(self): """Load the wrapper module every time
"""Make sure _Version works.""" """
with self.assertRaises(SystemExit) as e: wrapper._wrapper_module = None
with mock.patch('sys.stdout', new_callable=StringIO) as stdout: self.wrapper = wrapper.Wrapper()
with mock.patch('sys.stderr', new_callable=StringIO) as stderr:
self.wrapper._Version()
self.assertEqual(0, e.exception.code)
self.assertEqual('', stderr.getvalue())
self.assertIn('repo launcher version', stdout.getvalue())
def test_init_parser(self):
"""Make sure 'init' GetParser works."""
parser = self.wrapper.GetParser(gitc_init=False)
opts, args = parser.parse_args([])
self.assertEqual([], args)
self.assertIsNone(opts.manifest_url)
def test_gitc_init_parser(self):
"""Make sure 'gitc-init' GetParser works."""
parser = self.wrapper.GetParser(gitc_init=True)
opts, args = parser.parse_args([])
self.assertEqual([], args)
self.assertIsNone(opts.manifest_file)
def test_get_gitc_manifest_dir_no_gitc(self): def test_get_gitc_manifest_dir_no_gitc(self):
""" """
@ -131,355 +72,9 @@ class RepoWrapperUnitTest(RepoWrapperTestCase):
self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test/extra'), 'test') self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test/extra'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test'), 'test') self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/'), 'test') self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/'), 'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/extra'), self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/extra'), 'test')
'test')
self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/'), None) self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/'), None)
self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/'), None) self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/'), None)
class SetGitTrace2ParentSid(RepoWrapperTestCase):
"""Check SetGitTrace2ParentSid behavior."""
KEY = 'GIT_TRACE2_PARENT_SID'
VALID_FORMAT = re.compile(r'^repo-[0-9]{8}T[0-9]{6}Z-P[0-9a-f]{8}$')
def test_first_set(self):
"""Test env var not yet set."""
env = {}
self.wrapper.SetGitTrace2ParentSid(env)
self.assertIn(self.KEY, env)
value = env[self.KEY]
self.assertRegex(value, self.VALID_FORMAT)
def test_append(self):
"""Test env var is appended."""
env = {self.KEY: 'pfx'}
self.wrapper.SetGitTrace2ParentSid(env)
self.assertIn(self.KEY, env)
value = env[self.KEY]
self.assertTrue(value.startswith('pfx/'))
self.assertRegex(value[4:], self.VALID_FORMAT)
def test_global_context(self):
"""Check os.environ gets updated by default."""
os.environ.pop(self.KEY, None)
self.wrapper.SetGitTrace2ParentSid()
self.assertIn(self.KEY, os.environ)
value = os.environ[self.KEY]
self.assertRegex(value, self.VALID_FORMAT)
class RunCommand(RepoWrapperTestCase):
"""Check run_command behavior."""
def test_capture(self):
"""Check capture_output handling."""
ret = self.wrapper.run_command(['echo', 'hi'], capture_output=True)
self.assertEqual(ret.stdout, 'hi\n')
def test_check(self):
"""Check check handling."""
self.wrapper.run_command(['true'], check=False)
self.wrapper.run_command(['true'], check=True)
self.wrapper.run_command(['false'], check=False)
with self.assertRaises(self.wrapper.RunError):
self.wrapper.run_command(['false'], check=True)
class RunGit(RepoWrapperTestCase):
"""Check run_git behavior."""
def test_capture(self):
"""Check capture_output handling."""
ret = self.wrapper.run_git('--version')
self.assertIn('git', ret.stdout)
def test_check(self):
"""Check check handling."""
with self.assertRaises(self.wrapper.CloneFailure):
self.wrapper.run_git('--version-asdfasdf')
self.wrapper.run_git('--version-asdfasdf', check=False)
class ParseGitVersion(RepoWrapperTestCase):
"""Check ParseGitVersion behavior."""
def test_autoload(self):
"""Check we can load the version from the live git."""
ret = self.wrapper.ParseGitVersion()
self.assertIsNotNone(ret)
def test_bad_ver(self):
"""Check handling of bad git versions."""
ret = self.wrapper.ParseGitVersion(ver_str='asdf')
self.assertIsNone(ret)
def test_normal_ver(self):
"""Check handling of normal git versions."""
ret = self.wrapper.ParseGitVersion(ver_str='git version 2.25.1')
self.assertEqual(2, ret.major)
self.assertEqual(25, ret.minor)
self.assertEqual(1, ret.micro)
self.assertEqual('2.25.1', ret.full)
def test_extended_ver(self):
"""Check handling of extended distro git versions."""
ret = self.wrapper.ParseGitVersion(
ver_str='git version 1.30.50.696.g5e7596f4ac-goog')
self.assertEqual(1, ret.major)
self.assertEqual(30, ret.minor)
self.assertEqual(50, ret.micro)
self.assertEqual('1.30.50.696.g5e7596f4ac-goog', ret.full)
class CheckGitVersion(RepoWrapperTestCase):
"""Check _CheckGitVersion behavior."""
def test_unknown(self):
"""Unknown versions should abort."""
with mock.patch.object(self.wrapper, 'ParseGitVersion', return_value=None):
with self.assertRaises(self.wrapper.CloneFailure):
self.wrapper._CheckGitVersion()
def test_old(self):
"""Old versions should abort."""
with mock.patch.object(
self.wrapper, 'ParseGitVersion',
return_value=self.wrapper.GitVersion(1, 0, 0, '1.0.0')):
with self.assertRaises(self.wrapper.CloneFailure):
self.wrapper._CheckGitVersion()
def test_new(self):
"""Newer versions should run fine."""
with mock.patch.object(
self.wrapper, 'ParseGitVersion',
return_value=self.wrapper.GitVersion(100, 0, 0, '100.0.0')):
self.wrapper._CheckGitVersion()
class NeedSetupGnuPG(RepoWrapperTestCase):
"""Check NeedSetupGnuPG behavior."""
def test_missing_dir(self):
"""The ~/.repoconfig tree doesn't exist yet."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = os.path.join(tempdir, 'foo')
self.assertTrue(self.wrapper.NeedSetupGnuPG())
def test_missing_keyring(self):
"""The keyring-version file doesn't exist yet."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = tempdir
self.assertTrue(self.wrapper.NeedSetupGnuPG())
def test_empty_keyring(self):
"""The keyring-version file exists, but is empty."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = tempdir
with open(os.path.join(tempdir, 'keyring-version'), 'w'):
pass
self.assertTrue(self.wrapper.NeedSetupGnuPG())
def test_old_keyring(self):
"""The keyring-version file exists, but it's old."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = tempdir
with open(os.path.join(tempdir, 'keyring-version'), 'w') as fp:
fp.write('1.0\n')
self.assertTrue(self.wrapper.NeedSetupGnuPG())
def test_new_keyring(self):
"""The keyring-version file exists, and is up-to-date."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = tempdir
with open(os.path.join(tempdir, 'keyring-version'), 'w') as fp:
fp.write('1000.0\n')
self.assertFalse(self.wrapper.NeedSetupGnuPG())
class SetupGnuPG(RepoWrapperTestCase):
"""Check SetupGnuPG behavior."""
def test_full(self):
"""Make sure it works completely."""
with TemporaryDirectory() as tempdir:
self.wrapper.home_dot_repo = tempdir
self.wrapper.gpg_dir = os.path.join(self.wrapper.home_dot_repo, 'gnupg')
self.assertTrue(self.wrapper.SetupGnuPG(True))
with open(os.path.join(tempdir, 'keyring-version'), 'r') as fp:
data = fp.read()
self.assertEqual('.'.join(str(x) for x in self.wrapper.KEYRING_VERSION),
data.strip())
class VerifyRev(RepoWrapperTestCase):
"""Check verify_rev behavior."""
def test_verify_passes(self):
"""Check when we have a valid signed tag."""
desc_result = self.wrapper.RunResult(0, 'v1.0\n', '')
gpg_result = self.wrapper.RunResult(0, '', '')
with mock.patch.object(self.wrapper, 'run_git',
side_effect=(desc_result, gpg_result)):
ret = self.wrapper.verify_rev('/', 'refs/heads/stable', '1234', True)
self.assertEqual('v1.0^0', ret)
def test_unsigned_commit(self):
"""Check we fall back to signed tag when we have an unsigned commit."""
desc_result = self.wrapper.RunResult(0, 'v1.0-10-g1234\n', '')
gpg_result = self.wrapper.RunResult(0, '', '')
with mock.patch.object(self.wrapper, 'run_git',
side_effect=(desc_result, gpg_result)):
ret = self.wrapper.verify_rev('/', 'refs/heads/stable', '1234', True)
self.assertEqual('v1.0^0', ret)
def test_verify_fails(self):
"""Check we fall back to signed tag when we have an unsigned commit."""
desc_result = self.wrapper.RunResult(0, 'v1.0-10-g1234\n', '')
gpg_result = Exception
with mock.patch.object(self.wrapper, 'run_git',
side_effect=(desc_result, gpg_result)):
with self.assertRaises(Exception):
self.wrapper.verify_rev('/', 'refs/heads/stable', '1234', True)
class GitCheckoutTestCase(RepoWrapperTestCase):
"""Tests that use a real/small git checkout."""
GIT_DIR = None
REV_LIST = None
@classmethod
def setUpClass(cls):
# Create a repo to operate on, but do it once per-class.
cls.GIT_DIR = tempfile.mkdtemp(prefix='repo-rev-tests')
run_git = wrapper.Wrapper().run_git
remote = os.path.join(cls.GIT_DIR, 'remote')
os.mkdir(remote)
run_git('init', cwd=remote)
run_git('commit', '--allow-empty', '-minit', cwd=remote)
run_git('branch', 'stable', cwd=remote)
run_git('tag', 'v1.0', cwd=remote)
run_git('commit', '--allow-empty', '-m2nd commit', cwd=remote)
cls.REV_LIST = run_git('rev-list', 'HEAD', cwd=remote).stdout.splitlines()
run_git('init', cwd=cls.GIT_DIR)
run_git('fetch', remote, '+refs/heads/*:refs/remotes/origin/*', cwd=cls.GIT_DIR)
@classmethod
def tearDownClass(cls):
if not cls.GIT_DIR:
return
shutil.rmtree(cls.GIT_DIR)
class ResolveRepoRev(GitCheckoutTestCase):
"""Check resolve_repo_rev behavior."""
def test_explicit_branch(self):
"""Check refs/heads/branch argument."""
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/heads/stable')
self.assertEqual('refs/heads/stable', rrev)
self.assertEqual(self.REV_LIST[1], lrev)
with self.assertRaises(wrapper.CloneFailure):
self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/heads/unknown')
def test_explicit_tag(self):
"""Check refs/tags/tag argument."""
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/tags/v1.0')
self.assertEqual('refs/tags/v1.0', rrev)
self.assertEqual(self.REV_LIST[1], lrev)
with self.assertRaises(wrapper.CloneFailure):
self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/tags/unknown')
def test_branch_name(self):
"""Check branch argument."""
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'stable')
self.assertEqual('refs/heads/stable', rrev)
self.assertEqual(self.REV_LIST[1], lrev)
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'master')
self.assertEqual('refs/heads/master', rrev)
self.assertEqual(self.REV_LIST[0], lrev)
def test_tag_name(self):
"""Check tag argument."""
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'v1.0')
self.assertEqual('refs/tags/v1.0', rrev)
self.assertEqual(self.REV_LIST[1], lrev)
def test_full_commit(self):
"""Check specific commit argument."""
commit = self.REV_LIST[0]
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, commit)
self.assertEqual(commit, rrev)
self.assertEqual(commit, lrev)
def test_partial_commit(self):
"""Check specific (partial) commit argument."""
commit = self.REV_LIST[0][0:20]
rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, commit)
self.assertEqual(self.REV_LIST[0], rrev)
self.assertEqual(self.REV_LIST[0], lrev)
def test_unknown(self):
"""Check unknown ref/commit argument."""
with self.assertRaises(wrapper.CloneFailure):
self.wrapper.resolve_repo_rev(self.GIT_DIR, 'boooooooya')
class CheckRepoVerify(RepoWrapperTestCase):
"""Check check_repo_verify behavior."""
def test_no_verify(self):
"""Always fail with --no-repo-verify."""
self.assertFalse(self.wrapper.check_repo_verify(False))
def test_gpg_initialized(self):
"""Should pass if gpg is setup already."""
with mock.patch.object(self.wrapper, 'NeedSetupGnuPG', return_value=False):
self.assertTrue(self.wrapper.check_repo_verify(True))
def test_need_gpg_setup(self):
"""Should pass/fail based on gpg setup."""
with mock.patch.object(self.wrapper, 'NeedSetupGnuPG', return_value=True):
with mock.patch.object(self.wrapper, 'SetupGnuPG') as m:
m.return_value = True
self.assertTrue(self.wrapper.check_repo_verify(True))
m.return_value = False
self.assertFalse(self.wrapper.check_repo_verify(True))
class CheckRepoRev(GitCheckoutTestCase):
"""Check check_repo_rev behavior."""
def test_verify_works(self):
"""Should pass when verification passes."""
with mock.patch.object(self.wrapper, 'check_repo_verify', return_value=True):
with mock.patch.object(self.wrapper, 'verify_rev', return_value='12345'):
rrev, lrev = self.wrapper.check_repo_rev(self.GIT_DIR, 'stable')
self.assertEqual('refs/heads/stable', rrev)
self.assertEqual('12345', lrev)
def test_verify_fails(self):
"""Should fail when verification fails."""
with mock.patch.object(self.wrapper, 'check_repo_verify', return_value=True):
with mock.patch.object(self.wrapper, 'verify_rev', side_effect=Exception):
with self.assertRaises(Exception):
self.wrapper.check_repo_rev(self.GIT_DIR, 'stable')
def test_verify_ignore(self):
"""Should pass when verification is disabled."""
with mock.patch.object(self.wrapper, 'verify_rev', side_effect=Exception):
rrev, lrev = self.wrapper.check_repo_rev(self.GIT_DIR, 'stable', repo_verify=False)
self.assertEqual('refs/heads/stable', rrev)
self.assertEqual(self.REV_LIST[1], lrev)
if __name__ == '__main__': if __name__ == '__main__':
unittest.main() unittest.main()

32
tox.ini
View File

@ -1,32 +0,0 @@
# Copyright 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# https://tox.readthedocs.io/
[tox]
envlist = py36, py37, py38
[gh-actions]
python =
3.6: py36
3.7: py37
3.8: py38
[testenv]
deps = pytest
commands = {envpython} run_tests
setenv =
GIT_AUTHOR_NAME = Repo test author
GIT_COMMITTER_NAME = Repo test committer
EMAIL = repo@gerrit.nodomain

View File

@ -27,10 +27,7 @@ import os
def WrapperPath(): def WrapperPath():
return os.path.join(os.path.dirname(__file__), 'repo') return os.path.join(os.path.dirname(__file__), 'repo')
_wrapper_module = None _wrapper_module = None
def Wrapper(): def Wrapper():
global _wrapper_module global _wrapper_module
if not _wrapper_module: if not _wrapper_module: