Compare commits

...

295 Commits
0.179 ... main

Author SHA1 Message Date
Simon Quigley
466e2784de Upload to Unstable 2025-03-04 13:43:32 -06:00
Simon Quigley
ba3f0511f9 syncpackage: Catch exceptions cleanly, simply skipping to the next package (erring on the side of caution) if there is an error doing the download (LP: #1943286). 2025-03-04 13:42:50 -06:00
Simon Quigley
2e550ceff2 syncpackage: Cache the sync blocklist in-memory, so it's not fetched multiple times when syncing more than one package. 2025-03-04 13:39:07 -06:00
Simon Quigley
6c8a5d74bd syncpackage: s/syncblacklist/syncblocklist/g 2025-03-04 13:29:02 -06:00
Simon Quigley
3d11516599 mk-sbuild: default to using UTC for schroots (LP: #2097159). 2025-03-04 13:22:40 -06:00
Simon Quigley
5a20308ab1 Read ~/.devscripts in a more robust way, to ideally pick up multi-line variables (Closes: #725418). 2025-03-04 13:17:30 -06:00
Simon Quigley
b551877651 Add a changelog entry 2025-03-04 13:10:04 -06:00
ferbraher
4a4c4e0a27 Parsing arch parameter to getBinaryPackage() 2025-03-04 13:08:59 -06:00
Simon Quigley
865c1c97bc Add a changelog entry 2025-03-04 13:07:42 -06:00
Shengjing Zhu
d09718e976 import-bug-from-debian: package option is overridden and not used 2025-03-04 13:07:11 -06:00
Simon Quigley
bff7baecc9 Add a changelog entry 2025-03-04 13:06:38 -06:00
Dan Bungert
45fbbb5bd1 mk-sbuild: enable pkgmaintainermangler
mk-sbuild installs pkgbinarymangler into the schroot.  Of of the
provided tools in pkgbinarymangler is pkgmaintainermangler.
pkgmaintainermangler is disabled by default, and enabled with
configuration.

A difference between launchpad builds of a synced package and an sbuild
is that the maintainer information will be different.

Enable pkgmaintainermangler to close this difference.
2025-03-04 13:05:57 -06:00
Simon Quigley
ca217c035e Add a new changelog entry 2025-03-04 13:04:49 -06:00
Simon Quigley
b5e117788b Upload to Unstable 2025-03-01 11:30:18 -06:00
Simon Quigley
ddba2d1e98 Update Standards-Version to 4.7.2, no changes needed. 2025-03-01 11:29:53 -06:00
Simon Quigley
02d65a5804 [syncpackage] Do not use exit(1) on an error or exception unless it applies to all packages, instead return None so we can continue to the next package. 2025-03-01 11:26:59 -06:00
Simon Quigley
bda85fa6a8 [syncpackage] Add support for -y or --yes, noted that it should be used with care. 2025-03-01 11:22:52 -06:00
Simon Quigley
86a83bf74d [syncpackage] Within fetch_source_pkg, do not exit(1) on an error or exception, simply return None so we can continue to the next package. 2025-03-01 11:17:02 -06:00
Simon Quigley
162e758671 [syncpackage] When syncing multiple packages, if one of the packages is in the sync blocklist, do not exit, simply continue. 2025-03-01 11:12:49 -06:00
Simon Quigley
049425adb7 Add debian/files to .gitignore 2025-03-01 11:11:34 -06:00
Simon Quigley
f6ca6cad92 Add a new changelog entry 2025-03-01 11:11:17 -06:00
Simon Quigley
3dc17934d6 Upload to Unstable 2025-02-24 19:55:03 -06:00
Simon Quigley
10a176567a Remove mail line from default ~/.sbuildrc, to resolve the undeclared dependency on sendmail (Closes: #1074632). 2025-02-24 19:52:59 -06:00
Simon Quigley
86b366c6c5 Add a large warning at the top of mk-sbuild encouraging the use of the unshare backend. This is to provide ample warning to users. 2025-02-24 19:15:55 -06:00
Simon Quigley
50b580b30e Add a manpage for running-autopkgtests. 2025-02-24 18:51:12 -06:00
Simon Quigley
6ba0641f63 Rename bitesize to lp-bitesize (Closes: #1076224). 2025-02-24 18:51:10 -06:00
Simon Quigley
1e815db9d2 Add my name to the copyright file. 2025-02-24 18:35:20 -06:00
Simon Quigley
e2f43318bd Add several Lintian overrides related to .pyc files. 2025-02-24 18:34:18 -06:00
Julien Plissonneau Duquène
cdd81232d9 Fix reverse-depends -b crash on packages that b-d on themselves (Closes: #1087760). 2025-02-24 18:31:33 -06:00
Simon Quigley
65044d84d9 Update Standards-Version to 4.7.1, no changes needed. 2025-02-24 18:26:59 -06:00
Mattia Rizzolo
19e40b49c2
Fix minor typo in pbuilder-dist(1)
LP: #2096956
Thanks: Rolf Leggewie for the patch
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2025-01-30 07:52:22 +01:00
Benjamin Drung
55eb521461 Release 0.203 2024-11-02 18:20:32 +01:00
Benjamin Drung
983bb3b70e Depend on python3-yaml for pm-helper 2024-11-02 18:09:16 +01:00
Benjamin Drung
85f2e46f7d conform to snake_case naming style 2024-11-02 18:07:23 +01:00
Benjamin Drung
649c3db767 ubuntu-build: fix used-before-assignment
```
ubuntu-build:244:40: E0601: Using variable 'necessary_privs' before assignment (used-before-assignment)
```
2024-11-02 17:56:47 +01:00
Benjamin Drung
e7ba650414 Avoid unnecessary "elif" after "continue"
Address pylint's no-else-continue.
2024-11-02 17:55:33 +01:00
Benjamin Drung
3bc802a209 Use lazy % formatting in logging functions 2024-11-02 17:55:20 +01:00
Benjamin Drung
92c80d7bb7 ubuntu-build: remove unused code/imports 2024-11-02 17:54:06 +01:00
Benjamin Drung
d7362d9ed8 Use Python f-strings
```
flynt -ll 99 -tc -tj -a pbuilder-dist pm-helper running-autopkgtests ubuntu-build ubuntutools
```
2024-11-02 17:49:20 +01:00
Benjamin Drung
c7a855ff20 Format code with black and isort
```
isort pbuilder-dist pm-helper running-autopkgtests ubuntu-build ubuntutools
black -C pbuilder-dist pm-helper running-autopkgtests ubuntu-build ubuntutools
```
2024-11-02 17:21:30 +01:00
Benjamin Drung
017941ad70 setup.py: add pm-helper 2024-11-02 16:41:44 +01:00
Benjamin Drung
69914f861e add missing files to debian/copyright 2024-11-02 16:35:31 +01:00
Benjamin Drung
454f1e30c8 Bump year in copyright 2024-11-02 15:57:19 +01:00
Benjamin Drung
55bc403a95 Bump Standards-Version to 4.7.0 2024-11-02 15:56:01 +01:00
Benjamin Drung
c9339aeae4 import-bug-from-debian: add type hints 2024-11-02 15:34:59 +01:00
Benjamin Drung
c205ee0381 import-bug-from-debian: avoid type change of bug_num
The variable `bug_num` has the type `str`. Do not reuse the name for
type `int` to ease mypy.
2024-11-02 15:33:15 +01:00
Benjamin Drung
7577e10f13 import-bug-from-debian: reuse message variable
`log[0]["message"]` was already queried.
2024-11-02 15:32:19 +01:00
Florent 'Skia' Jacquet
e328dc05c2 import-bug-from-debian: split big main function into smaller ones
This allows better understanding of the various parts of the code, by
naming important parts and defining boundaries on the used variables.
2024-11-02 15:08:09 +01:00
Florent 'Skia' Jacquet
9a94c9dea1 import-bug-from-debian: handle multipart messages
With multipart messages, like #1073996, `import-bug-from-debian` would
produce bug description with this:
```
[<email.message.Message object at 0x7fbe14096fa0>, <email.message.Message object at 0x7fbe15143820>]
```
For that kind of bug, it now produces a correct description with the
plain text parts concatenated in the description, the attachments added
as attachments, and the inline images converted to attachments with an
inline message placeholder.

See #981577 for a particularly weird case now gracefully handled.
If something weirder happens, then the tool will now abort with a clear
message instead of producing garbage.

Closes: #969510
2024-11-02 14:57:01 +01:00
Florent 'Skia' Jacquet
47ab7b608b Add gitignore 2024-10-30 17:31:54 +01:00
Steve Langasek
56044d8eac Recommend sbuild over pbuilder. sbuild is the tool recommended by Ubuntu developers whose behavior most closely approximates Launchpad builds. 2024-05-26 13:04:55 -07:00
Steve Langasek
c523b4cfc4 open new version 2024-05-26 13:01:23 -07:00
Steve Langasek
3df40f6392 Handle exceptions on retry
The "can be retried" value from launchpad may have been cached.  Avoid an
exception when we race someone else retrying a build.
2024-05-26 12:57:14 -07:00
Simon Quigley
6ebffe3f4a Consolidate Ubuntu changelog entries, upload to Unstable 2024-04-12 23:35:08 -05:00
Chris Peterson
f01234e8a5 update debian/copyright
- Correctly add ISC licenses to new files in ubuntutools/tests/*
  as specified in debian/copyright
- Add GPL-3 licenses and correct attribution for:
    - running-autopkgtests
    - ubuntutools/running_autopkgtests.py
2024-03-13 09:21:30 -07:00
Chris Peterson
43891eda88 depends: python3-launchpadlib-desktop
Replace the dependency on python3-launchpadlib with
python3-launchpadlib-desktop. This package is the same as python3-launchpadlib
except that it also includes python3-keyring, which is a requirement for
some of the desktop-centric code-paths. In the case, requestsync has a
path for logging in via a web browser which also requires python3-keyring
to be installed. This had caused a ModuleNotFoundError when
python3-launchpadlib dropped python3-keyring from Recommends to Suggests
(LP: #2049217).
2024-03-13 09:17:49 -07:00
Steve Langasek
132866e2ba releasing package ubuntu-dev-tools version 0.201ubuntu2 2024-03-12 17:03:58 -07:00
Steve Langasek
a0fcac7777 changelog update 2024-03-12 17:03:41 -07:00
Steve Langasek
490895075d Merge latest Ubuntu upload 2024-03-12 17:01:59 -07:00
Chris Peterson
5186e76d8d Import Debian version 0.201ubuntu1
ubuntu-dev-tools (0.201ubuntu1) noble; urgency=medium
.
  * Replace Depends on python3-launchpadlib with Depends on
    python3-launchpadlib-desktop (LP: #2049217)
2024-03-12 17:01:19 -07:00
Steve Langasek
bf46f7fbc1 Fix license statement in manpage 2024-03-12 12:09:19 -07:00
Steve Langasek
881602c4b9 Update ubuntu-build manpage to match current options 2024-03-12 12:08:58 -07:00
Steve Langasek
c869d07f75 ubuntu-build: don't retry builds Launchpad tells us can't be retried 2024-03-12 11:52:32 -07:00
Gianfranco Costamagna
59041af613 update changelog 2024-03-12 10:39:36 +01:00
Gianfranco Costamagna
0ec53180f2 Merge remote-tracking branch 'vorlon/ubuntu-build-revamp' 2024-03-12 10:36:13 +01:00
Steve Langasek
c92fa6502f ubuntu-build: Handling of proposed vs release pocket default for ppas 2024-03-10 21:43:06 -07:00
Steve Langasek
07d3158ade Don't do expensive check of group membership on rescore, just handle exceptions
This could do with some further refactoring, but will probably postpone that
until a decision is made about dropping the non-batch mode
2024-03-10 16:51:15 -07:00
Steve Langasek
d5faa9b133 Proper handling of getDevelopmentSeries() 2024-03-10 15:48:16 -07:00
Steve Langasek
9e710a3d66 Always use exact match when looking for source packages by name 2024-03-10 15:46:22 -07:00
Steve Langasek
010af53d7c Add a -A archive option to act on ppas as well.
This results in a major refactor of the code to use launchpadlib directly
instead of the ubuntutools.lp.lpapicache module in ubuntu-dev-tools which is
idiosyncratic and does not expose the full launchpad API.  Easier to rewrite
to use the standard library.
2024-03-10 14:35:47 -07:00
Steve Langasek
0bef4d7352 ubuntu-build: fix licensing.
Canonical licensing policy has never been GPLv3+, only GPLv3.
2024-03-10 13:36:30 -07:00
Steve Langasek
688202a7cf ubuntu-build: update copyright 2024-03-10 13:35:56 -07:00
Steve Langasek
691c1381db ubuntu-build: support retrying builds in other states that failed-to-build 2024-03-10 01:45:20 -08:00
Steve Langasek
f01502bda2 ubuntu-build: make the --arch option top-level
This gets rid of the fugly --arch2 option
2024-03-08 18:53:20 -08:00
Steve Langasek
42f8e5c0d2 ubuntu-build: in batch mode, print a count of packages retried 2024-03-08 18:38:59 -08:00
Steve Langasek
bb8a9f7394 ubuntu-build: support --batch with no package names to retry all 2024-03-08 16:43:30 -08:00
Gianfranco Costamagna
a058c716b9 Upload to sid 2024-02-29 22:49:37 +01:00
Chris Peterson
e64fe7e212 Update Changelog 2024-02-29 13:08:13 -08:00
Chris Peterson
f07d3df40c running-autopkgtests: make running-autopkgtests available
Previously running-autopkgtests was added to the source but
wasn't correctly added to the scripts in setup.py, so it wasn't
actually available in the installed package. This also adds the
script to the package description.
2024-02-29 13:06:12 -08:00
Gianfranco Costamagna
f73f2c1df1 Upload to sid 2024-02-15 18:09:28 +01:00
Gianfranco Costamagna
268d082226 Update changelog 2024-02-15 18:06:49 +01:00
Athos Ribeiro
6bc59d789e Log syncpackage LP auth errors before halting 2024-02-15 18:06:38 +01:00
Logan Rosen
9a4cc312f4 Don't rely on debootstrap for validating Ubuntu distro 2024-02-15 17:51:35 +01:00
Ying-Chun Liu (PaulLiu)
ffc787b454 Drop qemu-debootstrap
qemu-debootstrap is deprecated for a while. In newer qemu release
the command is totally removed. We can use debootstrap directly.

Signed-off-by: Ying-Chun Liu (PaulLiu) <paulliu@debian.org>
2024-02-15 17:49:59 +01:00
Chris Peterson
bce1ef88c5 running-autopkgtests: use f-strings 2024-02-14 15:19:48 -08:00
Chris Peterson
a9eb902b83 running-autopkgtests: Changelog entry, ArgumentParser, refactor, tests
Created a new changelog entry to include addition of the running-autopkgtests
script. This includes a refactor of the original script resulting in a new
module in ubuntutools, test cases, and the addition an argument parser to
allow printing just the queued tests, just the running tests, or both
(default).
2024-02-14 15:19:43 -08:00
Chris Peterson
cb7464cf61 Add running-autopkgtests script
This script will print out all of the running and queued autokpgtests.
Originally this was a script titled lp-test-isrunning
from lp:~ubuntu-server/+git/ubuntu-helpers.
2024-02-14 14:55:33 -08:00
Simon Quigley
19f1df1054 Upload to Unstable 2024-01-29 10:03:47 -06:00
Simon Quigley
7f64dde12c Add a changelog entry for Steve 2024-01-29 10:03:19 -06:00
Simon Quigley
c2539c6787 Add a changelog entry for adding myself to Uploaders. 2024-01-29 09:59:30 -06:00
Simon Quigley
fd885ec239 Merge remote-tracking branch 'vorlon/pm-helper' 2024-01-29 09:57:52 -06:00
Simon Quigley
abbc56e185 Add my name to Uploaders.
To be fair, the last four uploads should have started with "Team upload." Whoops.
2024-01-10 20:21:06 -06:00
Simon Quigley
a2176110f0 Upload to Unstable. 2024-01-10 20:04:15 -06:00
Simon Quigley
a5185e4612 Add proper support for virtual packages in check-mir, basing the determination solely off of binary packages. This is not expected to be a typical case. 2024-01-10 20:03:44 -06:00
Simon Quigley
e90ceaf26b In check-mir, ignore debhelper-compat when checking the build dependencies. This is expected to be a build dependency of all packages, so warning about it in any way is surely a red herring. 2024-01-10 19:56:06 -06:00
Simon Quigley
47fd5d7cca Upload to Unstable. 2023-10-03 14:01:44 -05:00
Simon Quigley
2f396fe549 When using pull-*-source to grab a package which already has a defined Vcs- field, display the exact same warning message apt source does. 2023-10-03 14:01:19 -05:00
Gianfranco Costamagna
5bda35f6b4 Update also syncpackage help 2023-08-25 20:04:12 +02:00
Simon Quigley
db916653cd Update the manpage for syncpackage to reflect the ability to sync multiple packages at once. 2023-08-10 14:39:01 -05:00
Simon Quigley
784e7814e9 Allow the user to sync multiple packages at one time (LP: #1756748). 2023-08-04 14:38:46 -05:00
Simon Quigley
bed2dc470d Add support for the non-free-firmware components in all tools already referencing non-free. 2023-07-26 13:04:12 -05:00
Gianfranco Costamagna
414bc76b50 Upload to Debian 2023-07-08 08:43:09 +02:00
Gianfranco Costamagna
6f0caf1fc0 ubuntu-build: For some reasons, now you need to be authenticated before trying to use the "PersonTeam" class features.
Do it at the begin instead of replicating the same code inside the tool itself.

This fixes e.g. this failure:

./ubuntu-build --batch --retry morsmall
Traceback (most recent call last):
  File "/tmp/ubuntu-dev-tools/ubuntu-build", line 317, in <module>
    main()
  File "/tmp/ubuntu-dev-tools/ubuntu-build", line 289, in main
    can_retry = args.retry and me.canUploadPackage(
AttributeError: 'NoneType' object has no attribute 'canUploadPackage'
2023-07-07 19:23:41 +02:00
Robie Basak
4bcc55372a Changelog for 0.193ubuntu5 2023-07-06 11:28:21 +01:00
Robie Basak
232a73de31 ubuntutools/misc: swap iter_content for raw stream
This is a partial revert of 1e20363.

When downloading a .diff.gz source package file, we do expect it to be
written to disk still compressed. If we were to uncompress it, then we
would get a size mismatch and even if we were to ignore that, we'd get a
hash mismatch.

On the other hand when downloading a changes file we need to make sure
that is written to disk uncompressed.

To make this work in both cases we can ask the HTTP server for no
special content encoding using "Accept-Encoding: identity". This is what
wget requests, for example. Then we can write the output to the file
without performing any decoding at our end by using the raw response
object again.

This fixes both cases.

LP: #2025748
2023-07-06 11:28:21 +01:00
Steve Langasek
9aab0135a2 Add an initial manpage for pm-helper 2023-06-14 17:01:36 -07:00
Steve Langasek
23539f28b1 Update license header 2023-06-14 16:52:56 -07:00
Steve Langasek
4a09d23db6 There is no dry-run mode 2023-06-14 16:29:43 -07:00
Steve Langasek
534cd254f4 typo update-excuses->update-excuse 2023-06-14 15:14:14 -07:00
Steve Langasek
29c3fa98bc Use a context manager for lzma 2023-06-14 15:13:46 -07:00
Steve Langasek
7c9c7f2890 Sensible behavior when called for a non-existent package name 2023-06-14 15:12:57 -07:00
Steve Langasek
739279da3f More pythonic function name (thanks, Bryce) 2023-06-14 14:51:15 -07:00
Steve Langasek
7c11832ee0 Sensible behavior when a requested package isn't in -proposed. 2023-06-14 14:01:53 -07:00
Steve Langasek
f5512846d6 Code refactor; thanks, Bryce 2023-06-14 13:59:25 -07:00
Steve Langasek
9e0dff4461 move from OptionParser to ArgumentParser 2023-06-14 13:57:14 -07:00
Steve Langasek
7129e6e27a Fix imports 2023-06-13 13:57:47 -07:00
Steve Langasek
79d30a9bfc Add dependency on dateutil 2023-06-13 13:52:18 -07:00
Steve Langasek
2c6a8b5451 Initial implementation of pm-helper
This is a tool for making it easier to identify the next thing to work on
for proposed-migration.
2023-06-13 13:48:28 -07:00
Steve Langasek
ad014685ea Import utils.py from ubuntu-archive-tools 2023-06-13 13:47:15 -07:00
Steve Langasek
ff1c95e2c0 Remove references to architectures not supported in any active Ubuntu release. 2023-05-30 21:05:56 -07:00
Steve Langasek
89e788bf48 Remove references to deprecated http://people.canonical.com/~ubuntu-archive. 2023-05-30 19:37:11 -07:00
Steve Langasek
a000e9db5e releasing package ubuntu-dev-tools version 0.193ubuntu4 2023-05-30 10:02:47 -07:00
Steve Langasek
83158d24d9 Merge staged changes 2023-05-30 10:00:57 -07:00
Steve Langasek
6e6e1f1e1a Excise all references to cdbs (including in test cases) 2023-05-30 10:00:17 -07:00
Steve Langasek
c7a7767339 Fix a typo introduced in the last upload that made mk-sbuild fail unconditionally. LP: #2017177. 2023-05-30 09:55:06 -07:00
Steve Langasek
ac2f980e0f Remove references to ftpmaster.internal. When this name is resolvable but firewalled, syncpackage hangs; and these are tools for developers, not for running in an automated context in the DCs where ftpmaster.internal is reachable. 2023-04-12 17:59:40 -07:00
Steve Langasek
ccab82e054 releasing package ubuntu-dev-tools version 0.193ubuntu1 2023-04-12 09:45:23 -07:00
Steve Langasek
2e4e8b35b2 Merge branch 'mk-sbuild-not-automatic' 2023-04-12 09:45:15 -07:00
Steve Langasek
53fcd577e8 We no longer need to run sed 2023-04-12 09:41:52 -07:00
Steve Langasek
8430d445d8 Align with the Launchpad buildd implementation, per review comments 2023-04-12 09:28:41 -07:00
Benjamin Drung
c8a757eb07 Format Python code with black 23.1
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-04-04 12:11:36 +02:00
Nathan Rennie-Waldock
66a2773c1c backportpackage: Fix incorrectly reporting unknown distribution for Ubuntu
Fix incorrectly reporting unknown distribution for Ubuntu after commit
7fc6788b35d32aeb96c7cf81303853d4f31028d1 ("backportpackage: fix
automatic selection of the target release").

LP: #2013237
Signed-off-by: Nathan Rennie-Waldock <nathan.renniewaldock@gmail.com>
2023-04-04 11:50:41 +02:00
Stefano Rivera
17d2770451 Upload to unstable 2023-02-25 13:20:04 -04:00
Stefano Rivera
3136541ca6 Don't run linters at build time, or in autopkgtests. (Closes: #1031436). 2023-02-25 12:52:39 -04:00
Benjamin Drung
f3a0182e1a Release ubuntu-dev-tools 0.192
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-02-01 12:45:31 +01:00
Benjamin Drung
6498a13f18 Drop unneeded X-Python3-Version from d/control
lintain says: "Your sources request a specific set of Python versions
via the control field X-Python3-Version but all declared autopkgtests
exercise all supported Python versions by using the command py3versions
--supported."

Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-02-01 12:43:00 +01:00
Benjamin Drung
d2debf9ed9 Update year in debian/copyright
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-02-01 12:40:33 +01:00
Benjamin Drung
a11cb1f630 Bump Standards-Version to 4.6.2
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-02-01 12:39:50 +01:00
Benjamin Drung
34578e6a1e Enable more pylint checks
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-02-01 12:07:19 +01:00
Benjamin Drung
21784052ba test: Fix deprecated return value for test case
```
ubuntutools/test/test_archive.py::LocalSourcePackageTestCase::test_pull
  /usr/lib/python3.11/unittest/case.py:678: DeprecationWarning: It is deprecated to return a value that is not None from a test case (<bound method LocalSourcePackageTestCase.test_pull of <ubuntutools.test.test_archive.LocalSourcePackageTestCase testMethod=test_pull>>)
    return self.run(*args, **kwds)
```

`test_pull` does not need to be run directly. Make it private.

Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-31 17:39:12 +01:00
Benjamin Drung
aa556af89d Use f-strings
pylint complains about C0209: Formatting a regular string which could be
a f-string (consider-using-f-string)

Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-31 19:32:58 +01:00
Benjamin Drung
069a6926c0 Implement conventions found by pylint
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-31 17:28:33 +01:00
Benjamin Drung
444b319c12 Implement refactorings found by pylint
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-31 16:58:24 +01:00
Benjamin Drung
4449cf2437 Fix warnings found by pylint
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-31 15:51:29 +01:00
Benjamin Drung
9fa29f6ad5 fix(reverse-depends): Restore field titles format
Commit 90e8fe81e1b2610e352c82c0301076ffc7da5ac0 renamed `print_field` to
`log_field`, but changed the `print_field` call with `Logger.info`.
Therefore the line with `=` was lost.

Restore the previous formatting.

Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-31 14:42:22 +01:00
Benjamin Drung
a160def2ab fix(requestbackport): Remove useless loop from locate_package
Commit 0f3d2fed2a4ed67b90b5d49aab25ca2bda5d9d37 removed the difference
between the two loop iterations in `locate_package`. So drop the useless
second iteration.

Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-31 14:35:12 +01:00
Benjamin Drung
909d945af4 Replace deprecated optparse with argparse
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-31 13:33:18 +01:00
Benjamin Drung
f6fde2e217 fix: Use lazy % formatting in logging functions
pylint complains about W1201: Use lazy % formatting in logging functions
(logging-not-lazy) and W1203: Use lazy % formatting in logging functions
(logging-fstring-interpolation).

Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-31 11:13:07 +01:00
Benjamin Drung
17bed46ffb feat: Add some type hints
Add some type hints to satisfy mypy.

Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-31 10:35:22 +01:00
Benjamin Drung
72add78e9d Fix errors found by pylint
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-31 10:19:24 +01:00
Benjamin Drung
ab64467f33 Run pylint during package build again
Commit ae74f71a1e9d4be043162b19d23f2d44c964c771 removed the pylint unit
test saying that unit tests are not needed to just run flake8 or pylint.

Since pylint is useful, add it back, but this time call it directly and
not embed it into a unit test.

Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-31 00:05:15 +01:00
Benjamin Drung
b1bc7e1cdc Address pylint complaints
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-30 23:10:31 +01:00
Benjamin Drung
8692bc2b1c refactor(setup.py): Introduce get_debian_version
Move getting the Debian package version into a separate function and
fail in case it cannot find it or fails parsing it.

Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-30 21:56:37 +01:00
Benjamin Drung
a685368ae9 Run isort import sorter during package build
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-30 21:34:24 +01:00
Benjamin Drung
4e27045f49 style: Sort Python imports with isort
```
isort -l 99 --profile=black .
```

Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-30 21:28:47 +01:00
Benjamin Drung
db0e091e44 Run black code formatter during package build
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-30 19:48:49 +01:00
Benjamin Drung
3354b526b5 style: Format Python code with black
```
PYTHON_SCRIPTS=$(grep -l -r '^#! */usr/bin/python3$' .)
black -C -l 99 . $PYTHON_SCRIPTS
```

Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-30 19:45:36 +01:00
Benjamin Drung
79d24c9df1 fix: Check Python scripts with flake8 again
Commit ae74f71a1e9d4be043162b19d23f2d44c964c771 removed the flake8
unittest and commit 3428a65b1cd644445f55ad8ae65ece5f73d7acb5 added
running flake8 again, but only for files named `*.py`.

Check also all Python scripts with a Python shebang.

Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-30 19:29:51 +01:00
Benjamin Drung
932166484b Fix issues found by flake8 on the Python scripts
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-30 19:29:30 +01:00
Benjamin Drung
bd770fa6b1 test: Do not run flake8 in verbose mode
The verbose output of flake8 is not interesting and just clutters the
output.

Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-30 14:11:09 +01:00
Benjamin Drung
3d54a17403 refactor: Move linter checks into run-linters script
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-30 14:10:02 +01:00
Benjamin Drung
3bdb827516 fix: Use PEP440 compliant version in setup.py
Versions like `0.176ubuntu20.04.1` in Ubuntu are clearly not compliant
with https://peps.python.org/pep-0440/. With setuptools 66, the versions
of all packages visible in the Python environment *must* obey PEP440.

Bug: https://launchpad.net/bugs/1991606
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2023-01-30 14:07:23 +01:00
Mattia Rizzolo
0d94b5e747
document the last commit
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2023-01-15 18:16:07 +01:00
Krytarik Raido
0f3d2fed2a
requestbackport: Adapt to new backports policy (LP: #1959115)
As documented on <https://wiki.ubuntu.com/UbuntuBackports>

Template update done by Unit 193.

Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2023-01-15 18:14:48 +01:00
Mattia Rizzolo
844d6d942c
Merge branch 'mk-sbuild' of git+ssh://git.launchpad.net/~myamada/ubuntu-dev-tools
Closes: #1001832
LP: #1955116
MR: https://code.launchpad.net/~myamada/ubuntu-dev-tools/+git/ubuntu-dev-tools/+merge/435734
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2023-01-14 18:49:29 +01:00
Mattia Rizzolo
ae43fd1929
document the previous changes
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2023-01-14 18:46:50 +01:00
Masahiro Yamada
69ac109cdb mk-sbuild: fix security update repository for Debian bullseye and later
If I run "apt-get update" in the bullseye chroot, I get the following error:

  Err:4 http://security.debian.org bullseye-updates Release
    404  Not Found [IP: 2a04:4e42:600::644 80]

It looks like the directory path was changed since bullseye.

buster:

    deb https://security.debian.org/debian-security buster/updates main

bullseye:

    deb https://security.debian.org/debian-security bullseye-security main

Signed-off-by: Masahiro Yamada <masahiro.yamada@canonical.com>
2023-01-13 18:53:17 +09:00
Masahiro Yamada
9f2a53c166 mk-sbuild: add debian_dist_ge()
Add debian_dist_ge(), which will be used by the next commit.

To avoid code duplication, move the common part to dist_ge().

Signed-off-by: Masahiro Yamada <masahiro.yamada@canonical.com>
2023-01-13 18:34:01 +09:00
Steve Langasek
a69c40d403 Set up preferences for -proposed with NotAutomatic: yes
As of lunar, Ubuntu sets NotAutomatic: yes for its -proposed pockets.  For
sbuild chroots, we want to continue to explicitly install from -proposed by
default; so override with apt preferences to get the correct behavior.
2022-11-16 17:49:13 -08:00
Benjamin Drung
c1e4b14a98 Demote bzr/brz from Recommends to Suggests
Nowadays git is used nearly everywhere. Therefore demoting bzr/brz to
Suggest is the right thing to do.

Bug-Debian: https://bugs.debian.org/940531
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2022-11-16 18:49:42 +01:00
Benjamin Drung
096d5612e7 sponsor-patch: Use --skip-patches when extract source package
Use `--skip-patches` when extract source packages with `dpkg-source`.
`--no-preparation` is a source package build option and `--skip-patches`
is the correct extract option.

Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2022-11-16 18:37:56 +01:00
Benjamin Drung
b510dbd91e sponsor-patch: Ignore exit code 1 of debdiff call
sponsor-patch calls `debdiff` which exits with 1 if there are
differences. So accept exit codes 0 and 1 as expected.

Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2022-11-15 16:43:27 +01:00
Mattia Rizzolo
803949ed8b
also include a lp bug number there
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2022-10-11 14:42:08 +02:00
Mattia Rizzolo
e219eaa5fc
Open changelog for the next release.
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2022-10-11 13:58:00 +02:00
Mattia Rizzolo
60ee986014
Release 0.191
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2022-10-11 13:56:13 +02:00
Mattia Rizzolo
dabe475067
ubuntutools/archive.py: fix crash in SourcePackage()._source_urls()
Fix operation of SourcePackage._source_urls() (as used, for example, in
SourcePackage.pull() called by backportpackage) to also work when the
class is instantiated with a URL as .dsc.

This is a regression caused by 1b12d8b4e3315de3bf417b40a3c66279f309d72c
(first in v0.184) that moved from os.path.join() to Pathlib, but
os.path.join() was also used to join URLs.

Thanks: Unit 193 for the initial patch.
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2022-09-29 10:34:51 +02:00
Mattia Rizzolo
0a9e18ed91
document the previous change
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2022-09-29 10:32:07 +02:00
Stefano Rivera
7859889438 backportpackage: Add support for lsb-release-minimal, which doesn't have a Python module, thanks Gioele Barabucci. (Closes: 1020901) 2022-09-28 11:40:33 +02:00
Gioele Barabucci
a3c87e78aa backportpackage: Run lsb_release as command if the Python module is not available 2022-09-28 11:36:22 +02:00
Mattia Rizzolo
05af489f64
Merge branch 'lp1984113' of git+ssh://git.launchpad.net/~ddstreet/ubuntu-dev-tools
MR: https://code.launchpad.net/~ddstreet/ubuntu-dev-tools/+git/ubuntu-dev-tools/+merge/428101
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2022-08-22 17:57:07 +02:00
Mattia Rizzolo
d5fdc00396
open changelog for the next release
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2022-08-22 17:56:02 +02:00
Dan Streetman
7d278cde21 ubuntu-build: use correct exception from LP login failure 2022-08-09 12:15:09 -04:00
Dan Streetman
ad402231db ubuntu-build: explicitly login to LP
LP: #1984113
2022-08-09 12:14:56 -04:00
Dan Streetman
562e6b13cd lpapicache: force lp access on login to workaround possibly invalid cached creds 2022-08-09 12:08:50 -04:00
Dan Streetman
9c1561ff26 lpapicache: remove try-except around login that only logs the error and then re-raises 2022-08-09 12:07:31 -04:00
Benjamin Drung
06a04f642f Release ubuntu-dev-tools 0.190
Signed-off-by: Benjamin Drung <bdrung@ubuntu.com>
2022-06-16 10:55:29 +02:00
Benjamin Drung
8f0005ce1d Bump Standards-Version to 4.6.1
Signed-off-by: Benjamin Drung <bdrung@ubuntu.com>
2022-06-16 10:54:41 +02:00
Benjamin Drung
51ebfb21d3 Add missing files to debian/copyright
Signed-off-by: Benjamin Drung <bdrung@ubuntu.com>
2022-06-16 10:52:51 +02:00
Benjamin Drung
f83161dcc2 Wrap long line in setup-packaging-environment.1
Signed-off-by: Benjamin Drung <bdrung@ubuntu.com>
2022-06-16 10:24:32 +02:00
Benjamin Drung
bf5796c69e mk-sbuild: Rename SCRIPT to DEBOOTSTRAP_SCRIPT
Signed-off-by: Benjamin Drung <bdrung@ubuntu.com>
2022-06-16 10:20:22 +02:00
Gianfranco Costamagna
214da052b2 pbuilder-dist: fix typo kernal/kernel 2022-06-13 10:09:24 +02:00
Mattia Rizzolo
b9c9a21696
Merge branch 'unknown-ubuntu-script' of git+ssh://git.launchpad.net/~xnox/ubuntu-dev-tools
MR: https://code.launchpad.net/~xnox/ubuntu-dev-tools/+git/ubuntu-dev-tools/+merge/420623
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2022-05-01 14:40:23 +02:00
Dimitri John Ledkov
1f3e4a5ad7
mk-sbuild: build Ubuntu chroots for unknown new releases
Signed-off-by: Dimitri John Ledkov <dimitri.ledkov@canonical.com>
2022-04-27 14:18:15 +01:00
Mattia Rizzolo
835fe258ab
open changelog for the next release
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2022-04-04 15:05:49 +02:00
Mattia Rizzolo
5618358870
Release 0.189
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2022-04-04 15:03:39 +02:00
Mattia Rizzolo
d1de55b320
Merge branch 'master' of git+ssh://git.launchpad.net/~tobhe/ubuntu-dev-tools
MR: https://code.launchpad.net/~tobhe/ubuntu-dev-tools/+git/ubuntu-dev-tools/+merge/416458
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2022-04-04 14:56:38 +02:00
Matthias Klose
87858a4387 mk-sbuild: don't require pkg-config-$target_tuple
Packages pkg-config-$target_tuple are no longer built by from source
package gcc-defaults-ports. Install pkg-config instead (LP: #1966881).
2022-03-30 15:37:08 +02:00
Tobias Heider
cb48d71056 mk-sbuild: document SCHROOT_TYPE zfs in the manpage 2022-03-07 14:08:23 +01:00
Mattia Rizzolo
b2d259b415
init changelog for the next release
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2022-02-07 16:32:07 +01:00
Mattia Rizzolo
533b9535aa
Changelog for 0.188
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2022-02-07 16:30:14 +01:00
Mattia Rizzolo
7d3ea739a2
close a bug in the changelog
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2022-02-07 16:19:47 +01:00
Mattia Rizzolo
c53750694f
document the last change
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2022-01-20 15:41:55 +01:00
Graham Inggs
0dde3262d1
ubuntutools/lp: Python 3.10 compatibility 2022-01-20 15:38:03 +01:00
Mattia Rizzolo
3a903ca628
archive.py: support python 3.6
this is needed for the backports to bionic

Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-12-09 17:56:50 +01:00
Mattia Rizzolo
c7058559c5
Changelog for 0.187
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-12-05 15:58:23 +01:00
Mattia Rizzolo
13123c51c6
Merge branch 'cleanup' of git+ssh://git.launchpad.net/~ddstreet/ubuntu-dev-tools
MR: https://code.launchpad.net/~ddstreet/ubuntu-dev-tools/+git/ubuntu-dev-tools/+merge/412242
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-12-05 15:58:06 +01:00
Mattia Rizzolo
009b79224f
backportpackage: also close a debian bug
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-12-05 15:49:53 +01:00
Mattia Rizzolo
7fc6788b35
backportpackage: fix automatic selection of the target release.
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-12-05 15:49:31 +01:00
Mattia Rizzolo
3ef7c4a569
backportpackage: add a full stop after the changelog line :3
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-12-05 15:44:02 +01:00
Mattia Rizzolo
cd4d717551
backportpackage: change the ubuntu backports version following the new policy from the Backporters team
Thanks: Unit 193 for the patch
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-12-05 15:43:51 +01:00
Mattia Rizzolo
d903160215
backportpackage: slight refactor for the debian versioning handler
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-12-05 15:42:11 +01:00
Dan Streetman
8fe22fbbb6 Remove hugdaylist script
The last email announcing a 'hug day' appears to have been over
5 years ago, and the hugdaylist script doesn't seem to work
right anymore anyway.

This also removes ubuntutools/lp/libsupport, which has no functions
used by anything else.
2021-11-22 13:00:52 -05:00
Dan Streetman
06038060b0 update changelog 2021-11-19 08:22:07 -05:00
Dan Streetman
4d72d184db pullpkg: change pull_upload_queue params into specific keyword-only params 2021-11-19 08:11:31 -05:00
Dan Streetman
20261960f6 pullpkg: unpack downloaded src from upload queue 2021-11-19 08:11:31 -05:00
Dan Streetman
d05e023dfe ubuntutools: don't sys.exit if unpack fails, just log it 2021-11-19 08:11:31 -05:00
Dan Streetman
de295fe524 ubuntutools/misc: return dst Path object from download() 2021-11-19 08:11:31 -05:00
Dan Streetman
1e2036399e ubuntutools/misc: use iter_content instead of raw stream access
Reading the raw stream doesn't decode files that are content-encoded,
which is true for the 'changes' file from launchpad, so it's saved
to disk gzipped which isn't what is expected.

Using the python requests iter_content method instead uses the
built-in stream decoding that the requests library provides, and
saves the file uncompressed/unencoded.

Note that since the encoded Content-Length won't match the resulting
unencoded actual length of data we save to file, this also turns off
the progress bar for any files that have Content-Encoding.
2021-11-19 08:11:31 -05:00
Dan Streetman
85125e3c90 ubuntutools/misc: allow specifying blocksize to download methods 2021-11-19 08:11:15 -05:00
Dan Streetman
cfa45994d0 ubuntutools/misc: create helper class to display download progress bar 2021-11-19 08:09:08 -05:00
Mattia Rizzolo
a74a49fb81
backportpackage: Support backporting to Debian releases. LP: #974132
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-10-31 16:47:43 +01:00
Peter Pentchev
74867c90f4
mk-sbuild: Fix a check for TARGET_ARCH in a message.
Closes: #968316
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-10-31 16:34:00 +01:00
Mattia Rizzolo
b904993e05
Merge branch 'mk-sbuild-zfs' of git+ssh://git.launchpad.net/~paride/ubuntu-dev-tools
MR: https://code.launchpad.net/~paride/ubuntu-dev-tools/+git/ubuntu-dev-tools/+merge/409346
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-10-31 16:31:39 +01:00
Mattia Rizzolo
000d3c1c2d
open changelog
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-10-31 16:25:12 +01:00
Stefano Rivera
9a44175a17 Upload to unstable 2021-10-24 16:13:01 -07:00
Stefano Rivera
a9e2a2689d Replace nose with pytest (see: #997758). 2021-10-24 16:05:08 -07:00
Paride Legovini
fec7a72ef7 mk-sbuild: add support for zfs-snapshot schroots
Creating datasets with mountpoint=legacy is necessary because the
schroot helper scripts expect it. This will make zfs-snapshot schroots
behave more like lvm-snapshot schroots and less like btrfs-snapshot
schroots (e.g. the source schroot is not permanently mounted/visible).

Switching mount "style" requires changes in at least the
/etc/schroot/setup.d/05zfs and 10mount scripts.

LP: #1945349
2021-09-28 19:59:42 +02:00
Stefano Rivera
9360b17bcb Upload to unstable 2021-09-17 15:53:13 -07:00
Stefano Rivera
5eeb707142 Bump Standards-Version to 4.6.0, no changes needed. 2021-09-17 15:52:51 -07:00
Stefano Rivera
c1b1c106dc Update changelog 2021-09-17 15:52:51 -07:00
Dan Streetman
266085d587 misc: fix flake8 complaints 2021-09-17 07:25:47 -04:00
Dan Streetman
6ca12331d6 archive: use proper component
source packages sometimes have different component than their
bpphs, so use the correct component when downloading binaries

LP: #1943819
2021-09-16 21:01:19 -04:00
Dan Streetman
5fcc4b5b46 misc: handle ConnectionError as NotFoundError 2021-09-16 20:29:14 -04:00
Dan Streetman
a3ff68be5a misc: download to tmp file, to avoid leftover 0-size file on error 2021-09-16 19:24:00 -04:00
Krytarik Raido
a1b56ac31f
merge-changelog: Fix setting of newlines.
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-09-08 19:44:15 +02:00
Mattia Rizzolo
ceb020d0fa
lpapicache: fix sorting in Archive.getUploadersForPackage().
LP: #1862372
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-08-17 15:55:14 +02:00
Mattia Rizzolo
fbbcee9cc1
reflow code for the next commit
Gbp-Dch: Ignore
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-08-17 15:51:23 +02:00
Mattia Rizzolo
511aa3d80c
Merge branch 'fix-ppa-pull-lp-1938659' of git+ssh://git.launchpad.net/~alexmurray/ubuntu-dev-tools
MR: https://code.launchpad.net/~alexmurray/ubuntu-dev-tools/+git/ubuntu-dev-tools/+merge/406518
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-08-04 12:02:46 +02:00
Mattia Rizzolo
6c8109b6ae
init changelog
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-08-04 11:59:24 +02:00
Alex Murray
296e498fe9
archive: Fix PersonalPackageArchiveSourcePackage to yield URLs
Yielding the result of super()_source_urls() / _binary_urls() yields the
generator object itself which generates the list of URLs - not the URLs
which would be returned from this generator. Instead use yield from which
forwards the yield onto the generator object itself.

Fixes LP: #1938659

Signed-off-by: Alex Murray <alex.murray@canonical.com>
2021-08-03 10:39:55 +09:30
Mattia Rizzolo
3f0d63d5b6
Release 0.184
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-07-17 17:57:22 +02:00
Mattia Rizzolo
49527c36e1
d/control: Bump debhelper compat level to 13.
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-07-17 17:56:30 +02:00
Mattia Rizzolo
14a8005d45
d/control: remove redudant Recommends that are already in Depends.
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-07-17 17:56:08 +02:00
Mattia Rizzolo
52032eb081
Merge branch 'ccache-support' of git+ssh://git.launchpad.net/~3v1n0/ubuntu-dev-tools
MR: https://code.launchpad.net/~3v1n0/ubuntu-dev-tools/+git/ubuntu-dev-tools/+merge/401817
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-07-17 17:54:04 +02:00
Mattia Rizzolo
4fc36efcf2
pbuilder-dist: use shutil.which instead of distutils.spawn.find_executable() to save a dependency
LP: #1936697
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-07-17 17:47:51 +02:00
Dan Streetman
7f227f6264 update changelog 2021-07-14 17:53:26 -04:00
Dan Streetman
d598900df7 d/control: remove no longer needed requirements
Nothing uses termcolor or yaml anymore in the code
2021-07-14 17:41:10 -04:00
Dan Streetman
b4ca04efaa test: fix archive tests
Assumptions were made about the implementation by mocking
that are no longer true, and the tests generally need to be
fixed to be more robust about testing
2021-07-14 17:41:10 -04:00
Dan Streetman
0eaf71737d test: fix example package class
this class can be much simpler and more robust and doesn't require
checking in the example package files to git
2021-07-14 17:41:10 -04:00
Dan Streetman
1b12d8b4e3 archive: convert to use pathlib instead of os.path
Mostly convert to using pathlib
2021-07-14 17:41:10 -04:00
Dan Streetman
be6e09a02b archive: don't use existing file if no verification methods provided
we shouldn't use an existing file if we aren't checking its checksums
2021-07-14 17:41:10 -04:00
Dan Streetman
86276665cd archive: Update PersonalPackageArchiveSourcePackage to handle private PPA
Unfortunately private PPAs require downloading files from the special server
private-ppa.launchpad.net, and all the usual urls provided by the LP api fail.
So add code to handle using those custom private URLs, and use authentication
when downloading.
2021-07-14 17:41:10 -04:00
Dan Streetman
3f2983c157 misc: Change download() method to use python requests and optional authentication
Since private PPAs require authentication, use python requests library instead
of urlopen(), since requests handles authentication easily
2021-07-14 17:41:10 -04:00
Dan Streetman
4f6d6bf2d8 misc: add extract_authentication method
This pulls the username:password out of a URL
2021-07-14 17:41:10 -04:00
Dan Streetman
df93a225a8 misc: replace os.path with Pathlib
also change some strings to use f-strings
2021-07-14 17:41:10 -04:00
Dan Streetman
d8df8cc869 misc: add download_bytes() and deprecate mode param for download_text()
Passing 'mode' assumes use of open(), but callers don't care about
implementation, just if the returned object is text or bytes
2021-07-14 17:41:04 -04:00
Dan Streetman
243a728d6c Move DownloadError into ubuntutools/misc 2021-07-13 14:08:59 -04:00
Dan Streetman
6e18d60de4 lpapicache: add Archive.getMySubscriptionURL()
Private PPA require using username/password to access their files,
so make this information available. This gets the currently logged in
user's "subscription URL", which includes the authentication data.
2021-07-13 07:55:02 -04:00
Dan Streetman
dbd453876e flake8 tests no longer need to specifically exclude ubuntu-archive-assistant
Now that ubuntu-archive-assistant is removed, we can simply test everything
with flake8
2021-07-12 12:45:27 -04:00
Dan Streetman
732ff00cac Completely remove ubuntu-archive-assistant code
This hasn't been updated even a single time since it was added without
review almost 3 years ago. It additionally has never been included in the
ubuntu-dev-tools package. It's unclear if anyone is using it for anything,
but in any case it certainly shouldn't be in the ubuntu-dev-tools repository.
2021-07-12 12:45:27 -04:00
Marco Trevisan (Treviño)
2ac69a89e3 mk-sbuild: Add support to configure ccache for each schroot
ccache can help greatly in speeding up recompilations in sbuild, but it
is a bit annoying to configure at each schroot creation.

So, add --cache option (and relative CCACHE config parameter) to
configure ccache for sbuild schroots.

By default we use a shared ccache directory, but each schroot can use
a customized one if needed (with local parameters) by using --cache-dir
(or CCACHE_DIR).

Default ccache max-size is 4G, but can be configured with --ccache-size
(or CCACHE_SIZE), the size value is applied to each ccache path, so can
be shared by multiple schroots or applied to a single one.
2021-06-15 02:24:16 +02:00
Marco Trevisan (Treviño)
4e5e6efdb1 mk-sbuild: Enable debugging in the finish.sh script if --debug is used 2021-06-15 02:23:05 +02:00
Marco Trevisan (Treviño)
e193c30695 mk-sbuild: Use a more maintainable OPTS list using a bash array 2021-06-15 02:23:05 +02:00
Stefano Rivera
6bf3640e8f Upload to unstable 2021-06-08 10:09:16 -04:00
Dan Streetman
5dcde81c58 pbuilder: include missing import
commit d784fea1cdf50ca8d80fe2a0074aeb96ab1b580f did not include the
import for 'suppress'
2021-06-06 21:16:44 -04:00
Stefano Rivera
30abe6eacd Upload to unstable 2021-06-06 19:52:25 -04:00
Stefano Rivera
c0546396bf Respect nocheck in DEB_BUILD_OPTIONS, again. 2021-06-06 19:52:14 -04:00
Dan Streetman
ef100f6166 update changelog 2021-06-04 12:26:32 -04:00
Dan Streetman
3428a65b1c d/rules: override build tests to use flake8 and nosetests3 2021-06-04 12:25:21 -04:00
Dan Streetman
91e0babd93 d/t/control: add minimum version requirement for flake8 test
the --extend-exclude parameter was added in flake8 3.8.0
2021-06-04 12:25:21 -04:00
Dan Streetman
952b331c22 archive: fix flake8 test failure 2021-06-04 12:14:10 -04:00
Dan Streetman
e44bc63209 update changelog 2021-06-02 14:57:27 -04:00
Dan Streetman
c2ea95c067 syncpackage: don't login to LP if using --simulate
No need to login to LP if no real action is being taken.
2021-06-02 14:55:17 -04:00
Dan Streetman
f25f815bef update changelog 2021-06-02 11:49:20 -04:00
Dan Streetman
a0315dac8c archive: only download dsc file to the workdir from pull() method
LP: #1928946
2021-06-02 11:24:53 -04:00
Dan Streetman
ec36c7c792 syncpackage: remove calls to no-op pull_dsc() 2021-05-28 16:24:53 -04:00
Dan Streetman
9be49e7b93 archive: deprecate poorly-named pull_dsc() method 2021-05-28 16:24:15 -04:00
Dan Streetman
90824e056c archive: move check for verify_signature into check_dsc_signature 2021-05-28 16:18:50 -04:00
Dan Streetman
1093c372eb archive: allow passing absolute path to _download_file 2021-05-28 16:17:29 -04:00
Dan Streetman
ff66707a4c Add mode param to download_text() to allow using custom modes like 'rb' 2021-05-28 16:14:51 -04:00
Mattia Rizzolo
128eca1a5b
Changelog for 0.181
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-05-02 20:04:32 +02:00
Mattia Rizzolo
4f10be3f13
mk-sbuild: document eatmydata in the manpage
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-05-02 20:02:47 +02:00
Mattia Rizzolo
b687e11813
Merge branch 'fix-backportpackage' of git+ssh://git.launchpad.net/~ddstreet/ubuntu-dev-tools/+git/ubuntu-dev-tools
Closes: https://bugs.debian.org/983854
MR: https://code.launchpad.net/~ddstreet/ubuntu-dev-tools/+git/ubuntu-dev-tools/+merge/400848
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-05-02 19:55:44 +02:00
Mattia Rizzolo
eca442bf35
Merge branch 'lp1916633' of git+ssh://git.launchpad.net/~ddstreet/ubuntu-dev-tools/+git/ubuntu-dev-tools
MR: https://code.launchpad.net/~ddstreet/ubuntu-dev-tools/+git/ubuntu-dev-tools/+merge/399686
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-05-02 19:49:03 +02:00
Marco Trevisan (Treviño)
ede8a77718 doc/mk-sbuild.1: Add documentation for --debootstrap-proxy and DEBOOTSTRAP_PROXY
LP: #1926166
2021-04-26 20:33:48 +02:00
Dan Streetman
85ed9ad1ce backportpackage: don't use SourcePackage() directly
As the warning from 2010 says, don't use this class directly.
2021-04-08 22:52:20 -04:00
Balint Reczey
f97b19554f Use eatmydata by default
Since only the dpkg is wrapped in eatmydata it should be the safe and fast
default. Eatmydata is widely used around apt thus it should be a serious bug
if a package can't be installed with eatmydata in use.
2021-03-26 15:41:10 +01:00
Balint Reczey
a5ee35c812 Use eatmydata only with the dpkg command
Eatmydata wrapping the build as well could break tests.

Thanks: Julian Andres Klode for suggesting this solution
2021-03-26 15:40:25 +01:00
Dan Streetman
d784fea1cd pbuilder: handle debian change from /updates to -security
starting in bullseye, the security suite is -security instead of /updates

LP: #1916633
2021-03-12 13:09:17 -05:00
Krytarik Raido
728849964e
Logging: Fix oversight from the last logging refactor
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-02-24 14:26:18 +01:00
Mattia Rizzolo
09537bd44d
Merge branch 'fix-sponsor-patch' of git+ssh://git.launchpad.net/~logan/ubuntu-dev-tools
MR: https://code.launchpad.net/~logan/ubuntu-dev-tools/+git/ubuntu-dev-tools/+merge/398509
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-02-24 14:24:50 +01:00
Mattia Rizzolo
484a668c0a
init changelog
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-02-24 14:23:42 +01:00
Logan Rosen
bc24ef23de sponsor-patch: fix bugs from py3 migration 2021-02-22 22:24:52 -05:00
Mattia Rizzolo
ea549d6c19
Changelog for 0.180
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-02-19 12:36:01 +01:00
Mattia Rizzolo
7118411b89
Drop coverage in the autopkgtest, as python3-nose-cov is not in Debian
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2021-02-19 12:12:02 +01:00
119 changed files with 7076 additions and 6392 deletions

20
.gitignore vendored
View File

@ -1,18 +1,2 @@
.coverage
.tox
/ubuntu_dev_tools.egg-info/
__pycache__/
*.pyc
/build/
/.pybuild/
/test-data/example_1.0-1.debian.tar.xz
/test-data/example_1.0-1.dsc
/test-data/example_1.0.orig.tar.gz
/debian/python-ubuntutools/
/debian/python3-ubuntutools/
/debian/ubuntu-dev-tools/
/debian/debhelper-build-stamp
/debian/files
/debian/*.debhelper
/debian/*.debhelper.log
/debian/*.substvars
__pycache__
*.egg-info

View File

@ -1,5 +1,10 @@
[MASTER]
# A comma-separated list of package or module names from where C extensions may
# be loaded. Extensions are loading into the active Python interpreter and may
# run arbitrary code.
extension-pkg-allow-list=apt_pkg
# Pickle collected data for later comparisons.
persistent=no
@ -9,10 +14,6 @@ jobs=0
[MESSAGES CONTROL]
# Only show warnings with the listed confidence levels. Leave empty to show
# all. Valid levels: HIGH, INFERENCE, INFERENCE_FAILURE, UNDEFINED
confidence=HIGH
# Disable the message, report, category or checker with the given id(s). You
# can either give multiple identifiers separated by comma (,) or put this
# option multiple times (only on the command line, not in the configuration
@ -22,7 +23,18 @@ confidence=HIGH
# --enable=similarities". If you want to run only the classes checker, but have
# no Warning level messages displayed, use"--disable=all --enable=classes
# --disable=W"
disable=locally-disabled
disable=fixme,locally-disabled,missing-docstring,useless-option-value,
# TODO: Fix all following disabled checks!
invalid-name,
consider-using-with,
too-many-arguments,
too-many-branches,
too-many-statements,
too-many-locals,
duplicate-code,
too-many-instance-attributes,
too-many-nested-blocks,
too-many-lines,
[REPORTS]
@ -31,14 +43,6 @@ disable=locally-disabled
reports=no
[TYPECHECK]
# List of classes names for which member attributes should not be checked
# (useful for classes with attributes dynamically set).
# lpapicache classes, urlparse
ignored-classes=Launchpad,BaseWrapper,PersonTeam,Distribution,Consumer,Credentials,ParseResult,apt_pkg,apt_pkg.Dependency,apt_pkg.BaseDependency
[FORMAT]
# Maximum number of characters on a single line.
@ -52,4 +56,10 @@ indent-string=' '
[BASIC]
# Allow variables called e, f, lp
good-names=i,j,k,ex,Run,_,e,f,lp
good-names=i,j,k,ex,Run,_,e,f,lp,me,to
[IMPORTS]
# Force import order to recognize a module as part of a third party library.
known-third-party=debian

View File

@ -18,8 +18,8 @@
#
# ##################################################################
import argparse
import glob
import optparse
import os
import shutil
import subprocess
@ -27,192 +27,223 @@ import sys
import tempfile
from urllib.parse import quote
try:
import lsb_release
except ImportError:
lsb_release = None
from distro_info import DebianDistroInfo, UbuntuDistroInfo
from httplib2 import Http, HttpLib2Error
from ubuntutools.archive import (SourcePackage, DebianSourcePackage,
UbuntuSourcePackage, DownloadError)
from ubuntutools.config import UDTConfig, ubu_email
from ubuntutools import getLogger
from ubuntutools.archive import DebianSourcePackage, DownloadError, UbuntuSourcePackage
from ubuntutools.builder import get_builder
from ubuntutools.lp.lpapicache import (Launchpad, Distribution,
from ubuntutools.config import UDTConfig, ubu_email
from ubuntutools.lp.lpapicache import (
Distribution,
Launchpad,
PackageNotFoundException,
SeriesNotFoundException,
PackageNotFoundException)
from ubuntutools.misc import (system_distribution, vendor_to_distroinfo,
codename_to_distribution)
)
from ubuntutools.misc import codename_to_distribution, system_distribution, vendor_to_distroinfo
from ubuntutools.question import YesNoQuestion
from ubuntutools import getLogger
Logger = getLogger()
def error(msg):
Logger.error(msg)
def error(msg, *args):
Logger.error(msg, *args)
sys.exit(1)
def check_call(cmd, *args, **kwargs):
Logger.debug(' '.join(cmd))
Logger.debug(" ".join(cmd))
ret = subprocess.call(cmd, *args, **kwargs)
if ret != 0:
error('%s returned %d.' % (cmd[0], ret))
error("%s returned %d.", cmd[0], ret)
def parse(args):
usage = 'Usage: %prog [options] <source package name or .dsc URL/file>'
parser = optparse.OptionParser(usage)
parser.add_option('-d', '--destination',
metavar='DEST',
dest='dest_releases',
def parse(argv):
usage = "%(prog)s [options] <source package name or .dsc URL/file>"
parser = argparse.ArgumentParser(usage=usage)
parser.add_argument(
"-d",
"--destination",
metavar="DEST",
dest="dest_releases",
default=[],
action='append',
help='Backport to DEST release '
'(default: current release)')
parser.add_option('-s', '--source',
metavar='SOURCE',
dest='source_release',
help='Backport from SOURCE release '
'(default: devel release)')
parser.add_option('-S', '--suffix',
metavar='SUFFIX',
help='Suffix to append to version number '
'(default: ~ppa1 when uploading to a PPA)')
parser.add_option('-e', '--message',
metavar='MESSAGE',
action="append",
help="Backport to DEST release (default: current release)",
)
parser.add_argument(
"-s",
"--source",
metavar="SOURCE",
dest="source_release",
help="Backport from SOURCE release (default: devel release)",
)
parser.add_argument(
"-S",
"--suffix",
metavar="SUFFIX",
help="Suffix to append to version number (default: ~ppa1 when uploading to a PPA)",
)
parser.add_argument(
"-e",
"--message",
metavar="MESSAGE",
default="No-change",
help='Changelog message to use instead of "No-change" '
'(default: No-change backport to DEST)')
parser.add_option('-b', '--build',
"(default: No-change backport to DEST.)",
)
parser.add_argument(
"-b",
"--build",
default=False,
action='store_true',
help='Build the package before uploading '
'(default: %default)')
parser.add_option('-B', '--builder',
metavar='BUILDER',
help='Specify the package builder (default: pbuilder)')
parser.add_option('-U', '--update',
action="store_true",
help="Build the package before uploading (default: %(default)s)",
)
parser.add_argument(
"-B",
"--builder",
metavar="BUILDER",
help="Specify the package builder (default: pbuilder)",
)
parser.add_argument(
"-U",
"--update",
default=False,
action='store_true',
help='Update the build environment before '
'attempting to build')
parser.add_option('-u', '--upload',
metavar='UPLOAD',
help='Specify an upload destination')
parser.add_option("-k", "--key",
dest='keyid',
help="Specify the key ID to be used for signing.")
parser.add_option('--dont-sign',
dest='keyid', action='store_false',
help='Do not sign the upload.')
parser.add_option('-y', '--yes',
dest='prompt',
action="store_true",
help="Update the build environment before attempting to build",
)
parser.add_argument("-u", "--upload", metavar="UPLOAD", help="Specify an upload destination")
parser.add_argument(
"-k", "--key", dest="keyid", help="Specify the key ID to be used for signing."
)
parser.add_argument(
"--dont-sign", dest="keyid", action="store_false", help="Do not sign the upload."
)
parser.add_argument(
"-y",
"--yes",
dest="prompt",
default=True,
action='store_false',
help='Do not prompt before uploading to a PPA')
parser.add_option('-v', '--version',
metavar='VERSION',
help='Package version to backport (or verify)')
parser.add_option('-w', '--workdir',
metavar='WORKDIR',
help='Specify a working directory '
'(default: temporary dir)')
parser.add_option('-r', '--release-pocket',
action="store_false",
help="Do not prompt before uploading to a PPA",
)
parser.add_argument(
"-v", "--version", metavar="VERSION", help="Package version to backport (or verify)"
)
parser.add_argument(
"-w",
"--workdir",
metavar="WORKDIR",
help="Specify a working directory (default: temporary dir)",
)
parser.add_argument(
"-r",
"--release-pocket",
default=False,
action='store_true',
help='Target the release pocket in the .changes file. '
'Necessary (and default) for uploads to PPAs')
parser.add_option('-c', '--close',
metavar='BUG',
help='Bug to close in the changelog entry.')
parser.add_option('-m', '--mirror',
metavar='URL',
help='Preferred mirror (default: Launchpad)')
parser.add_option('-l', '--lpinstance',
metavar='INSTANCE',
help='Launchpad instance to connect to '
'(default: production)')
parser.add_option('--no-conf',
action="store_true",
help="Target the release pocket in the .changes file. "
"Necessary (and default) for uploads to PPAs",
)
parser.add_argument(
"-c", "--close", metavar="BUG", help="Bug to close in the changelog entry."
)
parser.add_argument(
"-m", "--mirror", metavar="URL", help="Preferred mirror (default: Launchpad)"
)
parser.add_argument(
"-l",
"--lpinstance",
metavar="INSTANCE",
help="Launchpad instance to connect to (default: production)",
)
parser.add_argument(
"--no-conf",
default=False,
action='store_true',
help="Don't read config files or environment variables")
action="store_true",
help="Don't read config files or environment variables",
)
parser.add_argument("package_or_dsc", help=argparse.SUPPRESS)
opts, args = parser.parse_args(args)
if len(args) != 1:
parser.error('You must specify a single source package or a .dsc '
'URL/path.')
config = UDTConfig(opts.no_conf)
if opts.builder is None:
opts.builder = config.get_value('BUILDER')
if not opts.update:
opts.update = config.get_value('UPDATE_BUILDER', boolean=True)
if opts.workdir is None:
opts.workdir = config.get_value('WORKDIR')
if opts.lpinstance is None:
opts.lpinstance = config.get_value('LPINSTANCE')
if opts.upload is None:
opts.upload = config.get_value('UPLOAD')
if opts.keyid is None:
opts.keyid = config.get_value('KEYID')
if not opts.upload and not opts.workdir:
parser.error('Please specify either a working dir or an upload target!')
if opts.upload and opts.upload.startswith('ppa:'):
opts.release_pocket = True
args = parser.parse_args(argv)
config = UDTConfig(args.no_conf)
if args.builder is None:
args.builder = config.get_value("BUILDER")
if not args.update:
args.update = config.get_value("UPDATE_BUILDER", boolean=True)
if args.workdir is None:
args.workdir = config.get_value("WORKDIR")
if args.lpinstance is None:
args.lpinstance = config.get_value("LPINSTANCE")
if args.upload is None:
args.upload = config.get_value("UPLOAD")
if args.keyid is None:
args.keyid = config.get_value("KEYID")
if not args.upload and not args.workdir:
parser.error("Please specify either a working dir or an upload target!")
if args.upload and args.upload.startswith("ppa:"):
args.release_pocket = True
return opts, args, config
return args, config
def find_release_package(mirror, workdir, package, version, source_release,
config):
def find_release_package(mirror, workdir, package, version, source_release, config):
srcpkg = None
if source_release:
distribution = codename_to_distribution(source_release)
if not distribution:
error('Unknown release codename %s' % source_release)
error("Unknown release codename %s", source_release)
info = vendor_to_distroinfo(distribution)()
source_release = info.codename(source_release, default=source_release)
else:
distribution = system_distribution()
mirrors = [mirror] if mirror else []
mirrors.append(config.get_value('%s_MIRROR' % distribution.upper()))
mirrors.append(config.get_value(f"{distribution.upper()}_MIRROR"))
if not version:
archive = Distribution(distribution.lower()).getArchive()
try:
spph = archive.getSourcePackage(package, source_release)
except (SeriesNotFoundException, PackageNotFoundException) as e:
error(str(e))
error("%s", str(e))
version = spph.getVersion()
if distribution == 'Debian':
srcpkg = DebianSourcePackage(package,
version,
workdir=workdir,
mirrors=mirrors)
elif distribution == 'Ubuntu':
srcpkg = UbuntuSourcePackage(package,
version,
workdir=workdir,
mirrors=mirrors)
if distribution == "Debian":
srcpkg = DebianSourcePackage(package, version, workdir=workdir, mirrors=mirrors)
elif distribution == "Ubuntu":
srcpkg = UbuntuSourcePackage(package, version, workdir=workdir, mirrors=mirrors)
return srcpkg
def find_package(mirror, workdir, package, version, source_release, config):
"Returns the SourcePackage"
if package.endswith('.dsc'):
return SourcePackage(version=version, dscfile=package,
workdir=workdir, mirrors=(mirror,))
if package.endswith(".dsc"):
# Here we are using UbuntuSourcePackage just because we don't have any
# "general" class that is safely instantiable (as SourcePackage is an
# abstract class). None of the distribution-specific details within
# UbuntuSourcePackage is relevant for this use case.
return UbuntuSourcePackage(
version=version, dscfile=package, workdir=workdir, mirrors=(mirror,)
)
if not source_release and not version:
info = vendor_to_distroinfo(system_distribution())
source_release = info().devel()
srcpkg = find_release_package(mirror, workdir, package, version,
source_release, config)
srcpkg = find_release_package(mirror, workdir, package, version, source_release, config)
if version and srcpkg.version != version:
error('Requested backport of version %s but version of %s in %s is %s'
% (version, package, source_release, srcpkg.version))
error(
"Requested backport of version %s but version of %s in %s is %s",
version,
package,
source_release,
srcpkg.version,
)
return srcpkg
@ -220,15 +251,27 @@ def find_package(mirror, workdir, package, version, source_release, config):
def get_backport_version(version, suffix, upload, release):
distribution = codename_to_distribution(release)
if not distribution:
error('Unknown release codename %s' % release)
series = Distribution(distribution.lower()).\
getSeries(name_or_version=release)
error("Unknown release codename %s", release)
if distribution == "Debian":
debian_distro_info = DebianDistroInfo()
debian_codenames = debian_distro_info.supported()
if release in debian_codenames:
release_version = debian_distro_info.version(release)
if not release_version:
error("Can't find the release version for %s", release)
backport_version = f"{version}~bpo{release_version}+1"
else:
error("%s is not a supported release (%s)", release, debian_codenames)
elif distribution == "Ubuntu":
series = Distribution(distribution.lower()).getSeries(name_or_version=release)
backport_version = version + ('~%s%s.1' % (distribution.lower(), series.version))
backport_version = f"{version}~bpo{series.version}.1"
else:
error("Unknown distribution «%s» for release «%s»", distribution, release)
if suffix is not None:
backport_version += suffix
elif upload and upload.startswith('ppa:'):
backport_version += '~ppa1'
elif upload and upload.startswith("ppa:"):
backport_version += "~ppa1"
return backport_version
@ -236,26 +279,25 @@ def get_old_version(source, release):
try:
distribution = codename_to_distribution(release)
archive = Distribution(distribution.lower()).getArchive()
pkg = archive.getSourcePackage(source,
release,
('Release', 'Security', 'Updates',
'Proposed', 'Backports'))
pkg = archive.getSourcePackage(
source, release, ("Release", "Security", "Updates", "Proposed", "Backports")
)
return pkg.getVersion()
except (SeriesNotFoundException, PackageNotFoundException):
pass
return None
def get_backport_dist(release, release_pocket):
if release_pocket:
return release
else:
return '%s-backports' % release
return f"{release}-backports"
def do_build(workdir, dsc, release, builder, update):
builder = get_builder(builder)
if not builder:
return
return None
if update:
if 0 != builder.update(release):
@ -263,41 +305,41 @@ def do_build(workdir, dsc, release, builder, update):
# builder.build is going to chdir to buildresult:
workdir = os.path.realpath(workdir)
return builder.build(os.path.join(workdir, dsc),
release,
os.path.join(workdir, "buildresult"))
return builder.build(os.path.join(workdir, dsc), release, os.path.join(workdir, "buildresult"))
def do_upload(workdir, package, bp_version, changes, upload, prompt):
print('Please check %s %s in file://%s carefully!' % (package, bp_version, workdir))
if prompt or upload == 'ubuntu':
question = 'Do you want to upload the package to %s' % upload
print(f"Please check {package} {bp_version} in file://{workdir} carefully!")
if prompt or upload == "ubuntu":
question = f"Do you want to upload the package to {upload}"
answer = YesNoQuestion().ask(question, "yes")
if answer == "no":
return
check_call(['dput', upload, changes], cwd=workdir)
check_call(["dput", upload, changes], cwd=workdir)
def orig_needed(upload, workdir, pkg):
'''Avoid a -sa if possible'''
if not upload or not upload.startswith('ppa:'):
"""Avoid a -sa if possible"""
if not upload or not upload.startswith("ppa:"):
return True
ppa = upload.split(':', 1)[1]
user, ppa = ppa.split('/', 1)
ppa = upload.split(":", 1)[1]
user, ppa = ppa.split("/", 1)
version = pkg.version.upstream_version
h = Http()
for filename in glob.glob(os.path.join(workdir, '%s_%s.orig*' % (pkg.source, version))):
url = ('https://launchpad.net/~%s/+archive/%s/+sourcefiles/%s/%s/%s'
% (quote(user), quote(ppa), quote(pkg.source),
quote(pkg.version.full_version),
quote(os.path.basename(filename))))
http = Http()
for filename in glob.glob(os.path.join(workdir, f"{pkg.source}_{version}.orig*")):
url = (
f"https://launchpad.net/~{quote(user)}/+archive/{quote(ppa)}/+sourcefiles"
f"/{quote(pkg.source)}/{quote(pkg.version.full_version)}"
f"/{quote(os.path.basename(filename))}"
)
try:
headers, body = h.request(url, 'HEAD')
if (headers.status != 200 or
not headers['content-location'].startswith('https://launchpadlibrarian.net')):
headers = http.request(url, "HEAD")[0]
if headers.status != 200 or not headers["content-location"].startswith(
"https://launchpadlibrarian.net"
):
return True
except HttpLib2Error as e:
Logger.debug(e)
@ -305,61 +347,79 @@ def orig_needed(upload, workdir, pkg):
return False
def do_backport(workdir, pkg, suffix, message, close, release, release_pocket,
build, builder, update, upload, keyid, prompt):
dirname = '%s-%s' % (pkg.source, release)
def do_backport(
workdir,
pkg,
suffix,
message,
close,
release,
release_pocket,
build,
builder,
update,
upload,
keyid,
prompt,
):
dirname = f"{pkg.source}-{release}"
srcdir = os.path.join(workdir, dirname)
if os.path.exists(srcdir):
question = 'Working directory %s already exists. Delete it?' % srcdir
if YesNoQuestion().ask(question, 'no') == 'no':
question = f"Working directory {srcdir} already exists. Delete it?"
if YesNoQuestion().ask(question, "no") == "no":
sys.exit(1)
shutil.rmtree(srcdir)
pkg.unpack(dirname)
bp_version = get_backport_version(pkg.version.full_version, suffix,
upload, release)
bp_version = get_backport_version(pkg.version.full_version, suffix, upload, release)
old_version = get_old_version(pkg.source, release)
bp_dist = get_backport_dist(release, release_pocket)
changelog = '%s backport to %s' % (message, release,)
changelog = f"{message} backport to {release}."
if close:
changelog += ' (LP: #%s)' % (close,)
check_call(['dch',
'--force-bad-version',
'--force-distribution',
'--preserve',
'--newversion', bp_version,
'--distribution', bp_dist,
changelog],
cwd=srcdir)
changelog += f" (LP: #{close})"
check_call(
[
"dch",
"--force-bad-version",
"--force-distribution",
"--preserve",
"--newversion",
bp_version,
"--distribution",
bp_dist,
changelog,
],
cwd=srcdir,
)
cmd = ['debuild', '--no-lintian', '-S', '-nc', '-uc', '-us']
cmd = ["debuild", "--no-lintian", "-S", "-nc", "-uc", "-us"]
if orig_needed(upload, workdir, pkg):
cmd.append('-sa')
cmd.append("-sa")
else:
cmd.append('-sd')
cmd.append("-sd")
if old_version:
cmd.append('-v%s' % old_version)
cmd.append(f"-v{old_version}")
env = os.environ.copy()
# An ubuntu.com e-mail address would make dpkg-buildpackage fail if there
# wasn't an Ubuntu maintainer for an ubuntu-versioned package. LP: #1007042
env.pop('DEBEMAIL', None)
env.pop("DEBEMAIL", None)
check_call(cmd, cwd=srcdir, env=env)
fn_base = pkg.source + '_' + bp_version.split(':', 1)[-1]
changes = fn_base + '_source.changes'
fn_base = pkg.source + "_" + bp_version.split(":", 1)[-1]
changes = fn_base + "_source.changes"
if build:
if 0 != do_build(workdir, fn_base + '.dsc', release, builder, update):
if 0 != do_build(workdir, fn_base + ".dsc", release, builder, update):
sys.exit(1)
# None: sign with the default signature. False: don't sign
if keyid is not False:
cmd = ['debsign']
cmd = ["debsign"]
if keyid:
cmd.append('-k' + keyid)
cmd.append("-k" + keyid)
cmd.append(changes)
check_call(cmd, cwd=workdir)
if upload:
@ -368,57 +428,68 @@ def do_backport(workdir, pkg, suffix, message, close, release, release_pocket,
shutil.rmtree(srcdir)
def main(args):
def main(argv):
ubu_email()
opts, (package_or_dsc,), config = parse(args[1:])
args, config = parse(argv[1:])
Launchpad.login_anonymously(service=opts.lpinstance)
Launchpad.login_anonymously(service=args.lpinstance)
if not opts.dest_releases:
if not args.dest_releases:
if lsb_release:
distinfo = lsb_release.get_distro_information()
try:
opts.dest_releases = [distinfo['CODENAME']]
current_distro = distinfo["ID"]
except KeyError:
error('No destination release specified and unable to guess yours.')
if opts.workdir:
workdir = os.path.expanduser(opts.workdir)
error("No destination release specified and unable to guess yours.")
else:
workdir = tempfile.mkdtemp(prefix='backportpackage-')
err, current_distro = subprocess.getstatusoutput("lsb_release --id --short")
if err:
error("Could not run lsb_release to retrieve distribution")
if current_distro == "Ubuntu":
args.dest_releases = [UbuntuDistroInfo().lts()]
elif current_distro == "Debian":
args.dest_releases = [DebianDistroInfo().stable()]
else:
error("Unknown distribution %s, can't guess target release", current_distro)
if args.workdir:
workdir = os.path.expanduser(args.workdir)
else:
workdir = tempfile.mkdtemp(prefix="backportpackage-")
if not os.path.exists(workdir):
os.makedirs(workdir)
try:
pkg = find_package(opts.mirror,
workdir,
package_or_dsc,
opts.version,
opts.source_release,
config)
pkg = find_package(
args.mirror, workdir, args.package_or_dsc, args.version, args.source_release, config
)
pkg.pull()
for release in opts.dest_releases:
do_backport(workdir,
for release in args.dest_releases:
do_backport(
workdir,
pkg,
opts.suffix,
opts.message,
opts.close,
args.suffix,
args.message,
args.close,
release,
opts.release_pocket,
opts.build,
opts.builder,
opts.update,
opts.upload,
opts.keyid,
opts.prompt)
args.release_pocket,
args.build,
args.builder,
args.update,
args.upload,
args.keyid,
args.prompt,
)
except DownloadError as e:
error(str(e))
error("%s", str(e))
finally:
if not opts.workdir:
if not args.workdir:
shutil.rmtree(workdir)
if __name__ == '__main__':
if __name__ == "__main__":
sys.exit(main(sys.argv))

View File

@ -36,7 +36,7 @@ _pbuilder-dist()
for distro in $(ubuntu-distro-info --all; debian-distro-info --all) stable testing unstable; do
for builder in pbuilder cowbuilder; do
echo "$builder-$distro"
for arch in i386 amd64 armel armhf; do
for arch in i386 amd64 armhf; do
echo "$builder-$distro-$arch"
done
done

131
check-mir
View File

@ -21,69 +21,116 @@
# this program; if not, write to the Free Software Foundation, Inc.,
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import sys
import optparse
# pylint: disable=invalid-name
# pylint: enable=invalid-name
"""Check if any of a package's build or binary dependencies are in universe or multiverse.
Run this inside an unpacked source package
"""
import argparse
import os.path
import sys
import apt
def check_support(apt_cache, pkgname, alt=False):
'''Check if pkgname is in main or restricted.
"""Check if pkgname is in main or restricted.
This prints messages if a package is not in main/restricted, or only
partially (i. e. source in main, but binary in universe).
'''
"""
if alt:
prefix = ' ... alternative ' + pkgname
prefix = " ... alternative " + pkgname
else:
prefix = ' * ' + pkgname
prefix = " * " + pkgname
try:
prov_packages = apt_cache.get_providing_packages(pkgname)
if pkgname in apt_cache:
pkg = apt_cache[pkgname]
except KeyError:
print(prefix, 'does not exist (pure virtual?)', file=sys.stderr)
# If this is a virtual package, iterate through the binary packages that
# provide this, and ensure they are all in Main. Source packages in and of
# themselves cannot provide virtual packages, only binary packages can.
elif len(prov_packages) > 0:
supported, unsupported = [], []
for pkg in prov_packages:
candidate = pkg.candidate
if candidate:
section = candidate.section
if section.startswith("universe") or section.startswith("multiverse"):
unsupported.append(pkg.name)
else:
supported.append(pkg.name)
if len(supported) > 0:
msg = "is a virtual package, which is provided by the following "
msg += "candidates in Main: " + " ".join(supported)
print(prefix, msg)
elif len(unsupported) > 0:
msg = "is a virtual package, but is only provided by the "
msg += "following non-Main candidates: " + " ".join(unsupported)
print(prefix, msg, file=sys.stderr)
return False
else:
msg = "is a virtual package that exists but is not provided by "
msg += "package currently in the archive. Proceed with caution."
print(prefix, msg, file=sys.stderr)
return False
else:
print(prefix, "does not exist", file=sys.stderr)
return False
section = pkg.candidate.section
if section.startswith('universe') or section.startswith('multiverse'):
if section.startswith("universe") or section.startswith("multiverse"):
# check if the source package is in main and thus will only need binary
# promotion
source_records = apt.apt_pkg.SourceRecords()
if not source_records.lookup(pkg.candidate.source_name):
print('ERROR: Cannot lookup source package for', pkg.name,
file=sys.stderr)
print(prefix, 'package is in', section.split('/')[0])
print("ERROR: Cannot lookup source package for", pkg.name, file=sys.stderr)
print(prefix, "package is in", section.split("/")[0])
return False
src = apt.apt_pkg.TagSection(source_records.record)
if (src['Section'].startswith('universe') or
src['Section'].startswith('multiverse')):
print(prefix, 'binary and source package is in',
section.split('/')[0])
if src["Section"].startswith("universe") or src["Section"].startswith("multiverse"):
print(prefix, "binary and source package is in", section.split("/")[0])
return False
else:
print(prefix, 'is in', section.split('/')[0] + ', but its source',
print(
prefix,
"is in",
section.split("/")[0] + ", but its source",
pkg.candidate.source_name,
'is already in main; file an ubuntu-archive bug for '
'promoting the current preferred alternative')
"is already in main; file an ubuntu-archive bug for "
"promoting the current preferred alternative",
)
return True
if alt:
print(prefix, 'is already in main; consider preferring it')
print(prefix, "is already in main; consider preferring it")
return True
def check_build_dependencies(apt_cache, control):
print('Checking support status of build dependencies...')
print("Checking support status of build dependencies...")
any_unsupported = False
for field in ('Build-Depends', 'Build-Depends-Indep'):
for field in ("Build-Depends", "Build-Depends-Indep"):
if field not in control.section:
continue
for or_group in apt.apt_pkg.parse_src_depends(control.section[field]):
pkgname = or_group[0][0]
# debhelper-compat is expected to be a build dependency of every
# package, so it is a red herring to display it in this report.
# (src:debhelper is in Ubuntu Main anyway)
if pkgname == "debhelper-compat":
continue
if not check_support(apt_cache, pkgname):
# check non-preferred alternatives
for altpkg in or_group[1:]:
@ -98,20 +145,19 @@ def check_build_dependencies(apt_cache, control):
def check_binary_dependencies(apt_cache, control):
any_unsupported = False
print('\nChecking support status of binary dependencies...')
print("\nChecking support status of binary dependencies...")
while True:
try:
next(control)
except StopIteration:
break
for field in ('Depends', 'Pre-Depends', 'Recommends'):
for field in ("Depends", "Pre-Depends", "Recommends"):
if field not in control.section:
continue
for or_group in apt.apt_pkg.parse_src_depends(
control.section[field]):
for or_group in apt.apt_pkg.parse_src_depends(control.section[field]):
pkgname = or_group[0][0]
if pkgname.startswith('$'):
if pkgname.startswith("$"):
continue
if not check_support(apt_cache, pkgname):
# check non-preferred alternatives
@ -125,32 +171,33 @@ def check_binary_dependencies(apt_cache, control):
def main():
description = "Check if any of a package's build or binary " + \
"dependencies are in universe or multiverse. " + \
"Run this inside an unpacked source package"
parser = optparse.OptionParser(description=description)
parser = argparse.ArgumentParser(description=__doc__)
parser.parse_args()
apt_cache = apt.Cache()
if not os.path.exists('debian/control'):
print('debian/control not found. You need to run this tool in a '
'source package directory', file=sys.stderr)
if not os.path.exists("debian/control"):
print(
"debian/control not found. You need to run this tool in a source package directory",
file=sys.stderr,
)
sys.exit(1)
# get build dependencies from debian/control
control = apt.apt_pkg.TagFile(open('debian/control'))
control = apt.apt_pkg.TagFile(open("debian/control", encoding="utf-8"))
next(control)
unsupported_build_deps = check_build_dependencies(apt_cache, control)
unsupported_binary_deps = check_binary_dependencies(apt_cache, control)
if unsupported_build_deps or unsupported_binary_deps:
print('\nPlease check https://wiki.ubuntu.com/MainInclusionProcess if '
'this source package needs to get into in main/restricted, or '
'reconsider if the package really needs above dependencies.')
print(
"\nPlease check https://wiki.ubuntu.com/MainInclusionProcess if "
"this source package needs to get into in main/restricted, or "
"reconsider if the package really needs above dependencies."
)
else:
print('All dependencies are supported in main or restricted.')
print("All dependencies are supported in main or restricted.")
if __name__ == '__main__':
if __name__ == "__main__":
main()

1
debian/.gitignore vendored Normal file
View File

@ -0,0 +1 @@
files

458
debian/changelog vendored
View File

@ -1,8 +1,464 @@
ubuntu-dev-tools (0.206) unstable; urgency=medium
[ Dan Bungert ]
* mk-sbuild: enable pkgmaintainermangler
[ Shengjing Zhu ]
* import-bug-from-debian: package option is overridden and not used
[ Fernando Bravo Hernández ]
* Parsing arch parameter to getBinaryPackage() (LP: #2081861)
[ Simon Quigley ]
* Read ~/.devscripts in a more robust way, to ideally pick up multi-line
variables (Closes: #725418).
* mk-sbuild: default to using UTC for schroots (LP: #2097159).
* syncpackage: s/syncblacklist/syncblocklist/g
* syncpackage: Cache the sync blocklist in-memory, so it's not fetched
multiple times when syncing more than one package.
* syncpackage: Catch exceptions cleanly, simply skipping to the next
package (erring on the side of caution) if there is an error doing the
download (LP: #1943286).
-- Simon Quigley <tsimonq2@debian.org> Tue, 04 Mar 2025 13:43:15 -0600
ubuntu-dev-tools (0.205) unstable; urgency=medium
* [syncpackage] When syncing multiple packages, if one of the packages is in
the sync blocklist, do not exit, simply continue.
* [syncpackage] Do not use exit(1) on an error or exception unless it
applies to all packages, instead return None so we can continue to the
next package.
* [syncpackage] Add support for -y or --yes, noted that it should be used
with care.
* Update Standards-Version to 4.7.2, no changes needed.
-- Simon Quigley <tsimonq2@debian.org> Sat, 01 Mar 2025 11:29:54 -0600
ubuntu-dev-tools (0.204) unstable; urgency=medium
[ Simon Quigley ]
* Update Standards-Version to 4.7.1, no changes needed.
* Add several Lintian overrides related to .pyc files.
* Add my name to the copyright file.
* Rename bitesize to lp-bitesize (Closes: #1076224).
* Add a manpage for running-autopkgtests.
* Add a large warning at the top of mk-sbuild encouraging the use of the
unshare backend. This is to provide ample warning to users.
* Remove mail line from default ~/.sbuildrc, to resolve the undeclared
dependency on sendmail (Closes: #1074632).
[ Julien Plissonneau Duquène ]
* Fix reverse-depends -b crash on packages that b-d on themselves
(Closes: #1087760).
-- Simon Quigley <tsimonq2@debian.org> Mon, 24 Feb 2025 19:54:39 -0600
ubuntu-dev-tools (0.203) unstable; urgency=medium
[ Steve Langasek ]
* ubuntu-build: handle TOCTOU issue with the "can be retried" value on
builds.
* Recommend sbuild over pbuilder. sbuild is the tool recommended by
Ubuntu developers whose behavior most closely approximates Launchpad
builds.
[ Florent 'Skia' Jacquet ]
* import-bug-from-debian: handle multipart message (Closes: #969510)
[ Benjamin Drung ]
* import-bug-from-debian: add type hints
* Bump Standards-Version to 4.7.0
* Bump year and add missing files to copyright
* setup.py: add pm-helper
* Format code with black and isort
* Address several issues pointed out by Pylint
* Depend on python3-yaml for pm-helper
-- Benjamin Drung <bdrung@debian.org> Sat, 02 Nov 2024 18:19:24 +0100
ubuntu-dev-tools (0.202) unstable; urgency=medium
[ Steve Langasek ]
* ubuntu-build: support --batch with no package names to retry all
* ubuntu-build: in batch mode, print a count of packages retried
* ubuntu-build: make the --arch option top-level.
This gets rid of the fugly --arch2 option
* ubuntu-build: support retrying builds in other states that failed-to-build
* ubuntu-build: Handling of proposed vs release pocket default for ppas
* ubuntu-build: update manpage
[ Chris Peterson ]
* Replace Depends on python3-launchpadlib with Depends on
python3-launchpadlib-desktop (LP: #2049217)
-- Simon Quigley <tsimonq2@ubuntu.com> Fri, 12 Apr 2024 23:33:14 -0500
ubuntu-dev-tools (0.201) unstable; urgency=medium
* running-autopkgtests: fix packaging to make the script available
(LP: #2055466)
-- Chris Peterson <chris.peterson@canonical.com> Thu, 29 Feb 2024 11:09:14 -0800
ubuntu-dev-tools (0.200) unstable; urgency=medium
[ Gianfranco Costamagna ]
* Team upload
[ Chris Peterson ]
* Add support to see currently running autopkgtests (running-autopkgtests)
* running-autopkgtests: use f-strings
[ Athos Ribeiro ]
* syncpackage: log LP authentication errors before halting.
[ Ying-Chun Liu (PaulLiu) ]
* Drop qemu-debootstrap
qemu-debootstrap is deprecated for a while. In newer qemu release
the command is totally removed. We can use debootstrap directly.
Signed-off-by: Ying-Chun Liu (PaulLiu) <paulliu@debian.org>
[ Logan Rosen ]
* Don't rely on debootstrap for validating Ubuntu distro
-- Gianfranco Costamagna <locutusofborg@debian.org> Thu, 15 Feb 2024 17:53:48 +0100
ubuntu-dev-tools (0.199) unstable; urgency=medium
[ Simon Quigley ]
* Add my name to Uploaders.
[ Steve Langasek ]
* Introduce a pm-helper tool.
-- Simon Quigley <tsimonq2@debian.org> Mon, 29 Jan 2024 10:03:22 -0600
ubuntu-dev-tools (0.198) unstable; urgency=medium
* In check-mir, ignore debhelper-compat when checking the build
dependencies. This is expected to be a build dependency of all packages,
so warning about it in any way is surely a red herring.
* Add proper support for virtual packages in check-mir, basing the
determination solely off of binary packages. This is not expected to be a
typical case.
-- Simon Quigley <tsimonq2@debian.org> Wed, 10 Jan 2024 20:04:02 -0600
ubuntu-dev-tools (0.197) unstable; urgency=medium
* Update the manpage for syncpackage to reflect the ability to sync
multiple packages at once.
* When using pull-*-source to grab a package which already has a defined
Vcs- field, display the exact same warning message `apt source` does.
-- Simon Quigley <tsimonq2@debian.org> Tue, 03 Oct 2023 14:01:25 -0500
ubuntu-dev-tools (0.196) unstable; urgency=medium
* Allow the user to sync multiple packages at one time (LP: #1756748).
-- Simon Quigley <tsimonq2@debian.org> Fri, 04 Aug 2023 14:37:59 -0500
ubuntu-dev-tools (0.195) unstable; urgency=medium
* Add support for the non-free-firmware components in all tools already
referencing non-free.
-- Simon Quigley <tsimonq2@debian.org> Wed, 26 Jul 2023 13:03:31 -0500
ubuntu-dev-tools (0.194) unstable; urgency=medium
[ Gianfranco Costamagna ]
* ubuntu-build: For some reasons, now you need to be authenticated before
trying to use the "PersonTeam" class features.
Do it at the begin instead of replicating the same code inside the
tool itself.
[ Steve Langasek ]
* Remove references to deprecated
http://people.canonical.com/~ubuntu-archive.
* Remove references to architectures not supported in any active
Ubuntu release.
* Remove references to ftpmaster.internal. When this name is resolvable
but firewalled, syncpackage hangs; and these are tools for developers,
not for running in an automated context in the DCs where
ftpmaster.internal is reachable.
* Excise all references to cdbs (including in test cases)
* Set apt preferences for the -proposed pocket in mk-sbuild so that
it works as expected for lunar and forward.
[ Robie Basak ]
* ubuntutools/misc: swap iter_content for raw stream with "Accept-Encoding:
identity" to fix .diff.gz downloads (LP: #2025748).
[ Vladimir Petko ]
* Fix a typo introduced in the last upload that made mk-sbuild fail
unconditionally. LP: #2017177.
-- Gianfranco Costamagna <locutusofborg@debian.org> Sat, 08 Jul 2023 08:42:05 +0200
ubuntu-dev-tools (0.193) unstable; urgency=medium
* Don't run linters at build time, or in autopkgtests. (Closes: #1031436).
-- Stefano Rivera <stefanor@debian.org> Sat, 25 Feb 2023 13:19:56 -0400
ubuntu-dev-tools (0.192) unstable; urgency=medium
[ Benjamin Drung ]
* sponsor-patch:
+ Ignore exit code 1 of debdiff call.
+ Use --skip-patches instead of --no-preparation with dpkg-source -x.
* Demote bzr/brz from Recommends to Suggests, as nowadays git is the way.
Closes: #940531
* Use PEP440 compliant version in setup.py (LP: #1991606)
* Fix issues found by flake8 on the Python scripts
* Check Python scripts with flake8 again
* Format Python code with black and run black during package build
* Sort Python imports with isort and run isort during package build
* Replace deprecated optparse with argparse
* requestbackport: Remove useless loop from locate_package
* reverse-depends: Restore field titles format
* test: Fix deprecated return value for test case
* Fix all errors and warnings found by pylint and implement most refactorings
and conventions. Run pylint during package build again.
* Bump Standards-Version to 4.6.2
* Drop unneeded X-Python3-Version from d/control
[ Masahiro Yamada ]
* mk-sbuild:
+ Handle the new location of the Debian bullseye security archive.
Closes: #1001832; LP: #1955116
[ Mattia Rizzolo ]
* requestbackport:
+ Apply patch from Krytarik Raido and Unit 193 to update the template and
workflow after the new Ubuntu Backport process has been established.
LP: #1959115
-- Benjamin Drung <bdrung@debian.org> Wed, 01 Feb 2023 12:45:15 +0100
ubuntu-dev-tools (0.191) unstable; urgency=medium
[ Dan Streetman ]
* lpapicache:
+ Make sure that login() actually logins and doesn't use cached credentials.
* ubuntu-build:
+ Fix crash caused by a change in lpapicache that changed the default
operation mode from authenticated to anonymous. LP: #1984113
[ Stefano Rivera ]
* backportpackage:
+ Add support for lsb-release-minimal, which doesn't have a Python module.
Thanks to Gioele Barabucci for the patch. Closes: #1020901; LP: #1991828
[ Mattia Rizzolo ]
* ubuntutools/archive.py:
+ Fix operation of SourcePackage._source_urls() (as used, for example, in
SourcePackage.pull() called by backportpackage) to also work when the
class is instantiated with a URL as .dsc. Fixes regression from v0.184.
Thanks to Unit 193 for the initial patch.
-- Mattia Rizzolo <mattia@debian.org> Tue, 11 Oct 2022 13:56:03 +0200
ubuntu-dev-tools (0.190) unstable; urgency=medium
[ Dimitri John Ledkov ]
* mk-sbuild:
+ For ubuntu, fix the debootstrap script to "gutsy", so to allow using
mk-sbuild for newer releases without requiring a newer debootstrap.
[ Gianfranco Costamagna ]
* pbuilder-dist: fix typo kernal/kernel
[ Benjamin Drung ]
* Add missing files to debian/copyright
* Bump Standards-Version to 4.6.1
-- Benjamin Drung <bdrung@debian.org> Thu, 16 Jun 2022 10:55:17 +0200
ubuntu-dev-tools (0.189) unstable; urgency=medium
[ Heinrich Schuchardt ]
* mk-sbuild: don't require pkg-config-<target>. LP: #1966881.
[ Tobias Heider ]
* mk-sbuild: document SCHROOT_TYPE zfs in the manpage.
-- Mattia Rizzolo <mattia@debian.org> Mon, 04 Apr 2022 15:03:31 +0200
ubuntu-dev-tools (0.188) unstable; urgency=medium
[ Mattia Rizzolo ]
* archive.py:
+ Support Python 3.6 by calling functools.lru_cache() as a function, and
avoid using @functools.cached_property (both new in Python 3.8).
[ Graham Inggs ]
* lpapicache.py:
+ Use collections.abc.Callable instead of the long deprecated
collections.Callable. LP: #1959541
-- Mattia Rizzolo <mattia@debian.org> Mon, 07 Feb 2022 16:30:07 +0100
ubuntu-dev-tools (0.187) unstable; urgency=medium
[ Paride Legovini ]
* mk-sbuild:
+ Add support for zfs-snapshot schroots. LP: #1945349
[ Mattia Rizzolo ]
* mk-sbuild:
+ Apply patch from Peter Pentchev to avoid a broken log message.
Closes: #968316
* backportpackage:
+ Support backporting to Debian releases. Closes: #776442; LP: #974132
+ Fix the guessing algorithm for the target release:
- for Debian: pick the current stable release.
- for Ubuntu: pick the current LTS release.
[ Unit 193 ]
* backportpackage:
+ Change the generated Ubuntu version following the new policy from the
Backporters team.
[ Dan Streetman ]
* misc:
+ Refactor download progress bar code.
+ Save files that have Content-Encoding correctly,
such as the changes file from upload queue packages.
* pullpkg:
+ Extract source packages pulled from upload queue.
* hugdaylist:
+ Remove long unused and non-working script.
-- Mattia Rizzolo <mattia@debian.org> Sun, 05 Dec 2021 15:58:15 +0100
ubuntu-dev-tools (0.186) unstable; urgency=medium
* Replace nose with pytest (see: #997758).
-- Stefano Rivera <stefanor@debian.org> Sun, 24 Oct 2021 16:10:44 -0700
ubuntu-dev-tools (0.185) unstable; urgency=medium
[ Alex Murray ]
* ubuntutools/archive.py:
+ Fix crash due to PersonalPackageArchiveSourcePackage() returning the
wrong object when requesting a download url. LP: #1938659
[ Krytarik Raido ]
* merge-changelog: Fix setting of newlines.
[ Dan Streetman ]
* misc: download to tmp file, to avoid leftover 0-size file on error
* misc: handle ConnectionError as NotFoundError
* archive: use proper component source packages sometimes have different
component than their bpphs, so use the correct component when downloading
binaries (LP: #1943819)
* misc: fix flake8 complaints
[ Stefano Rivera ]
* Bump Standards-Version to 4.6.0, no changes needed.
-- Stefano Rivera <stefanor@debian.org> Fri, 17 Sep 2021 15:53:02 -0700
ubuntu-dev-tools (0.184) experimental; urgency=medium
[ Dan Streetman ]
* Drop never packaged ubuntu-archive-assistant.
* Add support for downloading from private PPAs:
+ ubuntutools/misc:
- Refactor to use Pathlib and f-strings.
- Refactor to use requests instead of urllib (for the earier auth)
+ ubuntutools/archive:
- Refactor to use Pathlib.
- Add support for the special URLs of private PPAs.
* Don't use existing file without verifying their checksum.
* tests: recreate the test package files on demand.
* Remove no longer used dependencies on python3-termcolor and python3-yaml
[ Mattia Rizzolo ]
* pbuilder-dist: use shutil.which instead of
distutils.spawn.find_executable() to save a dependency. LP: #1936697
* d/control:
+ Drop redundant Recommends that are already in Depends.
+ Bump debhelper compat level to 13.
[ Marco Trevisan (Treviño) ]
* mk-sbuild:
+ Enable debugging in the finish.sh script if --debug is used.
+ Add support to configure ccache for each schroot.
-- Mattia Rizzolo <mattia@debian.org> Sat, 17 Jul 2021 17:31:19 +0200
ubuntu-dev-tools (0.183) unstable; urgency=medium
[ Dan Streetman ]
* pbuilder-dist: include missing import
-- Stefano Rivera <stefanor@debian.org> Tue, 08 Jun 2021 10:09:11 -0400
ubuntu-dev-tools (0.182) unstable; urgency=medium
[ Dan Streetman ]
* syncpackage, ubuntutools/archive.py:
Don't save dsc file to disk until requested with pull()
(LP: #1928946)
* syncpackage:
Don't login to LP if using --simulate
* d/t/control: Add minimum flake8 version
The --extend-exclude parameter is first available in flake8 3.8.0
* ubuntutools/archive.py: Fix flake8 test failure
* d/rules, d/control: Override build tests to use flake8 and nosetests3
[ Stefano Rivera ]
* Respect nocheck in DEB_BUILD_OPTIONS, again.
-- Stefano Rivera <stefanor@debian.org> Sun, 06 Jun 2021 19:52:18 -0400
ubuntu-dev-tools (0.181) unstable; urgency=medium
[ Logan Rosen ]
* Fix a couple of remaining issues from the py2→py3 move.
[ Krytarik Raido ]
* Fix typo in the logging configuration.
[ Dan Streetman ]
* pbuilder: Handle debian change from /updates to -security. LP: #1916633
Starting in bullseye, the security suite is -security instead of /updates.
* backportpackage: Don't use SourcePackage() directly. Closes: #983854
As the warning from 2010 says, don't use this class directly.
[ Balint Reczey ]
* mk-sbuild:
+ Use eatmydata only with the dpkg command.
Eatmydata wrapping the build as well could break tests.
Thanks to Julian Andres Klode for suggesting this solution
+ Use eatmydata by default.
Since only the dpkg is wrapped in eatmydata it should be the safe and
fast default. Eatmydata is widely used around apt thus it should be a
serious bug if a package can't be installed with eatmydata in use.
[ Marco Trevisan (Treviño) ]
* doc/mk-sbuild.1: Add documentation for --debootstrap-proxy and
DEBOOTSTRAP_PROXY. LP: #1926166
-- Mattia Rizzolo <mattia@debian.org> Sun, 02 May 2021 19:56:48 +0200
ubuntu-dev-tools (0.180) unstable; urgency=medium
* Drop coverage in the autopkgtest, as python3-nose-cov is not in Debian.
-- Mattia Rizzolo <mattia@debian.org> Fri, 19 Feb 2021 12:12:33 +0100
ubuntu-dev-tools (0.179) unstable; urgency=medium
[ Stefano Rivera ]
* archive.py: Evaluate the filter() fixing Debian source history queries
(LP: #1913330)
LP: #1913330
[ Dan Streetman ]
* allow running tests using tox

1
debian/clean vendored
View File

@ -1,2 +1 @@
*.egg-info/
test-data/example_*

36
debian/control vendored
View File

@ -6,30 +6,36 @@ Uploaders:
Benjamin Drung <bdrung@debian.org>,
Stefano Rivera <stefanor@debian.org>,
Mattia Rizzolo <mattia@debian.org>,
Simon Quigley <tsimonq2@debian.org>,
Build-Depends:
black <!nocheck>,
dctrl-tools,
debhelper-compat (= 12),
debhelper-compat (= 13),
devscripts (>= 2.11.0~),
dh-make,
dh-python,
distro-info (>= 0.2~),
flake8,
isort <!nocheck>,
lsb-release,
pylint <!nocheck>,
python3-all,
python3-apt,
python3-dateutil,
python3-debian,
python3-debianbts,
python3-distro-info,
python3-httplib2,
python3-launchpadlib,
python3-launchpadlib-desktop,
python3-pytest,
python3-requests <!nocheck>,
python3-setuptools,
python3-termcolor <!nocheck>,
python3-yaml <!nocheck>,
Standards-Version: 4.5.1
Standards-Version: 4.7.2
Rules-Requires-Root: no
Vcs-Git: https://git.launchpad.net/ubuntu-dev-tools
Vcs-Browser: https://git.launchpad.net/ubuntu-dev-tools
Homepage: https://launchpad.net/ubuntu-dev-tools
X-Python3-Version: >= 3.6
Package: ubuntu-dev-tools
Architecture: all
@ -48,33 +54,32 @@ Depends:
python3-debianbts,
python3-distro-info,
python3-httplib2,
python3-launchpadlib,
python3-launchpadlib-desktop,
python3-lazr.restfulclient,
python3-ubuntutools (= ${binary:Version}),
python3-yaml,
sensible-utils,
sudo,
tzdata,
${misc:Depends},
${perl:Depends},
Recommends:
bzr | brz,
bzr-builddeb | brz-debian,
arch-test,
ca-certificates,
debian-archive-keyring,
debian-keyring,
debootstrap,
dput,
genisoimage,
lintian,
patch,
pbuilder | cowbuilder | sbuild,
python3-debianbts,
sbuild | pbuilder | cowbuilder,
python3-dns,
quilt,
reportbug (>= 3.39ubuntu1),
ubuntu-keyring | ubuntu-archive-keyring,
arch-test
Suggests:
bzr | brz,
bzr-builddeb | brz-debian,
qemu-user-static,
Description: useful tools for Ubuntu developers
This is a collection of useful tools that Ubuntu developers use to make their
@ -91,7 +96,6 @@ Description: useful tools for Ubuntu developers
- dch-repeat - used to repeat a change log into an older release.
- grab-merge - grabs a merge from merges.ubuntu.com easily.
- grep-merges - search for pending merges from Debian.
- hugdaylist - compile HugDay lists from bug list URLs.
- import-bug-from-debian - copy a bug from the Debian BTS to Launchpad
- merge-changelog - manually merges two Debian changelogs with the same base
version.
@ -114,6 +118,8 @@ Description: useful tools for Ubuntu developers
- requestsync - files a sync request with Debian changelog and rationale.
- reverse-depends - find the reverse dependencies (or build dependencies) of
a package.
- running-autopkgtests - lists the currently running and/or queued
autopkgtests on the Ubuntu autopkgtest infrastructure
- seeded-in-ubuntu - query if a package is safe to upload during a freeze.
- setup-packaging-environment - assistant to get an Ubuntu installation
ready for packaging work.
@ -132,11 +138,13 @@ Package: python3-ubuntutools
Architecture: all
Section: python
Depends:
python3-dateutil,
python3-debian,
python3-distro-info,
python3-httplib2,
python3-launchpadlib,
python3-launchpadlib-desktop,
python3-lazr.restfulclient,
python3-requests,
sensible-utils,
${misc:Depends},
${python3:Depends},

58
debian/copyright vendored
View File

@ -3,26 +3,30 @@ Upstream-Name: Ubuntu Developer Tools
Upstream-Contact: Ubuntu Developers <ubuntu-devel-discuss@lists.ubuntu.com>
Source: https://launchpad.net/ubuntu-dev-tools
Files: *
backportpackage
Files: backportpackage
bash_completion/pbuilder-dist
check-symbols
debian/*
doc/backportpackage.1
doc/check-symbols.1
doc/requestsync.1
doc/ubuntu-iso.1
doc/running-autopkgtests.1
GPL-2
README.updates
requestsync
setup.py
TODO
ubuntu-iso
ubuntutools/requestsync/lp.py
ubuntutools/requestsync/mail.py
ubuntutools/requestsync/*.py
Copyright: 2007, Albert Damen <albrt@gmx.net>
2010, Benjamin Drung <bdrung@ubuntu.com>
2007-2010, Canonical Ltd.
2010-2024, Benjamin Drung <bdrung@ubuntu.com>
2007-2023, Canonical Ltd.
2006-2007, Daniel Holbach <daniel.holbach@ubuntu.com>
2010, Evan Broder <evan@ebroder.net>
2006-2007, Luke Yelavich <themuso@ubuntu.com>
2009-2010, Michael Bienia <geser@ubuntu.com>
2024-2025, Simon Quigley <tsimonq2@debian.org>
2010-2011, Stefano Rivera <stefanor@ubuntu.com>
2008, Stephan Hermann <sh@sourcecode.de>
2007, Steve Kowalik <stevenk@ubuntu.com>
@ -70,20 +74,28 @@ License: GPL-2+
On Debian systems, the complete text of the GNU General Public License
version 2 can be found in the /usr/share/common-licenses/GPL-2 file.
Files: doc/bitesize.1
Files: doc/lp-bitesize.1
doc/check-mir.1
doc/grab-merge.1
doc/hugdaylist.1
doc/merge-changelog.1
doc/pm-helper.1
doc/setup-packaging-environment.1
doc/syncpackage.1
bitesize
lp-bitesize
check-mir
GPL-3
grab-merge
hugdaylist
merge-changelog
pm-helper
pyproject.toml
run-linters
running-autopkgtests
setup-packaging-environment
syncpackage
Copyright: 2010, Benjamin Drung <bdrung@ubuntu.com>
2007-2011, Canonical Ltd.
ubuntutools/running_autopkgtests.py
ubuntutools/utils.py
Copyright: 2010-2024, Benjamin Drung <bdrung@ubuntu.com>
2007-2024, Canonical Ltd.
2008, Jonathan Patrick Davies <jpds@ubuntu.com>
2008-2010, Martin Pitt <martin.pitt@canonical.com>
2009, Siegfried-Angel Gevatter Pujals <rainct@ubuntu.com>
@ -105,13 +117,23 @@ Files: dch-repeat
doc/dch-repeat.1
doc/grep-merges.1
doc/mk-sbuild.1
doc/pull-pkg.1
doc/ubuntu-build.1
grep-merges
mk-sbuild
pull-pkg
pull-*debs
pull-*-source
requirements.txt
test-requirements.txt
tox.ini
ubuntu-build
ubuntutools/lp/libsupport.py
ubuntutools/__init__.py
ubuntutools/lp/__init__.py
ubuntutools/lp/lpapicache.py
ubuntutools/lp/udtexceptions.py
ubuntutools/misc.py
ubuntutools/pullpkg.py
Copyright: 2007-2010, Canonical Ltd.
2008-2009, Iain Lane <iain@orangesquash.org.uk>
2006, John Dong <jdong@ubuntu.com>
@ -137,7 +159,6 @@ License: GPL-3+
version 3 can be found in the /usr/share/common-licenses/GPL-3 file.
Files: doc/pull-debian-debdiff.1
doc/pull-pkg.1
doc/requestbackport.1
doc/reverse-depends.1
doc/seeded-in-ubuntu.1
@ -147,12 +168,10 @@ Files: doc/pull-debian-debdiff.1
doc/update-maintainer.1
enforced-editing-wrapper
pull-debian-debdiff
pull-pkg
requestbackport
reverse-depends
seeded-in-ubuntu
sponsor-patch
test-data/*
ubuntu-upload-permission
ubuntutools/archive.py
ubuntutools/builder.py
@ -162,12 +181,15 @@ Files: doc/pull-debian-debdiff.1
ubuntutools/sponsor_patch/*
ubuntutools/test/*
ubuntutools/update_maintainer.py
ubuntutools/version.py
update-maintainer
Copyright: 2009-2011, Benjamin Drung <bdrung@ubuntu.com>
.pylintrc
Copyright: 2009-2024, Benjamin Drung <bdrung@ubuntu.com>
2010, Evan Broder <evan@ebroder.net>
2008, Siegfried-Angel Gevatter Pujals <rainct@ubuntu.com>
2010-2011, Stefano Rivera <stefanor@ubuntu.com>
2017-2019, Dan Streetman <ddstreet@canonical.com>
2017-2021, Dan Streetman <ddstreet@canonical.com>
2024, Canonical Ltd.
License: ISC
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above

5
debian/rules vendored
View File

@ -5,5 +5,10 @@ override_dh_auto_clean:
rm -f .coverage
rm -rf .tox
override_dh_auto_test:
ifeq (,$(filter nocheck,$(DEB_BUILD_OPTIONS)))
python3 -m pytest -v ubuntutools
endif
%:
dh $@ --with python3 --buildsystem=pybuild

3
debian/source/lintian-overrides vendored Normal file
View File

@ -0,0 +1,3 @@
# pyc files are machine-generated; they're expected to have long lines and have unstated copyright
source: file-without-copyright-information *.pyc [debian/copyright]
source: very-long-line-length-in-source-file * > 512 [*.pyc:*]

12
debian/tests/control vendored
View File

@ -1,7 +1,7 @@
Test-Command: flake8 -v --max-line-length=99 --extend-exclude=ubuntu-archive-assistant,ubuntu_archive_assistant
Depends: flake8
Restrictions: allow-stderr
Test-Command: nosetests3 -v ubuntutools --with-coverage --cover-package=ubuntutools ubuntutools
Depends: python3-nose, python3-nose-cov, python3-setuptools, @
Test-Command: python3 -m pytest -v ubuntutools
Depends:
dh-make,
python3-pytest,
python3-setuptools,
@,
Restrictions: allow-stderr

View File

@ -1,26 +0,0 @@
.TH HUGDAYLIST "1" "August 27, 2008" "ubuntu-dev-tools"
.SH NAME
hugdaylist \- produce MoinMoin wiki formatted tables based on a Launchpad bug list
.SH SYNOPSIS
.B hugdaylist [\fB\-n\fP|\fB\-\-number <NUMBER>\fP] \fBlaunchpad-buglist-url\fP
.SH DESCRIPTION
\fBhugdaylist\fP produces MoinMoin wiki formatted tables based on a
Launchpad bug list
.SH OPTIONS
.TP
\fB\-\-number=<NUMBER>\fP
This option allows you to specify the number of entries to output.
.TP
\fBlaunchpad-buglist-url\fP
Required, this option is a URL pointing to a launchpad bug list.
.SH AUTHOR
\fBhugdaylist\fP has been written by Canonical Ltd., Daniel Holbach
<daniel.holbach@canonical.com> and Jonathan Patrick Davies <jpds@ubuntu.com>.
This manual page was written by Ryan Kavanagh <ryanakca@kubuntu.org>.
.PP
Both are released under the GNU General Public License, version 3.

View File

@ -1,21 +1,21 @@
.TH bitesize "1" "May 9 2010" "ubuntu-dev-tools"
.TH lp-bitesize "1" "May 9 2010" "ubuntu-dev-tools"
.SH NAME
bitesize \- Add \fBbitesize\fR tag to bugs and add a comment.
lp-bitesize \- Add \fBbitesize\fR tag to bugs and add a comment.
.SH SYNOPSIS
.B bitesize \fR<\fIbug number\fR>
.B lp-bitesize \fR<\fIbug number\fR>
.br
.B bitesize \-\-help
.B lp-bitesize \-\-help
.SH DESCRIPTION
\fBbitesize\fR adds a bitesize tag to the bug, if it's not there yet. It
\fBlp-bitesize\fR adds a bitesize tag to the bug, if it's not there yet. It
also adds a comment to the bug indicating that you are willing to help with
fixing it.
It checks for permission to operate on a given bug first,
then perform required tasks on Launchpad.
.SH OPTIONS
Listed below are the command line options for \fBbitesize\fR:
Listed below are the command line options for \fBlp-bitesize\fR:
.TP
.BR \-h ", " \-\-help
Display a help message and exit.
@ -48,7 +48,7 @@ The default value for \fB--lpinstance\fR.
.BR ubuntu\-dev\-tools (5)
.SH AUTHORS
\fBbitesize\fR and this manual page were written by Daniel Holbach
\fBlp-bitesize\fR and this manual page were written by Daniel Holbach
<daniel.holbach@canonical.com>.
.PP
Both are released under the terms of the GNU General Public License, version 3.

View File

@ -64,6 +64,15 @@ Disable checking gpg signatures of downloaded Release files by using
debootstrap's \fB\-\-no\-check\-gpg\fR option. See \fBdebootstrap\fR (8)
for more details.
.TP
.B \-\-debootstrap\-proxy\fR=\fIPROXY
Use \fIPROXY\fR as apt proxy.
.TP
.B \-\-eatmydata
Install and use eatmydata (default)
.TP
.B \-\-skip\-eatmydata
Don't install and use eatmydata
.TP
.B \-\-distro\fR=\fIDISTRO
Enable distro-specific logic.
When not provided, the distribution is determined from \fIrelease\fR.
@ -74,10 +83,31 @@ Specify a volume group, and subsequently use a default \fBSCHROOT_TYPE\fR of
"\fBlvm-snapshot\fR" rather than "\fBdirectory\fR" (via overlayfs or
aufs) mounts.
.TP
.B \-\-zfs-dataset=\fIDATASET
Specify a zfs dataset, and subsequently use a default \fBSCHROOT_TYPE\fR of
"\fBzfs-snapshot\fR" rather than "\fBdirectory\fR" (via overlayfs or
aufs) mounts.
.TP
.B \-\-type\fR=\fISHROOT_TYPE
Specify a \fBSCHROOT_TYPE\fR. Supported values are "\fBdirectory\fR"
(default if \fB\-\-vg\fR not specified), "\fBlvm-snapshot\fR" (default
if \fB\-\-vg\fR specified), "\fBbtrfs-snapshot\fR", and "\fBfile\fR".
if \fB\-\-vg\fR specified), "\fBbtrfs-snapshot\fR", "\fBzfs-snapshot\fR"
and "\fBfile\fR".
.TP
.B \-\-ccache
Enable usage of \fBccache\fR by default. See \fBccache\fR (1) for
more details.
.TP
.B \-\-ccache-dir=\fIPATH
Use \fBPATH\fR as schroot ccache directory. This directory can be
safely shared by multiple schroots, but they will all use the same
\fBCCACHE_MAXSIZE\fR.
Defaults to /var/cache/ccache-sbuild.
See \fBccache\fR (1) for more details.
.TP
.B \-\-ccache-size=\fISIZE
Sets \fBSIZE\fR as the schroot \fBCCACHE_DIR\fR max-size used by ccache.
See \fBccache\fR (1) for more details.
.SH ENVIRONMENT VARIABLES
.TP
@ -120,6 +150,14 @@ Keyring file to use for checking gpg signatures of retrieved release files
Disable gpg verification of retrieved release files (same as
\fB\-\-debootstrap\-no\-check\-gpg\fR)
.TP
.B DEBOOTSTRAP_PROXY
Proxy to use for apt. (same as
\fB\-\-debootstrap\-proxy\fR)
.TP
.B EATMYDATA
Enable or disable eatmydata usage, see \fB\-\-eatmydata\fR
and \fB\-\-skip\-eatmydata\fR
.TP
.B SOURCE_CHROOTS_DIR
Use \fBSOURCE_CHROOTS_DIR\fR as home of schroot source directories.
(default \fB/var/lib/schroot/chroots\fR)
@ -131,6 +169,18 @@ Use \fBSOURCE_CHROOTS_TGZ\fR as home of schroot source tarballs.
.B CHROOT_SNAPSHOT_DIR
Use \fBCHROOT_SNAPSHOT_DIR\fR as home of mounted btrfs snapshots.
(default \fB/var/lib/schroot/snapshots\fR)
.TP
.B CCACHE
Enable \fBccache\fR (1) by default.
(defaults to \fB0\fR)
.TP
.B CCACHE_DIR
Use \fBCCACHE_DIR\fR as the \fBccache\fR (1) directory.
(default \fB/var/cache/ccache-sbuild\fR)
.TP
.B CCACHE_SIZE
Use \fBCCACHE_SIZE\fR as the \fBccache\fR (1) max-size.
(defaults to \fB4G\fR)
.SH FILES

View File

@ -20,7 +20,7 @@ like for example \fBpbuilder\-feisty\fP, \fBpbuilder\-sid\fP, \fBpbuilder\-gutsy
.PP
The same applies to \fBcowbuilder\-dist\fP, which uses cowbuilder. The main
difference between both is that pbuilder compresses the created chroot as a
a tarball, thus using less disc space but needing to uncompress (and possibly
tarball, thus using less disc space but needing to uncompress (and possibly
compress) its contents again on each run, and cowbuilder doesn't do this.
.SH USAGE
@ -38,7 +38,7 @@ This optional parameter will attempt to construct a chroot in a foreign
architecture.
For some architecture pairs (e.g. i386 on an amd64 install), the chroot
will be created natively.
For others (e.g. armel on an i386 install), qemu\-user\-static will be
For others (e.g. arm64 on an amd64 install), qemu\-user\-static will be
used.
Note that some combinations (e.g. amd64 on an i386 install) require
special separate kernel handling, and may break in unexpected ways.

44
doc/pm-helper.1 Normal file
View File

@ -0,0 +1,44 @@
.\" Copyright (C) 2023, Canonical Ltd.
.\"
.\" This program is free software; you can redistribute it and/or
.\" modify it under the terms of the GNU General Public License, version 3.
.\"
.\" This program is distributed in the hope that it will be useful,
.\" but WITHOUT ANY WARRANTY; without even the implied warranty of
.\" MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
.\" General Public License for more details.
.\"
.\" You should have received a copy of the GNU General Public License
.\" along with this program. If not, see <http://www.gnu.org/licenses/>.
.TH pm\-helper 1 "June 2023" ubuntu\-dev\-tools
.SH NAME
pm\-helper \- helper to guide a developer through proposed\-migration work
.SH SYNOPSIS
.B pm\-helper \fR[\fIoptions\fR] [\fIpackage\fR]
.SH DESCRIPTION
Claim a package from proposed\-migration to work on and get additional
information (such as the state of the package in Debian) that may be helpful
in unblocking it.
.PP
This tool is incomplete and under development.
.SH OPTIONS
.TP
.B \-l \fIINSTANCE\fR, \fB\-\-launchpad\fR=\fIINSTANCE\fR
Use the specified instance of Launchpad (e.g. "staging"), instead of
the default of "production".
.TP
.B \-v\fR, \fB--verbose\fR
be more verbose
.TP
\fB\-h\fR, \fB\-\-help\fR
Display a help message and exit
.SH AUTHORS
\fBpm\-helper\fR and this manpage were written by Steve Langasek
<steve.langasek@ubuntu.com>.
.PP
Both are released under the GPLv3 license.

View File

@ -0,0 +1,15 @@
.TH running\-autopkgtests "1" "18 January 2024" "ubuntu-dev-tools"
.SH NAME
running\-autopkgtests \- dumps a list of currently running autopkgtests
.SH SYNOPSIS
.B running\-autopkgtests
.SH DESCRIPTION
Dumps a list of currently running and queued tests in Autopkgtest.
Pass --running to only see running tests, or --queued to only see
queued tests. Passing both will print both, which is the default behavior.
.SH AUTHOR
.B running\-autopkgtests
was written by Chris Peterson <chris.peterson@canonical.com>.

View File

@ -6,7 +6,13 @@
\fBsetup-packaging-environment\fR
.SH DESCRIPTION
\fBsetup-packaging-environment\fR aims to make it more straightforward for new contributors to get their Ubuntu installation ready for packaging work. It ensures that all four components from Ubuntu's official repositories are enabled along with their corresponding source repositories. It also installs a minimal set of packages needed for Ubuntu packaging work (ubuntu-dev-tools, devscripts, debhelper, cdbs, patchutils, pbuilder, and build-essential). Finally, it assists in defining the DEBEMAIL and DEBFULLNAME environment variables.
\fBsetup-packaging-environment\fR aims to make it more straightforward for new
contributors to get their Ubuntu installation ready for packaging work. It
ensures that all four components from Ubuntu's official repositories are enabled
along with their corresponding source repositories. It also installs a minimal
set of packages needed for Ubuntu packaging work (ubuntu-dev-tools, devscripts,
debhelper, patchutils, pbuilder, and build-essential). Finally, it assists
in defining the DEBEMAIL and DEBFULLNAME environment variables.
.SH AUTHORS
\fBsetup-packaging-environment\fR was written by Siegfried-A. Gevatter <rainct@ubuntu.com>.

View File

@ -4,11 +4,11 @@ syncpackage \- copy source packages from Debian to Ubuntu
.\"
.SH SYNOPSIS
.B syncpackage
[\fIoptions\fR] \fI<.dsc URL/path or package name>\fR
[\fIoptions\fR] \fI<.dsc URL/path or package name(s)>\fR
.\"
.SH DESCRIPTION
\fBsyncpackage\fR causes a source package to be copied from Debian to
Ubuntu.
\fBsyncpackage\fR causes one or more source package(s) to be copied from Debian
to Ubuntu.
.PP
\fBsyncpackage\fR allows you to upload files with the same checksums of the
Debian ones, as the common script used by Ubuntu archive administrators does,
@ -58,7 +58,7 @@ Display more progress information.
\fB\-F\fR, \fB\-\-fakesync\fR
Perform a fakesync, to work around a tarball mismatch between Debian and
Ubuntu.
This option ignores blacklisting, and performs a local sync.
This option ignores blocklisting, and performs a local sync.
It implies \fB\-\-no\-lp\fR, and will leave a signed \fB.changes\fR file
for you to upload.
.TP

View File

@ -1,9 +1,14 @@
.TH UBUNTU-BUILD "1" "June 2010" "ubuntu-dev-tools"
.TH UBUNTU-BUILD "1" "Mar 2024" "ubuntu-dev-tools"
.SH NAME
ubuntu-build \- command-line interface to Launchpad build operations
.SH SYNOPSIS
.B ubuntu-build <srcpackage> <release> <operation>
.nf
\fBubuntu-build\fR <srcpackage> <release> <operation>
\fBubuntu-build\fR --batch [--retry] [--rescore \fIPRIORITY\fR] [--arch \fIARCH\fR [...]]
[--series \fISERIES\fR] [--state \fIBUILD-STATE\fR]
[-A \fIARCHIVE\fR] [pkg]...
.fi
.SH DESCRIPTION
\fBubuntu-build\fR provides a command line interface to the Launchpad build
@ -38,8 +43,7 @@ operations.
\fB\-a\fR ARCHITECTURE, \fB\-\-arch\fR=\fIARCHITECTURE\fR
Rebuild or rescore a specific architecture. Valid
architectures are:
armel, armhf, arm64, amd64, hppa, i386, ia64,
lpia, powerpc, ppc64el, riscv64, s390x, sparc.
armhf, arm64, amd64, i386, powerpc, ppc64el, riscv64, s390x.
.TP
Batch processing:
.IP
@ -59,15 +63,16 @@ Retry builds (give\-back).
\fB\-\-rescore\fR=\fIPRIORITY\fR
Rescore builds to <priority>.
.IP
\fB\-\-arch2\fR=\fIARCHITECTURE\fR
\fB\-\-arch\fR=\fIARCHITECTURE\fR
Affect only 'architecture' (can be used several
times). Valid architectures are:
armel, armhf, arm64, amd64, hppa, i386, ia64,
lpia, powerpc, ppc64el, riscv64, s390x, sparc.
arm64, amd64, i386, powerpc, ppc64el, riscv64, s390x.
.IP
\fB\-A=\fIARCHIVE\fR
Act on the named archive (ppa) instead of on the main Ubuntu archive.
.SH AUTHORS
\fBubuntu-build\fR was written by Martin Pitt <martin.pitt@canonical.com>, and
this manual page was written by Jonathan Patrick Davies <jpds@ubuntu.com>.
.PP
Both are released under the terms of the GNU General Public License, version 3
or (at your option) any later version.
Both are released under the terms of the GNU General Public License, version 3.

View File

@ -22,7 +22,10 @@
# UDT_EDIT_WRAPPER_TEMPLATE_RE: An extra boilerplate-detecting regex.
# UDT_EDIT_WRAPPER_FILE_DESCRIPTION: The type of file being edited.
import optparse
# pylint: disable=invalid-name
# pylint: enable=invalid-name
import argparse
import os
import re
@ -30,33 +33,30 @@ from ubuntutools.question import EditFile
def main():
parser = optparse.OptionParser('%prog [options] filename')
options, args = parser.parse_args()
parser = argparse.ArgumentParser(usage="%(prog)s [options] filename")
parser.add_argument("filename", help=argparse.SUPPRESS)
args = parser.parse_args()
if not os.path.isfile(args.filename):
parser.error(f"File {args.filename} does not exist")
if len(args) != 1:
parser.error('A filename must be specified')
body = args[0]
if not os.path.isfile(body):
parser.error('File %s does not exist' % body)
if 'UDT_EDIT_WRAPPER_EDITOR' in os.environ:
os.environ['EDITOR'] = os.environ['UDT_EDIT_WRAPPER_EDITOR']
if "UDT_EDIT_WRAPPER_EDITOR" in os.environ:
os.environ["EDITOR"] = os.environ["UDT_EDIT_WRAPPER_EDITOR"]
else:
del os.environ['EDITOR']
del os.environ["EDITOR"]
if 'UDT_EDIT_WRAPPER_VISUAL' in os.environ:
os.environ['VISUAL'] = os.environ['UDT_EDIT_WRAPPER_VISUAL']
if "UDT_EDIT_WRAPPER_VISUAL" in os.environ:
os.environ["VISUAL"] = os.environ["UDT_EDIT_WRAPPER_VISUAL"]
else:
del os.environ['VISUAL']
del os.environ["VISUAL"]
placeholders = []
if 'UDT_EDIT_WRAPPER_TEMPLATE_RE' in os.environ:
placeholders.append(re.compile(
os.environ['UDT_EDIT_WRAPPER_TEMPLATE_RE']))
if "UDT_EDIT_WRAPPER_TEMPLATE_RE" in os.environ:
placeholders.append(re.compile(os.environ["UDT_EDIT_WRAPPER_TEMPLATE_RE"]))
description = os.environ.get('UDT_EDIT_WRAPPER_FILE_DESCRIPTION', 'file')
description = os.environ.get("UDT_EDIT_WRAPPER_FILE_DESCRIPTION", "file")
EditFile(body, description, placeholders).edit()
EditFile(args.filename, description, placeholders).edit()
if __name__ == '__main__':
if __name__ == "__main__":
main()

View File

@ -19,63 +19,70 @@
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import optparse
import sys
# pylint: disable=invalid-name
# pylint: enable=invalid-name
import argparse
import json
import sys
from httplib2 import Http, HttpLib2Error
import ubuntutools.misc
from ubuntutools import getLogger
Logger = getLogger()
def main():
parser = optparse.OptionParser(
usage='%prog [options] [string]',
description='List pending merges from Debian matching string')
args = parser.parse_args()[1]
if len(args) > 1:
parser.error('Too many arguments')
elif len(args) == 1:
match = args[0]
else:
match = None
parser = argparse.ArgumentParser(
usage="%(prog)s [options] [string]",
description="List pending merges from Debian matching string",
)
parser.add_argument("string", nargs="?", help=argparse.SUPPRESS)
args = parser.parse_args()
ubuntutools.misc.require_utf8()
for component in ('main', 'main-manual',
'restricted', 'restricted-manual',
'universe', 'universe-manual',
'multiverse', 'multiverse-manual'):
url = 'https://merges.ubuntu.com/%s.json' % component
for component in (
"main",
"main-manual",
"restricted",
"restricted-manual",
"universe",
"universe-manual",
"multiverse",
"multiverse-manual",
):
url = f"https://merges.ubuntu.com/{component}.json"
try:
headers, page = Http().request(url)
except HttpLib2Error as e:
Logger.exception(e)
sys.exit(1)
if headers.status != 200:
Logger.error("%s: %s %s" % (url, headers.status,
headers.reason))
Logger.error("%s: %s %s", url, headers.status, headers.reason)
sys.exit(1)
for merge in json.loads(page):
package = merge['source_package']
author, uploader = '', ''
if merge.get('user'):
author = merge['user']
if merge.get('uploader'):
uploader = '(%s)' % merge['uploader']
teams = merge.get('teams', [])
package = merge["source_package"]
author, uploader = "", ""
if merge.get("user"):
author = merge["user"]
if merge.get("uploader"):
uploader = f"({merge['uploader']})"
teams = merge.get("teams", [])
pretty_uploader = '{} {}'.format(author, uploader)
if (match is None or match in package or match in author
or match in uploader or match in teams):
Logger.info('%s\t%s' % (package, pretty_uploader))
pretty_uploader = f"{author} {uploader}"
if (
args.string is None
or args.string in package
or args.string in author
or args.string in uploader
or args.string in teams
):
Logger.info("%s\t%s", package, pretty_uploader)
if __name__ == '__main__':
if __name__ == "__main__":
main()

View File

@ -1,141 +0,0 @@
#!/usr/bin/python3
# -*- coding: utf-8 -*-
#
# Copyright (C) 2007 Canonical Ltd., Daniel Holbach
# Copyright (C) 2008 Jonathan Patrick Davies <jpds@ubuntu.com>
#
# ##################################################################
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; version 3.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# See file /usr/share/common-licenses/GPL-3 for more details.
#
# ##################################################################
#
#
# hugdaylist - produces MoinMoin wiki formatted tables based on a Launchpad bug
# list.
#
# hugdaylist <url>
# - produces lists like https://wiki.ubuntu.com/UbuntuBugDay/20070912?action=raw
#
# hugdaylist -n <howmany> <url>
# - will only list <howmany> URLs.
import sys
from optparse import OptionParser
from launchpadlib.launchpad import Launchpad
from ubuntutools.lp.libsupport import translate_web_api
from ubuntutools import getLogger
Logger = getLogger()
def check_args():
howmany = -1
url = ""
# Our usage options.
usage = "usage: %prog [-n <number>] launchpad-buglist-url"
opt_parser = OptionParser(usage)
# Options - namely just the number of bugs to output.
opt_parser.add_option("-n", "--number", type="int",
dest="number", help="Number of entries to output.")
# Parse arguments.
(options, args) = opt_parser.parse_args()
# Check if we want a number other than the default.
howmany = options.number
# Check that we have an URL.
if not args:
Logger.error("An URL pointing to a Launchpad bug list is required.")
opt_parser.print_help()
sys.exit(1)
else:
url = args[0]
return (howmany, url)
def filter_unsolved(task):
# TODO: don't use this filter here, only check status and assignee of
# the given task
# Filter out special types of bugs:
# - https://wiki.ubuntu.com/Bugs/HowToTriage#Special%20types%20of%20bugs
# this is expensive, parse name out of self_link instead?
subscriptions = set(s.person.name for s in task.bug.subscriptions)
if (task.status != "Fix Committed" and
(not task.assignee or task.assignee.name in ['motu', 'desktop-bugs']) and
'ubuntu-sponsors' not in subscriptions and
'ubuntu-archive' not in subscriptions):
return True
return False
def main():
(howmany, url) = check_args()
if len(url.split("?", 1)) == 2:
# search options not supported, because there is no mapping web ui
# options <-> API options
Logger.error("Options in url are not supported, url: %s" % url)
sys.exit(1)
launchpad = None
try:
launchpad = Launchpad.login_with("ubuntu-dev-tools", 'production')
except IOError as error:
Logger.exception(error)
sys.exit(1)
api_url = translate_web_api(url, launchpad)
try:
product = launchpad.load(api_url)
except Exception as error:
response = getattr(error, "response", {})
if response.get("status", None) == "404":
Logger.error("The URL at '%s' does not appear to be a "
"valid url to a product" % url)
sys.exit(1)
else:
raise
bug_list = [b for b in product.searchTasks() if filter_unsolved(b)]
if not bug_list:
Logger.info("Bug list of %s is empty." % url)
sys.exit(0)
if howmany == -1:
howmany = len(bug_list)
Logger.info("""
## ||<rowbgcolor="#CCFFCC"> This task is done || somebody || ||
## ||<rowbgcolor="#FFFFCC"> This task is assigned || somebody || <status> ||
## ||<rowbgcolor="#FFEBBB"> This task isn't || ... || ||
## ||<rowbgcolor="#FFCCCC"> This task is blocked on something || somebody || <explanation> ||
|| Bug || Subject || Triager ||""")
for i in list(bug_list)[:howmany]:
bug = i.bug
Logger.info('||<rowbgcolor="#FFEBBB"> [%s %s] || %s || ||' %
(bug.web_link, bug.id, bug.title))
if __name__ == '__main__':
try:
main()
except KeyboardInterrupt:
Logger.error("Aborted.")
sys.exit(1)

View File

@ -21,40 +21,213 @@
#
# ##################################################################
# pylint: disable=invalid-name
# pylint: enable=invalid-name
import argparse
import debianbts
import logging
import re
import sys
import webbrowser
from collections.abc import Iterable
from email.message import EmailMessage
import debianbts
from launchpadlib.launchpad import Launchpad
from ubuntutools import getLogger
from ubuntutools.config import UDTConfig
from ubuntutools import getLogger
Logger = getLogger()
ATTACHMENT_MAX_SIZE = 2000
def main():
def parse_args() -> argparse.Namespace:
parser = argparse.ArgumentParser()
parser.add_argument(
"-b",
"--browserless",
action="store_true",
help="Don't open the bug in the browser at the end",
)
parser.add_argument(
"-l",
"--lpinstance",
metavar="INSTANCE",
help="LP instance to connect to (default: production)",
)
parser.add_argument(
"-v", "--verbose", action="store_true", help="Print info about the bug being imported"
)
parser.add_argument(
"-n",
"--dry-run",
action="store_true",
help="Don't actually open a bug (also sets verbose)",
)
parser.add_argument(
"-p", "--package", help="Launchpad package to file bug against (default: Same as Debian)"
)
parser.add_argument(
"--no-conf", action="store_true", help="Don't read config files or environment variables."
)
parser.add_argument("bugs", nargs="+", help="Bug number(s) or URL(s)")
return parser.parse_args()
def get_bug_numbers(bug_list: Iterable[str]) -> list[int]:
bug_re = re.compile(r"bug=(\d+)")
parser = argparse.ArgumentParser()
parser.add_argument("-b", "--browserless", action="store_true",
help="Don't open the bug in the browser at the end")
parser.add_argument("-l", "--lpinstance", metavar="INSTANCE",
help="LP instance to connect to (default: production)")
parser.add_argument("-v", "--verbose", action="store_true",
help="Print info about the bug being imported")
parser.add_argument("-n", "--dry-run", action="store_true",
help="Don't actually open a bug (also sets verbose)")
parser.add_argument("-p", "--package",
help="Launchpad package to file bug against "
"(default: Same as Debian)")
parser.add_argument("--no-conf", action="store_true",
help="Don't read config files or environment variables.")
parser.add_argument("bugs", nargs="+", help="Bug number(s) or URL(s)")
options = parser.parse_args()
bug_nums = []
for bug_num in bug_list:
if bug_num.startswith("http"):
# bug URL
match = bug_re.search(bug_num)
if match is None:
Logger.error("Can't determine bug number from %s", bug_num)
sys.exit(1)
bug_num = match.groups()[0]
bug_num = bug_num.lstrip("#")
bug_nums.append(int(bug_num))
return bug_nums
def walk_multipart_message(message: EmailMessage) -> tuple[str, list[tuple[int, EmailMessage]]]:
summary = ""
attachments = []
i = 1
for part in message.walk():
content_type = part.get_content_type()
if content_type.startswith("multipart/"):
# we're already iterating on multipart items
# let's just skip the multipart extra metadata
continue
if content_type == "application/pgp-signature":
# we're not interested in importing pgp signatures
continue
if part.is_attachment():
attachments.append((i, part))
elif content_type.startswith("image/"):
# images here are not attachment, they are inline, but Launchpad can't handle that,
# so let's add them as attachments
summary += f"Message part #{i}\n"
summary += f"[inline image '{part.get_filename()}']\n\n"
attachments.append((i, part))
elif content_type.startswith("text/html"):
summary += f"Message part #{i}\n"
summary += "[inline html]\n\n"
attachments.append((i, part))
elif content_type == "text/plain":
summary += f"Message part #{i}\n"
summary += part.get_content() + "\n"
else:
raise RuntimeError(
f"""Unknown message part
Your Debian bug is too weird to be imported in Launchpad, sorry.
You can fix that by patching this script in ubuntu-dev-tools.
Faulty message part:
{part}"""
)
i += 1
return summary, attachments
def process_bugs(
bugs: Iterable[debianbts.Bugreport],
launchpad: Launchpad,
package: str,
dry_run: bool = True,
browserless: bool = False,
) -> bool:
debian = launchpad.distributions["debian"]
ubuntu = launchpad.distributions["ubuntu"]
lp_debbugs = launchpad.bug_trackers.getByName(name="debbugs")
err = False
for bug in bugs:
ubupackage = bug.source
if package:
ubupackage = package
bug_num = bug.bug_num
subject = bug.subject
log = debianbts.get_bug_log(bug_num)
message = log[0]["message"]
assert isinstance(message, EmailMessage)
attachments: list[tuple[int, EmailMessage]] = []
if message.is_multipart():
summary, attachments = walk_multipart_message(message)
else:
summary = str(message.get_payload())
target = ubuntu.getSourcePackage(name=ubupackage)
if target is None:
Logger.error(
"Source package '%s' is not in Ubuntu. Please specify "
"the destination source package with --package",
ubupackage,
)
err = True
continue
description = f"Imported from Debian bug http://bugs.debian.org/{bug_num}:\n\n{summary}"
# LP limits descriptions to 50K chars
description = (description[:49994] + " [...]") if len(description) > 50000 else description
Logger.debug("Target: %s", target)
Logger.debug("Subject: %s", subject)
Logger.debug("Description: ")
Logger.debug(description)
for i, attachment in attachments:
Logger.debug("Attachment #%s (%s)", i, attachment.get_filename() or "inline")
Logger.debug("Content:")
if attachment.get_content_type() == "text/plain":
content = attachment.get_content()
if len(content) > ATTACHMENT_MAX_SIZE:
content = (
content[:ATTACHMENT_MAX_SIZE]
+ f" [attachment cropped after {ATTACHMENT_MAX_SIZE} characters...]"
)
Logger.debug(content)
else:
Logger.debug("[data]")
if dry_run:
Logger.info("Dry-Run: not creating Ubuntu bug.")
continue
u_bug = launchpad.bugs.createBug(target=target, title=subject, description=description)
for i, attachment in attachments:
name = f"#{i}-{attachment.get_filename() or "inline"}"
content = attachment.get_content()
if isinstance(content, str):
# Launchpad only wants bytes
content = content.encode()
u_bug.addAttachment(
filename=name,
data=content,
comment=f"Imported from Debian bug http://bugs.debian.org/{bug_num}",
)
d_sp = debian.getSourcePackage(name=package)
if d_sp is None and package:
d_sp = debian.getSourcePackage(name=package)
d_task = u_bug.addTask(target=d_sp)
d_watch = u_bug.addWatch(remote_bug=bug_num, bug_tracker=lp_debbugs)
d_task.bug_watch = d_watch
d_task.lp_save()
Logger.info("Opened %s", u_bug.web_link)
if not browserless:
webbrowser.open(u_bug.web_link)
return err
def main() -> None:
options = parse_args()
config = UDTConfig(options.no_conf)
if options.lpinstance is None:
@ -69,77 +242,15 @@ def main():
if options.verbose:
Logger.setLevel(logging.DEBUG)
debian = launchpad.distributions['debian']
ubuntu = launchpad.distributions['ubuntu']
lp_debbugs = launchpad.bug_trackers.getByName(name='debbugs')
bug_nums = []
for bug_num in options.bugs:
if bug_num.startswith("http"):
# bug URL
match = bug_re.search(bug_num)
if match is None:
Logger.error("Can't determine bug number from %s", bug_num)
sys.exit(1)
bug_num = match.groups()[0]
bug_num = bug_num.lstrip("#")
bug_num = int(bug_num)
bug_nums.append(bug_num)
bugs = debianbts.get_status(*bug_nums)
bugs = debianbts.get_status(get_bug_numbers(options.bugs))
if not bugs:
Logger.error("Cannot find any of the listed bugs")
sys.exit(1)
err = False
for bug in bugs:
ubupackage = package = bug.source
if options.package:
ubupackage = options.package
bug_num = bug.bug_num
subject = bug.subject
log = debianbts.get_bug_log(bug_num)
summary = log[0]['message'].get_payload()
target = ubuntu.getSourcePackage(name=ubupackage)
if target is None:
Logger.error("Source package '%s' is not in Ubuntu. Please specify "
"the destination source package with --package",
ubupackage)
err = True
continue
description = ('Imported from Debian bug http://bugs.debian.org/%d:\n\n%s' %
(bug_num, summary))
# LP limits descriptions to 50K chars
description = (description[:49994] + ' [...]') if len(description) > 50000 else description
Logger.debug('Target: %s' % target)
Logger.debug('Subject: %s' % subject)
Logger.debug('Description: ')
Logger.debug(description)
if options.dry_run:
Logger.info('Dry-Run: not creating Ubuntu bug.')
continue
u_bug = launchpad.bugs.createBug(target=target, title=subject,
description=description)
d_sp = debian.getSourcePackage(name=package)
if d_sp is None and options.package:
d_sp = debian.getSourcePackage(name=options.package)
d_task = u_bug.addTask(target=d_sp)
d_watch = u_bug.addWatch(remote_bug=bug_num, bug_tracker=lp_debbugs)
d_task.bug_watch = d_watch
d_task.lp_save()
Logger.info("Opened %s", u_bug.web_link)
if not options.browserless:
webbrowser.open(u_bug.web_link)
if err:
if process_bugs(bugs, launchpad, options.package, options.dry_run, options.browserless):
sys.exit(1)
if __name__ == '__main__':
if __name__ == "__main__":
main()

View File

@ -21,20 +21,20 @@
# Authors:
# Daniel Holbach <daniel.holbach@canonical.com>
import argparse
import sys
from optparse import OptionParser
from launchpadlib.launchpad import Launchpad
from launchpadlib.errors import HTTPError
from ubuntutools.config import UDTConfig
from launchpadlib.launchpad import Launchpad
from ubuntutools import getLogger
from ubuntutools.config import UDTConfig
Logger = getLogger()
def error_out(msg):
Logger.error(msg)
def error_out(msg, *args):
Logger.error(msg, *args)
sys.exit(1)
@ -42,54 +42,64 @@ def save_entry(entry):
try:
entry.lp_save()
except HTTPError as error:
error_out(error.content)
error_out("%s", error.content)
def tag_bug(bug):
bug.tags = bug.tags + ['bitesize'] # LP: #254901 workaround
bug.tags = bug.tags + ["bitesize"] # LP: #254901 workaround
save_entry(bug)
def main():
usage = "Usage: %prog <bug number>"
opt_parser = OptionParser(usage)
opt_parser.add_option("-l", "--lpinstance", metavar="INSTANCE",
help="Launchpad instance to connect to "
"(default: production)",
dest="lpinstance", default=None)
opt_parser.add_option("--no-conf",
help="Don't read config files or "
"environment variables.",
dest="no_conf", default=False, action="store_true")
(options, args) = opt_parser.parse_args()
config = UDTConfig(options.no_conf)
if options.lpinstance is None:
options.lpinstance = config.get_value("LPINSTANCE")
if len(args) < 1:
opt_parser.error("Need at least one bug number.")
parser = argparse.ArgumentParser(usage="%(prog)s [options] <bug number>")
parser.add_argument(
"-l",
"--lpinstance",
metavar="INSTANCE",
help="Launchpad instance to connect to (default: production)",
dest="lpinstance",
default=None,
)
parser.add_argument(
"--no-conf",
help="Don't read config files or environment variables.",
dest="no_conf",
default=False,
action="store_true",
)
parser.add_argument("bug_number", help=argparse.SUPPRESS)
args = parser.parse_args()
config = UDTConfig(args.no_conf)
if args.lpinstance is None:
args.lpinstance = config.get_value("LPINSTANCE")
launchpad = Launchpad.login_with("ubuntu-dev-tools", options.lpinstance)
launchpad = Launchpad.login_with("ubuntu-dev-tools", args.lpinstance)
if launchpad is None:
error_out("Couldn't authenticate to Launchpad.")
# check that the new main bug isn't a duplicate
try:
bug = launchpad.bugs[args[0]]
bug = launchpad.bugs[args.bug_number]
except HTTPError as error:
if error.response.status == 401:
error_out("Don't have enough permissions to access bug %s. %s" %
(args[0], error.content))
error_out(
"Don't have enough permissions to access bug %s. %s",
args.bug_number,
error.content,
)
else:
raise
if 'bitesize' in bug.tags:
if "bitesize" in bug.tags:
error_out("Bug is already marked as 'bitesize'.")
bug.newMessage(content="I'm marking this bug as 'bitesize' as it looks "
bug.newMessage(
content="I'm marking this bug as 'bitesize' as it looks "
"like an issue that is easy to fix and suitable "
"for newcomers in Ubuntu development. If you need "
"any help with fixing it, talk to me about it.")
"any help with fixing it, talk to me about it."
)
bug.subscribe(person=launchpad.me)
tag_bug(launchpad.bugs[bug.id]) # fresh bug object, LP: #336866 workaround
if __name__ == '__main__':
if __name__ == "__main__":
main()

View File

@ -18,24 +18,31 @@
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
# pylint: disable=invalid-name
# pylint: enable=invalid-name
import sys
from debian.changelog import Changelog
from ubuntutools import getLogger
Logger = getLogger()
def usage(exit_code=1):
Logger.info('''Usage: merge-changelog <left changelog> <right changelog>
Logger.info(
"""Usage: merge-changelog <left changelog> <right changelog>
merge-changelog takes two changelogs that once shared a common source,
merges them back together, and prints the merged result to stdout. This
is useful if you need to manually merge a ubuntu package with a new
Debian release of the package.
''')
"""
)
sys.exit(exit_code)
########################################################################
# Changelog Management
########################################################################
@ -44,9 +51,9 @@ Debian release of the package.
def merge_changelog(left_changelog, right_changelog):
"""Merge a changelog file."""
with open(left_changelog) as f:
with open(left_changelog, encoding="utf-8") as f:
left_cl = Changelog(f)
with open(right_changelog) as f:
with open(right_changelog, encoding="utf-8") as f:
right_cl = Changelog(f)
left_versions = set(left_cl.versions)
@ -54,7 +61,10 @@ def merge_changelog(left_changelog, right_changelog):
left_blocks = iter(left_cl)
right_blocks = iter(right_cl)
for version in sorted(left_versions | right_versions, reverse=True):
clist = sorted(left_versions | right_versions, reverse=True)
remaining = len(clist)
for version in clist:
remaining -= 1
if version in left_versions:
block = next(left_blocks)
if version in right_versions:
@ -64,11 +74,11 @@ def merge_changelog(left_changelog, right_changelog):
assert block.version == version
Logger.info(str(block).strip() + '\n\n')
Logger.info("%s%s", str(block).strip(), "\n" if remaining else "")
def main():
if len(sys.argv) > 1 and sys.argv[1] in ('-h', '--help'):
if len(sys.argv) > 1 and sys.argv[1] in ("-h", "--help"):
usage(0)
if len(sys.argv) != 3:
usage(1)
@ -80,5 +90,5 @@ def main():
sys.exit(0)
if __name__ == '__main__':
if __name__ == "__main__":
main()

270
mk-sbuild
View File

@ -26,7 +26,7 @@
# ##################################################################
#
# This script creates chroots designed to be used in a snapshot mode
# (with LVM, btrfs, overlay, overlay or aufs) with schroot and sbuild.
# (with LVM, btrfs, zfs, overlay, overlay or aufs) with schroot and sbuild.
# Much love to "man sbuild-setup", https://wiki.ubuntu.com/PbuilderHowto,
# and https://help.ubuntu.com/community/SbuildLVMHowto.
#
@ -40,6 +40,8 @@ SOURCE_CHROOTS_DIR="/var/lib/schroot/chroots"
SOURCE_CHROOTS_TGZ="/var/lib/schroot/tarballs"
CHROOT_SNAPSHOT_DIR="/var/lib/schroot/snapshots"
SCHROOT_PROFILE="sbuild"
CCACHE_DIR="/var/cache/ccache-sbuild"
CCACHE_SIZE="4G"
function usage()
{
@ -49,6 +51,7 @@ function usage()
echo " --name=NAME Base name for the schroot (arch is appended)"
echo " --personality=PERSONALITY What personality to use (defaults to match --arch)"
echo " --vg=VG use LVM snapshots, with group VG"
echo " --zfs-dataset=DATASET use ZFS snapshots, with parent dataset DATASET"
echo " --debug Turn on script debugging"
echo " --skip-updates Do not include -updates pocket in sources.list"
echo " --skip-security Do not include -security pocket in sources.list"
@ -62,14 +65,21 @@ function usage()
echo " --debootstrap-keyring=KEYRING"
echo " Use KEYRING to check signatures of retrieved Release files"
echo " --debootstrap-no-check-gpg Disables checking gpg signatures of retrieved Release files"
echo " --eatmydata Install and use eatmydata"
echo " --skip-eatmydata Don't install and use eatmydata"
echo " --eatmydata Install and use eatmydata (default)"
echo " --ccache Install configure and use ccache as default"
echo " --ccache-dir=PATH Sets the CCACHE_DIR to PATH"
echo " (can be shared between all schroots, defaults to ${CCACHE_DIR})"
echo " --ccache-size=SIZE Sets the ccache max-size to SIZE"
echo " (shared by each CCACHE_DIR, defaults to ${CCACHE_SIZE})"
echo " --distro=DISTRO Install specific distro:"
echo " 'ubuntu' or 'debian' "
echo " (defaults to determining from release name)"
echo " --target=ARCH Target architecture for cross-building"
echo " --type=SCHROOT_TYPE Define the schroot type:"
echo " 'directory'(default), 'file', or 'btrfs-snapshot'"
echo " 'directory' (default), 'file', or 'btrfs-snapshot'."
echo " 'lvm-snapshot' is selected via --vg"
echo " 'zfs-snapshot' is selected via --zfs-dataset"
echo ""
echo "Configuration (via ~/.mk-sbuild.rc)"
echo " LV_SIZE Size of source LVs (default ${LV_SIZE})"
@ -89,7 +99,12 @@ function usage()
echo " DEBOOTSTRAP_PROXY Apt proxy (same as --debootstrap-proxy)"
echo " DEBOOTSTRAP_KEYRING GPG keyring (same as --debootstrap-keyring)"
echo " DEBOOTSTRAP_NO_CHECK_GPG Disable GPG verification (same as --debootstrap-no-check-gpg)"
echo " EATMYDATA Enable --eatmydata"
echo " EATMYDATA Enable or disable eatmydata usage, see --eatmydata and --skip-eatmydata"
echo " CCACHE Enable --ccache"
echo " CCACHE_DIR Path for ccache (can be shared between all schroots, "
echo " same as --ccache-dir, default ${CCACHE_DIR})"
echo " CCACHE_SIZE Sets the ccache max-size (shared by each CCACHE_DIR, "
echo " same as --ccache-size, default ${CCACHE_SIZE})"
echo " TEMPLATE_SOURCES A template for sources.list"
echo " TEMPLATE_SCHROOTCONF A template for schroot.conf stanza"
if [ -z "$1" ]; then
@ -102,26 +117,57 @@ function usage()
if [ -z "$1" ]; then
usage
fi
OPTS=`getopt -o 'h' --long "help,debug,skip-updates,skip-security,skip-proposed,eatmydata,arch:,name:,source-template:,debootstrap-mirror:,debootstrap-include:,debootstrap-exclude:,debootstrap-opts:,debootstrap-proxy:,debootstrap-no-check-gpg,debootstrap-keyring:,personality:,distro:,vg:,type:,target:" -- "$@"`
supported_options=(
help
debug
skip-updates
skip-security
skip-proposed
skip-eatmydata
ccache
arch:
name:
source-template:
debootstrap-mirror:
debootstrap-include:
debootstrap-exclude:
debootstrap-opts:
debootstrap-proxy:
debootstrap-no-check-gpg
debootstrap-keyring:
personality:
distro:
vg:
zfs-dataset:
type:
target:
ccache-dir:
ccache-size:
)
OPTS=$(getopt -o 'h' --long "$(IFS=, && echo "${supported_options[*]}")" -- "$@")
eval set -- "$OPTS"
VG=""
DISTRO=""
COMMAND_PREFIX=""
name=""
proxy="_unset_"
DEBOOTSTRAP_NO_CHECK_GPG=0
EATMYDATA=0
EATMYDATA=1
CCACHE=0
USE_PKGBINARYMANGLER=0
while :; do
case "$1" in
--debug)
DEBUG=1
set -x
shift
;;
--arch)
CHROOT_ARCH="$2"
case $2 in
armel|armhf|i386|lpia)
armhf|i386)
if [ -z "$personality" ]; then
personality="linux32"
fi
@ -186,8 +232,12 @@ while :; do
DEBOOTSTRAP_NO_CHECK_GPG=1
shift
;;
--eatmydata)
EATMYDATA=1
--skip-eatmydata)
EATMYDATA=0
shift
;;
--ccache)
CCACHE=1
shift
;;
--distro)
@ -198,6 +248,10 @@ while :; do
VG="$2"
shift 2
;;
--zfs-dataset)
ZFS_PARENT_DATASET="$2"
shift 2
;;
--type)
SCHROOT_TYPE="$2"
shift 2
@ -206,6 +260,14 @@ while :; do
TARGET_ARCH="$2"
shift 2
;;
--ccache-dir)
CCACHE_DIR="$2"
shift 2
;;
--ccache-size)
CCACHE_SIZE="$2"
shift 2
;;
--)
shift
break
@ -242,10 +304,26 @@ if [ ! -w /var/lib/sbuild ]; then
# Prepare a usable default .sbuildrc
if [ ! -e ~/.sbuildrc ]; then
cat > ~/.sbuildrc <<EOM
# *** VERIFY AND UPDATE \$mailto and \$maintainer_name BELOW ***
# *** THIS COMMAND IS DEPRECATED ***
#
# In sbuild 0.87.0 and later, the unshare backend is available. This is
# expected to become the default in a future release.
#
# This is the new preferred way of building Debian packages, making the manual
# creation of schroots no longer necessary. To retain the default behavior,
# you may remove this comment block and continue.
#
# To test the unshare backend while retaining the default settings, run sbuild
# with --chroot-mode=unshare like this:
# $ sbuild --chroot-mode=unshare --dist=unstable hello
#
# To switch to the unshare backend by default (recommended), uncomment the
# following lines and delete the rest of the file (with the exception of the
# last two lines):
#\$chroot_mode = 'unshare';
#\$unshare_mmdebstrap_keep_tarball = 1;
# Mail address where logs are sent to (mandatory, no default!)
\$mailto = '$USER';
# *** VERIFY AND UPDATE \$mailto and \$maintainer_name BELOW ***
# Name to use as override in .changes files for the Maintainer: field
#\$maintainer_name='$USER <$USER@localhost>';
@ -333,12 +411,15 @@ elif [ -z "$DISTRO" ]; then
exit 1
fi
if [ "$DISTRO" = "ubuntu" ]; then
ubuntu_dist_ge() {
local releases="$(ubuntu-distro-info --all)"
# By default DEBOOTSTRAP_SCRIPT must match RELEASE
DEBOOTSTRAP_SCRIPT="$RELEASE"
dist_ge() {
local releases="$($3-distro-info --all)"
local left=999
local right=0
local seq=1
for i in $releases; do
if [ $1 = $i ]; then
local left=$seq
@ -346,6 +427,7 @@ if [ "$DISTRO" = "ubuntu" ]; then
fi
seq=$((seq+1))
done
seq=1
for i in $releases; do
if [ $2 = $i ]; then
@ -354,8 +436,21 @@ if [ "$DISTRO" = "ubuntu" ]; then
fi
seq=$((seq+1))
done
[ $left -ge $right ] && return 0 || return 1
}
ubuntu_dist_ge () {
dist_ge $1 $2 ubuntu
}
debian_dist_ge () {
dist_ge $1 $2 debian
}
if [ "$DISTRO" = "ubuntu" ]; then
# On Ubuntu, set DEBOOTSTRAP_SCRIPT to gutsy to allow building new RELEASES without new debootstrap
DEBOOTSTRAP_SCRIPT=gutsy
fi
# By default, name the schroot the same as the release
@ -402,10 +497,58 @@ if [ $EATMYDATA -eq 1 ]; then
esac
fi
if [ $CCACHE -eq 1 ]; then
if [ -z "$CCACHE_DIR" ] || [[ "$(dirname "$CCACHE_DIR")" == '/' ]]; then
echo "Invalid ccache dir: ${CCACHE_DIR}" >&2
exit 1
fi
# We can safely use a global cache path, in such case changing size applies
# to all the schroots
setup_script="$CCACHE_DIR"/mk-sbuild-setup
if [ -d "$CCACHE_DIR" ]; then
echo "Reusing $CCACHE_DIR as CCACHE_DIR, will be configured to use max-size=${CCACHE_SIZE}"
rm -f "$setup_script"
else
echo "Configuring $CCACHE_DIR as CCACHE_DIR with max-size=${CCACHE_SIZE}"
sudo install --group=sbuild --mode=2775 -d "$CCACHE_DIR"
fi
if [ ! -x "$setup_script" ]; then
cat <<END | sudo tee "$setup_script" 1>/dev/null
#!/bin/sh
export CCACHE_DIR="$CCACHE_DIR"
export CCACHE_MAXSIZE="${CCACHE_SIZE}"
export CCACHE_UMASK=002
export CCACHE_COMPRESS=1
unset CCACHE_HARDLINK
export CCACHE_NOHARDLINK=1
export PATH="/usr/lib/ccache:\$PATH"
exec "\$@"
END
sudo chmod a+rx "$setup_script"
fi
if ! sudo grep -qs "$CCACHE_DIR" /etc/schroot/sbuild/fstab; then
# This acts on host configuration, but there is no other way to handle
# this, however it won't affect anything
cat <<END | sudo tee -a /etc/schroot/sbuild/fstab 1>/dev/null
${CCACHE_DIR} ${CCACHE_DIR} none rw,bind 0 0
END
fi
DEBOOTSTRAP_INCLUDE="${DEBOOTSTRAP_INCLUDE:+$DEBOOTSTRAP_INCLUDE,}ccache"
BUILD_PKGS="$BUILD_PKGS ccache"
COMMAND_PREFIX="${COMMAND_PREFIX:+$COMMAND_PREFIX,}$setup_script"
fi
if [ -z "$SCHROOT_TYPE" ]; then
# To build the LV, we need to know which volume group to use
if [ -n "$VG" ]; then
SCHROOT_TYPE="lvm-snapshot"
# To build the ZFS dataset, we need to know which parent to use
elif [ -n "$ZFS_PARENT_DATASET" ]; then
SCHROOT_TYPE="zfs-snapshot"
else
SCHROOT_TYPE="directory"
fi
@ -452,7 +595,7 @@ case "$SCHROOT_TYPE" in
# Set up some variables for use in the paths and names
CHROOT_PATH="${SOURCE_CHROOTS_TGZ}/${CHROOT_NAME}.tgz"
;;
"btrfs-snapshot")
"btrfs-snapshot" | "zfs-snapshot")
if [ ! -d "${SOURCE_CHROOTS_DIR}" ]; then
sudo mkdir -p "${SOURCE_CHROOTS_DIR}"
fi
@ -469,8 +612,8 @@ esac
# Is the specified release known to debootstrap?
variant_opt="--variant=buildd"
if [ ! -r "/usr/share/debootstrap/scripts/$RELEASE" ]; then
echo "Specified release ($RELEASE) not known to debootstrap" >&2
if [ ! -r "/usr/share/debootstrap/scripts/$DEBOOTSTRAP_SCRIPT" ]; then
echo "Specified release ($DEBOOTSTRAP_SCRIPT) not known to debootstrap" >&2
exit 1
fi
@ -525,6 +668,7 @@ ubuntu)
if ubuntu_dist_ge "$RELEASE" "edgy"; then
# Add pkgbinarymangler (edgy and later)
BUILD_PKGS="$BUILD_PKGS pkgbinarymangler"
USE_PKGBINARYMANGLER=1
# Disable recommends for a smaller chroot (gutsy and later only)
if ubuntu_dist_ge "$RELEASE" "gutsy"; then
BUILD_PKGS="--no-install-recommends $BUILD_PKGS"
@ -541,7 +685,7 @@ debian)
DEBOOTSTRAP_MIRROR="http://deb.debian.org/debian"
fi
if [ -z "$COMPONENTS" ]; then
COMPONENTS="main non-free contrib"
COMPONENTS="main non-free non-free-firmware contrib"
fi
if [ -z "$SOURCES_PROPOSED_SUITE" ]; then
SOURCES_PROPOSED_SUITE="RELEASE-proposed-updates"
@ -549,8 +693,12 @@ debian)
# Debian only performs security updates
SKIP_UPDATES=1
if [ -z "$SOURCES_SECURITY_SUITE" ]; then
if debian_dist_ge "$RELEASE" "bullseye"; then
SOURCES_SECURITY_SUITE="RELEASE-security"
else
SOURCES_SECURITY_SUITE="RELEASE/updates"
fi
fi
if [ -z "$SOURCES_SECURITY_URL" ]; then
SOURCES_SECURITY_URL="http://security.debian.org/"
fi
@ -582,7 +730,7 @@ if [ -n "$TARGET_ARCH" ]; then
echo "Unknown target architecture $TARGET_ARCH" >&2
exit 1
fi
BUILD_PKGS="$BUILD_PKGS g++-$target_tuple pkg-config-$target_tuple dpkg-cross libc-dev:$TARGET_ARCH"
BUILD_PKGS="$BUILD_PKGS g++-$target_tuple pkg-config dpkg-cross libc-dev:$TARGET_ARCH"
fi
debootstrap_opts="--components=$(echo $COMPONENTS | tr ' ' ,)"
@ -620,12 +768,12 @@ DEBOOTSTRAP_COMMAND=debootstrap
if [ "$CHROOT_ARCH" != "$HOST_ARCH" ] ; then
case "$CHROOT_ARCH-$HOST_ARCH" in
# Sometimes we don't need qemu
amd64-i386|amd64-lpia|armel-armhf|armhf-armel|arm64-armel|arm64-armhf|armel-arm64|armhf-arm64|i386-amd64|i386-lpia|lpia-i386|powerpc-ppc64|ppc64-powerpc|sparc-sparc64|sparc64-sparc)
amd64-i386|arm64-armhf|armhf-arm64|i386-amd64|powerpc-ppc64|ppc64-powerpc)
;;
# Sometimes we do
*)
DEBOOTSTRAP_COMMAND=qemu-debootstrap
if ! which "$DEBOOTSTRAP_COMMAND"; then
DEBOOTSTRAP_COMMAND=debootstrap
if ! which "qemu-x86_64-static"; then
sudo apt-get install qemu-user-static
fi
;;
@ -658,6 +806,19 @@ case "$SCHROOT_TYPE" in
fi
sudo btrfs subvolume create "${MNT}"
;;
"zfs-snapshot")
ZFS_DATASET="${ZFS_PARENT_DATASET}/${CHROOT_NAME}"
if sudo zfs list "${ZFS_DATASET}" >/dev/null 2>&1; then
echo "E: ZFS dataset ${ZFS_DATASET} already exists; aborting" >&2
exit 1
fi
sudo zfs create -p -o mountpoint=legacy "${ZFS_DATASET}"
# Mount
MNT=`mktemp -d -t schroot-XXXXXX`
sudo mount -t zfs "${ZFS_DATASET}" "${MNT}"
;;
"file")
MNT=`mktemp -d -t schroot-XXXXXX`
esac
@ -679,7 +840,14 @@ esac
sudo mkdir -p -m 0700 "$MNT"/root/.gnupg
# debootstrap the chroot
sudo ${proxy:+"http_proxy=${proxy}"} "$DEBOOTSTRAP_COMMAND" --arch="$CHROOT_ARCH" $variant_opt $debootstrap_opts "$RELEASE" "$MNT" "${DEBOOTSTRAP_MIRROR:-http://archive.ubuntu.com/ubuntu}"
sudo ${proxy:+"http_proxy=${proxy}"} "$DEBOOTSTRAP_COMMAND" --arch="$CHROOT_ARCH" $variant_opt $debootstrap_opts "$RELEASE" "$MNT" "${DEBOOTSTRAP_MIRROR:-http://archive.ubuntu.com/ubuntu}" "$DEBOOTSTRAP_SCRIPT"
if [ $EATMYDATA -eq 1 ]; then
sudo mkdir -p "${MNT}/usr/local/libexec/mk-sbuild"
sudo ln -s /usr/bin/eatmydata "${MNT}/usr/local/libexec/mk-sbuild/dpkg"
echo 'Dir::Bin::dpkg "/usr/local/libexec/mk-sbuild/dpkg";' \
| sudo tee "${MNT}/etc/apt/apt.conf.d/00mk-sbuild-eatmydata" > /dev/null
fi
# Update the package sources
TEMP_SOURCES=`mktemp -t sources-XXXXXX`
@ -724,6 +892,13 @@ EOM
fi
fi
if [ -z "$SKIP_PROPOSED" ]; then
TEMP_PREFERENCES=`mktemp -t preferences-XXXXXX`
cat >> "$TEMP_PREFERENCES" <<EOM
# override for NotAutomatic: yes
Package: *
Pin: release a=*-proposed
Pin-Priority: 500
EOM
cat >> "$TEMP_SOURCES" <<EOM
deb ${MIRROR_ARCHS}${DEBOOTSTRAP_MIRROR} $SOURCES_PROPOSED_SUITE ${COMPONENTS}
deb-src ${DEBOOTSTRAP_MIRROR} $SOURCES_PROPOSED_SUITE ${COMPONENTS}
@ -749,9 +924,12 @@ fi
cat "$TEMP_SOURCES" | sed -e "s|RELEASE|$RELEASE|g" | \
sudo bash -c "cat > $MNT/etc/apt/sources.list"
rm -f "$TEMP_SOURCES"
if [ -n "$TEMP_PREFERENCES" ]; then
sudo mv "$TEMP_PREFERENCES" $MNT/etc/apt/preferences.d/proposed.pref
fi
# Copy the timezone (comment this out if you want to leave the chroot at UTC)
sudo cp -P --remove-destination /etc/localtime /etc/timezone "$MNT"/etc/
# Copy the timezone (uncomment this if you want to use your local time zone)
#sudo cp -P --remove-destination /etc/localtime /etc/timezone "$MNT"/etc/
# Create a schroot entry for this chroot
TEMP_SCHROOTCONF=`mktemp -t schrootconf-XXXXXX`
TEMPLATE_SCHROOTCONF=~/.mk-sbuild.schroot.conf
@ -778,9 +956,9 @@ root-groups=$ADMIN_GROUPS
type=SCHROOT_TYPE
profile=$SCHROOT_PROFILE
EOM
if [ $EATMYDATA -eq 1 ]; then
if [ -n "$COMMAND_PREFIX" ]; then
cat >> "$TEMP_SCHROOTCONF" <<EOM
command-prefix=eatmydata
command-prefix=${COMMAND_PREFIX}
EOM
fi
case "$SCHROOT_TYPE" in
@ -803,6 +981,12 @@ btrfs-source-subvolume=CHROOT_PATH
btrfs-snapshot-directory=CHROOT_SNAPSHOT_DIR
EOM
;;
zfs-snapshot)
cat >> "${TEMP_SCHROOTCONF}" <<EOM
zfs-dataset=ZFS_DATASET
EOM
;;
esac
fi
if [ ! -z "$personality" ]; then
@ -819,6 +1003,7 @@ sed -e "s|CHROOT_NAME|$CHROOT_NAME|g" \
-e "s|SNAPSHOT_SIZE|$SNAPSHOT_SIZE|g" \
-e "s|SCHROOT_TYPE|$SCHROOT_TYPE|g" \
-e "s|CHROOT_SNAPSHOT_DIR|$CHROOT_SNAPSHOT_DIR|g" \
-e "s|ZFS_DATASET|$ZFS_DATASET|g" \
"$TEMP_SCHROOTCONF" \
| sudo tee "/etc/schroot/chroot.d/sbuild-$CHROOT_NAME" > /dev/null
rm -f "$TEMP_SCHROOTCONF"
@ -840,7 +1025,9 @@ sudo chmod a+x "$MNT"/usr/sbin/policy-rc.d
# Create image finalization script
sudo bash -c "cat >> $MNT/finish.sh" <<EOM
#!/bin/bash
#set -x
if [ "$DEBUG" = 1 ]; then
set -x
fi
set -e
if [ -n "$proxy" ]; then
mkdir -p /etc/apt/apt.conf.d/
@ -861,6 +1048,25 @@ EOF
EOM
fi
if [ "$USE_PKGBINARYMANGLER" = 1 ]; then
sudo bash -c "cat >> $MNT/finish.sh" <<EOM
mkdir -p /etc/pkgbinarymangler/
cat > /etc/pkgbinarymangler/maintainermangler.conf <<EOF
# pkgmaintainermangler configuration file
# pkgmaintainermangler will do nothing unless enable is set to "true"
enable: true
# Configure what happens if /CurrentlyBuilding is present, but invalid
# (i. e. it does not contain a Package: field). If "ignore" (default),
# the file is ignored (i. e. the Maintainer field is mangled) and a
# warning is printed. If "fail" (or any other value), pkgmaintainermangler
# exits with an error, which causes a package build to fail.
invalid_currentlybuilding: ignore
EOF
EOM
fi
if [ -n "$TARGET_ARCH" ]; then
sudo bash -c "cat >> $MNT/finish.sh" <<EOM
# Configure target architecture
@ -879,7 +1085,7 @@ apt-get update || true
echo set debconf/frontend Noninteractive | debconf-communicate
echo set debconf/priority critical | debconf-communicate
# Install basic build tool set, trying to match buildd
apt-get -y --force-yes install $BUILD_PKGS
apt-get -y --force-yes -o Dpkg::Options::="--force-confold" install $BUILD_PKGS
# Set up expected /dev entries
if [ ! -r /dev/stdin ]; then ln -s /proc/self/fd/0 /dev/stdin; fi
if [ ! -r /dev/stdout ]; then ln -s /proc/self/fd/1 /dev/stdout; fi
@ -891,7 +1097,7 @@ EOM
sudo chmod a+x "$MNT"/finish.sh
case "$SCHROOT_TYPE" in
"lvm-snapshot")
"lvm-snapshot"|"zfs-snapshot")
sudo umount "$MNT"
rmdir "$MNT"
;;
@ -915,7 +1121,7 @@ echo ""
echo " To CHANGE the golden image: sudo schroot -c source:${CHROOT_NAME} -u root"
echo " To ENTER an image snapshot: schroot -c ${CHROOT_NAME}"
echo " To BUILD within a snapshot: sbuild -A -d ${CHROOT_NAME} PACKAGE*.dsc"
if [ "$CHROOT_ARCH" != "$TARGET_ARCH" ] ; then
if [ -n "$TARGET_ARCH" ] && [ "$CHROOT_ARCH" != "$TARGET_ARCH" ] ; then
echo " To BUILD for ${TARGET_ARCH}: sbuild -A -d ${CHROOT_NAME} --host ${TARGET_ARCH} PACKAGE*.dsc"
fi
echo ""

View File

@ -29,25 +29,29 @@
# configurations. For example, a symlink called pbuilder-hardy will assume
# that the target distribution is always meant to be Ubuntu Hardy.
import distutils.spawn
# pylint: disable=invalid-name
# pylint: enable=invalid-name
import os
import os.path
import shutil
import subprocess
import sys
from contextlib import suppress
import debian.deb822
from distro_info import DebianDistroInfo, UbuntuDistroInfo, DistroDataOutdated
from distro_info import DebianDistroInfo, DistroDataOutdated, UbuntuDistroInfo
import ubuntutools.misc
import ubuntutools.version
from ubuntutools import getLogger
from ubuntutools.config import UDTConfig
from ubuntutools.question import YesNoQuestion
from ubuntutools import getLogger
Logger = getLogger()
class PbuilderDist(object):
class PbuilderDist:
def __init__(self, builder):
# Base directory where pbuilder will put all the files it creates.
self.base = None
@ -86,32 +90,36 @@ class PbuilderDist(object):
self.chroot_string = None
# Authentication method
self.auth = 'sudo'
self.auth = "sudo"
# Builder
self.builder = builder
self._debian_distros = DebianDistroInfo().all + \
['stable', 'testing', 'unstable']
# Distro info
self.debian_distro_info = DebianDistroInfo()
self.ubuntu_distro_info = UbuntuDistroInfo()
self._debian_distros = self.debian_distro_info.all + ["stable", "testing", "unstable"]
# Ensure that the used builder is installed
paths = set(os.environ['PATH'].split(':'))
paths |= set(('/sbin', '/usr/sbin', '/usr/local/sbin'))
paths = set(os.environ["PATH"].split(":"))
paths |= set(("/sbin", "/usr/sbin", "/usr/local/sbin"))
if not any(os.path.exists(os.path.join(p, builder)) for p in paths):
Logger.error('Could not find "%s".', builder)
sys.exit(1)
##############################################################
self.base = os.path.expanduser(os.environ.get('PBUILDFOLDER',
'~/pbuilder/'))
self.base = os.path.expanduser(os.environ.get("PBUILDFOLDER", "~/pbuilder/"))
if 'SUDO_USER' in os.environ:
Logger.warning('Running under sudo. '
'This is probably not what you want. '
'pbuilder-dist will use sudo itself, '
'when necessary.')
if os.stat(os.environ['HOME']).st_uid != os.getuid():
if "SUDO_USER" in os.environ:
Logger.warning(
"Running under sudo. "
"This is probably not what you want. "
"pbuilder-dist will use sudo itself, "
"when necessary."
)
if os.stat(os.environ["HOME"]).st_uid != os.getuid():
Logger.error("You don't own $HOME")
sys.exit(1)
@ -122,8 +130,8 @@ class PbuilderDist(object):
Logger.error('Cannot create base directory "%s"', self.base)
sys.exit(1)
if 'PBUILDAUTH' in os.environ:
self.auth = os.environ['PBUILDAUTH']
if "PBUILDAUTH" in os.environ:
self.auth = os.environ["PBUILDAUTH"]
self.system_architecture = ubuntutools.misc.host_architecture()
self.system_distro = ubuntutools.misc.system_distribution()
@ -144,16 +152,17 @@ class PbuilderDist(object):
Logger.error('"%s" is an invalid distribution codename.', distro)
sys.exit(1)
if not os.path.isfile(os.path.join('/usr/share/debootstrap/scripts/',
distro)):
if os.path.isdir('/usr/share/debootstrap/scripts/'):
if not os.path.isfile(os.path.join("/usr/share/debootstrap/scripts/", distro)):
if os.path.isdir("/usr/share/debootstrap/scripts/"):
# Debian experimental doesn't have a debootstrap file but
# should work nevertheless.
if distro not in self._debian_distros:
question = ('Warning: Unknown distribution "%s". '
'Do you want to continue' % distro)
answer = YesNoQuestion().ask(question, 'no')
if answer == 'no':
# should work nevertheless. Ubuntu releases automatically use
# the gutsy script as of debootstrap 1.0.128+nmu2ubuntu1.1.
if distro not in (self._debian_distros + self.ubuntu_distro_info.all):
question = (
f'Warning: Unknown distribution "{distro}". ' "Do you want to continue"
)
answer = YesNoQuestion().ask(question, "no")
if answer == "no":
sys.exit(0)
else:
Logger.error('Please install package "debootstrap".')
@ -168,22 +177,23 @@ class PbuilderDist(object):
depending on this either save it into the appropiate variable
or finalize pbuilder-dist's execution.
"""
arguments = ('create', 'update', 'build', 'clean', 'login', 'execute')
arguments = ("create", "update", "build", "clean", "login", "execute")
if operation not in arguments:
if operation.endswith('.dsc'):
if operation.endswith(".dsc"):
if os.path.isfile(operation):
self.operation = 'build'
self.operation = "build"
return [operation]
else:
Logger.error('Could not find file "%s".', operation)
sys.exit(1)
else:
Logger.error('"%s" is not a recognized argument.\n'
'Please use one of these: %s.',
operation, ', '.join(arguments))
Logger.error(
'"%s" is not a recognized argument.\nPlease use one of these: %s.',
operation,
", ".join(arguments),
)
sys.exit(1)
else:
self.operation = operation
return []
@ -199,30 +209,34 @@ class PbuilderDist(object):
if self.build_architecture == self.system_architecture:
self.chroot_string = self.target_distro
else:
self.chroot_string = (self.target_distro + '-'
+ self.build_architecture)
self.chroot_string = self.target_distro + "-" + self.build_architecture
prefix = os.path.join(self.base, self.chroot_string)
if '--buildresult' not in remaining_arguments:
result = os.path.normpath('%s_result/' % prefix)
if "--buildresult" not in remaining_arguments:
result = os.path.normpath(f"{prefix}_result/")
else:
location_of_arg = remaining_arguments.index('--buildresult')
location_of_arg = remaining_arguments.index("--buildresult")
result = os.path.normpath(remaining_arguments[location_of_arg + 1])
remaining_arguments.pop(location_of_arg + 1)
remaining_arguments.pop(location_of_arg)
if not self.logfile and self.operation != 'login':
if self.operation == 'build':
dsc_files = [a for a in remaining_arguments
if a.strip().endswith('.dsc')]
if not self.logfile and self.operation != "login":
if self.operation == "build":
dsc_files = [a for a in remaining_arguments if a.strip().endswith(".dsc")]
assert len(dsc_files) == 1
dsc = debian.deb822.Dsc(open(dsc_files[0]))
version = ubuntutools.version.Version(dsc['Version'])
name = (dsc['Source'] + '_' + version.strip_epoch() + '_' +
self.build_architecture + '.build')
dsc = debian.deb822.Dsc(open(dsc_files[0], encoding="utf-8"))
version = ubuntutools.version.Version(dsc["Version"])
name = (
dsc["Source"]
+ "_"
+ version.strip_epoch()
+ "_"
+ self.build_architecture
+ ".build"
)
self.logfile = os.path.join(result, name)
else:
self.logfile = os.path.join(result, 'last_operation.log')
self.logfile = os.path.join(result, "last_operation.log")
if not os.path.isdir(result):
try:
@ -232,85 +246,89 @@ class PbuilderDist(object):
sys.exit(1)
arguments = [
'--%s' % self.operation,
'--distribution', self.target_distro,
'--buildresult', result,
f"--{self.operation}",
"--distribution",
self.target_distro,
"--buildresult",
result,
]
if self.operation == 'update':
arguments += ['--override-config']
if self.operation == "update":
arguments += ["--override-config"]
if self.builder == 'pbuilder':
arguments += ['--basetgz', prefix + '-base.tgz']
elif self.builder == 'cowbuilder':
arguments += ['--basepath', prefix + '-base.cow']
if self.builder == "pbuilder":
arguments += ["--basetgz", prefix + "-base.tgz"]
elif self.builder == "cowbuilder":
arguments += ["--basepath", prefix + "-base.cow"]
else:
Logger.error('Unrecognized builder "%s".', self.builder)
sys.exit(1)
if self.logfile:
arguments += ['--logfile', self.logfile]
arguments += ["--logfile", self.logfile]
if os.path.exists('/var/cache/archive/'):
arguments += ['--bindmounts', '/var/cache/archive/']
if os.path.exists("/var/cache/archive/"):
arguments += ["--bindmounts", "/var/cache/archive/"]
config = UDTConfig()
if self.target_distro in self._debian_distros:
mirror = os.environ.get('MIRRORSITE',
config.get_value('DEBIAN_MIRROR'))
components = 'main'
mirror = os.environ.get("MIRRORSITE", config.get_value("DEBIAN_MIRROR"))
components = "main"
if self.extra_components:
components += ' contrib non-free'
components += " contrib non-free non-free-firmware"
else:
mirror = os.environ.get('MIRRORSITE',
config.get_value('UBUNTU_MIRROR'))
if self.build_architecture not in ('amd64', 'i386'):
mirror = os.environ.get(
'MIRRORSITE', config.get_value('UBUNTU_PORTS_MIRROR'))
components = 'main restricted'
mirror = os.environ.get("MIRRORSITE", config.get_value("UBUNTU_MIRROR"))
if self.build_architecture not in ("amd64", "i386"):
mirror = os.environ.get("MIRRORSITE", config.get_value("UBUNTU_PORTS_MIRROR"))
components = "main restricted"
if self.extra_components:
components += ' universe multiverse'
components += " universe multiverse"
arguments += ['--mirror', mirror]
arguments += ["--mirror", mirror]
othermirrors = []
localrepo = '/var/cache/archive/' + self.target_distro
localrepo = f"/var/cache/archive/{self.target_distro}"
if os.path.exists(localrepo):
repo = 'deb file:///var/cache/archive/ %s/' % self.target_distro
repo = f"deb file:///var/cache/archive/ {self.target_distro}/"
othermirrors.append(repo)
if self.target_distro in self._debian_distros:
debian_info = DebianDistroInfo()
try:
codename = debian_info.codename(self.target_distro,
default=self.target_distro)
codename = self.debian_distro_info.codename(
self.target_distro, default=self.target_distro
)
except DistroDataOutdated as error:
Logger.warning(error)
if codename in (debian_info.devel(), 'experimental'):
if codename in (self.debian_distro_info.devel(), "experimental"):
self.enable_security = False
self.enable_updates = False
self.enable_proposed = False
elif codename in (debian_info.testing(), 'testing'):
elif codename in (self.debian_distro_info.testing(), "testing"):
self.enable_updates = False
if self.enable_security:
othermirrors.append('deb %s %s/updates %s'
% (config.get_value('DEBSEC_MIRROR'),
self.target_distro, components))
pocket = "-security"
with suppress(ValueError):
# before bullseye (version 11) security suite is /updates
if float(self.debian_distro_info.version(codename)) < 11.0:
pocket = "/updates"
othermirrors.append(
f"deb {config.get_value('DEBSEC_MIRROR')}"
f" {self.target_distro}{pocket} {components}"
)
if self.enable_updates:
othermirrors.append('deb %s %s-updates %s'
% (mirror, self.target_distro, components))
othermirrors.append(f"deb {mirror} {self.target_distro}-updates {components}")
if self.enable_proposed:
othermirrors.append('deb %s %s-proposed-updates %s'
% (mirror, self.target_distro, components))
othermirrors.append(
f"deb {mirror} {self.target_distro}-proposed-updates {components}"
)
if self.enable_backports:
othermirrors.append('deb %s %s-backports %s'
% (mirror, self.target_distro, components))
othermirrors.append(f"deb {mirror} {self.target_distro}-backports {components}")
aptcache = os.path.join(self.base, 'aptcache', 'debian')
aptcache = os.path.join(self.base, "aptcache", "debian")
else:
try:
dev_release = self.target_distro == UbuntuDistroInfo().devel()
dev_release = self.target_distro == self.ubuntu_distro_info.devel()
except DistroDataOutdated as error:
Logger.warning(error)
dev_release = True
@ -320,46 +338,45 @@ class PbuilderDist(object):
self.enable_updates = False
if self.enable_security:
othermirrors.append('deb %s %s-security %s'
% (mirror, self.target_distro, components))
othermirrors.append(f"deb {mirror} {self.target_distro}-security {components}")
if self.enable_updates:
othermirrors.append('deb %s %s-updates %s'
% (mirror, self.target_distro, components))
othermirrors.append(f"deb {mirror} {self.target_distro}-updates {components}")
if self.enable_proposed:
othermirrors.append('deb %s %s-proposed %s'
% (mirror, self.target_distro, components))
othermirrors.append(f"deb {mirror} {self.target_distro}-proposed {components}")
aptcache = os.path.join(self.base, 'aptcache', 'ubuntu')
aptcache = os.path.join(self.base, "aptcache", "ubuntu")
if 'OTHERMIRROR' in os.environ:
othermirrors += os.environ['OTHERMIRROR'].split('|')
if "OTHERMIRROR" in os.environ:
othermirrors += os.environ["OTHERMIRROR"].split("|")
if othermirrors:
arguments += ['--othermirror', '|'.join(othermirrors)]
arguments += ["--othermirror", "|".join(othermirrors)]
# Work around LP:#599695
if (ubuntutools.misc.system_distribution() == 'Debian'
and self.target_distro not in self._debian_distros):
if not os.path.exists(
'/usr/share/keyrings/ubuntu-archive-keyring.gpg'):
Logger.error('ubuntu-keyring not installed')
if (
ubuntutools.misc.system_distribution() == "Debian"
and self.target_distro not in self._debian_distros
):
if not os.path.exists("/usr/share/keyrings/ubuntu-archive-keyring.gpg"):
Logger.error("ubuntu-keyring not installed")
sys.exit(1)
arguments += [
'--debootstrapopts',
'--keyring=/usr/share/keyrings/ubuntu-archive-keyring.gpg',
"--debootstrapopts",
"--keyring=/usr/share/keyrings/ubuntu-archive-keyring.gpg",
]
elif (ubuntutools.misc.system_distribution() == 'Ubuntu'
and self.target_distro in self._debian_distros):
if not os.path.exists(
'/usr/share/keyrings/debian-archive-keyring.gpg'):
Logger.error('debian-archive-keyring not installed')
elif (
ubuntutools.misc.system_distribution() == "Ubuntu"
and self.target_distro in self._debian_distros
):
if not os.path.exists("/usr/share/keyrings/debian-archive-keyring.gpg"):
Logger.error("debian-archive-keyring not installed")
sys.exit(1)
arguments += [
'--debootstrapopts',
'--keyring=/usr/share/keyrings/debian-archive-keyring.gpg',
"--debootstrapopts",
"--keyring=/usr/share/keyrings/debian-archive-keyring.gpg",
]
arguments += ['--aptcache', aptcache, '--components', components]
arguments += ["--aptcache", aptcache, "--components", components]
if not os.path.isdir(aptcache):
try:
@ -369,13 +386,11 @@ class PbuilderDist(object):
sys.exit(1)
if self.build_architecture != self.system_architecture:
arguments += ['--debootstrapopts',
'--arch=' + self.build_architecture]
arguments += ["--debootstrapopts", "--arch=" + self.build_architecture]
apt_conf_dir = os.path.join(self.base,
'etc/%s/apt.conf' % self.target_distro)
apt_conf_dir = os.path.join(self.base, f"etc/{self.target_distro}/apt.conf")
if os.path.exists(apt_conf_dir):
arguments += ['--aptconfdir', apt_conf_dir]
arguments += ["--aptconfdir", apt_conf_dir]
# Append remaining arguments
if remaining_arguments:
@ -386,12 +401,12 @@ class PbuilderDist(object):
# With both common variable name schemes (BTS: #659060).
return [
self.auth,
'HOME=' + os.path.expanduser('~'),
'ARCHITECTURE=' + self.build_architecture,
'DISTRIBUTION=' + self.target_distro,
'ARCH=' + self.build_architecture,
'DIST=' + self.target_distro,
'DEB_BUILD_OPTIONS=' + os.environ.get('DEB_BUILD_OPTIONS', ''),
"HOME=" + os.path.expanduser("~"),
"ARCHITECTURE=" + self.build_architecture,
"DISTRIBUTION=" + self.target_distro,
"ARCH=" + self.build_architecture,
"DIST=" + self.target_distro,
"DEB_BUILD_OPTIONS=" + os.environ.get("DEB_BUILD_OPTIONS", ""),
self.builder,
] + arguments
@ -401,7 +416,7 @@ def show_help(exit_code=0):
Print a help message for pbuilder-dist, and exit with the given code.
"""
Logger.info('See man pbuilder-dist for more information.')
Logger.info("See man pbuilder-dist for more information.")
sys.exit(exit_code)
@ -415,27 +430,25 @@ def main():
the script and runs pbuilder itself or exists with an error message.
"""
script_name = os.path.basename(sys.argv[0])
parts = script_name.split('-')
parts = script_name.split("-")
# Copy arguments into another list for save manipulation
args = sys.argv[1:]
if ('-' in script_name and parts[0] not in ('pbuilder', 'cowbuilder')
or len(parts) > 3):
Logger.error('"%s" is not a valid name for a "pbuilder-dist" '
'executable.', script_name)
if "-" in script_name and parts[0] not in ("pbuilder", "cowbuilder") or len(parts) > 3:
Logger.error('"%s" is not a valid name for a "pbuilder-dist" executable.', script_name)
sys.exit(1)
if len(args) < 1:
Logger.error('Insufficient number of arguments.')
Logger.error("Insufficient number of arguments.")
show_help(1)
if args[0] in ('-h', '--help', 'help'):
if args[0] in ("-h", "--help", "help"):
show_help(0)
app = PbuilderDist(parts[0])
if len(parts) > 1 and parts[1] != 'dist' and '.' not in parts[1]:
if len(parts) > 1 and parts[1] != "dist" and "." not in parts[1]:
app.set_target_distro(parts[1])
else:
app.set_target_distro(args.pop(0))
@ -443,24 +456,31 @@ def main():
if len(parts) > 2:
requested_arch = parts[2]
elif len(args) > 0:
if distutils.spawn.find_executable('arch-test'):
if subprocess.run(
['arch-test', args[0]],
stdout=subprocess.DEVNULL).returncode == 0:
if shutil.which("arch-test") is not None:
arch_test = subprocess.run(
["arch-test", args[0]], check=False, stdout=subprocess.DEVNULL
)
if arch_test.returncode == 0:
requested_arch = args.pop(0)
elif (os.path.isdir('/usr/lib/arch-test')
and args[0] in os.listdir('/usr/lib/arch-test/')):
Logger.error('Architecture "%s" is not supported on your '
'currently running kernal. Consider installing '
'the qemu-user-static package to enable the use of '
'foreign architectures.', args[0])
elif os.path.isdir("/usr/lib/arch-test") and args[0] in os.listdir(
"/usr/lib/arch-test/"
):
Logger.error(
'Architecture "%s" is not supported on your '
"currently running kernel. Consider installing "
"the qemu-user-static package to enable the use of "
"foreign architectures.",
args[0],
)
sys.exit(1)
else:
requested_arch = None
else:
Logger.error('Cannot determine if "%s" is a valid architecture. '
'Please install the arch-test package and retry.',
args[0])
Logger.error(
'Cannot determine if "%s" is a valid architecture. '
"Please install the arch-test package and retry.",
args[0],
)
sys.exit(1)
else:
requested_arch = None
@ -468,62 +488,64 @@ def main():
if requested_arch:
app.build_architecture = requested_arch
# For some foreign architectures we need to use qemu
if (requested_arch != app.system_architecture
and (app.system_architecture, requested_arch) not in [
('amd64', 'i386'), ('amd64', 'lpia'), ('arm', 'armel'),
('armel', 'arm'), ('armel', 'armhf'), ('armhf', 'armel'),
('arm64', 'arm'), ('arm64', 'armhf'), ('arm64', 'armel'),
('i386', 'lpia'), ('lpia', 'i386'), ('powerpc', 'ppc64'),
('ppc64', 'powerpc'), ('sparc', 'sparc64'),
('sparc64', 'sparc')]):
args += ['--debootstrap', 'qemu-debootstrap']
if requested_arch != app.system_architecture and (
app.system_architecture,
requested_arch,
) not in [
("amd64", "i386"),
("arm64", "arm"),
("arm64", "armhf"),
("powerpc", "ppc64"),
("ppc64", "powerpc"),
]:
args += ["--debootstrap", "debootstrap"]
if 'mainonly' in sys.argv or '--main-only' in sys.argv:
if "mainonly" in sys.argv or "--main-only" in sys.argv:
app.extra_components = False
if 'mainonly' in sys.argv:
args.remove('mainonly')
if "mainonly" in sys.argv:
args.remove("mainonly")
else:
args.remove('--main-only')
args.remove("--main-only")
if '--release-only' in sys.argv:
args.remove('--release-only')
if "--release-only" in sys.argv:
args.remove("--release-only")
app.enable_security = False
app.enable_updates = False
app.enable_proposed = False
elif '--security-only' in sys.argv:
args.remove('--security-only')
elif "--security-only" in sys.argv:
args.remove("--security-only")
app.enable_updates = False
app.enable_proposed = False
elif '--updates-only' in sys.argv:
args.remove('--updates-only')
elif "--updates-only" in sys.argv:
args.remove("--updates-only")
app.enable_proposed = False
elif '--backports' in sys.argv:
args.remove('--backports')
elif "--backports" in sys.argv:
args.remove("--backports")
app.enable_backports = True
if len(args) < 1:
Logger.error('Insufficient number of arguments.')
Logger.error("Insufficient number of arguments.")
show_help(1)
# Parse the operation
args = app.set_operation(args.pop(0)) + args
if app.operation == 'build':
if len([a for a in args if a.strip().endswith('.dsc')]) != 1:
msg = 'You have to specify one .dsc file if you want to build.'
if app.operation == "build":
if len([a for a in args if a.strip().endswith(".dsc")]) != 1:
msg = "You have to specify one .dsc file if you want to build."
Logger.error(msg)
sys.exit(1)
# Execute the pbuilder command
if '--debug-echo' not in args:
if "--debug-echo" not in args:
sys.exit(subprocess.call(app.get_command(args)))
else:
Logger.info(app.get_command([arg for arg in args if arg != '--debug-echo']))
Logger.info(app.get_command([arg for arg in args if arg != "--debug-echo"]))
if __name__ == '__main__':
if __name__ == "__main__":
try:
main()
except KeyboardInterrupt:
Logger.error('Manually aborted.')
Logger.error("Manually aborted.")
sys.exit(1)

142
pm-helper Executable file
View File

@ -0,0 +1,142 @@
#!/usr/bin/python3
# Find the next thing to work on for proposed-migration
# Copyright (C) 2023 Canonical Ltd.
# Author: Steve Langasek <steve.langasek@ubuntu.com>
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License, version 3.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# General Public License for more details.
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import lzma
import sys
import webbrowser
from argparse import ArgumentParser
import yaml
from launchpadlib.launchpad import Launchpad
from ubuntutools.utils import get_url
# proposed-migration is only concerned with the devel series; unlike other
# tools, don't make this configurable
excuses_url = "https://ubuntu-archive-team.ubuntu.com/proposed-migration/update_excuses.yaml.xz"
def get_proposed_version(excuses, package):
for k in excuses["sources"]:
if k["source"] == package:
return k.get("new-version")
return None
def claim_excuses_bug(launchpad, bug, package):
print(f"LP: #{bug.id}: {bug.title}")
ubuntu = launchpad.distributions["ubuntu"]
series = ubuntu.current_series.fullseriesname
for task in bug.bug_tasks:
# targeting to a series doesn't make the default task disappear,
# it just makes it useless
if task.bug_target_name == f"{package} ({series})":
our_task = task
break
if task.bug_target_name == f"{package} (Ubuntu)":
our_task = task
if our_task.assignee == launchpad.me:
print("Bug already assigned to you.")
return True
if our_task.assignee:
print(f"Currently assigned to {our_task.assignee.name}")
print("""Do you want to claim this bug? [yN] """, end="")
sys.stdout.flush()
response = sys.stdin.readline()
if response.strip().lower().startswith("y"):
our_task.assignee = launchpad.me
our_task.lp_save()
return True
return False
def create_excuses_bug(launchpad, package, version):
print("Will open a new bug")
bug = launchpad.bugs.createBug(
title=f"proposed-migration for {package} {version}",
tags=("update-excuse"),
target=f"https://api.launchpad.net/devel/ubuntu/+source/{package}",
description=f"{package} {version} is stuck in -proposed.",
)
task = bug.bug_tasks[0]
task.assignee = launchpad.me
task.lp_save()
print(f"Opening {bug.web_link} in browser")
webbrowser.open(bug.web_link)
return bug
def has_excuses_bugs(launchpad, package):
ubuntu = launchpad.distributions["ubuntu"]
pkg = ubuntu.getSourcePackage(name=package)
if not pkg:
raise ValueError(f"No such source package: {package}")
tasks = pkg.searchTasks(tags=["update-excuse"], order_by=["id"])
bugs = [task.bug for task in tasks]
if not bugs:
return False
if len(bugs) == 1:
print(f"There is 1 open update-excuse bug against {package}")
else:
print(f"There are {len(bugs)} open update-excuse bugs against {package}")
for bug in bugs:
if claim_excuses_bug(launchpad, bug, package):
return True
return True
def main():
parser = ArgumentParser()
parser.add_argument("-l", "--launchpad", dest="launchpad_instance", default="production")
parser.add_argument(
"-v", "--verbose", default=False, action="store_true", help="be more verbose"
)
parser.add_argument("package", nargs="?", help="act on this package only")
args = parser.parse_args()
args.launchpad = Launchpad.login_with("pm-helper", args.launchpad_instance, version="devel")
f = get_url(excuses_url, False)
with lzma.open(f) as lzma_f:
excuses = yaml.load(lzma_f, Loader=yaml.CSafeLoader)
if args.package:
try:
if not has_excuses_bugs(args.launchpad, args.package):
proposed_version = get_proposed_version(excuses, args.package)
if not proposed_version:
print(f"Package {args.package} not found in -proposed.")
sys.exit(1)
create_excuses_bug(args.launchpad, args.package, proposed_version)
except ValueError as e:
sys.stderr.write(f"{e}\n")
else:
pass # for now
if __name__ == "__main__":
sys.exit(main())

View File

@ -5,7 +5,10 @@
#
# See pull-pkg
# pylint: disable=invalid-name
# pylint: enable=invalid-name
from ubuntutools.pullpkg import PullPkg
if __name__ == '__main__':
PullPkg.main(distro='debian', pull='ddebs')
if __name__ == "__main__":
PullPkg.main(distro="debian", pull="ddebs")

View File

@ -17,29 +17,32 @@
# OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
# PERFORMANCE OF THIS SOFTWARE.
import optparse
# pylint: disable=invalid-name
# pylint: enable=invalid-name
import argparse
import sys
import debian.changelog
from ubuntutools import getLogger
from ubuntutools.archive import DebianSourcePackage, DownloadError
from ubuntutools.config import UDTConfig
from ubuntutools.version import Version
from ubuntutools import getLogger
Logger = getLogger()
def previous_version(package, version, distance):
"Given an (extracted) package, determine the version distance versions ago"
upver = Version(version).upstream_version
filename = '%s-%s/debian/changelog' % (package, upver)
changelog_file = open(filename, 'r')
filename = f"{package}-{upver}/debian/changelog"
changelog_file = open(filename, "r", encoding="utf-8")
changelog = debian.changelog.Changelog(changelog_file.read())
changelog_file.close()
seen = 0
for entry in changelog:
if entry.distributions == 'UNRELEASED':
if entry.distributions == "UNRELEASED":
continue
if seen == distance:
return entry.version.full_version
@ -48,69 +51,78 @@ def previous_version(package, version, distance):
def main():
parser = optparse.OptionParser('%prog [options] <package> <version> '
'[distance]')
parser.add_option('-f', '--fetch',
dest='fetch_only', default=False, action='store_true',
help="Only fetch the source packages, don't diff.")
parser.add_option('-d', '--debian-mirror', metavar='DEBIAN_MIRROR',
dest='debian_mirror',
help='Preferred Debian mirror '
'(default: http://deb.debian.org/debian)')
parser.add_option('-s', '--debsec-mirror', metavar='DEBSEC_MIRROR',
dest='debsec_mirror',
help='Preferred Debian Security mirror '
'(default: http://security.debian.org)')
parser.add_option('--no-conf',
dest='no_conf', default=False, action='store_true',
help="Don't read config files or environment variables")
parser = argparse.ArgumentParser(usage="%(prog)s [options] <package> <version> [distance]")
parser.add_argument(
"-f",
"--fetch",
dest="fetch_only",
default=False,
action="store_true",
help="Only fetch the source packages, don't diff.",
)
parser.add_argument(
"-d",
"--debian-mirror",
metavar="DEBIAN_MIRROR",
dest="debian_mirror",
help="Preferred Debian mirror (default: http://deb.debian.org/debian)",
)
parser.add_argument(
"-s",
"--debsec-mirror",
metavar="DEBSEC_MIRROR",
dest="debsec_mirror",
help="Preferred Debian Security mirror (default: http://security.debian.org)",
)
parser.add_argument(
"--no-conf",
dest="no_conf",
default=False,
action="store_true",
help="Don't read config files or environment variables",
)
parser.add_argument("package", help=argparse.SUPPRESS)
parser.add_argument("version", help=argparse.SUPPRESS)
parser.add_argument("distance", default=1, type=int, nargs="?", help=argparse.SUPPRESS)
args = parser.parse_args()
opts, args = parser.parse_args()
if len(args) < 2:
parser.error('Must specify package and version')
elif len(args) > 3:
parser.error('Too many arguments')
package = args[0]
version = args[1]
distance = int(args[2]) if len(args) > 2 else 1
config = UDTConfig(args.no_conf)
if args.debian_mirror is None:
args.debian_mirror = config.get_value("DEBIAN_MIRROR")
if args.debsec_mirror is None:
args.debsec_mirror = config.get_value("DEBSEC_MIRROR")
mirrors = [args.debsec_mirror, args.debian_mirror]
config = UDTConfig(opts.no_conf)
if opts.debian_mirror is None:
opts.debian_mirror = config.get_value('DEBIAN_MIRROR')
if opts.debsec_mirror is None:
opts.debsec_mirror = config.get_value('DEBSEC_MIRROR')
mirrors = [opts.debsec_mirror, opts.debian_mirror]
Logger.info("Downloading %s %s", args.package, args.version)
Logger.info('Downloading %s %s', package, version)
newpkg = DebianSourcePackage(package, version, mirrors=mirrors)
newpkg = DebianSourcePackage(args.package, args.version, mirrors=mirrors)
try:
newpkg.pull()
except DownloadError as e:
Logger.error('Failed to download: %s', str(e))
Logger.error("Failed to download: %s", str(e))
sys.exit(1)
newpkg.unpack()
if opts.fetch_only:
if args.fetch_only:
sys.exit(0)
oldversion = previous_version(package, version, distance)
oldversion = previous_version(args.package, args.version, args.distance)
if not oldversion:
Logger.error('No previous version could be found')
Logger.error("No previous version could be found")
sys.exit(1)
Logger.info('Downloading %s %s', package, oldversion)
Logger.info("Downloading %s %s", args.package, oldversion)
oldpkg = DebianSourcePackage(package, oldversion, mirrors=mirrors)
oldpkg = DebianSourcePackage(args.package, oldversion, mirrors=mirrors)
try:
oldpkg.pull()
except DownloadError as e:
Logger.error('Failed to download: %s', str(e))
Logger.error("Failed to download: %s", str(e))
sys.exit(1)
Logger.info('file://' + oldpkg.debdiff(newpkg, diffstat=True))
Logger.info("file://%s", oldpkg.debdiff(newpkg, diffstat=True))
if __name__ == '__main__':
if __name__ == "__main__":
try:
main()
except KeyboardInterrupt:
Logger.info('User abort.')
Logger.info("User abort.")

View File

@ -5,7 +5,10 @@
#
# See pull-pkg
# pylint: disable=invalid-name
# pylint: enable=invalid-name
from ubuntutools.pullpkg import PullPkg
if __name__ == '__main__':
PullPkg.main(distro='debian', pull='debs')
if __name__ == "__main__":
PullPkg.main(distro="debian", pull="debs")

View File

@ -5,7 +5,10 @@
#
# See pull-pkg
# pylint: disable=invalid-name
# pylint: enable=invalid-name
from ubuntutools.pullpkg import PullPkg
if __name__ == '__main__':
PullPkg.main(distro='debian', pull='source')
if __name__ == "__main__":
PullPkg.main(distro="debian", pull="source")

View File

@ -5,7 +5,10 @@
#
# See pull-pkg
# pylint: disable=invalid-name
# pylint: enable=invalid-name
from ubuntutools.pullpkg import PullPkg
if __name__ == '__main__':
PullPkg.main(distro='debian', pull='udebs')
if __name__ == "__main__":
PullPkg.main(distro="debian", pull="udebs")

View File

@ -5,7 +5,10 @@
#
# See pull-pkg
# pylint: disable=invalid-name
# pylint: enable=invalid-name
from ubuntutools.pullpkg import PullPkg
if __name__ == '__main__':
PullPkg.main(distro='ubuntu', pull='ddebs')
if __name__ == "__main__":
PullPkg.main(distro="ubuntu", pull="ddebs")

View File

@ -5,7 +5,10 @@
#
# See pull-pkg
# pylint: disable=invalid-name
# pylint: enable=invalid-name
from ubuntutools.pullpkg import PullPkg
if __name__ == '__main__':
PullPkg.main(distro='ubuntu', pull='debs')
if __name__ == "__main__":
PullPkg.main(distro="ubuntu", pull="debs")

View File

@ -5,7 +5,10 @@
#
# See pull-pkg
# pylint: disable=invalid-name
# pylint: enable=invalid-name
from ubuntutools.pullpkg import PullPkg
if __name__ == '__main__':
PullPkg.main(distro='ubuntu', pull='source')
if __name__ == "__main__":
PullPkg.main(distro="ubuntu", pull="source")

View File

@ -5,7 +5,10 @@
#
# See pull-pkg
# pylint: disable=invalid-name
# pylint: enable=invalid-name
from ubuntutools.pullpkg import PullPkg
if __name__ == '__main__':
PullPkg.main(distro='ubuntu', pull='udebs')
if __name__ == "__main__":
PullPkg.main(distro="ubuntu", pull="udebs")

View File

@ -23,7 +23,10 @@
#
# ##################################################################
# pylint: disable=invalid-name
# pylint: enable=invalid-name
from ubuntutools.pullpkg import PullPkg
if __name__ == '__main__':
if __name__ == "__main__":
PullPkg.main()

View File

@ -6,7 +6,10 @@
#
# See pull-pkg
# pylint: disable=invalid-name
# pylint: enable=invalid-name
from ubuntutools.pullpkg import PullPkg
if __name__ == '__main__':
PullPkg.main(distro='ppa', pull='ddebs')
if __name__ == "__main__":
PullPkg.main(distro="ppa", pull="ddebs")

View File

@ -6,7 +6,10 @@
#
# See pull-pkg
# pylint: disable=invalid-name
# pylint: enable=invalid-name
from ubuntutools.pullpkg import PullPkg
if __name__ == '__main__':
PullPkg.main(distro='ppa', pull='debs')
if __name__ == "__main__":
PullPkg.main(distro="ppa", pull="debs")

View File

@ -6,7 +6,10 @@
#
# See pull-pkg
# pylint: disable=invalid-name
# pylint: enable=invalid-name
from ubuntutools.pullpkg import PullPkg
if __name__ == '__main__':
PullPkg.main(distro='ppa', pull='source')
if __name__ == "__main__":
PullPkg.main(distro="ppa", pull="source")

View File

@ -6,7 +6,10 @@
#
# See pull-pkg
# pylint: disable=invalid-name
# pylint: enable=invalid-name
from ubuntutools.pullpkg import PullPkg
if __name__ == '__main__':
PullPkg.main(distro='ppa', pull='udebs')
if __name__ == "__main__":
PullPkg.main(distro="ppa", pull="udebs")

View File

@ -5,7 +5,10 @@
#
# See pull-pkg
# pylint: disable=invalid-name
# pylint: enable=invalid-name
from ubuntutools.pullpkg import PullPkg
if __name__ == '__main__':
PullPkg.main(distro='uca', pull='ddebs')
if __name__ == "__main__":
PullPkg.main(distro="uca", pull="ddebs")

View File

@ -5,7 +5,10 @@
#
# See pull-pkg
# pylint: disable=invalid-name
# pylint: enable=invalid-name
from ubuntutools.pullpkg import PullPkg
if __name__ == '__main__':
PullPkg.main(distro='uca', pull='debs')
if __name__ == "__main__":
PullPkg.main(distro="uca", pull="debs")

View File

@ -5,7 +5,10 @@
#
# See pull-pkg
# pylint: disable=invalid-name
# pylint: enable=invalid-name
from ubuntutools.pullpkg import PullPkg
if __name__ == '__main__':
PullPkg.main(distro='uca', pull='source')
if __name__ == "__main__":
PullPkg.main(distro="uca", pull="source")

View File

@ -5,7 +5,10 @@
#
# See pull-pkg
# pylint: disable=invalid-name
# pylint: enable=invalid-name
from ubuntutools.pullpkg import PullPkg
if __name__ == '__main__':
PullPkg.main(distro='uca', pull='udebs')
if __name__ == "__main__":
PullPkg.main(distro="uca", pull="udebs")

6
pyproject.toml Normal file
View File

@ -0,0 +1,6 @@
[tool.black]
line-length = 99
[tool.isort]
line_length = 99
profile = "black"

View File

@ -14,22 +14,20 @@
# ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
from collections import defaultdict
import optparse
import re
import argparse
import sys
from collections import defaultdict
import apt
from distro_info import UbuntuDistroInfo
from ubuntutools.config import UDTConfig
from ubuntutools.lp.lpapicache import Launchpad, Distribution
from ubuntutools.lp.udtexceptions import PackageNotFoundException
from ubuntutools.question import (YesNoQuestion, EditBugReport,
confirmation_prompt)
from ubuntutools.rdepends import query_rdepends, RDependsException
from ubuntutools import getLogger
from ubuntutools.config import UDTConfig
from ubuntutools.lp.lpapicache import Distribution, Launchpad
from ubuntutools.lp.udtexceptions import PackageNotFoundException
from ubuntutools.question import EditBugReport, YesNoQuestion, confirmation_prompt
from ubuntutools.rdepends import RDependsException, query_rdepends
Logger = getLogger()
@ -40,16 +38,14 @@ class DestinationException(Exception):
def determine_destinations(source, destination):
ubuntu_info = UbuntuDistroInfo()
if destination is None:
destination = ubuntu_info.stable()
destination = ubuntu_info.lts()
if source not in ubuntu_info.all:
raise DestinationException("Source release %s does not exist" % source)
raise DestinationException(f"Source release {source} does not exist")
if destination not in ubuntu_info.all:
raise DestinationException("Destination release %s does not exist"
% destination)
raise DestinationException(f"Destination release {destination} does not exist")
if destination not in ubuntu_info.supported():
raise DestinationException("Destination release %s is not supported"
% destination)
raise DestinationException(f"Destination release {destination} is not supported")
found = False
destinations = []
@ -77,41 +73,36 @@ def determine_destinations(source, destination):
def disclaimer():
print("Ubuntu's backports are not for fixing bugs in stable releases, "
print(
"Ubuntu's backports are not for fixing bugs in stable releases, "
"but for bringing new features to older, stable releases.\n"
"See https://wiki.ubuntu.com/UbuntuBackports for the Ubuntu "
"Backports policy and processes.\n"
"See https://wiki.ubuntu.com/StableReleaseUpdates for the process "
"for fixing bugs in stable releases.")
"for fixing bugs in stable releases."
)
confirmation_prompt()
def check_existing(package, destinations):
def check_existing(package):
"""Search for possible existing bug reports"""
# The LP bug search is indexed, not substring:
query = re.findall(r'[a-z]+', package)
bugs = []
for release in destinations:
project_name = '{}-backports'.format(release)
try:
project = Launchpad.projects[project_name]
except KeyError:
Logger.error("The backports tracking project '%s' doesn't seem to "
"exist. Please check the situation with the "
"backports team.", project_name)
sys.exit(1)
bugs += project.searchTasks(omit_duplicates=True,
search_text=query,
status=["Incomplete", "New", "Confirmed",
"Triaged", "In Progress",
"Fix Committed"])
distro = Distribution("ubuntu")
srcpkg = distro.getSourcePackage(name=package.getPackageName())
bugs = srcpkg.searchTasks(
omit_duplicates=True,
search_text="[BPO]",
status=["Incomplete", "New", "Confirmed", "Triaged", "In Progress", "Fix Committed"],
)
if not bugs:
return
Logger.info("There are existing bug reports that look similar to your "
"request. Please check before continuing:")
Logger.info(
"There are existing bug reports that look similar to your "
"request. Please check before continuing:"
)
for bug in sorted(set(bug_task.bug for bug_task in bugs)):
for bug in sorted([bug_task.bug for bug_task in bugs], key=lambda bug: bug.id):
Logger.info(" * LP: #%-7i: %s %s", bug.id, bug.title, bug.web_link)
confirmation_prompt()
@ -122,9 +113,9 @@ def find_rdepends(releases, published_binaries):
# We want to display every pubilshed binary, even if it has no rdepends
for binpkg in published_binaries:
intermediate[binpkg]
intermediate[binpkg] # pylint: disable=pointless-statement
for arch in ('any', 'source'):
for arch in ("any", "source"):
for release in releases:
for binpkg in published_binaries:
try:
@ -135,20 +126,20 @@ def find_rdepends(releases, published_binaries):
for relationship, rdeps in raw_rdeps.items():
for rdep in rdeps:
# Ignore circular deps:
if rdep['Package'] in published_binaries:
if rdep["Package"] in published_binaries:
continue
# arch==any queries return Reverse-Build-Deps:
if arch == 'any' and rdep.get('Architectures', []) == ['source']:
if arch == "any" and rdep.get("Architectures", []) == ["source"]:
continue
intermediate[binpkg][rdep['Package']].append((release, relationship))
intermediate[binpkg][rdep["Package"]].append((release, relationship))
output = []
for binpkg, rdeps in intermediate.items():
output += ['', binpkg, '-' * len(binpkg)]
output += ["", binpkg, "-" * len(binpkg)]
for pkg, appearences in rdeps.items():
output += ['* %s' % pkg]
output += [f"* {pkg}"]
for release, relationship in appearences:
output += [' [ ] %s (%s)' % (release, relationship)]
output += [f" [ ] {release} ({relationship})"]
found_any = sum(len(rdeps) for rdeps in intermediate.values())
if found_any:
@ -163,7 +154,7 @@ def find_rdepends(releases, published_binaries):
"package currently in the release still works with the new "
"%(package)s installed. "
"Reverse- Recommends, Suggests, and Enhances don't need to be "
"tested, and are listed for completeness-sake."
"tested, and are listed for completeness-sake.",
] + output
else:
output = ["No reverse dependencies"]
@ -172,146 +163,164 @@ def find_rdepends(releases, published_binaries):
def locate_package(package, distribution):
archive = Distribution('ubuntu').getArchive()
for pass_ in ('source', 'binary'):
archive = Distribution("ubuntu").getArchive()
try:
package_spph = archive.getSourcePackage(package, distribution)
return package_spph
except PackageNotFoundException as e:
if pass_ == 'binary':
Logger.error(str(e))
sys.exit(1)
try:
apt_pkg = apt.Cache()[package]
except KeyError:
continue
Logger.error(str(e))
sys.exit(1)
package = apt_pkg.candidate.source_name
Logger.info("Binary package specified, considering its source "
"package instead: %s", package)
Logger.info(
"Binary package specified, considering its source package instead: %s", package
)
return None
def request_backport(package_spph, source, destinations):
published_binaries = set()
for bpph in package_spph.getBinaries():
published_binaries.add(bpph.getPackageName())
if not published_binaries:
Logger.error("%s (%s) has no published binaries in %s. ",
package_spph.getPackageName(), package_spph.getVersion(),
source)
Logger.info("Is it stuck in bin-NEW? It can't be backported until "
"the binaries have been accepted.")
Logger.error(
"%s (%s) has no published binaries in %s. ",
package_spph.getPackageName(),
package_spph.getVersion(),
source,
)
Logger.info(
"Is it stuck in bin-NEW? It can't be backported until "
"the binaries have been accepted."
)
sys.exit(1)
testing = []
testing += ["You can test-build the backport in your PPA with "
"backportpackage:"]
testing += ["$ backportpackage -u ppa:<lp username>/<ppa name> "
"-s %s -d %s %s"
% (source, dest, package_spph.getPackageName())
for dest in destinations]
testing += [""]
testing = ["[Testing]", ""]
for dest in destinations:
testing += ['* %s:' % dest]
testing += [f" * {dest.capitalize()}:"]
testing += [" [ ] Package builds without modification"]
testing += ["[ ] %s installs cleanly and runs" % binary
for binary in published_binaries]
testing += [f" [ ] {binary} installs cleanly and runs" for binary in published_binaries]
subst = {
'package': package_spph.getPackageName(),
'version': package_spph.getVersion(),
'component': package_spph.getComponent(),
'source': package_spph.getSeriesAndPocket(),
'destinations': ', '.join(destinations),
"package": package_spph.getPackageName(),
"version": package_spph.getVersion(),
"component": package_spph.getComponent(),
"source": package_spph.getSeriesAndPocket(),
"destinations": ", ".join(destinations),
}
subject = ("Please backport %(package)s %(version)s (%(component)s) "
"from %(source)s" % subst)
body = ('\n'.join(
subject = "[BPO] %(package)s %(version)s to %(destinations)s" % subst
body = (
"\n".join(
[
"Please backport %(package)s %(version)s (%(component)s) "
"from %(source)s to %(destinations)s.",
"[Impact]",
"",
"Reason for the backport:",
"========================",
">>> Enter your reasoning here <<<",
" * Justification for backporting the new version to the stable release.",
"",
"[Scope]",
"",
" * List the Ubuntu release you will backport from,"
" and the specific package version.",
"",
" * List the Ubuntu release(s) you will backport to.",
"",
"[Other Info]",
"",
" * Anything else you think is useful to include",
"",
"Testing:",
"========",
"Mark off items in the checklist [X] as you test them, "
"but please leave the checklist so that backporters can quickly "
"evaluate the state of testing.",
""
]
+ testing
+ [""]
+ find_rdepends(destinations, published_binaries)
+ [""]) % subst)
+ [""]
)
% subst
)
editor = EditBugReport(subject, body)
editor.edit()
subject, body = editor.get_report()
Logger.info('The final report is:\nSummary: %s\nDescription:\n%s\n',
subject, body)
Logger.info("The final report is:\nSummary: %s\nDescription:\n%s\n", subject, body)
if YesNoQuestion().ask("Request this backport", "yes") == "no":
sys.exit(1)
targets = [Launchpad.projects['%s-backports' % destination]
for destination in destinations]
bug = Launchpad.bugs.createBug(title=subject, description=body,
target=targets[0])
for target in targets[1:]:
bug.addTask(target=target)
distro = Distribution("ubuntu")
pkgname = package_spph.getPackageName()
bug = Launchpad.bugs.createBug(
title=subject, description=body, target=distro.getSourcePackage(name=pkgname)
)
bug.subscribe(person=Launchpad.people["ubuntu-backporters"])
for dest in destinations:
series = distro.getSeries(dest)
try:
bug.addTask(target=series.getSourcePackage(name=pkgname))
except Exception: # pylint: disable=broad-except
break
Logger.info("Backport request filed as %s", bug.web_link)
def main():
parser = optparse.OptionParser('%prog [options] package')
parser.add_option('-d', '--destination', metavar='DEST',
help='Backport to DEST release and necessary '
'intermediate releases '
'(default: current stable release)')
parser.add_option('-s', '--source', metavar='SOURCE',
help='Backport from SOURCE release '
'(default: current devel release)')
parser.add_option('-l', '--lpinstance', metavar='INSTANCE', default=None,
help='Launchpad instance to connect to '
'(default: production).')
parser.add_option('--no-conf', action='store_true',
dest='no_conf', default=False,
help="Don't read config files or environment variables")
options, args = parser.parse_args()
parser = argparse.ArgumentParser(usage="%(prog)s [options] package")
parser.add_argument(
"-d",
"--destination",
metavar="DEST",
help="Backport to DEST release and necessary "
"intermediate releases "
"(default: current LTS release)",
)
parser.add_argument(
"-s",
"--source",
metavar="SOURCE",
help="Backport from SOURCE release (default: current devel release)",
)
parser.add_argument(
"-l",
"--lpinstance",
metavar="INSTANCE",
default=None,
help="Launchpad instance to connect to (default: production).",
)
parser.add_argument(
"--no-conf",
action="store_true",
dest="no_conf",
default=False,
help="Don't read config files or environment variables",
)
parser.add_argument("package", help=argparse.SUPPRESS)
args = parser.parse_args()
if len(args) != 1:
parser.error("One (and only one) package must be specified")
package = args[0]
config = UDTConfig(args.no_conf)
config = UDTConfig(options.no_conf)
if args.lpinstance is None:
args.lpinstance = config.get_value("LPINSTANCE")
Launchpad.login(args.lpinstance)
if options.lpinstance is None:
options.lpinstance = config.get_value('LPINSTANCE')
Launchpad.login(options.lpinstance)
if options.source is None:
options.source = Distribution('ubuntu').getDevelopmentSeries().name
if args.source is None:
args.source = Distribution("ubuntu").getDevelopmentSeries().name
try:
destinations = determine_destinations(options.source,
options.destination)
destinations = determine_destinations(args.source, args.destination)
except DestinationException as e:
Logger.error(str(e))
sys.exit(1)
disclaimer()
check_existing(package, destinations)
package_spph = locate_package(args.package, args.source)
package_spph = locate_package(package, options.source)
request_backport(package_spph, options.source, destinations)
check_existing(package_spph)
request_backport(package_spph, args.source, destinations)
if __name__ == '__main__':
if __name__ == "__main__":
main()

View File

@ -26,19 +26,19 @@
#
# ##################################################################
import optparse
import argparse
import os
import sys
from distro_info import UbuntuDistroInfo
from ubuntutools import getLogger
from ubuntutools.config import UDTConfig, ubu_email
from ubuntutools.lp import udtexceptions
from ubuntutools.misc import require_utf8
from ubuntutools.question import confirmation_prompt, EditBugReport
from ubuntutools.question import EditBugReport, confirmation_prompt
from ubuntutools.version import Version
from ubuntutools import getLogger
Logger = getLogger()
#
@ -48,170 +48,190 @@ Logger = getLogger()
def main():
# Our usage options.
usage = ('Usage: %prog [options] '
'<source package> [<target release> [base version]]')
parser = optparse.OptionParser(usage)
usage = "%(prog)s [options] <source package> [<target release> [base version]]"
parser = argparse.ArgumentParser(usage=usage)
parser.add_option('-d', type='string',
dest='dist', default='unstable',
help='Debian distribution to sync from.')
parser.add_option('-k', type='string',
dest='keyid', default=None,
help='GnuPG key ID to use for signing report '
'(only used when emailing the sync request).')
parser.add_option('-n', action='store_true',
dest='newpkg', default=False,
help='Whether package to sync is a new package in '
'Ubuntu.')
parser.add_option('--email', action='store_true', default=False,
help='Use a PGP-signed email for filing the sync '
'request, rather than the LP API.')
parser.add_option('--lp', dest='deprecated_lp_flag',
action='store_true', default=False,
help=optparse.SUPPRESS_HELP)
parser.add_option('-l', '--lpinstance', metavar='INSTANCE',
dest='lpinstance', default=None,
help='Launchpad instance to connect to '
'(default: production).')
parser.add_option('-s', action='store_true',
dest='sponsorship', default=False,
help='Force sponsorship')
parser.add_option('-C', action='store_true',
dest='missing_changelog_ok', default=False,
help='Allow changelog to be manually filled in '
'when missing')
parser.add_option('-e', action='store_true',
dest='ffe', default=False,
help='Use this after FeatureFreeze for non-bug fix '
'syncs, changes default subscription to the '
'appropriate release team.')
parser.add_option('--no-conf', action='store_true',
dest='no_conf', default=False,
help="Don't read config files or environment variables")
(options, args) = parser.parse_args()
if not len(args):
parser.print_help()
sys.exit(1)
parser.add_argument(
"-d", dest="dist", default="unstable", help="Debian distribution to sync from."
)
parser.add_argument(
"-k",
dest="keyid",
default=None,
help="GnuPG key ID to use for signing report "
"(only used when emailing the sync request).",
)
parser.add_argument(
"-n",
action="store_true",
dest="newpkg",
default=False,
help="Whether package to sync is a new package in Ubuntu.",
)
parser.add_argument(
"--email",
action="store_true",
default=False,
help="Use a PGP-signed email for filing the sync request, rather than the LP API.",
)
parser.add_argument(
"--lp",
dest="deprecated_lp_flag",
action="store_true",
default=False,
help=argparse.SUPPRESS,
)
parser.add_argument(
"-l",
"--lpinstance",
metavar="INSTANCE",
dest="lpinstance",
default=None,
help="Launchpad instance to connect to (default: production).",
)
parser.add_argument(
"-s", action="store_true", dest="sponsorship", default=False, help="Force sponsorship"
)
parser.add_argument(
"-C",
action="store_true",
dest="missing_changelog_ok",
default=False,
help="Allow changelog to be manually filled in when missing",
)
parser.add_argument(
"-e",
action="store_true",
dest="ffe",
default=False,
help="Use this after FeatureFreeze for non-bug fix "
"syncs, changes default subscription to the "
"appropriate release team.",
)
parser.add_argument(
"--no-conf",
action="store_true",
dest="no_conf",
default=False,
help="Don't read config files or environment variables",
)
parser.add_argument("source_package", help=argparse.SUPPRESS)
parser.add_argument("release", nargs="?", help=argparse.SUPPRESS)
parser.add_argument("base_version", nargs="?", type=Version, help=argparse.SUPPRESS)
args = parser.parse_args()
require_utf8()
config = UDTConfig(options.no_conf)
config = UDTConfig(args.no_conf)
if options.deprecated_lp_flag:
if args.deprecated_lp_flag:
Logger.info("The --lp flag is now default, ignored.")
if options.email:
options.lpapi = False
if args.email:
args.lpapi = False
else:
options.lpapi = config.get_value('USE_LPAPI', default=True,
boolean=True)
if options.lpinstance is None:
options.lpinstance = config.get_value('LPINSTANCE')
args.lpapi = config.get_value("USE_LPAPI", default=True, boolean=True)
if args.lpinstance is None:
args.lpinstance = config.get_value("LPINSTANCE")
if options.keyid is None:
options.keyid = config.get_value('KEYID')
if args.keyid is None:
args.keyid = config.get_value("KEYID")
if not options.lpapi:
if options.lpinstance == 'production':
bug_mail_domain = 'bugs.launchpad.net'
elif options.lpinstance == 'staging':
bug_mail_domain = 'bugs.staging.launchpad.net'
if not args.lpapi:
if args.lpinstance == "production":
bug_mail_domain = "bugs.launchpad.net"
elif args.lpinstance == "staging":
bug_mail_domain = "bugs.staging.launchpad.net"
else:
Logger.error('Error: Unknown launchpad instance: %s'
% options.lpinstance)
Logger.error("Error: Unknown launchpad instance: %s", args.lpinstance)
sys.exit(1)
mailserver_host = config.get_value('SMTP_SERVER',
default=None,
compat_keys=['UBUSMTP', 'DEBSMTP'])
if not options.lpapi and not mailserver_host:
mailserver_host = config.get_value(
"SMTP_SERVER", default=None, compat_keys=["UBUSMTP", "DEBSMTP"]
)
if not args.lpapi and not mailserver_host:
try:
import DNS
import DNS # pylint: disable=import-outside-toplevel
DNS.DiscoverNameServers()
mxlist = DNS.mxlookup(bug_mail_domain)
firstmx = mxlist[0]
mailserver_host = firstmx[1]
except ImportError:
Logger.error('Please install python-dns to support '
'Launchpad mail server lookup.')
Logger.error("Please install python-dns to support Launchpad mail server lookup.")
sys.exit(1)
mailserver_port = config.get_value('SMTP_PORT', default=25,
compat_keys=['UBUSMTP_PORT',
'DEBSMTP_PORT'])
mailserver_user = config.get_value('SMTP_USER',
compat_keys=['UBUSMTP_USER',
'DEBSMTP_USER'])
mailserver_pass = config.get_value('SMTP_PASS',
compat_keys=['UBUSMTP_PASS',
'DEBSMTP_PASS'])
mailserver_port = config.get_value(
"SMTP_PORT", default=25, compat_keys=["UBUSMTP_PORT", "DEBSMTP_PORT"]
)
mailserver_user = config.get_value("SMTP_USER", compat_keys=["UBUSMTP_USER", "DEBSMTP_USER"])
mailserver_pass = config.get_value("SMTP_PASS", compat_keys=["UBUSMTP_PASS", "DEBSMTP_PASS"])
# import the needed requestsync module
if options.lpapi:
from ubuntutools.requestsync.lp import (check_existing_reports,
get_debian_srcpkg,
get_ubuntu_srcpkg,
get_ubuntu_delta_changelog,
need_sponsorship, post_bug)
# pylint: disable=import-outside-toplevel
if args.lpapi:
from ubuntutools.lp.lpapicache import Distribution, Launchpad
from ubuntutools.requestsync.lp import (
check_existing_reports,
get_debian_srcpkg,
get_ubuntu_delta_changelog,
get_ubuntu_srcpkg,
need_sponsorship,
post_bug,
)
# See if we have LP credentials and exit if we don't -
# cannot continue in this case
try:
# devel for changelogUrl()
Launchpad.login(service=options.lpinstance, api_version='devel')
Launchpad.login(service=args.lpinstance, api_version="devel")
except IOError:
sys.exit(1)
else:
from ubuntutools.requestsync.mail import (check_existing_reports,
from ubuntutools.requestsync.mail import (
check_existing_reports,
get_debian_srcpkg,
get_ubuntu_srcpkg,
get_ubuntu_delta_changelog,
mail_bug, need_sponsorship)
if not any(x in os.environ for x in ('UBUMAIL', 'DEBEMAIL', 'EMAIL')):
Logger.error('The environment variable UBUMAIL, DEBEMAIL or EMAIL needs '
'to be set to let this script mail the sync request.')
get_ubuntu_srcpkg,
mail_bug,
need_sponsorship,
)
if not any(x in os.environ for x in ("UBUMAIL", "DEBEMAIL", "EMAIL")):
Logger.error(
"The environment variable UBUMAIL, DEBEMAIL or EMAIL needs "
"to be set to let this script mail the sync request."
)
sys.exit(1)
newsource = options.newpkg
sponsorship = options.sponsorship
distro = options.dist
ffe = options.ffe
lpapi = options.lpapi
newsource = args.newpkg
sponsorship = args.sponsorship
distro = args.dist
ffe = args.ffe
lpapi = args.lpapi
need_interaction = False
force_base_version = None
srcpkg = args[0]
srcpkg = args.source_package
if len(args) == 1:
if not args.release:
if lpapi:
release = Distribution('ubuntu').getDevelopmentSeries().name
args.release = Distribution("ubuntu").getDevelopmentSeries().name
else:
ubu_info = UbuntuDistroInfo()
release = ubu_info.devel()
Logger.warning('Target release missing - assuming %s' % release)
elif len(args) == 2:
release = args[1]
elif len(args) == 3:
release = args[1]
force_base_version = Version(args[2])
else:
Logger.error('Too many arguments.')
parser.print_help()
sys.exit(1)
args.release = ubu_info.devel()
Logger.warning("Target release missing - assuming %s", args.release)
# Get the current Ubuntu source package
try:
ubuntu_srcpkg = get_ubuntu_srcpkg(srcpkg, release, 'Proposed')
ubuntu_srcpkg = get_ubuntu_srcpkg(srcpkg, args.release, "Proposed")
ubuntu_version = Version(ubuntu_srcpkg.getVersion())
ubuntu_component = ubuntu_srcpkg.getComponent()
newsource = False # override the -n flag
except udtexceptions.PackageNotFoundException:
ubuntu_srcpkg = None
ubuntu_version = Version('~')
ubuntu_version = Version("~")
ubuntu_component = None # Set after getting the Debian info
if not newsource:
Logger.info("'%s' doesn't exist in 'Ubuntu %s'." % (srcpkg, release))
Logger.info("'%s' doesn't exist in 'Ubuntu %s'.", srcpkg, args.release)
Logger.info("Do you want to sync a new package?")
confirmation_prompt()
newsource = True
@ -232,15 +252,16 @@ def main():
sys.exit(1)
if ubuntu_component is None:
if debian_component == 'main':
ubuntu_component = 'universe'
if debian_component == "main":
ubuntu_component = "universe"
else:
ubuntu_component = 'multiverse'
ubuntu_component = "multiverse"
# Stop if Ubuntu has already the version from Debian or a newer version
if (ubuntu_version >= debian_version) and options.lpapi:
if (ubuntu_version >= debian_version) and args.lpapi:
# try rmadison
import ubuntutools.requestsync.mail
import ubuntutools.requestsync.mail # pylint: disable=import-outside-toplevel
try:
debian_srcpkg = ubuntutools.requestsync.mail.get_debian_srcpkg(srcpkg, distro)
debian_version = Version(debian_srcpkg.getVersion())
@ -250,72 +271,80 @@ def main():
sys.exit(1)
if ubuntu_version == debian_version:
Logger.error('The versions in Debian and Ubuntu are the '
'same already (%s). Aborting.' % ubuntu_version)
Logger.error(
"The versions in Debian and Ubuntu are the same already (%s). Aborting.",
ubuntu_version,
)
sys.exit(1)
if ubuntu_version > debian_version:
Logger.error('The version in Ubuntu (%s) is newer than '
'the version in Debian (%s). Aborting.'
% (ubuntu_version, debian_version))
Logger.error(
"The version in Ubuntu (%s) is newer than the version in Debian (%s). Aborting.",
ubuntu_version,
debian_version,
)
sys.exit(1)
# -s flag not specified - check if we do need sponsorship
if not sponsorship:
sponsorship = need_sponsorship(srcpkg, ubuntu_component, release)
sponsorship = need_sponsorship(srcpkg, ubuntu_component, args.release)
if not sponsorship and not ffe:
Logger.error('Consider using syncpackage(1) for syncs that '
'do not require feature freeze exceptions.')
Logger.error(
"Consider using syncpackage(1) for syncs that "
"do not require feature freeze exceptions."
)
# Check for existing package reports
if not newsource:
check_existing_reports(srcpkg)
# Generate bug report
pkg_to_sync = ('%s %s (%s) from Debian %s (%s)'
% (srcpkg, debian_version, ubuntu_component,
distro, debian_component))
title = "Sync %s" % pkg_to_sync
pkg_to_sync = (
f"{srcpkg} {debian_version} ({ubuntu_component})"
f" from Debian {distro} ({debian_component})"
)
title = f"Sync {pkg_to_sync}"
if ffe:
title = "FFe: " + title
report = "Please sync %s\n\n" % pkg_to_sync
report = f"Please sync {pkg_to_sync}\n\n"
if 'ubuntu' in str(ubuntu_version):
if "ubuntu" in str(ubuntu_version):
need_interaction = True
Logger.info('Changes have been made to the package in Ubuntu.')
Logger.info('Please edit the report and give an explanation.')
Logger.info('Not saving the report file will abort the request.')
report += ('Explanation of the Ubuntu delta and why it can be '
'dropped:\n%s\n>>> ENTER_EXPLANATION_HERE <<<\n\n'
% get_ubuntu_delta_changelog(ubuntu_srcpkg))
Logger.info("Changes have been made to the package in Ubuntu.")
Logger.info("Please edit the report and give an explanation.")
Logger.info("Not saving the report file will abort the request.")
report += (
f"Explanation of the Ubuntu delta and why it can be dropped:\n"
f"{get_ubuntu_delta_changelog(ubuntu_srcpkg)}\n>>> ENTER_EXPLANATION_HERE <<<\n\n"
)
if ffe:
need_interaction = True
Logger.info('To approve FeatureFreeze exception, you need to state')
Logger.info('the reason why you feel it is necessary.')
Logger.info('Not saving the report file will abort the request.')
report += ('Explanation of FeatureFreeze exception:\n'
'>>> ENTER_EXPLANATION_HERE <<<\n\n')
Logger.info("To approve FeatureFreeze exception, you need to state")
Logger.info("the reason why you feel it is necessary.")
Logger.info("Not saving the report file will abort the request.")
report += "Explanation of FeatureFreeze exception:\n>>> ENTER_EXPLANATION_HERE <<<\n\n"
if need_interaction:
confirmation_prompt()
base_version = force_base_version or ubuntu_version
base_version = args.base_version or ubuntu_version
if newsource:
report += 'All changelog entries:\n\n'
report += "All changelog entries:\n\n"
else:
report += ('Changelog entries since current %s version %s:\n\n'
% (release, ubuntu_version))
report += f"Changelog entries since current {args.release} version {ubuntu_version}:\n\n"
changelog = debian_srcpkg.getChangelog(since_version=base_version)
if not changelog:
if not options.missing_changelog_ok:
Logger.error("Did not retrieve any changelog entries. "
if not args.missing_changelog_ok:
Logger.error(
"Did not retrieve any changelog entries. "
"Do you need to specify '-C'? "
"Was the package recently uploaded? (check "
"http://packages.debian.org/changelogs/)")
"http://packages.debian.org/changelogs/)"
)
sys.exit(1)
else:
need_interaction = True
@ -326,36 +355,49 @@ def main():
editor.edit(optional=not need_interaction)
title, report = editor.get_report()
if 'XXX FIXME' in report:
Logger.error("changelog boilerplate found in report, "
"please manually add changelog when using '-C'")
if "XXX FIXME" in report:
Logger.error(
"changelog boilerplate found in report, "
"please manually add changelog when using '-C'"
)
sys.exit(1)
# bug status and bug subscriber
status = 'confirmed'
subscribe = 'ubuntu-archive'
status = "confirmed"
subscribe = "ubuntu-archive"
if sponsorship:
status = 'new'
subscribe = 'ubuntu-sponsors'
status = "new"
subscribe = "ubuntu-sponsors"
if ffe:
status = 'new'
subscribe = 'ubuntu-release'
status = "new"
subscribe = "ubuntu-release"
srcpkg = not newsource and srcpkg or None
srcpkg = None if newsource else srcpkg
if lpapi:
# Map status to the values expected by LP API
mapping = {'new': 'New', 'confirmed': 'Confirmed'}
mapping = {"new": "New", "confirmed": "Confirmed"}
# Post sync request using LP API
post_bug(srcpkg, subscribe, mapping[status], title, report)
else:
email_from = ubu_email(export=False)[1]
# Mail sync request
mail_bug(srcpkg, subscribe, status, title, report, bug_mail_domain,
options.keyid, email_from, mailserver_host, mailserver_port,
mailserver_user, mailserver_pass)
mail_bug(
srcpkg,
subscribe,
status,
title,
report,
bug_mail_domain,
args.keyid,
email_from,
mailserver_host,
mailserver_port,
mailserver_user,
mailserver_pass,
)
if __name__ == '__main__':
if __name__ == "__main__":
try:
main()
except KeyboardInterrupt:

View File

@ -1,9 +1,8 @@
python-debian
python-debianbts
dateutil
distro-info
httplib2
launchpadlib
requests
setuptools
termcolor
pyyaml

View File

@ -14,16 +14,18 @@
# ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
# pylint: disable=invalid-name
# pylint: enable=invalid-name
import argparse
import sys
from distro_info import DistroDataOutdated
from ubuntutools.misc import (system_distribution, vendor_to_distroinfo,
codename_to_distribution)
from ubuntutools.rdepends import query_rdepends, RDependsException
from ubuntutools import getLogger
from ubuntutools.misc import codename_to_distribution, system_distribution, vendor_to_distroinfo
from ubuntutools.rdepends import RDependsException, query_rdepends
Logger = getLogger()
DEFAULT_MAX_DEPTH = 10 # We want avoid any infinite loop...
@ -35,77 +37,107 @@ def main():
default_release = system_distro_info.devel()
except DistroDataOutdated as e:
Logger.warning(e)
default_release = 'unstable'
default_release = "unstable"
description = ("List reverse-dependencies of package. "
description = (
"List reverse-dependencies of package. "
"If the package name is prefixed with src: then the "
"reverse-dependencies of all the binary packages that "
"the specified source package builds will be listed.")
"the specified source package builds will be listed."
)
parser = argparse.ArgumentParser(description=description)
parser.add_argument('-r', '--release', default=default_release,
help='Query dependencies in RELEASE. '
'Default: %s' % default_release)
parser.add_argument('-R', '--without-recommends', action='store_false',
dest='recommends',
help='Only consider Depends relationships, '
'not Recommends')
parser.add_argument('-s', '--with-suggests', action='store_true',
help='Also consider Suggests relationships')
parser.add_argument('-b', '--build-depends', action='store_true',
help='Query build dependencies (synonym for --arch=source)')
parser.add_argument('-a', '--arch', default='any',
help='Query dependencies in ARCH. Default: any')
parser.add_argument('-c', '--component', action='append',
help='Only consider reverse-dependencies in COMPONENT. '
'Can be specified multiple times. Default: all')
parser.add_argument('-l', '--list', action='store_true',
help='Display a simple, machine-readable list')
parser.add_argument('-u', '--service-url', metavar='URL',
dest='server', default=None,
help='Reverse Dependencies webservice URL. '
'Default: UbuntuWire')
parser.add_argument('-x', '--recursive', action='store_true',
help='Consider to find reverse dependencies recursively.')
parser.add_argument('-d', '--recursive-depth', type=int,
parser.add_argument(
"-r",
"--release",
default=default_release,
help="Query dependencies in RELEASE. Default: %(default)s",
)
parser.add_argument(
"-R",
"--without-recommends",
action="store_false",
dest="recommends",
help="Only consider Depends relationships, not Recommends",
)
parser.add_argument(
"-s", "--with-suggests", action="store_true", help="Also consider Suggests relationships"
)
parser.add_argument(
"-b",
"--build-depends",
action="store_true",
help="Query build dependencies (synonym for --arch=source)",
)
parser.add_argument(
"-a", "--arch", default="any", help="Query dependencies in ARCH. Default: any"
)
parser.add_argument(
"-c",
"--component",
action="append",
help="Only consider reverse-dependencies in COMPONENT. "
"Can be specified multiple times. Default: all",
)
parser.add_argument(
"-l", "--list", action="store_true", help="Display a simple, machine-readable list"
)
parser.add_argument(
"-u",
"--service-url",
metavar="URL",
dest="server",
default=None,
help="Reverse Dependencies webservice URL. Default: UbuntuWire",
)
parser.add_argument(
"-x",
"--recursive",
action="store_true",
help="Consider to find reverse dependencies recursively.",
)
parser.add_argument(
"-d",
"--recursive-depth",
type=int,
default=DEFAULT_MAX_DEPTH,
help='If recusive, you can specify the depth.')
parser.add_argument('package')
help="If recusive, you can specify the depth.",
)
parser.add_argument("package")
options = parser.parse_args()
opts = {}
if options.server is not None:
opts['server'] = options.server
opts["server"] = options.server
# Convert unstable/testing aliases to codenames:
distribution = codename_to_distribution(options.release)
if not distribution:
parser.error('Unknown release codename %s' % options.release)
parser.error(f"Unknown release codename {options.release}")
distro_info = vendor_to_distroinfo(distribution)()
try:
options.release = distro_info.codename(options.release,
default=options.release)
options.release = distro_info.codename(options.release, default=options.release)
except DistroDataOutdated:
# We already logged a warning
pass
if options.build_depends:
options.arch = 'source'
options.arch = "source"
if options.arch == 'source':
if options.arch == "source":
fields = [
'Reverse-Build-Depends',
'Reverse-Build-Depends-Indep',
'Reverse-Build-Depends-Arch',
'Reverse-Testsuite-Triggers',
"Reverse-Build-Depends",
"Reverse-Build-Depends-Indep",
"Reverse-Build-Depends-Arch",
"Reverse-Testsuite-Triggers",
]
else:
fields = ['Reverse-Depends']
fields = ["Reverse-Depends"]
if options.recommends:
fields.append('Reverse-Recommends')
fields.append("Reverse-Recommends")
if options.with_suggests:
fields.append('Reverse-Suggests')
fields.append("Reverse-Suggests")
def build_results(package, result, fields, component, recursive):
try:
@ -119,9 +151,9 @@ def main():
if fields:
data = {k: v for k, v in data.items() if k in fields}
if component:
data = {k: [rdep for rdep in v
if rdep['Component'] in component]
for k, v in data.items()}
data = {
k: [rdep for rdep in v if rdep["Component"] in component] for k, v in data.items()
}
data = {k: v for k, v in data.items() if v}
result[package] = data
@ -129,13 +161,16 @@ def main():
if recursive > 0:
for rdeps in result[package].values():
for rdep in rdeps:
build_results(
rdep['Package'], result, fields, component, recursive - 1)
build_results(rdep["Package"], result, fields, component, recursive - 1)
result = {}
build_results(
options.package, result, fields, options.component,
options.recursive and options.recursive_depth or 0)
options.package,
result,
fields,
options.component,
options.recursive and options.recursive_depth or 0,
)
if options.list:
display_consise(result)
@ -148,52 +183,59 @@ def display_verbose(package, values):
Logger.info("No reverse dependencies found")
return
def log_field(field):
Logger.info(field)
Logger.info('=' * len(field))
def log_package(values, package, arch, dependency, offset=0):
line = ' ' * offset + '* %s' % package
def log_package(values, package, arch, dependency, visited, offset=0):
line = f"{' ' * offset}* {package}"
if all_archs and set(arch) != all_archs:
line += ' [%s]' % ' '.join(sorted(arch))
line += f" [{' '.join(sorted(arch))}]"
if dependency:
if len(line) < 30:
line += ' ' * (30 - len(line))
line += ' (for %s)' % dependency
line += " " * (30 - len(line))
line += f" (for {dependency})"
Logger.info(line)
if package in visited:
return
visited = visited.copy().add(package)
data = values.get(package)
if data:
offset = offset + 1
for rdeps in data.values():
for rdep in rdeps:
log_package(values,
rdep['Package'],
rdep.get('Architectures', all_archs),
rdep.get('Dependency'),
offset)
log_package(
values,
rdep["Package"],
rdep.get("Architectures", all_archs),
rdep.get("Dependency"),
visited,
offset,
)
all_archs = set()
# This isn't accurate, but we make up for it by displaying what we found
for data in values.values():
for rdeps in data.values():
for rdep in rdeps:
if 'Architectures' in rdep:
all_archs.update(rdep['Architectures'])
if "Architectures" in rdep:
all_archs.update(rdep["Architectures"])
for field, rdeps in values[package].items():
Logger.info(field)
rdeps.sort(key=lambda x: x['Package'])
Logger.info("%s", field)
Logger.info("%s", "=" * len(field))
rdeps.sort(key=lambda x: x["Package"])
for rdep in rdeps:
log_package(values,
rdep['Package'],
rdep.get('Architectures', all_archs),
rdep.get('Dependency'))
log_package(
values,
rdep["Package"],
rdep.get("Architectures", all_archs),
rdep.get("Dependency"),
{package},
)
Logger.info("")
if all_archs:
Logger.info("Packages without architectures listed are "
"reverse-dependencies in: %s"
% ', '.join(sorted(list(all_archs))))
Logger.info(
"Packages without architectures listed are reverse-dependencies in: %s",
", ".join(sorted(list(all_archs))),
)
def display_consise(values):
@ -201,10 +243,10 @@ def display_consise(values):
for data in values.values():
for rdeps in data.values():
for rdep in rdeps:
result.add(rdep['Package'])
result.add(rdep["Package"])
Logger.info('\n'.join(sorted(list(result))))
Logger.info("\n".join(sorted(list(result))))
if __name__ == '__main__':
if __name__ == "__main__":
main()

19
run-linters Executable file
View File

@ -0,0 +1,19 @@
#!/bin/sh
set -eu
# Copyright 2023, Canonical Ltd.
# SPDX-License-Identifier: GPL-3.0
PYTHON_SCRIPTS=$(grep -l -r '^#! */usr/bin/python3$' .)
echo "Running black..."
black --check --diff . $PYTHON_SCRIPTS
echo "Running isort..."
isort --check-only --diff .
echo "Running flake8..."
flake8 --max-line-length=99 --ignore=E203,W503 . $PYTHON_SCRIPTS
echo "Running pylint..."
pylint $(find * -name '*.py') $PYTHON_SCRIPTS

81
running-autopkgtests Executable file
View File

@ -0,0 +1,81 @@
#!/usr/bin/python3
# -*- Mode: Python; coding: utf-8; indent-tabs-mode: nil; tab-width: 4 -*-
# Authors:
# Andy P. Whitcroft
# Christian Ehrhardt
# Chris Peterson <chris.peterson@canonical.com>
#
# Copyright (C) 2024 Canonical Ltd.
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU General Public License version 3, as published
# by the Free Software Foundation.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranties of
# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program. If not, see <http://www.gnu.org/licenses/>.
"""Dumps a list of currently running tests in Autopkgtest"""
__example__ = """
Display first listed test running on amd64 hardware:
$ running-autopkgtests | grep amd64 | head -n1
R 0:01:40 systemd-upstream - focal amd64\
upstream-systemd-ci/systemd-ci - ['CFLAGS=-O0', 'DEB_BUILD_PROFILES=noudeb',\
'TEST_UPSTREAM=1', 'CONFFLAGS_UPSTREAM=--werror -Dslow-tests=true',\
'UPSTREAM_PULL_REQUEST=23153',\
'GITHUB_STATUSES_URL=https://api.github.com/repos/\
systemd/systemd/statuses/cfb0935923dff8050315b5dd22ce8ab06461ff0e']
"""
import sys
from argparse import ArgumentParser, RawDescriptionHelpFormatter
from ubuntutools.running_autopkgtests import get_queued, get_running
def parse_args():
description = (
"Dumps a list of currently running and queued tests in Autopkgtest. "
"Pass --running to only see running tests, or --queued to only see "
"queued tests. Passing both will print both, which is the default behavior. "
)
parser = ArgumentParser(
prog="running-autopkgtests",
description=description,
epilog=f"example: {__example__}",
formatter_class=RawDescriptionHelpFormatter,
)
parser.add_argument(
"-r", "--running", action="store_true", help="Print runnning autopkgtests (default: true)"
)
parser.add_argument(
"-q", "--queued", action="store_true", help="Print queued autopkgtests (default: true)"
)
options = parser.parse_args()
# If neither flag was specified, default to both not neither
if not options.running and not options.queued:
options.running = True
options.queued = True
return options
def main() -> int:
args = parse_args()
if args.running:
print(get_running())
if args.queued:
print(get_queued())
return 0
if __name__ == "__main__":
sys.exit(main())

View File

@ -14,51 +14,53 @@
# ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
# pylint: disable=invalid-name
# pylint: enable=invalid-name
import argparse
import collections
import gzip
import json
import optparse
import os
import time
import urllib.request
from ubuntutools.lp.lpapicache import (Distribution, Launchpad,
PackageNotFoundException)
from ubuntutools import getLogger
from ubuntutools.lp.lpapicache import Distribution, Launchpad, PackageNotFoundException
Logger = getLogger()
DATA_URL = 'http://qa.ubuntuwire.org/ubuntu-seeded-packages/seeded.json.gz'
DATA_URL = "http://qa.ubuntuwire.org/ubuntu-seeded-packages/seeded.json.gz"
def load_index(url):
'''Download a new copy of the image contents index, if necessary,
"""Download a new copy of the image contents index, if necessary,
and read it.
'''
cachedir = os.path.expanduser('~/.cache/ubuntu-dev-tools')
fn = os.path.join(cachedir, 'seeded.json.gz')
"""
cachedir = os.path.expanduser("~/.cache/ubuntu-dev-tools")
seeded = os.path.join(cachedir, "seeded.json.gz")
if (not os.path.isfile(fn)
or time.time() - os.path.getmtime(fn) > 60 * 60 * 2):
if not os.path.isfile(seeded) or time.time() - os.path.getmtime(seeded) > 60 * 60 * 2:
if not os.path.isdir(cachedir):
os.makedirs(cachedir)
urllib.request.urlretrieve(url, fn)
urllib.request.urlretrieve(url, seeded)
try:
with gzip.open(fn, 'r') as f:
with gzip.open(seeded, "r") as f:
return json.load(f)
except Exception as e:
Logger.error("Unable to parse seed data: %s. "
"Deleting cached data, please try again.",
str(e))
os.unlink(fn)
except Exception as e: # pylint: disable=broad-except
Logger.error(
"Unable to parse seed data: %s. Deleting cached data, please try again.", str(e)
)
os.unlink(seeded)
return None
def resolve_binaries(sources):
'''Return a dict of source:binaries for all binary packages built by
"""Return a dict of source:binaries for all binary packages built by
sources
'''
archive = Distribution('ubuntu').getArchive()
"""
archive = Distribution("ubuntu").getArchive()
binaries = {}
for source in sources:
try:
@ -66,80 +68,84 @@ def resolve_binaries(sources):
except PackageNotFoundException as e:
Logger.error(str(e))
continue
binaries[source] = sorted(set(bpph.getPackageName()
for bpph in spph.getBinaries()))
binaries[source] = sorted(set(bpph.getPackageName() for bpph in spph.getBinaries()))
return binaries
def present_on(appearences):
'''Format a list of (flavor, type) tuples into a human-readable string'''
"""Format a list of (flavor, type) tuples into a human-readable string"""
present = collections.defaultdict(set)
for flavor, type_ in appearences:
present[flavor].add(type_)
for flavor, types in present.items():
if len(types) > 1:
types.discard('supported')
output = [' %s: %s' % (flavor, ', '.join(sorted(types)))
for flavor, types in present.items()]
types.discard("supported")
output = [f" {flavor}: {', '.join(sorted(types))}" for flavor, types in present.items()]
output.sort()
return '\n'.join(output)
return "\n".join(output)
def output_binaries(index, binaries):
'''Print binaries found in index'''
"""Print binaries found in index"""
for binary in binaries:
if binary in index:
Logger.info("%s is seeded in:" % binary)
Logger.info("%s is seeded in:", binary)
Logger.info(present_on(index[binary]))
else:
Logger.info("%s is not seeded (and may not exist)." % binary)
Logger.info("%s is not seeded (and may not exist).", binary)
def output_by_source(index, by_source):
'''Logger.Info(binaries found in index. Grouped by source'''
"""Logger.Info(binaries found in index. Grouped by source"""
for source, binaries in by_source.items():
seen = False
if not binaries:
Logger.info("Status unknown: No binary packages built by the latest "
"%s.\nTry again using -b and the expected binary packages."
% source)
Logger.info(
"Status unknown: No binary packages built by the latest "
"%s.\nTry again using -b and the expected binary packages.",
source,
)
continue
for binary in binaries:
if binary in index:
seen = True
Logger.info("%s (from %s) is seeded in:" % (binary, source))
Logger.info("%s (from %s) is seeded in:", binary, source)
Logger.info(present_on(index[binary]))
if not seen:
Logger.info("%s's binaries are not seeded." % source)
Logger.info("%s's binaries are not seeded.", source)
def main():
'''Query which images the specified packages are on'''
parser = optparse.OptionParser('%prog [options] package...')
parser.add_option('-b', '--binary',
default=False, action='store_true',
help="Binary packages are being specified, "
"not source packages (fast)")
parser.add_option('-u', '--data-url', metavar='URL',
"""Query which images the specified packages are on"""
parser = argparse.ArgumentParser(usage="%(prog)s [options] package...")
parser.add_argument(
"-b",
"--binary",
default=False,
action="store_true",
help="Binary packages are being specified, not source packages (fast)",
)
parser.add_argument(
"-u",
"--data-url",
metavar="URL",
default=DATA_URL,
help='URL for the seeded packages index. '
'Default: UbuntuWire')
options, args = parser.parse_args()
if len(args) < 1:
parser.error("At least one package must be specified")
help="URL for the seeded packages index. Default: UbuntuWire",
)
parser.add_argument("packages", metavar="package", nargs="+", help=argparse.SUPPRESS)
args = parser.parse_args()
# Login anonymously to LP
Launchpad.login_anonymously()
index = load_index(options.data_url)
if options.binary:
output_binaries(index, args)
index = load_index(args.data_url)
if args.binary:
output_binaries(index, args.packages)
else:
binaries = resolve_binaries(args)
binaries = resolve_binaries(args.packages)
output_by_source(index, binaries)
if __name__ == '__main__':
if __name__ == "__main__":
main()

View File

@ -104,7 +104,7 @@ echo "In order to do packaging work, you'll need a minimal set of packages."
echo "Those, together with other packages which, though optional, have proven"
echo "to be useful, will now be installed."
echo
sudo apt-get install ubuntu-dev-tools devscripts debhelper cdbs patchutils pbuilder build-essential
sudo apt-get install ubuntu-dev-tools devscripts debhelper patchutils pbuilder build-essential
separator2
echo "Enabling the source repository"

146
setup.py
View File

@ -1,82 +1,100 @@
#!/usr/bin/python3
from setuptools import setup
import glob
import os
import pathlib
import re
# look/set what version we have
changelog = "debian/changelog"
if os.path.exists(changelog):
head = open(changelog, 'r', encoding='utf-8').readline()
from setuptools import setup
def get_debian_version() -> str:
"""Look what Debian version we have."""
changelog = pathlib.Path(__file__).parent / "debian" / "changelog"
with changelog.open("r", encoding="utf-8") as changelog_f:
head = changelog_f.readline()
match = re.compile(r".*\((.*)\).*").match(head)
if match:
version = match.group(1)
if not match:
raise ValueError(f"Failed to extract Debian version from '{head}'.")
return match.group(1)
def make_pep440_compliant(version: str) -> str:
"""Convert the version into a PEP440 compliant version."""
public_version_re = re.compile(r"^([0-9][0-9.]*(?:(?:a|b|rc|.post|.dev)[0-9]+)*)\+?")
_, public, local = public_version_re.split(version, maxsplit=1)
if not local:
return version
sanitized_local = re.sub("[+~]+", ".", local).strip(".")
pep440_version = f"{public}+{sanitized_local}"
assert re.match("^[a-zA-Z0-9.]+$", sanitized_local), f"'{pep440_version}' not PEP440 compliant"
return pep440_version
scripts = [
'backportpackage',
'bitesize',
'check-mir',
'check-symbols',
'dch-repeat',
'grab-merge',
'grep-merges',
'hugdaylist',
'import-bug-from-debian',
'merge-changelog',
'mk-sbuild',
'pbuilder-dist',
'pbuilder-dist-simple',
'pull-pkg',
'pull-debian-debdiff',
'pull-debian-source',
'pull-debian-debs',
'pull-debian-ddebs',
'pull-debian-udebs',
'pull-lp-source',
'pull-lp-debs',
'pull-lp-ddebs',
'pull-lp-udebs',
'pull-ppa-source',
'pull-ppa-debs',
'pull-ppa-ddebs',
'pull-ppa-udebs',
'pull-uca-source',
'pull-uca-debs',
'pull-uca-ddebs',
'pull-uca-udebs',
'requestbackport',
'requestsync',
'reverse-depends',
'seeded-in-ubuntu',
'setup-packaging-environment',
'sponsor-patch',
'submittodebian',
'syncpackage',
'ubuntu-build',
'ubuntu-iso',
'ubuntu-upload-permission',
'update-maintainer',
"backportpackage",
"check-mir",
"check-symbols",
"dch-repeat",
"grab-merge",
"grep-merges",
"import-bug-from-debian",
"lp-bitesize",
"merge-changelog",
"mk-sbuild",
"pbuilder-dist",
"pbuilder-dist-simple",
"pm-helper",
"pull-pkg",
"pull-debian-debdiff",
"pull-debian-source",
"pull-debian-debs",
"pull-debian-ddebs",
"pull-debian-udebs",
"pull-lp-source",
"pull-lp-debs",
"pull-lp-ddebs",
"pull-lp-udebs",
"pull-ppa-source",
"pull-ppa-debs",
"pull-ppa-ddebs",
"pull-ppa-udebs",
"pull-uca-source",
"pull-uca-debs",
"pull-uca-ddebs",
"pull-uca-udebs",
"requestbackport",
"requestsync",
"reverse-depends",
"running-autopkgtests",
"seeded-in-ubuntu",
"setup-packaging-environment",
"sponsor-patch",
"submittodebian",
"syncpackage",
"ubuntu-build",
"ubuntu-iso",
"ubuntu-upload-permission",
"update-maintainer",
]
data_files = [
('share/bash-completion/completions', glob.glob("bash_completion/*")),
('share/man/man1', glob.glob("doc/*.1")),
('share/man/man5', glob.glob("doc/*.5")),
('share/ubuntu-dev-tools', ['enforced-editing-wrapper']),
("share/bash-completion/completions", glob.glob("bash_completion/*")),
("share/man/man1", glob.glob("doc/*.1")),
("share/man/man5", glob.glob("doc/*.5")),
("share/ubuntu-dev-tools", ["enforced-editing-wrapper"]),
]
if __name__ == '__main__':
if __name__ == "__main__":
setup(
name='ubuntu-dev-tools',
version=version,
name="ubuntu-dev-tools",
version=make_pep440_compliant(get_debian_version()),
scripts=scripts,
packages=[
'ubuntutools',
'ubuntutools/lp',
'ubuntutools/requestsync',
'ubuntutools/sponsor_patch',
'ubuntutools/test',
"ubuntutools",
"ubuntutools/lp",
"ubuntutools/requestsync",
"ubuntutools/sponsor_patch",
"ubuntutools/test",
],
data_files=data_files,
test_suite='ubuntutools.test',
test_suite="ubuntutools.test",
)

View File

@ -14,123 +14,153 @@
# ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
import optparse
# pylint: disable=invalid-name
# pylint: enable=invalid-name
import argparse
import logging
import os
import shutil
import sys
import tempfile
import logging
from ubuntutools.builder import get_builder
from ubuntutools.config import UDTConfig
from ubuntutools.sponsor_patch.sponsor_patch import sponsor_patch, check_dependencies
from ubuntutools import getLogger
from ubuntutools.builder import get_builder
from ubuntutools.config import UDTConfig
from ubuntutools.sponsor_patch.sponsor_patch import check_dependencies, sponsor_patch
Logger = getLogger()
def parse(script_name):
"""Parse the command line parameters."""
usage = ("%s [options] <bug number>\n" % (script_name)
+ "One of --upload, --workdir, or --sponsor must be specified.")
epilog = "See %s(1) for more info." % (script_name)
parser = optparse.OptionParser(usage=usage, epilog=epilog)
usage = (
"%(prog)s [options] <bug number>\n"
"One of --upload, --workdir, or --sponsor must be specified."
)
epilog = f"See {script_name}(1) for more info."
parser = argparse.ArgumentParser(usage=usage, epilog=epilog)
parser.add_option("-b", "--build", dest="build",
parser.add_argument(
"-b",
"--build",
dest="build",
help="Build the package with the specified builder.",
action="store_true", default=False)
parser.add_option("-B", "--builder", dest="builder", default=None,
help="Specify the package builder (default pbuilder)")
parser.add_option("-e", "--edit",
help="launch sub-shell to allow editing of the patch",
dest="edit", action="store_true", default=False)
parser.add_option("-k", "--key", dest="keyid", default=None,
help="Specify the key ID to be used for signing.")
parser.add_option("-l", "--lpinstance", dest="lpinstance", default=None,
help="Launchpad instance to connect to "
"(default: production)",
metavar="INSTANCE")
parser.add_option("--no-conf", dest="no_conf", default=False,
help="Don't read config files or environment variables.",
action="store_true")
parser.add_option("-s", "--sponsor", help="sponsoring; equals -b -u ubuntu",
dest="sponsoring", action="store_true", default=False)
parser.add_option("-u", "--upload", dest="upload", default=None,
help="Specify an upload destination (default none).")
parser.add_option("-U", "--update", dest="update", default=False,
action="store_true",
help="Update the build environment before building.")
parser.add_option("-v", "--verbose", help="print more information",
dest="verbose", action="store_true", default=False)
parser.add_option("-w", "--workdir", dest="workdir", default=None,
)
parser.add_argument(
"-B", "--builder", dest="builder", help="Specify the package builder (default pbuilder)"
)
parser.add_argument(
"-e",
"--edit",
help="launch sub-shell to allow editing of the patch",
dest="edit",
action="store_true",
)
parser.add_argument(
"-k", "--key", dest="keyid", help="Specify the key ID to be used for signing."
)
parser.add_argument(
"-l",
"--lpinstance",
dest="lpinstance",
help="Launchpad instance to connect to (default: production)",
metavar="INSTANCE",
)
parser.add_argument(
"--no-conf",
dest="no_conf",
help="Don't read config files or environment variables.",
action="store_true",
)
parser.add_argument(
"-s",
"--sponsor",
help="sponsoring; equals -b -u ubuntu",
dest="sponsoring",
action="store_true",
)
parser.add_argument(
"-u", "--upload", dest="upload", help="Specify an upload destination (default none)."
)
parser.add_argument(
"-U",
"--update",
dest="update",
action="store_true",
help="Update the build environment before building.",
)
parser.add_argument(
"-v", "--verbose", help="print more information", dest="verbose", action="store_true"
)
parser.add_argument(
"-w",
"--workdir",
dest="workdir",
help="Specify a working directory (default is a "
"temporary directory, deleted afterwards).")
"temporary directory, deleted afterwards).",
)
parser.add_argument("bug_number", type=int, help=argparse.SUPPRESS)
(options, args) = parser.parse_args()
if options.verbose:
args = parser.parse_args()
if args.verbose:
Logger.setLevel(logging.DEBUG)
check_dependencies()
if len(args) == 0:
Logger.error("No bug number specified.")
sys.exit(1)
elif len(args) > 1:
Logger.error("Multiple bug numbers specified: %s" % (", ".join(args)))
sys.exit(1)
config = UDTConfig(args.no_conf)
if args.builder is None:
args.builder = config.get_value("BUILDER")
if args.lpinstance is None:
args.lpinstance = config.get_value("LPINSTANCE")
if not args.update:
args.update = config.get_value("UPDATE_BUILDER", boolean=True)
if args.workdir is None:
args.workdir = config.get_value("WORKDIR")
if args.keyid is None:
args.keyid = config.get_value("KEYID")
bug_number = args[0]
if bug_number.isdigit():
bug_number = int(bug_number)
else:
Logger.error("Invalid bug number specified: %s" % (bug_number))
sys.exit(1)
if args.sponsoring:
args.build = True
args.upload = "ubuntu"
config = UDTConfig(options.no_conf)
if options.builder is None:
options.builder = config.get_value("BUILDER")
if options.lpinstance is None:
options.lpinstance = config.get_value("LPINSTANCE")
if not options.update:
options.update = config.get_value("UPDATE_BUILDER", boolean=True)
if options.workdir is None:
options.workdir = config.get_value("WORKDIR")
if options.keyid is None:
options.keyid = config.get_value("KEYID")
if options.sponsoring:
options.build = True
options.upload = "ubuntu"
return (options, bug_number)
return args
def main():
script_name = os.path.basename(sys.argv[0])
(options, bug_number) = parse(script_name)
args = parse(script_name)
builder = get_builder(options.builder)
builder = get_builder(args.builder)
if not builder:
sys.exit(1)
if not options.upload and not options.workdir:
Logger.error("Please specify either a working directory or an upload "
"target!")
if not args.upload and not args.workdir:
Logger.error("Please specify either a working directory or an upload target!")
sys.exit(1)
if options.workdir is None:
if args.workdir is None:
workdir = tempfile.mkdtemp(prefix=script_name + "-")
else:
workdir = options.workdir
workdir = args.workdir
try:
sponsor_patch(bug_number, options.build, builder, options.edit,
options.keyid, options.lpinstance, options.update,
options.upload, workdir)
sponsor_patch(
args.bug_number,
args.build,
builder,
args.edit,
args.keyid,
args.lpinstance,
args.update,
args.upload,
workdir,
)
except KeyboardInterrupt:
Logger.error("User abort.")
sys.exit(2)
finally:
if options.workdir is None:
if args.workdir is None:
shutil.rmtree(workdir)

View File

@ -22,32 +22,36 @@
#
# ##################################################################
import optparse
"""Submit the Ubuntu changes in a package to Debian.
Run inside an unpacked Ubuntu source package.
"""
import argparse
import os
import re
import shutil
import sys
from subprocess import call, check_call, run, Popen, PIPE, DEVNULL
from subprocess import DEVNULL, PIPE, Popen, call, check_call, run
from tempfile import mkdtemp
from debian.changelog import Changelog
from distro_info import UbuntuDistroInfo, DistroDataOutdated
from ubuntutools.config import ubu_email
from ubuntutools.question import YesNoQuestion, EditFile
from ubuntutools.update_maintainer import update_maintainer, restore_maintainer
from distro_info import DistroDataOutdated, UbuntuDistroInfo
from ubuntutools import getLogger
from ubuntutools.config import ubu_email
from ubuntutools.question import EditFile, YesNoQuestion
from ubuntutools.update_maintainer import restore_maintainer, update_maintainer
Logger = getLogger()
def get_most_recent_debian_version(changelog):
for block in changelog:
version = block.version.full_version
if not re.search('(ubuntu|build)', version):
if not re.search("(ubuntu|build)", version):
return version
return None
def get_bug_body(changelog):
@ -65,19 +69,20 @@ In Ubuntu, the attached patch was applied to achieve the following:
%s
Thanks for considering the patch.
""" % ("\n".join([a for a in entry.changes()]))
""" % (
"\n".join(entry.changes())
)
return msg
def build_source_package():
if os.path.isdir('.bzr'):
cmd = ['bzr', 'bd', '--builder=dpkg-buildpackage', '-S',
'--', '-uc', '-us', '-nc']
if os.path.isdir(".bzr"):
cmd = ["bzr", "bd", "--builder=dpkg-buildpackage", "-S", "--", "-uc", "-us", "-nc"]
else:
cmd = ['dpkg-buildpackage', '-S', '-uc', '-us', '-nc']
cmd = ["dpkg-buildpackage", "-S", "-uc", "-us", "-nc"]
env = os.environ.copy()
# Unset DEBEMAIL in case there's an @ubuntu.com e-mail address
env.pop('DEBEMAIL', None)
env.pop("DEBEMAIL", None)
check_call(cmd, env=env)
@ -88,30 +93,35 @@ def gen_debdiff(tmpdir, changelog):
newver = next(changelog_it).version
oldver = next(changelog_it).version
debdiff = os.path.join(tmpdir, '%s_%s.debdiff' % (pkg, newver))
debdiff = os.path.join(tmpdir, f"{pkg}_{newver}.debdiff")
diff_cmd = ['bzr', 'diff', '-r', 'tag:' + str(oldver)]
diff_cmd = ["bzr", "diff", "-r", "tag:" + str(oldver)]
if call(diff_cmd, stdout=DEVNULL, stderr=DEVNULL) == 1:
Logger.info("Extracting bzr diff between %s and %s" % (oldver, newver))
Logger.info("Extracting bzr diff between %s and %s", oldver, newver)
else:
if oldver.epoch is not None:
oldver = str(oldver)[str(oldver).index(":") + 1 :]
if newver.epoch is not None:
newver = str(newver)[str(newver).index(":") + 1 :]
olddsc = '../%s_%s.dsc' % (pkg, oldver)
newdsc = '../%s_%s.dsc' % (pkg, newver)
olddsc = f"../{pkg}_{oldver}.dsc"
newdsc = f"../{pkg}_{newver}.dsc"
check_file(olddsc)
check_file(newdsc)
Logger.info("Generating debdiff between %s and %s" % (oldver, newver))
diff_cmd = ['debdiff', olddsc, newdsc]
Logger.info("Generating debdiff between %s and %s", oldver, newver)
diff_cmd = ["debdiff", olddsc, newdsc]
with Popen(diff_cmd, stdout=PIPE, encoding='utf-8') as diff:
with open(debdiff, 'w', encoding='utf-8') as debdiff_f:
run(['filterdiff', '-x', '*changelog*'],
stdin=diff.stdout, stdout=debdiff_f, encoding='utf-8')
with Popen(diff_cmd, stdout=PIPE, encoding="utf-8") as diff:
with open(debdiff, "w", encoding="utf-8") as debdiff_f:
run(
["filterdiff", "-x", "*changelog*"],
check=False,
stdin=diff.stdout,
stdout=debdiff_f,
encoding="utf-8",
)
return debdiff
@ -119,10 +129,9 @@ def gen_debdiff(tmpdir, changelog):
def check_file(fname, critical=True):
if os.path.exists(fname):
return fname
else:
if not critical:
return False
Logger.info("Couldn't find «%s».\n" % fname)
Logger.info("Couldn't find «%s».\n", fname)
sys.exit(1)
@ -131,76 +140,84 @@ def submit_bugreport(body, debdiff, deb_version, changelog):
devel = UbuntuDistroInfo().devel()
except DistroDataOutdated as e:
Logger.info(str(e))
devel = ''
devel = ""
if os.path.dirname(sys.argv[0]).startswith('/usr/bin'):
editor_path = '/usr/share/ubuntu-dev-tools'
if os.path.dirname(sys.argv[0]).startswith("/usr/bin"):
editor_path = "/usr/share/ubuntu-dev-tools"
else:
editor_path = os.path.dirname(sys.argv[0])
env = dict(os.environ.items())
if 'EDITOR' in env:
env['UDT_EDIT_WRAPPER_EDITOR'] = env['EDITOR']
if 'VISUAL' in env:
env['UDT_EDIT_WRAPPER_VISUAL'] = env['VISUAL']
env['EDITOR'] = os.path.join(editor_path, 'enforced-editing-wrapper')
env['VISUAL'] = os.path.join(editor_path, 'enforced-editing-wrapper')
env['UDT_EDIT_WRAPPER_TEMPLATE_RE'] = (
'.*REPLACE THIS WITH ACTUAL INFORMATION.*')
env['UDT_EDIT_WRAPPER_FILE_DESCRIPTION'] = 'bug report'
if "EDITOR" in env:
env["UDT_EDIT_WRAPPER_EDITOR"] = env["EDITOR"]
if "VISUAL" in env:
env["UDT_EDIT_WRAPPER_VISUAL"] = env["VISUAL"]
env["EDITOR"] = os.path.join(editor_path, "enforced-editing-wrapper")
env["VISUAL"] = os.path.join(editor_path, "enforced-editing-wrapper")
env["UDT_EDIT_WRAPPER_TEMPLATE_RE"] = ".*REPLACE THIS WITH ACTUAL INFORMATION.*"
env["UDT_EDIT_WRAPPER_FILE_DESCRIPTION"] = "bug report"
# In external mua mode, attachments are lost (Reportbug bug: #679907)
internal_mua = True
for cfgfile in ('/etc/reportbug.conf', '~/.reportbugrc'):
for cfgfile in ("/etc/reportbug.conf", "~/.reportbugrc"):
cfgfile = os.path.expanduser(cfgfile)
if not os.path.exists(cfgfile):
continue
with open(cfgfile, 'r') as f:
with open(cfgfile, "r", encoding="utf-8") as f:
for line in f:
line = line.strip()
if line in ('gnus', 'mutt', 'nmh') or line.startswith('mua '):
if line in ("gnus", "mutt", "nmh") or line.startswith("mua "):
internal_mua = False
break
cmd = ('reportbug',
'--no-check-available',
'--no-check-installed',
'--pseudo-header', 'User: ubuntu-devel@lists.ubuntu.com',
'--pseudo-header', 'Usertags: origin-ubuntu %s ubuntu-patch'
% devel,
'--tag', 'patch',
'--bts', 'debian',
'--include', body,
'--attach' if internal_mua else '--include', debdiff,
'--package-version', deb_version,
changelog.package)
cmd = (
"reportbug",
"--no-check-available",
"--no-check-installed",
"--pseudo-header",
"User: ubuntu-devel@lists.ubuntu.com",
"--pseudo-header",
f"Usertags: origin-ubuntu {devel} ubuntu-patch",
"--tag",
"patch",
"--bts",
"debian",
"--include",
body,
"--attach" if internal_mua else "--include",
debdiff,
"--package-version",
deb_version,
changelog.package,
)
check_call(cmd, env=env)
def check_reportbug_config():
fn = os.path.expanduser('~/.reportbugrc')
if os.path.exists(fn):
reportbugrc_filename = os.path.expanduser("~/.reportbugrc")
if os.path.exists(reportbugrc_filename):
return
email = ubu_email()[1]
reportbugrc = """# Reportbug configuration generated by submittodebian(1)
reportbugrc = f"""# Reportbug configuration generated by submittodebian(1)
# See reportbug.conf(5) for the configuration file format.
# Use Debian's reportbug SMTP Server:
# Note: it's limited to 5 connections per hour, and cannot CC you at submission
# time. See /usr/share/doc/reportbug/README.Users.gz for more details.
smtphost reportbug.debian.org:587
header "X-Debbugs-CC: %s"
header "X-Debbugs-CC: {email}"
no-cc
# Use GMail's SMTP Server:
#smtphost smtp.googlemail.com:587
#smtpuser "<your address>@gmail.com"
#smtptls
""" % email
"""
with open(fn, 'w') as f:
with open(reportbugrc_filename, "w", encoding="utf-8") as f:
f.write(reportbugrc)
Logger.info("""\
Logger.info(
"""\
You have not configured reportbug. Assuming this is the first time you have
used it. Writing a ~/.reportbugrc that will use Debian's mail server, and CC
the bug to you at <%s>
@ -211,40 +228,43 @@ the bug to you at <%s>
If this is not correct, please exit now and edit ~/.reportbugrc or run
reportbug --configure for its configuration wizard.
""" % (email, reportbugrc.strip()))
""",
email,
reportbugrc.strip(),
)
if YesNoQuestion().ask("Continue submitting this bug", "yes") == "no":
sys.exit(1)
def main():
description = 'Submit the Ubuntu changes in a package to Debian. ' + \
'Run inside an unpacked Ubuntu source package.'
parser = optparse.OptionParser(description=description)
parser = argparse.ArgumentParser(description=__doc__)
parser.parse_args()
if not os.path.exists('/usr/bin/reportbug'):
Logger.error("This utility requires the «reportbug» package, which isn't "
"currently installed.")
if not os.path.exists("/usr/bin/reportbug"):
Logger.error(
"This utility requires the «reportbug» package, which isn't currently installed."
)
sys.exit(1)
check_reportbug_config()
changelog_file = (check_file('debian/changelog', critical=False) or
check_file('../debian/changelog'))
with open(changelog_file) as f:
changelog_file = check_file("debian/changelog", critical=False) or check_file(
"../debian/changelog"
)
with open(changelog_file, encoding="utf-8") as f:
changelog = Changelog(f.read())
deb_version = get_most_recent_debian_version(changelog)
bug_body = get_bug_body(changelog)
tmpdir = mkdtemp()
body = os.path.join(tmpdir, 'bug_body')
with open(body, 'wb') as f:
f.write(bug_body.encode('utf-8'))
body = os.path.join(tmpdir, "bug_body")
with open(body, "wb") as f:
f.write(bug_body.encode("utf-8"))
restore_maintainer('debian')
restore_maintainer("debian")
build_source_package()
update_maintainer('debian')
update_maintainer("debian")
debdiff = gen_debdiff(tmpdir, changelog)
@ -252,7 +272,7 @@ def main():
# reverted in the most recent build
build_source_package()
EditFile(debdiff, 'debdiff').edit(optional=True)
EditFile(debdiff, "debdiff").edit(optional=True)
submit_bugreport(body, debdiff, deb_version, changelog)
os.unlink(body)
@ -260,5 +280,5 @@ def main():
shutil.rmtree(tmpdir)
if __name__ == '__main__':
if __name__ == "__main__":
main()

File diff suppressed because it is too large Load Diff

View File

@ -1 +0,0 @@
upstream

View File

@ -1 +0,0 @@
7

View File

@ -1,12 +0,0 @@
Source: example
Section: misc
Priority: extra
Maintainer: Ubuntu Developers <ubuntu-dev-team@lists.alioth.debian.org>
Build-Depends: debhelper (>= 7.0.50~)
Standards-Version: 3.9.1
Package: example
Architecture: all
Depends: ${misc:Depends}, ${shlibs:Depends}
Description: Example package for testing purposes
An example package used by the test suite. Useless.

View File

@ -1,17 +0,0 @@
Format: http://svn.debian.org/wsvn/dep/web/deps/dep5.mdwn?op=file&rev=152
Source: https://launchpad.net/ubuntu-dev-tools
Files: *
Copyright: 2010-2011, Stefano Rivera <stefanor@ubuntu.com>
License: ISC
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY
AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
PERFORMANCE OF THIS SOFTWARE.

View File

@ -1,4 +0,0 @@
#!/usr/bin/make -f
%:
dh $@

View File

@ -1 +0,0 @@
3.0 (quilt)

View File

@ -1 +0,0 @@
abort-on-upstream-changes

View File

@ -1 +0,0 @@
compression=xz

View File

@ -1,3 +1,4 @@
coverage
flake8 >= 3.8.0
nose
pytest
pytest-cov

View File

@ -1,5 +1,5 @@
[tox]
envlist = flake8,nose
envlist = flake8,pytest
skipsdist = True
[testenv]
@ -14,10 +14,9 @@ install_command = pip install {opts} {packages}
[testenv:flake8]
commands = flake8 {posargs}
[testenv:nose]
commands = nosetests -v --with-coverage --cover-package=ubuntutools {posargs:ubuntutools}
[testenv:pytest]
commands = pytest -v --cov=ubuntutools {posargs:ubuntutools}
[flake8]
verbose = 2
max-line-length = 99
extend-exclude = ubuntu-archive-assistant,ubuntu_archive_assistant

View File

@ -1,28 +0,0 @@
#!/usr/bin/python3
# -*- coding: utf-8 -*-
# Copyright (C) 2018 Canonical Ltd.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; version 3 of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import signal
import sys
from ubuntu_archive_assistant.core import Assistant
assistant = Assistant()
def signal_handler(signal, frame):
sys.exit(0)
signal.signal(signal.SIGINT, signal_handler)
assistant.main()

View File

@ -2,16 +2,16 @@
#
# ubuntu-build - command line interface for Launchpad buildd operations.
#
# Copyright (C) 2007 Canonical Ltd.
# Copyright (C) 2007-2024 Canonical Ltd.
# Authors:
# - Martin Pitt <martin.pitt@canonical.com>
# - Jonathan Davies <jpds@ubuntu.com>
# - Michael Bienia <geser@ubuntu.com>
# - Steve Langasek <steve.langasek@canonical.com>
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# the Free Software Foundation, version 3 of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
@ -22,106 +22,181 @@
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
# Our modules to import.
# pylint: disable=invalid-name
# pylint: enable=invalid-name
import argparse
import sys
from optparse import OptionGroup
from optparse import OptionParser
from ubuntutools.lp.udtexceptions import (SeriesNotFoundException,
PackageNotFoundException,
PocketDoesNotExistError,)
from ubuntutools.lp.lpapicache import Distribution, PersonTeam
from ubuntutools.misc import split_release_pocket
import lazr.restfulclient.errors
from launchpadlib.launchpad import Launchpad
from ubuntutools import getLogger
from ubuntutools.lp.udtexceptions import PocketDoesNotExistError
from ubuntutools.misc import split_release_pocket
Logger = getLogger()
def get_build_states(pkg, archs):
res = []
for build in pkg.getBuilds():
if build.arch_tag in archs:
res.append(f" {build.arch_tag}: {build.buildstate}")
msg = "\n".join(res)
return f"Build state(s) for '{pkg.source_package_name}':\n{msg}"
def rescore_builds(pkg, archs, score):
res = []
for build in pkg.getBuilds():
arch = build.arch_tag
if arch in archs:
if not build.can_be_rescored:
continue
try:
build.rescore(score=score)
res.append(f" {arch}: done")
except lazr.restfulclient.errors.Unauthorized:
Logger.error(
"You don't have the permissions to rescore builds."
" Ignoring your rescore request."
)
return None
except lazr.restfulclient.errors.BadRequest:
Logger.info("Cannot rescore build of %s on %s.", build.source_package_name, arch)
res.append(f" {arch}: failed")
msg = "\n".join(res)
return f"Rescoring builds of '{pkg.source_package_name}' to {score}:\n{msg}"
def retry_builds(pkg, archs):
res = []
for build in pkg.getBuilds():
arch = build.arch_tag
if arch in archs:
try:
build.retry()
res.append(f" {arch}: done")
except lazr.restfulclient.errors.BadRequest:
res.append(f" {arch}: failed")
msg = "\n".join(res)
return f"Retrying builds of '{pkg.source_package_name}':\n{msg}"
def main():
# Usage.
usage = "%prog <srcpackage> <release> <operation>\n\n"
usage = "%(prog)s <srcpackage> <release> <operation>\n\n"
usage += "Where operation may be one of: rescore, retry, or status.\n"
usage += "Only Launchpad Buildd Admins may rescore package builds."
# Valid architectures.
valid_archs = set([
"armel", "armhf", "arm64", "amd64", "hppa", "i386", "ia64",
"lpia", "powerpc", "ppc64el", "riscv64", "s390x", "sparc",
])
valid_archs = set(
["armhf", "arm64", "amd64", "i386", "powerpc", "ppc64el", "riscv64", "s390x"]
)
# Prepare our option parser.
opt_parser = OptionParser(usage)
parser = argparse.ArgumentParser(usage=usage)
# Retry options
retry_rescore_options = OptionGroup(opt_parser, "Retry and rescore options",
"These options may only be used with "
"the 'retry' and 'rescore' operations.")
retry_rescore_options.add_option("-a", "--arch", type="string",
action="append", dest="architecture",
help="Rebuild or rescore a specific "
"architecture. Valid architectures "
"include: %s." %
", ".join(valid_archs))
parser.add_argument(
"-a",
"--arch",
action="append",
dest="architecture",
help=f"Rebuild or rescore a specific architecture. Valid architectures "
f"include: {', '.join(valid_archs)}.",
)
parser.add_argument("-A", "--archive", help="operate on ARCHIVE", default="ubuntu")
# Batch processing options
batch_options = OptionGroup(opt_parser, "Batch processing",
batch_options = parser.add_argument_group(
"Batch processing",
"These options and parameter ordering is only "
"available in --batch mode.\nUsage: "
"ubuntu-build --batch [options] <package>...")
batch_options.add_option('--batch',
action='store_true', dest='batch', default=False,
help='Enable batch mode')
batch_options.add_option('--series',
action='store', dest='series', type='string',
help='Selects the Ubuntu series to operate on '
'(default: current development series)')
batch_options.add_option('--retry',
action='store_true', dest='retry', default=False,
help='Retry builds (give-back).')
batch_options.add_option('--rescore',
action='store', dest='priority', type='int',
help='Rescore builds to <priority>.')
batch_options.add_option('--arch2', action='append', dest='architecture',
type='string',
help="Affect only 'architecture' (can be used "
"several times). Valid architectures are: %s."
% ', '.join(valid_archs))
"ubuntu-build --batch [options] <package>...",
)
batch_options.add_argument(
"--batch", action="store_true", dest="batch", help="Enable batch mode"
)
batch_options.add_argument(
"--series",
action="store",
dest="series",
help="Selects the Ubuntu series to operate on (default: current development series)",
)
batch_options.add_argument(
"--retry", action="store_true", dest="retry", help="Retry builds (give-back)."
)
batch_options.add_argument(
"--rescore",
action="store",
dest="priority",
type=int,
help="Rescore builds to <priority>.",
)
batch_options.add_argument(
"--state",
action="store",
dest="state",
help="Act on builds that are in the specified state",
)
# Add the retry options to the main group.
opt_parser.add_option_group(retry_rescore_options)
# Add the batch mode to the main group.
opt_parser.add_option_group(batch_options)
parser.add_argument("packages", metavar="package", nargs="*", help=argparse.SUPPRESS)
# Parse our options.
(options, args) = opt_parser.parse_args()
args = parser.parse_args()
if not len(args):
opt_parser.print_help()
launchpad = Launchpad.login_with("ubuntu-dev-tools", "production", version="devel")
ubuntu = launchpad.distributions["ubuntu"]
if args.batch:
release = args.series
if not release:
# ppas don't have a proposed pocket so just use the release pocket;
# but for the main archive we default to -proposed
release = ubuntu.getDevelopmentSeries()[0].name
if args.archive == "ubuntu":
release = f"{release}-proposed"
try:
(release, pocket) = split_release_pocket(release)
except PocketDoesNotExistError as error:
Logger.error(error)
sys.exit(1)
if not options.batch:
else:
# Check we have the correct number of arguments.
if len(args) < 3:
opt_parser.error("Incorrect number of arguments.")
if len(args.packages) < 3:
parser.error("Incorrect number of arguments.")
try:
package = str(args[0]).lower()
release = str(args[1]).lower()
op = str(args[2]).lower()
package = str(args.packages[0]).lower()
release = str(args.packages[1]).lower()
operation = str(args.packages[2]).lower()
except IndexError:
opt_parser.print_help()
parser.print_help()
sys.exit(1)
archive = launchpad.archives.getByReference(reference=args.archive)
try:
distroseries = ubuntu.getSeries(name_or_version=release)
except lazr.restfulclient.errors.NotFound as error:
Logger.error(error)
sys.exit(1)
if not args.batch:
# Check our operation.
if op not in ("rescore", "retry", "status"):
Logger.error("Invalid operation: %s." % op)
if operation not in ("rescore", "retry", "status"):
Logger.error("Invalid operation: %s.", operation)
sys.exit(1)
# If the user has specified an architecture to build, we only wish to
# rebuild it and nothing else.
if options.architecture:
if options.architecture[0] not in valid_archs:
Logger.error("Invalid architecture specified: %s."
% options.architecture[0])
if args.architecture:
if args.architecture[0] not in valid_archs:
Logger.error("Invalid architecture specified: %s.", args.architecture[0])
sys.exit(1)
else:
one_arch = True
@ -135,148 +210,239 @@ def main():
Logger.error(error)
sys.exit(1)
# Get the ubuntu archive
try:
ubuntu_archive = Distribution('ubuntu').getArchive()
# Will fail here if we have no credentials, bail out
except IOError:
sys.exit(1)
# Get list of published sources for package in question.
try:
sources = ubuntu_archive.getSourcePackage(package, release, pocket)
distroseries = Distribution('ubuntu').getSeries(release)
except (SeriesNotFoundException, PackageNotFoundException) as error:
Logger.error(error)
sources = archive.getPublishedSources(
distro_series=distroseries,
exact_match=True,
pocket=pocket,
source_name=package,
status="Published",
)[0]
except IndexError:
Logger.error("No publication found for package %s", package)
sys.exit(1)
# Get list of builds for that package.
builds = sources.getBuilds()
# Find out the version and component in given release.
version = sources.getVersion()
component = sources.getComponent()
version = sources.source_package_version
component = sources.component_name
# Operations that are remaining may only be done by Ubuntu developers
# (retry) or buildd admins (rescore). Check if the proper permissions
# are in place.
me = PersonTeam.me
if op == "rescore":
necessary_privs = me.isLpTeamMember('launchpad-buildd-admins')
if op == "retry":
necessary_privs = me.canUploadPackage(ubuntu_archive, distroseries,
sources.getPackageName(),
sources.getComponent(),
pocket=pocket)
if op in ('rescore', 'retry') and not necessary_privs:
Logger.error("You cannot perform the %s operation on a %s "
"package as you do not have the permissions "
"to do this action." % (op, component))
if operation == "retry":
necessary_privs = archive.checkUpload(
component=sources.getComponent(),
distroseries=distroseries,
person=launchpad.me,
pocket=pocket,
sourcepackagename=sources.getPackageName(),
)
if not necessary_privs:
Logger.error(
"You cannot perform the %s operation on a %s package as you"
" do not have the permissions to do this action.",
operation,
component,
)
sys.exit(1)
# Output details.
Logger.info("The source version for '%s' in %s (%s) is at %s." %
(package, release.capitalize(), component, version))
Logger.info(
"The source version for '%s' in %s (%s) is at %s.",
package,
release.capitalize(),
component,
version,
)
Logger.info("Current build status for this package:")
# Output list of arches for package and their status.
done = False
for build in builds:
if one_arch and build.arch_tag != options.architecture[0]:
if one_arch and build.arch_tag != args.architecture[0]:
# Skip this architecture.
continue
done = True
Logger.info("%s: %s." % (build.arch_tag, build.buildstate))
if op == 'rescore':
Logger.info("%s: %s.", build.arch_tag, build.buildstate)
if operation == "rescore":
if build.can_be_rescored:
# FIXME: make priority an option
priority = 5000
Logger.info('Rescoring build %s to %d...' % (build.arch_tag, priority))
Logger.info("Rescoring build %s to %d...", build.arch_tag, priority)
try:
build.rescore(score=priority)
except lazr.restfulclient.errors.Unauthorized:
Logger.error(
"You don't have the permissions to rescore builds."
" Ignoring your rescore request."
)
break
else:
Logger.info('Cannot rescore build on %s.' % build.arch_tag)
if op == 'retry':
Logger.info("Cannot rescore build on %s.", build.arch_tag)
if operation == "retry":
if build.can_be_retried:
Logger.info('Retrying build on %s...' % build.arch_tag)
Logger.info("Retrying build on %s...", build.arch_tag)
build.retry()
else:
Logger.info('Cannot retry build on %s.' % build.arch_tag)
Logger.info("Cannot retry build on %s.", build.arch_tag)
# We are done
if done:
sys.exit(0)
Logger.info("No builds for '%s' found in the %s release" % (package, release.capitalize()))
Logger.info("No builds for '%s' found in the %s release", package, release.capitalize())
Logger.info("It may have been built in a former release.")
sys.exit(0)
# Batch mode
if not options.architecture:
if not args.architecture:
# no specific architectures specified, assume all valid ones
archs = valid_archs
else:
archs = set(options.architecture)
archs = set(args.architecture)
# filter out duplicate and invalid architectures
archs.intersection_update(valid_archs)
release = options.series
if not release:
release = (Distribution('ubuntu').getDevelopmentSeries().name
+ '-proposed')
try:
(release, pocket) = split_release_pocket(release)
except PocketDoesNotExistError as error:
Logger.error(error)
sys.exit(1)
if not args.packages:
retry_count = 0
can_rescore = True
ubuntu_archive = Distribution('ubuntu').getArchive()
try:
distroseries = Distribution('ubuntu').getSeries(release)
except SeriesNotFoundException as error:
Logger.error(error)
sys.exit(1)
me = PersonTeam.me
if not args.state:
if args.retry:
args.state = "Failed to build"
elif args.priority:
args.state = "Needs building"
# there is no equivalent to series.getBuildRecords() for a ppa.
# however, we don't want to have to traverse all build records for
# all series when working on the main archive, so we use
# series.getBuildRecords() for ubuntu and handle ppas separately
series = ubuntu.getSeries(name_or_version=release)
if args.archive == "ubuntu":
builds = series.getBuildRecords(build_state=args.state, pocket=pocket)
else:
builds = []
for build in archive.getBuildRecords(build_state=args.state, pocket=pocket):
if not build.current_source_publication:
continue
if build.current_source_publication.distro_series == series:
builds.append(build)
for build in builds:
if build.arch_tag not in archs:
continue
if not build.current_source_publication:
continue
# fixme: refactor
# Check permissions (part 2): check upload permissions for the
# source package
can_retry = args.retry and archive.checkUpload(
component=build.current_source_publication.component_name,
distroseries=series,
person=launchpad.me,
pocket=pocket,
sourcepackagename=build.source_package_name,
)
if args.retry and not can_retry:
Logger.error(
"You don't have the permissions to retry the build of '%s', skipping.",
build.source_package_name,
)
continue
Logger.info(
"The source version for '%s' in '%s' (%s) is: %s",
build.source_package_name,
release,
pocket,
build.source_package_version,
)
# Check permisions (part 1): Rescoring can only be done by buildd admins
can_rescore = ((options.priority
and me.isLpTeamMember('launchpad-buildd-admins'))
or False)
if options.priority and not can_rescore:
Logger.error("You don't have the permissions to rescore "
"builds. Ignoring your rescore request.")
for pkg in args:
if args.retry and build.can_be_retried:
Logger.info(
"Retrying build of %s on %s...", build.source_package_name, build.arch_tag
)
try:
pkg = ubuntu_archive.getSourcePackage(pkg, release, pocket)
except PackageNotFoundException as error:
Logger.error(error)
build.retry()
retry_count += 1
except lazr.restfulclient.errors.BadRequest:
Logger.info(
"Failed to retry build of %s on %s",
build.source_package_name,
build.arch_tag,
)
if args.priority and can_rescore:
if build.can_be_rescored:
try:
build.rescore(score=args.priority)
except lazr.restfulclient.errors.Unauthorized:
Logger.error(
"You don't have the permissions to rescore builds."
" Ignoring your rescore request."
)
can_rescore = False
except lazr.restfulclient.errors.BadRequest:
Logger.info(
"Cannot rescore build of %s on %s.",
build.source_package_name,
build.arch_tag,
)
Logger.info("")
if args.retry:
Logger.info("%d package builds retried", retry_count)
sys.exit(0)
for pkg in args.packages:
try:
pkg = archive.getPublishedSources(
distro_series=distroseries,
exact_match=True,
pocket=pocket,
source_name=pkg,
status="Published",
)[0]
except IndexError:
Logger.error("No publication found for package %s", pkg)
continue
# Check permissions (part 2): check upload permissions for the source
# package
can_retry = options.retry and me.canUploadPackage(ubuntu_archive,
distroseries,
pkg.getPackageName(),
pkg.getComponent())
if options.retry and not can_retry:
Logger.error("You don't have the permissions to retry the "
"build of '%s'. Ignoring your request."
% pkg.getPackageName())
can_retry = args.retry and archive.checkUpload(
component=pkg.component_name,
distroseries=distroseries,
person=launchpad.me,
pocket=pocket,
sourcepackagename=pkg.source_package_name,
)
if args.retry and not can_retry:
Logger.error(
"You don't have the permissions to retry the "
"build of '%s'. Ignoring your request.",
pkg.source_package_name,
)
Logger.info("The source version for '%s' in '%s' (%s) is: %s" %
(pkg.getPackageName(), release, pocket, pkg.getVersion()))
Logger.info(
"The source version for '%s' in '%s' (%s) is: %s",
pkg.source_package_name,
release,
pocket,
pkg.source_package_version,
)
Logger.info(pkg.getBuildStates(archs))
Logger.info(get_build_states(pkg, archs))
if can_retry:
Logger.info(pkg.retryBuilds(archs))
if options.priority and can_rescore:
Logger.info(pkg.rescoreBuilds(archs, options.priority))
Logger.info(retry_builds(pkg, archs))
if args.priority:
Logger.info(rescore_builds(pkg, archs, args.priority))
Logger.info('')
Logger.info("")
if __name__ == '__main__':
if __name__ == "__main__":
main()

View File

@ -20,19 +20,23 @@
#
# ##################################################################
import optparse
# pylint: disable=invalid-name
# pylint: enable=invalid-name
import argparse
import subprocess
import sys
from ubuntutools import getLogger
Logger = getLogger()
def extract(iso, path):
command = ['isoinfo', '-R', '-i', iso, '-x', path]
pipe = subprocess.run(command, encoding='utf-8',
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
command = ["isoinfo", "-R", "-i", iso, "-x", path]
pipe = subprocess.run(
command, check=False, encoding="utf-8", stdout=subprocess.PIPE, stderr=subprocess.PIPE
)
if pipe.returncode != 0:
sys.stderr.write(pipe.stderr)
@ -42,22 +46,22 @@ def extract(iso, path):
def main():
desc = 'Given an ISO, %prog will display the Ubuntu version information'
parser = optparse.OptionParser(usage='%prog [options] iso...',
description=desc)
isos = parser.parse_args()[1]
desc = "Given an ISO, %(prog)s will display the Ubuntu version information"
parser = argparse.ArgumentParser(usage="%(prog)s [options] iso...", description=desc)
parser.add_argument("isos", nargs="*", help=argparse.SUPPRESS)
args = parser.parse_args()
err = False
for iso in isos:
if len(isos) > 1:
prefix = '%s:' % iso
for iso in args.isos:
if len(args.isos) > 1:
prefix = f"{iso}:"
else:
prefix = ''
prefix = ""
version = extract(iso, '/.disk/info')
version = extract(iso, "/.disk/info")
if len(version) == 0:
Logger.error('%s does not appear to be an Ubuntu ISO' % iso)
Logger.error("%s does not appear to be an Ubuntu ISO", iso)
err = True
continue
@ -67,6 +71,6 @@ def main():
sys.exit(1)
if __name__ == '__main__':
if __name__ == "__main__":
main()
sys.exit(0)

View File

@ -14,131 +14,159 @@
# ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
import optparse
# pylint: disable=invalid-name
# pylint: enable=invalid-name
import argparse
import sys
from ubuntutools.lp.lpapicache import (Launchpad, Distribution, PersonTeam,
Packageset, PackageNotFoundException,
SeriesNotFoundException)
from ubuntutools import getLogger
from ubuntutools.lp.lpapicache import (
Distribution,
Launchpad,
PackageNotFoundException,
Packageset,
PersonTeam,
SeriesNotFoundException,
)
from ubuntutools.misc import split_release_pocket
from ubuntutools import getLogger
Logger = getLogger()
def parse_arguments():
'''Parse arguments and return (options, package)'''
parser = optparse.OptionParser('%prog [options] package')
parser.add_option('-r', '--release', default=None, metavar='RELEASE',
help='Use RELEASE, rather than the current development '
'release')
parser.add_option('-a', '--list-uploaders',
default=False, action='store_true',
help='List all the people/teams with upload rights')
parser.add_option('-t', '--list-team-members',
default=False, action='store_true',
help='List all team members of teams with upload rights '
'(implies --list-uploaders)')
options, args = parser.parse_args()
"""Parse arguments and return (options, package)"""
parser = argparse.ArgumentParser(usage="%(prog)s [options] package")
parser.add_argument(
"-r",
"--release",
metavar="RELEASE",
help="Use RELEASE, rather than the current development release",
)
parser.add_argument(
"-a",
"--list-uploaders",
action="store_true",
help="List all the people/teams with upload rights",
)
parser.add_argument(
"-t",
"--list-team-members",
action="store_true",
help="List all team members of teams with upload rights (implies --list-uploaders)",
)
parser.add_argument("package", help=argparse.SUPPRESS)
args = parser.parse_args()
if len(args) != 1:
parser.error("One (and only one) package must be specified")
package = args[0]
if args.list_team_members:
args.list_uploaders = True
if options.list_team_members:
options.list_uploaders = True
return (options, package)
return args
def main():
'''Query upload permissions'''
options, package = parse_arguments()
"""Query upload permissions"""
args = parse_arguments()
# Need to be logged in to see uploaders:
Launchpad.login()
ubuntu = Distribution('ubuntu')
ubuntu = Distribution("ubuntu")
archive = ubuntu.getArchive()
if options.release is None:
options.release = ubuntu.getDevelopmentSeries().name
if args.release is None:
args.release = ubuntu.getDevelopmentSeries().name
try:
release, pocket = split_release_pocket(options.release)
release, pocket = split_release_pocket(args.release)
series = ubuntu.getSeries(release)
except SeriesNotFoundException as e:
Logger.error(str(e))
sys.exit(2)
try:
spph = archive.getSourcePackage(package)
spph = archive.getSourcePackage(args.package)
except PackageNotFoundException as e:
Logger.error(str(e))
sys.exit(2)
component = spph.getComponent()
if (options.list_uploaders and (pocket != 'Release' or series.status in
('Experimental', 'Active Development', 'Pre-release Freeze'))):
component_uploader = archive.getUploadersForComponent(
component_name=component)[0]
Logger.info("All upload permissions for %s:" % package)
if args.list_uploaders and (
pocket != "Release"
or series.status in ("Experimental", "Active Development", "Pre-release Freeze")
):
component_uploader = archive.getUploadersForComponent(component_name=component)[0]
Logger.info("All upload permissions for %s:", args.package)
Logger.info("")
Logger.info("Component (%s)" % component)
Logger.info("============" + ("=" * len(component)))
print_uploaders([component_uploader], options.list_team_members)
Logger.info("Component (%s)", component)
Logger.info("============%s", "=" * len(component))
print_uploaders([component_uploader], args.list_team_members)
packagesets = sorted(Packageset.setsIncludingSource(
distroseries=series,
sourcepackagename=package), key=lambda p: p.name)
packagesets = sorted(
Packageset.setsIncludingSource(distroseries=series, sourcepackagename=args.package),
key=lambda p: p.name,
)
if packagesets:
Logger.info("")
Logger.info("Packagesets")
Logger.info("===========")
for packageset in packagesets:
Logger.info("")
Logger.info("%s:" % packageset.name)
print_uploaders(archive.getUploadersForPackageset(
packageset=packageset), options.list_team_members)
Logger.info("%s:", packageset.name)
print_uploaders(
archive.getUploadersForPackageset(packageset=packageset),
args.list_team_members,
)
ppu_uploaders = archive.getUploadersForPackage(
source_package_name=package)
ppu_uploaders = archive.getUploadersForPackage(source_package_name=args.package)
if ppu_uploaders:
Logger.info("")
Logger.info("Per-Package-Uploaders")
Logger.info("=====================")
Logger.info("")
print_uploaders(ppu_uploaders, options.list_team_members)
print_uploaders(ppu_uploaders, args.list_team_members)
Logger.info("")
if PersonTeam.me.canUploadPackage(archive, series, package, component,
pocket):
Logger.info("You can upload %s to %s." % (package, options.release))
if PersonTeam.me.canUploadPackage(archive, series, args.package, component, pocket):
Logger.info("You can upload %s to %s.", args.package, args.release)
else:
Logger.info("You can not upload %s to %s, yourself." % (package, options.release))
if (series.status in ('Current Stable Release', 'Supported', 'Obsolete')
and pocket == 'Release'):
Logger.info("%s is in the '%s' state. You may want to query the %s-proposed pocket." %
(release, series.status, release))
Logger.info("You can not upload %s to %s, yourself.", args.package, args.release)
if (
series.status in ("Current Stable Release", "Supported", "Obsolete")
and pocket == "Release"
):
Logger.info(
"%s is in the '%s' state. You may want to query the %s-proposed pocket.",
release,
series.status,
release,
)
else:
Logger.info("But you can still contribute to it via the sponsorship "
"process: https://wiki.ubuntu.com/SponsorshipProcess")
if not options.list_uploaders:
Logger.info("To see who has the necessary upload rights, "
"use the --list-uploaders option.")
Logger.info(
"But you can still contribute to it via the sponsorship "
"process: https://wiki.ubuntu.com/SponsorshipProcess"
)
if not args.list_uploaders:
Logger.info(
"To see who has the necessary upload rights, "
"use the --list-uploaders option."
)
sys.exit(1)
def print_uploaders(uploaders, expand_teams=False, prefix=''):
def print_uploaders(uploaders, expand_teams=False, prefix=""):
"""Given a list of uploaders, pretty-print them all
Each line is prefixed with prefix.
If expand_teams is set, recurse, adding more spaces to prefix on each
recursion.
"""
for uploader in sorted(uploaders, key=lambda p: p.display_name):
Logger.info("%s* %s (%s)%s" %
(prefix, uploader.display_name, uploader.name,
' [team]' if uploader.is_team else ''))
Logger.info(
"%s* %s (%s)%s",
prefix,
uploader.display_name,
uploader.name,
" [team]" if uploader.is_team else "",
)
if expand_teams and uploader.is_team:
print_uploaders(uploader.participants, True, prefix=prefix + ' ')
print_uploaders(uploader.participants, True, prefix=prefix + " ")
if __name__ == '__main__':
if __name__ == "__main__":
main()

View File

@ -1,16 +0,0 @@
#!/usr/bin/python3
# -*- coding: utf-8 -*-
# Copyright (C) 2018 Canonical Ltd.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; version 3 of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.

View File

@ -1,114 +0,0 @@
#!/usr/bin/python3
# -*- coding: utf-8 -*-
# Copyright (C) 2018 Canonical Ltd.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; version 3 of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import sys
import os
import argparse
import subprocess
import logging
from ubuntu_archive_assistant.logging import AssistantLogger, AssistantTaskLogger
class AssistantCommand(argparse.Namespace):
def __init__(self, command_id, description, logger=None, leaf=True, testing=False):
self.command_id = command_id
self.description = description
self.leaf_command = leaf
self.testing = testing
self._args = None
self.debug = False
self.cache_path = None
self.commandclass = None
self.subcommands = {}
self.subcommand = None
self.func = None
self.logger = AssistantLogger(module=command_id)
self.log = self.logger.log
self.task_logger = self.logger
self.review = self.task_logger.review
self.parser = argparse.ArgumentParser(prog="%s %s" % (sys.argv[0], command_id),
description=description,
add_help=True)
self.parser.add_argument('--debug', action='store_true',
help='Enable debug messages')
self.parser.add_argument('--verbose', action='store_true',
help='Enable debug messages')
if not leaf:
self.subparsers = self.parser.add_subparsers(title='Available commands',
metavar='', dest='subcommand')
p_help = self.subparsers.add_parser('help',
description='Show this help message',
help='Show this help message')
p_help.set_defaults(func=self.print_usage)
def update(self, args):
self._args = args
def parse_args(self):
ns, self._args = self.parser.parse_known_args(args=self._args, namespace=self)
if self.debug:
self.logger.setLevel(logging.DEBUG)
self.logger.setReviewLevel(logging.DEBUG)
if self.verbose:
self.logger.setReviewLevel(logging.INFO)
if not self.subcommand and not self.leaf_command:
print('You need to specify a command', file=sys.stderr)
self.print_usage()
def run_command(self):
if self.commandclass:
self.commandclass.update(self._args)
if self.leaf_command and 'help' in self._args:
self.print_usage()
self.func()
def print_usage(self):
self.parser.print_help(file=sys.stderr)
sys.exit(os.EX_USAGE)
def _add_subparser_from_class(self, name, commandclass):
instance = commandclass(self.logger)
self.subcommands[name] = {}
self.subcommands[name]['class'] = name
self.subcommands[name]['instance'] = instance
if instance.testing:
if not os.environ.get('ENABLE_TEST_COMMANDS', None):
return
p = self.subparsers.add_parser(instance.command_id,
description=instance.description,
help=instance.description,
add_help=False)
p.set_defaults(func=instance.run, commandclass=instance)
self.subcommands[name]['parser'] = p
def _import_subcommands(self, submodules):
import inspect
for name, obj in inspect.getmembers(submodules):
if inspect.isclass(obj) and issubclass(obj, AssistantCommand):
self._add_subparser_from_class(name, obj)

View File

@ -1,24 +0,0 @@
#!/usr/bin/python3
# -*- coding: utf-8 -*-
# Copyright (C) 2018 Canonical Ltd.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; version 3 of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
from ubuntu_archive_assistant.commands.proposed_migration import ProposedMigration
from ubuntu_archive_assistant.commands.mir import MIRReview
__all__ = [
'ProposedMigration',
'MIRReview',
]

View File

@ -1,201 +0,0 @@
#!/usr/bin/python3
# -*- coding: utf-8 -*-
# Copyright (C) 2018 Canonical Ltd.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; version 3 of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import os
import sys
import time
import subprocess
import tempfile
import argparse
import requests
import logging
from ubuntu_archive_assistant.command import AssistantCommand
from ubuntu_archive_assistant.utils import urlhandling, launchpad, bugtools
from ubuntu_archive_assistant.logging import ReviewResult, AssistantTaskLogger
class MIRReview(AssistantCommand):
def __init__(self, logger):
super().__init__(command_id='mir',
description='Review Main Inclusion Requests',
logger=logger,
leaf=True)
def run(self):
self.parser.add_argument('-b', '--bug', dest='bug',
help='the MIR bug to evaluate')
self.parser.add_argument('-s', '--source', dest='source',
help='the MIR bug to evaluate')
self.parser.add_argument('--skip-review', action="store_true",
help='skip dropping to a subshell for code review')
self.parser.add_argument('--unprocessed', action="store_true",
default=False,
help='show MIRs accepted but not yet processed')
self.func = self.mir_review
self.parse_args()
self.run_command()
def mir_review(self):
lp = launchpad.LaunchpadInstance()
self.mir_team = lp.lp.people["ubuntu-mir"]
if not self.source and not self.bug:
self.log.debug("showing MIR report. show unprocessed=%s" % self.unprocessed)
bugs = self.get_mir_bugs(show_unprocessed=self.unprocessed)
sys.exit(0)
else:
completed_statuses = ("Won't Fix", "Invalid", "Fix Committed", "Fix Released")
if self.bug:
self.log.debug("show MIR by bug")
bug_no = int(self.bug)
bug = lp.lp.bugs[bug_no]
for bug_task in bug.bug_tasks:
if self.source:
if self.source != bug_task.target.name:
continue
if bug_task.status in completed_statuses:
print("MIR for %s is %s\n" % (bug_task.target.name,
bug_task.status))
continue
self.process(bug_task.target, bug_task)
else:
self.log.debug("show MIR by source")
source_pkg = self.get_source_package(self.source)
mir_bug = source_pkg.searchTasks(omit_duplicates=True,
bug_subscriber=self.mir_team,
order_by="id")[0]
self.process(source_pkg, mir_bug)
def get_source_package(self, binary):
lp = launchpad.LaunchpadInstance()
cache_name = None
name = None
source_pkg = lp.ubuntu.getSourcePackage(name=binary)
if source_pkg:
return source_pkg
try:
cache_name = subprocess.check_output(
"apt-cache show %s | grep Source:" % binary,
shell=True, universal_newlines=True)
except subprocess.CalledProcessError as e:
cache_name = subprocess.check_output(
"apt-cache show %s | grep Package:" % binary,
shell=True, universal_newlines=True)
if cache_name is not None:
if source.startswith("Source:") or source.startswith("Package:"):
name = source.split()[1]
if name:
source_pkg = lp.ubuntu.getSourcePackage(name=name)
return source_pkg
def lp_build_logs(self, source):
lp = launchpad.LaunchpadInstance()
archive = lp.ubuntu_archive()
spph = archive.getPublishedSources(exact_match=True,
source_name=source,
distro_series=lp.current_series(),
pocket="Release",
order_by_date=True)
builds = spph[0].getBuilds()
for build in builds:
if "Successfully" not in build.buildstate:
print("%s has failed to build" % build.arch_tag)
print(build.build_log_url)
def process(self, source_pkg, task=None):
lp = launchpad.LaunchpadInstance()
source_name = source_pkg.name
print("== MIR report for source package '%s' ==" % source_name)
print("\n=== Details ===")
print("LP: %s" % source_pkg.web_link)
if task and task.bug:
print("MIR bug: %s\n" % task.bug.web_link)
print(task.bug.description)
print("\n\n=== MIR assessment ===")
latest = lp.ubuntu_archive().getPublishedSources(exact_match=True,
source_name=source_name,
distro_series=lp.current_series())[0]
if not source_pkg:
print("\n%s does not exist in Ubuntu")
sys.exit(1)
if latest.pocket == "Proposed":
print("\nThere is a version of %s in -proposed: %s" % (source, latest.source_package_version))
if task:
if task.assignee:
print("MIR for %s is assigned to %s (%s)" % (task.target.display_name,
task.assignee.display_name,
task.status))
else:
print("MIR for %s is %s" % (task.target.display_name,
task.status))
print("\nPackage bug subscribers:")
for sub in source_pkg.getSubscriptions():
sub_text = " - %s" % sub.subscriber.display_name
if sub.subscribed_by:
sub_text += ", subscribed by %s" % sub.subscribed_by.display_name
print(sub_text)
print("\nBuild logs:")
self.lp_build_logs(source_name)
if not self.skip_review:
self.open_source_tmpdir(source_name)
def get_mir_bugs(self, show_unprocessed=False):
bug_statuses = ("New", "Incomplete", "Confirmed", "Triaged",
"In Progress")
def only_ubuntu(task):
if 'ubuntu/+source' not in task.target_link:
return True
return False
if show_unprocessed:
unprocessed = self.mir_team.searchTasks(omit_duplicates=True, bug_subscriber=self.mir_team, status="Fix Committed")
if any(unprocessed):
print("== Open MIRs reviewed but not processed ==")
bugtools.list_bugs(print, unprocessed, filter=only_ubuntu, file=sys.stderr)
tasks = self.mir_team.searchTasks(omit_duplicates=True, bug_subscriber=self.mir_team, status=bug_statuses)
bugtools.list_bugs(print, tasks, filter=only_ubuntu, file=sys.stderr)
result = None
return result
def open_source_tmpdir(self, source_name):
print("\nDropping to a shell for code review:\n")
with tempfile.TemporaryDirectory() as temp_dir:
os.system('cd %s; pull-lp-source %s; bash -l' % (temp_dir, source_name))

View File

@ -1,835 +0,0 @@
#!/usr/bin/python3
# -*- coding: utf-8 -*-
# Copyright (C) 2018 Canonical Ltd.
# Author: Mathieu Trudel-Lapierre <mathieu.trudel-lapierre@canonical.com>
# Author: Łukasz 'sil2100' Zemczak <lukasz.zemczak@canonical.com>
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; version 3 of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
"""Analyze britney's excuses output and suggest a course of action for
proposed migration.
"""
# FIXME: Various parts of slangasek's pseudocode (in comments where relevant)
# are not well implemented.
import yaml
import os
import re
import sys
import time
import math
import subprocess
import argparse
import tempfile
import logging
from contextlib import ExitStack
from enum import Enum
from collections import defaultdict
from ubuntu_archive_assistant.command import AssistantCommand
from ubuntu_archive_assistant.utils import urlhandling, launchpad
from ubuntu_archive_assistant.logging import ReviewResult, ReviewResultAdapter, AssistantTaskLogger
HINTS_BRANCH = 'lp:~ubuntu-release/britney/hints-ubuntu'
DEBIAN_CURRENT_SERIES = 'sid'
ARCHIVE_PAGES = 'https://people.canonical.com/~ubuntu-archive/'
LAUNCHPAD_URL = 'https://launchpad.net'
AUTOPKGTEST_URL = 'http://autopkgtest.ubuntu.com'
MAX_CACHE_AGE = 14400 # excuses cache should not be older than 4 hours
class ProposedMigration(AssistantCommand):
def __init__(self, logger):
super().__init__(command_id='proposed',
description='Assess next work required for a package\'s proposed migration',
logger=logger,
leaf=True)
self.excuses = {}
self.seen = []
def run(self):
self.parser.add_argument('-s', '--source', dest='source_name',
help='the package to evaluate')
self.parser.add_argument('--no-cache', dest='do_not_cache', action='store_const',
const=True, default=False,
help='Do not cache excuses')
self.parser.add_argument('--refresh', action='store_const',
const=True, default=False,
help='Force refresh of cached excuses')
self.func = self.proposed_migration
self.parse_args()
self.run_command()
def proposed_migration(self):
refresh_due = False
with ExitStack() as resources:
if self.do_not_cache:
fp = resources.enter_context(tempfile.NamedTemporaryFile())
self.cache_path = resources.enter_context(
tempfile.TemporaryDirectory())
refresh_due = True
else:
xdg_cache = os.getenv('XDG_CACHE_HOME', '~/.cache')
self.cache_path = os.path.expanduser(
os.path.join(xdg_cache, 'ubuntu-archive-assistant', 'proposed-migration'))
excuses_path = os.path.join(self.cache_path, 'excuses.yaml')
if os.path.exists(self.cache_path):
if not os.path.isdir(self.cache_path):
print("The {} cache directory is not a directory, please "
"resolve manually and re-run.".format(self.cache_path))
exit(1)
else:
os.makedirs(self.cache_path)
try:
fp = open(excuses_path, 'r')
except FileNotFoundError:
refresh_due = True
pass
finally:
fp = open(excuses_path, 'a+')
file_state = os.stat(excuses_path)
mtime = file_state.st_mtime
now = time.time()
if (now - mtime) > MAX_CACHE_AGE:
refresh_due = True
if self.refresh or refresh_due:
excuses_url = ARCHIVE_PAGES + 'proposed-migration/update_excuses.yaml'
urlhandling.get_with_progress(url=excuses_url, filename=fp.name)
fp.seek(0)
# Use the C implementation of the SafeLoader, it's noticeably faster, and
# here we're dealing with large input files.
self.excuses = yaml.load(fp, Loader=yaml.CSafeLoader)
if self.source_name is None:
print("No source package name was provided. The following packages are "
"blocked in proposed:\n")
self.source_name = self.choose_blocked_source(self.excuses)
self.find_excuses(self.source_name, 0)
def get_debian_ci_results(self, source_name, arch):
try:
url = "https://ci.debian.net/data/packages/unstable/{}/{}/latest.json"
results_url = url.format("amd64", self.get_pkg_archive_path(source_name))
resp = urlhandling.get(url=results_url)
return resp.json()
except Exception:
return None
def find_excuses(self, source_name, level):
if source_name in self.seen:
return
for excuses_item in self.excuses['sources']:
item_name = excuses_item.get('item-name')
if item_name == source_name:
self.selected = excuses_item
self.process(level)
def get_pkg_archive_path(self, package):
try:
# TODO: refactor to avoid shell=True
path = subprocess.check_output(
"apt-cache show %s | grep Filename:" % package,
shell=True, universal_newlines=True)
path = path.split(' ')[1].split('/')
path = "/".join(path[2:4])
return path
except Exception:
return None
def get_source_package(self, binary_name):
cache_output = None
# TODO: refactor to avoid shell=True
try:
cache_output = subprocess.check_output(
"apt-cache show %s | grep Source:" % binary_name,
shell=True, universal_newlines=True)
except subprocess.CalledProcessError:
cache_output = subprocess.check_output(
"apt-cache show %s | grep Package:" % binary_name,
shell=True, universal_newlines=True)
if cache_output is not None:
if cache_output.startswith("Source:") or cache_output.startswith("Package:"):
source_name = cache_output.split()[1]
return source_name
return None
def package_in_distro(self, package, distro='ubuntu', distroseries='bionic',
proposed=False):
# TODO: This operation is pretty costly, do caching?
if distro == 'debian':
distroseries = DEBIAN_CURRENT_SERIES
if proposed:
distroseries += "-proposed"
madison_url = "https://qa.debian.org/cgi-bin/madison.cgi"
params = "?package={}&table={}&a=&c=&s={}".format(package,
distro,
distroseries)
url = madison_url + params
resp = urlhandling.get(url=url)
package_found = {}
for line in resp.text.split('\n'):
if " {} ".format(package) not in line:
continue
package_line = line.split(' | ')
series_component = package_line[2].split('/')
component = 'main'
if len(series_component) > 1:
component = series_component[1]
if '{}'.format(distroseries) in series_component[0]:
if distro == 'ubuntu':
package_found = {
'version': package_line[1],
'component': component,
}
else:
package_found = {
'version': package_line[1],
}
return package_found
return {}
def process_lp_build_results(self, level, uploads, failed):
logger = AssistantTaskLogger("lp_builds", self.task_logger)
assistant = logger.newTask("lp_builds", level + 1)
lp = launchpad.LaunchpadInstance()
archive = lp.ubuntu_archive()
series = lp.current_series()
source_name = self.selected.get('source')
spph = archive.getPublishedSources(exact_match=True,
source_name=source_name,
distro_series=series,
pocket="Proposed",
order_by_date=True)
new_version = series.getPackageUploads(archive=archive,
name=source_name,
version=self.selected.get('new-version'),
pocket="Proposed",
exact_match=True)
for item in new_version:
arch = item.display_arches.split(',')[0]
if item.package_version not in uploads:
uploads[item.package_version] = {}
if arch == 'source':
continue
uploads[item.package_version][arch] = item.getBinaryProperties()
# Only get the builds for the latest publication, this is more likely to
# be new source in -proposed, or the most recent upload.
builds = spph[0].getBuilds()
for build in builds:
missing_arches = set()
if "Successfully" not in build.buildstate:
failed[build.arch_tag] = {
'state': build.buildstate,
}
if self.logger.getReviewLevel() < logging.ERROR:
assistant.error("{} is missing a build on {}:".format(
source_name, build.arch_tag),
status=ReviewResult.FAIL)
log_url = build.build_log_url
if not log_url:
log_url = "<No build log available>"
assistant.warning("[%s] %s" % (build.buildstate,
log_url),
status=ReviewResult.NONE, depth=1)
if any(failed) and self.logger.getReviewLevel() >= logging.ERROR:
assistant.critical("Fix missing builds: {}".format(
", ".join(failed.keys())),
status=ReviewResult.NONE)
assistant.error("{}/ubuntu/+source/{}/{}".format(
LAUNCHPAD_URL,
spph[0].source_package_name,
spph[0].source_package_version),
status=ReviewResult.INFO, depth=1)
def check_mir_status(self, logger, target_package, level):
logger = AssistantTaskLogger("mir", logger)
assistant = logger.newTask("mir", level + 2)
# TODO: Check for MIR bug state
# - has the MIR been rejected?
# - upload or submit to sponsorship queue to drop the dependency
lp = launchpad.LaunchpadInstance()
source_name = self.get_source_package(target_package)
source_pkg = lp.ubuntu.getSourcePackage(name=source_name)
mir_tasks = source_pkg.searchTasks(bug_subscriber=lp.lp.people['ubuntu-mir'],
omit_duplicates=True)
if not mir_tasks:
assistant.error("Please open a MIR bug:",
status=ReviewResult.INFO)
assistant.error("{}/ubuntu/+source/{}/+filebug?field.title=%5bMIR%5d%20{}".format(
LAUNCHPAD_URL, source_name, source_name),
status=ReviewResult.NONE, depth=1)
last_bug_id = 0
for task in mir_tasks:
assigned_to = "unassigned"
if task.assignee:
assigned_to = "assigned to %s" % task.assignee.display_name
if task.bug.id != last_bug_id:
assistant.error("(LP: #%s) %s" % (task.bug.id, task.bug.title),
status=ReviewResult.INFO)
last_bug_id = task.bug.id
assistant.warning("%s (%s) in %s (%s)" % (task.status,
task.importance,
task.target.name,
assigned_to),
status=ReviewResult.NONE, depth=1)
if task.status in ("Won't Fix", "Invalid"):
assistant.error("This MIR has been rejected; please look into "
"dropping the dependency on {} from {}".format(
target_package, source_name),
status=ReviewResult.INFO, depth=1)
def process_unsatisfiable_depends(self, level):
logger = AssistantTaskLogger("unsatisfiable", self.task_logger)
assistant = logger.newTask("unsatisfiable", level + 1)
distroseries = launchpad.LaunchpadInstance().current_series().name
affected_sources = set()
unsatisfiable = defaultdict(set)
depends = self.selected.get('dependencies').get('unsatisfiable-dependencies', {})
for arch, signatures in depends.items():
for signature in signatures:
binary_name = signature.split(' ')[0]
unsatisfiable[signature].add(arch)
try:
pkg = self.get_source_package(binary_name)
affected_sources.add(pkg)
except Exception:
# FIXME: we might be dealing with a new package in proposed
# here, but using the binary name instead of the source
# name.
if any(self.package_in_distro(binary_name, distro='ubuntu',
distroseries=distroseries)):
affected_sources.add(binary_name)
elif any(self.package_in_distro(binary_name,
distro='ubuntu',
distroseries=distroseries,
proposed=True)):
affected_sources.add(binary_name)
if not affected_sources and not unsatisfiable:
return
logger.critical("Fix unsatisfiable dependencies in {}:".format(
self.selected.get('source')),
status=ReviewResult.NONE)
# TODO: Check version comparisons for removal requests/fixes
# - is the unsatisfied dependency due to a package dropped in Ubuntu,
# but not in Debian, which may come back as a sync later
# (i.e. not blacklisted)?
# - leave in -proposed
# - is this package Ubuntu-specific?
# - is there an open bug in launchpad about this issue, with no action?
# - subscribe ubuntu-archive and request the package's removal
# - else
# - open a bug report and assign to the package's maintainer
# - is the package in Debian, but the dependency is part of Ubuntu delta?
# - fix
possible_mir = set()
for signature, arches in unsatisfiable.items():
assistant = logger.newTask("unsatisfiable", level + 2)
depends = signature.split(' ')[0]
assistant.warning("{} can not be satisfied "
"on {}".format(signature, ", ".join(arches)),
status=ReviewResult.FAIL)
in_archive = self.package_in_distro(depends, distro='ubuntu',
distroseries=distroseries)
in_proposed = self.package_in_distro(depends, distro='ubuntu',
distroseries=distroseries,
proposed=True)
if any(in_archive) and not any(in_proposed):
assistant.info("{}/{} exists "
"in the Ubuntu primary archive".format(
depends,
in_archive.get('version')),
status=ReviewResult.FAIL, depth=1)
if self.selected.get('component', 'main') != in_archive.get('component'):
possible_mir.add(depends)
elif not any(in_archive) and any(in_proposed):
assistant.info("{} is only in -proposed".format(depends),
status=ReviewResult.FAIL, depth=1)
assistant.debug("Has this package been dropped in Ubuntu, "
"but not in Debian?",
status=ReviewResult.INFO, depth=2)
elif not any(in_archive) and not any(in_proposed):
in_debian = self.package_in_distro(depends, distro='debian',
distroseries=distroseries)
if any(in_debian):
assistant.warning("{} only exists in Debian".format(depends),
status=ReviewResult.FAIL, depth=1)
assistant.debug("Is this package blacklisted? Should it be synced?",
status=ReviewResult.INFO, depth=2)
else:
assistant.warning("{} is not found".format(depends),
status=ReviewResult.FAIL, depth=1)
assistant.debug("Has this package been removed?",
status=ReviewResult.INFO, depth=2)
else:
if self.selected.get('component', 'main') != in_archive.get('component'):
possible_mir.add(depends)
for p_mir in possible_mir:
self.check_mir_status(logger, p_mir, level)
if affected_sources:
for src_name in affected_sources:
self.find_excuses(src_name, level+2)
def process_autopkgtest(self, level):
logger = AssistantTaskLogger("autopkgtest", self.task_logger)
assistant = logger.newTask("autopkgtest", level + 1)
autopkgtests = self.selected.get('policy_info').get('autopkgtest')
assistant.critical("Fix autopkgtests triggered by this package for:",
status=ReviewResult.NONE)
waiting = 0
failed_tests = defaultdict(set)
for key, test in autopkgtests.items():
logger = AssistantTaskLogger(key, logger)
assistant = logger.newTask(key, level + 2)
for arch, arch_test in test.items():
if 'RUNNING' in arch_test:
waiting += 1
if 'REGRESSION' in arch_test:
assistant.warning("{} {} {}".format(key, arch, arch_test[2]),
status=ReviewResult.FAIL)
failed_tests[key].add(arch)
if arch == "amd64":
if '/' in key:
pkgname = key.split('/')[0]
else:
pkgname = key
ci_results = self.get_debian_ci_results(pkgname, "amd64")
if ci_results is not None:
result = ci_results.get('status')
status_ci = ReviewResult.FAIL
if result == 'pass':
status_ci = ReviewResult.PASS
assistant.warning("CI tests {} in Debian".format(
result),
status=status_ci, depth=1)
if 'pass' in result:
assistant.info("Consider filing a bug "
"(usertag: autopkgtest) "
"in Debian if none exist",
status=ReviewResult.INFO, depth=2)
else:
# TODO: (cyphermox) detect this case?
# check versions?
assistant.info("If synced from Debian and "
"requires sourceful changes to "
"the package, file a bug for "
"removal from -proposed",
status=ReviewResult.INFO, depth=2)
if waiting > 0:
assistant.error("{} tests are currently running "
"or waiting to be run".format(waiting),
status=ReviewResult.INFO)
else:
if self.logger.getReviewLevel() >= logging.ERROR:
for test, arches in failed_tests.items():
assistant.error("{}: {}".format(test, ", ".join(arches)),
status=ReviewResult.FAIL)
assistant.error("{}/packages/p/{}".format(AUTOPKGTEST_URL, test.split('/')[0]),
status=ReviewResult.INFO, depth=1)
def process_blocking(self, level):
assistant = self.task_logger.newTask("blocking", level + 1)
lp = launchpad.LaunchpadInstance().lp
bugs = self.selected.get('policy_info').get('block-bugs')
source_name = self.selected.get('source')
if bugs:
assistant.critical("Resolve blocking bugs:", status=ReviewResult.NONE)
for bug in bugs.keys():
lp_bug = lp.bugs[bug]
assistant.error("[LP: #{}] {} {}".format(lp_bug.id,
lp_bug.title,
lp_bug.web_link),
status=ReviewResult.NONE)
tasks = lp_bug.bug_tasks
for task in tasks:
value = ReviewResult.FAIL
if task.status in ('Fix Committed', 'Fix Released'):
value = ReviewResult.PASS
elif task.status in ("Won't Fix", 'Invalid'):
continue
assistant.warning("{}({}) in {}".format(
task.status,
task.importance,
task.bug_target_display_name),
status=value)
# guesstimate whether this is a removal request
if 'emove {}'.format(source_name) in lp_bug.title:
assistant.info("This looks like a removal request",
status=ReviewResult.INFO)
assistant.info("Consider pinging #ubuntu-release for processing",
status=ReviewResult.INFO)
hints = self.selected.get('hints')
if hints is not None:
hints_path = os.path.join(self.cache_path, 'hints-ubuntu')
self.get_latest_hints(hints_path)
assistant.critical("Update manual hinting (contact #ubuntu-release):",
status=ReviewResult.NONE)
hint_from = hints[0]
if hint_from == 'freeze':
assistant.error("Package blocked by freeze.")
else:
version = None
unblock_re = re.compile(r'^unblock {}\/(.*)$'.format(source_name))
files = [f for f in os.listdir(hints_path) if (os.path.isfile(
os.path.join(hints_path, f)) and f != 'freeze')]
for hints_file in files:
with open(os.path.join(hints_path, hints_file)) as fp:
print("Checking {}".format(os.path.join(hints_path, hints_file)))
for line in fp:
match = unblock_re.match(line)
if match:
version = match.group(1)
break
if version:
break
if version:
reason = \
("Unblock request by {} ignored due to version mismatch: "
"{}".format(hints_file, version))
else:
reason = "Missing unblock sequence in the hints file"
assistant.error(reason, status=ReviewResult.INFO)
def process_dependencies(self, source, level):
assistant = self.task_logger.newTask("dependencies", level + 1)
dependencies = source.get('dependencies')
blocked_by = dependencies.get('blocked-by', None)
migrate_after = dependencies.get('migrate-after', None)
if blocked_by or migrate_after:
assistant.critical("Clear outstanding promotion interdependencies:",
status=ReviewResult.NONE)
assistant = self.task_logger.newTask("dependencies", level + 2)
if migrate_after is not None:
assistant.error("{} will migrate after {}".format(
source.get('source'), ", ".join(migrate_after)),
status=ReviewResult.FAIL)
assistant.warning("Investigate what packages are conflicting, "
"by looking at 'Trying easy on autohinter' lines in "
"update_output.txt for {}".format(
source.get('source')),
status=ReviewResult.INFO, depth=1)
assistant.warning("See {}proposed-migration/update_output.txt".format(
ARCHIVE_PAGES),
status=ReviewResult.INFO, depth=2)
if blocked_by is not None:
assistant.error("{} is blocked by the migration of {}".format(
source.get('source'), ", ".join(blocked_by)),
status=ReviewResult.FAIL)
for blocker in blocked_by:
self.find_excuses(blocker, level+2)
def process_missing_builds(self, level):
logger = AssistantTaskLogger("missing_builds", self.task_logger)
assistant = logger.newTask("missing_builds", level + 1)
new_version = self.selected.get('new-version')
old_version = self.selected.get('old-version')
# TODO: Process missing builds; suggest options
#
# - missing build on $arch / has no binaries on any arch
# - is this an architecture-specific build failure?
# - has Debian removed the binaries for this architecture?
# - ask AA to remove the binaries as ANAIS
# - else
# - try to fix
#
# - is this a build failure on all archs?
# - are there bugs filed about this failure in Debian?
# - is the package in sync with Debian and does the package require
# sourceful changes to fix?
# - remove from -proposed
#
# - does the package fail to build in Debian?
# - file a bug in Debian
# - is the package in sync with Debian and does the package require
# sourceful changes to fix?
# - remove from -proposed
#
# - is this a dep-wait?
# - does this package have this build-dependency in Debian?
# - is this an architecture-specific dep-wait?
# - has Debian removed the binaries for this architecture?
# - ask AA to remove the binaries as ANAIS
# - else
# - try to fix
# - does this binary package exist in Debian?
# - look what source package provides this binary package in Debian
# - is this source package ftbfs or dep-wait in -proposed?
# - recurse
# - else
# - is this source package on the sync blacklist?
# - file a bug with the Ubuntu package
# - else
# - fix by syncing or merging the source
# - else
# - make sure a bug is filed in Debian about the issue
# - was the depended-on package removed from Debian,
# and is this a sync?
# - ask AA to remove the package from -proposed
# - else
# - leave the package in -proposed
uploads = {}
failed = {}
new = []
new_binaries = set()
self.process_lp_build_results(level, uploads, failed)
if new_version in uploads:
for arch, item in uploads[new_version].items():
for binary in item:
binary_name = binary.get('name')
new_binaries.add(binary_name)
if binary.get('is_new'):
new.append(binary)
if not any(failed):
assistant = logger.newTask("old_binaries", level + 1)
assistant.warning("No failed builds found", status=ReviewResult.PASS)
try:
missing_builds = self.selected.get('missing-builds')
missing_arches = missing_builds.get('on-architectures')
arch_o = []
for arch in missing_arches:
if arch not in uploads[new_version]:
arch_o.append("-a {}".format(arch))
if any(arch_o):
old_binaries = self.selected.get('old-binaries').get(old_version)
assistant.warning("This package has dropped support for "
"architectures it previous supported. ",
status=ReviewResult.INFO)
assistant.warning("Ask in #ubuntu-release for an Archive "
"Admin to run:",
status=ReviewResult.INFO)
assistant.info("remove-package %(arches)s -b %(bins)s"
% ({'arches': " ".join(arch_o),
'bins': " ".join(old_binaries),
}), status=ReviewResult.NONE, depth=1)
except AttributeError:
# Ignore a failure here, it just means we don't have
# missing-builds to process after all.
pass
if any(new):
assistant = logger.newTask("new", level + 1)
assistant.warning("This package has NEW binaries to process:",
status=ReviewResult.INFO)
for binary in new:
assistant.error("NEW: [{}] {}/{}".format(
binary.get('architecture'),
binary.get('name'),
binary.get('version')),
status=ReviewResult.FAIL, depth=1)
def process(self, level):
source_name = self.selected.get('source')
reasons = self.selected.get('reason')
self.seen.append(source_name)
self.task_logger = AssistantTaskLogger(source_name, self.task_logger)
assistant = self.task_logger.newTask(source_name, depth=level)
text_candidate = "not considered"
candidate = ReviewResult.FAIL
if self.selected.get('is-candidate'):
text_candidate = "a valid candidate"
candidate = ReviewResult.PASS
assistant.info("{} is {}".format(source_name, text_candidate),
status=candidate)
assistant.critical("Next steps for {} {}:".format(
source_name, self.selected.get('new-version')),
status=ReviewResult.NONE)
assistant.debug("reasons: {}".format(reasons), status=ReviewResult.NONE)
work_needed = False
missing_builds = self.selected.get('missing-builds')
if missing_builds is not None or 'no-binaries' in reasons:
work_needed = True
self.process_missing_builds(level)
if 'depends' in reasons:
work_needed = True
self.process_unsatisfiable_depends(level)
if 'block' in reasons:
work_needed = True
self.process_blocking(level)
if 'autopkgtest' in reasons:
work_needed = True
self.process_autopkgtest(level)
dependencies = self.selected.get('dependencies')
if dependencies is not None:
work_needed = True
self.process_dependencies(self.selected, level)
if work_needed is False:
assistant.error("Good job!", status=ReviewResult.PASS)
assistant.warning("Investigate if packages are conflicting, "
"by looking at 'Trying easy on autohinter' lines in "
"update_output.txt"
" for {}".format(source_name),
status=ReviewResult.INFO)
assistant.warning("See {}proposed-migration/update_output.txt".format(
ARCHIVE_PAGES),
status=ReviewResult.INFO)
def choose_blocked_source(self, excuses):
import pager
def pager_callback(pagenum):
prompt = "Page -%s-. Press any key for next page or Q to select a " \
"package." % pagenum
pager.echo(prompt)
if pager.getch() in [pager.ESC_, 'q', 'Q']:
return False
pager.echo('\r' + ' '*(len(prompt)) + '\r')
choice = 0
options = []
entry_list = []
sorted_excuses = sorted(
self.excuses['sources'],
key=lambda e: e.get('policy_info').get('age').get('current-age'),
reverse=True)
for src_num, item in enumerate(sorted_excuses, start=1):
item_name = item.get('item-name')
age = math.floor(
item.get('policy_info').get('age').get('current-age'))
options.append(item_name)
entry_list.append("({}) {} (Age: {} days)\n".format(
src_num, item_name, age))
while True:
pager.page(iter(entry_list), pager_callback)
num = input("\nWhich package do you want to look at? ")
try:
choice = int(num)
if choice > 0 and choice <= src_num:
break
except ValueError:
# num might be the package name.
if num in options:
return num
return options[choice - 1]
def get_latest_hints(self, path):
if os.path.exists(path):
try:
subprocess.check_call(
"bzr info %s" % path, shell=True, stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
subprocess.check_call("bzr pull -d %s" % path, shell=True,
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
except subprocess.CalledProcessError:
print("The {} path either exists but doesn't seem to be a valid "
"branch or failed to update it properly.".format(
path))
exit(1)
else:
try:
subprocess.check_call(
"bzr branch %s %s" % (HINTS_BRANCH, path), shell=True,
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
except subprocess.CalledProcessError:
print("Could not access the hints-ubuntu bzr branch, exiting.")
exit(1)

View File

@ -1,45 +0,0 @@
#!/usr/bin/python3
# -*- coding: utf-8 -*-
# Copyright (C) 2018 Canonical Ltd.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; version 3 of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import os
import logging
from ubuntu_archive_assistant.command import AssistantCommand
import ubuntu_archive_assistant.logging as app_logging
logger = app_logging.AssistantLogger()
class Assistant(AssistantCommand):
def __init__(self):
super().__init__(command_id='',
description='archive assistant',
logger=logger,
leaf=False)
def parse_args(self):
import ubuntu_archive_assistant.commands
self._import_subcommands(ubuntu_archive_assistant.commands)
super().parse_args()
def main(self):
self.parse_args()
self.run_command()

View File

@ -1,171 +0,0 @@
#!/usr/bin/python3
# -*- coding: utf-8 -*-
# Copyright (C) 2018 Canonical Ltd.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; version 3 of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import os
import logging
from enum import Enum
class ReviewResult(Enum):
NONE = 1
PASS = 2
FAIL = 3
INFO = 4
class ReviewResultAdapter(logging.LoggerAdapter):
depth = 0
def process(self, msg, kwargs):
status = kwargs.pop('status')
depth = self.depth + kwargs.pop('depth', 0)
# FIXME: identationing may be ugly because of character width
if status is ReviewResult.PASS:
icon = "\033[92m✔\033[0m"
#icon = ""
elif status is ReviewResult.FAIL:
icon = "\033[91m✘\033[0m"
#icon = ""
elif status is ReviewResult.INFO:
icon = "\033[94m\033[0m"
#icon = ""
else:
icon = ""
if depth <= 0:
return '%s %s' % (msg, icon), kwargs
elif status is ReviewResult.INFO:
return '%s%s %s' % (" " * depth * 2, icon, msg), kwargs
else:
return '%s%s %s' % (" " * depth * 2, msg, icon), kwargs
def critical(self, msg, *args, **kwargs):
self.depth = self.extra['depth']
msg, kwargs = self.process(msg, kwargs)
self.logger.critical(msg, *args, **kwargs)
def error(self, msg, *args, **kwargs):
self.depth = self.extra['depth']
msg, kwargs = self.process(msg, kwargs)
self.logger.error(msg, *args, **kwargs)
def warning(self, msg, *args, **kwargs):
self.depth = self.extra['depth']
msg, kwargs = self.process(msg, kwargs)
self.logger.warning(msg, *args, **kwargs)
def info(self, msg, *args, **kwargs):
self.depth = self.extra['depth']
msg, kwargs = self.process(msg, kwargs)
self.logger.info(msg, *args, **kwargs)
def debug(self, msg, *args, **kwargs):
self.depth = self.extra['depth']
msg, kwargs = self.process("DEBUG<{}>: {}".format(self.name, msg), kwargs)
self.logger.debug(msg, *args, **kwargs)
class AssistantLogger(object):
class __AssistantLogger(object):
def __init__(self):
main_root_logger = logging.RootLogger(logging.INFO)
self.main_log_manager = logging.Manager(main_root_logger)
main_review_logger = logging.RootLogger(logging.ERROR)
self.review_log_manager = logging.Manager(main_review_logger)
fmt = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
main_handler = logging.StreamHandler()
review_handler = logging.StreamHandler()
main_handler.setFormatter(fmt)
main_root_logger.addHandler(main_handler)
main_review_logger.addHandler(review_handler)
instance = None
def __init__(self, module=None, depth=0):
if not AssistantLogger.instance:
AssistantLogger.instance = AssistantLogger.__AssistantLogger()
if not module:
self.log = AssistantLogger.instance.main_log_manager.getLogger('assistant')
self.review_logger = AssistantLogger.instance.review_log_manager.getLogger('review')
else:
self.log = AssistantLogger.instance.main_log_manager.getLogger('assistant.%s' % module)
self.review_logger = AssistantLogger.instance.review_log_manager.getLogger('review.%s' % module)
self.depth = depth
self.review = ReviewResultAdapter(self.review_logger, {'depth': self.depth})
def newTask(self, task, depth):
review_logger = AssistantLogger.instance.review_log_manager.getLogger("%s.%s" % (self.review.name, task))
return ReviewResultAdapter(review_logger, {'depth': self.depth})
def setLevel(self, level):
self.log.setLevel(level)
def setReviewLevel(self, level):
self.review.setLevel(level)
def getReviewLevel(self):
return self.review.getEffectiveLevel()
def getReviewLogger(self, name):
return AssistantLogger.instance.review_log_manager.getLogger(name)
class AssistantTask(object):
def __init__(self, task, parent=None):
self.parent = parent
self.log = parent.log
if isinstance(parent, AssistantLogger):
self.depth = 0
else:
self.depth = parent.depth + 1
class AssistantTaskLogger(AssistantTask):
def __init__(self, task, logger):
super().__init__(task, parent=logger)
#self.review = self.parent.newTask(task, logger.depth + 1)
def newTask(self, task, depth):
review_logger = self.parent.getReviewLogger("%s.%s" % (self.parent.review.name, task))
self.review = ReviewResultAdapter(review_logger, {'depth': depth})
return self.review
def getReviewLogger(self, name):
return self.parent.getReviewLogger(name)
def critical(self, msg, *args, **kwargs):
self.review.critical(msg, *args, **kwargs)
def error(self, msg, *args, **kwargs):
self.review.error(msg, *args, **kwargs)
def warning(self, msg, *args, **kwargs):
self.review.warning(msg, *args, **kwargs)
def info(self, msg, *args, **kwargs):
self.review.info(msg, *args, **kwargs)
def debug(self, msg, *args, **kwargs):
self.review.debug(msg, *args, **kwargs)

View File

@ -1,110 +0,0 @@
#!/usr/bin/python3
# -*- coding: utf-8 -*-
# Copyright (C) 2018 Canonical Ltd.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; version 3 of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import sys
import unittest
from ubuntu_archive_assistant.command import AssistantCommand
# class AssistantCommand():
# def print_usage(self):
# def _add_subparser_from_class(self, name, commandclass):
# def _import_subcommands(self, submodules):
scratch = 0
class MyMainCommand(AssistantCommand):
def __init__(self):
super().__init__(command_id="test", description="test", leaf=False)
class MySubCommand(AssistantCommand):
def __init__(self):
super().__init__(command_id="subtest", description="subtest", leaf=False)
def update(self, args):
super().update(args)
self._args.append("extra")
def do_nothing():
return
def do_something():
global scratch
scratch = 1337
def do_crash():
raise Exception("unexpected")
class TestCommand(unittest.TestCase):
def test_update_args(self):
main = MyMainCommand()
sub = MySubCommand()
sub._args = ['toto', 'tata']
main.commandclass = sub
main.func = do_nothing
self.assertNotIn('titi', sub._args)
main.update(['titi', 'tutu'])
main.run_command()
self.assertIn('titi', main._args)
self.assertIn('titi', sub._args)
def test_parse_args(self):
main = MyMainCommand()
main._args = [ '--debug', 'help' ]
main.subcommand = do_nothing
main.parse_args()
self.assertNotIn('help', main._args)
self.assertNotIn('--debug', main._args)
self.assertTrue(main.debug)
def test_run_command_with_commandclass(self):
main = MyMainCommand()
sub = MySubCommand()
main._args = ['unknown_arg']
main.commandclass = sub
main.func = do_nothing
self.assertEqual(None, sub._args)
main.run_command()
self.assertIn('extra', sub._args)
def test_run_command(self):
main = MyMainCommand()
sub = MySubCommand()
main.func = do_something
self.assertEqual(None, sub._args)
main.run_command()
self.assertEqual(1337, scratch)
def test_run_command_crashing(self):
main = MyMainCommand()
sub = MySubCommand()
main.func = do_crash
try:
main.run_command()
self.fail("Did not crash as expected")
except Exception as e:
self.assertIn('unexpected', e.args)

View File

@ -1,16 +0,0 @@
#!/usr/bin/python3
# -*- coding: utf-8 -*-
# Copyright (C) 2018 Canonical Ltd.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; version 3 of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.

View File

@ -1,82 +0,0 @@
#!/usr/bin/python3
# -*- coding: utf-8 -*-
# Copyright (C) 2018 Canonical Ltd.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; version 3 of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
from termcolor import colored
from ubuntu_archive_assistant.utils import launchpad
from ubuntu_archive_assistant.logging import AssistantLogger
def _colorize_status(status):
color = 'grey'
if status in ('In Progress', 'Fix Committed'):
color = 'green'
elif status in ('Incomplete'):
color = 'red'
else:
color = 'grey'
return colored(status, color)
def _colorize_priority(importance):
color = 'grey'
if importance in ('Critical'):
color = 'red'
elif importance in ('High', 'Medium'):
color = 'yellow'
elif importance in ('Low'):
color = 'green'
else:
color = 'grey'
return colored(importance, color)
def show_bug(print_func, bug, **kwargs):
print_func("(LP: #%s) %s" % (bug.id, bug.title),
**kwargs)
def show_task(print_func, task, show_bug_header=False, **kwargs):
assigned_to = "unassigned"
if task.assignee:
if task.assignee.name in ('ubuntu-security', 'canonical-security'):
a_color = 'red'
elif task.assignee.name in ('ubuntu-mir'):
a_color = 'blue'
else:
a_color = 'grey'
assignee = colored(task.assignee.display_name, a_color)
assigned_to = "assigned to %s" % assignee
if show_bug_header:
show_bug(print_func, task.bug, **kwargs)
print_func("\t%s (%s) in %s (%s)" % (_colorize_status(task.status),
_colorize_priority(task.importance),
task.target.name, assigned_to),
**kwargs)
def list_bugs(print_func, tasks, filter=None, **kwargs):
last_bug_id = 0
for task in tasks:
if filter is not None and filter(task):
continue
if task.bug.id != last_bug_id:
show_bug(print_func, task.bug, **kwargs)
last_bug_id = task.bug.id
show_task(print_func, task, show_bug_header=False, **kwargs)

View File

@ -1,60 +0,0 @@
#!/usr/bin/python3
# -*- coding: utf-8 -*-
# Copyright (C) 2018 Canonical Ltd.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; version 3 of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import os
from launchpadlib.launchpad import Launchpad
from ubuntu_archive_assistant.logging import AssistantLogger
class LaunchpadInstance(object):
class __LaunchpadInstance(object):
def __init__(self):
self.logger = AssistantLogger()
self.lp_cachedir = os.path.expanduser(os.path.join("~", ".launchpadlib/cache"))
self.logger.log.debug("Using Launchpad cache dir: \"%s\"" % self.lp_cachedir)
self.lp = Launchpad.login_with('ubuntu-archive-assisant',
service_root='production',
launchpadlib_dir=self.lp_cachedir,
version='devel')
instance = None
def __init__(self, module=None, depth=0):
if not LaunchpadInstance.instance:
LaunchpadInstance.instance = LaunchpadInstance.__LaunchpadInstance()
self.lp = LaunchpadInstance.instance.lp
self.ubuntu = self.lp.distributions['ubuntu']
def lp(self):
return self.lp
def ubuntu(self):
return self.ubuntu
def ubuntu_archive(self):
return self.ubuntu.main_archive
def current_series(self):
return self.ubuntu.current_series

View File

@ -1,50 +0,0 @@
#!/usr/bin/python3
# -*- coding: utf-8 -*-
# Copyright (C) 2018 Canonical Ltd.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; version 3 of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import os
import requests
from urllib.request import FancyURLopener
class URLRetrieverWithProgress(object):
def __init__(self, url, filename):
self.url = url
self.filename = filename
self.url_opener = FancyURLopener()
def get(self):
self.url_opener.retrieve(self.url, self.filename, self._report_download)
def _report_download(self, blocks_read, block_size, total_size):
size_read = blocks_read * block_size
percent = size_read/total_size*100
if percent <= 100:
print("Refreshing %s: %.0f %%" % (os.path.basename(self.filename), percent), end='\r')
else:
print(" " * 80, end='\r')
def get_with_progress(url=None, filename=None):
retriever = URLRetrieverWithProgress(url, filename)
response = retriever.get()
return response
def get(url=None):
response = requests.get(url=url)
return response

View File

@ -7,8 +7,8 @@ import logging
import sys
def getLogger():
''' Get the logger instance for this module
def getLogger(): # pylint: disable=invalid-name
"""Get the logger instance for this module
Quick guide for using this or not: if you want to call ubuntutools
module code and have its output print to stdout/stderr ONLY, you can
@ -33,12 +33,12 @@ def getLogger():
This should only be used by runnable scripts provided by the
ubuntu-dev-tools package, or other runnable scripts that want the behavior
described above.
'''
"""
logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)
logger.propagate = False
fmt = logging.Formatter('%(message)s')
fmt = logging.Formatter("%(message)s")
stdout_handler = logging.StreamHandler(stream=sys.stdout)
stdout_handler.setFormatter(fmt)
@ -46,7 +46,7 @@ def getLogger():
logger.addHandler(stdout_handler)
stderr_handler = logging.StreamHandler(stream=sys.stderr)
stdout_handler.setFormatter(fmt)
stderr_handler.setFormatter(fmt)
stderr_handler.setLevel(logging.INFO + 1)
logger.addHandler(stderr_handler)

File diff suppressed because it is too large Load Diff

View File

@ -18,10 +18,10 @@
# CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
#
import logging
import os
import subprocess
import logging
Logger = logging.getLogger(__name__)
@ -31,20 +31,21 @@ def _build_preparation(result_directory):
os.makedirs(result_directory)
class Builder(object):
class Builder:
def __init__(self, name):
self.name = name
cmd = ["dpkg-architecture", "-qDEB_BUILD_ARCH_CPU"]
self.architecture = subprocess.check_output(cmd, encoding='utf-8').strip()
self.architecture = subprocess.check_output(cmd, encoding="utf-8").strip()
def _build_failure(self, returncode, dsc_file):
if returncode != 0:
Logger.error("Failed to build %s from source with %s." %
(os.path.basename(dsc_file), self.name))
Logger.error(
"Failed to build %s from source with %s.", os.path.basename(dsc_file), self.name
)
return returncode
def exists_in_path(self):
for path in os.environ.get('PATH', os.defpath).split(os.pathsep):
for path in os.environ.get("PATH", os.defpath).split(os.pathsep):
if os.path.isfile(os.path.join(path, self.name)):
return True
return False
@ -57,8 +58,7 @@ class Builder(object):
def _update_failure(self, returncode, dist):
if returncode != 0:
Logger.error("Failed to update %s chroot for %s." %
(dist, self.name))
Logger.error("Failed to update %s chroot for %s.", dist, self.name)
return returncode
@ -68,19 +68,39 @@ class Pbuilder(Builder):
def build(self, dsc_file, dist, result_directory):
_build_preparation(result_directory)
cmd = ["sudo", "-E", "ARCH=" + self.architecture, "DIST=" + dist,
self.name, "--build",
"--architecture", self.architecture, "--distribution", dist,
"--buildresult", result_directory, dsc_file]
Logger.debug(' '.join(cmd))
cmd = [
"sudo",
"-E",
f"ARCH={self.architecture}",
f"DIST={dist}",
self.name,
"--build",
"--architecture",
self.architecture,
"--distribution",
dist,
"--buildresult",
result_directory,
dsc_file,
]
Logger.debug(" ".join(cmd))
returncode = subprocess.call(cmd)
return self._build_failure(returncode, dsc_file)
def update(self, dist):
cmd = ["sudo", "-E", "ARCH=" + self.architecture, "DIST=" + dist,
self.name, "--update",
"--architecture", self.architecture, "--distribution", dist]
Logger.debug(' '.join(cmd))
cmd = [
"sudo",
"-E",
f"ARCH={self.architecture}",
f"DIST={dist}",
self.name,
"--update",
"--architecture",
self.architecture,
"--distribution",
dist,
]
Logger.debug(" ".join(cmd))
returncode = subprocess.call(cmd)
return self._update_failure(returncode, dist)
@ -91,15 +111,22 @@ class Pbuilderdist(Builder):
def build(self, dsc_file, dist, result_directory):
_build_preparation(result_directory)
cmd = [self.name, dist, self.architecture,
"build", dsc_file, "--buildresult", result_directory]
Logger.debug(' '.join(cmd))
cmd = [
self.name,
dist,
self.architecture,
"build",
dsc_file,
"--buildresult",
result_directory,
]
Logger.debug(" ".join(cmd))
returncode = subprocess.call(cmd)
return self._build_failure(returncode, dsc_file)
def update(self, dist):
cmd = [self.name, dist, self.architecture, "update"]
Logger.debug(' '.join(cmd))
Logger.debug(" ".join(cmd))
returncode = subprocess.call(cmd)
return self._update_failure(returncode, dist)
@ -111,41 +138,40 @@ class Sbuild(Builder):
def build(self, dsc_file, dist, result_directory):
_build_preparation(result_directory)
workdir = os.getcwd()
Logger.debug("cd " + result_directory)
Logger.debug("cd %s", result_directory)
os.chdir(result_directory)
cmd = ["sbuild", "--arch-all", "--dist=" + dist,
"--arch=" + self.architecture, dsc_file]
Logger.debug(' '.join(cmd))
cmd = ["sbuild", "--arch-all", f"--dist={dist}", f"--arch={self.architecture}", dsc_file]
Logger.debug(" ".join(cmd))
returncode = subprocess.call(cmd)
Logger.debug("cd " + workdir)
Logger.debug("cd %s", workdir)
os.chdir(workdir)
return self._build_failure(returncode, dsc_file)
def update(self, dist):
cmd = ["schroot", "--list"]
Logger.debug(' '.join(cmd))
process = subprocess.run(cmd, stdout=subprocess.PIPE, encoding='utf-8')
Logger.debug(" ".join(cmd))
process = subprocess.run(cmd, check=False, stdout=subprocess.PIPE, encoding="utf-8")
chroots, _ = process.stdout.strip().split()
if process.returncode != 0:
return process.returncode
params = {"dist": dist, "arch": self.architecture}
for chroot in ("%(dist)s-%(arch)s-sbuild-source",
for chroot in (
"%(dist)s-%(arch)s-sbuild-source",
"%(dist)s-sbuild-source",
"%(dist)s-%(arch)s-source",
"%(dist)s-source"):
"%(dist)s-source",
):
chroot = chroot % params
if chroot in chroots:
break
else:
return 1
commands = [["sbuild-update"],
["sbuild-distupgrade"],
["sbuild-clean", "-a", "-c"]]
commands = [["sbuild-update"], ["sbuild-distupgrade"], ["sbuild-clean", "-a", "-c"]]
for cmd in commands:
# pylint: disable=W0631
Logger.debug(' '.join(cmd) + " " + chroot)
Logger.debug("%s %s", " ".join(cmd), chroot)
ret = subprocess.call(cmd + [chroot])
# pylint: enable=W0631
if ret != 0:
@ -156,9 +182,9 @@ class Sbuild(Builder):
_SUPPORTED_BUILDERS = {
"cowbuilder": lambda: Pbuilder("cowbuilder"),
"cowbuilder-dist": lambda: Pbuilderdist("cowbuilder-dist"),
"pbuilder": lambda: Pbuilder(),
"pbuilder-dist": lambda: Pbuilderdist(),
"sbuild": lambda: Sbuild(),
"pbuilder": Pbuilder,
"pbuilder-dist": Pbuilderdist,
"sbuild": Sbuild,
}
@ -170,5 +196,5 @@ def get_builder(name):
Logger.error("Builder doesn't appear to be installed: %s", name)
else:
Logger.error("Unsupported builder specified: %s.", name)
Logger.error("Supported builders: %s",
", ".join(sorted(_SUPPORTED_BUILDERS.keys())))
Logger.error("Supported builders: %s", ", ".join(sorted(_SUPPORTED_BUILDERS.keys())))
return None

View File

@ -15,39 +15,39 @@
# OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
# PERFORMANCE OF THIS SOFTWARE.
import locale
import logging
import os
import pwd
import re
import shlex
import socket
import sys
import locale
import logging
Logger = logging.getLogger(__name__)
class UDTConfig(object):
class UDTConfig:
"""Ubuntu Dev Tools configuration file (devscripts config file) and
environment variable parsing.
"""
no_conf = False
# Package wide configuration variables.
# These are reqired to be used by at least two scripts.
defaults = {
'BUILDER': 'pbuilder',
'DEBIAN_MIRROR': 'http://deb.debian.org/debian',
'DEBSEC_MIRROR': 'http://security.debian.org',
'DEBIAN_DDEBS_MIRROR': 'http://debug.mirrors.debian.org/debian-debug',
'LPINSTANCE': 'production',
'MIRROR_FALLBACK': True,
'UBUNTU_MIRROR': 'http://archive.ubuntu.com/ubuntu',
'UBUNTU_PORTS_MIRROR': 'http://ports.ubuntu.com',
'UBUNTU_INTERNAL_MIRROR': 'http://ftpmaster.internal/ubuntu',
'UBUNTU_DDEBS_MIRROR': 'http://ddebs.ubuntu.com',
'UPDATE_BUILDER': False,
'WORKDIR': None,
'KEYID': None,
"BUILDER": "pbuilder",
"DEBIAN_MIRROR": "http://deb.debian.org/debian",
"DEBSEC_MIRROR": "http://security.debian.org",
"DEBIAN_DDEBS_MIRROR": "http://debug.mirrors.debian.org/debian-debug",
"LPINSTANCE": "production",
"MIRROR_FALLBACK": True,
"UBUNTU_MIRROR": "http://archive.ubuntu.com/ubuntu",
"UBUNTU_PORTS_MIRROR": "http://ports.ubuntu.com",
"UBUNTU_DDEBS_MIRROR": "http://ddebs.ubuntu.com",
"UPDATE_BUILDER": False,
"WORKDIR": None,
"KEYID": None,
}
# Populated from the configuration files:
config = {}
@ -55,30 +55,32 @@ class UDTConfig(object):
def __init__(self, no_conf=False, prefix=None):
self.no_conf = no_conf
if prefix is None:
prefix = os.path.basename(sys.argv[0]).upper().replace('-', '_')
prefix = os.path.basename(sys.argv[0]).upper().replace("-", "_")
self.prefix = prefix
if not no_conf:
self.config = self.parse_devscripts_config()
def parse_devscripts_config(self):
@staticmethod
def parse_devscripts_config():
"""Read the devscripts configuration files, and return the values as a
dictionary
"""
config = {}
for filename in ('/etc/devscripts.conf', '~/.devscripts'):
for filename in ("/etc/devscripts.conf", "~/.devscripts"):
try:
f = open(os.path.expanduser(filename), 'r')
with open(os.path.expanduser(filename), "r", encoding="utf-8") as f:
content = f.read()
except IOError:
continue
for line in f:
parsed = shlex.split(line, comments=True)
if len(parsed) > 1:
Logger.warning('Cannot parse variable assignment in %s: %s',
getattr(f, 'name', '<config>'), line)
if len(parsed) >= 1 and '=' in parsed[0]:
key, value = parsed[0].split('=', 1)
try:
tokens = shlex.split(content, comments=True)
except ValueError as e:
Logger.error("Error parsing %s: %s", filename, e)
continue
for token in tokens:
if "=" in token:
key, value = token.split("=", 1)
config[key] = value
f.close()
return config
def get_value(self, key, default=None, boolean=False, compat_keys=()):
@ -95,9 +97,9 @@ class UDTConfig(object):
if default is None and key in self.defaults:
default = self.defaults[key]
keys = [self.prefix + '_' + key]
keys = [f"{self.prefix}_{key}"]
if key in self.defaults:
keys.append('UBUNTUTOOLS_' + key)
keys.append(f"UBUNTUTOOLS_{key}")
keys += compat_keys
for k in keys:
@ -105,16 +107,19 @@ class UDTConfig(object):
if k in store:
value = store[k]
if boolean:
if value in ('yes', 'no'):
value = value == 'yes'
if value in ("yes", "no"):
value = value == "yes"
else:
continue
if k in compat_keys:
replacements = self.prefix + '_' + key
replacements = f"{self.prefix}_{key}"
if key in self.defaults:
replacements += 'or UBUNTUTOOLS_' + key
Logger.warning('Using deprecated configuration variable %s. '
'You should use %s.', k, replacements)
replacements += f"or UBUNTUTOOLS_{key}"
Logger.warning(
"Using deprecated configuration variable %s. You should use %s.",
k,
replacements,
)
return value
return default
@ -132,7 +137,7 @@ def ubu_email(name=None, email=None, export=True):
Return name, email.
"""
name_email_re = re.compile(r'^\s*(.+?)\s*<(.+@.+)>\s*$')
name_email_re = re.compile(r"^\s*(.+?)\s*<(.+@.+)>\s*$")
if email:
match = name_email_re.match(email)
@ -140,11 +145,16 @@ def ubu_email(name=None, email=None, export=True):
name = match.group(1)
email = match.group(2)
if export and not name and not email and 'UBUMAIL' not in os.environ:
if export and not name and not email and "UBUMAIL" not in os.environ:
export = False
for var, target in (('UBUMAIL', 'email'), ('DEBFULLNAME', 'name'), ('DEBEMAIL', 'email'),
('EMAIL', 'email'), ('NAME', 'name')):
for var, target in (
("UBUMAIL", "email"),
("DEBFULLNAME", "name"),
("DEBEMAIL", "email"),
("EMAIL", "email"),
("NAME", "name"),
):
if name and email:
break
if var in os.environ:
@ -154,30 +164,30 @@ def ubu_email(name=None, email=None, export=True):
name = match.group(1)
if not email:
email = match.group(2)
elif target == 'name' and not name:
elif target == "name" and not name:
name = os.environ[var].strip()
elif target == 'email' and not email:
elif target == "email" and not email:
email = os.environ[var].strip()
if not name:
gecos_name = pwd.getpwuid(os.getuid()).pw_gecos.split(',')[0].strip()
gecos_name = pwd.getpwuid(os.getuid()).pw_gecos.split(",")[0].strip()
if gecos_name:
name = gecos_name
if not email:
mailname = socket.getfqdn()
if os.path.isfile('/etc/mailname'):
mailname = open('/etc/mailname', 'r').read().strip()
email = pwd.getpwuid(os.getuid()).pw_name + '@' + mailname
if os.path.isfile("/etc/mailname"):
mailname = open("/etc/mailname", "r", encoding="utf-8").read().strip()
email = f"{pwd.getpwuid(os.getuid()).pw_name}@{mailname}"
if export:
os.environ['DEBFULLNAME'] = name
os.environ['DEBEMAIL'] = email
os.environ["DEBFULLNAME"] = name
os.environ["DEBEMAIL"] = email
# decode env var or gecos raw string with the current locale's encoding
encoding = locale.getdefaultlocale()[1]
encoding = locale.getlocale()[1]
if not encoding:
encoding = 'utf-8'
encoding = "utf-8"
if name and isinstance(name, bytes):
name = name.decode(encoding)
return name, email

View File

@ -2,5 +2,5 @@
# ubuntu-dev-tools Launchpad Python modules.
#
service = 'production'
api_version = 'devel'
SERVICE = "production"
API_VERSION = "devel"

View File

@ -1,50 +0,0 @@
#
# libsupport.py - functions which add launchpadlib support to the Ubuntu
# Developer Tools package.
#
# Copyright (C) 2009 Markus Korn <thekorn@gmx.de>
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 3
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# Please see the /usr/share/common-licenses/GPL file for the full text of
# the GNU General Public License license.
#
from urllib.parse import urlsplit, urlencode, urlunsplit
def query_to_dict(query_string):
result = dict()
options = filter(None, query_string.split("&"))
for opt in options:
key, value = opt.split("=")
result.setdefault(key, set()).add(value)
return result
def translate_web_api(url, launchpad):
scheme, netloc, path, query, fragment = urlsplit(url)
query = query_to_dict(query)
differences = set(netloc.split('.')).symmetric_difference(
set(launchpad._root_uri.host.split('.')))
if ('staging' in differences or 'edge' in differences):
raise ValueError("url conflict (url: %s, root: %s" %
(url, launchpad._root_uri))
if path.endswith("/+bugs"):
path = path[:-6]
if "ws.op" in query:
raise ValueError("Invalid web url, url: %s" % url)
query["ws.op"] = "searchTasks"
scheme, netloc, api_path, _, _ = urlsplit(str(launchpad._root_uri))
query = urlencode(query)
url = urlunsplit((scheme, netloc, api_path + path.lstrip("/"), query, fragment))
return url

File diff suppressed because it is too large Load Diff

View File

@ -1,33 +1,26 @@
class PackageNotFoundException(BaseException):
"""Thrown when a package is not found"""
pass
class SeriesNotFoundException(BaseException):
"""Thrown when a distroseries is not found"""
pass
class PocketDoesNotExistError(Exception):
'''Raised when a invalid pocket is used.'''
pass
"""Raised when a invalid pocket is used."""
class ArchiveNotFoundException(BaseException):
"""Thrown when an archive for a distibution is not found"""
pass
class AlreadyLoggedInError(Exception):
'''Raised when a second login is attempted.'''
pass
"""Raised when a second login is attempted."""
class ArchSeriesNotFoundException(BaseException):
"""Thrown when a distroarchseries is not found."""
pass
class InvalidDistroValueError(ValueError):
"""Thrown when distro value is invalid"""
pass

View File

@ -22,34 +22,45 @@
#
# ##################################################################
import distro_info
import hashlib
import locale
import logging
import os
import shutil
import sys
import tempfile
from contextlib import suppress
from subprocess import check_output, CalledProcessError
from pathlib import Path
from subprocess import CalledProcessError, check_output
from urllib.parse import urlparse
from urllib.request import urlopen
import distro_info
import requests
from ubuntutools.lp.udtexceptions import PocketDoesNotExistError
import logging
Logger = logging.getLogger(__name__)
DEFAULT_POCKETS = ('Release', 'Security', 'Updates', 'Proposed')
POCKETS = DEFAULT_POCKETS + ('Backports',)
DEFAULT_POCKETS = ("Release", "Security", "Updates", "Proposed")
POCKETS = DEFAULT_POCKETS + ("Backports",)
DEFAULT_STATUSES = ('Pending', 'Published')
STATUSES = DEFAULT_STATUSES + ('Superseded', 'Deleted', 'Obsolete')
DEFAULT_STATUSES = ("Pending", "Published")
STATUSES = DEFAULT_STATUSES + ("Superseded", "Deleted", "Obsolete")
UPLOAD_QUEUE_STATUSES = ('New', 'Unapproved', 'Accepted', 'Done', 'Rejected')
UPLOAD_QUEUE_STATUSES = ("New", "Unapproved", "Accepted", "Done", "Rejected")
_system_distribution_chain = []
DOWNLOAD_BLOCKSIZE_DEFAULT = 8192
_SYSTEM_DISTRIBUTION_CHAIN: list[str] = []
class DownloadError(Exception):
"Unable to pull a source package"
class NotFoundError(DownloadError):
"Source package not found"
def system_distribution_chain():
@ -61,27 +72,32 @@ def system_distribution_chain():
the distribution chain can't be determined, print an error message
and return an empty list.
"""
global _system_distribution_chain
if len(_system_distribution_chain) == 0:
if len(_SYSTEM_DISTRIBUTION_CHAIN) == 0:
try:
vendor = check_output(('dpkg-vendor', '--query', 'Vendor'),
encoding='utf-8').strip()
_system_distribution_chain.append(vendor)
vendor = check_output(("dpkg-vendor", "--query", "Vendor"), encoding="utf-8").strip()
_SYSTEM_DISTRIBUTION_CHAIN.append(vendor)
except CalledProcessError:
Logger.error('Could not determine what distribution you are running.')
Logger.error("Could not determine what distribution you are running.")
return []
while True:
try:
parent = check_output((
'dpkg-vendor', '--vendor', _system_distribution_chain[-1],
'--query', 'Parent'), encoding='utf-8').strip()
parent = check_output(
(
"dpkg-vendor",
"--vendor",
_SYSTEM_DISTRIBUTION_CHAIN[-1],
"--query",
"Parent",
),
encoding="utf-8",
).strip()
except CalledProcessError:
# Vendor has no parent
break
_system_distribution_chain.append(parent)
_SYSTEM_DISTRIBUTION_CHAIN.append(parent)
return _system_distribution_chain
return _SYSTEM_DISTRIBUTION_CHAIN
def system_distribution():
@ -102,14 +118,12 @@ def host_architecture():
"""
try:
arch = check_output(('dpkg', '--print-architecture'),
encoding='utf-8').strip()
arch = check_output(("dpkg", "--print-architecture"), encoding="utf-8").strip()
except CalledProcessError:
arch = None
if not arch or 'not found' in arch:
Logger.error('Not running on a Debian based system; '
'could not detect its architecture.')
if not arch or "not found" in arch:
Logger.error("Not running on a Debian based system; could not detect its architecture.")
return None
return arch
@ -121,16 +135,16 @@ def readlist(filename, uniq=True):
Read a list of words from the indicated file. If 'uniq' is True, filter
out duplicated words.
"""
path = Path(filename)
if not os.path.isfile(filename):
Logger.error('File "%s" does not exist.' % filename)
if not path.is_file():
Logger.error("File %s does not exist.", path)
return False
with open(filename) as f:
content = f.read().replace('\n', ' ').replace(',', ' ')
content = path.read_text(encoding="utf-8").replace("\n", " ").replace(",", " ")
if not content.strip():
Logger.error('File "%s" is empty.' % filename)
Logger.error("File %s is empty.", path)
return False
items = [item for item in content.split() if item]
@ -141,38 +155,40 @@ def readlist(filename, uniq=True):
return items
def split_release_pocket(release, default='Release'):
'''Splits the release and pocket name.
def split_release_pocket(release, default="Release"):
"""Splits the release and pocket name.
If the argument doesn't contain a pocket name then the 'Release' pocket
is assumed.
Returns the release and pocket name.
'''
"""
pocket = default
if release is None:
raise ValueError('No release name specified')
raise ValueError("No release name specified")
if '-' in release:
(release, pocket) = release.rsplit('-', 1)
if "-" in release:
(release, pocket) = release.rsplit("-", 1)
pocket = pocket.capitalize()
if pocket not in POCKETS:
raise PocketDoesNotExistError("Pocket '%s' does not exist." % pocket)
raise PocketDoesNotExistError(f"Pocket '{pocket}' does not exist.")
return (release, pocket)
def require_utf8():
'''Can be called by programs that only function in UTF-8 locales'''
if locale.getpreferredencoding() != 'UTF-8':
"""Can be called by programs that only function in UTF-8 locales"""
if locale.getpreferredencoding() != "UTF-8":
Logger.error("This program only functions in a UTF-8 locale. Aborting.")
sys.exit(1)
_vendor_to_distroinfo = {"Debian": distro_info.DebianDistroInfo,
"Ubuntu": distro_info.UbuntuDistroInfo}
_vendor_to_distroinfo = {
"Debian": distro_info.DebianDistroInfo,
"Ubuntu": distro_info.UbuntuDistroInfo,
}
def vendor_to_distroinfo(vendor):
@ -199,14 +215,15 @@ def codename_to_distribution(codename):
if info().valid(codename):
return distro
return None
def verify_file_checksums(pathname, checksums={}, size=0):
def verify_file_checksums(pathname, checksums=None, size=0):
"""verify checksums of file
Any failure will log an error.
pathname: str
pathname: str or Path
full path to file
checksums: dict
keys are alg name, values are expected checksum
@ -215,30 +232,33 @@ def verify_file_checksums(pathname, checksums={}, size=0):
Returns True if all checks pass, False otherwise
"""
if not os.path.isfile(pathname):
Logger.error('File not found: %s', pathname)
if checksums is None:
checksums = {}
path = Path(pathname)
if not path.is_file():
Logger.error("File %s not found", path)
return False
filename = os.path.basename(pathname)
if size and size != os.path.getsize(pathname):
Logger.error('File %s incorrect size, got %s expected %s',
filename, os.path.getsize(pathname), size)
filesize = path.stat().st_size
if size and size != filesize:
Logger.error("File %s incorrect size, got %s expected %s", path, filesize, size)
return False
for (alg, checksum) in checksums.items():
h = hashlib.new(alg)
with open(pathname, 'rb') as f:
for alg, checksum in checksums.items():
hash_ = hashlib.new(alg)
with path.open("rb") as f:
while True:
block = f.read(h.block_size)
block = f.read(hash_.block_size)
if len(block) == 0:
break
h.update(block)
match = h.hexdigest() == checksum
if match:
Logger.debug('File %s checksum (%s) verified: %s',
filename, alg, checksum)
hash_.update(block)
digest = hash_.hexdigest()
if digest == checksum:
Logger.debug("File %s checksum (%s) verified: %s", path, alg, checksum)
else:
Logger.error('File %s checksum (%s) mismatch: got %s expected %s',
filename, alg, h.hexdigest(), checksum)
Logger.error(
"File %s checksum (%s) mismatch: got %s expected %s", path, alg, digest, checksum
)
return False
return True
@ -246,7 +266,7 @@ def verify_file_checksums(pathname, checksums={}, size=0):
def verify_file_checksum(pathname, alg, checksum, size=0):
"""verify checksum of file
pathname: str
pathname: str or Path
full path to file
alg: str
name of checksum alg
@ -260,92 +280,210 @@ def verify_file_checksum(pathname, alg, checksum, size=0):
return verify_file_checksums(pathname, {alg: checksum}, size)
def download(src, dst, size=0):
def extract_authentication(url):
"""Remove plaintext authentication data from a URL
If the URL has a username:password in its netloc, this removes it
and returns the remaining URL, along with the username and password
separately. If no authentication data is in the netloc, this just
returns the URL unchanged with None for the username and password.
This returns a tuple in the form (url, username, password)
"""
components = urlparse(url)
if components.username or components.password:
return (
components._replace(netloc=components.hostname).geturl(),
components.username,
components.password,
)
return (url, None, None)
def download(src, dst, size=0, *, blocksize=DOWNLOAD_BLOCKSIZE_DEFAULT):
"""download/copy a file/url to local file
src: str
src: str or Path
Source to copy from (file path or url)
dst: str
dst: str or Path
Destination dir or filename
size: int
Size of source, if known
blocksize: int or None
Blocksize to use when downloading
This calls urllib.request.urlopen() so it may raise the same
exceptions as that method (URLError or HTTPError)
If the URL contains authentication data in the URL 'netloc',
it will be stripped from the URL and passed to the requests library.
This may throw a DownloadError.
On success, this will return the dst as a Path object.
"""
if not urlparse(src).scheme:
src = 'file://%s' % os.path.abspath(os.path.expanduser(src))
dst = os.path.abspath(os.path.expanduser(dst))
src = str(src)
parsedsrc = urlparse(src)
filename = os.path.basename(urlparse(src).path)
dst = Path(dst).expanduser().resolve()
if dst.is_dir():
dst = dst / Path(parsedsrc.path).name
if os.path.isdir(dst):
dst = os.path.join(dst, filename)
# Copy if src is a local file
if parsedsrc.scheme in ["", "file"]:
src = Path(parsedsrc.path).expanduser().resolve()
if src != parsedsrc.path:
Logger.info("Parsed %s as %s", parsedsrc.path, src)
if not src.exists():
raise NotFoundError(f"Source file {src} not found")
if dst.exists():
if src.samefile(dst):
Logger.info("Using existing file %s", dst)
return dst
Logger.info("Replacing existing file %s", dst)
Logger.info("Copying file %s to %s", src, dst)
shutil.copyfile(src, dst)
return dst
if urlparse(src).scheme == 'file':
srcfile = urlparse(src).path
if os.path.exists(srcfile) and os.path.exists(dst):
if os.path.samefile(srcfile, dst):
Logger.info(f"Using existing file {dst}")
(src, username, password) = extract_authentication(src)
auth = (username, password) if username or password else None
with tempfile.TemporaryDirectory() as tmpdir:
tmpdst = Path(tmpdir) / "dst"
try:
# We must use "Accept-Encoding: identity" so that Launchpad doesn't
# compress changes files. See LP: #2025748.
with requests.get(
src, stream=True, timeout=60, auth=auth, headers={"accept-encoding": "identity"}
) as fsrc:
with tmpdst.open("wb") as fdst:
fsrc.raise_for_status()
_download(fsrc, fdst, size, blocksize=blocksize)
except requests.exceptions.HTTPError as error:
if error.response is not None and error.response.status_code == 404:
raise NotFoundError(f"URL {src} not found: {error}") from error
raise DownloadError(error) from error
except requests.exceptions.ConnectionError as error:
# This is likely a archive hostname that doesn't resolve, like 'ftpmaster.internal'
raise NotFoundError(f"URL {src} not found: {error}") from error
except requests.exceptions.RequestException as error:
raise DownloadError(error) from error
shutil.move(tmpdst, dst)
return dst
class _StderrProgressBar:
BAR_WIDTH_MIN = 40
BAR_WIDTH_DEFAULT = 60
def __init__(self, max_width):
self.full_width = min(max_width, self.BAR_WIDTH_DEFAULT)
self.width = self.full_width - len("[] 99%")
self.show_progress = self.full_width >= self.BAR_WIDTH_MIN
def update(self, progress, total):
if not self.show_progress:
return
pct = progress * 100 // total
pctstr = f"{pct:>3}%"
barlen = self.width * pct // 100
barstr = "=" * barlen
barstr = f"{barstr[:-1]}>"
barstr = barstr.ljust(self.width)
fullstr = f"\r[{barstr}]{pctstr}"
sys.stderr.write(fullstr)
sys.stderr.flush()
with urlopen(src) as fsrc, open(dst, 'wb') as fdst:
url = fsrc.geturl()
Logger.debug(f"Using URL: {url}")
def finish(self):
if not self.show_progress:
return
sys.stderr.write("\n")
sys.stderr.flush()
def _download(fsrc, fdst, size, *, blocksize):
"""helper method to download src to dst using requests library."""
url = fsrc.url
Logger.debug("Using URL: %s", url)
if not size:
with suppress(AttributeError, TypeError, ValueError):
size = int(fsrc.info().get('Content-Length'))
size = int(fsrc.headers.get("Content-Length"))
hostname = urlparse(url).hostname
sizemb = ' (%0.3f MiB)' % (size / 1024.0 / 1024) if size else ''
Logger.info(f'Downloading {filename} from {hostname}{sizemb}')
parsed = urlparse(url)
filename = Path(parsed.path).name
hostname = parsed.hostname
sizemb = f" ({size / 1024.0 / 1024:0.3f} MiB)" if size else ""
Logger.info("Downloading %s from %s%s", filename, hostname, sizemb)
if not all((Logger.isEnabledFor(logging.INFO),
sys.stderr.isatty(), size)):
shutil.copyfileobj(fsrc, fdst)
return
# Don't show progress if:
# logging INFO is suppressed
# stderr isn't a tty
# we don't know the total file size
# the file is content-encoded (i.e. compressed)
show_progress = all(
(
Logger.isEnabledFor(logging.INFO),
sys.stderr.isatty(),
size > 0,
"Content-Encoding" not in fsrc.headers,
)
)
terminal_width = 0
if show_progress:
try:
terminal_width = os.get_terminal_size(sys.stderr.fileno()).columns
except Exception as e: # pylint: disable=broad-except
Logger.error("Error finding stderr width, suppressing progress bar: %s", e)
progress_bar = _StderrProgressBar(max_width=terminal_width)
blocksize = 4096
XTRALEN = len('[] 99%')
downloaded = 0
bar_width = 60
term_width = os.get_terminal_size(sys.stderr.fileno())[0]
if term_width < bar_width + XTRALEN + 1:
bar_width = term_width - XTRALEN - 1
try:
while True:
block = fsrc.read(blocksize)
# We use fsrc.raw so that compressed files stay compressed as we
# write them to disk. For example, if this is a .diff.gz, then it
# needs to remain compressed and unmodified to remain valid as part
# of a source package later, even though Launchpad sends
# "Content-Encoding: gzip" and the requests library therefore would
# want to decompress it. See LP: #2025748.
block = fsrc.raw.read(blocksize)
if not block:
break
fdst.write(block)
downloaded += len(block)
pct = float(downloaded) / size
bar = ('=' * int(pct * bar_width))[:-1] + '>'
fmt = '\r[{bar:<%d}]{pct:>3}%%\r' % bar_width
sys.stderr.write(fmt.format(bar=bar, pct=int(pct * 100)))
sys.stderr.flush()
progress_bar.update(downloaded, size)
finally:
sys.stderr.write('\r' + ' ' * (term_width - 1) + '\r')
if downloaded < size:
Logger.error('Partial download: %0.3f MiB of %0.3f MiB' %
(downloaded / 1024.0 / 1024,
size / 1024.0 / 1024))
progress_bar.finish()
if size and size > downloaded:
Logger.error(
"Partial download: %0.3f MiB of %0.3f MiB",
downloaded / 1024.0 / 1024,
size / 1024.0 / 1024,
)
def download_text(src):
""" return the text content of a downloaded file
def _download_text(src, binary, *, blocksize):
with tempfile.TemporaryDirectory() as tmpdir:
dst = Path(tmpdir) / "dst"
download(src, dst, blocksize=blocksize)
return dst.read_bytes() if binary else dst.read_text()
src: str
def download_text(src, mode=None, *, blocksize=DOWNLOAD_BLOCKSIZE_DEFAULT):
"""Return the text content of a downloaded file
src: str or Path
Source to copy from (file path or url)
mode: str
Deprecated, ignored unless a string that contains 'b'
blocksize: int or None
Blocksize to use when downloading
Raises the same exceptions as download()
Returns text content of downloaded file
"""
with tempfile.TemporaryDirectory() as d:
dst = os.path.join(d, 'dst')
download(src, dst)
with open(dst) as f:
return f.read()
return _download_text(src, binary="b" in (mode or ""), blocksize=blocksize)
def download_bytes(src, *, blocksize=DOWNLOAD_BLOCKSIZE_DEFAULT):
"""Same as download_text() but returns bytes"""
return _download_text(src, binary=True, blocksize=blocksize)

View File

@ -22,53 +22,58 @@
# ##################################################################
import errno
import logging
import os
import re
import subprocess
import sys
import errno
from argparse import ArgumentParser
from distro_info import DebianDistroInfo
from urllib.parse import urlparse
from ubuntutools.archive import (UbuntuSourcePackage, DebianSourcePackage,
UbuntuCloudArchiveSourcePackage,
PersonalPackageArchiveSourcePackage)
from ubuntutools.config import UDTConfig
from ubuntutools.lp.lpapicache import (Distribution, Launchpad)
from ubuntutools.lp.udtexceptions import (AlreadyLoggedInError,
SeriesNotFoundException,
PackageNotFoundException,
PocketDoesNotExistError,
InvalidDistroValueError)
from ubuntutools.misc import (split_release_pocket,
host_architecture,
download,
UPLOAD_QUEUE_STATUSES,
STATUSES)
from distro_info import DebianDistroInfo
# by default we use standard logging.getLogger() and only use
# ubuntutools.getLogger() in PullPkg().main()
from ubuntutools import getLogger as ubuntutools_getLogger
import logging
from ubuntutools.archive import (
DebianSourcePackage,
PersonalPackageArchiveSourcePackage,
UbuntuCloudArchiveSourcePackage,
UbuntuSourcePackage,
)
from ubuntutools.config import UDTConfig
from ubuntutools.lp.lpapicache import Distribution, Launchpad
from ubuntutools.lp.udtexceptions import (
AlreadyLoggedInError,
InvalidDistroValueError,
PackageNotFoundException,
PocketDoesNotExistError,
SeriesNotFoundException,
)
from ubuntutools.misc import (
STATUSES,
UPLOAD_QUEUE_STATUSES,
download,
host_architecture,
split_release_pocket,
)
Logger = logging.getLogger(__name__)
PULL_SOURCE = 'source'
PULL_DEBS = 'debs'
PULL_DDEBS = 'ddebs'
PULL_UDEBS = 'udebs'
PULL_LIST = 'list'
PULL_SOURCE = "source"
PULL_DEBS = "debs"
PULL_DDEBS = "ddebs"
PULL_UDEBS = "udebs"
PULL_LIST = "list"
VALID_PULLS = [PULL_SOURCE, PULL_DEBS, PULL_DDEBS, PULL_UDEBS, PULL_LIST]
VALID_BINARY_PULLS = [PULL_DEBS, PULL_DDEBS, PULL_UDEBS]
DISTRO_DEBIAN = 'debian'
DISTRO_UBUNTU = 'ubuntu'
DISTRO_UCA = 'uca'
DISTRO_PPA = 'ppa'
DISTRO_DEBIAN = "debian"
DISTRO_UBUNTU = "ubuntu"
DISTRO_UCA = "uca"
DISTRO_PPA = "ppa"
DISTRO_PKG_CLASS = {
DISTRO_DEBIAN: DebianSourcePackage,
@ -81,11 +86,11 @@ VALID_DISTROS = DISTRO_PKG_CLASS.keys()
class InvalidPullValueError(ValueError):
"""Thrown when --pull value is invalid"""
pass
class PullPkg(object):
class PullPkg:
"""Class used to pull file(s) associated with a specific package"""
@classmethod
def main(cls, *args, **kwargs):
"""For use by stand-alone cmdline scripts.
@ -100,59 +105,74 @@ class PullPkg(object):
unexpected errors will flow up to the caller.
On success, this simply returns.
"""
Logger = ubuntutools_getLogger()
logger = ubuntutools_getLogger()
try:
cls(*args, **kwargs).pull()
return
except KeyboardInterrupt:
Logger.info('User abort.')
except (PackageNotFoundException, SeriesNotFoundException,
PocketDoesNotExistError, InvalidDistroValueError,
InvalidPullValueError) as e:
Logger.error(str(e))
logger.info("User abort.")
except (
PackageNotFoundException,
SeriesNotFoundException,
PocketDoesNotExistError,
InvalidDistroValueError,
InvalidPullValueError,
) as error:
logger.error(str(error))
sys.exit(errno.ENOENT)
def __init__(self, *args, **kwargs):
self._default_pull = kwargs.get('pull')
self._default_distro = kwargs.get('distro')
self._default_arch = kwargs.get('arch', host_architecture())
def __init__(self, *args, **kwargs): # pylint: disable=unused-argument
self._default_pull = kwargs.get("pull")
self._default_distro = kwargs.get("distro")
self._default_arch = kwargs.get("arch", host_architecture())
def parse_args(self, args):
args = args[:]
if args is None:
args = sys.argv[1:]
help_default_pull = "What to pull: " + ", ".join(VALID_PULLS)
if self._default_pull:
help_default_pull += (" (default: %s)" % self._default_pull)
help_default_pull += f" (default: {self._default_pull})"
help_default_distro = "Pull from: " + ", ".join(VALID_DISTROS)
if self._default_distro:
help_default_distro += (" (default: %s)" % self._default_distro)
help_default_arch = ("Get binary packages for arch")
help_default_arch += ("(default: %s)" % self._default_arch)
help_default_distro += f" (default: {self._default_distro})"
help_default_arch = "Get binary packages for arch"
help_default_arch += f"(default: {self._default_arch})"
# use add_help=False because we do parse_known_args() below, and if
# that sees --help then it exits immediately
parser = ArgumentParser(add_help=False)
parser.add_argument('-L', '--login', action='store_true',
help="Login to Launchpad")
parser.add_argument('-v', '--verbose', action='count', default=0,
help="Increase verbosity/debug")
parser.add_argument('-d', '--download-only', action='store_true',
help="Do not extract the source package")
parser.add_argument('-m', '--mirror', action='append',
help='Preferred mirror(s)')
parser.add_argument('--no-conf', action='store_true',
help="Don't read config files or environment variables")
parser.add_argument('--no-verify-signature', action='store_true',
help="Don't fail if dsc signature can't be verified")
parser.add_argument('-s', '--status', action='append', default=[],
help="Search for packages with specific status(es)")
parser.add_argument('-a', '--arch', default=self._default_arch,
help=help_default_arch)
parser.add_argument('-p', '--pull', default=self._default_pull,
help=help_default_pull)
parser.add_argument('-D', '--distro', default=self._default_distro,
help=help_default_distro)
parser.add_argument("-L", "--login", action="store_true", help="Login to Launchpad")
parser.add_argument(
"-v", "--verbose", action="count", default=0, help="Increase verbosity/debug"
)
parser.add_argument(
"-d", "--download-only", action="store_true", help="Do not extract the source package"
)
parser.add_argument("-m", "--mirror", action="append", help="Preferred mirror(s)")
parser.add_argument(
"--no-conf",
action="store_true",
help="Don't read config files or environment variables",
)
parser.add_argument(
"--no-verify-signature",
action="store_true",
help="Don't fail if dsc signature can't be verified",
)
parser.add_argument(
"-s",
"--status",
action="append",
default=[],
help="Search for packages with specific status(es)",
)
parser.add_argument("-a", "--arch", default=self._default_arch, help=help_default_arch)
parser.add_argument("-p", "--pull", default=self._default_pull, help=help_default_pull)
parser.add_argument(
"-D", "--distro", default=self._default_distro, help=help_default_distro
)
# add distro-specific params
try:
@ -162,75 +182,84 @@ class PullPkg(object):
distro = None
if distro == DISTRO_UBUNTU:
parser.add_argument('--security', action='store_true',
help='Pull from the Ubuntu Security Team (proposed) PPA')
parser.add_argument('--upload-queue', action='store_true',
help='Pull from the Ubuntu upload queue')
parser.add_argument(
"--security",
action="store_true",
help="Pull from the Ubuntu Security Team (proposed) PPA",
)
parser.add_argument(
"--upload-queue", action="store_true", help="Pull from the Ubuntu upload queue"
)
if distro == DISTRO_PPA:
parser.add_argument('--ppa', help='PPA to pull from')
parser.add_argument("--ppa", help="PPA to pull from")
if parser.parse_known_args(args)[0].ppa is None:
# check for any param starting with "ppa:"
# if found, move it to a --ppa param
for param in args:
if param.startswith('ppa:'):
if param.startswith("ppa:"):
args.remove(param)
args.insert(0, param)
args.insert(0, '--ppa')
args.insert(0, "--ppa")
break
# add the positional params
parser.add_argument('package', help="Package name to pull")
parser.add_argument('release', nargs='?', help="Release to pull from")
parser.add_argument('version', nargs='?', help="Package version to pull")
parser.add_argument("package", help="Package name to pull")
parser.add_argument("release", nargs="?", help="Release to pull from")
parser.add_argument("version", nargs="?", help="Package version to pull")
epilog = ("Note on --status: if a version is provided, all status types "
epilog = (
"Note on --status: if a version is provided, all status types "
"will be searched; if no version is provided, by default only "
"'Pending' and 'Published' status will be searched.")
"'Pending' and 'Published' status will be searched."
)
# since parser has no --help handler, create a new parser that does
newparser = ArgumentParser(parents=[parser], epilog=epilog)
return self.parse_options(vars(newparser.parse_args(args)))
def parse_pull(self, pull):
@staticmethod
def parse_pull(pull):
if not pull:
raise InvalidPullValueError("Must specify --pull")
# allow 'dbgsym' as alias for 'ddebs'
if pull == 'dbgsym':
if pull == "dbgsym":
Logger.debug("Pulling '%s' for '%s'", PULL_DDEBS, pull)
pull = PULL_DDEBS
# assume anything starting with 'bin' means 'debs'
if str(pull).startswith('bin'):
if str(pull).startswith("bin"):
Logger.debug("Pulling '%s' for '%s'", PULL_DEBS, pull)
pull = PULL_DEBS
# verify pull action is valid
if pull not in VALID_PULLS:
raise InvalidPullValueError("Invalid pull action '%s'" % pull)
raise InvalidPullValueError(f"Invalid pull action '{pull}'")
return pull
def parse_distro(self, distro):
@staticmethod
def parse_distro(distro):
if not distro:
raise InvalidDistroValueError("Must specify --distro")
distro = distro.lower()
# allow 'lp' for 'ubuntu'
if distro == 'lp':
if distro == "lp":
Logger.debug("Using distro '%s' for '%s'", DISTRO_UBUNTU, distro)
distro = DISTRO_UBUNTU
# assume anything with 'cloud' is UCA
if re.match(r'.*cloud.*', distro):
if re.match(r".*cloud.*", distro):
Logger.debug("Using distro '%s' for '%s'", DISTRO_UCA, distro)
distro = DISTRO_UCA
# verify distro is valid
if distro not in VALID_DISTROS:
raise InvalidDistroValueError("Invalid distro '%s'" % distro)
raise InvalidDistroValueError(f"Invalid distro '{distro}'")
return distro
def parse_release(self, distro, release):
@staticmethod
def parse_release(distro, release):
if distro == DISTRO_UCA:
return UbuntuCloudArchiveSourcePackage.parseReleaseAndPocket(release)
@ -248,15 +277,14 @@ class PullPkg(object):
if distro == DISTRO_PPA:
# PPAs are part of Ubuntu distribution
d = Distribution(DISTRO_UBUNTU)
distribution = Distribution(DISTRO_UBUNTU)
else:
d = Distribution(distro)
distribution = Distribution(distro)
# let SeriesNotFoundException flow up
d.getSeries(release)
distribution.getSeries(release)
Logger.debug("Using distro '%s' release '%s' pocket '%s'",
distro, release, pocket)
Logger.debug("Using distro '%s' release '%s' pocket '%s'", distro, release, pocket)
return (release, pocket)
def parse_release_and_version(self, distro, release, version, try_swap=True):
@ -280,151 +308,196 @@ class PullPkg(object):
# they should all be provided, though the optional ones may be None
# type bool
assert 'verbose' in options
assert 'download_only' in options
assert 'no_conf' in options
assert 'no_verify_signature' in options
assert 'status' in options
assert "verbose" in options
assert "download_only" in options
assert "no_conf" in options
assert "no_verify_signature" in options
assert "status" in options
# type string
assert 'pull' in options
assert 'distro' in options
assert 'arch' in options
assert 'package' in options
assert "pull" in options
assert "distro" in options
assert "arch" in options
assert "package" in options
# type string, optional
assert 'release' in options
assert 'version' in options
assert "release" in options
assert "version" in options
# type list of strings, optional
assert 'mirror' in options
assert "mirror" in options
options['pull'] = self.parse_pull(options['pull'])
options['distro'] = self.parse_distro(options['distro'])
options["pull"] = self.parse_pull(options["pull"])
options["distro"] = self.parse_distro(options["distro"])
# ensure these are always included so we can just check for None/False later
options['ppa'] = options.get('ppa', None)
options['security'] = options.get('security', False)
options['upload_queue'] = options.get('upload_queue', False)
options["ppa"] = options.get("ppa", None)
options["security"] = options.get("security", False)
options["upload_queue"] = options.get("upload_queue", False)
return options
def _get_params(self, options):
distro = options['distro']
pull = options['pull']
distro = options["distro"]
pull = options["pull"]
params = {}
params['package'] = options['package']
params["package"] = options["package"]
params["arch"] = options["arch"]
if options['release']:
(r, v, p) = self.parse_release_and_version(distro, options['release'],
options['version'])
params['series'] = r
params['version'] = v
params['pocket'] = p
if options["release"]:
(release, version, pocket) = self.parse_release_and_version(
distro, options["release"], options["version"]
)
params["series"] = release
params["version"] = version
params["pocket"] = pocket
if (params['package'].endswith('.dsc') and not params['series'] and not params['version']):
params['dscfile'] = params['package']
params.pop('package')
if params["package"].endswith(".dsc") and not params["series"] and not params["version"]:
params["dscfile"] = params["package"]
params.pop("package")
if options['security']:
if options['ppa']:
Logger.warning('Both --security and --ppa specified, ignoring --ppa')
Logger.debug('Checking Ubuntu Security PPA')
if options["security"]:
if options["ppa"]:
Logger.warning("Both --security and --ppa specified, ignoring --ppa")
Logger.debug("Checking Ubuntu Security PPA")
# --security is just a shortcut for --ppa ppa:ubuntu-security-proposed/ppa
options['ppa'] = 'ubuntu-security-proposed/ppa'
options["ppa"] = "ubuntu-security-proposed/ppa"
if options['ppa']:
if options['ppa'].startswith('ppa:'):
params['ppa'] = options['ppa'][4:]
if options["ppa"]:
if options["ppa"].startswith("ppa:"):
params["ppa"] = options["ppa"][4:]
else:
params['ppa'] = options['ppa']
params["ppa"] = options["ppa"]
elif distro == DISTRO_PPA:
raise ValueError('Must specify PPA to pull from')
raise ValueError("Must specify PPA to pull from")
mirrors = []
if options['mirror']:
mirrors.extend(options['mirror'])
if options["mirror"]:
mirrors.extend(options["mirror"])
if pull == PULL_DDEBS:
config = UDTConfig(options['no_conf'])
ddebs_mirror = config.get_value(distro.upper() + '_DDEBS_MIRROR')
config = UDTConfig(options["no_conf"])
ddebs_mirror = config.get_value(distro.upper() + "_DDEBS_MIRROR")
if ddebs_mirror:
mirrors.append(ddebs_mirror)
if mirrors:
Logger.debug("using mirrors %s", ", ".join(mirrors))
params['mirrors'] = mirrors
params["mirrors"] = mirrors
params['verify_signature'] = not options['no_verify_signature']
params["verify_signature"] = not options["no_verify_signature"]
params['status'] = STATUSES if 'all' in options['status'] else options['status']
params["status"] = STATUSES if "all" in options["status"] else options["status"]
# special handling for upload queue
if options['upload_queue']:
if len(options['status']) > 1:
raise ValueError("Too many --status provided, "
"can only search for a single status or 'all'")
if not options['status']:
params['status'] = None
elif options['status'][0].lower() == 'all':
params['status'] = 'all'
elif options['status'][0].capitalize() in UPLOAD_QUEUE_STATUSES:
params['status'] = options['status'][0].capitalize()
if options["upload_queue"]:
if len(options["status"]) > 1:
raise ValueError(
"Too many --status provided, can only search for a single status or 'all'"
)
if not options["status"]:
params["status"] = None
elif options["status"][0].lower() == "all":
params["status"] = "all"
elif options["status"][0].capitalize() in UPLOAD_QUEUE_STATUSES:
params["status"] = options["status"][0].capitalize()
else:
msg = ("Invalid upload queue status '%s': valid values are %s" %
(options['status'][0], ', '.join(UPLOAD_QUEUE_STATUSES)))
raise ValueError(msg)
raise ValueError(
f"Invalid upload queue status '{options['status'][0]}':"
f" valid values are {', '.join(UPLOAD_QUEUE_STATUSES)}"
)
return params
def pull(self, args=sys.argv[1:]):
def pull(self, args=None):
"""Pull (download) specified package file(s)"""
options = self.parse_args(args)
if options['verbose']:
if options["verbose"]:
Logger.setLevel(logging.DEBUG)
if options['verbose'] > 1:
if options["verbose"] > 1:
logging.getLogger(__package__).setLevel(logging.DEBUG)
Logger.debug("pullpkg options: %s", options)
pull = options['pull']
distro = options['distro']
pull = options["pull"]
distro = options["distro"]
if options['login']:
if options["login"]:
Logger.debug("Logging in to Launchpad:")
try:
Launchpad.login()
except AlreadyLoggedInError:
Logger.error("Launchpad singleton has already performed a login, "
"and its design prevents another login")
Logger.error(
"Launchpad singleton has already performed a login, "
"and its design prevents another login"
)
Logger.warning("Continuing anyway, with existing Launchpad instance")
params = self._get_params(options)
package = params['package']
package = params["package"]
if options['upload_queue']:
if options["upload_queue"]:
# upload queue API is different/simpler
self.pull_upload_queue(pull, arch=options['arch'], **params)
self.pull_upload_queue( # pylint: disable=missing-kwoa
pull, arch=options["arch"], download_only=options["download_only"], **params
)
return
# call implementation, and allow exceptions to flow up to caller
srcpkg = DISTRO_PKG_CLASS[distro](**params)
spph = srcpkg.lp_spph
Logger.info('Found %s', spph.display_name)
Logger.info("Found %s", spph.display_name)
# The VCS detection logic was modeled after `apt source`
for key in srcpkg.dsc.keys():
original_key = key
key = key.lower()
if key.startswith("vcs-"):
if key == "vcs-browser":
continue
if key == "vcs-git":
vcs = "Git"
elif key == "vcs-bzr":
vcs = "Bazaar"
else:
continue
uri = srcpkg.dsc[original_key]
Logger.warning(
"\nNOTICE: '%s' packaging is maintained in "
"the '%s' version control system at:\n %s\n",
package,
vcs,
uri,
)
if vcs == "Bazaar":
vcscmd = " $ bzr branch " + uri
elif vcs == "Git":
vcscmd = " $ git clone " + uri
if vcscmd:
Logger.info(
"Please use:\n%s\n"
"to retrieve the latest (possibly unreleased) updates to the package.\n",
vcscmd,
)
if pull == PULL_LIST:
Logger.info("Source files:")
for f in srcpkg.dsc['Files']:
Logger.info(" %s", f['name'])
for f in srcpkg.dsc["Files"]:
Logger.info(" %s", f["name"])
Logger.info("Binary files:")
for f in spph.getBinaries(options['arch']):
archtext = ''
for f in spph.getBinaries(options["arch"]):
archtext = ""
name = f.getFileName()
if name.rpartition('.')[0].endswith('all'):
if name.rpartition(".")[0].endswith("all"):
archtext = f" ({f.arch})"
Logger.info(f" {name}{archtext}")
Logger.info(" %s%s", name, archtext)
elif pull == PULL_SOURCE:
# allow DownloadError to flow up to caller
srcpkg.pull()
if options['download_only']:
if options["download_only"]:
Logger.debug("--download-only specified, not extracting")
else:
srcpkg.unpack()
@ -432,106 +505,116 @@ class PullPkg(object):
name = None
if package != spph.getPackageName():
Logger.info("Pulling only binary package '%s'", package)
Logger.info("Use package name '%s' to pull all binary packages",
spph.getPackageName())
Logger.info(
"Use package name '%s' to pull all binary packages", spph.getPackageName()
)
name = package
# e.g. 'debs' -> 'deb'
ext = pull.rstrip('s')
ext = pull.rstrip("s")
if distro == DISTRO_DEBIAN:
# Debian ddebs don't use .ddeb extension, unfortunately :(
if pull in [PULL_DEBS, PULL_DDEBS]:
name = name or '.*'
ext = 'deb'
name = name or ".*"
ext = "deb"
if pull == PULL_DEBS:
name += r'(?<!-dbgsym)$'
name += r"(?<!-dbgsym)$"
if pull == PULL_DDEBS:
name += r'-dbgsym$'
name += r"-dbgsym$"
# allow DownloadError to flow up to caller
total = srcpkg.pull_binaries(name=name, ext=ext, arch=options['arch'])
total = srcpkg.pull_binaries(name=name, ext=ext, arch=options["arch"])
if total < 1:
Logger.error("No %s found for %s %s", pull,
package, spph.getVersion())
Logger.error("No %s found for %s %s", pull, package, spph.getVersion())
else:
Logger.error("Internal error: invalid pull value after parse_pull()")
raise InvalidPullValueError("Invalid pull value '%s'" % pull)
raise InvalidPullValueError(f"Invalid pull value '{pull}'")
def pull_upload_queue(self, pull, **params):
package = params['package']
version = params['version']
arch = params['arch']
if not params['series']:
def pull_upload_queue(
self,
pull,
*,
package,
version=None,
arch=None,
series=None,
pocket=None,
status=None,
download_only=None,
**kwargs,
): # pylint: disable=no-self-use,unused-argument
if not series:
Logger.error("Using --upload-queue requires specifying series")
return
series = Distribution('ubuntu').getSeries(params['series'])
series = Distribution("ubuntu").getSeries(series)
queueparams = {'name': package}
if params['pocket']:
queueparams['pocket'] = params['pocket']
queueparams = {"name": package}
if pocket:
queueparams["pocket"] = pocket
if params['status'] == 'all':
queueparams['status'] = None
queuetype = 'any'
elif params['status']:
queueparams['status'] = params['status']
queuetype = params['status']
if status == "all":
queueparams["status"] = None
queuetype = "any"
elif status:
queueparams["status"] = status
queuetype = status
else:
queuetype = 'Unapproved'
queuetype = "Unapproved"
packages = [p for p in series.getPackageUploads(**queueparams) if
p.package_version == version or
str(p.id) == version or
not version]
packages = [
p
for p in series.getPackageUploads(**queueparams)
if p.package_version == version or str(p.id) == version or not version
]
if pull == PULL_SOURCE:
packages = [p for p in packages if p.contains_source]
elif pull in VALID_BINARY_PULLS:
packages = [p for p in packages if
p.contains_build and
(arch in ['all', 'any'] or
arch in p.display_arches.replace(',', '').split())]
packages = [
p
for p in packages
if p.contains_build
and (arch in ["all", "any"] or arch in p.display_arches.replace(",", "").split())
]
if not packages:
msg = ("Package %s not found in %s upload queue for %s" %
(package, queuetype, series.name))
msg = f"Package {package} not found in {queuetype} upload queue for {series.name}"
if version:
msg += " with version/id %s" % version
msg += f" with version/id {version}"
if pull in VALID_BINARY_PULLS:
msg += " for arch %s" % arch
msg += f" for arch {arch}"
raise PackageNotFoundException(msg)
if pull == PULL_LIST:
for p in packages:
msg = "Found %s %s (ID %s)" % (p.package_name, p.package_version, p.id)
if p.display_arches:
msg += " arch %s" % p.display_arches
for pkg in packages:
msg = f"Found {pkg.package_name} {pkg.package_version} (ID {pkg.id})"
if pkg.display_arches:
msg += f" arch {pkg.display_arches}"
Logger.info(msg)
url = p.changesFileUrl()
url = pkg.changesFileUrl()
if url:
Logger.info("Changes file:")
Logger.info(" %s", url)
else:
Logger.info("No changes file")
urls = p.sourceFileUrls()
urls = pkg.sourceFileUrls()
if urls:
Logger.info("Source files:")
for url in urls:
Logger.info(" %s", url)
else:
Logger.info("No source files")
urls = p.binaryFileUrls()
urls = pkg.binaryFileUrls()
if urls:
Logger.info("Binary files:")
for url in urls:
Logger.info(" %s", url)
Logger.info(" { %s }" % p.binaryFileProperties(url))
Logger.info(" { %s }", pkg.binaryFileProperties(url))
else:
Logger.info("No binary files")
urls = p.customFileUrls()
urls = pkg.customFileUrls()
if urls:
Logger.info("Custom files:")
for url in urls:
@ -541,38 +624,58 @@ class PullPkg(object):
if len(packages) > 1:
msg = "Found multiple packages"
if version:
msg += " with version %s, please specify the ID instead" % version
msg += f" with version {version}, please specify the ID instead"
else:
msg += ", please specify the version"
Logger.error("Available package versions/ids are:")
for p in packages:
Logger.error("%s %s (id %s)" % (p.package_name, p.package_version, p.id))
for pkg in packages:
Logger.error("%s %s (id %s)", pkg.package_name, pkg.package_version, pkg.id)
raise PackageNotFoundException(msg)
p = packages[0]
pkg = packages[0]
urls = set(p.customFileUrls())
if p.changesFileUrl():
urls.add(p.changesFileUrl())
urls = set(pkg.customFileUrls())
if pkg.changesFileUrl():
urls.add(pkg.changesFileUrl())
if pull == PULL_SOURCE:
urls |= set(p.sourceFileUrls())
urls |= set(pkg.sourceFileUrls())
if not urls:
Logger.error("No source files to download")
dscfile = None
for url in urls:
download(url, os.getcwd())
dst = download(url, os.getcwd())
if dst.name.endswith(".dsc"):
dscfile = dst
if download_only:
Logger.debug("--download-only specified, not extracting")
elif not dscfile:
Logger.error("No source dsc file found, cannot extract")
else:
name = '.*'
cmd = ["dpkg-source", "-x", dscfile.name]
Logger.debug(" ".join(cmd))
result = subprocess.run(
cmd,
check=False,
encoding="utf-8",
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
)
if result.returncode != 0:
Logger.error("Source unpack failed.")
Logger.debug(result.stdout)
else:
name = ".*"
if pull == PULL_DEBS:
name = r'{}(?<!-di)(?<!-dbgsym)$'.format(name)
name = rf"{name}(?<!-di)(?<!-dbgsym)$"
elif pull == PULL_DDEBS:
name += '-dbgsym$'
name += "-dbgsym$"
elif pull == PULL_UDEBS:
name += '-di$'
name += "-di$"
else:
raise InvalidPullValueError("Invalid pull value %s" % pull)
raise InvalidPullValueError(f"Invalid pull value {pull}")
urls |= set(p.binaryFileUrls())
urls |= set(pkg.binaryFileUrls())
if not urls:
Logger.error("No binary files to download")
for url in urls:

View File

@ -16,14 +16,14 @@
# ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
import tempfile
import os
import re
import subprocess
import sys
import tempfile
class Question(object):
class Question:
def __init__(self, options, show_help=True):
assert len(options) >= 2
self.options = [s.lower() for s in options]
@ -31,9 +31,9 @@ class Question(object):
def get_options(self):
if len(self.options) == 2:
options = self.options[0] + " or " + self.options[1]
options = f"{self.options[0]} or {self.options[1]}"
else:
options = ", ".join(self.options[:-1]) + ", or " + self.options[-1]
options = f"{', '.join(self.options[:-1])}, or {self.options[-1]}"
return options
def ask(self, question, default=None):
@ -57,7 +57,7 @@ class Question(object):
try:
selected = input(question).strip().lower()
except (EOFError, KeyboardInterrupt):
print('\nAborting as requested.')
print("\nAborting as requested.")
sys.exit(1)
if selected == "":
selected = default
@ -67,7 +67,7 @@ class Question(object):
if selected == option[0]:
selected = option
if selected not in self.options:
print("Please answer the question with " + self.get_options() + ".")
print(f"Please answer the question with {self.get_options()}.")
return selected
@ -78,15 +78,15 @@ class YesNoQuestion(Question):
def input_number(question, min_number, max_number, default=None):
if default:
question += " [%i]? " % (default)
question += f" [{default}]? "
else:
question += "? "
selected = None
while selected < min_number or selected > max_number:
while not selected or selected < min_number or selected > max_number:
try:
selected = input(question).strip()
except (EOFError, KeyboardInterrupt):
print('\nAborting as requested.')
print("\nAborting as requested.")
sys.exit(1)
if default and selected == "":
selected = default
@ -94,40 +94,40 @@ def input_number(question, min_number, max_number, default=None):
try:
selected = int(selected)
if selected < min_number or selected > max_number:
print("Please input a number between %i and %i." % (min_number, max_number))
print(f"Please input a number between {min_number} and {max_number}.")
except ValueError:
print("Please input a number.")
assert type(selected) == int
assert isinstance(selected, int)
return selected
def confirmation_prompt(message=None, action=None):
'''Display message, or a stock message including action, and wait for the
"""Display message, or a stock message including action, and wait for the
user to press Enter
'''
"""
if message is None:
if action is None:
action = 'continue'
message = 'Press [Enter] to %s. Press [Ctrl-C] to abort now.' % action
action = "continue"
message = f"Press [Enter] to {action}. Press [Ctrl-C] to abort now."
try:
input(message)
except (EOFError, KeyboardInterrupt):
print('\nAborting as requested.')
print("\nAborting as requested.")
sys.exit(1)
class EditFile(object):
class EditFile:
def __init__(self, filename, description, placeholders=None):
self.filename = filename
self.description = description
if placeholders is None:
placeholders = (re.compile(r'^>>>.*<<<$', re.UNICODE),)
placeholders = (re.compile(r"^>>>.*<<<$", re.UNICODE),)
self.placeholders = placeholders
def edit(self, optional=False):
if optional:
print("\n\nCurrently the %s looks like:" % self.description)
with open(self.filename, 'r', encoding='utf-8') as f:
print(f"\n\nCurrently the {self.description} looks like:")
with open(self.filename, "r", encoding="utf-8") as f:
print(f.read())
if YesNoQuestion().ask("Edit", "no") == "no":
return
@ -135,68 +135,65 @@ class EditFile(object):
done = False
while not done:
old_mtime = os.stat(self.filename).st_mtime
subprocess.check_call(['sensible-editor', self.filename])
subprocess.check_call(["sensible-editor", self.filename])
modified = old_mtime != os.stat(self.filename).st_mtime
placeholders_present = False
if self.placeholders:
with open(self.filename, 'r', encoding='utf-8') as f:
with open(self.filename, "r", encoding="utf-8") as f:
for line in f:
for placeholder in self.placeholders:
if placeholder.search(line.strip()):
placeholders_present = True
if placeholders_present:
print("Placeholders still present in the %s. "
"Please replace them with useful information."
% self.description)
confirmation_prompt(action='edit again')
print(
f"Placeholders still present in the {self.description}. "
f"Please replace them with useful information."
)
confirmation_prompt(action="edit again")
elif not modified:
print("The %s was not modified" % self.description)
print(f"The {self.description} was not modified")
if YesNoQuestion().ask("Edit again", "yes") == "no":
done = True
elif self.check_edit():
done = True
def check_edit(self):
'''Override this to implement extra checks on the edited report.
def check_edit(self): # pylint: disable=no-self-use
"""Override this to implement extra checks on the edited report.
Should return False if another round of editing is needed,
and should prompt the user to confirm that, if necessary.
'''
"""
return True
class EditBugReport(EditFile):
split_re = re.compile(r'^Summary.*?:\s+(.*?)\s+'
r'Description:\s+(.*)$',
re.DOTALL | re.UNICODE)
split_re = re.compile(r"^Summary.*?:\s+(.*?)\s+Description:\s+(.*)$", re.DOTALL | re.UNICODE)
def __init__(self, subject, body, placeholders=None):
prefix = os.path.basename(sys.argv[0]) + '_'
tmpfile = tempfile.NamedTemporaryFile(prefix=prefix, suffix='.txt',
delete=False)
tmpfile.write((u'Summary (one line):\n%s\n\nDescription:\n%s'
% (subject, body)).encode('utf-8'))
prefix = f"{os.path.basename(sys.argv[0])}_"
tmpfile = tempfile.NamedTemporaryFile(prefix=prefix, suffix=".txt", delete=False)
tmpfile.write((f"Summary (one line):\n{subject}\n\nDescription:\n{body}").encode("utf-8"))
tmpfile.close()
super(EditBugReport, self).__init__(tmpfile.name, 'bug report',
placeholders)
super().__init__(tmpfile.name, "bug report", placeholders)
def check_edit(self):
with open(self.filename, 'r', encoding='utf-8') as f:
with open(self.filename, "r", encoding="utf-8") as f:
report = f.read()
if self.split_re.match(report) is None:
print("The %s doesn't start with 'Summary:' and 'Description:' "
"blocks" % self.description)
confirmation_prompt('edit again')
print(
f"The {self.description} doesn't start with 'Summary:' and 'Description:' blocks"
)
confirmation_prompt("edit again")
return False
return True
def get_report(self):
with open(self.filename, 'r', encoding='utf-8') as f:
with open(self.filename, "r", encoding="utf-8") as f:
report = f.read()
match = self.split_re.match(report)
title = match.group(1).replace(u'\n', u' ')
title = match.group(1).replace("\n", " ")
report = (title, match.group(2))
os.unlink(self.filename)
return report

View File

@ -22,13 +22,12 @@ class RDependsException(Exception):
pass
def query_rdepends(package, release, arch,
server='http://qa.ubuntuwire.org/rdepends'):
def query_rdepends(package, release, arch, server="http://qa.ubuntuwire.org/rdepends"):
"""Look up a packages reverse-dependencies on the Ubuntuwire
Reverse- webservice
"""
url = os.path.join(server, 'v1', release, arch, package)
url = os.path.join(server, "v1", release, arch, package)
response, data = httplib2.Http().request(url)
if response.status != 200:

Some files were not shown because too many files have changed in this diff Show More