Compare commits

...

67 Commits
0.203 ... main

Author SHA1 Message Date
Florent 'Skia' Jacquet
7f5e9c8680 pm-helper: make use of YesNoQuestion 2025-12-03 16:46:51 +01:00
Benjamin Drung
d35268b797 Release ubuntu-dev-tools 0.208
Signed-off-by: Benjamin Drung <benjamin.drung@canonical.com>
2025-12-03 16:33:51 +01:00
Benjamin Drung
bf9ead2204 requestsync: support pocket parameter in get_ubuntu_srcpkg
The command `requestsync --email -d sid <package> <target>` fails with
the following stacktrace:

```
Traceback (most recent call last):
  File "/usr/bin/requestsync", line 402, in <module>
    main()
  File "/usr/bin/requestsync", line 225, in main
    ubuntu_srcpkg = get_ubuntu_srcpkg(srcpkg, args.release, "Proposed")
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: get_ubuntu_srcpkg() takes 2 positional arguments but 3 were given
```

LP: #2115990
Fixes: 5eb960dd3fe57daa16d8cee8cefee035cebb8e5d
2025-12-03 16:25:28 +01:00
Benjamin Drung
38988ed183 Modernize SourcePackage._run_lintian() 2025-12-03 15:43:23 +01:00
Benjamin Drung
5e2f94cdb4 SourcePackage: introduce package_and_version
Introduce `package_and_version` to avoid code duplication.
2025-12-03 15:34:17 +01:00
Benjamin Drung
29914382cf ubuntu-build: introduce parse_args helper function
Move the argument parsing code into a separate `parse_args` function to
make the `main` function a little bit smaller.

The `IndexError` on accessing `args.packages` cannot happen.
2025-12-03 15:21:27 +01:00
Benjamin Drung
addeb4f7fb sponsor-patch: stop checking for bzr being present 2025-12-03 15:10:17 +01:00
Benjamin Drung
ee87f312bf run mypy during package build 2025-12-03 14:54:54 +01:00
Benjamin Drung
32530e356d run-linters: avoid searching scripts in build/
During package build the Python code is copied to the `build` directory.
Ignore this directory when searching for Python scripts.
2025-12-03 14:51:51 +01:00
Benjamin Drung
38ef3c506e Run wrap-and-sort -ast 2025-12-03 14:40:47 +01:00
Benjamin Drung
5fc7e15f96 Fix type annotation for resource_type
mypy complains:

```
ubuntutools/lp/lpapicache.py:143: error: Incompatible types in assignment (expression has type "None", variable has type "str")  [assignment]
ubuntutools/lp/lpapicache.py:1328: error: Incompatible types in assignment (expression has type "tuple[str, str]", base class "BaseWrapper" defined the type as "str")  [assignment]
```
2025-12-03 14:37:33 +01:00
Benjamin Drung
f0326592bd add type annotation for mypy
mypy comlains:

```
ubuntutools/config.py:53: error: Need type annotation for "config" (hint: "config: dict[<type>, <type>] = ...")  [var-annotated]
ubuntutools/lp/lpapicache.py:1504: error: Need type annotation for "_source_sets" (hint: "_source_sets: dict[<type>, <type>] = ...")  [var-annotated]
```
2025-12-03 14:36:49 +01:00
Benjamin Drung
aa439fec02 add missing return type annotation 2025-12-03 14:35:04 +01:00
Benjamin Drung
4bcfa0dd5a .pylintrc: ignore too-many-positional-arguments for now 2025-12-03 14:17:06 +01:00
Benjamin Drung
63b3d54264 Drop obsolete Rules-Requires-Root: no 2025-12-03 14:15:06 +01:00
Benjamin Drung
654af1a613 Drop Lintian overrides related to .pyc files
.pyc files should not be included in the source tarball.
2025-12-03 14:14:20 +01:00
Benjamin Drung
524f590af2 Run linters that can detect real errors on package build 2025-12-03 14:11:36 +01:00
Benjamin Drung
4a2f194860 run-linters: add --errors-only mode
Add `--errors-only` that runs only linters that can detect real errors
and all other ignoring the result.
2025-12-03 14:10:11 +01:00
Benjamin Drung
7b9aee4c0c run-linters: introduce helper functions 2025-12-03 14:07:56 +01:00
Benjamin Drung
45d317cc87 mark non-returning functions with typing.NoReturn
To help pylint, mark non-returning functions with `typing.NoReturn`.
2025-12-03 14:01:18 +01:00
Benjamin Drung
2e041ac1ff syncpackage: replace global by LRU cache
pylint complains:

```
syncpackage:459:4: W0603: Using the global statement (global-statement)
```

Replace the `global` statement by `functools.lru_cache`.
2025-12-03 13:59:06 +01:00
Benjamin Drung
c8fe724560 Fix pylint in ubuntutools/test/test_requestsync.py
pylint complains:

```
ubuntutools/test/test_requestsync.py:31:12: C0415: Import outside toplevel (keyring) (import-outside-toplevel)
ubuntutools/test/test_requestsync.py:33:12: W0707: Consider explicitly re-raising using 'except ModuleNotFoundError as exc' and 'raise ModuleNotFoundError('package python3-keyring is not installed') from exc' (raise-missing-from)
ubuntutools/test/test_requestsync.py:31:12: W0611: Unused import keyring (unused-import)
```
2025-12-03 13:51:50 +01:00
Benjamin Drung
768a517370 ubuntutools/archive.py: use 'yield from'
pylint complains:

```
ubuntutools/archive.py:343:8: R1737: Use 'yield from' directly instead of yielding each element one by one (use-yield-from)
ubuntutools/archive.py:346:8: R1737: Use 'yield from' directly instead of yielding each element one by one (use-yield-from)
ubuntutools/archive.py:638:8: R1737: Use 'yield from' directly instead of yielding each element one by one (use-yield-from)
```
2025-12-03 13:47:54 +01:00
Benjamin Drung
3d2ee5a1b7 ubuntutools/archive.py: do not raise general exception
pylint complains:

```
ubuntutools/archive.py:935:8: W0719: Raising too general exception: Exception (broad-exception-raised)
```
2025-12-03 13:45:46 +01:00
Benjamin Drung
1c81f0872d Use lazy % formatting in logging functions
pylint complains:

```
W1201: Use lazy % formatting in logging functions
```
2025-12-03 13:36:48 +01:00
Benjamin Drung
ef5e3d8066 requestsync: silence possibly-used-before-assignment
pylint complains:

```
requestsync:156:34: E0606: Possibly using variable 'bug_mail_domain' before assignment (possibly-used-before-assignment)
requestsync:217:27: E0606: Possibly using variable 'Distribution' before assignment (possibly-used-before-assignment)
requestsync:380:8: E0606: Possibly using variable 'post_bug' before assignment (possibly-used-before-assignment)
```

These errors are false positives. So silence them. The perfect solution
would be to restructure the code.
2025-12-03 13:32:33 +01:00
Benjamin Drung
816323ea5c sponsor_patch: make pylint happy
pylint complains:

```
ubuntutools/sponsor_patch/sponsor_patch.py:243:45: E0606: Possibly using variable 'task' before assignment (possibly-used-before-assignment)
```

Drop the `len(ubuntu_tasks) > 1` check and rely on being the else case.
2025-12-03 13:21:19 +01:00
Benjamin Drung
41e7d2d714 ubuntutools/pullpkg.py: initialize vcscmd
Fixes: 2f396fe54956
2025-12-03 13:11:47 +01:00
Sebastien Bacher
7e82344d57 ubuntu-build: fix non batch mode errors
The current version is not working

```
$ ubuntu-build librsync resolute retry

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/lazr/restfulclient/resource.py", line 354, in __getattr__
    return self.lp_get_parameter(attr)
           ~~~~~~~~~~~~~~~~~~~~~^^^^^^
  File "/usr/lib/python3/dist-packages/lazr/restfulclient/resource.py", line 249, in lp_get_parameter
    raise KeyError("No such parameter: %s" % param_name)
KeyError: 'No such parameter: getComponent'
```

The way launchpadlib is used was changed in 010af53 but non batch mode
was not correctly updated.

Also tweak the way the release is computed for distroseries to be able
to retry a package in e.g "resolute-proposed".
2025-12-03 13:07:25 +01:00
Benjamin Drung
cf88f4b92f syncpackage: do not use bare except for urlopen()
flake8 complains:

```
./syncpackage:465:9: E722 do not use bare 'except'
```

The function `urllib.request.urlopen` might throw
`urllib.error.URLError`, `urllib.error.HTTPError`, `socket.gaierror`,
`ssl.SSLError` which are all subclasses of `OSError`.
2025-12-03 12:24:47 +01:00
Benjamin Drung
c6a4c10da2 Format code with black and isort
```
isort .
black -C . $(grep -l -r '^#! */usr/bin/python3$' .)
```
2025-12-03 12:01:23 +01:00
Gianfranco Costamagna
dff0b269d2 ubuntu-build: consider amd64v3 as valid architecture 2025-10-19 09:29:46 +02:00
Colin Watson
3f880bea90 releasing package ubuntu-dev-tools version 0.207 2025-09-15 15:59:36 +01:00
Colin Watson
8bb85c6a94 Optimize Launchpad collection handling
Various methods in `ubuntutools.lp.lpapicache` iterated over collections
in a pessimal way: they fetched the collection and then fetched each
individual entry in it, when the same information was already available
in the collection response.  Use more idiomatic launchpadlib code for
this instead, which is also much faster.
2025-09-15 11:29:14 +01:00
Dan Streetman
7dd913fe16 Add a new changelog entry 2025-05-06 13:29:29 -04:00
Dan Streetman
bcf3e153f7 Fix pulling from upload queue
Commit 4a4c4e0a27cfd159ac0bbc135d4eff06be8bde1c completely broke
pull-lp-source --upload-queue, which now fails with:

Traceback (most recent call last):
  File "/usr/bin/pull-lp-source", line 14, in <module>
    PullPkg.main(distro="ubuntu", pull="source")
    ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/ubuntutools/pullpkg.py", line 111, in main
    cls(*args, **kwargs).pull()
    ~~~~~~~~~~~~~~~~~~~~~~~~~^^
  File "/usr/lib/python3/dist-packages/ubuntutools/pullpkg.py", line 438, in pull
    self.pull_upload_queue(  # pylint: disable=missing-kwoa
    ~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        pull, arch=options["arch"], download_only=options["download_only"], **params
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )

Fixes LP: #2110061
2025-05-06 13:21:52 -04:00
Simon Quigley
466e2784de Upload to Unstable 2025-03-04 13:43:32 -06:00
Simon Quigley
ba3f0511f9 syncpackage: Catch exceptions cleanly, simply skipping to the next package (erring on the side of caution) if there is an error doing the download (LP: #1943286). 2025-03-04 13:42:50 -06:00
Simon Quigley
2e550ceff2 syncpackage: Cache the sync blocklist in-memory, so it's not fetched multiple times when syncing more than one package. 2025-03-04 13:39:07 -06:00
Simon Quigley
6c8a5d74bd syncpackage: s/syncblacklist/syncblocklist/g 2025-03-04 13:29:02 -06:00
Simon Quigley
3d11516599 mk-sbuild: default to using UTC for schroots (LP: #2097159). 2025-03-04 13:22:40 -06:00
Simon Quigley
5a20308ab1 Read ~/.devscripts in a more robust way, to ideally pick up multi-line variables (Closes: #725418). 2025-03-04 13:17:30 -06:00
Simon Quigley
b551877651 Add a changelog entry 2025-03-04 13:10:04 -06:00
ferbraher
4a4c4e0a27 Parsing arch parameter to getBinaryPackage() 2025-03-04 13:08:59 -06:00
Simon Quigley
865c1c97bc Add a changelog entry 2025-03-04 13:07:42 -06:00
Shengjing Zhu
d09718e976 import-bug-from-debian: package option is overridden and not used 2025-03-04 13:07:11 -06:00
Simon Quigley
bff7baecc9 Add a changelog entry 2025-03-04 13:06:38 -06:00
Dan Bungert
45fbbb5bd1 mk-sbuild: enable pkgmaintainermangler
mk-sbuild installs pkgbinarymangler into the schroot.  Of of the
provided tools in pkgbinarymangler is pkgmaintainermangler.
pkgmaintainermangler is disabled by default, and enabled with
configuration.

A difference between launchpad builds of a synced package and an sbuild
is that the maintainer information will be different.

Enable pkgmaintainermangler to close this difference.
2025-03-04 13:05:57 -06:00
Simon Quigley
ca217c035e Add a new changelog entry 2025-03-04 13:04:49 -06:00
Simon Quigley
b5e117788b Upload to Unstable 2025-03-01 11:30:18 -06:00
Simon Quigley
ddba2d1e98 Update Standards-Version to 4.7.2, no changes needed. 2025-03-01 11:29:53 -06:00
Simon Quigley
02d65a5804 [syncpackage] Do not use exit(1) on an error or exception unless it applies to all packages, instead return None so we can continue to the next package. 2025-03-01 11:26:59 -06:00
Simon Quigley
bda85fa6a8 [syncpackage] Add support for -y or --yes, noted that it should be used with care. 2025-03-01 11:22:52 -06:00
Simon Quigley
86a83bf74d [syncpackage] Within fetch_source_pkg, do not exit(1) on an error or exception, simply return None so we can continue to the next package. 2025-03-01 11:17:02 -06:00
Simon Quigley
162e758671 [syncpackage] When syncing multiple packages, if one of the packages is in the sync blocklist, do not exit, simply continue. 2025-03-01 11:12:49 -06:00
Simon Quigley
049425adb7 Add debian/files to .gitignore 2025-03-01 11:11:34 -06:00
Simon Quigley
f6ca6cad92 Add a new changelog entry 2025-03-01 11:11:17 -06:00
Simon Quigley
3dc17934d6 Upload to Unstable 2025-02-24 19:55:03 -06:00
Simon Quigley
10a176567a Remove mail line from default ~/.sbuildrc, to resolve the undeclared dependency on sendmail (Closes: #1074632). 2025-02-24 19:52:59 -06:00
Simon Quigley
86b366c6c5 Add a large warning at the top of mk-sbuild encouraging the use of the unshare backend. This is to provide ample warning to users. 2025-02-24 19:15:55 -06:00
Simon Quigley
50b580b30e Add a manpage for running-autopkgtests. 2025-02-24 18:51:12 -06:00
Simon Quigley
6ba0641f63 Rename bitesize to lp-bitesize (Closes: #1076224). 2025-02-24 18:51:10 -06:00
Simon Quigley
1e815db9d2 Add my name to the copyright file. 2025-02-24 18:35:20 -06:00
Simon Quigley
e2f43318bd Add several Lintian overrides related to .pyc files. 2025-02-24 18:34:18 -06:00
Julien Plissonneau Duquène
cdd81232d9 Fix reverse-depends -b crash on packages that b-d on themselves (Closes: #1087760). 2025-02-24 18:31:33 -06:00
Simon Quigley
65044d84d9 Update Standards-Version to 4.7.1, no changes needed. 2025-02-24 18:26:59 -06:00
Mattia Rizzolo
19e40b49c2
Fix minor typo in pbuilder-dist(1)
LP: #2096956
Thanks: Rolf Leggewie for the patch
Signed-off-by: Mattia Rizzolo <mattia@debian.org>
2025-01-30 07:52:22 +01:00
38 changed files with 428 additions and 209 deletions

View File

@ -34,6 +34,7 @@ disable=fixme,locally-disabled,missing-docstring,useless-option-value,
duplicate-code, duplicate-code,
too-many-instance-attributes, too-many-instance-attributes,
too-many-nested-blocks, too-many-nested-blocks,
too-many-positional-arguments,
too-many-lines, too-many-lines,

View File

@ -25,6 +25,7 @@ import shutil
import subprocess import subprocess
import sys import sys
import tempfile import tempfile
from typing import Any, NoReturn
from urllib.parse import quote from urllib.parse import quote
try: try:
@ -50,7 +51,7 @@ from ubuntutools.question import YesNoQuestion
Logger = getLogger() Logger = getLogger()
def error(msg, *args): def error(msg: str, *args: Any) -> NoReturn:
Logger.error(msg, *args) Logger.error(msg, *args)
sys.exit(1) sys.exit(1)

1
debian/.gitignore vendored Normal file
View File

@ -0,0 +1 @@
files

91
debian/changelog vendored
View File

@ -1,3 +1,94 @@
ubuntu-dev-tools (0.208) unstable; urgency=medium
[ Gianfranco Costamagna ]
* ubuntu-build: consider amd64v3 as valid architecture
[ Sebastien Bacher ]
* ubuntu-build: fix non batch mode errors.
[ Benjamin Drung ]
* Format code with black and isort
* ubuntutools/pullpkg.py: initialize vcscmd
* make pylint and mypy happy
* mark non-returning functions with typing.NoReturn
* run-linters: add --errors-only mode and run this during package build
* Drop Lintian overrides related to .pyc files
* Drop obsolete Rules-Requires-Root: no
* run mypy during package build
* sponsor-patch: stop checking for bzr being present
* Modernize SourcePackage._run_lintian()
* requestsync: support pocket parameter in get_ubuntu_srcpkg (LP: #2115990)
-- Benjamin Drung <bdrung@debian.org> Wed, 03 Dec 2025 16:33:47 +0100
ubuntu-dev-tools (0.207) unstable; urgency=medium
* Team upload.
[ Dan Streetman ]
* Fix pull-lp-source --upload-queue (LP: #2110061)
[ Colin Watson ]
* Optimize Launchpad collection handling.
-- Colin Watson <cjwatson@debian.org> Mon, 15 Sep 2025 15:58:34 +0100
ubuntu-dev-tools (0.206) unstable; urgency=medium
[ Dan Bungert ]
* mk-sbuild: enable pkgmaintainermangler
[ Shengjing Zhu ]
* import-bug-from-debian: package option is overridden and not used
[ Fernando Bravo Hernández ]
* Parsing arch parameter to getBinaryPackage() (LP: #2081861)
[ Simon Quigley ]
* Read ~/.devscripts in a more robust way, to ideally pick up multi-line
variables (Closes: #725418).
* mk-sbuild: default to using UTC for schroots (LP: #2097159).
* syncpackage: s/syncblacklist/syncblocklist/g
* syncpackage: Cache the sync blocklist in-memory, so it's not fetched
multiple times when syncing more than one package.
* syncpackage: Catch exceptions cleanly, simply skipping to the next
package (erring on the side of caution) if there is an error doing the
download (LP: #1943286).
-- Simon Quigley <tsimonq2@debian.org> Tue, 04 Mar 2025 13:43:15 -0600
ubuntu-dev-tools (0.205) unstable; urgency=medium
* [syncpackage] When syncing multiple packages, if one of the packages is in
the sync blocklist, do not exit, simply continue.
* [syncpackage] Do not use exit(1) on an error or exception unless it
applies to all packages, instead return None so we can continue to the
next package.
* [syncpackage] Add support for -y or --yes, noted that it should be used
with care.
* Update Standards-Version to 4.7.2, no changes needed.
-- Simon Quigley <tsimonq2@debian.org> Sat, 01 Mar 2025 11:29:54 -0600
ubuntu-dev-tools (0.204) unstable; urgency=medium
[ Simon Quigley ]
* Update Standards-Version to 4.7.1, no changes needed.
* Add several Lintian overrides related to .pyc files.
* Add my name to the copyright file.
* Rename bitesize to lp-bitesize (Closes: #1076224).
* Add a manpage for running-autopkgtests.
* Add a large warning at the top of mk-sbuild encouraging the use of the
unshare backend. This is to provide ample warning to users.
* Remove mail line from default ~/.sbuildrc, to resolve the undeclared
dependency on sendmail (Closes: #1074632).
[ Julien Plissonneau Duquène ]
* Fix reverse-depends -b crash on packages that b-d on themselves
(Closes: #1087760).
-- Simon Quigley <tsimonq2@debian.org> Mon, 24 Feb 2025 19:54:39 -0600
ubuntu-dev-tools (0.203) unstable; urgency=medium ubuntu-dev-tools (0.203) unstable; urgency=medium
[ Steve Langasek ] [ Steve Langasek ]

15
debian/control vendored
View File

@ -8,16 +8,17 @@ Uploaders:
Mattia Rizzolo <mattia@debian.org>, Mattia Rizzolo <mattia@debian.org>,
Simon Quigley <tsimonq2@debian.org>, Simon Quigley <tsimonq2@debian.org>,
Build-Depends: Build-Depends:
black <!nocheck>,
dctrl-tools,
debhelper-compat (= 13), debhelper-compat (= 13),
devscripts (>= 2.11.0~),
dh-make, dh-make,
dh-python, dh-python,
black <!nocheck>,
dctrl-tools,
devscripts (>= 2.11.0~),
distro-info (>= 0.2~), distro-info (>= 0.2~),
flake8, flake8,
isort <!nocheck>, isort <!nocheck>,
lsb-release, lsb-release,
mypy <!nocheck>,
pylint <!nocheck>, pylint <!nocheck>,
python3-all, python3-all,
python3-apt, python3-apt,
@ -30,9 +31,9 @@ Build-Depends:
python3-pytest, python3-pytest,
python3-requests <!nocheck>, python3-requests <!nocheck>,
python3-setuptools, python3-setuptools,
python3-typeshed <!nocheck>,
python3-yaml <!nocheck>, python3-yaml <!nocheck>,
Standards-Version: 4.7.0 Standards-Version: 4.7.2
Rules-Requires-Root: no
Vcs-Git: https://git.launchpad.net/ubuntu-dev-tools Vcs-Git: https://git.launchpad.net/ubuntu-dev-tools
Vcs-Browser: https://git.launchpad.net/ubuntu-dev-tools Vcs-Browser: https://git.launchpad.net/ubuntu-dev-tools
Homepage: https://launchpad.net/ubuntu-dev-tools Homepage: https://launchpad.net/ubuntu-dev-tools
@ -40,12 +41,12 @@ Homepage: https://launchpad.net/ubuntu-dev-tools
Package: ubuntu-dev-tools Package: ubuntu-dev-tools
Architecture: all Architecture: all
Depends: Depends:
dpkg-dev,
binutils, binutils,
dctrl-tools, dctrl-tools,
devscripts (>= 2.11.0~), devscripts (>= 2.11.0~),
diffstat, diffstat,
distro-info (>= 0.2~), distro-info (>= 0.2~),
dpkg-dev,
dput, dput,
lsb-release, lsb-release,
python3, python3,
@ -72,10 +73,10 @@ Recommends:
genisoimage, genisoimage,
lintian, lintian,
patch, patch,
sbuild | pbuilder | cowbuilder,
python3-dns, python3-dns,
quilt, quilt,
reportbug (>= 3.39ubuntu1), reportbug (>= 3.39ubuntu1),
sbuild | pbuilder | cowbuilder,
ubuntu-keyring | ubuntu-archive-keyring, ubuntu-keyring | ubuntu-archive-keyring,
Suggests: Suggests:
bzr | brz, bzr | brz,

6
debian/copyright vendored
View File

@ -11,6 +11,7 @@ Files: backportpackage
doc/check-symbols.1 doc/check-symbols.1
doc/requestsync.1 doc/requestsync.1
doc/ubuntu-iso.1 doc/ubuntu-iso.1
doc/running-autopkgtests.1
GPL-2 GPL-2
README.updates README.updates
requestsync requestsync
@ -25,6 +26,7 @@ Copyright: 2007, Albert Damen <albrt@gmx.net>
2010, Evan Broder <evan@ebroder.net> 2010, Evan Broder <evan@ebroder.net>
2006-2007, Luke Yelavich <themuso@ubuntu.com> 2006-2007, Luke Yelavich <themuso@ubuntu.com>
2009-2010, Michael Bienia <geser@ubuntu.com> 2009-2010, Michael Bienia <geser@ubuntu.com>
2024-2025, Simon Quigley <tsimonq2@debian.org>
2010-2011, Stefano Rivera <stefanor@ubuntu.com> 2010-2011, Stefano Rivera <stefanor@ubuntu.com>
2008, Stephan Hermann <sh@sourcecode.de> 2008, Stephan Hermann <sh@sourcecode.de>
2007, Steve Kowalik <stevenk@ubuntu.com> 2007, Steve Kowalik <stevenk@ubuntu.com>
@ -72,14 +74,14 @@ License: GPL-2+
On Debian systems, the complete text of the GNU General Public License On Debian systems, the complete text of the GNU General Public License
version 2 can be found in the /usr/share/common-licenses/GPL-2 file. version 2 can be found in the /usr/share/common-licenses/GPL-2 file.
Files: doc/bitesize.1 Files: doc/lp-bitesize.1
doc/check-mir.1 doc/check-mir.1
doc/grab-merge.1 doc/grab-merge.1
doc/merge-changelog.1 doc/merge-changelog.1
doc/pm-helper.1 doc/pm-helper.1
doc/setup-packaging-environment.1 doc/setup-packaging-environment.1
doc/syncpackage.1 doc/syncpackage.1
bitesize lp-bitesize
check-mir check-mir
GPL-3 GPL-3
grab-merge grab-merge

3
debian/rules vendored
View File

@ -3,10 +3,11 @@
override_dh_auto_clean: override_dh_auto_clean:
dh_auto_clean dh_auto_clean
rm -f .coverage rm -f .coverage
rm -rf .tox rm -rf .mypy_cache .tox
override_dh_auto_test: override_dh_auto_test:
ifeq (,$(filter nocheck,$(DEB_BUILD_OPTIONS))) ifeq (,$(filter nocheck,$(DEB_BUILD_OPTIONS)))
./run-linters --errors-only
python3 -m pytest -v ubuntutools python3 -m pytest -v ubuntutools
endif endif

View File

@ -4,4 +4,5 @@ Depends:
python3-pytest, python3-pytest,
python3-setuptools, python3-setuptools,
@, @,
Restrictions: allow-stderr Restrictions:
allow-stderr,

View File

@ -1,21 +1,21 @@
.TH bitesize "1" "May 9 2010" "ubuntu-dev-tools" .TH lp-bitesize "1" "May 9 2010" "ubuntu-dev-tools"
.SH NAME .SH NAME
bitesize \- Add \fBbitesize\fR tag to bugs and add a comment. lp-bitesize \- Add \fBbitesize\fR tag to bugs and add a comment.
.SH SYNOPSIS .SH SYNOPSIS
.B bitesize \fR<\fIbug number\fR> .B lp-bitesize \fR<\fIbug number\fR>
.br .br
.B bitesize \-\-help .B lp-bitesize \-\-help
.SH DESCRIPTION .SH DESCRIPTION
\fBbitesize\fR adds a bitesize tag to the bug, if it's not there yet. It \fBlp-bitesize\fR adds a bitesize tag to the bug, if it's not there yet. It
also adds a comment to the bug indicating that you are willing to help with also adds a comment to the bug indicating that you are willing to help with
fixing it. fixing it.
It checks for permission to operate on a given bug first, It checks for permission to operate on a given bug first,
then perform required tasks on Launchpad. then perform required tasks on Launchpad.
.SH OPTIONS .SH OPTIONS
Listed below are the command line options for \fBbitesize\fR: Listed below are the command line options for \fBlp-bitesize\fR:
.TP .TP
.BR \-h ", " \-\-help .BR \-h ", " \-\-help
Display a help message and exit. Display a help message and exit.
@ -48,7 +48,7 @@ The default value for \fB--lpinstance\fR.
.BR ubuntu\-dev\-tools (5) .BR ubuntu\-dev\-tools (5)
.SH AUTHORS .SH AUTHORS
\fBbitesize\fR and this manual page were written by Daniel Holbach \fBlp-bitesize\fR and this manual page were written by Daniel Holbach
<daniel.holbach@canonical.com>. <daniel.holbach@canonical.com>.
.PP .PP
Both are released under the terms of the GNU General Public License, version 3. Both are released under the terms of the GNU General Public License, version 3.

View File

@ -20,7 +20,7 @@ like for example \fBpbuilder\-feisty\fP, \fBpbuilder\-sid\fP, \fBpbuilder\-gutsy
.PP .PP
The same applies to \fBcowbuilder\-dist\fP, which uses cowbuilder. The main The same applies to \fBcowbuilder\-dist\fP, which uses cowbuilder. The main
difference between both is that pbuilder compresses the created chroot as a difference between both is that pbuilder compresses the created chroot as a
a tarball, thus using less disc space but needing to uncompress (and possibly tarball, thus using less disc space but needing to uncompress (and possibly
compress) its contents again on each run, and cowbuilder doesn't do this. compress) its contents again on each run, and cowbuilder doesn't do this.
.SH USAGE .SH USAGE

View File

@ -0,0 +1,15 @@
.TH running\-autopkgtests "1" "18 January 2024" "ubuntu-dev-tools"
.SH NAME
running\-autopkgtests \- dumps a list of currently running autopkgtests
.SH SYNOPSIS
.B running\-autopkgtests
.SH DESCRIPTION
Dumps a list of currently running and queued tests in Autopkgtest.
Pass --running to only see running tests, or --queued to only see
queued tests. Passing both will print both, which is the default behavior.
.SH AUTHOR
.B running\-autopkgtests
was written by Chris Peterson <chris.peterson@canonical.com>.

View File

@ -58,7 +58,7 @@ Display more progress information.
\fB\-F\fR, \fB\-\-fakesync\fR \fB\-F\fR, \fB\-\-fakesync\fR
Perform a fakesync, to work around a tarball mismatch between Debian and Perform a fakesync, to work around a tarball mismatch between Debian and
Ubuntu. Ubuntu.
This option ignores blacklisting, and performs a local sync. This option ignores blocklisting, and performs a local sync.
It implies \fB\-\-no\-lp\fR, and will leave a signed \fB.changes\fR file It implies \fB\-\-no\-lp\fR, and will leave a signed \fB.changes\fR file
for you to upload. for you to upload.
.TP .TP

View File

@ -43,7 +43,7 @@ operations.
\fB\-a\fR ARCHITECTURE, \fB\-\-arch\fR=\fIARCHITECTURE\fR \fB\-a\fR ARCHITECTURE, \fB\-\-arch\fR=\fIARCHITECTURE\fR
Rebuild or rescore a specific architecture. Valid Rebuild or rescore a specific architecture. Valid
architectures are: architectures are:
armhf, arm64, amd64, i386, powerpc, ppc64el, riscv64, s390x. armhf, arm64, amd64, amd64v3, i386, powerpc, ppc64el, riscv64, s390x.
.TP .TP
Batch processing: Batch processing:
.IP .IP
@ -66,7 +66,7 @@ Rescore builds to <priority>.
\fB\-\-arch\fR=\fIARCHITECTURE\fR \fB\-\-arch\fR=\fIARCHITECTURE\fR
Affect only 'architecture' (can be used several Affect only 'architecture' (can be used several
times). Valid architectures are: times). Valid architectures are:
arm64, amd64, i386, powerpc, ppc64el, riscv64, s390x. armhf, arm64, amd64, amd64v3, i386, powerpc, ppc64el, riscv64, s390x.
.IP .IP
\fB\-A=\fIARCHIVE\fR \fB\-A=\fIARCHIVE\fR
Act on the named archive (ppa) instead of on the main Ubuntu archive. Act on the named archive (ppa) instead of on the main Ubuntu archive.

View File

@ -150,7 +150,7 @@ def process_bugs(
err = False err = False
for bug in bugs: for bug in bugs:
ubupackage = package = bug.source ubupackage = bug.source
if package: if package:
ubupackage = package ubupackage = package
bug_num = bug.bug_num bug_num = bug.bug_num

View File

@ -23,6 +23,7 @@
import argparse import argparse
import sys import sys
from typing import Any, NoReturn
from launchpadlib.errors import HTTPError from launchpadlib.errors import HTTPError
from launchpadlib.launchpad import Launchpad from launchpadlib.launchpad import Launchpad
@ -33,7 +34,7 @@ from ubuntutools.config import UDTConfig
Logger = getLogger() Logger = getLogger()
def error_out(msg, *args): def error_out(msg: str, *args: Any) -> NoReturn:
Logger.error(msg, *args) Logger.error(msg, *args)
sys.exit(1) sys.exit(1)

View File

@ -22,6 +22,7 @@
# pylint: enable=invalid-name # pylint: enable=invalid-name
import sys import sys
from typing import NoReturn
from debian.changelog import Changelog from debian.changelog import Changelog
@ -30,7 +31,7 @@ from ubuntutools import getLogger
Logger = getLogger() Logger = getLogger()
def usage(exit_code=1): def usage(exit_code: int = 1) -> NoReturn:
Logger.info( Logger.info(
"""Usage: merge-changelog <left changelog> <right changelog> """Usage: merge-changelog <left changelog> <right changelog>

View File

@ -155,6 +155,7 @@ proxy="_unset_"
DEBOOTSTRAP_NO_CHECK_GPG=0 DEBOOTSTRAP_NO_CHECK_GPG=0
EATMYDATA=1 EATMYDATA=1
CCACHE=0 CCACHE=0
USE_PKGBINARYMANGLER=0
while :; do while :; do
case "$1" in case "$1" in
@ -303,10 +304,26 @@ if [ ! -w /var/lib/sbuild ]; then
# Prepare a usable default .sbuildrc # Prepare a usable default .sbuildrc
if [ ! -e ~/.sbuildrc ]; then if [ ! -e ~/.sbuildrc ]; then
cat > ~/.sbuildrc <<EOM cat > ~/.sbuildrc <<EOM
# *** VERIFY AND UPDATE \$mailto and \$maintainer_name BELOW *** # *** THIS COMMAND IS DEPRECATED ***
#
# In sbuild 0.87.0 and later, the unshare backend is available. This is
# expected to become the default in a future release.
#
# This is the new preferred way of building Debian packages, making the manual
# creation of schroots no longer necessary. To retain the default behavior,
# you may remove this comment block and continue.
#
# To test the unshare backend while retaining the default settings, run sbuild
# with --chroot-mode=unshare like this:
# $ sbuild --chroot-mode=unshare --dist=unstable hello
#
# To switch to the unshare backend by default (recommended), uncomment the
# following lines and delete the rest of the file (with the exception of the
# last two lines):
#\$chroot_mode = 'unshare';
#\$unshare_mmdebstrap_keep_tarball = 1;
# Mail address where logs are sent to (mandatory, no default!) # *** VERIFY AND UPDATE \$mailto and \$maintainer_name BELOW ***
\$mailto = '$USER';
# Name to use as override in .changes files for the Maintainer: field # Name to use as override in .changes files for the Maintainer: field
#\$maintainer_name='$USER <$USER@localhost>'; #\$maintainer_name='$USER <$USER@localhost>';
@ -651,6 +668,7 @@ ubuntu)
if ubuntu_dist_ge "$RELEASE" "edgy"; then if ubuntu_dist_ge "$RELEASE" "edgy"; then
# Add pkgbinarymangler (edgy and later) # Add pkgbinarymangler (edgy and later)
BUILD_PKGS="$BUILD_PKGS pkgbinarymangler" BUILD_PKGS="$BUILD_PKGS pkgbinarymangler"
USE_PKGBINARYMANGLER=1
# Disable recommends for a smaller chroot (gutsy and later only) # Disable recommends for a smaller chroot (gutsy and later only)
if ubuntu_dist_ge "$RELEASE" "gutsy"; then if ubuntu_dist_ge "$RELEASE" "gutsy"; then
BUILD_PKGS="--no-install-recommends $BUILD_PKGS" BUILD_PKGS="--no-install-recommends $BUILD_PKGS"
@ -910,8 +928,8 @@ if [ -n "$TEMP_PREFERENCES" ]; then
sudo mv "$TEMP_PREFERENCES" $MNT/etc/apt/preferences.d/proposed.pref sudo mv "$TEMP_PREFERENCES" $MNT/etc/apt/preferences.d/proposed.pref
fi fi
# Copy the timezone (comment this out if you want to leave the chroot at UTC) # Copy the timezone (uncomment this if you want to use your local time zone)
sudo cp -P --remove-destination /etc/localtime /etc/timezone "$MNT"/etc/ #sudo cp -P --remove-destination /etc/localtime /etc/timezone "$MNT"/etc/
# Create a schroot entry for this chroot # Create a schroot entry for this chroot
TEMP_SCHROOTCONF=`mktemp -t schrootconf-XXXXXX` TEMP_SCHROOTCONF=`mktemp -t schrootconf-XXXXXX`
TEMPLATE_SCHROOTCONF=~/.mk-sbuild.schroot.conf TEMPLATE_SCHROOTCONF=~/.mk-sbuild.schroot.conf
@ -1030,6 +1048,25 @@ EOF
EOM EOM
fi fi
if [ "$USE_PKGBINARYMANGLER" = 1 ]; then
sudo bash -c "cat >> $MNT/finish.sh" <<EOM
mkdir -p /etc/pkgbinarymangler/
cat > /etc/pkgbinarymangler/maintainermangler.conf <<EOF
# pkgmaintainermangler configuration file
# pkgmaintainermangler will do nothing unless enable is set to "true"
enable: true
# Configure what happens if /CurrentlyBuilding is present, but invalid
# (i. e. it does not contain a Package: field). If "ignore" (default),
# the file is ignored (i. e. the Maintainer field is mangled) and a
# warning is printed. If "fail" (or any other value), pkgmaintainermangler
# exits with an error, which causes a package build to fail.
invalid_currentlybuilding: ignore
EOF
EOM
fi
if [ -n "$TARGET_ARCH" ]; then if [ -n "$TARGET_ARCH" ]; then
sudo bash -c "cat >> $MNT/finish.sh" <<EOM sudo bash -c "cat >> $MNT/finish.sh" <<EOM
# Configure target architecture # Configure target architecture
@ -1048,7 +1085,7 @@ apt-get update || true
echo set debconf/frontend Noninteractive | debconf-communicate echo set debconf/frontend Noninteractive | debconf-communicate
echo set debconf/priority critical | debconf-communicate echo set debconf/priority critical | debconf-communicate
# Install basic build tool set, trying to match buildd # Install basic build tool set, trying to match buildd
apt-get -y --force-yes install $BUILD_PKGS apt-get -y --force-yes -o Dpkg::Options::="--force-confold" install $BUILD_PKGS
# Set up expected /dev entries # Set up expected /dev entries
if [ ! -r /dev/stdin ]; then ln -s /proc/self/fd/0 /dev/stdin; fi if [ ! -r /dev/stdin ]; then ln -s /proc/self/fd/0 /dev/stdin; fi
if [ ! -r /dev/stdout ]; then ln -s /proc/self/fd/1 /dev/stdout; fi if [ ! -r /dev/stdout ]; then ln -s /proc/self/fd/1 /dev/stdout; fi

View File

@ -38,6 +38,7 @@ import shutil
import subprocess import subprocess
import sys import sys
from contextlib import suppress from contextlib import suppress
from typing import NoReturn
import debian.deb822 import debian.deb822
from distro_info import DebianDistroInfo, DistroDataOutdated, UbuntuDistroInfo from distro_info import DebianDistroInfo, DistroDataOutdated, UbuntuDistroInfo
@ -411,7 +412,7 @@ class PbuilderDist:
] + arguments ] + arguments
def show_help(exit_code=0): def show_help(exit_code: int = 0) -> NoReturn:
"""help() -> None """help() -> None
Print a help message for pbuilder-dist, and exit with the given code. Print a help message for pbuilder-dist, and exit with the given code.

View File

@ -22,6 +22,7 @@ from argparse import ArgumentParser
import yaml import yaml
from launchpadlib.launchpad import Launchpad from launchpadlib.launchpad import Launchpad
from ubuntutools.question import YesNoQuestion
from ubuntutools.utils import get_url from ubuntutools.utils import get_url
# proposed-migration is only concerned with the devel series; unlike other # proposed-migration is only concerned with the devel series; unlike other
@ -56,10 +57,8 @@ def claim_excuses_bug(launchpad, bug, package):
if our_task.assignee: if our_task.assignee:
print(f"Currently assigned to {our_task.assignee.name}") print(f"Currently assigned to {our_task.assignee.name}")
print("""Do you want to claim this bug? [yN] """, end="") answer = YesNoQuestion().ask("Do you want to claim this bug?", "no")
sys.stdout.flush() if answer == "yes":
response = sys.stdin.readline()
if response.strip().lower().startswith("y"):
our_task.assignee = launchpad.me our_task.assignee = launchpad.me
our_task.lp_save() our_task.lp_save()
return True return True
@ -131,6 +130,8 @@ def main():
if not proposed_version: if not proposed_version:
print(f"Package {args.package} not found in -proposed.") print(f"Package {args.package} not found in -proposed.")
sys.exit(1) sys.exit(1)
answer = YesNoQuestion().ask("Do you want to create a bug?", "no")
if answer == "yes":
create_excuses_bug(args.launchpad, args.package, proposed_version) create_excuses_bug(args.launchpad, args.package, proposed_version)
except ValueError as e: except ValueError as e:
sys.stderr.write(f"{e}\n") sys.stderr.write(f"{e}\n")

View File

@ -4,3 +4,7 @@ line-length = 99
[tool.isort] [tool.isort]
line_length = 99 line_length = 99
profile = "black" profile = "black"
[tool.mypy]
disallow_incomplete_defs = true
ignore_missing_imports = true

View File

@ -46,7 +46,7 @@ Logger = getLogger()
# #
def main(): def main() -> None:
# Our usage options. # Our usage options.
usage = "%(prog)s [options] <source package> [<target release> [base version]]" usage = "%(prog)s [options] <source package> [<target release> [base version]]"
parser = argparse.ArgumentParser(usage=usage) parser = argparse.ArgumentParser(usage=usage)
@ -153,6 +153,7 @@ def main():
import DNS # pylint: disable=import-outside-toplevel import DNS # pylint: disable=import-outside-toplevel
DNS.DiscoverNameServers() DNS.DiscoverNameServers()
# imported earlier, pylint: disable-next=possibly-used-before-assignment
mxlist = DNS.mxlookup(bug_mail_domain) mxlist = DNS.mxlookup(bug_mail_domain)
firstmx = mxlist[0] firstmx = mxlist[0]
mailserver_host = firstmx[1] mailserver_host = firstmx[1]
@ -214,6 +215,7 @@ def main():
if not args.release: if not args.release:
if lpapi: if lpapi:
# imported earlier, pylint: disable-next=possibly-used-before-assignment
args.release = Distribution("ubuntu").getDevelopmentSeries().name args.release = Distribution("ubuntu").getDevelopmentSeries().name
else: else:
ubu_info = UbuntuDistroInfo() ubu_info = UbuntuDistroInfo()
@ -377,6 +379,7 @@ def main():
# Map status to the values expected by LP API # Map status to the values expected by LP API
mapping = {"new": "New", "confirmed": "Confirmed"} mapping = {"new": "New", "confirmed": "Confirmed"}
# Post sync request using LP API # Post sync request using LP API
# imported earlier, pylint: disable-next=possibly-used-before-assignment
post_bug(srcpkg, subscribe, mapping[status], title, report) post_bug(srcpkg, subscribe, mapping[status], title, report)
else: else:
email_from = ubu_email(export=False)[1] email_from = ubu_email(export=False)[1]

View File

@ -183,7 +183,7 @@ def display_verbose(package, values):
Logger.info("No reverse dependencies found") Logger.info("No reverse dependencies found")
return return
def log_package(values, package, arch, dependency, offset=0): def log_package(values, package, arch, dependency, visited, offset=0):
line = f"{' ' * offset}* {package}" line = f"{' ' * offset}* {package}"
if all_archs and set(arch) != all_archs: if all_archs and set(arch) != all_archs:
line += f" [{' '.join(sorted(arch))}]" line += f" [{' '.join(sorted(arch))}]"
@ -192,6 +192,9 @@ def display_verbose(package, values):
line += " " * (30 - len(line)) line += " " * (30 - len(line))
line += f" (for {dependency})" line += f" (for {dependency})"
Logger.info(line) Logger.info(line)
if package in visited:
return
visited = visited.copy().add(package)
data = values.get(package) data = values.get(package)
if data: if data:
offset = offset + 1 offset = offset + 1
@ -202,6 +205,7 @@ def display_verbose(package, values):
rdep["Package"], rdep["Package"],
rdep.get("Architectures", all_archs), rdep.get("Architectures", all_archs),
rdep.get("Dependency"), rdep.get("Dependency"),
visited,
offset, offset,
) )
@ -223,6 +227,7 @@ def display_verbose(package, values):
rdep["Package"], rdep["Package"],
rdep.get("Architectures", all_archs), rdep.get("Architectures", all_archs),
rdep.get("Dependency"), rdep.get("Dependency"),
{package},
) )
Logger.info("") Logger.info("")

View File

@ -4,16 +4,45 @@ set -eu
# Copyright 2023, Canonical Ltd. # Copyright 2023, Canonical Ltd.
# SPDX-License-Identifier: GPL-3.0 # SPDX-License-Identifier: GPL-3.0
PYTHON_SCRIPTS=$(grep -l -r '^#! */usr/bin/python3$' .) PYTHON_SCRIPTS=$(find . -maxdepth 1 -type f -exec grep -l '^#! */usr/bin/python3$' {} +)
echo "Running black..." run_black() {
black --check --diff . $PYTHON_SCRIPTS echo "Running black..."
black -C --check --diff . ${PYTHON_SCRIPTS}
}
echo "Running isort..." run_isort() {
isort --check-only --diff . echo "Running isort..."
isort --check-only --diff .
}
echo "Running flake8..." run_flake8() {
flake8 --max-line-length=99 --ignore=E203,W503 . $PYTHON_SCRIPTS echo "Running flake8..."
flake8 --max-line-length=99 --ignore=E203,W503 . $PYTHON_SCRIPTS
}
echo "Running pylint..." run_mypy() {
pylint $(find * -name '*.py') $PYTHON_SCRIPTS echo "Running mypy..."
mypy .
mypy --scripts-are-modules $PYTHON_SCRIPTS
}
run_pylint() {
echo "Running pylint..."
pylint "$@" $(find * -name '*.py') $PYTHON_SCRIPTS
}
if test "${1-}" = "--errors-only"; then
# Run only linters that can detect real errors (ignore formatting)
run_black || true
run_isort || true
run_flake8 || true
run_mypy
run_pylint --errors-only
else
run_black
run_isort
run_flake8
run_mypy
run_pylint
fi

View File

@ -32,13 +32,13 @@ def make_pep440_compliant(version: str) -> str:
scripts = [ scripts = [
"backportpackage", "backportpackage",
"bitesize",
"check-mir", "check-mir",
"check-symbols", "check-symbols",
"dch-repeat", "dch-repeat",
"grab-merge", "grab-merge",
"grep-merges", "grep-merges",
"import-bug-from-debian", "import-bug-from-debian",
"lp-bitesize",
"merge-changelog", "merge-changelog",
"mk-sbuild", "mk-sbuild",
"pbuilder-dist", "pbuilder-dist",

View File

@ -22,6 +22,7 @@
import argparse import argparse
import fnmatch import fnmatch
import functools
import logging import logging
import os import os
import shutil import shutil
@ -143,7 +144,7 @@ def sync_dsc(
if ubuntu_ver.is_modified_in_ubuntu(): if ubuntu_ver.is_modified_in_ubuntu():
if not force: if not force:
Logger.error("--force is required to discard Ubuntu changes.") Logger.error("--force is required to discard Ubuntu changes.")
sys.exit(1) return None
Logger.warning( Logger.warning(
"Overwriting modified Ubuntu version %s, setting current version to %s", "Overwriting modified Ubuntu version %s, setting current version to %s",
@ -157,7 +158,7 @@ def sync_dsc(
src_pkg.pull() src_pkg.pull()
except DownloadError as e: except DownloadError as e:
Logger.error("Failed to download: %s", str(e)) Logger.error("Failed to download: %s", str(e))
sys.exit(1) return None
src_pkg.unpack() src_pkg.unpack()
needs_fakesync = not (need_orig or ubu_pkg.verify_orig()) needs_fakesync = not (need_orig or ubu_pkg.verify_orig())
@ -166,13 +167,13 @@ def sync_dsc(
Logger.warning("Performing a fakesync") Logger.warning("Performing a fakesync")
elif not needs_fakesync and fakesync: elif not needs_fakesync and fakesync:
Logger.error("Fakesync not required, aborting.") Logger.error("Fakesync not required, aborting.")
sys.exit(1) return None
elif needs_fakesync and not fakesync: elif needs_fakesync and not fakesync:
Logger.error( Logger.error(
"The checksums of the Debian and Ubuntu packages " "The checksums of the Debian and Ubuntu packages "
"mismatch. A fake sync using --fakesync is required." "mismatch. A fake sync using --fakesync is required."
) )
sys.exit(1) return None
if fakesync: if fakesync:
# Download Ubuntu files (override Debian source tarballs) # Download Ubuntu files (override Debian source tarballs)
@ -180,7 +181,7 @@ def sync_dsc(
ubu_pkg.pull() ubu_pkg.pull()
except DownloadError as e: except DownloadError as e:
Logger.error("Failed to download: %s", str(e)) Logger.error("Failed to download: %s", str(e))
sys.exit(1) return None
# change into package directory # change into package directory
directory = src_pkg.source + "-" + new_ver.upstream_version directory = src_pkg.source + "-" + new_ver.upstream_version
@ -265,7 +266,7 @@ def sync_dsc(
returncode = subprocess.call(cmd) returncode = subprocess.call(cmd)
if returncode != 0: if returncode != 0:
Logger.error("Source-only build with debuild failed. Please check build log above.") Logger.error("Source-only build with debuild failed. Please check build log above.")
sys.exit(1) return None
def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror): def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
@ -295,7 +296,7 @@ def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
udtexceptions.SeriesNotFoundException, udtexceptions.SeriesNotFoundException,
) as e: ) as e:
Logger.error(str(e)) Logger.error(str(e))
sys.exit(1) return None
if version is None: if version is None:
version = Version(debian_srcpkg.getVersion()) version = Version(debian_srcpkg.getVersion())
try: try:
@ -306,7 +307,7 @@ def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
ubuntu_version = Version("~") ubuntu_version = Version("~")
except udtexceptions.SeriesNotFoundException as e: except udtexceptions.SeriesNotFoundException as e:
Logger.error(str(e)) Logger.error(str(e))
sys.exit(1) return None
if ubuntu_version >= version: if ubuntu_version >= version:
# The LP importer is maybe out of date # The LP importer is maybe out of date
debian_srcpkg = requestsync_mail_get_debian_srcpkg(package, dist) debian_srcpkg = requestsync_mail_get_debian_srcpkg(package, dist)
@ -320,7 +321,7 @@ def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
ubuntu_version, ubuntu_version,
ubuntu_release, ubuntu_release,
) )
sys.exit(1) return None
if component is None: if component is None:
component = debian_srcpkg.getComponent() component = debian_srcpkg.getComponent()
@ -329,7 +330,7 @@ def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
return DebianSourcePackage(package, version.full_version, component, mirrors=mirrors) return DebianSourcePackage(package, version.full_version, component, mirrors=mirrors)
def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False): def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, yes=False):
"""Copy a source package from Debian to Ubuntu using the Launchpad API.""" """Copy a source package from Debian to Ubuntu using the Launchpad API."""
ubuntu = Distribution("ubuntu") ubuntu = Distribution("ubuntu")
debian_archive = Distribution("debian").getArchive() debian_archive = Distribution("debian").getArchive()
@ -352,7 +353,7 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False):
"Debian version %s has not been picked up by LP yet. Please try again later.", "Debian version %s has not been picked up by LP yet. Please try again later.",
src_pkg.version, src_pkg.version,
) )
sys.exit(1) return None
try: try:
ubuntu_spph = get_ubuntu_srcpkg(src_pkg.source, ubuntu_series, ubuntu_pocket) ubuntu_spph = get_ubuntu_srcpkg(src_pkg.source, ubuntu_series, ubuntu_pocket)
@ -373,7 +374,7 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False):
base_version = ubuntu_version.get_related_debian_version() base_version = ubuntu_version.get_related_debian_version()
if not force and ubuntu_version.is_modified_in_ubuntu(): if not force and ubuntu_version.is_modified_in_ubuntu():
Logger.error("--force is required to discard Ubuntu changes.") Logger.error("--force is required to discard Ubuntu changes.")
sys.exit(1) return None
# Check whether a fakesync would be required. # Check whether a fakesync would be required.
if not src_pkg.dsc.compare_dsc(ubuntu_pkg.dsc): if not src_pkg.dsc.compare_dsc(ubuntu_pkg.dsc):
@ -381,7 +382,7 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False):
"The checksums of the Debian and Ubuntu packages " "The checksums of the Debian and Ubuntu packages "
"mismatch. A fake sync using --fakesync is required." "mismatch. A fake sync using --fakesync is required."
) )
sys.exit(1) return None
except udtexceptions.PackageNotFoundException: except udtexceptions.PackageNotFoundException:
base_version = Version("~") base_version = Version("~")
Logger.info( Logger.info(
@ -402,6 +403,7 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False):
if sponsoree: if sponsoree:
Logger.info("Sponsoring this sync for %s (%s)", sponsoree.display_name, sponsoree.name) Logger.info("Sponsoring this sync for %s (%s)", sponsoree.display_name, sponsoree.name)
if not yes:
answer = YesNoQuestion().ask("Sync this package", "no") answer = YesNoQuestion().ask("Sync this package", "no")
if answer != "yes": if answer != "yes":
return return
@ -419,26 +421,37 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False):
except HTTPError as error: except HTTPError as error:
Logger.error("HTTP Error %s: %s", error.response.status, error.response.reason) Logger.error("HTTP Error %s: %s", error.response.status, error.response.reason)
Logger.error(error.content) Logger.error(error.content)
sys.exit(1) return None
Logger.info("Request succeeded; you should get an e-mail once it is processed.") Logger.info("Request succeeded; you should get an e-mail once it is processed.")
bugs = sorted(set(bugs)) bugs = sorted(set(bugs))
if bugs: if bugs:
Logger.info("Launchpad bugs to be closed: %s", ", ".join(str(bug) for bug in bugs)) Logger.info("Launchpad bugs to be closed: %s", ", ".join(str(bug) for bug in bugs))
Logger.info("Please wait for the sync to be successful before closing bugs.") Logger.info("Please wait for the sync to be successful before closing bugs.")
if yes:
close_bugs(bugs, src_pkg.source, src_pkg.version.full_version, changes, sponsoree)
else:
answer = YesNoQuestion().ask("Close bugs", "yes") answer = YesNoQuestion().ask("Close bugs", "yes")
if answer == "yes": if answer == "yes":
close_bugs(bugs, src_pkg.source, src_pkg.version.full_version, changes, sponsoree) close_bugs(bugs, src_pkg.source, src_pkg.version.full_version, changes, sponsoree)
def is_blacklisted(query): @functools.lru_cache(maxsize=1)
"""Determine if package "query" is in the sync blacklist def _fetch_sync_blocklist() -> str:
Returns tuple of (blacklisted, comments) url = "https://ubuntu-archive-team.ubuntu.com/sync-blocklist.txt"
blacklisted is one of False, 'CURRENT', 'ALWAYS' with urllib.request.urlopen(url) as f:
sync_blocklist = f.read().decode("utf-8")
return sync_blocklist
def is_blocklisted(query):
"""Determine if package "query" is in the sync blocklist
Returns tuple of (blocklisted, comments)
blocklisted is one of False, 'CURRENT', 'ALWAYS'
""" """
series = Launchpad.distributions["ubuntu"].current_series series = Launchpad.distributions["ubuntu"].current_series
lp_comments = series.getDifferenceComments(source_package_name=query) lp_comments = series.getDifferenceComments(source_package_name=query)
blacklisted = False blocklisted = False
comments = [ comments = [
f"{c.body_text}\n -- {c.comment_author.name}" f"{c.body_text}\n -- {c.comment_author.name}"
f" {c.comment_date.strftime('%a, %d %b %Y %H:%M:%S +0000')}" f" {c.comment_date.strftime('%a, %d %b %Y %H:%M:%S +0000')}"
@ -446,17 +459,19 @@ def is_blacklisted(query):
] ]
for diff in series.getDifferencesTo(source_package_name_filter=query): for diff in series.getDifferencesTo(source_package_name_filter=query):
if diff.status == "Blacklisted current version" and blacklisted != "ALWAYS": if diff.status == "Blacklisted current version" and blocklisted != "ALWAYS":
blacklisted = "CURRENT" blocklisted = "CURRENT"
if diff.status == "Blacklisted always": if diff.status == "Blacklisted always":
blacklisted = "ALWAYS" blocklisted = "ALWAYS"
try:
sync_blocklist = _fetch_sync_blocklist()
except OSError:
print("WARNING: unable to download the sync blocklist. Erring on the side of caution.")
return ("ALWAYS", "INTERNAL ERROR: Unable to fetch sync blocklist")
# Old blacklist:
url = "https://ubuntu-archive-team.ubuntu.com/sync-blacklist.txt"
with urllib.request.urlopen(url) as f:
applicable_lines = [] applicable_lines = []
for line in f: for line in sync_blocklist.splitlines():
line = line.decode("utf-8")
if not line.strip(): if not line.strip():
applicable_lines = [] applicable_lines = []
continue continue
@ -467,11 +482,11 @@ def is_blacklisted(query):
pass pass
source = line.strip() source = line.strip()
if source and fnmatch.fnmatch(query, source): if source and fnmatch.fnmatch(query, source):
comments += ["From sync-blacklist.txt:"] + applicable_lines comments += ["From sync-blocklist.txt:"] + applicable_lines
blacklisted = "ALWAYS" blocklisted = "ALWAYS"
break break
return (blacklisted, comments) return (blocklisted, comments)
def close_bugs(bugs, package, version, changes, sponsoree): def close_bugs(bugs, package, version, changes, sponsoree):
@ -508,6 +523,12 @@ def parse():
epilog = f"See {os.path.basename(sys.argv[0])}(1) for more info." epilog = f"See {os.path.basename(sys.argv[0])}(1) for more info."
parser = argparse.ArgumentParser(usage=usage, epilog=epilog) parser = argparse.ArgumentParser(usage=usage, epilog=epilog)
parser.add_argument(
"-y",
"--yes",
action="store_true",
help="Automatically sync without prompting. Use with caution and care.",
)
parser.add_argument("-d", "--distribution", help="Debian distribution to sync from.") parser.add_argument("-d", "--distribution", help="Debian distribution to sync from.")
parser.add_argument("-r", "--release", help="Specify target Ubuntu release.") parser.add_argument("-r", "--release", help="Specify target Ubuntu release.")
parser.add_argument("-V", "--debian-version", help="Specify the version to sync from.") parser.add_argument("-V", "--debian-version", help="Specify the version to sync from.")
@ -712,36 +733,38 @@ def main():
args.release, args.release,
args.debian_mirror, args.debian_mirror,
) )
if not src_pkg:
continue
blacklisted, comments = is_blacklisted(src_pkg.source) blocklisted, comments = is_blocklisted(src_pkg.source)
blacklist_fail = False blocklist_fail = False
if blacklisted: if blocklisted:
messages = [] messages = []
if blacklisted == "CURRENT": if blocklisted == "CURRENT":
Logger.debug( Logger.debug(
"Source package %s is temporarily blacklisted " "Source package %s is temporarily blocklisted "
"(blacklisted_current). " "(blocklisted_current). "
"Ubuntu ignores these for now. " "Ubuntu ignores these for now. "
"See also LP: #841372", "See also LP: #841372",
src_pkg.source, src_pkg.source,
) )
else: else:
if args.fakesync: if args.fakesync:
messages += ["Doing a fakesync, overriding blacklist."] messages += ["Doing a fakesync, overriding blocklist."]
else: else:
blacklist_fail = True blocklist_fail = True
messages += [ messages += [
"If this package needs a fakesync, use --fakesync", "If this package needs a fakesync, use --fakesync",
"If you think this package shouldn't be " "If you think this package shouldn't be "
"blacklisted, please file a bug explaining your " "blocklisted, please file a bug explaining your "
"reasoning and subscribe ~ubuntu-archive.", "reasoning and subscribe ~ubuntu-archive.",
] ]
if blacklist_fail: if blocklist_fail:
Logger.error("Source package %s is blacklisted.", src_pkg.source) Logger.error("Source package %s is blocklisted.", src_pkg.source)
elif blacklisted == "ALWAYS": elif blocklisted == "ALWAYS":
Logger.info("Source package %s is blacklisted.", src_pkg.source) Logger.info("Source package %s is blocklisted.", src_pkg.source)
if messages: if messages:
for message in messages: for message in messages:
for line in textwrap.wrap(message): for line in textwrap.wrap(message):
@ -753,14 +776,17 @@ def main():
for line in textwrap.wrap(comment): for line in textwrap.wrap(comment):
Logger.info(" %s", line) Logger.info(" %s", line)
if blacklist_fail: if blocklist_fail:
sys.exit(1) continue
if args.lp: if args.lp:
copy(src_pkg, args.release, args.bugs, sponsoree, args.simulate, args.force) if not copy(
src_pkg, args.release, args.bugs, sponsoree, args.simulate, args.force, args.yes
):
continue
else: else:
os.environ["DEB_VENDOR"] = "Ubuntu" os.environ["DEB_VENDOR"] = "Ubuntu"
sync_dsc( if not sync_dsc(
src_pkg, src_pkg,
args.distribution, args.distribution,
args.release, args.release,
@ -772,7 +798,8 @@ def main():
args.simulate, args.simulate,
args.force, args.force,
args.fakesync, args.fakesync,
) ):
continue
if __name__ == "__main__": if __name__ == "__main__":

View File

@ -87,17 +87,13 @@ def retry_builds(pkg, archs):
return f"Retrying builds of '{pkg.source_package_name}':\n{msg}" return f"Retrying builds of '{pkg.source_package_name}':\n{msg}"
def main(): def parse_args(argv: list[str], valid_archs: set[str]) -> argparse.Namespace:
"""Parse command line arguments and return namespace."""
# Usage. # Usage.
usage = "%(prog)s <srcpackage> <release> <operation>\n\n" usage = "%(prog)s <srcpackage> <release> <operation>\n\n"
usage += "Where operation may be one of: rescore, retry, or status.\n" usage += "Where operation may be one of: rescore, retry, or status.\n"
usage += "Only Launchpad Buildd Admins may rescore package builds." usage += "Only Launchpad Buildd Admins may rescore package builds."
# Valid architectures.
valid_archs = set(
["armhf", "arm64", "amd64", "i386", "powerpc", "ppc64el", "riscv64", "s390x"]
)
# Prepare our option parser. # Prepare our option parser.
parser = argparse.ArgumentParser(usage=usage) parser = argparse.ArgumentParser(usage=usage)
@ -148,7 +144,23 @@ def main():
parser.add_argument("packages", metavar="package", nargs="*", help=argparse.SUPPRESS) parser.add_argument("packages", metavar="package", nargs="*", help=argparse.SUPPRESS)
# Parse our options. # Parse our options.
args = parser.parse_args() args = parser.parse_args(argv)
if not args.batch:
# Check we have the correct number of arguments.
if len(args.packages) < 3:
parser.error("Incorrect number of arguments.")
return args
def main():
# Valid architectures.
valid_archs = set(
["armhf", "arm64", "amd64", "amd64v3", "i386", "powerpc", "ppc64el", "riscv64", "s390x"]
)
args = parse_args(sys.argv[1:], valid_archs)
launchpad = Launchpad.login_with("ubuntu-dev-tools", "production", version="devel") launchpad = Launchpad.login_with("ubuntu-dev-tools", "production", version="devel")
ubuntu = launchpad.distributions["ubuntu"] ubuntu = launchpad.distributions["ubuntu"]
@ -167,21 +179,13 @@ def main():
Logger.error(error) Logger.error(error)
sys.exit(1) sys.exit(1)
else: else:
# Check we have the correct number of arguments.
if len(args.packages) < 3:
parser.error("Incorrect number of arguments.")
try:
package = str(args.packages[0]).lower() package = str(args.packages[0]).lower()
release = str(args.packages[1]).lower() release = str(args.packages[1]).lower()
operation = str(args.packages[2]).lower() operation = str(args.packages[2]).lower()
except IndexError:
parser.print_help()
sys.exit(1)
archive = launchpad.archives.getByReference(reference=args.archive) archive = launchpad.archives.getByReference(reference=args.archive)
try: try:
distroseries = ubuntu.getSeries(name_or_version=release) distroseries = ubuntu.getSeries(name_or_version=release.split("-")[0])
except lazr.restfulclient.errors.NotFound as error: except lazr.restfulclient.errors.NotFound as error:
Logger.error(error) Logger.error(error)
sys.exit(1) sys.exit(1)
@ -234,11 +238,11 @@ def main():
# are in place. # are in place.
if operation == "retry": if operation == "retry":
necessary_privs = archive.checkUpload( necessary_privs = archive.checkUpload(
component=sources.getComponent(), component=component,
distroseries=distroseries, distroseries=distroseries,
person=launchpad.me, person=launchpad.me,
pocket=pocket, pocket=pocket,
sourcepackagename=sources.getPackageName(), sourcepackagename=sources.source_package_name,
) )
if not necessary_privs: if not necessary_privs:
Logger.error( Logger.error(

View File

@ -65,7 +65,7 @@ def main():
err = True err = True
continue continue
Logger.info(prefix + version) Logger.info("%s%s", prefix, version)
if err: if err:
sys.exit(1) sys.exit(1)

View File

@ -165,6 +165,7 @@ class SourcePackage(ABC):
series = kwargs.get("series") series = kwargs.get("series")
pocket = kwargs.get("pocket") pocket = kwargs.get("pocket")
status = kwargs.get("status") status = kwargs.get("status")
arch = kwargs.get("arch")
verify_signature = kwargs.get("verify_signature", False) verify_signature = kwargs.get("verify_signature", False)
try_binary = kwargs.get("try_binary", True) try_binary = kwargs.get("try_binary", True)
@ -184,6 +185,7 @@ class SourcePackage(ABC):
self._series = series self._series = series
self._pocket = pocket self._pocket = pocket
self._status = status self._status = status
self._arch = arch
# dscfile can be either a path or an URL. misc.py's download() will # dscfile can be either a path or an URL. misc.py's download() will
# later fiture it out # later fiture it out
self._dsc_source = dscfile self._dsc_source = dscfile
@ -252,6 +254,7 @@ class SourcePackage(ABC):
) )
try: try:
params["archtag"] = self._arch
bpph = archive.getBinaryPackage(self.source, **params) bpph = archive.getBinaryPackage(self.source, **params)
except PackageNotFoundException as bpnfe: except PackageNotFoundException as bpnfe:
# log binary lookup failure, in case it provides hints # log binary lookup failure, in case it provides hints
@ -337,11 +340,9 @@ class SourcePackage(ABC):
def _archive_servers(self): def _archive_servers(self):
"Generator for mirror and master servers" "Generator for mirror and master servers"
# Always provide the mirrors first # Always provide the mirrors first
for server in self.mirrors: yield from self.mirrors
yield server
# Don't repeat servers that are in both mirrors and masters # Don't repeat servers that are in both mirrors and masters
for server in set(self.masters) - set(self.mirrors): yield from set(self.masters) - set(self.mirrors)
yield server
def _source_urls(self, name): def _source_urls(self, name):
"Generator of sources for name" "Generator of sources for name"
@ -632,8 +633,7 @@ class DebianSourcePackage(SourcePackage):
def _source_urls(self, name): def _source_urls(self, name):
"Generator of sources for name" "Generator of sources for name"
for url in super()._source_urls(name): yield from super()._source_urls(name)
yield url
if name in self.snapshot_files: if name in self.snapshot_files:
yield self.snapshot_files[name] yield self.snapshot_files[name]
@ -731,6 +731,7 @@ class PersonalPackageArchiveSourcePackage(UbuntuSourcePackage):
class UbuntuCloudArchiveSourcePackage(PersonalPackageArchiveSourcePackage): class UbuntuCloudArchiveSourcePackage(PersonalPackageArchiveSourcePackage):
"Download / unpack an Ubuntu Cloud Archive source package" "Download / unpack an Ubuntu Cloud Archive source package"
TEAM = "ubuntu-cloud-archive" TEAM = "ubuntu-cloud-archive"
PROJECT = "cloud-archive" PROJECT = "cloud-archive"
VALID_POCKETS = ["updates", "proposed", "staging"] VALID_POCKETS = ["updates", "proposed", "staging"]
@ -927,8 +928,8 @@ class UbuntuCloudArchiveSourcePackage(PersonalPackageArchiveSourcePackage):
class _WebJSON: class _WebJSON:
def getHostUrl(self): # pylint: disable=no-self-use def getHostUrl(self):
raise Exception("Not implemented") raise NotImplementedError(f"{self.__class__.__name__}.getHostUrl() is not implemented")
def load(self, path=""): def load(self, path=""):
reader = codecs.getreader("utf-8") reader = codecs.getreader("utf-8")

View File

@ -50,7 +50,7 @@ class UDTConfig:
"KEYID": None, "KEYID": None,
} }
# Populated from the configuration files: # Populated from the configuration files:
config = {} config: dict[str, str] = {}
def __init__(self, no_conf=False, prefix=None): def __init__(self, no_conf=False, prefix=None):
self.no_conf = no_conf self.no_conf = no_conf
@ -61,28 +61,26 @@ class UDTConfig:
self.config = self.parse_devscripts_config() self.config = self.parse_devscripts_config()
@staticmethod @staticmethod
def parse_devscripts_config(): def parse_devscripts_config() -> dict[str, str]:
"""Read the devscripts configuration files, and return the values as a """Read the devscripts configuration files, and return the values as a
dictionary dictionary
""" """
config = {} config = {}
for filename in ("/etc/devscripts.conf", "~/.devscripts"): for filename in ("/etc/devscripts.conf", "~/.devscripts"):
try: try:
f = open(os.path.expanduser(filename), "r", encoding="utf-8") with open(os.path.expanduser(filename), "r", encoding="utf-8") as f:
content = f.read()
except IOError: except IOError:
continue continue
for line in f: try:
parsed = shlex.split(line, comments=True) tokens = shlex.split(content, comments=True)
if len(parsed) > 1: except ValueError as e:
Logger.warning( Logger.error("Error parsing %s: %s", filename, e)
"Cannot parse variable assignment in %s: %s", continue
getattr(f, "name", "<config>"), for token in tokens:
line, if "=" in token:
) key, value = token.split("=", 1)
if len(parsed) >= 1 and "=" in parsed[0]:
key, value = parsed[0].split("=", 1)
config[key] = value config[key] = value
f.close()
return config return config
def get_value(self, key, default=None, boolean=False, compat_keys=()): def get_value(self, key, default=None, boolean=False, compat_keys=()):

View File

@ -26,6 +26,7 @@ import logging
import os import os
import re import re
from copy import copy from copy import copy
from typing import Any
from urllib.error import URLError from urllib.error import URLError
from urllib.parse import urlparse from urllib.parse import urlparse
@ -139,7 +140,7 @@ class BaseWrapper(metaclass=MetaWrapper):
A base class from which other wrapper classes are derived. A base class from which other wrapper classes are derived.
""" """
resource_type: str = None # it's a base class after all resource_type: str | tuple[str, str] = "" # it's a base class after all
def __new__(cls, data): def __new__(cls, data):
if isinstance(data, str) and data.startswith(str(Launchpad._root_uri)): if isinstance(data, str) and data.startswith(str(Launchpad._root_uri)):
@ -290,9 +291,8 @@ class Distribution(BaseWrapper):
Returns a list of all DistroSeries objects. Returns a list of all DistroSeries objects.
""" """
if not self._have_all_series: if not self._have_all_series:
for series in Launchpad.load(self.series_collection_link).entries: for series in self.series:
series_link = DistroSeries(series["self_link"]) self._cache_series(DistroSeries(series))
self._cache_series(series_link)
self._have_all_series = True self._have_all_series = True
allseries = filter(lambda s: s.active, self._series.values()) allseries = filter(lambda s: s.active, self._series.values())
@ -668,20 +668,19 @@ class Archive(BaseWrapper):
rversion = getattr(record, "binary_package_version", None) rversion = getattr(record, "binary_package_version", None)
else: else:
rversion = getattr(record, "source_package_version", None) rversion = getattr(record, "source_package_version", None)
skipmsg = f"Skipping version {rversion}: "
if record.pocket not in pockets: if record.pocket not in pockets:
err_msg = f"pocket {record.pocket} not in ({','.join(pockets)})" err_msg = f"pocket {record.pocket} not in ({','.join(pockets)})"
Logger.debug(skipmsg + err_msg) Logger.debug("Skipping version %s: %s", rversion, err_msg)
continue continue
if record.status not in statuses: if record.status not in statuses:
err_msg = f"status {record.status} not in ({','.join(statuses)})" err_msg = f"status {record.status} not in ({','.join(statuses)})"
Logger.debug(skipmsg + err_msg) Logger.debug("Skipping version %s: %s", rversion, err_msg)
continue continue
release = wrapper(record) release = wrapper(record)
if binary and archtag and archtag != release.arch: if binary and archtag and archtag != release.arch:
err_msg = f"arch {release.arch} does not match requested arch {archtag}" err_msg = f"arch {release.arch} does not match requested arch {archtag}"
Logger.debug(skipmsg + err_msg) Logger.debug("Skipping version %s: %s", rversion, err_msg)
continue continue
# results are ordered so first is latest # results are ordered so first is latest
cache[index] = release cache[index] = release
@ -1406,10 +1405,7 @@ class PersonTeam(BaseWrapper, metaclass=MetaPersonTeam):
def getPPAs(self): def getPPAs(self):
if self._ppas is None: if self._ppas is None:
ppas = [ ppas = [Archive(ppa) for ppa in self._lpobject.ppas]
Archive(ppa["self_link"])
for ppa in Launchpad.load(self._lpobject.ppas_collection_link).entries
]
self._ppas = {ppa.name: ppa for ppa in ppas} self._ppas = {ppa.name: ppa for ppa in ppas}
return self._ppas return self._ppas
@ -1434,10 +1430,7 @@ class Project(BaseWrapper):
The list will be sorted by date_created, in descending order. The list will be sorted by date_created, in descending order.
""" """
if not self._series: if not self._series:
series = [ series = [ProjectSeries(s) for s in self._lpobject.series]
ProjectSeries(s["self_link"])
for s in Launchpad.load(self._lpobject.series_collection_link).entries
]
self._series = sorted(series, key=lambda s: s.date_created, reverse=True) self._series = sorted(series, key=lambda s: s.date_created, reverse=True)
return self._series.copy() return self._series.copy()
@ -1509,7 +1502,7 @@ class Packageset(BaseWrapper): # pylint: disable=too-few-public-methods
resource_type = "packageset" resource_type = "packageset"
_lp_packagesets = None _lp_packagesets = None
_source_sets = {} _source_sets: dict[tuple[str, str | None, bool], Any] = {}
@classmethod @classmethod
def setsIncludingSource(cls, sourcepackagename, distroseries=None, direct_inclusion=False): def setsIncludingSource(cls, sourcepackagename, distroseries=None, direct_inclusion=False):

View File

@ -340,6 +340,7 @@ class PullPkg:
params = {} params = {}
params["package"] = options["package"] params["package"] = options["package"]
params["arch"] = options["arch"]
if options["release"]: if options["release"]:
(release, version, pocket) = self.parse_release_and_version( (release, version, pocket) = self.parse_release_and_version(
@ -435,7 +436,7 @@ class PullPkg:
if options["upload_queue"]: if options["upload_queue"]:
# upload queue API is different/simpler # upload queue API is different/simpler
self.pull_upload_queue( # pylint: disable=missing-kwoa self.pull_upload_queue( # pylint: disable=missing-kwoa
pull, arch=options["arch"], download_only=options["download_only"], **params pull, download_only=options["download_only"], **params
) )
return return
@ -470,6 +471,7 @@ class PullPkg:
uri, uri,
) )
vcscmd = ""
if vcs == "Bazaar": if vcs == "Bazaar":
vcscmd = " $ bzr branch " + uri vcscmd = " $ bzr branch " + uri
elif vcs == "Git": elif vcs == "Git":

View File

@ -62,8 +62,17 @@ def get_debian_srcpkg(name, release):
return DebianSourcePackage(package=name, series=release).lp_spph return DebianSourcePackage(package=name, series=release).lp_spph
def get_ubuntu_srcpkg(name, release): def get_ubuntu_srcpkg(name, release, pocket="Proposed"):
return UbuntuSourcePackage(package=name, series=release).lp_spph srcpkg = UbuntuSourcePackage(package=name, series=release, pocket=pocket)
try:
return srcpkg.lp_spph
except PackageNotFoundException:
if pocket != "Release":
parent_pocket = "Release"
if pocket == "Updates":
parent_pocket = "Proposed"
return get_ubuntu_srcpkg(name, release, parent_pocket)
raise
def need_sponsorship(name, component, release): def need_sponsorship(name, component, release):

View File

@ -16,6 +16,7 @@
# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. # OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
import sys import sys
from typing import NoReturn
from ubuntutools.question import Question, YesNoQuestion from ubuntutools.question import Question, YesNoQuestion
@ -42,7 +43,7 @@ def ask_for_manual_fixing():
user_abort() user_abort()
def user_abort(): def user_abort() -> NoReturn:
"""Print abort and quit the program.""" """Print abort and quit the program."""
print("User abort.") print("User abort.")

View File

@ -17,6 +17,7 @@
import logging import logging
import os import os
import pathlib
import re import re
import subprocess import subprocess
import sys import sys
@ -407,22 +408,16 @@ class SourcePackage:
return True return True
def _run_lintian(self): def _run_lintian(self) -> str:
"""Runs lintian on either the source or binary changes file. """Runs lintian on either the source or binary changes file.
Returns the filename of the created lintian output file. Returns the filename of the created lintian output file.
""" """
# Determine whether to use the source or binary build for lintian # Determine whether to use the source or binary build for lintian
package_and_version = f"{self._package}_{strip_epoch(self._version)}"
if self._build_log: if self._build_log:
build_changes = ( build_changes = f"{package_and_version}_{self._builder.get_architecture()}.changes"
self._package
+ "_"
+ strip_epoch(self._version)
+ "_"
+ self._builder.get_architecture()
+ ".changes"
)
changes_for_lintian = os.path.join(self._buildresult, build_changes) changes_for_lintian = os.path.join(self._buildresult, build_changes)
else: else:
changes_for_lintian = self._changes_file changes_for_lintian = self._changes_file
@ -430,18 +425,12 @@ class SourcePackage:
# Check lintian # Check lintian
assert os.path.isfile(changes_for_lintian), f"{changes_for_lintian} does not exist." assert os.path.isfile(changes_for_lintian), f"{changes_for_lintian} does not exist."
cmd = ["lintian", "-IE", "--pedantic", "-q", "--profile", "ubuntu", changes_for_lintian] cmd = ["lintian", "-IE", "--pedantic", "-q", "--profile", "ubuntu", changes_for_lintian]
lintian_filename = os.path.join( lintian_file = pathlib.Path(self._workdir) / f"{package_and_version}.lintian"
self._workdir, self._package + "_" + strip_epoch(self._version) + ".lintian" Logger.debug("%s > %s", " ".join(cmd), lintian_file)
) with lintian_file.open("wb") as outfile:
Logger.debug("%s > %s", " ".join(cmd), lintian_filename) subprocess.run(cmd, stdout=outfile, check=True)
report = subprocess.check_output(cmd, encoding="utf-8")
# write lintian report file return str(lintian_file)
lintian_file = open(lintian_filename, "w", encoding="utf-8")
lintian_file.writelines(report)
lintian_file.close()
return lintian_filename
def sync(self, upload, series, bug_number, requester): def sync(self, upload, series, bug_number, requester):
"""Does a sync of the source package.""" """Does a sync of the source package."""

View File

@ -46,11 +46,9 @@ def is_command_available(command, check_sbin=False):
def check_dependencies(): def check_dependencies():
"Do we have all the commands we need for full functionality?" "Do we have all the commands we need for full functionality?"
missing = [] missing = []
for cmd in ("patch", "bzr", "quilt", "dput", "lintian"): for cmd in ("patch", "quilt", "dput", "lintian"):
if not is_command_available(cmd): if not is_command_available(cmd):
missing.append(cmd) missing.append(cmd)
if not is_command_available("bzr-buildpackage"):
missing.append("bzr-builddeb")
if not any( if not any(
is_command_available(cmd, check_sbin=True) for cmd in ("pbuilder", "sbuild", "cowbuilder") is_command_available(cmd, check_sbin=True) for cmd in ("pbuilder", "sbuild", "cowbuilder")
): ):
@ -212,14 +210,14 @@ def get_open_ubuntu_bug_task(launchpad, bug, branch=None):
sys.exit(1) sys.exit(1)
elif len(ubuntu_tasks) == 1: elif len(ubuntu_tasks) == 1:
task = ubuntu_tasks[0] task = ubuntu_tasks[0]
if len(ubuntu_tasks) > 1 and branch and branch[1] == "ubuntu": elif branch and branch[1] == "ubuntu":
tasks = [t for t in ubuntu_tasks if t.get_series() == branch[2] and t.package == branch[3]] tasks = [t for t in ubuntu_tasks if t.get_series() == branch[2] and t.package == branch[3]]
if len(tasks) > 1: if len(tasks) > 1:
# A bug targeted to the development series? # A bug targeted to the development series?
tasks = [t for t in tasks if t.series is not None] tasks = [t for t in tasks if t.series is not None]
assert len(tasks) == 1 assert len(tasks) == 1
task = tasks[0] task = tasks[0]
elif len(ubuntu_tasks) > 1: else:
task_list = [t.get_short_info() for t in ubuntu_tasks] task_list = [t.get_short_info() for t in ubuntu_tasks]
Logger.debug( Logger.debug(
"%i Ubuntu tasks exist for bug #%i.\n%s", "%i Ubuntu tasks exist for bug #%i.\n%s",

View File

@ -60,7 +60,7 @@ class ExamplePackage:
with tempfile.TemporaryDirectory() as tmpdir: with tempfile.TemporaryDirectory() as tmpdir:
self._create(Path(tmpdir)) self._create(Path(tmpdir))
def _create(self, directory: Path): def _create(self, directory: Path) -> None:
pkgdir = directory / self.dirname pkgdir = directory / self.dirname
pkgdir.mkdir() pkgdir.mkdir()
(pkgdir / self.content_filename).write_text(self.content_text) (pkgdir / self.content_filename).write_text(self.content_text)

View File

@ -28,6 +28,7 @@ class BinaryTests(unittest.TestCase):
def test_keyring_installed(self): def test_keyring_installed(self):
"""Smoke test for required lp api dependencies""" """Smoke test for required lp api dependencies"""
try: try:
# pylint: disable-next=import-outside-toplevel,unused-import
import keyring # noqa: F401 import keyring # noqa: F401
except ModuleNotFoundError: except ModuleNotFoundError as error:
raise ModuleNotFoundError("package python3-keyring is not installed") raise ModuleNotFoundError("package python3-keyring is not installed") from error

View File

@ -12,7 +12,7 @@
# LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR # LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
# OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR # OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
# PERFORMANCE OF THIS SOFTWARE. # PERFORMANCE OF THIS SOFTWARE.
""" Tests for running_autopkgtests """Tests for running_autopkgtests
Tests using cached data from autopkgtest servers. Tests using cached data from autopkgtest servers.
These tests only ensure code changes don't change parsing behavior These tests only ensure code changes don't change parsing behavior