Compare commits

..

4 Commits

Author SHA1 Message Date
Steve Langasek
7192e99427 changelog bump 2024-05-08 11:33:37 +02:00
Steve Langasek
475533f542 Correct wrong backport of python3-launchpadlib-desktop change 2024-05-08 11:33:23 +02:00
Steve Langasek
823b790da3 releasing package ubuntu-dev-tools version 0.201ubuntu2~23.10.1 2024-03-12 17:47:28 -07:00
Steve Langasek
7cac9c9c3b Backport to mantic 2024-03-12 17:47:19 -07:00
50 changed files with 508 additions and 1010 deletions

2
.gitignore vendored
View File

@ -1,2 +0,0 @@
__pycache__
*.egg-info

View File

@ -34,7 +34,6 @@ disable=fixme,locally-disabled,missing-docstring,useless-option-value,
duplicate-code,
too-many-instance-attributes,
too-many-nested-blocks,
too-many-positional-arguments,
too-many-lines,

View File

@ -25,7 +25,6 @@ import shutil
import subprocess
import sys
import tempfile
from typing import Any, NoReturn
from urllib.parse import quote
try:
@ -51,7 +50,7 @@ from ubuntutools.question import YesNoQuestion
Logger = getLogger()
def error(msg: str, *args: Any) -> NoReturn:
def error(msg, *args):
Logger.error(msg, *args)
sys.exit(1)

View File

@ -23,7 +23,6 @@
import argparse
import sys
from typing import Any, NoReturn
from launchpadlib.errors import HTTPError
from launchpadlib.launchpad import Launchpad
@ -34,7 +33,7 @@ from ubuntutools.config import UDTConfig
Logger = getLogger()
def error_out(msg: str, *args: Any) -> NoReturn:
def error_out(msg, *args):
Logger.error(msg, *args)
sys.exit(1)

1
debian/.gitignore vendored
View File

@ -1 +0,0 @@
files

157
debian/changelog vendored
View File

@ -1,152 +1,10 @@
ubuntu-dev-tools (0.210) unstable; urgency=medium
ubuntu-dev-tools (0.201ubuntu2~23.10.2) mantic; urgency=medium
* Team upload
* Backport current ubuntu-dev-tools to mantic. LP: #2057716.
[ Nick Rosbrook ]
* d/control: Depends: dput-ng instead of dput (LP: #2147656)
-- Steve Langasek <steve.langasek@ubuntu.com> Tue, 12 Mar 2024 17:47:25 -0700
[ Nadzeya Hutsko ]
* reverse-depends: print message when results are filtered out
(LP: #2143989)
[ Benjamin Drung ]
* Format Python code with black 26
* Bump Standards-Version to 4.7.4
-- Nadzeya Hutsko <nadzeya.hutsko@canonical.com> Thu, 16 Apr 2026 12:36:55 +0200
ubuntu-dev-tools (0.209) unstable; urgency=medium
[ Colin Watson ]
* Demote sudo to Recommends, and indicate which tools need it in the
package description.
[ Florent 'Skia' Jacquet ]
* pm-helper: make use of YesNoQuestion
[ Benjamin Drung ]
* setup.py: specify type of data_files (Closes: #1127543, LP: #2143232)
* Remove redundant Priority: optional field
* Update year in copyright
* Bump Standards-Version to 4.7.3
-- Benjamin Drung <bdrung@debian.org> Wed, 04 Mar 2026 22:16:02 +0100
ubuntu-dev-tools (0.208) unstable; urgency=medium
[ Gianfranco Costamagna ]
* ubuntu-build: consider amd64v3 as valid architecture
[ Sebastien Bacher ]
* ubuntu-build: fix non batch mode errors.
[ Benjamin Drung ]
* Format code with black and isort
* ubuntutools/pullpkg.py: initialize vcscmd
* make pylint and mypy happy
* mark non-returning functions with typing.NoReturn
* run-linters: add --errors-only mode and run this during package build
* Drop Lintian overrides related to .pyc files
* Drop obsolete Rules-Requires-Root: no
* run mypy during package build
* sponsor-patch: stop checking for bzr being present
* Modernize SourcePackage._run_lintian()
* requestsync: support pocket parameter in get_ubuntu_srcpkg (LP: #2115990)
-- Benjamin Drung <bdrung@debian.org> Wed, 03 Dec 2025 16:33:47 +0100
ubuntu-dev-tools (0.207) unstable; urgency=medium
* Team upload.
[ Dan Streetman ]
* Fix pull-lp-source --upload-queue (LP: #2110061)
[ Colin Watson ]
* Optimize Launchpad collection handling.
-- Colin Watson <cjwatson@debian.org> Mon, 15 Sep 2025 15:58:34 +0100
ubuntu-dev-tools (0.206) unstable; urgency=medium
[ Dan Bungert ]
* mk-sbuild: enable pkgmaintainermangler
[ Shengjing Zhu ]
* import-bug-from-debian: package option is overridden and not used
[ Fernando Bravo Hernández ]
* Parsing arch parameter to getBinaryPackage() (LP: #2081861)
[ Simon Quigley ]
* Read ~/.devscripts in a more robust way, to ideally pick up multi-line
variables (Closes: #725418).
* mk-sbuild: default to using UTC for schroots (LP: #2097159).
* syncpackage: s/syncblacklist/syncblocklist/g
* syncpackage: Cache the sync blocklist in-memory, so it's not fetched
multiple times when syncing more than one package.
* syncpackage: Catch exceptions cleanly, simply skipping to the next
package (erring on the side of caution) if there is an error doing the
download (LP: #1943286).
-- Simon Quigley <tsimonq2@debian.org> Tue, 04 Mar 2025 13:43:15 -0600
ubuntu-dev-tools (0.205) unstable; urgency=medium
* [syncpackage] When syncing multiple packages, if one of the packages is in
the sync blocklist, do not exit, simply continue.
* [syncpackage] Do not use exit(1) on an error or exception unless it
applies to all packages, instead return None so we can continue to the
next package.
* [syncpackage] Add support for -y or --yes, noted that it should be used
with care.
* Update Standards-Version to 4.7.2, no changes needed.
-- Simon Quigley <tsimonq2@debian.org> Sat, 01 Mar 2025 11:29:54 -0600
ubuntu-dev-tools (0.204) unstable; urgency=medium
[ Simon Quigley ]
* Update Standards-Version to 4.7.1, no changes needed.
* Add several Lintian overrides related to .pyc files.
* Add my name to the copyright file.
* Rename bitesize to lp-bitesize (Closes: #1076224).
* Add a manpage for running-autopkgtests.
* Add a large warning at the top of mk-sbuild encouraging the use of the
unshare backend. This is to provide ample warning to users.
* Remove mail line from default ~/.sbuildrc, to resolve the undeclared
dependency on sendmail (Closes: #1074632).
[ Julien Plissonneau Duquène ]
* Fix reverse-depends -b crash on packages that b-d on themselves
(Closes: #1087760).
-- Simon Quigley <tsimonq2@debian.org> Mon, 24 Feb 2025 19:54:39 -0600
ubuntu-dev-tools (0.203) unstable; urgency=medium
[ Steve Langasek ]
* ubuntu-build: handle TOCTOU issue with the "can be retried" value on
builds.
* Recommend sbuild over pbuilder. sbuild is the tool recommended by
Ubuntu developers whose behavior most closely approximates Launchpad
builds.
[ Florent 'Skia' Jacquet ]
* import-bug-from-debian: handle multipart message (Closes: #969510)
[ Benjamin Drung ]
* import-bug-from-debian: add type hints
* Bump Standards-Version to 4.7.0
* Bump year and add missing files to copyright
* setup.py: add pm-helper
* Format code with black and isort
* Address several issues pointed out by Pylint
* Depend on python3-yaml for pm-helper
-- Benjamin Drung <bdrung@debian.org> Sat, 02 Nov 2024 18:19:24 +0100
ubuntu-dev-tools (0.202) unstable; urgency=medium
ubuntu-dev-tools (0.201ubuntu2) noble; urgency=medium
[ Steve Langasek ]
* ubuntu-build: support --batch with no package names to retry all
@ -157,11 +15,14 @@ ubuntu-dev-tools (0.202) unstable; urgency=medium
* ubuntu-build: Handling of proposed vs release pocket default for ppas
* ubuntu-build: update manpage
[ Chris Peterson ]
-- Steve Langasek <steve.langasek@ubuntu.com> Tue, 12 Mar 2024 17:03:43 -0700
ubuntu-dev-tools (0.201ubuntu1) noble; urgency=medium
* Replace Depends on python3-launchpadlib with Depends on
python3-launchpadlib-desktop (LP: #2049217)
-- Simon Quigley <tsimonq2@ubuntu.com> Fri, 12 Apr 2024 23:33:14 -0500
-- Chris Peterson <chris.peterson@canonical.com> Fri, 01 Mar 2024 14:08:07 -0800
ubuntu-dev-tools (0.201) unstable; urgency=medium

34
debian/control vendored
View File

@ -1,23 +1,24 @@
Source: ubuntu-dev-tools
Section: devel
Maintainer: Ubuntu Developers <ubuntu-dev-tools@packages.debian.org>
Priority: optional
Maintainer: Ubuntu Developers <ubuntu-devel-discuss@lists.ubuntu.com>
XSBC-Original-Maintainer: Ubuntu Developers <ubuntu-dev-tools@packages.debian.org>
Uploaders:
Benjamin Drung <bdrung@debian.org>,
Stefano Rivera <stefanor@debian.org>,
Mattia Rizzolo <mattia@debian.org>,
Simon Quigley <tsimonq2@debian.org>,
Build-Depends:
debhelper-compat (= 13),
dh-make,
dh-python,
black <!nocheck>,
dctrl-tools,
debhelper-compat (= 13),
devscripts (>= 2.11.0~),
dh-make,
dh-python,
distro-info (>= 0.2~),
flake8,
isort <!nocheck>,
lsb-release,
mypy <!nocheck>,
pylint <!nocheck>,
python3-all,
python3-apt,
@ -26,13 +27,12 @@ Build-Depends:
python3-debianbts,
python3-distro-info,
python3-httplib2,
python3-launchpadlib-desktop,
python3-launchpadlib,
python3-pytest,
python3-requests <!nocheck>,
python3-setuptools,
python3-typeshed <!nocheck>,
python3-yaml <!nocheck>,
Standards-Version: 4.7.4
Standards-Version: 4.6.2
Rules-Requires-Root: no
Vcs-Git: https://git.launchpad.net/ubuntu-dev-tools
Vcs-Browser: https://git.launchpad.net/ubuntu-dev-tools
Homepage: https://launchpad.net/ubuntu-dev-tools
@ -40,13 +40,13 @@ Homepage: https://launchpad.net/ubuntu-dev-tools
Package: ubuntu-dev-tools
Architecture: all
Depends:
dpkg-dev,
binutils,
dctrl-tools,
devscripts (>= 2.11.0~),
diffstat,
distro-info (>= 0.2~),
dput-ng,
dpkg-dev,
dput,
lsb-release,
python3,
python3-apt,
@ -57,8 +57,8 @@ Depends:
python3-launchpadlib-desktop,
python3-lazr.restfulclient,
python3-ubuntutools (= ${binary:Version}),
python3-yaml,
sensible-utils,
sudo,
tzdata,
${misc:Depends},
${perl:Depends},
@ -71,11 +71,10 @@ Recommends:
genisoimage,
lintian,
patch,
pbuilder | cowbuilder | sbuild,
python3-dns,
quilt,
reportbug (>= 3.39ubuntu1),
sbuild | pbuilder | cowbuilder,
sudo,
ubuntu-keyring | ubuntu-archive-keyring,
Suggests:
bzr | brz,
@ -92,7 +91,7 @@ Description: useful tools for Ubuntu developers
willing to help fix it.
- check-mir - check support status of build/binary dependencies
- check-symbols - will compare and give you a diff of the exported symbols of
all .so files in a binary package. [sudo]
all .so files in a binary package.
- dch-repeat - used to repeat a change log into an older release.
- grab-merge - grabs a merge from merges.ubuntu.com easily.
- grep-merges - search for pending merges from Debian.
@ -100,10 +99,9 @@ Description: useful tools for Ubuntu developers
- merge-changelog - manually merges two Debian changelogs with the same base
version.
- mk-sbuild - script to create LVM snapshot chroots via schroot and
sbuild. [sbuild, sudo]
sbuild.
- pbuilder-dist, cowbuilder-dist - wrapper script for managing several build
chroots (for different Ubuntu and Debian releases) on the same system.
[pbuilder | cowbuilder, sudo]
- pull-debian-debdiff - attempts to find and download a specific version of
a Debian package and its immediate parent to generate a debdiff.
- pull-debian-source - downloads the latest source package available in
@ -123,7 +121,7 @@ Description: useful tools for Ubuntu developers
autopkgtests on the Ubuntu autopkgtest infrastructure
- seeded-in-ubuntu - query if a package is safe to upload during a freeze.
- setup-packaging-environment - assistant to get an Ubuntu installation
ready for packaging work. [sudo]
ready for packaging work.
- sponsor-patch - Downloads a patch from a Launchpad bug, patches the source
package, and uploads it (to Ubuntu or a PPA)
- submittodebian - automatically send your changes to Debian as a bug report.

22
debian/copyright vendored
View File

@ -11,7 +11,6 @@ Files: backportpackage
doc/check-symbols.1
doc/requestsync.1
doc/ubuntu-iso.1
doc/running-autopkgtests.1
GPL-2
README.updates
requestsync
@ -20,13 +19,12 @@ Files: backportpackage
ubuntu-iso
ubuntutools/requestsync/*.py
Copyright: 2007, Albert Damen <albrt@gmx.net>
2010-2026, Benjamin Drung <bdrung@ubuntu.com>
2007-2026, Canonical Ltd.
2010-2022, Benjamin Drung <bdrung@ubuntu.com>
2007-2023, Canonical Ltd.
2006-2007, Daniel Holbach <daniel.holbach@ubuntu.com>
2010, Evan Broder <evan@ebroder.net>
2006-2007, Luke Yelavich <themuso@ubuntu.com>
2009-2010, Michael Bienia <geser@ubuntu.com>
2024-2025, Simon Quigley <tsimonq2@debian.org>
2010-2011, Stefano Rivera <stefanor@ubuntu.com>
2008, Stephan Hermann <sh@sourcecode.de>
2007, Steve Kowalik <stevenk@ubuntu.com>
@ -74,28 +72,23 @@ License: GPL-2+
On Debian systems, the complete text of the GNU General Public License
version 2 can be found in the /usr/share/common-licenses/GPL-2 file.
Files: doc/lp-bitesize.1
Files: doc/bitesize.1
doc/check-mir.1
doc/grab-merge.1
doc/merge-changelog.1
doc/pm-helper.1
doc/setup-packaging-environment.1
doc/syncpackage.1
lp-bitesize
bitesize
check-mir
GPL-3
grab-merge
merge-changelog
pm-helper
pyproject.toml
run-linters
running-autopkgtests
setup-packaging-environment
syncpackage
ubuntutools/running_autopkgtests.py
ubuntutools/utils.py
Copyright: 2010-2024, Benjamin Drung <bdrung@ubuntu.com>
2007-2024, Canonical Ltd.
Copyright: 2010, Benjamin Drung <bdrung@ubuntu.com>
2007-2023, Canonical Ltd.
2008, Jonathan Patrick Davies <jpds@ubuntu.com>
2008-2010, Martin Pitt <martin.pitt@canonical.com>
2009, Siegfried-Angel Gevatter Pujals <rainct@ubuntu.com>
@ -184,12 +177,11 @@ Files: doc/pull-debian-debdiff.1
ubuntutools/version.py
update-maintainer
.pylintrc
Copyright: 2009-2024, Benjamin Drung <bdrung@ubuntu.com>
Copyright: 2009-2023, Benjamin Drung <bdrung@ubuntu.com>
2010, Evan Broder <evan@ebroder.net>
2008, Siegfried-Angel Gevatter Pujals <rainct@ubuntu.com>
2010-2011, Stefano Rivera <stefanor@ubuntu.com>
2017-2021, Dan Streetman <ddstreet@canonical.com>
2024, Canonical Ltd.
License: ISC
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above

3
debian/rules vendored
View File

@ -3,11 +3,10 @@
override_dh_auto_clean:
dh_auto_clean
rm -f .coverage
rm -rf .mypy_cache .tox
rm -rf .tox
override_dh_auto_test:
ifeq (,$(filter nocheck,$(DEB_BUILD_OPTIONS)))
./run-linters --errors-only
python3 -m pytest -v ubuntutools
endif

View File

@ -4,5 +4,4 @@ Depends:
python3-pytest,
python3-setuptools,
@,
Restrictions:
allow-stderr,
Restrictions: allow-stderr

View File

@ -1,21 +1,21 @@
.TH lp-bitesize "1" "May 9 2010" "ubuntu-dev-tools"
.TH bitesize "1" "May 9 2010" "ubuntu-dev-tools"
.SH NAME
lp-bitesize \- Add \fBbitesize\fR tag to bugs and add a comment.
bitesize \- Add \fBbitesize\fR tag to bugs and add a comment.
.SH SYNOPSIS
.B lp-bitesize \fR<\fIbug number\fR>
.B bitesize \fR<\fIbug number\fR>
.br
.B lp-bitesize \-\-help
.B bitesize \-\-help
.SH DESCRIPTION
\fBlp-bitesize\fR adds a bitesize tag to the bug, if it's not there yet. It
\fBbitesize\fR adds a bitesize tag to the bug, if it's not there yet. It
also adds a comment to the bug indicating that you are willing to help with
fixing it.
It checks for permission to operate on a given bug first,
then perform required tasks on Launchpad.
.SH OPTIONS
Listed below are the command line options for \fBlp-bitesize\fR:
Listed below are the command line options for \fBbitesize\fR:
.TP
.BR \-h ", " \-\-help
Display a help message and exit.
@ -48,7 +48,7 @@ The default value for \fB--lpinstance\fR.
.BR ubuntu\-dev\-tools (5)
.SH AUTHORS
\fBlp-bitesize\fR and this manual page were written by Daniel Holbach
\fBbitesize\fR and this manual page were written by Daniel Holbach
<daniel.holbach@canonical.com>.
.PP
Both are released under the terms of the GNU General Public License, version 3.

View File

@ -20,7 +20,7 @@ like for example \fBpbuilder\-feisty\fP, \fBpbuilder\-sid\fP, \fBpbuilder\-gutsy
.PP
The same applies to \fBcowbuilder\-dist\fP, which uses cowbuilder. The main
difference between both is that pbuilder compresses the created chroot as a
tarball, thus using less disc space but needing to uncompress (and possibly
a tarball, thus using less disc space but needing to uncompress (and possibly
compress) its contents again on each run, and cowbuilder doesn't do this.
.SH USAGE

View File

@ -1,15 +0,0 @@
.TH running\-autopkgtests "1" "18 January 2024" "ubuntu-dev-tools"
.SH NAME
running\-autopkgtests \- dumps a list of currently running autopkgtests
.SH SYNOPSIS
.B running\-autopkgtests
.SH DESCRIPTION
Dumps a list of currently running and queued tests in Autopkgtest.
Pass --running to only see running tests, or --queued to only see
queued tests. Passing both will print both, which is the default behavior.
.SH AUTHOR
.B running\-autopkgtests
was written by Chris Peterson <chris.peterson@canonical.com>.

View File

@ -58,7 +58,7 @@ Display more progress information.
\fB\-F\fR, \fB\-\-fakesync\fR
Perform a fakesync, to work around a tarball mismatch between Debian and
Ubuntu.
This option ignores blocklisting, and performs a local sync.
This option ignores blacklisting, and performs a local sync.
It implies \fB\-\-no\-lp\fR, and will leave a signed \fB.changes\fR file
for you to upload.
.TP

View File

@ -43,7 +43,7 @@ operations.
\fB\-a\fR ARCHITECTURE, \fB\-\-arch\fR=\fIARCHITECTURE\fR
Rebuild or rescore a specific architecture. Valid
architectures are:
armhf, arm64, amd64, amd64v3, i386, powerpc, ppc64el, riscv64, s390x.
armhf, arm64, amd64, i386, powerpc, ppc64el, riscv64, s390x.
.TP
Batch processing:
.IP
@ -66,7 +66,7 @@ Rescore builds to <priority>.
\fB\-\-arch\fR=\fIARCHITECTURE\fR
Affect only 'architecture' (can be used several
times). Valid architectures are:
armhf, arm64, amd64, amd64v3, i386, powerpc, ppc64el, riscv64, s390x.
arm64, amd64, i386, powerpc, ppc64el, riscv64, s390x.
.IP
\fB\-A=\fIARCHIVE\fR
Act on the named archive (ppa) instead of on the main Ubuntu archive.

View File

@ -29,8 +29,6 @@ import logging
import re
import sys
import webbrowser
from collections.abc import Iterable
from email.message import EmailMessage
import debianbts
from launchpadlib.launchpad import Launchpad
@ -39,10 +37,11 @@ from ubuntutools import getLogger
from ubuntutools.config import UDTConfig
Logger = getLogger()
ATTACHMENT_MAX_SIZE = 2000
def parse_args() -> argparse.Namespace:
def main():
bug_re = re.compile(r"bug=(\d+)")
parser = argparse.ArgumentParser()
parser.add_argument(
"-b",
@ -72,15 +71,28 @@ def parse_args() -> argparse.Namespace:
"--no-conf", action="store_true", help="Don't read config files or environment variables."
)
parser.add_argument("bugs", nargs="+", help="Bug number(s) or URL(s)")
return parser.parse_args()
options = parser.parse_args()
config = UDTConfig(options.no_conf)
if options.lpinstance is None:
options.lpinstance = config.get_value("LPINSTANCE")
def get_bug_numbers(bug_list: Iterable[str]) -> list[int]:
bug_re = re.compile(r"bug=(\d+)")
if options.dry_run:
launchpad = Launchpad.login_anonymously("ubuntu-dev-tools")
options.verbose = True
else:
launchpad = Launchpad.login_with("ubuntu-dev-tools", options.lpinstance)
if options.verbose:
Logger.setLevel(logging.DEBUG)
debian = launchpad.distributions["debian"]
ubuntu = launchpad.distributions["ubuntu"]
lp_debbugs = launchpad.bug_trackers.getByName(name="debbugs")
bug_nums = []
for bug_num in bug_list:
for bug_num in options.bugs:
if bug_num.startswith("http"):
# bug URL
match = bug_re.search(bug_num)
@ -89,79 +101,24 @@ def get_bug_numbers(bug_list: Iterable[str]) -> list[int]:
sys.exit(1)
bug_num = match.groups()[0]
bug_num = bug_num.lstrip("#")
bug_nums.append(int(bug_num))
bug_num = int(bug_num)
bug_nums.append(bug_num)
return bug_nums
bugs = debianbts.get_status(bug_nums)
def walk_multipart_message(message: EmailMessage) -> tuple[str, list[tuple[int, EmailMessage]]]:
summary = ""
attachments = []
i = 1
for part in message.walk():
content_type = part.get_content_type()
if content_type.startswith("multipart/"):
# we're already iterating on multipart items
# let's just skip the multipart extra metadata
continue
if content_type == "application/pgp-signature":
# we're not interested in importing pgp signatures
continue
if part.is_attachment():
attachments.append((i, part))
elif content_type.startswith("image/"):
# images here are not attachment, they are inline, but Launchpad can't handle that,
# so let's add them as attachments
summary += f"Message part #{i}\n"
summary += f"[inline image '{part.get_filename()}']\n\n"
attachments.append((i, part))
elif content_type.startswith("text/html"):
summary += f"Message part #{i}\n"
summary += "[inline html]\n\n"
attachments.append((i, part))
elif content_type == "text/plain":
summary += f"Message part #{i}\n"
summary += part.get_content() + "\n"
else:
raise RuntimeError(f"""Unknown message part
Your Debian bug is too weird to be imported in Launchpad, sorry.
You can fix that by patching this script in ubuntu-dev-tools.
Faulty message part:
{part}""")
i += 1
return summary, attachments
def process_bugs(
bugs: Iterable[debianbts.Bugreport],
launchpad: Launchpad,
package: str,
dry_run: bool = True,
browserless: bool = False,
) -> bool:
debian = launchpad.distributions["debian"]
ubuntu = launchpad.distributions["ubuntu"]
lp_debbugs = launchpad.bug_trackers.getByName(name="debbugs")
if not bugs:
Logger.error("Cannot find any of the listed bugs")
sys.exit(1)
err = False
for bug in bugs:
ubupackage = bug.source
if package:
ubupackage = package
ubupackage = package = bug.source
if options.package:
ubupackage = options.package
bug_num = bug.bug_num
subject = bug.subject
log = debianbts.get_bug_log(bug_num)
message = log[0]["message"]
assert isinstance(message, EmailMessage)
attachments: list[tuple[int, EmailMessage]] = []
if message.is_multipart():
summary, attachments = walk_multipart_message(message)
else:
summary = str(message.get_payload())
summary = log[0]["message"].get_payload()
target = ubuntu.getSourcePackage(name=ubupackage)
if target is None:
Logger.error(
@ -180,73 +137,24 @@ def process_bugs(
Logger.debug("Subject: %s", subject)
Logger.debug("Description: ")
Logger.debug(description)
for i, attachment in attachments:
Logger.debug("Attachment #%s (%s)", i, attachment.get_filename() or "inline")
Logger.debug("Content:")
if attachment.get_content_type() == "text/plain":
content = attachment.get_content()
if len(content) > ATTACHMENT_MAX_SIZE:
content = (
content[:ATTACHMENT_MAX_SIZE]
+ f" [attachment cropped after {ATTACHMENT_MAX_SIZE} characters...]"
)
Logger.debug(content)
else:
Logger.debug("[data]")
if dry_run:
if options.dry_run:
Logger.info("Dry-Run: not creating Ubuntu bug.")
continue
u_bug = launchpad.bugs.createBug(target=target, title=subject, description=description)
for i, attachment in attachments:
name = f"#{i}-{attachment.get_filename() or "inline"}"
content = attachment.get_content()
if isinstance(content, str):
# Launchpad only wants bytes
content = content.encode()
u_bug.addAttachment(
filename=name,
data=content,
comment=f"Imported from Debian bug http://bugs.debian.org/{bug_num}",
)
d_sp = debian.getSourcePackage(name=package)
if d_sp is None and package:
d_sp = debian.getSourcePackage(name=package)
if d_sp is None and options.package:
d_sp = debian.getSourcePackage(name=options.package)
d_task = u_bug.addTask(target=d_sp)
d_watch = u_bug.addWatch(remote_bug=bug_num, bug_tracker=lp_debbugs)
d_task.bug_watch = d_watch
d_task.lp_save()
Logger.info("Opened %s", u_bug.web_link)
if not browserless:
if not options.browserless:
webbrowser.open(u_bug.web_link)
return err
def main() -> None:
options = parse_args()
config = UDTConfig(options.no_conf)
if options.lpinstance is None:
options.lpinstance = config.get_value("LPINSTANCE")
if options.dry_run:
launchpad = Launchpad.login_anonymously("ubuntu-dev-tools")
options.verbose = True
else:
launchpad = Launchpad.login_with("ubuntu-dev-tools", options.lpinstance)
if options.verbose:
Logger.setLevel(logging.DEBUG)
bugs = debianbts.get_status(get_bug_numbers(options.bugs))
if not bugs:
Logger.error("Cannot find any of the listed bugs")
sys.exit(1)
if process_bugs(bugs, launchpad, options.package, options.dry_run, options.browserless):
if err:
sys.exit(1)

View File

@ -22,7 +22,6 @@
# pylint: enable=invalid-name
import sys
from typing import NoReturn
from debian.changelog import Changelog
@ -31,14 +30,16 @@ from ubuntutools import getLogger
Logger = getLogger()
def usage(exit_code: int = 1) -> NoReturn:
Logger.info("""Usage: merge-changelog <left changelog> <right changelog>
def usage(exit_code=1):
Logger.info(
"""Usage: merge-changelog <left changelog> <right changelog>
merge-changelog takes two changelogs that once shared a common source,
merges them back together, and prints the merged result to stdout. This
is useful if you need to manually merge a ubuntu package with a new
Debian release of the package.
""")
"""
)
sys.exit(exit_code)

View File

@ -155,7 +155,6 @@ proxy="_unset_"
DEBOOTSTRAP_NO_CHECK_GPG=0
EATMYDATA=1
CCACHE=0
USE_PKGBINARYMANGLER=0
while :; do
case "$1" in
@ -304,27 +303,11 @@ if [ ! -w /var/lib/sbuild ]; then
# Prepare a usable default .sbuildrc
if [ ! -e ~/.sbuildrc ]; then
cat > ~/.sbuildrc <<EOM
# *** THIS COMMAND IS DEPRECATED ***
#
# In sbuild 0.87.0 and later, the unshare backend is available. This is
# expected to become the default in a future release.
#
# This is the new preferred way of building Debian packages, making the manual
# creation of schroots no longer necessary. To retain the default behavior,
# you may remove this comment block and continue.
#
# To test the unshare backend while retaining the default settings, run sbuild
# with --chroot-mode=unshare like this:
# $ sbuild --chroot-mode=unshare --dist=unstable hello
#
# To switch to the unshare backend by default (recommended), uncomment the
# following lines and delete the rest of the file (with the exception of the
# last two lines):
#\$chroot_mode = 'unshare';
#\$unshare_mmdebstrap_keep_tarball = 1;
# *** VERIFY AND UPDATE \$mailto and \$maintainer_name BELOW ***
# Mail address where logs are sent to (mandatory, no default!)
\$mailto = '$USER';
# Name to use as override in .changes files for the Maintainer: field
#\$maintainer_name='$USER <$USER@localhost>';
@ -668,7 +651,6 @@ ubuntu)
if ubuntu_dist_ge "$RELEASE" "edgy"; then
# Add pkgbinarymangler (edgy and later)
BUILD_PKGS="$BUILD_PKGS pkgbinarymangler"
USE_PKGBINARYMANGLER=1
# Disable recommends for a smaller chroot (gutsy and later only)
if ubuntu_dist_ge "$RELEASE" "gutsy"; then
BUILD_PKGS="--no-install-recommends $BUILD_PKGS"
@ -928,8 +910,8 @@ if [ -n "$TEMP_PREFERENCES" ]; then
sudo mv "$TEMP_PREFERENCES" $MNT/etc/apt/preferences.d/proposed.pref
fi
# Copy the timezone (uncomment this if you want to use your local time zone)
#sudo cp -P --remove-destination /etc/localtime /etc/timezone "$MNT"/etc/
# Copy the timezone (comment this out if you want to leave the chroot at UTC)
sudo cp -P --remove-destination /etc/localtime /etc/timezone "$MNT"/etc/
# Create a schroot entry for this chroot
TEMP_SCHROOTCONF=`mktemp -t schrootconf-XXXXXX`
TEMPLATE_SCHROOTCONF=~/.mk-sbuild.schroot.conf
@ -1048,25 +1030,6 @@ EOF
EOM
fi
if [ "$USE_PKGBINARYMANGLER" = 1 ]; then
sudo bash -c "cat >> $MNT/finish.sh" <<EOM
mkdir -p /etc/pkgbinarymangler/
cat > /etc/pkgbinarymangler/maintainermangler.conf <<EOF
# pkgmaintainermangler configuration file
# pkgmaintainermangler will do nothing unless enable is set to "true"
enable: true
# Configure what happens if /CurrentlyBuilding is present, but invalid
# (i. e. it does not contain a Package: field). If "ignore" (default),
# the file is ignored (i. e. the Maintainer field is mangled) and a
# warning is printed. If "fail" (or any other value), pkgmaintainermangler
# exits with an error, which causes a package build to fail.
invalid_currentlybuilding: ignore
EOF
EOM
fi
if [ -n "$TARGET_ARCH" ]; then
sudo bash -c "cat >> $MNT/finish.sh" <<EOM
# Configure target architecture
@ -1085,7 +1048,7 @@ apt-get update || true
echo set debconf/frontend Noninteractive | debconf-communicate
echo set debconf/priority critical | debconf-communicate
# Install basic build tool set, trying to match buildd
apt-get -y --force-yes -o Dpkg::Options::="--force-confold" install $BUILD_PKGS
apt-get -y --force-yes install $BUILD_PKGS
# Set up expected /dev entries
if [ ! -r /dev/stdin ]; then ln -s /proc/self/fd/0 /dev/stdin; fi
if [ ! -r /dev/stdout ]; then ln -s /proc/self/fd/1 /dev/stdout; fi

View File

@ -38,7 +38,6 @@ import shutil
import subprocess
import sys
from contextlib import suppress
from typing import NoReturn
import debian.deb822
from distro_info import DebianDistroInfo, DistroDataOutdated, UbuntuDistroInfo
@ -295,9 +294,7 @@ class PbuilderDist:
if self.target_distro in self._debian_distros:
try:
codename = self.debian_distro_info.codename(
self.target_distro, default=self.target_distro
)
codename = self.debian_distro_info.codename(self.target_distro, default=self.target_distro)
except DistroDataOutdated as error:
Logger.warning(error)
if codename in (self.debian_distro_info.devel(), "experimental"):
@ -412,7 +409,7 @@ class PbuilderDist:
] + arguments
def show_help(exit_code: int = 0) -> NoReturn:
def show_help(exit_code=0):
"""help() -> None
Print a help message for pbuilder-dist, and exit with the given code.

View File

@ -15,50 +15,53 @@
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import lzma
from argparse import ArgumentParser
import sys
import webbrowser
from argparse import ArgumentParser
import yaml
from launchpadlib.launchpad import Launchpad
from ubuntutools.question import YesNoQuestion
from ubuntutools.utils import get_url
# proposed-migration is only concerned with the devel series; unlike other
# tools, don't make this configurable
excuses_url = "https://ubuntu-archive-team.ubuntu.com/proposed-migration/update_excuses.yaml.xz"
excuses_url = 'https://ubuntu-archive-team.ubuntu.com/proposed-migration/' \
+ 'update_excuses.yaml.xz'
def get_proposed_version(excuses, package):
for k in excuses["sources"]:
if k["source"] == package:
return k.get("new-version")
for k in excuses['sources']:
if k['source'] == package:
return k.get('new-version')
return None
def claim_excuses_bug(launchpad, bug, package):
print(f"LP: #{bug.id}: {bug.title}")
ubuntu = launchpad.distributions["ubuntu"]
print("LP: #%d: %s" % (bug.id, bug.title))
ubuntu = launchpad.distributions['ubuntu']
series = ubuntu.current_series.fullseriesname
for task in bug.bug_tasks:
# targeting to a series doesn't make the default task disappear,
# it just makes it useless
if task.bug_target_name == f"{package} ({series})":
if task.bug_target_name == "%s (%s)" % (package, series):
our_task = task
break
if task.bug_target_name == f"{package} (Ubuntu)":
elif task.bug_target_name == "%s (Ubuntu)" % package:
our_task = task
if our_task.assignee == launchpad.me:
print("Bug already assigned to you.")
return True
if our_task.assignee:
print(f"Currently assigned to {our_task.assignee.name}")
elif our_task.assignee:
print("Currently assigned to %s" % our_task.assignee.name)
answer = YesNoQuestion().ask("Do you want to claim this bug?", "no")
if answer == "yes":
print('''Do you want to claim this bug? [yN] ''', end="")
sys.stdout.flush()
response = sys.stdin.readline()
if response.strip().lower().startswith('y'):
our_task.assignee = launchpad.me
our_task.lp_save()
return True
@ -69,37 +72,38 @@ def claim_excuses_bug(launchpad, bug, package):
def create_excuses_bug(launchpad, package, version):
print("Will open a new bug")
bug = launchpad.bugs.createBug(
title=f"proposed-migration for {package} {version}",
tags=("update-excuse"),
target=f"https://api.launchpad.net/devel/ubuntu/+source/{package}",
description=f"{package} {version} is stuck in -proposed.",
title = 'proposed-migration for %s %s' % (package, version),
tags = ('update-excuse'),
target = 'https://api.launchpad.net/devel/ubuntu/+source/%s' % package,
description = '%s %s is stuck in -proposed.' % (package, version)
)
task = bug.bug_tasks[0]
task.assignee = launchpad.me
task.lp_save()
print(f"Opening {bug.web_link} in browser")
print("Opening %s in browser" % bug.web_link)
webbrowser.open(bug.web_link)
return bug
def has_excuses_bugs(launchpad, package):
ubuntu = launchpad.distributions["ubuntu"]
ubuntu = launchpad.distributions['ubuntu']
pkg = ubuntu.getSourcePackage(name=package)
if not pkg:
raise ValueError(f"No such source package: {package}")
tasks = pkg.searchTasks(tags=["update-excuse"], order_by=["id"])
tasks = pkg.searchTasks(tags=['update-excuse'], order_by=['id'])
bugs = [task.bug for task in tasks]
if not bugs:
return False
if len(bugs) == 1:
print(f"There is 1 open update-excuse bug against {package}")
print("There is 1 open update-excuse bug against %s" % package)
else:
print(f"There are {len(bugs)} open update-excuse bugs against {package}")
print("There are %d open update-excuse bugs against %s" \
% (len(bugs), package))
for bug in bugs:
if claim_excuses_bug(launchpad, bug, package):
@ -110,14 +114,17 @@ def has_excuses_bugs(launchpad, package):
def main():
parser = ArgumentParser()
parser.add_argument("-l", "--launchpad", dest="launchpad_instance", default="production")
parser.add_argument(
"-v", "--verbose", default=False, action="store_true", help="be more verbose"
)
parser.add_argument("package", nargs="?", help="act on this package only")
"-l", "--launchpad", dest="launchpad_instance", default="production")
parser.add_argument(
"-v", "--verbose", default=False, action="store_true",
help="be more verbose")
parser.add_argument(
'package', nargs='?', help="act on this package only")
args = parser.parse_args()
args.launchpad = Launchpad.login_with("pm-helper", args.launchpad_instance, version="devel")
args.launchpad = Launchpad.login_with(
"pm-helper", args.launchpad_instance, version="devel")
f = get_url(excuses_url, False)
with lzma.open(f) as lzma_f:
@ -128,16 +135,15 @@ def main():
if not has_excuses_bugs(args.launchpad, args.package):
proposed_version = get_proposed_version(excuses, args.package)
if not proposed_version:
print(f"Package {args.package} not found in -proposed.")
print("Package %s not found in -proposed." % args.package)
sys.exit(1)
answer = YesNoQuestion().ask("Do you want to create a bug?", "no")
if answer == "yes":
create_excuses_bug(args.launchpad, args.package, proposed_version)
create_excuses_bug(args.launchpad, args.package,
proposed_version)
except ValueError as e:
sys.stderr.write(f"{e}\n")
else:
pass # for now
if __name__ == "__main__":
if __name__ == '__main__':
sys.exit(main())

View File

@ -4,7 +4,3 @@ line-length = 99
[tool.isort]
line_length = 99
profile = "black"
[tool.mypy]
disallow_incomplete_defs = true
ignore_missing_imports = true

View File

@ -46,7 +46,7 @@ Logger = getLogger()
#
def main() -> None:
def main():
# Our usage options.
usage = "%(prog)s [options] <source package> [<target release> [base version]]"
parser = argparse.ArgumentParser(usage=usage)
@ -153,7 +153,6 @@ def main() -> None:
import DNS # pylint: disable=import-outside-toplevel
DNS.DiscoverNameServers()
# imported earlier, pylint: disable-next=possibly-used-before-assignment
mxlist = DNS.mxlookup(bug_mail_domain)
firstmx = mxlist[0]
mailserver_host = firstmx[1]
@ -215,7 +214,6 @@ def main() -> None:
if not args.release:
if lpapi:
# imported earlier, pylint: disable-next=possibly-used-before-assignment
args.release = Distribution("ubuntu").getDevelopmentSeries().name
else:
ubu_info = UbuntuDistroInfo()
@ -379,7 +377,6 @@ def main() -> None:
# Map status to the values expected by LP API
mapping = {"new": "New", "confirmed": "Confirmed"}
# Post sync request using LP API
# imported earlier, pylint: disable-next=possibly-used-before-assignment
post_bug(srcpkg, subscribe, mapping[status], title, report)
else:
email_from = ubu_email(export=False)[1]

View File

@ -182,11 +182,8 @@ def display_verbose(package, values):
if not values:
Logger.info("No reverse dependencies found")
return
if not values.get(package):
Logger.info("No reverse dependencies found with the current filters")
return
def log_package(values, package, arch, dependency, visited, offset=0):
def log_package(values, package, arch, dependency, offset=0):
line = f"{' ' * offset}* {package}"
if all_archs and set(arch) != all_archs:
line += f" [{' '.join(sorted(arch))}]"
@ -195,9 +192,6 @@ def display_verbose(package, values):
line += " " * (30 - len(line))
line += f" (for {dependency})"
Logger.info(line)
if package in visited:
return
visited = visited.copy().add(package)
data = values.get(package)
if data:
offset = offset + 1
@ -208,7 +202,6 @@ def display_verbose(package, values):
rdep["Package"],
rdep.get("Architectures", all_archs),
rdep.get("Dependency"),
visited,
offset,
)
@ -230,7 +223,6 @@ def display_verbose(package, values):
rdep["Package"],
rdep.get("Architectures", all_archs),
rdep.get("Dependency"),
{package},
)
Logger.info("")

View File

@ -4,45 +4,16 @@ set -eu
# Copyright 2023, Canonical Ltd.
# SPDX-License-Identifier: GPL-3.0
PYTHON_SCRIPTS=$(find . -maxdepth 1 -type f -exec grep -l '^#! */usr/bin/python3$' {} +)
PYTHON_SCRIPTS=$(grep -l -r '^#! */usr/bin/python3$' .)
run_black() {
echo "Running black..."
black -C --check --diff . ${PYTHON_SCRIPTS}
}
echo "Running black..."
black --check --diff . $PYTHON_SCRIPTS
run_isort() {
echo "Running isort..."
isort --check-only --diff .
}
echo "Running isort..."
isort --check-only --diff .
run_flake8() {
echo "Running flake8..."
flake8 --max-line-length=99 --ignore=E203,W503 . $PYTHON_SCRIPTS
}
echo "Running flake8..."
flake8 --max-line-length=99 --ignore=E203,W503 . $PYTHON_SCRIPTS
run_mypy() {
echo "Running mypy..."
mypy .
mypy --scripts-are-modules $PYTHON_SCRIPTS
}
run_pylint() {
echo "Running pylint..."
pylint "$@" $(find * -name '*.py') $PYTHON_SCRIPTS
}
if test "${1-}" = "--errors-only"; then
# Run only linters that can detect real errors (ignore formatting)
run_black || true
run_isort || true
run_flake8 || true
run_mypy
run_pylint --errors-only
else
run_black
run_isort
run_flake8
run_mypy
run_pylint
fi
echo "Running pylint..."
pylint $(find * -name '*.py') $PYTHON_SCRIPTS

View File

@ -4,31 +4,13 @@
# Authors:
# Andy P. Whitcroft
# Christian Ehrhardt
# Chris Peterson <chris.peterson@canonical.com>
#
# Copyright (C) 2024 Canonical Ltd.
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU General Public License version 3, as published
# by the Free Software Foundation.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranties of
# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program. If not, see <http://www.gnu.org/licenses/>.
"""Dumps a list of currently running tests in Autopkgtest"""
__example__ = """
Display first listed test running on amd64 hardware:
$ running-autopkgtests | grep amd64 | head -n1
R 0:01:40 systemd-upstream - focal amd64\
upstream-systemd-ci/systemd-ci - ['CFLAGS=-O0', 'DEB_BUILD_PROFILES=noudeb',\
'TEST_UPSTREAM=1', 'CONFFLAGS_UPSTREAM=--werror -Dslow-tests=true',\
'UPSTREAM_PULL_REQUEST=23153',\
'GITHUB_STATUSES_URL=https://api.github.com/repos/\
systemd/systemd/statuses/cfb0935923dff8050315b5dd22ce8ab06461ff0e']
R 0:01:40 systemd-upstream - focal amd64 upstream-systemd-ci/systemd-ci - ['CFLAGS=-O0', 'DEB_BUILD_PROFILES=noudeb', 'TEST_UPSTREAM=1', 'CONFFLAGS_UPSTREAM=--werror -Dslow-tests=true', 'UPSTREAM_PULL_REQUEST=23153', 'GITHUB_STATUSES_URL=https://api.github.com/repos/systemd/systemd/statuses/cfb0935923dff8050315b5dd22ce8ab06461ff0e']
"""
import sys
@ -51,10 +33,16 @@ def parse_args():
formatter_class=RawDescriptionHelpFormatter,
)
parser.add_argument(
"-r", "--running", action="store_true", help="Print runnning autopkgtests (default: true)"
"-r",
"--running",
action="store_true",
help="Print runnning autopkgtests (default: true)",
)
parser.add_argument(
"-q", "--queued", action="store_true", help="Print queued autopkgtests (default: true)"
"-q",
"--queued",
action="store_true",
help="Print queued autopkgtests (default: true)",
)
options = parser.parse_args()

View File

@ -3,7 +3,6 @@
import glob
import pathlib
import re
from collections.abc import Sequence
from setuptools import setup
@ -33,18 +32,17 @@ def make_pep440_compliant(version: str) -> str:
scripts = [
"backportpackage",
"bitesize",
"check-mir",
"check-symbols",
"dch-repeat",
"grab-merge",
"grep-merges",
"import-bug-from-debian",
"lp-bitesize",
"merge-changelog",
"mk-sbuild",
"pbuilder-dist",
"pbuilder-dist-simple",
"pm-helper",
"pull-pkg",
"pull-debian-debdiff",
"pull-debian-source",
@ -77,7 +75,7 @@ scripts = [
"ubuntu-upload-permission",
"update-maintainer",
]
data_files: list[tuple[str, Sequence[str]]] = [
data_files = [
("share/bash-completion/completions", glob.glob("bash_completion/*")),
("share/man/man1", glob.glob("doc/*.1")),
("share/man/man5", glob.glob("doc/*.5")),

View File

@ -69,7 +69,9 @@ In Ubuntu, the attached patch was applied to achieve the following:
%s
Thanks for considering the patch.
""" % ("\n".join(entry.changes()))
""" % (
"\n".join(entry.changes())
)
return msg

View File

@ -22,7 +22,6 @@
import argparse
import fnmatch
import functools
import logging
import os
import shutil
@ -144,7 +143,7 @@ def sync_dsc(
if ubuntu_ver.is_modified_in_ubuntu():
if not force:
Logger.error("--force is required to discard Ubuntu changes.")
return None
sys.exit(1)
Logger.warning(
"Overwriting modified Ubuntu version %s, setting current version to %s",
@ -158,7 +157,7 @@ def sync_dsc(
src_pkg.pull()
except DownloadError as e:
Logger.error("Failed to download: %s", str(e))
return None
sys.exit(1)
src_pkg.unpack()
needs_fakesync = not (need_orig or ubu_pkg.verify_orig())
@ -167,13 +166,13 @@ def sync_dsc(
Logger.warning("Performing a fakesync")
elif not needs_fakesync and fakesync:
Logger.error("Fakesync not required, aborting.")
return None
sys.exit(1)
elif needs_fakesync and not fakesync:
Logger.error(
"The checksums of the Debian and Ubuntu packages "
"mismatch. A fake sync using --fakesync is required."
)
return None
sys.exit(1)
if fakesync:
# Download Ubuntu files (override Debian source tarballs)
@ -181,7 +180,7 @@ def sync_dsc(
ubu_pkg.pull()
except DownloadError as e:
Logger.error("Failed to download: %s", str(e))
return None
sys.exit(1)
# change into package directory
directory = src_pkg.source + "-" + new_ver.upstream_version
@ -266,7 +265,7 @@ def sync_dsc(
returncode = subprocess.call(cmd)
if returncode != 0:
Logger.error("Source-only build with debuild failed. Please check build log above.")
return None
sys.exit(1)
def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
@ -296,7 +295,7 @@ def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
udtexceptions.SeriesNotFoundException,
) as e:
Logger.error(str(e))
return None
sys.exit(1)
if version is None:
version = Version(debian_srcpkg.getVersion())
try:
@ -307,7 +306,7 @@ def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
ubuntu_version = Version("~")
except udtexceptions.SeriesNotFoundException as e:
Logger.error(str(e))
return None
sys.exit(1)
if ubuntu_version >= version:
# The LP importer is maybe out of date
debian_srcpkg = requestsync_mail_get_debian_srcpkg(package, dist)
@ -321,7 +320,7 @@ def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
ubuntu_version,
ubuntu_release,
)
return None
sys.exit(1)
if component is None:
component = debian_srcpkg.getComponent()
@ -330,7 +329,7 @@ def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
return DebianSourcePackage(package, version.full_version, component, mirrors=mirrors)
def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, yes=False):
def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False):
"""Copy a source package from Debian to Ubuntu using the Launchpad API."""
ubuntu = Distribution("ubuntu")
debian_archive = Distribution("debian").getArchive()
@ -353,7 +352,7 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, ye
"Debian version %s has not been picked up by LP yet. Please try again later.",
src_pkg.version,
)
return None
sys.exit(1)
try:
ubuntu_spph = get_ubuntu_srcpkg(src_pkg.source, ubuntu_series, ubuntu_pocket)
@ -374,7 +373,7 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, ye
base_version = ubuntu_version.get_related_debian_version()
if not force and ubuntu_version.is_modified_in_ubuntu():
Logger.error("--force is required to discard Ubuntu changes.")
return None
sys.exit(1)
# Check whether a fakesync would be required.
if not src_pkg.dsc.compare_dsc(ubuntu_pkg.dsc):
@ -382,7 +381,7 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, ye
"The checksums of the Debian and Ubuntu packages "
"mismatch. A fake sync using --fakesync is required."
)
return None
sys.exit(1)
except udtexceptions.PackageNotFoundException:
base_version = Version("~")
Logger.info(
@ -403,7 +402,6 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, ye
if sponsoree:
Logger.info("Sponsoring this sync for %s (%s)", sponsoree.display_name, sponsoree.name)
if not yes:
answer = YesNoQuestion().ask("Sync this package", "no")
if answer != "yes":
return
@ -421,37 +419,26 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, ye
except HTTPError as error:
Logger.error("HTTP Error %s: %s", error.response.status, error.response.reason)
Logger.error(error.content)
return None
sys.exit(1)
Logger.info("Request succeeded; you should get an e-mail once it is processed.")
bugs = sorted(set(bugs))
if bugs:
Logger.info("Launchpad bugs to be closed: %s", ", ".join(str(bug) for bug in bugs))
Logger.info("Please wait for the sync to be successful before closing bugs.")
if yes:
close_bugs(bugs, src_pkg.source, src_pkg.version.full_version, changes, sponsoree)
else:
answer = YesNoQuestion().ask("Close bugs", "yes")
if answer == "yes":
close_bugs(bugs, src_pkg.source, src_pkg.version.full_version, changes, sponsoree)
@functools.lru_cache(maxsize=1)
def _fetch_sync_blocklist() -> str:
url = "https://ubuntu-archive-team.ubuntu.com/sync-blocklist.txt"
with urllib.request.urlopen(url) as f:
sync_blocklist = f.read().decode("utf-8")
return sync_blocklist
def is_blocklisted(query):
"""Determine if package "query" is in the sync blocklist
Returns tuple of (blocklisted, comments)
blocklisted is one of False, 'CURRENT', 'ALWAYS'
def is_blacklisted(query):
"""Determine if package "query" is in the sync blacklist
Returns tuple of (blacklisted, comments)
blacklisted is one of False, 'CURRENT', 'ALWAYS'
"""
series = Launchpad.distributions["ubuntu"].current_series
lp_comments = series.getDifferenceComments(source_package_name=query)
blocklisted = False
blacklisted = False
comments = [
f"{c.body_text}\n -- {c.comment_author.name}"
f" {c.comment_date.strftime('%a, %d %b %Y %H:%M:%S +0000')}"
@ -459,19 +446,17 @@ def is_blocklisted(query):
]
for diff in series.getDifferencesTo(source_package_name_filter=query):
if diff.status == "Blacklisted current version" and blocklisted != "ALWAYS":
blocklisted = "CURRENT"
if diff.status == "Blacklisted current version" and blacklisted != "ALWAYS":
blacklisted = "CURRENT"
if diff.status == "Blacklisted always":
blocklisted = "ALWAYS"
try:
sync_blocklist = _fetch_sync_blocklist()
except OSError:
print("WARNING: unable to download the sync blocklist. Erring on the side of caution.")
return ("ALWAYS", "INTERNAL ERROR: Unable to fetch sync blocklist")
blacklisted = "ALWAYS"
# Old blacklist:
url = "https://ubuntu-archive-team.ubuntu.com/sync-blacklist.txt"
with urllib.request.urlopen(url) as f:
applicable_lines = []
for line in sync_blocklist.splitlines():
for line in f:
line = line.decode("utf-8")
if not line.strip():
applicable_lines = []
continue
@ -482,11 +467,11 @@ def is_blocklisted(query):
pass
source = line.strip()
if source and fnmatch.fnmatch(query, source):
comments += ["From sync-blocklist.txt:"] + applicable_lines
blocklisted = "ALWAYS"
comments += ["From sync-blacklist.txt:"] + applicable_lines
blacklisted = "ALWAYS"
break
return (blocklisted, comments)
return (blacklisted, comments)
def close_bugs(bugs, package, version, changes, sponsoree):
@ -523,12 +508,6 @@ def parse():
epilog = f"See {os.path.basename(sys.argv[0])}(1) for more info."
parser = argparse.ArgumentParser(usage=usage, epilog=epilog)
parser.add_argument(
"-y",
"--yes",
action="store_true",
help="Automatically sync without prompting. Use with caution and care.",
)
parser.add_argument("-d", "--distribution", help="Debian distribution to sync from.")
parser.add_argument("-r", "--release", help="Specify target Ubuntu release.")
parser.add_argument("-V", "--debian-version", help="Specify the version to sync from.")
@ -733,38 +712,36 @@ def main():
args.release,
args.debian_mirror,
)
if not src_pkg:
continue
blocklisted, comments = is_blocklisted(src_pkg.source)
blocklist_fail = False
if blocklisted:
blacklisted, comments = is_blacklisted(src_pkg.source)
blacklist_fail = False
if blacklisted:
messages = []
if blocklisted == "CURRENT":
if blacklisted == "CURRENT":
Logger.debug(
"Source package %s is temporarily blocklisted "
"(blocklisted_current). "
"Source package %s is temporarily blacklisted "
"(blacklisted_current). "
"Ubuntu ignores these for now. "
"See also LP: #841372",
src_pkg.source,
)
else:
if args.fakesync:
messages += ["Doing a fakesync, overriding blocklist."]
messages += ["Doing a fakesync, overriding blacklist."]
else:
blocklist_fail = True
blacklist_fail = True
messages += [
"If this package needs a fakesync, use --fakesync",
"If you think this package shouldn't be "
"blocklisted, please file a bug explaining your "
"blacklisted, please file a bug explaining your "
"reasoning and subscribe ~ubuntu-archive.",
]
if blocklist_fail:
Logger.error("Source package %s is blocklisted.", src_pkg.source)
elif blocklisted == "ALWAYS":
Logger.info("Source package %s is blocklisted.", src_pkg.source)
if blacklist_fail:
Logger.error("Source package %s is blacklisted.", src_pkg.source)
elif blacklisted == "ALWAYS":
Logger.info("Source package %s is blacklisted.", src_pkg.source)
if messages:
for message in messages:
for line in textwrap.wrap(message):
@ -776,17 +753,14 @@ def main():
for line in textwrap.wrap(comment):
Logger.info(" %s", line)
if blocklist_fail:
continue
if blacklist_fail:
sys.exit(1)
if args.lp:
if not copy(
src_pkg, args.release, args.bugs, sponsoree, args.simulate, args.force, args.yes
):
continue
copy(src_pkg, args.release, args.bugs, sponsoree, args.simulate, args.force)
else:
os.environ["DEB_VENDOR"] = "Ubuntu"
if not sync_dsc(
sync_dsc(
src_pkg,
args.distribution,
args.release,
@ -798,8 +772,7 @@ def main():
args.simulate,
args.force,
args.fakesync,
):
continue
)
if __name__ == "__main__":

View File

@ -28,8 +28,9 @@
import argparse
import sys
import lazr.restfulclient.errors
from launchpadlib.credentials import TokenAuthorizationException
from launchpadlib.launchpad import Launchpad
import lazr.restfulclient.errors
from ubuntutools import getLogger
from ubuntutools.lp.udtexceptions import PocketDoesNotExistError
@ -38,7 +39,7 @@ from ubuntutools.misc import split_release_pocket
Logger = getLogger()
def get_build_states(pkg, archs):
def getBuildStates(pkg, archs):
res = []
for build in pkg.getBuilds():
@ -47,8 +48,7 @@ def get_build_states(pkg, archs):
msg = "\n".join(res)
return f"Build state(s) for '{pkg.source_package_name}':\n{msg}"
def rescore_builds(pkg, archs, score):
def rescoreBuilds(pkg, archs, score):
res = []
for build in pkg.getBuilds():
@ -61,19 +61,18 @@ def rescore_builds(pkg, archs, score):
res.append(f" {arch}: done")
except lazr.restfulclient.errors.Unauthorized:
Logger.error(
"You don't have the permissions to rescore builds."
" Ignoring your rescore request."
"You don't have the permissions to rescore builds. Ignoring your rescore request."
)
return None
except lazr.restfulclient.errors.BadRequest:
Logger.info("Cannot rescore build of %s on %s.", build.source_package_name, arch)
Logger.info("Cannot rescore build of %s on %s.",
build.source_package_name, arch)
res.append(f" {arch}: failed")
msg = "\n".join(res)
return f"Rescoring builds of '{pkg.source_package_name}' to {score}:\n{msg}"
def retry_builds(pkg, archs):
def retryBuilds(pkg, archs):
res = []
for build in pkg.getBuilds():
arch = build.arch_tag
@ -87,13 +86,26 @@ def retry_builds(pkg, archs):
return f"Retrying builds of '{pkg.source_package_name}':\n{msg}"
def parse_args(argv: list[str], valid_archs: set[str]) -> argparse.Namespace:
"""Parse command line arguments and return namespace."""
def main():
# Usage.
usage = "%(prog)s <srcpackage> <release> <operation>\n\n"
usage += "Where operation may be one of: rescore, retry, or status.\n"
usage += "Only Launchpad Buildd Admins may rescore package builds."
# Valid architectures.
valid_archs = set(
[
"armhf",
"arm64",
"amd64",
"i386",
"powerpc",
"ppc64el",
"riscv64",
"s390x",
]
)
# Prepare our option parser.
parser = argparse.ArgumentParser(usage=usage)
@ -106,7 +118,8 @@ def parse_args(argv: list[str], valid_archs: set[str]) -> argparse.Namespace:
f"include: {', '.join(valid_archs)}.",
)
parser.add_argument("-A", "--archive", help="operate on ARCHIVE", default="ubuntu")
parser.add_argument("-A", "--archive", help="operate on ARCHIVE",
default="ubuntu")
# Batch processing options
batch_options = parser.add_argument_group(
@ -135,35 +148,20 @@ def parse_args(argv: list[str], valid_archs: set[str]) -> argparse.Namespace:
help="Rescore builds to <priority>.",
)
batch_options.add_argument(
"--state",
action="store",
dest="state",
"--state", action="store", dest="state",
help="Act on builds that are in the specified state",
)
parser.add_argument("packages", metavar="package", nargs="*", help=argparse.SUPPRESS)
# Parse our options.
args = parser.parse_args(argv)
args = parser.parse_args()
if not args.batch:
# Check we have the correct number of arguments.
if len(args.packages) < 3:
parser.error("Incorrect number of arguments.")
launchpad = Launchpad.login_with("ubuntu-dev-tools", "production",
version="devel")
me = launchpad.me
return args
def main():
# Valid architectures.
valid_archs = set(
["armhf", "arm64", "amd64", "amd64v3", "i386", "powerpc", "ppc64el", "riscv64", "s390x"]
)
args = parse_args(sys.argv[1:], valid_archs)
launchpad = Launchpad.login_with("ubuntu-dev-tools", "production", version="devel")
ubuntu = launchpad.distributions["ubuntu"]
ubuntu = launchpad.distributions['ubuntu']
if args.batch:
release = args.series
@ -171,21 +169,29 @@ def main():
# ppas don't have a proposed pocket so just use the release pocket;
# but for the main archive we default to -proposed
release = ubuntu.getDevelopmentSeries()[0].name
if args.archive == "ubuntu":
release = f"{release}-proposed"
if args.archive == 'ubuntu':
release = release + "-proposed"
try:
release, pocket = split_release_pocket(release)
(release, pocket) = split_release_pocket(release)
except PocketDoesNotExistError as error:
Logger.error(error)
sys.exit(1)
else:
# Check we have the correct number of arguments.
if len(args.packages) < 3:
parser.error("Incorrect number of arguments.")
try:
package = str(args.packages[0]).lower()
release = str(args.packages[1]).lower()
operation = str(args.packages[2]).lower()
except IndexError:
parser.print_help()
sys.exit(1)
archive = launchpad.archives.getByReference(reference=args.archive)
try:
distroseries = ubuntu.getSeries(name_or_version=release.split("-")[0])
distroseries = ubuntu.getSeries(name_or_version=release)
except lazr.restfulclient.errors.NotFound as error:
Logger.error(error)
sys.exit(1)
@ -209,7 +215,7 @@ def main():
# split release and pocket
try:
release, pocket = split_release_pocket(release)
(release, pocket) = split_release_pocket(release)
except PocketDoesNotExistError as error:
Logger.error(error)
sys.exit(1)
@ -221,9 +227,8 @@ def main():
exact_match=True,
pocket=pocket,
source_name=package,
status="Published",
)[0]
except IndexError:
status='Published')[0]
except IndexError as error:
Logger.error("No publication found for package %s", package)
sys.exit(1)
# Get list of builds for that package.
@ -238,13 +243,14 @@ def main():
# are in place.
if operation == "retry":
necessary_privs = archive.checkUpload(
component=component,
component=sources.getComponent(),
distroseries=distroseries,
person=launchpad.me,
pocket=pocket,
sourcepackagename=sources.source_package_name,
sourcepackagename=sources.getPackageName(),
)
if not necessary_privs:
if operation == "retry" and not necessary_privs:
Logger.error(
"You cannot perform the %s operation on a %s package as you"
" do not have the permissions to do this action.",
@ -282,8 +288,7 @@ def main():
build.rescore(score=priority)
except lazr.restfulclient.errors.Unauthorized:
Logger.error(
"You don't have the permissions to rescore builds."
" Ignoring your rescore request."
"You don't have the permissions to rescore builds. Ignoring your rescore request."
)
break
else:
@ -320,22 +325,24 @@ def main():
if not args.state:
if args.retry:
args.state = "Failed to build"
args.state='Failed to build'
elif args.priority:
args.state = "Needs building"
args.state='Needs building'
# there is no equivalent to series.getBuildRecords() for a ppa.
# however, we don't want to have to traverse all build records for
# all series when working on the main archive, so we use
# series.getBuildRecords() for ubuntu and handle ppas separately
series = ubuntu.getSeries(name_or_version=release)
if args.archive == "ubuntu":
builds = series.getBuildRecords(build_state=args.state, pocket=pocket)
if args.archive == 'ubuntu':
builds = series.getBuildRecords(build_state=args.state,
pocket=pocket)
else:
builds = []
for build in archive.getBuildRecords(build_state=args.state, pocket=pocket):
for build in archive.getBuildRecords(build_state=args.state,
pocket=pocket):
if not build.current_source_publication:
continue
if build.current_source_publication.distro_series == series:
if build.current_source_publication.distro_series==series:
builds.append(build)
for build in builds:
if build.arch_tag not in archs:
@ -354,8 +361,9 @@ def main():
)
if args.retry and not can_retry:
Logger.error(
"You don't have the permissions to retry the build of '%s', skipping.",
build.source_package_name,
"You don't have the permissions to retry the "
"build of '%s', skipping.",
build.source_package_name
)
continue
Logger.info(
@ -363,22 +371,14 @@ def main():
build.source_package_name,
release,
pocket,
build.source_package_version,
build.source_package_version
)
if args.retry and build.can_be_retried:
Logger.info(
"Retrying build of %s on %s...", build.source_package_name, build.arch_tag
)
try:
Logger.info("Retrying build of %s on %s...",
build.source_package_name, build.arch_tag)
build.retry()
retry_count += 1
except lazr.restfulclient.errors.BadRequest:
Logger.info(
"Failed to retry build of %s on %s",
build.source_package_name,
build.arch_tag,
)
if args.priority and can_rescore:
if build.can_be_rescored:
@ -386,22 +386,19 @@ def main():
build.rescore(score=args.priority)
except lazr.restfulclient.errors.Unauthorized:
Logger.error(
"You don't have the permissions to rescore builds."
" Ignoring your rescore request."
"You don't have the permissions to rescore builds. Ignoring your rescore request."
)
can_rescore = False
except lazr.restfulclient.errors.BadRequest:
Logger.info(
"Cannot rescore build of %s on %s.",
build.source_package_name,
build.arch_tag,
)
Logger.info("Cannot rescore build of %s on %s.",
build.source_package_name, build.arch_tag)
Logger.info("")
if args.retry:
Logger.info("%d package builds retried", retry_count)
sys.exit(0)
for pkg in args.packages:
try:
pkg = archive.getPublishedSources(
@ -409,9 +406,8 @@ def main():
exact_match=True,
pocket=pocket,
source_name=pkg,
status="Published",
)[0]
except IndexError:
status='Published')[0]
except IndexError as error:
Logger.error("No publication found for package %s", pkg)
continue
@ -439,14 +435,15 @@ def main():
pkg.source_package_version,
)
Logger.info(get_build_states(pkg, archs))
Logger.info(getBuildStates(pkg, archs))
if can_retry:
Logger.info(retry_builds(pkg, archs))
Logger.info(retryBuilds(pkg, archs))
if args.priority:
Logger.info(rescore_builds(pkg, archs, args.priority))
Logger.info(rescoreBuilds(pkg, archs, args.priority))
Logger.info("")
if __name__ == "__main__":
main()

View File

@ -65,7 +65,7 @@ def main():
err = True
continue
Logger.info("%s%s", prefix, version)
Logger.info(prefix + version)
if err:
sys.exit(1)

View File

@ -165,7 +165,6 @@ class SourcePackage(ABC):
series = kwargs.get("series")
pocket = kwargs.get("pocket")
status = kwargs.get("status")
arch = kwargs.get("arch")
verify_signature = kwargs.get("verify_signature", False)
try_binary = kwargs.get("try_binary", True)
@ -185,7 +184,6 @@ class SourcePackage(ABC):
self._series = series
self._pocket = pocket
self._status = status
self._arch = arch
# dscfile can be either a path or an URL. misc.py's download() will
# later fiture it out
self._dsc_source = dscfile
@ -254,7 +252,6 @@ class SourcePackage(ABC):
)
try:
params["archtag"] = self._arch
bpph = archive.getBinaryPackage(self.source, **params)
except PackageNotFoundException as bpnfe:
# log binary lookup failure, in case it provides hints
@ -340,9 +337,11 @@ class SourcePackage(ABC):
def _archive_servers(self):
"Generator for mirror and master servers"
# Always provide the mirrors first
yield from self.mirrors
for server in self.mirrors:
yield server
# Don't repeat servers that are in both mirrors and masters
yield from set(self.masters) - set(self.mirrors)
for server in set(self.masters) - set(self.mirrors):
yield server
def _source_urls(self, name):
"Generator of sources for name"
@ -544,7 +543,7 @@ class SourcePackage(ABC):
Return the debdiff filename.
"""
cmd = ["debdiff", self.dsc_name, newpkg.dsc_name]
difffn = f"{newpkg.dsc_name[:-3]}debdiff"
difffn = newpkg.dsc_name[:-3] + "debdiff"
Logger.debug("%s > %s", " ".join(cmd), difffn)
with open(difffn, "w", encoding="utf-8") as f:
if subprocess.call(cmd, stdout=f, cwd=str(self.workdir)) > 2:
@ -633,7 +632,8 @@ class DebianSourcePackage(SourcePackage):
def _source_urls(self, name):
"Generator of sources for name"
yield from super()._source_urls(name)
for url in super()._source_urls(name):
yield url
if name in self.snapshot_files:
yield self.snapshot_files[name]
@ -731,14 +731,13 @@ class PersonalPackageArchiveSourcePackage(UbuntuSourcePackage):
class UbuntuCloudArchiveSourcePackage(PersonalPackageArchiveSourcePackage):
"Download / unpack an Ubuntu Cloud Archive source package"
TEAM = "ubuntu-cloud-archive"
PROJECT = "cloud-archive"
VALID_POCKETS = ["updates", "proposed", "staging"]
def __init__(self, *args, **kwargs):
# Need to determine actual series/pocket ppa now, as it affects getArchive()
series, pocket = self._findReleaseAndPocketForPackage(
(series, pocket) = self._findReleaseAndPocketForPackage(
kwargs.get("series"),
kwargs.get("pocket"),
kwargs.get("package"),
@ -921,15 +920,15 @@ class UbuntuCloudArchiveSourcePackage(PersonalPackageArchiveSourcePackage):
if version:
params["version"] = version
if ppa.getPublishedSources(**params):
ppa_release, _, ppa_pocket = ppa.name.partition("-")
(ppa_release, _, ppa_pocket) = ppa.name.partition("-")
return (ppa_release, ppa_pocket)
# package/version not found in any ppa
return default
class _WebJSON:
def getHostUrl(self):
raise NotImplementedError(f"{self.__class__.__name__}.getHostUrl() is not implemented")
def getHostUrl(self): # pylint: disable=no-self-use
raise Exception("Not implemented")
def load(self, path=""):
reader = codecs.getreader("utf-8")
@ -1343,7 +1342,7 @@ class SnapshotSPPH:
self.getComponent(),
subdir,
name,
f"{name}_{pkgversion}",
name + "_" + pkgversion,
"changelog.txt",
)
try:

View File

@ -71,8 +71,8 @@ class Pbuilder(Builder):
cmd = [
"sudo",
"-E",
f"ARCH={self.architecture}",
f"DIST={dist}",
"ARCH=" + self.architecture,
"DIST=" + dist,
self.name,
"--build",
"--architecture",
@ -91,8 +91,8 @@ class Pbuilder(Builder):
cmd = [
"sudo",
"-E",
f"ARCH={self.architecture}",
f"DIST={dist}",
"ARCH=" + self.architecture,
"DIST=" + dist,
self.name,
"--update",
"--architecture",
@ -140,7 +140,7 @@ class Sbuild(Builder):
workdir = os.getcwd()
Logger.debug("cd %s", result_directory)
os.chdir(result_directory)
cmd = ["sbuild", "--arch-all", f"--dist={dist}", f"--arch={self.architecture}", dsc_file]
cmd = ["sbuild", "--arch-all", "--dist=" + dist, "--arch=" + self.architecture, dsc_file]
Logger.debug(" ".join(cmd))
returncode = subprocess.call(cmd)
Logger.debug("cd %s", workdir)

View File

@ -50,7 +50,7 @@ class UDTConfig:
"KEYID": None,
}
# Populated from the configuration files:
config: dict[str, str] = {}
config = {}
def __init__(self, no_conf=False, prefix=None):
self.no_conf = no_conf
@ -61,26 +61,28 @@ class UDTConfig:
self.config = self.parse_devscripts_config()
@staticmethod
def parse_devscripts_config() -> dict[str, str]:
def parse_devscripts_config():
"""Read the devscripts configuration files, and return the values as a
dictionary
"""
config = {}
for filename in ("/etc/devscripts.conf", "~/.devscripts"):
try:
with open(os.path.expanduser(filename), "r", encoding="utf-8") as f:
content = f.read()
f = open(os.path.expanduser(filename), "r", encoding="utf-8")
except IOError:
continue
try:
tokens = shlex.split(content, comments=True)
except ValueError as e:
Logger.error("Error parsing %s: %s", filename, e)
continue
for token in tokens:
if "=" in token:
key, value = token.split("=", 1)
for line in f:
parsed = shlex.split(line, comments=True)
if len(parsed) > 1:
Logger.warning(
"Cannot parse variable assignment in %s: %s",
getattr(f, "name", "<config>"),
line,
)
if len(parsed) >= 1 and "=" in parsed[0]:
key, value = parsed[0].split("=", 1)
config[key] = value
f.close()
return config
def get_value(self, key, default=None, boolean=False, compat_keys=()):
@ -97,9 +99,9 @@ class UDTConfig:
if default is None and key in self.defaults:
default = self.defaults[key]
keys = [f"{self.prefix}_{key}"]
keys = [self.prefix + "_" + key]
if key in self.defaults:
keys.append(f"UBUNTUTOOLS_{key}")
keys.append("UBUNTUTOOLS_" + key)
keys += compat_keys
for k in keys:
@ -112,9 +114,9 @@ class UDTConfig:
else:
continue
if k in compat_keys:
replacements = f"{self.prefix}_{key}"
replacements = self.prefix + "_" + key
if key in self.defaults:
replacements += f"or UBUNTUTOOLS_{key}"
replacements += "or UBUNTUTOOLS_" + key
Logger.warning(
"Using deprecated configuration variable %s. You should use %s.",
k,
@ -178,7 +180,7 @@ def ubu_email(name=None, email=None, export=True):
mailname = socket.getfqdn()
if os.path.isfile("/etc/mailname"):
mailname = open("/etc/mailname", "r", encoding="utf-8").read().strip()
email = f"{pwd.getpwuid(os.getuid()).pw_name}@{mailname}"
email = pwd.getpwuid(os.getuid()).pw_name + "@" + mailname
if export:
os.environ["DEBFULLNAME"] = name

View File

@ -26,7 +26,6 @@ import logging
import os
import re
from copy import copy
from typing import Any
from urllib.error import URLError
from urllib.parse import urlparse
@ -140,7 +139,7 @@ class BaseWrapper(metaclass=MetaWrapper):
A base class from which other wrapper classes are derived.
"""
resource_type: str | tuple[str, str] = "" # it's a base class after all
resource_type: str = None # it's a base class after all
def __new__(cls, data):
if isinstance(data, str) and data.startswith(str(Launchpad._root_uri)):
@ -158,7 +157,7 @@ class BaseWrapper(metaclass=MetaWrapper):
pass
if isinstance(data, Entry):
service_root, resource_type = data.resource_type_link.split("#")
(service_root, resource_type) = data.resource_type_link.split("#")
if service_root == str(Launchpad._root_uri) and resource_type in cls.resource_type:
# check if it's already cached
cached = cls._cache.get(data.self_link)
@ -291,8 +290,9 @@ class Distribution(BaseWrapper):
Returns a list of all DistroSeries objects.
"""
if not self._have_all_series:
for series in self.series:
self._cache_series(DistroSeries(series))
for series in Launchpad.load(self.series_collection_link).entries:
series_link = DistroSeries(series["self_link"])
self._cache_series(series_link)
self._have_all_series = True
allseries = filter(lambda s: s.active, self._series.values())
@ -668,19 +668,20 @@ class Archive(BaseWrapper):
rversion = getattr(record, "binary_package_version", None)
else:
rversion = getattr(record, "source_package_version", None)
skipmsg = f"Skipping version {rversion}: "
if record.pocket not in pockets:
err_msg = f"pocket {record.pocket} not in ({','.join(pockets)})"
Logger.debug("Skipping version %s: %s", rversion, err_msg)
Logger.debug(skipmsg + err_msg)
continue
if record.status not in statuses:
err_msg = f"status {record.status} not in ({','.join(statuses)})"
Logger.debug("Skipping version %s: %s", rversion, err_msg)
Logger.debug(skipmsg + err_msg)
continue
release = wrapper(record)
if binary and archtag and archtag != release.arch:
err_msg = f"arch {release.arch} does not match requested arch {archtag}"
Logger.debug("Skipping version %s: %s", rversion, err_msg)
Logger.debug(skipmsg + err_msg)
continue
# results are ordered so first is latest
cache[index] = release
@ -882,7 +883,7 @@ class SourcePackagePublishingHistory(BaseWrapper):
"""
release = self.getSeriesName()
if self.pocket != "Release":
release += f"-{self.pocket.lower()}"
release += "-" + self.pocket.lower()
return release
def getArchive(self):
@ -1043,9 +1044,9 @@ class SourcePackagePublishingHistory(BaseWrapper):
# strip out the URL leading text.
filename = os.path.basename(urlparse(url).path)
# strip the file suffix
pkgname, _, extension = filename.rpartition(".")
(pkgname, _, extension) = filename.rpartition(".")
# split into name, version, arch
name_, _, arch_ = pkgname.rsplit("_", 2)
(name_, _, arch_) = pkgname.rsplit("_", 2)
# arch 'all' has separate bpph for each real arch,
# but all point to the same binary url
if arch_ == "all":
@ -1405,7 +1406,10 @@ class PersonTeam(BaseWrapper, metaclass=MetaPersonTeam):
def getPPAs(self):
if self._ppas is None:
ppas = [Archive(ppa) for ppa in self._lpobject.ppas]
ppas = [
Archive(ppa["self_link"])
for ppa in Launchpad.load(self._lpobject.ppas_collection_link).entries
]
self._ppas = {ppa.name: ppa for ppa in ppas}
return self._ppas
@ -1430,7 +1434,10 @@ class Project(BaseWrapper):
The list will be sorted by date_created, in descending order.
"""
if not self._series:
series = [ProjectSeries(s) for s in self._lpobject.series]
series = [
ProjectSeries(s["self_link"])
for s in Launchpad.load(self._lpobject.series_collection_link).entries
]
self._series = sorted(series, key=lambda s: s.date_created, reverse=True)
return self._series.copy()
@ -1502,7 +1509,7 @@ class Packageset(BaseWrapper): # pylint: disable=too-few-public-methods
resource_type = "packageset"
_lp_packagesets = None
_source_sets: dict[tuple[str, str | None, bool], Any] = {}
_source_sets = {}
@classmethod
def setsIncludingSource(cls, sourcepackagename, distroseries=None, direct_inclusion=False):

View File

@ -169,7 +169,7 @@ def split_release_pocket(release, default="Release"):
raise ValueError("No release name specified")
if "-" in release:
release, pocket = release.rsplit("-", 1)
(release, pocket) = release.rsplit("-", 1)
pocket = pocket.capitalize()
if pocket not in POCKETS:
@ -342,7 +342,7 @@ def download(src, dst, size=0, *, blocksize=DOWNLOAD_BLOCKSIZE_DEFAULT):
shutil.copyfile(src, dst)
return dst
src, username, password = extract_authentication(src)
(src, username, password) = extract_authentication(src)
auth = (username, password) if username or password else None
with tempfile.TemporaryDirectory() as tmpdir:
@ -385,7 +385,7 @@ class _StderrProgressBar:
pctstr = f"{pct:>3}%"
barlen = self.width * pct // 100
barstr = "=" * barlen
barstr = f"{barstr[:-1]}>"
barstr = barstr[:-1] + ">"
barstr = barstr.ljust(self.width)
fullstr = f"\r[{barstr}]{pctstr}"
sys.stderr.write(fullstr)

View File

@ -264,7 +264,7 @@ class PullPkg:
return UbuntuCloudArchiveSourcePackage.parseReleaseAndPocket(release)
# Check if release[-pocket] is specified
release, pocket = split_release_pocket(release, default=None)
(release, pocket) = split_release_pocket(release, default=None)
Logger.debug("Parsed release '%s' pocket '%s'", release, pocket)
if distro == DISTRO_DEBIAN:
@ -291,7 +291,7 @@ class PullPkg:
# Verify specified release is valid, and params in correct order
pocket = None
try:
release, pocket = self.parse_release(distro, release)
(release, pocket) = self.parse_release(distro, release)
except (SeriesNotFoundException, PocketDoesNotExistError):
if try_swap:
Logger.debug("Param '%s' not valid series, must be version", release)
@ -340,10 +340,9 @@ class PullPkg:
params = {}
params["package"] = options["package"]
params["arch"] = options["arch"]
if options["release"]:
release, version, pocket = self.parse_release_and_version(
(release, version, pocket) = self.parse_release_and_version(
distro, options["release"], options["version"]
)
params["series"] = release
@ -436,7 +435,7 @@ class PullPkg:
if options["upload_queue"]:
# upload queue API is different/simpler
self.pull_upload_queue( # pylint: disable=missing-kwoa
pull, download_only=options["download_only"], **params
pull, arch=options["arch"], download_only=options["download_only"], **params
)
return
@ -454,7 +453,7 @@ class PullPkg:
if key.startswith("vcs-"):
if key == "vcs-browser":
continue
if key == "vcs-git":
elif key == "vcs-git":
vcs = "Git"
elif key == "vcs-bzr":
vcs = "Bazaar"
@ -463,26 +462,19 @@ class PullPkg:
uri = srcpkg.dsc[original_key]
Logger.warning(
"\nNOTICE: '%s' packaging is maintained in "
"the '%s' version control system at:\n %s\n",
package,
vcs,
uri,
)
Logger.warning("\nNOTICE: '%s' packaging is maintained in "
"the '%s' version control system at:\n"
" %s\n" % (package, vcs, uri))
vcscmd = ""
if vcs == "Bazaar":
vcscmd = " $ bzr branch " + uri
elif vcs == "Git":
vcscmd = " $ git clone " + uri
if vcscmd:
Logger.info(
"Please use:\n%s\n"
"to retrieve the latest (possibly unreleased) updates to the package.\n",
vcscmd,
)
Logger.info(f"Please use:\n{vcscmd}\n"
"to retrieve the latest (possibly unreleased) "
"updates to the package.\n")
if pull == PULL_LIST:
Logger.info("Source files:")

View File

@ -31,9 +31,9 @@ class Question:
def get_options(self):
if len(self.options) == 2:
options = f"{self.options[0]} or {self.options[1]}"
options = self.options[0] + " or " + self.options[1]
else:
options = f"{', '.join(self.options[:-1])}, or {self.options[-1]}"
options = ", ".join(self.options[:-1]) + ", or " + self.options[-1]
return options
def ask(self, question, default=None):
@ -67,7 +67,7 @@ class Question:
if selected == option[0]:
selected = option
if selected not in self.options:
print(f"Please answer the question with {self.get_options()}.")
print("Please answer the question with " + self.get_options() + ".")
return selected
@ -170,7 +170,7 @@ class EditBugReport(EditFile):
split_re = re.compile(r"^Summary.*?:\s+(.*?)\s+Description:\s+(.*)$", re.DOTALL | re.UNICODE)
def __init__(self, subject, body, placeholders=None):
prefix = f"{os.path.basename(sys.argv[0])}_"
prefix = os.path.basename(sys.argv[0]) + "_"
tmpfile = tempfile.NamedTemporaryFile(prefix=prefix, suffix=".txt", delete=False)
tmpfile.write((f"Summary (one line):\n{subject}\n\nDescription:\n{body}").encode("utf-8"))
tmpfile.close()

View File

@ -76,11 +76,13 @@ def need_sponsorship(name, component, release):
need_sponsor = not PersonTeam.me.canUploadPackage(archive, distroseries, name, component)
if need_sponsor:
print("""You are not able to upload this package directly to Ubuntu.
print(
"""You are not able to upload this package directly to Ubuntu.
Your sync request shall require an approval by a member of the appropriate
sponsorship team, who shall be subscribed to this bug report.
This must be done before it can be processed by a member of the Ubuntu Archive
team.""")
team."""
)
confirmation_prompt()
return need_sponsor

View File

@ -62,17 +62,8 @@ def get_debian_srcpkg(name, release):
return DebianSourcePackage(package=name, series=release).lp_spph
def get_ubuntu_srcpkg(name, release, pocket="Proposed"):
srcpkg = UbuntuSourcePackage(package=name, series=release, pocket=pocket)
try:
return srcpkg.lp_spph
except PackageNotFoundException:
if pocket != "Release":
parent_pocket = "Release"
if pocket == "Updates":
parent_pocket = "Proposed"
return get_ubuntu_srcpkg(name, release, parent_pocket)
raise
def get_ubuntu_srcpkg(name, release):
return UbuntuSourcePackage(package=name, series=release).lp_spph
def need_sponsorship(name, component, release):
@ -192,7 +183,7 @@ Content-Type: text/plain; charset=UTF-8
backup = tempfile.NamedTemporaryFile(
mode="w",
delete=False,
prefix=f"requestsync-{re.sub('[^a-zA-Z0-9_-]', '', bugtitle.replace(' ', '_'))}",
prefix="requestsync-" + re.sub(r"[^a-zA-Z0-9_-]", "", bugtitle.replace(" ", "_")),
)
with backup:
backup.write(mail)

View File

@ -1,19 +1,18 @@
# Copyright (C) 2024 Canonical Ltd.
# Author: Chris Peterson <chris.peterson@canonical.com>
# Author: Andy P. Whitcroft
# Author: Christian Ehrhardt
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU General Public License version 3, as published
# by the Free Software Foundation.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranties of
# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program. If not, see <http://www.gnu.org/licenses/>.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import datetime
import json
@ -26,7 +25,10 @@ URL_QUEUED = "http://autopkgtest.ubuntu.com/queues.json"
def _get_jobs(url: str) -> dict:
request = urllib.request.Request(url, headers={"Cache-Control": "max-age-0"})
request = urllib.request.Request(
url,
headers={"Cache-Control": "max-age-0"},
)
with urllib.request.urlopen(request) as response:
data = response.read()
jobs = json.loads(data.decode("utf-8"))
@ -49,10 +51,7 @@ def get_running():
env = jobinfo[0].get("env", "-")
time = str(datetime.timedelta(seconds=jobinfo[1]))
try:
line = (
f"R {time:6} {pkg:30} {'-':10} {series:8} {arch:8}"
f" {ppas:31} {triggers} {env}\n"
)
line = f"R {time:6} {pkg:30} {'-':10} {series:8} {arch:8} {ppas:31} {triggers} {env}\n"
running.append((jobinfo[1], line))
except BrokenPipeError:
sys.exit(1)
@ -75,7 +74,7 @@ def get_queued():
if key == "private job":
pkg = triggers = ppas = "private job"
else:
pkg, json_data = key.split(maxsplit=1)
(pkg, json_data) = key.split(maxsplit=1)
try:
jobinfo = json.loads(json_data)
triggers = ",".join(jobinfo.get("triggers", "-"))
@ -86,10 +85,7 @@ def get_queued():
n = n + 1
try:
output += (
f"Q{n:04d} {'-:--':>6} {pkg:30} {origin:10} {series:8} {arch:8}"
f" {ppas:31} {triggers}\n"
)
output += f"Q{n:04d} {'-:--':>6} {pkg:30} {origin:10} {series:8} {arch:8} {ppas:31} {triggers}\n"
except BrokenPipeError:
sys.exit(1)
return output

View File

@ -16,7 +16,6 @@
# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
import sys
from typing import NoReturn
from ubuntutools.question import Question, YesNoQuestion
@ -43,7 +42,7 @@ def ask_for_manual_fixing():
user_abort()
def user_abort() -> NoReturn:
def user_abort():
"""Print abort and quit the program."""
print("User abort.")

View File

@ -17,7 +17,6 @@
import logging
import os
import pathlib
import re
import subprocess
import sys
@ -256,7 +255,7 @@ class SourcePackage:
def _changes_file(self):
"""Returns the file name of the .changes file."""
return os.path.join(
self._workdir, f"{self._package}_{strip_epoch(self._version)}_source.changes"
self._workdir, f"{self._package}_{ strip_epoch(self._version)}_source.changes"
)
def check_target(self, upload, launchpad):
@ -266,7 +265,7 @@ class SourcePackage:
wants to change something.
"""
devel_series, supported_series = _get_series(launchpad)
(devel_series, supported_series) = _get_series(launchpad)
if upload == "ubuntu":
allowed = (
@ -408,16 +407,22 @@ class SourcePackage:
return True
def _run_lintian(self) -> str:
def _run_lintian(self):
"""Runs lintian on either the source or binary changes file.
Returns the filename of the created lintian output file.
"""
# Determine whether to use the source or binary build for lintian
package_and_version = f"{self._package}_{strip_epoch(self._version)}"
if self._build_log:
build_changes = f"{package_and_version}_{self._builder.get_architecture()}.changes"
build_changes = (
self._package
+ "_"
+ strip_epoch(self._version)
+ "_"
+ self._builder.get_architecture()
+ ".changes"
)
changes_for_lintian = os.path.join(self._buildresult, build_changes)
else:
changes_for_lintian = self._changes_file
@ -425,12 +430,18 @@ class SourcePackage:
# Check lintian
assert os.path.isfile(changes_for_lintian), f"{changes_for_lintian} does not exist."
cmd = ["lintian", "-IE", "--pedantic", "-q", "--profile", "ubuntu", changes_for_lintian]
lintian_file = pathlib.Path(self._workdir) / f"{package_and_version}.lintian"
Logger.debug("%s > %s", " ".join(cmd), lintian_file)
with lintian_file.open("wb") as outfile:
subprocess.run(cmd, stdout=outfile, check=True)
lintian_filename = os.path.join(
self._workdir, self._package + "_" + strip_epoch(self._version) + ".lintian"
)
Logger.debug("%s > %s", " ".join(cmd), lintian_filename)
report = subprocess.check_output(cmd, encoding="utf-8")
return str(lintian_file)
# write lintian report file
lintian_file = open(lintian_filename, "w", encoding="utf-8")
lintian_file.writelines(report)
lintian_file.close()
return lintian_filename
def sync(self, upload, series, bug_number, requester):
"""Does a sync of the source package."""

View File

@ -39,16 +39,18 @@ def is_command_available(command, check_sbin=False):
"Is command in $PATH?"
path = os.environ.get("PATH", "/usr/bin:/bin").split(":")
if check_sbin:
path += [f"{directory[:-3]}sbin" for directory in path if directory.endswith("/bin")]
path += [directory[:-3] + "sbin" for directory in path if directory.endswith("/bin")]
return any(os.access(os.path.join(directory, command), os.X_OK) for directory in path)
def check_dependencies():
"Do we have all the commands we need for full functionality?"
missing = []
for cmd in ("patch", "quilt", "dput", "lintian"):
for cmd in ("patch", "bzr", "quilt", "dput", "lintian"):
if not is_command_available(cmd):
missing.append(cmd)
if not is_command_available("bzr-buildpackage"):
missing.append("bzr-builddeb")
if not any(
is_command_available(cmd, check_sbin=True) for cmd in ("pbuilder", "sbuild", "cowbuilder")
):
@ -210,14 +212,14 @@ def get_open_ubuntu_bug_task(launchpad, bug, branch=None):
sys.exit(1)
elif len(ubuntu_tasks) == 1:
task = ubuntu_tasks[0]
elif branch and branch[1] == "ubuntu":
if len(ubuntu_tasks) > 1 and branch and branch[1] == "ubuntu":
tasks = [t for t in ubuntu_tasks if t.get_series() == branch[2] and t.package == branch[3]]
if len(tasks) > 1:
# A bug targeted to the development series?
tasks = [t for t in tasks if t.series is not None]
assert len(tasks) == 1
task = tasks[0]
else:
elif len(ubuntu_tasks) > 1:
task_list = [t.get_short_info() for t in ubuntu_tasks]
Logger.debug(
"%i Ubuntu tasks exist for bug #%i.\n%s",
@ -301,7 +303,7 @@ def _download_and_change_into(task, dsc_file, patch, branch):
extract_source(dsc_file, Logger.isEnabledFor(logging.DEBUG))
# change directory
directory = f"{task.package}-{task.get_version().upstream_version}"
directory = task.package + "-" + task.get_version().upstream_version
Logger.debug("cd %s", directory)
os.chdir(directory)
@ -313,7 +315,7 @@ def sponsor_patch(bug_number, build, builder, edit, keyid, lpinstance, update, u
launchpad = Launchpad.login_with("sponsor-patch", lpinstance)
bug = launchpad.bugs[bug_number]
patch, branch = get_patch_or_branch(bug)
(patch, branch) = get_patch_or_branch(bug)
task = get_open_ubuntu_bug_task(launchpad, bug, branch)
dsc_file = task.download_source()

View File

@ -60,7 +60,7 @@ class ExamplePackage:
with tempfile.TemporaryDirectory() as tmpdir:
self._create(Path(tmpdir))
def _create(self, directory: Path) -> None:
def _create(self, directory: Path):
pkgdir = directory / self.dirname
pkgdir.mkdir()
(pkgdir / self.content_filename).write_text(self.content_text)

View File

@ -68,7 +68,9 @@ class ConfigTestCase(unittest.TestCase):
del os.environ[k]
def test_config_parsing(self):
self._config_files["user"] = """#COMMENT=yes
self._config_files[
"user"
] = """#COMMENT=yes
\tTAB_INDENTED=yes
SPACE_INDENTED=yes
SPACE_SUFFIX=yes
@ -115,7 +117,10 @@ REPEAT=yes
self.assertEqual(self.get_value("BUILDER", default="foo"), "foo")
def test_scriptname_precedence(self):
self._config_files["user"] = "TEST_BUILDER=foo\nUBUNTUTOOLS_BUILDER=bar\n"
self._config_files[
"user"
] = """TEST_BUILDER=foo
UBUNTUTOOLS_BUILDER=bar"""
self.assertEqual(self.get_value("BUILDER"), "foo")
def test_configfile_precedence(self):

View File

@ -1,34 +0,0 @@
# Copyright (C) 2024 Canonical Ltd.
# Author: Chris Peterson <chris.peterson@canonical.com>
#
# Permission to use, copy, modify, and/or distribute this software for any
# purpose with or without fee is hereby granted, provided that the above
# copyright notice and this permission notice appear in all copies.
#
# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
# REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY
# AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
# INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
# LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
# OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
# PERFORMANCE OF THIS SOFTWARE.
import unittest
# Binary Tests
class BinaryTests(unittest.TestCase):
# The requestsync binary has the option of using the launchpad api
# to log in but requires python3-keyring in addition to
# python3-launchpadlib. Testing the integrated login functionality
# automatically isn't very feasbile, but we can at least write a smoke
# test to make sure the required packages are installed.
# See LP: #2049217
def test_keyring_installed(self):
"""Smoke test for required lp api dependencies"""
try:
# pylint: disable-next=import-outside-toplevel,unused-import
import keyring # noqa: F401
except ModuleNotFoundError as error:
raise ModuleNotFoundError("package python3-keyring is not installed") from error

View File

@ -1,75 +0,0 @@
# test_reverse_depends.py - Test suite for reverse-depends
#
# Copyright (C) 2026, Nadzeya Hutsko <nadzeya.hutsko@canonical.com>
#
# Permission to use, copy, modify, and/or distribute this software for any
# purpose with or without fee is hereby granted, provided that the above
# copyright notice and this permission notice appear in all copies.
#
# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
# REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY
# AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
# INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
# LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
# OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
# PERFORMANCE OF THIS SOFTWARE.
"""Test suite for the reverse-depends script"""
import importlib.machinery
import importlib.util
import os
import unittest
from unittest import mock
def _load_reverse_depends():
script_path = os.path.join(os.path.dirname(__file__), "..", "..", "reverse-depends")
script_path = os.path.abspath(script_path)
loader = importlib.machinery.SourceFileLoader("reverse_depends", script_path)
spec = importlib.util.spec_from_loader("reverse_depends", loader)
module = importlib.util.module_from_spec(spec)
loader.exec_module(module)
return module
reverse_depends = _load_reverse_depends()
class DisplayVerboseTestCase(unittest.TestCase):
"""Tests for display_verbose output when no reverse dependencies are found"""
def test_no_results_at_all(self):
"""Empty values dict prints 'No reverse dependencies found'"""
with mock.patch.object(reverse_depends.Logger, "info") as mock_info:
reverse_depends.display_verbose("some-package", {})
mock_info.assert_called_once_with("No reverse dependencies found")
def test_results_filtered_out(self):
"""Package present in values but with empty data prints filtered message"""
# Simulate the case where rdeps exist but are all filtered out (e.g. by
# -R), so build_results populates `result[package] = {}` when all fields
# are filtered
values = {"r-cran-bdgraph": {}}
with mock.patch.object(reverse_depends.Logger, "info") as mock_info:
reverse_depends.display_verbose("r-cran-bdgraph", values)
mock_info.assert_called_once_with("No reverse dependencies found with the current filters")
def test_results_found(self):
"""Non-empty results are displayed without the 'not found' messages"""
values = {
"r-cran-bdgraph": {
"Reverse-Depends": [{"Package": "r-cran-qgraph", "Architectures": ["amd64"]}]
}
}
with mock.patch.object(reverse_depends.Logger, "info") as mock_info:
reverse_depends.display_verbose("r-cran-bdgraph", values)
calls = [str(c) for c in mock_info.call_args_list]
self.assertFalse(
any("No reverse dependencies found" in c for c in calls),
"Should not print 'no rdeps' message when results exist",
)
if __name__ == "__main__":
unittest.main()

View File

@ -1,18 +1,19 @@
# Copyright (C) 2024 Canonical Ltd.
# Author: Chris Peterson <chris.peterson@canonical.com>
#
# Permission to use, copy, modify, and/or distribute this software for any
# purpose with or without fee is hereby granted, provided that the above
# copyright notice and this permission notice appear in all copies.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
# REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY
# AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
# INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
# LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
# OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
# PERFORMANCE OF THIS SOFTWARE.
"""Tests for running_autopkgtests
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
""" Tests for running_autopkgtests
Tests using cached data from autopkgtest servers.
These tests only ensure code changes don't change parsing behavior
@ -32,17 +33,8 @@ from ubuntutools.running_autopkgtests import (
)
# Cached binary response data from autopkgtest server
RUN_DATA = (
b'{"pyatem": {'
b" \"submit-time_2024-01-19 19:37:36;triggers_['python3-defaults/3.12.1-0ubuntu1'];\":"
b' {"noble": {"arm64": [{"triggers": ["python3-defaults/3.12.1-0ubuntu1"],'
b' "submit-time": "2024-01-19 19:37:36"}, 380, "<omitted log>"]}}}}'
)
QUEUED_DATA = (
b'{"ubuntu": {"noble": {"arm64": ["libobject-accessor-perl {\\"requester\\": \\"someone\\",'
b' \\"submit-time\\": \\"2024-01-18 01:08:55\\",'
b' \\"triggers\\": [\\"perl/5.38.2-3\\", \\"liblocale-gettext-perl/1.07-6build1\\"]}"]}}}'
)
RUN_DATA = b'{"pyatem": { "submit-time_2024-01-19 19:37:36;triggers_[\'python3-defaults/3.12.1-0ubuntu1\'];": {"noble": {"arm64": [{"triggers": ["python3-defaults/3.12.1-0ubuntu1"], "submit-time": "2024-01-19 19:37:36"}, 380, "<omitted log>"]}}}}'
QUEUED_DATA = b'{"ubuntu": {"noble": {"arm64": ["libobject-accessor-perl {\\"requester\\": \\"someone\\", \\"submit-time\\": \\"2024-01-18 01:08:55\\", \\"triggers\\": [\\"perl/5.38.2-3\\", \\"liblocale-gettext-perl/1.07-6build1\\"]}"]}}}'
# Expected result(s) of parsing the above JSON data
RUNNING_JOB = {
@ -66,9 +58,7 @@ QUEUED_JOB = {
"ubuntu": {
"noble": {
"arm64": [
'libobject-accessor-perl {"requester": "someone",'
' "submit-time": "2024-01-18 01:08:55",'
' "triggers": ["perl/5.38.2-3", "liblocale-gettext-perl/1.07-6build1"]}'
'libobject-accessor-perl {"requester": "someone", "submit-time": "2024-01-18 01:08:55", "triggers": ["perl/5.38.2-3", "liblocale-gettext-perl/1.07-6build1"]}',
]
}
}
@ -79,18 +69,9 @@ PRIVATE_JOB = {"ppa": {"noble": {"arm64": ["private job"]}}}
# Expected textual output of the program based on the above data
RUNNING_OUTPUT = (
"R 0:06:20 pyatem - noble arm64"
" - python3-defaults/3.12.1-0ubuntu1 -\n"
)
QUEUED_OUTPUT = (
"Q0001 -:-- libobject-accessor-perl ubuntu noble arm64"
" - perl/5.38.2-3,liblocale-gettext-perl/1.07-6build1\n"
)
PRIVATE_OUTPUT = (
"Q0001 -:-- private job ppa noble arm64"
" private job private job\n"
)
RUNNING_OUTPUT = "R 0:06:20 pyatem - noble arm64 - python3-defaults/3.12.1-0ubuntu1 -\n"
QUEUED_OUTPUT = "Q0001 -:-- libobject-accessor-perl ubuntu noble arm64 - perl/5.38.2-3,liblocale-gettext-perl/1.07-6build1\n"
PRIVATE_OUTPUT = "Q0001 -:-- private job ppa noble arm64 private job private job\n"
class RunningAutopkgtestTestCase(unittest.TestCase):

View File

@ -72,17 +72,17 @@ class Control:
def set_maintainer(self, maintainer):
"""Sets the value of the Maintainer field."""
pattern = re.compile("^Maintainer: ?.*$", re.MULTILINE)
self._content = pattern.sub(f"Maintainer: {maintainer}", self._content)
self._content = pattern.sub("Maintainer: " + maintainer, self._content)
def set_original_maintainer(self, original_maintainer):
"""Sets the value of the XSBC-Original-Maintainer field."""
original_maintainer = f"XSBC-Original-Maintainer: {original_maintainer}"
original_maintainer = "XSBC-Original-Maintainer: " + original_maintainer
if self.get_original_maintainer():
pattern = re.compile("^(?:[XSBC]*-)?Original-Maintainer:.*$", re.MULTILINE)
self._content = pattern.sub(original_maintainer, self._content)
else:
pattern = re.compile("^(Maintainer:.*)$", re.MULTILINE)
self._content = pattern.sub(f"\\1\\n{original_maintainer}", self._content)
self._content = pattern.sub(r"\1\n" + original_maintainer, self._content)
def remove_original_maintainer(self):
"""Strip out out the XSBC-Original-Maintainer line"""

View File

@ -15,44 +15,47 @@
"""Portions of archive related code that is re-used by various tools."""
from datetime import datetime
import os
import re
import urllib.request
from datetime import datetime
import dateutil.parser
from dateutil.tz import tzutc
def get_cache_dir():
cache_dir = os.environ.get("XDG_CACHE_HOME", os.path.expanduser(os.path.join("~", ".cache")))
uat_cache = os.path.join(cache_dir, "ubuntu-archive-tools")
cache_dir = os.environ.get('XDG_CACHE_HOME',
os.path.expanduser(os.path.join('~', '.cache')))
uat_cache = os.path.join(cache_dir, 'ubuntu-archive-tools')
os.makedirs(uat_cache, exist_ok=True)
return uat_cache
def get_url(url, force_cached):
"""Return file to the URL, possibly caching it"""
''' Return file to the URL, possibly caching it
'''
cache_file = None
# ignore bileto urls wrt caching, they're usually too small to matter
# and we don't do proper cache expiry
m = re.search("ubuntu-archive-team.ubuntu.com/proposed-migration/([^/]*)/([^/]*)", url)
m = re.search('ubuntu-archive-team.ubuntu.com/proposed-migration/'
'([^/]*)/([^/]*)',
url)
if m:
cache_dir = get_cache_dir()
cache_file = os.path.join(cache_dir, f"{m.group(1)}_{m.group(2)}")
cache_file = os.path.join(cache_dir, '%s_%s' % (m.group(1), m.group(2)))
else:
# test logs can be cached, too
m = re.search(
"https://autopkgtest.ubuntu.com/results/autopkgtest-[^/]*/([^/]*)/([^/]*)"
"/[a-z0-9]*/([^/]*)/([_a-f0-9]*)@/log.gz",
url,
)
'https://autopkgtest.ubuntu.com/results/autopkgtest-[^/]*/([^/]*)/([^/]*)'
'/[a-z0-9]*/([^/]*)/([_a-f0-9]*)@/log.gz',
url)
if m:
cache_dir = get_cache_dir()
cache_file = os.path.join(
cache_dir, f"{m.group(1)}_{m.group(2)}_{m.group(3)}_{m.group(4)}.gz"
)
cache_dir, '%s_%s_%s_%s.gz' % (
m.group(1), m.group(2), m.group(3), m.group(4)))
if cache_file:
try:
@ -62,18 +65,18 @@ def get_url(url, force_cached):
prev_timestamp = datetime.fromtimestamp(prev_mtime, tz=tzutc())
new_timestamp = datetime.now(tz=tzutc()).timestamp()
if force_cached:
return open(cache_file, "rb")
return open(cache_file, 'rb')
f = urllib.request.urlopen(url)
if cache_file:
remote_ts = dateutil.parser.parse(f.headers["last-modified"])
remote_ts = dateutil.parser.parse(f.headers['last-modified'])
if remote_ts > prev_timestamp:
with open(f"{cache_file}.new", "wb") as new_cache:
with open('%s.new' % cache_file, 'wb') as new_cache:
for line in f:
new_cache.write(line)
os.rename(f"{cache_file}.new", cache_file)
os.rename('%s.new' % cache_file, cache_file)
os.utime(cache_file, times=(new_timestamp, new_timestamp))
f.close()
f = open(cache_file, "rb")
f = open(cache_file, 'rb')
return f