Compare commits

..

No commits in common. "main" and "0.193" have entirely different histories.
main ... 0.193

46 changed files with 447 additions and 1615 deletions

18
.gitignore vendored
View File

@ -1,2 +1,16 @@
__pycache__
*.egg-info
.coverage
.tox
/ubuntu_dev_tools.egg-info/
__pycache__/
*.pyc
/build/
/.pybuild/
/test-data/
/debian/python-ubuntutools/
/debian/python3-ubuntutools/
/debian/ubuntu-dev-tools/
/debian/debhelper-build-stamp
/debian/files
/debian/*.debhelper
/debian/*.debhelper.log
/debian/*.substvars

View File

@ -449,7 +449,7 @@ def main(argv):
if current_distro == "Ubuntu":
args.dest_releases = [UbuntuDistroInfo().lts()]
elif current_distro == "Debian":
if current_distro == "Debian":
args.dest_releases = [DebianDistroInfo().stable()]
else:
error("Unknown distribution %s, can't guess target release", current_distro)

View File

@ -36,7 +36,7 @@ _pbuilder-dist()
for distro in $(ubuntu-distro-info --all; debian-distro-info --all) stable testing unstable; do
for builder in pbuilder cowbuilder; do
echo "$builder-$distro"
for arch in i386 amd64 armhf; do
for arch in i386 amd64 armel armhf; do
echo "$builder-$distro-$arch"
done
done

View File

@ -47,41 +47,10 @@ def check_support(apt_cache, pkgname, alt=False):
else:
prefix = " * " + pkgname
prov_packages = apt_cache.get_providing_packages(pkgname)
if pkgname in apt_cache:
try:
pkg = apt_cache[pkgname]
# If this is a virtual package, iterate through the binary packages that
# provide this, and ensure they are all in Main. Source packages in and of
# themselves cannot provide virtual packages, only binary packages can.
elif len(prov_packages) > 0:
supported, unsupported = [], []
for pkg in prov_packages:
candidate = pkg.candidate
if candidate:
section = candidate.section
if section.startswith("universe") or section.startswith("multiverse"):
unsupported.append(pkg.name)
else:
supported.append(pkg.name)
if len(supported) > 0:
msg = "is a virtual package, which is provided by the following "
msg += "candidates in Main: " + " ".join(supported)
print(prefix, msg)
elif len(unsupported) > 0:
msg = "is a virtual package, but is only provided by the "
msg += "following non-Main candidates: " + " ".join(unsupported)
print(prefix, msg, file=sys.stderr)
return False
else:
msg = "is a virtual package that exists but is not provided by "
msg += "package currently in the archive. Proceed with caution."
print(prefix, msg, file=sys.stderr)
return False
else:
print(prefix, "does not exist", file=sys.stderr)
except KeyError:
print(prefix, "does not exist (pure virtual?)", file=sys.stderr)
return False
section = pkg.candidate.section
@ -124,13 +93,6 @@ def check_build_dependencies(apt_cache, control):
continue
for or_group in apt.apt_pkg.parse_src_depends(control.section[field]):
pkgname = or_group[0][0]
# debhelper-compat is expected to be a build dependency of every
# package, so it is a red herring to display it in this report.
# (src:debhelper is in Ubuntu Main anyway)
if pkgname == "debhelper-compat":
continue
if not check_support(apt_cache, pkgname):
# check non-preferred alternatives
for altpkg in or_group[1:]:

1
debian/.gitignore vendored
View File

@ -1 +0,0 @@
files

206
debian/changelog vendored
View File

@ -1,209 +1,3 @@
ubuntu-dev-tools (0.207) UNRELEASED; urgency=medium
* Fix pull-lp-source --upload-queue (LP: #2110061)
-- Dan Streetman <ddstreet@ieee.org> Tue, 06 May 2025 13:25:22 -0400
ubuntu-dev-tools (0.206) unstable; urgency=medium
[ Dan Bungert ]
* mk-sbuild: enable pkgmaintainermangler
[ Shengjing Zhu ]
* import-bug-from-debian: package option is overridden and not used
[ Fernando Bravo Hernández ]
* Parsing arch parameter to getBinaryPackage() (LP: #2081861)
[ Simon Quigley ]
* Read ~/.devscripts in a more robust way, to ideally pick up multi-line
variables (Closes: #725418).
* mk-sbuild: default to using UTC for schroots (LP: #2097159).
* syncpackage: s/syncblacklist/syncblocklist/g
* syncpackage: Cache the sync blocklist in-memory, so it's not fetched
multiple times when syncing more than one package.
* syncpackage: Catch exceptions cleanly, simply skipping to the next
package (erring on the side of caution) if there is an error doing the
download (LP: #1943286).
-- Simon Quigley <tsimonq2@debian.org> Tue, 04 Mar 2025 13:43:15 -0600
ubuntu-dev-tools (0.205) unstable; urgency=medium
* [syncpackage] When syncing multiple packages, if one of the packages is in
the sync blocklist, do not exit, simply continue.
* [syncpackage] Do not use exit(1) on an error or exception unless it
applies to all packages, instead return None so we can continue to the
next package.
* [syncpackage] Add support for -y or --yes, noted that it should be used
with care.
* Update Standards-Version to 4.7.2, no changes needed.
-- Simon Quigley <tsimonq2@debian.org> Sat, 01 Mar 2025 11:29:54 -0600
ubuntu-dev-tools (0.204) unstable; urgency=medium
[ Simon Quigley ]
* Update Standards-Version to 4.7.1, no changes needed.
* Add several Lintian overrides related to .pyc files.
* Add my name to the copyright file.
* Rename bitesize to lp-bitesize (Closes: #1076224).
* Add a manpage for running-autopkgtests.
* Add a large warning at the top of mk-sbuild encouraging the use of the
unshare backend. This is to provide ample warning to users.
* Remove mail line from default ~/.sbuildrc, to resolve the undeclared
dependency on sendmail (Closes: #1074632).
[ Julien Plissonneau Duquène ]
* Fix reverse-depends -b crash on packages that b-d on themselves
(Closes: #1087760).
-- Simon Quigley <tsimonq2@debian.org> Mon, 24 Feb 2025 19:54:39 -0600
ubuntu-dev-tools (0.203) unstable; urgency=medium
[ Steve Langasek ]
* ubuntu-build: handle TOCTOU issue with the "can be retried" value on
builds.
* Recommend sbuild over pbuilder. sbuild is the tool recommended by
Ubuntu developers whose behavior most closely approximates Launchpad
builds.
[ Florent 'Skia' Jacquet ]
* import-bug-from-debian: handle multipart message (Closes: #969510)
[ Benjamin Drung ]
* import-bug-from-debian: add type hints
* Bump Standards-Version to 4.7.0
* Bump year and add missing files to copyright
* setup.py: add pm-helper
* Format code with black and isort
* Address several issues pointed out by Pylint
* Depend on python3-yaml for pm-helper
-- Benjamin Drung <bdrung@debian.org> Sat, 02 Nov 2024 18:19:24 +0100
ubuntu-dev-tools (0.202) unstable; urgency=medium
[ Steve Langasek ]
* ubuntu-build: support --batch with no package names to retry all
* ubuntu-build: in batch mode, print a count of packages retried
* ubuntu-build: make the --arch option top-level.
This gets rid of the fugly --arch2 option
* ubuntu-build: support retrying builds in other states that failed-to-build
* ubuntu-build: Handling of proposed vs release pocket default for ppas
* ubuntu-build: update manpage
[ Chris Peterson ]
* Replace Depends on python3-launchpadlib with Depends on
python3-launchpadlib-desktop (LP: #2049217)
-- Simon Quigley <tsimonq2@ubuntu.com> Fri, 12 Apr 2024 23:33:14 -0500
ubuntu-dev-tools (0.201) unstable; urgency=medium
* running-autopkgtests: fix packaging to make the script available
(LP: #2055466)
-- Chris Peterson <chris.peterson@canonical.com> Thu, 29 Feb 2024 11:09:14 -0800
ubuntu-dev-tools (0.200) unstable; urgency=medium
[ Gianfranco Costamagna ]
* Team upload
[ Chris Peterson ]
* Add support to see currently running autopkgtests (running-autopkgtests)
* running-autopkgtests: use f-strings
[ Athos Ribeiro ]
* syncpackage: log LP authentication errors before halting.
[ Ying-Chun Liu (PaulLiu) ]
* Drop qemu-debootstrap
qemu-debootstrap is deprecated for a while. In newer qemu release
the command is totally removed. We can use debootstrap directly.
Signed-off-by: Ying-Chun Liu (PaulLiu) <paulliu@debian.org>
[ Logan Rosen ]
* Don't rely on debootstrap for validating Ubuntu distro
-- Gianfranco Costamagna <locutusofborg@debian.org> Thu, 15 Feb 2024 17:53:48 +0100
ubuntu-dev-tools (0.199) unstable; urgency=medium
[ Simon Quigley ]
* Add my name to Uploaders.
[ Steve Langasek ]
* Introduce a pm-helper tool.
-- Simon Quigley <tsimonq2@debian.org> Mon, 29 Jan 2024 10:03:22 -0600
ubuntu-dev-tools (0.198) unstable; urgency=medium
* In check-mir, ignore debhelper-compat when checking the build
dependencies. This is expected to be a build dependency of all packages,
so warning about it in any way is surely a red herring.
* Add proper support for virtual packages in check-mir, basing the
determination solely off of binary packages. This is not expected to be a
typical case.
-- Simon Quigley <tsimonq2@debian.org> Wed, 10 Jan 2024 20:04:02 -0600
ubuntu-dev-tools (0.197) unstable; urgency=medium
* Update the manpage for syncpackage to reflect the ability to sync
multiple packages at once.
* When using pull-*-source to grab a package which already has a defined
Vcs- field, display the exact same warning message `apt source` does.
-- Simon Quigley <tsimonq2@debian.org> Tue, 03 Oct 2023 14:01:25 -0500
ubuntu-dev-tools (0.196) unstable; urgency=medium
* Allow the user to sync multiple packages at one time (LP: #1756748).
-- Simon Quigley <tsimonq2@debian.org> Fri, 04 Aug 2023 14:37:59 -0500
ubuntu-dev-tools (0.195) unstable; urgency=medium
* Add support for the non-free-firmware components in all tools already
referencing non-free.
-- Simon Quigley <tsimonq2@debian.org> Wed, 26 Jul 2023 13:03:31 -0500
ubuntu-dev-tools (0.194) unstable; urgency=medium
[ Gianfranco Costamagna ]
* ubuntu-build: For some reasons, now you need to be authenticated before
trying to use the "PersonTeam" class features.
Do it at the begin instead of replicating the same code inside the
tool itself.
[ Steve Langasek ]
* Remove references to deprecated
http://people.canonical.com/~ubuntu-archive.
* Remove references to architectures not supported in any active
Ubuntu release.
* Remove references to ftpmaster.internal. When this name is resolvable
but firewalled, syncpackage hangs; and these are tools for developers,
not for running in an automated context in the DCs where
ftpmaster.internal is reachable.
* Excise all references to cdbs (including in test cases)
* Set apt preferences for the -proposed pocket in mk-sbuild so that
it works as expected for lunar and forward.
[ Robie Basak ]
* ubuntutools/misc: swap iter_content for raw stream with "Accept-Encoding:
identity" to fix .diff.gz downloads (LP: #2025748).
[ Vladimir Petko ]
* Fix a typo introduced in the last upload that made mk-sbuild fail
unconditionally. LP: #2017177.
-- Gianfranco Costamagna <locutusofborg@debian.org> Sat, 08 Jul 2023 08:42:05 +0200
ubuntu-dev-tools (0.193) unstable; urgency=medium
* Don't run linters at build time, or in autopkgtests. (Closes: #1031436).

17
debian/control vendored
View File

@ -6,7 +6,6 @@ Uploaders:
Benjamin Drung <bdrung@debian.org>,
Stefano Rivera <stefanor@debian.org>,
Mattia Rizzolo <mattia@debian.org>,
Simon Quigley <tsimonq2@debian.org>,
Build-Depends:
black <!nocheck>,
dctrl-tools,
@ -21,17 +20,15 @@ Build-Depends:
pylint <!nocheck>,
python3-all,
python3-apt,
python3-dateutil,
python3-debian,
python3-debianbts,
python3-distro-info,
python3-httplib2,
python3-launchpadlib-desktop,
python3-launchpadlib,
python3-pytest,
python3-requests <!nocheck>,
python3-setuptools,
python3-yaml <!nocheck>,
Standards-Version: 4.7.2
Standards-Version: 4.6.2
Rules-Requires-Root: no
Vcs-Git: https://git.launchpad.net/ubuntu-dev-tools
Vcs-Browser: https://git.launchpad.net/ubuntu-dev-tools
@ -54,10 +51,9 @@ Depends:
python3-debianbts,
python3-distro-info,
python3-httplib2,
python3-launchpadlib-desktop,
python3-launchpadlib,
python3-lazr.restfulclient,
python3-ubuntutools (= ${binary:Version}),
python3-yaml,
sensible-utils,
sudo,
tzdata,
@ -72,7 +68,7 @@ Recommends:
genisoimage,
lintian,
patch,
sbuild | pbuilder | cowbuilder,
pbuilder | cowbuilder | sbuild,
python3-dns,
quilt,
reportbug (>= 3.39ubuntu1),
@ -118,8 +114,6 @@ Description: useful tools for Ubuntu developers
- requestsync - files a sync request with Debian changelog and rationale.
- reverse-depends - find the reverse dependencies (or build dependencies) of
a package.
- running-autopkgtests - lists the currently running and/or queued
autopkgtests on the Ubuntu autopkgtest infrastructure
- seeded-in-ubuntu - query if a package is safe to upload during a freeze.
- setup-packaging-environment - assistant to get an Ubuntu installation
ready for packaging work.
@ -138,11 +132,10 @@ Package: python3-ubuntutools
Architecture: all
Section: python
Depends:
python3-dateutil,
python3-debian,
python3-distro-info,
python3-httplib2,
python3-launchpadlib-desktop,
python3-launchpadlib,
python3-lazr.restfulclient,
python3-requests,
sensible-utils,

20
debian/copyright vendored
View File

@ -11,7 +11,6 @@ Files: backportpackage
doc/check-symbols.1
doc/requestsync.1
doc/ubuntu-iso.1
doc/running-autopkgtests.1
GPL-2
README.updates
requestsync
@ -20,13 +19,12 @@ Files: backportpackage
ubuntu-iso
ubuntutools/requestsync/*.py
Copyright: 2007, Albert Damen <albrt@gmx.net>
2010-2024, Benjamin Drung <bdrung@ubuntu.com>
2010-2022, Benjamin Drung <bdrung@ubuntu.com>
2007-2023, Canonical Ltd.
2006-2007, Daniel Holbach <daniel.holbach@ubuntu.com>
2010, Evan Broder <evan@ebroder.net>
2006-2007, Luke Yelavich <themuso@ubuntu.com>
2009-2010, Michael Bienia <geser@ubuntu.com>
2024-2025, Simon Quigley <tsimonq2@debian.org>
2010-2011, Stefano Rivera <stefanor@ubuntu.com>
2008, Stephan Hermann <sh@sourcecode.de>
2007, Steve Kowalik <stevenk@ubuntu.com>
@ -74,28 +72,23 @@ License: GPL-2+
On Debian systems, the complete text of the GNU General Public License
version 2 can be found in the /usr/share/common-licenses/GPL-2 file.
Files: doc/lp-bitesize.1
Files: doc/bitesize.1
doc/check-mir.1
doc/grab-merge.1
doc/merge-changelog.1
doc/pm-helper.1
doc/setup-packaging-environment.1
doc/syncpackage.1
lp-bitesize
bitesize
check-mir
GPL-3
grab-merge
merge-changelog
pm-helper
pyproject.toml
run-linters
running-autopkgtests
setup-packaging-environment
syncpackage
ubuntutools/running_autopkgtests.py
ubuntutools/utils.py
Copyright: 2010-2024, Benjamin Drung <bdrung@ubuntu.com>
2007-2024, Canonical Ltd.
Copyright: 2010, Benjamin Drung <bdrung@ubuntu.com>
2007-2023, Canonical Ltd.
2008, Jonathan Patrick Davies <jpds@ubuntu.com>
2008-2010, Martin Pitt <martin.pitt@canonical.com>
2009, Siegfried-Angel Gevatter Pujals <rainct@ubuntu.com>
@ -184,12 +177,11 @@ Files: doc/pull-debian-debdiff.1
ubuntutools/version.py
update-maintainer
.pylintrc
Copyright: 2009-2024, Benjamin Drung <bdrung@ubuntu.com>
Copyright: 2009-2023, Benjamin Drung <bdrung@ubuntu.com>
2010, Evan Broder <evan@ebroder.net>
2008, Siegfried-Angel Gevatter Pujals <rainct@ubuntu.com>
2010-2011, Stefano Rivera <stefanor@ubuntu.com>
2017-2021, Dan Streetman <ddstreet@canonical.com>
2024, Canonical Ltd.
License: ISC
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above

View File

@ -1,3 +0,0 @@
# pyc files are machine-generated; they're expected to have long lines and have unstated copyright
source: file-without-copyright-information *.pyc [debian/copyright]
source: very-long-line-length-in-source-file * > 512 [*.pyc:*]

View File

@ -1,21 +1,21 @@
.TH lp-bitesize "1" "May 9 2010" "ubuntu-dev-tools"
.TH bitesize "1" "May 9 2010" "ubuntu-dev-tools"
.SH NAME
lp-bitesize \- Add \fBbitesize\fR tag to bugs and add a comment.
bitesize \- Add \fBbitesize\fR tag to bugs and add a comment.
.SH SYNOPSIS
.B lp-bitesize \fR<\fIbug number\fR>
.B bitesize \fR<\fIbug number\fR>
.br
.B lp-bitesize \-\-help
.B bitesize \-\-help
.SH DESCRIPTION
\fBlp-bitesize\fR adds a bitesize tag to the bug, if it's not there yet. It
\fBbitesize\fR adds a bitesize tag to the bug, if it's not there yet. It
also adds a comment to the bug indicating that you are willing to help with
fixing it.
It checks for permission to operate on a given bug first,
then perform required tasks on Launchpad.
.SH OPTIONS
Listed below are the command line options for \fBlp-bitesize\fR:
Listed below are the command line options for \fBbitesize\fR:
.TP
.BR \-h ", " \-\-help
Display a help message and exit.
@ -48,7 +48,7 @@ The default value for \fB--lpinstance\fR.
.BR ubuntu\-dev\-tools (5)
.SH AUTHORS
\fBlp-bitesize\fR and this manual page were written by Daniel Holbach
\fBbitesize\fR and this manual page were written by Daniel Holbach
<daniel.holbach@canonical.com>.
.PP
Both are released under the terms of the GNU General Public License, version 3.

View File

@ -20,7 +20,7 @@ like for example \fBpbuilder\-feisty\fP, \fBpbuilder\-sid\fP, \fBpbuilder\-gutsy
.PP
The same applies to \fBcowbuilder\-dist\fP, which uses cowbuilder. The main
difference between both is that pbuilder compresses the created chroot as a
tarball, thus using less disc space but needing to uncompress (and possibly
a tarball, thus using less disc space but needing to uncompress (and possibly
compress) its contents again on each run, and cowbuilder doesn't do this.
.SH USAGE
@ -38,7 +38,7 @@ This optional parameter will attempt to construct a chroot in a foreign
architecture.
For some architecture pairs (e.g. i386 on an amd64 install), the chroot
will be created natively.
For others (e.g. arm64 on an amd64 install), qemu\-user\-static will be
For others (e.g. armel on an i386 install), qemu\-user\-static will be
used.
Note that some combinations (e.g. amd64 on an i386 install) require
special separate kernel handling, and may break in unexpected ways.

View File

@ -1,44 +0,0 @@
.\" Copyright (C) 2023, Canonical Ltd.
.\"
.\" This program is free software; you can redistribute it and/or
.\" modify it under the terms of the GNU General Public License, version 3.
.\"
.\" This program is distributed in the hope that it will be useful,
.\" but WITHOUT ANY WARRANTY; without even the implied warranty of
.\" MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
.\" General Public License for more details.
.\"
.\" You should have received a copy of the GNU General Public License
.\" along with this program. If not, see <http://www.gnu.org/licenses/>.
.TH pm\-helper 1 "June 2023" ubuntu\-dev\-tools
.SH NAME
pm\-helper \- helper to guide a developer through proposed\-migration work
.SH SYNOPSIS
.B pm\-helper \fR[\fIoptions\fR] [\fIpackage\fR]
.SH DESCRIPTION
Claim a package from proposed\-migration to work on and get additional
information (such as the state of the package in Debian) that may be helpful
in unblocking it.
.PP
This tool is incomplete and under development.
.SH OPTIONS
.TP
.B \-l \fIINSTANCE\fR, \fB\-\-launchpad\fR=\fIINSTANCE\fR
Use the specified instance of Launchpad (e.g. "staging"), instead of
the default of "production".
.TP
.B \-v\fR, \fB--verbose\fR
be more verbose
.TP
\fB\-h\fR, \fB\-\-help\fR
Display a help message and exit
.SH AUTHORS
\fBpm\-helper\fR and this manpage were written by Steve Langasek
<steve.langasek@ubuntu.com>.
.PP
Both are released under the GPLv3 license.

View File

@ -1,15 +0,0 @@
.TH running\-autopkgtests "1" "18 January 2024" "ubuntu-dev-tools"
.SH NAME
running\-autopkgtests \- dumps a list of currently running autopkgtests
.SH SYNOPSIS
.B running\-autopkgtests
.SH DESCRIPTION
Dumps a list of currently running and queued tests in Autopkgtest.
Pass --running to only see running tests, or --queued to only see
queued tests. Passing both will print both, which is the default behavior.
.SH AUTHOR
.B running\-autopkgtests
was written by Chris Peterson <chris.peterson@canonical.com>.

View File

@ -11,7 +11,7 @@ contributors to get their Ubuntu installation ready for packaging work. It
ensures that all four components from Ubuntu's official repositories are enabled
along with their corresponding source repositories. It also installs a minimal
set of packages needed for Ubuntu packaging work (ubuntu-dev-tools, devscripts,
debhelper, patchutils, pbuilder, and build-essential). Finally, it assists
debhelper, cdbs, patchutils, pbuilder, and build-essential). Finally, it assists
in defining the DEBEMAIL and DEBFULLNAME environment variables.
.SH AUTHORS

View File

@ -4,11 +4,11 @@ syncpackage \- copy source packages from Debian to Ubuntu
.\"
.SH SYNOPSIS
.B syncpackage
[\fIoptions\fR] \fI<.dsc URL/path or package name(s)>\fR
[\fIoptions\fR] \fI<.dsc URL/path or package name>\fR
.\"
.SH DESCRIPTION
\fBsyncpackage\fR causes one or more source package(s) to be copied from Debian
to Ubuntu.
\fBsyncpackage\fR causes a source package to be copied from Debian to
Ubuntu.
.PP
\fBsyncpackage\fR allows you to upload files with the same checksums of the
Debian ones, as the common script used by Ubuntu archive administrators does,
@ -58,7 +58,7 @@ Display more progress information.
\fB\-F\fR, \fB\-\-fakesync\fR
Perform a fakesync, to work around a tarball mismatch between Debian and
Ubuntu.
This option ignores blocklisting, and performs a local sync.
This option ignores blacklisting, and performs a local sync.
It implies \fB\-\-no\-lp\fR, and will leave a signed \fB.changes\fR file
for you to upload.
.TP

View File

@ -1,14 +1,9 @@
.TH UBUNTU-BUILD "1" "Mar 2024" "ubuntu-dev-tools"
.TH UBUNTU-BUILD "1" "June 2010" "ubuntu-dev-tools"
.SH NAME
ubuntu-build \- command-line interface to Launchpad build operations
.SH SYNOPSIS
.nf
\fBubuntu-build\fR <srcpackage> <release> <operation>
\fBubuntu-build\fR --batch [--retry] [--rescore \fIPRIORITY\fR] [--arch \fIARCH\fR [...]]
[--series \fISERIES\fR] [--state \fIBUILD-STATE\fR]
[-A \fIARCHIVE\fR] [pkg]...
.fi
.B ubuntu-build <srcpackage> <release> <operation>
.SH DESCRIPTION
\fBubuntu-build\fR provides a command line interface to the Launchpad build
@ -43,7 +38,8 @@ operations.
\fB\-a\fR ARCHITECTURE, \fB\-\-arch\fR=\fIARCHITECTURE\fR
Rebuild or rescore a specific architecture. Valid
architectures are:
armhf, arm64, amd64, i386, powerpc, ppc64el, riscv64, s390x.
armel, armhf, arm64, amd64, hppa, i386, ia64,
lpia, powerpc, ppc64el, riscv64, s390x, sparc.
.TP
Batch processing:
.IP
@ -63,16 +59,15 @@ Retry builds (give\-back).
\fB\-\-rescore\fR=\fIPRIORITY\fR
Rescore builds to <priority>.
.IP
\fB\-\-arch\fR=\fIARCHITECTURE\fR
\fB\-\-arch2\fR=\fIARCHITECTURE\fR
Affect only 'architecture' (can be used several
times). Valid architectures are:
arm64, amd64, i386, powerpc, ppc64el, riscv64, s390x.
.IP
\fB\-A=\fIARCHIVE\fR
Act on the named archive (ppa) instead of on the main Ubuntu archive.
armel, armhf, arm64, amd64, hppa, i386, ia64,
lpia, powerpc, ppc64el, riscv64, s390x, sparc.
.SH AUTHORS
\fBubuntu-build\fR was written by Martin Pitt <martin.pitt@canonical.com>, and
this manual page was written by Jonathan Patrick Davies <jpds@ubuntu.com>.
.PP
Both are released under the terms of the GNU General Public License, version 3.
Both are released under the terms of the GNU General Public License, version 3
or (at your option) any later version.

View File

@ -54,6 +54,7 @@ def main():
"multiverse",
"multiverse-manual",
):
url = f"https://merges.ubuntu.com/{component}.json"
try:
headers, page = Http().request(url)

View File

@ -29,8 +29,6 @@ import logging
import re
import sys
import webbrowser
from collections.abc import Iterable
from email.message import EmailMessage
import debianbts
from launchpadlib.launchpad import Launchpad
@ -39,10 +37,11 @@ from ubuntutools import getLogger
from ubuntutools.config import UDTConfig
Logger = getLogger()
ATTACHMENT_MAX_SIZE = 2000
def parse_args() -> argparse.Namespace:
def main():
bug_re = re.compile(r"bug=(\d+)")
parser = argparse.ArgumentParser()
parser.add_argument(
"-b",
@ -72,15 +71,28 @@ def parse_args() -> argparse.Namespace:
"--no-conf", action="store_true", help="Don't read config files or environment variables."
)
parser.add_argument("bugs", nargs="+", help="Bug number(s) or URL(s)")
return parser.parse_args()
options = parser.parse_args()
config = UDTConfig(options.no_conf)
if options.lpinstance is None:
options.lpinstance = config.get_value("LPINSTANCE")
def get_bug_numbers(bug_list: Iterable[str]) -> list[int]:
bug_re = re.compile(r"bug=(\d+)")
if options.dry_run:
launchpad = Launchpad.login_anonymously("ubuntu-dev-tools")
options.verbose = True
else:
launchpad = Launchpad.login_with("ubuntu-dev-tools", options.lpinstance)
if options.verbose:
Logger.setLevel(logging.DEBUG)
debian = launchpad.distributions["debian"]
ubuntu = launchpad.distributions["ubuntu"]
lp_debbugs = launchpad.bug_trackers.getByName(name="debbugs")
bug_nums = []
for bug_num in bug_list:
for bug_num in options.bugs:
if bug_num.startswith("http"):
# bug URL
match = bug_re.search(bug_num)
@ -89,81 +101,24 @@ def get_bug_numbers(bug_list: Iterable[str]) -> list[int]:
sys.exit(1)
bug_num = match.groups()[0]
bug_num = bug_num.lstrip("#")
bug_nums.append(int(bug_num))
bug_num = int(bug_num)
bug_nums.append(bug_num)
return bug_nums
bugs = debianbts.get_status(bug_nums)
def walk_multipart_message(message: EmailMessage) -> tuple[str, list[tuple[int, EmailMessage]]]:
summary = ""
attachments = []
i = 1
for part in message.walk():
content_type = part.get_content_type()
if content_type.startswith("multipart/"):
# we're already iterating on multipart items
# let's just skip the multipart extra metadata
continue
if content_type == "application/pgp-signature":
# we're not interested in importing pgp signatures
continue
if part.is_attachment():
attachments.append((i, part))
elif content_type.startswith("image/"):
# images here are not attachment, they are inline, but Launchpad can't handle that,
# so let's add them as attachments
summary += f"Message part #{i}\n"
summary += f"[inline image '{part.get_filename()}']\n\n"
attachments.append((i, part))
elif content_type.startswith("text/html"):
summary += f"Message part #{i}\n"
summary += "[inline html]\n\n"
attachments.append((i, part))
elif content_type == "text/plain":
summary += f"Message part #{i}\n"
summary += part.get_content() + "\n"
else:
raise RuntimeError(
f"""Unknown message part
Your Debian bug is too weird to be imported in Launchpad, sorry.
You can fix that by patching this script in ubuntu-dev-tools.
Faulty message part:
{part}"""
)
i += 1
return summary, attachments
def process_bugs(
bugs: Iterable[debianbts.Bugreport],
launchpad: Launchpad,
package: str,
dry_run: bool = True,
browserless: bool = False,
) -> bool:
debian = launchpad.distributions["debian"]
ubuntu = launchpad.distributions["ubuntu"]
lp_debbugs = launchpad.bug_trackers.getByName(name="debbugs")
if not bugs:
Logger.error("Cannot find any of the listed bugs")
sys.exit(1)
err = False
for bug in bugs:
ubupackage = bug.source
if package:
ubupackage = package
ubupackage = package = bug.source
if options.package:
ubupackage = options.package
bug_num = bug.bug_num
subject = bug.subject
log = debianbts.get_bug_log(bug_num)
message = log[0]["message"]
assert isinstance(message, EmailMessage)
attachments: list[tuple[int, EmailMessage]] = []
if message.is_multipart():
summary, attachments = walk_multipart_message(message)
else:
summary = str(message.get_payload())
summary = log[0]["message"].get_payload()
target = ubuntu.getSourcePackage(name=ubupackage)
if target is None:
Logger.error(
@ -182,73 +137,24 @@ def process_bugs(
Logger.debug("Subject: %s", subject)
Logger.debug("Description: ")
Logger.debug(description)
for i, attachment in attachments:
Logger.debug("Attachment #%s (%s)", i, attachment.get_filename() or "inline")
Logger.debug("Content:")
if attachment.get_content_type() == "text/plain":
content = attachment.get_content()
if len(content) > ATTACHMENT_MAX_SIZE:
content = (
content[:ATTACHMENT_MAX_SIZE]
+ f" [attachment cropped after {ATTACHMENT_MAX_SIZE} characters...]"
)
Logger.debug(content)
else:
Logger.debug("[data]")
if dry_run:
if options.dry_run:
Logger.info("Dry-Run: not creating Ubuntu bug.")
continue
u_bug = launchpad.bugs.createBug(target=target, title=subject, description=description)
for i, attachment in attachments:
name = f"#{i}-{attachment.get_filename() or "inline"}"
content = attachment.get_content()
if isinstance(content, str):
# Launchpad only wants bytes
content = content.encode()
u_bug.addAttachment(
filename=name,
data=content,
comment=f"Imported from Debian bug http://bugs.debian.org/{bug_num}",
)
d_sp = debian.getSourcePackage(name=package)
if d_sp is None and package:
d_sp = debian.getSourcePackage(name=package)
if d_sp is None and options.package:
d_sp = debian.getSourcePackage(name=options.package)
d_task = u_bug.addTask(target=d_sp)
d_watch = u_bug.addWatch(remote_bug=bug_num, bug_tracker=lp_debbugs)
d_task.bug_watch = d_watch
d_task.lp_save()
Logger.info("Opened %s", u_bug.web_link)
if not browserless:
if not options.browserless:
webbrowser.open(u_bug.web_link)
return err
def main() -> None:
options = parse_args()
config = UDTConfig(options.no_conf)
if options.lpinstance is None:
options.lpinstance = config.get_value("LPINSTANCE")
if options.dry_run:
launchpad = Launchpad.login_anonymously("ubuntu-dev-tools")
options.verbose = True
else:
launchpad = Launchpad.login_with("ubuntu-dev-tools", options.lpinstance)
if options.verbose:
Logger.setLevel(logging.DEBUG)
bugs = debianbts.get_status(get_bug_numbers(options.bugs))
if not bugs:
Logger.error("Cannot find any of the listed bugs")
sys.exit(1)
if process_bugs(bugs, launchpad, options.package, options.dry_run, options.browserless):
if err:
sys.exit(1)

View File

@ -155,7 +155,6 @@ proxy="_unset_"
DEBOOTSTRAP_NO_CHECK_GPG=0
EATMYDATA=1
CCACHE=0
USE_PKGBINARYMANGLER=0
while :; do
case "$1" in
@ -167,7 +166,7 @@ while :; do
--arch)
CHROOT_ARCH="$2"
case $2 in
armhf|i386)
armel|armhf|i386|lpia)
if [ -z "$personality" ]; then
personality="linux32"
fi
@ -304,27 +303,11 @@ if [ ! -w /var/lib/sbuild ]; then
# Prepare a usable default .sbuildrc
if [ ! -e ~/.sbuildrc ]; then
cat > ~/.sbuildrc <<EOM
# *** THIS COMMAND IS DEPRECATED ***
#
# In sbuild 0.87.0 and later, the unshare backend is available. This is
# expected to become the default in a future release.
#
# This is the new preferred way of building Debian packages, making the manual
# creation of schroots no longer necessary. To retain the default behavior,
# you may remove this comment block and continue.
#
# To test the unshare backend while retaining the default settings, run sbuild
# with --chroot-mode=unshare like this:
# $ sbuild --chroot-mode=unshare --dist=unstable hello
#
# To switch to the unshare backend by default (recommended), uncomment the
# following lines and delete the rest of the file (with the exception of the
# last two lines):
#\$chroot_mode = 'unshare';
#\$unshare_mmdebstrap_keep_tarball = 1;
# *** VERIFY AND UPDATE \$mailto and \$maintainer_name BELOW ***
# Mail address where logs are sent to (mandatory, no default!)
\$mailto = '$USER';
# Name to use as override in .changes files for the Maintainer: field
#\$maintainer_name='$USER <$USER@localhost>';
@ -668,7 +651,6 @@ ubuntu)
if ubuntu_dist_ge "$RELEASE" "edgy"; then
# Add pkgbinarymangler (edgy and later)
BUILD_PKGS="$BUILD_PKGS pkgbinarymangler"
USE_PKGBINARYMANGLER=1
# Disable recommends for a smaller chroot (gutsy and later only)
if ubuntu_dist_ge "$RELEASE" "gutsy"; then
BUILD_PKGS="--no-install-recommends $BUILD_PKGS"
@ -685,7 +667,7 @@ debian)
DEBOOTSTRAP_MIRROR="http://deb.debian.org/debian"
fi
if [ -z "$COMPONENTS" ]; then
COMPONENTS="main non-free non-free-firmware contrib"
COMPONENTS="main non-free contrib"
fi
if [ -z "$SOURCES_PROPOSED_SUITE" ]; then
SOURCES_PROPOSED_SUITE="RELEASE-proposed-updates"
@ -768,12 +750,12 @@ DEBOOTSTRAP_COMMAND=debootstrap
if [ "$CHROOT_ARCH" != "$HOST_ARCH" ] ; then
case "$CHROOT_ARCH-$HOST_ARCH" in
# Sometimes we don't need qemu
amd64-i386|arm64-armhf|armhf-arm64|i386-amd64|powerpc-ppc64|ppc64-powerpc)
amd64-i386|amd64-lpia|armel-armhf|armhf-armel|arm64-armel|arm64-armhf|armel-arm64|armhf-arm64|i386-amd64|i386-lpia|lpia-i386|powerpc-ppc64|ppc64-powerpc|sparc-sparc64|sparc64-sparc)
;;
# Sometimes we do
*)
DEBOOTSTRAP_COMMAND=debootstrap
if ! which "qemu-x86_64-static"; then
DEBOOTSTRAP_COMMAND=qemu-debootstrap
if ! which "$DEBOOTSTRAP_COMMAND"; then
sudo apt-get install qemu-user-static
fi
;;
@ -892,13 +874,6 @@ EOM
fi
fi
if [ -z "$SKIP_PROPOSED" ]; then
TEMP_PREFERENCES=`mktemp -t preferences-XXXXXX`
cat >> "$TEMP_PREFERENCES" <<EOM
# override for NotAutomatic: yes
Package: *
Pin: release a=*-proposed
Pin-Priority: 500
EOM
cat >> "$TEMP_SOURCES" <<EOM
deb ${MIRROR_ARCHS}${DEBOOTSTRAP_MIRROR} $SOURCES_PROPOSED_SUITE ${COMPONENTS}
deb-src ${DEBOOTSTRAP_MIRROR} $SOURCES_PROPOSED_SUITE ${COMPONENTS}
@ -924,12 +899,9 @@ fi
cat "$TEMP_SOURCES" | sed -e "s|RELEASE|$RELEASE|g" | \
sudo bash -c "cat > $MNT/etc/apt/sources.list"
rm -f "$TEMP_SOURCES"
if [ -n "$TEMP_PREFERENCES" ]; then
sudo mv "$TEMP_PREFERENCES" $MNT/etc/apt/preferences.d/proposed.pref
fi
# Copy the timezone (uncomment this if you want to use your local time zone)
#sudo cp -P --remove-destination /etc/localtime /etc/timezone "$MNT"/etc/
# Copy the timezone (comment this out if you want to leave the chroot at UTC)
sudo cp -P --remove-destination /etc/localtime /etc/timezone "$MNT"/etc/
# Create a schroot entry for this chroot
TEMP_SCHROOTCONF=`mktemp -t schrootconf-XXXXXX`
TEMPLATE_SCHROOTCONF=~/.mk-sbuild.schroot.conf
@ -1048,25 +1020,6 @@ EOF
EOM
fi
if [ "$USE_PKGBINARYMANGLER" = 1 ]; then
sudo bash -c "cat >> $MNT/finish.sh" <<EOM
mkdir -p /etc/pkgbinarymangler/
cat > /etc/pkgbinarymangler/maintainermangler.conf <<EOF
# pkgmaintainermangler configuration file
# pkgmaintainermangler will do nothing unless enable is set to "true"
enable: true
# Configure what happens if /CurrentlyBuilding is present, but invalid
# (i. e. it does not contain a Package: field). If "ignore" (default),
# the file is ignored (i. e. the Maintainer field is mangled) and a
# warning is printed. If "fail" (or any other value), pkgmaintainermangler
# exits with an error, which causes a package build to fail.
invalid_currentlybuilding: ignore
EOF
EOM
fi
if [ -n "$TARGET_ARCH" ]; then
sudo bash -c "cat >> $MNT/finish.sh" <<EOM
# Configure target architecture
@ -1085,7 +1038,7 @@ apt-get update || true
echo set debconf/frontend Noninteractive | debconf-communicate
echo set debconf/priority critical | debconf-communicate
# Install basic build tool set, trying to match buildd
apt-get -y --force-yes -o Dpkg::Options::="--force-confold" install $BUILD_PKGS
apt-get -y --force-yes install $BUILD_PKGS
# Set up expected /dev entries
if [ ! -r /dev/stdin ]; then ln -s /proc/self/fd/0 /dev/stdin; fi
if [ ! -r /dev/stdout ]; then ln -s /proc/self/fd/1 /dev/stdout; fi

View File

@ -95,11 +95,7 @@ class PbuilderDist:
# Builder
self.builder = builder
# Distro info
self.debian_distro_info = DebianDistroInfo()
self.ubuntu_distro_info = UbuntuDistroInfo()
self._debian_distros = self.debian_distro_info.all + ["stable", "testing", "unstable"]
self._debian_distros = DebianDistroInfo().all + ["stable", "testing", "unstable"]
# Ensure that the used builder is installed
paths = set(os.environ["PATH"].split(":"))
@ -155,9 +151,8 @@ class PbuilderDist:
if not os.path.isfile(os.path.join("/usr/share/debootstrap/scripts/", distro)):
if os.path.isdir("/usr/share/debootstrap/scripts/"):
# Debian experimental doesn't have a debootstrap file but
# should work nevertheless. Ubuntu releases automatically use
# the gutsy script as of debootstrap 1.0.128+nmu2ubuntu1.1.
if distro not in (self._debian_distros + self.ubuntu_distro_info.all):
# should work nevertheless.
if distro not in self._debian_distros:
question = (
f'Warning: Unknown distribution "{distro}". ' "Do you want to continue"
)
@ -275,7 +270,7 @@ class PbuilderDist:
mirror = os.environ.get("MIRRORSITE", config.get_value("DEBIAN_MIRROR"))
components = "main"
if self.extra_components:
components += " contrib non-free non-free-firmware"
components += " contrib non-free"
else:
mirror = os.environ.get("MIRRORSITE", config.get_value("UBUNTU_MIRROR"))
if self.build_architecture not in ("amd64", "i386"):
@ -293,24 +288,23 @@ class PbuilderDist:
othermirrors.append(repo)
if self.target_distro in self._debian_distros:
debian_info = DebianDistroInfo()
try:
codename = self.debian_distro_info.codename(
self.target_distro, default=self.target_distro
)
codename = debian_info.codename(self.target_distro, default=self.target_distro)
except DistroDataOutdated as error:
Logger.warning(error)
if codename in (self.debian_distro_info.devel(), "experimental"):
if codename in (debian_info.devel(), "experimental"):
self.enable_security = False
self.enable_updates = False
self.enable_proposed = False
elif codename in (self.debian_distro_info.testing(), "testing"):
elif codename in (debian_info.testing(), "testing"):
self.enable_updates = False
if self.enable_security:
pocket = "-security"
with suppress(ValueError):
# before bullseye (version 11) security suite is /updates
if float(self.debian_distro_info.version(codename)) < 11.0:
if float(debian_info.version(codename)) < 11.0:
pocket = "/updates"
othermirrors.append(
f"deb {config.get_value('DEBSEC_MIRROR')}"
@ -328,7 +322,7 @@ class PbuilderDist:
aptcache = os.path.join(self.base, "aptcache", "debian")
else:
try:
dev_release = self.target_distro == self.ubuntu_distro_info.devel()
dev_release = self.target_distro == UbuntuDistroInfo().devel()
except DistroDataOutdated as error:
Logger.warning(error)
dev_release = True
@ -493,12 +487,22 @@ def main():
requested_arch,
) not in [
("amd64", "i386"),
("amd64", "lpia"),
("arm", "armel"),
("armel", "arm"),
("armel", "armhf"),
("armhf", "armel"),
("arm64", "arm"),
("arm64", "armhf"),
("arm64", "armel"),
("i386", "lpia"),
("lpia", "i386"),
("powerpc", "ppc64"),
("ppc64", "powerpc"),
("sparc", "sparc64"),
("sparc64", "sparc"),
]:
args += ["--debootstrap", "debootstrap"]
args += ["--debootstrap", "qemu-debootstrap"]
if "mainonly" in sys.argv or "--main-only" in sys.argv:
app.extra_components = False

142
pm-helper
View File

@ -1,142 +0,0 @@
#!/usr/bin/python3
# Find the next thing to work on for proposed-migration
# Copyright (C) 2023 Canonical Ltd.
# Author: Steve Langasek <steve.langasek@ubuntu.com>
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License, version 3.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# General Public License for more details.
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import lzma
import sys
import webbrowser
from argparse import ArgumentParser
import yaml
from launchpadlib.launchpad import Launchpad
from ubuntutools.utils import get_url
# proposed-migration is only concerned with the devel series; unlike other
# tools, don't make this configurable
excuses_url = "https://ubuntu-archive-team.ubuntu.com/proposed-migration/update_excuses.yaml.xz"
def get_proposed_version(excuses, package):
for k in excuses["sources"]:
if k["source"] == package:
return k.get("new-version")
return None
def claim_excuses_bug(launchpad, bug, package):
print(f"LP: #{bug.id}: {bug.title}")
ubuntu = launchpad.distributions["ubuntu"]
series = ubuntu.current_series.fullseriesname
for task in bug.bug_tasks:
# targeting to a series doesn't make the default task disappear,
# it just makes it useless
if task.bug_target_name == f"{package} ({series})":
our_task = task
break
if task.bug_target_name == f"{package} (Ubuntu)":
our_task = task
if our_task.assignee == launchpad.me:
print("Bug already assigned to you.")
return True
if our_task.assignee:
print(f"Currently assigned to {our_task.assignee.name}")
print("""Do you want to claim this bug? [yN] """, end="")
sys.stdout.flush()
response = sys.stdin.readline()
if response.strip().lower().startswith("y"):
our_task.assignee = launchpad.me
our_task.lp_save()
return True
return False
def create_excuses_bug(launchpad, package, version):
print("Will open a new bug")
bug = launchpad.bugs.createBug(
title=f"proposed-migration for {package} {version}",
tags=("update-excuse"),
target=f"https://api.launchpad.net/devel/ubuntu/+source/{package}",
description=f"{package} {version} is stuck in -proposed.",
)
task = bug.bug_tasks[0]
task.assignee = launchpad.me
task.lp_save()
print(f"Opening {bug.web_link} in browser")
webbrowser.open(bug.web_link)
return bug
def has_excuses_bugs(launchpad, package):
ubuntu = launchpad.distributions["ubuntu"]
pkg = ubuntu.getSourcePackage(name=package)
if not pkg:
raise ValueError(f"No such source package: {package}")
tasks = pkg.searchTasks(tags=["update-excuse"], order_by=["id"])
bugs = [task.bug for task in tasks]
if not bugs:
return False
if len(bugs) == 1:
print(f"There is 1 open update-excuse bug against {package}")
else:
print(f"There are {len(bugs)} open update-excuse bugs against {package}")
for bug in bugs:
if claim_excuses_bug(launchpad, bug, package):
return True
return True
def main():
parser = ArgumentParser()
parser.add_argument("-l", "--launchpad", dest="launchpad_instance", default="production")
parser.add_argument(
"-v", "--verbose", default=False, action="store_true", help="be more verbose"
)
parser.add_argument("package", nargs="?", help="act on this package only")
args = parser.parse_args()
args.launchpad = Launchpad.login_with("pm-helper", args.launchpad_instance, version="devel")
f = get_url(excuses_url, False)
with lzma.open(f) as lzma_f:
excuses = yaml.load(lzma_f, Loader=yaml.CSafeLoader)
if args.package:
try:
if not has_excuses_bugs(args.launchpad, args.package):
proposed_version = get_proposed_version(excuses, args.package)
if not proposed_version:
print(f"Package {args.package} not found in -proposed.")
sys.exit(1)
create_excuses_bug(args.launchpad, args.package, proposed_version)
except ValueError as e:
sys.stderr.write(f"{e}\n")
else:
pass # for now
if __name__ == "__main__":
sys.exit(main())

View File

@ -1,6 +1,5 @@
python-debian
python-debianbts
dateutil
distro-info
httplib2
launchpadlib

View File

@ -183,7 +183,7 @@ def display_verbose(package, values):
Logger.info("No reverse dependencies found")
return
def log_package(values, package, arch, dependency, visited, offset=0):
def log_package(values, package, arch, dependency, offset=0):
line = f"{' ' * offset}* {package}"
if all_archs and set(arch) != all_archs:
line += f" [{' '.join(sorted(arch))}]"
@ -192,9 +192,6 @@ def display_verbose(package, values):
line += " " * (30 - len(line))
line += f" (for {dependency})"
Logger.info(line)
if package in visited:
return
visited = visited.copy().add(package)
data = values.get(package)
if data:
offset = offset + 1
@ -205,7 +202,6 @@ def display_verbose(package, values):
rdep["Package"],
rdep.get("Architectures", all_archs),
rdep.get("Dependency"),
visited,
offset,
)
@ -227,7 +223,6 @@ def display_verbose(package, values):
rdep["Package"],
rdep.get("Architectures", all_archs),
rdep.get("Dependency"),
{package},
)
Logger.info("")

View File

@ -1,81 +0,0 @@
#!/usr/bin/python3
# -*- Mode: Python; coding: utf-8; indent-tabs-mode: nil; tab-width: 4 -*-
# Authors:
# Andy P. Whitcroft
# Christian Ehrhardt
# Chris Peterson <chris.peterson@canonical.com>
#
# Copyright (C) 2024 Canonical Ltd.
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU General Public License version 3, as published
# by the Free Software Foundation.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranties of
# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program. If not, see <http://www.gnu.org/licenses/>.
"""Dumps a list of currently running tests in Autopkgtest"""
__example__ = """
Display first listed test running on amd64 hardware:
$ running-autopkgtests | grep amd64 | head -n1
R 0:01:40 systemd-upstream - focal amd64\
upstream-systemd-ci/systemd-ci - ['CFLAGS=-O0', 'DEB_BUILD_PROFILES=noudeb',\
'TEST_UPSTREAM=1', 'CONFFLAGS_UPSTREAM=--werror -Dslow-tests=true',\
'UPSTREAM_PULL_REQUEST=23153',\
'GITHUB_STATUSES_URL=https://api.github.com/repos/\
systemd/systemd/statuses/cfb0935923dff8050315b5dd22ce8ab06461ff0e']
"""
import sys
from argparse import ArgumentParser, RawDescriptionHelpFormatter
from ubuntutools.running_autopkgtests import get_queued, get_running
def parse_args():
description = (
"Dumps a list of currently running and queued tests in Autopkgtest. "
"Pass --running to only see running tests, or --queued to only see "
"queued tests. Passing both will print both, which is the default behavior. "
)
parser = ArgumentParser(
prog="running-autopkgtests",
description=description,
epilog=f"example: {__example__}",
formatter_class=RawDescriptionHelpFormatter,
)
parser.add_argument(
"-r", "--running", action="store_true", help="Print runnning autopkgtests (default: true)"
)
parser.add_argument(
"-q", "--queued", action="store_true", help="Print queued autopkgtests (default: true)"
)
options = parser.parse_args()
# If neither flag was specified, default to both not neither
if not options.running and not options.queued:
options.running = True
options.queued = True
return options
def main() -> int:
args = parse_args()
if args.running:
print(get_running())
if args.queued:
print(get_queued())
return 0
if __name__ == "__main__":
sys.exit(main())

View File

@ -104,7 +104,7 @@ echo "In order to do packaging work, you'll need a minimal set of packages."
echo "Those, together with other packages which, though optional, have proven"
echo "to be useful, will now be installed."
echo
sudo apt-get install ubuntu-dev-tools devscripts debhelper patchutils pbuilder build-essential
sudo apt-get install ubuntu-dev-tools devscripts debhelper cdbs patchutils pbuilder build-essential
separator2
echo "Enabling the source repository"

View File

@ -32,18 +32,17 @@ def make_pep440_compliant(version: str) -> str:
scripts = [
"backportpackage",
"bitesize",
"check-mir",
"check-symbols",
"dch-repeat",
"grab-merge",
"grep-merges",
"import-bug-from-debian",
"lp-bitesize",
"merge-changelog",
"mk-sbuild",
"pbuilder-dist",
"pbuilder-dist-simple",
"pm-helper",
"pull-pkg",
"pull-debian-debdiff",
"pull-debian-source",
@ -65,7 +64,6 @@ scripts = [
"requestbackport",
"requestsync",
"reverse-depends",
"running-autopkgtests",
"seeded-in-ubuntu",
"setup-packaging-environment",
"sponsor-patch",

View File

@ -49,7 +49,6 @@ from ubuntutools.requestsync.mail import get_debian_srcpkg as requestsync_mail_g
from ubuntutools.version import Version
Logger = getLogger()
cached_sync_blocklist = None
def remove_signature(dscname):
@ -144,7 +143,7 @@ def sync_dsc(
if ubuntu_ver.is_modified_in_ubuntu():
if not force:
Logger.error("--force is required to discard Ubuntu changes.")
return None
sys.exit(1)
Logger.warning(
"Overwriting modified Ubuntu version %s, setting current version to %s",
@ -158,7 +157,7 @@ def sync_dsc(
src_pkg.pull()
except DownloadError as e:
Logger.error("Failed to download: %s", str(e))
return None
sys.exit(1)
src_pkg.unpack()
needs_fakesync = not (need_orig or ubu_pkg.verify_orig())
@ -167,13 +166,13 @@ def sync_dsc(
Logger.warning("Performing a fakesync")
elif not needs_fakesync and fakesync:
Logger.error("Fakesync not required, aborting.")
return None
sys.exit(1)
elif needs_fakesync and not fakesync:
Logger.error(
"The checksums of the Debian and Ubuntu packages "
"mismatch. A fake sync using --fakesync is required."
)
return None
sys.exit(1)
if fakesync:
# Download Ubuntu files (override Debian source tarballs)
@ -181,7 +180,7 @@ def sync_dsc(
ubu_pkg.pull()
except DownloadError as e:
Logger.error("Failed to download: %s", str(e))
return None
sys.exit(1)
# change into package directory
directory = src_pkg.source + "-" + new_ver.upstream_version
@ -266,7 +265,7 @@ def sync_dsc(
returncode = subprocess.call(cmd)
if returncode != 0:
Logger.error("Source-only build with debuild failed. Please check build log above.")
return None
sys.exit(1)
def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
@ -296,7 +295,7 @@ def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
udtexceptions.SeriesNotFoundException,
) as e:
Logger.error(str(e))
return None
sys.exit(1)
if version is None:
version = Version(debian_srcpkg.getVersion())
try:
@ -307,7 +306,7 @@ def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
ubuntu_version = Version("~")
except udtexceptions.SeriesNotFoundException as e:
Logger.error(str(e))
return None
sys.exit(1)
if ubuntu_version >= version:
# The LP importer is maybe out of date
debian_srcpkg = requestsync_mail_get_debian_srcpkg(package, dist)
@ -321,16 +320,16 @@ def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
ubuntu_version,
ubuntu_release,
)
return None
sys.exit(1)
if component is None:
component = debian_srcpkg.getComponent()
assert component in ("main", "contrib", "non-free", "non-free-firmware")
assert component in ("main", "contrib", "non-free")
return DebianSourcePackage(package, version.full_version, component, mirrors=mirrors)
def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, yes=False):
def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False):
"""Copy a source package from Debian to Ubuntu using the Launchpad API."""
ubuntu = Distribution("ubuntu")
debian_archive = Distribution("debian").getArchive()
@ -353,7 +352,7 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, ye
"Debian version %s has not been picked up by LP yet. Please try again later.",
src_pkg.version,
)
return None
sys.exit(1)
try:
ubuntu_spph = get_ubuntu_srcpkg(src_pkg.source, ubuntu_series, ubuntu_pocket)
@ -374,7 +373,7 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, ye
base_version = ubuntu_version.get_related_debian_version()
if not force and ubuntu_version.is_modified_in_ubuntu():
Logger.error("--force is required to discard Ubuntu changes.")
return None
sys.exit(1)
# Check whether a fakesync would be required.
if not src_pkg.dsc.compare_dsc(ubuntu_pkg.dsc):
@ -382,7 +381,7 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, ye
"The checksums of the Debian and Ubuntu packages "
"mismatch. A fake sync using --fakesync is required."
)
return None
sys.exit(1)
except udtexceptions.PackageNotFoundException:
base_version = Version("~")
Logger.info(
@ -403,10 +402,9 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, ye
if sponsoree:
Logger.info("Sponsoring this sync for %s (%s)", sponsoree.display_name, sponsoree.name)
if not yes:
answer = YesNoQuestion().ask("Sync this package", "no")
if answer != "yes":
return
answer = YesNoQuestion().ask("Sync this package", "no")
if answer != "yes":
return
try:
ubuntu_archive.copyPackage(
@ -421,29 +419,26 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, ye
except HTTPError as error:
Logger.error("HTTP Error %s: %s", error.response.status, error.response.reason)
Logger.error(error.content)
return None
sys.exit(1)
Logger.info("Request succeeded; you should get an e-mail once it is processed.")
bugs = sorted(set(bugs))
if bugs:
Logger.info("Launchpad bugs to be closed: %s", ", ".join(str(bug) for bug in bugs))
Logger.info("Please wait for the sync to be successful before closing bugs.")
if yes:
answer = YesNoQuestion().ask("Close bugs", "yes")
if answer == "yes":
close_bugs(bugs, src_pkg.source, src_pkg.version.full_version, changes, sponsoree)
else:
answer = YesNoQuestion().ask("Close bugs", "yes")
if answer == "yes":
close_bugs(bugs, src_pkg.source, src_pkg.version.full_version, changes, sponsoree)
def is_blocklisted(query):
"""Determine if package "query" is in the sync blocklist
Returns tuple of (blocklisted, comments)
blocklisted is one of False, 'CURRENT', 'ALWAYS'
def is_blacklisted(query):
"""Determine if package "query" is in the sync blacklist
Returns tuple of (blacklisted, comments)
blacklisted is one of False, 'CURRENT', 'ALWAYS'
"""
series = Launchpad.distributions["ubuntu"].current_series
lp_comments = series.getDifferenceComments(source_package_name=query)
blocklisted = False
blacklisted = False
comments = [
f"{c.body_text}\n -- {c.comment_author.name}"
f" {c.comment_date.strftime('%a, %d %b %Y %H:%M:%S +0000')}"
@ -451,38 +446,32 @@ def is_blocklisted(query):
]
for diff in series.getDifferencesTo(source_package_name_filter=query):
if diff.status == "Blacklisted current version" and blocklisted != "ALWAYS":
blocklisted = "CURRENT"
if diff.status == "Blacklisted current version" and blacklisted != "ALWAYS":
blacklisted = "CURRENT"
if diff.status == "Blacklisted always":
blocklisted = "ALWAYS"
blacklisted = "ALWAYS"
global cached_sync_blocklist
if not cached_sync_blocklist:
url = "https://ubuntu-archive-team.ubuntu.com/sync-blocklist.txt"
try:
with urllib.request.urlopen(url) as f:
cached_sync_blocklist = f.read().decode("utf-8")
except:
print("WARNING: unable to download the sync blocklist. Erring on the side of caution.")
return ("ALWAYS", "INTERNAL ERROR: Unable to fetch sync blocklist")
# Old blacklist:
url = "http://people.canonical.com/~ubuntu-archive/sync-blacklist.txt"
with urllib.request.urlopen(url) as f:
applicable_lines = []
for line in f:
line = line.decode("utf-8")
if not line.strip():
applicable_lines = []
continue
applicable_lines.append(line)
try:
line = line[: line.index("#")]
except ValueError:
pass
source = line.strip()
if source and fnmatch.fnmatch(query, source):
comments += ["From sync-blacklist.txt:"] + applicable_lines
blacklisted = "ALWAYS"
break
applicable_lines = []
for line in cached_sync_blocklist.splitlines():
if not line.strip():
applicable_lines = []
continue
applicable_lines.append(line)
try:
line = line[:line.index("#")]
except ValueError:
pass
source = line.strip()
if source and fnmatch.fnmatch(query, source):
comments += ["From sync-blocklist.txt:"] + applicable_lines
blocklisted = "ALWAYS"
break
return (blocklisted, comments)
return (blacklisted, comments)
def close_bugs(bugs, package, version, changes, sponsoree):
@ -515,16 +504,10 @@ def close_bugs(bugs, package, version, changes, sponsoree):
def parse():
"""Parse given command-line parameters."""
usage = "%(prog)s [options] <.dsc URL/path or package name(s)>"
usage = "%(prog)s [options] <.dsc URL/path or package name>"
epilog = f"See {os.path.basename(sys.argv[0])}(1) for more info."
parser = argparse.ArgumentParser(usage=usage, epilog=epilog)
parser.add_argument(
"-y",
"--yes",
action="store_true",
help="Automatically sync without prompting. Use with caution and care."
)
parser.add_argument("-d", "--distribution", help="Debian distribution to sync from.")
parser.add_argument("-r", "--release", help="Specify target Ubuntu release.")
parser.add_argument("-V", "--debian-version", help="Specify the version to sync from.")
@ -619,7 +602,7 @@ def parse():
metavar="UBUNTU_MIRROR",
help=f"Preferred Ubuntu mirror (default: {UDTConfig.defaults['UBUNTU_MIRROR']})",
)
parser.add_argument("package", nargs="*", help=argparse.SUPPRESS)
parser.add_argument("package", help=argparse.SUPPRESS)
args = parser.parse_args()
if args.fakesync:
@ -630,10 +613,10 @@ def parse():
except TypeError:
parser.error("Invalid bug number(s) specified.")
if args.component not in (None, "main", "contrib", "non-free", "non-free-firmware"):
if args.component not in (None, "main", "contrib", "non-free"):
parser.error(
f"{args.component} is not a valid Debian component. "
f"It should be one of main, contrib, non-free, or non-free-firmware."
f"It should be one of main, contrib, or non-free."
)
if args.lp and args.uploader_name:
@ -644,9 +627,8 @@ def parse():
# ignored with args.lp, and do not require warnings.
if args.lp:
for package in args.package:
if package.endswith(".dsc"):
parser.error(".dsc files can only be synced using --no-lp.")
if args.package.endswith(".dsc"):
parser.error(".dsc files can only be synced using --no-lp.")
return args
@ -669,15 +651,14 @@ def main():
if args.lpinstance is None:
args.lpinstance = config.get_value("LPINSTANCE")
# devel for copyPackage and changelogUrl
kwargs = {"service": args.lpinstance, "api_version": "devel"}
try:
# devel for copyPackage and changelogUrl
kwargs = {"service": args.lpinstance, "api_version": "devel"}
if args.lp and not args.simulate:
Launchpad.login(**kwargs)
else:
Launchpad.login_anonymously(**kwargs)
except IOError as e:
Logger.error("Could not authenticate to LP: %s", str(e))
except IOError:
sys.exit(1)
if args.release is None:
@ -720,80 +701,75 @@ def main():
elif args.uploader_email is None:
args.uploader_email = ubu_email(export=False)[1]
for package in args.package:
src_pkg = fetch_source_pkg(
package,
args.distribution,
args.debian_version,
args.component,
args.release,
args.debian_mirror,
)
if not src_pkg:
continue
src_pkg = fetch_source_pkg(
args.package,
args.distribution,
args.debian_version,
args.component,
args.release,
args.debian_mirror,
)
blocklisted, comments = is_blocklisted(src_pkg.source)
blocklist_fail = False
if blocklisted:
messages = []
blacklisted, comments = is_blacklisted(src_pkg.source)
blacklist_fail = False
if blacklisted:
messages = []
if blocklisted == "CURRENT":
Logger.debug(
"Source package %s is temporarily blocklisted "
"(blocklisted_current). "
"Ubuntu ignores these for now. "
"See also LP: #841372",
src_pkg.source,
)
else:
if args.fakesync:
messages += ["Doing a fakesync, overriding blocklist."]
else:
blocklist_fail = True
messages += [
"If this package needs a fakesync, use --fakesync",
"If you think this package shouldn't be "
"blocklisted, please file a bug explaining your "
"reasoning and subscribe ~ubuntu-archive.",
]
if blocklist_fail:
Logger.error("Source package %s is blocklisted.", src_pkg.source)
elif blocklisted == "ALWAYS":
Logger.info("Source package %s is blocklisted.", src_pkg.source)
if messages:
for message in messages:
for line in textwrap.wrap(message):
Logger.info(line)
if comments:
Logger.info("Blacklist Comments:")
for comment in comments:
for line in textwrap.wrap(comment):
Logger.info(" %s", line)
if blocklist_fail:
continue
if args.lp:
if not copy(src_pkg, args.release, args.bugs, sponsoree, args.simulate, args.force, args.yes):
continue
if blacklisted == "CURRENT":
Logger.debug(
"Source package %s is temporarily blacklisted "
"(blacklisted_current). "
"Ubuntu ignores these for now. "
"See also LP: #841372",
src_pkg.source,
)
else:
os.environ["DEB_VENDOR"] = "Ubuntu"
if not sync_dsc(
src_pkg,
args.distribution,
args.release,
args.uploader_name,
args.uploader_email,
args.bugs,
args.ubuntu_mirror,
args.keyid,
args.simulate,
args.force,
args.fakesync,
):
continue
if args.fakesync:
messages += ["Doing a fakesync, overriding blacklist."]
else:
blacklist_fail = True
messages += [
"If this package needs a fakesync, use --fakesync",
"If you think this package shouldn't be "
"blacklisted, please file a bug explaining your "
"reasoning and subscribe ~ubuntu-archive.",
]
if blacklist_fail:
Logger.error("Source package %s is blacklisted.", src_pkg.source)
elif blacklisted == "ALWAYS":
Logger.info("Source package %s is blacklisted.", src_pkg.source)
if messages:
for message in messages:
for line in textwrap.wrap(message):
Logger.info(line)
if comments:
Logger.info("Blacklist Comments:")
for comment in comments:
for line in textwrap.wrap(comment):
Logger.info(" %s", line)
if blacklist_fail:
sys.exit(1)
if args.lp:
copy(src_pkg, args.release, args.bugs, sponsoree, args.simulate, args.force)
else:
os.environ["DEB_VENDOR"] = "Ubuntu"
sync_dsc(
src_pkg,
args.distribution,
args.release,
args.uploader_name,
args.uploader_email,
args.bugs,
args.ubuntu_mirror,
args.keyid,
args.simulate,
args.force,
args.fakesync,
)
if __name__ == "__main__":

View File

@ -2,16 +2,16 @@
#
# ubuntu-build - command line interface for Launchpad buildd operations.
#
# Copyright (C) 2007-2024 Canonical Ltd.
# Copyright (C) 2007 Canonical Ltd.
# Authors:
# - Martin Pitt <martin.pitt@canonical.com>
# - Jonathan Davies <jpds@ubuntu.com>
# - Michael Bienia <geser@ubuntu.com>
# - Steve Langasek <steve.langasek@canonical.com>
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, version 3 of the License.
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
@ -28,65 +28,20 @@
import argparse
import sys
import lazr.restfulclient.errors
from launchpadlib.launchpad import Launchpad
from launchpadlib.credentials import TokenAuthorizationException
from ubuntutools import getLogger
from ubuntutools.lp.udtexceptions import PocketDoesNotExistError
from ubuntutools.lp.lpapicache import Distribution, Launchpad, PersonTeam
from ubuntutools.lp.udtexceptions import (
PackageNotFoundException,
PocketDoesNotExistError,
SeriesNotFoundException,
)
from ubuntutools.misc import split_release_pocket
Logger = getLogger()
def get_build_states(pkg, archs):
res = []
for build in pkg.getBuilds():
if build.arch_tag in archs:
res.append(f" {build.arch_tag}: {build.buildstate}")
msg = "\n".join(res)
return f"Build state(s) for '{pkg.source_package_name}':\n{msg}"
def rescore_builds(pkg, archs, score):
res = []
for build in pkg.getBuilds():
arch = build.arch_tag
if arch in archs:
if not build.can_be_rescored:
continue
try:
build.rescore(score=score)
res.append(f" {arch}: done")
except lazr.restfulclient.errors.Unauthorized:
Logger.error(
"You don't have the permissions to rescore builds."
" Ignoring your rescore request."
)
return None
except lazr.restfulclient.errors.BadRequest:
Logger.info("Cannot rescore build of %s on %s.", build.source_package_name, arch)
res.append(f" {arch}: failed")
msg = "\n".join(res)
return f"Rescoring builds of '{pkg.source_package_name}' to {score}:\n{msg}"
def retry_builds(pkg, archs):
res = []
for build in pkg.getBuilds():
arch = build.arch_tag
if arch in archs:
try:
build.retry()
res.append(f" {arch}: done")
except lazr.restfulclient.errors.BadRequest:
res.append(f" {arch}: failed")
msg = "\n".join(res)
return f"Retrying builds of '{pkg.source_package_name}':\n{msg}"
def main():
# Usage.
usage = "%(prog)s <srcpackage> <release> <operation>\n\n"
@ -95,13 +50,32 @@ def main():
# Valid architectures.
valid_archs = set(
["armhf", "arm64", "amd64", "i386", "powerpc", "ppc64el", "riscv64", "s390x"]
[
"armel",
"armhf",
"arm64",
"amd64",
"hppa",
"i386",
"ia64",
"lpia",
"powerpc",
"ppc64el",
"riscv64",
"s390x",
"sparc",
]
)
# Prepare our option parser.
parser = argparse.ArgumentParser(usage=usage)
parser.add_argument(
# Retry options
retry_rescore_options = parser.add_argument_group(
"Retry and rescore options",
"These options may only be used with the 'retry' and 'rescore' operations.",
)
retry_rescore_options.add_argument(
"-a",
"--arch",
action="append",
@ -110,8 +84,6 @@ def main():
f"include: {', '.join(valid_archs)}.",
)
parser.add_argument("-A", "--archive", help="operate on ARCHIVE", default="ubuntu")
# Batch processing options
batch_options = parser.add_argument_group(
"Batch processing",
@ -139,34 +111,18 @@ def main():
help="Rescore builds to <priority>.",
)
batch_options.add_argument(
"--state",
action="store",
dest="state",
help="Act on builds that are in the specified state",
"--arch2",
action="append",
dest="architecture",
help=f"Affect only 'architecture' (can be used several times)."
f" Valid architectures are: {', '.join(valid_archs)}.",
)
parser.add_argument("packages", metavar="package", nargs="*", help=argparse.SUPPRESS)
parser.add_argument("packages", metavar="package", nargs="+", help=argparse.SUPPRESS)
# Parse our options.
args = parser.parse_args()
launchpad = Launchpad.login_with("ubuntu-dev-tools", "production", version="devel")
ubuntu = launchpad.distributions["ubuntu"]
if args.batch:
release = args.series
if not release:
# ppas don't have a proposed pocket so just use the release pocket;
# but for the main archive we default to -proposed
release = ubuntu.getDevelopmentSeries()[0].name
if args.archive == "ubuntu":
release = f"{release}-proposed"
try:
(release, pocket) = split_release_pocket(release)
except PocketDoesNotExistError as error:
Logger.error(error)
sys.exit(1)
else:
if not args.batch:
# Check we have the correct number of arguments.
if len(args.packages) < 3:
parser.error("Incorrect number of arguments.")
@ -179,14 +135,6 @@ def main():
parser.print_help()
sys.exit(1)
archive = launchpad.archives.getByReference(reference=args.archive)
try:
distroseries = ubuntu.getSeries(name_or_version=release)
except lazr.restfulclient.errors.NotFound as error:
Logger.error(error)
sys.exit(1)
if not args.batch:
# Check our operation.
if operation not in ("rescore", "retry", "status"):
Logger.error("Invalid operation: %s.", operation)
@ -210,44 +158,51 @@ def main():
Logger.error(error)
sys.exit(1)
try:
# Will fail here if we have no credentials, bail out
Launchpad.login()
except TokenAuthorizationException:
sys.exit(1)
# Get the ubuntu archive
ubuntu_archive = Distribution("ubuntu").getArchive()
# Get list of published sources for package in question.
try:
sources = archive.getPublishedSources(
distro_series=distroseries,
exact_match=True,
pocket=pocket,
source_name=package,
status="Published",
)[0]
except IndexError:
Logger.error("No publication found for package %s", package)
sources = ubuntu_archive.getSourcePackage(package, release, pocket)
distroseries = Distribution("ubuntu").getSeries(release)
except (SeriesNotFoundException, PackageNotFoundException) as error:
Logger.error(error)
sys.exit(1)
# Get list of builds for that package.
builds = sources.getBuilds()
# Find out the version and component in given release.
version = sources.source_package_version
component = sources.component_name
version = sources.getVersion()
component = sources.getComponent()
# Operations that are remaining may only be done by Ubuntu developers
# (retry) or buildd admins (rescore). Check if the proper permissions
# are in place.
me = PersonTeam.me
if operation == "rescore":
necessary_privs = me.isLpTeamMember("launchpad-buildd-admins")
if operation == "retry":
necessary_privs = archive.checkUpload(
component=sources.getComponent(),
distroseries=distroseries,
person=launchpad.me,
necessary_privs = me.canUploadPackage(
ubuntu_archive,
distroseries,
sources.getPackageName(),
sources.getComponent(),
pocket=pocket,
sourcepackagename=sources.getPackageName(),
)
if not necessary_privs:
Logger.error(
"You cannot perform the %s operation on a %s package as you"
" do not have the permissions to do this action.",
operation,
component,
)
sys.exit(1)
if operation in ("rescore", "retry") and not necessary_privs:
Logger.error(
"You cannot perform the %s operation on a %s package as you"
" do not have the permissions to do this action.",
operation,
component,
)
sys.exit(1)
# Output details.
Logger.info(
@ -274,14 +229,7 @@ def main():
# FIXME: make priority an option
priority = 5000
Logger.info("Rescoring build %s to %d...", build.arch_tag, priority)
try:
build.rescore(score=priority)
except lazr.restfulclient.errors.Unauthorized:
Logger.error(
"You don't have the permissions to rescore builds."
" Ignoring your rescore request."
)
break
build.rescore(score=priority)
else:
Logger.info("Cannot rescore build on %s.", build.arch_tag)
if operation == "retry":
@ -310,136 +258,62 @@ def main():
# filter out duplicate and invalid architectures
archs.intersection_update(valid_archs)
if not args.packages:
retry_count = 0
can_rescore = True
release = args.series
if not release:
release = Distribution("ubuntu").getDevelopmentSeries().name + "-proposed"
try:
(release, pocket) = split_release_pocket(release)
except PocketDoesNotExistError as error:
Logger.error(error)
sys.exit(1)
if not args.state:
if args.retry:
args.state = "Failed to build"
elif args.priority:
args.state = "Needs building"
# there is no equivalent to series.getBuildRecords() for a ppa.
# however, we don't want to have to traverse all build records for
# all series when working on the main archive, so we use
# series.getBuildRecords() for ubuntu and handle ppas separately
series = ubuntu.getSeries(name_or_version=release)
if args.archive == "ubuntu":
builds = series.getBuildRecords(build_state=args.state, pocket=pocket)
else:
builds = []
for build in archive.getBuildRecords(build_state=args.state, pocket=pocket):
if not build.current_source_publication:
continue
if build.current_source_publication.distro_series == series:
builds.append(build)
for build in builds:
if build.arch_tag not in archs:
continue
if not build.current_source_publication:
continue
# fixme: refactor
# Check permissions (part 2): check upload permissions for the
# source package
can_retry = args.retry and archive.checkUpload(
component=build.current_source_publication.component_name,
distroseries=series,
person=launchpad.me,
pocket=pocket,
sourcepackagename=build.source_package_name,
)
if args.retry and not can_retry:
Logger.error(
"You don't have the permissions to retry the build of '%s', skipping.",
build.source_package_name,
)
continue
Logger.info(
"The source version for '%s' in '%s' (%s) is: %s",
build.source_package_name,
release,
pocket,
build.source_package_version,
)
ubuntu_archive = Distribution("ubuntu").getArchive()
try:
distroseries = Distribution("ubuntu").getSeries(release)
except SeriesNotFoundException as error:
Logger.error(error)
sys.exit(1)
me = PersonTeam.me
if args.retry and build.can_be_retried:
Logger.info(
"Retrying build of %s on %s...", build.source_package_name, build.arch_tag
)
try:
build.retry()
retry_count += 1
except lazr.restfulclient.errors.BadRequest:
Logger.info(
"Failed to retry build of %s on %s",
build.source_package_name,
build.arch_tag,
)
if args.priority and can_rescore:
if build.can_be_rescored:
try:
build.rescore(score=args.priority)
except lazr.restfulclient.errors.Unauthorized:
Logger.error(
"You don't have the permissions to rescore builds."
" Ignoring your rescore request."
)
can_rescore = False
except lazr.restfulclient.errors.BadRequest:
Logger.info(
"Cannot rescore build of %s on %s.",
build.source_package_name,
build.arch_tag,
)
Logger.info("")
if args.retry:
Logger.info("%d package builds retried", retry_count)
sys.exit(0)
# Check permisions (part 1): Rescoring can only be done by buildd admins
can_rescore = args.priority and me.isLpTeamMember("launchpad-buildd-admins")
if args.priority and not can_rescore:
Logger.error(
"You don't have the permissions to rescore builds. Ignoring your rescore request."
)
for pkg in args.packages:
try:
pkg = archive.getPublishedSources(
distro_series=distroseries,
exact_match=True,
pocket=pocket,
source_name=pkg,
status="Published",
)[0]
except IndexError:
Logger.error("No publication found for package %s", pkg)
pkg = ubuntu_archive.getSourcePackage(pkg, release, pocket)
except PackageNotFoundException as error:
Logger.error(error)
continue
# Check permissions (part 2): check upload permissions for the source
# package
can_retry = args.retry and archive.checkUpload(
component=pkg.component_name,
distroseries=distroseries,
person=launchpad.me,
pocket=pocket,
sourcepackagename=pkg.source_package_name,
can_retry = args.retry and me.canUploadPackage(
ubuntu_archive, distroseries, pkg.getPackageName(), pkg.getComponent()
)
if args.retry and not can_retry:
Logger.error(
"You don't have the permissions to retry the "
"build of '%s'. Ignoring your request.",
pkg.source_package_name,
pkg.getPackageName(),
)
Logger.info(
"The source version for '%s' in '%s' (%s) is: %s",
pkg.source_package_name,
pkg.getPackageName(),
release,
pocket,
pkg.source_package_version,
pkg.getVersion(),
)
Logger.info(get_build_states(pkg, archs))
Logger.info(pkg.getBuildStates(archs))
if can_retry:
Logger.info(retry_builds(pkg, archs))
if args.priority:
Logger.info(rescore_builds(pkg, archs, args.priority))
Logger.info(pkg.retryBuilds(archs))
if args.priority and can_rescore:
Logger.info(pkg.rescoreBuilds(archs, args.priority))
Logger.info("")

View File

@ -91,6 +91,7 @@ def main():
pocket != "Release"
or series.status in ("Experimental", "Active Development", "Pre-release Freeze")
):
component_uploader = archive.getUploadersForComponent(component_name=component)[0]
Logger.info("All upload permissions for %s:", args.package)
Logger.info("")

View File

@ -165,7 +165,6 @@ class SourcePackage(ABC):
series = kwargs.get("series")
pocket = kwargs.get("pocket")
status = kwargs.get("status")
arch = kwargs.get("arch")
verify_signature = kwargs.get("verify_signature", False)
try_binary = kwargs.get("try_binary", True)
@ -185,7 +184,6 @@ class SourcePackage(ABC):
self._series = series
self._pocket = pocket
self._status = status
self._arch = arch
# dscfile can be either a path or an URL. misc.py's download() will
# later fiture it out
self._dsc_source = dscfile
@ -254,7 +252,6 @@ class SourcePackage(ABC):
)
try:
params["archtag"] = self._arch
bpph = archive.getBinaryPackage(self.source, **params)
except PackageNotFoundException as bpnfe:
# log binary lookup failure, in case it provides hints
@ -546,7 +543,7 @@ class SourcePackage(ABC):
Return the debdiff filename.
"""
cmd = ["debdiff", self.dsc_name, newpkg.dsc_name]
difffn = f"{newpkg.dsc_name[:-3]}debdiff"
difffn = newpkg.dsc_name[:-3] + "debdiff"
Logger.debug("%s > %s", " ".join(cmd), difffn)
with open(difffn, "w", encoding="utf-8") as f:
if subprocess.call(cmd, stdout=f, cwd=str(self.workdir)) > 2:
@ -949,7 +946,7 @@ class _WebJSON:
class Madison(_WebJSON):
urls = {
"debian": "https://api.ftp-master.debian.org/madison",
"ubuntu": "https://ubuntu-archive-team.ubuntu.com/madison.cgi",
"ubuntu": "http://people.canonical.com/~ubuntu-archive/madison.cgi",
}
def __init__(self, distro="debian"):
@ -979,7 +976,7 @@ class Madison(_WebJSON):
# Snapshot API
# https://anonscm.debian.org/cgit/mirror/snapshot.debian.org.git/plain/API
class _Snapshot(_WebJSON):
DEBIAN_COMPONENTS = ["main", "contrib", "non-free", "non-free-firmware"]
DEBIAN_COMPONENTS = ["main", "contrib", "non-free"]
def getHostUrl(self):
return "http://snapshot.debian.org"
@ -1345,7 +1342,7 @@ class SnapshotSPPH:
self.getComponent(),
subdir,
name,
f"{name}_{pkgversion}",
name + "_" + pkgversion,
"changelog.txt",
)
try:

View File

@ -71,8 +71,8 @@ class Pbuilder(Builder):
cmd = [
"sudo",
"-E",
f"ARCH={self.architecture}",
f"DIST={dist}",
"ARCH=" + self.architecture,
"DIST=" + dist,
self.name,
"--build",
"--architecture",
@ -91,8 +91,8 @@ class Pbuilder(Builder):
cmd = [
"sudo",
"-E",
f"ARCH={self.architecture}",
f"DIST={dist}",
"ARCH=" + self.architecture,
"DIST=" + dist,
self.name,
"--update",
"--architecture",
@ -140,7 +140,7 @@ class Sbuild(Builder):
workdir = os.getcwd()
Logger.debug("cd %s", result_directory)
os.chdir(result_directory)
cmd = ["sbuild", "--arch-all", f"--dist={dist}", f"--arch={self.architecture}", dsc_file]
cmd = ["sbuild", "--arch-all", "--dist=" + dist, "--arch=" + self.architecture, dsc_file]
Logger.debug(" ".join(cmd))
returncode = subprocess.call(cmd)
Logger.debug("cd %s", workdir)

View File

@ -44,6 +44,7 @@ class UDTConfig:
"MIRROR_FALLBACK": True,
"UBUNTU_MIRROR": "http://archive.ubuntu.com/ubuntu",
"UBUNTU_PORTS_MIRROR": "http://ports.ubuntu.com",
"UBUNTU_INTERNAL_MIRROR": "http://ftpmaster.internal/ubuntu",
"UBUNTU_DDEBS_MIRROR": "http://ddebs.ubuntu.com",
"UPDATE_BUILDER": False,
"WORKDIR": None,
@ -68,19 +69,21 @@ class UDTConfig:
config = {}
for filename in ("/etc/devscripts.conf", "~/.devscripts"):
try:
with open(os.path.expanduser(filename), "r", encoding="utf-8") as f:
content = f.read()
f = open(os.path.expanduser(filename), "r", encoding="utf-8")
except IOError:
continue
try:
tokens = shlex.split(content, comments=True)
except ValueError as e:
Logger.error("Error parsing %s: %s", filename, e)
continue
for token in tokens:
if "=" in token:
key, value = token.split("=", 1)
for line in f:
parsed = shlex.split(line, comments=True)
if len(parsed) > 1:
Logger.warning(
"Cannot parse variable assignment in %s: %s",
getattr(f, "name", "<config>"),
line,
)
if len(parsed) >= 1 and "=" in parsed[0]:
key, value = parsed[0].split("=", 1)
config[key] = value
f.close()
return config
def get_value(self, key, default=None, boolean=False, compat_keys=()):
@ -97,9 +100,9 @@ class UDTConfig:
if default is None and key in self.defaults:
default = self.defaults[key]
keys = [f"{self.prefix}_{key}"]
keys = [self.prefix + "_" + key]
if key in self.defaults:
keys.append(f"UBUNTUTOOLS_{key}")
keys.append("UBUNTUTOOLS_" + key)
keys += compat_keys
for k in keys:
@ -112,9 +115,9 @@ class UDTConfig:
else:
continue
if k in compat_keys:
replacements = f"{self.prefix}_{key}"
replacements = self.prefix + "_" + key
if key in self.defaults:
replacements += f"or UBUNTUTOOLS_{key}"
replacements += "or UBUNTUTOOLS_" + key
Logger.warning(
"Using deprecated configuration variable %s. You should use %s.",
k,
@ -178,7 +181,7 @@ def ubu_email(name=None, email=None, export=True):
mailname = socket.getfqdn()
if os.path.isfile("/etc/mailname"):
mailname = open("/etc/mailname", "r", encoding="utf-8").read().strip()
email = f"{pwd.getpwuid(os.getuid()).pw_name}@{mailname}"
email = pwd.getpwuid(os.getuid()).pw_name + "@" + mailname
if export:
os.environ["DEBFULLNAME"] = name

View File

@ -412,7 +412,7 @@ class PackageUpload(BaseWrapper):
urls = self.binaryFileUrls()
props = self.getBinaryProperties()
self._binary_prop_dict = dict(zip(urls, props))
for key, value in copy(self._binary_prop_dict).items():
for (key, value) in copy(self._binary_prop_dict).items():
filename = os.path.basename(urlparse(key).path)
self._binary_prop_dict[filename] = value
return self._binary_prop_dict.get(filename_or_url, {})
@ -883,7 +883,7 @@ class SourcePackagePublishingHistory(BaseWrapper):
"""
release = self.getSeriesName()
if self.pocket != "Release":
release += f"-{self.pocket.lower()}"
release += "-" + self.pocket.lower()
return release
def getArchive(self):
@ -1097,6 +1097,51 @@ class SourcePackagePublishingHistory(BaseWrapper):
for build in builds:
self._builds[build.arch_tag] = Build(build)
def getBuildStates(self, archs):
res = []
if not self._builds:
self._fetch_builds()
for arch in archs:
build = self._builds.get(arch)
if build:
res.append(f" {build}")
msg = "\n".join(res)
return f"Build state(s) for '{self.getPackageName()}':\n{msg}"
def rescoreBuilds(self, archs, score):
res = []
if not self._builds:
self._fetch_builds()
for arch in archs:
build = self._builds.get(arch)
if build:
if build.rescore(score):
res.append(f" {arch}: done")
else:
res.append(f" {arch}: failed")
msg = "\n".join(res)
return f"Rescoring builds of '{self.getPackageName()}' to {score}:\n{msg}"
def retryBuilds(self, archs):
res = []
if not self._builds:
self._fetch_builds()
for arch in archs:
build = self._builds.get(arch)
if build:
if build.retry():
res.append(f" {arch}: done")
else:
res.append(f" {arch}: failed")
msg = "\n".join(res)
return f"Retrying builds of '{self.getPackageName()}':\n{msg}"
class BinaryPackagePublishingHistory(BaseWrapper):
"""

View File

@ -244,7 +244,7 @@ def verify_file_checksums(pathname, checksums=None, size=0):
Logger.error("File %s incorrect size, got %s expected %s", path, filesize, size)
return False
for alg, checksum in checksums.items():
for (alg, checksum) in checksums.items():
hash_ = hashlib.new(alg)
with path.open("rb") as f:
while True:
@ -348,11 +348,7 @@ def download(src, dst, size=0, *, blocksize=DOWNLOAD_BLOCKSIZE_DEFAULT):
with tempfile.TemporaryDirectory() as tmpdir:
tmpdst = Path(tmpdir) / "dst"
try:
# We must use "Accept-Encoding: identity" so that Launchpad doesn't
# compress changes files. See LP: #2025748.
with requests.get(
src, stream=True, timeout=60, auth=auth, headers={"accept-encoding": "identity"}
) as fsrc:
with requests.get(src, stream=True, timeout=60, auth=auth) as fsrc:
with tmpdst.open("wb") as fdst:
fsrc.raise_for_status()
_download(fsrc, fdst, size, blocksize=blocksize)
@ -385,7 +381,7 @@ class _StderrProgressBar:
pctstr = f"{pct:>3}%"
barlen = self.width * pct // 100
barstr = "=" * barlen
barstr = f"{barstr[:-1]}>"
barstr = barstr[:-1] + ">"
barstr = barstr.ljust(self.width)
fullstr = f"\r[{barstr}]{pctstr}"
sys.stderr.write(fullstr)
@ -437,16 +433,7 @@ def _download(fsrc, fdst, size, *, blocksize):
downloaded = 0
try:
while True:
# We use fsrc.raw so that compressed files stay compressed as we
# write them to disk. For example, if this is a .diff.gz, then it
# needs to remain compressed and unmodified to remain valid as part
# of a source package later, even though Launchpad sends
# "Content-Encoding: gzip" and the requests library therefore would
# want to decompress it. See LP: #2025748.
block = fsrc.raw.read(blocksize)
if not block:
break
for block in fsrc.iter_content(blocksize):
fdst.write(block)
downloaded += len(block)
progress_bar.update(downloaded, size)

View File

@ -340,7 +340,6 @@ class PullPkg:
params = {}
params["package"] = options["package"]
params["arch"] = options["arch"]
if options["release"]:
(release, version, pocket) = self.parse_release_and_version(
@ -436,7 +435,7 @@ class PullPkg:
if options["upload_queue"]:
# upload queue API is different/simpler
self.pull_upload_queue( # pylint: disable=missing-kwoa
pull, download_only=options["download_only"], **params
pull, arch=options["arch"], download_only=options["download_only"], **params
)
return
@ -446,43 +445,6 @@ class PullPkg:
Logger.info("Found %s", spph.display_name)
# The VCS detection logic was modeled after `apt source`
for key in srcpkg.dsc.keys():
original_key = key
key = key.lower()
if key.startswith("vcs-"):
if key == "vcs-browser":
continue
if key == "vcs-git":
vcs = "Git"
elif key == "vcs-bzr":
vcs = "Bazaar"
else:
continue
uri = srcpkg.dsc[original_key]
Logger.warning(
"\nNOTICE: '%s' packaging is maintained in "
"the '%s' version control system at:\n %s\n",
package,
vcs,
uri,
)
if vcs == "Bazaar":
vcscmd = " $ bzr branch " + uri
elif vcs == "Git":
vcscmd = " $ git clone " + uri
if vcscmd:
Logger.info(
"Please use:\n%s\n"
"to retrieve the latest (possibly unreleased) updates to the package.\n",
vcscmd,
)
if pull == PULL_LIST:
Logger.info("Source files:")
for f in srcpkg.dsc["Files"]:

View File

@ -31,9 +31,9 @@ class Question:
def get_options(self):
if len(self.options) == 2:
options = f"{self.options[0]} or {self.options[1]}"
options = self.options[0] + " or " + self.options[1]
else:
options = f"{', '.join(self.options[:-1])}, or {self.options[-1]}"
options = ", ".join(self.options[:-1]) + ", or " + self.options[-1]
return options
def ask(self, question, default=None):
@ -67,7 +67,7 @@ class Question:
if selected == option[0]:
selected = option
if selected not in self.options:
print(f"Please answer the question with {self.get_options()}.")
print("Please answer the question with " + self.get_options() + ".")
return selected
@ -170,7 +170,7 @@ class EditBugReport(EditFile):
split_re = re.compile(r"^Summary.*?:\s+(.*?)\s+Description:\s+(.*)$", re.DOTALL | re.UNICODE)
def __init__(self, subject, body, placeholders=None):
prefix = f"{os.path.basename(sys.argv[0])}_"
prefix = os.path.basename(sys.argv[0]) + "_"
tmpfile = tempfile.NamedTemporaryFile(prefix=prefix, suffix=".txt", delete=False)
tmpfile.write((f"Summary (one line):\n{subject}\n\nDescription:\n{body}").encode("utf-8"))
tmpfile.close()

View File

@ -183,7 +183,7 @@ Content-Type: text/plain; charset=UTF-8
backup = tempfile.NamedTemporaryFile(
mode="w",
delete=False,
prefix=f"requestsync-{re.sub('[^a-zA-Z0-9_-]', '', bugtitle.replace(' ', '_'))}",
prefix="requestsync-" + re.sub(r"[^a-zA-Z0-9_-]", "", bugtitle.replace(" ", "_")),
)
with backup:
backup.write(mail)

View File

@ -1,95 +0,0 @@
# Copyright (C) 2024 Canonical Ltd.
# Author: Chris Peterson <chris.peterson@canonical.com>
# Author: Andy P. Whitcroft
# Author: Christian Ehrhardt
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU General Public License version 3, as published
# by the Free Software Foundation.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranties of
# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program. If not, see <http://www.gnu.org/licenses/>.
import datetime
import json
import sys
import urllib
import urllib.request
URL_RUNNING = "http://autopkgtest.ubuntu.com/static/running.json"
URL_QUEUED = "http://autopkgtest.ubuntu.com/queues.json"
def _get_jobs(url: str) -> dict:
request = urllib.request.Request(url, headers={"Cache-Control": "max-age-0"})
with urllib.request.urlopen(request) as response:
data = response.read()
jobs = json.loads(data.decode("utf-8"))
return jobs
def get_running():
jobs = _get_jobs(URL_RUNNING)
running = []
for pkg in jobs:
for handle in jobs[pkg]:
for series in jobs[pkg][handle]:
for arch in jobs[pkg][handle][series]:
jobinfo = jobs[pkg][handle][series][arch]
triggers = ",".join(jobinfo[0].get("triggers", "-"))
ppas = ",".join(jobinfo[0].get("ppas", "-"))
time = jobinfo[1]
env = jobinfo[0].get("env", "-")
time = str(datetime.timedelta(seconds=jobinfo[1]))
try:
line = (
f"R {time:6} {pkg:30} {'-':10} {series:8} {arch:8}"
f" {ppas:31} {triggers} {env}\n"
)
running.append((jobinfo[1], line))
except BrokenPipeError:
sys.exit(1)
output = ""
for time, row in sorted(running, reverse=True):
output += f"{row}"
return output
def get_queued():
queues = _get_jobs(URL_QUEUED)
output = ""
for origin in queues:
for series in queues[origin]:
for arch in queues[origin][series]:
n = 0
for key in queues[origin][series][arch]:
if key == "private job":
pkg = triggers = ppas = "private job"
else:
(pkg, json_data) = key.split(maxsplit=1)
try:
jobinfo = json.loads(json_data)
triggers = ",".join(jobinfo.get("triggers", "-"))
ppas = ",".join(jobinfo.get("ppas", "-"))
except json.decoder.JSONDecodeError:
pkg = triggers = ppas = "failed to parse"
continue
n = n + 1
try:
output += (
f"Q{n:04d} {'-:--':>6} {pkg:30} {origin:10} {series:8} {arch:8}"
f" {ppas:31} {triggers}\n"
)
except BrokenPipeError:
sys.exit(1)
return output

View File

@ -255,7 +255,7 @@ class SourcePackage:
def _changes_file(self):
"""Returns the file name of the .changes file."""
return os.path.join(
self._workdir, f"{self._package}_{strip_epoch(self._version)}_source.changes"
self._workdir, f"{self._package}_{ strip_epoch(self._version)}_source.changes"
)
def check_target(self, upload, launchpad):

View File

@ -39,7 +39,7 @@ def is_command_available(command, check_sbin=False):
"Is command in $PATH?"
path = os.environ.get("PATH", "/usr/bin:/bin").split(":")
if check_sbin:
path += [f"{directory[:-3]}sbin" for directory in path if directory.endswith("/bin")]
path += [directory[:-3] + "sbin" for directory in path if directory.endswith("/bin")]
return any(os.access(os.path.join(directory, command), os.X_OK) for directory in path)
@ -303,7 +303,7 @@ def _download_and_change_into(task, dsc_file, patch, branch):
extract_source(dsc_file, Logger.isEnabledFor(logging.DEBUG))
# change directory
directory = f"{task.package}-{task.get_version().upstream_version}"
directory = task.package + "-" + task.get_version().upstream_version
Logger.debug("cd %s", directory)
os.chdir(directory)

View File

@ -1,33 +0,0 @@
# Copyright (C) 2024 Canonical Ltd.
# Author: Chris Peterson <chris.peterson@canonical.com>
#
# Permission to use, copy, modify, and/or distribute this software for any
# purpose with or without fee is hereby granted, provided that the above
# copyright notice and this permission notice appear in all copies.
#
# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
# REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY
# AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
# INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
# LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
# OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
# PERFORMANCE OF THIS SOFTWARE.
import unittest
# Binary Tests
class BinaryTests(unittest.TestCase):
# The requestsync binary has the option of using the launchpad api
# to log in but requires python3-keyring in addition to
# python3-launchpadlib. Testing the integrated login functionality
# automatically isn't very feasbile, but we can at least write a smoke
# test to make sure the required packages are installed.
# See LP: #2049217
def test_keyring_installed(self):
"""Smoke test for required lp api dependencies"""
try:
import keyring # noqa: F401
except ModuleNotFoundError:
raise ModuleNotFoundError("package python3-keyring is not installed")

View File

@ -1,128 +0,0 @@
# Copyright (C) 2024 Canonical Ltd.
# Author: Chris Peterson <chris.peterson@canonical.com>
#
# Permission to use, copy, modify, and/or distribute this software for any
# purpose with or without fee is hereby granted, provided that the above
# copyright notice and this permission notice appear in all copies.
#
# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
# REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY
# AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
# INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
# LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
# OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
# PERFORMANCE OF THIS SOFTWARE.
""" Tests for running_autopkgtests
Tests using cached data from autopkgtest servers.
These tests only ensure code changes don't change parsing behavior
of the response data. If the response format changes, then the cached
responses will need to change as well.
"""
import unittest
from unittest.mock import patch
from ubuntutools.running_autopkgtests import (
URL_QUEUED,
URL_RUNNING,
_get_jobs,
get_queued,
get_running,
)
# Cached binary response data from autopkgtest server
RUN_DATA = (
b'{"pyatem": {'
b" \"submit-time_2024-01-19 19:37:36;triggers_['python3-defaults/3.12.1-0ubuntu1'];\":"
b' {"noble": {"arm64": [{"triggers": ["python3-defaults/3.12.1-0ubuntu1"],'
b' "submit-time": "2024-01-19 19:37:36"}, 380, "<omitted log>"]}}}}'
)
QUEUED_DATA = (
b'{"ubuntu": {"noble": {"arm64": ["libobject-accessor-perl {\\"requester\\": \\"someone\\",'
b' \\"submit-time\\": \\"2024-01-18 01:08:55\\",'
b' \\"triggers\\": [\\"perl/5.38.2-3\\", \\"liblocale-gettext-perl/1.07-6build1\\"]}"]}}}'
)
# Expected result(s) of parsing the above JSON data
RUNNING_JOB = {
"pyatem": {
"submit-time_2024-01-19 19:37:36;triggers_['python3-defaults/3.12.1-0ubuntu1'];": {
"noble": {
"arm64": [
{
"triggers": ["python3-defaults/3.12.1-0ubuntu1"],
"submit-time": "2024-01-19 19:37:36",
},
380,
"<omitted log>",
]
}
}
}
}
QUEUED_JOB = {
"ubuntu": {
"noble": {
"arm64": [
'libobject-accessor-perl {"requester": "someone",'
' "submit-time": "2024-01-18 01:08:55",'
' "triggers": ["perl/5.38.2-3", "liblocale-gettext-perl/1.07-6build1"]}'
]
}
}
}
PRIVATE_JOB = {"ppa": {"noble": {"arm64": ["private job"]}}}
# Expected textual output of the program based on the above data
RUNNING_OUTPUT = (
"R 0:06:20 pyatem - noble arm64"
" - python3-defaults/3.12.1-0ubuntu1 -\n"
)
QUEUED_OUTPUT = (
"Q0001 -:-- libobject-accessor-perl ubuntu noble arm64"
" - perl/5.38.2-3,liblocale-gettext-perl/1.07-6build1\n"
)
PRIVATE_OUTPUT = (
"Q0001 -:-- private job ppa noble arm64"
" private job private job\n"
)
class RunningAutopkgtestTestCase(unittest.TestCase):
"""Assert helper functions parse data correctly"""
maxDiff = None
@patch("urllib.request.urlopen")
def test_get_running_jobs(self, mock_response):
"""Test: Correctly parse autopkgtest json data for running tests"""
mock_response.return_value.__enter__.return_value.read.return_value = RUN_DATA
jobs = _get_jobs(URL_RUNNING)
self.assertEqual(RUNNING_JOB, jobs)
@patch("urllib.request.urlopen")
def test_get_queued_jobs(self, mock_response):
"""Test: Correctly parse autopkgtest json data for queued tests"""
mock_response.return_value.__enter__.return_value.read.return_value = QUEUED_DATA
jobs = _get_jobs(URL_QUEUED)
self.assertEqual(QUEUED_JOB, jobs)
def test_get_running_output(self):
"""Test: Correctly print running tests"""
with patch("ubuntutools.running_autopkgtests._get_jobs", return_value=RUNNING_JOB):
self.assertEqual(get_running(), RUNNING_OUTPUT)
def test_get_queued_output(self):
"""Test: Correctly print queued tests"""
with patch("ubuntutools.running_autopkgtests._get_jobs", return_value=QUEUED_JOB):
self.assertEqual(get_queued(), QUEUED_OUTPUT)
def test_private_queued_job(self):
"""Test: Correctly print queued private job"""
with patch("ubuntutools.running_autopkgtests._get_jobs", return_value=PRIVATE_JOB):
self.assertEqual(get_queued(), PRIVATE_OUTPUT)

View File

@ -165,7 +165,8 @@ Source: seahorse-plugins
Section: gnome
Priority: optional
Maintainer: Emilio Pozuelo Monfort <pochu@debian.org>
Build-Depends: debhelper (>= 5)
Build-Depends: debhelper (>= 5),
cdbs (>= 0.4.41)
Standards-Version: 3.8.3
Homepage: http://live.gnome.org/Seahorse
@ -183,7 +184,8 @@ Section: gnome
Priority: optional
Maintainer: Ubuntu Developers <ubuntu-devel-discuss@lists.ubuntu.com>
XSBC-Original-Maintainer: Emilio Pozuelo Monfort <pochu@debian.org>
Build-Depends: debhelper (>= 5)
Build-Depends: debhelper (>= 5),
cdbs (>= 0.4.41)
Standards-Version: 3.8.3
Homepage: http://live.gnome.org/Seahorse

View File

@ -72,17 +72,17 @@ class Control:
def set_maintainer(self, maintainer):
"""Sets the value of the Maintainer field."""
pattern = re.compile("^Maintainer: ?.*$", re.MULTILINE)
self._content = pattern.sub(f"Maintainer: {maintainer}", self._content)
self._content = pattern.sub("Maintainer: " + maintainer, self._content)
def set_original_maintainer(self, original_maintainer):
"""Sets the value of the XSBC-Original-Maintainer field."""
original_maintainer = f"XSBC-Original-Maintainer: {original_maintainer}"
original_maintainer = "XSBC-Original-Maintainer: " + original_maintainer
if self.get_original_maintainer():
pattern = re.compile("^(?:[XSBC]*-)?Original-Maintainer:.*$", re.MULTILINE)
self._content = pattern.sub(original_maintainer, self._content)
else:
pattern = re.compile("^(Maintainer:.*)$", re.MULTILINE)
self._content = pattern.sub(f"\\1\\n{original_maintainer}", self._content)
self._content = pattern.sub(r"\1\n" + original_maintainer, self._content)
def remove_original_maintainer(self):
"""Strip out out the XSBC-Original-Maintainer line"""

View File

@ -1,79 +0,0 @@
# Copyright (C) 2019-2023 Canonical Ltd.
# Author: Brian Murray <brian.murray@canonical.com> et al.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; version 3 of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
"""Portions of archive related code that is re-used by various tools."""
import os
import re
import urllib.request
from datetime import datetime
import dateutil.parser
from dateutil.tz import tzutc
def get_cache_dir():
cache_dir = os.environ.get("XDG_CACHE_HOME", os.path.expanduser(os.path.join("~", ".cache")))
uat_cache = os.path.join(cache_dir, "ubuntu-archive-tools")
os.makedirs(uat_cache, exist_ok=True)
return uat_cache
def get_url(url, force_cached):
"""Return file to the URL, possibly caching it"""
cache_file = None
# ignore bileto urls wrt caching, they're usually too small to matter
# and we don't do proper cache expiry
m = re.search("ubuntu-archive-team.ubuntu.com/proposed-migration/([^/]*)/([^/]*)", url)
if m:
cache_dir = get_cache_dir()
cache_file = os.path.join(cache_dir, f"{m.group(1)}_{m.group(2)}")
else:
# test logs can be cached, too
m = re.search(
"https://autopkgtest.ubuntu.com/results/autopkgtest-[^/]*/([^/]*)/([^/]*)"
"/[a-z0-9]*/([^/]*)/([_a-f0-9]*)@/log.gz",
url,
)
if m:
cache_dir = get_cache_dir()
cache_file = os.path.join(
cache_dir, f"{m.group(1)}_{m.group(2)}_{m.group(3)}_{m.group(4)}.gz"
)
if cache_file:
try:
prev_mtime = os.stat(cache_file).st_mtime
except FileNotFoundError:
prev_mtime = 0
prev_timestamp = datetime.fromtimestamp(prev_mtime, tz=tzutc())
new_timestamp = datetime.now(tz=tzutc()).timestamp()
if force_cached:
return open(cache_file, "rb")
f = urllib.request.urlopen(url)
if cache_file:
remote_ts = dateutil.parser.parse(f.headers["last-modified"])
if remote_ts > prev_timestamp:
with open(f"{cache_file}.new", "wb") as new_cache:
for line in f:
new_cache.write(line)
os.rename(f"{cache_file}.new", cache_file)
os.utime(cache_file, times=(new_timestamp, new_timestamp))
f.close()
f = open(cache_file, "rb")
return f