Compare commits

..

6 Commits

Author SHA1 Message Date
Steve Langasek
33f50a19cd changelog bump 2024-05-10 18:12:57 +02:00
Steve Langasek
fa74ec81c9 Also needs dropped as a runtime dep... 2024-05-10 18:12:48 +02:00
Steve Langasek
92e44f2f13 changelog bump 2024-05-08 11:36:00 +02:00
Steve Langasek
cd06772676 Correct wrong backport of python3-launchpadlib-desktop change 2024-05-08 11:35:52 +02:00
Steve Langasek
e4a5cb36f4 releasing package ubuntu-dev-tools version 0.201ubuntu2~22.04.1 2024-03-12 17:39:46 -07:00
Steve Langasek
288a3c3416 Backport to jammy 2024-03-12 17:39:38 -07:00
35 changed files with 367 additions and 692 deletions

2
.gitignore vendored
View File

@ -1,2 +0,0 @@
__pycache__
*.egg-info

1
debian/.gitignore vendored
View File

@ -1 +0,0 @@
files

88
debian/changelog vendored
View File

@ -1,83 +1,10 @@
ubuntu-dev-tools (0.206) unstable; urgency=medium
ubuntu-dev-tools (0.201ubuntu2~22.04.3) jammy; urgency=medium
[ Dan Bungert ]
* mk-sbuild: enable pkgmaintainermangler
* Backport current ubuntu-dev-tools to jammy. LP: #2057716.
[ Shengjing Zhu ]
* import-bug-from-debian: package option is overridden and not used
-- Steve Langasek <steve.langasek@ubuntu.com> Tue, 12 Mar 2024 17:39:42 -0700
[ Fernando Bravo Hernández ]
* Parsing arch parameter to getBinaryPackage() (LP: #2081861)
[ Simon Quigley ]
* Read ~/.devscripts in a more robust way, to ideally pick up multi-line
variables (Closes: #725418).
* mk-sbuild: default to using UTC for schroots (LP: #2097159).
* syncpackage: s/syncblacklist/syncblocklist/g
* syncpackage: Cache the sync blocklist in-memory, so it's not fetched
multiple times when syncing more than one package.
* syncpackage: Catch exceptions cleanly, simply skipping to the next
package (erring on the side of caution) if there is an error doing the
download (LP: #1943286).
-- Simon Quigley <tsimonq2@debian.org> Tue, 04 Mar 2025 13:43:15 -0600
ubuntu-dev-tools (0.205) unstable; urgency=medium
* [syncpackage] When syncing multiple packages, if one of the packages is in
the sync blocklist, do not exit, simply continue.
* [syncpackage] Do not use exit(1) on an error or exception unless it
applies to all packages, instead return None so we can continue to the
next package.
* [syncpackage] Add support for -y or --yes, noted that it should be used
with care.
* Update Standards-Version to 4.7.2, no changes needed.
-- Simon Quigley <tsimonq2@debian.org> Sat, 01 Mar 2025 11:29:54 -0600
ubuntu-dev-tools (0.204) unstable; urgency=medium
[ Simon Quigley ]
* Update Standards-Version to 4.7.1, no changes needed.
* Add several Lintian overrides related to .pyc files.
* Add my name to the copyright file.
* Rename bitesize to lp-bitesize (Closes: #1076224).
* Add a manpage for running-autopkgtests.
* Add a large warning at the top of mk-sbuild encouraging the use of the
unshare backend. This is to provide ample warning to users.
* Remove mail line from default ~/.sbuildrc, to resolve the undeclared
dependency on sendmail (Closes: #1074632).
[ Julien Plissonneau Duquène ]
* Fix reverse-depends -b crash on packages that b-d on themselves
(Closes: #1087760).
-- Simon Quigley <tsimonq2@debian.org> Mon, 24 Feb 2025 19:54:39 -0600
ubuntu-dev-tools (0.203) unstable; urgency=medium
[ Steve Langasek ]
* ubuntu-build: handle TOCTOU issue with the "can be retried" value on
builds.
* Recommend sbuild over pbuilder. sbuild is the tool recommended by
Ubuntu developers whose behavior most closely approximates Launchpad
builds.
[ Florent 'Skia' Jacquet ]
* import-bug-from-debian: handle multipart message (Closes: #969510)
[ Benjamin Drung ]
* import-bug-from-debian: add type hints
* Bump Standards-Version to 4.7.0
* Bump year and add missing files to copyright
* setup.py: add pm-helper
* Format code with black and isort
* Address several issues pointed out by Pylint
* Depend on python3-yaml for pm-helper
-- Benjamin Drung <bdrung@debian.org> Sat, 02 Nov 2024 18:19:24 +0100
ubuntu-dev-tools (0.202) unstable; urgency=medium
ubuntu-dev-tools (0.201ubuntu2) noble; urgency=medium
[ Steve Langasek ]
* ubuntu-build: support --batch with no package names to retry all
@ -88,11 +15,14 @@ ubuntu-dev-tools (0.202) unstable; urgency=medium
* ubuntu-build: Handling of proposed vs release pocket default for ppas
* ubuntu-build: update manpage
[ Chris Peterson ]
-- Steve Langasek <steve.langasek@ubuntu.com> Tue, 12 Mar 2024 17:03:43 -0700
ubuntu-dev-tools (0.201ubuntu1) noble; urgency=medium
* Replace Depends on python3-launchpadlib with Depends on
python3-launchpadlib-desktop (LP: #2049217)
-- Simon Quigley <tsimonq2@ubuntu.com> Fri, 12 Apr 2024 23:33:14 -0500
-- Chris Peterson <chris.peterson@canonical.com> Fri, 01 Mar 2024 14:08:07 -0800
ubuntu-dev-tools (0.201) unstable; urgency=medium

13
debian/control vendored
View File

@ -1,7 +1,8 @@
Source: ubuntu-dev-tools
Section: devel
Priority: optional
Maintainer: Ubuntu Developers <ubuntu-dev-tools@packages.debian.org>
Maintainer: Ubuntu Developers <ubuntu-devel-discuss@lists.ubuntu.com>
XSBC-Original-Maintainer: Ubuntu Developers <ubuntu-dev-tools@packages.debian.org>
Uploaders:
Benjamin Drung <bdrung@debian.org>,
Stefano Rivera <stefanor@debian.org>,
@ -26,12 +27,11 @@ Build-Depends:
python3-debianbts,
python3-distro-info,
python3-httplib2,
python3-launchpadlib-desktop,
python3-launchpadlib,
python3-pytest,
python3-requests <!nocheck>,
python3-setuptools,
python3-yaml <!nocheck>,
Standards-Version: 4.7.2
Standards-Version: 4.6.2
Rules-Requires-Root: no
Vcs-Git: https://git.launchpad.net/ubuntu-dev-tools
Vcs-Browser: https://git.launchpad.net/ubuntu-dev-tools
@ -54,10 +54,9 @@ Depends:
python3-debianbts,
python3-distro-info,
python3-httplib2,
python3-launchpadlib-desktop,
python3-launchpadlib,
python3-lazr.restfulclient,
python3-ubuntutools (= ${binary:Version}),
python3-yaml,
sensible-utils,
sudo,
tzdata,
@ -72,7 +71,7 @@ Recommends:
genisoimage,
lintian,
patch,
sbuild | pbuilder | cowbuilder,
pbuilder | cowbuilder | sbuild,
python3-dns,
quilt,
reportbug (>= 3.39ubuntu1),

20
debian/copyright vendored
View File

@ -11,7 +11,6 @@ Files: backportpackage
doc/check-symbols.1
doc/requestsync.1
doc/ubuntu-iso.1
doc/running-autopkgtests.1
GPL-2
README.updates
requestsync
@ -20,13 +19,12 @@ Files: backportpackage
ubuntu-iso
ubuntutools/requestsync/*.py
Copyright: 2007, Albert Damen <albrt@gmx.net>
2010-2024, Benjamin Drung <bdrung@ubuntu.com>
2010-2022, Benjamin Drung <bdrung@ubuntu.com>
2007-2023, Canonical Ltd.
2006-2007, Daniel Holbach <daniel.holbach@ubuntu.com>
2010, Evan Broder <evan@ebroder.net>
2006-2007, Luke Yelavich <themuso@ubuntu.com>
2009-2010, Michael Bienia <geser@ubuntu.com>
2024-2025, Simon Quigley <tsimonq2@debian.org>
2010-2011, Stefano Rivera <stefanor@ubuntu.com>
2008, Stephan Hermann <sh@sourcecode.de>
2007, Steve Kowalik <stevenk@ubuntu.com>
@ -74,28 +72,23 @@ License: GPL-2+
On Debian systems, the complete text of the GNU General Public License
version 2 can be found in the /usr/share/common-licenses/GPL-2 file.
Files: doc/lp-bitesize.1
Files: doc/bitesize.1
doc/check-mir.1
doc/grab-merge.1
doc/merge-changelog.1
doc/pm-helper.1
doc/setup-packaging-environment.1
doc/syncpackage.1
lp-bitesize
bitesize
check-mir
GPL-3
grab-merge
merge-changelog
pm-helper
pyproject.toml
run-linters
running-autopkgtests
setup-packaging-environment
syncpackage
ubuntutools/running_autopkgtests.py
ubuntutools/utils.py
Copyright: 2010-2024, Benjamin Drung <bdrung@ubuntu.com>
2007-2024, Canonical Ltd.
Copyright: 2010, Benjamin Drung <bdrung@ubuntu.com>
2007-2023, Canonical Ltd.
2008, Jonathan Patrick Davies <jpds@ubuntu.com>
2008-2010, Martin Pitt <martin.pitt@canonical.com>
2009, Siegfried-Angel Gevatter Pujals <rainct@ubuntu.com>
@ -184,12 +177,11 @@ Files: doc/pull-debian-debdiff.1
ubuntutools/version.py
update-maintainer
.pylintrc
Copyright: 2009-2024, Benjamin Drung <bdrung@ubuntu.com>
Copyright: 2009-2023, Benjamin Drung <bdrung@ubuntu.com>
2010, Evan Broder <evan@ebroder.net>
2008, Siegfried-Angel Gevatter Pujals <rainct@ubuntu.com>
2010-2011, Stefano Rivera <stefanor@ubuntu.com>
2017-2021, Dan Streetman <ddstreet@canonical.com>
2024, Canonical Ltd.
License: ISC
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above

View File

@ -1,3 +0,0 @@
# pyc files are machine-generated; they're expected to have long lines and have unstated copyright
source: file-without-copyright-information *.pyc [debian/copyright]
source: very-long-line-length-in-source-file * > 512 [*.pyc:*]

View File

@ -1,21 +1,21 @@
.TH lp-bitesize "1" "May 9 2010" "ubuntu-dev-tools"
.TH bitesize "1" "May 9 2010" "ubuntu-dev-tools"
.SH NAME
lp-bitesize \- Add \fBbitesize\fR tag to bugs and add a comment.
bitesize \- Add \fBbitesize\fR tag to bugs and add a comment.
.SH SYNOPSIS
.B lp-bitesize \fR<\fIbug number\fR>
.B bitesize \fR<\fIbug number\fR>
.br
.B lp-bitesize \-\-help
.B bitesize \-\-help
.SH DESCRIPTION
\fBlp-bitesize\fR adds a bitesize tag to the bug, if it's not there yet. It
\fBbitesize\fR adds a bitesize tag to the bug, if it's not there yet. It
also adds a comment to the bug indicating that you are willing to help with
fixing it.
It checks for permission to operate on a given bug first,
then perform required tasks on Launchpad.
.SH OPTIONS
Listed below are the command line options for \fBlp-bitesize\fR:
Listed below are the command line options for \fBbitesize\fR:
.TP
.BR \-h ", " \-\-help
Display a help message and exit.
@ -48,7 +48,7 @@ The default value for \fB--lpinstance\fR.
.BR ubuntu\-dev\-tools (5)
.SH AUTHORS
\fBlp-bitesize\fR and this manual page were written by Daniel Holbach
\fBbitesize\fR and this manual page were written by Daniel Holbach
<daniel.holbach@canonical.com>.
.PP
Both are released under the terms of the GNU General Public License, version 3.

View File

@ -20,7 +20,7 @@ like for example \fBpbuilder\-feisty\fP, \fBpbuilder\-sid\fP, \fBpbuilder\-gutsy
.PP
The same applies to \fBcowbuilder\-dist\fP, which uses cowbuilder. The main
difference between both is that pbuilder compresses the created chroot as a
tarball, thus using less disc space but needing to uncompress (and possibly
a tarball, thus using less disc space but needing to uncompress (and possibly
compress) its contents again on each run, and cowbuilder doesn't do this.
.SH USAGE

View File

@ -1,15 +0,0 @@
.TH running\-autopkgtests "1" "18 January 2024" "ubuntu-dev-tools"
.SH NAME
running\-autopkgtests \- dumps a list of currently running autopkgtests
.SH SYNOPSIS
.B running\-autopkgtests
.SH DESCRIPTION
Dumps a list of currently running and queued tests in Autopkgtest.
Pass --running to only see running tests, or --queued to only see
queued tests. Passing both will print both, which is the default behavior.
.SH AUTHOR
.B running\-autopkgtests
was written by Chris Peterson <chris.peterson@canonical.com>.

View File

@ -58,7 +58,7 @@ Display more progress information.
\fB\-F\fR, \fB\-\-fakesync\fR
Perform a fakesync, to work around a tarball mismatch between Debian and
Ubuntu.
This option ignores blocklisting, and performs a local sync.
This option ignores blacklisting, and performs a local sync.
It implies \fB\-\-no\-lp\fR, and will leave a signed \fB.changes\fR file
for you to upload.
.TP

View File

@ -29,8 +29,6 @@ import logging
import re
import sys
import webbrowser
from collections.abc import Iterable
from email.message import EmailMessage
import debianbts
from launchpadlib.launchpad import Launchpad
@ -39,10 +37,11 @@ from ubuntutools import getLogger
from ubuntutools.config import UDTConfig
Logger = getLogger()
ATTACHMENT_MAX_SIZE = 2000
def parse_args() -> argparse.Namespace:
def main():
bug_re = re.compile(r"bug=(\d+)")
parser = argparse.ArgumentParser()
parser.add_argument(
"-b",
@ -72,15 +71,28 @@ def parse_args() -> argparse.Namespace:
"--no-conf", action="store_true", help="Don't read config files or environment variables."
)
parser.add_argument("bugs", nargs="+", help="Bug number(s) or URL(s)")
return parser.parse_args()
options = parser.parse_args()
config = UDTConfig(options.no_conf)
if options.lpinstance is None:
options.lpinstance = config.get_value("LPINSTANCE")
def get_bug_numbers(bug_list: Iterable[str]) -> list[int]:
bug_re = re.compile(r"bug=(\d+)")
if options.dry_run:
launchpad = Launchpad.login_anonymously("ubuntu-dev-tools")
options.verbose = True
else:
launchpad = Launchpad.login_with("ubuntu-dev-tools", options.lpinstance)
if options.verbose:
Logger.setLevel(logging.DEBUG)
debian = launchpad.distributions["debian"]
ubuntu = launchpad.distributions["ubuntu"]
lp_debbugs = launchpad.bug_trackers.getByName(name="debbugs")
bug_nums = []
for bug_num in bug_list:
for bug_num in options.bugs:
if bug_num.startswith("http"):
# bug URL
match = bug_re.search(bug_num)
@ -89,81 +101,24 @@ def get_bug_numbers(bug_list: Iterable[str]) -> list[int]:
sys.exit(1)
bug_num = match.groups()[0]
bug_num = bug_num.lstrip("#")
bug_nums.append(int(bug_num))
bug_num = int(bug_num)
bug_nums.append(bug_num)
return bug_nums
bugs = debianbts.get_status(bug_nums)
def walk_multipart_message(message: EmailMessage) -> tuple[str, list[tuple[int, EmailMessage]]]:
summary = ""
attachments = []
i = 1
for part in message.walk():
content_type = part.get_content_type()
if content_type.startswith("multipart/"):
# we're already iterating on multipart items
# let's just skip the multipart extra metadata
continue
if content_type == "application/pgp-signature":
# we're not interested in importing pgp signatures
continue
if part.is_attachment():
attachments.append((i, part))
elif content_type.startswith("image/"):
# images here are not attachment, they are inline, but Launchpad can't handle that,
# so let's add them as attachments
summary += f"Message part #{i}\n"
summary += f"[inline image '{part.get_filename()}']\n\n"
attachments.append((i, part))
elif content_type.startswith("text/html"):
summary += f"Message part #{i}\n"
summary += "[inline html]\n\n"
attachments.append((i, part))
elif content_type == "text/plain":
summary += f"Message part #{i}\n"
summary += part.get_content() + "\n"
else:
raise RuntimeError(
f"""Unknown message part
Your Debian bug is too weird to be imported in Launchpad, sorry.
You can fix that by patching this script in ubuntu-dev-tools.
Faulty message part:
{part}"""
)
i += 1
return summary, attachments
def process_bugs(
bugs: Iterable[debianbts.Bugreport],
launchpad: Launchpad,
package: str,
dry_run: bool = True,
browserless: bool = False,
) -> bool:
debian = launchpad.distributions["debian"]
ubuntu = launchpad.distributions["ubuntu"]
lp_debbugs = launchpad.bug_trackers.getByName(name="debbugs")
if not bugs:
Logger.error("Cannot find any of the listed bugs")
sys.exit(1)
err = False
for bug in bugs:
ubupackage = bug.source
if package:
ubupackage = package
ubupackage = package = bug.source
if options.package:
ubupackage = options.package
bug_num = bug.bug_num
subject = bug.subject
log = debianbts.get_bug_log(bug_num)
message = log[0]["message"]
assert isinstance(message, EmailMessage)
attachments: list[tuple[int, EmailMessage]] = []
if message.is_multipart():
summary, attachments = walk_multipart_message(message)
else:
summary = str(message.get_payload())
summary = log[0]["message"].get_payload()
target = ubuntu.getSourcePackage(name=ubupackage)
if target is None:
Logger.error(
@ -182,73 +137,24 @@ def process_bugs(
Logger.debug("Subject: %s", subject)
Logger.debug("Description: ")
Logger.debug(description)
for i, attachment in attachments:
Logger.debug("Attachment #%s (%s)", i, attachment.get_filename() or "inline")
Logger.debug("Content:")
if attachment.get_content_type() == "text/plain":
content = attachment.get_content()
if len(content) > ATTACHMENT_MAX_SIZE:
content = (
content[:ATTACHMENT_MAX_SIZE]
+ f" [attachment cropped after {ATTACHMENT_MAX_SIZE} characters...]"
)
Logger.debug(content)
else:
Logger.debug("[data]")
if dry_run:
if options.dry_run:
Logger.info("Dry-Run: not creating Ubuntu bug.")
continue
u_bug = launchpad.bugs.createBug(target=target, title=subject, description=description)
for i, attachment in attachments:
name = f"#{i}-{attachment.get_filename() or "inline"}"
content = attachment.get_content()
if isinstance(content, str):
# Launchpad only wants bytes
content = content.encode()
u_bug.addAttachment(
filename=name,
data=content,
comment=f"Imported from Debian bug http://bugs.debian.org/{bug_num}",
)
d_sp = debian.getSourcePackage(name=package)
if d_sp is None and package:
d_sp = debian.getSourcePackage(name=package)
if d_sp is None and options.package:
d_sp = debian.getSourcePackage(name=options.package)
d_task = u_bug.addTask(target=d_sp)
d_watch = u_bug.addWatch(remote_bug=bug_num, bug_tracker=lp_debbugs)
d_task.bug_watch = d_watch
d_task.lp_save()
Logger.info("Opened %s", u_bug.web_link)
if not browserless:
if not options.browserless:
webbrowser.open(u_bug.web_link)
return err
def main() -> None:
options = parse_args()
config = UDTConfig(options.no_conf)
if options.lpinstance is None:
options.lpinstance = config.get_value("LPINSTANCE")
if options.dry_run:
launchpad = Launchpad.login_anonymously("ubuntu-dev-tools")
options.verbose = True
else:
launchpad = Launchpad.login_with("ubuntu-dev-tools", options.lpinstance)
if options.verbose:
Logger.setLevel(logging.DEBUG)
bugs = debianbts.get_status(get_bug_numbers(options.bugs))
if not bugs:
Logger.error("Cannot find any of the listed bugs")
sys.exit(1)
if process_bugs(bugs, launchpad, options.package, options.dry_run, options.browserless):
if err:
sys.exit(1)

View File

@ -155,7 +155,6 @@ proxy="_unset_"
DEBOOTSTRAP_NO_CHECK_GPG=0
EATMYDATA=1
CCACHE=0
USE_PKGBINARYMANGLER=0
while :; do
case "$1" in
@ -304,27 +303,11 @@ if [ ! -w /var/lib/sbuild ]; then
# Prepare a usable default .sbuildrc
if [ ! -e ~/.sbuildrc ]; then
cat > ~/.sbuildrc <<EOM
# *** THIS COMMAND IS DEPRECATED ***
#
# In sbuild 0.87.0 and later, the unshare backend is available. This is
# expected to become the default in a future release.
#
# This is the new preferred way of building Debian packages, making the manual
# creation of schroots no longer necessary. To retain the default behavior,
# you may remove this comment block and continue.
#
# To test the unshare backend while retaining the default settings, run sbuild
# with --chroot-mode=unshare like this:
# $ sbuild --chroot-mode=unshare --dist=unstable hello
#
# To switch to the unshare backend by default (recommended), uncomment the
# following lines and delete the rest of the file (with the exception of the
# last two lines):
#\$chroot_mode = 'unshare';
#\$unshare_mmdebstrap_keep_tarball = 1;
# *** VERIFY AND UPDATE \$mailto and \$maintainer_name BELOW ***
# Mail address where logs are sent to (mandatory, no default!)
\$mailto = '$USER';
# Name to use as override in .changes files for the Maintainer: field
#\$maintainer_name='$USER <$USER@localhost>';
@ -668,7 +651,6 @@ ubuntu)
if ubuntu_dist_ge "$RELEASE" "edgy"; then
# Add pkgbinarymangler (edgy and later)
BUILD_PKGS="$BUILD_PKGS pkgbinarymangler"
USE_PKGBINARYMANGLER=1
# Disable recommends for a smaller chroot (gutsy and later only)
if ubuntu_dist_ge "$RELEASE" "gutsy"; then
BUILD_PKGS="--no-install-recommends $BUILD_PKGS"
@ -928,8 +910,8 @@ if [ -n "$TEMP_PREFERENCES" ]; then
sudo mv "$TEMP_PREFERENCES" $MNT/etc/apt/preferences.d/proposed.pref
fi
# Copy the timezone (uncomment this if you want to use your local time zone)
#sudo cp -P --remove-destination /etc/localtime /etc/timezone "$MNT"/etc/
# Copy the timezone (comment this out if you want to leave the chroot at UTC)
sudo cp -P --remove-destination /etc/localtime /etc/timezone "$MNT"/etc/
# Create a schroot entry for this chroot
TEMP_SCHROOTCONF=`mktemp -t schrootconf-XXXXXX`
TEMPLATE_SCHROOTCONF=~/.mk-sbuild.schroot.conf
@ -1048,25 +1030,6 @@ EOF
EOM
fi
if [ "$USE_PKGBINARYMANGLER" = 1 ]; then
sudo bash -c "cat >> $MNT/finish.sh" <<EOM
mkdir -p /etc/pkgbinarymangler/
cat > /etc/pkgbinarymangler/maintainermangler.conf <<EOF
# pkgmaintainermangler configuration file
# pkgmaintainermangler will do nothing unless enable is set to "true"
enable: true
# Configure what happens if /CurrentlyBuilding is present, but invalid
# (i. e. it does not contain a Package: field). If "ignore" (default),
# the file is ignored (i. e. the Maintainer field is mangled) and a
# warning is printed. If "fail" (or any other value), pkgmaintainermangler
# exits with an error, which causes a package build to fail.
invalid_currentlybuilding: ignore
EOF
EOM
fi
if [ -n "$TARGET_ARCH" ]; then
sudo bash -c "cat >> $MNT/finish.sh" <<EOM
# Configure target architecture
@ -1085,7 +1048,7 @@ apt-get update || true
echo set debconf/frontend Noninteractive | debconf-communicate
echo set debconf/priority critical | debconf-communicate
# Install basic build tool set, trying to match buildd
apt-get -y --force-yes -o Dpkg::Options::="--force-confold" install $BUILD_PKGS
apt-get -y --force-yes install $BUILD_PKGS
# Set up expected /dev entries
if [ ! -r /dev/stdin ]; then ln -s /proc/self/fd/0 /dev/stdin; fi
if [ ! -r /dev/stdout ]; then ln -s /proc/self/fd/1 /dev/stdout; fi

View File

@ -294,9 +294,7 @@ class PbuilderDist:
if self.target_distro in self._debian_distros:
try:
codename = self.debian_distro_info.codename(
self.target_distro, default=self.target_distro
)
codename = self.debian_distro_info.codename(self.target_distro, default=self.target_distro)
except DistroDataOutdated as error:
Logger.warning(error)
if codename in (self.debian_distro_info.devel(), "experimental"):

View File

@ -15,51 +15,53 @@
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import lzma
from argparse import ArgumentParser
import sys
import webbrowser
from argparse import ArgumentParser
import yaml
from launchpadlib.launchpad import Launchpad
from ubuntutools.utils import get_url
# proposed-migration is only concerned with the devel series; unlike other
# tools, don't make this configurable
excuses_url = "https://ubuntu-archive-team.ubuntu.com/proposed-migration/update_excuses.yaml.xz"
excuses_url = 'https://ubuntu-archive-team.ubuntu.com/proposed-migration/' \
+ 'update_excuses.yaml.xz'
def get_proposed_version(excuses, package):
for k in excuses["sources"]:
if k["source"] == package:
return k.get("new-version")
for k in excuses['sources']:
if k['source'] == package:
return k.get('new-version')
return None
def claim_excuses_bug(launchpad, bug, package):
print(f"LP: #{bug.id}: {bug.title}")
ubuntu = launchpad.distributions["ubuntu"]
print("LP: #%d: %s" % (bug.id, bug.title))
ubuntu = launchpad.distributions['ubuntu']
series = ubuntu.current_series.fullseriesname
for task in bug.bug_tasks:
# targeting to a series doesn't make the default task disappear,
# it just makes it useless
if task.bug_target_name == f"{package} ({series})":
if task.bug_target_name == "%s (%s)" % (package, series):
our_task = task
break
if task.bug_target_name == f"{package} (Ubuntu)":
elif task.bug_target_name == "%s (Ubuntu)" % package:
our_task = task
if our_task.assignee == launchpad.me:
print("Bug already assigned to you.")
return True
if our_task.assignee:
print(f"Currently assigned to {our_task.assignee.name}")
elif our_task.assignee:
print("Currently assigned to %s" % our_task.assignee.name)
print("""Do you want to claim this bug? [yN] """, end="")
print('''Do you want to claim this bug? [yN] ''', end="")
sys.stdout.flush()
response = sys.stdin.readline()
if response.strip().lower().startswith("y"):
if response.strip().lower().startswith('y'):
our_task.assignee = launchpad.me
our_task.lp_save()
return True
@ -70,37 +72,38 @@ def claim_excuses_bug(launchpad, bug, package):
def create_excuses_bug(launchpad, package, version):
print("Will open a new bug")
bug = launchpad.bugs.createBug(
title=f"proposed-migration for {package} {version}",
tags=("update-excuse"),
target=f"https://api.launchpad.net/devel/ubuntu/+source/{package}",
description=f"{package} {version} is stuck in -proposed.",
title = 'proposed-migration for %s %s' % (package, version),
tags = ('update-excuse'),
target = 'https://api.launchpad.net/devel/ubuntu/+source/%s' % package,
description = '%s %s is stuck in -proposed.' % (package, version)
)
task = bug.bug_tasks[0]
task.assignee = launchpad.me
task.lp_save()
print(f"Opening {bug.web_link} in browser")
print("Opening %s in browser" % bug.web_link)
webbrowser.open(bug.web_link)
return bug
def has_excuses_bugs(launchpad, package):
ubuntu = launchpad.distributions["ubuntu"]
ubuntu = launchpad.distributions['ubuntu']
pkg = ubuntu.getSourcePackage(name=package)
if not pkg:
raise ValueError(f"No such source package: {package}")
tasks = pkg.searchTasks(tags=["update-excuse"], order_by=["id"])
tasks = pkg.searchTasks(tags=['update-excuse'], order_by=['id'])
bugs = [task.bug for task in tasks]
if not bugs:
return False
if len(bugs) == 1:
print(f"There is 1 open update-excuse bug against {package}")
print("There is 1 open update-excuse bug against %s" % package)
else:
print(f"There are {len(bugs)} open update-excuse bugs against {package}")
print("There are %d open update-excuse bugs against %s" \
% (len(bugs), package))
for bug in bugs:
if claim_excuses_bug(launchpad, bug, package):
@ -111,14 +114,17 @@ def has_excuses_bugs(launchpad, package):
def main():
parser = ArgumentParser()
parser.add_argument("-l", "--launchpad", dest="launchpad_instance", default="production")
parser.add_argument(
"-v", "--verbose", default=False, action="store_true", help="be more verbose"
)
parser.add_argument("package", nargs="?", help="act on this package only")
"-l", "--launchpad", dest="launchpad_instance", default="production")
parser.add_argument(
"-v", "--verbose", default=False, action="store_true",
help="be more verbose")
parser.add_argument(
'package', nargs='?', help="act on this package only")
args = parser.parse_args()
args.launchpad = Launchpad.login_with("pm-helper", args.launchpad_instance, version="devel")
args.launchpad = Launchpad.login_with(
"pm-helper", args.launchpad_instance, version="devel")
f = get_url(excuses_url, False)
with lzma.open(f) as lzma_f:
@ -129,14 +135,15 @@ def main():
if not has_excuses_bugs(args.launchpad, args.package):
proposed_version = get_proposed_version(excuses, args.package)
if not proposed_version:
print(f"Package {args.package} not found in -proposed.")
print("Package %s not found in -proposed." % args.package)
sys.exit(1)
create_excuses_bug(args.launchpad, args.package, proposed_version)
create_excuses_bug(args.launchpad, args.package,
proposed_version)
except ValueError as e:
sys.stderr.write(f"{e}\n")
else:
pass # for now
pass # for now
if __name__ == "__main__":
if __name__ == '__main__':
sys.exit(main())

View File

@ -183,7 +183,7 @@ def display_verbose(package, values):
Logger.info("No reverse dependencies found")
return
def log_package(values, package, arch, dependency, visited, offset=0):
def log_package(values, package, arch, dependency, offset=0):
line = f"{' ' * offset}* {package}"
if all_archs and set(arch) != all_archs:
line += f" [{' '.join(sorted(arch))}]"
@ -192,9 +192,6 @@ def display_verbose(package, values):
line += " " * (30 - len(line))
line += f" (for {dependency})"
Logger.info(line)
if package in visited:
return
visited = visited.copy().add(package)
data = values.get(package)
if data:
offset = offset + 1
@ -205,7 +202,6 @@ def display_verbose(package, values):
rdep["Package"],
rdep.get("Architectures", all_archs),
rdep.get("Dependency"),
visited,
offset,
)
@ -227,7 +223,6 @@ def display_verbose(package, values):
rdep["Package"],
rdep.get("Architectures", all_archs),
rdep.get("Dependency"),
{package},
)
Logger.info("")

View File

@ -4,31 +4,13 @@
# Authors:
# Andy P. Whitcroft
# Christian Ehrhardt
# Chris Peterson <chris.peterson@canonical.com>
#
# Copyright (C) 2024 Canonical Ltd.
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU General Public License version 3, as published
# by the Free Software Foundation.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranties of
# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program. If not, see <http://www.gnu.org/licenses/>.
"""Dumps a list of currently running tests in Autopkgtest"""
__example__ = """
Display first listed test running on amd64 hardware:
$ running-autopkgtests | grep amd64 | head -n1
R 0:01:40 systemd-upstream - focal amd64\
upstream-systemd-ci/systemd-ci - ['CFLAGS=-O0', 'DEB_BUILD_PROFILES=noudeb',\
'TEST_UPSTREAM=1', 'CONFFLAGS_UPSTREAM=--werror -Dslow-tests=true',\
'UPSTREAM_PULL_REQUEST=23153',\
'GITHUB_STATUSES_URL=https://api.github.com/repos/\
systemd/systemd/statuses/cfb0935923dff8050315b5dd22ce8ab06461ff0e']
R 0:01:40 systemd-upstream - focal amd64 upstream-systemd-ci/systemd-ci - ['CFLAGS=-O0', 'DEB_BUILD_PROFILES=noudeb', 'TEST_UPSTREAM=1', 'CONFFLAGS_UPSTREAM=--werror -Dslow-tests=true', 'UPSTREAM_PULL_REQUEST=23153', 'GITHUB_STATUSES_URL=https://api.github.com/repos/systemd/systemd/statuses/cfb0935923dff8050315b5dd22ce8ab06461ff0e']
"""
import sys
@ -51,10 +33,16 @@ def parse_args():
formatter_class=RawDescriptionHelpFormatter,
)
parser.add_argument(
"-r", "--running", action="store_true", help="Print runnning autopkgtests (default: true)"
"-r",
"--running",
action="store_true",
help="Print runnning autopkgtests (default: true)",
)
parser.add_argument(
"-q", "--queued", action="store_true", help="Print queued autopkgtests (default: true)"
"-q",
"--queued",
action="store_true",
help="Print queued autopkgtests (default: true)",
)
options = parser.parse_args()

View File

@ -32,18 +32,17 @@ def make_pep440_compliant(version: str) -> str:
scripts = [
"backportpackage",
"bitesize",
"check-mir",
"check-symbols",
"dch-repeat",
"grab-merge",
"grep-merges",
"import-bug-from-debian",
"lp-bitesize",
"merge-changelog",
"mk-sbuild",
"pbuilder-dist",
"pbuilder-dist-simple",
"pm-helper",
"pull-pkg",
"pull-debian-debdiff",
"pull-debian-source",

View File

@ -49,7 +49,6 @@ from ubuntutools.requestsync.mail import get_debian_srcpkg as requestsync_mail_g
from ubuntutools.version import Version
Logger = getLogger()
cached_sync_blocklist = None
def remove_signature(dscname):
@ -144,7 +143,7 @@ def sync_dsc(
if ubuntu_ver.is_modified_in_ubuntu():
if not force:
Logger.error("--force is required to discard Ubuntu changes.")
return None
sys.exit(1)
Logger.warning(
"Overwriting modified Ubuntu version %s, setting current version to %s",
@ -158,7 +157,7 @@ def sync_dsc(
src_pkg.pull()
except DownloadError as e:
Logger.error("Failed to download: %s", str(e))
return None
sys.exit(1)
src_pkg.unpack()
needs_fakesync = not (need_orig or ubu_pkg.verify_orig())
@ -167,13 +166,13 @@ def sync_dsc(
Logger.warning("Performing a fakesync")
elif not needs_fakesync and fakesync:
Logger.error("Fakesync not required, aborting.")
return None
sys.exit(1)
elif needs_fakesync and not fakesync:
Logger.error(
"The checksums of the Debian and Ubuntu packages "
"mismatch. A fake sync using --fakesync is required."
)
return None
sys.exit(1)
if fakesync:
# Download Ubuntu files (override Debian source tarballs)
@ -181,7 +180,7 @@ def sync_dsc(
ubu_pkg.pull()
except DownloadError as e:
Logger.error("Failed to download: %s", str(e))
return None
sys.exit(1)
# change into package directory
directory = src_pkg.source + "-" + new_ver.upstream_version
@ -266,7 +265,7 @@ def sync_dsc(
returncode = subprocess.call(cmd)
if returncode != 0:
Logger.error("Source-only build with debuild failed. Please check build log above.")
return None
sys.exit(1)
def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
@ -296,7 +295,7 @@ def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
udtexceptions.SeriesNotFoundException,
) as e:
Logger.error(str(e))
return None
sys.exit(1)
if version is None:
version = Version(debian_srcpkg.getVersion())
try:
@ -307,7 +306,7 @@ def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
ubuntu_version = Version("~")
except udtexceptions.SeriesNotFoundException as e:
Logger.error(str(e))
return None
sys.exit(1)
if ubuntu_version >= version:
# The LP importer is maybe out of date
debian_srcpkg = requestsync_mail_get_debian_srcpkg(package, dist)
@ -321,7 +320,7 @@ def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
ubuntu_version,
ubuntu_release,
)
return None
sys.exit(1)
if component is None:
component = debian_srcpkg.getComponent()
@ -330,7 +329,7 @@ def fetch_source_pkg(package, dist, version, component, ubuntu_release, mirror):
return DebianSourcePackage(package, version.full_version, component, mirrors=mirrors)
def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, yes=False):
def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False):
"""Copy a source package from Debian to Ubuntu using the Launchpad API."""
ubuntu = Distribution("ubuntu")
debian_archive = Distribution("debian").getArchive()
@ -353,7 +352,7 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, ye
"Debian version %s has not been picked up by LP yet. Please try again later.",
src_pkg.version,
)
return None
sys.exit(1)
try:
ubuntu_spph = get_ubuntu_srcpkg(src_pkg.source, ubuntu_series, ubuntu_pocket)
@ -374,7 +373,7 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, ye
base_version = ubuntu_version.get_related_debian_version()
if not force and ubuntu_version.is_modified_in_ubuntu():
Logger.error("--force is required to discard Ubuntu changes.")
return None
sys.exit(1)
# Check whether a fakesync would be required.
if not src_pkg.dsc.compare_dsc(ubuntu_pkg.dsc):
@ -382,7 +381,7 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, ye
"The checksums of the Debian and Ubuntu packages "
"mismatch. A fake sync using --fakesync is required."
)
return None
sys.exit(1)
except udtexceptions.PackageNotFoundException:
base_version = Version("~")
Logger.info(
@ -403,10 +402,9 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, ye
if sponsoree:
Logger.info("Sponsoring this sync for %s (%s)", sponsoree.display_name, sponsoree.name)
if not yes:
answer = YesNoQuestion().ask("Sync this package", "no")
if answer != "yes":
return
answer = YesNoQuestion().ask("Sync this package", "no")
if answer != "yes":
return
try:
ubuntu_archive.copyPackage(
@ -421,29 +419,26 @@ def copy(src_pkg, release, bugs, sponsoree=None, simulate=False, force=False, ye
except HTTPError as error:
Logger.error("HTTP Error %s: %s", error.response.status, error.response.reason)
Logger.error(error.content)
return None
sys.exit(1)
Logger.info("Request succeeded; you should get an e-mail once it is processed.")
bugs = sorted(set(bugs))
if bugs:
Logger.info("Launchpad bugs to be closed: %s", ", ".join(str(bug) for bug in bugs))
Logger.info("Please wait for the sync to be successful before closing bugs.")
if yes:
answer = YesNoQuestion().ask("Close bugs", "yes")
if answer == "yes":
close_bugs(bugs, src_pkg.source, src_pkg.version.full_version, changes, sponsoree)
else:
answer = YesNoQuestion().ask("Close bugs", "yes")
if answer == "yes":
close_bugs(bugs, src_pkg.source, src_pkg.version.full_version, changes, sponsoree)
def is_blocklisted(query):
"""Determine if package "query" is in the sync blocklist
Returns tuple of (blocklisted, comments)
blocklisted is one of False, 'CURRENT', 'ALWAYS'
def is_blacklisted(query):
"""Determine if package "query" is in the sync blacklist
Returns tuple of (blacklisted, comments)
blacklisted is one of False, 'CURRENT', 'ALWAYS'
"""
series = Launchpad.distributions["ubuntu"].current_series
lp_comments = series.getDifferenceComments(source_package_name=query)
blocklisted = False
blacklisted = False
comments = [
f"{c.body_text}\n -- {c.comment_author.name}"
f" {c.comment_date.strftime('%a, %d %b %Y %H:%M:%S +0000')}"
@ -451,38 +446,32 @@ def is_blocklisted(query):
]
for diff in series.getDifferencesTo(source_package_name_filter=query):
if diff.status == "Blacklisted current version" and blocklisted != "ALWAYS":
blocklisted = "CURRENT"
if diff.status == "Blacklisted current version" and blacklisted != "ALWAYS":
blacklisted = "CURRENT"
if diff.status == "Blacklisted always":
blocklisted = "ALWAYS"
blacklisted = "ALWAYS"
global cached_sync_blocklist
if not cached_sync_blocklist:
url = "https://ubuntu-archive-team.ubuntu.com/sync-blocklist.txt"
try:
with urllib.request.urlopen(url) as f:
cached_sync_blocklist = f.read().decode("utf-8")
except:
print("WARNING: unable to download the sync blocklist. Erring on the side of caution.")
return ("ALWAYS", "INTERNAL ERROR: Unable to fetch sync blocklist")
# Old blacklist:
url = "https://ubuntu-archive-team.ubuntu.com/sync-blacklist.txt"
with urllib.request.urlopen(url) as f:
applicable_lines = []
for line in f:
line = line.decode("utf-8")
if not line.strip():
applicable_lines = []
continue
applicable_lines.append(line)
try:
line = line[: line.index("#")]
except ValueError:
pass
source = line.strip()
if source and fnmatch.fnmatch(query, source):
comments += ["From sync-blacklist.txt:"] + applicable_lines
blacklisted = "ALWAYS"
break
applicable_lines = []
for line in cached_sync_blocklist.splitlines():
if not line.strip():
applicable_lines = []
continue
applicable_lines.append(line)
try:
line = line[:line.index("#")]
except ValueError:
pass
source = line.strip()
if source and fnmatch.fnmatch(query, source):
comments += ["From sync-blocklist.txt:"] + applicable_lines
blocklisted = "ALWAYS"
break
return (blocklisted, comments)
return (blacklisted, comments)
def close_bugs(bugs, package, version, changes, sponsoree):
@ -519,12 +508,6 @@ def parse():
epilog = f"See {os.path.basename(sys.argv[0])}(1) for more info."
parser = argparse.ArgumentParser(usage=usage, epilog=epilog)
parser.add_argument(
"-y",
"--yes",
action="store_true",
help="Automatically sync without prompting. Use with caution and care."
)
parser.add_argument("-d", "--distribution", help="Debian distribution to sync from.")
parser.add_argument("-r", "--release", help="Specify target Ubuntu release.")
parser.add_argument("-V", "--debian-version", help="Specify the version to sync from.")
@ -729,38 +712,36 @@ def main():
args.release,
args.debian_mirror,
)
if not src_pkg:
continue
blocklisted, comments = is_blocklisted(src_pkg.source)
blocklist_fail = False
if blocklisted:
blacklisted, comments = is_blacklisted(src_pkg.source)
blacklist_fail = False
if blacklisted:
messages = []
if blocklisted == "CURRENT":
if blacklisted == "CURRENT":
Logger.debug(
"Source package %s is temporarily blocklisted "
"(blocklisted_current). "
"Source package %s is temporarily blacklisted "
"(blacklisted_current). "
"Ubuntu ignores these for now. "
"See also LP: #841372",
src_pkg.source,
)
else:
if args.fakesync:
messages += ["Doing a fakesync, overriding blocklist."]
messages += ["Doing a fakesync, overriding blacklist."]
else:
blocklist_fail = True
blacklist_fail = True
messages += [
"If this package needs a fakesync, use --fakesync",
"If you think this package shouldn't be "
"blocklisted, please file a bug explaining your "
"blacklisted, please file a bug explaining your "
"reasoning and subscribe ~ubuntu-archive.",
]
if blocklist_fail:
Logger.error("Source package %s is blocklisted.", src_pkg.source)
elif blocklisted == "ALWAYS":
Logger.info("Source package %s is blocklisted.", src_pkg.source)
if blacklist_fail:
Logger.error("Source package %s is blacklisted.", src_pkg.source)
elif blacklisted == "ALWAYS":
Logger.info("Source package %s is blacklisted.", src_pkg.source)
if messages:
for message in messages:
for line in textwrap.wrap(message):
@ -772,15 +753,14 @@ def main():
for line in textwrap.wrap(comment):
Logger.info(" %s", line)
if blocklist_fail:
continue
if blacklist_fail:
sys.exit(1)
if args.lp:
if not copy(src_pkg, args.release, args.bugs, sponsoree, args.simulate, args.force, args.yes):
continue
copy(src_pkg, args.release, args.bugs, sponsoree, args.simulate, args.force)
else:
os.environ["DEB_VENDOR"] = "Ubuntu"
if not sync_dsc(
sync_dsc(
src_pkg,
args.distribution,
args.release,
@ -792,8 +772,7 @@ def main():
args.simulate,
args.force,
args.fakesync,
):
continue
)
if __name__ == "__main__":

View File

@ -28,8 +28,9 @@
import argparse
import sys
import lazr.restfulclient.errors
from launchpadlib.credentials import TokenAuthorizationException
from launchpadlib.launchpad import Launchpad
import lazr.restfulclient.errors
from ubuntutools import getLogger
from ubuntutools.lp.udtexceptions import PocketDoesNotExistError
@ -38,7 +39,7 @@ from ubuntutools.misc import split_release_pocket
Logger = getLogger()
def get_build_states(pkg, archs):
def getBuildStates(pkg, archs):
res = []
for build in pkg.getBuilds():
@ -47,8 +48,7 @@ def get_build_states(pkg, archs):
msg = "\n".join(res)
return f"Build state(s) for '{pkg.source_package_name}':\n{msg}"
def rescore_builds(pkg, archs, score):
def rescoreBuilds(pkg, archs, score):
res = []
for build in pkg.getBuilds():
@ -61,19 +61,18 @@ def rescore_builds(pkg, archs, score):
res.append(f" {arch}: done")
except lazr.restfulclient.errors.Unauthorized:
Logger.error(
"You don't have the permissions to rescore builds."
" Ignoring your rescore request."
"You don't have the permissions to rescore builds. Ignoring your rescore request."
)
return None
except lazr.restfulclient.errors.BadRequest:
Logger.info("Cannot rescore build of %s on %s.", build.source_package_name, arch)
Logger.info("Cannot rescore build of %s on %s.",
build.source_package_name, arch)
res.append(f" {arch}: failed")
msg = "\n".join(res)
return f"Rescoring builds of '{pkg.source_package_name}' to {score}:\n{msg}"
def retry_builds(pkg, archs):
def retryBuilds(pkg, archs):
res = []
for build in pkg.getBuilds():
arch = build.arch_tag
@ -95,7 +94,16 @@ def main():
# Valid architectures.
valid_archs = set(
["armhf", "arm64", "amd64", "i386", "powerpc", "ppc64el", "riscv64", "s390x"]
[
"armhf",
"arm64",
"amd64",
"i386",
"powerpc",
"ppc64el",
"riscv64",
"s390x",
]
)
# Prepare our option parser.
@ -110,7 +118,8 @@ def main():
f"include: {', '.join(valid_archs)}.",
)
parser.add_argument("-A", "--archive", help="operate on ARCHIVE", default="ubuntu")
parser.add_argument("-A", "--archive", help="operate on ARCHIVE",
default="ubuntu")
# Batch processing options
batch_options = parser.add_argument_group(
@ -139,9 +148,7 @@ def main():
help="Rescore builds to <priority>.",
)
batch_options.add_argument(
"--state",
action="store",
dest="state",
"--state", action="store", dest="state",
help="Act on builds that are in the specified state",
)
@ -150,8 +157,11 @@ def main():
# Parse our options.
args = parser.parse_args()
launchpad = Launchpad.login_with("ubuntu-dev-tools", "production", version="devel")
ubuntu = launchpad.distributions["ubuntu"]
launchpad = Launchpad.login_with("ubuntu-dev-tools", "production",
version="devel")
me = launchpad.me
ubuntu = launchpad.distributions['ubuntu']
if args.batch:
release = args.series
@ -159,8 +169,8 @@ def main():
# ppas don't have a proposed pocket so just use the release pocket;
# but for the main archive we default to -proposed
release = ubuntu.getDevelopmentSeries()[0].name
if args.archive == "ubuntu":
release = f"{release}-proposed"
if args.archive == 'ubuntu':
release = release + "-proposed"
try:
(release, pocket) = split_release_pocket(release)
except PocketDoesNotExistError as error:
@ -213,13 +223,12 @@ def main():
# Get list of published sources for package in question.
try:
sources = archive.getPublishedSources(
distro_series=distroseries,
exact_match=True,
pocket=pocket,
source_name=package,
status="Published",
)[0]
except IndexError:
distro_series=distroseries,
exact_match=True,
pocket=pocket,
source_name=package,
status='Published')[0]
except IndexError as error:
Logger.error("No publication found for package %s", package)
sys.exit(1)
# Get list of builds for that package.
@ -234,20 +243,21 @@ def main():
# are in place.
if operation == "retry":
necessary_privs = archive.checkUpload(
component=sources.getComponent(),
distroseries=distroseries,
person=launchpad.me,
pocket=pocket,
sourcepackagename=sources.getPackageName(),
component=sources.getComponent(),
distroseries=distroseries,
person=launchpad.me,
pocket=pocket,
sourcepackagename=sources.getPackageName(),
)
if not necessary_privs:
Logger.error(
"You cannot perform the %s operation on a %s package as you"
" do not have the permissions to do this action.",
operation,
component,
)
sys.exit(1)
if operation == "retry" and not necessary_privs:
Logger.error(
"You cannot perform the %s operation on a %s package as you"
" do not have the permissions to do this action.",
operation,
component,
)
sys.exit(1)
# Output details.
Logger.info(
@ -278,8 +288,7 @@ def main():
build.rescore(score=priority)
except lazr.restfulclient.errors.Unauthorized:
Logger.error(
"You don't have the permissions to rescore builds."
" Ignoring your rescore request."
"You don't have the permissions to rescore builds. Ignoring your rescore request."
)
break
else:
@ -316,22 +325,24 @@ def main():
if not args.state:
if args.retry:
args.state = "Failed to build"
args.state='Failed to build'
elif args.priority:
args.state = "Needs building"
args.state='Needs building'
# there is no equivalent to series.getBuildRecords() for a ppa.
# however, we don't want to have to traverse all build records for
# all series when working on the main archive, so we use
# series.getBuildRecords() for ubuntu and handle ppas separately
series = ubuntu.getSeries(name_or_version=release)
if args.archive == "ubuntu":
builds = series.getBuildRecords(build_state=args.state, pocket=pocket)
if args.archive == 'ubuntu':
builds = series.getBuildRecords(build_state=args.state,
pocket=pocket)
else:
builds = []
for build in archive.getBuildRecords(build_state=args.state, pocket=pocket):
for build in archive.getBuildRecords(build_state=args.state,
pocket=pocket):
if not build.current_source_publication:
continue
if build.current_source_publication.distro_series == series:
if build.current_source_publication.distro_series==series:
builds.append(build)
for build in builds:
if build.arch_tag not in archs:
@ -350,31 +361,24 @@ def main():
)
if args.retry and not can_retry:
Logger.error(
"You don't have the permissions to retry the build of '%s', skipping.",
build.source_package_name,
"You don't have the permissions to retry the "
"build of '%s', skipping.",
build.source_package_name
)
continue
Logger.info(
"The source version for '%s' in '%s' (%s) is: %s",
build.source_package_name,
release,
pocket,
build.source_package_version,
build.source_package_name,
release,
pocket,
build.source_package_version
)
if args.retry and build.can_be_retried:
Logger.info(
"Retrying build of %s on %s...", build.source_package_name, build.arch_tag
)
try:
build.retry()
retry_count += 1
except lazr.restfulclient.errors.BadRequest:
Logger.info(
"Failed to retry build of %s on %s",
build.source_package_name,
build.arch_tag,
)
Logger.info("Retrying build of %s on %s...",
build.source_package_name, build.arch_tag)
build.retry()
retry_count += 1
if args.priority and can_rescore:
if build.can_be_rescored:
@ -382,32 +386,28 @@ def main():
build.rescore(score=args.priority)
except lazr.restfulclient.errors.Unauthorized:
Logger.error(
"You don't have the permissions to rescore builds."
" Ignoring your rescore request."
"You don't have the permissions to rescore builds. Ignoring your rescore request."
)
can_rescore = False
except lazr.restfulclient.errors.BadRequest:
Logger.info(
"Cannot rescore build of %s on %s.",
build.source_package_name,
build.arch_tag,
)
Logger.info("Cannot rescore build of %s on %s.",
build.source_package_name, build.arch_tag)
Logger.info("")
if args.retry:
Logger.info("%d package builds retried", retry_count)
sys.exit(0)
for pkg in args.packages:
try:
pkg = archive.getPublishedSources(
distro_series=distroseries,
exact_match=True,
pocket=pocket,
source_name=pkg,
status="Published",
)[0]
except IndexError:
distro_series=distroseries,
exact_match=True,
pocket=pocket,
source_name=pkg,
status='Published')[0]
except IndexError as error:
Logger.error("No publication found for package %s", pkg)
continue
@ -435,14 +435,15 @@ def main():
pkg.source_package_version,
)
Logger.info(get_build_states(pkg, archs))
Logger.info(getBuildStates(pkg, archs))
if can_retry:
Logger.info(retry_builds(pkg, archs))
Logger.info(retryBuilds(pkg, archs))
if args.priority:
Logger.info(rescore_builds(pkg, archs, args.priority))
Logger.info(rescoreBuilds(pkg, archs, args.priority))
Logger.info("")
if __name__ == "__main__":
main()

View File

@ -165,7 +165,6 @@ class SourcePackage(ABC):
series = kwargs.get("series")
pocket = kwargs.get("pocket")
status = kwargs.get("status")
arch = kwargs.get("arch")
verify_signature = kwargs.get("verify_signature", False)
try_binary = kwargs.get("try_binary", True)
@ -185,7 +184,6 @@ class SourcePackage(ABC):
self._series = series
self._pocket = pocket
self._status = status
self._arch = arch
# dscfile can be either a path or an URL. misc.py's download() will
# later fiture it out
self._dsc_source = dscfile
@ -254,7 +252,6 @@ class SourcePackage(ABC):
)
try:
params["archtag"] = self._arch
bpph = archive.getBinaryPackage(self.source, **params)
except PackageNotFoundException as bpnfe:
# log binary lookup failure, in case it provides hints
@ -546,7 +543,7 @@ class SourcePackage(ABC):
Return the debdiff filename.
"""
cmd = ["debdiff", self.dsc_name, newpkg.dsc_name]
difffn = f"{newpkg.dsc_name[:-3]}debdiff"
difffn = newpkg.dsc_name[:-3] + "debdiff"
Logger.debug("%s > %s", " ".join(cmd), difffn)
with open(difffn, "w", encoding="utf-8") as f:
if subprocess.call(cmd, stdout=f, cwd=str(self.workdir)) > 2:
@ -1345,7 +1342,7 @@ class SnapshotSPPH:
self.getComponent(),
subdir,
name,
f"{name}_{pkgversion}",
name + "_" + pkgversion,
"changelog.txt",
)
try:

View File

@ -71,8 +71,8 @@ class Pbuilder(Builder):
cmd = [
"sudo",
"-E",
f"ARCH={self.architecture}",
f"DIST={dist}",
"ARCH=" + self.architecture,
"DIST=" + dist,
self.name,
"--build",
"--architecture",
@ -91,8 +91,8 @@ class Pbuilder(Builder):
cmd = [
"sudo",
"-E",
f"ARCH={self.architecture}",
f"DIST={dist}",
"ARCH=" + self.architecture,
"DIST=" + dist,
self.name,
"--update",
"--architecture",
@ -140,7 +140,7 @@ class Sbuild(Builder):
workdir = os.getcwd()
Logger.debug("cd %s", result_directory)
os.chdir(result_directory)
cmd = ["sbuild", "--arch-all", f"--dist={dist}", f"--arch={self.architecture}", dsc_file]
cmd = ["sbuild", "--arch-all", "--dist=" + dist, "--arch=" + self.architecture, dsc_file]
Logger.debug(" ".join(cmd))
returncode = subprocess.call(cmd)
Logger.debug("cd %s", workdir)

View File

@ -68,19 +68,21 @@ class UDTConfig:
config = {}
for filename in ("/etc/devscripts.conf", "~/.devscripts"):
try:
with open(os.path.expanduser(filename), "r", encoding="utf-8") as f:
content = f.read()
f = open(os.path.expanduser(filename), "r", encoding="utf-8")
except IOError:
continue
try:
tokens = shlex.split(content, comments=True)
except ValueError as e:
Logger.error("Error parsing %s: %s", filename, e)
continue
for token in tokens:
if "=" in token:
key, value = token.split("=", 1)
for line in f:
parsed = shlex.split(line, comments=True)
if len(parsed) > 1:
Logger.warning(
"Cannot parse variable assignment in %s: %s",
getattr(f, "name", "<config>"),
line,
)
if len(parsed) >= 1 and "=" in parsed[0]:
key, value = parsed[0].split("=", 1)
config[key] = value
f.close()
return config
def get_value(self, key, default=None, boolean=False, compat_keys=()):
@ -97,9 +99,9 @@ class UDTConfig:
if default is None and key in self.defaults:
default = self.defaults[key]
keys = [f"{self.prefix}_{key}"]
keys = [self.prefix + "_" + key]
if key in self.defaults:
keys.append(f"UBUNTUTOOLS_{key}")
keys.append("UBUNTUTOOLS_" + key)
keys += compat_keys
for k in keys:
@ -112,9 +114,9 @@ class UDTConfig:
else:
continue
if k in compat_keys:
replacements = f"{self.prefix}_{key}"
replacements = self.prefix + "_" + key
if key in self.defaults:
replacements += f"or UBUNTUTOOLS_{key}"
replacements += "or UBUNTUTOOLS_" + key
Logger.warning(
"Using deprecated configuration variable %s. You should use %s.",
k,
@ -178,7 +180,7 @@ def ubu_email(name=None, email=None, export=True):
mailname = socket.getfqdn()
if os.path.isfile("/etc/mailname"):
mailname = open("/etc/mailname", "r", encoding="utf-8").read().strip()
email = f"{pwd.getpwuid(os.getuid()).pw_name}@{mailname}"
email = pwd.getpwuid(os.getuid()).pw_name + "@" + mailname
if export:
os.environ["DEBFULLNAME"] = name

View File

@ -883,7 +883,7 @@ class SourcePackagePublishingHistory(BaseWrapper):
"""
release = self.getSeriesName()
if self.pocket != "Release":
release += f"-{self.pocket.lower()}"
release += "-" + self.pocket.lower()
return release
def getArchive(self):

View File

@ -385,7 +385,7 @@ class _StderrProgressBar:
pctstr = f"{pct:>3}%"
barlen = self.width * pct // 100
barstr = "=" * barlen
barstr = f"{barstr[:-1]}>"
barstr = barstr[:-1] + ">"
barstr = barstr.ljust(self.width)
fullstr = f"\r[{barstr}]{pctstr}"
sys.stderr.write(fullstr)

View File

@ -340,7 +340,6 @@ class PullPkg:
params = {}
params["package"] = options["package"]
params["arch"] = options["arch"]
if options["release"]:
(release, version, pocket) = self.parse_release_and_version(
@ -454,7 +453,7 @@ class PullPkg:
if key.startswith("vcs-"):
if key == "vcs-browser":
continue
if key == "vcs-git":
elif key == "vcs-git":
vcs = "Git"
elif key == "vcs-bzr":
vcs = "Bazaar"
@ -463,13 +462,9 @@ class PullPkg:
uri = srcpkg.dsc[original_key]
Logger.warning(
"\nNOTICE: '%s' packaging is maintained in "
"the '%s' version control system at:\n %s\n",
package,
vcs,
uri,
)
Logger.warning("\nNOTICE: '%s' packaging is maintained in "
"the '%s' version control system at:\n"
" %s\n" % (package, vcs, uri))
if vcs == "Bazaar":
vcscmd = " $ bzr branch " + uri
@ -477,11 +472,9 @@ class PullPkg:
vcscmd = " $ git clone " + uri
if vcscmd:
Logger.info(
"Please use:\n%s\n"
"to retrieve the latest (possibly unreleased) updates to the package.\n",
vcscmd,
)
Logger.info(f"Please use:\n{vcscmd}\n"
"to retrieve the latest (possibly unreleased) "
"updates to the package.\n")
if pull == PULL_LIST:
Logger.info("Source files:")

View File

@ -31,9 +31,9 @@ class Question:
def get_options(self):
if len(self.options) == 2:
options = f"{self.options[0]} or {self.options[1]}"
options = self.options[0] + " or " + self.options[1]
else:
options = f"{', '.join(self.options[:-1])}, or {self.options[-1]}"
options = ", ".join(self.options[:-1]) + ", or " + self.options[-1]
return options
def ask(self, question, default=None):
@ -67,7 +67,7 @@ class Question:
if selected == option[0]:
selected = option
if selected not in self.options:
print(f"Please answer the question with {self.get_options()}.")
print("Please answer the question with " + self.get_options() + ".")
return selected
@ -170,7 +170,7 @@ class EditBugReport(EditFile):
split_re = re.compile(r"^Summary.*?:\s+(.*?)\s+Description:\s+(.*)$", re.DOTALL | re.UNICODE)
def __init__(self, subject, body, placeholders=None):
prefix = f"{os.path.basename(sys.argv[0])}_"
prefix = os.path.basename(sys.argv[0]) + "_"
tmpfile = tempfile.NamedTemporaryFile(prefix=prefix, suffix=".txt", delete=False)
tmpfile.write((f"Summary (one line):\n{subject}\n\nDescription:\n{body}").encode("utf-8"))
tmpfile.close()

View File

@ -183,7 +183,7 @@ Content-Type: text/plain; charset=UTF-8
backup = tempfile.NamedTemporaryFile(
mode="w",
delete=False,
prefix=f"requestsync-{re.sub('[^a-zA-Z0-9_-]', '', bugtitle.replace(' ', '_'))}",
prefix="requestsync-" + re.sub(r"[^a-zA-Z0-9_-]", "", bugtitle.replace(" ", "_")),
)
with backup:
backup.write(mail)

View File

@ -1,19 +1,18 @@
# Copyright (C) 2024 Canonical Ltd.
# Author: Chris Peterson <chris.peterson@canonical.com>
# Author: Andy P. Whitcroft
# Author: Christian Ehrhardt
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU General Public License version 3, as published
# by the Free Software Foundation.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranties of
# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program. If not, see <http://www.gnu.org/licenses/>.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import datetime
import json
@ -26,7 +25,10 @@ URL_QUEUED = "http://autopkgtest.ubuntu.com/queues.json"
def _get_jobs(url: str) -> dict:
request = urllib.request.Request(url, headers={"Cache-Control": "max-age-0"})
request = urllib.request.Request(
url,
headers={"Cache-Control": "max-age-0"},
)
with urllib.request.urlopen(request) as response:
data = response.read()
jobs = json.loads(data.decode("utf-8"))
@ -49,10 +51,7 @@ def get_running():
env = jobinfo[0].get("env", "-")
time = str(datetime.timedelta(seconds=jobinfo[1]))
try:
line = (
f"R {time:6} {pkg:30} {'-':10} {series:8} {arch:8}"
f" {ppas:31} {triggers} {env}\n"
)
line = f"R {time:6} {pkg:30} {'-':10} {series:8} {arch:8} {ppas:31} {triggers} {env}\n"
running.append((jobinfo[1], line))
except BrokenPipeError:
sys.exit(1)
@ -86,10 +85,7 @@ def get_queued():
n = n + 1
try:
output += (
f"Q{n:04d} {'-:--':>6} {pkg:30} {origin:10} {series:8} {arch:8}"
f" {ppas:31} {triggers}\n"
)
output += f"Q{n:04d} {'-:--':>6} {pkg:30} {origin:10} {series:8} {arch:8} {ppas:31} {triggers}\n"
except BrokenPipeError:
sys.exit(1)
return output

View File

@ -255,7 +255,7 @@ class SourcePackage:
def _changes_file(self):
"""Returns the file name of the .changes file."""
return os.path.join(
self._workdir, f"{self._package}_{strip_epoch(self._version)}_source.changes"
self._workdir, f"{self._package}_{ strip_epoch(self._version)}_source.changes"
)
def check_target(self, upload, launchpad):

View File

@ -39,7 +39,7 @@ def is_command_available(command, check_sbin=False):
"Is command in $PATH?"
path = os.environ.get("PATH", "/usr/bin:/bin").split(":")
if check_sbin:
path += [f"{directory[:-3]}sbin" for directory in path if directory.endswith("/bin")]
path += [directory[:-3] + "sbin" for directory in path if directory.endswith("/bin")]
return any(os.access(os.path.join(directory, command), os.X_OK) for directory in path)
@ -303,7 +303,7 @@ def _download_and_change_into(task, dsc_file, patch, branch):
extract_source(dsc_file, Logger.isEnabledFor(logging.DEBUG))
# change directory
directory = f"{task.package}-{task.get_version().upstream_version}"
directory = task.package + "-" + task.get_version().upstream_version
Logger.debug("cd %s", directory)
os.chdir(directory)

View File

@ -1,33 +0,0 @@
# Copyright (C) 2024 Canonical Ltd.
# Author: Chris Peterson <chris.peterson@canonical.com>
#
# Permission to use, copy, modify, and/or distribute this software for any
# purpose with or without fee is hereby granted, provided that the above
# copyright notice and this permission notice appear in all copies.
#
# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
# REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY
# AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
# INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
# LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
# OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
# PERFORMANCE OF THIS SOFTWARE.
import unittest
# Binary Tests
class BinaryTests(unittest.TestCase):
# The requestsync binary has the option of using the launchpad api
# to log in but requires python3-keyring in addition to
# python3-launchpadlib. Testing the integrated login functionality
# automatically isn't very feasbile, but we can at least write a smoke
# test to make sure the required packages are installed.
# See LP: #2049217
def test_keyring_installed(self):
"""Smoke test for required lp api dependencies"""
try:
import keyring # noqa: F401
except ModuleNotFoundError:
raise ModuleNotFoundError("package python3-keyring is not installed")

View File

@ -1,17 +1,18 @@
# Copyright (C) 2024 Canonical Ltd.
# Author: Chris Peterson <chris.peterson@canonical.com>
#
# Permission to use, copy, modify, and/or distribute this software for any
# purpose with or without fee is hereby granted, provided that the above
# copyright notice and this permission notice appear in all copies.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
# REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY
# AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
# INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
# LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
# OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
# PERFORMANCE OF THIS SOFTWARE.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
""" Tests for running_autopkgtests
Tests using cached data from autopkgtest servers.
@ -32,17 +33,8 @@ from ubuntutools.running_autopkgtests import (
)
# Cached binary response data from autopkgtest server
RUN_DATA = (
b'{"pyatem": {'
b" \"submit-time_2024-01-19 19:37:36;triggers_['python3-defaults/3.12.1-0ubuntu1'];\":"
b' {"noble": {"arm64": [{"triggers": ["python3-defaults/3.12.1-0ubuntu1"],'
b' "submit-time": "2024-01-19 19:37:36"}, 380, "<omitted log>"]}}}}'
)
QUEUED_DATA = (
b'{"ubuntu": {"noble": {"arm64": ["libobject-accessor-perl {\\"requester\\": \\"someone\\",'
b' \\"submit-time\\": \\"2024-01-18 01:08:55\\",'
b' \\"triggers\\": [\\"perl/5.38.2-3\\", \\"liblocale-gettext-perl/1.07-6build1\\"]}"]}}}'
)
RUN_DATA = b'{"pyatem": { "submit-time_2024-01-19 19:37:36;triggers_[\'python3-defaults/3.12.1-0ubuntu1\'];": {"noble": {"arm64": [{"triggers": ["python3-defaults/3.12.1-0ubuntu1"], "submit-time": "2024-01-19 19:37:36"}, 380, "<omitted log>"]}}}}'
QUEUED_DATA = b'{"ubuntu": {"noble": {"arm64": ["libobject-accessor-perl {\\"requester\\": \\"someone\\", \\"submit-time\\": \\"2024-01-18 01:08:55\\", \\"triggers\\": [\\"perl/5.38.2-3\\", \\"liblocale-gettext-perl/1.07-6build1\\"]}"]}}}'
# Expected result(s) of parsing the above JSON data
RUNNING_JOB = {
@ -66,9 +58,7 @@ QUEUED_JOB = {
"ubuntu": {
"noble": {
"arm64": [
'libobject-accessor-perl {"requester": "someone",'
' "submit-time": "2024-01-18 01:08:55",'
' "triggers": ["perl/5.38.2-3", "liblocale-gettext-perl/1.07-6build1"]}'
'libobject-accessor-perl {"requester": "someone", "submit-time": "2024-01-18 01:08:55", "triggers": ["perl/5.38.2-3", "liblocale-gettext-perl/1.07-6build1"]}',
]
}
}
@ -79,18 +69,9 @@ PRIVATE_JOB = {"ppa": {"noble": {"arm64": ["private job"]}}}
# Expected textual output of the program based on the above data
RUNNING_OUTPUT = (
"R 0:06:20 pyatem - noble arm64"
" - python3-defaults/3.12.1-0ubuntu1 -\n"
)
QUEUED_OUTPUT = (
"Q0001 -:-- libobject-accessor-perl ubuntu noble arm64"
" - perl/5.38.2-3,liblocale-gettext-perl/1.07-6build1\n"
)
PRIVATE_OUTPUT = (
"Q0001 -:-- private job ppa noble arm64"
" private job private job\n"
)
RUNNING_OUTPUT = "R 0:06:20 pyatem - noble arm64 - python3-defaults/3.12.1-0ubuntu1 -\n"
QUEUED_OUTPUT = "Q0001 -:-- libobject-accessor-perl ubuntu noble arm64 - perl/5.38.2-3,liblocale-gettext-perl/1.07-6build1\n"
PRIVATE_OUTPUT = "Q0001 -:-- private job ppa noble arm64 private job private job\n"
class RunningAutopkgtestTestCase(unittest.TestCase):

View File

@ -72,17 +72,17 @@ class Control:
def set_maintainer(self, maintainer):
"""Sets the value of the Maintainer field."""
pattern = re.compile("^Maintainer: ?.*$", re.MULTILINE)
self._content = pattern.sub(f"Maintainer: {maintainer}", self._content)
self._content = pattern.sub("Maintainer: " + maintainer, self._content)
def set_original_maintainer(self, original_maintainer):
"""Sets the value of the XSBC-Original-Maintainer field."""
original_maintainer = f"XSBC-Original-Maintainer: {original_maintainer}"
original_maintainer = "XSBC-Original-Maintainer: " + original_maintainer
if self.get_original_maintainer():
pattern = re.compile("^(?:[XSBC]*-)?Original-Maintainer:.*$", re.MULTILINE)
self._content = pattern.sub(original_maintainer, self._content)
else:
pattern = re.compile("^(Maintainer:.*)$", re.MULTILINE)
self._content = pattern.sub(f"\\1\\n{original_maintainer}", self._content)
self._content = pattern.sub(r"\1\n" + original_maintainer, self._content)
def remove_original_maintainer(self):
"""Strip out out the XSBC-Original-Maintainer line"""

View File

@ -15,44 +15,47 @@
"""Portions of archive related code that is re-used by various tools."""
from datetime import datetime
import os
import re
import urllib.request
from datetime import datetime
import dateutil.parser
from dateutil.tz import tzutc
def get_cache_dir():
cache_dir = os.environ.get("XDG_CACHE_HOME", os.path.expanduser(os.path.join("~", ".cache")))
uat_cache = os.path.join(cache_dir, "ubuntu-archive-tools")
cache_dir = os.environ.get('XDG_CACHE_HOME',
os.path.expanduser(os.path.join('~', '.cache')))
uat_cache = os.path.join(cache_dir, 'ubuntu-archive-tools')
os.makedirs(uat_cache, exist_ok=True)
return uat_cache
def get_url(url, force_cached):
"""Return file to the URL, possibly caching it"""
''' Return file to the URL, possibly caching it
'''
cache_file = None
# ignore bileto urls wrt caching, they're usually too small to matter
# and we don't do proper cache expiry
m = re.search("ubuntu-archive-team.ubuntu.com/proposed-migration/([^/]*)/([^/]*)", url)
m = re.search('ubuntu-archive-team.ubuntu.com/proposed-migration/'
'([^/]*)/([^/]*)',
url)
if m:
cache_dir = get_cache_dir()
cache_file = os.path.join(cache_dir, f"{m.group(1)}_{m.group(2)}")
cache_file = os.path.join(cache_dir, '%s_%s' % (m.group(1), m.group(2)))
else:
# test logs can be cached, too
m = re.search(
"https://autopkgtest.ubuntu.com/results/autopkgtest-[^/]*/([^/]*)/([^/]*)"
"/[a-z0-9]*/([^/]*)/([_a-f0-9]*)@/log.gz",
url,
)
'https://autopkgtest.ubuntu.com/results/autopkgtest-[^/]*/([^/]*)/([^/]*)'
'/[a-z0-9]*/([^/]*)/([_a-f0-9]*)@/log.gz',
url)
if m:
cache_dir = get_cache_dir()
cache_file = os.path.join(
cache_dir, f"{m.group(1)}_{m.group(2)}_{m.group(3)}_{m.group(4)}.gz"
)
cache_dir, '%s_%s_%s_%s.gz' % (
m.group(1), m.group(2), m.group(3), m.group(4)))
if cache_file:
try:
@ -62,18 +65,18 @@ def get_url(url, force_cached):
prev_timestamp = datetime.fromtimestamp(prev_mtime, tz=tzutc())
new_timestamp = datetime.now(tz=tzutc()).timestamp()
if force_cached:
return open(cache_file, "rb")
return open(cache_file, 'rb')
f = urllib.request.urlopen(url)
if cache_file:
remote_ts = dateutil.parser.parse(f.headers["last-modified"])
remote_ts = dateutil.parser.parse(f.headers['last-modified'])
if remote_ts > prev_timestamp:
with open(f"{cache_file}.new", "wb") as new_cache:
with open('%s.new' % cache_file, 'wb') as new_cache:
for line in f:
new_cache.write(line)
os.rename(f"{cache_file}.new", cache_file)
os.rename('%s.new' % cache_file, cache_file)
os.utime(cache_file, times=(new_timestamp, new_timestamp))
f.close()
f = open(cache_file, "rb")
f = open(cache_file, 'rb')
return f