2015-04-25 17:20:49 +02:00
#!/usr/bin/python3 -u
2006-08-18 22:50:52 +00:00
# -*- coding: utf-8 -*-
2006-06-17 13:45:56 +00:00
2008-04-29 05:08:58 +00:00
# Copyright (C) 2001-2008 Anthony Towns <ajt@debian.org>
2006-06-17 13:45:56 +00:00
# Andreas Barth <aba@debian.org>
# Fabio Tranchitella <kobold@debian.org>
2013-09-07 18:21:00 +00:00
# Copyright (C) 2010-2013 Adam D. Barratt <adsb@debian.org>
2006-06-17 13:45:56 +00:00
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
2006-06-24 17:49:43 +00:00
"""
2015-04-25 15:59:02 +02:00
= Introduction =
2006-06-24 17:49:43 +00:00
This is the Debian testing updater script , also known as " Britney " .
Packages are usually installed into the ` testing ' distribution after
they have undergone some degree of testing in unstable . The goal of
this software is to do this task in a smart way , allowing testing
2013-09-18 20:40:00 +00:00
to always be fully installable and close to being a release candidate .
2006-06-25 10:05:35 +00:00
2010-02-28 12:58:03 +00:00
Britney ' s source code is split between two different but related tasks:
2006-06-25 10:05:35 +00:00
the first one is the generation of the update excuses , while the
second tries to update testing with the valid candidates ; first
each package alone , then larger and even larger sets of packages
together . Each try is accepted if testing is not more uninstallable
after the update than before .
= Data Loading =
In order to analyze the entire Debian distribution , Britney needs to
load in memory the whole archive : this means more than 10.000 packages
2010-02-28 12:58:03 +00:00
for twelve architectures , as well as the dependency interconnections
between them . For this reason , the memory requirements for running this
2006-06-25 10:05:35 +00:00
software are quite high and at least 1 gigabyte of RAM should be available .
Britney loads the source packages from the ` Sources ' file and the binary
packages from the ` Packages_ $ { arch } ' files, where $ {arch} is substituted
with the supported architectures . While loading the data , the software
2010-02-28 12:58:03 +00:00
analyzes the dependencies and builds a directed weighted graph in memory
2006-06-25 10:05:35 +00:00
with all the interconnections between the packages ( see Britney . read_sources
and Britney . read_binaries ) .
Other than source and binary packages , Britney loads the following data :
2010-02-23 19:27:32 +00:00
* BugsV , which contains the list of release - critical bugs for a given
2016-03-26 08:16:39 +00:00
version of a source or binary package ( see RCBugPolicy . read_bugs ) .
2006-06-25 10:05:35 +00:00
* Dates , which contains the date of the upload of a given version
of a source package ( see Britney . read_dates ) .
* Urgencies , which contains the urgency of the upload of a given
2016-03-25 15:23:34 +00:00
version of a source package ( see AgePolicy . _read_urgencies ) .
2006-06-25 10:05:35 +00:00
* Hints , which contains lists of commands which modify the standard behaviour
of Britney ( see Britney . read_hints ) .
For a more detailed explanation about the format of these files , please read
the documentation of the related methods . The exact meaning of them will be
instead explained in the chapter " Excuses Generation " .
= Excuses =
An excuse is a detailed explanation of why a package can or cannot
2016-07-02 20:28:07 +00:00
be updated in the testing distribution from a newer package in
2006-06-25 10:05:35 +00:00
another distribution ( like for example unstable ) . The main purpose
of the excuses is to be written in an HTML file which will be
published over HTTP . The maintainers will be able to parse it manually
or automatically to find the explanation of why their packages have
been updated or not .
== Excuses generation ==
These are the steps ( with references to method names ) that Britney
does for the generation of the update excuses .
* If a source package is available in testing but it is not
present in unstable and no binary packages in unstable are
built from it , then it is marked for removal .
* Every source package in unstable and testing - proposed - updates ,
if already present in testing , is checked for binary - NMUs , new
or dropped binary packages in all the supported architectures
( see Britney . should_upgrade_srcarch ) . The steps to detect if an
upgrade is needed are :
1. If there is a ` remove ' hint for the source package, the package
is ignored : it will be removed and not updated .
2010-02-28 12:58:03 +00:00
2. For every binary package built from the new source , it checks
2013-09-18 20:40:00 +00:00
for unsatisfied dependencies , new binary packages and updated
binary packages ( binNMU ) , excluding the architecture - independent
ones , and packages not built from the same source .
2006-06-25 10:05:35 +00:00
2010-02-28 12:58:03 +00:00
3. For every binary package built from the old source , it checks
2006-06-25 10:05:35 +00:00
if it is still built from the new source ; if this is not true
and the package is not architecture - independent , the script
removes it from testing .
4. Finally , if there is something worth doing ( eg . a new or updated
binary package ) and nothing wrong it marks the source package
as " Valid candidate " , or " Not considered " if there is something
wrong which prevented the update .
* Every source package in unstable and testing - proposed - updates is
checked for upgrade ( see Britney . should_upgrade_src ) . The steps
to detect if an upgrade is needed are :
1. If the source package in testing is more recent the new one
is ignored .
2. If the source package doesn ' t exist (is fake), which means that
a binary package refers to it but it is not present in the
` Sources ' file, the new one is ignored.
3. If the package doesn ' t exist in testing, the urgency of the
upload is ignored and set to the default ( actually ` low ' ).
4. If there is a ` remove ' hint for the source package, the package
is ignored : it will be removed and not updated .
5. If there is a ` block ' hint for the source package without an
` unblock ` hint or a ` block - all source ` , the package is ignored .
2009-08-15 16:39:00 +01:00
6. If there is a ` block - udeb ' hint for the source package, it will
have the same effect as ` block ' , but may only be cancelled by
a subsequent ` unblock - udeb ' hint.
2006-06-25 10:05:35 +00:00
7. If the suite is unstable , the update can go ahead only if the
2010-02-28 12:58:03 +00:00
upload happened more than the minimum days specified by the
2006-06-25 10:05:35 +00:00
urgency of the upload ; if this is not true , the package is
ignored as ` too - young ' . Note that the urgency is sticky, meaning
that the highest urgency uploaded since the previous testing
transition is taken into account .
2010-09-20 22:03:43 +00:00
8. If the suite is unstable , all the architecture - dependent binary
packages and the architecture - independent ones for the ` nobreakall '
architectures have to be built from the source we are considering .
If this is not true , then these are called ` out - of - date '
architectures and the package is ignored .
2006-06-25 10:05:35 +00:00
2013-09-18 20:40:00 +00:00
9. The source package must have at least one binary package , otherwise
2006-06-25 10:05:35 +00:00
it is ignored .
2010-02-23 19:27:32 +00:00
10. If the suite is unstable , the new source package must have no
release critical bugs which do not also apply to the testing
2006-06-25 10:05:35 +00:00
one . If this is not true , the package is ignored as ` buggy ' .
11. If there is a ` force ' hint for the source package, then it is
updated even if it is marked as ignored from the previous steps .
2011-08-28 21:25:38 +00:00
12. If the suite is { testing - , } proposed - updates , the source package can
2010-09-20 22:03:43 +00:00
be updated only if there is an explicit approval for it . Unless
a ` force ' hint exists, the new package must also be available
on all of the architectures for which it has binary packages in
testing .
2006-06-25 10:05:35 +00:00
13. If the package will be ignored , mark it as " Valid candidate " ,
otherwise mark it as " Not considered " .
* The list of ` remove ' hints is processed: if the requested source
package is not already being updated or removed and the version
actually in testing is the same specified with the ` remove ' hint,
it is marked for removal .
* The excuses are sorted by the number of days from the last upload
( days - old ) and by name .
* A list of unconsidered excuses ( for which the package is not upgraded )
2011-08-02 17:36:03 +00:00
is built . Using this list , all of the excuses depending on them are
marked as invalid " impossible dependencies " .
2006-06-25 10:05:35 +00:00
* The excuses are written in an HTML file .
2006-06-24 17:49:43 +00:00
"""
2016-10-23 10:30:01 +00:00
import optparse
2006-06-17 13:45:56 +00:00
import os
import sys
import time
2016-11-13 16:32:11 +00:00
from collections import defaultdict
2015-11-22 11:29:06 +01:00
from functools import reduce
2011-12-27 11:36:50 +01:00
from operator import attrgetter
2015-04-25 17:22:43 +02:00
from urllib . parse import quote
2011-12-27 11:36:50 +01:00
2016-10-23 10:30:01 +00:00
import apt_pkg
2016-09-25 05:45:36 +00:00
# Check the "check_field_name" reflection before removing an import here.
2016-11-13 16:32:11 +00:00
from britney2 import SuiteInfo , SourcePackage , BinaryPackageId , BinaryPackage
2016-10-23 10:30:01 +00:00
from britney2 . consts import ( SOURCE , SOURCEVER , ARCHITECTURE , CONFLICTS , DEPENDS , PROVIDES , MULTIARCH )
from britney2 . excuse import Excuse
from britney2 . hints import HintParser
2016-11-16 07:17:50 +00:00
from britney2 . installability . builder import build_installability_tester
2016-10-23 10:30:01 +00:00
from britney2 . migrationitem import MigrationItem
2017-11-01 21:09:23 +00:00
from britney2 . policies import PolicyVerdict
2017-11-01 21:32:03 +00:00
from britney2 . policies . policy import AgePolicy , RCBugPolicy , PiupartsPolicy , BuildDependsPolicy
2016-10-23 10:30:01 +00:00
from britney2 . utils import ( old_libraries_format , undo_changes ,
compute_reverse_tree , possibly_compressed ,
read_nuninst , write_nuninst , write_heidi ,
eval_uninst , newly_uninst , make_migrationitem ,
write_excuses , write_heidi_delta , write_controlfiles ,
old_libraries , is_nuninst_asgood_generous ,
clone_nuninst , check_installability ,
2016-09-24 07:38:07 +00:00
create_provides_map , read_release_file ,
2016-11-15 23:09:26 +00:00
read_sources_file , get_dependency_solvers ,
2016-11-16 07:38:29 +00:00
invalidate_excuses , compile_nuninst ,
2016-10-23 10:30:01 +00:00
)
2006-06-17 13:45:56 +00:00
2011-08-28 21:28:47 +00:00
__author__ = ' Fabio Tranchitella and the Debian Release Team '
__version__ = ' 2.0 '
2006-06-17 13:45:56 +00:00
2016-01-17 10:43:26 +00:00
# NB: ESSENTIAL deliberately skipped as the 2011 and 2012
# parts of the live-data tests require it (britney merges
# this field correctly from the unstable version where
# available)
check_field_name = dict ( ( globals ( ) [ fn ] , fn ) for fn in
(
" SOURCE SOURCEVER ARCHITECTURE MULTIARCH " +
" DEPENDS CONFLICTS PROVIDES "
) . split ( )
)
check_fields = sorted ( check_field_name )
2016-09-24 09:42:39 +00:00
2012-01-06 19:39:57 +01:00
class Britney ( object ) :
2010-02-28 12:58:03 +00:00
""" Britney, the Debian testing updater script
2006-06-24 17:49:43 +00:00
2010-02-28 12:58:03 +00:00
This is the script that updates the testing distribution . It is executed
2006-06-24 17:49:43 +00:00
each day after the installation of the updated packages . It generates the
` Packages ' files for the testing distribution, but it does so in an
2010-02-28 12:58:03 +00:00
intelligent manner ; it tries to avoid any inconsistency and to use only
2006-06-24 17:49:43 +00:00
non - buggy packages .
2006-06-17 13:45:56 +00:00
2006-06-24 17:49:43 +00:00
For more documentation on this script , please read the Developers Reference .
"""
2006-06-17 13:45:56 +00:00
2017-02-03 16:47:24 +00:00
HINTS_HELPERS = ( " easy " , " hint " , " remove " , " block " , " block-udeb " , " unblock " , " unblock-udeb " , " approve " ,
" remark " , " ignore-piuparts " , " ignore-rc-bugs " )
2008-01-19 17:24:06 +00:00
HINTS_STANDARD = ( " urgent " , " age-days " ) + HINTS_HELPERS
2016-07-02 20:28:07 +00:00
# ALL = {"force", "force-hint", "block-all"} | HINTS_STANDARD | registered policy hints (not covered above)
HINTS_ALL = ( ' ALL ' )
2006-06-17 13:45:56 +00:00
def __init__ ( self ) :
2006-06-24 17:49:43 +00:00
""" Class constructor
This method initializes and populates the data lists , which contain all
the information needed by the other methods of the class .
"""
# parse the command line arguments
2016-03-25 15:23:34 +00:00
self . policies = [ ]
2016-07-02 20:28:07 +00:00
self . _hint_parser = HintParser ( self )
2016-10-24 19:36:44 +00:00
self . suite_info = { }
2006-06-17 13:45:56 +00:00
self . __parse_arguments ( )
2011-11-11 13:30:53 +00:00
MigrationItem . set_architectures ( self . options . architectures )
2006-06-24 17:49:43 +00:00
# initialize the apt_pkg back-end
2006-06-17 13:45:56 +00:00
apt_pkg . init ( )
2012-01-04 21:36:25 +01:00
self . sources = { }
self . binaries = { }
2014-01-11 08:31:36 +01:00
self . all_selected = [ ]
2016-03-25 09:18:35 +00:00
self . excuses = { }
2014-01-11 08:31:36 +01:00
2013-10-07 19:52:43 +00:00
try :
2016-07-02 20:00:16 +00:00
self . read_hints ( self . options . hintsdir )
2013-10-07 19:52:43 +00:00
except AttributeError :
2016-10-24 19:36:44 +00:00
self . read_hints ( os . path . join ( self . suite_info [ ' unstable ' ] . path , ' Hints ' ) )
2006-06-24 17:49:43 +00:00
2012-02-18 18:51:05 +01:00
if self . options . nuninst_cache :
2016-03-25 07:43:46 +00:00
self . log ( " Not building the list of non-installable packages, as requested " , type = " I " )
2012-02-18 18:51:05 +01:00
if self . options . print_uninst :
2016-11-22 20:55:48 +00:00
nuninst = self . get_nuninst ( build = False )
2015-04-25 16:03:54 +02:00
print ( ' * summary ' )
print ( ' \n ' . join ( ' %4d %s ' % ( len ( nuninst [ x ] ) , x ) for x in self . options . architectures ) )
2012-02-18 18:51:05 +01:00
return
2006-08-18 22:50:52 +00:00
2016-01-17 10:43:26 +00:00
self . all_binaries = { }
2006-06-24 17:49:43 +00:00
# read the source and binary packages for the involved distributions
2016-10-24 19:36:44 +00:00
self . sources [ ' testing ' ] = self . read_sources ( self . suite_info [ ' testing ' ] . path )
self . sources [ ' unstable ' ] = self . read_sources ( self . suite_info [ ' unstable ' ] . path )
2016-06-01 16:38:27 +00:00
for suite in ( ' tpu ' , ' pu ' ) :
if hasattr ( self . options , suite ) :
self . sources [ suite ] = self . read_sources ( getattr ( self . options , suite ) )
else :
self . sources [ suite ] = { }
2012-01-04 21:36:25 +01:00
2016-01-17 18:49:35 +00:00
self . binaries [ ' testing ' ] = { }
2012-01-04 21:36:25 +01:00
self . binaries [ ' unstable ' ] = { }
self . binaries [ ' tpu ' ] = { }
self . binaries [ ' pu ' ] = { }
2016-10-24 19:36:44 +00:00
self . binaries [ ' unstable ' ] = self . read_binaries ( self . suite_info [ ' unstable ' ] . path , " unstable " , self . options . architectures )
2016-09-24 07:00:36 +00:00
for suite in ( ' tpu ' , ' pu ' ) :
2016-10-24 19:36:44 +00:00
if suite in self . suite_info :
self . binaries [ suite ] = self . read_binaries ( self . suite_info [ suite ] . path , suite , self . options . architectures )
2016-09-24 07:00:36 +00:00
else :
# _build_installability_tester relies on this being
# properly initialised, so insert two empty dicts
# here.
for arch in self . options . architectures :
2016-06-01 16:38:27 +00:00
self . binaries [ suite ] [ arch ] = ( { } , { } )
2016-09-24 07:00:36 +00:00
# Load testing last as some live-data tests have more complete information in
# unstable
2016-10-24 19:36:44 +00:00
self . binaries [ ' testing ' ] = self . read_binaries ( self . suite_info [ ' testing ' ] . path , " testing " , self . options . architectures )
2015-04-25 23:53:57 +10:00
2016-05-07 08:56:41 +00:00
try :
constraints_file = os . path . join ( self . options . static_input_dir , ' constraints ' )
2016-05-07 12:36:49 +00:00
faux_packages = os . path . join ( self . options . static_input_dir , ' faux-packages ' )
2016-05-07 08:56:41 +00:00
except AttributeError :
self . log ( " The static_input_dir option is not set " , type = ' I ' )
constraints_file = None
2016-05-07 12:36:49 +00:00
faux_packages = None
if faux_packages is not None and os . path . exists ( faux_packages ) :
self . log ( " Loading faux packages from %s " % faux_packages , type = ' I ' )
self . _load_faux_packages ( faux_packages )
2016-05-26 17:48:19 +00:00
elif faux_packages is not None :
2016-05-07 12:36:49 +00:00
self . log ( " No Faux packages as %s does not exist " % faux_packages , type = ' I ' )
2016-05-07 08:56:41 +00:00
if constraints_file is not None and os . path . exists ( constraints_file ) :
self . log ( " Loading constraints from %s " % constraints_file , type = ' I ' )
self . constraints = self . _load_constraints ( constraints_file )
else :
if constraints_file is not None :
self . log ( " No constraints as %s does not exist " % constraints_file , type = ' I ' )
self . constraints = {
' keep-installable ' : [ ] ,
}
2016-03-25 07:43:46 +00:00
self . log ( " Compiling Installability tester " , type = " I " )
2016-11-16 07:17:50 +00:00
self . _inst_tester = build_installability_tester ( self . binaries , self . options . architectures )
2006-06-24 17:49:43 +00:00
2012-02-18 18:51:05 +01:00
if not self . options . nuninst_cache :
2016-03-25 07:43:46 +00:00
self . log ( " Building the list of non-installable packages for the full archive " , type = " I " )
2011-12-13 18:52:01 +01:00
self . _inst_tester . compute_testing_installability ( )
2016-11-16 07:38:29 +00:00
nuninst = self . get_nuninst ( build = True )
2012-02-18 18:51:05 +01:00
for arch in self . options . architectures :
2016-03-25 07:43:46 +00:00
self . log ( " > Found %d non-installable packages " % len ( nuninst [ arch ] ) , type = " I " )
2012-02-18 18:51:05 +01:00
if self . options . print_uninst :
self . nuninst_arch_report ( nuninst , arch )
if self . options . print_uninst :
2015-04-25 16:03:54 +02:00
print ( ' * summary ' )
print ( ' \n ' . join ( map ( lambda x : ' %4d %s ' % ( len ( nuninst [ x ] ) , x ) , self . options . architectures ) ) )
2012-02-18 18:51:05 +01:00
return
else :
write_nuninst ( self . options . noninst_status , nuninst )
2015-04-05 12:21:49 +02:00
stats = self . _inst_tester . compute_stats ( )
2016-03-25 07:43:46 +00:00
self . log ( " > Installability tester statistics (per architecture) " , type = " I " )
2015-04-05 12:21:49 +02:00
for arch in self . options . architectures :
arch_stat = stats [ arch ]
2016-03-25 07:43:46 +00:00
self . log ( " > %s " % arch , type = " I " )
2015-04-05 12:21:49 +02:00
for stat in arch_stat . stat_summary ( ) :
2016-03-25 07:43:46 +00:00
self . log ( " > - %s " % stat , type = " I " )
2015-04-05 12:21:49 +02:00
2016-03-25 15:23:34 +00:00
for policy in self . policies :
policy . hints = self . hints
policy . initialise ( self )
2006-06-17 13:45:56 +00:00
2016-01-17 10:43:26 +00:00
def merge_pkg_entries ( self , package , parch , pkg_entry1 , pkg_entry2 ,
check_fields = check_fields , check_field_name = check_field_name ) :
bad = [ ]
for f in check_fields :
2016-11-13 08:49:46 +00:00
if pkg_entry1 [ f ] != pkg_entry2 [ f ] : # pragma: no cover
2016-01-17 10:43:26 +00:00
bad . append ( ( f , pkg_entry1 [ f ] , pkg_entry2 [ f ] ) )
2016-11-13 08:49:46 +00:00
if bad : # pragma: no cover
2016-03-25 07:43:46 +00:00
self . log ( " Mismatch found %s %s %s differs " % (
2016-04-06 20:49:40 +00:00
package , pkg_entry1 . version , parch ) , type = " E " )
2016-01-17 10:43:26 +00:00
for f , v1 , v2 in bad :
2016-03-25 07:43:46 +00:00
self . log ( " ... %s %s != %s " % ( check_field_name [ f ] , v1 , v2 ) )
2016-01-17 10:43:26 +00:00
raise ValueError ( " Invalid data set " )
# Merge ESSENTIAL if necessary
2016-04-06 20:49:40 +00:00
assert pkg_entry1 . is_essential or not pkg_entry2 . is_essential
2015-04-25 23:53:57 +10:00
2006-06-17 13:45:56 +00:00
def __parse_arguments ( self ) :
2006-06-24 17:49:43 +00:00
""" Parse the command line arguments
This method parses and initializes the command line arguments .
While doing so , it preprocesses some of the options to be converted
in a suitable form for the other methods of the class .
"""
# initialize the parser
2011-10-19 12:14:02 +02:00
parser = optparse . OptionParser ( version = " % prog " )
parser . add_option ( " -v " , " " , action = " count " , dest = " verbose " , help = " enable verbose output " )
parser . add_option ( " -c " , " --config " , action = " store " , dest = " config " , default = " /etc/britney.conf " ,
2006-07-28 13:21:44 +00:00
help = " path for the configuration file " )
2011-10-19 12:14:02 +02:00
parser . add_option ( " " , " --architectures " , action = " store " , dest = " architectures " , default = None ,
2006-07-28 13:21:44 +00:00
help = " override architectures from configuration file " )
2011-10-19 12:14:02 +02:00
parser . add_option ( " " , " --actions " , action = " store " , dest = " actions " , default = None ,
2006-07-28 13:21:44 +00:00
help = " override the list of actions to be performed " )
2011-10-19 12:14:02 +02:00
parser . add_option ( " " , " --hints " , action = " store " , dest = " hints " , default = None ,
2008-01-17 14:27:36 +00:00
help = " additional hints, separated by semicolons " )
2011-10-19 12:14:02 +02:00
parser . add_option ( " " , " --hint-tester " , action = " store_true " , dest = " hint_tester " , default = None ,
2008-01-15 16:09:51 +00:00
help = " provide a command line interface to test hints " )
2011-10-19 12:14:02 +02:00
parser . add_option ( " " , " --dry-run " , action = " store_true " , dest = " dry_run " , default = False ,
2006-08-06 09:29:32 +00:00
help = " disable all outputs to the testing directory " )
2011-10-19 12:14:02 +02:00
parser . add_option ( " " , " --control-files " , action = " store_true " , dest = " control_files " , default = False ,
2006-08-03 11:41:39 +00:00
help = " enable control files generation " )
2011-10-19 12:14:02 +02:00
parser . add_option ( " " , " --nuninst-cache " , action = " store_true " , dest = " nuninst_cache " , default = False ,
2006-08-07 14:38:13 +00:00
help = " do not build the non-installability status, use the cache from file " )
2011-10-19 12:14:02 +02:00
parser . add_option ( " " , " --print-uninst " , action = " store_true " , dest = " print_uninst " , default = False ,
2008-05-02 13:57:37 +00:00
help = " just print a summary of uninstallable packages " )
2015-04-27 02:13:27 +10:00
parser . add_option ( " " , " --components " , action = " store " , dest = " components " ,
help = " Sources/Packages are laid out by components listed (, sep) " )
2017-04-06 11:45:59 +00:00
parser . add_option ( " " , " --compute-migrations " , action = " store_true " , dest = " compute_migrations " , default = True ,
help = " Compute which packages can migrate (the default) " )
parser . add_option ( " " , " --no-compute-migrations " , action = " store_false " , dest = " compute_migrations " ,
help = " Do not compute which packages can migrate. " )
2011-10-19 12:14:02 +02:00
( self . options , self . args ) = parser . parse_args ( )
2008-04-29 05:08:58 +00:00
# integrity checks
2016-11-13 08:49:46 +00:00
if self . options . nuninst_cache and self . options . print_uninst : # pragma: no cover
2016-03-25 07:43:46 +00:00
self . log ( " nuninst_cache and print_uninst are mutually exclusive! " , type = " E " )
2008-04-29 05:08:58 +00:00
sys . exit ( 1 )
2015-04-27 19:14:08 +00:00
# if the configuration file exists, then read it and set the additional options
2016-11-13 08:49:46 +00:00
elif not os . path . isfile ( self . options . config ) : # pragma: no cover
2016-03-25 07:43:46 +00:00
self . log ( " Unable to read the configuration file ( %s ), exiting! " % self . options . config , type = " E " )
2006-06-17 13:45:56 +00:00
sys . exit ( 1 )
2006-06-24 17:49:43 +00:00
# minimum days for unstable-testing transition and the list of hints
# are handled as an ad-hoc case
2016-03-25 15:23:34 +00:00
MINDAYS = { }
2016-07-02 20:28:07 +00:00
2008-01-17 14:27:36 +00:00
self . HINTS = { ' command-line ' : self . HINTS_ALL }
2015-04-26 00:04:50 +02:00
with open ( self . options . config , encoding = ' utf-8 ' ) as config :
2015-04-25 16:30:04 +02:00
for line in config :
if ' = ' in line and not line . strip ( ) . startswith ( ' # ' ) :
k , v = line . split ( ' = ' , 1 )
k = k . strip ( )
v = v . strip ( )
if k . startswith ( " MINDAYS_ " ) :
2016-03-25 15:23:34 +00:00
MINDAYS [ k . split ( " _ " ) [ 1 ] . lower ( ) ] = int ( v )
2015-04-25 16:30:04 +02:00
elif k . startswith ( " HINTS_ " ) :
self . HINTS [ k . split ( " _ " ) [ 1 ] . lower ( ) ] = \
reduce ( lambda x , y : x + y , [ hasattr ( self , " HINTS_ " + i ) and getattr ( self , " HINTS_ " + i ) or ( i , ) for i in v . split ( ) ] )
elif not hasattr ( self . options , k . lower ( ) ) or \
not getattr ( self . options , k . lower ( ) ) :
setattr ( self . options , k . lower ( ) , v )
2016-10-24 19:36:44 +00:00
for suite in ( ' testing ' , ' unstable ' , ' pu ' , ' tpu ' ) :
suffix = suite if suite in { ' pu ' , ' tpu ' } else ' '
if hasattr ( self . options , suite ) :
suite_path = getattr ( self . options , suite )
self . suite_info [ suite ] = SuiteInfo ( name = suite , path = suite_path , excuses_suffix = suffix )
else :
2016-11-13 08:49:46 +00:00
if suite in { ' testing ' , ' unstable ' } : # pragma: no cover
2016-10-24 19:36:44 +00:00
self . log ( " Mandatory configuration %s is not set in the config " % suite . upper ( ) , type = ' E ' )
sys . exit ( 1 )
self . log ( " Optional suite %s is not defined (config option: %s ) " % ( suite , suite . upper ( ) ) )
2016-10-23 12:28:12 +00:00
try :
2016-10-24 19:36:44 +00:00
release_file = read_release_file ( self . suite_info [ ' testing ' ] . path )
2016-10-23 12:28:12 +00:00
self . log ( " Found a Release file in testing - using that for defaults " )
except FileNotFoundError :
self . log ( " Testing does not have a Release file. " )
release_file = None
2006-06-24 17:49:43 +00:00
2015-04-27 02:13:27 +10:00
if getattr ( self . options , " components " , None ) :
self . options . components = [ s . strip ( ) for s in self . options . components . split ( " , " ) ]
2016-10-23 12:28:12 +00:00
elif release_file and not self . options . control_files :
self . options . components = release_file [ ' Components ' ] . split ( )
self . log ( " Using components listed in Release file: %s " % ' ' . join ( self . options . components ) )
2015-04-27 02:13:27 +10:00
else :
self . options . components = None
2016-11-13 08:49:46 +00:00
if self . options . control_files and self . options . components : # pragma: no cover
2016-03-24 15:08:18 +00:00
# We cannot regenerate the control files correctly when reading from an
# actual mirror (we don't which package goes in what component etc.).
2016-03-25 07:43:46 +00:00
self . log ( " Cannot use --control-files with mirror-layout (components)! " , type = " E " )
2016-03-24 15:08:18 +00:00
sys . exit ( 1 )
2014-01-11 08:31:36 +01:00
if not hasattr ( self . options , " heidi_delta_output " ) :
self . options . heidi_delta_output = self . options . heidi_output + " Delta "
2015-05-27 22:34:42 +02:00
self . options . nobreakall_arches = self . options . nobreakall_arches . split ( )
2013-09-09 12:37:52 +01:00
self . options . outofsync_arches = self . options . outofsync_arches . split ( )
2015-05-27 22:34:42 +02:00
self . options . break_arches = self . options . break_arches . split ( )
self . options . new_arches = self . options . new_arches . split ( )
2016-10-23 12:28:12 +00:00
if getattr ( self . options , " architectures " , None ) :
# Sort the architecture list
allarches = sorted ( self . options . architectures . split ( ) )
else :
2016-11-13 08:49:46 +00:00
if not release_file : # pragma: no cover
2016-10-23 12:28:12 +00:00
self . log ( " No configured architectures and there is no release file for testing " , type = " E " )
2016-10-24 19:36:44 +00:00
self . log ( " Please check if there is a \" Release \" file in %s " % self . suite_info [ ' testing ' ] . path , type = " E " )
2016-10-23 12:28:12 +00:00
self . log ( " or if the config file contains a non-empty \" ARCHITECTURES \" field " , type = " E " )
sys . exit ( 1 )
allarches = sorted ( release_file [ ' Architectures ' ] . split ( ) )
self . log ( " Using architectures listed in Release file: %s " % ' ' . join ( allarches ) )
2015-05-27 22:34:42 +02:00
arches = [ x for x in allarches if x in self . options . nobreakall_arches ]
2013-09-09 12:37:52 +01:00
arches + = [ x for x in allarches if x not in arches and x not in self . options . outofsync_arches ]
2015-05-27 22:34:42 +02:00
arches + = [ x for x in allarches if x not in arches and x not in self . options . break_arches ]
arches + = [ x for x in allarches if x not in arches and x not in self . options . new_arches ]
2006-06-19 19:54:21 +00:00
arches + = [ x for x in allarches if x not in arches ]
2015-04-25 17:22:43 +02:00
self . options . architectures = [ sys . intern ( arch ) for arch in arches ]
2006-08-06 09:29:32 +00:00
self . options . smooth_updates = self . options . smooth_updates . split ( )
2006-06-17 13:45:56 +00:00
2015-09-06 15:21:30 +00:00
if not hasattr ( self . options , ' ignore_cruft ' ) or \
self . options . ignore_cruft == " 0 " :
self . options . ignore_cruft = False
2015-05-27 22:34:42 +02:00
2016-10-24 19:36:44 +00:00
self . policies . append ( AgePolicy ( self . options , self . suite_info , MINDAYS ) )
self . policies . append ( RCBugPolicy ( self . options , self . suite_info ) )
2016-11-26 17:11:14 +00:00
self . policies . append ( PiupartsPolicy ( self . options , self . suite_info ) )
2017-11-01 21:32:03 +00:00
self . policies . append ( BuildDependsPolicy ( self . options , self . suite_info ) )
2016-03-25 15:23:34 +00:00
2016-07-02 21:38:52 +00:00
for policy in self . policies :
policy . register_hints ( self . _hint_parser )
2016-07-02 20:00:16 +00:00
@property
def hints ( self ) :
return self . _hint_parser . hints
2016-03-25 07:43:46 +00:00
def log ( self , msg , type = " I " ) :
2006-06-24 17:49:43 +00:00
""" Print info messages according to verbosity level
An easy - and - simple log method which prints messages to the standard
output . The type parameter controls the urgency of the message , and
can be equal to ` I ' for `Information ' , ` W ' for `Warning ' and ` E ' for
2010-02-28 12:58:03 +00:00
` Error ' . Warnings and errors are always printed, and information is
2013-09-18 20:40:00 +00:00
printed only if verbose logging is enabled .
2006-06-24 17:49:43 +00:00
"""
2006-06-17 13:45:56 +00:00
if self . options . verbose or type in ( " E " , " W " ) :
2015-04-25 16:03:54 +02:00
print ( " %s : [ %s ] - %s " % ( type , time . asctime ( ) , msg ) )
2006-06-17 13:45:56 +00:00
2016-05-07 12:36:49 +00:00
def _load_faux_packages ( self , faux_packages_file ) :
""" Loads fake packages
In rare cases , it is useful to create a " fake " package that can be used to satisfy
dependencies . This is usually needed for packages that are not shipped directly
on this mirror but is a prerequisite for using this mirror ( e . g . some vendors provide
non - distributable " setup " packages and contrib / non - free packages depend on these ) .
: param faux_packages_file : Path to the file containing the fake package definitions
"""
tag_file = apt_pkg . TagFile ( faux_packages_file )
get_field = tag_file . section . get
step = tag_file . step
no = 0
while step ( ) :
no + = 1
pkg_name = get_field ( ' Package ' , None )
2016-11-13 08:49:46 +00:00
if pkg_name is None : # pragma: no cover
2016-05-21 13:38:52 +00:00
raise ValueError ( " Missing Package field in paragraph %d (file %s ) " % ( no , faux_packages_file ) )
2016-05-07 12:36:49 +00:00
pkg_name = sys . intern ( pkg_name )
version = sys . intern ( get_field ( ' Version ' , ' 1.0-1 ' ) )
provides_raw = get_field ( ' Provides ' )
archs_raw = get_field ( ' Architecture ' , None )
2016-05-21 13:39:09 +00:00
component = get_field ( ' Component ' , ' non-free ' )
2016-05-07 12:36:49 +00:00
if archs_raw :
archs = archs_raw . split ( )
else :
archs = self . options . architectures
faux_section = ' faux '
if component != ' main ' :
faux_section = " %s /faux " % component
2016-09-24 09:42:39 +00:00
src_data = SourcePackage ( version ,
2016-05-07 12:36:49 +00:00
sys . intern ( faux_section ) ,
[ ] ,
None ,
True ,
2017-11-01 21:32:03 +00:00
None
2016-09-24 09:42:39 +00:00
)
2016-05-07 12:36:49 +00:00
self . sources [ ' testing ' ] [ pkg_name ] = src_data
self . sources [ ' unstable ' ] [ pkg_name ] = src_data
for arch in archs :
2016-05-16 06:42:26 +00:00
pkg_id = BinaryPackageId ( pkg_name , version , arch )
2016-05-07 12:36:49 +00:00
if provides_raw :
provides = self . _parse_provides ( pkg_id , provides_raw )
else :
provides = [ ]
bin_data = BinaryPackage ( version ,
faux_section ,
pkg_name ,
version ,
arch ,
get_field ( ' Multi-Arch ' ) ,
None ,
None ,
provides ,
False ,
2016-05-16 06:24:36 +00:00
pkg_id ,
2016-05-07 12:36:49 +00:00
)
2016-09-25 05:45:36 +00:00
src_data . binaries . append ( pkg_id )
2016-05-07 12:36:49 +00:00
self . binaries [ ' testing ' ] [ arch ] [ 0 ] [ pkg_name ] = bin_data
self . binaries [ ' unstable ' ] [ arch ] [ 0 ] [ pkg_name ] = bin_data
self . all_binaries [ pkg_id ] = bin_data
2016-05-07 08:56:41 +00:00
def _load_constraints ( self , constraints_file ) :
""" Loads configurable constraints
The constraints file can contain extra rules that Britney should attempt
to satisfy . Examples can be " keep package X in testing and ensure it is
installable " .
: param constraints_file : Path to the file containing the constraints
"""
tag_file = apt_pkg . TagFile ( constraints_file )
get_field = tag_file . section . get
step = tag_file . step
no = 0
faux_version = sys . intern ( ' 1 ' )
faux_section = sys . intern ( ' faux ' )
2016-05-07 10:44:32 +00:00
keep_installable = [ ]
constraints = {
' keep-installable ' : keep_installable
}
2016-05-07 08:56:41 +00:00
while step ( ) :
no + = 1
pkg_name = get_field ( ' Fake-Package-Name ' , None )
2016-11-13 08:49:46 +00:00
if pkg_name is None : # pragma: no cover
2016-05-07 08:56:41 +00:00
raise ValueError ( " Missing Fake-Package-Name field in paragraph %d (file %s ) " % ( no , constraints_file ) )
pkg_name = sys . intern ( pkg_name )
def mandatory_field ( x ) :
v = get_field ( x , None )
2016-11-13 08:49:46 +00:00
if v is None : # pragma: no cover
2016-05-07 08:56:41 +00:00
raise ValueError ( " Missing %s field for %s (file %s ) " % ( x , pkg_name , constraints_file ) )
return v
constraint = mandatory_field ( ' Constraint ' )
2016-11-13 08:49:46 +00:00
if constraint not in { ' present-and-installable ' } : # pragma: no cover
2016-05-07 08:56:41 +00:00
raise ValueError ( " Unsupported constraint %s for %s (file %s ) " % ( constraint , pkg_name , constraints_file ) )
self . log ( " - constraint %s " % pkg_name , type = ' I ' )
2016-05-21 17:13:20 +00:00
pkg_list = [ x . strip ( ) for x in mandatory_field ( ' Package-List ' ) . split ( " \n " ) if x . strip ( ) != ' ' and not x . strip ( ) . startswith ( " # " ) ]
2016-09-24 09:42:39 +00:00
src_data = SourcePackage ( faux_version ,
2016-05-07 08:56:41 +00:00
faux_section ,
[ ] ,
None ,
True ,
2017-11-01 21:32:03 +00:00
None ,
2016-09-24 09:42:39 +00:00
)
2016-05-07 08:56:41 +00:00
self . sources [ ' testing ' ] [ pkg_name ] = src_data
self . sources [ ' unstable ' ] [ pkg_name ] = src_data
2016-05-07 10:44:32 +00:00
keep_installable . append ( pkg_name )
2016-05-07 08:56:41 +00:00
for arch in self . options . architectures :
deps = [ ]
for pkg_spec in pkg_list :
2016-05-21 16:44:20 +00:00
s = pkg_spec . split ( None , 1 )
2016-05-07 08:56:41 +00:00
if len ( s ) == 1 :
deps . append ( s [ 0 ] )
else :
pkg , arch_res = s
2016-11-13 08:49:46 +00:00
if not ( arch_res . startswith ( ' [ ' ) and arch_res . endswith ( ' ] ' ) ) : # pragma: no cover
2016-05-07 08:56:41 +00:00
raise ValueError ( " Invalid arch-restriction on %s - should be [arch1 arch2] (for %s file %s ) "
% ( pkg , pkg_name , constraints_file ) )
arch_res = arch_res [ 1 : - 1 ] . split ( )
2016-11-13 08:49:46 +00:00
if not arch_res : # pragma: no cover
2016-05-07 08:56:41 +00:00
msg = " Empty arch-restriction for %s : Uses comma or negation (for %s file %s ) "
raise ValueError ( msg % ( pkg , pkg_name , constraints_file ) )
for a in arch_res :
if a == arch :
deps . append ( pkg )
2016-11-13 08:49:46 +00:00
elif ' , ' in a or ' ! ' in a : # pragma: no cover
2016-05-07 08:56:41 +00:00
msg = " Invalid arch-restriction for %s : Uses comma or negation (for %s file %s ) "
raise ValueError ( msg % ( pkg , pkg_name , constraints_file ) )
2016-05-16 06:42:26 +00:00
pkg_id = BinaryPackageId ( pkg_name , faux_version , arch )
2016-05-07 08:56:41 +00:00
bin_data = BinaryPackage ( faux_version ,
faux_section ,
pkg_name ,
faux_version ,
arch ,
' no ' ,
' , ' . join ( deps ) ,
None ,
[ ] ,
False ,
2016-05-16 06:24:36 +00:00
pkg_id ,
2016-05-07 08:56:41 +00:00
)
2016-09-25 05:45:36 +00:00
src_data . binaries . append ( pkg_id )
2016-05-07 08:56:41 +00:00
self . binaries [ ' testing ' ] [ arch ] [ 0 ] [ pkg_name ] = bin_data
self . binaries [ ' unstable ' ] [ arch ] [ 0 ] [ pkg_name ] = bin_data
self . all_binaries [ pkg_id ] = bin_data
2016-05-07 10:44:32 +00:00
return constraints
2006-06-24 17:49:43 +00:00
# Data reading/writing methods
# ----------------------------
2006-06-17 13:45:56 +00:00
2015-04-27 02:13:27 +10:00
def read_sources ( self , basedir ) :
""" Read the list of source packages from the specified directory
2006-06-24 17:49:43 +00:00
2015-04-27 02:13:27 +10:00
The source packages are read from the ` Sources ' file within the
directory specified as ` basedir ' parameter. Considering the
large amount of memory needed , not all the fields are loaded
in memory . The available fields are Version , Maintainer and Section .
2006-06-24 17:49:43 +00:00
2015-04-27 02:13:27 +10:00
The method returns a list where every item represents a source
package as a dictionary .
2006-06-24 17:49:43 +00:00
"""
2015-04-27 02:13:27 +10:00
if self . options . components :
sources = { }
for component in self . options . components :
filename = os . path . join ( basedir , component , " source " , " Sources " )
2016-05-08 12:12:39 +00:00
filename = possibly_compressed ( filename )
2016-11-15 22:22:24 +00:00
self . log ( " Loading source packages from %s " % filename )
read_sources_file ( filename , sources )
2015-04-27 02:13:27 +10:00
else :
filename = os . path . join ( basedir , " Sources " )
2016-11-15 22:22:24 +00:00
self . log ( " Loading source packages from %s " % filename )
sources = read_sources_file ( filename )
2006-07-26 20:07:59 +00:00
2015-04-27 02:13:27 +10:00
return sources
2016-05-07 12:30:59 +00:00
def _parse_provides ( self , pkg_id , provides_raw ) :
parts = apt_pkg . parse_depends ( provides_raw , False )
nprov = [ ]
for or_clause in parts :
2016-11-13 08:49:46 +00:00
if len ( or_clause ) != 1 : # pragma: no cover
2016-05-07 12:30:59 +00:00
msg = " Ignoring invalid provides in %s : Alternatives [ %s ] " % ( str ( pkg_id ) , str ( or_clause ) )
self . log ( msg , type = ' W ' )
continue
for part in or_clause :
provided , provided_version , op = part
2016-11-13 08:49:46 +00:00
if op != ' ' and op != ' = ' : # pragma: no cover
2016-05-07 12:30:59 +00:00
msg = " Ignoring invalid provides in %s : %s ( %s %s ) " % ( str ( pkg_id ) , provided , op , provided_version )
self . log ( msg , type = ' W ' )
continue
provided = sys . intern ( provided )
provided_version = sys . intern ( provided_version )
part = ( provided , provided_version , sys . intern ( op ) )
nprov . append ( part )
return nprov
2015-04-27 02:13:27 +10:00
def _read_packages_file ( self , filename , arch , srcdist , packages = None , intern = sys . intern ) :
2016-03-25 07:43:46 +00:00
self . log ( " Loading binary packages from %s " % filename )
2011-12-18 21:19:22 +01:00
2015-04-27 02:13:27 +10:00
if packages is None :
packages = { }
all_binaries = self . all_binaries
2016-03-24 15:59:41 +00:00
Packages = apt_pkg . TagFile ( filename )
2011-12-18 21:19:22 +01:00
get_field = Packages . section . get
step = Packages . step
2011-03-06 20:52:05 +00:00
while step ( ) :
2006-07-26 19:26:38 +00:00
pkg = get_field ( ' Package ' )
version = get_field ( ' Version ' )
2010-02-25 20:23:02 +00:00
# There may be multiple versions of any arch:all packages
# (in unstable) if some architectures have out-of-date
# binaries. We only ever consider the package with the
# largest version for migration.
2011-12-13 18:52:01 +01:00
pkg = intern ( pkg )
version = intern ( version )
2016-05-16 06:42:26 +00:00
pkg_id = BinaryPackageId ( pkg , version , arch )
2010-02-25 20:23:02 +00:00
2016-04-12 18:18:16 +00:00
if pkg in packages :
old_pkg_data = packages [ pkg ]
2016-04-06 20:49:40 +00:00
if apt_pkg . version_compare ( old_pkg_data . version , version ) > 0 :
2016-04-12 18:18:16 +00:00
continue
2016-05-16 06:28:36 +00:00
old_pkg_id = old_pkg_data . pkg_id
2016-11-22 21:08:33 +00:00
old_src_binaries = srcdist [ old_pkg_data . source ] . binaries
2016-04-12 18:18:16 +00:00
old_src_binaries . remove ( old_pkg_id )
# This may seem weird at first glance, but the current code rely
# on this behaviour to avoid issues like #709460. Admittedly it
# is a special case, but Britney will attempt to remove the
# arch:all packages without this. Even then, this particular
# stop-gap relies on the packages files being sorted by name
# and the version, so it is not particularly resilient.
if pkg_id not in old_src_binaries :
old_src_binaries . append ( pkg_id )
2012-04-11 23:36:51 +02:00
# Merge Pre-Depends with Depends and Conflicts with
# Breaks. Britney is not interested in the "finer
2015-04-27 19:14:08 +00:00
# semantic differences" of these fields anyway.
2012-04-11 23:36:51 +02:00
pdeps = get_field ( ' Pre-Depends ' )
deps = get_field ( ' Depends ' )
if deps and pdeps :
deps = pdeps + ' , ' + deps
elif pdeps :
deps = pdeps
2011-12-13 18:52:01 +01:00
ess = False
if get_field ( ' Essential ' , ' no ' ) == ' yes ' :
ess = True
2009-09-06 18:00:56 +01:00
final_conflicts_list = [ ]
conflicts = get_field ( ' Conflicts ' )
if conflicts :
final_conflicts_list . append ( conflicts )
breaks = get_field ( ' Breaks ' )
if breaks :
final_conflicts_list . append ( breaks )
2016-04-06 20:17:27 +00:00
source = pkg
source_version = version
# retrieve the name and the version of the source package
source_raw = get_field ( ' Source ' )
if source_raw :
source = intern ( source_raw . split ( " " ) [ 0 ] )
if " ( " in source_raw :
source_version = intern ( source_raw [ source_raw . find ( " ( " ) + 1 : source_raw . find ( " ) " ) ] )
2016-04-06 20:20:38 +00:00
provides_raw = get_field ( ' Provides ' )
if provides_raw :
2016-05-07 12:30:59 +00:00
provides = self . _parse_provides ( pkg_id , provides_raw )
2016-04-06 20:20:38 +00:00
else :
provides = [ ]
2016-11-13 08:59:49 +00:00
raw_arch = intern ( get_field ( ' Architecture ' ) )
if raw_arch not in { ' all ' , arch } : # pragma: no cover
raise AssertionError ( " %s has wrong architecture ( %s ) - should be either %s or all " % (
str ( pkg_id ) , raw_arch , arch ) )
2016-04-06 20:30:03 +00:00
dpkg = BinaryPackage ( version ,
2011-12-13 18:52:01 +01:00
intern ( get_field ( ' Section ' ) ) ,
2016-04-06 20:17:27 +00:00
source ,
source_version ,
2016-11-13 08:59:49 +00:00
raw_arch ,
2013-09-16 15:02:55 +01:00
get_field ( ' Multi-Arch ' ) ,
2012-04-11 23:36:51 +02:00
deps ,
2009-09-06 18:00:56 +01:00
' , ' . join ( final_conflicts_list ) or None ,
2016-04-06 20:20:38 +00:00
provides ,
2011-12-13 18:52:01 +01:00
ess ,
2016-05-16 06:24:36 +00:00
pkg_id ,
2016-04-06 20:30:03 +00:00
)
2006-06-24 17:49:43 +00:00
# if the source package is available in the distribution, then register this binary package
2016-04-06 20:17:27 +00:00
if source in srcdist :
2013-07-01 18:13:51 +00:00
# There may be multiple versions of any arch:all packages
# (in unstable) if some architectures have out-of-date
# binaries. We only want to include the package in the
# source -> binary mapping once. It doesn't matter which
# of the versions we include as only the package name and
# architecture are recorded.
2016-09-25 05:45:36 +00:00
if pkg_id not in srcdist [ source ] . binaries :
srcdist [ source ] . binaries . append ( pkg_id )
2006-06-24 17:49:43 +00:00
# if the source package doesn't exist, create a fake one
else :
2017-11-01 21:32:03 +00:00
srcdist [ source ] = SourcePackage ( source_version , ' faux ' , [ pkg_id ] , None , True , None )
2006-06-24 17:49:43 +00:00
2006-07-28 13:21:44 +00:00
# add the resulting dictionary to the package list
2006-06-24 17:49:43 +00:00
packages [ pkg ] = dpkg
2016-01-17 10:43:26 +00:00
if pkg_id in all_binaries :
self . merge_pkg_entries ( pkg , arch , all_binaries [ pkg_id ] , dpkg )
else :
all_binaries [ pkg_id ] = dpkg
2006-06-24 17:49:43 +00:00
2015-04-27 02:13:27 +10:00
# add the resulting dictionary to the package list
packages [ pkg ] = dpkg
return packages
2016-09-24 07:00:36 +00:00
def read_binaries ( self , basedir , distribution , architectures ) :
2015-04-27 02:13:27 +10:00
""" Read the list of binary packages from the specified directory
2016-09-24 07:00:36 +00:00
This method reads all the binary packages for a given distribution ,
which is expected to be in the directory denoted by the " base_dir "
parameter .
2015-04-27 02:13:27 +10:00
2016-09-24 07:00:36 +00:00
If the " components " config parameter is set , the directory should
be the " suite " directory of a local mirror ( i . e . the one containing
2016-11-12 09:54:23 +00:00
the " Release " file ) . Otherwise , Britney will read the packages
2016-09-24 07:00:36 +00:00
information from all the " Packages_$ {arch} " files referenced by
the " architectures " parameter .
2015-04-27 02:13:27 +10:00
Considering the
large amount of memory needed , not all the fields are loaded
in memory . The available fields are Version , Source , Multi - Arch ,
Depends , Conflicts , Provides and Architecture .
2016-09-24 07:00:36 +00:00
The ` Provides ' field is used to populate the virtual packages list.
2015-04-27 02:13:27 +10:00
2016-09-24 07:00:36 +00:00
The method returns a dict mapping an architecture name to a 2 - element
tuple . The first element in the tuple is a map from binary package
names to " BinaryPackage " objects ; the second element is a dictionary
which maps virtual packages to real packages that provide them .
2015-04-27 02:13:27 +10:00
"""
2016-09-24 07:00:36 +00:00
arch2packages = { }
2015-04-27 02:13:27 +10:00
if self . options . components :
2016-09-24 07:38:07 +00:00
release_file = read_release_file ( basedir )
listed_archs = set ( release_file [ ' Architectures ' ] . split ( ) )
2016-09-24 07:00:36 +00:00
for arch in architectures :
packages = { }
2016-09-24 07:38:07 +00:00
if arch not in listed_archs :
self . log ( " Skipping arch %s for %s : It is not listed in the Release file " % ( arch , distribution ) )
arch2packages [ arch ] = ( { } , { } )
continue
2016-09-24 07:00:36 +00:00
for component in self . options . components :
binary_dir = " binary- %s " % arch
filename = os . path . join ( basedir ,
component ,
binary_dir ,
' Packages ' )
2016-09-24 07:38:07 +00:00
filename = possibly_compressed ( filename )
2016-09-24 07:00:36 +00:00
udeb_filename = os . path . join ( basedir ,
component ,
" debian-installer " ,
binary_dir ,
" Packages " )
# We assume the udeb Packages file is present if the
# regular one is present
udeb_filename = possibly_compressed ( udeb_filename )
self . _read_packages_file ( filename ,
arch ,
self . sources [ distribution ] ,
packages )
self . _read_packages_file ( udeb_filename ,
arch ,
self . sources [ distribution ] ,
packages )
# create provides
provides = create_provides_map ( packages )
arch2packages [ arch ] = ( packages , provides )
2015-04-27 02:13:27 +10:00
else :
2016-09-24 07:00:36 +00:00
for arch in architectures :
filename = os . path . join ( basedir , " Packages_ %s " % arch )
packages = self . _read_packages_file ( filename ,
arch ,
self . sources [ distribution ] )
provides = create_provides_map ( packages )
arch2packages [ arch ] = ( packages , provides )
return arch2packages
2006-06-24 17:49:43 +00:00
2016-05-26 16:49:09 +00:00
def read_hints ( self , hintsdir ) :
2006-06-24 17:49:43 +00:00
""" Read the hint commands from the specified directory
2016-05-26 16:49:09 +00:00
The hint commands are read from the files contained in the directory
specified by the ` hintsdir ' parameter.
2010-02-28 12:58:03 +00:00
The names of the files have to be the same as the authorized users
2006-06-24 17:49:43 +00:00
for the hints .
The file contains rows with the format :
< command > < package - name > [ / < version > ]
The method returns a dictionary where the key is the command , and
the value is the list of affected packages .
"""
2006-06-17 13:45:56 +00:00
for who in self . HINTS . keys ( ) :
2008-01-17 14:27:36 +00:00
if who == ' command-line ' :
lines = self . options . hints and self . options . hints . split ( ' ; ' ) or ( )
2015-02-01 10:59:57 +01:00
filename = ' <cmd-line> '
2016-07-02 20:00:16 +00:00
self . _hint_parser . parse_hints ( who , self . HINTS [ who ] , filename , lines )
2008-01-17 14:27:36 +00:00
else :
2016-05-26 16:49:09 +00:00
filename = os . path . join ( hintsdir , who )
2008-01-17 14:27:36 +00:00
if not os . path . isfile ( filename ) :
2016-03-25 07:43:46 +00:00
self . log ( " Cannot read hints list from %s , no such file! " % filename , type = " E " )
2008-01-17 14:27:36 +00:00
continue
2016-03-25 07:43:46 +00:00
self . log ( " Loading hints list from %s " % filename )
2015-04-26 00:04:50 +02:00
with open ( filename , encoding = ' utf-8 ' ) as f :
2016-07-02 20:00:16 +00:00
self . _hint_parser . parse_hints ( who , self . HINTS [ who ] , filename , f )
2016-03-25 07:49:31 +00:00
2016-07-02 20:00:16 +00:00
hints = self . _hint_parser . hints
2006-06-17 13:45:56 +00:00
2013-01-18 18:47:06 +00:00
for x in [ " block " , " block-all " , " block-udeb " , " unblock " , " unblock-udeb " , " force " , " urgent " , " remove " , " age-days " ] :
2006-06-17 13:45:56 +00:00
z = { }
2011-09-04 16:41:33 +00:00
for hint in hints [ x ] :
2011-09-04 19:46:38 +00:00
package = hint . package
2011-11-14 19:52:42 +00:00
key = ( hint , hint . user )
2011-12-18 14:53:04 +01:00
if package in z and z [ package ] != key :
2011-11-14 19:52:42 +00:00
hint2 = z [ package ] [ 0 ]
2010-02-28 13:11:22 +00:00
if x in [ ' unblock ' , ' unblock-udeb ' ] :
2011-12-18 21:19:22 +01:00
if apt_pkg . version_compare ( hint2 . version , hint . version ) < 0 :
2010-02-28 13:11:22 +00:00
# This hint is for a newer version, so discard the old one
2016-03-25 07:43:46 +00:00
self . log ( " Overriding %s [ %s ] = ( ' %s ' , ' %s ' ) with ( ' %s ' , ' %s ' ) " %
2011-11-14 19:52:42 +00:00
( x , package , hint2 . version , hint2 . user , hint . version , hint . user ) , type = " W " )
hint2 . set_active ( False )
2010-02-28 13:11:22 +00:00
else :
2011-08-28 21:45:14 +00:00
# This hint is for an older version, so ignore it in favour of the new one
2016-03-25 07:43:46 +00:00
self . log ( " Ignoring %s [ %s ] = ( ' %s ' , ' %s ' ), ( ' %s ' , ' %s ' ) is higher or equal " %
2011-11-14 19:52:42 +00:00
( x , package , hint . version , hint . user , hint2 . version , hint2 . user ) , type = " W " )
2011-09-04 16:41:33 +00:00
hint . set_active ( False )
2010-02-28 13:11:22 +00:00
else :
2017-01-29 11:14:55 +01:00
self . log ( " Overriding %s [ %s ] = ( ' %s ' , ' %s ' ) with ( ' %s ' , ' %s ' ) " %
( x , package , hint2 . user , hint2 , hint . user , hint ) ,
type = " W " )
2011-11-14 19:52:42 +00:00
hint2 . set_active ( False )
2011-11-10 10:30:13 +00:00
2011-11-14 19:52:42 +00:00
z [ package ] = key
2006-06-17 13:45:56 +00:00
2010-02-28 14:41:59 +00:00
# Sanity check the hints hash
if len ( hints [ " block " ] ) == 0 and len ( hints [ " block-udeb " ] ) == 0 :
2016-03-25 07:43:46 +00:00
self . log ( " WARNING: No block hints at all, not even udeb ones! " , type = " W " )
2010-02-28 14:41:59 +00:00
2011-10-19 20:02:29 +00:00
# Utility methods for package analysis
2006-06-24 17:49:43 +00:00
# ------------------------------------
2006-06-17 13:45:56 +00:00
2016-11-15 23:09:26 +00:00
def excuse_unsat_deps ( self , pkg , src , arch , suite , excuse , get_dependency_solvers = get_dependency_solvers ) :
2006-06-24 17:49:43 +00:00
""" Find unsatisfied dependencies for a binary package
This method analyzes the dependencies of the binary package specified
by the parameter ` pkg ' , built from the source package `src ' , for the
architecture ` arch ' within the suite `suite ' . If the dependency can ' t
be satisfied in testing and / or unstable , it updates the excuse passed
as parameter .
"""
# retrieve the binary package from the specified suite and arch
2016-11-15 23:09:26 +00:00
binaries_s_a , provides_s_a = self . binaries [ suite ] [ arch ]
binaries_t_a , provides_t_a = self . binaries [ ' testing ' ] [ arch ]
2016-09-24 08:42:52 +00:00
binary_u = binaries_s_a [ pkg ]
2006-06-24 17:49:43 +00:00
2013-09-18 20:40:00 +00:00
# local copies for better performance
2011-12-18 21:19:22 +01:00
parse_depends = apt_pkg . parse_depends
2006-07-28 07:59:05 +00:00
2006-06-24 17:49:43 +00:00
# analyze the dependency fields (if present)
2016-04-06 20:49:40 +00:00
deps = binary_u . depends
if not deps :
2016-04-07 19:49:13 +00:00
return True
is_all_ok = True
2006-06-17 13:45:56 +00:00
2016-09-24 08:42:52 +00:00
2013-09-18 20:40:00 +00:00
# for every dependency block (formed as conjunction of disjunction)
2012-04-11 23:36:51 +02:00
for block , block_txt in zip ( parse_depends ( deps , False ) , deps . split ( ' , ' ) ) :
# if the block is satisfied in testing, then skip the block
2016-11-15 23:09:26 +00:00
packages = get_dependency_solvers ( block , binaries_t_a , provides_t_a )
2014-08-09 20:49:11 +02:00
if packages :
2012-04-11 23:36:51 +02:00
for p in packages :
2016-09-24 08:42:52 +00:00
if p not in binaries_s_a :
2015-09-08 21:45:53 +02:00
continue
2016-09-24 08:42:52 +00:00
excuse . add_sane_dep ( binaries_s_a [ p ] . source )
2012-04-11 23:36:51 +02:00
continue
2006-06-17 13:45:56 +00:00
2015-09-08 21:45:53 +02:00
# check if the block can be satisfied in the source suite, and list the solving packages
2016-11-15 23:09:26 +00:00
packages = get_dependency_solvers ( block , binaries_s_a , provides_s_a )
packages = [ binaries_s_a [ p ] . source for p in packages ]
2006-06-24 17:49:43 +00:00
2012-04-11 23:36:51 +02:00
# if the dependency can be satisfied by the same source package, skip the block:
# obviously both binary packages will enter testing together
if src in packages : continue
2006-06-17 13:45:56 +00:00
2012-04-11 23:36:51 +02:00
# if no package can satisfy the dependency, add this information to the excuse
2014-08-09 20:49:11 +02:00
if not packages :
2012-04-11 23:36:51 +02:00
excuse . addhtml ( " %s / %s unsatisfiable Depends: %s " % ( pkg , arch , block_txt . strip ( ) ) )
2015-02-01 11:00:24 +01:00
excuse . addreason ( " depends " )
2016-04-07 19:49:13 +00:00
if arch not in self . options . break_arches :
is_all_ok = False
2012-04-11 23:36:51 +02:00
continue
2006-06-17 13:45:56 +00:00
2012-04-11 23:36:51 +02:00
# for the solving packages, update the excuse to add the dependencies
for p in packages :
2015-05-27 22:34:42 +02:00
if arch not in self . options . break_arches :
2016-09-25 05:45:36 +00:00
if p in self . sources [ ' testing ' ] and self . sources [ ' testing ' ] [ p ] . version == self . sources [ suite ] [ p ] . version :
2012-12-25 18:28:14 +00:00
excuse . add_dep ( " %s / %s " % ( p , arch ) , arch )
else :
excuse . add_dep ( p , arch )
2012-04-11 23:36:51 +02:00
else :
excuse . add_break_dep ( p , arch )
2016-04-07 19:49:13 +00:00
return is_all_ok
2006-06-17 13:45:56 +00:00
2010-02-28 12:58:03 +00:00
# Package analysis methods
2006-06-24 17:49:43 +00:00
# ------------------------
def should_remove_source ( self , pkg ) :
""" Check if a source package should be removed from testing
This method checks if a source package should be removed from the
2010-02-28 12:58:03 +00:00
testing distribution ; this happens if the source package is not
2006-06-24 17:49:43 +00:00
present in the unstable distribution anymore .
It returns True if the package can be removed , False otherwise .
2013-07-06 12:42:52 +00:00
In the former case , a new excuse is appended to the object
2006-06-24 17:49:43 +00:00
attribute excuses .
"""
2010-02-28 12:58:03 +00:00
# if the source package is available in unstable, then do nothing
2006-07-26 19:26:38 +00:00
if pkg in self . sources [ ' unstable ' ] :
2006-06-24 17:49:43 +00:00
return False
2015-04-27 19:14:08 +00:00
# otherwise, add a new excuse for its removal
2006-06-24 17:49:43 +00:00
src = self . sources [ ' testing ' ] [ pkg ]
excuse = Excuse ( " - " + pkg )
2013-12-11 21:56:51 +01:00
excuse . addhtml ( " Package not in unstable, will try to remove " )
2016-09-25 05:45:36 +00:00
excuse . set_vers ( src . version , None )
src . maintainer and excuse . set_maint ( src . maintainer )
src . section and excuse . set_section ( src . section )
2006-08-07 11:06:04 +00:00
# if the package is blocked, skip it
2011-09-04 20:26:36 +00:00
for hint in self . hints . search ( ' block ' , package = pkg , removal = True ) :
2015-08-11 22:24:39 +00:00
excuse . addhtml ( " Not touching package, as requested by %s "
2017-09-08 07:43:11 +00:00
" (contact debian-release if update is needed) " % hint . user )
2013-11-15 15:01:20 +00:00
excuse . addreason ( " block " )
2016-03-25 09:18:35 +00:00
self . excuses [ excuse . name ] = excuse
2006-08-07 11:06:04 +00:00
return False
2017-02-08 19:01:41 +00:00
excuse . policy_verdict = PolicyVerdict . PASS
2016-03-25 09:18:35 +00:00
self . excuses [ excuse . name ] = excuse
2006-06-24 17:49:43 +00:00
return True
2016-03-23 09:55:07 +00:00
def should_upgrade_srcarch ( self , src , arch , suite ) :
2013-07-06 11:21:53 +00:00
""" Check if a set of binary packages should be upgraded
2006-06-24 17:49:43 +00:00
2013-07-06 11:21:53 +00:00
This method checks if the binary packages produced by the source
package on the given architecture should be upgraded ; this can
happen also if the migration is a binary - NMU for the given arch .
2006-06-24 17:49:43 +00:00
2013-07-06 11:21:53 +00:00
It returns False if the given packages don ' t need to be upgraded,
2006-06-24 17:49:43 +00:00
True otherwise . In the former case , a new excuse is appended to
2013-07-06 12:42:52 +00:00
the object attribute excuses .
2006-06-24 17:49:43 +00:00
"""
# retrieve the source packages for testing and suite
2006-06-17 13:45:56 +00:00
source_t = self . sources [ ' testing ' ] [ src ]
source_u = self . sources [ suite ] [ src ]
2016-10-24 19:42:05 +00:00
suite_info = self . suite_info [ suite ]
suffix = ' '
if suite_info . excuses_suffix :
suffix = " _ %s " % suite_info . excuses_suffix
2006-06-17 13:45:56 +00:00
2006-06-24 17:49:43 +00:00
# build the common part of the excuse, which will be filled by the code below
2016-10-24 19:42:05 +00:00
ref = " %s / %s %s " % ( src , arch , suffix )
2006-06-17 13:45:56 +00:00
excuse = Excuse ( ref )
2016-09-25 05:45:36 +00:00
excuse . set_vers ( source_t . version , source_t . version )
source_u . maintainer and excuse . set_maint ( source_u . maintainer )
source_u . section and excuse . set_section ( source_u . section )
2006-06-17 13:45:56 +00:00
2010-02-28 12:58:03 +00:00
# if there is a `remove' hint and the requested version is the same as the
2006-06-24 17:49:43 +00:00
# version in testing, then stop here and return False
2013-07-06 11:26:11 +00:00
# (as a side effect, a removal may generate such excuses for both the source
2013-07-06 11:21:53 +00:00
# package and its binary packages on each architecture)
2016-09-25 05:45:36 +00:00
for hint in self . hints . search ( ' remove ' , package = src , version = source_t . version ) :
2016-07-05 22:44:56 +00:00
excuse . add_hint ( hint )
2011-09-04 16:41:33 +00:00
excuse . addhtml ( " Removal request by %s " % ( hint . user ) )
2006-06-17 13:45:56 +00:00
excuse . addhtml ( " Trying to remove package, not update it " )
2016-03-25 09:18:35 +00:00
self . excuses [ excuse . name ] = excuse
2006-06-17 13:45:56 +00:00
return False
2006-06-24 17:49:43 +00:00
# the starting point is that there is nothing wrong and nothing worth doing
anywrongver = False
anyworthdoing = False
2016-03-23 15:21:27 +00:00
packages_t_a = self . binaries [ ' testing ' ] [ arch ] [ 0 ]
packages_s_a = self . binaries [ suite ] [ arch ] [ 0 ]
2006-06-24 17:49:43 +00:00
# for every binary package produced by this source in unstable for this architecture
2016-09-25 05:45:36 +00:00
for pkg_id in sorted ( x for x in source_u . binaries if x . architecture == arch ) :
2016-05-16 08:19:26 +00:00
pkg_name = pkg_id . package_name
2006-06-17 13:45:56 +00:00
2006-06-24 17:49:43 +00:00
# retrieve the testing (if present) and unstable corresponding binary packages
2017-09-02 11:10:56 +00:00
binary_t = packages_t_a [ pkg_name ] if pkg_name in packages_t_a else None
2016-03-23 15:21:27 +00:00
binary_u = packages_s_a [ pkg_name ]
2006-06-24 17:49:43 +00:00
# this is the source version for the new binary package
2016-04-06 20:49:40 +00:00
pkgsv = binary_u . source_version
2006-06-17 13:45:56 +00:00
2006-06-24 17:49:43 +00:00
# if the new binary package is architecture-independent, then skip it
2016-04-06 20:49:40 +00:00
if binary_u . architecture == ' all ' :
2016-09-25 05:45:36 +00:00
if pkg_id not in source_t . binaries :
2016-03-28 11:17:25 +00:00
# only add a note if the arch:all does not match the expected version
2016-04-06 20:49:40 +00:00
excuse . addhtml ( " Ignoring %s %s (from %s ) as it is arch: all " % ( pkg_name , binary_u . version , pkgsv ) )
2006-06-17 13:45:56 +00:00
continue
2006-06-24 17:49:43 +00:00
# if the new binary package is not from the same source as the testing one, then skip it
2013-07-06 11:21:53 +00:00
# this implies that this binary migration is part of a source migration
2016-09-25 05:45:36 +00:00
if source_u . version == pkgsv and source_t . version != pkgsv :
2006-06-17 13:45:56 +00:00
anywrongver = True
2016-09-25 05:45:36 +00:00
excuse . addhtml ( " From wrong source: %s %s ( %s not %s ) " % ( pkg_name , binary_u . version , pkgsv , source_t . version ) )
2015-10-20 18:39:28 +00:00
continue
2006-06-17 13:45:56 +00:00
2015-10-31 21:53:57 +01:00
# cruft in unstable
2016-09-25 05:45:36 +00:00
if source_u . version != pkgsv and source_t . version != pkgsv :
2015-10-31 21:53:57 +01:00
if self . options . ignore_cruft :
excuse . addhtml ( " Old cruft: %s %s (but ignoring cruft, so nevermind) " % ( pkg_name , pkgsv ) )
else :
anywrongver = True
excuse . addhtml ( " Old cruft: %s %s " % ( pkg_name , pkgsv ) )
continue
2013-06-08 15:40:07 +00:00
# if the source package has been updated in unstable and this is a binary migration, skip it
2013-07-06 11:21:53 +00:00
# (the binaries are now out-of-date)
2016-09-25 05:45:36 +00:00
if source_t . version == pkgsv and source_t . version != source_u . version :
2013-06-08 15:40:07 +00:00
anywrongver = True
2016-09-25 05:45:36 +00:00
excuse . addhtml ( " From wrong source: %s %s ( %s not %s ) " % ( pkg_name , binary_u . version , pkgsv , source_u . version ) )
2015-10-20 18:39:28 +00:00
continue
2013-06-08 15:40:07 +00:00
2006-06-24 17:49:43 +00:00
# find unsatisfied dependencies for the new binary package
2006-06-17 13:45:56 +00:00
self . excuse_unsat_deps ( pkg_name , src , arch , suite , excuse )
2006-06-24 17:49:43 +00:00
# if the binary is not present in testing, then it is a new binary;
# in this case, there is something worth doing
2006-06-17 13:45:56 +00:00
if not binary_t :
2016-04-06 20:49:40 +00:00
excuse . addhtml ( " New binary: %s ( %s ) " % ( pkg_name , binary_u . version ) )
2006-06-17 13:45:56 +00:00
anyworthdoing = True
continue
2006-06-24 17:49:43 +00:00
# at this point, the binary package is present in testing, so we can compare
# the versions of the packages ...
2016-04-06 20:49:40 +00:00
vcompare = apt_pkg . version_compare ( binary_t . version , binary_u . version )
2006-06-24 17:49:43 +00:00
# ... if updating would mean downgrading, then stop here: there is something wrong
2006-06-17 13:45:56 +00:00
if vcompare > 0 :
anywrongver = True
2016-04-06 20:49:40 +00:00
excuse . addhtml ( " Not downgrading: %s ( %s to %s ) " % ( pkg_name , binary_t . version , binary_u . version ) )
2006-06-17 13:45:56 +00:00
break
2006-06-24 17:49:43 +00:00
# ... if updating would mean upgrading, then there is something worth doing
2006-06-17 13:45:56 +00:00
elif vcompare < 0 :
2016-04-06 20:49:40 +00:00
excuse . addhtml ( " Updated binary: %s ( %s to %s ) " % ( pkg_name , binary_t . version , binary_u . version ) )
2006-06-17 13:45:56 +00:00
anyworthdoing = True
2006-06-24 17:49:43 +00:00
# if there is nothing wrong and there is something worth doing or the source
2010-02-28 12:58:03 +00:00
# package is not fake, then check what packages should be removed
2016-09-25 05:45:36 +00:00
if not anywrongver and ( anyworthdoing or not source_u . is_fakesrc ) :
srcv = source_u . version
ssrc = source_t . version == srcv
2013-02-02 22:47:56 +00:00
# if this is a binary-only migration via *pu, we never want to try
# removing binary packages
if not ( ssrc and suite != ' unstable ' ) :
# for every binary package produced by this source in testing for this architecture
2014-06-08 16:07:09 +02:00
_ , _ , smoothbins = self . _compute_groups ( src ,
" unstable " ,
arch ,
False )
2013-07-09 17:33:49 +02:00
2016-09-25 05:45:36 +00:00
for pkg_id in sorted ( x for x in source_t . binaries if x . architecture == arch ) :
2016-05-16 08:19:26 +00:00
pkg = pkg_id . package_name
2013-02-02 22:47:56 +00:00
# if the package is architecture-independent, then ignore it
2016-03-23 15:21:27 +00:00
tpkg_data = packages_t_a [ pkg ]
2016-04-06 20:49:40 +00:00
if tpkg_data . version == ' all ' :
2016-09-25 05:45:36 +00:00
if pkg_id not in source_u . binaries :
2016-03-28 11:17:25 +00:00
# only add a note if the arch:all does not match the expected version
excuse . addhtml ( " Ignoring removal of %s as it is arch: all " % ( pkg ) )
2013-02-02 22:47:56 +00:00
continue
# if the package is not produced by the new source package, then remove it from testing
2016-03-23 15:21:27 +00:00
if pkg not in packages_s_a :
2016-04-06 20:49:40 +00:00
excuse . addhtml ( " Removed binary: %s %s " % ( pkg , tpkg_data . version ) )
2013-07-06 11:21:53 +00:00
# the removed binary is only interesting if this is a binary-only migration,
# as otherwise the updated source will already cause the binary packages
# to be updated
2013-07-09 17:33:49 +02:00
if ssrc :
2013-09-18 20:40:00 +00:00
# Special-case, if the binary is a candidate for a smooth update, we do not consider
2013-07-09 17:33:49 +02:00
# it "interesting" on its own. This case happens quite often with smooth updatable
# packages, where the old binary "survives" a full run because it still has
# reverse dependencies.
2016-03-23 15:10:08 +00:00
if pkg_id not in smoothbins :
2013-07-09 17:33:49 +02:00
anyworthdoing = True
2006-06-17 13:45:56 +00:00
2010-02-28 12:58:03 +00:00
# if there is nothing wrong and there is something worth doing, this is a valid candidate
2006-06-17 13:45:56 +00:00
if not anywrongver and anyworthdoing :
2017-02-08 19:01:41 +00:00
excuse . policy_verdict = PolicyVerdict . PASS
2016-03-25 09:18:35 +00:00
self . excuses [ excuse . name ] = excuse
2006-07-22 07:31:22 +00:00
return True
2006-06-24 17:49:43 +00:00
# else if there is something worth doing (but something wrong, too) this package won't be considered
2006-06-17 13:45:56 +00:00
elif anyworthdoing :
2016-03-25 09:18:35 +00:00
self . excuses [ excuse . name ] = excuse
2006-06-17 13:45:56 +00:00
2006-07-20 19:11:48 +00:00
# otherwise, return False
return False
2006-06-17 13:45:56 +00:00
2016-03-23 09:55:07 +00:00
def should_upgrade_src ( self , src , suite ) :
2006-06-24 17:49:43 +00:00
""" Check if source package should be upgraded
2010-02-28 12:58:03 +00:00
This method checks if a source package should be upgraded . The analysis
2006-06-24 17:49:43 +00:00
is performed for the source package specified by the ` src ' parameter,
2011-12-15 21:31:44 +00:00
for the distribution ` suite ' .
2006-06-24 17:49:43 +00:00
It returns False if the given package doesn ' t need to be upgraded,
True otherwise . In the former case , a new excuse is appended to
2011-05-31 20:57:59 +00:00
the object attribute excuses .
2006-06-24 17:49:43 +00:00
"""
2006-06-17 13:45:56 +00:00
source_u = self . sources [ suite ] [ src ]
2017-02-08 17:38:49 +00:00
if source_u . is_fakesrc :
# it is a fake package created to satisfy Britney implementation details; silently ignore it
return False
# retrieve the source packages for testing (if available) and suite
2006-06-17 13:45:56 +00:00
if src in self . sources [ ' testing ' ] :
source_t = self . sources [ ' testing ' ] [ src ]
2006-06-24 17:49:43 +00:00
# if testing and unstable have the same version, then this is a candidate for binary-NMUs only
2016-09-25 05:45:36 +00:00
if apt_pkg . version_compare ( source_t . version , source_u . version ) == 0 :
2006-06-17 13:45:56 +00:00
return False
else :
source_t = None
2016-10-24 19:42:05 +00:00
suite_info = self . suite_info [ suite ]
suffix = ' '
if suite_info . excuses_suffix :
suffix = " _ %s " % suite_info . excuses_suffix
2006-06-24 17:49:43 +00:00
# build the common part of the excuse, which will be filled by the code below
2016-10-24 19:42:05 +00:00
ref = " %s %s " % ( src , suffix )
2006-06-17 13:45:56 +00:00
excuse = Excuse ( ref )
2016-09-25 05:45:36 +00:00
excuse . set_vers ( source_t and source_t . version or None , source_u . version )
source_u . maintainer and excuse . set_maint ( source_u . maintainer )
source_u . section and excuse . set_section ( source_u . section )
2006-06-24 17:49:43 +00:00
# if the version in unstable is older, then stop here with a warning in the excuse and return False
2016-09-25 05:45:36 +00:00
if source_t and apt_pkg . version_compare ( source_u . version , source_t . version ) < 0 :
excuse . addhtml ( " ALERT: %s is newer in testing ( %s %s ) " % ( src , source_t . version , source_u . version ) )
2016-03-25 09:18:35 +00:00
self . excuses [ excuse . name ] = excuse
2015-02-01 11:00:24 +01:00
excuse . addreason ( " newerintesting " )
2006-06-17 13:45:56 +00:00
return False
2016-11-01 11:59:11 +02:00
# the starting point is that we will update the candidate
2017-02-08 19:01:41 +00:00
excuse . policy_verdict = PolicyVerdict . PASS
2016-11-01 11:59:11 +02:00
2010-02-28 12:58:03 +00:00
# if there is a `remove' hint and the requested version is the same as the
2006-06-24 17:49:43 +00:00
# version in testing, then stop here and return False
2016-07-05 22:44:56 +00:00
for hint in self . hints . search ( ' remove ' , package = src ) :
2016-09-25 05:45:36 +00:00
if source_t and source_t . version == hint . version or \
source_u . version == hint . version :
2016-07-05 22:44:56 +00:00
excuse . add_hint ( hint )
excuse . addhtml ( " Removal request by %s " % ( hint . user ) )
2006-06-17 13:45:56 +00:00
excuse . addhtml ( " Trying to remove package, not update it " )
2017-02-08 19:01:41 +00:00
excuse . policy_verdict = PolicyVerdict . REJECTED_PERMANENTLY
2017-02-08 17:49:30 +00:00
break
2006-06-17 13:45:56 +00:00
2009-08-15 16:39:00 +01:00
# check if there is a `block' or `block-udeb' hint for this package, or a `block-all source' hint
blocked = { }
2011-09-04 20:26:36 +00:00
for hint in self . hints . search ( package = src ) :
2012-04-03 21:44:32 +00:00
if hint . type == ' block ' :
2011-09-04 16:41:33 +00:00
blocked [ ' block ' ] = hint
2016-03-28 10:09:58 +00:00
excuse . add_hint ( hint )
2011-09-04 16:41:33 +00:00
if hint . type == ' block-udeb ' :
blocked [ ' block-udeb ' ] = hint
2016-03-28 10:09:58 +00:00
excuse . add_hint ( hint )
if ' block ' not in blocked :
2016-05-15 14:06:12 +00:00
for hint in self . hints . search ( type = ' block-all ' ) :
if hint . package == ' source ' or ( not source_t and hint . package == ' new-source ' ) :
blocked [ ' block ' ] = hint
excuse . add_hint ( hint )
break
2016-03-28 10:09:58 +00:00
if suite in ( ' pu ' , ' tpu ' ) :
2013-01-18 18:52:07 +00:00
blocked [ ' block ' ] = ' %s -block ' % ( suite )
2016-03-28 10:09:58 +00:00
excuse . needs_approval = True
2006-06-17 13:45:56 +00:00
2006-06-24 17:49:43 +00:00
# if the source is blocked, then look for an `unblock' hint; the unblock request
2009-08-15 16:39:00 +01:00
# is processed only if the specified version is correct. If a package is blocked
# by `block-udeb', then `unblock-udeb' must be present to cancel it.
for block_cmd in blocked :
unblock_cmd = " un " + block_cmd
2011-09-04 20:26:36 +00:00
unblocks = self . hints . search ( unblock_cmd , package = src )
2011-11-10 10:30:13 +00:00
2016-09-25 05:45:36 +00:00
if unblocks and unblocks [ 0 ] . version is not None and unblocks [ 0 ] . version == source_u . version :
2016-03-28 10:09:58 +00:00
excuse . add_hint ( unblocks [ 0 ] )
if block_cmd == ' block-udeb ' or not excuse . needs_approval :
2013-01-18 18:52:07 +00:00
excuse . addhtml ( " Ignoring %s request by %s , due to %s request by %s " %
( block_cmd , blocked [ block_cmd ] . user , unblock_cmd , unblocks [ 0 ] . user ) )
else :
excuse . addhtml ( " Approved by %s " % ( unblocks [ 0 ] . user ) )
2006-06-24 17:49:43 +00:00
else :
2011-09-04 16:41:33 +00:00
if unblocks :
2013-06-25 10:54:53 +00:00
if unblocks [ 0 ] . version is None :
excuse . addhtml ( " %s request by %s ignored due to missing version " %
( unblock_cmd . capitalize ( ) , unblocks [ 0 ] . user ) )
else :
excuse . addhtml ( " %s request by %s ignored due to version mismatch: %s " %
( unblock_cmd . capitalize ( ) , unblocks [ 0 ] . user , unblocks [ 0 ] . version ) )
2013-01-18 18:52:07 +00:00
if suite == ' unstable ' or block_cmd == ' block-udeb ' :
2017-09-08 07:43:11 +00:00
tooltip = " please contact debian-release if update is needed "
2015-08-11 22:26:57 +00:00
# redirect people to d-i RM for udeb things:
if block_cmd == ' block-udeb ' :
tooltip = " please contact the d-i release manager if an update is needed "
excuse . addhtml ( " Not touching package due to %s request by %s ( %s ) " %
( block_cmd , blocked [ block_cmd ] . user , tooltip ) )
2013-11-15 15:01:20 +00:00
excuse . addreason ( " block " )
2013-01-18 18:52:07 +00:00
else :
excuse . addhtml ( " NEEDS APPROVAL BY RM " )
2013-11-15 15:01:20 +00:00
excuse . addreason ( " block " )
2017-02-08 20:02:16 +00:00
excuse . policy_verdict = PolicyVerdict . REJECTED_NEEDS_APPROVAL
2006-06-17 13:45:56 +00:00
2016-10-20 09:00:14 +02:00
# at this point, we check the status of the builds on all the supported architectures
# to catch the out-of-date ones
pkgs = { src : [ " source " ] }
all_binaries = self . all_binaries
for arch in self . options . architectures :
oodbins = { }
uptodatebins = False
# for every binary package produced by this source in the suite for this architecture
for pkg_id in sorted ( x for x in source_u . binaries if x . architecture == arch ) :
pkg = pkg_id . package_name
if pkg not in pkgs : pkgs [ pkg ] = [ ]
pkgs [ pkg ] . append ( arch )
# retrieve the binary package and its source version
binary_u = all_binaries [ pkg_id ]
pkgsv = binary_u . source_version
# if it wasn't built by the same source, it is out-of-date
# if there is at least one binary on this arch which is
# up-to-date, there is a build on this arch
if source_u . version != pkgsv :
if pkgsv not in oodbins :
oodbins [ pkgsv ] = [ ]
oodbins [ pkgsv ] . append ( pkg )
excuse . add_old_binary ( pkg , pkgsv )
continue
else :
# if the binary is arch all, it doesn't count as
# up-to-date for this arch
if binary_u . architecture == arch :
uptodatebins = True
# if the package is architecture-dependent or the current arch is `nobreakall'
# find unsatisfied dependencies for the binary package
if binary_u . architecture != ' all ' or arch in self . options . nobreakall_arches :
is_valid = self . excuse_unsat_deps ( pkg , src , arch , suite , excuse )
2016-10-23 09:59:36 +00:00
inst_tester = self . _inst_tester
if not is_valid and inst_tester . any_of_these_are_in_testing ( { binary_u . pkg_id } ) \
and not inst_tester . is_installable ( binary_u . pkg_id ) :
# Forgive uninstallable packages only when
# they are already broken in testing ideally
# we would not need to be forgiving at
# all. However, due to how arch:all packages
# are handled, we do run into occasionally.
2017-02-08 19:01:41 +00:00
excuse . policy_verdict = PolicyVerdict . REJECTED_PERMANENTLY
2016-10-20 09:00:14 +02:00
2016-11-01 11:59:11 +02:00
# if there are out-of-date packages, warn about them in the excuse and set excuse.is_valid
2016-10-20 09:00:14 +02:00
# to False to block the update; if the architecture where the package is out-of-date is
# in the `outofsync_arches' list, then do not block the update
if oodbins :
oodtxt = " "
for v in oodbins . keys ( ) :
if oodtxt : oodtxt = oodtxt + " ; "
oodtxt = oodtxt + " %s (from <a href= \" https://buildd.debian.org/status/logs.php? " \
" arch= %s &pkg= %s &ver= %s \" target= \" _blank \" > %s </a>) " % \
( " , " . join ( sorted ( oodbins [ v ] ) ) , quote ( arch ) , quote ( src ) , quote ( v ) , v )
if uptodatebins :
text = " old binaries left on <a href= \" https://buildd.debian.org/status/logs.php? " \
" arch= %s &pkg= %s &ver= %s \" target= \" _blank \" > %s </a>: %s " % \
( quote ( arch ) , quote ( src ) , quote ( source_u . version ) , arch , oodtxt )
else :
text = " missing build on <a href= \" https://buildd.debian.org/status/logs.php? " \
" arch= %s &pkg= %s &ver= %s \" target= \" _blank \" > %s </a>: %s " % \
( quote ( arch ) , quote ( src ) , quote ( source_u . version ) , arch , oodtxt )
if arch in self . options . outofsync_arches :
text = text + " (but %s isn ' t keeping up, so nevermind) " % ( arch )
if not uptodatebins :
excuse . missing_build_on_ood_arch ( arch )
else :
if uptodatebins :
if self . options . ignore_cruft :
text = text + " (but ignoring cruft, so nevermind) "
else :
2017-02-08 19:01:41 +00:00
excuse . policy_verdict = PolicyVerdict . REJECTED_PERMANENTLY
2016-10-20 09:00:14 +02:00
else :
2017-02-08 20:02:16 +00:00
excuse . policy_verdict = PolicyVerdict . REJECTED_CANNOT_DETERMINE_IF_PERMANENT
2016-10-20 09:00:14 +02:00
excuse . missing_build_on_arch ( arch )
excuse . addhtml ( text )
2016-11-01 11:59:11 +02:00
# if the source package has no binaries, set is_valid to False to block the update
2016-10-20 09:00:14 +02:00
if not source_u . binaries :
excuse . addhtml ( " %s has no binaries on any arch " % src )
excuse . addreason ( " no-binaries " )
2017-02-08 19:01:41 +00:00
excuse . policy_verdict = PolicyVerdict . REJECTED_PERMANENTLY
2016-10-20 09:00:14 +02:00
2006-06-24 17:49:43 +00:00
# if the suite is unstable, then we have to check the urgency and the minimum days of
# permanence in unstable before updating testing; if the source package is too young,
2016-11-01 11:59:11 +02:00
# the check fails and we set is_valid to False to block the update; consider
2008-01-12 18:41:03 +00:00
# the age-days hint, if specified for the package
2017-02-08 19:01:41 +00:00
policy_verdict = excuse . policy_verdict
2016-03-25 15:23:34 +00:00
policy_info = excuse . policy_info
for policy in self . policies :
if suite in policy . applicable_suites :
2016-10-20 09:08:49 +02:00
v = policy . apply_policy ( policy_info , suite , src , source_t , source_u , excuse )
2016-03-25 15:23:34 +00:00
if v . value > policy_verdict . value :
policy_verdict = v
2017-02-08 19:01:41 +00:00
excuse . policy_verdict = policy_verdict
2016-03-25 15:23:34 +00:00
2016-03-23 15:21:27 +00:00
if suite in ( ' pu ' , ' tpu ' ) and source_t :
2011-07-27 09:30:00 +00:00
# o-o-d(ish) checks for (t-)p-u
2016-03-23 15:21:27 +00:00
# This only makes sense if the package is actually in testing.
2010-09-20 22:03:43 +00:00
for arch in self . options . architectures :
2013-02-06 19:05:16 +00:00
# if the package in testing has no binaries on this
# architecture, it can't be out-of-date
2016-09-25 05:45:36 +00:00
if not any ( x for x in source_t . binaries
2016-05-16 08:19:26 +00:00
if x . architecture == arch and all_binaries [ x ] . architecture != ' all ' ) :
2013-02-06 19:05:16 +00:00
continue
2016-03-26 08:16:39 +00:00
2013-02-06 19:05:16 +00:00
# if the (t-)p-u package has produced any binaries on
# this architecture then we assume it's ok. this allows for
# uploads to (t-)p-u which intentionally drop binary
# packages
2013-02-06 19:09:04 +00:00
if any ( x for x in self . binaries [ suite ] [ arch ] [ 0 ] . values ( ) \
2016-10-23 10:30:01 +00:00
if x . source == src and x . source_version == source_u . version and \
2016-04-06 20:49:40 +00:00
x . architecture != ' all ' ) :
2010-09-20 22:03:43 +00:00
continue
2011-07-27 09:30:00 +00:00
if suite == ' tpu ' :
base = ' testing '
else :
base = ' stable '
2016-09-25 05:45:36 +00:00
text = " Not yet built on <a href= \" https://buildd.debian.org/status/logs.php?arch= %s &pkg= %s &ver= %s &suite= %s \" target= \" _blank \" > %s </a> (relative to testing) " % ( quote ( arch ) , quote ( src ) , quote ( source_u . version ) , base , arch )
2010-09-20 22:03:43 +00:00
2013-09-09 12:37:52 +01:00
if arch in self . options . outofsync_arches :
2010-09-20 22:03:43 +00:00
text = text + " (but %s isn ' t keeping up, so never mind) " % ( arch )
2016-03-28 09:37:02 +00:00
excuse . missing_build_on_ood_arch ( arch )
2010-09-20 22:03:43 +00:00
else :
2017-02-08 20:02:16 +00:00
excuse . policy_verdict = PolicyVerdict . REJECTED_CANNOT_DETERMINE_IF_PERMANENT
2016-03-28 09:37:02 +00:00
excuse . missing_build_on_arch ( arch )
2010-09-20 22:03:43 +00:00
excuse . addhtml ( text )
2006-06-24 17:49:43 +00:00
# check if there is a `force' hint for this package, which allows it to go in even if it is not updateable
2016-09-25 05:45:36 +00:00
forces = self . hints . search ( ' force ' , package = src , version = source_u . version )
2011-09-04 16:41:33 +00:00
if forces :
2017-02-08 18:14:29 +00:00
# force() updates the final verdict for us
changed_state = excuse . force ( )
if changed_state :
2016-11-16 21:08:14 +00:00
excuse . addhtml ( " Should ignore, but forced by %s " % ( forces [ 0 ] . user ) )
2006-06-17 13:45:56 +00:00
2016-03-25 09:18:35 +00:00
self . excuses [ excuse . name ] = excuse
2016-11-01 11:59:11 +02:00
return excuse . is_valid
2006-06-17 13:45:56 +00:00
2016-03-23 09:55:07 +00:00
def write_excuses ( self ) :
2006-06-24 17:49:43 +00:00
""" Produce and write the update excuses
This method handles the update excuses generation : the packages are
2010-02-28 12:58:03 +00:00
looked at to determine whether they are valid candidates . For the details
2006-06-24 17:49:43 +00:00
of this procedure , please refer to the module docstring .
"""
2006-06-17 13:45:56 +00:00
2016-03-25 07:43:46 +00:00
self . log ( " Update Excuses generation started " , type = " I " )
2006-06-25 13:51:53 +00:00
2006-07-26 19:48:54 +00:00
# list of local methods and variables (for better performance)
sources = self . sources
architectures = self . options . architectures
should_remove_source = self . should_remove_source
should_upgrade_srcarch = self . should_upgrade_srcarch
should_upgrade_src = self . should_upgrade_src
2016-09-20 14:03:02 -07:00
unstable = sources [ ' unstable ' ]
testing = sources [ ' testing ' ]
2006-06-24 17:49:43 +00:00
# this list will contain the packages which are valid candidates;
# if a package is going to be removed, it will have a "-" prefix
2017-10-30 20:58:12 +00:00
upgrade_me = set ( )
upgrade_me_add = upgrade_me . add # Every . in a loop slows it down
2006-06-17 13:45:56 +00:00
2016-03-25 09:18:35 +00:00
excuses = self . excuses = { }
2013-03-16 16:31:04 +00:00
2006-06-24 17:49:43 +00:00
# for every source package in testing, check if it should be removed
2016-09-20 14:03:02 -07:00
for pkg in testing :
2006-07-26 19:48:54 +00:00
if should_remove_source ( pkg ) :
2017-10-30 20:58:12 +00:00
upgrade_me_add ( " - " + pkg )
2006-06-17 13:45:56 +00:00
2006-06-24 17:49:43 +00:00
# for every source package in unstable check if it should be upgraded
2016-09-20 14:03:02 -07:00
for pkg in unstable :
2016-09-25 05:45:36 +00:00
if unstable [ pkg ] . is_fakesrc : continue
2006-06-24 17:49:43 +00:00
# if the source package is already present in testing,
# check if it should be upgraded for every binary package
2016-09-25 05:45:36 +00:00
if pkg in testing and not testing [ pkg ] . is_fakesrc :
2006-07-26 19:48:54 +00:00
for arch in architectures :
if should_upgrade_srcarch ( pkg , arch , ' unstable ' ) :
2017-10-30 20:58:12 +00:00
upgrade_me_add ( " %s / %s " % ( pkg , arch ) )
2006-06-17 13:45:56 +00:00
2006-06-24 17:49:43 +00:00
# check if the source package should be upgraded
2006-07-26 19:48:54 +00:00
if should_upgrade_src ( pkg , ' unstable ' ) :
2017-10-30 20:58:12 +00:00
upgrade_me_add ( pkg )
2006-06-17 13:45:56 +00:00
2011-07-27 09:30:00 +00:00
# for every source package in *-proposed-updates, check if it should be upgraded
for suite in [ ' pu ' , ' tpu ' ] :
for pkg in sources [ suite ] :
# if the source package is already present in testing,
# check if it should be upgraded for every binary package
2016-09-20 14:03:02 -07:00
if pkg in testing :
2011-07-27 09:30:00 +00:00
for arch in architectures :
if should_upgrade_srcarch ( pkg , arch , suite ) :
2017-10-30 20:58:12 +00:00
upgrade_me_add ( " %s / %s _ %s " % ( pkg , arch , suite ) )
2006-06-17 13:45:56 +00:00
2011-07-27 09:30:00 +00:00
# check if the source package should be upgraded
if should_upgrade_src ( pkg , suite ) :
2017-10-30 20:58:12 +00:00
upgrade_me_add ( " %s _ %s " % ( pkg , suite ) )
2006-06-17 13:45:56 +00:00
2006-06-24 17:49:43 +00:00
# process the `remove' hints, if the given package is not yet in upgrade_me
2016-07-05 22:44:56 +00:00
for hint in self . hints [ ' remove ' ] :
src = hint . package
2006-06-17 13:45:56 +00:00
if src in upgrade_me : continue
if ( " - " + src ) in upgrade_me : continue
2016-09-20 14:03:02 -07:00
if src not in testing : continue
2006-06-17 13:45:56 +00:00
2010-02-28 12:58:03 +00:00
# check if the version specified in the hint is the same as the considered package
2016-09-25 05:45:36 +00:00
tsrcv = testing [ src ] . version
2016-07-05 22:44:56 +00:00
if tsrcv != hint . version :
2016-03-23 09:55:07 +00:00
continue
2006-06-17 13:45:56 +00:00
2006-06-24 17:49:43 +00:00
# add the removal of the package to upgrade_me and build a new excuse
2017-10-30 20:58:12 +00:00
upgrade_me_add ( " - %s " % ( src ) )
2006-06-17 13:45:56 +00:00
excuse = Excuse ( " - %s " % ( src ) )
excuse . set_vers ( tsrcv , None )
2016-07-05 22:44:56 +00:00
excuse . addhtml ( " Removal request by %s " % ( hint . user ) )
2006-06-17 13:45:56 +00:00
excuse . addhtml ( " Package is broken, will try to remove " )
2016-07-05 22:44:56 +00:00
excuse . add_hint ( hint )
2017-02-08 19:01:41 +00:00
# Using "PASS" here as "Created by a hint" != "accepted due to hint". In a future
# where there might be policy checks on removals, it would make sense to distinguish
# those two states. Not sure that future will ever be.
excuse . policy_verdict = PolicyVerdict . PASS
2016-03-25 09:18:35 +00:00
excuses [ excuse . name ] = excuse
2006-06-17 13:45:56 +00:00
2006-06-24 17:49:43 +00:00
# extract the not considered packages, which are in the excuses but not in upgrade_me
2017-10-30 20:58:12 +00:00
unconsidered = { ename for ename in excuses if ename not in upgrade_me }
2006-06-17 13:45:56 +00:00
2006-06-24 17:49:43 +00:00
# invalidate impossible excuses
2016-03-25 09:18:35 +00:00
for e in excuses . values ( ) :
2011-08-02 18:46:17 +00:00
# parts[0] == package name
# parts[1] == optional architecture
parts = e . name . split ( ' / ' )
2006-06-17 13:45:56 +00:00
for d in e . deps :
2011-08-02 18:46:17 +00:00
ok = False
2011-08-03 20:53:38 +00:00
# source -> source dependency; both packages must have
# valid excuses
2011-08-02 18:46:17 +00:00
if d in upgrade_me or d in unconsidered :
ok = True
# if the excuse is for a binNMU, also consider d/$arch as a
# valid excuse
2011-08-03 20:53:38 +00:00
elif len ( parts ) == 2 :
2011-08-02 18:46:17 +00:00
bd = ' %s / %s ' % ( d , parts [ 1 ] )
if bd in upgrade_me or bd in unconsidered :
ok = True
2011-08-03 20:53:38 +00:00
# if the excuse is for a source package, check each of the
# architectures on which the excuse lists a dependency on d,
# and consider the excuse valid if it is possible on each
# architecture
else :
arch_ok = True
for arch in e . deps [ d ] :
bd = ' %s / %s ' % ( d , arch )
if bd not in upgrade_me and bd not in unconsidered :
arch_ok = False
break
if arch_ok :
ok = True
2011-08-02 18:46:17 +00:00
if not ok :
2011-08-02 17:36:03 +00:00
e . addhtml ( " Impossible dependency: %s -> %s " % ( e . name , d ) )
2013-11-15 15:01:20 +00:00
e . addreason ( " depends " )
2016-11-16 07:23:58 +00:00
invalidate_excuses ( excuses , upgrade_me , unconsidered )
2006-06-17 13:45:56 +00:00
2006-07-26 22:12:09 +00:00
# sort the list of candidates
2013-09-07 18:19:17 +00:00
self . upgrade_me = sorted ( make_migrationitem ( x , self . sources ) for x in upgrade_me )
2006-06-25 13:51:53 +00:00
2006-06-24 17:49:43 +00:00
# write excuses to the output file
2008-01-15 16:09:51 +00:00
if not self . options . dry_run :
2016-03-25 07:43:46 +00:00
self . log ( " > Writing Excuses to %s " % self . options . excuses_output , type = " I " )
2016-03-25 09:18:35 +00:00
sorted_excuses = sorted ( excuses . values ( ) , key = lambda x : x . sortkey ( ) )
write_excuses ( sorted_excuses , self . options . excuses_output ,
2013-12-11 20:55:17 +01:00
output_format = " legacy-html " )
2013-08-26 20:12:46 +00:00
if hasattr ( self . options , ' excuses_yaml_output ' ) :
2016-03-25 07:43:46 +00:00
self . log ( " > Writing YAML Excuses to %s " % self . options . excuses_yaml_output , type = " I " )
2016-03-25 09:18:35 +00:00
write_excuses ( sorted_excuses , self . options . excuses_yaml_output ,
output_format = " yaml " )
2013-08-26 20:12:46 +00:00
2016-03-25 07:43:46 +00:00
self . log ( " Update Excuses generation completed " , type = " I " )
2006-06-25 13:51:53 +00:00
2006-07-20 19:00:47 +00:00
# Upgrade run
# -----------
2016-11-16 07:38:29 +00:00
def get_nuninst ( self , build = False ) :
2006-07-30 11:15:27 +00:00
""" Return the uninstallability statistic for all the architectures
To calculate the uninstallability counters , the method checks the
installability of all the packages for all the architectures , and
2010-02-28 12:58:03 +00:00
tracks dependencies in a recursive way . The architecture
independent packages are checked only for the ` nobreakall `
2006-07-30 11:15:27 +00:00
architectures .
It returns a dictionary with the architectures as keys and the list
of uninstallable packages as values .
"""
2006-08-07 14:38:13 +00:00
# if we are not asked to build the nuninst, read it from the cache
if not build :
2013-01-12 17:17:41 +01:00
return read_nuninst ( self . options . noninst_status ,
self . options . architectures )
2006-08-07 14:38:13 +00:00
2016-11-16 07:38:29 +00:00
return compile_nuninst ( self . binaries [ ' testing ' ] ,
self . _inst_tester ,
self . options . architectures ,
self . options . nobreakall_arches )
2006-07-20 19:00:47 +00:00
2006-07-22 09:54:27 +00:00
def eval_nuninst ( self , nuninst , original = None ) :
2006-07-30 11:15:27 +00:00
""" Return a string which represents the uninstallability counters
This method returns a string which represents the uninstallability
counters reading the uninstallability statistics ` nuninst ` and , if
present , merging the results with the ` original ` one .
An example of the output string is :
1 + 2 : i - 0 : a - 0 : a - 0 : h - 0 : i - 1 : m - 0 : m - 0 : p - 0 : a - 0 : m - 0 : s - 2 : s - 0
where the first part is the number of broken packages in non - break
architectures + the total number of broken packages for all the
architectures .
"""
2006-07-20 19:00:47 +00:00
res = [ ]
total = 0
totalbreak = 0
for arch in self . options . architectures :
2006-07-26 19:26:38 +00:00
if arch in nuninst :
2006-07-25 20:32:23 +00:00
n = len ( nuninst [ arch ] )
2006-07-26 19:26:38 +00:00
elif original and arch in original :
2006-07-25 20:32:23 +00:00
n = len ( original [ arch ] )
else : continue
2015-05-27 22:34:42 +02:00
if arch in self . options . break_arches :
2006-07-25 20:32:23 +00:00
totalbreak = totalbreak + n
else :
total = total + n
res . append ( " %s - %d " % ( arch [ 0 ] , n ) )
2006-07-20 19:00:47 +00:00
return " %d + %d : %s " % ( total , totalbreak , " : " . join ( res ) )
2014-06-08 16:07:09 +02:00
def _compute_groups ( self , source_name , suite , migration_architecture ,
2016-03-23 16:26:04 +00:00
is_removal ,
2014-01-11 07:51:19 +01:00
allow_smooth_updates = True ,
removals = frozenset ( ) ) :
2014-06-08 16:07:09 +02:00
""" Compute the groups of binaries being migrated by item
2013-07-09 17:18:46 +02:00
2014-06-08 16:07:09 +02:00
This method will compute the binaries that will be added ,
replaced in testing and which of them are smooth updatable .
2013-07-09 17:18:46 +02:00
Parameters :
* " source_name " is the name of the source package , whose
binaries are migrating .
* " suite " is the suite from which the binaries are migrating .
2014-06-08 16:07:09 +02:00
[ Same as item . suite , where available ]
* " migration_architecture " is the architecture determines
architecture of the migrating binaries ( can be " source " for
a " source " - migration , meaning all binaries regardless of
architecture ) . [ Same as item . architecture , where available ]
* " is_removal " is a boolean determining if this is a removal
or not [ Same as item . is_removal , where available ]
2014-01-11 07:51:19 +01:00
* " allow_smooth_updates " is a boolean determing whether smooth -
updates are permitted in this migration . When set to False ,
the " smoothbins " return value will always be the empty set .
Any value that would have been there will now be in " rms "
instead . ( defaults : True )
* " removals " is a set of binaries that is assumed to be
removed at the same time as this migration ( e . g . in the same
" easy " - hint ) . This may affect what if some binaries are
smooth updated or not . ( defaults : empty - set )
- Binaries must be given as ( " package-name " , " version " ,
" architecture " ) tuples .
2014-06-08 16:07:09 +02:00
Returns a tuple ( adds , rms , smoothbins ) . " adds " is a set of
binaries that will updated in or appear after the migration .
" rms " is a set of binaries that are not smooth - updatable ( or
binaries that could be , but there is no reason to let them be
smooth updated ) . " smoothbins " is set of binaries that are to
be smooth - updated .
Each " binary " in " adds " , " rms " and " smoothbins " will be a
tuple of ( " package-name " , " version " , " architecture " ) and are
thus tuples suitable for passing on to the
InstallabilityTester .
2013-07-09 17:18:46 +02:00
2014-06-08 16:07:09 +02:00
Unlike doop_source , this will not modify any data structure .
"""
# local copies for better performances
sources = self . sources
2016-11-12 14:08:40 +00:00
binaries_s = self . binaries [ suite ]
2013-07-09 17:14:40 +02:00
binaries_t = self . binaries [ ' testing ' ]
2015-09-09 20:30:09 +02:00
inst_tester = self . _inst_tester
2014-06-08 16:07:09 +02:00
adds = set ( )
rms = set ( )
2015-09-09 20:30:09 +02:00
smoothbins = set ( )
2014-06-08 16:07:09 +02:00
# remove all binary packages (if the source already exists)
if migration_architecture == ' source ' or not is_removal :
if source_name in sources [ ' testing ' ] :
source_data = sources [ ' testing ' ] [ source_name ]
bins = [ ]
2015-09-09 20:30:09 +02:00
check = set ( )
2014-06-08 16:07:09 +02:00
# remove all the binaries
# first, build a list of eligible binaries
2016-09-25 05:45:36 +00:00
for pkg_id in source_data . binaries :
2016-03-23 15:10:08 +00:00
binary , _ , parch = pkg_id
2014-06-08 16:07:09 +02:00
if ( migration_architecture != ' source '
and parch != migration_architecture ) :
continue
2016-03-22 20:37:48 +00:00
# Work around #815995
if migration_architecture == ' source ' and is_removal and binary not in binaries_t [ parch ] [ 0 ] :
continue
2016-03-23 16:26:04 +00:00
# Do not include hijacked binaries
2016-04-06 20:49:40 +00:00
if binaries_t [ parch ] [ 0 ] [ binary ] . source != source_name :
2014-06-08 16:07:09 +02:00
continue
2016-03-23 15:10:08 +00:00
bins . append ( pkg_id )
2014-06-08 16:07:09 +02:00
2015-09-09 20:30:09 +02:00
for pkg_id in bins :
binary , _ , parch = pkg_id
2014-06-08 16:07:09 +02:00
# if a smooth update is possible for the package, skip it
2014-01-11 07:51:19 +01:00
if allow_smooth_updates and suite == ' unstable ' and \
2016-11-12 14:08:40 +00:00
binary not in binaries_s [ parch ] [ 0 ] and \
2014-06-08 16:07:09 +02:00
( ' ALL ' in self . options . smooth_updates or \
2016-04-06 20:49:40 +00:00
binaries_t [ parch ] [ 0 ] [ binary ] . section in self . options . smooth_updates ) :
2014-06-08 16:07:09 +02:00
# if the package has reverse-dependencies which are
# built from other sources, it's a valid candidate for
# a smooth update. if not, it may still be a valid
# candidate if one if its r-deps is itself a candidate,
# so note it for checking later
2015-09-09 20:30:09 +02:00
rdeps = set ( inst_tester . reverse_dependencies_of ( pkg_id ) )
# We ignore all binaries listed in "removals" as we
# assume they will leave at the same time as the
# given package.
rdeps . difference_update ( removals , bins )
smooth_update_it = False
if inst_tester . any_of_these_are_in_testing ( rdeps ) :
combined = set ( smoothbins )
combined . add ( pkg_id )
for rdep in rdeps :
for dep_clause in inst_tester . dependencies_of ( rdep ) :
if dep_clause < = combined :
smooth_update_it = True
2014-06-08 16:07:09 +02:00
break
2015-09-09 20:30:09 +02:00
if smooth_update_it :
smoothbins = combined
2014-06-08 16:07:09 +02:00
else :
2015-09-09 20:30:09 +02:00
check . add ( pkg_id )
2014-06-08 16:07:09 +02:00
# check whether we should perform a smooth update for
# packages which are candidates but do not have r-deps
# outside of the current source
2015-09-09 20:30:09 +02:00
while 1 :
found_any = False
for pkg_id in check :
rdeps = inst_tester . reverse_dependencies_of ( pkg_id )
if not rdeps . isdisjoint ( smoothbins ) :
smoothbins . add ( pkg_id )
found_any = True
if not found_any :
break
check = [ x for x in check if x not in smoothbins ]
2014-06-08 16:07:09 +02:00
# remove all the binaries which aren't being smooth updated
2015-09-09 20:30:09 +02:00
for pkg_id in ( pkg_id for pkg_id in bins if pkg_id not in smoothbins ) :
binary , version , parch = pkg_id
2014-09-21 15:50:50 +00:00
# if this is a binary migration from *pu, only the arch:any
# packages will be present. ideally dak would also populate
# the arch-indep packages, but as that's not the case we
# must keep them around; they will not be re-added by the
# migration so will end up missing from testing
if migration_architecture != ' source ' and \
suite != ' unstable ' and \
2016-04-06 20:49:40 +00:00
binaries_t [ parch ] [ 0 ] [ binary ] . architecture == ' all ' :
2014-09-21 15:50:50 +00:00
continue
2015-04-26 08:13:20 +10:00
else :
2015-09-09 20:30:09 +02:00
rms . add ( pkg_id )
2014-06-08 16:07:09 +02:00
# single binary removal; used for clearing up after smooth
# updates but not supported as a manual hint
2016-11-13 22:40:06 +00:00
else :
assert source_name in binaries_t [ migration_architecture ] [ 0 ]
2016-11-15 22:50:01 +00:00
pkg_id = binaries_t [ migration_architecture ] [ 0 ] [ source_name ] . pkg_id
rms . add ( pkg_id )
2014-06-08 16:07:09 +02:00
# add the new binary packages (if we are not removing)
if not is_removal :
source_data = sources [ suite ] [ source_name ]
2016-09-25 05:45:36 +00:00
for pkg_id in source_data . binaries :
2016-03-23 15:10:08 +00:00
binary , _ , parch = pkg_id
2014-06-08 16:07:09 +02:00
if migration_architecture not in [ ' source ' , parch ] :
2013-07-09 17:14:40 +02:00
continue
2015-04-26 08:13:20 +10:00
2016-11-12 14:08:40 +00:00
if binaries_s [ parch ] [ 0 ] [ binary ] . source != source_name :
2015-04-26 08:13:20 +10:00
# This binary package has been hijacked by some other source.
# So don't add it as part of this update.
#
# Also, if this isn't a source update, don't remove
# the package that's been hijacked if it's present.
if migration_architecture != ' source ' :
for rm_b , rm_v , rm_p in list ( rms ) :
if ( rm_b , rm_p ) == ( binary , parch ) :
rms . remove ( ( rm_b , rm_v , rm_p ) )
continue
2016-03-23 20:07:25 +00:00
# Don't add the binary if it is old cruft that is no longer in testing
2013-09-09 12:37:52 +01:00
if ( parch not in self . options . outofsync_arches and
2016-11-12 14:08:40 +00:00
source_data . version != binaries_s [ parch ] [ 0 ] [ binary ] . source_version and
2016-03-23 20:07:25 +00:00
binary not in binaries_t [ parch ] [ 0 ] ) :
continue
2016-03-23 15:10:08 +00:00
adds . add ( pkg_id )
2013-07-09 17:14:40 +02:00
2015-09-09 20:30:09 +02:00
return ( adds , rms , smoothbins )
2013-07-09 17:14:40 +02:00
2014-08-04 22:32:15 +02:00
def doop_source ( self , item , hint_undo = None , removals = frozenset ( ) ) :
2006-07-30 11:15:27 +00:00
""" Apply a change to the testing distribution as requested by `pkg`
2011-05-30 19:15:54 +00:00
An optional list of undo actions related to packages processed earlier
in a hint may be passed in ` hint_undo ` .
2014-01-11 07:51:19 +01:00
An optional set of binaries may be passed in " removals " . Binaries listed
2016-01-17 19:37:02 +00:00
in this set will be assumed to be removed at the same time as the " item "
2014-01-11 07:51:19 +01:00
will migrate . This may change what binaries will be smooth - updated .
- Binaries in this set must be ( " package-name " , " version " , " architecture " )
tuples .
2011-08-31 19:55:22 +00:00
This method applies the changes required by the action ` item ` tracking
2006-07-30 11:15:27 +00:00
them so it will be possible to revert them .
2006-07-20 21:43:24 +00:00
2014-07-24 23:00:56 +02:00
The method returns a tuple containing a set of packages
affected by the change ( as ( name , arch ) - tuples ) and the
dictionary undo which can be used to rollback the changes .
2006-07-30 11:15:27 +00:00
"""
2006-07-29 19:13:23 +00:00
undo = { ' binaries ' : { } , ' sources ' : { } , ' virtual ' : { } , ' nvirtual ' : [ ] }
2006-07-20 21:43:24 +00:00
2016-04-08 20:35:02 +00:00
affected_pos = set ( )
affected_remain = set ( )
2006-07-20 21:43:24 +00:00
2015-04-27 19:14:08 +00:00
# local copies for better performance
2006-07-29 20:29:19 +00:00
sources = self . sources
2014-07-24 23:52:50 +02:00
packages_t = self . binaries [ ' testing ' ]
2014-07-25 22:02:17 +02:00
inst_tester = self . _inst_tester
eqv_set = set ( )
2015-04-26 08:13:20 +10:00
updates , rms , _ = self . _compute_groups ( item . package ,
item . suite ,
item . architecture ,
item . is_removal ,
removals = removals )
2016-11-15 22:50:27 +00:00
# Handle the source package
if item . architecture == ' source ' :
2011-08-31 19:55:22 +00:00
if item . package in sources [ ' testing ' ] :
source = sources [ ' testing ' ] [ item . package ]
2016-11-15 22:50:27 +00:00
undo [ ' sources ' ] [ item . package ] = source
del sources [ ' testing ' ] [ item . package ]
2006-07-22 17:06:55 +00:00
else :
2006-07-30 11:15:27 +00:00
# the package didn't exist, so we mark it as to-be-removed in case of undo
2011-08-31 19:55:22 +00:00
undo [ ' sources ' ] [ ' - ' + item . package ] = True
2006-07-22 17:06:55 +00:00
2016-11-15 22:50:27 +00:00
# add/update the source package
if not item . is_removal :
sources [ ' testing ' ] [ item . package ] = sources [ item . suite ] [ item . package ]
2011-12-13 18:52:01 +01:00
2016-11-15 22:50:27 +00:00
# If we are removing *and* updating packages, then check for eqv. packages
if rms and updates :
eqv_table = { }
for rm_pkg_id in rms :
binary , _ , parch = rm_pkg_id
key = ( binary , parch )
eqv_table [ key ] = rm_pkg_id
for new_pkg_id in updates :
binary , _ , parch = new_pkg_id
key = ( binary , parch )
old_pkg_id = eqv_table . get ( key )
if old_pkg_id is not None :
if inst_tester . are_equivalent ( new_pkg_id , old_pkg_id ) :
eqv_set . add ( key )
# remove all the binaries which aren't being smooth updated
for rm_pkg_id in rms :
binary , version , parch = rm_pkg_id
p = ( binary , parch )
binaries_t_a , provides_t_a = packages_t [ parch ]
pkey = ( binary , parch )
pkg_data = binaries_t_a [ binary ]
# save the old binary for undo
undo [ ' binaries ' ] [ p ] = rm_pkg_id
if pkey not in eqv_set :
# all the reverse dependencies are affected by
# the change
affected_pos . update ( inst_tester . reverse_dependencies_of ( rm_pkg_id ) )
affected_remain . update ( inst_tester . negative_dependencies_of ( rm_pkg_id ) )
# remove the provided virtual packages
for provided_pkg , prov_version , _ in pkg_data . provides :
key = ( provided_pkg , parch )
if key not in undo [ ' virtual ' ] :
undo [ ' virtual ' ] [ key ] = provides_t_a [ provided_pkg ] . copy ( )
provides_t_a [ provided_pkg ] . remove ( ( binary , prov_version ) )
if not provides_t_a [ provided_pkg ] :
del provides_t_a [ provided_pkg ]
# finally, remove the binary package
del binaries_t_a [ binary ]
inst_tester . remove_testing_binary ( rm_pkg_id )
# Add/Update binary packages in testing
if updates :
2014-07-24 23:52:50 +02:00
packages_s = self . binaries [ item . suite ]
2015-04-26 08:13:20 +10:00
2015-09-09 17:39:09 +02:00
for updated_pkg_id in updates :
binary , new_version , parch = updated_pkg_id
2006-08-06 18:42:58 +00:00
key = ( binary , parch )
2014-07-24 23:52:50 +02:00
binaries_t_a , provides_t_a = packages_t [ parch ]
2014-07-25 22:02:17 +02:00
equivalent_replacement = key in eqv_set
2006-07-30 11:15:27 +00:00
# obviously, added/modified packages are affected
2016-04-08 20:35:02 +00:00
if not equivalent_replacement :
affected_pos . add ( updated_pkg_id )
2012-12-25 22:16:05 +00:00
# if the binary already exists in testing, it is currently
# built by another source package. we therefore remove the
# version built by the other source package, after marking
# all of its reverse dependencies as affected
2014-07-24 23:52:50 +02:00
if binary in binaries_t_a :
old_pkg_data = binaries_t_a [ binary ]
2016-05-16 06:28:36 +00:00
old_pkg_id = old_pkg_data . pkg_id
2016-01-17 10:56:47 +00:00
# save the old binary package
2016-03-26 11:37:55 +00:00
undo [ ' binaries ' ] [ key ] = old_pkg_id
2014-07-25 22:02:17 +02:00
if not equivalent_replacement :
2015-09-09 17:39:09 +02:00
# all the reverse conflicts
2016-04-08 20:35:02 +00:00
affected_pos . update ( inst_tester . reverse_dependencies_of ( old_pkg_id ) )
affected_remain . update ( inst_tester . negative_dependencies_of ( old_pkg_id ) )
2015-09-09 20:44:13 +02:00
inst_tester . remove_testing_binary ( old_pkg_id )
2014-08-04 22:32:15 +02:00
elif hint_undo :
2012-12-25 22:16:05 +00:00
# the binary isn't in testing, but it may have been at
# the start of the current hint and have been removed
# by an earlier migration. if that's the case then we
# will have a record of the older instance of the binary
# in the undo information. we can use that to ensure
# that the reverse dependencies of the older binary
# package are also checked.
2011-05-30 19:15:54 +00:00
# reverse dependencies built from this source can be
# ignored as their reverse trees are already handled
# by this function
2011-08-31 19:55:22 +00:00
for ( tundo , tpkg ) in hint_undo :
2016-03-26 11:37:55 +00:00
if key in tundo [ ' binaries ' ] :
tpkg_id = tundo [ ' binaries ' ] [ key ]
2016-04-08 20:35:02 +00:00
affected_pos . update ( inst_tester . reverse_dependencies_of ( tpkg_id ) )
2014-07-25 22:02:17 +02:00
2014-07-24 23:52:50 +02:00
# add/update the binary package from the source suite
new_pkg_data = packages_s [ parch ] [ 0 ] [ binary ]
binaries_t_a [ binary ] = new_pkg_data
2015-09-09 20:44:13 +02:00
inst_tester . add_testing_binary ( updated_pkg_id )
2006-07-30 11:15:27 +00:00
# register new provided packages
2016-11-13 22:39:57 +00:00
for provided_pkg , prov_version , _ in new_pkg_data . provides :
key = ( provided_pkg , parch )
if provided_pkg not in provides_t_a :
2006-07-30 11:15:27 +00:00
undo [ ' nvirtual ' ] . append ( key )
2016-11-13 22:39:57 +00:00
provides_t_a [ provided_pkg ] = set ( )
2006-07-30 11:15:27 +00:00
elif key not in undo [ ' virtual ' ] :
2016-11-13 22:39:57 +00:00
undo [ ' virtual ' ] [ key ] = provides_t_a [ provided_pkg ] . copy ( )
provides_t_a [ provided_pkg ] . add ( ( binary , prov_version ) )
2014-07-25 22:02:17 +02:00
if not equivalent_replacement :
# all the reverse dependencies are affected by the change
2016-04-08 20:35:02 +00:00
affected_pos . add ( updated_pkg_id )
affected_remain . update ( inst_tester . negative_dependencies_of ( updated_pkg_id ) )
2006-07-30 11:15:27 +00:00
2015-09-09 17:39:09 +02:00
# Also include the transitive rdeps of the packages found so far
2016-04-08 20:35:02 +00:00
compute_reverse_tree ( inst_tester , affected_pos )
compute_reverse_tree ( inst_tester , affected_remain )
2006-07-30 11:15:27 +00:00
# return the package name, the suite, the list of affected packages and the undo dictionary
2016-04-08 20:35:02 +00:00
return ( affected_pos , affected_remain , undo )
2006-07-20 21:43:24 +00:00
2015-05-27 22:23:03 +02:00
def try_migration ( self , actions , nuninst_now , lundo = None , automatic_revert = True ) :
is_accepted = True
affected_architectures = set ( )
item = actions
packages_t = self . binaries [ ' testing ' ]
nobreakall_arches = self . options . nobreakall_arches
new_arches = self . options . new_arches
break_arches = self . options . break_arches
arch = None
2015-05-28 08:29:50 +02:00
if len ( actions ) == 1 :
item = actions [ 0 ]
# apply the changes
2016-04-08 20:35:02 +00:00
affected_pos , affected_remain , undo = self . doop_source ( item , hint_undo = lundo )
2015-05-28 08:29:50 +02:00
undo_list = [ ( undo , item ) ]
if item . architecture == ' source ' :
affected_architectures = set ( self . options . architectures )
else :
affected_architectures . add ( item . architecture )
else :
undo_list = [ ]
removals = set ( )
2016-04-08 20:35:02 +00:00
affected_pos = set ( )
affected_remain = set ( )
2015-05-28 08:29:50 +02:00
for item in actions :
_ , rms , _ = self . _compute_groups ( item . package , item . suite ,
item . architecture ,
item . is_removal ,
allow_smooth_updates = False )
removals . update ( rms )
affected_architectures . add ( item . architecture )
if ' source ' in affected_architectures :
affected_architectures = set ( self . options . architectures )
for item in actions :
2016-04-08 20:35:02 +00:00
item_affected_pos , item_affected_remain , undo = self . doop_source ( item ,
hint_undo = lundo ,
removals = removals )
affected_pos . update ( item_affected_pos )
affected_remain . update ( item_affected_remain )
2015-05-28 08:29:50 +02:00
undo_list . append ( ( undo , item ) )
2015-05-27 22:23:03 +02:00
2016-04-08 20:35:02 +00:00
# Optimise the test if we may revert directly.
# - The automatic-revert is needed since some callers (notably via hints) may
# accept the outcome of this migration and expect nuninst to be updated.
# (e.g. "force-hint" or "hint")
if automatic_revert :
affected_remain - = affected_pos
else :
affected_remain | = affected_pos
affected_pos = set ( )
2015-05-27 22:23:03 +02:00
# Copy nuninst_comp - we have to deep clone affected
# architectures.
# NB: We do this *after* updating testing as we have to filter out
# removed binaries. Otherwise, uninstallable binaries that were
# removed by the item would still be counted.
2015-05-28 08:29:50 +02:00
2015-05-27 22:23:03 +02:00
nuninst_after = clone_nuninst ( nuninst_now , packages_t , affected_architectures )
2016-05-07 10:44:32 +00:00
must_be_installable = self . constraints [ ' keep-installable ' ]
2015-05-27 22:23:03 +02:00
# check the affected packages on all the architectures
for arch in affected_architectures :
check_archall = arch in nobreakall_arches
2016-04-08 20:35:02 +00:00
check_installability ( self . _inst_tester , packages_t , arch , affected_pos , affected_remain ,
check_archall , nuninst_after )
2015-05-27 22:23:03 +02:00
# if the uninstallability counter is worse than before, break the loop
2016-05-07 10:44:32 +00:00
if automatic_revert :
worse = False
if len ( nuninst_after [ arch ] ) > len ( nuninst_now [ arch ] ) :
worse = True
else :
regression = nuninst_after [ arch ] - nuninst_now [ arch ]
if not regression . isdisjoint ( must_be_installable ) :
worse = True
2015-05-27 22:23:03 +02:00
# ... except for a few special cases
2016-05-07 10:44:32 +00:00
if worse and ( ( item . architecture != ' source ' and arch not in new_arches ) or
( arch not in break_arches ) ) :
2015-05-27 22:23:03 +02:00
is_accepted = False
break
# check if the action improved the uninstallability counters
if not is_accepted and automatic_revert :
2015-05-28 08:29:50 +02:00
undo_copy = list ( reversed ( undo_list ) )
2016-01-17 10:56:47 +00:00
undo_changes ( undo_copy , self . _inst_tester , self . sources , self . binaries , self . all_binaries )
2015-05-27 22:23:03 +02:00
2015-05-28 08:29:50 +02:00
return ( is_accepted , nuninst_after , undo_list , arch )
2015-05-27 22:23:03 +02:00
2014-07-24 22:38:44 +02:00
def iter_packages ( self , packages , selected , nuninst = None , lundo = None ) :
2006-07-30 11:15:27 +00:00
""" Iter on the list of actions and apply them one-by-one
2010-02-28 12:58:03 +00:00
This method applies the changes from ` packages ` to testing , checking the uninstallability
counters for every action performed . If the action does not improve them , it is reverted .
2006-07-30 11:15:27 +00:00
The method returns the new uninstallability counters and the remaining actions if the
2015-09-07 18:07:37 +02:00
final result is successful , otherwise ( None , [ ] ) .
2006-07-30 11:15:27 +00:00
"""
2015-09-07 18:07:37 +02:00
group_info = { }
2016-04-08 18:02:19 +00:00
rescheduled_packages = packages
maybe_rescheduled_packages = [ ]
2015-09-07 18:07:37 +02:00
2012-05-11 16:31:50 +02:00
for y in sorted ( ( y for y in packages ) , key = attrgetter ( ' uvname ' ) ) :
updates , rms , _ = self . _compute_groups ( y . package , y . suite , y . architecture , y . is_removal )
2015-09-07 18:07:37 +02:00
result = ( y , frozenset ( updates ) , frozenset ( rms ) )
group_info [ y ] = result
2006-08-01 13:33:06 +00:00
2006-08-15 21:23:36 +00:00
if nuninst :
2012-05-11 16:31:50 +02:00
nuninst_orig = nuninst
2006-08-15 21:23:36 +00:00
else :
2012-05-11 16:31:50 +02:00
nuninst_orig = self . nuninst_orig
2015-09-07 18:07:37 +02:00
2012-05-11 16:31:50 +02:00
nuninst_last_accepted = nuninst_orig
2015-09-07 18:07:37 +02:00
self . output_write ( " recur: [] %s %d /0 \n " % ( " , " . join ( x . uvname for x in selected ) , len ( packages ) ) )
2016-04-08 18:02:19 +00:00
while rescheduled_packages :
groups = { group_info [ x ] for x in rescheduled_packages }
2015-09-07 18:07:37 +02:00
worklist = self . _inst_tester . solve_groups ( groups )
2016-04-08 18:02:19 +00:00
rescheduled_packages = [ ]
2015-09-07 18:07:37 +02:00
worklist . reverse ( )
while worklist :
comp = worklist . pop ( )
comp_name = ' ' . join ( item . uvname for item in comp )
self . output_write ( " trying: %s \n " % comp_name )
accepted , nuninst_after , comp_undo , failed_arch = self . try_migration ( comp , nuninst_last_accepted , lundo )
if accepted :
selected . extend ( comp )
if lundo is not None :
lundo . extend ( comp_undo )
self . output_write ( " accepted: %s \n " % comp_name )
self . output_write ( " ori: %s \n " % ( self . eval_nuninst ( nuninst_orig ) ) )
self . output_write ( " pre: %s \n " % ( self . eval_nuninst ( nuninst_last_accepted ) ) )
self . output_write ( " now: %s \n " % ( self . eval_nuninst ( nuninst_after ) ) )
if len ( selected ) < = 20 :
self . output_write ( " all: %s \n " % ( " " . join ( x . uvname for x in selected ) ) )
else :
self . output_write ( " most: ( %d ) .. %s \n " % ( len ( selected ) , " " . join ( x . uvname for x in selected [ - 20 : ] ) ) )
2015-09-20 14:56:22 +02:00
nuninst_last_accepted = nuninst_after
2016-04-08 18:02:19 +00:00
rescheduled_packages . extend ( maybe_rescheduled_packages )
maybe_rescheduled_packages . clear ( )
2012-05-11 16:31:50 +02:00
else :
2015-09-07 18:07:37 +02:00
broken = sorted ( b for b in nuninst_after [ failed_arch ]
if b not in nuninst_last_accepted [ failed_arch ] )
compare_nuninst = None
if any ( item for item in comp if item . architecture != ' source ' ) :
compare_nuninst = nuninst_last_accepted
# NB: try_migration already reverted this for us, so just print the results and move on
2016-04-08 18:02:19 +00:00
self . output_write ( " skipped: %s ( %d , %d , %d ) \n " % ( comp_name , len ( rescheduled_packages ) ,
len ( maybe_rescheduled_packages ) , len ( worklist ) ) )
2015-09-07 18:07:37 +02:00
self . output_write ( " got: %s \n " % ( self . eval_nuninst ( nuninst_after , compare_nuninst ) ) )
self . output_write ( " * %s : %s \n " % ( failed_arch , " , " . join ( broken ) ) )
if len ( comp ) > 1 :
self . output_write ( " - splitting the component into single items and retrying them \n " )
worklist . extend ( [ item ] for item in comp )
else :
2016-04-08 18:02:19 +00:00
maybe_rescheduled_packages . append ( comp [ 0 ] )
2006-08-01 20:27:48 +00:00
2012-07-05 17:35:09 +02:00
self . output_write ( " finish: [ %s ] \n " % " , " . join ( x . uvname for x in selected ) )
2006-08-01 18:59:29 +00:00
self . output_write ( " endloop: %s \n " % ( self . eval_nuninst ( self . nuninst_orig ) ) )
2012-05-11 16:31:50 +02:00
self . output_write ( " now: %s \n " % ( self . eval_nuninst ( nuninst_last_accepted ) ) )
2013-01-12 19:58:15 +01:00
self . output_write ( eval_uninst ( self . options . architectures ,
2012-05-11 16:31:50 +02:00
newly_uninst ( self . nuninst_orig , nuninst_last_accepted ) ) )
2006-08-01 18:59:29 +00:00
self . output_write ( " \n " )
2006-07-22 19:14:44 +00:00
2016-04-08 18:02:19 +00:00
return ( nuninst_last_accepted , maybe_rescheduled_packages )
2006-07-22 19:14:44 +00:00
2014-07-24 22:38:44 +02:00
2012-01-04 14:20:33 +01:00
def do_all ( self , hinttype = None , init = None , actions = None ) :
2006-07-30 11:15:27 +00:00
""" Testing update runner
This method tries to update testing checking the uninstallability
counters before and after the actions to decide if the update was
successful or not .
"""
2006-08-15 15:32:14 +00:00
selected = [ ]
2006-08-06 09:29:32 +00:00
if actions :
upgrade_me = actions [ : ]
else :
upgrade_me = self . upgrade_me [ : ]
2006-08-01 20:27:48 +00:00
nuninst_start = self . nuninst_orig
2006-07-25 20:32:23 +00:00
2006-08-01 18:59:29 +00:00
# these are special parameters for hints processing
force = False
2012-05-22 19:56:11 +00:00
recurse = True
2012-05-14 19:28:03 +02:00
lundo = None
nuninst_end = None
2014-08-02 22:15:54 +02:00
better = True
2014-08-05 22:44:14 +02:00
extra = [ ]
2012-05-14 19:28:03 +02:00
2012-01-04 14:20:33 +01:00
if hinttype == " easy " or hinttype == " force-hint " :
force = hinttype == " force-hint "
2012-05-22 19:56:11 +00:00
recurse = False
2006-08-01 18:59:29 +00:00
# if we have a list of initial packages, check them
if init :
2012-05-14 19:28:03 +02:00
if not force :
lundo = [ ]
2006-08-01 18:59:29 +00:00
for x in init :
2006-08-01 20:27:48 +00:00
if x not in upgrade_me :
2015-05-11 21:57:23 +02:00
self . output_write ( " failed: %s is not a valid candidate (or it already migrated) \n " % ( x . uvname ) )
2006-08-01 18:59:29 +00:00
return None
2006-08-06 09:29:32 +00:00
selected . append ( x )
2006-08-01 20:27:48 +00:00
upgrade_me . remove ( x )
2006-08-01 18:59:29 +00:00
2006-08-01 20:27:48 +00:00
self . output_write ( " start: %s \n " % self . eval_nuninst ( nuninst_start ) )
2006-09-05 08:15:04 +00:00
if not force :
self . output_write ( " orig: %s \n " % self . eval_nuninst ( nuninst_start ) )
2006-08-01 18:59:29 +00:00
2012-05-14 19:28:03 +02:00
2012-05-14 19:11:08 +02:00
if init :
# init => a hint (e.g. "easy") - so do the hint run
2015-05-28 08:29:50 +02:00
( better , nuninst_end , undo_list , _ ) = self . try_migration ( selected ,
self . nuninst_orig ,
lundo = lundo ,
automatic_revert = False )
if force :
# Force implies "unconditionally better"
better = True
if lundo is not None :
lundo . extend ( undo_list )
2014-07-24 22:38:44 +02:00
if recurse :
# Ensure upgrade_me and selected do not overlap, if we
# follow-up with a recurse ("hint"-hint).
upgrade_me = [ x for x in upgrade_me if x not in set ( selected ) ]
2012-05-14 19:11:08 +02:00
2012-05-22 19:56:11 +00:00
if recurse :
# Either the main run or the recursive run of a "hint"-hint.
2012-05-14 19:11:08 +02:00
( nuninst_end , extra ) = self . iter_packages ( upgrade_me , selected , nuninst = nuninst_end , lundo = lundo )
nuninst_end_str = self . eval_nuninst ( nuninst_end )
2012-05-22 19:56:11 +00:00
if not recurse :
2012-05-14 19:11:08 +02:00
# easy or force-hint
2006-09-05 08:15:04 +00:00
if force :
2012-05-14 19:11:08 +02:00
self . output_write ( " orig: %s \n " % nuninst_end_str )
self . output_write ( " easy: %s \n " % nuninst_end_str )
2006-09-05 08:15:04 +00:00
if not force :
2013-01-12 19:58:15 +01:00
self . output_write ( eval_uninst ( self . options . architectures ,
2013-10-30 17:29:11 -07:00
newly_uninst ( nuninst_start , nuninst_end ) ) )
2012-05-14 19:11:08 +02:00
2014-08-02 22:15:54 +02:00
if not force :
2015-05-27 22:34:42 +02:00
break_arches = set ( self . options . break_arches )
2014-08-02 22:31:15 +02:00
if all ( x . architecture in break_arches for x in selected ) :
# If we only migrated items from break-arches, then we
# do not allow any regressions on these architectures.
# This usually only happens with hints
break_arches = set ( )
2016-05-07 10:44:32 +00:00
better = is_nuninst_asgood_generous ( self . constraints ,
self . options . architectures ,
2014-08-02 22:15:54 +02:00
self . nuninst_orig ,
nuninst_end ,
break_arches )
if better :
2012-05-14 19:11:08 +02:00
# Result accepted either by force or by being better than the original result.
2012-05-22 21:16:36 +00:00
if recurse :
2008-04-23 17:22:24 +00:00
self . output_write ( " Apparently successful \n " )
2012-07-05 17:35:09 +02:00
self . output_write ( " final: %s \n " % " , " . join ( sorted ( x . uvname for x in selected ) ) )
2006-08-01 18:59:29 +00:00
self . output_write ( " start: %s \n " % self . eval_nuninst ( nuninst_start ) )
2006-09-05 08:15:04 +00:00
if not force :
self . output_write ( " orig: %s \n " % self . eval_nuninst ( self . nuninst_orig ) )
else :
2012-05-14 19:11:08 +02:00
self . output_write ( " orig: %s \n " % nuninst_end_str )
self . output_write ( " end: %s \n " % nuninst_end_str )
2006-08-01 20:27:48 +00:00
if force :
self . output_write ( " force breaks: \n " )
2013-01-12 19:58:15 +01:00
self . output_write ( eval_uninst ( self . options . architectures ,
2013-10-30 17:29:11 -07:00
newly_uninst ( nuninst_start , nuninst_end ) ) )
2006-08-06 09:29:32 +00:00
self . output_write ( " SUCCESS ( %d / %d ) \n " % ( len ( actions or self . upgrade_me ) , len ( extra ) ) )
2006-08-11 21:12:27 +00:00
self . nuninst_orig = nuninst_end
2014-01-11 08:31:36 +01:00
self . all_selected + = selected
2006-09-05 08:15:04 +00:00
if not actions :
2012-05-22 19:56:11 +00:00
if recurse :
2014-08-05 22:44:14 +02:00
self . upgrade_me = extra
2006-09-05 08:15:04 +00:00
else :
2014-07-24 22:38:44 +02:00
self . upgrade_me = [ x for x in self . upgrade_me if x not in set ( selected ) ]
2006-08-01 20:27:48 +00:00
else :
self . output_write ( " FAILED \n " )
2012-05-14 19:28:03 +02:00
if not lundo : return
2013-06-12 14:55:04 +00:00
lundo . reverse ( )
2006-08-03 11:10:40 +00:00
2016-01-17 10:56:47 +00:00
undo_changes ( lundo , self . _inst_tester , self . sources , self . binaries , self . all_binaries )
2011-12-13 09:58:45 +01:00
2013-10-30 17:29:11 -07:00
self . output_write ( " \n " )
2011-12-13 09:58:45 +01:00
2015-09-07 21:04:02 +02:00
def assert_nuninst_is_correct ( self ) :
2016-03-25 07:43:46 +00:00
self . log ( " > Update complete - Verifying non-installability counters " , type = " I " )
2015-09-07 21:04:02 +02:00
cached_nuninst = self . nuninst_orig
self . _inst_tester . compute_testing_installability ( )
computed_nuninst = self . get_nuninst ( build = True )
2016-11-13 08:49:46 +00:00
if cached_nuninst != computed_nuninst : # pragma: no cover
2016-07-04 11:15:14 +00:00
only_on_break_archs = True
2016-03-25 07:43:46 +00:00
self . log ( " ==================== NUNINST OUT OF SYNC ========================= " , type = " E " )
2015-09-07 21:04:02 +02:00
for arch in self . options . architectures :
expected_nuninst = set ( cached_nuninst [ arch ] )
actual_nuninst = set ( computed_nuninst [ arch ] )
false_negatives = actual_nuninst - expected_nuninst
false_positives = expected_nuninst - actual_nuninst
2016-07-04 11:15:14 +00:00
# Britney does not quite work correctly with
# break/fucked arches, so ignore issues there for now.
2016-07-11 17:55:46 +00:00
if ( false_negatives or false_positives ) and arch not in self . options . break_arches :
2016-07-04 11:15:14 +00:00
only_on_break_archs = False
2015-09-07 21:04:02 +02:00
if false_negatives :
2016-03-25 07:43:46 +00:00
self . log ( " %s - unnoticed nuninst: %s " % ( arch , str ( false_negatives ) ) , type = " E " )
2015-09-07 21:04:02 +02:00
if false_positives :
2016-03-25 07:43:46 +00:00
self . log ( " %s - invalid nuninst: %s " % ( arch , str ( false_positives ) ) , type = " E " )
self . log ( " %s - actual nuninst: %s " % ( arch , str ( actual_nuninst ) ) , type = " I " )
self . log ( " ==================== NUNINST OUT OF SYNC ========================= " , type = " E " )
2016-07-05 23:24:08 +00:00
if not only_on_break_archs :
2016-07-04 11:15:14 +00:00
raise AssertionError ( " NUNINST OUT OF SYNC " )
else :
self . log ( " Nuninst is out of sync on some break arches " ,
type = " W " )
2015-09-07 21:04:02 +02:00
2016-03-25 07:43:46 +00:00
self . log ( " > All non-installability counters are ok " , type = " I " )
2015-09-07 21:04:02 +02:00
2006-07-20 19:00:47 +00:00
def upgrade_testing ( self ) :
""" Upgrade testing using the unstable packages
This method tries to upgrade testing using the packages from unstable .
2006-08-01 18:59:29 +00:00
Before running the do_all method , it tries the easy and force - hint
commands .
2006-07-20 19:00:47 +00:00
"""
2016-03-25 07:43:46 +00:00
self . log ( " Starting the upgrade test " , type = " I " )
2006-08-01 18:59:29 +00:00
self . output_write ( " Generated on: %s \n " % ( time . strftime ( " % Y. % m. %d % H: % M: % S % z " , time . gmtime ( time . time ( ) ) ) ) )
self . output_write ( " Arch order is: %s \n " % " , " . join ( self . options . architectures ) )
2016-03-25 07:43:46 +00:00
self . log ( " > Calculating current uninstallability counters " , type = " I " )
2006-08-01 20:27:48 +00:00
self . nuninst_orig = self . get_nuninst ( )
2011-06-28 19:10:13 +00:00
# nuninst_orig may get updated during the upgrade process
self . nuninst_orig_save = self . get_nuninst ( )
2006-08-01 20:27:48 +00:00
2006-08-16 17:56:38 +00:00
if not self . options . actions :
# process `easy' hints
for x in self . hints [ ' easy ' ] :
2011-09-04 16:41:33 +00:00
self . do_hint ( " easy " , x . user , x . packages )
2006-08-01 18:59:29 +00:00
2006-08-16 17:56:38 +00:00
# process `force-hint' hints
for x in self . hints [ " force-hint " ] :
2011-09-04 16:41:33 +00:00
self . do_hint ( " force-hint " , x . user , x . packages )
2006-07-20 19:00:47 +00:00
2006-08-01 18:59:29 +00:00
# run the first round of the upgrade
2012-01-04 14:20:33 +01:00
# - do separate runs for break arches
2006-08-07 11:06:04 +00:00
allpackages = [ ]
normpackages = self . upgrade_me [ : ]
archpackages = { }
2015-05-27 22:34:42 +02:00
for a in self . options . break_arches :
2011-11-09 22:44:31 +00:00
archpackages [ a ] = [ p for p in normpackages if p . architecture == a ]
2010-11-03 19:05:46 +00:00
normpackages = [ p for p in normpackages if p not in archpackages [ a ] ]
2006-08-07 11:06:04 +00:00
self . upgrade_me = normpackages
self . output_write ( " info: main run \n " )
2006-08-01 20:27:48 +00:00
self . do_all ( )
2006-08-07 11:06:04 +00:00
allpackages + = self . upgrade_me
2015-05-27 22:34:42 +02:00
for a in self . options . break_arches :
2006-08-15 15:32:14 +00:00
backup = self . options . break_arches
2015-05-27 22:34:42 +02:00
self . options . break_arches = " " . join ( x for x in self . options . break_arches if x != a )
2006-08-07 11:06:04 +00:00
self . upgrade_me = archpackages [ a ]
self . output_write ( " info: broken arch run for %s \n " % ( a ) )
self . do_all ( )
allpackages + = self . upgrade_me
2006-08-15 15:32:14 +00:00
self . options . break_arches = backup
2006-08-07 11:06:04 +00:00
self . upgrade_me = allpackages
2006-07-20 19:00:47 +00:00
2006-08-16 17:56:38 +00:00
if self . options . actions :
2011-06-30 17:45:44 +00:00
self . printuninstchange ( )
2006-08-16 17:56:38 +00:00
return
2006-08-15 15:32:14 +00:00
# process `hint' hints
hintcnt = 0
for x in self . hints [ " hint " ] [ : 50 ] :
if hintcnt > 50 :
self . output_write ( " Skipping remaining hints... " )
break
2011-09-04 16:41:33 +00:00
if self . do_hint ( " hint " , x . user , x . packages ) :
2006-08-15 15:32:14 +00:00
hintcnt + = 1
2006-08-04 18:38:52 +00:00
# run the auto hinter
2011-12-24 11:56:36 +01:00
self . auto_hinter ( )
2006-08-04 18:38:52 +00:00
2012-10-22 13:55:11 +01:00
if getattr ( self . options , " remove_obsolete " , " yes " ) == " yes " :
# obsolete source packages
# a package is obsolete if none of the binary packages in testing
# are built by it
self . log ( " > Removing obsolete source packages from testing " , type = " I " )
# local copies for performance
sources = self . sources [ ' testing ' ]
binaries = self . binaries [ ' testing ' ]
used = set ( binaries [ arch ] [ 0 ] [ binary ] . source
for arch in binaries
for binary in binaries [ arch ] [ 0 ]
)
removals = [ MigrationItem ( " - %s / %s " % ( source , sources [ source ] . version ) )
for source in sources if source not in used
]
if removals :
self . output_write ( " Removing obsolete source packages from testing ( %d ): \n " % ( len ( removals ) ) )
self . do_all ( actions = removals )
2015-09-07 21:04:02 +02:00
2006-08-06 09:29:32 +00:00
# smooth updates
2013-09-09 12:37:52 +01:00
removals = old_libraries ( self . sources , self . binaries , self . options . outofsync_arches )
2014-03-09 11:01:12 +01:00
if self . options . smooth_updates :
2016-03-25 07:43:46 +00:00
self . log ( " > Removing old packages left in testing from smooth updates " , type = " I " )
2014-03-09 11:01:12 +01:00
if removals :
2006-08-06 09:29:32 +00:00
self . output_write ( " Removing packages left in testing for smooth updates ( %d ): \n %s " % \
2012-12-28 11:37:53 +01:00
( len ( removals ) , old_libraries_format ( removals ) ) )
2013-09-07 16:59:06 +00:00
self . do_all ( actions = removals )
2013-09-09 12:37:52 +01:00
removals = old_libraries ( self . sources , self . binaries , self . options . outofsync_arches )
2008-04-23 17:25:56 +00:00
else :
2016-03-25 12:33:33 +00:00
self . log ( " > Not removing old packages left in testing from smooth updates (smooth-updates disabled) " ,
type = " I " )
2006-08-06 09:29:32 +00:00
2011-12-24 11:56:36 +01:00
self . output_write ( " List of old libraries in testing ( %d ): \n %s " % \
2012-12-28 11:37:53 +01:00
( len ( removals ) , old_libraries_format ( removals ) ) )
2006-08-06 09:29:32 +00:00
2015-09-07 21:04:02 +02:00
self . assert_nuninst_is_correct ( )
2006-08-06 09:29:32 +00:00
# output files
if not self . options . dry_run :
# re-write control files
if self . options . control_files :
2016-03-25 07:43:46 +00:00
self . log ( " Writing new testing control files to %s " %
2016-10-24 19:36:44 +00:00
self . suite_info [ ' testing ' ] . path )
2014-03-09 10:50:52 +01:00
write_controlfiles ( self . sources , self . binaries ,
2016-10-24 19:36:44 +00:00
' testing ' , self . suite_info [ ' testing ' ] . path )
2006-08-03 11:41:39 +00:00
2016-03-25 15:23:34 +00:00
for policy in self . policies :
policy . save_state ( self )
2006-08-01 18:59:29 +00:00
2006-08-06 09:29:32 +00:00
# write HeidiResult
2016-03-25 07:43:46 +00:00
self . log ( " Writing Heidi results to %s " % self . options . heidi_output )
2013-01-12 17:47:24 +01:00
write_heidi ( self . options . heidi_output , self . sources [ " testing " ] ,
self . binaries [ " testing " ] )
2006-08-01 18:59:29 +00:00
2016-03-25 07:43:46 +00:00
self . log ( " Writing delta to %s " % self . options . heidi_delta_output )
2014-01-11 08:31:36 +01:00
write_heidi_delta ( self . options . heidi_delta_output ,
self . all_selected )
2011-06-28 19:10:13 +00:00
self . printuninstchange ( )
2016-03-25 07:43:46 +00:00
self . log ( " Test completed! " , type = " I " )
2006-07-20 19:00:47 +00:00
2011-06-28 19:10:13 +00:00
def printuninstchange ( self ) :
2016-03-25 07:43:46 +00:00
self . log ( " Checking for newly uninstallable packages " , type = " I " )
2013-01-12 20:13:43 +01:00
text = eval_uninst ( self . options . architectures , newly_uninst (
2013-01-12 19:58:15 +01:00
self . nuninst_orig_save , self . nuninst_orig ) )
2011-06-28 19:10:13 +00:00
if text != ' ' :
self . output_write ( " \n Newly uninstallable packages in testing: \n %s " % \
( text ) )
2008-01-15 16:09:51 +00:00
def hint_tester ( self ) :
""" Run a command line interface to test hints
2010-02-28 12:58:03 +00:00
This method provides a command line interface for the release team to
2011-09-17 11:48:49 +02:00
try hints and evaluate the results .
2008-01-15 16:09:51 +00:00
"""
2016-03-25 07:43:46 +00:00
self . log ( " > Calculating current uninstallability counters " , type = " I " )
2008-01-15 16:09:51 +00:00
self . nuninst_orig = self . get_nuninst ( )
2011-06-28 19:10:13 +00:00
self . nuninst_orig_save = self . get_nuninst ( )
2008-01-15 16:09:51 +00:00
2008-04-23 17:22:24 +00:00
import readline
2016-10-23 10:30:01 +00:00
from britney2 . completer import Completer
2011-10-22 13:42:14 +02:00
2008-04-23 17:22:24 +00:00
histfile = os . path . expanduser ( ' ~/.britney2_history ' )
if os . path . exists ( histfile ) :
readline . read_history_file ( histfile )
2011-10-22 13:42:14 +02:00
readline . parse_and_bind ( ' tab: complete ' )
readline . set_completer ( Completer ( self ) . completer )
2011-10-22 14:09:35 +02:00
# Package names can contain "-" and we use "/" in our presentation of them as well,
# so ensure readline does not split on these characters.
readline . set_completer_delims ( readline . get_completer_delims ( ) . replace ( ' - ' , ' ' ) . replace ( ' / ' , ' ' ) )
2011-10-22 13:42:14 +02:00
2016-07-02 20:28:07 +00:00
known_hints = self . _hint_parser . registered_hints
2008-01-15 16:09:51 +00:00
while True :
# read the command from the command line
try :
2016-11-28 16:10:00 +00:00
user_input = input ( ' britney> ' ) . split ( )
2008-01-15 16:09:51 +00:00
except EOFError :
2015-04-25 16:03:54 +02:00
print ( " " )
2008-01-15 16:09:51 +00:00
break
2008-04-23 17:22:24 +00:00
except KeyboardInterrupt :
2015-04-25 16:03:54 +02:00
print ( " " )
2008-04-23 17:22:24 +00:00
continue
2008-01-15 16:09:51 +00:00
# quit the hint tester
2015-09-20 17:17:57 +02:00
if user_input and user_input [ 0 ] in ( ' quit ' , ' exit ' ) :
2008-01-15 16:09:51 +00:00
break
2016-07-02 20:28:07 +00:00
# run a hint
2015-09-20 17:17:57 +02:00
elif user_input and user_input [ 0 ] in ( ' easy ' , ' hint ' , ' force-hint ' ) :
2008-04-23 17:22:24 +00:00
try :
2015-09-20 17:17:57 +02:00
self . do_hint ( user_input [ 0 ] , ' hint-tester ' ,
2016-07-02 20:28:07 +00:00
[ k . rsplit ( " / " , 1 ) for k in user_input [ 1 : ] if " / " in k ] )
2011-06-28 19:10:13 +00:00
self . printuninstchange ( )
2008-04-23 17:22:24 +00:00
except KeyboardInterrupt :
continue
2016-07-02 20:28:07 +00:00
elif user_input and user_input [ 0 ] in known_hints :
self . _hint_parser . parse_hints ( ' hint-tester ' , self . HINTS_ALL , ' <stdin> ' , [ ' ' . join ( user_input ) ] )
self . write_excuses ( )
2011-10-22 10:26:10 +02:00
try :
readline . write_history_file ( histfile )
2015-04-25 16:35:09 +02:00
except IOError as e :
2016-03-25 07:43:46 +00:00
self . log ( " Could not write %s : %s " % ( histfile , e ) , type = " W " )
2008-01-15 16:09:51 +00:00
2012-01-04 14:20:33 +01:00
def do_hint ( self , hinttype , who , pkgvers ) :
2006-08-01 18:59:29 +00:00
""" Process hints
This method process ` easy ` , ` hint ` and ` force - hint ` hints . If the
2010-02-28 12:58:03 +00:00
requested version is not in unstable , then the hint is skipped .
2006-08-01 18:59:29 +00:00
"""
2011-10-22 16:58:02 +02:00
if isinstance ( pkgvers [ 0 ] , tuple ) or isinstance ( pkgvers [ 0 ] , list ) :
2013-09-07 17:01:58 +00:00
_pkgvers = [ MigrationItem ( ' %s / %s ' % ( p , v ) ) for ( p , v ) in pkgvers ]
2011-09-04 16:41:33 +00:00
else :
_pkgvers = pkgvers
2016-03-25 07:43:46 +00:00
self . log ( " > Processing ' %s ' hint from %s " % ( hinttype , who ) , type = " I " )
2016-01-17 19:36:44 +00:00
self . output_write ( " Trying %s from %s : %s \n " % ( hinttype , who , " " . join ( " %s / %s " % ( x . uvname , x . version ) for x in _pkgvers ) ) )
2006-08-01 18:59:29 +00:00
ok = True
# loop on the requested packages and versions
2011-11-15 20:50:09 +00:00
for idx in range ( len ( _pkgvers ) ) :
pkg = _pkgvers [ idx ]
2006-08-01 18:59:29 +00:00
# skip removal requests
2011-09-04 16:41:33 +00:00
if pkg . is_removal :
2006-08-01 18:59:29 +00:00
continue
2011-11-15 20:50:09 +00:00
inunstable = pkg . package in self . sources [ ' unstable ' ]
2016-09-25 05:45:36 +00:00
rightversion = inunstable and ( apt_pkg . version_compare ( self . sources [ ' unstable ' ] [ pkg . package ] . version , pkg . version ) == 0 )
2011-11-15 20:50:09 +00:00
if pkg . suite == ' unstable ' and not rightversion :
for suite in [ ' pu ' , ' tpu ' ] :
2016-09-25 05:45:36 +00:00
if pkg . package in self . sources [ suite ] and apt_pkg . version_compare ( self . sources [ suite ] [ pkg . package ] . version , pkg . version ) == 0 :
2011-11-15 20:50:09 +00:00
pkg . suite = suite
_pkgvers [ idx ] = pkg
break
2011-07-27 09:38:27 +00:00
# handle *-proposed-updates
2011-11-15 20:50:09 +00:00
if pkg . suite in [ ' pu ' , ' tpu ' ] :
2011-09-04 16:41:33 +00:00
if pkg . package not in self . sources [ pkg . suite ] : continue
2016-09-25 05:45:36 +00:00
if apt_pkg . version_compare ( self . sources [ pkg . suite ] [ pkg . package ] . version , pkg . version ) != 0 :
self . output_write ( " Version mismatch, %s %s != %s \n " % ( pkg . package , pkg . version , self . sources [ pkg . suite ] [ pkg . package ] . version ) )
2011-07-27 09:30:00 +00:00
ok = False
2006-08-01 18:59:29 +00:00
# does the package exist in unstable?
2011-11-15 20:50:09 +00:00
elif not inunstable :
2011-09-04 16:41:33 +00:00
self . output_write ( " Source %s has no version in unstable \n " % pkg . package )
2006-08-01 18:59:29 +00:00
ok = False
2011-11-15 20:50:09 +00:00
elif not rightversion :
2016-09-25 05:45:36 +00:00
self . output_write ( " Version mismatch, %s %s != %s \n " % ( pkg . package , pkg . version , self . sources [ ' unstable ' ] [ pkg . package ] . version ) )
2006-08-01 18:59:29 +00:00
ok = False
if not ok :
self . output_write ( " Not using hint \n " )
return False
2012-01-04 14:20:33 +01:00
self . do_all ( hinttype , _pkgvers )
2006-08-01 18:59:29 +00:00
return True
2006-08-04 18:38:52 +00:00
def auto_hinter ( self ) :
2011-08-28 21:19:05 +00:00
""" Auto-generate " easy " hints.
2006-08-04 18:38:52 +00:00
2012-05-11 16:31:50 +02:00
This method attempts to generate " easy " hints for sets of packages which
2011-08-28 21:19:05 +00:00
must migrate together . Beginning with a package which does not depend on
any other package ( in terms of excuses ) , a list of dependencies and
reverse dependencies is recursively created .
Once all such lists have been generated , any which are subsets of other
lists are ignored in favour of the larger lists . The remaining lists are
then attempted in turn as " easy " hints .
We also try to auto hint circular dependencies analyzing the update
2006-08-06 09:29:32 +00:00
excuses relationships . If they build a circular dependency , which we already
know as not - working with the standard do_all algorithm , try to ` easy ` them .
2006-08-04 18:38:52 +00:00
"""
2016-05-27 19:52:55 +00:00
self . log ( " > Processing hints from the auto hinter " , type = " I " )
2006-08-04 18:38:52 +00:00
2012-05-11 13:19:55 +02:00
sources_t = self . sources [ ' testing ' ]
2016-05-27 19:52:55 +00:00
excuses = self . excuses
# consider only excuses which are valid candidates and still relevant.
valid_excuses = frozenset ( y . uvname for y in self . upgrade_me
2016-09-25 05:45:36 +00:00
if y not in sources_t or sources_t [ y ] . version != excuses [ y ] . ver [ 1 ] )
2016-05-27 19:52:55 +00:00
excuses_deps = { name : valid_excuses . intersection ( excuse . deps )
for name , excuse in excuses . items ( ) if name in valid_excuses }
2016-05-28 07:37:06 +00:00
excuses_rdeps = defaultdict ( set )
for name , deps in excuses_deps . items ( ) :
for dep in deps :
excuses_rdeps [ dep ] . add ( name )
2012-05-11 13:19:55 +02:00
2011-07-27 08:43:02 +00:00
def find_related ( e , hint , circular_first = False ) :
2006-08-04 18:38:52 +00:00
excuse = excuses [ e ]
2011-07-27 08:43:02 +00:00
if not circular_first :
2006-08-04 18:38:52 +00:00
hint [ e ] = excuse . ver [ 1 ]
2014-12-31 23:30:04 +01:00
if not excuse . deps :
2006-08-04 18:38:52 +00:00
return hint
2016-05-27 20:40:11 +00:00
for p in excuses_deps [ e ] :
if p in hint or p not in valid_excuses :
continue
2006-08-04 18:38:52 +00:00
if not find_related ( p , hint ) :
return False
return hint
# loop on them
2011-08-27 08:50:14 +00:00
candidates = [ ]
2011-10-19 18:23:52 +00:00
mincands = [ ]
2015-05-31 22:13:03 +02:00
seen_hints = set ( )
2016-03-25 09:18:35 +00:00
for e in valid_excuses :
2006-08-04 18:38:52 +00:00
excuse = excuses [ e ]
2014-12-31 23:30:04 +01:00
if excuse . deps :
2011-08-27 08:50:14 +00:00
hint = find_related ( e , { } , True )
2015-05-31 22:13:03 +02:00
if isinstance ( hint , dict ) and e in hint :
h = frozenset ( hint . items ( ) )
if h not in seen_hints :
candidates . append ( h )
seen_hints . add ( h )
2011-08-27 08:50:14 +00:00
else :
2015-05-31 22:26:16 +02:00
items = [ ( e , excuse . ver [ 1 ] ) ]
2015-05-31 22:13:03 +02:00
orig_size = 1
2011-10-19 18:23:52 +00:00
looped = False
2015-05-31 22:26:16 +02:00
seen_items = set ( )
seen_items . update ( items )
2011-08-27 08:50:14 +00:00
for item , ver in items :
# excuses which depend on "item" or are depended on by it
2016-05-28 07:37:06 +00:00
new_items = set ( ( x , excuses [ x ] . ver [ 1 ] ) for x in excuses_deps [ item ] )
new_items . update ( ( x , excuses [ x ] . ver [ 1 ] ) for x in excuses_rdeps [ item ] )
new_items - = seen_items
2015-05-31 22:26:16 +02:00
items . extend ( new_items )
seen_items . update ( new_items )
2011-10-19 18:23:52 +00:00
if not looped and len ( items ) > 1 :
2015-05-31 22:13:03 +02:00
orig_size = len ( items )
2015-05-31 22:26:16 +02:00
h = frozenset ( seen_items )
2015-05-31 22:13:03 +02:00
if h not in seen_hints :
mincands . append ( h )
seen_hints . add ( h )
2011-10-19 18:23:52 +00:00
looped = True
2015-05-31 22:13:03 +02:00
if len ( items ) != orig_size :
2015-05-31 22:26:16 +02:00
h = frozenset ( seen_items )
2015-05-31 22:13:03 +02:00
if h != mincands [ - 1 ] and h not in seen_hints :
candidates . append ( h )
seen_hints . add ( h )
2011-08-27 08:50:14 +00:00
2011-10-19 18:23:52 +00:00
for l in [ candidates , mincands ] :
2015-05-31 22:13:03 +02:00
for hint in l :
self . do_hint ( " easy " , " autohinter " , [ MigrationItem ( " %s / %s " % ( x [ 0 ] , x [ 1 ] ) ) for x in sorted ( hint ) ] )
2006-08-06 09:29:32 +00:00
2008-05-02 13:57:37 +00:00
def nuninst_arch_report ( self , nuninst , arch ) :
""" Print a report of uninstallable packages for one architecture. """
2014-07-25 08:44:13 +02:00
all = defaultdict ( set )
2008-05-02 13:57:37 +00:00
for p in nuninst [ arch ] :
pkg = self . binaries [ ' testing ' ] [ arch ] [ 0 ] [ p ]
2016-04-06 20:49:40 +00:00
all [ ( pkg . source , pkg . source_version ) ] . add ( p )
2014-08-05 07:54:53 +02:00
2016-01-17 19:36:44 +00:00
print ( ' * %s ' % arch )
2008-05-02 13:57:37 +00:00
2015-04-25 16:36:39 +02:00
for ( src , ver ) , pkgs in sorted ( all . items ( ) ) :
2015-04-25 16:03:54 +02:00
print ( ' %s ( %s ): %s ' % ( src , ver , ' ' . join ( sorted ( pkgs ) ) ) )
2008-05-02 13:57:37 +00:00
2015-09-06 22:39:39 +02:00
print ( )
2008-05-02 13:57:37 +00:00
2006-08-01 18:59:29 +00:00
def output_write ( self , msg ) :
""" Simple wrapper for output writing """
2015-04-25 16:03:54 +02:00
print ( msg , end = ' ' )
2008-05-31 16:32:13 +02:00
self . __output . write ( msg )
2006-08-01 18:59:29 +00:00
2006-06-24 17:49:43 +00:00
def main ( self ) :
""" Main method
This is the entry point for the class : it includes the list of calls
for the member methods which will produce the output files .
"""
2008-05-02 13:57:37 +00:00
# if running in --print-uninst mode, quit
if self . options . print_uninst :
2008-04-29 05:08:58 +00:00
return
2006-08-07 14:38:13 +00:00
# if no actions are provided, build the excuses and sort them
2008-04-29 05:08:58 +00:00
elif not self . options . actions :
2006-07-28 13:21:44 +00:00
self . write_excuses ( )
2006-08-07 14:38:13 +00:00
# otherwise, use the actions provided by the command line
2008-01-15 16:09:51 +00:00
else :
self . upgrade_me = self . options . actions . split ( )
2006-07-28 13:21:44 +00:00
2017-04-06 11:45:59 +00:00
if self . options . compute_migrations or self . options . hint_tester :
with open ( self . options . upgrade_output , ' w ' , encoding = ' utf-8 ' ) as f :
self . __output = f
2008-05-31 16:32:13 +02:00
2017-04-06 11:45:59 +00:00
# run the hint tester
if self . options . hint_tester :
self . hint_tester ( )
# run the upgrade test
else :
self . upgrade_testing ( )
2008-05-31 16:32:13 +02:00
2017-04-06 11:45:59 +00:00
self . log ( ' > Stats from the installability tester ' , type = " I " )
for stat in self . _inst_tester . stats . stats ( ) :
self . log ( ' > %s ' % stat , type = " I " )
else :
self . log ( ' Migration computation skipped as requested. ' , type = ' I ' )
2015-04-05 14:30:36 +02:00
2011-12-24 17:00:48 +01:00
2006-06-17 13:45:56 +00:00
if __name__ == ' __main__ ' :
Britney ( ) . main ( )