britney.py

00001 #!/usr/bin/env python2.4 # -*- coding: utf-8 -*-
00002 
00003 # Copyright (C) 2001-2004 Anthony Towns <ajt@debian.org>
00004 #                         Andreas Barth <aba@debian.org>
00005 #                         Fabio Tranchitella <kobold@debian.org>
00006 
00007 # This program is free software; you can redistribute it and/or modify
00008 # it under the terms of the GNU General Public License as published by
00009 # the Free Software Foundation; either version 2 of the License, or
00010 # (at your option) any later version.
00011 
00012 # This program is distributed in the hope that it will be useful,
00013 # but WITHOUT ANY WARRANTY; without even the implied warranty of
00014 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
00015 # GNU General Public License for more details.
00016 
00017 """
00018 = Introdution =
00019 
00020 This is the Debian testing updater script, also known as "Britney".
00021 
00022 Packages are usually installed into the `testing' distribution after
00023 they have undergone some degree of testing in unstable. The goal of
00024 this software is to do this task in a smart way, allowing testing
00025 to be always fully installable and close to being a release candidate.
00026 
00027 Britney source code is splitted in two different but related tasks:
00028 the first one is the generation of the update excuses, while the
00029 second tries to update testing with the valid candidates; first 
00030 each package alone, then larger and even larger sets of packages
00031 together. Each try is accepted if testing is not more uninstallable
00032 after the update than before.
00033 
00034 = Data Loading =
00035 
00036 In order to analyze the entire Debian distribution, Britney needs to
00037 load in memory the whole archive: this means more than 10.000 packages
00038 for twelve architectures, as well as the dependency interconnection
00039 between them. For this reason, the memory requirement for running this
00040 software are quite high and at least 1 gigabyte of RAM should be available.
00041 
00042 Britney loads the source packages from the `Sources' file and the binary
00043 packages from the `Packages_${arch}' files, where ${arch} is substituted
00044 with the supported architectures. While loading the data, the software
00045 analyze the dependencies and build a directed weighted graph in memory
00046 with all the interconnections between the packages (see Britney.read_sources
00047 and Britney.read_binaries).
00048 
00049 Other than source and binary packages, Britney loads the following data:
00050 
00051   * Bugs, which contains the count of release-critical bugs for a given
00052     version of a source package (see Britney.read_bugs).
00053 
00054   * Dates, which contains the date of the upload of a given version 
00055     of a source package (see Britney.read_dates).
00056 
00057   * Urgencies, which contains the urgency of the upload of a given
00058     version of a source package (see Britney.read_urgencies).
00059 
00060   * Approvals, which contains the list of approved testing-proposed-updates
00061     packages (see Britney.read_approvals).
00062 
00063   * Hints, which contains lists of commands which modify the standard behaviour
00064     of Britney (see Britney.read_hints).
00065 
00066 For a more detailed explanation about the format of these files, please read
00067 the documentation of the related methods. The exact meaning of them will be
00068 instead explained in the chapter "Excuses Generation".
00069 
00070 = Excuses =
00071 
00072 An excuse is a detailed explanation of why a package can or cannot
00073 be updated in the testing distribution from a newer package in 
00074 another distribution (like for example unstable). The main purpose
00075 of the excuses is to be written in an HTML file which will be 
00076 published over HTTP. The maintainers will be able to parse it manually
00077 or automatically to find the explanation of why their packages have
00078 been updated or not.
00079 
00080 == Excuses generation ==
00081 
00082 These are the steps (with references to method names) that Britney
00083 does for the generation of the update excuses.
00084 
00085  * If a source package is available in testing but it is not
00086    present in unstable and no binary packages in unstable are
00087    built from it, then it is marked for removal.
00088 
00089  * Every source package in unstable and testing-proposed-updates,
00090    if already present in testing, is checked for binary-NMUs, new
00091    or dropped binary packages in all the supported architectures
00092    (see Britney.should_upgrade_srcarch). The steps to detect if an
00093    upgrade is needed are:
00094 
00095     1. If there is a `remove' hint for the source package, the package
00096        is ignored: it will be removed and not updated.
00097 
00098     2. For every binary package build from the new source, it checks
00099        for unsatisfied dependencies, new binary package and updated
00100        binary package (binNMU) excluding the architecture-independent
00101        ones and the packages not built from the same source.
00102 
00103     3. For every binary package build from the old source, it checks
00104        if it is still built from the new source; if this is not true
00105        and the package is not architecture-independent, the script
00106        removes it from testing.
00107 
00108     4. Finally, if there is something worth doing (eg. a new or updated
00109        binary package) and nothing wrong it marks the source package
00110        as "Valid candidate", or "Not considered" if there is something
00111        wrong which prevented the update.
00112 
00113  * Every source package in unstable and testing-proposed-updates is
00114    checked for upgrade (see Britney.should_upgrade_src). The steps
00115    to detect if an upgrade is needed are:
00116 
00117     1. If the source package in testing is more recent the new one
00118        is ignored.
00119 
00120     2. If the source package doesn't exist (is fake), which means that
00121        a binary package refers to it but it is not present in the
00122        `Sources' file, the new one is ignored.
00123 
00124     3. If the package doesn't exist in testing, the urgency of the
00125        upload is ignored and set to the default (actually `low').
00126 
00127     4. If there is a `remove' hint for the source package, the package
00128        is ignored: it will be removed and not updated.
00129 
00130     5. If there is a `block' hint for the source package without an
00131        `unblock` hint or a `block-all source`, the package is ignored.
00132 
00133     7. If the suite is unstable, the update can go ahead only if the
00134        upload happend more then the minimum days specified by the
00135        urgency of the upload; if this is not true, the package is
00136        ignored as `too-young'. Note that the urgency is sticky, meaning
00137        that the highest urgency uploaded since the previous testing
00138        transition is taken into account.
00139 
00140     8. All the architecture-dependent binary packages and the
00141        architecture-independent ones for the `nobreakall' architectures
00142        have to be built from the source we are considering. If this is
00143        not true, then these are called `out-of-date' architectures and
00144        the package is ignored.
00145 
00146     9. The source package must have at least a binary package, otherwise
00147        it is ignored.
00148 
00149    10. If the suite is unstable, the count of release critical bugs for
00150        the new source package must be less then the count for the testing
00151        one. If this is not true, the package is ignored as `buggy'.
00152 
00153    11. If there is a `force' hint for the source package, then it is
00154        updated even if it is marked as ignored from the previous steps.
00155 
00156    12. If the suite is testing-proposed-updates, the source package can
00157        be updated only if there is an explicit approval for it.
00158 
00159    13. If the package will be ignored, mark it as "Valid candidate",
00160        otherwise mark it as "Not considered".
00161 
00162  * The list of `remove' hints is processed: if the requested source
00163    package is not already being updated or removed and the version
00164    actually in testing is the same specified with the `remove' hint,
00165    it is marked for removal.
00166 
00167  * The excuses are sorted by the number of days from the last upload
00168    (days-old) and by name.
00169 
00170  * A list of unconsidered excuses (for which the package is not upgraded)
00171    is built. Using this list, all the excuses depending on them is marked
00172    as invalid for "unpossible dependency".
00173 
00174  * The excuses are written in an HTML file.
00175 """
00176 
00177 import os
00178 import re
00179 import sys
00180 import string
00181 import time
00182 import copy
00183 import optparse
00184 import operator
00185 
00186 import apt_pkg
00187 
00188 from excuse import Excuse
00189 
00190 __author__ = 'Fabio Tranchitella'
00191 __version__ = '2.0.alpha1'
00192 
00193 # source package
00194 VERSION = 0
00195 SECTION = 1
00196 BINARIES = 2
00197 MAINTAINER = 3
00198 FAKESRC = 4
00199 
00200 # binary package
00201 SOURCE = 2
00202 SOURCEVER = 3
00203 ARCHITECTURE = 4
00204 PREDEPENDS = 5
00205 DEPENDS = 6
00206 CONFLICTS = 7
00207 PROVIDES = 8
00208 RDEPENDS = 9
00209 RCONFLICTS = 10
00210 
00211 
00212 class Britney:
00213     """Britney, the debian testing updater script
00214     
00215     This is the script that updates the testing_ distribution. It is executed
00216     each day after the installation of the updated packages. It generates the 
00217     `Packages' files for the testing distribution, but it does so in an
00218     intelligent manner; it try to avoid any inconsistency and to use only
00219     non-buggy packages.
00220 
00221     For more documentation on this script, please read the Developers Reference.
00222     """
00223 
00224     HINTS_STANDARD = ("easy", "hint", "remove", "block", "unblock", "urgent", "approve")
00225     HINTS_ALL = ("force", "force-hint", "block-all") + HINTS_STANDARD
00226 
00227     def __init__(self):
00228         """Class constructor
00229 
00230         This method initializes and populates the data lists, which contain all
00231         the information needed by the other methods of the class.
00232         """
00233         self.date_now = int(((time.time() / (60*60)) - 15) / 24)
00234 
00235         # parse the command line arguments
00236         self.__parse_arguments()
00237 
00238         # initialize the apt_pkg back-end
00239         apt_pkg.init()
00240 
00241         # if requested, build the non-installable status and save it
00242         if not self.options.nuninst_cache:
00243             self.__log("Building the list of not installable packages for the full archive", type="I")
00244             self.sources = {'testing': self.read_sources(self.options.testing)}
00245             nuninst = {}
00246             for arch in self.options.architectures:
00247                 self.binaries = {'testing': {arch: self.read_binaries(self.options.testing, "testing", arch)}}
00248                 self.__log("> Checking for non-installable packages for architecture %s" % arch, type="I")
00249                 result = self.get_nuninst(arch, build=True)
00250                 nuninst.update(result)
00251                 self.__log("> Found %d non-installable packages" % len(nuninst[arch]), type="I")
00252             self.write_nuninst(nuninst)
00253         else:
00254             self.__log("Not building the list of not installable packages, as requested", type="I")
00255 
00256         # read the source and binary packages for the involved distributions
00257         self.sources = {'testing': self.read_sources(self.options.testing),
00258                         'unstable': self.read_sources(self.options.unstable),
00259                         'tpu': self.read_sources(self.options.tpu),}
00260         self.binaries = {'testing': {}, 'unstable': {}, 'tpu': {}}
00261         for arch in self.options.architectures:
00262             self.binaries['testing'][arch] = self.read_binaries(self.options.testing, "testing", arch)
00263             self.binaries['unstable'][arch] = self.read_binaries(self.options.unstable, "unstable", arch)
00264             self.binaries['tpu'][arch] = self.read_binaries(self.options.tpu, "tpu", arch)
00265 
00266         # read the release-critical bug summaries for testing and unstable
00267         self.bugs = {'unstable': self.read_bugs(self.options.unstable),
00268                      'testing': self.read_bugs(self.options.testing),}
00269         self.normalize_bugs()
00270 
00271         # read additional data
00272         self.dates = self.read_dates(self.options.testing)
00273         self.urgencies = self.read_urgencies(self.options.testing)
00274         self.approvals = self.read_approvals(self.options.tpu)
00275         self.hints = self.read_hints(self.options.unstable)
00276         self.excuses = []
00277         self.dependencies = {}
00278 
00279     def __parse_arguments(self):
00280         """Parse the command line arguments
00281 
00282         This method parses and initializes the command line arguments.
00283         While doing so, it preprocesses some of the options to be converted
00284         in a suitable form for the other methods of the class.
00285         """
00286         # initialize the parser
00287         self.parser = optparse.OptionParser(version="%prog")
00288         self.parser.add_option("-v", "", action="count", dest="verbose", help="enable verbose output")
00289         self.parser.add_option("-c", "--config", action="store", dest="config", default="/etc/britney.conf",
00290                                help="path for the configuration file")
00291         self.parser.add_option("", "--architectures", action="store", dest="architectures", default=None,
00292                                help="override architectures from configuration file")
00293         self.parser.add_option("", "--actions", action="store", dest="actions", default=None,
00294                                help="override the list of actions to be performed")
00295         self.parser.add_option("", "--dry-run", action="store_true", dest="dry_run", default=False,
00296                                help="disable all outputs to the testing directory")
00297         self.parser.add_option("", "--compatible", action="store_true", dest="compatible", default=False,
00298                                help="enable full compatibility with old britney's output")
00299         self.parser.add_option("", "--control-files", action="store_true", dest="control_files", default=False,
00300                                help="enable control files generation")
00301         self.parser.add_option("", "--nuninst-cache", action="store_true", dest="nuninst_cache", default=False,
00302                                help="do not build the non-installability status, use the cache from file")
00303         (self.options, self.args) = self.parser.parse_args()
00304 
00305         # if the configuration file exists, than read it and set the additional options
00306         if not os.path.isfile(self.options.config):
00307             self.__log("Unable to read the configuration file (%s), exiting!" % self.options.config, type="E")
00308             sys.exit(1)
00309 
00310         # minimum days for unstable-testing transition and the list of hints
00311         # are handled as an ad-hoc case
00312         self.MINDAYS = {}
00313         self.HINTS = {}
00314         for k, v in [map(string.strip,r.split('=', 1)) for r in file(self.options.config) if '=' in r and not r.strip().startswith('#')]:
00315             if k.startswith("MINDAYS_"):
00316                 self.MINDAYS[k.split("_")[1].lower()] = int(v)
00317             elif k.startswith("HINTS_"):
00318                 self.HINTS[k.split("_")[1].lower()] = \
00319                     reduce(lambda x,y: x+y, [hasattr(self, "HINTS_" + i) and getattr(self, "HINTS_" + i) or (i,) for i in v.split()])
00320             elif not hasattr(self.options, k.lower()) or \
00321                  not getattr(self.options, k.lower()):
00322                 setattr(self.options, k.lower(), v)
00323 
00324         # Sort the architecture list
00325         allarches = sorted(self.options.architectures.split())
00326         arches = [x for x in allarches if x in self.options.nobreakall_arches]
00327         arches += [x for x in allarches if x not in arches and x not in self.options.fucked_arches.split()]
00328         arches += [x for x in allarches if x not in arches and x not in self.options.break_arches.split()]
00329         arches += [x for x in allarches if x not in arches and x not in self.options.new_arches.split()]
00330         arches += [x for x in allarches if x not in arches]
00331         self.options.architectures = arches
00332         self.options.smooth_updates = self.options.smooth_updates.split()
00333 
00334     def __log(self, msg, type="I"):
00335         """Print info messages according to verbosity level
00336         
00337         An easy-and-simple log method which prints messages to the standard
00338         output. The type parameter controls the urgency of the message, and
00339         can be equal to `I' for `Information', `W' for `Warning' and `E' for
00340         `Error'. Warnings and errors are always printed, and information are
00341         printed only if the verbose logging is enabled.
00342         """
00343         if self.options.verbose or type in ("E", "W"):
00344             print "%s: [%s] - %s" % (type, time.asctime(), msg)
00345 
00346     # Data reading/writing methods
00347     # ----------------------------
00348 
00349     def read_sources(self, basedir):
00350         """Read the list of source packages from the specified directory
00351         
00352         The source packages are read from the `Sources' file within the
00353         directory specified as `basedir' parameter. Considering the
00354         large amount of memory needed, not all the fields are loaded
00355         in memory. The available fields are Version, Maintainer and Section.
00356 
00357         The method returns a list where every item represents a source
00358         package as a dictionary.
00359         """
00360         sources = {}
00361         package = None
00362         filename = os.path.join(basedir, "Sources")
00363         self.__log("Loading source packages from %s" % filename)
00364         Packages = apt_pkg.ParseTagFile(open(filename))
00365         get_field = Packages.Section.get
00366         while Packages.Step():
00367             pkg = get_field('Package')
00368             sources[pkg] = [get_field('Version'),
00369                             get_field('Section'),
00370                             [],
00371                             get_field('Maintainer'),
00372                             False,
00373                            ]
00374         return sources
00375 
00376     def read_binaries(self, basedir, distribution, arch):
00377         """Read the list of binary packages from the specified directory
00378         
00379         The binary packages are read from the `Packages_${arch}' files
00380         within the directory specified as `basedir' parameter, replacing
00381         ${arch} with the value of the arch parameter. Considering the
00382         large amount of memory needed, not all the fields are loaded
00383         in memory. The available fields are Version, Source, Pre-Depends,
00384         Depends, Conflicts, Provides and Architecture.
00385         
00386         After reading the packages, reverse dependencies are computed
00387         and saved in the `rdepends' keys, and the `Provides' field is
00388         used to populate the virtual packages list.
00389 
00390         The dependencies are parsed with the apt.pkg.ParseDepends method,
00391         and they are stored both as the format of its return value and
00392         text.
00393 
00394         The method returns a tuple. The first element is a list where
00395         every item represents a binary package as a dictionary; the second
00396         element is a dictionary which maps virtual packages to real
00397         packages that provide it.
00398         """
00399 
00400         packages = {}
00401         provides = {}
00402         sources = self.sources
00403         package = None
00404 
00405         filename = os.path.join(basedir, "Packages_%s" % arch)
00406         self.__log("Loading binary packages from %s" % filename)
00407         Packages = apt_pkg.ParseTagFile(open(filename))
00408         get_field = Packages.Section.get
00409         while Packages.Step():
00410             pkg = get_field('Package')
00411             version = get_field('Version')
00412             dpkg = [version,
00413                     get_field('Section'),
00414                     pkg, 
00415                     version,
00416                     get_field('Architecture'),
00417                     get_field('Pre-Depends'),
00418                     get_field('Depends'),
00419                     get_field('Conflicts'),
00420                     get_field('Provides'),
00421                     [],
00422                     [],
00423                    ]
00424 
00425             # retrieve the name and the version of the source package
00426             source = get_field('Source')
00427             if source:
00428                 dpkg[SOURCE] = source.split(" ")[0]
00429                 if "(" in source:
00430                     dpkg[SOURCEVER] = source[source.find("(")+1:source.find(")")]
00431 
00432             # if the source package is available in the distribution, then register this binary package
00433             if dpkg[SOURCE] in sources[distribution]:
00434                 sources[distribution][dpkg[SOURCE]][BINARIES].append(pkg + "/" + arch)
00435             # if the source package doesn't exist, create a fake one
00436             else:
00437                 sources[distribution][dpkg[SOURCE]] = [dpkg[SOURCEVER], None, [pkg + "/" + arch], None, True]
00438 
00439             # register virtual packages and real packages that provide them
00440             if dpkg[PROVIDES]:
00441                 parts = map(string.strip, dpkg[PROVIDES].split(","))
00442                 for p in parts:
00443                     if p not in provides:
00444                         provides[p] = []
00445                     provides[p].append(pkg)
00446                 dpkg[PROVIDES] = parts
00447             else: dpkg[PROVIDES] = []
00448 
00449             # add the resulting dictionary to the package list
00450             packages[pkg] = dpkg
00451 
00452         # loop again on the list of packages to register reverse dependencies and conflicts
00453         register_reverses = self.register_reverses
00454         for pkg in packages:
00455             register_reverses(pkg, packages, provides, check_doubles=False)
00456 
00457         # return a tuple with the list of real and virtual packages
00458         return (packages, provides)
00459 
00460     def register_reverses(self, pkg, packages, provides, check_doubles=True, parse_depends=apt_pkg.ParseDepends):
00461         """Register reverse dependencies and conflicts for the specified package
00462 
00463         This method register the reverse dependencies and conflicts for
00464         a give package using `packages` as list of packages and `provides`
00465         as list of virtual packages.
00466 
00467         The method has an optional parameter parse_depends which is there
00468         just for performance reasons and is not meant to be overwritten.
00469         """
00470         # register the list of the dependencies for the depending packages
00471         dependencies = []
00472         if packages[pkg][DEPENDS]:
00473             dependencies.extend(parse_depends(packages[pkg][DEPENDS]))
00474         if packages[pkg][PREDEPENDS]:
00475             dependencies.extend(parse_depends(packages[pkg][PREDEPENDS]))
00476         # go through the list
00477         for p in dependencies:
00478             for a in p:
00479                 # register real packages
00480                 if a[0] in packages and (not check_doubles or pkg not in packages[a[0]][RDEPENDS]):
00481                     packages[a[0]][RDEPENDS].append(pkg)
00482                 # register packages which provides a virtual package
00483                 elif a[0] in provides:
00484                     for i in provides.get(a[0]):
00485                         if i not in packages: continue
00486                         if not check_doubles or pkg not in packages[i][RDEPENDS]:
00487                             packages[i][RDEPENDS].append(pkg)
00488         # register the list of the conflicts for the conflicting packages
00489         if packages[pkg][CONFLICTS]:
00490             for p in parse_depends(packages[pkg][CONFLICTS]):
00491                 for a in p:
00492                     # register real packages
00493                     if a[0] in packages and (not check_doubles or pkg not in packages[a[0]][RCONFLICTS]):
00494                         packages[a[0]][RCONFLICTS].append(pkg)
00495                     # register packages which provides a virtual package
00496                     elif a[0] in provides:
00497                         for i in provides[a[0]]:
00498                             if i not in packages: continue
00499                             if not check_doubles or pkg not in packages[i][RCONFLICTS]:
00500                                 packages[i][RCONFLICTS].append(pkg)
00501      
00502     def read_bugs(self, basedir):
00503         """Read the release critial bug summary from the specified directory
00504         
00505         The RC bug summaries are read from the `Bugs' file within the
00506         directory specified as `basedir' parameter. The file contains
00507         rows with the format:
00508 
00509         <package-name> <count-of-rc-bugs>
00510 
00511         The method returns a dictionary where the key is the binary package
00512         name and the value is the number of open RC bugs for it.
00513         """
00514         bugs = {}
00515         filename = os.path.join(basedir, "Bugs")
00516         self.__log("Loading RC bugs count from %s" % filename)
00517         for line in open(filename):
00518             l = line.split()
00519             if len(l) != 2: continue
00520             try:
00521                 bugs[l[0]] = int(l[1])
00522             except ValueError:
00523                 self.__log("Bugs, unable to parse \"%s\"" % line, type="E")
00524         return bugs
00525 
00526     def write_bugs(self, basedir, bugs):
00527         """Write the release critical bug summary to the specified directory
00528 
00529         For a more detailed explanation of the format, please check the method
00530         read_bugs.
00531         """
00532         filename = os.path.join(basedir, "Bugs")
00533         self.__log("Writing RC bugs count to %s" % filename)
00534         f = open(filename, 'w')
00535         for pkg in sorted(bugs.keys()):
00536             if bugs[pkg] == 0: continue
00537             f.write("%s %d\n" % (pkg, bugs[pkg]))
00538         f.close()
00539 
00540     def __maxver(self, pkg, dist):
00541         """Return the maximum version for a given package name
00542         
00543         This method returns None if the specified source package
00544         is not available in the `dist' distribution. If the package
00545         exists, then it returns the maximum version between the
00546         source package and its binary packages.
00547         """
00548         maxver = None
00549         if pkg in self.sources[dist]:
00550             maxver = self.sources[dist][pkg][VERSION]
00551         for arch in self.options.architectures:
00552             if pkg not in self.binaries[dist][arch][0]: continue
00553             pkgv = self.binaries[dist][arch][0][pkg][VERSION]
00554             if maxver == None or apt_pkg.VersionCompare(pkgv, maxver) > 0:
00555                 maxver = pkgv
00556         return maxver
00557 
00558     def normalize_bugs(self):
00559         """Normalize the release critical bug summaries for testing and unstable
00560         
00561         The method doesn't return any value: it directly modifies the
00562         object attribute `bugs'.
00563         """
00564         # loop on all the package names from testing and unstable bug summaries
00565         for pkg in set(self.bugs['testing'].keys() + self.bugs['unstable'].keys()):
00566 
00567             # make sure that the key is present in both dictionaries
00568             if pkg not in self.bugs['testing']:
00569                 self.bugs['testing'][pkg] = 0
00570             elif pkg not in self.bugs['unstable']:
00571                 self.bugs['unstable'][pkg] = 0
00572 
00573             # retrieve the maximum version of the package in testing:
00574             maxvert = self.__maxver(pkg, 'testing')
00575 
00576             # if the package is not available in testing or it has the
00577             # same RC bug count, then do nothing
00578             if maxvert == None or \
00579                self.bugs['testing'][pkg] == self.bugs['unstable'][pkg]:
00580                 continue
00581 
00582             # retrieve the maximum version of the package in testing:
00583             maxveru = self.__maxver(pkg, 'unstable')
00584 
00585             # if the package is not available in unstable, then do nothing
00586             if maxveru == None:
00587                 continue
00588             # else if the testing package is more recent, then use the
00589             # unstable RC bug count for testing, too
00590             elif apt_pkg.VersionCompare(maxvert, maxveru) >= 0:
00591                 self.bugs['testing'][pkg] = self.bugs['unstable'][pkg]
00592 
00593     def read_dates(self, basedir):
00594         """Read the upload date for the packages from the specified directory
00595         
00596         The upload dates are read from the `Date' file within the directory
00597         specified as `basedir' parameter. The file contains rows with the
00598         format:
00599 
00600         <package-name> <version> <date-of-upload>
00601 
00602         The dates are expressed as days starting from the 1970-01-01.
00603 
00604         The method returns a dictionary where the key is the binary package
00605         name and the value is tuple with two items, the version and the date.
00606         """
00607         dates = {}
00608         filename = os.path.join(basedir, "Dates")
00609         self.__log("Loading upload data from %s" % filename)
00610         for line in open(filename):
00611             l = line.split()
00612             if len(l) != 3: continue
00613             try:
00614                 dates[l[0]] = (l[1], int(l[2]))
00615             except ValueError:
00616                 self.__log("Dates, unable to parse \"%s\"" % line, type="E")
00617         return dates
00618 
00619     def write_dates(self, basedir, dates):
00620         """Write the upload date for the packages to the specified directory
00621 
00622         For a more detailed explanation of the format, please check the method
00623         read_dates.
00624         """
00625         filename = os.path.join(basedir, "Dates")
00626         self.__log("Writing upload data to %s" % filename)
00627         f = open(filename, 'w')
00628         for pkg in sorted(dates.keys()):
00629             f.write("%s %s %d\n" % ((pkg,) + dates[pkg]))
00630         f.close()
00631 
00632 
00633     def read_urgencies(self, basedir):
00634         """Read the upload urgency of the packages from the specified directory
00635         
00636         The upload urgencies are read from the `Urgency' file within the
00637         directory specified as `basedir' parameter. The file contains rows
00638         with the format:
00639 
00640         <package-name> <version> <urgency>
00641 
00642         The method returns a dictionary where the key is the binary package
00643         name and the value is the greatest urgency from the versions of the
00644         package that are higher then the testing one.
00645         """
00646 
00647         urgencies = {}
00648         filename = os.path.join(basedir, "Urgency")
00649         self.__log("Loading upload urgencies from %s" % filename)
00650         for line in open(filename):
00651             l = line.split()
00652             if len(l) != 3: continue
00653 
00654             # read the minimum days associated to the urgencies
00655             urgency_old = urgencies.get(l[0], self.options.default_urgency)
00656             mindays_old = self.MINDAYS.get(urgency_old, self.MINDAYS[self.options.default_urgency])
00657             mindays_new = self.MINDAYS.get(l[2], self.MINDAYS[self.options.default_urgency])
00658 
00659             # if the new urgency is lower (so the min days are higher), do nothing
00660             if mindays_old <= mindays_new:
00661                 continue
00662 
00663             # if the package exists in testing and it is more recent, do nothing
00664             tsrcv = self.sources['testing'].get(l[0], None)
00665             if tsrcv and apt_pkg.VersionCompare(tsrcv[VERSION], l[1]) >= 0:
00666                 continue
00667 
00668             # if the package doesn't exist in unstable or it is older, do nothing
00669             usrcv = self.sources['unstable'].get(l[0], None)
00670             if not usrcv or apt_pkg.VersionCompare(usrcv[VERSION], l[1]) < 0:
00671                 continue
00672 
00673             # update the urgency for the package
00674             urgencies[l[0]] = l[2]
00675 
00676         return urgencies
00677 
00678     def read_approvals(self, basedir):
00679         """Read the approval commands from the specified directory
00680         
00681         The approval commands are read from the files contained by the 
00682         `Approved' directory within the directory specified as `basedir'
00683         parameter. The name of the files has to be the same of the
00684         authorized users for the approvals.
00685         
00686         The file contains rows with the format:
00687 
00688         <package-name> <version>
00689 
00690         The method returns a dictionary where the key is the binary package
00691         name followed by an underscore and the version number, and the value
00692         is the user who submitted the command.
00693         """
00694         approvals = {}
00695         for approver in self.options.approvers.split():
00696             filename = os.path.join(basedir, "Approved", approver)
00697             self.__log("Loading approvals list from %s" % filename)
00698             for line in open(filename):
00699                 l = line.split()
00700                 if len(l) != 2: continue
00701                 approvals["%s_%s" % (l[0], l[1])] = approver
00702         return approvals
00703 
00704     def read_hints(self, basedir):
00705         """Read the hint commands from the specified directory
00706         
00707         The hint commands are read from the files contained by the `Hints'
00708         directory within the directory specified as `basedir' parameter. 
00709         The name of the files has to be the same of the authorized users
00710         for the hints.
00711         
00712         The file contains rows with the format:
00713 
00714         <command> <package-name>[/<version>]
00715 
00716         The method returns a dictionary where the key is the command, and
00717         the value is the list of affected packages.
00718         """
00719         hints = dict([(k,[]) for k in self.HINTS_ALL])
00720 
00721         for who in self.HINTS.keys():
00722             filename = os.path.join(basedir, "Hints", who)
00723             self.__log("Loading hints list from %s" % filename)
00724             for line in open(filename):
00725                 line = line.strip()
00726                 if line == "": continue
00727                 l = line.split()
00728                 if l[0] == 'finished':
00729                     break
00730                 elif l[0] not in self.HINTS[who]:
00731                     continue
00732                 elif l[0] in ["easy", "hint", "force-hint"]:
00733                     hints[l[0]].append((who, [k.split("/") for k in l if "/" in k]))
00734                 elif l[0] in ["block-all"]:
00735                     hints[l[0]].extend([(y, who) for y in l[1:]])
00736                 elif l[0] in ["block"]:
00737                     hints[l[0]].extend([(y, who) for y in l[1:]])
00738                 elif l[0] in ["remove", "approve", "unblock", "force", "urgent"]:
00739                     hints[l[0]].extend([(k.split("/")[0], (k.split("/")[1],who) ) for k in l if "/" in k])
00740 
00741         for x in ["block", "block-all", "unblock", "force", "urgent", "remove"]:
00742             z = {}
00743             for a, b in hints[x]:
00744                 if a in z:
00745                     self.__log("Overriding %s[%s] = %s with %s" % (x, a, z[a], b), type="W")
00746                 z[a] = b
00747             hints[x] = z
00748 
00749         return hints
00750 
00751     def write_heidi(self, basedir, filename):
00752         """Write the output HeidiResult
00753 
00754         This method write the output for Heidi, which contains all the
00755         binary packages and the source packages in the form:
00756         
00757         <pkg-name> <pkg-version> <pkg-architecture> <pkg-section>
00758         <src-name> <src-version> <src-section>
00759         """
00760         filename = os.path.join(basedir, filename)
00761         self.__log("Writing Heidi results to %s" % filename)
00762         f = open(filename, 'w')
00763 
00764         # local copies
00765         sources = self.sources['testing']
00766 
00767         # write binary packages
00768         for arch in sorted(self.options.architectures):
00769             binaries = self.binaries['testing'][arch][0]
00770             for pkg_name in sorted(binaries):
00771                 pkg = binaries[pkg_name]
00772                 pkgv = pkg[VERSION]
00773                 pkgarch = pkg[ARCHITECTURE]
00774                 pkgsec = pkg[SECTION] or 'unknown'
00775                 f.write('%s %s %s %s\n' % (pkg_name, pkgv, pkgarch, pkgsec))
00776 
00777         # write sources
00778         for src_name in sorted(sources):
00779             src = sources[src_name]
00780             srcv = src[VERSION]
00781             srcsec = src[FAKESRC] and 'faux' or src[SECTION] or 'unknown'
00782             f.write('%s %s source %s\n' % (src_name, srcv, srcsec))
00783 
00784         f.close()
00785 
00786     def write_controlfiles(self, basedir, suite):
00787         """Write the control files
00788 
00789         This method write the control files for the binary packages of all
00790         the architectures and for the source packages.
00791         """
00792         sources = self.sources[suite]
00793 
00794         self.__log("Writing new %s control files to %s" % (suite, basedir))
00795         for arch in self.options.architectures:
00796             filename = os.path.join(basedir, 'Packages_%s' % arch)
00797             f = open(filename, 'w')
00798             binaries = self.binaries[suite][arch][0]
00799             for pkg in binaries:
00800                 output = "Package: %s\n" % pkg
00801                 for key, k in ((SECTION, 'Section'), (ARCHITECTURE, 'Architecture'), (SOURCE, 'Source'), (VERSION, 'Version'), 
00802                           (PREDEPENDS, 'Pre-Depends'), (DEPENDS, 'Depends'), (PROVIDES, 'Provides'), (CONFLICTS, 'Conflicts')):
00803                     if not binaries[pkg][key]: continue
00804                     if key == SOURCE:
00805                         if binaries[pkg][SOURCE] == pkg:
00806                             if binaries[pkg][SOURCEVER] != binaries[pkg][VERSION]:
00807                                 source = binaries[pkg][SOURCE] + " (" + binaries[pkg][SOURCEVER] + ")"
00808                             else: continue
00809                         else:
00810                             if binaries[pkg][SOURCEVER] != binaries[pkg][VERSION]:
00811                                 source = binaries[pkg][SOURCE] + " (" + binaries[pkg][SOURCEVER] + ")"
00812                             else:
00813                                 source = binaries[pkg][SOURCE]
00814                         output += (k + ": " + source + "\n")
00815                         if sources[binaries[pkg][SOURCE]][MAINTAINER]:
00816                             output += (k + ": " + sources[binaries[pkg][SOURCE]][MAINTAINER] + "\n")
00817                     elif key == PROVIDES:
00818                         if len(binaries[pkg][key]) > 0:
00819                             output += (k + ": " + ", ".join(binaries[pkg][key]) + "\n")
00820                     else:
00821                         output += (k + ": " + binaries[pkg][key] + "\n")
00822                 f.write(output + "\n")
00823             f.close()
00824 
00825         filename = os.path.join(basedir, 'Sources')
00826         f = open(filename, 'w')
00827         for src in sources:
00828             output = "Package: %s\n" % src
00829             for k in ('Version', 'Section', 'Maintainer'):
00830                 key = k.lower()
00831                 if key not in sources[src] or not sources[src][key]: continue
00832                 output += (k + ": " + sources[src][key] + "\n")
00833             f.write(output + "\n")
00834         f.close()
00835 
00836     def write_nuninst(self, nuninst):
00837         """Write the non-installable report"""
00838         f = open(self.options.noninst_status, 'w')
00839         f.write("Built on: " + time.strftime("%Y.%m.%d %H:%M:%S %z", time.gmtime(time.time())) + "\n")
00840         f.write("Last update: " + time.strftime("%Y.%m.%d %H:%M:%S %z", time.gmtime(time.time())) + "\n\n")
00841         f.write("".join([k + ": " + " ".join(nuninst[k]) + "\n" for k in nuninst]))
00842         f.close()
00843 
00844     def read_nuninst(self):
00845         """Read the non-installable report"""
00846         f = open(self.options.noninst_status)
00847         nuninst = {}
00848         for r in f:
00849             if ":" not in r: continue
00850             arch, packages = r.strip().split(":", 1)
00851             if arch.split("+", 1)[0] in self.options.architectures:
00852                 nuninst[arch] = packages.split()
00853         return nuninst
00854 
00855 
00856     # Utility methods for package analysis
00857     # ------------------------------------
00858 
00859     def same_source(self, sv1, sv2):
00860         """Check if two version numbers are built from the same source
00861 
00862         This method returns a boolean value which is true if the two
00863         version numbers specified as parameters are built from the same
00864         source. The main use of this code is to detect binary-NMU.
00865         """
00866         if sv1 == sv2:
00867             return 1
00868 
00869         m = re.match(r'^(.*)\+b\d+$', sv1)
00870         if m: sv1 = m.group(1)
00871         m = re.match(r'^(.*)\+b\d+$', sv2)
00872         if m: sv2 = m.group(1)
00873 
00874         if sv1 == sv2:
00875             return 1
00876 
00877         if re.search("-", sv1) or re.search("-", sv2):
00878             m = re.match(r'^(.*-[^.]+)\.0\.\d+$', sv1)
00879             if m: sv1 = m.group(1)
00880             m = re.match(r'^(.*-[^.]+\.[^.]+)\.\d+$', sv1)
00881             if m: sv1 = m.group(1)
00882 
00883             m = re.match(r'^(.*-[^.]+)\.0\.\d+$', sv2)
00884             if m: sv2 = m.group(1)
00885             m = re.match(r'^(.*-[^.]+\.[^.]+)\.\d+$', sv2)
00886             if m: sv2 = m.group(1)
00887 
00888             return (sv1 == sv2)
00889         else:
00890             m = re.match(r'^([^-]+)\.0\.\d+$', sv1)
00891             if m and sv2 == m.group(1): return 1
00892 
00893             m = re.match(r'^([^-]+)\.0\.\d+$', sv2)
00894             if m and sv1 == m.group(1): return 1
00895 
00896             return 0
00897 
00898     def get_dependency_solvers(self, block, arch, distribution, excluded=[], strict=False):
00899         """Find the packages which satisfy a dependency block
00900 
00901         This method returns the list of packages which satisfy a dependency
00902         block (as returned by apt_pkg.ParseDepends) for the given architecture
00903         and distribution.
00904 
00905         It returns a tuple with two items: the first is a boolean which is
00906         True if the dependency is satisfied, the second is the list of the
00907         solving packages.
00908         """
00909 
00910         packages = []
00911 
00912         # local copies for better performances
00913         binaries = self.binaries[distribution][arch]
00914 
00915         # for every package, version and operation in the block
00916         for name, version, op in block:
00917             # look for the package in unstable
00918             if name not in excluded and name in binaries[0]:
00919                 package = binaries[0][name]
00920                 # check the versioned dependency (if present)
00921                 if op == '' and version == '' or apt_pkg.CheckDep(package[VERSION], op, version):
00922                     packages.append(name)
00923 
00924             # look for the package in the virtual packages list and loop on them
00925             for prov in binaries[1].get(name, []):
00926                 if prov in excluded or \
00927                    prov not in binaries[0]: continue
00928                 package = binaries[0][prov]
00929                 # check the versioned dependency (if present)
00930                 # TODO: this is forbidden by the debian policy, which says that versioned
00931                 #       dependencies on virtual packages are never satisfied. The old britney
00932                 #       does it and we have to go with it, but at least a warning should be raised.
00933                 if op == '' and version == '' or not strict and apt_pkg.CheckDep(package[VERSION], op, version):
00934                     packages.append(prov)
00935                     break
00936 
00937         return (len(packages) > 0, packages)
00938 
00939     def excuse_unsat_deps(self, pkg, src, arch, suite, excuse, excluded=[], conflicts=False):
00940         """Find unsatisfied dependencies for a binary package
00941 
00942         This method analyzes the dependencies of the binary package specified
00943         by the parameter `pkg', built from the source package `src', for the
00944         architecture `arch' within the suite `suite'. If the dependency can't
00945         be satisfied in testing and/or unstable, it updates the excuse passed
00946         as parameter.
00947 
00948         The dependency fields checked are Pre-Depends and Depends.
00949         """
00950         # retrieve the binary package from the specified suite and arch
00951         binary_u = self.binaries[suite][arch][0][pkg]
00952 
00953         # local copies for better performances
00954         parse_depends = apt_pkg.ParseDepends
00955         get_dependency_solvers = self.get_dependency_solvers
00956         strict = True # not self.options.compatible
00957 
00958         # analyze the dependency fields (if present)
00959         for type_key, type in ((PREDEPENDS, 'Pre-Depends'), (DEPENDS, 'Depends')):
00960             if not binary_u[type_key]:
00961                 continue
00962 
00963             # for every block of dependency (which is formed as conjunction of disconjunction)
00964             for block, block_txt in zip(parse_depends(binary_u[type_key]), binary_u[type_key].split(',')):
00965                 # if the block is satisfied in testing, then skip the block
00966                 solved, packages = get_dependency_solvers(block, arch, 'testing', excluded, strict=strict)
00967                 if solved:
00968                     for p in packages:
00969                         if p not in self.binaries[suite][arch][0]: continue
00970                         excuse.add_sane_dep(self.binaries[suite][arch][0][p][SOURCE])
00971                     continue
00972 
00973                 # check if the block can be satisfied in unstable, and list the solving packages
00974                 solved, packages = get_dependency_solvers(block, arch, suite, [], strict=strict)
00975                 packages = [self.binaries[suite][arch][0][p][SOURCE] for p in packages]
00976 
00977                 # if the dependency can be satisfied by the same source package, skip the block:
00978                 # obviously both binary packages will enter testing togheter
00979                 if src in packages: continue
00980 
00981                 # if no package can satisfy the dependency, add this information to the excuse
00982                 if len(packages) == 0:
00983                     excuse.addhtml("%s/%s unsatisfiable %s: %s" % (pkg, arch, type, block_txt.strip()))
00984                     if arch not in self.options.break_arches: excuse.add_unsat_dep(arch)
00985                     continue
00986 
00987                 # for the solving packages, update the excuse to add the dependencies
00988                 for p in packages:
00989                     if arch not in self.options.break_arches.split():
00990                         excuse.add_dep(p)
00991                     else:
00992                         excuse.add_break_dep(p, arch)
00993 
00994         return True
00995 
00996     # Package analysis methods
00997     # ------------------------
00998 
00999     def should_remove_source(self, pkg):
01000         """Check if a source package should be removed from testing
01001         
01002         This method checks if a source package should be removed from the
01003         testing distribution; this happen if the source package is not
01004         present in the unstable distribution anymore.
01005 
01006         It returns True if the package can be removed, False otherwise.
01007         In the former case, a new excuse is appended to the the object
01008         attribute excuses.
01009         """
01010         # if the soruce package is available in unstable, then do nothing
01011         if pkg in self.sources['unstable']:
01012             return False
01013         # otherwise, add a new excuse for its removal and return True
01014         src = self.sources['testing'][pkg]
01015         excuse = Excuse("-" + pkg)
01016         excuse.set_vers(src[VERSION], None)
01017         src[MAINTAINER] and excuse.set_maint(src[MAINTAINER].strip())
01018         src[SECTION] and excuse.set_section(src[SECTION].strip())
01019 
01020         # if the package is blocked, skip it
01021         if self.hints['block'].has_key('-' + pkg):
01022             exc.addhtml("Not touching package, as requested by %s (contact debian-release if update is needed)" % hints['block']['-' + pkg])
01023             return False
01024 
01025         excuse.addhtml("Valid candidate")
01026         self.excuses.append(excuse)
01027         return True
01028 
01029     def should_upgrade_srcarch(self, src, arch, suite):
01030         """Check if binary package should be upgraded
01031 
01032         This method checks if a binary package should be upgraded; this can
01033         happen also if the binary package is a binary-NMU for the given arch.
01034         The analysis is performed for the source package specified by the
01035         `src' parameter, checking the architecture `arch' for the distribution
01036         `suite'.
01037        
01038         It returns False if the given package doesn't need to be upgraded,
01039         True otherwise. In the former case, a new excuse is appended to
01040         the the object attribute excuses.
01041         """
01042         # retrieve the source packages for testing and suite
01043         source_t = self.sources['testing'][src]
01044         source_u = self.sources[suite][src]
01045 
01046         # build the common part of the excuse, which will be filled by the code below
01047         ref = "%s/%s%s" % (src, arch, suite != 'unstable' and "_" + suite or "")
01048         excuse = Excuse(ref)
01049         excuse.set_vers(source_t[VERSION], source_t[VERSION])
01050         source_u[MAINTAINER] and excuse.set_maint(source_u[MAINTAINER].strip())
01051         source_u[SECTION] and excuse.set_section(source_u[SECTION].strip())
01052         
01053         # if there is a `remove' hint and the requested version is the same of the
01054         # version in testing, then stop here and return False
01055         if src in self.hints["remove"] and \
01056            self.same_source(source_t[VERSION], self.hints["remove"][src][0]):
01057             excuse.addhtml("Removal request by %s" % (self.hints["remove"][src][1]))
01058             excuse.addhtml("Trying to remove package, not update it")
01059             excuse.addhtml("Not considered")
01060             self.excuses.append(excuse)
01061             return False
01062 
01063         # the starting point is that there is nothing wrong and nothing worth doing
01064         anywrongver = False
01065         anyworthdoing = False
01066 
01067         # for every binary package produced by this source in unstable for this architecture
01068         for pkg in sorted(filter(lambda x: x.endswith("/" + arch), source_u[BINARIES]), key=lambda x: x.split("/")[0]):
01069             pkg_name = pkg.split("/")[0]
01070 
01071             # retrieve the testing (if present) and unstable corresponding binary packages
01072             binary_t = pkg in source_t[BINARIES] and self.binaries['testing'][arch][0][pkg_name] or None
01073             binary_u = self.binaries[suite][arch][0][pkg_name]
01074 
01075             # this is the source version for the new binary package
01076             pkgsv = self.binaries[suite][arch][0][pkg_name][SOURCEVER]
01077 
01078             # if the new binary package is architecture-independent, then skip it
01079             if binary_u[ARCHITECTURE] == 'all':
01080                 excuse.addhtml("Ignoring %s %s (from %s) as it is arch: all" % (pkg_name, binary_u[VERSION], pkgsv))
01081                 continue
01082 
01083             # if the new binary package is not from the same source as the testing one, then skip it
01084             if not self.same_source(source_t[VERSION], pkgsv):
01085                 anywrongver = True
01086                 excuse.addhtml("From wrong source: %s %s (%s not %s)" % (pkg_name, binary_u[VERSION], pkgsv, source_t[VERSION]))
01087                 break
01088 
01089             # find unsatisfied dependencies for the new binary package
01090             self.excuse_unsat_deps(pkg_name, src, arch, suite, excuse)
01091 
01092             # if the binary is not present in testing, then it is a new binary;
01093             # in this case, there is something worth doing
01094             if not binary_t:
01095                 excuse.addhtml("New binary: %s (%s)" % (pkg_name, binary_u[VERSION]))
01096                 anyworthdoing = True
01097                 continue
01098 
01099             # at this point, the binary package is present in testing, so we can compare
01100             # the versions of the packages ...
01101             vcompare = apt_pkg.VersionCompare(binary_t[VERSION], binary_u[VERSION])
01102 
01103             # ... if updating would mean downgrading, then stop here: there is something wrong
01104             if vcompare > 0:
01105                 anywrongver = True
01106                 excuse.addhtml("Not downgrading: %s (%s to %s)" % (pkg_name, binary_t[VERSION], binary_u[VERSION]))
01107                 break
01108             # ... if updating would mean upgrading, then there is something worth doing
01109             elif vcompare < 0:
01110                 excuse.addhtml("Updated binary: %s (%s to %s)" % (pkg_name, binary_t[VERSION], binary_u[VERSION]))
01111                 anyworthdoing = True
01112 
01113         # if there is nothing wrong and there is something worth doing or the source
01114         # package is not fake, then check what packages shuold be removed
01115         if not anywrongver and (anyworthdoing or self.sources[suite][src][FAKESRC]):
01116             srcv = self.sources[suite][src][VERSION]
01117             ssrc = self.same_source(source_t[VERSION], srcv)
01118             # for every binary package produced by this source in testing for this architecture
01119             for pkg in sorted([x.split("/")[0] for x in self.sources['testing'][src][BINARIES] if x.endswith("/"+arch)]):
01120                 # if the package is architecture-independent, then ignore it
01121                 if self.binaries['testing'][arch][0][pkg][ARCHITECTURE] == 'all':
01122                     excuse.addhtml("Ignoring removal of %s as it is arch: all" % (pkg))
01123                     continue
01124                 # if the package is not produced by the new source package, then remove it from testing
01125                 if pkg not in self.binaries[suite][arch][0]:
01126                     tpkgv = self.binaries['testing'][arch][0][pkg][VERSION]
01127                     excuse.addhtml("Removed binary: %s %s" % (pkg, tpkgv))
01128                     if ssrc: anyworthdoing = True
01129 
01130         # if there is nothing wrong and there is something worth doing, this is valid candidate
01131         if not anywrongver and anyworthdoing:
01132             excuse.addhtml("Valid candidate")
01133             self.excuses.append(excuse)
01134             return True
01135         # else if there is something worth doing (but something wrong, too) this package won't be considered
01136         elif anyworthdoing:
01137             excuse.addhtml("Not considered")
01138             self.excuses.append(excuse)
01139 
01140         # otherwise, return False
01141         return False
01142 
01143     def should_upgrade_src(self, src, suite):
01144         """Check if source package should be upgraded
01145 
01146         This method checks if a source package should be upgraded. The analysis
01147         is performed for the source package specified by the `src' parameter, 
01148         checking the architecture `arch' for the distribution `suite'.
01149        
01150         It returns False if the given package doesn't need to be upgraded,
01151         True otherwise. In the former case, a new excuse is appended to
01152         the the object attribute excuses.
01153         """
01154 
01155         # retrieve the source packages for testing (if available) and suite
01156         source_u = self.sources[suite][src]
01157         if src in self.sources['testing']:
01158             source_t = self.sources['testing'][src]
01159             # if testing and unstable have the same version, then this is a candidate for binary-NMUs only
01160             if apt_pkg.VersionCompare(source_t[VERSION], source_u[VERSION]) == 0:
01161                 return False
01162         else:
01163             source_t = None
01164 
01165         # build the common part of the excuse, which will be filled by the code below
01166         ref = "%s%s" % (src, suite != 'unstable' and "_" + suite or "")
01167         excuse = Excuse(ref)
01168         excuse.set_vers(source_t and source_t[VERSION] or None, source_u[VERSION])
01169         source_u[MAINTAINER] and excuse.set_maint(source_u[MAINTAINER].strip())
01170         source_u[SECTION] and excuse.set_section(source_u[SECTION].strip())
01171 
01172         # the starting point is that we will update the candidate
01173         update_candidate = True
01174         
01175         # if the version in unstable is older, then stop here with a warning in the excuse and return False
01176         if source_t and apt_pkg.VersionCompare(source_u[VERSION], source_t[VERSION]) < 0:
01177             excuse.addhtml("ALERT: %s is newer in testing (%s %s)" % (src, source_t[VERSION], source_u[VERSION]))
01178             self.excuses.append(excuse)
01179             return False
01180 
01181         # check if the source package really exists or if it is a fake one
01182         if source_u[FAKESRC]:
01183             excuse.addhtml("%s source package doesn't exist" % (src))
01184             update_candidate = False
01185 
01186         # retrieve the urgency for the upload, ignoring it if this is a NEW package (not present in testing)
01187         urgency = self.urgencies.get(src, self.options.default_urgency)
01188         if not source_t and urgency != self.options.default_urgency:
01189             excuse.addhtml("Ignoring %s urgency setting for NEW package" % (urgency))
01190             urgency = self.options.default_urgency
01191 
01192         # if there is a `remove' hint and the requested version is the same of the
01193         # version in testing, then stop here and return False
01194         if src in self.hints["remove"]:
01195             if source_t and self.same_source(source_t[VERSION], self.hints['remove'][src][0]) or \
01196                self.same_source(source_u[VERSION], self.hints['remove'][src][0]):
01197                 excuse.addhtml("Removal request by %s" % (self.hints["remove"][src][1]))
01198                 excuse.addhtml("Trying to remove package, not update it")
01199                 update_candidate = False
01200 
01201         # check if there is a `block' hint for this package or a `block-all source' hint
01202         blocked = None
01203         if src in self.hints["block"]:
01204             blocked = self.hints["block"][src]
01205         elif 'source' in self.hints["block-all"]:
01206             blocked = self.hints["block-all"]["source"]
01207 
01208         # if the source is blocked, then look for an `unblock' hint; the unblock request
01209         # is processed only if the specified version is correct
01210         if blocked:
01211             unblock = self.hints["unblock"].get(src,(None,None))
01212             if unblock[0] != None:
01213                 if self.same_source(unblock[0], source_u[VERSION]):
01214                     excuse.addhtml("Ignoring request to block package by %s, due to unblock request by %s" % (blocked, unblock[1]))
01215                 else:
01216                     excuse.addhtml("Unblock request by %s ignored due to version mismatch: %s" % (unblock[1], unblock[0]))
01217             else:
01218                 excuse.addhtml("Not touching package, as requested by %s (contact debian-release if update is needed)" % (blocked))
01219                 update_candidate = False
01220 
01221         # if the suite is unstable, then we have to check the urgency and the minimum days of
01222         # permanence in unstable before updating testing; if the source package is too young,
01223         # the check fails and we set update_candidate to False to block the update
01224         if suite == 'unstable':
01225             if src not in self.dates:
01226                 self.dates[src] = (source_u[VERSION], self.date_now)
01227             elif not self.same_source(self.dates[src][0], source_u[VERSION]):
01228                 self.dates[src] = (source_u[VERSION], self.date_now)
01229 
01230             days_old = self.date_now - self.dates[src][1]
01231             min_days = self.MINDAYS[urgency]
01232             excuse.setdaysold(days_old, min_days)
01233             if days_old < min_days:
01234                 if src in self.hints["urgent"] and self.same_source(source_u[VERSION], self.hints["urgent"][src][0]):
01235                     excuse.addhtml("Too young, but urgency pushed by %s" % (self.hints["urgent"][src][1]))
01236                 else:
01237                     update_candidate = False
01238 
01239         # at this point, we check what is the status of the builds on all the supported architectures
01240         # to catch the out-of-date ones
01241         pkgs = {src: ["source"]}
01242         for arch in self.options.architectures:
01243             oodbins = {}
01244             # for every binary package produced by this source in the suite for this architecture
01245             for pkg in sorted([x.split("/")[0] for x in self.sources[suite][src][BINARIES] if x.endswith("/"+arch)]):
01246                 if pkg not in pkgs: pkgs[pkg] = []
01247                 pkgs[pkg].append(arch)
01248 
01249                 # retrieve the binary package and its source version
01250                 binary_u = self.binaries[suite][arch][0][pkg]
01251                 pkgsv = binary_u[SOURCEVER]
01252 
01253                 # if it wasn't builded by the same source, it is out-of-date
01254                 if not self.same_source(source_u[VERSION], pkgsv):
01255                     if pkgsv not in oodbins:
01256                         oodbins[pkgsv] = []
01257                     oodbins[pkgsv].append(pkg)
01258                     continue
01259 
01260                 # if the package is architecture-dependent or the current arch is `nobreakall'
01261                 # find unsatisfied dependencies for the binary package
01262                 if binary_u[ARCHITECTURE] != 'all' or arch in self.options.nobreakall_arches:
01263                     self.excuse_unsat_deps(pkg, src, arch, suite, excuse)
01264 
01265             # if there are out-of-date packages, warn about them in the excuse and set update_candidate
01266             # to False to block the update; if the architecture where the package is out-of-date is
01267             # in the `fucked_arches' list, then do not block the update
01268             if oodbins:
01269                 oodtxt = ""
01270                 for v in oodbins.keys():
01271                     if oodtxt: oodtxt = oodtxt + "; "
01272                     oodtxt = oodtxt + "%s (from <a href=\"http://buildd.debian.org/build.php?" \
01273                         "arch=%s&pkg=%s&ver=%s\" target=\"_blank\">%s</a>)" % \
01274                         (", ".join(sorted(oodbins[v])), arch, src, v, v)
01275                 text = "out of date on <a href=\"http://buildd.debian.org/build.php?" \
01276                     "arch=%s&pkg=%s&ver=%s\" target=\"_blank\">%s</a>: %s" % \
01277                     (arch, src, source_u[VERSION], arch, oodtxt)
01278 
01279                 if arch in self.options.fucked_arches:
01280                     text = text + " (but %s isn't keeping up, so nevermind)" % (arch)
01281                 else:
01282                     update_candidate = False
01283 
01284                 if self.date_now != self.dates[src][1]:
01285                     excuse.addhtml(text)
01286 
01287         # if the source package has no binaries, set update_candidate to False to block the update
01288         if len(self.sources[suite][src][BINARIES]) == 0:
01289             excuse.addhtml("%s has no binaries on any arch" % src)
01290             update_candidate = False
01291 
01292         # if the suite is unstable, then we have to check the release-critical bug counts before
01293         # updating testing; if the unstable package have a RC bug count greater than the testing
01294         # one,  the check fails and we set update_candidate to False to block the update
01295         if suite == 'unstable':
01296             for pkg in pkgs.keys():
01297                 if pkg not in self.bugs['testing']:
01298                     self.bugs['testing'][pkg] = 0
01299                 if pkg not in self.bugs['unstable']:
01300                     self.bugs['unstable'][pkg] = 0
01301 
01302                 if self.bugs['unstable'][pkg] > self.bugs['testing'][pkg]:
01303                     excuse.addhtml("%s (%s) is <a href=\"http://bugs.debian.org/cgi-bin/pkgreport.cgi?" \
01304                                    "which=pkg&data=%s&sev-inc=critical&sev-inc=grave&sev-inc=serious\" " \
01305                                    "target=\"_blank\">buggy</a>! (%d > %d)" % \
01306                                    (pkg, ", ".join(pkgs[pkg]), pkg, self.bugs['unstable'][pkg], self.bugs['testing'][pkg]))
01307                     update_candidate = False
01308                 elif self.bugs['unstable'][pkg] > 0:
01309                     excuse.addhtml("%s (%s) is (less) <a href=\"http://bugs.debian.org/cgi-bin/pkgreport.cgi?" \
01310                                    "which=pkg&data=%s&sev-inc=critical&sev-inc=grave&sev-inc=serious\" " \
01311                                    "target=\"_blank\">buggy</a>! (%d <= %d)" % \
01312                                    (pkg, ", ".join(pkgs[pkg]), pkg, self.bugs['unstable'][pkg], self.bugs['testing'][pkg]))
01313 
01314         # check if there is a `force' hint for this package, which allows it to go in even if it is not updateable
01315         if not update_candidate and src in self.hints["force"] and \
01316            self.same_source(source_u[VERSION], self.hints["force"][src][0]):
01317             excuse.dontinvalidate = 1
01318             excuse.addhtml("Should ignore, but forced by %s" % (self.hints["force"][src][1]))
01319             update_candidate = True
01320 
01321         # if the suite is testing-proposed-updates, the package needs an explicit approval in order to go in
01322         if suite == "tpu":
01323             key = "%s_%s" % (src, source_u[VERSION])
01324             if key in self.approvals:
01325                 excuse.addhtml("Approved by %s" % approvals[key])
01326             else:
01327                 excuse.addhtml("NEEDS APPROVAL BY RM")
01328                 update_candidate = False
01329 
01330         # if the package can be updated, it is a valid candidate
01331         if update_candidate:
01332             excuse.addhtml("Valid candidate")
01333         # else it won't be considered
01334         else:
01335             excuse.addhtml("Not considered")
01336 
01337         self.excuses.append(excuse)
01338         return update_candidate
01339 
01340     def reversed_exc_deps(self):
01341         """Reverse the excuses dependencies
01342 
01343         This method returns a dictionary where the keys are the package names
01344         and the values are the excuse names which depend on it.
01345         """
01346         res = {}
01347         for exc in self.excuses:
01348             for d in exc.deps:
01349                 if d not in res: res[d] = []
01350                 res[d].append(exc.name)
01351         return res
01352 
01353     def invalidate_excuses(self, valid, invalid):
01354         """Invalidate impossible excuses
01355 
01356         This method invalidates the impossible excuses, which depend
01357         on invalid excuses. The two parameters contains the list of
01358         `valid' and `invalid' excuses.
01359         """
01360         # build a lookup-by-name map
01361         exclookup = {}
01362         for e in self.excuses:
01363             exclookup[e.name] = e
01364 
01365         # build the reverse dependencies
01366         revdeps = self.reversed_exc_deps()
01367 
01368         # loop on the invalid excuses
01369         i = 0
01370         while i < len(invalid):
01371             # if there is no reverse dependency, skip the item
01372             if invalid[i] not in revdeps:
01373                 i += 1
01374                 continue
01375             # if there dependency can be satisfied by a testing-proposed-updates excuse, skip the item
01376             if (invalid[i] + "_tpu") in valid:
01377                 i += 1
01378                 continue
01379             # loop on the reverse dependencies
01380             for x in revdeps[invalid[i]]:
01381                 # if the item is valid and it is marked as `dontinvalidate', skip the item
01382                 if x in valid and exclookup[x].dontinvalidate:
01383                     continue
01384 
01385                 # otherwise, invalidate the dependency and mark as invalidated and
01386                 # remove the depending excuses
01387                 exclookup[x].invalidate_dep(invalid[i])
01388                 if x in valid:
01389                     p = valid.index(x)
01390                     invalid.append(valid.pop(p))
01391                     exclookup[x].addhtml("Invalidated by dependency")
01392                     exclookup[x].addhtml("Not considered")
01393             i = i + 1
01394  
01395     def write_excuses(self):
01396         """Produce and write the update excuses
01397 
01398         This method handles the update excuses generation: the packages are
01399         looked to determine whether they are valid candidates. For the details
01400         of this procedure, please refer to the module docstring.
01401         """
01402 
01403         self.__log("Update Excuses generation started", type="I")
01404 
01405         # list of local methods and variables (for better performance)
01406         sources = self.sources
01407         architectures = self.options.architectures
01408         should_remove_source = self.should_remove_source
01409         should_upgrade_srcarch = self.should_upgrade_srcarch
01410         should_upgrade_src = self.should_upgrade_src
01411 
01412         # this list will contain the packages which are valid candidates;
01413         # if a package is going to be removed, it will have a "-" prefix
01414         upgrade_me = []
01415 
01416         # for every source package in testing, check if it should be removed
01417         for pkg in sources['testing']:
01418             if should_remove_source(pkg):
01419                 upgrade_me.append("-" + pkg)
01420 
01421         # for every source package in unstable check if it should be upgraded
01422         for pkg in sources['unstable']:
01423             if sources['unstable'][pkg][FAKESRC]: continue
01424             # if the source package is already present in testing,
01425             # check if it should be upgraded for every binary package
01426             if pkg in sources['testing'] and not sources['testing'][pkg][FAKESRC]:
01427                 for arch in architectures:
01428                     if should_upgrade_srcarch(pkg, arch, 'unstable'):
01429                         upgrade_me.append("%s/%s" % (pkg, arch))
01430 
01431             # check if the source package should be upgraded
01432             if should_upgrade_src(pkg, 'unstable'):
01433                 upgrade_me.append(pkg)
01434 
01435         # for every source package in testing-proposed-updates, check if it should be upgraded
01436         for pkg in sources['tpu']:
01437             if sources['tpu'][pkg][FAKESRC]: continue
01438             # if the source package is already present in testing,
01439             # check if it should be upgraded for every binary package
01440             if pkg in sources['testing']:
01441                 for arch in architectures:
01442                     if should_upgrade_srcarch(pkg, arch, 'tpu'):
01443                         upgrade_me.append("%s/%s_tpu" % (pkg, arch))
01444 
01445             # check if the source package should be upgraded
01446             if should_upgrade_src(pkg, 'tpu'):
01447                 upgrade_me.append("%s_tpu" % pkg)
01448 
01449         # process the `remove' hints, if the given package is not yet in upgrade_me
01450         for src in self.hints["remove"].keys():
01451             if src in upgrade_me: continue
01452             if ("-"+src) in upgrade_me: continue
01453             if src not in sources['testing']: continue
01454 
01455             # check if the version specified in the hint is the same of the considered package
01456             tsrcv = sources['testing'][src][VERSION]
01457             if not self.same_source(tsrcv, self.hints["remove"][src][0]): continue
01458 
01459             # add the removal of the package to upgrade_me and build a new excuse
01460             upgrade_me.append("-%s" % (src))
01461             excuse = Excuse("-%s" % (src))
01462             excuse.set_vers(tsrcv, None)
01463             excuse.addhtml("Removal request by %s" % (self.hints["remove"][src][1]))
01464             excuse.addhtml("Package is broken, will try to remove")
01465             self.excuses.append(excuse)
01466 
01467         # sort the excuses by daysold and name
01468         self.excuses.sort(lambda x, y: cmp(x.daysold, y.daysold) or cmp(x.name, y.name))
01469 
01470         # extract the not considered packages, which are in the excuses but not in upgrade_me
01471         unconsidered = [e.name for e in self.excuses if e.name not in upgrade_me]
01472 
01473         # invalidate impossible excuses
01474         for e in self.excuses:
01475             for d in e.deps:
01476                 if d not in upgrade_me and d not in unconsidered:
01477                     e.addhtml("Unpossible dep: %s -> %s" % (e.name, d))
01478         self.invalidate_excuses(upgrade_me, unconsidered)
01479 
01480         # sort the list of candidates
01481         self.upgrade_me = sorted(upgrade_me)
01482 
01483         # write excuses to the output file
01484         self.__log("> Writing Excuses to %s" % self.options.excuses_output, type="I")
01485 
01486         f = open(self.options.excuses_output, 'w')
01487         f.write("<!DOCTYPE HTML PUBLIC \"-//W3C//DTD HTML 4.01//EN\" \"http://www.w3.org/TR/REC-html40/strict.dtd\">\n")
01488         f.write("<html><head><title>excuses...</title>")
01489         f.write("<meta http-equiv=\"Content-Type\" content=\"text/html;charset=utf-8\"></head><body>\n")
01490         f.write("<p>Generated: " + time.strftime("%Y.%m.%d %H:%M:%S %z", time.gmtime(time.time())) + "</p>\n")
01491         f.write("<ul>\n")
01492         for e in self.excuses:
01493             f.write("<li>%s" % e.html())
01494         f.write("</ul></body></html>\n")
01495         f.close()
01496 
01497         self.__log("Update Excuses generation completed", type="I")
01498 
01499     # Upgrade run
01500     # -----------
01501 
01502     def newlyuninst(self, nuold, nunew):
01503         """Return a nuninst statstic with only new uninstallable packages
01504 
01505         This method subtract the uninstallabla packages of the statistic
01506         `nunew` from the statistic `nuold`.
01507 
01508         It returns a dictionary with the architectures as keys and the list
01509         of uninstallable packages as values.
01510         """
01511         res = {}
01512         for arch in nuold:
01513             if arch not in nunew: continue
01514             res[arch] = [x for x in nunew[arch] if x not in nuold[arch]]
01515         return res
01516 
01517     def get_nuninst(self, requested_arch=None, build=False):
01518         """Return the uninstallability statistic for all the architectures
01519 
01520         To calculate the uninstallability counters, the method checks the
01521         installability of all the packages for all the architectures, and
01522         tracking dependencies in a recursive way. The architecture
01523         indipendent packages are checked only for the `nobreakall`
01524         architectures.
01525 
01526         It returns a dictionary with the architectures as keys and the list
01527         of uninstallable packages as values.
01528         """
01529         # if we are not asked to build the nuninst, read it from the cache
01530         if not build:
01531             return self.read_nuninst()
01532 
01533         nuninst = {}
01534 
01535         # local copies for better performances
01536         binaries = self.binaries['testing']
01537         check_installable = self.check_installable
01538 
01539         # when a new uninstallable package is discovered, check again all the
01540         # reverse dependencies and if they are uninstallable, too, call itself
01541         # recursively
01542         def add_nuninst(pkg, arch):
01543             if pkg not in nuninst[arch]:
01544                 nuninst[arch].append(pkg)
01545                 for p in binaries[arch][0][pkg][RDEPENDS]:
01546                     r = check_installable(p, arch, 'testing', excluded=nuninst[arch], conflicts=True)
01547                     if not r:
01548                         add_nuninst(p, arch)
01549 
01550         # for all the architectures
01551         for arch in self.options.architectures:
01552             if requested_arch and arch != requested_arch: continue
01553             # if it is in the nobreakall ones, check arch-indipendent packages too
01554             if arch not in self.options.nobreakall_arches:
01555                 skip_archall = True
01556             else: skip_archall = False
01557 
01558             # check all the packages for this architecture, calling add_nuninst if a new
01559             # uninstallable package is found
01560             nuninst[arch] = []
01561             for pkg_name in binaries[arch][0]:
01562                 r = check_installable(pkg_name, arch, 'testing', excluded=nuninst[arch], conflicts=True)
01563                 if not r:
01564                     add_nuninst(pkg_name, arch)
01565 
01566             # if they are not required, removed architecture-indipendent packages
01567             nuninst[arch + "+all"] = nuninst[arch][:]
01568             if skip_archall:
01569                 for pkg in nuninst[arch + "+all"]:
01570                     bpkg = binaries[arch][0][pkg]
01571                     if bpkg[ARCHITECTURE] == 'all':
01572                         nuninst[arch].remove(pkg)
01573 
01574         # return the dictionary with the results
01575         return nuninst
01576 
01577     def eval_nuninst(self, nuninst, original=None):
01578         """Return a string which represents the uninstallability counters
01579 
01580         This method returns a string which represents the uninstallability
01581         counters reading the uninstallability statistics `nuninst` and, if
01582         present, merging the results with the `original` one.
01583 
01584         An example of the output string is:
01585         1+2: i-0:a-0:a-0:h-0:i-1:m-0:m-0:p-0:a-0:m-0:s-2:s-0
01586 
01587         where the first part is the number of broken packages in non-break
01588         architectures + the total number of broken packages for all the
01589         architectures.
01590         """
01591         res = []
01592         total = 0
01593         totalbreak = 0
01594         for arch in self.options.architectures:
01595             if arch in nuninst:
01596                 n = len(nuninst[arch])
01597             elif original and arch in original:
01598                 n = len(original[arch])
01599             else: continue
01600             if arch in self.options.break_arches:
01601                 totalbreak = totalbreak + n
01602             else:
01603                 total = total + n
01604             res.append("%s-%d" % (arch[0], n))
01605         return "%d+%d: %s" % (total, totalbreak, ":".join(res))
01606 
01607     def eval_uninst(self, nuninst):
01608         """Return a string which represents the uninstallable packages
01609 
01610         This method returns a string which represents the uninstallable
01611         packages reading the uninstallability statistics `nuninst`.
01612 
01613         An example of the output string is:
01614             * i386: broken-pkg1, broken-pkg2
01615         """
01616         parts = []
01617         for arch in self.options.architectures:
01618             if arch in nuninst and len(nuninst[arch]) > 0:
01619                 parts.append("    * %s: %s\n" % (arch,", ".join(sorted(nuninst[arch]))))
01620         return "".join(parts)
01621 
01622     def is_nuninst_asgood_generous(self, old, new):
01623         diff = 0
01624         for arch in self.options.architectures:
01625             if arch in self.options.break_arches: continue
01626             diff = diff + (len(new[arch]) - len(old[arch]))
01627         return diff <= 0
01628 
01629     def check_installable(self, pkg, arch, suite, excluded=[], conflicts=False):
01630         """Check if a package is installable
01631 
01632         This method analyzes the dependencies of the binary package specified
01633         by the parameter `pkg' for the architecture `arch' within the suite
01634         `suite'. If the dependency can be satisfied in the given `suite` and
01635         `conflicts` parameter is True, then the co-installability with 
01636         conflicts handling is checked.
01637 
01638         The dependency fields checked are Pre-Depends and Depends.
01639 
01640         The method returns a boolean which is True if the given package is
01641         installable.
01642         """
01643         # retrieve the binary package from the specified suite and arch
01644         binary_u = self.binaries[suite][arch][0][pkg]
01645 
01646         # local copies for better performances
01647         parse_depends = apt_pkg.ParseDepends
01648         get_dependency_solvers = self.get_dependency_solvers
01649 
01650         # analyze the dependency fields (if present)
01651         for type in (PREDEPENDS, DEPENDS):
01652             if not binary_u[type]:
01653                 continue
01654 
01655             # for every block of dependency (which is formed as conjunction of disconjunction)
01656             for block in parse_depends(binary_u[type]):
01657                 # if the block is not satisfied, return False
01658                 solved, packages = get_dependency_solvers(block, arch, 'testing', excluded, strict=True)
01659                 if not solved:
01660                     return False
01661 
01662         # otherwise, the package is installable (not considering conflicts)
01663         # if the conflicts handling is enabled, then check conflicts before
01664         # saying that the package is really installable
01665         if conflicts:
01666             return self.check_conflicts(pkg, arch, excluded, {}, {})
01667 
01668         return True
01669 
01670     def check_conflicts(self, pkg, arch, broken, system, conflicts):
01671         """Check if a package can be installed satisfying the conflicts
01672 
01673         This method checks if the `pkg` package from the `arch` architecture
01674         can be installed (excluding `broken` packages) within the system
01675         `system` along with all its dependencies. This means that all the
01676         conflicts relationships are checked in order to achieve the test
01677         co-installability of the package.
01678 
01679         The method returns a boolean which is True if the given package is
01680         co-installable in the given system.
01681         """
01682 
01683         # local copies for better performances
01684         binaries = self.binaries['testing'][arch]
01685         parse_depends = apt_pkg.ParseDepends
01686         check_depends = apt_pkg.CheckDep
01687 
01688         # unregister conflicts, local method to remove conflicts
01689         # registered from a given package.
01690         def unregister_conflicts(pkg, conflicts):
01691             for c in conflicts.keys():
01692                 i = 0
01693                 while i < len(conflicts[c]):
01694                     if conflicts[c][i][3] == pkg:
01695                         del conflicts[c][i]
01696                     else: i = i + 1
01697                 if len(conflicts[c]) == 0:
01698                     del conflicts[c]
01699 
01700         def remove_package(pkg, system, conflicts):
01701             for k in system:
01702                 if pkg in system[k][1]:
01703                     system[k][1].remove(pkg)
01704             unregister_conflicts(pkg, conflicts)
01705 
01706         # handle a conflict, local method to solve a conflict which happened
01707         # in the system; the behaviour of the conflict-solver is:
01708         #   1. If there are alternatives for the package which must be removed,
01709         #      try them, and if one of them resolves the system return True;
01710         #   2. If none of the alternatives can solve the conflict, then call
01711         #      itself for the package which depends on the conflicting package.
01712         #   3. If the top of the dependency tree is reached, then the conflict
01713         #      can't be solved, so return False.
01714         def handle_conflict(pkg, source, system, conflicts):
01715             # skip packages which don't have reverse dependencies
01716             if source not in system or system[source][1] == []:
01717                 remove_package(source, system, conflicts)
01718                 return (system, conflicts)
01719             # reached the top of the tree
01720             if not system[source][1][0]:
01721                 return False
01722             # remove its conflicts
01723             unregister_conflicts(source, conflicts)
01724             # if there are alternatives, try them
01725             alternatives = system[source][0]
01726             for alt in alternatives:
01727                 if satisfy(alt, [x for x in alternatives if x != alt], pkg_from=system[source][1],
01728                         system=system, conflicts=conflicts, excluded=[source]):
01729                     remove_package(source, system, conflicts)
01730                     return (system, conflicts)
01731             # there are no good alternatives, so remove the package which depends on it
01732             for p in system[source][1]:
01733                 # the package does not exist, we reached the top of the tree
01734                 if not p: return False
01735                 # we are providing the package we conflict on (eg. exim4 and mail-transfer-agent), skip it
01736                 if p == pkg: continue
01737                 output = handle_conflict(pkg, p, system, conflicts)
01738                 if output:
01739                     system, conflicts = output
01740                 else: return False
01741             remove_package(source, system, conflicts)
01742             return (system, conflicts)
01743 
01744         # dependency tree satisfier, local method which tries to satisfy the dependency
01745         # tree for a given package. It calls itself recursively in order to check the
01746         # co-installability of the full tree of dependency of the starting package.
01747         # If a conflict is detected, it tries to handle it calling the handle_conflict
01748         # method; if it can't be resolved, then it returns False.
01749         def satisfy(pkg, pkg_alt=None, pkg_from=None, system=system, conflicts=conflicts, excluded=[]):
01750             # if it is real package and it is already installed, skip it and return True
01751             if pkg in binaries[0]:
01752                 if pkg in system:
01753                     if type(pkg_from) == list:
01754                         system[pkg][1].extend(pkg_from)
01755                     else:
01756                         system[pkg][1].append(pkg_from)
01757                     system[pkg] = (system[pkg][1], filter(lambda x: x in pkg_alt, system[pkg][0]))
01758                     return True
01759                 binary_u = binaries[0][pkg]
01760             else: binary_u = None
01761 
01762             # if it is a virtual package
01763             providers = []
01764             if pkg_from and pkg in binaries[1]:
01765                 providers = binaries[1][pkg]
01766                 # it is both real and virtual, so the providers are alternatives
01767                 if binary_u:
01768                     providers = filter(lambda x: (not pkg_alt or x not in pkg_alt) and x != pkg, providers)
01769                     if not pkg_alt:
01770                         pkg_alt = []
01771                     pkg_alt.extend(providers)
01772                 # try all the alternatives and if none of them suits, give up and return False
01773                 else:
01774                     # if we already have a provider in the system, everything is ok and return True
01775                     if len(filter(lambda x: x in providers and x not in excluded, system)) > 0:
01776                         return True
01777                     for p in providers:
01778                         # try to install the providers skipping excluded packages,
01779                         # which we already tried but do not work
01780                         if p in excluded: continue
01781                         elif satisfy(p, [a for a in providers if a != p], pkg_from):
01782                             return True
01783                     # if none of them suits, return False
01784                     return False
01785 
01786             # if the package doesn't exist, return False
01787             if not binary_u: return False
01788 
01789             # it is broken, but we have providers
01790             if pkg in broken and pkg_from:
01791                 for p in providers:
01792                     # try to install the providers skipping excluded packages,
01793                     # which we already tried but do not work
01794                     if p in excluded: continue
01795                     elif satisfy(p, [a for a in providers if a != p], pkg_from):
01796                         return True
01797                 return False
01798 
01799             # install the package into the system, recording which package required it
01800             if type(pkg_from) != list:
01801                 pkg_from = [pkg_from]
01802             system[pkg] = (pkg_alt or [], pkg_from)
01803 
01804             # register provided packages
01805             if binary_u[PROVIDES]:
01806                 for p in binary_u[PROVIDES]:
01807                     if p in system:
01808                         # do not consider packages providing the one which we are checking
01809                         if len(system[p][1]) == 1 and system[p][1][0] == None: continue
01810                         system[p][1].append(pkg)
01811                     else:
01812                         system[p] = ([], [pkg])
01813 
01814             # check the conflicts
01815             if pkg in conflicts:
01816                 for name, version, op, conflicting in conflicts[pkg]:
01817                     if conflicting in binary_u[PROVIDES] and system[conflicting][1] == [pkg]: continue
01818                     if op == '' and version == '' or check_depends(binary_u[VERSION], op, version):
01819                         # if conflict is found, check if it can be solved removing
01820                         # already-installed packages without broking the system; if
01821                         # this is not possible, give up and return False
01822                         output = handle_conflict(pkg, conflicting, system.copy(), conflicts.copy())
01823                         if output:
01824                             system, conflicts = output
01825                         else:
01826                             del system[pkg]
01827                             return False
01828 
01829             # register conflicts from the just-installed package
01830             if binary_u[CONFLICTS]:
01831                 for block in map(operator.itemgetter(0), parse_depends(binary_u[CONFLICTS] or [])):
01832                     name, version, op = block
01833                     # skip conflicts for packages provided by itself
01834                     # if the conflicting package is in the system (and it is not a self-conflict)
01835                     if not (name in binary_u[PROVIDES] and system[name][1] == [pkg]) and \
01836                        block[0] != pkg and block[0] in system:
01837                         if block[0] in binaries[0]:
01838                             binary_c = binaries[0][block[0]]
01839                         else: binary_c = None
01840                         if op == '' and version == '' or binary_c and check_depends(binary_c[VERSION], op, version):
01841                             # if conflict is found, check if it can be solved removing
01842                             # already-installed packages without broking the system; if
01843                             # this is not possible, give up and return False
01844                             output = handle_conflict(pkg, name, system.copy(), conflicts.copy())
01845                             if output:
01846                                 system, conflicts = output
01847                             else:
01848                                 del system[pkg]
01849                                 unregister_conflicts(pkg, conflicts)
01850                                 return False
01851                     # register the conflict)
01852                     if block[0] not in conflicts:
01853                         conflicts[block[0]] = []
01854                     conflicts[block[0]].append((name, version, op, pkg))
01855 
01856             # list all its dependencies ...
01857             dependencies = []
01858             for key in (PREDEPENDS, DEPENDS):
01859                 if not binary_u[key]: continue
01860                 dependencies.extend(parse_depends(binary_u[key]))
01861 
01862             # ... and go through them
01863             for block in dependencies:
01864                 # list the possible alternatives, in case of a conflict
01865                 alternatives = map(operator.itemgetter(0), block)
01866                 valid = False
01867                 for name, version, op in block:
01868                     # otherwise, if it is already installed or it is installable, the block is satisfied
01869                     if name in system or satisfy(name, [a for a in alternatives if a != name], pkg):
01870                         valid = True
01871                         break
01872                 # if the block can't be satisfied, the package is not installable so
01873                 # we need to remove it, its conflicts and its provided packages and
01874                 # return False
01875                 if not valid:
01876                     del system[pkg]
01877                     unregister_conflicts(pkg, conflicts)
01878                     for p in providers:
01879                         if satisfy(p, [a for a in providers if a != p], pkg_from):
01880                             return True
01881                     return False
01882 
01883             # if all the blocks have been satisfied, the package is installable
01884             return True
01885 
01886         # check the package at the top of the tree
01887         return satisfy(pkg)
01888 
01889     def doop_source(self, pkg):
01890         """Apply a change to the testing distribution as requested by `pkg`
01891 
01892         This method apply the changes required by the action `pkg` tracking
01893         them so it will be possible to revert them.
01894 
01895         The method returns a list of the package name, the suite where the
01896         package comes from, the list of packages affected by the change and
01897         the dictionary undo which can be used to rollback the changes.
01898         """
01899         undo = {'binaries': {}, 'sources': {}, 'virtual': {}, 'nvirtual': []}
01900 
01901         affected = []
01902         arch = None
01903 
01904         # local copies for better performances
01905         sources = self.sources
01906         binaries = self.binaries['testing']
01907 
01908         # removal of single-arch binary package = "-<package>/<arch>"
01909         if pkg[0] == "-" and "/" in pkg:
01910             pkg_name, arch = pkg.split("/")
01911             pkg_name = pkg_name[1:]
01912             if arch.endswith("_tpu"):
01913                 arch, suite = arch.split("_")
01914             else: suite = "testing"
01915         # arch = "<source>/<arch>",
01916         elif "/" in pkg:
01917             pkg_name, arch = pkg.split("/")
01918             suite = "unstable"
01919         # removal of source packages = "-<source>",
01920         elif pkg[0] == "-":
01921             pkg_name = pkg[1:]
01922             suite = "testing"
01923         # testing-proposed-updates = "<source>_tpu"
01924         elif pkg[0].endswith("_tpu"):
01925             pkg_name = pkg[:-4]
01926             suite = "tpu"
01927         # normal update of source packages = "<source>"
01928         else:
01929             pkg_name = pkg
01930             suite = "unstable"
01931 
01932         # remove all binary packages (if the source already exists)
01933         if not (arch and pkg[0] == '-'):
01934             if pkg_name in sources['testing']:
01935                 source = sources['testing'][pkg_name]
01936                 # remove all the binaries
01937                 for p in source[BINARIES]:
01938                     binary, parch = p.split("/")
01939                     if arch and parch != arch: continue
01940                     # if a smooth update is possible for the package, skip it
01941                     if not self.options.compatible and suite == 'unstable' and \
01942                        binary not in self.binaries[suite][parch][0] and \
01943                        ('ALL' in self.options.smooth_updates or \
01944                         binaries[parch][0][binary][SECTION] in self.options.smooth_updates):
01945                         continue
01946                     # save the old binary for undo
01947                     undo['binaries'][p] = binaries[parch][0][binary]
01948                     # all the reverse dependencies are affected by the change
01949                     for j in binaries[parch][0][binary][RDEPENDS]:
01950                         key = (j, parch)
01951                         if key not in affected: affected.append(key)
01952                     # remove the provided virtual packages
01953                     for j in binaries[parch][0][binary][PROVIDES]:
01954                         key = j + "/" + parch
01955                         if key not in undo['virtual']:
01956                             undo['virtual'][key] = binaries[parch][1][j][:]
01957                         binaries[parch][1][j].remove(binary)
01958                         if len(binaries[parch][1][j]) == 0:
01959                             del binaries[parch][1][j]
01960                     # finally, remove the binary package
01961                     del binaries[parch][0][binary]
01962                 # remove the source package
01963                 if not arch:
01964                     undo['sources'][pkg_name] = source
01965                     del sources['testing'][pkg_name]
01966             else:
01967                 # the package didn't exist, so we mark it as to-be-removed in case of undo
01968                 undo['sources']['-' + pkg_name] = True
01969 
01970         # single binary removal
01971         elif pkg_name in binaries[arch][0]:
01972             undo['binaries'][pkg_name + "/" + arch] = binaries[arch][0][pkg_name]
01973             for j in binaries[arch][0][pkg_name][RDEPENDS]:
01974                 key = (j, arch)
01975                 if key not in affected: affected.append(key)
01976             del binaries[arch][0][pkg_name]
01977 
01978         # add the new binary packages (if we are not removing)
01979         if pkg[0] != "-":
01980             source = sources[suite][pkg_name]
01981             for p in source[BINARIES]:
01982                 binary, parch = p.split("/")
01983                 if arch and parch != arch: continue
01984                 key = (binary, parch)
01985                 # obviously, added/modified packages are affected
01986                 if key not in affected: affected.append(key)
01987                 # if the binary already exists (built from another ource)
01988                 if binary in binaries[parch][0]:
01989                     # save the old binary package
01990                     undo['binaries'][p] = binaries[parch][0][binary]
01991                     # all the reverse dependencies are affected by the change
01992                     for j in binaries[parch][0][binary][RDEPENDS]:
01993                         key = (j, parch)
01994                         if key not in affected: affected.append(key)
01995                     # all the reverse conflicts and their dependency tree are affected by the change
01996                     for j in binaries[parch][0][binary][RCONFLICTS]:
01997                         key = (j, parch)
01998                         if key not in affected: affected.append(key)
01999                         for p in self.get_full_tree(j, parch, 'testing'):
02000                             key = (p, parch)
02001                             if key not in affected: affected.append(key)
02002                 # add/update the binary package
02003                 binaries[parch][0][binary] = self.binaries[suite][parch][0][binary]
02004                 # register new provided packages
02005                 for j in binaries[parch][0][binary][PROVIDES]:
02006                     key = j + "/" + parch
02007                     if j not in binaries[parch][1]:
02008                         undo['nvirtual'].append(key)
02009                         binaries[parch][1][j] = []
02010                     elif key not in undo['virtual']:
02011                         undo['virtual'][key] = binaries[parch][1][j][:]
02012                     binaries[parch][1][j].append(binary)
02013                 # all the reverse dependencies are affected by the change
02014                 for j in binaries[parch][0][binary][RDEPENDS]:
02015                     key = (j, parch)
02016                     if key not in affected: affected.append(key)
02017 
02018             # register reverse dependencies and conflicts for the new binary packages
02019             for p in source[BINARIES]:
02020                 binary, parch = p.split("/")
02021                 if arch and parch != arch: continue
02022                 self.register_reverses(binary, binaries[parch][0] , binaries[parch][1])
02023 
02024             # add/update the source package
02025             if not arch:
02026                 sources['testing'][pkg_name] = sources[suite][pkg_name]
02027 
02028         # return the package name, the suite, the list of affected packages and the undo dictionary
02029         return (pkg_name, suite, affected, undo)
02030 
02031     def get_full_tree(self, pkg, arch, suite):
02032         """Calculate the full dependency tree for the given package
02033 
02034         This method returns the full dependency tree for the package `pkg`,
02035         inside the `arch` architecture for the suite `suite`.
02036         """
02037         packages = [pkg]
02038         binaries = self.binaries[suite][arch][0]
02039         l = n = 0
02040         while len(packages) > l:
02041             l = len(packages)
02042             for p in packages[n:]:
02043                 packages.extend([x for x in binaries[p][RDEPENDS] if x not in packages and x in binaries])
02044             n = l
02045         return packages
02046 
02047     def iter_packages(self, packages, selected, hint=False, nuninst=None):
02048         """Iter on the list of actions and apply them one-by-one
02049 
02050         This method apply the changes from `packages` to testing, checking the uninstallability
02051         counters for every action performed. If the action do not improve the it, it is reverted.
02052         The method returns the new uninstallability counters and the remaining actions if the
02053         final result is successful, otherwise (None, None).
02054         """
02055         extra = []
02056         deferred = []
02057         skipped = []
02058         mark_passed = False
02059         position = len(packages)
02060 
02061         if nuninst:
02062             nuninst_comp = nuninst.copy()
02063         else:
02064             nuninst_comp = self.nuninst_orig.copy()
02065 
02066         # local copies for better performances
02067         check_installable = self.check_installable
02068         binaries = self.binaries['testing']
02069         sources = self.sources
02070         architectures = self.options.architectures
02071         nobreakall_arches = self.options.nobreakall_arches
02072         new_arches = self.options.new_arches
02073         break_arches = self.options.break_arches
02074         dependencies = self.dependencies
02075         compatible = self.options.compatible
02076 
02077         # pre-process a hint batch
02078         pre_process = {}
02079         if selected and hint:
02080             for pkg in selected:
02081                 pkg_name, suite, affected, undo = self.doop_source(pkg)
02082                 pre_process[pkg] = (pkg_name, suite, affected, undo)
02083 
02084         lundo = []
02085         if not hint:
02086             self.output_write("recur: [%s] %s %d/%d\n" % ("", ",".join(selected), len(packages), len(extra)))
02087 
02088         # loop on the packages (or better, actions)
02089         while packages:
02090             pkg = packages.pop(0)
02091 
02092             # this is the marker for the first loop
02093             if not compatible and not mark_passed and position < 0:
02094                 mark_passed = True
02095                 packages.extend(deferred)
02096                 del deferred
02097             else: position -= 1
02098 
02099             # defer packages if their dependency has been already skipped
02100             if not compatible and not mark_passed:
02101                 defer = False
02102                 for p in dependencies.get(pkg, []):
02103                     if p in skipped:
02104                         deferred.append(pkg)
02105                         skipped.append(pkg)
02106                         defer = True
02107                         break
02108                 if defer: continue
02109 
02110             if not hint:
02111                 self.output_write("trying: %s\n" % (pkg))
02112 
02113             better = True
02114             nuninst = {}
02115 
02116             # apply the changes
02117             if pkg in pre_process:
02118                 pkg_name, suite, affected, undo = pre_process[pkg]
02119             else:
02120                 pkg_name, suite, affected, undo = self.doop_source(pkg)
02121             if hint:
02122                 lundo.append((undo, pkg, pkg_name, suite))
02123 
02124             # check the affected packages on all the architectures
02125             for arch in ("/" in pkg and (pkg.split("/")[1].split("_")[0],) or architectures):
02126                 if arch not in nobreakall_arches:
02127                     skip_archall = True
02128                 else: skip_archall = False
02129 
02130                 nuninst[arch] = [x for x in nuninst_comp[arch] if x in binaries[arch][0]]
02131                 nuninst[arch + "+all"] = [x for x in nuninst_comp[arch + "+all"] if x in binaries[arch][0]]
02132                 broken = nuninst[arch + "+all"]
02133                 to_check = [x[0] for x in affected if x[1] == arch]
02134                 # broken packages (first round)
02135                 repaired = []
02136                 broken_changed = True
02137                 last_broken = None
02138                 while broken_changed:
02139                     broken_changed = False
02140                     for p in to_check:
02141                         if p == last_broken: break
02142                         if p not in binaries[arch][0]: continue
02143                         r = check_installable(p, arch, 'testing', excluded=broken, conflicts=True)
02144                         if not r and p not in broken:
02145                             last_broken = p
02146                             broken.append(p)
02147                             broken_changed = True
02148                             if not (skip_archall and binaries[arch][0][p][ARCHITECTURE] == 'all'):
02149                                 nuninst[arch].append(p)
02150                         elif r and p in broken:
02151                             last_broken = p
02152                             repaired.append(p)
02153                             broken.remove(p)
02154                             broken_changed = True
02155                             if not (skip_archall and binaries[arch][0][p][ARCHITECTURE] == 'all'):
02156                                 nuninst[arch].remove(p)
02157 
02158                 # broken packages (second round, reverse dependencies of the first round)
02159                 l = 0
02160                 broken_changed = True
02161                 last_broken = None
02162                 while broken_changed:
02163                     broken_changed = False
02164                     for j in broken + repaired:
02165                         if j not in binaries[arch][0]: continue
02166                         for p in binaries[arch][0][j][RDEPENDS]:
02167                             if p in broken or p not in binaries[arch][0]: continue
02168                             r = check_installable(p, arch, 'testing', excluded=broken, conflicts=True)
02169                             if not r and p not in broken:
02170                                 l = -1
02171                                 last_broken = j
02172                                 broken.append(p)
02173                                 broken_changed = True
02174                                 if not (skip_archall and binaries[arch][0][p][ARCHITECTURE] == 'all'):
02175                                     nuninst[arch].append(p)
02176                             elif r and p in nuninst[arch + "+all"]:
02177                                 last_broken = p
02178                                 repaired.append(p)
02179                                 broken.remove(p)
02180                                 broken_changed = True
02181                                 if not (skip_archall and binaries[arch][0][p][ARCHITECTURE] == 'all'):
02182                                     nuninst[arch].remove(p)
02183                     if l != -1 and last_broken == j: break
02184 
02185                 # if we are processing hints, go ahead
02186                 if hint:
02187                     nuninst_comp[arch] = nuninst[arch]
02188                     nuninst_comp[arch + "+all"] = nuninst[arch + "+all"]
02189                     continue
02190 
02191                 # if the uninstallability counter is worse than before, break the loop
02192                 if (("/" in pkg and arch not in new_arches) or \
02193                     (arch not in break_arches)) and len(nuninst[arch]) > len(nuninst_comp[arch]):
02194                     better = False
02195                     break
02196 
02197             # if we are processing hints or the package is already accepted, go ahead
02198             if hint or pkg in selected: continue
02199 
02200             # check if the action improved the uninstallability counters
02201             if better:
02202                 lundo.append((undo, pkg, pkg_name, suite))
02203                 selected.append(pkg)
02204                 packages.extend(extra)
02205                 extra = []
02206                 self.output_write("accepted: %s\n" % (pkg))
02207                 self.output_write("   ori: %s\n" % (self.eval_nuninst(self.nuninst_orig)))
02208                 self.output_write("   pre: %s\n" % (self.eval_nuninst(nuninst_comp)))
02209                 self.output_write("   now: %s\n" % (self.eval_nuninst(nuninst, nuninst_comp)))
02210                 if len(selected) <= 20:
02211                     self.output_write("   all: %s\n" % (" ".join(selected)))
02212                 else:
02213                     self.output_write("  most: (%d) .. %s\n" % (len(selected), " ".join(selected[-20:])))
02214                 for k in nuninst:
02215                     nuninst_comp[k] = nuninst[k]
02216             else:
02217                 self.output_write("skipped: %s (%d <- %d)\n" % (pkg, len(extra), len(packages)))
02218                 self.output_write("    got: %s\n" % (self.eval_nuninst(nuninst, "/" in pkg and nuninst_comp or None)))
02219                 self.output_write("    * %s: %s\n" % (arch, ", ".join(sorted([b for b in nuninst[arch] if b not in nuninst_comp[arch]]))))
02220 
02221                 extra.append(pkg)
02222                 if not mark_passed:
02223                     skipped.append(pkg)
02224 
02225                 # undo the changes (source)
02226                 for k in undo['sources'].keys():
02227                     if k[0] == '-':
02228                         del sources['testing'][k[1:]]
02229                     else: sources['testing'][k] = undo['sources'][k]
02230 
02231                 # undo the changes (new binaries)
02232                 if pkg[0] != '-' and pkg_name in sources[suite]:
02233                     for p in sources[suite][pkg_name][BINARIES]:
02234                         binary, arch = p.split("/")
02235                         if "/" in pkg and arch != pkg[pkg.find("/")+1:]: continue
02236                         del binaries[arch][0][binary]
02237 
02238                 # undo the changes (binaries)
02239                 for p in undo['binaries'].keys():
02240                     binary, arch = p.split("/")
02241                     if binary[0] == "-":
02242                         del binaries[arch][0][binary[1:]]
02243                     else: binaries[arch][0][binary] = undo['binaries'][p]
02244 
02245                 # undo the changes (virtual packages)
02246                 for p in undo['nvirtual']:
02247                     j, arch = p.split("/")
02248                     del binaries[arch][1][j]
02249                 for p in undo['virtual']:
02250                     j, arch = p.split("/")
02251                     if j[0] == '-':
02252                         del binaries[arch][1][j[1:]]
02253                     else: binaries[arch][1][j] = undo['virtual'][p]
02254 
02255         # if we are processing hints, return now
02256         if hint:
02257             return (nuninst_comp, [], lundo)
02258 
02259         self.output_write(" finish: [%s]\n" % ",".join(selected))
02260         self.output_write("endloop: %s\n" % (self.eval_nuninst(self.nuninst_orig)))
02261         self.output_write("    now: %s\n" % (self.eval_nuninst(nuninst_comp)))
02262         self.output_write(self.eval_uninst(self.newlyuninst(self.nuninst_orig, nuninst_comp)))
02263         self.output_write("\n")
02264 
02265         return (nuninst_comp, extra, lundo)
02266 
02267     def do_all(self, maxdepth=0, init=None, actions=None):
02268         """Testing update runner
02269 
02270         This method tries to update testing checking the uninstallability
02271         counters before and after the actions to decide if the update was
02272         successful or not.
02273         """
02274         selected = []
02275         if actions:
02276             upgrade_me = actions[:]
02277         else:
02278             upgrade_me = self.upgrade_me[:]
02279         nuninst_start = self.nuninst_orig
02280 
02281         # these are special parameters for hints processing
02282         undo = False
02283         force = False
02284         earlyabort = False
02285         if maxdepth == "easy" or maxdepth < 0:
02286             force = maxdepth < 0
02287             earlyabort = True
02288             maxdepth = 0
02289 
02290         # if we have a list of initial packages, check them
02291         if init:
02292             self.output_write("leading: %s\n" % (",".join(init)))
02293             for x in init:
02294                 if x not in upgrade_me:
02295                     self.output_write("failed: %s\n" % (x))
02296                     return None
02297                 selected.append(x)
02298                 upgrade_me.remove(x)
02299         
02300         self.output_write("start: %s\n" % self.eval_nuninst(nuninst_start))
02301         self.output_write("orig: %s\n" % self.eval_nuninst(nuninst_start))
02302 
02303         if earlyabort:
02304             extra = upgrade_me[:]
02305             (nuninst_end, extra, lundo) = self.iter_packages(init, selected, hint=True)
02306             undo = True
02307             self.output_write("easy: %s\n" % (self.eval_nuninst(nuninst_end)))
02308             self.output_write(self.eval_uninst(self.newlyuninst(nuninst_start, nuninst_end)) + "\n")
02309             if not force and not self.is_nuninst_asgood_generous(self.nuninst_orig, nuninst_end):
02310                 nuninst_end, extra = None, None
02311         else:
02312             lundo = []
02313             if init:
02314                 (nuninst_end, extra, tundo) = self.iter_packages(init, selected, hint=True)
02315                 lundo.extend(tundo)
02316                 undo = True
02317             else: nuninst_end = None
02318             (nuninst_end, extra, tundo) = self.iter_packages(upgrade_me, selected, nuninst=nuninst_end)
02319             lundo.extend(tundo)
02320             if not self.is_nuninst_asgood_generous(self.nuninst_orig, nuninst_end):
02321                 nuninst_end, extra = None, None
02322 
02323         if nuninst_end:
02324             self.output_write("Apparently successful\n")
02325             self.output_write("final: %s\n" % ",".join(sorted(selected)))
02326             self.output_write("start: %s\n" % self.eval_nuninst(nuninst_start))
02327             self.output_write(" orig: %s\n" % self.eval_nuninst(self.nuninst_orig))
02328             self.output_write("  end: %s\n" % self.eval_nuninst(nuninst_end))
02329             if force:
02330                 self.output_write("force breaks:\n")
02331                 self.output_write(self.eval_uninst(self.newlyuninst(nuninst_start, nuninst_end)) + "\n")
02332             self.output_write("SUCCESS (%d/%d)\n" % (len(actions or self.upgrade_me), len(extra)))
02333             self.nuninst_orig = nuninst_end
02334             if not actions:
02335                 self.upgrade_me = sorted(extra)
02336                 if not self.options.compatible:
02337                     self.sort_actions()
02338         else:
02339             self.output_write("FAILED\n")
02340             if not undo: return
02341 
02342             # undo all the changes
02343             for (undo, pkg, pkg_name, suite) in lundo:
02344                 # undo the changes (source)
02345                 for k in undo['sources'].keys():
02346                     if k[0] == '-':
02347                         del self.sources['testing'][k[1:]]
02348                     else: self.sources['testing'][k] = undo['sources'][k]
02349 
02350                 # undo the changes (new binaries)
02351                 if pkg[0] != '-' and pkg_name in self.sources[suite]:
02352                     for p in self.sources[suite][pkg_name][BINARIES]:
02353                         binary, arch = p.split("/")
02354                         if "/" in pkg and arch != pkg[pkg.find("/")+1:]: continue
02355                         del self.binaries['testing'][arch][0][binary]
02356 
02357                 # undo the changes (binaries)
02358                 for p in undo['binaries'].keys():
02359                     binary, arch = p.split("/")
02360                     if binary[0] == "-":
02361                         del self.binaries['testing'][arch][0][binary[1:]]
02362                     else: self.binaries['testing'][arch][0][binary] = undo['binaries'][p]
02363 
02364                 # undo the changes (virtual packages)
02365                 for p in undo['nvirtual']:
02366                     j, arch = p.split("/")
02367                     del self.binaries['testing'][arch][1][j]
02368                 for p in undo['virtual']:
02369                     j, arch = p.split("/")
02370                     if j[0] == '-':
02371                         del self.binaries['testing'][arch][1][j[1:]]
02372                     else: self.binaries['testing'][arch][1][j] = undo['virtual'][p]
02373 
02374     def upgrade_testing(self):
02375         """Upgrade testing using the unstable packages
02376 
02377         This method tries to upgrade testing using the packages from unstable.
02378         Before running the do_all method, it tries the easy and force-hint
02379         commands.
02380         """
02381 
02382         self.__log("Starting the upgrade test", type="I")
02383         self.__output = open(self.options.upgrade_output, 'w')
02384         self.output_write("Generated on: %s\n" % (time.strftime("%Y.%m.%d %H:%M:%S %z", time.gmtime(time.time()))))
02385         self.output_write("Arch order is: %s\n" % ", ".join(self.options.architectures))
02386 
02387         self.__log("> Calculating current uninstallability counters", type="I")
02388         self.nuninst_orig = self.get_nuninst()
02389 
02390         if not self.options.actions:
02391             # process `easy' hints
02392             for x in self.hints['easy']:
02393                 self.do_hint("easy", x[0], x[1])
02394 
02395             # process `force-hint' hints
02396             for x in self.hints["force-hint"]:
02397                 self.do_hint("force-hint", x[0], x[1])
02398 
02399         # run the first round of the upgrade
02400         self.__log("> First loop on the packages with depth = 0", type="I")
02401 
02402         # separate runs for break arches
02403         allpackages = []
02404         normpackages = self.upgrade_me[:]
02405         archpackages = {}
02406         for a in self.options.break_arches.split():
02407             archpackages[a] = [p for p in normpackages if p.endswith("/" + a)]
02408             normpackages = [p for p in normpackages if not p.endswith("/" + a)]
02409         self.upgrade_me = normpackages
02410         self.output_write("info: main run\n")
02411         self.do_all()
02412         allpackages += self.upgrade_me
02413         for a in self.options.break_arches.split():
02414             backup = self.options.break_arches
02415             self.options.break_arches = " ".join([x for x in self.options.break_arches.split() if x != a])
02416             self.upgrade_me = archpackages[a]
02417             self.output_write("info: broken arch run for %s\n" % (a))
02418             self.do_all()
02419             allpackages += self.upgrade_me
02420             self.options.break_arches = backup
02421         self.upgrade_me = allpackages
02422 
02423         if self.options.actions:
02424             return
02425 
02426         # process `hint' hints
02427         hintcnt = 0
02428         for x in self.hints["hint"][:50]:
02429             if hintcnt > 50:
02430                 self.output_write("Skipping remaining hints...")
02431                 break
02432             if self.do_hint("hint", x[0], x[1]):
02433                 hintcnt += 1
02434 
02435         # run the auto hinter
02436         if not self.options.compatible:
02437             self.auto_hinter()
02438 
02439         # smooth updates
02440         if not self.options.compatible and len(self.options.smooth_updates) > 0:
02441             self.__log("> Removing old packages left in testing from smooth updates", type="I")
02442             removals = self.old_libraries()
02443             if len(removals) > 0:
02444                 self.output_write("Removing packages left in testing for smooth updates (%d):\n%s" % \
02445                     (len(removals), self.old_libraries_format(removals)))
02446                 self.do_all(actions=removals)
02447                 removals = self.old_libraries()
02448 
02449         if not self.options.compatible:
02450             self.output_write("List of old libraries in testing (%d):\n%s" % \
02451                 (len(removals), self.old_libraries_format(removals)))
02452 
02453         # output files
02454         if not self.options.dry_run:
02455             # re-write control files
02456             if self.options.control_files:
02457                 self.write_controlfiles(self.options.testing, 'testing')
02458 
02459             # write bugs and dates
02460             self.write_bugs(self.options.testing, self.bugs['testing'])
02461             self.write_dates(self.options.testing, self.dates)
02462 
02463             # write HeidiResult
02464             self.write_heidi(self.options.testing, 'HeidiResult')
02465 
02466         self.__output.close()
02467         self.__log("Test completed!", type="I")
02468 
02469     def do_hint(self, type, who, pkgvers):
02470         """Process hints
02471 
02472         This method process `easy`, `hint` and `force-hint` hints. If the
02473         requested version is not in unstable, than the hint is skipped.
02474         """
02475         hintinfo = {"easy": "easy",
02476                     "hint": 0,
02477                     "force-hint": -1,}
02478 
02479         self.__log("> Processing '%s' hint from %s" % (type, who), type="I")
02480         self.output_write("Trying %s from %s: %s\n" % (type, who, " ".join( ["%s/%s" % (p,v) for (p,v) in pkgvers])))
02481 
02482         ok = True
02483         # loop on the requested packages and versions
02484         for pkg, v in pkgvers:
02485             # remove architecture
02486             if "/" in pkg:
02487                 pkg = pkg[:pkg.find("/")]
02488 
02489             # skip removal requests
02490             if pkg[0] == "-":
02491                 continue
02492             # handle testing-proposed-updates
02493             elif pkg.endswith("_tpu"):
02494                 pkg = pkg[:-4]
02495                 if pkg not in self.sources['tpu']: continue
02496                 if apt_pkg.VersionCompare(self.sources['tpu'][pkg][VERSION], v) != 0:
02497                     self.output_write(" Version mismatch, %s %s != %s\n" % (pkg, v, self.sources['tpu'][pkg][VERSION]))
02498                     ok = False
02499             # does the package exist in unstable?
02500             elif pkg not in self.sources['unstable']:
02501                 self.output_write(" Source %s has no version in unstable\n" % pkg)
02502                 ok = False
02503             elif apt_pkg.VersionCompare(self.sources['unstable'][pkg][VERSION], v) != 0:
02504                 self.output_write(" Version mismatch, %s %s != %s\n" % (pkg, v, self.sources['unstable'][pkg][VERSION]))
02505                 ok = False
02506         if not ok:
02507             self.output_write("Not using hint\n")
02508             return False
02509 
02510         self.do_all(hintinfo[type], map(operator.itemgetter(0), pkgvers))
02511         return True
02512 
02513     def sort_actions(self):
02514         """Sort actions in a smart way
02515 
02516         This method sorts the list of actions in a smart way. In details, it uses
02517         as base sort the number of days the excuse is old, then reordering packages
02518         so the ones with most reverse dependencies are at the end of the loop.
02519         If an action depends on another one, it is put after it.
02520         """
02521         upgrade_me = [x.name for x in self.excuses if x.name in self.upgrade_me]
02522         for e in self.excuses:
02523             if e.name not in upgrade_me: continue
02524             # try removes at the end of the loop
02525             elif e.name[0] == '-':
02526                 upgrade_me.remove(e.name)
02527                 upgrade_me.append(e.name)
02528             # otherwise, put it in a good position checking its dependencies
02529             else:
02530                 pos = []
02531                 udeps = [upgrade_me.index(x) for x in e.deps if x in upgrade_me and x != e.name]
02532                 if len(udeps) > 0:
02533                     pos.append(max(udeps))
02534                 sdeps = [upgrade_me.index(x) for x in e.sane_deps if x in upgrade_me and x != e.name]
02535                 if len(sdeps) > 0:
02536                     pos.append(min(sdeps))
02537                 if len(pos) == 0: continue
02538                 upgrade_me.remove(e.name)
02539                 upgrade_me.insert(max(pos)+1, e.name)
02540                 self.dependencies[e.name] = e.deps
02541 
02542         # replace the list of actions with the new one
02543         self.upgrade_me = upgrade_me
02544 
02545     def auto_hinter(self):
02546         """Auto hint circular dependencies
02547 
02548         This method tries to auto hint circular dependencies analyzing the update
02549         excuses relationships. If they build a circular dependency, which we already
02550         know as not-working with the standard do_all algorithm, try to `easy` them.
02551         """
02552         self.__log("> Processing hints from the auto hinter", type="I")
02553 
02554         # consider only excuses which are valid candidates
02555         excuses = dict([(x.name, x) for x in self.excuses if x.name in self.upgrade_me])
02556 
02557         def find_related(e, hint, first=False):
02558             if e not in excuses:
02559                 return False
02560             excuse = excuses[e]
02561             if e in self.sources['testing'] and self.sources['testing'][e][VERSION] == excuse.ver[1]:
02562                 return True
02563             if not first:
02564                 hint[e] = excuse.ver[1]
02565             if len(excuse.deps) == 0:
02566                 return hint
02567             for p in excuse.deps:
02568                 if p in hint: continue
02569                 if not find_related(p, hint):
02570                     return False
02571             return hint
02572 
02573         # loop on them
02574         cache = []
02575         for e in excuses:
02576             excuse = excuses[e]
02577             if e in self.sources['testing'] and self.sources['testing'][e][VERSION] == excuse.ver[1] or \
02578                len(excuse.deps) == 0:
02579                 continue
02580             hint = find_related(e, {}, True)
02581             if hint and e in hint and hint not in cache:
02582                 self.do_hint("easy", "autohinter", hint.items())
02583                 cache.append(hint)
02584 
02585     def old_libraries(self):
02586         """Detect old libraries left in testing for smooth transitions
02587 
02588         This method detect old libraries which are in testing but no longer
02589         built from the source package: they are still there because other
02590         packages still depend on them, but they should be removed as soon
02591         as possible.
02592         """
02593         sources = self.sources['testing']
02594         testing = self.binaries['testing']
02595         unstable = self.binaries['unstable']
02596         removals = []
02597         for arch in self.options.architectures:
02598             for pkg_name in testing[arch][0]:
02599                 pkg = testing[arch][0][pkg_name]
02600                 if pkg_name not in unstable[arch][0] and \
02601                    not self.same_source(sources[pkg[SOURCE]][VERSION], pkg[SOURCEVER]):
02602                     removals.append("-" + pkg_name + "/" + arch)
02603         return removals
02604 
02605     def old_libraries_format(self, libs):
02606         """Format old libraries in a smart table"""
02607         libraries = {}
02608         for i in libs:
02609             pkg, arch = i.split("/")
02610             pkg = pkg[1:]
02611             if pkg in libraries:
02612                     libraries[pkg].append(arch)
02613             else:
02614                     libraries[pkg] = [arch]
02615         return "\n".join(["  " + k + ": " + " ".join(libraries[k]) for k in libraries]) + "\n"
02616 
02617     def output_write(self, msg):
02618         """Simple wrapper for output writing"""
02619         self.__output.write(msg)
02620 
02621     def main(self):
02622         """Main method
02623         
02624         This is the entry point for the class: it includes the list of calls
02625         for the member methods which will produce the output files.
02626         """
02627         # if no actions are provided, build the excuses and sort them
02628         if not self.options.actions:
02629             self.write_excuses()
02630             if not self.options.compatible:
02631                 self.sort_actions()
02632         # otherwise, use the actions provided by the command line
02633         else: self.upgrade_me = self.options.actions.split()
02634 
02635         # run the upgrade test
02636         self.upgrade_testing()
02637 
02638 if __name__ == '__main__':
02639     Britney().main()

Generated on Fri Aug 18 23:23:25 2006 for briteny by  doxygen 1.4.7