britney.py

00001 #!/usr/bin/env python2.4
00002 # -*- coding: utf-8 -*-
00003 
00004 # Copyright (C) 2001-2004 Anthony Towns <ajt@debian.org>
00005 #                         Andreas Barth <aba@debian.org>
00006 #                         Fabio Tranchitella <kobold@debian.org>
00007 
00008 # This program is free software; you can redistribute it and/or modify
00009 # it under the terms of the GNU General Public License as published by
00010 # the Free Software Foundation; either version 2 of the License, or
00011 # (at your option) any later version.
00012 
00013 # This program is distributed in the hope that it will be useful,
00014 # but WITHOUT ANY WARRANTY; without even the implied warranty of
00015 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
00016 # GNU General Public License for more details.
00017 
00018 """
00019 = Introdution =
00020 
00021 This is the Debian testing updater script, also known as "Britney".
00022 
00023 Packages are usually installed into the `testing' distribution after
00024 they have undergone some degree of testing in unstable. The goal of
00025 this software is to do this task in a smart way, allowing testing
00026 to be always fully installable and close to being a release candidate.
00027 
00028 Britney source code is splitted in two different but related tasks:
00029 the first one is the generation of the update excuses, while the
00030 second tries to update testing with the valid candidates; first 
00031 each package alone, then larger and even larger sets of packages
00032 together. Each try is accepted if testing is not more uninstallable
00033 after the update than before.
00034 
00035 = Data Loading =
00036 
00037 In order to analyze the entire Debian distribution, Britney needs to
00038 load in memory the whole archive: this means more than 10.000 packages
00039 for twelve architectures, as well as the dependency interconnection
00040 between them. For this reason, the memory requirement for running this
00041 software are quite high and at least 1 gigabyte of RAM should be available.
00042 
00043 Britney loads the source packages from the `Sources' file and the binary
00044 packages from the `Packages_${arch}' files, where ${arch} is substituted
00045 with the supported architectures. While loading the data, the software
00046 analyze the dependencies and build a directed weighted graph in memory
00047 with all the interconnections between the packages (see Britney.read_sources
00048 and Britney.read_binaries).
00049 
00050 Other than source and binary packages, Britney loads the following data:
00051 
00052   * Bugs, which contains the count of release-critical bugs for a given
00053     version of a source package (see Britney.read_bugs).
00054 
00055   * Dates, which contains the date of the upload of a given version 
00056     of a source package (see Britney.read_dates).
00057 
00058   * Urgencies, which contains the urgency of the upload of a given
00059     version of a source package (see Britney.read_urgencies).
00060 
00061   * Approvals, which contains the list of approved testing-proposed-updates
00062     packages (see Britney.read_approvals).
00063 
00064   * Hints, which contains lists of commands which modify the standard behaviour
00065     of Britney (see Britney.read_hints).
00066 
00067 For a more detailed explanation about the format of these files, please read
00068 the documentation of the related methods. The exact meaning of them will be
00069 instead explained in the chapter "Excuses Generation".
00070 
00071 = Excuses =
00072 
00073 An excuse is a detailed explanation of why a package can or cannot
00074 be updated in the testing distribution from a newer package in 
00075 another distribution (like for example unstable). The main purpose
00076 of the excuses is to be written in an HTML file which will be 
00077 published over HTTP. The maintainers will be able to parse it manually
00078 or automatically to find the explanation of why their packages have
00079 been updated or not.
00080 
00081 == Excuses generation ==
00082 
00083 These are the steps (with references to method names) that Britney
00084 does for the generation of the update excuses.
00085 
00086  * If a source package is available in testing but it is not
00087    present in unstable and no binary packages in unstable are
00088    built from it, then it is marked for removal.
00089 
00090  * Every source package in unstable and testing-proposed-updates,
00091    if already present in testing, is checked for binary-NMUs, new
00092    or dropped binary packages in all the supported architectures
00093    (see Britney.should_upgrade_srcarch). The steps to detect if an
00094    upgrade is needed are:
00095 
00096     1. If there is a `remove' hint for the source package, the package
00097        is ignored: it will be removed and not updated.
00098 
00099     2. For every binary package build from the new source, it checks
00100        for unsatisfied dependencies, new binary package and updated
00101        binary package (binNMU) excluding the architecture-independent
00102        ones and the packages not built from the same source.
00103 
00104     3. For every binary package build from the old source, it checks
00105        if it is still built from the new source; if this is not true
00106        and the package is not architecture-independent, the script
00107        removes it from testing.
00108 
00109     4. Finally, if there is something worth doing (eg. a new or updated
00110        binary package) and nothing wrong it marks the source package
00111        as "Valid candidate", or "Not considered" if there is something
00112        wrong which prevented the update.
00113 
00114  * Every source package in unstable and testing-proposed-updates is
00115    checked for upgrade (see Britney.should_upgrade_src). The steps
00116    to detect if an upgrade is needed are:
00117 
00118     1. If the source package in testing is more recent the new one
00119        is ignored.
00120 
00121     2. If the source package doesn't exist (is fake), which means that
00122        a binary package refers to it but it is not present in the
00123        `Sources' file, the new one is ignored.
00124 
00125     3. If the package doesn't exist in testing, the urgency of the
00126        upload is ignored and set to the default (actually `low').
00127 
00128     4. If there is a `remove' hint for the source package, the package
00129        is ignored: it will be removed and not updated.
00130 
00131     5. If there is a `block' hint for the source package without an
00132        `unblock` hint or a `block-all source`, the package is ignored.
00133 
00134     7. If the suite is unstable, the update can go ahead only if the
00135        upload happend more then the minimum days specified by the
00136        urgency of the upload; if this is not true, the package is
00137        ignored as `too-young'. Note that the urgency is sticky, meaning
00138        that the highest urgency uploaded since the previous testing
00139        transition is taken into account.
00140 
00141     8. All the architecture-dependent binary packages and the
00142        architecture-independent ones for the `nobreakall' architectures
00143        have to be built from the source we are considering. If this is
00144        not true, then these are called `out-of-date' architectures and
00145        the package is ignored.
00146 
00147     9. The source package must have at least a binary package, otherwise
00148        it is ignored.
00149 
00150    10. If the suite is unstable, the count of release critical bugs for
00151        the new source package must be less then the count for the testing
00152        one. If this is not true, the package is ignored as `buggy'.
00153 
00154    11. If there is a `force' hint for the source package, then it is
00155        updated even if it is marked as ignored from the previous steps.
00156 
00157    12. If the suite is testing-proposed-updates, the source package can
00158        be updated only if there is an explicit approval for it.
00159 
00160    13. If the package will be ignored, mark it as "Valid candidate",
00161        otherwise mark it as "Not considered".
00162 
00163  * The list of `remove' hints is processed: if the requested source
00164    package is not already being updated or removed and the version
00165    actually in testing is the same specified with the `remove' hint,
00166    it is marked for removal.
00167 
00168  * The excuses are sorted by the number of days from the last upload
00169    (days-old) and by name.
00170 
00171  * A list of unconsidered excuses (for which the package is not upgraded)
00172    is built. Using this list, all the excuses depending on them is marked
00173    as invalid for "unpossible dependency".
00174 
00175  * The excuses are written in an HTML file.
00176 """
00177 
00178 import os
00179 import re
00180 import sys
00181 import string
00182 import time
00183 import optparse
00184 import operator
00185 
00186 import apt_pkg
00187 
00188 from excuse import Excuse
00189 from upgrade import UpgradeRun
00190 
00191 __author__ = 'Fabio Tranchitella'
00192 __version__ = '2.0.alpha1'
00193 
00194 
00195 class Britney:
00196     """Britney, the debian testing updater script
00197     
00198     This is the script that updates the testing_ distribution. It is executed
00199     each day after the installation of the updated packages. It generates the 
00200     `Packages' files for the testing distribution, but it does so in an
00201     intelligent manner; it try to avoid any inconsistency and to use only
00202     non-buggy packages.
00203 
00204     For more documentation on this script, please read the Developers Reference.
00205     """
00206 
00207     HINTS_STANDARD = ("easy", "hint", "remove", "block", "unblock", "urgent", "approve")
00208     HINTS_ALL = ("force", "force-hint", "block-all") + HINTS_STANDARD
00209 
00210     def __init__(self):
00211         """Class constructor
00212 
00213         This method initializes and populates the data lists, which contain all
00214         the information needed by the other methods of the class.
00215         """
00216         self.date_now = int(((time.time() / (60*60)) - 15) / 24)
00217 
00218         # parse the command line arguments
00219         self.__parse_arguments()
00220 
00221         # initialize the apt_pkg back-end
00222         apt_pkg.init()
00223 
00224         # read the source and binary packages for the involved distributions
00225         self.sources = {'testing': self.read_sources(self.options.testing),
00226                         'unstable': self.read_sources(self.options.unstable),
00227                         'tpu': self.read_sources(self.options.tpu),}
00228         self.binaries = {'testing': {}, 'unstable': {}, 'tpu': {}}
00229         for arch in self.options.architectures:
00230             self.binaries['testing'][arch] = self.read_binaries(self.options.testing, "testing", arch)
00231             self.binaries['unstable'][arch] = self.read_binaries(self.options.unstable, "unstable", arch)
00232             self.binaries['tpu'][arch] = self.read_binaries(self.options.tpu, "tpu", arch)
00233 
00234         # read the release-critical bug summaries for testing and unstable
00235         self.bugs = {'unstable': self.read_bugs(self.options.unstable),
00236                      'testing': self.read_bugs(self.options.testing),}
00237         self.normalize_bugs()
00238 
00239         # read additional data
00240         self.dates = self.read_dates(self.options.testing)
00241         self.urgencies = self.read_urgencies(self.options.testing)
00242         self.approvals = self.read_approvals(self.options.tpu)
00243         self.hints = self.read_hints(self.options.unstable)
00244         self.excuses = []
00245 
00246     def __parse_arguments(self):
00247         """Parse the command line arguments
00248 
00249         This method parses and initializes the command line arguments.
00250         While doing so, it preprocesses some of the options to be converted
00251         in a suitable form for the other methods of the class.
00252         """
00253         # initialize the parser
00254         self.parser = optparse.OptionParser(version="%prog")
00255         self.parser.add_option("-v", "", action="count", dest="verbose", help="enable verbose output")
00256         self.parser.add_option("-c", "--config", action="store", dest="config",
00257                           default="/etc/britney.conf", help="path for the configuration file")
00258         (self.options, self.args) = self.parser.parse_args()
00259 
00260         # if the configuration file exists, than read it and set the additional options
00261         if not os.path.isfile(self.options.config):
00262             self.__log("Unable to read the configuration file (%s), exiting!" % self.options.config, type="E")
00263             sys.exit(1)
00264 
00265         # minimum days for unstable-testing transition and the list of hints
00266         # are handled as an ad-hoc case
00267         self.MINDAYS = {}
00268         self.HINTS = {}
00269         for k, v in [map(string.strip,r.split('=', 1)) for r in file(self.options.config) if '=' in r and not r.strip().startswith('#')]:
00270             if k.startswith("MINDAYS_"):
00271                 self.MINDAYS[k.split("_")[1].lower()] = int(v)
00272             elif k.startswith("HINTS_"):
00273                 self.HINTS[k.split("_")[1].lower()] = \
00274                     reduce(lambda x,y: x+y, [hasattr(self, "HINTS_" + i) and getattr(self, "HINTS_" + i) or (i,) for i in v.split()])
00275             else:
00276                 setattr(self.options, k.lower(), v)
00277 
00278         # Sort the architecture list
00279         allarches = sorted(self.options.architectures.split())
00280         arches = [x for x in allarches if x in self.options.nobreakall_arches]
00281         arches += [x for x in allarches if x not in arches and x not in self.options.fucked_arches]
00282         arches += [x for x in allarches if x not in arches and x not in self.options.break_arches]
00283         arches += [x for x in allarches if x not in arches]
00284         self.options.architectures = arches
00285 
00286     def __log(self, msg, type="I"):
00287         """Print info messages according to verbosity level
00288         
00289         An easy-and-simple log method which prints messages to the standard
00290         output. The type parameter controls the urgency of the message, and
00291         can be equal to `I' for `Information', `W' for `Warning' and `E' for
00292         `Error'. Warnings and errors are always printed, and information are
00293         printed only if the verbose logging is enabled.
00294         """
00295         if self.options.verbose or type in ("E", "W"):
00296             print "%s: [%s] - %s" % (type, time.asctime(), msg)
00297 
00298     # Data reading/writing methods
00299     # ----------------------------
00300 
00301     def read_sources(self, basedir):
00302         """Read the list of source packages from the specified directory
00303         
00304         The source packages are read from the `Sources' file within the
00305         directory specified as `basedir' parameter. Considering the
00306         large amount of memory needed, not all the fields are loaded
00307         in memory. The available fields are Version, Maintainer and Section.
00308 
00309         The method returns a list where every item represents a source
00310         package as a dictionary.
00311         """
00312         sources = {}
00313         package = None
00314         filename = os.path.join(basedir, "Sources")
00315         self.__log("Loading source packages from %s" % filename)
00316         packages = apt_pkg.ParseTagFile(open(filename))
00317         while packages.Step():
00318             pkg = packages.Section.get('Package')
00319             sources[pkg] = {'binaries': [],
00320                             'version': packages.Section.get('Version'),
00321                             'maintainer': packages.Section.get('Maintainer'),
00322                             'section': packages.Section.get('Section'),
00323                             }
00324         return sources
00325 
00326     def read_binaries(self, basedir, distribution, arch):
00327         """Read the list of binary packages from the specified directory
00328         
00329         The binary packages are read from the `Packages_${arch}' files
00330         within the directory specified as `basedir' parameter, replacing
00331         ${arch} with the value of the arch parameter. Considering the
00332         large amount of memory needed, not all the fields are loaded
00333         in memory. The available fields are Version, Source, Pre-Depends,
00334         Depends, Conflicts, Provides and Architecture.
00335         
00336         After reading the packages, reverse dependencies are computed
00337         and saved in the `rdepends' keys, and the `Provides' field is
00338         used to populate the virtual packages list.
00339 
00340         The dependencies are parsed with the apt.pkg.ParseDepends method,
00341         and they are stored both as the format of its return value and
00342         text.
00343 
00344         The method returns a tuple. The first element is a list where
00345         every item represents a binary package as a dictionary; the second
00346         element is a dictionary which maps virtual packages to real
00347         packages that provide it.
00348         """
00349 
00350         packages = {}
00351         provides = {}
00352         package = None
00353         filename = os.path.join(basedir, "Packages_%s" % arch)
00354         self.__log("Loading binary packages from %s" % filename)
00355         Packages = apt_pkg.ParseTagFile(open(filename))
00356         while Packages.Step():
00357             pkg = Packages.Section.get('Package')
00358             version = Packages.Section.get('Version')
00359             dpkg = {'version': version,
00360                     'source': pkg, 
00361                     'source-ver': version,
00362                     'architecture': Packages.Section.get('Architecture'),
00363                     'rdepends': [],
00364                     }
00365             for k in ('Pre-Depends', 'Depends', 'Provides', 'Conflicts'):
00366                 v = Packages.Section.get(k)
00367                 if v: dpkg[k.lower()] = v
00368 
00369             # retrieve the name and the version of the source package
00370             source = Packages.Section.get('Source')
00371             if source:
00372                 dpkg['source'] = source.split(" ")[0]
00373                 if "(" in source:
00374                     dpkg['source-ver'] = source.split("(")[1].split(")")[0]
00375 
00376             # if the source package is available in the distribution, then register this binary package
00377             if dpkg['source'] in self.sources[distribution]:
00378                 self.sources[distribution][dpkg['source']]['binaries'].append(pkg + "/" + arch)
00379             # if the source package doesn't exist, create a fake one
00380             else:
00381                 self.sources[distribution][dpkg['source']] = {'binaries': [pkg + "/" + arch],
00382                     'version': dpkg['source-ver'], 'maintainer': None, 'section': None, 'fake': True}
00383 
00384             # register virtual packages and real packages that provide them
00385             if dpkg.has_key('provides'):
00386                 parts = map(string.strip, dpkg['provides'].split(","))
00387                 for p in parts:
00388                     try:
00389                         provides[p].append(pkg)
00390                     except KeyError:
00391                         provides[p] = [pkg]
00392                 del dpkg['provides']
00393 
00394             # append the resulting dictionary to the package list
00395             packages[pkg] = dpkg
00396 
00397         # loop again on the list of packages to register reverse dependencies
00398         # this is not needed for the moment, so it is disabled
00399         for pkg in packages:
00400             dependencies = []
00401             if packages[pkg].has_key('depends'):
00402                 dependencies.extend(apt_pkg.ParseDepends(packages[pkg]['depends']))
00403             if packages[pkg].has_key('pre-depends'):
00404                 dependencies.extend(apt_pkg.ParseDepends(packages[pkg]['pre-depends']))
00405             # register the list of the dependencies for the depending packages
00406             for p in dependencies:
00407                 for a in p:
00408                     if a[0] not in packages: continue
00409                     packages[a[0]]['rdepends'].append((pkg, a[1], a[2]))
00410             del dependencies
00411 
00412         # return a tuple with the list of real and virtual packages
00413         return (packages, provides)
00414 
00415     def read_bugs(self, basedir):
00416         """Read the release critial bug summary from the specified directory
00417         
00418         The RC bug summaries are read from the `Bugs' file within the
00419         directory specified as `basedir' parameter. The file contains
00420         rows with the format:
00421 
00422         <package-name> <count-of-rc-bugs>
00423 
00424         The method returns a dictionary where the key is the binary package
00425         name and the value is the number of open RC bugs for it.
00426         """
00427         bugs = {}
00428         filename = os.path.join(basedir, "Bugs")
00429         self.__log("Loading RC bugs count from %s" % filename)
00430         for line in open(filename):
00431             l = line.strip().split()
00432             if len(l) != 2: continue
00433             try:
00434                 bugs[l[0]] = int(l[1])
00435             except ValueError:
00436                 self.__log("Bugs, unable to parse \"%s\"" % line, type="E")
00437         return bugs
00438 
00439     def __maxver(self, pkg, dist):
00440         """Return the maximum version for a given package name
00441         
00442         This method returns None if the specified source package
00443         is not available in the `dist' distribution. If the package
00444         exists, then it returns the maximum version between the
00445         source package and its binary packages.
00446         """
00447         maxver = None
00448         if self.sources[dist].has_key(pkg):
00449             maxver = self.sources[dist][pkg]['version']
00450         for arch in self.options.architectures:
00451             if not self.binaries[dist][arch][0].has_key(pkg): continue
00452             pkgv = self.binaries[dist][arch][0][pkg]['version']
00453             if maxver == None or apt_pkg.VersionCompare(pkgv, maxver) > 0:
00454                 maxver = pkgv
00455         return maxver
00456 
00457     def normalize_bugs(self):
00458         """Normalize the release critical bug summaries for testing and unstable
00459         
00460         The method doesn't return any value: it directly modifies the
00461         object attribute `bugs'.
00462         """
00463         # loop on all the package names from testing and unstable bug summaries
00464         for pkg in set(self.bugs['testing'].keys() + self.bugs['unstable'].keys()):
00465 
00466             # make sure that the key is present in both dictionaries
00467             if not self.bugs['testing'].has_key(pkg):
00468                 self.bugs['testing'][pkg] = 0
00469             elif not self.bugs['unstable'].has_key(pkg):
00470                 self.bugs['unstable'][pkg] = 0
00471 
00472             # retrieve the maximum version of the package in testing:
00473             maxvert = self.__maxver(pkg, 'testing')
00474 
00475             # if the package is not available in testing or it has the
00476             # same RC bug count, then do nothing
00477             if maxvert == None or \
00478                self.bugs['testing'][pkg] == self.bugs['unstable'][pkg]:
00479                 continue
00480 
00481             # retrieve the maximum version of the package in testing:
00482             maxveru = self.__maxver(pkg, 'unstable')
00483 
00484             # if the package is not available in unstable, then do nothing
00485             if maxveru == None:
00486                 continue
00487             # else if the testing package is more recent, then use the
00488             # unstable RC bug count for testing, too
00489             elif apt_pkg.VersionCompare(maxvert, maxveru) >= 0:
00490                 self.bugs['testing'][pkg] = self.bugs['unstable'][pkg]
00491 
00492     def read_dates(self, basedir):
00493         """Read the upload date for the packages from the specified directory
00494         
00495         The upload dates are read from the `Date' file within the directory
00496         specified as `basedir' parameter. The file contains rows with the
00497         format:
00498 
00499         <package-name> <version> <date-of-upload>
00500 
00501         The dates are expressed as days starting from the 1970-01-01.
00502 
00503         The method returns a dictionary where the key is the binary package
00504         name and the value is tuple with two items, the version and the date.
00505         """
00506         dates = {}
00507         filename = os.path.join(basedir, "Dates")
00508         self.__log("Loading upload data from %s" % filename)
00509         for line in open(filename):
00510             l = line.strip().split()
00511             if len(l) != 3: continue
00512             try:
00513                 dates[l[0]] = (l[1], int(l[2]))
00514             except ValueError:
00515                 self.__log("Dates, unable to parse \"%s\"" % line, type="E")
00516         return dates
00517 
00518     def read_urgencies(self, basedir):
00519         """Read the upload urgency of the packages from the specified directory
00520         
00521         The upload urgencies are read from the `Urgency' file within the
00522         directory specified as `basedir' parameter. The file contains rows
00523         with the format:
00524 
00525         <package-name> <version> <urgency>
00526 
00527         The method returns a dictionary where the key is the binary package
00528         name and the value is the greatest urgency from the versions of the
00529         package that are higher then the testing one.
00530         """
00531 
00532         urgencies = {}
00533         filename = os.path.join(basedir, "Urgency")
00534         self.__log("Loading upload urgencies from %s" % filename)
00535         for line in open(filename):
00536             l = line.strip().split()
00537             if len(l) != 3: continue
00538 
00539             # read the minimum days associated to the urgencies
00540             urgency_old = urgencies.get(l[0], self.options.default_urgency)
00541             mindays_old = self.MINDAYS.get(urgency_old, self.MINDAYS[self.options.default_urgency])
00542             mindays_new = self.MINDAYS.get(l[2], self.MINDAYS[self.options.default_urgency])
00543 
00544             # if the new urgency is lower (so the min days are higher), do nothing
00545             if mindays_old <= mindays_new:
00546                 continue
00547 
00548             # if the package exists in testing and it is more recent, do nothing
00549             tsrcv = self.sources['testing'].get(l[0], None)
00550             if tsrcv and apt_pkg.VersionCompare(tsrcv['version'], l[1]) >= 0:
00551                 continue
00552 
00553             # if the package doesn't exist in unstable or it is older, do nothing
00554             usrcv = self.sources['unstable'].get(l[0], None)
00555             if not usrcv or apt_pkg.VersionCompare(usrcv['version'], l[1]) < 0:
00556                 continue
00557 
00558             # update the urgency for the package
00559             urgencies[l[0]] = l[2]
00560 
00561         return urgencies
00562 
00563     def read_approvals(self, basedir):
00564         """Read the approval commands from the specified directory
00565         
00566         The approval commands are read from the files contained by the 
00567         `Approved' directory within the directory specified as `basedir'
00568         parameter. The name of the files has to be the same of the
00569         authorized users for the approvals.
00570         
00571         The file contains rows with the format:
00572 
00573         <package-name> <version>
00574 
00575         The method returns a dictionary where the key is the binary package
00576         name followed by an underscore and the version number, and the value
00577         is the user who submitted the command.
00578         """
00579         approvals = {}
00580         for approver in self.options.approvers.split():
00581             filename = os.path.join(basedir, "Approved", approver)
00582             self.__log("Loading approvals list from %s" % filename)
00583             for line in open(filename):
00584                 l = line.strip().split()
00585                 if len(l) != 2: continue
00586                 approvals["%s_%s" % (l[0], l[1])] = approver
00587         return approvals
00588 
00589     def read_hints(self, basedir):
00590         """Read the hint commands from the specified directory
00591         
00592         The hint commands are read from the files contained by the `Hints'
00593         directory within the directory specified as `basedir' parameter. 
00594         The name of the files has to be the same of the authorized users
00595         for the hints.
00596         
00597         The file contains rows with the format:
00598 
00599         <command> <package-name>[/<version>]
00600 
00601         The method returns a dictionary where the key is the command, and
00602         the value is the list of affected packages.
00603         """
00604         hints = dict([(k,[]) for k in self.HINTS_ALL])
00605 
00606         for who in self.HINTS.keys():
00607             filename = os.path.join(basedir, "Hints", who)
00608             self.__log("Loading hints list from %s" % filename)
00609             for line in open(filename):
00610                 line = line.strip()
00611                 if line == "": continue
00612                 l = line.split()
00613                 if l[0] == 'finished':
00614                     break
00615                 elif l[0] not in self.HINTS[who]:
00616                     continue
00617                 elif l[0] in ["easy", "hint", "force-hint"]:
00618                     hints[l[0]].append((who, [k.split("/") for k in l if "/" in k]))
00619                 elif l[0] in ["block-all"]:
00620                     hints[l[0]].extend([(y, who) for y in l[1:]])
00621                 elif l[0] in ["block"]:
00622                     hints[l[0]].extend([(y, who) for y in l[1:]])
00623                 elif l[0] in ["remove", "approve", "unblock", "force", "urgent"]:
00624                     hints[l[0]].extend([(k.split("/")[0], (k.split("/")[1],who) ) for k in l if "/" in k])
00625 
00626         for x in ["block", "block-all", "unblock", "force", "urgent", "remove"]:
00627             z = {}
00628             for a, b in hints[x]:
00629                 if z.has_key(a):
00630                     self.__log("Overriding %s[%s] = %s with %s" % (x, a, z[a], b), type="W")
00631                 z[a] = b
00632             hints[x] = z
00633 
00634         return hints
00635 
00636     # Utility methods for package analisys
00637     # ------------------------------------
00638 
00639     def same_source(self, sv1, sv2):
00640         """Check if two version numbers are built from the same source
00641 
00642         This method returns a boolean value which is true if the two
00643         version numbers specified as parameters are built from the same
00644         source. The main use of this code is to detect binary-NMU.
00645         """
00646         if sv1 == sv2:
00647             return 1
00648 
00649         m = re.match(r'^(.*)\+b\d+$', sv1)
00650         if m: sv1 = m.group(1)
00651         m = re.match(r'^(.*)\+b\d+$', sv2)
00652         if m: sv2 = m.group(1)
00653 
00654         if sv1 == sv2:
00655             return 1
00656 
00657         if re.search("-", sv1) or re.search("-", sv2):
00658             m = re.match(r'^(.*-[^.]+)\.0\.\d+$', sv1)
00659             if m: sv1 = m.group(1)
00660             m = re.match(r'^(.*-[^.]+\.[^.]+)\.\d+$', sv1)
00661             if m: sv1 = m.group(1)
00662 
00663             m = re.match(r'^(.*-[^.]+)\.0\.\d+$', sv2)
00664             if m: sv2 = m.group(1)
00665             m = re.match(r'^(.*-[^.]+\.[^.]+)\.\d+$', sv2)
00666             if m: sv2 = m.group(1)
00667 
00668             return (sv1 == sv2)
00669         else:
00670             m = re.match(r'^([^-]+)\.0\.\d+$', sv1)
00671             if m and sv2 == m.group(1): return 1
00672 
00673             m = re.match(r'^([^-]+)\.0\.\d+$', sv2)
00674             if m and sv1 == m.group(1): return 1
00675 
00676             return 0
00677 
00678     def get_dependency_solvers(self, block, arch, distribution, excluded=[]):
00679         """Find the packages which satisfy a dependency block
00680 
00681         This method returns the list of packages which satisfy a dependency
00682         block (as returned by apt_pkg.ParseDepends) for the given architecture
00683         and distribution.
00684 
00685         It returns a tuple with two items: the first is a boolean which is
00686         True if the dependency is satisfied, the second is the list of the
00687         solving packages.
00688         """
00689 
00690         packages = []
00691 
00692         # for every package, version and operation in the block
00693         for name, version, op in block:
00694             # look for the package in unstable
00695             if name in self.binaries[distribution][arch][0] and name not in excluded:
00696                 package = self.binaries[distribution][arch][0][name]
00697                 # check the versioned dependency (if present)
00698                 if op == '' and version == '' or apt_pkg.CheckDep(package['version'], op, version):
00699                     packages.append(name)
00700 
00701             # look for the package in the virtual packages list
00702             if name in self.binaries[distribution][arch][1]:
00703                 # loop on the list of packages which provides it
00704                 for prov in self.binaries[distribution][arch][1][name]:
00705                     if prov in excluded or \
00706                        not self.binaries[distribution][arch][0].has_key(prov): continue
00707                     package = self.binaries[distribution][arch][0][prov]
00708                     # check the versioned dependency (if present)
00709                     # TODO: this is forbidden by the debian policy, which says that versioned
00710                     #       dependencies on virtual packages are never satisfied. The old britney
00711                     #       does it and we have to go with it, but at least a warning should be raised.
00712                     if op == '' and version == '' or apt_pkg.CheckDep(package['version'], op, version):
00713                         packages.append(prov)
00714                         break
00715 
00716         return (len(packages) > 0, packages)
00717 
00718     def excuse_unsat_deps(self, pkg, src, arch, suite, excuse=None, excluded=[]):
00719         """Find unsatisfied dependencies for a binary package
00720 
00721         This method analyzes the dependencies of the binary package specified
00722         by the parameter `pkg', built from the source package `src', for the
00723         architecture `arch' within the suite `suite'. If the dependency can't
00724         be satisfied in testing and/or unstable, it updates the excuse passed
00725         as parameter.
00726 
00727         The dependency fields checked are Pre-Depends and Depends.
00728         """
00729         # retrieve the binary package from the specified suite and arch
00730         binary_u = self.binaries[suite][arch][0][pkg]
00731 
00732         # analyze the dependency fields (if present)
00733         for type in ('Pre-Depends', 'Depends'):
00734             type_key = type.lower()
00735             if not binary_u.has_key(type_key):
00736                 continue
00737 
00738             # this list will contain the packages that satisfy the dependency
00739             packages = []
00740 
00741             # for every block of dependency (which is formed as conjunction of disconjunction)
00742             for block, block_txt in zip(apt_pkg.ParseDepends(binary_u[type_key]), binary_u[type_key].split(',')):
00743                 # if the block is satisfied in testing, then skip the block
00744                 solved, packages = self.get_dependency_solvers(block, arch, 'testing', excluded)
00745                 if solved: continue
00746                 elif excuse == None:
00747                     return False
00748 
00749                 # check if the block can be satisfied in unstable, and list the solving packages
00750                 solved, packages = self.get_dependency_solvers(block, arch, suite)
00751                 packages = [self.binaries[suite][arch][0][p]['source'] for p in packages]
00752 
00753                 # if the dependency can be satisfied by the same source package, skip the block:
00754                 # obviously both binary packages will enter testing togheter
00755                 if src in packages: continue
00756 
00757                 # if no package can satisfy the dependency, add this information to the excuse
00758                 if len(packages) == 0:
00759                     excuse.addhtml("%s/%s unsatisfiable %s: %s" % (pkg, arch, type, block_txt.strip()))
00760 
00761                 # for the solving packages, update the excuse to add the dependencies
00762                 for p in packages:
00763                     if arch not in self.options.break_arches.split():
00764                         excuse.add_dep(p)
00765                     else:
00766                         excuse.add_break_dep(p, arch)
00767 
00768         # otherwise, the package is installable
00769         return True
00770 
00771     # Package analisys methods
00772     # ------------------------
00773 
00774     def should_remove_source(self, pkg):
00775         """Check if a source package should be removed from testing
00776         
00777         This method checks if a source package should be removed from the
00778         testing distribution; this happen if the source package is not
00779         present in the unstable distribution anymore.
00780 
00781         It returns True if the package can be removed, False otherwise.
00782         In the former case, a new excuse is appended to the the object
00783         attribute excuses.
00784         """
00785         # if the soruce package is available in unstable, then do nothing
00786         if self.sources['unstable'].has_key(pkg):
00787             return False
00788         # otherwise, add a new excuse for its removal and return True
00789         src = self.sources['testing'][pkg]
00790         excuse = Excuse("-" + pkg)
00791         excuse.set_vers(src['version'], None)
00792         src['maintainer'] and excuse.set_maint(src['maintainer'].strip())
00793         src['section'] and excuse.set_section(src['section'].strip())
00794         excuse.addhtml("Valid candidate")
00795         self.excuses.append(excuse)
00796         return True
00797 
00798     def should_upgrade_srcarch(self, src, arch, suite):
00799         """Check if binary package should be upgraded
00800 
00801         This method checks if a binary package should be upgraded; this can
00802         happen also if the binary package is a binary-NMU for the given arch.
00803         The analisys is performed for the source package specified by the
00804         `src' parameter, checking the architecture `arch' for the distribution
00805         `suite'.
00806        
00807         It returns False if the given package doesn't need to be upgraded,
00808         True otherwise. In the former case, a new excuse is appended to
00809         the the object attribute excuses.
00810         """
00811         # retrieve the source packages for testing and suite
00812         source_t = self.sources['testing'][src]
00813         source_u = self.sources[suite][src]
00814 
00815         # build the common part of the excuse, which will be filled by the code below
00816         ref = "%s/%s%s" % (src, arch, suite != 'unstable' and "_" + suite or "")
00817         excuse = Excuse(ref)
00818         excuse.set_vers(source_t['version'], source_t['version'])
00819         source_u['maintainer'] and excuse.set_maint(source_u['maintainer'].strip())
00820         source_u['section'] and excuse.set_section(source_u['section'].strip())
00821         
00822         # if there is a `remove' hint and the requested version is the same of the
00823         # version in testing, then stop here and return False
00824         if self.hints["remove"].has_key(src) and \
00825            self.same_source(source_t['version'], self.hints["remove"][src][0]):
00826             excuse.addhtml("Removal request by %s" % (self.hints["remove"][src][1]))
00827             excuse.addhtml("Trying to remove package, not update it")
00828             excuse.addhtml("Not considered")
00829             self.excuses.append(excuse)
00830             return False
00831 
00832         # the starting point is that there is nothing wrong and nothing worth doing
00833         anywrongver = False
00834         anyworthdoing = False
00835 
00836         # for every binary package produced by this source in unstable for this architecture
00837         for pkg in sorted(filter(lambda x: x.endswith("/" + arch), source_u['binaries'])):
00838             pkg_name = pkg.split("/")[0]
00839 
00840             # retrieve the testing (if present) and unstable corresponding binary packages
00841             binary_t = pkg in source_t['binaries'] and self.binaries['testing'][arch][0][pkg_name] or None
00842             binary_u = self.binaries[suite][arch][0][pkg_name]
00843 
00844             # this is the source version for the new binary package
00845             pkgsv = self.binaries[suite][arch][0][pkg_name]['source-ver']
00846 
00847             # if the new binary package is architecture-independent, then skip it
00848             if binary_u['architecture'] == 'all':
00849                 excuse.addhtml("Ignoring %s %s (from %s) as it is arch: all" % (pkg_name, binary_u['version'], pkgsv))
00850                 continue
00851 
00852             # if the new binary package is not from the same source as the testing one, then skip it
00853             if not self.same_source(source_t['version'], pkgsv):
00854                 anywrongver = True
00855                 excuse.addhtml("From wrong source: %s %s (%s not %s)" % (pkg_name, binary_u['version'], pkgsv, source_t['version']))
00856                 break
00857 
00858             # find unsatisfied dependencies for the new binary package
00859             self.excuse_unsat_deps(pkg_name, src, arch, suite, excuse)
00860 
00861             # if the binary is not present in testing, then it is a new binary;
00862             # in this case, there is something worth doing
00863             if not binary_t:
00864                 excuse.addhtml("New binary: %s (%s)" % (pkg_name, binary_u['version']))
00865                 anyworthdoing = True
00866                 continue
00867 
00868             # at this point, the binary package is present in testing, so we can compare
00869             # the versions of the packages ...
00870             vcompare = apt_pkg.VersionCompare(binary_t['version'], binary_u['version'])
00871 
00872             # ... if updating would mean downgrading, then stop here: there is something wrong
00873             if vcompare > 0:
00874                 anywrongver = True
00875                 excuse.addhtml("Not downgrading: %s (%s to %s)" % (pkg_name, binary_t['version'], binary_u['version']))
00876                 break
00877             # ... if updating would mean upgrading, then there is something worth doing
00878             elif vcompare < 0:
00879                 excuse.addhtml("Updated binary: %s (%s to %s)" % (pkg_name, binary_t['version'], binary_u['version']))
00880                 anyworthdoing = True
00881 
00882         # if there is nothing wrong and there is something worth doing or the source
00883         # package is not fake, then check what packages shuold be removed
00884         if not anywrongver and (anyworthdoing or self.sources[suite][src].has_key('fake')):
00885             srcv = self.sources[suite][src]['version']
00886             ssrc = self.same_source(source_t['version'], srcv)
00887             # for every binary package produced by this source in testing for this architecture
00888             for pkg in sorted([x.split("/")[0] for x in self.sources['testing'][src]['binaries'] if x.endswith("/"+arch)]):
00889                 # if the package is architecture-independent, then ignore it
00890                 if self.binaries['testing'][arch][0][pkg]['architecture'] == 'all':
00891                     excuse.addhtml("Ignoring removal of %s as it is arch: all" % (pkg))
00892                     continue
00893                 # if the package is not produced by the new source package, then remove it from testing
00894                 if not self.binaries[suite][arch][0].has_key(pkg):
00895                     tpkgv = self.binaries['testing'][arch][0][pkg]['version']
00896                     excuse.addhtml("Removed binary: %s %s" % (pkg, tpkgv))
00897                     if ssrc: anyworthdoing = True
00898 
00899         # if there is nothing wrong and there is something worth doing, this is valid candidate
00900         if not anywrongver and anyworthdoing:
00901             excuse.addhtml("Valid candidate")
00902             self.excuses.append(excuse)
00903             return True
00904         # else if there is something worth doing (but something wrong, too) this package won't be considered
00905         elif anyworthdoing:
00906             excuse.addhtml("Not considered")
00907             self.excuses.append(excuse)
00908 
00909         # otherwise, return False
00910         return False
00911 
00912     def should_upgrade_src(self, src, suite):
00913         """Check if source package should be upgraded
00914 
00915         This method checks if a source package should be upgraded. The analisys
00916         is performed for the source package specified by the `src' parameter, 
00917         checking the architecture `arch' for the distribution `suite'.
00918        
00919         It returns False if the given package doesn't need to be upgraded,
00920         True otherwise. In the former case, a new excuse is appended to
00921         the the object attribute excuses.
00922         """
00923 
00924         # retrieve the source packages for testing (if available) and suite
00925         source_u = self.sources[suite][src]
00926         if src in self.sources['testing']:
00927             source_t = self.sources['testing'][src]
00928             # if testing and unstable have the same version, then this is a candidate for binary-NMUs only
00929             if apt_pkg.VersionCompare(source_t['version'], source_u['version']) == 0:
00930                 return False
00931         else:
00932             source_t = None
00933 
00934         # build the common part of the excuse, which will be filled by the code below
00935         ref = "%s%s" % (src, suite != 'unstable' and "_" + suite or "")
00936         excuse = Excuse(ref)
00937         excuse.set_vers(source_t and source_t['version'] or None, source_u['version'])
00938         source_u['maintainer'] and excuse.set_maint(source_u['maintainer'].strip())
00939         source_u['section'] and excuse.set_section(source_u['section'].strip())
00940 
00941         # the starting point is that we will update the candidate
00942         update_candidate = True
00943         
00944         # if the version in unstable is older, then stop here with a warning in the excuse and return False
00945         if source_t and apt_pkg.VersionCompare(source_u['version'], source_t['version']) < 0:
00946             excuse.addhtml("ALERT: %s is newer in testing (%s %s)" % (src, source_t['version'], source_u['version']))
00947             self.excuses.append(excuse)
00948             return False
00949 
00950         # check if the source package really exists or if it is a fake one
00951         if source_u.has_key('fake'):
00952             excuse.addhtml("%s source package doesn't exist" % (src))
00953             update_candidate = False
00954 
00955         # retrieve the urgency for the upload, ignoring it if this is a NEW package (not present in testing)
00956         urgency = self.urgencies.get(src, self.options.default_urgency)
00957         if not source_t and urgency != self.options.default_urgency:
00958             excuse.addhtml("Ignoring %s urgency setting for NEW package" % (urgency))
00959             urgency = self.options.default_urgency
00960 
00961         # if there is a `remove' hint and the requested version is the same of the
00962         # version in testing, then stop here and return False
00963         if self.hints["remove"].has_key(src):
00964             if source_t and self.same_source(source_t['version'], self.hints['remove'][src][0]) or \
00965                self.same_source(source_u['version'], self.hints['remove'][src][0]):
00966                 excuse.addhtml("Removal request by %s" % (self.hints["remove"][src][1]))
00967                 excuse.addhtml("Trying to remove package, not update it")
00968                 update_candidate = False
00969 
00970         # check if there is a `block' hint for this package or a `block-all source' hint
00971         blocked = None
00972         if self.hints["block"].has_key(src):
00973             blocked = self.hints["block"][src]
00974         elif self.hints["block-all"].has_key("source"):
00975             blocked = self.hints["block-all"]["source"]
00976 
00977         # if the source is blocked, then look for an `unblock' hint; the unblock request
00978         # is processed only if the specified version is correct
00979         if blocked:
00980             unblock = self.hints["unblock"].get(src,(None,None))
00981             if unblock[0] != None:
00982                 if self.same_source(unblock[0], source_u['version']):
00983                     excuse.addhtml("Ignoring request to block package by %s, due to unblock request by %s" % (blocked, unblock[1]))
00984                 else:
00985                     excuse.addhtml("Unblock request by %s ignored due to version mismatch: %s" % (unblock[1], unblock[0]))
00986             else:
00987                 excuse.addhtml("Not touching package, as requested by %s (contact debian-release if update is needed)" % (blocked))
00988                 update_candidate = False
00989 
00990         # if the suite is unstable, then we have to check the urgency and the minimum days of
00991         # permanence in unstable before updating testing; if the source package is too young,
00992         # the check fails and we set update_candidate to False to block the update
00993         if suite == 'unstable':
00994             if not self.dates.has_key(src):
00995                 self.dates[src] = (source_u['version'], self.date_now)
00996             elif not self.same_source(self.dates[src][0], source_u['version']):
00997                 self.dates[src] = (source_u['version'], self.date_now)
00998 
00999             days_old = self.date_now - self.dates[src][1]
01000             min_days = self.MINDAYS[urgency]
01001             excuse.setdaysold(days_old, min_days)
01002             if days_old < min_days:
01003                 if self.hints["urgent"].has_key(src) and self.same_source(source_u['version'], self.hints["urgent"][src][0]):
01004                     excuse.addhtml("Too young, but urgency pushed by %s" % (self.hints["urgent"][src][1]))
01005                 else:
01006                     update_candidate = False
01007 
01008         # at this point, we check what is the status of the builds on all the supported architectures
01009         # to catch the out-of-date ones
01010         pkgs = {src: ["source"]}
01011         for arch in self.options.architectures:
01012             oodbins = {}
01013             # for every binary package produced by this source in the suite for this architecture
01014             for pkg in sorted([x.split("/")[0] for x in self.sources[suite][src]['binaries'] if x.endswith("/"+arch)]):
01015                 if not pkgs.has_key(pkg): pkgs[pkg] = []
01016                 pkgs[pkg].append(arch)
01017 
01018                 # retrieve the binary package and its source version
01019                 binary_u = self.binaries[suite][arch][0][pkg]
01020                 pkgsv = binary_u['source-ver']
01021 
01022                 # if it wasn't builded by the same source, it is out-of-date
01023                 if not self.same_source(source_u['version'], pkgsv):
01024                     if not oodbins.has_key(pkgsv):
01025                         oodbins[pkgsv] = []
01026                     oodbins[pkgsv].append(pkg)
01027                     continue
01028 
01029                 # if the package is architecture-dependent or the current arch is `nobreakall'
01030                 # find unsatisfied dependencies for the binary package
01031                 if binary_u['architecture'] != 'all' or arch in self.options.nobreakall_arches:
01032                     self.excuse_unsat_deps(pkg, src, arch, suite, excuse)
01033 
01034             # if there are out-of-date packages, warn about them in the excuse and set update_candidate
01035             # to False to block the update; if the architecture where the package is out-of-date is
01036             # in the `fucked_arches' list, then do not block the update
01037             if oodbins:
01038                 oodtxt = ""
01039                 for v in oodbins.keys():
01040                     if oodtxt: oodtxt = oodtxt + "; "
01041                     oodtxt = oodtxt + "%s (from <a href=\"http://buildd.debian.org/build.php?" \
01042                         "arch=%s&pkg=%s&ver=%s\" target=\"_blank\">%s</a>)" % \
01043                         (", ".join(sorted(oodbins[v])), arch, src, v, v)
01044                 text = "out of date on <a href=\"http://buildd.debian.org/build.php?" \
01045                     "arch=%s&pkg=%s&ver=%s\" target=\"_blank\">%s</a>: %s" % \
01046                     (arch, src, source_u['version'], arch, oodtxt)
01047 
01048                 if arch in self.options.fucked_arches:
01049                     text = text + " (but %s isn't keeping up, so nevermind)" % (arch)
01050                 else:
01051                     update_candidate = False
01052 
01053                 if self.date_now != self.dates[src][1]:
01054                     excuse.addhtml(text)
01055 
01056         # if the source package has no binaries, set update_candidate to False to block the update
01057         if len(self.sources[suite][src]['binaries']) == 0:
01058             excuse.addhtml("%s has no binaries on any arch" % src)
01059             update_candidate = False
01060 
01061         # if the suite is unstable, then we have to check the release-critical bug counts before
01062         # updating testing; if the unstable package have a RC bug count greater than the testing
01063         # one,  the check fails and we set update_candidate to False to block the update
01064         if suite == 'unstable':
01065             for pkg in pkgs.keys():
01066                 if not self.bugs['testing'].has_key(pkg):
01067                     self.bugs['testing'][pkg] = 0
01068                 if not self.bugs['unstable'].has_key(pkg):
01069                     self.bugs['unstable'][pkg] = 0
01070 
01071                 if self.bugs['unstable'][pkg] > self.bugs['testing'][pkg]:
01072                     excuse.addhtml("%s (%s) is <a href=\"http://bugs.debian.org/cgi-bin/pkgreport.cgi?" \
01073                                    "which=pkg&data=%s&sev-inc=critical&sev-inc=grave&sev-inc=serious\" " \
01074                                    "target=\"_blank\">buggy</a>! (%d > %d)" % \
01075                                    (pkg, ", ".join(pkgs[pkg]), pkg, self.bugs['unstable'][pkg], self.bugs['testing'][pkg]))
01076                     update_candidate = False
01077                 elif self.bugs['unstable'][pkg] > 0:
01078                     excuse.addhtml("%s (%s) is (less) <a href=\"http://bugs.debian.org/cgi-bin/pkgreport.cgi?" \
01079                                    "which=pkg&data=%s&sev-inc=critical&sev-inc=grave&sev-inc=serious\" " \
01080                                    "target=\"_blank\">buggy</a>! (%d <= %d)" % \
01081                                    (pkg, ", ".join(pkgs[pkg]), pkg, self.bugs['unstable'][pkg], self.bugs['testing'][pkg]))
01082 
01083         # check if there is a `force' hint for this package, which allows it to go in even if it is not updateable
01084         if not update_candidate and self.hints["force"].has_key(src) and \
01085            self.same_source(source_u['version'], self.hints["force"][src][0]):
01086             excuse.dontinvalidate = 1
01087             excuse.addhtml("Should ignore, but forced by %s" % (self.hints["force"][src][1]))
01088             update_candidate = True
01089 
01090         # if the suite is testing-proposed-updates, the package needs an explicit approval in order to go in
01091         if suite == "tpu":
01092             if self.approvals.has_key("%s_%s" % (src, source_u['version'])):
01093                 excuse.addhtml("Approved by %s" % approvals["%s_%s" % (src, source_u['version'])])
01094             else:
01095                 excuse.addhtml("NEEDS APPROVAL BY RM")
01096                 update_candidate = False
01097 
01098         # if the package can be updated, it is a valid candidate
01099         if update_candidate:
01100             excuse.addhtml("Valid candidate")
01101         # else it won't be considered
01102         else:
01103             excuse.addhtml("Not considered")
01104 
01105         self.excuses.append(excuse)
01106         return update_candidate
01107 
01108     def reversed_exc_deps(self):
01109         """Reverse the excuses dependencies
01110 
01111         This method returns a dictionary where the keys are the package names
01112         and the values are the excuse names which depend on it.
01113         """
01114         res = {}
01115         for exc in self.excuses:
01116             for d in exc.deps:
01117                 if not res.has_key(d): res[d] = []
01118                 res[d].append(exc.name)
01119         return res
01120 
01121     def invalidate_excuses(self, valid, invalid):
01122         """Invalidate impossible excuses
01123 
01124         This method invalidates the impossible excuses, which depend
01125         on invalid excuses. The two parameters contains the list of
01126         `valid' and `invalid' excuses.
01127         """
01128         # build a lookup-by-name map
01129         exclookup = {}
01130         for e in self.excuses:
01131             exclookup[e.name] = e
01132 
01133         # build the reverse dependencies
01134         revdeps = self.reversed_exc_deps()
01135 
01136         # loop on the invalid excuses
01137         i = 0
01138         while i < len(invalid):
01139             # if there is no reverse dependency, skip the item
01140             if not revdeps.has_key(invalid[i]):
01141                 i += 1
01142                 continue
01143             # if there dependency can be satisfied by a testing-proposed-updates excuse, skip the item
01144             if (invalid[i] + "_tpu") in valid:
01145                 i += 1
01146                 continue
01147             # loop on the reverse dependencies
01148             for x in revdeps[invalid[i]]:
01149                 # if the item is valid and it is marked as `dontinvalidate', skip the item
01150                 if x in valid and exclookup[x].dontinvalidate:
01151                     continue
01152 
01153                 # otherwise, invalidate the dependency and mark as invalidated and
01154                 # remove the depending excuses
01155                 exclookup[x].invalidate_dep(invalid[i])
01156                 if x in valid:
01157                     p = valid.index(x)
01158                     invalid.append(valid.pop(p))
01159                     exclookup[x].addhtml("Invalidated by dependency")
01160                     exclookup[x].addhtml("Not considered")
01161             i = i + 1
01162  
01163     def write_excuses(self):
01164         """Produce and write the update excuses
01165 
01166         This method handles the update excuses generation: the packages are
01167         looked to determine whether they are valid candidates. For the details
01168         of this procedure, please refer to the module docstring.
01169         """
01170 
01171         self.__log("Update Excuses generation started", type="I")
01172 
01173         # this list will contain the packages which are valid candidates;
01174         # if a package is going to be removed, it will have a "-" prefix
01175         upgrade_me = []
01176 
01177         # for every source package in testing, check if it should be removed
01178         for pkg in self.sources['testing']:
01179             if self.should_remove_source(pkg):
01180                 upgrade_me.append("-" + pkg)
01181 
01182         # for every source package in unstable check if it should be upgraded
01183         for pkg in self.sources['unstable']:
01184             # if the source package is already present in testing,
01185             # check if it should be upgraded for every binary package
01186             if self.sources['testing'].has_key(pkg):
01187                 for arch in self.options.architectures:
01188                     if self.should_upgrade_srcarch(pkg, arch, 'unstable'):
01189                         upgrade_me.append("%s/%s" % (pkg, arch))
01190 
01191             # check if the source package should be upgraded
01192             if self.should_upgrade_src(pkg, 'unstable'):
01193                 upgrade_me.append(pkg)
01194 
01195         # for every source package in testing-proposed-updates, check if it should be upgraded
01196         for pkg in self.sources['tpu']:
01197             # if the source package is already present in testing,
01198             # check if it should be upgraded for every binary package
01199             if self.sources['testing'].has_key(pkg):
01200                 for arch in self.options.architectures:
01201                     if self.should_upgrade_srcarch(pkg, arch, 'tpu'):
01202                         upgrade_me.append("%s/%s_tpu" % (pkg, arch))
01203 
01204             # check if the source package should be upgraded
01205             if self.should_upgrade_src(pkg, 'tpu'):
01206                 upgrade_me.append("%s_tpu" % pkg)
01207 
01208         # process the `remove' hints, if the given package is not yet in upgrade_me
01209         for src in self.hints["remove"].keys():
01210             if src in upgrade_me: continue
01211             if ("-"+src) in upgrade_me: continue
01212             if not self.sources['testing'].has_key(src): continue
01213 
01214             # check if the version specified in the hint is the same of the considered package
01215             tsrcv = self.sources['testing'][src]['version']
01216             if not self.same_source(tsrcv, self.hints["remove"][src][0]): continue
01217 
01218             # add the removal of the package to upgrade_me and build a new excuse
01219             upgrade_me.append("-%s" % (src))
01220             excuse = Excuse("-%s" % (src))
01221             excuse.set_vers(tsrcv, None)
01222             excuse.addhtml("Removal request by %s" % (self.hints["remove"][src][1]))
01223             excuse.addhtml("Package is broken, will try to remove")
01224             self.excuses.append(excuse)
01225 
01226         # sort the excuses by daysold and name
01227         self.excuses.sort(lambda x, y: cmp(x.daysold, y.daysold) or cmp(x.name, y.name))
01228 
01229         # extract the not considered packages, which are in the excuses but not in upgrade_me
01230         unconsidered = [e.name for e in self.excuses if e.name not in upgrade_me]
01231 
01232         # invalidate impossible excuses
01233         for e in self.excuses:
01234             for d in e.deps:
01235                 if d not in upgrade_me and d not in unconsidered:
01236                     e.addhtml("Unpossible dep: %s -> %s" % (e.name, d))
01237         self.invalidate_excuses(upgrade_me, unconsidered)
01238 
01239         self.upgrade_me = sorted(upgrade_me)
01240 
01241         # write excuses to the output file
01242         self.__log("Writing Excuses to %s" % self.options.excuses_output, type="I")
01243 
01244         f = open(self.options.excuses_output, 'w')
01245         f.write("<!DOCTYPE HTML PUBLIC \"-//W3C//DTD HTML 4.01//EN\" \"http://www.w3.org/TR/REC-html40/strict.dtd\">\n")
01246         f.write("<html><head><title>excuses...</title>")
01247         f.write("<meta http-equiv=\"Content-Type\" content=\"text/html;charset=utf-8\"></head><body>\n")
01248         f.write("<p>Generated: " + time.strftime("%Y.%m.%d %H:%M:%S %z", time.gmtime(time.time())) + "</p>\n")
01249         f.write("<ul>\n")
01250         for e in self.excuses:
01251             f.write("<li>%s" % e.html())
01252         f.write("</ul></body></html>\n")
01253         f.close()
01254 
01255         self.__log("Update Excuses generation completed", type="I")
01256 
01257     # Upgrade run
01258     # -----------
01259 
01260     def get_nuninst(self):
01261         nuninst = {}
01262 
01263         def add_nuninst(pkg, arch):
01264             if pkg not in nuninst[arch]:
01265                 nuninst[arch].append(pkg)
01266                 for p in self.binaries['testing'][arch][0][pkg]['rdepends']:
01267                     tpkg = self.binaries['testing'][arch][0][p[0]]
01268                     if skip_archall and tpkg['architecture'] == 'all':
01269                         continue
01270                     r = self.excuse_unsat_deps(p[0], tpkg['source'], arch, 'testing', None, excluded=nuninst[arch])
01271                     if not r:
01272                         add_nuninst(p[0], arch)
01273 
01274         for arch in self.options.architectures:
01275             if arch not in self.options.nobreakall_arches:
01276                 skip_archall = True
01277             else: skip_archall = False
01278 
01279             nuninst[arch] = []
01280             for pkg_name in self.binaries['testing'][arch][0]:
01281                 pkg = self.binaries['testing'][arch][0][pkg_name]
01282                 if skip_archall and pkg['architecture'] == 'all':
01283                     continue
01284                 r = self.excuse_unsat_deps(pkg_name, pkg['source'], arch, 'testing', None, excluded=nuninst[arch])
01285                 if not r:
01286                     add_nuninst(pkg_name, arch)
01287 
01288         return nuninst
01289 
01290     def eval_nuninst(self, nuninst):
01291         res = []
01292         total = 0
01293         totalbreak = 0
01294         for arch in self.options.architectures:
01295             if nuninst.has_key(arch):
01296                 n = len(nuninst[arch])
01297                 if arch in self.options.break_arches:
01298                     totalbreak = totalbreak + n
01299                 else:
01300                     total = total + n
01301                 res.append("%s-%d" % (arch[0], n))
01302         return "%d+%d: %s" % (total, totalbreak, ":".join(res))
01303 
01304     def eval_uninst(self, nuninst):
01305         res = ""
01306         for arch in self.arches:
01307             if nuninst.has_key(arch) and nuninst[arch] != []:
01308                 res = res + "    * %s: %s\n" % (arch,
01309                     ", ".join(nuninst[arch]))
01310         return res
01311 
01312     def doop_source(self, pkg):
01313 
01314         undo = {'binaries': {}, 'sources': {}}
01315 
01316         affected = []
01317 
01318         # arch = "<source>/<arch>",
01319         if "/" in pkg:
01320             print "NOT HANDLED!"
01321             sys.exit(1)
01322 
01323         # removals = "-<source>",
01324         # normal = "<source>"
01325         else:
01326             if pkg[0] == "-":
01327                 pkg_name = pkg[1:]
01328                 suite = "testing"
01329             elif pkg[0].endswith("_tpu"):
01330                 pkg_name = pkg[:-4]
01331                 suite = "tpu"
01332             else:
01333                 pkg_name = pkg
01334                 suite = "unstable"
01335 
01336             # remove all binary packages (if the source already exists)
01337             if pkg_name in self.sources['testing']:
01338                 source = self.sources['testing'][pkg_name]
01339                 for p in source['binaries']:
01340                     binary, arch = p.split("/")
01341                     undo['binaries'][p] = self.binaries['testing'][arch][0][binary]
01342                     for j in self.binaries['testing'][arch][0][binary]['rdepends']:
01343                         if j not in affected: affected.append((j[0], j[1], j[2], arch))
01344                     del self.binaries['testing'][arch][0][binary]
01345                 undo['sources'][pkg_name] = source
01346                 del self.sources['testing'][pkg_name]
01347 
01348             # add the new binary packages (if we are not removing)
01349             if pkg[0] != "-":
01350                 source = self.sources[suite][pkg_name]
01351                 for p in source['binaries']:
01352                     binary, arch = p.split("/")
01353                     if p not in affected:
01354                         affected.append((binary, None, None, arch))
01355                     if binary in self.binaries['testing'][arch][0]:
01356                         undo['binaries'][p] = self.binaries['testing'][arch][0][binary]
01357                     self.binaries['testing'][arch][0][binary] = self.binaries[suite][arch][0][binary]
01358                     for j in self.binaries['testing'][arch][0][binary]['rdepends']:
01359                         if j not in affected: affected.append((j[0], j[1], j[2], arch))
01360                 self.sources['testing'][pkg_name] = self.sources[suite][pkg_name]
01361 
01362         return (pkg_name, suite, affected, undo)
01363 
01364     def iter_packages(self, packages, output):
01365         extra = []
01366         nuninst_comp = self.get_nuninst()
01367 
01368         while packages:
01369             pkg = packages.pop(0)
01370             output.write("trying: %s\n" % (pkg))
01371 
01372             better = True
01373             nuninst = {}
01374 
01375             pkg_name, suite, affected, undo = self.doop_source(pkg)
01376             broken = []
01377 
01378             for arch in self.options.architectures:
01379                 if arch not in self.options.nobreakall_arches:
01380                     skip_archall = True
01381                 else: skip_archall = False
01382 
01383                 for p in filter(lambda x: x[3] == arch, affected):
01384                     if not self.binaries['testing'][arch][0].has_key(p[0]) or \
01385                        skip_archall and self.binaries['testing'][arch][0][p[0]]['architecture'] == 'all': continue
01386                     r = self.excuse_unsat_deps(p[0], None, arch, 'testing', None, excluded=[])
01387                     if not r and p[0] not in broken: broken.append(p[0])
01388 
01389                 l = 0
01390                 while l < len(broken):
01391                     l = len(broken)
01392                     for j in broken:
01393                         for p in self.binaries['testing'][arch][0][j]['rdepends']:
01394                             if not self.binaries['testing'][arch][0].has_key(p[0]) or \
01395                                skip_archall and self.binaries['testing'][arch][0][p[0]]['architecture'] == 'all': continue
01396                             r = self.excuse_unsat_deps(p[0], None, arch, 'testing', None, excluded=broken)
01397                             if not r and p[0] not in broken: broken.append(p[0])
01398                     
01399                 nuninst[arch] = sorted(broken)
01400                 if len(nuninst[arch]) > 0:
01401                     better = False
01402                     break
01403 
01404             if better:
01405                 self.selected.append(pkg)
01406                 packages.extend(extra)
01407                 extra = []
01408                 nuninst_new = nuninst_comp # FIXME!
01409                 output.write("accepted: %s\n" % (pkg))
01410                 output.write("   ori: %s\n" % (self.eval_nuninst(self.nuninst_orig)))
01411                 output.write("   pre: %s\n" % (self.eval_nuninst(nuninst_comp)))
01412                 output.write("   now: %s\n" % (self.eval_nuninst(nuninst_new)))
01413                 if len(self.selected) <= 20:
01414                     output.write("   all: %s\n" % (" ".join(self.selected)))
01415                 else:
01416                     output.write("  most: (%d) .. %s\n" % (len(self.selected), " ".join(self.selected[-20:])))
01417                 nuninst_comp = nuninst_new
01418             else:
01419                 output.write("skipped: %s (%d <- %d)\n" % (pkg, len(extra), len(packages)))
01420                 output.write("    got: %s\n" % self.eval_nuninst(nuninst))
01421                 output.write("    * %s: %s\n" % (arch, ", ".join(nuninst[arch])))
01422                 extra.append(pkg)
01423 
01424                 # undo the changes (source and new binaries)
01425                 for k in undo['sources'].keys():
01426                     if k in self.sources[suite]:
01427                         for p in self.sources[suite][k]['binaries']:
01428                             binary, arch = p.split("/")
01429                             del self.binaries['testing'][arch][0][binary]
01430                         del self.sources['testing'][k]
01431                     self.sources['testing'][k] = undo['sources'][k]
01432 
01433                 # undo the changes (binaries)
01434                 for p in undo['binaries'].keys():
01435                     binary, arch = p.split("/")
01436                     self.binaries['testing'][arch][0][binary] = undo['binaries'][p]
01437  
01438 
01439     def do_all(self, output):
01440         nuninst_start = self.get_nuninst()
01441         output.write("start: %s\n" % self.eval_nuninst(nuninst_start))
01442         output.write("orig: %s\n" % self.eval_nuninst(nuninst_start))
01443         self.selected = []
01444         self.nuninst_orig = nuninst_start
01445         self.iter_packages(self.upgrade_me, output)
01446 
01447     def upgrade_testing(self):
01448         """Upgrade testing using the unstable packages
01449 
01450         This method tries to upgrade testing using the packages from unstable.
01451         """
01452 
01453         self.__log("Starting the upgrade test", type="I")
01454         output = open(self.options.upgrade_output, 'w')
01455         output.write("Generated on: %s\n" % (time.strftime("%Y.%m.%d %H:%M:%S %z", time.gmtime(time.time()))))
01456         output.write("Arch order is: %s\n" % ", ".join(self.options.architectures))
01457 
01458         # TODO: process hints!
01459         self.do_all(output)
01460 
01461         output.close()
01462         self.__log("Test completed!", type="I")
01463 
01464     def main(self):
01465         """Main method
01466         
01467         This is the entry point for the class: it includes the list of calls
01468         for the member methods which will produce the output files.
01469         """
01470         self.write_excuses()
01471         self.upgrade_testing()
01472 
01473 if __name__ == '__main__':
01474     Britney().main()

Generated on Sat Jul 22 09:29:59 2006 for briteny by  doxygen 1.4.7