britney.py

00001 #!/usr/bin/env python2.4
00002 # -*- coding: utf-8 -*-
00003 
00004 # Copyright (C) 2001-2004 Anthony Towns <ajt@debian.org>
00005 #                         Andreas Barth <aba@debian.org>
00006 #                         Fabio Tranchitella <kobold@debian.org>
00007 
00008 # This program is free software; you can redistribute it and/or modify
00009 # it under the terms of the GNU General Public License as published by
00010 # the Free Software Foundation; either version 2 of the License, or
00011 # (at your option) any later version.
00012 
00013 # This program is distributed in the hope that it will be useful,
00014 # but WITHOUT ANY WARRANTY; without even the implied warranty of
00015 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
00016 # GNU General Public License for more details.
00017 
00018 """
00019 = Introdution =
00020 
00021 This is the Debian testing updater script, also known as "Britney".
00022 
00023 Packages are usually installed into the `testing' distribution after
00024 they have undergone some degree of testing in unstable. The goal of
00025 this software is to do this task in a smart way, allowing testing
00026 to be always fully installable and close to being a release candidate.
00027 
00028 Britney source code is splitted in two different but related tasks:
00029 the first one is the generation of the update excuses, while the
00030 second tries to update testing with the valid candidates; first 
00031 each package alone, then larger and even larger sets of packages
00032 together. Each try is accepted if testing is not more uninstallable
00033 after the update than before.
00034 
00035 = Data Loading =
00036 
00037 In order to analyze the entire Debian distribution, Britney needs to
00038 load in memory the whole archive: this means more than 10.000 packages
00039 for twelve architectures, as well as the dependency interconnection
00040 between them. For this reason, the memory requirement for running this
00041 software are quite high and at least 1 gigabyte of RAM should be available.
00042 
00043 Britney loads the source packages from the `Sources' file and the binary
00044 packages from the `Packages_${arch}' files, where ${arch} is substituted
00045 with the supported architectures. While loading the data, the software
00046 analyze the dependencies and build a directed weighted graph in memory
00047 with all the interconnections between the packages (see Britney.read_sources
00048 and Britney.read_binaries).
00049 
00050 Other than source and binary packages, Britney loads the following data:
00051 
00052   * Bugs, which contains the count of release-critical bugs for a given
00053     version of a source package (see Britney.read_bugs).
00054 
00055   * Dates, which contains the date of the upload of a given version 
00056     of a source package (see Britney.read_dates).
00057 
00058   * Urgencies, which contains the urgency of the upload of a given
00059     version of a source package (see Britney.read_urgencies).
00060 
00061   * Approvals, which contains the list of approved testing-proposed-updates
00062     packages (see Britney.read_approvals).
00063 
00064   * Hints, which contains lists of commands which modify the standard behaviour
00065     of Britney (see Britney.read_hints).
00066 
00067 For a more detailed explanation about the format of these files, please read
00068 the documentation of the related methods. The exact meaning of them will be
00069 instead explained in the chapter "Excuses Generation".
00070 
00071 = Excuses =
00072 
00073 An excuse is a detailed explanation of why a package can or cannot
00074 be updated in the testing distribution from a newer package in 
00075 another distribution (like for example unstable). The main purpose
00076 of the excuses is to be written in an HTML file which will be 
00077 published over HTTP. The maintainers will be able to parse it manually
00078 or automatically to find the explanation of why their packages have
00079 been updated or not.
00080 
00081 == Excuses generation ==
00082 
00083 These are the steps (with references to method names) that Britney
00084 does for the generation of the update excuses.
00085 
00086  * If a source package is available in testing but it is not
00087    present in unstable and no binary packages in unstable are
00088    built from it, then it is marked for removal.
00089 
00090  * Every source package in unstable and testing-proposed-updates,
00091    if already present in testing, is checked for binary-NMUs, new
00092    or dropped binary packages in all the supported architectures
00093    (see Britney.should_upgrade_srcarch). The steps to detect if an
00094    upgrade is needed are:
00095 
00096     1. If there is a `remove' hint for the source package, the package
00097        is ignored: it will be removed and not updated.
00098 
00099     2. For every binary package build from the new source, it checks
00100        for unsatisfied dependencies, new binary package and updated
00101        binary package (binNMU) excluding the architecture-independent
00102        ones and the packages not built from the same source.
00103 
00104     3. For every binary package build from the old source, it checks
00105        if it is still built from the new source; if this is not true
00106        and the package is not architecture-independent, the script
00107        removes it from testing.
00108 
00109     4. Finally, if there is something worth doing (eg. a new or updated
00110        binary package) and nothing wrong it marks the source package
00111        as "Valid candidate", or "Not considered" if there is something
00112        wrong which prevented the update.
00113 
00114  * Every source package in unstable and testing-proposed-updates is
00115    checked for upgrade (see Britney.should_upgrade_src). The steps
00116    to detect if an upgrade is needed are:
00117 
00118     1. If the source package in testing is more recent the new one
00119        is ignored.
00120 
00121     2. If the source package doesn't exist (is fake), which means that
00122        a binary package refers to it but it is not present in the
00123        `Sources' file, the new one is ignored.
00124 
00125     3. If the package doesn't exist in testing, the urgency of the
00126        upload is ignored and set to the default (actually `low').
00127 
00128     4. If there is a `remove' hint for the source package, the package
00129        is ignored: it will be removed and not updated.
00130 
00131     5. If there is a `block' hint for the source package without an
00132        `unblock` hint or a `block-all source`, the package is ignored.
00133 
00134     7. If the suite is unstable, the update can go ahead only if the
00135        upload happend more then the minimum days specified by the
00136        urgency of the upload; if this is not true, the package is
00137        ignored as `too-young'. Note that the urgency is sticky, meaning
00138        that the highest urgency uploaded since the previous testing
00139        transition is taken into account.
00140 
00141     8. All the architecture-dependent binary packages and the
00142        architecture-independent ones for the `nobreakall' architectures
00143        have to be built from the source we are considering. If this is
00144        not true, then these are called `out-of-date' architectures and
00145        the package is ignored.
00146 
00147     9. The source package must have at least a binary package, otherwise
00148        it is ignored.
00149 
00150    10. If the suite is unstable, the count of release critical bugs for
00151        the new source package must be less then the count for the testing
00152        one. If this is not true, the package is ignored as `buggy'.
00153 
00154    11. If there is a `force' hint for the source package, then it is
00155        updated even if it is marked as ignored from the previous steps.
00156 
00157    12. If the suite is testing-proposed-updates, the source package can
00158        be updated only if there is an explicit approval for it.
00159 
00160    13. If the package will be ignored, mark it as "Valid candidate",
00161        otherwise mark it as "Not considered".
00162 
00163  * The list of `remove' hints is processed: if the requested source
00164    package is not already being updated or removed and the version
00165    actually in testing is the same specified with the `remove' hint,
00166    it is marked for removal.
00167 
00168  * The excuses are sorted by the number of days from the last upload
00169    (days-old) and by name.
00170 
00171  * A list of unconsidered excuses (for which the package is not upgraded)
00172    is built. Using this list, all the excuses depending on them is marked
00173    as invalid for "unpossible dependency".
00174 
00175  * The excuses are written in an HTML file.
00176 """
00177 
00178 import os
00179 import re
00180 import sys
00181 import string
00182 import time
00183 import optparse
00184 
00185 import apt_pkg
00186 
00187 from excuse import Excuse
00188 
00189 __author__ = 'Fabio Tranchitella'
00190 __version__ = '2.0.alpha1'
00191 
00192 
00193 class Britney:
00194     """Britney, the debian testing updater script
00195     
00196     This is the script that updates the testing_ distribution. It is executed
00197     each day after the installation of the updated packages. It generates the 
00198     `Packages' files for the testing distribution, but it does so in an
00199     intelligent manner; it try to avoid any inconsistency and to use only
00200     non-buggy packages.
00201 
00202     For more documentation on this script, please read the Developers Reference.
00203     """
00204 
00205     HINTS_STANDARD = ("easy", "hint", "remove", "block", "unblock", "urgent", "approve")
00206     HINTS_ALL = ("force", "force-hint", "block-all") + HINTS_STANDARD
00207 
00208     def __init__(self):
00209         """Class constructor
00210 
00211         This method initializes and populates the data lists, which contain all
00212         the information needed by the other methods of the class.
00213         """
00214         self.date_now = int(((time.time() / (60*60)) - 15) / 24)
00215 
00216         # parse the command line arguments
00217         self.__parse_arguments()
00218 
00219         # initialize the apt_pkg back-end
00220         apt_pkg.init()
00221 
00222         # read the source and binary packages for the involved distributions
00223         self.sources = {'testing': self.read_sources(self.options.testing),
00224                         'unstable': self.read_sources(self.options.unstable),
00225                         'tpu': self.read_sources(self.options.tpu),}
00226         self.binaries = {'testing': {}, 'unstable': {}, 'tpu': {}}
00227         for arch in self.options.architectures:
00228             self.binaries['testing'][arch] = self.read_binaries(self.options.testing, "testing", arch)
00229             self.binaries['unstable'][arch] = self.read_binaries(self.options.unstable, "unstable", arch)
00230             self.binaries['tpu'][arch] = self.read_binaries(self.options.tpu, "tpu", arch)
00231 
00232         # read the release-critical bug summaries for testing and unstable
00233         self.bugs = {'unstable': self.read_bugs(self.options.unstable),
00234                      'testing': self.read_bugs(self.options.testing),}
00235         self.normalize_bugs()
00236 
00237         # read additional data
00238         self.dates = self.read_dates(self.options.testing)
00239         self.urgencies = self.read_urgencies(self.options.testing)
00240         self.approvals = self.read_approvals(self.options.tpu)
00241         self.hints = self.read_hints(self.options.unstable)
00242         self.excuses = []
00243 
00244     def __parse_arguments(self):
00245         """Parse the command line arguments
00246 
00247         This method parses and initializes the command line arguments.
00248         While doing so, it preprocesses some of the options to be converted
00249         in a suitable form for the other methods of the class.
00250         """
00251         # initialize the parser
00252         self.parser = optparse.OptionParser(version="%prog")
00253         self.parser.add_option("-v", "", action="count", dest="verbose", help="enable verbose output")
00254         self.parser.add_option("-c", "--config", action="store", dest="config",
00255                           default="/etc/britney.conf", help="path for the configuration file")
00256         (self.options, self.args) = self.parser.parse_args()
00257 
00258         # if the configuration file exists, than read it and set the additional options
00259         if not os.path.isfile(self.options.config):
00260             self.__log("Unable to read the configuration file (%s), exiting!" % self.options.config, type="E")
00261             sys.exit(1)
00262 
00263         # minimum days for unstable-testing transition and the list of hints
00264         # are handled as an ad-hoc case
00265         self.MINDAYS = {}
00266         self.HINTS = {}
00267         for k, v in [map(string.strip,r.split('=', 1)) for r in file(self.options.config) if '=' in r and not r.strip().startswith('#')]:
00268             if k.startswith("MINDAYS_"):
00269                 self.MINDAYS[k.split("_")[1].lower()] = int(v)
00270             elif k.startswith("HINTS_"):
00271                 self.HINTS[k.split("_")[1].lower()] = \
00272                     reduce(lambda x,y: x+y, [hasattr(self, "HINTS_" + i) and getattr(self, "HINTS_" + i) or (i,) for i in v.split()])
00273             else:
00274                 setattr(self.options, k.lower(), v)
00275 
00276         # Sort the architecture list
00277         allarches = sorted(self.options.architectures.split())
00278         arches = [x for x in allarches if x in self.options.nobreakall_arches]
00279         arches += [x for x in allarches if x not in arches and x not in self.options.fucked_arches]
00280         arches += [x for x in allarches if x not in arches and x not in self.options.break_arches]
00281         arches += [x for x in allarches if x not in arches]
00282         self.options.architectures = arches
00283 
00284     def __log(self, msg, type="I"):
00285         """Print info messages according to verbosity level
00286         
00287         An easy-and-simple log method which prints messages to the standard
00288         output. The type parameter controls the urgency of the message, and
00289         can be equal to `I' for `Information', `W' for `Warning' and `E' for
00290         `Error'. Warnings and errors are always printed, and information are
00291         printed only if the verbose logging is enabled.
00292         """
00293         if self.options.verbose or type in ("E", "W"):
00294             print "%s: [%s] - %s" % (type, time.asctime(), msg)
00295 
00296     # Data reading/writing methods
00297     # ----------------------------
00298 
00299     def read_sources(self, basedir):
00300         """Read the list of source packages from the specified directory
00301         
00302         The source packages are read from the `Sources' file within the
00303         directory specified as `basedir' parameter. Considering the
00304         large amount of memory needed, not all the fields are loaded
00305         in memory. The available fields are Version, Maintainer and Section.
00306 
00307         The method returns a list where every item represents a source
00308         package as a dictionary.
00309         """
00310         sources = {}
00311         package = None
00312         filename = os.path.join(basedir, "Sources")
00313         self.__log("Loading source packages from %s" % filename)
00314         packages = apt_pkg.ParseTagFile(open(filename))
00315         while packages.Step():
00316             pkg = packages.Section.get('Package')
00317             sources[pkg] = {'binaries': [],
00318                             'version': packages.Section.get('Version'),
00319                             'maintainer': packages.Section.get('Maintainer'),
00320                             'section': packages.Section.get('Section'),
00321                             }
00322         return sources
00323 
00324     def read_binaries(self, basedir, distribution, arch):
00325         """Read the list of binary packages from the specified directory
00326         
00327         The binary packages are read from the `Packages_${arch}' files
00328         within the directory specified as `basedir' parameter, replacing
00329         ${arch} with the value of the arch parameter. Considering the
00330         large amount of memory needed, not all the fields are loaded
00331         in memory. The available fields are Version, Source, Pre-Depends,
00332         Depends, Conflicts, Provides and Architecture.
00333         
00334         After reading the packages, reverse dependencies are computed
00335         and saved in the `rdepends' keys, and the `Provides' field is
00336         used to populate the virtual packages list.
00337 
00338         The dependencies are parsed with the apt.pkg.ParseDepends method,
00339         and they are stored both as the format of its return value and
00340         text.
00341 
00342         The method returns a tuple. The first element is a list where
00343         every item represents a binary package as a dictionary; the second
00344         element is a dictionary which maps virtual packages to real
00345         packages that provide it.
00346         """
00347 
00348         packages = {}
00349         provides = {}
00350         package = None
00351         filename = os.path.join(basedir, "Packages_%s" % arch)
00352         self.__log("Loading binary packages from %s" % filename)
00353         Packages = apt_pkg.ParseTagFile(open(filename))
00354         while Packages.Step():
00355             pkg = Packages.Section.get('Package')
00356             version = Packages.Section.get('Version')
00357             dpkg = {'rdepends': [],
00358                     'version': version,
00359                     'source': pkg, 
00360                     'source-ver': version,
00361                     'pre-depends': Packages.Section.get('Pre-Depends'),
00362                     'depends': Packages.Section.get('Depends'),
00363                     'conflicts': Packages.Section.get('Conflicts'),
00364                     'provides': Packages.Section.get('Provides'),
00365                     'architecture': Packages.Section.get('Architecture'),
00366                     }
00367 
00368             # retrieve the name and the version of the source package
00369             source = Packages.Section.get('Source')
00370             if source:
00371                 dpkg['source'] = source.split(" ")[0]
00372                 if "(" in source:
00373                     dpkg['source-ver'] = source.split("(")[1].split(")")[0]
00374 
00375             # if the source package is available in the distribution, then register this binary package
00376             if dpkg['source'] in self.sources[distribution]:
00377                 self.sources[distribution][dpkg['source']]['binaries'].append(pkg + "/" + arch)
00378             # if the source package doesn't exist, create a fake one
00379             else:
00380                 self.sources[distribution][dpkg['source']] = {'binaries': [pkg + "/" + arch],
00381                     'version': dpkg['source-ver'], 'maintainer': None, 'section': None, 'fake': True}
00382 
00383             # register virtual packages and real packages that provide them
00384             if dpkg['provides']:
00385                 parts = map(string.strip, dpkg['provides'].split(","))
00386                 for p in parts:
00387                     try:
00388                         provides[p].append(pkg)
00389                     except KeyError:
00390                         provides[p] = [pkg]
00391             del dpkg['provides']
00392 
00393             # append the resulting dictionary to the package list
00394             packages[pkg] = dpkg
00395 
00396         # loop again on the list of packages to register reverse dependencies
00397         for pkg in packages:
00398             dependencies = []
00399 
00400             # analyze dependencies
00401             if packages[pkg]['depends']:
00402                 packages[pkg]['depends-txt'] = packages[pkg]['depends']
00403                 packages[pkg]['depends'] = apt_pkg.ParseDepends(packages[pkg]['depends'])
00404                 dependencies.extend(packages[pkg]['depends'])
00405 
00406             # analyze pre-dependencies
00407             if packages[pkg]['pre-depends']:
00408                 packages[pkg]['pre-depends-txt'] = packages[pkg]['pre-depends']
00409                 packages[pkg]['pre-depends'] = apt_pkg.ParseDepends(packages[pkg]['pre-depends'])
00410                 dependencies.extend(packages[pkg]['pre-depends'])
00411 
00412             # register the list of the dependencies for the depending packages
00413             for p in dependencies:
00414                 for a in p:
00415                     if a[0] not in packages: continue
00416                     packages[a[0]]['rdepends'].append((pkg, a[1], a[2]))
00417 
00418         # return a tuple with the list of real and virtual packages
00419         return (packages, provides)
00420 
00421     def read_bugs(self, basedir):
00422         """Read the release critial bug summary from the specified directory
00423         
00424         The RC bug summaries are read from the `Bugs' file within the
00425         directory specified as `basedir' parameter. The file contains
00426         rows with the format:
00427 
00428         <package-name> <count-of-rc-bugs>
00429 
00430         The method returns a dictionary where the key is the binary package
00431         name and the value is the number of open RC bugs for it.
00432         """
00433         bugs = {}
00434         filename = os.path.join(basedir, "Bugs")
00435         self.__log("Loading RC bugs count from %s" % filename)
00436         for line in open(filename):
00437             l = line.strip().split()
00438             if len(l) != 2: continue
00439             try:
00440                 bugs[l[0]] = int(l[1])
00441             except ValueError:
00442                 self.__log("Bugs, unable to parse \"%s\"" % line, type="E")
00443         return bugs
00444 
00445     def __maxver(self, pkg, dist):
00446         """Return the maximum version for a given package name
00447         
00448         This method returns None if the specified source package
00449         is not available in the `dist' distribution. If the package
00450         exists, then it returns the maximum version between the
00451         source package and its binary packages.
00452         """
00453         maxver = None
00454         if self.sources[dist].has_key(pkg):
00455             maxver = self.sources[dist][pkg]['version']
00456         for arch in self.options.architectures:
00457             if not self.binaries[dist][arch][0].has_key(pkg): continue
00458             pkgv = self.binaries[dist][arch][0][pkg]['version']
00459             if maxver == None or apt_pkg.VersionCompare(pkgv, maxver) > 0:
00460                 maxver = pkgv
00461         return maxver
00462 
00463     def normalize_bugs(self):
00464         """Normalize the release critical bug summaries for testing and unstable
00465         
00466         The method doesn't return any value: it directly modifies the
00467         object attribute `bugs'.
00468         """
00469         # loop on all the package names from testing and unstable bug summaries
00470         for pkg in set(self.bugs['testing'].keys() + self.bugs['unstable'].keys()):
00471 
00472             # make sure that the key is present in both dictionaries
00473             if not self.bugs['testing'].has_key(pkg):
00474                 self.bugs['testing'][pkg] = 0
00475             elif not self.bugs['unstable'].has_key(pkg):
00476                 self.bugs['unstable'][pkg] = 0
00477 
00478             # retrieve the maximum version of the package in testing:
00479             maxvert = self.__maxver(pkg, 'testing')
00480 
00481             # if the package is not available in testing or it has the
00482             # same RC bug count, then do nothing
00483             if maxvert == None or \
00484                self.bugs['testing'][pkg] == self.bugs['unstable'][pkg]:
00485                 continue
00486 
00487             # retrieve the maximum version of the package in testing:
00488             maxveru = self.__maxver(pkg, 'unstable')
00489 
00490             # if the package is not available in unstable, then do nothing
00491             if maxveru == None:
00492                 continue
00493             # else if the testing package is more recent, then use the
00494             # unstable RC bug count for testing, too
00495             elif apt_pkg.VersionCompare(maxvert, maxveru) >= 0:
00496                 self.bugs['testing'][pkg] = self.bugs['unstable'][pkg]
00497 
00498     def read_dates(self, basedir):
00499         """Read the upload date for the packages from the specified directory
00500         
00501         The upload dates are read from the `Date' file within the directory
00502         specified as `basedir' parameter. The file contains rows with the
00503         format:
00504 
00505         <package-name> <version> <date-of-upload>
00506 
00507         The dates are expressed as days starting from the 1970-01-01.
00508 
00509         The method returns a dictionary where the key is the binary package
00510         name and the value is tuple with two items, the version and the date.
00511         """
00512         dates = {}
00513         filename = os.path.join(basedir, "Dates")
00514         self.__log("Loading upload data from %s" % filename)
00515         for line in open(filename):
00516             l = line.strip().split()
00517             if len(l) != 3: continue
00518             try:
00519                 dates[l[0]] = (l[1], int(l[2]))
00520             except ValueError:
00521                 self.__log("Dates, unable to parse \"%s\"" % line, type="E")
00522         return dates
00523 
00524     def read_urgencies(self, basedir):
00525         """Read the upload urgency of the packages from the specified directory
00526         
00527         The upload urgencies are read from the `Urgency' file within the
00528         directory specified as `basedir' parameter. The file contains rows
00529         with the format:
00530 
00531         <package-name> <version> <urgency>
00532 
00533         The method returns a dictionary where the key is the binary package
00534         name and the value is the greatest urgency from the versions of the
00535         package that are higher then the testing one.
00536         """
00537 
00538         urgencies = {}
00539         filename = os.path.join(basedir, "Urgency")
00540         self.__log("Loading upload urgencies from %s" % filename)
00541         for line in open(filename):
00542             l = line.strip().split()
00543             if len(l) != 3: continue
00544 
00545             # read the minimum days associated to the urgencies
00546             urgency_old = urgencies.get(l[0], self.options.default_urgency)
00547             mindays_old = self.MINDAYS.get(urgency_old, self.MINDAYS[self.options.default_urgency])
00548             mindays_new = self.MINDAYS.get(l[2], self.MINDAYS[self.options.default_urgency])
00549 
00550             # if the new urgency is lower (so the min days are higher), do nothing
00551             if mindays_old <= mindays_new:
00552                 continue
00553 
00554             # if the package exists in testing and it is more recent, do nothing
00555             tsrcv = self.sources['testing'].get(l[0], None)
00556             if tsrcv and apt_pkg.VersionCompare(tsrcv['version'], l[1]) >= 0:
00557                 continue
00558 
00559             # if the package doesn't exist in unstable or it is older, do nothing
00560             usrcv = self.sources['unstable'].get(l[0], None)
00561             if not usrcv or apt_pkg.VersionCompare(usrcv['version'], l[1]) < 0:
00562                 continue
00563 
00564             # update the urgency for the package
00565             urgencies[l[0]] = l[2]
00566 
00567         return urgencies
00568 
00569     def read_approvals(self, basedir):
00570         """Read the approval commands from the specified directory
00571         
00572         The approval commands are read from the files contained by the 
00573         `Approved' directory within the directory specified as `basedir'
00574         parameter. The name of the files has to be the same of the
00575         authorized users for the approvals.
00576         
00577         The file contains rows with the format:
00578 
00579         <package-name> <version>
00580 
00581         The method returns a dictionary where the key is the binary package
00582         name followed by an underscore and the version number, and the value
00583         is the user who submitted the command.
00584         """
00585         approvals = {}
00586         for approver in self.options.approvers.split():
00587             filename = os.path.join(basedir, "Approved", approver)
00588             self.__log("Loading approvals list from %s" % filename)
00589             for line in open(filename):
00590                 l = line.strip().split()
00591                 if len(l) != 2: continue
00592                 approvals["%s_%s" % (l[0], l[1])] = approver
00593         return approvals
00594 
00595     def read_hints(self, basedir):
00596         """Read the hint commands from the specified directory
00597         
00598         The hint commands are read from the files contained by the `Hints'
00599         directory within the directory specified as `basedir' parameter. 
00600         The name of the files has to be the same of the authorized users
00601         for the hints.
00602         
00603         The file contains rows with the format:
00604 
00605         <command> <package-name>[/<version>]
00606 
00607         The method returns a dictionary where the key is the command, and
00608         the value is the list of affected packages.
00609         """
00610         hints = dict([(k,[]) for k in self.HINTS_ALL])
00611 
00612         for who in self.HINTS.keys():
00613             filename = os.path.join(basedir, "Hints", who)
00614             self.__log("Loading hints list from %s" % filename)
00615             for line in open(filename):
00616                 line = line.strip()
00617                 if line == "": continue
00618                 l = line.split()
00619                 if l[0] == 'finished':
00620                     break
00621                 elif l[0] not in self.HINTS[who]:
00622                     continue
00623                 elif l[0] in ["easy", "hint", "force-hint"]:
00624                     hints[l[0]].append((who, [k.split("/") for k in l if "/" in k]))
00625                 elif l[0] in ["block-all"]:
00626                     hints[l[0]].extend([(y, who) for y in l[1:]])
00627                 elif l[0] in ["block"]:
00628                     hints[l[0]].extend([(y, who) for y in l[1:]])
00629                 elif l[0] in ["remove", "approve", "unblock", "force", "urgent"]:
00630                     hints[l[0]].extend([(k.split("/")[0], (k.split("/")[1],who) ) for k in l if "/" in k])
00631 
00632         for x in ["block", "block-all", "unblock", "force", "urgent", "remove"]:
00633             z = {}
00634             for a, b in hints[x]:
00635                 if z.has_key(a):
00636                     self.__log("Overriding %s[%s] = %s with %s" % (x, a, z[a], b), type="W")
00637                 z[a] = b
00638             hints[x] = z
00639 
00640         return hints
00641 
00642     # Utility methods for package analisys
00643     # ------------------------------------
00644 
00645     def same_source(self, sv1, sv2):
00646         """Check if two version numbers are built from the same source
00647 
00648         This method returns a boolean value which is true if the two
00649         version numbers specified as parameters are built from the same
00650         source. The main use of this code is to detect binary-NMU.
00651         """
00652         if sv1 == sv2:
00653             return 1
00654 
00655         m = re.match(r'^(.*)\+b\d+$', sv1)
00656         if m: sv1 = m.group(1)
00657         m = re.match(r'^(.*)\+b\d+$', sv2)
00658         if m: sv2 = m.group(1)
00659 
00660         if sv1 == sv2:
00661             return 1
00662 
00663         if re.search("-", sv1) or re.search("-", sv2):
00664             m = re.match(r'^(.*-[^.]+)\.0\.\d+$', sv1)
00665             if m: sv1 = m.group(1)
00666             m = re.match(r'^(.*-[^.]+\.[^.]+)\.\d+$', sv1)
00667             if m: sv1 = m.group(1)
00668 
00669             m = re.match(r'^(.*-[^.]+)\.0\.\d+$', sv2)
00670             if m: sv2 = m.group(1)
00671             m = re.match(r'^(.*-[^.]+\.[^.]+)\.\d+$', sv2)
00672             if m: sv2 = m.group(1)
00673 
00674             return (sv1 == sv2)
00675         else:
00676             m = re.match(r'^([^-]+)\.0\.\d+$', sv1)
00677             if m and sv2 == m.group(1): return 1
00678 
00679             m = re.match(r'^([^-]+)\.0\.\d+$', sv2)
00680             if m and sv1 == m.group(1): return 1
00681 
00682             return 0
00683 
00684     def get_dependency_solvers(self, block, arch, distribution):
00685         """Find the packages which satisfy a dependency block
00686 
00687         This method returns the list of packages which satisfy a dependency
00688         block (as returned by apt_pkg.ParseDepends) for the given architecture
00689         and distribution.
00690 
00691         It returns a tuple with two items: the first is a boolean which is
00692         True if the dependency is satisfied, the second is the list of the
00693         solving packages.
00694         """
00695 
00696         packages = []
00697 
00698         # for every package, version and operation in the block
00699         for name, version, op in block:
00700             # look for the package in unstable
00701             if name in self.binaries[distribution][arch][0]:
00702                 package = self.binaries[distribution][arch][0][name]
00703                 # check the versioned dependency (if present)
00704                 if op == '' and version == '' or apt_pkg.CheckDep(package['version'], op, version):
00705                     packages.append(name)
00706 
00707             # look for the package in the virtual packages list
00708             if name in self.binaries[distribution][arch][1]:
00709                 # loop on the list of packages which provides it
00710                 for prov in self.binaries[distribution][arch][1][name]:
00711                     package = self.binaries[distribution][arch][0][prov]
00712                     # check the versioned dependency (if present)
00713                     # TODO: this is forbidden by the debian policy, which says that versioned
00714                     #       dependencies on virtual packages are never satisfied. The old britney
00715                     #       does it and we have to go with it, but at least a warning should be raised.
00716                     if op == '' and version == '' or apt_pkg.CheckDep(package['version'], op, version):
00717                         packages.append(prov)
00718                         break
00719 
00720         return (len(packages) > 0, packages)
00721 
00722     def excuse_unsat_deps(self, pkg, src, arch, suite, excuse):
00723         """Find unsatisfied dependencies for a binary package
00724 
00725         This method analyzes the dependencies of the binary package specified
00726         by the parameter `pkg', built from the source package `src', for the
00727         architecture `arch' within the suite `suite'. If the dependency can't
00728         be satisfied in testing and/or unstable, it updates the excuse passed
00729         as parameter.
00730 
00731         The dependency fields checked are Pre-Depends and Depends.
00732         """
00733         # retrieve the binary package from the specified suite and arch
00734         binary_u = self.binaries[suite][arch][0][pkg]
00735 
00736         # analyze the dependency fields (if present)
00737         for type in ('Pre-Depends', 'Depends'):
00738             type_key = type.lower()
00739             if not binary_u[type_key]:
00740                 continue
00741 
00742             # this list will contain the packages that satisfy the dependency
00743             packages = []
00744 
00745             # for every block of dependency (which is formed as conjunction of disconjunction)
00746             for block, block_txt in map(None, binary_u[type_key], binary_u[type_key + '-txt'].split(',')):
00747                 # if the block is satisfied in testing, then skip the block
00748                 solved, packages = self.get_dependency_solvers(block, arch, 'testing')
00749                 if solved: continue
00750 
00751                 # check if the block can be satisfied in unstable, and list the solving packages
00752                 solved, packages = self.get_dependency_solvers(block, arch, suite)
00753                 packages = [self.binaries[suite][arch][0][p]['source'] for p in packages]
00754 
00755                 # if the dependency can be satisfied by the same source package, skip the block:
00756                 # obviously both binary packages will enter testing togheter
00757                 if src in packages: continue
00758 
00759                 # if no package can satisfy the dependency, add this information to the excuse
00760                 if len(packages) == 0:
00761                     excuse.addhtml("%s/%s unsatisfiable %s: %s" % (pkg, arch, type, block_txt.strip()))
00762 
00763                 # for the solving packages, update the excuse to add the dependencies
00764                 for p in packages:
00765                     if arch not in self.options.break_arches.split():
00766                         excuse.add_dep(p)
00767                     else:
00768                         excuse.add_break_dep(p, arch)
00769 
00770     # Package analisys methods
00771     # ------------------------
00772 
00773     def should_remove_source(self, pkg):
00774         """Check if a source package should be removed from testing
00775         
00776         This method checks if a source package should be removed from the
00777         testing distribution; this happen if the source package is not
00778         present in the unstable distribution anymore.
00779 
00780         It returns True if the package can be removed, False otherwise.
00781         In the former case, a new excuse is appended to the the object
00782         attribute excuses.
00783         """
00784         # if the soruce package is available in unstable, then do nothing
00785         if self.sources['unstable'].has_key(pkg):
00786             return False
00787         # otherwise, add a new excuse for its removal and return True
00788         src = self.sources['testing'][pkg]
00789         excuse = Excuse("-" + pkg)
00790         excuse.set_vers(src['version'], None)
00791         src['maintainer'] and excuse.set_maint(src['maintainer'].strip())
00792         src['section'] and excuse.set_section(src['section'].strip())
00793         excuse.addhtml("Valid candidate")
00794         self.excuses.append(excuse)
00795         return True
00796 
00797     def should_upgrade_srcarch(self, src, arch, suite):
00798         """Check if binary package should be upgraded
00799 
00800         This method checks if a binary package should be upgraded; this can
00801         happen also if the binary package is a binary-NMU for the given arch.
00802         The analisys is performed for the source package specified by the
00803         `src' parameter, checking the architecture `arch' for the distribution
00804         `suite'.
00805        
00806         It returns False if the given package doesn't need to be upgraded,
00807         True otherwise. In the former case, a new excuse is appended to
00808         the the object attribute excuses.
00809         """
00810         # retrieve the source packages for testing and suite
00811         source_t = self.sources['testing'][src]
00812         source_u = self.sources[suite][src]
00813 
00814         # build the common part of the excuse, which will be filled by the code below
00815         ref = "%s/%s%s" % (src, arch, suite != 'unstable' and "_" + suite or "")
00816         excuse = Excuse(ref)
00817         excuse.set_vers(source_t['version'], source_t['version'])
00818         source_u['maintainer'] and excuse.set_maint(source_u['maintainer'].strip())
00819         source_u['section'] and excuse.set_section(source_u['section'].strip())
00820         
00821         # if there is a `remove' hint and the requested version is the same of the
00822         # version in testing, then stop here and return False
00823         if self.hints["remove"].has_key(src) and \
00824            self.same_source(source_t['version'], self.hints["remove"][src][0]):
00825             excuse.addhtml("Removal request by %s" % (self.hints["remove"][src][1]))
00826             excuse.addhtml("Trying to remove package, not update it")
00827             excuse.addhtml("Not considered")
00828             self.excuses.append(excuse)
00829             return False
00830 
00831         # the starting point is that there is nothing wrong and nothing worth doing
00832         anywrongver = False
00833         anyworthdoing = False
00834 
00835         # for every binary package produced by this source in unstable for this architecture
00836         for pkg in sorted(filter(lambda x: x.endswith("/" + arch), source_u['binaries'])):
00837             pkg_name = pkg.split("/")[0]
00838 
00839             # retrieve the testing (if present) and unstable corresponding binary packages
00840             binary_t = pkg in source_t['binaries'] and self.binaries['testing'][arch][0][pkg_name] or None
00841             binary_u = self.binaries[suite][arch][0][pkg_name]
00842 
00843             # this is the source version for the new binary package
00844             pkgsv = self.binaries[suite][arch][0][pkg_name]['source-ver']
00845 
00846             # if the new binary package is architecture-independent, then skip it
00847             if binary_u['architecture'] == 'all':
00848                 excuse.addhtml("Ignoring %s %s (from %s) as it is arch: all" % (pkg_name, binary_u['version'], pkgsv))
00849                 continue
00850 
00851             # if the new binary package is not from the same source as the testing one, then skip it
00852             if not self.same_source(source_t['version'], pkgsv):
00853                 anywrongver = True
00854                 excuse.addhtml("From wrong source: %s %s (%s not %s)" % (pkg_name, binary_u['version'], pkgsv, source_t['version']))
00855                 break
00856 
00857             # find unsatisfied dependencies for the new binary package
00858             self.excuse_unsat_deps(pkg_name, src, arch, suite, excuse)
00859 
00860             # if the binary is not present in testing, then it is a new binary;
00861             # in this case, there is something worth doing
00862             if not binary_t:
00863                 excuse.addhtml("New binary: %s (%s)" % (pkg_name, binary_u['version']))
00864                 anyworthdoing = True
00865                 continue
00866 
00867             # at this point, the binary package is present in testing, so we can compare
00868             # the versions of the packages ...
00869             vcompare = apt_pkg.VersionCompare(binary_t['version'], binary_u['version'])
00870 
00871             # ... if updating would mean downgrading, then stop here: there is something wrong
00872             if vcompare > 0:
00873                 anywrongver = True
00874                 excuse.addhtml("Not downgrading: %s (%s to %s)" % (pkg_name, binary_t['version'], binary_u['version']))
00875                 break
00876             # ... if updating would mean upgrading, then there is something worth doing
00877             elif vcompare < 0:
00878                 excuse.addhtml("Updated binary: %s (%s to %s)" % (pkg_name, binary_t['version'], binary_u['version']))
00879                 anyworthdoing = True
00880 
00881         # if there is nothing wrong and there is something worth doing or the source
00882         # package is not fake, then check what packages shuold be removed
00883         if not anywrongver and (anyworthdoing or self.sources[suite][src].has_key('fake')):
00884             srcv = self.sources[suite][src]['version']
00885             ssrc = self.same_source(source_t['version'], srcv)
00886             # for every binary package produced by this source in testing for this architecture
00887             for pkg in sorted([x.split("/")[0] for x in self.sources['testing'][src]['binaries'] if x.endswith("/"+arch)]):
00888                 # if the package is architecture-independent, then ignore it
00889                 if self.binaries['testing'][arch][0][pkg]['architecture'] == 'all':
00890                     excuse.addhtml("Ignoring removal of %s as it is arch: all" % (pkg))
00891                     continue
00892                 # if the package is not produced by the new source package, then remove it from testing
00893                 if not self.binaries[suite][arch][0].has_key(pkg):
00894                     tpkgv = self.binaries['testing'][arch][0][pkg]['version']
00895                     excuse.addhtml("Removed binary: %s %s" % (pkg, tpkgv))
00896                     if ssrc: anyworthdoing = True
00897 
00898         # if there is nothing wrong and there is something worth doing, this is valid candidate
00899         if not anywrongver and anyworthdoing:
00900             excuse.addhtml("Valid candidate")
00901             self.excuses.append(excuse)
00902         # else if there is something worth doing (but something wrong, too) this package won't be considered
00903         elif anyworthdoing:
00904             excuse.addhtml("Not considered")
00905             self.excuses.append(excuse)
00906             return False
00907 
00908         # otherwise, return True
00909         return True
00910 
00911     def should_upgrade_src(self, src, suite):
00912         """Check if source package should be upgraded
00913 
00914         This method checks if a source package should be upgraded. The analisys
00915         is performed for the source package specified by the `src' parameter, 
00916         checking the architecture `arch' for the distribution `suite'.
00917        
00918         It returns False if the given package doesn't need to be upgraded,
00919         True otherwise. In the former case, a new excuse is appended to
00920         the the object attribute excuses.
00921         """
00922 
00923         # retrieve the source packages for testing (if available) and suite
00924         source_u = self.sources[suite][src]
00925         if src in self.sources['testing']:
00926             source_t = self.sources['testing'][src]
00927             # if testing and unstable have the same version, then this is a candidate for binary-NMUs only
00928             if apt_pkg.VersionCompare(source_t['version'], source_u['version']) == 0:
00929                 return False
00930         else:
00931             source_t = None
00932 
00933         # build the common part of the excuse, which will be filled by the code below
00934         ref = "%s%s" % (src, suite != 'unstable' and "_" + suite or "")
00935         excuse = Excuse(ref)
00936         excuse.set_vers(source_t and source_t['version'] or None, source_u['version'])
00937         source_u['maintainer'] and excuse.set_maint(source_u['maintainer'].strip())
00938         source_u['section'] and excuse.set_section(source_u['section'].strip())
00939 
00940         # the starting point is that we will update the candidate
00941         update_candidate = True
00942         
00943         # if the version in unstable is older, then stop here with a warning in the excuse and return False
00944         if source_t and apt_pkg.VersionCompare(source_u['version'], source_t['version']) < 0:
00945             excuse.addhtml("ALERT: %s is newer in testing (%s %s)" % (src, source_t['version'], source_u['version']))
00946             self.excuses.append(excuse)
00947             return False
00948 
00949         # check if the source package really exists or if it is a fake one
00950         if source_u.has_key('fake'):
00951             excuse.addhtml("%s source package doesn't exist" % (src))
00952             update_candidate = False
00953 
00954         # retrieve the urgency for the upload, ignoring it if this is a NEW package (not present in testing)
00955         urgency = self.urgencies.get(src, self.options.default_urgency)
00956         if not source_t and urgency != self.options.default_urgency:
00957             excuse.addhtml("Ignoring %s urgency setting for NEW package" % (urgency))
00958             urgency = self.options.default_urgency
00959 
00960         # if there is a `remove' hint and the requested version is the same of the
00961         # version in testing, then stop here and return False
00962         if self.hints["remove"].has_key(src):
00963             if source_t and self.same_source(source_t['version'], self.hints['remove'][src][0]) or \
00964                self.same_source(source_u['version'], self.hints['remove'][src][0]):
00965                 excuse.addhtml("Removal request by %s" % (self.hints["remove"][src][1]))
00966                 excuse.addhtml("Trying to remove package, not update it")
00967                 update_candidate = False
00968 
00969         # check if there is a `block' hint for this package or a `block-all source' hint
00970         blocked = None
00971         if self.hints["block"].has_key(src):
00972             blocked = self.hints["block"][src]
00973         elif self.hints["block-all"].has_key("source"):
00974             blocked = self.hints["block-all"]["source"]
00975 
00976         # if the source is blocked, then look for an `unblock' hint; the unblock request
00977         # is processed only if the specified version is correct
00978         if blocked:
00979             unblock = self.hints["unblock"].get(src,(None,None))
00980             if unblock[0] != None:
00981                 if self.same_source(unblock[0], source_u['version']):
00982                     excuse.addhtml("Ignoring request to block package by %s, due to unblock request by %s" % (blocked, unblock[1]))
00983                 else:
00984                     excuse.addhtml("Unblock request by %s ignored due to version mismatch: %s" % (unblock[1], unblock[0]))
00985             else:
00986                 excuse.addhtml("Not touching package, as requested by %s (contact debian-release if update is needed)" % (blocked))
00987                 update_candidate = False
00988 
00989         # if the suite is unstable, then we have to check the urgency and the minimum days of
00990         # permanence in unstable before updating testing; if the source package is too young,
00991         # the check fails and we set update_candidate to False to block the update
00992         if suite == 'unstable':
00993             if not self.dates.has_key(src):
00994                 self.dates[src] = (source_u['version'], self.date_now)
00995             elif not self.same_source(self.dates[src][0], source_u['version']):
00996                 self.dates[src] = (source_u['version'], self.date_now)
00997 
00998             days_old = self.date_now - self.dates[src][1]
00999             min_days = self.MINDAYS[urgency]
01000             excuse.setdaysold(days_old, min_days)
01001             if days_old < min_days:
01002                 if self.hints["urgent"].has_key(src) and self.same_source(source_u['version'], self.hints["urgent"][src][0]):
01003                     excuse.addhtml("Too young, but urgency pushed by %s" % (self.hints["urgent"][src][1]))
01004                 else:
01005                     update_candidate = False
01006 
01007         # at this point, we check what is the status of the builds on all the supported architectures
01008         # to catch the out-of-date ones
01009         pkgs = {src: ["source"]}
01010         for arch in self.options.architectures:
01011             oodbins = {}
01012             # for every binary package produced by this source in the suite for this architecture
01013             for pkg in sorted([x.split("/")[0] for x in self.sources[suite][src]['binaries'] if x.endswith("/"+arch)]):
01014                 if not pkgs.has_key(pkg): pkgs[pkg] = []
01015                 pkgs[pkg].append(arch)
01016 
01017                 # retrieve the binary package and its source version
01018                 binary_u = self.binaries[suite][arch][0][pkg]
01019                 pkgsv = binary_u['source-ver']
01020 
01021                 # if it wasn't builded by the same source, it is out-of-date
01022                 if not self.same_source(source_u['version'], pkgsv):
01023                     if not oodbins.has_key(pkgsv):
01024                         oodbins[pkgsv] = []
01025                     oodbins[pkgsv].append(pkg)
01026                     continue
01027 
01028                 # if the package is architecture-dependent or the current arch is `nobreakall'
01029                 # find unsatisfied dependencies for the binary package
01030                 if binary_u['architecture'] != 'all' or arch in self.options.nobreakall_arches:
01031                     self.excuse_unsat_deps(pkg, src, arch, suite, excuse)
01032 
01033             # if there are out-of-date packages, warn about them in the excuse and set update_candidate
01034             # to False to block the update; if the architecture where the package is out-of-date is
01035             # in the `fucked_arches' list, then do not block the update
01036             if oodbins:
01037                 oodtxt = ""
01038                 for v in oodbins.keys():
01039                     if oodtxt: oodtxt = oodtxt + "; "
01040                     oodtxt = oodtxt + "%s (from <a href=\"http://buildd.debian.org/build.php?" \
01041                         "arch=%s&pkg=%s&ver=%s\" target=\"_blank\">%s</a>)" % \
01042                         (", ".join(sorted(oodbins[v])), arch, src, v, v)
01043                 text = "out of date on <a href=\"http://buildd.debian.org/build.php?" \
01044                     "arch=%s&pkg=%s&ver=%s\" target=\"_blank\">%s</a>: %s" % \
01045                     (arch, src, source_u['version'], arch, oodtxt)
01046 
01047                 if arch in self.options.fucked_arches:
01048                     text = text + " (but %s isn't keeping up, so nevermind)" % (arch)
01049                 else:
01050                     update_candidate = False
01051 
01052                 if self.date_now != self.dates[src][1]:
01053                     excuse.addhtml(text)
01054 
01055         # if the source package has no binaries, set update_candidate to False to block the update
01056         if len(self.sources[suite][src]['binaries']) == 0:
01057             excuse.addhtml("%s has no binaries on any arch" % src)
01058             update_candidate = False
01059 
01060         # if the suite is unstable, then we have to check the release-critical bug counts before
01061         # updating testing; if the unstable package have a RC bug count greater than the testing
01062         # one,  the check fails and we set update_candidate to False to block the update
01063         if suite == 'unstable':
01064             for pkg in pkgs.keys():
01065                 if not self.bugs['testing'].has_key(pkg):
01066                     self.bugs['testing'][pkg] = 0
01067                 if not self.bugs['unstable'].has_key(pkg):
01068                     self.bugs['unstable'][pkg] = 0
01069 
01070                 if self.bugs['unstable'][pkg] > self.bugs['testing'][pkg]:
01071                     excuse.addhtml("%s (%s) is <a href=\"http://bugs.debian.org/cgi-bin/pkgreport.cgi?" \
01072                                    "which=pkg&data=%s&sev-inc=critical&sev-inc=grave&sev-inc=serious\" " \
01073                                    "target=\"_blank\">buggy</a>! (%d > %d)" % \
01074                                    (pkg, ", ".join(pkgs[pkg]), pkg, self.bugs['unstable'][pkg], self.bugs['testing'][pkg]))
01075                     update_candidate = False
01076                 elif self.bugs['unstable'][pkg] > 0:
01077                     excuse.addhtml("%s (%s) is (less) <a href=\"http://bugs.debian.org/cgi-bin/pkgreport.cgi?" \
01078                                    "which=pkg&data=%s&sev-inc=critical&sev-inc=grave&sev-inc=serious\" " \
01079                                    "target=\"_blank\">buggy</a>! (%d <= %d)" % \
01080                                    (pkg, ", ".join(pkgs[pkg]), pkg, self.bugs['unstable'][pkg], self.bugs['testing'][pkg]))
01081 
01082         # check if there is a `force' hint for this package, which allows it to go in even if it is not updateable
01083         if not update_candidate and self.hints["force"].has_key(src) and \
01084            self.same_source(source_u['version'], self.hints["force"][src][0]):
01085             excuse.dontinvalidate = 1
01086             excuse.addhtml("Should ignore, but forced by %s" % (self.hints["force"][src][1]))
01087             update_candidate = True
01088 
01089         # if the suite is testing-proposed-updates, the package needs an explicit approval in order to go in
01090         if suite == "tpu":
01091             if self.approvals.has_key("%s_%s" % (src, source_u['version'])):
01092                 excuse.addhtml("Approved by %s" % approvals["%s_%s" % (src, source_u['version'])])
01093             else:
01094                 excuse.addhtml("NEEDS APPROVAL BY RM")
01095                 update_candidate = False
01096 
01097         # if the package can be updated, it is a valid candidate
01098         if update_candidate:
01099             excuse.addhtml("Valid candidate")
01100         # else it won't be considered
01101         else:
01102             excuse.addhtml("Not considered")
01103 
01104         self.excuses.append(excuse)
01105         return update_candidate
01106 
01107     def reversed_exc_deps(self):
01108         """Reverse the excuses dependencies
01109 
01110         This method returns a dictionary where the keys are the package names
01111         and the values are the excuse names which depend on it.
01112         """
01113         res = {}
01114         for exc in self.excuses:
01115             for d in exc.deps:
01116                 if not res.has_key(d): res[d] = []
01117                 res[d].append(exc.name)
01118         return res
01119 
01120     def invalidate_excuses(self, valid, invalid):
01121         """Invalidate impossible excuses
01122 
01123         This method invalidates the impossible excuses, which depend
01124         on invalid excuses. The two parameters contains the list of
01125         `valid' and `invalid' excuses.
01126         """
01127         # build a lookup-by-name map
01128         exclookup = {}
01129         for e in self.excuses:
01130             exclookup[e.name] = e
01131 
01132         # build the reverse dependencies
01133         revdeps = self.reversed_exc_deps()
01134 
01135         # loop on the invalid excuses
01136         i = 0
01137         while i < len(invalid):
01138             # if there is no reverse dependency, skip the item
01139             if not revdeps.has_key(invalid[i]):
01140                 i += 1
01141                 continue
01142             # if there dependency can be satisfied by a testing-proposed-updates excuse, skip the item
01143             if (invalid[i] + "_tpu") in valid:
01144                 i += 1
01145                 continue
01146             # loop on the reverse dependencies
01147             for x in revdeps[invalid[i]]:
01148                 # if the item is valid and it is marked as `dontinvalidate', skip the item
01149                 if x in valid and exclookup[x].dontinvalidate:
01150                     continue
01151 
01152                 # otherwise, invalidate the dependency and mark as invalidated and
01153                 # remove the depending excuses
01154                 exclookup[x].invalidate_dep(invalid[i])
01155                 if x in valid:
01156                     p = valid.index(x)
01157                     invalid.append(valid.pop(p))
01158                     exclookup[x].addhtml("Invalidated by dependency")
01159                     exclookup[x].addhtml("Not considered")
01160             i = i + 1
01161  
01162     def write_excuses(self):
01163         """Produce and write the update excuses
01164 
01165         This method handles the update excuses generation: the packages are
01166         looked to determine whether they are valid candidates. For the details
01167         of this procedure, please refer to the module docstring.
01168         """
01169 
01170         # this list will contain the packages which are valid candidates;
01171         # if a package is going to be removed, it will have a "-" prefix
01172         upgrade_me = []
01173 
01174         # for every source package in testing, check if it should be removed
01175         for pkg in self.sources['testing']:
01176             if self.should_remove_source(pkg):
01177                 upgrade_me.append("-" + pkg)
01178 
01179         # for every source package in unstable check if it should be upgraded
01180         for pkg in self.sources['unstable']:
01181             # if the source package is already present in testing,
01182             # check if it should be upgraded for every binary package
01183             if self.sources['testing'].has_key(pkg):
01184                 for arch in self.options.architectures:
01185                     if self.should_upgrade_srcarch(pkg, arch, 'unstable'):
01186                         upgrade_me.append("%s/%s" % (pkg, arch))
01187 
01188             # check if the source package should be upgraded
01189             if self.should_upgrade_src(pkg, 'unstable'):
01190                 upgrade_me.append(pkg)
01191 
01192         # for every source package in testing-proposed-updates, check if it should be upgraded
01193         for pkg in self.sources['tpu']:
01194             # if the source package is already present in testing,
01195             # check if it should be upgraded for every binary package
01196             if self.sources['testing'].has_key(pkg):
01197                 for arch in self.options.architectures:
01198                     if self.should_upgrade_srcarch(pkg, arch, 'tpu'):
01199                         upgrade_me.append("%s/%s_tpu" % (pkg, arch))
01200 
01201             # check if the source package should be upgraded
01202             if self.should_upgrade_src(pkg, 'tpu'):
01203                 upgrade_me.append("%s_tpu" % pkg)
01204 
01205         # process the `remove' hints, if the given package is not yet in upgrade_me
01206         for src in self.hints["remove"].keys():
01207             if src in upgrade_me: continue
01208             if ("-"+src) in upgrade_me: continue
01209             if not self.sources['testing'].has_key(src): continue
01210 
01211             # check if the version specified in the hint is the same of the considered package
01212             tsrcv = self.sources['testing'][src]['version']
01213             if not self.same_source(tsrcv, self.hints["remove"][src][0]): continue
01214 
01215             # add the removal of the package to upgrade_me and build a new excuse
01216             upgrade_me.append("-%s" % (src))
01217             excuse = Excuse("-%s" % (src))
01218             excuse.set_vers(tsrcv, None)
01219             excuse.addhtml("Removal request by %s" % (self.hints["remove"][src][1]))
01220             excuse.addhtml("Package is broken, will try to remove")
01221             self.excuses.append(excuse)
01222 
01223         # sort the excuses by daysold and name
01224         self.excuses.sort(lambda x, y: cmp(x.daysold, y.daysold) or cmp(x.name, y.name))
01225 
01226         # extract the not considered packages, which are in the excuses but not in upgrade_me
01227         unconsidered = [e.name for e in self.excuses if e.name not in upgrade_me]
01228 
01229         # invalidate impossible excuses
01230         for e in self.excuses:
01231             for d in e.deps:
01232                 if d not in upgrade_me and d not in unconsidered:
01233                     e.addhtml("Unpossible dep: %s -> %s" % (e.name, d))
01234         self.invalidate_excuses(upgrade_me, unconsidered)
01235 
01236         # write excuses to the output file
01237         f = open(self.options.excuses_output, 'w')
01238         f.write("<!DOCTYPE HTML PUBLIC \"-//W3C//DTD HTML 4.01//EN\" \"http://www.w3.org/TR/REC-html40/strict.dtd\">\n")
01239         f.write("<html><head><title>excuses...</title>")
01240         f.write("<meta http-equiv=\"Content-Type\" content=\"text/html;charset=utf-8\"></head><body>\n")
01241         f.write("<p>Generated: " + time.strftime("%Y.%m.%d %H:%M:%S %z", time.gmtime(time.time())) + "</p>\n")
01242         f.write("<ul>\n")
01243         for e in self.excuses:
01244             f.write("<li>%s" % e.html())
01245         f.write("</ul></body></html>\n")
01246         f.close()
01247 
01248     def main(self):
01249         """Main method
01250         
01251         This is the entry point for the class: it includes the list of calls
01252         for the member methods which will produce the output files.
01253         """
01254         self.write_excuses()
01255 
01256 if __name__ == '__main__':
01257     Britney().main()

Generated on Sun Jun 25 12:04:03 2006 for briteny by  doxygen 1.4.6