00001
00002
00003
00004
00005
00006
00007
00008
00009
00010
00011
00012
00013
00014
00015
00016
00017 """
00018 = Introdution =
00019
00020 This is the Debian testing updater script, also known as "Britney".
00021
00022 Packages are usually installed into the `testing' distribution after
00023 they have undergone some degree of testing in unstable. The goal of
00024 this software is to do this task in a smart way, allowing testing
00025 to be always fully installable and close to being a release candidate.
00026
00027 Britney source code is splitted in two different but related tasks:
00028 the first one is the generation of the update excuses, while the
00029 second tries to update testing with the valid candidates; first
00030 each package alone, then larger and even larger sets of packages
00031 together. Each try is accepted if testing is not more uninstallable
00032 after the update than before.
00033
00034 = Data Loading =
00035
00036 In order to analyze the entire Debian distribution, Britney needs to
00037 load in memory the whole archive: this means more than 10.000 packages
00038 for twelve architectures, as well as the dependency interconnection
00039 between them. For this reason, the memory requirement for running this
00040 software are quite high and at least 1 gigabyte of RAM should be available.
00041
00042 Britney loads the source packages from the `Sources' file and the binary
00043 packages from the `Packages_${arch}' files, where ${arch} is substituted
00044 with the supported architectures. While loading the data, the software
00045 analyze the dependencies and build a directed weighted graph in memory
00046 with all the interconnections between the packages (see Britney.read_sources
00047 and Britney.read_binaries).
00048
00049 Other than source and binary packages, Britney loads the following data:
00050
00051 * Bugs, which contains the count of release-critical bugs for a given
00052 version of a source package (see Britney.read_bugs).
00053
00054 * Dates, which contains the date of the upload of a given version
00055 of a source package (see Britney.read_dates).
00056
00057 * Urgencies, which contains the urgency of the upload of a given
00058 version of a source package (see Britney.read_urgencies).
00059
00060 * Approvals, which contains the list of approved testing-proposed-updates
00061 packages (see Britney.read_approvals).
00062
00063 * Hints, which contains lists of commands which modify the standard behaviour
00064 of Britney (see Britney.read_hints).
00065
00066 For a more detailed explanation about the format of these files, please read
00067 the documentation of the related methods. The exact meaning of them will be
00068 instead explained in the chapter "Excuses Generation".
00069
00070 = Excuses =
00071
00072 An excuse is a detailed explanation of why a package can or cannot
00073 be updated in the testing distribution from a newer package in
00074 another distribution (like for example unstable). The main purpose
00075 of the excuses is to be written in an HTML file which will be
00076 published over HTTP. The maintainers will be able to parse it manually
00077 or automatically to find the explanation of why their packages have
00078 been updated or not.
00079
00080 == Excuses generation ==
00081
00082 These are the steps (with references to method names) that Britney
00083 does for the generation of the update excuses.
00084
00085 * If a source package is available in testing but it is not
00086 present in unstable and no binary packages in unstable are
00087 built from it, then it is marked for removal.
00088
00089 * Every source package in unstable and testing-proposed-updates,
00090 if already present in testing, is checked for binary-NMUs, new
00091 or dropped binary packages in all the supported architectures
00092 (see Britney.should_upgrade_srcarch). The steps to detect if an
00093 upgrade is needed are:
00094
00095 1. If there is a `remove' hint for the source package, the package
00096 is ignored: it will be removed and not updated.
00097
00098 2. For every binary package build from the new source, it checks
00099 for unsatisfied dependencies, new binary package and updated
00100 binary package (binNMU) excluding the architecture-independent
00101 ones and the packages not built from the same source.
00102
00103 3. For every binary package build from the old source, it checks
00104 if it is still built from the new source; if this is not true
00105 and the package is not architecture-independent, the script
00106 removes it from testing.
00107
00108 4. Finally, if there is something worth doing (eg. a new or updated
00109 binary package) and nothing wrong it marks the source package
00110 as "Valid candidate", or "Not considered" if there is something
00111 wrong which prevented the update.
00112
00113 * Every source package in unstable and testing-proposed-updates is
00114 checked for upgrade (see Britney.should_upgrade_src). The steps
00115 to detect if an upgrade is needed are:
00116
00117 1. If the source package in testing is more recent the new one
00118 is ignored.
00119
00120 2. If the source package doesn't exist (is fake), which means that
00121 a binary package refers to it but it is not present in the
00122 `Sources' file, the new one is ignored.
00123
00124 3. If the package doesn't exist in testing, the urgency of the
00125 upload is ignored and set to the default (actually `low').
00126
00127 4. If there is a `remove' hint for the source package, the package
00128 is ignored: it will be removed and not updated.
00129
00130 5. If there is a `block' hint for the source package without an
00131 `unblock` hint or a `block-all source`, the package is ignored.
00132
00133 7. If the suite is unstable, the update can go ahead only if the
00134 upload happend more then the minimum days specified by the
00135 urgency of the upload; if this is not true, the package is
00136 ignored as `too-young'. Note that the urgency is sticky, meaning
00137 that the highest urgency uploaded since the previous testing
00138 transition is taken into account.
00139
00140 8. All the architecture-dependent binary packages and the
00141 architecture-independent ones for the `nobreakall' architectures
00142 have to be built from the source we are considering. If this is
00143 not true, then these are called `out-of-date' architectures and
00144 the package is ignored.
00145
00146 9. The source package must have at least a binary package, otherwise
00147 it is ignored.
00148
00149 10. If the suite is unstable, the count of release critical bugs for
00150 the new source package must be less then the count for the testing
00151 one. If this is not true, the package is ignored as `buggy'.
00152
00153 11. If there is a `force' hint for the source package, then it is
00154 updated even if it is marked as ignored from the previous steps.
00155
00156 12. If the suite is testing-proposed-updates, the source package can
00157 be updated only if there is an explicit approval for it.
00158
00159 13. If the package will be ignored, mark it as "Valid candidate",
00160 otherwise mark it as "Not considered".
00161
00162 * The list of `remove' hints is processed: if the requested source
00163 package is not already being updated or removed and the version
00164 actually in testing is the same specified with the `remove' hint,
00165 it is marked for removal.
00166
00167 * The excuses are sorted by the number of days from the last upload
00168 (days-old) and by name.
00169
00170 * A list of unconsidered excuses (for which the package is not upgraded)
00171 is built. Using this list, all the excuses depending on them is marked
00172 as invalid for "unpossible dependency".
00173
00174 * The excuses are written in an HTML file.
00175 """
00176
00177 import os
00178 import re
00179 import sys
00180 import string
00181 import time
00182 import copy
00183 import optparse
00184 import operator
00185
00186 import apt_pkg
00187
00188 from excuse import Excuse
00189
00190 __author__ = 'Fabio Tranchitella'
00191 __version__ = '2.0.alpha1'
00192
00193
00194 VERSION = 0
00195 SECTION = 1
00196 BINARIES = 2
00197 MAINTAINER = 3
00198 FAKESRC = 4
00199
00200
00201 SOURCE = 2
00202 SOURCEVER = 3
00203 ARCHITECTURE = 4
00204 PREDEPENDS = 5
00205 DEPENDS = 6
00206 CONFLICTS = 7
00207 PROVIDES = 8
00208 RDEPENDS = 9
00209 RCONFLICTS = 10
00210
00211
00212 class Britney:
00213 """Britney, the debian testing updater script
00214
00215 This is the script that updates the testing_ distribution. It is executed
00216 each day after the installation of the updated packages. It generates the
00217 `Packages' files for the testing distribution, but it does so in an
00218 intelligent manner; it try to avoid any inconsistency and to use only
00219 non-buggy packages.
00220
00221 For more documentation on this script, please read the Developers Reference.
00222 """
00223
00224 HINTS_STANDARD = ("easy", "hint", "remove", "block", "unblock", "urgent", "approve")
00225 HINTS_ALL = ("force", "force-hint", "block-all") + HINTS_STANDARD
00226
00227 def __init__(self):
00228 """Class constructor
00229
00230 This method initializes and populates the data lists, which contain all
00231 the information needed by the other methods of the class.
00232 """
00233 self.date_now = int(((time.time() / (60*60)) - 15) / 24)
00234
00235
00236 self.__parse_arguments()
00237
00238
00239 apt_pkg.init()
00240
00241
00242 if not self.options.nuninst_cache:
00243 self.__log("Building the list of not installable packages for the full archive", type="I")
00244 self.sources = {'testing': self.read_sources(self.options.testing)}
00245 nuninst = {}
00246 for arch in self.options.architectures:
00247 self.binaries = {'testing': {arch: self.read_binaries(self.options.testing, "testing", arch)}}
00248 self.__log("> Checking for non-installable packages for architecture %s" % arch, type="I")
00249 result = self.get_nuninst(arch, build=True)
00250 nuninst.update(result)
00251 self.__log("> Found %d non-installable packages" % len(nuninst[arch]), type="I")
00252 self.write_nuninst(nuninst)
00253 else:
00254 self.__log("Not building the list of not installable packages, as requested", type="I")
00255
00256
00257 self.sources = {'testing': self.read_sources(self.options.testing),
00258 'unstable': self.read_sources(self.options.unstable),
00259 'tpu': self.read_sources(self.options.tpu),}
00260 self.binaries = {'testing': {}, 'unstable': {}, 'tpu': {}}
00261 for arch in self.options.architectures:
00262 self.binaries['testing'][arch] = self.read_binaries(self.options.testing, "testing", arch)
00263 self.binaries['unstable'][arch] = self.read_binaries(self.options.unstable, "unstable", arch)
00264 self.binaries['tpu'][arch] = self.read_binaries(self.options.tpu, "tpu", arch)
00265
00266
00267 self.bugs = {'unstable': self.read_bugs(self.options.unstable),
00268 'testing': self.read_bugs(self.options.testing),}
00269 self.normalize_bugs()
00270
00271
00272 self.dates = self.read_dates(self.options.testing)
00273 self.urgencies = self.read_urgencies(self.options.testing)
00274 self.approvals = self.read_approvals(self.options.tpu)
00275 self.hints = self.read_hints(self.options.unstable)
00276 self.excuses = []
00277 self.dependencies = {}
00278
00279 def __parse_arguments(self):
00280 """Parse the command line arguments
00281
00282 This method parses and initializes the command line arguments.
00283 While doing so, it preprocesses some of the options to be converted
00284 in a suitable form for the other methods of the class.
00285 """
00286
00287 self.parser = optparse.OptionParser(version="%prog")
00288 self.parser.add_option("-v", "", action="count", dest="verbose", help="enable verbose output")
00289 self.parser.add_option("-c", "--config", action="store", dest="config", default="/etc/britney.conf",
00290 help="path for the configuration file")
00291 self.parser.add_option("", "--architectures", action="store", dest="architectures", default=None,
00292 help="override architectures from configuration file")
00293 self.parser.add_option("", "--actions", action="store", dest="actions", default=None,
00294 help="override the list of actions to be performed")
00295 self.parser.add_option("", "--dry-run", action="store_true", dest="dry_run", default=False,
00296 help="disable all outputs to the testing directory")
00297 self.parser.add_option("", "--compatible", action="store_true", dest="compatible", default=False,
00298 help="enable full compatibility with old britney's output")
00299 self.parser.add_option("", "--control-files", action="store_true", dest="control_files", default=False,
00300 help="enable control files generation")
00301 self.parser.add_option("", "--nuninst-cache", action="store_true", dest="nuninst_cache", default=False,
00302 help="do not build the non-installability status, use the cache from file")
00303 (self.options, self.args) = self.parser.parse_args()
00304
00305
00306 if not os.path.isfile(self.options.config):
00307 self.__log("Unable to read the configuration file (%s), exiting!" % self.options.config, type="E")
00308 sys.exit(1)
00309
00310
00311
00312 self.MINDAYS = {}
00313 self.HINTS = {}
00314 for k, v in [map(string.strip,r.split('=', 1)) for r in file(self.options.config) if '=' in r and not r.strip().startswith('#')]:
00315 if k.startswith("MINDAYS_"):
00316 self.MINDAYS[k.split("_")[1].lower()] = int(v)
00317 elif k.startswith("HINTS_"):
00318 self.HINTS[k.split("_")[1].lower()] = \
00319 reduce(lambda x,y: x+y, [hasattr(self, "HINTS_" + i) and getattr(self, "HINTS_" + i) or (i,) for i in v.split()])
00320 elif not hasattr(self.options, k.lower()) or \
00321 not getattr(self.options, k.lower()):
00322 setattr(self.options, k.lower(), v)
00323
00324
00325 allarches = sorted(self.options.architectures.split())
00326 arches = [x for x in allarches if x in self.options.nobreakall_arches]
00327 arches += [x for x in allarches if x not in arches and x not in self.options.fucked_arches.split()]
00328 arches += [x for x in allarches if x not in arches and x not in self.options.break_arches.split()]
00329 arches += [x for x in allarches if x not in arches and x not in self.options.new_arches.split()]
00330 arches += [x for x in allarches if x not in arches]
00331 self.options.architectures = arches
00332 self.options.smooth_updates = self.options.smooth_updates.split()
00333
00334 def __log(self, msg, type="I"):
00335 """Print info messages according to verbosity level
00336
00337 An easy-and-simple log method which prints messages to the standard
00338 output. The type parameter controls the urgency of the message, and
00339 can be equal to `I' for `Information', `W' for `Warning' and `E' for
00340 `Error'. Warnings and errors are always printed, and information are
00341 printed only if the verbose logging is enabled.
00342 """
00343 if self.options.verbose or type in ("E", "W"):
00344 print "%s: [%s] - %s" % (type, time.asctime(), msg)
00345
00346
00347
00348
00349 def read_sources(self, basedir):
00350 """Read the list of source packages from the specified directory
00351
00352 The source packages are read from the `Sources' file within the
00353 directory specified as `basedir' parameter. Considering the
00354 large amount of memory needed, not all the fields are loaded
00355 in memory. The available fields are Version, Maintainer and Section.
00356
00357 The method returns a list where every item represents a source
00358 package as a dictionary.
00359 """
00360 sources = {}
00361 package = None
00362 filename = os.path.join(basedir, "Sources")
00363 self.__log("Loading source packages from %s" % filename)
00364 Packages = apt_pkg.ParseTagFile(open(filename))
00365 get_field = Packages.Section.get
00366 while Packages.Step():
00367 pkg = get_field('Package')
00368 sources[pkg] = [get_field('Version'),
00369 get_field('Section'),
00370 [],
00371 get_field('Maintainer'),
00372 False,
00373 ]
00374 return sources
00375
00376 def read_binaries(self, basedir, distribution, arch):
00377 """Read the list of binary packages from the specified directory
00378
00379 The binary packages are read from the `Packages_${arch}' files
00380 within the directory specified as `basedir' parameter, replacing
00381 ${arch} with the value of the arch parameter. Considering the
00382 large amount of memory needed, not all the fields are loaded
00383 in memory. The available fields are Version, Source, Pre-Depends,
00384 Depends, Conflicts, Provides and Architecture.
00385
00386 After reading the packages, reverse dependencies are computed
00387 and saved in the `rdepends' keys, and the `Provides' field is
00388 used to populate the virtual packages list.
00389
00390 The dependencies are parsed with the apt.pkg.ParseDepends method,
00391 and they are stored both as the format of its return value and
00392 text.
00393
00394 The method returns a tuple. The first element is a list where
00395 every item represents a binary package as a dictionary; the second
00396 element is a dictionary which maps virtual packages to real
00397 packages that provide it.
00398 """
00399
00400 packages = {}
00401 provides = {}
00402 sources = self.sources
00403 package = None
00404
00405 filename = os.path.join(basedir, "Packages_%s" % arch)
00406 self.__log("Loading binary packages from %s" % filename)
00407 Packages = apt_pkg.ParseTagFile(open(filename))
00408 get_field = Packages.Section.get
00409 while Packages.Step():
00410 pkg = get_field('Package')
00411 version = get_field('Version')
00412 dpkg = [version,
00413 get_field('Section'),
00414 pkg,
00415 version,
00416 get_field('Architecture'),
00417 get_field('Pre-Depends'),
00418 get_field('Depends'),
00419 get_field('Conflicts'),
00420 get_field('Provides'),
00421 [],
00422 [],
00423 ]
00424
00425
00426 source = get_field('Source')
00427 if source:
00428 dpkg[SOURCE] = source.split(" ")[0]
00429 if "(" in source:
00430 dpkg[SOURCEVER] = source[source.find("(")+1:source.find(")")]
00431
00432
00433 if dpkg[SOURCE] in sources[distribution]:
00434 sources[distribution][dpkg[SOURCE]][BINARIES].append(pkg + "/" + arch)
00435
00436 else:
00437 sources[distribution][dpkg[SOURCE]] = [dpkg[SOURCEVER], None, [pkg + "/" + arch], None, True]
00438
00439
00440 if dpkg[PROVIDES]:
00441 parts = map(string.strip, dpkg[PROVIDES].split(","))
00442 for p in parts:
00443 if p not in provides:
00444 provides[p] = []
00445 provides[p].append(pkg)
00446 dpkg[PROVIDES] = parts
00447 else: dpkg[PROVIDES] = []
00448
00449
00450 packages[pkg] = dpkg
00451
00452
00453 register_reverses = self.register_reverses
00454 for pkg in packages:
00455 register_reverses(pkg, packages, provides, check_doubles=False)
00456
00457
00458 return (packages, provides)
00459
00460 def register_reverses(self, pkg, packages, provides, check_doubles=True, parse_depends=apt_pkg.ParseDepends):
00461 """Register reverse dependencies and conflicts for the specified package
00462
00463 This method register the reverse dependencies and conflicts for
00464 a give package using `packages` as list of packages and `provides`
00465 as list of virtual packages.
00466
00467 The method has an optional parameter parse_depends which is there
00468 just for performance reasons and is not meant to be overwritten.
00469 """
00470
00471 dependencies = []
00472 if packages[pkg][DEPENDS]:
00473 dependencies.extend(parse_depends(packages[pkg][DEPENDS]))
00474 if packages[pkg][PREDEPENDS]:
00475 dependencies.extend(parse_depends(packages[pkg][PREDEPENDS]))
00476
00477 for p in dependencies:
00478 for a in p:
00479
00480 if a[0] in packages and (not check_doubles or pkg not in packages[a[0]][RDEPENDS]):
00481 packages[a[0]][RDEPENDS].append(pkg)
00482
00483 elif a[0] in provides:
00484 for i in provides.get(a[0]):
00485 if i not in packages: continue
00486 if not check_doubles or pkg not in packages[i][RDEPENDS]:
00487 packages[i][RDEPENDS].append(pkg)
00488
00489 if packages[pkg][CONFLICTS]:
00490 for p in parse_depends(packages[pkg][CONFLICTS]):
00491 for a in p:
00492
00493 if a[0] in packages and (not check_doubles or pkg not in packages[a[0]][RCONFLICTS]):
00494 packages[a[0]][RCONFLICTS].append(pkg)
00495
00496 elif a[0] in provides:
00497 for i in provides[a[0]]:
00498 if i not in packages: continue
00499 if not check_doubles or pkg not in packages[i][RCONFLICTS]:
00500 packages[i][RCONFLICTS].append(pkg)
00501
00502 def read_bugs(self, basedir):
00503 """Read the release critial bug summary from the specified directory
00504
00505 The RC bug summaries are read from the `Bugs' file within the
00506 directory specified as `basedir' parameter. The file contains
00507 rows with the format:
00508
00509 <package-name> <count-of-rc-bugs>
00510
00511 The method returns a dictionary where the key is the binary package
00512 name and the value is the number of open RC bugs for it.
00513 """
00514 bugs = {}
00515 filename = os.path.join(basedir, "Bugs")
00516 self.__log("Loading RC bugs count from %s" % filename)
00517 for line in open(filename):
00518 l = line.split()
00519 if len(l) != 2: continue
00520 try:
00521 bugs[l[0]] = int(l[1])
00522 except ValueError:
00523 self.__log("Bugs, unable to parse \"%s\"" % line, type="E")
00524 return bugs
00525
00526 def write_bugs(self, basedir, bugs):
00527 """Write the release critical bug summary to the specified directory
00528
00529 For a more detailed explanation of the format, please check the method
00530 read_bugs.
00531 """
00532 filename = os.path.join(basedir, "Bugs")
00533 self.__log("Writing RC bugs count to %s" % filename)
00534 f = open(filename, 'w')
00535 for pkg in sorted(bugs.keys()):
00536 if bugs[pkg] == 0: continue
00537 f.write("%s %d\n" % (pkg, bugs[pkg]))
00538 f.close()
00539
00540 def __maxver(self, pkg, dist):
00541 """Return the maximum version for a given package name
00542
00543 This method returns None if the specified source package
00544 is not available in the `dist' distribution. If the package
00545 exists, then it returns the maximum version between the
00546 source package and its binary packages.
00547 """
00548 maxver = None
00549 if pkg in self.sources[dist]:
00550 maxver = self.sources[dist][pkg][VERSION]
00551 for arch in self.options.architectures:
00552 if pkg not in self.binaries[dist][arch][0]: continue
00553 pkgv = self.binaries[dist][arch][0][pkg][VERSION]
00554 if maxver == None or apt_pkg.VersionCompare(pkgv, maxver) > 0:
00555 maxver = pkgv
00556 return maxver
00557
00558 def normalize_bugs(self):
00559 """Normalize the release critical bug summaries for testing and unstable
00560
00561 The method doesn't return any value: it directly modifies the
00562 object attribute `bugs'.
00563 """
00564
00565 for pkg in set(self.bugs['testing'].keys() + self.bugs['unstable'].keys()):
00566
00567
00568 if pkg not in self.bugs['testing']:
00569 self.bugs['testing'][pkg] = 0
00570 elif pkg not in self.bugs['unstable']:
00571 self.bugs['unstable'][pkg] = 0
00572
00573
00574 maxvert = self.__maxver(pkg, 'testing')
00575
00576
00577
00578 if maxvert == None or \
00579 self.bugs['testing'][pkg] == self.bugs['unstable'][pkg]:
00580 continue
00581
00582
00583 maxveru = self.__maxver(pkg, 'unstable')
00584
00585
00586 if maxveru == None:
00587 continue
00588
00589
00590 elif apt_pkg.VersionCompare(maxvert, maxveru) >= 0:
00591 self.bugs['testing'][pkg] = self.bugs['unstable'][pkg]
00592
00593 def read_dates(self, basedir):
00594 """Read the upload date for the packages from the specified directory
00595
00596 The upload dates are read from the `Date' file within the directory
00597 specified as `basedir' parameter. The file contains rows with the
00598 format:
00599
00600 <package-name> <version> <date-of-upload>
00601
00602 The dates are expressed as days starting from the 1970-01-01.
00603
00604 The method returns a dictionary where the key is the binary package
00605 name and the value is tuple with two items, the version and the date.
00606 """
00607 dates = {}
00608 filename = os.path.join(basedir, "Dates")
00609 self.__log("Loading upload data from %s" % filename)
00610 for line in open(filename):
00611 l = line.split()
00612 if len(l) != 3: continue
00613 try:
00614 dates[l[0]] = (l[1], int(l[2]))
00615 except ValueError:
00616 self.__log("Dates, unable to parse \"%s\"" % line, type="E")
00617 return dates
00618
00619 def write_dates(self, basedir, dates):
00620 """Write the upload date for the packages to the specified directory
00621
00622 For a more detailed explanation of the format, please check the method
00623 read_dates.
00624 """
00625 filename = os.path.join(basedir, "Dates")
00626 self.__log("Writing upload data to %s" % filename)
00627 f = open(filename, 'w')
00628 for pkg in sorted(dates.keys()):
00629 f.write("%s %s %d\n" % ((pkg,) + dates[pkg]))
00630 f.close()
00631
00632
00633 def read_urgencies(self, basedir):
00634 """Read the upload urgency of the packages from the specified directory
00635
00636 The upload urgencies are read from the `Urgency' file within the
00637 directory specified as `basedir' parameter. The file contains rows
00638 with the format:
00639
00640 <package-name> <version> <urgency>
00641
00642 The method returns a dictionary where the key is the binary package
00643 name and the value is the greatest urgency from the versions of the
00644 package that are higher then the testing one.
00645 """
00646
00647 urgencies = {}
00648 filename = os.path.join(basedir, "Urgency")
00649 self.__log("Loading upload urgencies from %s" % filename)
00650 for line in open(filename):
00651 l = line.split()
00652 if len(l) != 3: continue
00653
00654
00655 urgency_old = urgencies.get(l[0], self.options.default_urgency)
00656 mindays_old = self.MINDAYS.get(urgency_old, self.MINDAYS[self.options.default_urgency])
00657 mindays_new = self.MINDAYS.get(l[2], self.MINDAYS[self.options.default_urgency])
00658
00659
00660 if mindays_old <= mindays_new:
00661 continue
00662
00663
00664 tsrcv = self.sources['testing'].get(l[0], None)
00665 if tsrcv and apt_pkg.VersionCompare(tsrcv[VERSION], l[1]) >= 0:
00666 continue
00667
00668
00669 usrcv = self.sources['unstable'].get(l[0], None)
00670 if not usrcv or apt_pkg.VersionCompare(usrcv[VERSION], l[1]) < 0:
00671 continue
00672
00673
00674 urgencies[l[0]] = l[2]
00675
00676 return urgencies
00677
00678 def read_approvals(self, basedir):
00679 """Read the approval commands from the specified directory
00680
00681 The approval commands are read from the files contained by the
00682 `Approved' directory within the directory specified as `basedir'
00683 parameter. The name of the files has to be the same of the
00684 authorized users for the approvals.
00685
00686 The file contains rows with the format:
00687
00688 <package-name> <version>
00689
00690 The method returns a dictionary where the key is the binary package
00691 name followed by an underscore and the version number, and the value
00692 is the user who submitted the command.
00693 """
00694 approvals = {}
00695 for approver in self.options.approvers.split():
00696 filename = os.path.join(basedir, "Approved", approver)
00697 self.__log("Loading approvals list from %s" % filename)
00698 for line in open(filename):
00699 l = line.split()
00700 if len(l) != 2: continue
00701 approvals["%s_%s" % (l[0], l[1])] = approver
00702 return approvals
00703
00704 def read_hints(self, basedir):
00705 """Read the hint commands from the specified directory
00706
00707 The hint commands are read from the files contained by the `Hints'
00708 directory within the directory specified as `basedir' parameter.
00709 The name of the files has to be the same of the authorized users
00710 for the hints.
00711
00712 The file contains rows with the format:
00713
00714 <command> <package-name>[/<version>]
00715
00716 The method returns a dictionary where the key is the command, and
00717 the value is the list of affected packages.
00718 """
00719 hints = dict([(k,[]) for k in self.HINTS_ALL])
00720
00721 for who in self.HINTS.keys():
00722 filename = os.path.join(basedir, "Hints", who)
00723 self.__log("Loading hints list from %s" % filename)
00724 for line in open(filename):
00725 line = line.strip()
00726 if line == "": continue
00727 l = line.split()
00728 if l[0] == 'finished':
00729 break
00730 elif l[0] not in self.HINTS[who]:
00731 continue
00732 elif l[0] in ["easy", "hint", "force-hint"]:
00733 hints[l[0]].append((who, [k.split("/") for k in l if "/" in k]))
00734 elif l[0] in ["block-all"]:
00735 hints[l[0]].extend([(y, who) for y in l[1:]])
00736 elif l[0] in ["block"]:
00737 hints[l[0]].extend([(y, who) for y in l[1:]])
00738 elif l[0] in ["remove", "approve", "unblock", "force", "urgent"]:
00739 hints[l[0]].extend([(k.split("/")[0], (k.split("/")[1],who) ) for k in l if "/" in k])
00740
00741 for x in ["block", "block-all", "unblock", "force", "urgent", "remove"]:
00742 z = {}
00743 for a, b in hints[x]:
00744 if a in z:
00745 self.__log("Overriding %s[%s] = %s with %s" % (x, a, z[a], b), type="W")
00746 z[a] = b
00747 hints[x] = z
00748
00749 return hints
00750
00751 def write_heidi(self, basedir, filename):
00752 """Write the output HeidiResult
00753
00754 This method write the output for Heidi, which contains all the
00755 binary packages and the source packages in the form:
00756
00757 <pkg-name> <pkg-version> <pkg-architecture> <pkg-section>
00758 <src-name> <src-version> <src-section>
00759 """
00760 filename = os.path.join(basedir, filename)
00761 self.__log("Writing Heidi results to %s" % filename)
00762 f = open(filename, 'w')
00763
00764
00765 sources = self.sources['testing']
00766
00767
00768 for arch in sorted(self.options.architectures):
00769 binaries = self.binaries['testing'][arch][0]
00770 for pkg_name in sorted(binaries):
00771 pkg = binaries[pkg_name]
00772 pkgv = pkg[VERSION]
00773 pkgarch = pkg[ARCHITECTURE]
00774 pkgsec = pkg[SECTION] or 'unknown'
00775 f.write('%s %s %s %s\n' % (pkg_name, pkgv, pkgarch, pkgsec))
00776
00777
00778 for src_name in sorted(sources):
00779 src = sources[src_name]
00780 srcv = src[VERSION]
00781 srcsec = src[FAKESRC] and 'faux' or src[SECTION] or 'unknown'
00782 f.write('%s %s source %s\n' % (src_name, srcv, srcsec))
00783
00784 f.close()
00785
00786 def write_controlfiles(self, basedir, suite):
00787 """Write the control files
00788
00789 This method write the control files for the binary packages of all
00790 the architectures and for the source packages.
00791 """
00792 sources = self.sources[suite]
00793
00794 self.__log("Writing new %s control files to %s" % (suite, basedir))
00795 for arch in self.options.architectures:
00796 filename = os.path.join(basedir, 'Packages_%s' % arch)
00797 f = open(filename, 'w')
00798 binaries = self.binaries[suite][arch][0]
00799 for pkg in binaries:
00800 output = "Package: %s\n" % pkg
00801 for key, k in ((SECTION, 'Section'), (ARCHITECTURE, 'Architecture'), (SOURCE, 'Source'), (VERSION, 'Version'),
00802 (PREDEPENDS, 'Pre-Depends'), (DEPENDS, 'Depends'), (PROVIDES, 'Provides'), (CONFLICTS, 'Conflicts')):
00803 if not binaries[pkg][key]: continue
00804 if key == SOURCE:
00805 if binaries[pkg][SOURCE] == pkg:
00806 if binaries[pkg][SOURCEVER] != binaries[pkg][VERSION]:
00807 source = binaries[pkg][SOURCE] + " (" + binaries[pkg][SOURCEVER] + ")"
00808 else: continue
00809 else:
00810 if binaries[pkg][SOURCEVER] != binaries[pkg][VERSION]:
00811 source = binaries[pkg][SOURCE] + " (" + binaries[pkg][SOURCEVER] + ")"
00812 else:
00813 source = binaries[pkg][SOURCE]
00814 output += (k + ": " + source + "\n")
00815 if sources[binaries[pkg][SOURCE]][MAINTAINER]:
00816 output += (k + ": " + sources[binaries[pkg][SOURCE]][MAINTAINER] + "\n")
00817 elif key == PROVIDES:
00818 if len(binaries[pkg][key]) > 0:
00819 output += (k + ": " + ", ".join(binaries[pkg][key]) + "\n")
00820 else:
00821 output += (k + ": " + binaries[pkg][key] + "\n")
00822 f.write(output + "\n")
00823 f.close()
00824
00825 filename = os.path.join(basedir, 'Sources')
00826 f = open(filename, 'w')
00827 for src in sources:
00828 output = "Package: %s\n" % src
00829 for k in ('Version', 'Section', 'Maintainer'):
00830 key = k.lower()
00831 if key not in sources[src] or not sources[src][key]: continue
00832 output += (k + ": " + sources[src][key] + "\n")
00833 f.write(output + "\n")
00834 f.close()
00835
00836 def write_nuninst(self, nuninst):
00837 """Write the non-installable report"""
00838 f = open(self.options.noninst_status, 'w')
00839 f.write("Built on: " + time.strftime("%Y.%m.%d %H:%M:%S %z", time.gmtime(time.time())) + "\n")
00840 f.write("Last update: " + time.strftime("%Y.%m.%d %H:%M:%S %z", time.gmtime(time.time())) + "\n\n")
00841 f.write("".join([k + ": " + " ".join(nuninst[k]) + "\n" for k in nuninst]))
00842 f.close()
00843
00844 def read_nuninst(self):
00845 """Read the non-installable report"""
00846 f = open(self.options.noninst_status)
00847 nuninst = {}
00848 for r in f:
00849 if ":" not in r: continue
00850 arch, packages = r.strip().split(":", 1)
00851 if arch.split("+", 1)[0] in self.options.architectures:
00852 nuninst[arch] = packages.split()
00853 return nuninst
00854
00855
00856
00857
00858
00859 def same_source(self, sv1, sv2):
00860 """Check if two version numbers are built from the same source
00861
00862 This method returns a boolean value which is true if the two
00863 version numbers specified as parameters are built from the same
00864 source. The main use of this code is to detect binary-NMU.
00865 """
00866 if sv1 == sv2:
00867 return 1
00868
00869 m = re.match(r'^(.*)\+b\d+$', sv1)
00870 if m: sv1 = m.group(1)
00871 m = re.match(r'^(.*)\+b\d+$', sv2)
00872 if m: sv2 = m.group(1)
00873
00874 if sv1 == sv2:
00875 return 1
00876
00877 if re.search("-", sv1) or re.search("-", sv2):
00878 m = re.match(r'^(.*-[^.]+)\.0\.\d+$', sv1)
00879 if m: sv1 = m.group(1)
00880 m = re.match(r'^(.*-[^.]+\.[^.]+)\.\d+$', sv1)
00881 if m: sv1 = m.group(1)
00882
00883 m = re.match(r'^(.*-[^.]+)\.0\.\d+$', sv2)
00884 if m: sv2 = m.group(1)
00885 m = re.match(r'^(.*-[^.]+\.[^.]+)\.\d+$', sv2)
00886 if m: sv2 = m.group(1)
00887
00888 return (sv1 == sv2)
00889 else:
00890 m = re.match(r'^([^-]+)\.0\.\d+$', sv1)
00891 if m and sv2 == m.group(1): return 1
00892
00893 m = re.match(r'^([^-]+)\.0\.\d+$', sv2)
00894 if m and sv1 == m.group(1): return 1
00895
00896 return 0
00897
00898 def get_dependency_solvers(self, block, arch, distribution, excluded=[], strict=False):
00899 """Find the packages which satisfy a dependency block
00900
00901 This method returns the list of packages which satisfy a dependency
00902 block (as returned by apt_pkg.ParseDepends) for the given architecture
00903 and distribution.
00904
00905 It returns a tuple with two items: the first is a boolean which is
00906 True if the dependency is satisfied, the second is the list of the
00907 solving packages.
00908 """
00909
00910 packages = []
00911
00912
00913 binaries = self.binaries[distribution][arch]
00914
00915
00916 for name, version, op in block:
00917
00918 if name not in excluded and name in binaries[0]:
00919 package = binaries[0][name]
00920
00921 if op == '' and version == '' or apt_pkg.CheckDep(package[VERSION], op, version):
00922 packages.append(name)
00923
00924
00925 for prov in binaries[1].get(name, []):
00926 if prov in excluded or \
00927 prov not in binaries[0]: continue
00928 package = binaries[0][prov]
00929
00930
00931
00932
00933 if op == '' and version == '' or not strict and apt_pkg.CheckDep(package[VERSION], op, version):
00934 packages.append(prov)
00935 break
00936
00937 return (len(packages) > 0, packages)
00938
00939 def excuse_unsat_deps(self, pkg, src, arch, suite, excuse, excluded=[], conflicts=False):
00940 """Find unsatisfied dependencies for a binary package
00941
00942 This method analyzes the dependencies of the binary package specified
00943 by the parameter `pkg', built from the source package `src', for the
00944 architecture `arch' within the suite `suite'. If the dependency can't
00945 be satisfied in testing and/or unstable, it updates the excuse passed
00946 as parameter.
00947
00948 The dependency fields checked are Pre-Depends and Depends.
00949 """
00950
00951 binary_u = self.binaries[suite][arch][0][pkg]
00952
00953
00954 parse_depends = apt_pkg.ParseDepends
00955 get_dependency_solvers = self.get_dependency_solvers
00956 strict = True
00957
00958
00959 for type_key, type in ((PREDEPENDS, 'Pre-Depends'), (DEPENDS, 'Depends')):
00960 if not binary_u[type_key]:
00961 continue
00962
00963
00964 for block, block_txt in zip(parse_depends(binary_u[type_key]), binary_u[type_key].split(',')):
00965
00966 solved, packages = get_dependency_solvers(block, arch, 'testing', excluded, strict=strict)
00967 if solved:
00968 for p in packages:
00969 if p not in self.binaries[suite][arch][0]: continue
00970 excuse.add_sane_dep(self.binaries[suite][arch][0][p][SOURCE])
00971 continue
00972
00973
00974 solved, packages = get_dependency_solvers(block, arch, suite, [], strict=strict)
00975 packages = [self.binaries[suite][arch][0][p][SOURCE] for p in packages]
00976
00977
00978
00979 if src in packages: continue
00980
00981
00982 if len(packages) == 0:
00983 excuse.addhtml("%s/%s unsatisfiable %s: %s" % (pkg, arch, type, block_txt.strip()))
00984 if arch not in self.options.break_arches: excuse.add_unsat_dep(arch)
00985 continue
00986
00987
00988 for p in packages:
00989 if arch not in self.options.break_arches.split():
00990 excuse.add_dep(p)
00991 else:
00992 excuse.add_break_dep(p, arch)
00993
00994 return True
00995
00996
00997
00998
00999 def should_remove_source(self, pkg):
01000 """Check if a source package should be removed from testing
01001
01002 This method checks if a source package should be removed from the
01003 testing distribution; this happen if the source package is not
01004 present in the unstable distribution anymore.
01005
01006 It returns True if the package can be removed, False otherwise.
01007 In the former case, a new excuse is appended to the the object
01008 attribute excuses.
01009 """
01010
01011 if pkg in self.sources['unstable']:
01012 return False
01013
01014 src = self.sources['testing'][pkg]
01015 excuse = Excuse("-" + pkg)
01016 excuse.set_vers(src[VERSION], None)
01017 src[MAINTAINER] and excuse.set_maint(src[MAINTAINER].strip())
01018 src[SECTION] and excuse.set_section(src[SECTION].strip())
01019
01020
01021 if self.hints['block'].has_key('-' + pkg):
01022 exc.addhtml("Not touching package, as requested by %s (contact debian-release if update is needed)" % hints['block']['-' + pkg])
01023 return False
01024
01025 excuse.addhtml("Valid candidate")
01026 self.excuses.append(excuse)
01027 return True
01028
01029 def should_upgrade_srcarch(self, src, arch, suite):
01030 """Check if binary package should be upgraded
01031
01032 This method checks if a binary package should be upgraded; this can
01033 happen also if the binary package is a binary-NMU for the given arch.
01034 The analisys is performed for the source package specified by the
01035 `src' parameter, checking the architecture `arch' for the distribution
01036 `suite'.
01037
01038 It returns False if the given package doesn't need to be upgraded,
01039 True otherwise. In the former case, a new excuse is appended to
01040 the the object attribute excuses.
01041 """
01042
01043 source_t = self.sources['testing'][src]
01044 source_u = self.sources[suite][src]
01045
01046
01047 ref = "%s/%s%s" % (src, arch, suite != 'unstable' and "_" + suite or "")
01048 excuse = Excuse(ref)
01049 excuse.set_vers(source_t[VERSION], source_t[VERSION])
01050 source_u[MAINTAINER] and excuse.set_maint(source_u[MAINTAINER].strip())
01051 source_u[SECTION] and excuse.set_section(source_u[SECTION].strip())
01052
01053
01054
01055 if src in self.hints["remove"] and \
01056 self.same_source(source_t[VERSION], self.hints["remove"][src][0]):
01057 excuse.addhtml("Removal request by %s" % (self.hints["remove"][src][1]))
01058 excuse.addhtml("Trying to remove package, not update it")
01059 excuse.addhtml("Not considered")
01060 self.excuses.append(excuse)
01061 return False
01062
01063
01064 anywrongver = False
01065 anyworthdoing = False
01066
01067
01068 for pkg in sorted(filter(lambda x: x.endswith("/" + arch), source_u[BINARIES]), key=lambda x: x.split("/")[0]):
01069 pkg_name = pkg.split("/")[0]
01070
01071
01072 binary_t = pkg in source_t[BINARIES] and self.binaries['testing'][arch][0][pkg_name] or None
01073 binary_u = self.binaries[suite][arch][0][pkg_name]
01074
01075
01076 pkgsv = self.binaries[suite][arch][0][pkg_name][SOURCEVER]
01077
01078
01079 if binary_u[ARCHITECTURE] == 'all':
01080 excuse.addhtml("Ignoring %s %s (from %s) as it is arch: all" % (pkg_name, binary_u[VERSION], pkgsv))
01081 continue
01082
01083
01084 if not self.same_source(source_t[VERSION], pkgsv):
01085 anywrongver = True
01086 excuse.addhtml("From wrong source: %s %s (%s not %s)" % (pkg_name, binary_u[VERSION], pkgsv, source_t[VERSION]))
01087 break
01088
01089
01090 self.excuse_unsat_deps(pkg_name, src, arch, suite, excuse)
01091
01092
01093
01094 if not binary_t:
01095 excuse.addhtml("New binary: %s (%s)" % (pkg_name, binary_u[VERSION]))
01096 anyworthdoing = True
01097 continue
01098
01099
01100
01101 vcompare = apt_pkg.VersionCompare(binary_t[VERSION], binary_u[VERSION])
01102
01103
01104 if vcompare > 0:
01105 anywrongver = True
01106 excuse.addhtml("Not downgrading: %s (%s to %s)" % (pkg_name, binary_t[VERSION], binary_u[VERSION]))
01107 break
01108
01109 elif vcompare < 0:
01110 excuse.addhtml("Updated binary: %s (%s to %s)" % (pkg_name, binary_t[VERSION], binary_u[VERSION]))
01111 anyworthdoing = True
01112
01113
01114
01115 if not anywrongver and (anyworthdoing or self.sources[suite][src][FAKESRC]):
01116 srcv = self.sources[suite][src][VERSION]
01117 ssrc = self.same_source(source_t[VERSION], srcv)
01118
01119 for pkg in sorted([x.split("/")[0] for x in self.sources['testing'][src][BINARIES] if x.endswith("/"+arch)]):
01120
01121 if self.binaries['testing'][arch][0][pkg][ARCHITECTURE] == 'all':
01122 excuse.addhtml("Ignoring removal of %s as it is arch: all" % (pkg))
01123 continue
01124
01125 if pkg not in self.binaries[suite][arch][0]:
01126 tpkgv = self.binaries['testing'][arch][0][pkg][VERSION]
01127 excuse.addhtml("Removed binary: %s %s" % (pkg, tpkgv))
01128 if ssrc: anyworthdoing = True
01129
01130
01131 if not anywrongver and anyworthdoing:
01132 excuse.addhtml("Valid candidate")
01133 self.excuses.append(excuse)
01134 return True
01135
01136 elif anyworthdoing:
01137 excuse.addhtml("Not considered")
01138 self.excuses.append(excuse)
01139
01140
01141 return False
01142
01143 def should_upgrade_src(self, src, suite):
01144 """Check if source package should be upgraded
01145
01146 This method checks if a source package should be upgraded. The analisys
01147 is performed for the source package specified by the `src' parameter,
01148 checking the architecture `arch' for the distribution `suite'.
01149
01150 It returns False if the given package doesn't need to be upgraded,
01151 True otherwise. In the former case, a new excuse is appended to
01152 the the object attribute excuses.
01153 """
01154
01155
01156 source_u = self.sources[suite][src]
01157 if src in self.sources['testing']:
01158 source_t = self.sources['testing'][src]
01159
01160 if apt_pkg.VersionCompare(source_t[VERSION], source_u[VERSION]) == 0:
01161 return False
01162 else:
01163 source_t = None
01164
01165
01166 ref = "%s%s" % (src, suite != 'unstable' and "_" + suite or "")
01167 excuse = Excuse(ref)
01168 excuse.set_vers(source_t and source_t[VERSION] or None, source_u[VERSION])
01169 source_u[MAINTAINER] and excuse.set_maint(source_u[MAINTAINER].strip())
01170 source_u[SECTION] and excuse.set_section(source_u[SECTION].strip())
01171
01172
01173 update_candidate = True
01174
01175
01176 if source_t and apt_pkg.VersionCompare(source_u[VERSION], source_t[VERSION]) < 0:
01177 excuse.addhtml("ALERT: %s is newer in testing (%s %s)" % (src, source_t[VERSION], source_u[VERSION]))
01178 self.excuses.append(excuse)
01179 return False
01180
01181
01182 if source_u[FAKESRC]:
01183 excuse.addhtml("%s source package doesn't exist" % (src))
01184 update_candidate = False
01185
01186
01187 urgency = self.urgencies.get(src, self.options.default_urgency)
01188 if not source_t and urgency != self.options.default_urgency:
01189 excuse.addhtml("Ignoring %s urgency setting for NEW package" % (urgency))
01190 urgency = self.options.default_urgency
01191
01192
01193
01194 if src in self.hints["remove"]:
01195 if source_t and self.same_source(source_t[VERSION], self.hints['remove'][src][0]) or \
01196 self.same_source(source_u[VERSION], self.hints['remove'][src][0]):
01197 excuse.addhtml("Removal request by %s" % (self.hints["remove"][src][1]))
01198 excuse.addhtml("Trying to remove package, not update it")
01199 update_candidate = False
01200
01201
01202 blocked = None
01203 if src in self.hints["block"]:
01204 blocked = self.hints["block"][src]
01205 elif 'source' in self.hints["block-all"]:
01206 blocked = self.hints["block-all"]["source"]
01207
01208
01209
01210 if blocked:
01211 unblock = self.hints["unblock"].get(src,(None,None))
01212 if unblock[0] != None:
01213 if self.same_source(unblock[0], source_u[VERSION]):
01214 excuse.addhtml("Ignoring request to block package by %s, due to unblock request by %s" % (blocked, unblock[1]))
01215 else:
01216 excuse.addhtml("Unblock request by %s ignored due to version mismatch: %s" % (unblock[1], unblock[0]))
01217 else:
01218 excuse.addhtml("Not touching package, as requested by %s (contact debian-release if update is needed)" % (blocked))
01219 update_candidate = False
01220
01221
01222
01223
01224 if suite == 'unstable':
01225 if src not in self.dates:
01226 self.dates[src] = (source_u[VERSION], self.date_now)
01227 elif not self.same_source(self.dates[src][0], source_u[VERSION]):
01228 self.dates[src] = (source_u[VERSION], self.date_now)
01229
01230 days_old = self.date_now - self.dates[src][1]
01231 min_days = self.MINDAYS[urgency]
01232 excuse.setdaysold(days_old, min_days)
01233 if days_old < min_days:
01234 if src in self.hints["urgent"] and self.same_source(source_u[VERSION], self.hints["urgent"][src][0]):
01235 excuse.addhtml("Too young, but urgency pushed by %s" % (self.hints["urgent"][src][1]))
01236 else:
01237 update_candidate = False
01238
01239
01240
01241 pkgs = {src: ["source"]}
01242 for arch in self.options.architectures:
01243 oodbins = {}
01244
01245 for pkg in sorted([x.split("/")[0] for x in self.sources[suite][src][BINARIES] if x.endswith("/"+arch)]):
01246 if pkg not in pkgs: pkgs[pkg] = []
01247 pkgs[pkg].append(arch)
01248
01249
01250 binary_u = self.binaries[suite][arch][0][pkg]
01251 pkgsv = binary_u[SOURCEVER]
01252
01253
01254 if not self.same_source(source_u[VERSION], pkgsv):
01255 if pkgsv not in oodbins:
01256 oodbins[pkgsv] = []
01257 oodbins[pkgsv].append(pkg)
01258 continue
01259
01260
01261
01262 if binary_u[ARCHITECTURE] != 'all' or arch in self.options.nobreakall_arches:
01263 self.excuse_unsat_deps(pkg, src, arch, suite, excuse)
01264
01265
01266
01267
01268 if oodbins:
01269 oodtxt = ""
01270 for v in oodbins.keys():
01271 if oodtxt: oodtxt = oodtxt + "; "
01272 oodtxt = oodtxt + "%s (from <a href=\"http://buildd.debian.org/build.php?" \
01273 "arch=%s&pkg=%s&ver=%s\" target=\"_blank\">%s</a>)" % \
01274 (", ".join(sorted(oodbins[v])), arch, src, v, v)
01275 text = "out of date on <a href=\"http://buildd.debian.org/build.php?" \
01276 "arch=%s&pkg=%s&ver=%s\" target=\"_blank\">%s</a>: %s" % \
01277 (arch, src, source_u[VERSION], arch, oodtxt)
01278
01279 if arch in self.options.fucked_arches:
01280 text = text + " (but %s isn't keeping up, so nevermind)" % (arch)
01281 else:
01282 update_candidate = False
01283
01284 if self.date_now != self.dates[src][1]:
01285 excuse.addhtml(text)
01286
01287
01288 if len(self.sources[suite][src][BINARIES]) == 0:
01289 excuse.addhtml("%s has no binaries on any arch" % src)
01290 update_candidate = False
01291
01292
01293
01294
01295 if suite == 'unstable':
01296 for pkg in pkgs.keys():
01297 if pkg not in self.bugs['testing']:
01298 self.bugs['testing'][pkg] = 0
01299 if pkg not in self.bugs['unstable']:
01300 self.bugs['unstable'][pkg] = 0
01301
01302 if self.bugs['unstable'][pkg] > self.bugs['testing'][pkg]:
01303 excuse.addhtml("%s (%s) is <a href=\"http://bugs.debian.org/cgi-bin/pkgreport.cgi?" \
01304 "which=pkg&data=%s&sev-inc=critical&sev-inc=grave&sev-inc=serious\" " \
01305 "target=\"_blank\">buggy</a>! (%d > %d)" % \
01306 (pkg, ", ".join(pkgs[pkg]), pkg, self.bugs['unstable'][pkg], self.bugs['testing'][pkg]))
01307 update_candidate = False
01308 elif self.bugs['unstable'][pkg] > 0:
01309 excuse.addhtml("%s (%s) is (less) <a href=\"http://bugs.debian.org/cgi-bin/pkgreport.cgi?" \
01310 "which=pkg&data=%s&sev-inc=critical&sev-inc=grave&sev-inc=serious\" " \
01311 "target=\"_blank\">buggy</a>! (%d <= %d)" % \
01312 (pkg, ", ".join(pkgs[pkg]), pkg, self.bugs['unstable'][pkg], self.bugs['testing'][pkg]))
01313
01314
01315 if not update_candidate and src in self.hints["force"] and \
01316 self.same_source(source_u[VERSION], self.hints["force"][src][0]):
01317 excuse.dontinvalidate = 1
01318 excuse.addhtml("Should ignore, but forced by %s" % (self.hints["force"][src][1]))
01319 update_candidate = True
01320
01321
01322 if suite == "tpu":
01323 key = "%s_%s" % (src, source_u[VERSION])
01324 if key in self.approvals:
01325 excuse.addhtml("Approved by %s" % approvals[key])
01326 else:
01327 excuse.addhtml("NEEDS APPROVAL BY RM")
01328 update_candidate = False
01329
01330
01331 if update_candidate:
01332 excuse.addhtml("Valid candidate")
01333
01334 else:
01335 excuse.addhtml("Not considered")
01336
01337 self.excuses.append(excuse)
01338 return update_candidate
01339
01340 def reversed_exc_deps(self):
01341 """Reverse the excuses dependencies
01342
01343 This method returns a dictionary where the keys are the package names
01344 and the values are the excuse names which depend on it.
01345 """
01346 res = {}
01347 for exc in self.excuses:
01348 for d in exc.deps:
01349 if d not in res: res[d] = []
01350 res[d].append(exc.name)
01351 return res
01352
01353 def invalidate_excuses(self, valid, invalid):
01354 """Invalidate impossible excuses
01355
01356 This method invalidates the impossible excuses, which depend
01357 on invalid excuses. The two parameters contains the list of
01358 `valid' and `invalid' excuses.
01359 """
01360
01361 exclookup = {}
01362 for e in self.excuses:
01363 exclookup[e.name] = e
01364
01365
01366 revdeps = self.reversed_exc_deps()
01367
01368
01369 i = 0
01370 while i < len(invalid):
01371
01372 if invalid[i] not in revdeps:
01373 i += 1
01374 continue
01375
01376 if (invalid[i] + "_tpu") in valid:
01377 i += 1
01378 continue
01379
01380 for x in revdeps[invalid[i]]:
01381
01382 if x in valid and exclookup[x].dontinvalidate:
01383 continue
01384
01385
01386
01387 exclookup[x].invalidate_dep(invalid[i])
01388 if x in valid:
01389 p = valid.index(x)
01390 invalid.append(valid.pop(p))
01391 exclookup[x].addhtml("Invalidated by dependency")
01392 exclookup[x].addhtml("Not considered")
01393 i = i + 1
01394
01395 def write_excuses(self):
01396 """Produce and write the update excuses
01397
01398 This method handles the update excuses generation: the packages are
01399 looked to determine whether they are valid candidates. For the details
01400 of this procedure, please refer to the module docstring.
01401 """
01402
01403 self.__log("Update Excuses generation started", type="I")
01404
01405
01406 sources = self.sources
01407 architectures = self.options.architectures
01408 should_remove_source = self.should_remove_source
01409 should_upgrade_srcarch = self.should_upgrade_srcarch
01410 should_upgrade_src = self.should_upgrade_src
01411
01412
01413
01414 upgrade_me = []
01415
01416
01417 for pkg in sources['testing']:
01418 if should_remove_source(pkg):
01419 upgrade_me.append("-" + pkg)
01420
01421
01422 for pkg in sources['unstable']:
01423 if sources['unstable'][pkg][FAKESRC]: continue
01424
01425
01426 if pkg in sources['testing'] and not sources['testing'][pkg][FAKESRC]:
01427 for arch in architectures:
01428 if should_upgrade_srcarch(pkg, arch, 'unstable'):
01429 upgrade_me.append("%s/%s" % (pkg, arch))
01430
01431
01432 if should_upgrade_src(pkg, 'unstable'):
01433 upgrade_me.append(pkg)
01434
01435
01436 for pkg in sources['tpu']:
01437 if sources['tpu'][pkg][FAKESRC]: continue
01438
01439
01440 if pkg in sources['testing']:
01441 for arch in architectures:
01442 if should_upgrade_srcarch(pkg, arch, 'tpu'):
01443 upgrade_me.append("%s/%s_tpu" % (pkg, arch))
01444
01445
01446 if should_upgrade_src(pkg, 'tpu'):
01447 upgrade_me.append("%s_tpu" % pkg)
01448
01449
01450 for src in self.hints["remove"].keys():
01451 if src in upgrade_me: continue
01452 if ("-"+src) in upgrade_me: continue
01453 if src not in sources['testing']: continue
01454
01455
01456 tsrcv = sources['testing'][src][VERSION]
01457 if not self.same_source(tsrcv, self.hints["remove"][src][0]): continue
01458
01459
01460 upgrade_me.append("-%s" % (src))
01461 excuse = Excuse("-%s" % (src))
01462 excuse.set_vers(tsrcv, None)
01463 excuse.addhtml("Removal request by %s" % (self.hints["remove"][src][1]))
01464 excuse.addhtml("Package is broken, will try to remove")
01465 self.excuses.append(excuse)
01466
01467
01468 self.excuses.sort(lambda x, y: cmp(x.daysold, y.daysold) or cmp(x.name, y.name))
01469
01470
01471 unconsidered = [e.name for e in self.excuses if e.name not in upgrade_me]
01472
01473
01474 for e in self.excuses:
01475 for d in e.deps:
01476 if d not in upgrade_me and d not in unconsidered:
01477 e.addhtml("Unpossible dep: %s -> %s" % (e.name, d))
01478 self.invalidate_excuses(upgrade_me, unconsidered)
01479
01480
01481 self.upgrade_me = sorted(upgrade_me)
01482
01483
01484 self.__log("> Writing Excuses to %s" % self.options.excuses_output, type="I")
01485
01486 f = open(self.options.excuses_output, 'w')
01487 f.write("<!DOCTYPE HTML PUBLIC \"-//W3C//DTD HTML 4.01//EN\" \"http://www.w3.org/TR/REC-html40/strict.dtd\">\n")
01488 f.write("<html><head><title>excuses...</title>")
01489 f.write("<meta http-equiv=\"Content-Type\" content=\"text/html;charset=utf-8\"></head><body>\n")
01490 f.write("<p>Generated: " + time.strftime("%Y.%m.%d %H:%M:%S %z", time.gmtime(time.time())) + "</p>\n")
01491 f.write("<ul>\n")
01492 for e in self.excuses:
01493 f.write("<li>%s" % e.html())
01494 f.write("</ul></body></html>\n")
01495 f.close()
01496
01497 self.__log("Update Excuses generation completed", type="I")
01498
01499
01500
01501
01502 def newlyuninst(self, nuold, nunew):
01503 """Return a nuninst statstic with only new uninstallable packages
01504
01505 This method subtract the uninstallabla packages of the statistic
01506 `nunew` from the statistic `nuold`.
01507
01508 It returns a dictionary with the architectures as keys and the list
01509 of uninstallable packages as values.
01510 """
01511 res = {}
01512 for arch in nuold:
01513 if arch not in nunew: continue
01514 res[arch] = [x for x in nunew[arch] if x not in nuold[arch]]
01515 return res
01516
01517 def get_nuninst(self, requested_arch=None, build=False):
01518 """Return the uninstallability statistic for all the architectures
01519
01520 To calculate the uninstallability counters, the method checks the
01521 installability of all the packages for all the architectures, and
01522 tracking dependencies in a recursive way. The architecture
01523 indipendent packages are checked only for the `nobreakall`
01524 architectures.
01525
01526 It returns a dictionary with the architectures as keys and the list
01527 of uninstallable packages as values.
01528 """
01529
01530 if not build:
01531 return self.read_nuninst()
01532
01533 nuninst = {}
01534
01535
01536 binaries = self.binaries['testing']
01537 check_installable = self.check_installable
01538
01539
01540
01541
01542 def add_nuninst(pkg, arch):
01543 if pkg not in nuninst[arch]:
01544 nuninst[arch].append(pkg)
01545 for p in binaries[arch][0][pkg][RDEPENDS]:
01546 r = check_installable(p, arch, 'testing', excluded=nuninst[arch], conflicts=True)
01547 if not r:
01548 add_nuninst(p, arch)
01549
01550
01551 for arch in self.options.architectures:
01552 if requested_arch and arch != requested_arch: continue
01553
01554 if arch not in self.options.nobreakall_arches:
01555 skip_archall = True
01556 else: skip_archall = False
01557
01558
01559
01560 nuninst[arch] = []
01561 for pkg_name in binaries[arch][0]:
01562 r = check_installable(pkg_name, arch, 'testing', excluded=nuninst[arch], conflicts=True)
01563 if not r:
01564 add_nuninst(pkg_name, arch)
01565
01566
01567 nuninst[arch + "+all"] = nuninst[arch][:]
01568 if skip_archall:
01569 for pkg in nuninst[arch + "+all"]:
01570 bpkg = binaries[arch][0][pkg]
01571 if bpkg[ARCHITECTURE] == 'all':
01572 nuninst[arch].remove(pkg)
01573
01574
01575 return nuninst
01576
01577 def eval_nuninst(self, nuninst, original=None):
01578 """Return a string which represents the uninstallability counters
01579
01580 This method returns a string which represents the uninstallability
01581 counters reading the uninstallability statistics `nuninst` and, if
01582 present, merging the results with the `original` one.
01583
01584 An example of the output string is:
01585 1+2: i-0:a-0:a-0:h-0:i-1:m-0:m-0:p-0:a-0:m-0:s-2:s-0
01586
01587 where the first part is the number of broken packages in non-break
01588 architectures + the total number of broken packages for all the
01589 architectures.
01590 """
01591 res = []
01592 total = 0
01593 totalbreak = 0
01594 for arch in self.options.architectures:
01595 if arch in nuninst:
01596 n = len(nuninst[arch])
01597 elif original and arch in original:
01598 n = len(original[arch])
01599 else: continue
01600 if arch in self.options.break_arches:
01601 totalbreak = totalbreak + n
01602 else:
01603 total = total + n
01604 res.append("%s-%d" % (arch[0], n))
01605 return "%d+%d: %s" % (total, totalbreak, ":".join(res))
01606
01607 def eval_uninst(self, nuninst):
01608 """Return a string which represents the uninstallable packages
01609
01610 This method returns a string which represents the uninstallable
01611 packages reading the uninstallability statistics `nuninst`.
01612
01613 An example of the output string is:
01614 * i386: broken-pkg1, broken-pkg2
01615 """
01616 parts = []
01617 for arch in self.options.architectures:
01618 if arch in nuninst and len(nuninst[arch]) > 0:
01619 parts.append(" * %s: %s\n" % (arch,", ".join(sorted(nuninst[arch]))))
01620 return "".join(parts)
01621
01622 def is_nuninst_asgood_generous(self, old, new):
01623 diff = 0
01624 for arch in self.options.architectures:
01625 if arch in self.options.break_arches: continue
01626 diff = diff + (len(new[arch]) - len(old[arch]))
01627 return diff <= 0
01628
01629 def check_installable(self, pkg, arch, suite, excluded=[], conflicts=False):
01630 """Check if a package is installable
01631
01632 This method analyzes the dependencies of the binary package specified
01633 by the parameter `pkg' for the architecture `arch' within the suite
01634 `suite'. If the dependency can be satisfied in the given `suite` and
01635 `conflicts` parameter is True, then the co-installability with
01636 conflicts handling is checked.
01637
01638 The dependency fields checked are Pre-Depends and Depends.
01639
01640 The method returns a boolean which is True if the given package is
01641 installable.
01642 """
01643
01644 binary_u = self.binaries[suite][arch][0][pkg]
01645
01646
01647 parse_depends = apt_pkg.ParseDepends
01648 get_dependency_solvers = self.get_dependency_solvers
01649
01650
01651 for type in (PREDEPENDS, DEPENDS):
01652 if not binary_u[type]:
01653 continue
01654
01655
01656 for block in parse_depends(binary_u[type]):
01657
01658 solved, packages = get_dependency_solvers(block, arch, 'testing', excluded, strict=True)
01659 if not solved:
01660 return False
01661
01662
01663
01664
01665 if conflicts:
01666 return self.check_conflicts(pkg, arch, excluded, {}, {})
01667
01668 return True
01669
01670 def check_conflicts(self, pkg, arch, broken, system, conflicts):
01671 """Check if a package can be installed satisfying the conflicts
01672
01673 This method checks if the `pkg` package from the `arch` architecture
01674 can be installed (excluding `broken` packages) within the system
01675 `system` along with all its dependencies. This means that all the
01676 conflicts relationships are checked in order to achieve the test
01677 co-installability of the package.
01678
01679 The method returns a boolean which is True if the given package is
01680 co-installable in the given system.
01681 """
01682
01683
01684 binaries = self.binaries['testing'][arch]
01685 parse_depends = apt_pkg.ParseDepends
01686 check_depends = apt_pkg.CheckDep
01687
01688
01689
01690 def unregister_conflicts(pkg, conflicts):
01691 for c in conflicts.keys():
01692 i = 0
01693 while i < len(conflicts[c]):
01694 if conflicts[c][i][3] == pkg:
01695 del conflicts[c][i]
01696 else: i = i + 1
01697 if len(conflicts[c]) == 0:
01698 del conflicts[c]
01699
01700 def remove_package(pkg, system, conflicts):
01701 for k in system:
01702 if pkg in system[k][1]:
01703 system[k][1].remove(pkg)
01704 unregister_conflicts(pkg, conflicts)
01705
01706
01707
01708
01709
01710
01711
01712
01713
01714 def handle_conflict(pkg, source, system, conflicts):
01715
01716 if source not in system or system[source][1] == []:
01717 remove_package(source, system, conflicts)
01718 return (system, conflicts)
01719
01720 if not system[source][1][0]:
01721 return False
01722
01723 unregister_conflicts(source, conflicts)
01724
01725 alternatives = system[source][0]
01726 for alt in alternatives:
01727 if satisfy(alt, [x for x in alternatives if x != alt], pkg_from=system[source][1],
01728 system=system, conflicts=conflicts, excluded=[source]):
01729 remove_package(source, system, conflicts)
01730 return (system, conflicts)
01731
01732 for p in system[source][1]:
01733
01734 if not p: return False
01735
01736 if p == pkg: continue
01737 output = handle_conflict(pkg, p, system, conflicts)
01738 if output:
01739 system, conflicts = output
01740 else: return False
01741 remove_package(source, system, conflicts)
01742 return (system, conflicts)
01743
01744
01745
01746
01747
01748
01749 def satisfy(pkg, pkg_alt=None, pkg_from=None, system=system, conflicts=conflicts, excluded=[]):
01750
01751 if pkg in binaries[0]:
01752 if pkg in system:
01753 if type(pkg_from) == list:
01754 system[pkg][1].extend(pkg_from)
01755 else:
01756 system[pkg][1].append(pkg_from)
01757 system[pkg] = (system[pkg][1], filter(lambda x: x in pkg_alt, system[pkg][0]))
01758 return True
01759 binary_u = binaries[0][pkg]
01760 else: binary_u = None
01761
01762
01763 providers = []
01764 if pkg_from and pkg in binaries[1]:
01765 providers = binaries[1][pkg]
01766
01767 if binary_u:
01768 providers = filter(lambda x: (not pkg_alt or x not in pkg_alt) and x != pkg, providers)
01769 if not pkg_alt:
01770 pkg_alt = []
01771 pkg_alt.extend(providers)
01772
01773 else:
01774
01775 if len(filter(lambda x: x in providers and x not in excluded, system)) > 0:
01776 return True
01777 for p in providers:
01778
01779
01780 if p in excluded: continue
01781 elif satisfy(p, [a for a in providers if a != p], pkg_from):
01782 return True
01783
01784 return False
01785
01786
01787 if not binary_u: return False
01788
01789
01790 if pkg in broken and pkg_from:
01791 for p in providers:
01792
01793
01794 if p in excluded: continue
01795 elif satisfy(p, [a for a in providers if a != p], pkg_from):
01796 return True
01797 return False
01798
01799
01800 if type(pkg_from) != list:
01801 pkg_from = [pkg_from]
01802 system[pkg] = (pkg_alt or [], pkg_from)
01803
01804
01805 if binary_u[PROVIDES]:
01806 for p in binary_u[PROVIDES]:
01807 if p in system:
01808
01809 if len(system[p][1]) == 1 and system[p][1][0] == None: continue
01810 system[p][1].append(pkg)
01811 else:
01812 system[p] = ([], [pkg])
01813
01814
01815 if pkg in conflicts:
01816 for name, version, op, conflicting in conflicts[pkg]:
01817 if conflicting in binary_u[PROVIDES] and system[conflicting][1] == [pkg]: continue
01818 if op == '' and version == '' or check_depends(binary_u[VERSION], op, version):
01819
01820
01821
01822 output = handle_conflict(pkg, conflicting, system.copy(), conflicts.copy())
01823 if output:
01824 system, conflicts = output
01825 else:
01826 del system[pkg]
01827 return False
01828
01829
01830 if binary_u[CONFLICTS]:
01831 for block in map(operator.itemgetter(0), parse_depends(binary_u[CONFLICTS] or [])):
01832 name, version, op = block
01833
01834
01835 if not (name in binary_u[PROVIDES] and system[name][1] == [pkg]) and \
01836 block[0] != pkg and block[0] in system:
01837 if block[0] in binaries[0]:
01838 binary_c = binaries[0][block[0]]
01839 else: binary_c = None
01840 if op == '' and version == '' or binary_c and check_depends(binary_c[VERSION], op, version):
01841
01842
01843
01844 output = handle_conflict(pkg, name, system.copy(), conflicts.copy())
01845 if output:
01846 system, conflicts = output
01847 else:
01848 del system[pkg]
01849 unregister_conflicts(pkg, conflicts)
01850 return False
01851
01852 if block[0] not in conflicts:
01853 conflicts[block[0]] = []
01854 conflicts[block[0]].append((name, version, op, pkg))
01855
01856
01857 dependencies = []
01858 for key in (PREDEPENDS, DEPENDS):
01859 if not binary_u[key]: continue
01860 dependencies.extend(parse_depends(binary_u[key]))
01861
01862
01863 for block in dependencies:
01864
01865 alternatives = map(operator.itemgetter(0), block)
01866 valid = False
01867 for name, version, op in block:
01868
01869 if name in system or satisfy(name, [a for a in alternatives if a != name], pkg):
01870 valid = True
01871 break
01872
01873
01874
01875 if not valid:
01876 del system[pkg]
01877 unregister_conflicts(pkg, conflicts)
01878 for p in providers:
01879 if satisfy(p, [a for a in providers if a != p], pkg_from):
01880 return True
01881 return False
01882
01883
01884 return True
01885
01886
01887 return satisfy(pkg)
01888
01889 def doop_source(self, pkg):
01890 """Apply a change to the testing distribution as requested by `pkg`
01891
01892 This method apply the changes required by the action `pkg` tracking
01893 them so it will be possible to revert them.
01894
01895 The method returns a list of the package name, the suite where the
01896 package comes from, the list of packages affected by the change and
01897 the dictionary undo which can be used to rollback the changes.
01898 """
01899 undo = {'binaries': {}, 'sources': {}, 'virtual': {}, 'nvirtual': []}
01900
01901 affected = []
01902 arch = None
01903
01904
01905 sources = self.sources
01906 binaries = self.binaries['testing']
01907
01908
01909 if pkg[0] == "-" and "/" in pkg:
01910 pkg_name, arch = pkg.split("/")
01911 pkg_name = pkg_name[1:]
01912 if arch.endswith("_tpu"):
01913 arch, suite = arch.split("_")
01914 else: suite = "testing"
01915
01916 elif "/" in pkg:
01917 pkg_name, arch = pkg.split("/")
01918 suite = "unstable"
01919
01920 elif pkg[0] == "-":
01921 pkg_name = pkg[1:]
01922 suite = "testing"
01923
01924 elif pkg[0].endswith("_tpu"):
01925 pkg_name = pkg[:-4]
01926 suite = "tpu"
01927
01928 else:
01929 pkg_name = pkg
01930 suite = "unstable"
01931
01932
01933 if not (arch and pkg[0] == '-'):
01934 if pkg_name in sources['testing']:
01935 source = sources['testing'][pkg_name]
01936
01937 for p in source[BINARIES]:
01938 binary, parch = p.split("/")
01939 if arch and parch != arch: continue
01940
01941 if not self.options.compatible and suite == 'unstable' and \
01942 binary not in self.binaries[suite][parch][0] and \
01943 ('ALL' in self.options.smooth_updates or \
01944 binaries[parch][0][binary][SECTION] in self.options.smooth_updates):
01945 continue
01946
01947 undo['binaries'][p] = binaries[parch][0][binary]
01948
01949 for j in binaries[parch][0][binary][RDEPENDS]:
01950 key = (j, parch)
01951 if key not in affected: affected.append(key)
01952
01953 for j in binaries[parch][0][binary][PROVIDES]:
01954 key = j + "/" + parch
01955 if key not in undo['virtual']:
01956 undo['virtual'][key] = binaries[parch][1][j][:]
01957 binaries[parch][1][j].remove(binary)
01958 if len(binaries[parch][1][j]) == 0:
01959 del binaries[parch][1][j]
01960
01961 del binaries[parch][0][binary]
01962
01963 if not arch:
01964 undo['sources'][pkg_name] = source
01965 del sources['testing'][pkg_name]
01966 else:
01967
01968 undo['sources']['-' + pkg_name] = True
01969
01970
01971 elif pkg_name in binaries[arch][0]:
01972 undo['binaries'][pkg_name + "/" + arch] = binaries[arch][0][pkg_name]
01973 for j in binaries[arch][0][pkg_name][RDEPENDS]:
01974 key = (j, arch)
01975 if key not in affected: affected.append(key)
01976 del binaries[arch][0][pkg_name]
01977
01978
01979 if pkg[0] != "-":
01980 source = sources[suite][pkg_name]
01981 for p in source[BINARIES]:
01982 binary, parch = p.split("/")
01983 if arch and parch != arch: continue
01984 key = (binary, parch)
01985
01986 if key not in affected: affected.append(key)
01987
01988 if binary in binaries[parch][0]:
01989
01990 undo['binaries'][p] = binaries[parch][0][binary]
01991
01992 for j in binaries[parch][0][binary][RDEPENDS]:
01993 key = (j, parch)
01994 if key not in affected: affected.append(key)
01995
01996 for j in binaries[parch][0][binary][RCONFLICTS]:
01997 key = (j, parch)
01998 if key not in affected: affected.append(key)
01999 for p in self.get_full_tree(j, parch, 'testing'):
02000 key = (p, parch)
02001 if key not in affected: affected.append(key)
02002
02003 binaries[parch][0][binary] = self.binaries[suite][parch][0][binary]
02004
02005 for j in binaries[parch][0][binary][PROVIDES]:
02006 key = j + "/" + parch
02007 if j not in binaries[parch][1]:
02008 undo['nvirtual'].append(key)
02009 binaries[parch][1][j] = []
02010 elif key not in undo['virtual']:
02011 undo['virtual'][key] = binaries[parch][1][j][:]
02012 binaries[parch][1][j].append(binary)
02013
02014 for j in binaries[parch][0][binary][RDEPENDS]:
02015 key = (j, parch)
02016 if key not in affected: affected.append(key)
02017
02018
02019 for p in source[BINARIES]:
02020 binary, parch = p.split("/")
02021 if arch and parch != arch: continue
02022 self.register_reverses(binary, binaries[parch][0] , binaries[parch][1])
02023
02024
02025 if not arch:
02026 sources['testing'][pkg_name] = sources[suite][pkg_name]
02027
02028
02029 return (pkg_name, suite, affected, undo)
02030
02031 def get_full_tree(self, pkg, arch, suite):
02032 """Calculate the full dependency tree for the given package
02033
02034 This method returns the full dependency tree for the package `pkg`,
02035 inside the `arch` architecture for the suite `suite`.
02036 """
02037 packages = [pkg]
02038 binaries = self.binaries[suite][arch][0]
02039 l = n = 0
02040 while len(packages) > l:
02041 l = len(packages)
02042 for p in packages[n:]:
02043 packages.extend([x for x in binaries[p][RDEPENDS] if x not in packages and x in binaries])
02044 n = l
02045 return packages
02046
02047 def iter_packages(self, packages, selected, hint=False, nuninst=None):
02048 """Iter on the list of actions and apply them one-by-one
02049
02050 This method apply the changes from `packages` to testing, checking the uninstallability
02051 counters for every action performed. If the action do not improve the it, it is reverted.
02052 The method returns the new uninstallability counters and the remaining actions if the
02053 final result is successful, otherwise (None, None).
02054 """
02055 extra = []
02056 deferred = []
02057 skipped = []
02058 mark_passed = False
02059 position = len(packages)
02060
02061 if nuninst:
02062 nuninst_comp = nuninst.copy()
02063 else:
02064 nuninst_comp = self.nuninst_orig.copy()
02065
02066
02067 check_installable = self.check_installable
02068 binaries = self.binaries['testing']
02069 sources = self.sources
02070 architectures = self.options.architectures
02071 nobreakall_arches = self.options.nobreakall_arches
02072 new_arches = self.options.new_arches
02073 break_arches = self.options.break_arches
02074 dependencies = self.dependencies
02075 compatible = self.options.compatible
02076
02077
02078 pre_process = {}
02079 if selected and hint:
02080 for pkg in selected:
02081 pkg_name, suite, affected, undo = self.doop_source(pkg)
02082 pre_process[pkg] = (pkg_name, suite, affected, undo)
02083
02084 lundo = []
02085 if not hint:
02086 self.output_write("recur: [%s] %s %d/%d\n" % ("", ",".join(selected), len(packages), len(extra)))
02087
02088
02089 while packages:
02090 pkg = packages.pop(0)
02091
02092
02093 if not compatible and not mark_passed and position < 0:
02094 mark_passed = True
02095 packages.extend(deferred)
02096 del deferred
02097 else: position -= 1
02098
02099
02100 if not compatible and not mark_passed:
02101 defer = False
02102 for p in dependencies.get(pkg, []):
02103 if p in skipped:
02104 deferred.append(pkg)
02105 skipped.append(pkg)
02106 defer = True
02107 break
02108 if defer: continue
02109
02110 if not hint:
02111 self.output_write("trying: %s\n" % (pkg))
02112
02113 better = True
02114 nuninst = {}
02115
02116
02117 if pkg in pre_process:
02118 pkg_name, suite, affected, undo = pre_process[pkg]
02119 else:
02120 pkg_name, suite, affected, undo = self.doop_source(pkg)
02121 if hint:
02122 lundo.append((undo, pkg, pkg_name, suite))
02123
02124
02125 for arch in ("/" in pkg and (pkg.split("/")[1].split("_")[0],) or architectures):
02126 if arch not in nobreakall_arches:
02127 skip_archall = True
02128 else: skip_archall = False
02129
02130 nuninst[arch] = [x for x in nuninst_comp[arch] if x in binaries[arch][0]]
02131 nuninst[arch + "+all"] = [x for x in nuninst_comp[arch + "+all"] if x in binaries[arch][0]]
02132 broken = nuninst[arch + "+all"]
02133 to_check = [x[0] for x in affected if x[1] == arch]
02134
02135 repaired = []
02136 broken_changed = True
02137 last_broken = None
02138 while broken_changed:
02139 broken_changed = False
02140 for p in to_check:
02141 if p == last_broken: break
02142 if p not in binaries[arch][0]: continue
02143 r = check_installable(p, arch, 'testing', excluded=broken, conflicts=True)
02144 if not r and p not in broken:
02145 last_broken = p
02146 broken.append(p)
02147 broken_changed = True
02148 if not (skip_archall and binaries[arch][0][p][ARCHITECTURE] == 'all'):
02149 nuninst[arch].append(p)
02150 elif r and p in broken:
02151 last_broken = p
02152 repaired.append(p)
02153 broken.remove(p)
02154 broken_changed = True
02155 if not (skip_archall and binaries[arch][0][p][ARCHITECTURE] == 'all'):
02156 nuninst[arch].remove(p)
02157
02158
02159 l = 0
02160 broken_changed = True
02161 last_broken = None
02162 while broken_changed:
02163 broken_changed = False
02164 for j in broken + repaired:
02165 if j not in binaries[arch][0]: continue
02166 for p in binaries[arch][0][j][RDEPENDS]:
02167 if p in broken or p not in binaries[arch][0]: continue
02168 r = check_installable(p, arch, 'testing', excluded=broken, conflicts=True)
02169 if not r and p not in broken:
02170 l = -1
02171 last_broken = j
02172 broken.append(p)
02173 broken_changed = True
02174 if not (skip_archall and binaries[arch][0][p][ARCHITECTURE] == 'all'):
02175 nuninst[arch].append(p)
02176 elif r and p in nuninst[arch + "+all"]:
02177 last_broken = p
02178 repaired.append(p)
02179 broken.remove(p)
02180 broken_changed = True
02181 if not (skip_archall and binaries[arch][0][p][ARCHITECTURE] == 'all'):
02182 nuninst[arch].remove(p)
02183 if l != -1 and last_broken == j: break
02184
02185
02186 if hint:
02187 nuninst_comp[arch] = nuninst[arch]
02188 nuninst_comp[arch + "+all"] = nuninst[arch + "+all"]
02189 continue
02190
02191
02192 if (("/" in pkg and arch not in new_arches) or \
02193 (arch not in break_arches)) and len(nuninst[arch]) > len(nuninst_comp[arch]):
02194 better = False
02195 break
02196
02197
02198 if hint or pkg in selected: continue
02199
02200
02201 if better:
02202 lundo.append((undo, pkg, pkg_name, suite))
02203 selected.append(pkg)
02204 packages.extend(extra)
02205 extra = []
02206 self.output_write("accepted: %s\n" % (pkg))
02207 self.output_write(" ori: %s\n" % (self.eval_nuninst(self.nuninst_orig)))
02208 self.output_write(" pre: %s\n" % (self.eval_nuninst(nuninst_comp)))
02209 self.output_write(" now: %s\n" % (self.eval_nuninst(nuninst, nuninst_comp)))
02210 if len(selected) <= 20:
02211 self.output_write(" all: %s\n" % (" ".join(selected)))
02212 else:
02213 self.output_write(" most: (%d) .. %s\n" % (len(selected), " ".join(selected[-20:])))
02214 for k in nuninst:
02215 nuninst_comp[k] = nuninst[k]
02216 else:
02217 self.output_write("skipped: %s (%d <- %d)\n" % (pkg, len(extra), len(packages)))
02218 self.output_write(" got: %s\n" % (self.eval_nuninst(nuninst, "/" in pkg and nuninst_comp or None)))
02219 self.output_write(" * %s: %s\n" % (arch, ", ".join(sorted([b for b in nuninst[arch] if b not in nuninst_comp[arch]]))))
02220
02221 extra.append(pkg)
02222 if not mark_passed:
02223 skipped.append(pkg)
02224
02225
02226 for k in undo['sources'].keys():
02227 if k[0] == '-':
02228 del sources['testing'][k[1:]]
02229 else: sources['testing'][k] = undo['sources'][k]
02230
02231
02232 if pkg[0] != '-' and pkg_name in sources[suite]:
02233 for p in sources[suite][pkg_name][BINARIES]:
02234 binary, arch = p.split("/")
02235 if "/" in pkg and arch != pkg[pkg.find("/")+1:]: continue
02236 del binaries[arch][0][binary]
02237
02238
02239 for p in undo['binaries'].keys():
02240 binary, arch = p.split("/")
02241 if binary[0] == "-":
02242 del binaries[arch][0][binary[1:]]
02243 else: binaries[arch][0][binary] = undo['binaries'][p]
02244
02245
02246 for p in undo['nvirtual']:
02247 j, arch = p.split("/")
02248 del binaries[arch][1][j]
02249 for p in undo['virtual']:
02250 j, arch = p.split("/")
02251 if j[0] == '-':
02252 del binaries[arch][1][j[1:]]
02253 else: binaries[arch][1][j] = undo['virtual'][p]
02254
02255
02256 if hint:
02257 return (nuninst_comp, [], lundo)
02258
02259 self.output_write(" finish: [%s]\n" % ",".join(selected))
02260 self.output_write("endloop: %s\n" % (self.eval_nuninst(self.nuninst_orig)))
02261 self.output_write(" now: %s\n" % (self.eval_nuninst(nuninst_comp)))
02262 self.output_write(self.eval_uninst(self.newlyuninst(self.nuninst_orig, nuninst_comp)))
02263 self.output_write("\n")
02264
02265 return (nuninst_comp, extra, lundo)
02266
02267 def do_all(self, maxdepth=0, init=None, actions=None):
02268 """Testing update runner
02269
02270 This method tries to update testing checking the uninstallability
02271 counters before and after the actions to decide if the update was
02272 successful or not.
02273 """
02274 selected = []
02275 if actions:
02276 upgrade_me = actions[:]
02277 else:
02278 upgrade_me = self.upgrade_me[:]
02279 nuninst_start = self.nuninst_orig
02280
02281
02282 undo = False
02283 force = False
02284 earlyabort = False
02285 if maxdepth == "easy" or maxdepth < 0:
02286 force = maxdepth < 0
02287 earlyabort = True
02288 maxdepth = 0
02289
02290
02291 if init:
02292 self.output_write("leading: %s\n" % (",".join(init)))
02293 for x in init:
02294 if x not in upgrade_me:
02295 self.output_write("failed: %s\n" % (x))
02296 return None
02297 selected.append(x)
02298 upgrade_me.remove(x)
02299
02300 self.output_write("start: %s\n" % self.eval_nuninst(nuninst_start))
02301 self.output_write("orig: %s\n" % self.eval_nuninst(nuninst_start))
02302
02303 if earlyabort:
02304 extra = upgrade_me[:]
02305 (nuninst_end, extra, lundo) = self.iter_packages(init, selected, hint=True)
02306 undo = True
02307 self.output_write("easy: %s\n" % (self.eval_nuninst(nuninst_end)))
02308 self.output_write(self.eval_uninst(self.newlyuninst(nuninst_start, nuninst_end)) + "\n")
02309 if not force and not self.is_nuninst_asgood_generous(self.nuninst_orig, nuninst_end):
02310 nuninst_end, extra = None, None
02311 else:
02312 lundo = []
02313 if init:
02314 (nuninst_end, extra, tundo) = self.iter_packages(init, selected, hint=True)
02315 lundo.extend(tundo)
02316 undo = True
02317 else: nuninst_end = None
02318 (nuninst_end, extra, tundo) = self.iter_packages(upgrade_me, selected, nuninst=nuninst_end)
02319 lundo.extend(tundo)
02320 if not self.is_nuninst_asgood_generous(self.nuninst_orig, nuninst_end):
02321 nuninst_end, extra = None, None
02322
02323 if nuninst_end:
02324 self.output_write("Apparently successful\n")
02325 self.output_write("final: %s\n" % ",".join(sorted(selected)))
02326 self.output_write("start: %s\n" % self.eval_nuninst(nuninst_start))
02327 self.output_write(" orig: %s\n" % self.eval_nuninst(self.nuninst_orig))
02328 self.output_write(" end: %s\n" % self.eval_nuninst(nuninst_end))
02329 if force:
02330 self.output_write("force breaks:\n")
02331 self.output_write(self.eval_uninst(self.newlyuninst(nuninst_start, nuninst_end)) + "\n")
02332 self.output_write("SUCCESS (%d/%d)\n" % (len(actions or self.upgrade_me), len(extra)))
02333 self.nuninst_orig = nuninst_end
02334 if not actions:
02335 self.upgrade_me = sorted(extra)
02336 if not self.options.compatible:
02337 self.sort_actions()
02338 else:
02339 self.output_write("FAILED\n")
02340 if not undo: return
02341
02342
02343 for (undo, pkg, pkg_name, suite) in lundo:
02344
02345 for k in undo['sources'].keys():
02346 if k[0] == '-':
02347 del self.sources['testing'][k[1:]]
02348 else: self.sources['testing'][k] = undo['sources'][k]
02349
02350
02351 if pkg[0] != '-' and pkg_name in self.sources[suite]:
02352 for p in self.sources[suite][pkg_name][BINARIES]:
02353 binary, arch = p.split("/")
02354 if "/" in pkg and arch != pkg[pkg.find("/")+1:]: continue
02355 del self.binaries['testing'][arch][0][binary]
02356
02357
02358 for p in undo['binaries'].keys():
02359 binary, arch = p.split("/")
02360 if binary[0] == "-":
02361 del self.binaries['testing'][arch][0][binary[1:]]
02362 else: self.binaries['testing'][arch][0][binary] = undo['binaries'][p]
02363
02364
02365 for p in undo['nvirtual']:
02366 j, arch = p.split("/")
02367 del self.binaries['testing'][arch][1][j]
02368 for p in undo['virtual']:
02369 j, arch = p.split("/")
02370 if j[0] == '-':
02371 del self.binaries['testing'][arch][1][j[1:]]
02372 else: self.binaries['testing'][arch][1][j] = undo['virtual'][p]
02373
02374 def upgrade_testing(self):
02375 """Upgrade testing using the unstable packages
02376
02377 This method tries to upgrade testing using the packages from unstable.
02378 Before running the do_all method, it tries the easy and force-hint
02379 commands.
02380 """
02381
02382 self.__log("Starting the upgrade test", type="I")
02383 self.__output = open(self.options.upgrade_output, 'w')
02384 self.output_write("Generated on: %s\n" % (time.strftime("%Y.%m.%d %H:%M:%S %z", time.gmtime(time.time()))))
02385 self.output_write("Arch order is: %s\n" % ", ".join(self.options.architectures))
02386
02387 self.__log("> Calculating current uninstallability counters", type="I")
02388 self.nuninst_orig = self.get_nuninst()
02389
02390 if not self.options.actions:
02391
02392 for x in self.hints['easy']:
02393 self.do_hint("easy", x[0], x[1])
02394
02395
02396 for x in self.hints["force-hint"]:
02397 self.do_hint("force-hint", x[0], x[1])
02398
02399
02400 self.__log("> First loop on the packages with depth = 0", type="I")
02401
02402
02403 allpackages = []
02404 normpackages = self.upgrade_me[:]
02405 archpackages = {}
02406 for a in self.options.break_arches.split():
02407 archpackages[a] = [p for p in normpackages if p.endswith("/" + a)]
02408 normpackages = [p for p in normpackages if not p.endswith("/" + a)]
02409 self.upgrade_me = normpackages
02410 self.output_write("info: main run\n")
02411 self.do_all()
02412 allpackages += self.upgrade_me
02413 for a in self.options.break_arches.split():
02414 backup = self.options.break_arches
02415 self.options.break_arches = " ".join([x for x in self.options.break_arches.split() if x != a])
02416 self.upgrade_me = archpackages[a]
02417 self.output_write("info: broken arch run for %s\n" % (a))
02418 self.do_all()
02419 allpackages += self.upgrade_me
02420 self.options.break_arches = backup
02421 self.upgrade_me = allpackages
02422
02423 if self.options.actions:
02424 return
02425
02426
02427 hintcnt = 0
02428 for x in self.hints["hint"][:50]:
02429 if hintcnt > 50:
02430 self.output_write("Skipping remaining hints...")
02431 break
02432 if self.do_hint("hint", x[0], x[1]):
02433 hintcnt += 1
02434
02435
02436 if not self.options.compatible:
02437 self.auto_hinter()
02438
02439
02440 if not self.options.compatible and len(self.options.smooth_updates) > 0:
02441 self.__log("> Removing old packages left in testing from smooth updates", type="I")
02442 removals = self.old_libraries()
02443 if len(removals) > 0:
02444 self.output_write("Removing packages left in testing for smooth updates (%d):\n%s" % \
02445 (len(removals), self.old_libraries_format(removals)))
02446 self.do_all(actions=removals)
02447 removals = self.old_libraries()
02448
02449 if not self.options.compatible:
02450 self.output_write("List of old libraries in testing (%d):\n%s" % \
02451 (len(removals), self.old_libraries_format(removals)))
02452
02453
02454 if not self.options.dry_run:
02455
02456 if self.options.control_files:
02457 self.write_controlfiles(self.options.testing, 'testing')
02458
02459
02460 self.write_bugs(self.options.testing, self.bugs['testing'])
02461 self.write_dates(self.options.testing, self.dates)
02462
02463
02464 self.write_heidi(self.options.testing, 'HeidiResult')
02465
02466 self.__output.close()
02467 self.__log("Test completed!", type="I")
02468
02469 def do_hint(self, type, who, pkgvers):
02470 """Process hints
02471
02472 This method process `easy`, `hint` and `force-hint` hints. If the
02473 requested version is not in unstable, than the hint is skipped.
02474 """
02475 hintinfo = {"easy": "easy",
02476 "hint": 0,
02477 "force-hint": -1,}
02478
02479 self.__log("> Processing '%s' hint from %s" % (type, who), type="I")
02480 self.output_write("Trying %s from %s: %s\n" % (type, who, " ".join( ["%s/%s" % (p,v) for (p,v) in pkgvers])))
02481
02482 ok = True
02483
02484 for pkg, v in pkgvers:
02485
02486 if "/" in pkg:
02487 pkg = pkg[:pkg.find("/")]
02488
02489
02490 if pkg[0] == "-":
02491 continue
02492
02493 elif pkg.endswith("_tpu"):
02494 pkg = pkg[:-4]
02495 if pkg not in self.sources['tpu']: continue
02496 if apt_pkg.VersionCompare(self.sources['tpu'][pkg][VERSION], v) != 0:
02497 self.output_write(" Version mismatch, %s %s != %s\n" % (pkg, v, self.sources['tpu'][pkg][VERSION]))
02498 ok = False
02499
02500 elif pkg not in self.sources['unstable']:
02501 self.output_write(" Source %s has no version in unstable\n" % pkg)
02502 ok = False
02503 elif apt_pkg.VersionCompare(self.sources['unstable'][pkg][VERSION], v) != 0:
02504 self.output_write(" Version mismatch, %s %s != %s\n" % (pkg, v, self.sources['unstable'][pkg][VERSION]))
02505 ok = False
02506 if not ok:
02507 self.output_write("Not using hint\n")
02508 return False
02509
02510 self.do_all(hintinfo[type], map(operator.itemgetter(0), pkgvers))
02511 return True
02512
02513 def sort_actions(self):
02514 """Sort actions in a smart way
02515
02516 This method sorts the list of actions in a smart way. In details, it uses
02517 as base sort the number of days the excuse is old, then reordering packages
02518 so the ones with most reverse dependencies are at the end of the loop.
02519 If an action depends on another one, it is put after it.
02520 """
02521 upgrade_me = [x.name for x in self.excuses if x.name in self.upgrade_me]
02522 for e in self.excuses:
02523 if e.name not in upgrade_me: continue
02524
02525 elif e.name[0] == '-':
02526 upgrade_me.remove(e.name)
02527 upgrade_me.append(e.name)
02528
02529 else:
02530 pos = []
02531 udeps = [upgrade_me.index(x) for x in e.deps if x in upgrade_me and x != e.name]
02532 if len(udeps) > 0:
02533 pos.append(max(udeps))
02534 sdeps = [upgrade_me.index(x) for x in e.sane_deps if x in upgrade_me and x != e.name]
02535 if len(sdeps) > 0:
02536 pos.append(min(sdeps))
02537 if len(pos) == 0: continue
02538 upgrade_me.remove(e.name)
02539 upgrade_me.insert(max(pos)+1, e.name)
02540 self.dependencies[e.name] = e.deps
02541
02542
02543 self.upgrade_me = upgrade_me
02544
02545 def auto_hinter(self):
02546 """Auto hint circular dependencies
02547
02548 This method tries to auto hint circular dependencies analyzing the update
02549 excuses relationships. If they build a circular dependency, which we already
02550 know as not-working with the standard do_all algorithm, try to `easy` them.
02551 """
02552 self.__log("> Processing hints from the auto hinter", type="I")
02553
02554
02555 excuses = dict([(x.name, x) for x in self.excuses if x.name in self.upgrade_me])
02556
02557 def find_related(e, hint, first=False):
02558 if e not in excuses:
02559 return False
02560 excuse = excuses[e]
02561 if e in self.sources['testing'] and self.sources['testing'][e][VERSION] == excuse.ver[1]:
02562 return True
02563 if not first:
02564 hint[e] = excuse.ver[1]
02565 if len(excuse.deps) == 0:
02566 return hint
02567 for p in excuse.deps:
02568 if p in hint: continue
02569 if not find_related(p, hint):
02570 return False
02571 return hint
02572
02573
02574 cache = []
02575 for e in excuses:
02576 excuse = excuses[e]
02577 if e in self.sources['testing'] and self.sources['testing'][e][VERSION] == excuse.ver[1] or \
02578 len(excuse.deps) == 0:
02579 continue
02580 hint = find_related(e, {}, True)
02581 if hint and e in hint and hint not in cache:
02582 self.do_hint("easy", "autohinter", hint.items())
02583 cache.append(hint)
02584
02585 def old_libraries(self):
02586 """Detect old libraries left in testing for smooth transitions
02587
02588 This method detect old libraries which are in testing but no longer
02589 built from the source package: they are still there because other
02590 packages still depend on them, but they should be removed as soon
02591 as possible.
02592 """
02593 sources = self.sources['testing']
02594 testing = self.binaries['testing']
02595 unstable = self.binaries['unstable']
02596 removals = []
02597 for arch in self.options.architectures:
02598 for pkg_name in testing[arch][0]:
02599 pkg = testing[arch][0][pkg_name]
02600 if pkg_name not in unstable[arch][0] and \
02601 not self.same_source(sources[pkg[SOURCE]][VERSION], pkg[SOURCEVER]):
02602 removals.append("-" + pkg_name + "/" + arch)
02603 return removals
02604
02605 def old_libraries_format(self, libs):
02606 """Format old libraries in a smart table"""
02607 libraries = {}
02608 for i in libs:
02609 pkg, arch = i.split("/")
02610 pkg = pkg[1:]
02611 if pkg in libraries:
02612 libraries[pkg].append(arch)
02613 else:
02614 libraries[pkg] = [arch]
02615 return "\n".join([" " + k + ": " + " ".join(libraries[k]) for k in libraries]) + "\n"
02616
02617 def output_write(self, msg):
02618 """Simple wrapper for output writing"""
02619 self.__output.write(msg)
02620
02621 def main(self):
02622 """Main method
02623
02624 This is the entry point for the class: it includes the list of calls
02625 for the member methods which will produce the output files.
02626 """
02627
02628 if not self.options.actions:
02629 self.write_excuses()
02630 if not self.options.compatible:
02631 self.sort_actions()
02632
02633 else: self.upgrade_me = self.options.actions.split()
02634
02635
02636 self.upgrade_testing()
02637
02638 if __name__ == '__main__':
02639 Britney().main()