A sorted list of non-dominated unique objective-pairs.

Non-domination here means smaller in at least one objective. The list is sorted (naturally) by the first objective. No equal entries in either objective exist in the list (assuming it is in a consistent state).

The operation

>>> from moarchiving import BiobjectiveNondominatedSortedList
>>> any_list = BiobjectiveNondominatedSortedList(any_list)  # doctest:+SKIP

sorts and prunes the pair list any_list to become a consistent nondominated sorted archive.

Afterwards, the methods add and add_list keep the list always in a consistent state. If a reference point was given on initialization, also the hypervolume of the archive is computed and updated.

The contributing_hypervolume and hypervolume_improvement methods give the uncrowded hypervolume improvement, with or without removing the input from the archive before the computation, respectively, see https://arxiv.org/abs/1904.08823

Removing elements with pop or del keeps the archive sorted and non-dominated but does not update the hypervolume, which hence becomes inconsistent.

>>> a = BiobjectiveNondominatedSortedList([[1,0.9], [0,1], [0,2]])
>>> a
[[0, 1], [1, 0.9]]
>>> a.add([0, 1])  # doesn't change anything, [0, 1] is not duplicated
>>> BiobjectiveNondominatedSortedList(
...     [[-0.749, -1.188], [-0.557, 1.1076],
...     [0.2454, 0.4724], [-1.146, -0.110]])
[[-1.146, -0.11], [-0.749, -1.188]]
>>> a._asserts()  # consistency assertions

Details: This list doesn't prevent the user to insert a new element anywhere and hence get into an inconsistent state. Inheriting from sortedcontainers.SortedList would ensure that the list remains at least sorted.

See also: https://pypi.org/project/sortedcontainers https://code.activestate.com/recipes/577197-sortedcollection/ https://pythontips.com/2016/04/24/python-sorted-collections/

Method __init__ list_of_f_pairs does not need to be sorted.
Method add insert f_pair in self if it is not (weakly) dominated.
Method remove remove element f_pair.
Method add_list insert a list of f-pairs which doesn't need to be sorted.
Method merge merge in a sorted list of f-pairs.
Method copy return a "deep" copy of self
Method bisect_left return index where f_pair may need to be inserted.
Method dominates return True if any element of self dominates or is equal to f_pair.
Method dominates_with return True if self[idx] dominates or is equal to f_pair.
Method dominators return the list of all f_pair-dominating elements in self,
Method in_domain return True if f_pair is dominating the reference point,
Method hypervolume hypervolume of the entire list w.r.t. the "initial" reference point.
Method contributing_hypervolumes list of contributing hypervolumes.
Method contributing_hypervolume return contributing hypervolume of element idx.
Method distance_to_pareto_front of a dominated f_pair also considering the reference domain.
Method distance_to_hypervolume_area Undocumented
Method hypervolume_improvement return how much f_pair would improve the hypervolume.
Method compute_hypervolume return hypervolume w.r.t. reference_point
Method compute_hypervolumes depricated, subject to removal, see compute_hypervolume and contributing_hypervolumes.
Method prune remove dominated or equal entries assuming that the list is sorted.
Method discarded list of f-pairs discarded in the last relevant method call.
Method _add_at add f_pair at position idx and remove dominated elements.
Method _set_HV set current hypervolume value using self.reference_point.
Method _subtract_HV remove contributing hypervolumes of elements self[idx0] to self[idx1 - 1].
Method _add_HV add contributing hypervolume of self[idx] to hypervolume.
Method _state Undocumented
Static Method _random_archive Undocumented
Method _asserts make all kind of consistency assertions.
def __init__(self, list_of_f_pairs=None, reference_point=None, sort=sorted):

list_of_f_pairs does not need to be sorted.

f-pairs beyond the reference_point are pruned away. The reference_point is also used to compute the hypervolume.

sort is a sorting function and sort=None will prevent a sort, which can be useful if the list_of_f_pairs is already sorted.

CAVEAT: the interface, in particular the positional interface may change in future versions.

def add(self, f_pair):

insert f_pair in self if it is not (weakly) dominated.

Return index at which the insertion took place or None. The list remains sorted in the process.

The list remains non-dominated with unique elements, which means that some or many or even all of its present elements may be removed.

Implementation detail: For performance reasons, insert is avoided in favor of __setitem__, if possible.

def _add_at(self, idx, f_pair):

add f_pair at position idx and remove dominated elements.

This method assumes that f_pair is not weakly dominated by self and that idx is the correct insertion place e.g. acquired by bisect_left.

def remove(self, f_pair):

remove element f_pair.

Raises a ValueError (like list) if f_pair is not in self. To avoid the error, checking if f_pair is in self first is a possible coding solution, like

>>> from moarchiving import BiobjectiveNondominatedSortedList
>>> nda = BiobjectiveNondominatedSortedList([[2, 3]])
>>> f_pair = [1, 2]
>>> assert [2, 3] in nda and f_pair not in nda
>>> if f_pair in nda:
...     nda.remove(f_pair)
>>> nda = BiobjectiveNondominatedSortedList._random_archive(p_ref_point=1)
>>> for t in [None, float]:
...     if t:
...         nda.hypervolume_final_float_type = t
...         nda.hypervolume_computation_float_type = t
...     for pair in list(nda):
...         len_ = len(nda)
...         state = nda._state()
...         nda.remove(pair)
...         assert len(nda) == len_ - 1
...         if 100 * pair[0] - int(100 * pair[0]) < 0.7:
...             res = nda.add(pair)
...             assert all(state[i] == nda._state()[i] for i in (
...                [0, 3] if nda.hypervolume_final_float_type is float else [0, 2, 3]))

Return None (like list.remove).

def add_list(self, list_of_f_pairs):

insert a list of f-pairs which doesn't need to be sorted.

This is just a shortcut for looping over add, but discarded now contains the discarded elements from all add operations.

>>> from moarchiving import BiobjectiveNondominatedSortedList
>>> arch = BiobjectiveNondominatedSortedList()
>>> list_of_f_pairs = [[1, 2], [0, 3]]
>>> for f_pair in list_of_f_pairs:
...     arch.add(f_pair)  # return insert index or None
0
0
>>> arch == sorted(list_of_f_pairs)  # both entries are nondominated
True
>>> arch.compute_hypervolume([3, 4]) == 5.0
True

Return None.

Details: discarded does not contain elements of list_of_f_pairs. When list_of_pairs is already sorted, merge may have a small performance benefit.

def merge(self, list_of_f_pairs):

merge in a sorted list of f-pairs.

The list can contain dominated pairs, which are discarded during the merge.

Return None.

Details: merging 200 into 100_000 takes 3e-4s vs 4e-4s with add_list. The discarded property is not consistent with the overall merge.

def copy(self):
return a "deep" copy of self
def bisect_left(self, f_pair, lowest_index=0):

return index where f_pair may need to be inserted.

Smaller indices have a strictly better f1 value or they have equal f1 and better f2 value.

lowest_index restricts the search from below.

Details: This method does a binary search in self using bisect.bisect_left.

def dominates(self, f_pair):

return True if any element of self dominates or is equal to f_pair.

Otherwise return False.

>>> from moarchiving import BiobjectiveNondominatedSortedList as NDA
>>> a = NDA([[0.39, 0.075], [0.0087, 0.14]])
>>> a.dominates(a[0])  # is always True if `a` is not empty
True
>>> a.dominates([-1, 33]) or a.dominates([33, -1])
False
>>> a._asserts()

See also bisect_left to find the closest index.

def dominates_with(self, idx, f_pair):

return True if self[idx] dominates or is equal to f_pair.

Otherwise return False or None if idx is out-of-range.

>>> from moarchiving import BiobjectiveNondominatedSortedList as NDA
>>> NDA().dominates_with(0, [1, 2]) is None  # empty NDA
True
def dominators(self, f_pair, number_only=False):

return the list of all f_pair-dominating elements in self,

including an equal element. len(....dominators(...)) is hence the number of dominating elements which can also be obtained without creating the list with number_only=True.

>>> from moarchiving import BiobjectiveNondominatedSortedList as NDA
>>> a = NDA([[1.2, 0.1], [0.5, 1]])
>>> len(a)
2
>>> a.dominators([2, 3]) == a
True
>>> a.dominators([0.5, 1])
[[0.5, 1]]
>>> len(a.dominators([0.6, 3])), a.dominators([0.6, 3], number_only=True)
(1, 1)
>>> a.dominators([0.5, 0.9])
[]
def in_domain(self, f_pair, reference_point=None):

return True if f_pair is dominating the reference point,

False otherwise. True means that f_pair contributes to the hypervolume if not dominated by other elements.

f_pair may also be an index in self in which case self[f_pair] is tested to be in-domain.

>>> from moarchiving import BiobjectiveNondominatedSortedList as NDA
>>> a = NDA([[2.2, 0.1], [0.5, 1]], reference_point=[2, 2])
>>> assert len(a) == 1
>>> a.in_domain([0, 0])
True
>>> a.in_domain([2, 1])
False
>>> all(a.in_domain(ai) for ai in a)
True
>>> a.in_domain(0)
True

TODO: improve name?

@property
def hypervolume(self):

hypervolume of the entire list w.r.t. the "initial" reference point.

Raise ValueError when no reference point was given initially.

>>> from moarchiving import BiobjectiveNondominatedSortedList as NDA
>>> a = NDA([[0.5, 0.4], [0.3, 0.7]], [2, 2.1])
>>> a._asserts()
>>> a.reference_point == [2, 2.1]
True
>>> abs(a.hypervolume - a.compute_hypervolume(a.reference_point)) < 1e-11
True
>>> a.add([0.2, 0.8])
0
>>> a._asserts()
>>> abs(a.hypervolume - a.compute_hypervolume(a.reference_point)) < 1e-11
True
>>> a.add([0.3, 0.6])
1
>>> a._asserts()
>>> abs(a.hypervolume - a.compute_hypervolume(a.reference_point)) < 1e-11
True
@property
def contributing_hypervolumes(self):

list of contributing hypervolumes.

Elements in the list are of type self.hypervolume_computation_float_type. Conversion to float in a list comprehension should always be possible.

Changing this list will have unexpected consequences if self.maintain_contributing_hypervolumes,

Details: The "initial" reference point is used for the outer points. If none is given, inf is used as reference. For the time being, the contributing hypervolumes are computed each time from scratch.

See Alsocontributing_hypervolume
def contributing_hypervolume(self, idx):

return contributing hypervolume of element idx.

If idx is an f_pair, return contributing hypervolume of element with value f_pair. If f_pair is not in self, return hypervolume_improvement(f_pair).

The return type is self.hypervolume_computation_float_type` and by default `fractions.Fraction`, which can be converted to `float` like ``float(....contributing_hypervolume(idx)).

def distance_to_pareto_front(self, f_pair, ref_factor=1):

of a dominated f_pair also considering the reference domain.

Non-dominated points have (by definition) a distance of zero, unless the archive is empty and the point does not dominate the reference point.

Assumes that extreme points in the archive are in the reference domain.

Details: the distance for dominated points is computed by iterating over the relevant kink points (self[i+1][0], self[i][1]). In case of minimization, the boundary with two non-dominated points can be depicted like:

...______.      . <- reference point
         |
         x__. <- kink point
            |
            x___. <- kink point
                |
                |
                :
                :

The three kink points which are possibly used for the computations are denoted by a dot. The outer kink points use one coordinate of the reference point.

def distance_to_hypervolume_area(self, f_pair):
Undocumented
def hypervolume_improvement(self, f_pair):

return how much f_pair would improve the hypervolume.

If dominated, return the distance to the empirical pareto front multiplied by -1. Else if not in domain, return distance to the reference point dominating area times -1.

Overall this amounts to the uncrowded hypervolume improvement, see https://arxiv.org/abs/1904.08823

def _set_HV(self):

set current hypervolume value using self.reference_point.

Raise ValueError if self.reference_point is None.

TODO: we may need to store the list of _contributing_ hypervolumes to handle numerical rounding errors later.

def compute_hypervolume(self, reference_point):
return hypervolume w.r.t. reference_point
def compute_hypervolumes(self, reference_point):

depricated, subject to removal, see compute_hypervolume and contributing_hypervolumes.

Never implemented: return list of contributing hypervolumes w.r.t. reference_point

def _subtract_HV(self, idx0, idx1=None):

remove contributing hypervolumes of elements self[idx0] to self[idx1 - 1].

TODO: also update list of contributing hypervolumes in case.

def _add_HV(self, idx):

add contributing hypervolume of self[idx] to hypervolume.

TODO: also update list of contributing hypervolumes in case.

def prune(self):

remove dominated or equal entries assuming that the list is sorted.

Return number of dropped elements.

Implementation details: pruning from right to left may be preferable, because list.insert(0) is O(n) while list.append is O(1), however it is not possible with the given sorting: in principle, the first element may dominate all others, which can only be discovered in the last step when traversing from right to left. This suggests that reverse sort may be better for pruning or we should inherit from collections.deque instead from list, but deque seems not to support deletion of slices.

@property
def discarded(self):

list of f-pairs discarded in the last relevant method call.

Methods covered are __init__, prune, add, and add_list. Removed duplicates are not element of the discarded list. When not inserted and not already in self also the input argument(s) show(s) up in discarded.

Example to create a list of rank-k-non-dominated fronts:

>>> from moarchiving import BiobjectiveNondominatedSortedList as NDA
>>> all_ = [[0.1, 1], [-2, 3], [-4, 5], [-4, 5], [-4, 4.9]]
>>> nda_list = [NDA(all_)]  # rank-0-non-dominated
>>> while nda_list[-1].discarded:
...     nda_list += [NDA(nda_list[-1].discarded)]
>>> assert [len(p) for p in nda_list] == [3, 1]
def _state(self):
Undocumented
@staticmethod
def _random_archive(max_size=500, p_ref_point=0.5):
Undocumented
def _asserts(self):

make all kind of consistency assertions.

>>> import moarchiving
>>> a = moarchiving.BiobjectiveNondominatedSortedList(
...    [[-0.749, -1.188], [-0.557, 1.1076],
...    [0.2454, 0.4724], [-1.146, -0.110]], [10, 10])
>>> a._asserts()
>>> for i in range(len(a)):
...    assert a.contributing_hypervolume(i) == a.contributing_hypervolumes[i]
>>> assert all(map(lambda x, y: x - 1e-9 < y < x + 1e-9,
...               a.contributing_hypervolumes,
...               [4.01367, 11.587422]))
>>> for p in list(a):
...     a.remove(p)
>>> assert len(a) == 0
>>> try: a.remove([0, 0])
... except ValueError: pass
... else: raise AssertionError("remove did not raise ValueError")
>>> from numpy.random import rand
>>> for _ in range(120):
...     a = moarchiving.BiobjectiveNondominatedSortedList._random_archive()
...     a.make_expensive_asserts = True
...     if a.reference_point:
...         for f_pair in rand(10, 2):
...             h0 = a.hypervolume
...             hi = a.hypervolume_improvement(list(f_pair))
...             assert a.hypervolume == h0  # works OK with Fraction
API Documentation for moarchiving, generated by pydoctor at 2020-06-01 23:58:25.