How To Boost Any Loss Function
How To Boost Any Loss Function
Abstract
1 Introduction
In ML, zeroth order optimization has been devised as an alternative to techniques that would
otherwise require access to ě 1-order information about the loss to minimize, such as gradient
descent (stochastic or not, constrained or not, etc., see Section 2). Such approaches replace the access
to a so-called oracle providing derivatives for the loss at hand, operations that can be consuming
or not available in exact form in the ML world, by the access to a cheaper function value oracle,
providing loss values at queried points.
Zeroth order optimization has seen a considerable boost in ML over the past years, over many
settings and algorithms, yet, there is one foundational ML setting and related algorithms that, to our
knowledge, have not yet been the subject of investigations: boosting [32, 31]. Such a question is
very relevant: boosting has quickly evolved as a technique requiring first-order information about
the loss optimized [6, Section 10.3], [41, Section 7.2.2] [53]. It is also not uncommon to find
boosting reduced to this first-order setting [9]. However, originally, the boosting model did not
mandate the access to any first-order information about the loss, rather requiring access to a weak
learner providing classifiers at least slightly different from random guessing [31]. In the context of
zeroth-order optimization gaining traction in ML, it becomes crucial to understand not just whether
differentiability is necessary for boosting, but more generally what are loss functions that can be
boosted with a weak learner and in fine where boosting stands with respect to recent formal progress
on lifting gradient descent to zeroth-order optimisation.
2 Related work
Over the past years, ML has seen a substantial push to get the cheapest optimisation routines, in
general batch [14], online [27], distributed [3], adversarial [20, 18] or bandits settings [2] or more
specific settings like projection-free [26, 28, 51] or saddle-point optimisation [25, 38]. We summarize
several dozen recent references in Table 1 in terms of assumptions for the analysis about the loss
optimized, provided in Appendix, Section A. Zeroth-order optimization reduces the information
available to the learner to the "cheapest" one which consists in (loss) function values, usually via
a so-called function value oracle. However, as Table 1 shows, the loss itself is always assumed
to have some form of "niceness" to study the algorithms’ convergence, such as differentiability,
Lipschitzness, convexity, etc. . Another quite remarkable phenomenon is that throughout all their
diverse settings and frameworks, not a single one of them addresses boosting. Boosting is however
a natural candidate for such investigations, for two reasons. First, the most widely used boosting
algorithms are first-order information hungry [6, 41, 53]: they require access to derivatives to compute
examples’ weights and classifiers’ leveraging coefficients. Second and perhaps most importantly,
unlike other optimization techniques like gradient descent, the original boosting model does not
mandate the access to a first-order information oracle to learn, but rather to a weak learning oracle
which supplies classifiers performing slightly differently from random guessing [32, 31]. Only few
approaches exist to get to "cheaper” algorithms relying on less assumptions about the loss at hand,
and to our knowledge do not have boosting-compliant convergence proofs, as for example when
alleviating convexity [16, 46] or access to gradients of the loss [54]. Such questions are however
important given the early negative results on boosting convex potentials with first-order information
[37] and the role of the classifiers in the negative results [39].
Finally, we note that a rich literature has developed in mathematics as well for derivative-free
optimisation [34], yet methods would also often rely on assumptions included in the three above (e.g.
[42]). It must be noted however that derivative-free optimisation has been implemented in computers
for more than seven decades [24].
2
Proprietary + Confidential
Bregman information
.
Figure 1: Left: value of SF |v pz 1 }zq for convex F , v “ z4 ´ z and various z 1 (colors), for which the
Bregman Secant distortion is positive (z 1 “ z1 , green), negative (z 1 “ z2 , red), minimal (z 1 “ z3 ) or
null (z 1 “ z4 , z). Right: depiction of QF pz, z ` v, z 1 q for non-convex F (Definition 4.6).
learn a classifier, i.e. a function h : X Ñ R which belongs to a given set H. The goodness of fit of
some h on S is evaluated from a given function F : R Ñ R called a loss function, whose expectation
on training is sought to be minimized:
.
F pS, hq “ Ei„rms rF pyi hpxi qqs.
.
The set of most popular losses comprises convex functions: the exponential loss (FEXP pzq “ expp´zq),
. .
the logistic loss (F LOG pzq “ logp1 ` expp´zqq), the square loss (FSQ pzq “ p1 ´ zq2 ), the Hinge loss
.
(FH pzq “ maxt0, 1 ´ zu). These are surrogate losses because they all define upperbounds of the
.
0/1-loss (F0/1 pzq “ 1zď0 , "1” being the indicator variable).
Our ML setting is that of boosting [31]. It consists in having primary access to a weak learner WL
that when called, provides so-called weak hypotheses, weak because barely anything is assumed
in terms of classification performance relatively to the sample over which they were trained. Our
goal is to devise a so-called "boosting" algorithm that can take any loss F as input and training
sample S and a target loss value F˚ and after some T calls to the weak learner crafts a classifier
HT satisfying F pS, HT q ď F˚ , where T depends on various parameters of the ML problem. Our
. ř
boosting architecture is a linear model: HT “ t αt ht where each ht is an output from the weak
learner and leveraging coefficients αt have to be computed during boosting. Notice that this is
substantially more general than the classical boosting formulation where the loss would be fixed or
belong to a restricted subset of functions.
3
Definition 4.3. For any z, z 1 , v P R, the Bregman Secant distortion SF |v pz 1 }zq with generator F
and offset v is:
SF |v pz 1 }zq “ F pz 1 q ´ F pzq ´ pz 1 ´ zqδv F pzq.
.
Even if F is convex, the distortion is not necessarily positive, though it is lowerbounded (Figure 1).
There is an intimate relationship between the Bregman Secant distortions and Bregman divergences.
We shall use a definition slightly more general than the original one when F is differentiable [11, eq.
(1.4)], introduced in information geometry [5, Section 3.4] and recently reintroduced in ML [10].
Definition 4.4. The Bregman divergence with generator F (scalar, convex) between z 1 and z is
. .
DF pz 1 }zq “ F pz 1 q ` F ‹ pzq ´ z 1 z, where F ‹ pzq “ supt tz ´ F ptq is the convex conjugate of F .
Relaxed forms of Bregman divergences have been introduced in information geometry [43].
. .
Definition 4.6. For any a, b, α P R, denote for short Ia,b “ rminta, bu, maxta, bus and puvqα “
αu ` p1 ´ αqv. The Optimal Bregman Information (OBI) of F defined by triple pa, b, cq P R3 is:
.
QF pa, b, cq “ max tpF paqF pbqqα ´ F ppabqα qu. (1)
α:pabqα PIa,c
As represented in Figure 1 (right), the OBI is obtained by drawing the line passing through pa, F paqq
and pb, F pbqq and then, in the interval Ia,c , look for the maximal difference between the line and
F . We note that QF is non negative because a P Ia,c and for the choice α “ 1, the RHS in (1) is 0.
We also note that when F is convex, the RHS is indeed the maximal Bregman information of two
points in [7, Definition 2], where maximality is obtained over the probability measure. The following
Lemma follows from the definition of the Bregman secant divergence and the OBI. An inspection of
the functions in Figure 1 provides a graphical proof.
Lemma 4.7. For any F ,
@z, v, z 1 P R, SF |v pz 1 }zq ě ´QF pz, z ` v, z 1 q. (2)
and if F is convex,
@z, v P R, @z 1 R Iz,z`v , SF |v pz 1 }zq ě 0,
@z, v, z 1 P R, SF |v pz 1 }zq ě ´QF pz, z ` v, z ` vq. (3)
We shall abbreviate the two possible forms of OBI in the RHS of (2), (3) as:
"
QF pz, z ` v, z ` vq if F convex
Q˚F pz, z 1 , vq “
.
. (4)
QF pz, z ` v, z 1 q otherwise
For short, we define two edge quantities for i P rms and t “ 1, 2, ...,
. .
eti “ αt ¨ yi ht pxi q, ẽti “ yi Ht pxi q, (5)
.
where αt is a leveraging coefficient for the weak classifiers in an ensemble HT p.q “ αt ht p.q.
ř
tPrT s
We observe
ẽti “ ẽpt´1qi ` eti .
4
Algorithm 1 S EC B OOST(S, T ) // red boxes pinpoint substantial differences with "classical" boosting
Input sample S “ tpxi , yi q, i “ 1, 2, ..., mu, number of iterations T , initial ph0 , v0 q (constant
classification and offset).
Step 1 : let H0 Ð 1 ¨ h0 and w1 “ ´δv0 F ph0 q ¨ 1 ; // h0 , v0 ‰ 0 chosen s. t. δv0 F ph0 q ‰ 0
Step 2 : for t “ 1, 2, ..., T
.
Step 2.1 : let ht Ð WLpSt , |wt |q //weak learner call, St “ tpxi , yi ¨ signpwti qqu
Step 2.2 : let ηt Ð p1{mq ¨ i wti yi ht pxi q
ř
//unnormalized edge
If bound on W 2,t available (Section 5.3) otherwise | general procedure
Step 2.3 : pick εt ą 0, πt Pηtp0, 1q and αt ÐS OLVEα (S, wt , ht )
αt P ¨ r1 ´ πt , 1 ` πt s ; (6) // W 2,t ą 0, εt ą 0, πt P p0, 1q
2p1 ` εt qMt2 W 2,t // Theorem 5.8
Step 2.4 : let Ht Ð Ht´1 ` αt ¨ ht //classifier update
Step 2.5 : if Iti pεt ¨ αt2 Mt2 W 2,t q ‰ H, @i P rms then //new offsets
for i “ 1, 2, ..., m, let
vti Ð OOpt, i, εt ¨ αt2 Mt2 W 2,t q ;
else return Ht ;
Step 2.6 : for i “ 1, 2, ..., m, let //weight update
For boosting rate’s sake, we should find W 2,t as small as possible. We refer to (5) for the e. , ẽ.
notations; v. is the current (set of) offset(s) (Section 4 for their definition). The second-order V-
derivative in the LHS plays the same role as the second-order derivative in classical boosting rates,
see for example [45, Appendix, eq. 29]. As offsets Ñ 0, it converges to a second-order derivative;
otherwise, they still share some properties, such as the sign for convex functions.
Lemma 5.2. Suppose F convex. For any a P R, b, c P R˚ , δtb,cu F paq ě 0.
(Proof in Appendix, Section B.3) We can also see a link with weights variation since, modulo a
slight abuse of notation, we have δteti ,vpt´1qi u F pẽpt´1qi q “ δeti wti . A substantial difference with
traditional boosting algorithms is that we have two ways to pick the leveraging coefficient αt ; the
first one can be used when a convenient W 2,t is directly accessible from the loss. Otherwise, there is
a simple algorithm that provides parameters (including W 2,t ) such that (8) is satisfied. Section 5.3
5
details those two possibilities and their implementation. In the more favorable case (the former one),
αt can be chosen in an interval, furthermore defined by flexible parameters εt ą 0, πt P p0, 1q. Note
that fixing beforehand these parameters is not mandatory: we can also pick any
ˆ ˙
1
αt P ηt ¨ 0, 2 , (9)
Mt W 2,t
and then compute choices for the corresponding εt and πt . εt is important for the algorithm and
both parameters are important for the analysis of the boosting rate. From the boosting standpoint, a
smaller εt yields a larger αt and a smaller πt reduces the interval of values in which we can pick αt ;
both cases tend to favor better convergence rates as seen in Theorem 5.3.
Step 2.4 is just the crafting of the final model.
Step 2.5 is new to boosting, the use of a so-called offset oracle, detailed in Section 5.1.2.
Step 2.6 The weight update does not rely on a first-order oracle as in traditional boosting, but uses
only loss values through v-derivatives. The finiteness of F implies the finiteness of weights.
Step 2.7 Early stopping happens if all weights are null. While this would never happen with traditional
(e.g. strictly convex) losses, some losses that are unusual in the context of boosting can lead to early
stopping. A discussion on early stopping and how to avoid it is in Section 6.
The offset oracle has a technical importance for boosting: Iti pzq is the set of offsets that limit an
OBI for a training example (Definition 4.6). The importance for boosting comes from Lemma
4.7: upperbounding an OBI implies lowerbounding a Bregman Secant divergence, which will also
guarantee a sufficient slack between two successive boosting iterations. This is embedded in a
blueprint of a proof technique to show boosting-compliant convergence which is not new, see e.g.
[45]. We now detail this convergence.
Remark that the expected edge ηt in Step 2.2 of S EC B OOST is not normalized. We define a normalized
version of this edge as:
.
ÿ |wti | ht pxi q
r´1, 1s Q η̃t “ ¨ ỹti ¨ , (12)
i
Wt Mt
. . ř
with ỹti “ yi ¨ signpwti q, Wt “ i |wti | “ i |δvpt´1qi F pẽpt´1qi q|. Remark that the labels are
ř
corrected by the weight sign and thus may switch between iterations. In the particular case where the
loss is non-increasing (such as with traditional convex surrogates), the labels do not switch. We need
also a quantity which is, in absolute value, the expected weight:
. ˇ
W 1,t “ ˇEi„rms δvpt´1qi F pẽpt´1qi q ˇ pwe indeed observe W 1,t “ |Ei„rms rwti s |q. (13)
“ ‰ˇ
6
In classical boosting for convex decreasing losses† , weights are non-negative and converge to a
minimum (typically 0) as examples get the right class with increasing confidence. Thus, W 1,t can be
an indicator of when classification becomes "good enough" to stop boosting. In our more general
setting, it shall be used in a similar indicator. We are now in a position to show a first result about
S EC B OOST.
.
Theorem 5.3. Suppose assumption 5.1 holds. Let F0 “ F pS, h0 q in S EC B OOST and z ˚ any real
such that F pz ˚ q ď F0 . Then we are guaranteed that classifier HT output by S EC B OOST satisfies
F pS, HT q ď F pz ˚ q when the number of boosting iterations T yields:
T 2
W 1,t p1 ´ πt2 q
¨ η̃t2 ě 4pF0 ´ F pz ˚ qq,
ÿ
(14)
t“1 W 2,t p1 ` εt q
(proof in Appendix, Section B.4) We observe the tradeoff between the freedom in picking parameters
and convergence guarantee as exposed by (14): to get more freedom in picking the leveraging
coefficient αt , we typically need πt large (Step 2.3) and to get more freedom in picking the offset
vt ‰ 0, we typically need εt large (Step 2.5). However, allowing more freedom in such ways
reduces the LHS and thus impairs the guarantee in (14). Therefore, there is a subtle balance between
"freedom" of choice and convergence. This balance becomes more clear as boosting compliance
formally enters convergence requirement.
As is usually the case in boosting, the weights are normalized in the weak learning assumption (12).
So the minimization "potential" of the loss does not depend on the absolute scale of weight. This is
not surprising because the loss is "nice" in classical boosting: a large γ guarantees most examples’
edges moving to the right of the x-axis after the classifier update which, because the loss is strictly
decreasing (exponential loss, logistic loss, etc.), is sufficient to yield a smaller expected loss. In our
case it is not true anymore as for example there could be a local bump in the loss that would have
it increase after the update. This is not even a pathological example: one may imagine that instead
of a single bump the loss jiggles a lot locally. How can we keep boosting operating in such cases ?
A sufficient condition takes the form of a second assumption that also integrates weights, ensuring
that the variation of weights is locally not too large compared to (unnormalized) weights, which is
akin to comparing local first- and second-order variations of the loss in the differentiable case. We
encapsulate this notion in what we call a weight regularity assumption.
. 2
Assumption 5.5. (ρ-Weight Regularity Assumption, ρ-WRA) Let ρt “ W 1,t {W 2,t . We assume there
exists ρ ą 0 such that @t ě 1, ρt ą ρ.
In Figure 2 we present a(n overly) simplified depiction of the cases where W 2,t is large for "not nice"
losses, and two workarounds on how to keep it small enough for the WRA to hold. Keep in mind
that W 1,t is an expected local variation of the loss (13), (5), so as it goes to zero, boosting converges
to a local minimum and it is reasonable to expect that the WRA breaks. Otherwise, there are two
strategies that keep W 2,t relatively small enough for WRA to hold: either we pick small enough
offsets, which essentially works for most losses but make us converge in general to a local minimum
(this is in essence our experimental choice) or we optimize the offset oracle so that it sometimes
"passes" local jiggling (Figure 2 (d)). While this eventually requires to tune the weak learner jointly
with the offset oracle and fine-tune that latter algorithm on a loss-dependent basis, such a strategy can
be used to eventually pass local minima of the loss. To do so, "larger" offsets directly translate into
corresponding requests for larger magnitude classification for the next weak classifier, for the related
examples. We are now in a position to state a simple corollary to Theorem 5.3.
†
This is an important class of losses since it encompasses the convex surrogates of symmetric proper losses
[44, 49]
7
"nice" loss "not nice" loss
F F F F
<latexit sha1_base64="2QY65FduMdx6jg5XAue/uMCrq1Q=">AABfXHic3VzrcxtHcuddLsmFl0t8SVUqVfmyCsmypIA0QZuS4ivdmeBDUhVIwSQl0eZSqNndAbDGvjS7AAGv9/6afE3+n3zJ35LumdnXzIBiquQPOd1Zwk7/uufd093zcJLAT7Pd3f/+xS//4ld/+Vd//eu/Wf/N3/727/7+s9/9w9s0njGXvnHjIGZXDklp4Ef0TeZnAb1KGCWhE9B3zvQQ6e/mlKV+HF1my4TehGQc+SPfJRkkDT/759zmQvKTmNE0e8EojQrrpLCGn23s7uzyP5b+oyt/bKzJP4Ph7953bC92ZyGNMjcgaXrd3U2ym5ywzHcDWqzbs5QmxJ2SMb2GnxEJaXqT89wLawtSPGsUM/gvyiyeur5uR/TWjcOQRJ49gX8W+W7RSnSqxIZwb+4nqRS/EPKbdCgRI8tifX3LKH7ro/LTZejEwU0+iuMs9FMXRNmM1ly5nU0oEqM4o0VujyLBkVdpyNEsUJiGJJsUaiI2RaqlojAQ0EpGduIsFLnQ8Q70c6gkL0ZJ0Wrb3E4Ig8/iunuT21HsRx50orVpZ8wn0TigzB9Psk3Lzugic0b5Rtfa3raKSkgwCyMcXPkfi/xBbs/ZLKDWre9lE2svyQy4Q57TH2DsQT4gPRrbAc0QhOPYtm3eRQ6UN4WRNLEnKZSd5tv7KC2E/IuiXX6XBkFa5Ls7T9zQROK0PaS1+je3oXJzGhTXe1DxUULnJADpXzzc6P7bxt4jNRcDmMWzyHv4sOZ59Li7u4ucSkYuVF9wYoH4qLweQyVvSrFYq3xjrygstQIVpwJU+jCaJ8yfF/l8+EGREArC6TDnPZi6uZcUGn86LnJope7+vlpv0fxjaMSqJ3Z39qE1dRFCxp7WQaWItownTw1CAjrKXBYnRQ7DL3z+bNdJrH347yn+6MKvjhv4SYePr+eYpY2Dhn+m2riepRNo5WTiu2mfktERTWCe8WHuR24w82hJvf4/Z3ZjGIfN/A6CZELuldf+ffOCQfXa+YG6mT+n1iEJfIdxPf61NZgxah35oxEFTZT5JAiW1gB6nWTU6sVxmsEcs+II0uKEMqsfpylNV0qrOG59mMOabCHZXeoCVa0CJL4oFfk5pMShdYgrQ7X+WGexn1LLi2lqgWK0PDqiJLMOgsA6jKM5XVgDUJciT14mWMmsVwxWqkQWOx5Zp7FHAT+JfRdWma3LCbUGfjaCJkhV8te2zfO6szBaGaqc/7i+Zb2bAHHqBwEsV6DmFQwA+tBnhIls01IaIqGlQswAyiQaLaLp6hZLJzHLilYbroIesAVO79XQdJYkAcWFucgPBoOV1H4cwfy9qNHq9KbQrFgoXG5c0H4vClApsLQtQLH40bJSL7cBKfCPxp4GsBp9nD8gPy4NAhK6SIA9MTMBUfAoahHWtRYeEzSUxxWDxAmgJdI0KBsFBJRTme35yXZfLSgbYWGakGNNDAnGMRopNczxElpsDoECimOzUOGQ6OAoKxq8ikyHZiQDE64QP4ft+nCK1qgILOUamATJyAWGpmdm4pRCNXNmyRhKMO3QGfR9hy+BmRO07RupIhdgEVohmKTwlwczBRbTBiidOSN/DFqpzRvOgsxn8W071QlRFphfnoXDLrVaVIN5GIGiGDHiFpb4swXWaAjEzBJmXMrN1O4Xex2LZu5Ou44pVA3UZNBJM0ih0Ri0QdtkBesclDNodepOqDvF9WFr3U7JnHIWaMEwgaaEhk2hwzJ3ksUAsCybi5Yf5Sfk9Pja5ktEPMptGqXQKlhNvl7AyjOFxtveeZJktuOPb6k3pnVSISTBcAV77XqbD8MJRVPvC7A2unQhh6ZIK4qtd6+OLl9u91+dvro8PrJ6r15Y746PXhyDlDZyy1pHe2sfJBRYN6xXHF130R1A40WUHcfLNn7JCgNUGTCZP/1R9A3+4usTW6L5Ht+mHWi/WKxX6U4CNYbRAiMnGnfQbBxzuyztJHHqIwTTR37WSSgsvbpIKI6rzqWJH/hgqmP/lD4SjuraBtY1w0F2GxfXX95cH9yILnMo9D+vRuK7GR+wvO9EeurCKiBSIC2CgX7tRxEsptACz8F36sSzrP66sR6SR9BZoAaiLBZG8O9LZjQcgL4DfiEs2LcwlB5ZDJZIbr1bDzeBFMFsmzzafgj9YifjUWVadHYfbT76/X1L0cHqP9/Yu4FxAyUQfDDuvGZtlOJQ8nMX58tGcXhhWk2u6XphDjdtYXWhg1XmIElAkr9Qaf64ZoXfypBNsxAGFPMUdQmmChOmYBDAkAPNkFngRdmMyS9VvfrjmqXnj8eBwohprMmurS4wToKY29vgb7zPH4okzadxyJKmnMT8dArWuw0ThzLsj7xfNPnuZAQHkoRyshgFDLF/dCmNnPurC5lgEIEEdaV4u1SfQ4MlfstVUaMZbleLFyg/4tVPUv+OWldQh7AGOp7LGgvAqoqCdgamk4/JB1jWwg0zHRn6oAeZyA4K4vkkjCOvLXpF2Zud1azuXX0EZWqwnRh4VFsExwdlzDysWuYCoAzWHhcAa9k9BJySzCCAr6+qFJ7YFrZSmqoVPgTEAf/faH2mHwwG74fmDKz4TfVMP6yoZsW0snI164q6NUQYp5UsJp9T4rdpQrkkiqGSfMQMm5VRvPd4fEcrAVVvJmRptlMlwtRQQFzRUjXbyqZqMK9oq5VCkMA1hJgxKwvpR/MKqmPfo8WzohvKZhDLhPhY1RHYjmVPrCoJpFf6plWvNuz713f01+4XXb2/gKPZXaUAU299/3pFZ1VMK/uqZl3RVQ0RxvaUxeTNKX6vas3vX1eNWVfGpDynTNY88wOP8gYo7rO0Tg1aUMrgdbl7OSi5K477LZSysPVSKRI+slgqVby9I7MSL8e6bBNox3vUSIxMyXOiMChND25FRiY88IAW9vWmPacytdi8yZ1gBg4v2uab75vBxe6XLrh+jwv0pD06gjYXVjyJAT120EPZ7Vhgf+JfuxiwtXo0Q+PSizNwYWNvVvl863ZIppRkAQc0SvcYTTUXGERQIyEcUSZ+k+8owdQK/w2P6CKPA66APReRcCg/n4SVg7SxV+SbdvhNNtno2s4sAPGbcqkTJYqhHZjaYGTMowOq72KVXosaIRnzRssW0MSFmRb63kqaRoAqNmmK3eKyOCuuv8Lao8MutgXKoHazA59B/9ksIAl4qHFGMtEgXwLuq+JPUA+r3a8eYdMx37/Kz1/0oHs7e7v7nV2lbJDh9DuaFlXD1GzgXAWEgYfMFQ7Y8VPezg3rPvFxS8Yg8Swu2i1dVgN83qIU64EXmmN0XdNYYxpBgb9Vd3XE5lUjUtKu8AvcxsKIyRLH8rN9ZZz3cU4fLklUDfdnzzrdTlfNPMMC42hsVQDmEnTJZuXu7mkzOROt1uRstOXH+UNX7Gqoo4DJ3Y1PWtdq6wmM5T/w4kJCQ3ThFirS0ZC3E4wIGKDeCqFMcywT6oJ+xi2gorETxHXYg719EeLfsg4Gg+Ozo1dX623V4frMDajHNTp6uNflzu/zhy54UTv4+egml85z04VOJyShzwV/x2PkttNwqne+5NEFFPFIuNGa0o7cSQz9Yp8cDQ6Kra28LIo0F4jB9p3MRiMM/Nv9o0HPwJOaAsRjEaRWsYlhzqSzhEeCEN2wXDZtmb5pkO/Ft5GZqaIINowjNHZ8R+D40JBmk9h7fulPv7/JQ4+noZ8PP8VQTLNlQPPT5QnjYV0RXQvKMfocB2+Hp/EQRhXxeL6XZCKdB63cmEHPPMfN847Fk3lXZXECGmnsRzzsUac7cZZh/TjJLgdEOvWTBojv30pMt82Pu20apQ6h1UV/0N19ICbA+qo697nl8GdUb17fB092H7ioWlbWm+/1/HlWHNcCZVXCbRy+tK+3ovLXGEX2OnMsjXeT4xYF87NJuMeBkGFIFl+EfrRuH1EXFkQKbvbkdUIZ2DDsMegYQPigv+W/d8HIQsCIFpyLYgyGnHUPIhIsYVVJC3shC6vikjbQGgQkG8UsXMGBriPYNGOvjv7147G9fU7hr4sXR5p+QrOgaIXkm3ZFGzvyF3wjSPQL6C9I4tpJ/C3MB7mSnry6MsQ6zl5fHgtTH2F+lpfwq6ur643uja4/D+OYeUoB+QbJKD/UxIPeN0MPNGh/FbSnQQ/ukGoBwlopRqnKfatxcN9K9O5dhRXIe9SgfWSIgt8UxDD+7TkMevmhgpKJLwD4Q3HKv5f7qo6Tf6+W8i11MzE6OGJkcAEx6PURBLYewLBuCqUnKT2NcioppxrlSlKOiuH9+uT0u7s59M6JII9o2FVTv8PUPSX1iIA3KeTD/AJfEBMGLJ774LDnJz5Ls06fpKrxzw2i+8+Me4+rt1gY7Da0rRSSU5IcjYQaxwlL51ilhiVjqJFYSWIaaVaSZhppXpLmGum2JN1qpEVJWmikZUlaaqQfS9KPOomyuKTuatQ4oiVRG9NvQ6wdbg7LI3rgjRSFcnTrbaSCIq0pwKPHiIHwhja6721Yvi3hxiiwKVZfwmWPvdUaguN8A3Dom6C8Fgo0XFXGXllIS+zMMgo1OhqcFHkZmyjMJe+1itS7u0wIOTTn1P9oToetnA7vyInHWqdUBCPV43qM4AIXEScg6nE06EwwysCUaU/Ilytq8tJUk5aQOyqUMeLjDrSW28VK5MzRwZ9rdaeERWAX8dpnahWx9iJMfEQDjfq2JjcHtoCqjbVMMGIDDZA83Oj+3sKTkQbEKOILCGIeqVOoFjEZoncvMIoGHpwc9PvomaHqHUrdWuDpjavhgYrtt7Fct0psTxdcuq6alNI/VcorWJqaGheGk0LTDUKECuzrNgpIfFm0qmY1B5CspeVnD18+0krZ4MTSGjh7klPL9ELNtDkaG7le6LleqLnqrD3JqmV7uTJbMbQbOff0nC9X5tzk7klures4++BkqLLpfSeQfR3ZLtJMnOOcDatQiBjAbZBTgcrghwE1F6Lmd4qaOxWoLUrJET6EOGIqD3wIOeRzUx4Vs7OCKpidz41Zj6rKnhhrMKpqUNIxsHV8dmSdvzp8eXB+pAg8PcW+OD3Vu0KxEBD2dvmxHsMzmMKeBcUc2MEoAHeHn5yAb2Yzl/qBdnrOYWQqt9/ASb62H4h/eUzMsrlPfQOJ4oca80ZmoQGB8RpZAGtEOXXBMA/839fiS+aERbyBNEHjXyY5vep4iJTTa8jocRkyFfnbAo4jV5aCl9tg7gMi+jjEaUAcM6TXgPTkOeYWZsB+EMvrlnVLLdzW4Odlb0HJ4S88h0wiC7xVPyJsWe7IWBbvnR1xng1MsiDwaOCHPh7VRQcedASZZTFoZd/l56EZTf0fhWyHMHG2WXQmCIOhscVJo1nk4pExzGBeHrbCTvgJtU7CaEbBr/Ozz1OLWIGfZVhIkE+ZRTIrillIAgszQgFc+k98yePi/dSC/2M2VVnXeXsYgh3YMDHY+gMtioyHij/wA4KlKYJLDwvznaLIn6tdQBd4WLr2Do9VAPiTPugiWkPeavuKrCYONPuAUycka0LU0GuSsjpyAl/afnGSujUAP3QJXkuCZzih4fwAkAt/HMLSEeImMR5O7Vg/DB9r4pywDQ1NmMgkTxnAj6HaS74vlNv4z4iEfrDMk8m8sFMaQNuLTSP9xg5LR6mS5YuicgpfqMV5U9PeqLSjmnakOTk1TevYdzXtnTZ3a5o2Zq5q2pVmFB0UTS9YIR4WzQCNQuwVTZ+4neV3Ne07lfZtTftWC43VtDOVdl7TzjXaXUQ0Pkuidvz7pKadqLTB8flpTdbm0kWDqLkHA1oeCazYh80N0128k1Nt8m4pOgMsexFfLORX6xi9FWpzloNEJKHmqLfejXDQXWEDzWtrwgn3RsIutCWhUVo2Qd++PmDHSYVa9g7oWgBqGiEgcoVqHPCS09US7aTCU9DfQYuHp7SZ9GySIu+TRMvfS0Z44LtWXEeD7ROKB8xUHK4cLdgZJmgwsFwWLdjx4jI+J16c6udg/BleQpgliRosDGeSEs602w0gqpC8ChMlEZI4s85U3iEo7zxAUk/cHdBzEBcOzuI+Hm54HdIx2dyuGHcOPGLmLBnP62xK7OawWnxgrm5qG5iS8+JjnODMrGY2cK/GmrJ6GHHpSrsSpelON+UVkG3H2Ay3lExr9K2+khKWxTWA0FhDtEU427qQGHulDau66lY/9gcDkzSWZy/ZZsYxOb4N+MAbkxDX1GYltA3eD0FRWTb4oQ+6mo4fulnTtmi4B/IarES2jTEjvJul3b9idF5vo6AhggnSf1GmfNKwV/BDyd5tdIEba7WTTUCDIPHrFnkv4ljtC04TwvA2jaWKmFNpWZtuxIJ75NJ6sYAmyljBLWaojbBLVZ+Nn5KtmET+ovxCBYqFReumMvYmFhxYG2JGITfxb5Ffyh/XKeV29U0b58YsDsD0XRb5YfUTIJIu9ln5niu/yCGKXDI30vOj+veKrAIKRQY1zf+pMVwijeY+iyM8fpCHy4TF8QijSw6/qxePvobWz3PLdj+MPEs2tYElneKFnSajJZKa/MgOBee/YfUe+bC48FNfm/LwV/P+9O7OsyRbt+dAsLL4SQI29oST+eU0QZbQ/Z09DhUCVVChi0U1abU3catuW7+zL9vtShckTAIYNzLhJj+WKdb6yvZvUUiazsJE9ONB/dtq5wNNXN7naeQ1aKS288OIHAP9dS7+XTcPu1pSa/yZR1kNbgw3dUBHeKcWb540RVeJbTRZ+HHYAB7wb9RUn+zP+pY1AvduyV9WSD+p5ObI4bf6hJL9dBlsWX4Iwn++Ulf3DT91uavzDz9f2assCnOy737yWvHmsvAAmR+Nf66KXcNac5MvYBrah3GY+AGGRXBlSj91dXARw5cusjgOPnE/NZdIP+KRer6uug7Luzs7kXYEJKALDtqTIFxth13wawIvztIOd+uG8hzl/9N2kNm8Ojzty0c05IMouF6in0KZC54h1PRry3BgVLKfn8mDzcBTH1g9P8M3SVpC9PMAkNVd3LoVBQsTMKjoxhIlZ5jmEHK1VZunr0KJ1KxISsDQAP8xdDx1jw5pwk/uS7JaHQ+ziGaFNtwoi0gg6X6s7f4hhUuur+KfGa9wBRgX9MTDMNii71uGqWMwB8VlMXdC+YEmsMHRLVcwJ0eXRX7SNnIv9TNNJ/1DDdbXL2bx+Bs/Bm8IwpGxEoRrMPLz3g4QF3JINMTUY5PHjMvTVPIolRYd5Q9tKIdK+S1TvYEoYxme58KSLvzhY37x4FQEvX1xtxz0H79Cn6o+Fo6Ixq17/Ak2JV5XsOsjQy0vAO9XtPjKmysSvvriftv1dydpIzqRaoEJRNAGArpf84gQEzQwgSaFwqxagv08Geb6NoEzc6fYGQWMVe0KQxzJmVQHingCD9Wg32d+oCT2+SUdeb+mdcNIRSZmZOlg9g2vVbDymZIkbjJzJ8yuXgDIt9vlqJwzvbBSQMM3C0wZx8ndxRR/2gOFLEG7pTwslBFVoyT8vZL61pfm/Zc0ef9IfOcY3t2yjo77x5fH1tnx1SX8PjFf1vIal5Xq3Nrqpgou8NuiQ9NWUnWVTpdluvdEw7Kl6kN3bX1jvADXuN2nB1ph7daGklbKbCKVc6l+tUAwqGeENSB6lDThb874/GT+LNGPgABXEyKnvkBqsiRIixV6NEoZv3w1S7guV5piSgN/EsdeQ9urhWiBZCkkVrs+WkH52w7lgBeDt6cHHzJARTjv+cmV9jRVVkXPnzdOODr5K/MQKkdyPeQVNYS7kBmpygWru7ZV4cbJ0iWZEdoe064pooL7TWDFU4dh5J0H3fTbD3LKnQxtQyHHNBLkF6vI7ZZ9UZQ47RQz9FYNl51XcRVD2WCaRvDTdg5HdQ5qRdrAkxXAMPYaPW1VjGpslL9AUrX37Qppbuqn6KnXlk15g17Ld4kzaFlXvqWs9ZWSpK2om650PA3jFZzPNBp7AfhEPf4cDIZvwHXAV1XkPwU+ToO70fHI4vEnoYqagSkZyULFWD3sV0enbKsoyhBUmZNtX++F4U2xQlYZ4lohUca77hSszCewVeMZ6IhyYTPMcn8cNULB+KVJwR5J+M5FqfHeo3Zg2syMIDXKEn5hr8IOpf5Tg+AJB1bSTZm6Adgrxrw42WNFpUNXo3jta91XAhUXUero0XtZWrWZMOgGfrlcWqorVKnL/CRD0ynftC9ON3WDAXcTgWPg6/P/oB742rnvrFcTNR3Idw0FUTtRmDkVzTEsjwzMpdpM5KcgtMMAgV+ujxJ2gUkDTNJD9TQac6jBdJnHjV2Gt7G2y+DFYU3HD5OzNS9faIQfDze6P/ETkpoc4kf1tMcV/cp4qguQYMPi7kV1RVhmzderh1c8CqCxueBDNt4cO+Sf2tZRWiPeUTLd5q0GNdCQkL/YaS13XNVGnTY2Nadas004GW1cfi5kqu8YMY6YBugBtVmZxsu9JHV/gz9z9qZ0e5TGYGjuRxks4stSr2hne/HhO/HIXm1QZ1leJutrmkeiMeX31tFFpJEb8yu9b4qmyzmaZWnT57TxXiW/ZPXkic1oQBbFn1TJE+KRqhit6RqDc7dpcCLTzGDZ1UmqHnPSJpw3axOsdD0oyTm/oP+trj3n8uY+3kDR/Q5OtAd4O0XRXWnS2B7kX9qUbh3v8XQVD1nCmMDN9M3ekO9rT3z17RYBKAr1+TxkpoQFyyZz51J7iYZDOLvmWPCbqeV0lN8abOILu0vmMDRbR4kTNCZiEuiPoIK101CBp/il6tA4wUurY8qXRgCUYPPeOdj8Eb5vWfCf5hcUTW8u8m1JfM+xuWtrfPJxFTO0F75mGXgfE4FvQxrMYRfbarM+bXOh967rZU3I0aUBQtqYAyMoipqYszMDJHCSVnF6A8NwcXiha6O/fVyFV0ltK4dXYjWLlxlYyN08xMgURXfxAFVn4ZW+ozqOfE94xXsM5XMb/45PbXzFH9zYe1q04eFyzMcIx3Y7uztPxeMNbQwPYklxCDBAUN+6MaoJ/gzE/rPO/n5nv6vhxLshArPX2X/a6T59ovbjOT6N0HhFQxbA+IyG549GQSwfNZgl8mwTQDfUu2Zz2sLOaYUvzAwOo+ASCuvhyBTIkICUCpEXiPlJvBGthhVcsH6VAycX1DUeZoHW8uQDYxU0DuatR0qNbRCD7cDwJcS8qtn7nNs63HbZ+BIcx882uupr8vqPt3s73Sc7u99+tfFNT740/+u1f1n717WHa921p2vfrL1cG6y9WXPX/rT2H2v/ufZf7/9n+Kvhb4a/FdBf/kLy/ONa68/wn/4XiKnzmQ==</latexit>
A A B A A B
B B
<latexit sha1_base64="mwIShgNPMRbNxfoVdXjn17Y9zBY=">AABkW3ic3V1ZcxxHcsbu+ljDa3vXDj/5pWkAIZA7gDCQQNLa4K4wOEhGAOAIAA8JDU70UTPTQndXs7pnMKNW7z/wq/3b/OD/4syq6quqGoQjKB/LXYnTlV9m3VmZWYfcJAzSbGfnP37281/82Z//xV/+8q9W//pXf/O3f/fr3/z9m5TOmEdeezSk7J3rpCQMYvI6C7KQvEsYcSI3JG/dmwOkv50TlgY0vsyWCbmOnEkcjAPPySDpYjDKRr9e29ne4X8s/Udf/lhbkX+Go994A9un3iwiceaFTppe9XeS7Dp3WBZ4ISlW7VlKEse7cSbkCn7GTkTS65yXtbA2IMW3xpTBP3Fm8dTVVTsmtx6NIif27Sn8tch3ilaiWyU2hPvzIEml+IWQ36RDiZizLFZXN4ziNz4qP11GLg2v8zGlWRSkHoiyGam5cjubEiTGNCNFbo9jwZFXacjRLFCURk42LdREbIpUS0VhIKCVjOyOu1DkQg+70KGRkrwYJ0WrbXM7cRh8Flf969yOaRD70InWup2xwIknIWHBZJqtW3ZGFpk7ztf61taWVVRCwlkU4yjK/1DkD3J7zmYhsW4DP5tau0lmwB3wnH6f2x7kA9LjiR2SDEE4YG3b5l3kQnlTGElTe5pC2Um+tYfSIsi/KNrl90gYpkW+s/3Yi0wkTttFWqt/cxsqNydhcbULFR8nZO6EIP3zzbX+b9d2H6q5GMCMzmJ/c7Pmefiov7ODnEpGHlRfcGKB+Ki8mkAlr0uxWKt8bbcoLLUCFacCVPownicsmBf5fPRBkRAJwuko5z2YermfFBp/OilyaKX+3p5ab9H8E2jEqid2tvegNXURQsau1kGliLaMx08MQkIyzjxGkyKH4Rc9e7rjJtYe/PMEf/ThV88Lg6THx9czzNLGQcM/U21cz9IptHIyDbz0hDjjQ5LAPOPDPIi9cOaTknr1387s2jAOm/nth8nUuVdee/fNCwbVK/d74mXBnFgHThi4jCvsr6zhjBHrMBiPCWiiLHDCcGkNodedjFgDStMM5phFY0ijCWHWCU1TknZKqzhuA5jDmmwh2VvqAlWtAiS++hT5OaTQyDrAlaFaaKwzGqTE8ilJLVCMlk/GxMms/TC0Dmg8JwtrCOpS5MnLBEuW9ZIxkiay2HRsnVKfAH5KAw9WmY3LKbGGQTaGJkhV8le2zfO6szBaGaqc/7C6Yb2dAvEmCENYrkDNKxgAnECfOUxkm5bSEAktFWEGUCbRaDFJu1ssnVKWFa027ILuswVO725oOkuSkODCXOT7w2En9YTGMH8varQ6vQk0KxYKlxsPtN/zAlQKLG0LUCxBvKzUy23oFPhHY09DWI0+zh86PywNAhKySIA9MTMBUfAoahHWtRYeEzSUzxWDxAmgJdI0KBuHDiinMtvz460TtaBsjIVpQo40MU44oWik1DDXT0ixPgIKKI71QoVDooujrGjwKjJdkjkZIygTf47a9eEUrVERWMo1MAmSkQssSt/MxCmFaubMkgmU4KZHZtD3Pb4EZm7Ytm+kilyARWhFFPROBPOIxbCYNkDpzB0HE9BKbd5oFmYBo7ftVDdCWWB++RYOu9RqUQ3mYQyKYswcr7DEnw2wRiMgZpYw41JupvY/3+1ZJPO223VMoWqgJsNemkEKiSegDdomK5jhoJxBqxNvSrwbXB82Vu3UmRPOAi0YJdCU0LApdFjmTTMKAMuyuWj5UX5CTo+ubL5E0HFukziFVsFq8vUCVp4baLyt7cdJZrvB5Jb4E1InFUISDFew1662+DCcEjT1Pgdro08WcmiKtKLYePvy8PLF1snL05eXR4fW4OVz6+3R4fMjkNJGbliraG/tgYQC64b1ovFVH90BNF5E2XG8bOGXrDBAlQGTBTc/iL7BX3x9Yks03+lt2oP2o2K9SrcTqDGMFhg58aSHZuOE22VpL6FpgBBMHwdZLyGw9OoioTieOpemQRiAqY79AzYYDpMcR3VtA+uaYT+7pcXVF9dX+9eiy
vt´1
<latexit sha1_base64="EocFLurBj3IGQBKlWbyegK+eOSQ=">AABkX3ic3V1ZcxxHcsauj13Dl9Z+cvilaQAhkB5AGEggaW1wVxgcJCMAEAJAEhIanOijZqY13V3N6p7BjFq9/8Gv9i/zo/+JM6uqr6pqkI6gfCx3JU5Xfpl1Z2VmHXKTMEiznZ3/+MUv/+RP/+zPf/Xrv1j9y7/667/5289+83dvUjpjHnnt0ZCya9dJSRjE5HUWZCG5ThhxIjckb93pAdLfzglLAxpfZcuE3EbOOA5GgedkkPR2PsyzrX4x/GxtZ3uH/7H0H335Y21F/jkf/sYb2D71ZhGJMy900vSmv5Nkt7nDssALSbFqz1KSON7UGZMb+Bk7EUlvc17ewtqAFN8aUQb/xJnFU1dX7ZjceTSKnNi3J/DXIt8pWoluldgQ7s+DJJXiF0J+kw4lYs6yWF3dMIrf+KD8dBm5NLzNR5RmUZB6IMpmpObK7WxCkBjTjBS5PYoFR16lIUezQFEaOdmkUBOxKVItFYWBgFYysjvuQpELvexCp0ZK8mKUFK22ze3EYfBZ3PRvczumQexDJ1rrdsYCJx6HhAXjSbZu2RlZZO4oX+tbW1tWUQkJZ1GMIyn/fZE/yO05m4XEugv8bGLtJpkBd8Bz+l1ue5APSI/HdkgyBOGgtW2bd5EL5U1hJE3sSQplJ/nWHkqLIP+iaJffI2GYFvnO9mMvMpE4bRdprf7NbajcnITFzS5UfJSQuROC9C821/r/vLb7UM3FAGZ0FvubmzXPw0f9nR3kVDLyoPqCEwvER+XNGCp5W4rFWuVru0VhqRWoOBWg0ofxPGHBvMjnw/eKhEgQToc578HUy/2k0PjTcZFDK/X39tR6i+YfQyNWPbGzvQetqYsQMna1DipFtGU8fmIQEpJR5jGaFDkMv+jZ0x03sfbgnyf4ow+/el4YJD0+vp5hljYOGv6ZauN6lk6glZNJ4KUnxBkdkgTmGR/mQeyFM5+U1Jv/dma3hnHYzG8/TCbOR+W197F5waB65f5AvCyYE+vACQOXcaX9tXU+Y8Q6DEYjApooC5wwXFrn0OtORqwBpWkGc8yiMaTRhDDrhKYpSTulVRx3AcxhTbaQ7C11gapWARJfgYr8AlJoZB3gylAtNtYZDVJi+ZSkFihGyycj4mTWfhhaBzSek4V1DupS5MnLBMuW9ZIxkiay2HRknVKfAH5CAw9WmY2rCbHOg2wETZCq5K9tm+d1b2G0MlQ5/351w3o7AeI0CENYrkDNKxgAnECfOUxkm5bSEAktFWEGUCbRaDFJu1ssnVCWFa027ILuswVO725oOkuSkODCXOT75+ed1BMaw/y9rNHq9CbQrFgoXG480H7PC1ApsLQtQLEE8bJSL3ehU+AfjT0NYTX6MH/o/Lg0CEjIIgH2xMwERMGjqEVY11p4TNBQPlcMEieAlkjToGwUOqCcymwvjrdO1IKyERamCTnSxDjhmKKRUsNcPyHF+hAooDjWCxUOiS6OsqLBq8h0SeZkjKBM/Dls14dTtEZFYCnXwCRIRi6wKn0zE6cUqpkzS8ZQgmmPzKDve3wJzNywbd9IFbkAi9CKKOidCOYRi2ExbYDSmTsKxqCV2rzRLMwCRu/aqW6EssD88i0cdqnVohrMwxgUxYg5XmGJPxtgjUZAzCxhxqXcTO1/sduzSOZtt+uYQtVATYa9NIMUEo9BG7RNVjDFQTmDVifehHhTXB82Vu3UmRPOAi0YJdCU0LApdFjmTTIKAMuyuWj5UX5CTo9ubL5E0FFukziFVsFq8vUCVp4pNN7W9uMks91gfEf8MamTCiEJhivYazdbfBhOCJp6X4C10ScLOTRFWlFsvH15ePVi6+Tl6curo0Nr8PK59fbo8PkRSGkjN6xVtLf2QEKBdcN60fimj+4AGi+i7DhetvBLVhigyoDJgumPom/wF1+f2BLNd3qX9qD9qFiv0u0EagyjBUZOPO6h2TjmdlnaS2gaIATTR0HWSwgsvbpIKI6nzqVJEAZgqmP/gA2GwyTHUV3bwLpm2M/uaHHz5e3N/q3oMpdA//NqJIGX8QHL+06kpx6sAiIF0mIY6DdBHMNiCi3wDHynHp1l9dettek8hM4CNRBnVBjBvy2Z0XAA+jY4gbBg38FQemgxWCK59W5trgMphtk2ebi1Cf1iJ+NRZVr0dh6uP/ztx5aih9V/trZ7C+MGSiD4YNz5zdooxSHOz12cLxvF4YVpNbmm64U53LSF1YUOVpn9JAFJwUKlBeOaFX4rQzbNIhhQzFfUJZgqTJiCYQhDDjRDZoEXZTMmv1T1GoxrlkEwHocKI6axJru2usA4CSm3t8HfeJdviiTNp3GdJUk5iQXpFKx3GyYOYdgf+UnR5LuXERxIJ5KTxShgiP2jS2nkfNJdyASDCE5YV4q3S/U5NFjid1wVNZrhrlu8QAUxr36SBvfUuoK6Dmug6VzWWAC6KgraGZiOPyQfYFkLN8x0ZBSAHmQiOyiIHzgRjf226I6yNzurWd37+gjK1GA7NvCotgiOD8KYeVi1zAVAGaw9LgDWso8QcOpkBgF8fVWl8MS2sE5pqlZ4Hzou+P9G6zN9bzB43zdnYMVvqmf6vqOaFVNn5WrWjro1RBinlSwmn1Pit2lCeU5MoZJ8xAyblVG8dzq+p5WAqjcTsjTbqRJhaiggdrRUzdbZVA3mjrbqFIIEriHEjOksZBDPK6iOfYcWT0c3lM0glgnx0dUR2I5lT3SVBNIrfdOqVxv2/at7+mvni77eX8DR7K5SgKm3vn/V0VkVU2df1awdXdUQYWxPWUzenOJ3V2t+/6pqzLoyJuU5ZbLmWRD6hDdA8TFL69SgBaUMXpf7l4OSu+L4uIVSFrZeKkXCBxZLpYp392RW4uVYl20C7fgRNRIjU/IcKwxK04NbkTkTHnhAC/tm3Z4TmVqs3+ZuOAOHF23z9XfN4GL/Sw9cv0cFetI+GUGbCyveoYAeu+ih7PQssD/xXzsYsLUGJEPj0qcZuLDUn1U+36odOVPiZCEHNEr3CE01DxhEUCNxOKJM/CbfVoKpFf4bHtFFHhdcAXsuIuFQfj4JKwdpbbfI1+3om2yy1rfdWQji1+VSJ0pEoR2Y2mDOmEcHVN/FKr0WNUIy5o2WLaCJCzMtCvxOmkaAKjZpit3iMZoVN19h7dFhF9sCZVC72YFPof9sFjoJeKg0czLRIF8C7qviD1APq92vvsOmGGCIi/zi+QC6t7e7s9fbUcoGGU6/I2lRNUzNBs5V6DDwkLnCATt+ytu5Yd0nAW7JGCSe0aLd0mU1wOctSrE+eKE5Rtc1jTUmMRT4W3VXR2xeNSIl7Qo/x20sjJgscSw/3VPG+QnO6YOlE1fD/enTXr/XVzPPsMA4GlsVgLkEXbJeubu72kzORKs1ORtt+WH+yBO7GuooYHJ345PWtdp6AmP5d7y4kNAQXXiFinQ15N0EIwIGqN8hlGmOZUI80M+4BVQ0doK4DnuwuydC/BvW/vn50dnhy+vVturwAuaFxOcaHT3cm3Kb99mmB17UNn4+vM2l89x0odOJk5Bngr/nM+eu13Cqt7/k0QUU8VC40ZrSjr0JhX6xjw/P94uNjbwsijQXHIPtO5mNRhj4t08OzwcGntQUIB6LILWKTQxzJp0lPBKE6Iblsm7L9HWDfJ/exWamiiLYMI7Q2PEdgeNDIpJNqP/sKph+f5tHPk9DPx9+iqGYZsuQ5KfLY8bDuiK6FpZj9BkO3h5P4yGMKuLxbDfJRDoPWnmUQc88w83znsWTeVdlNAGNNA5iHvao012aZVg/TrLLAZFOg6QB4vu3EtNv8+Num0apQ2h10R/0dx6ICbDaVecTbjn8EdWb1/fB450HHqqWznrzvZ4/zorjWqCsSriNw5f21VZU/gajyH5vjqXxb3PcomBBNol2ORAyjJzFF1EQr9qHxIMFkYCbPXmVEAY2DHsEOgYQAehv+fd9MGchYI4WnIspBkPO+vuxEy5hVUkLeyELq+KSNtA6D51sRFnUwYGuI9g0Y7+O/p3Qsb11QeBfl88PNf2EZkHRCsk37Yo2dhQs+EaQ6BfQX5DEtZP4tzAf5Ep6/PLaEOs4e3V1JEx9hAVZXsKvr69v1vq3uv48oJT5SgH5BskoP9DEg943Q/c16EkXdKBB9++RagHC6hSjVOVjq7H/sZUYfHQVOpAfUYP2kSECflNIYfzbcxj08kMFJZNAAPCH4pR/L/dVXTf/Xi3lG+JlYnRwxMjgAmLQ6wMIbD2AYd0UykBSBhrlVFJONcq1pBwWw4/rk9Pv7ufQOyeGPOJhX039DlN3ldRDB7xJIR/mF/iCmHDO6DwAhz0/Dlia9U6cVDX+uUH08TPjo8fVGywMdhvaVgrJLUmuRkKN40alc6xSo5Ix0kisJDGNNCtJM400L0lzjXRXku400qIkLTTSsiQtNdKPJelHnUQYLak7GpXGpCRqY/pNhLXDzWF5RA+8kaJQjm69iVVQrDUFePQYMRDe0Fr/nQ3LtyXcGAU2xepLuOyxN1pDcFxgAA4DE5TXQoFGXWUclIW0xM4sI1Cjw/PjIi9jE4W55INWkQb3lwkhB+acTj6Y00Erp4N7cuKx1ikRwUj1uB5zcIGLHTd01ONo0JlglIEp056QLzpq8sJUk5aQeyqUMSfAHWgtt8tO5MzVwZ9rdScOi8Eu4rXP1Cpi7UWY+JCEGvVNTW4ObAFVG2uZYMQGGiDZXOv/1sKTkQbEKOYLCGIeqlOoFjEZoncvMIoGPj/ePzlBzwxV71Dq1gJPb1wP91XsSRvLdavEDnTBpeuqSSn9U6W8gqWpqXFhOC403SBEqMAT3UYBiS+KVtWs5gCStbSCbPPFQ62UDU4srYFzIDm1TC/VTJujsZHrpZ7rpZqrzjqQrFq2V53ZiqHdyHmg53zVmXOTeyC5ta7j7OfHQ5VN7zuBPNGR7SLNxDnO2bAKhYgB3Aa5FagMfhhQcyFqfq+ouVuB2qKUHOFDiHNM5YEPIcf53JRHxex2UAWz+7kx61FV2WNjDUZVDUo6BraOzg6ti5cHL/YvDhWBp6fYF6enelcoFgLC3iw/1GN4BlPYs6CYQzscheDu8JMT8M1s5pEg1E7PucyZyu03cJJv7Afibx4Ts2zuU99CovihxryRWWhAYLxBFsAaUW5dMMwD//e1+JI5YRFvIU3Q+JdJzqA6HiLlDBoyBlyGTEX+toCj2JOl4OU2mPuAiD8McRsQ1wwZNCADeY65hTlnP4jldcO6IxZua/Dzsneg5PAXnkN2Ygu81SB22LLckbEs3jvb4jwbmGRh6JMwiAI8qosOPOgIZ5ZR0MqBx89DM5IGPwrZrsPE2WbRmSAMhsYGJ41msYdHxjCDeXnYCjvhJ9Q6CSMZAb8uyD5PLccKgyzDQoJ8wiwns2LKIie0MCMUwKX/xJc8Lj5ILfg/ZlOVdZW3hyHYgQ1DwdY/16LIeKj4PT8gWJoiuPSwKN8uivyZ2gVkgYela+/wSAWAPxmALiI15I22r8hq4rlmH3DqxMmaEDX0mqSsjpzAl7ZfnKReDcAPXYLfkuAbTmi4PwDkMhhHsHREuEmMh1N71g/DR5o4N2pDIxMmNslTBvAjqPaS7wvlNv41cqIgXObJZF7YKQmh7cWmkX5jh6WjVMnyeVE5hc/V4ryuaa9V2mFNO9ScnJqmdezbmvZWm7s1TRsz1zXtWjOK9oumF6wQD4pmgEYhDoqmT9zO8rua9p1K+7amfauFxmramUq7qGkXGu0+IhqfJVE7/n1c045V2vnRxWlN1ubSZYOouQfnpDwSWLEPmxumO3gnp9rk3VB0Blj2Ir5YyK/WMXor0uYsB4lIQs1Rb70b4aC7ogaa19aEE+6NhF1qS0KjtGyCvn19wI6TCrXsPdC1ANQ0QujIFapxwEtOV0u0kwpPQX+HLR6e0mbSs0mK/MRJtPz9ZIQHvmvFdXi+dUzwgJmKw5WjBTvDBA0GlsuiBTtaXNELx6epfg4mmOElhFmSqMHCaCYp0Uy73QCiCsmrMBEnRhJn1pnKOwTlnQdIGoi7A3oO4sLBGT3Bww2vIjJ21rcqxu193zFzlowXdTYldn1YLT4wV9e1DUzJefkhTnBmupkN3N1YU1abMZeutKujNN3purwCsuUam+GOONMafaevpA7LaA1wCNUQbRHuli6EYq+0YVVX3enH/mBgOo3l2U+2mHFMju9CPvDGToRrarMS2gbv+7CoLBv80AddTccP3axpWzTcA3kFViLbwpgR3s3S7l8xMq+3UdAQwQTpvyhTPmnYK/ihZO81usCjWu1kE5AwTIK6Rd6JOFb7gtPEYXibxlJFzIm0rE03YsE98ki9WEATZazgFjPURtilqs/GT8lWTCJ/UX6hAsXConVTGXsTCw6sDZQRyE38XeRX8sdNSrhdfdvGeZTREEzfZZEfVD8BIulin5XvufKLHKLIJXMjPT+sf3dkFRIoMqhp/leN4RJJPA8YjfH4QR4tE0bpCKNLLr+rR0dfQ+vnuWV770e+JZvawJJO8cJOk9ESSU1+ZIeC89+weo8CWFz4qa91efireX96Z/tpkq3acyBYGX2cgI094WR+OU2QJXRve5dDhUAVVOhiUU1a7U3cqttW7+3LdruShRMlIYwbmXCbH8kUa7Wz/VsUJ01nUSL6cb/+bbXzgSYu7/M08jpvpLbzw4gcA/11If5eNQ+7WlJr/JlHWQ1uDDd1QMd4pxZvnjRFV4lttLMIaNQA7vNv1FSf7M/qhjUC927JX1ZIP6nk5sjht/qEkv10GWxYQQTCf75SV/cNP3W5q/MPP1/ZqywKc3LgffJa8eay8ABZEI9/rordwFpzmy9gGtoHNEqCEMMiuDKln7o6uIjhSxcZpeEn7qfmEhnEPFLP11XPZXl/ezvWjoCEZMFBuxKEq+2wD35N6NMs7XG3bijPUf4/bQeZzcuD0xP5iIZ8EAXXS/RTCPPAM4Safm0ZDoxK9oszebAZeOoDqxdn+CZJS4h+HgCyuo9bt6JgYQIGFd1YouQM0xxCrrZq8/RlJJGaFUkcMDTAf4xcX92jQ5rwk08kWa2Oj1nEs0IbboTFTijpAdV2/5DCJddX8c+MV7hCjAv64mEYbNF3LcPUNZiD4rKYNyH8QBPY4OiWK5jjw6siP24buVf6mabjkwMNdqJfzOLxN34M3hCEc8ZKEK7ByM97u0BcyCHREFOPTR4zLk9TyaNUWnSUP7ShHCrlt0z1BiKMZXieC0u6CIaP+MWDUxH0DsTdctB//Ap9qvpYOCIat+7xJ9iUeF3Bro8MtbwAvF/R4itvrkh498X9tuvvTdJGdCLVAhOIIA0EdL/mESEmbGBCTQqBWbUE+3kyzPVtAnfmTbEzChir2hUGGsuZVAeKeAIP1aDfZ36ghAb8ko68X9O6YaQiEzOydDBPDK9VsPKZkoQ2mbkTZlcvAORb7XJUzpleWCmg4ZuFpoxpcn8xxZ/2QHGWoN1SHhbKHFWjJPy9kvrWl+b9lzR5/0h85xje3bAOj06Oro6ss6PrK/h9bL6s5TcuK9W5tdVNFVzgt0WHpq2k6iqdLst074lEZUvVh+7a+sZ4Aa5xu08PtMLarQ0lrZTZRCrnUv1qgWBQzwhrQPQoacLfnAn4yfxZoh8BAa4mRE59gdRkSZAWK/RJnDJ++WqWcF2uNMWUhMGEUr+h7dVCtECyFBKrXR+toPxth3LAi8GrH7JNM0DFOO/5yZX2NFVWRT+YN044uvlL8xAqR3I95BU1hLuQmVOVC1Z3bavCo8nSczIjtD2mPVNEBfebwIonLsPIOw+66bcf5JQ7HtqGQo5JLMjPu8jtln1elDjtFDP0Vg2XnVdxFUPZYJpGCFK176oc1Iq0gccdwIj6jZ62KkY1NspfIKna+65DmpcGKXrqtWVT3qDX8l3iDFrWlW8pa32ldNJW1E1XOr6G8QvOZxqNgxB8ogF/DgbDN+A64Ksq8q8CH6fB3Wg6snj8SaiiZmBKRrJQMVYP+9XRKdsqijIEVeZk2ze7UXRbdMgqQ1wdEmW8617BynwCW5XOQEeUC9vANCHiRigYvzQp2CMJ37koNd471A5Mm5kxpMZZwi/sVdih1H9qEDzhwEq6KVMvBHvFmBcn+6yodGg3ite+1n0lUHERpY4evZOlVZsJg27gl8ulpbpClXosSDI0nfJ1+/J0XTcYcDcROM4Dff7v1wNfO/edDWqipgP5rqEgaicKM7eiuYblkYG5VJuJ/BSEdhggDMr1UcIuMekck/RQPYnHHGowXea0scvwhmq7DD6Najp+mJyteflCI/zYXOv/xE9IanKcIK6nPa7o18ZTXYAEGxZ3L6orwjJrvl5tXvMogMbmgQ/ZeHPsgH9qW0dpjXhLnOkWbzWogYaE/MVOa7njqjbqtLGpOdWabcLJaOPycyFTfceIccQ0RA+ozco0Xu4lqfsb/Jmz16XbozQGQ3M/zmARX5Z6RTvbiw/fiUf2aoM6y/IyWV/TfCceE35vHV1EEnuUX+l9XTRdztEsS5s+p433Kvklq8ePbUZCZ1H8QZU8cXynKkZrulJw7tYNTmSaGSy7OknVY27ahPNmbYKVrgclOecX9L/Vtedc3tzHGyi638GJ9jneTlF0V5o0tgf5lzalW8d7fF3FQ5YwJnAzfX0w5Pvak0B9u0UAikJ9Pg+ZicPCZZO5d6W9RMMhnF1zLPjN1HI6ym8NNgmE3SVzGJqto8QNGxMxCfVHUMHaaajAU/xSdShN8NLqmPClEQAl2Lx3DjZ/jO9bFvyn+QVF05uLfFsS33Ns7toan3zsYob2wtcsQ/9DIvBtSIM57GFbrdenbS713vX8rAk5vDJAnDZm3wiK4ybm7MwACd2kVZzBuWG4uLzQtdHfPq7Cq6S2lcsr0c3iZwYW534ex8gUx/fxAFVn4ZW+pzqufE+44z2G8rmNf8GnNr7iD27sPina8Gg55mOEY/u9ne0n4vGGNoYHsaQ4BBggqG89imqCPwOx97S3t9fb62s48W6IwOz29p70+k8eq/14gU8jNF7RkAUwPqPhB6NRSOWjBrNEnm0C6Jp612xOWtg5qfCFmcFlBFxCYT0cmgIZEpASIfISMT+JN6LVsIIH1q9y4OSSeMbDLNBavnxgrILScN56pNTYBhRsB4YvIeZVzd7l3Nbhtsval5oyaz1Vl2lPzYDn14Zs8aSl+c2djifDNg2vqNX4e16ww4Lr5yuqR+v04sowVFbHobKuEJSEVhWU70WPuhlokHXEAY1lyepH5zLDo3AZiRICg5oskir/Y1O+P9Iy4JDb5ROp+U6R69sseG5PYg3n8ZKQPwtTXaJ2dCu1OqaIWxuqBHwCCBURPiHzP/LQpmjmta+s9YZPjzbZ/+7rm13F+r/0JOd9ZTS+06ku/ah0Z/XTOjQM5jxqitGS14YX/xA1rvFCxQv48w44a7/cI9EXHWihIapBeHN6K7hkyfRpw51y0nA1pO93oBzyEs+hDD9b66v/jQ39x5vd7f7j7Z1vv1r7ZiD/+xu/XvnHlX9a2VzprzxZ+Wblxcr5yusVb2W68q8r/7by795/+r/y/8b/TEB/+QvJ8/crrT/+P/wXeJfVZw==</latexit>
vt´1
<latexit sha1_base64="EocFLurBj3IGQBKlWbyegK+eOSQ=">AABkX3ic3V1ZcxxHcsauj13Dl9Z+cvilaQAhkB5AGEggaW1wVxgcJCMAEAJAEhIanOijZqY13V3N6p7BjFq9/8Gv9i/zo/+JM6uqr6pqkI6gfCx3JU5Xfpl1Z2VmHXKTMEiznZ3/+MUv/+RP/+zPf/Xrv1j9y7/667/5289+83dvUjpjHnnt0ZCya9dJSRjE5HUWZCG5ThhxIjckb93pAdLfzglLAxpfZcuE3EbOOA5GgedkkPR2PsyzrX4x/GxtZ3uH/7H0H335Y21F/jkf/sYb2D71ZhGJMy900vSmv5Nkt7nDssALSbFqz1KSON7UGZMb+Bk7EUlvc17ewtqAFN8aUQb/xJnFU1dX7ZjceTSKnNi3J/DXIt8pWoluldgQ7s+DJJXiF0J+kw4lYs6yWF3dMIrf+KD8dBm5NLzNR5RmUZB6IMpmpObK7WxCkBjTjBS5PYoFR16lIUezQFEaOdmkUBOxKVItFYWBgFYysjvuQpELvexCp0ZK8mKUFK22ze3EYfBZ3PRvczumQexDJ1rrdsYCJx6HhAXjSbZu2RlZZO4oX+tbW1tWUQkJZ1GMIyn/fZE/yO05m4XEugv8bGLtJpkBd8Bz+l1ue5APSI/HdkgyBOGgtW2bd5EL5U1hJE3sSQplJ/nWHkqLIP+iaJffI2GYFvnO9mMvMpE4bRdprf7NbajcnITFzS5UfJSQuROC9C821/r/vLb7UM3FAGZ0FvubmzXPw0f9nR3kVDLyoPqCEwvER+XNGCp5W4rFWuVru0VhqRWoOBWg0ofxPGHBvMjnw/eKhEgQToc578HUy/2k0PjTcZFDK/X39tR6i+YfQyNWPbGzvQetqYsQMna1DipFtGU8fmIQEpJR5jGaFDkMv+jZ0x03sfbgnyf4ow+/el4YJD0+vp5hljYOGv6ZauN6lk6glZNJ4KUnxBkdkgTmGR/mQeyFM5+U1Jv/dma3hnHYzG8/TCbOR+W197F5waB65f5AvCyYE+vACQOXcaX9tXU+Y8Q6DEYjApooC5wwXFrn0OtORqwBpWkGc8yiMaTRhDDrhKYpSTulVRx3AcxhTbaQ7C11gapWARJfgYr8AlJoZB3gylAtNtYZDVJi+ZSkFihGyycj4mTWfhhaBzSek4V1DupS5MnLBMuW9ZIxkiay2HRknVKfAH5CAw9WmY2rCbHOg2wETZCq5K9tm+d1b2G0MlQ5/351w3o7AeI0CENYrkDNKxgAnECfOUxkm5bSEAktFWEGUCbRaDFJu1ssnVCWFa027ILuswVO725oOkuSkODCXOT75+ed1BMaw/y9rNHq9CbQrFgoXG480H7PC1ApsLQtQLEE8bJSL3ehU+AfjT0NYTX6MH/o/Lg0CEjIIgH2xMwERMGjqEVY11p4TNBQPlcMEieAlkjToGwUOqCcymwvjrdO1IKyERamCTnSxDjhmKKRUsNcPyHF+hAooDjWCxUOiS6OsqLBq8h0SeZkjKBM/Dls14dTtEZFYCnXwCRIRi6wKn0zE6cUqpkzS8ZQgmmPzKDve3wJzNywbd9IFbkAi9CKKOidCOYRi2ExbYDSmTsKxqCV2rzRLMwCRu/aqW6EssD88i0cdqnVohrMwxgUxYg5XmGJPxtgjUZAzCxhxqXcTO1/sduzSOZtt+uYQtVATYa9NIMUEo9BG7RNVjDFQTmDVifehHhTXB82Vu3UmRPOAi0YJdCU0LApdFjmTTIKAMuyuWj5UX5CTo9ubL5E0FFukziFVsFq8vUCVp4pNN7W9uMks91gfEf8MamTCiEJhivYazdbfBhOCJp6X4C10ScLOTRFWlFsvH15ePVi6+Tl6curo0Nr8PK59fbo8PkRSGkjN6xVtLf2QEKBdcN60fimj+4AGi+i7DhetvBLVhigyoDJgumPom/wF1+f2BLNd3qX9qD9qFiv0u0EagyjBUZOPO6h2TjmdlnaS2gaIATTR0HWSwgsvbpIKI6nzqVJEAZgqmP/gA2GwyTHUV3bwLpm2M/uaHHz5e3N/q3oMpdA//NqJIGX8QHL+06kpx6sAiIF0mIY6DdBHMNiCi3wDHynHp1l9dettek8hM4CNRBnVBjBvy2Z0XAA+jY4gbBg38FQemgxWCK59W5trgMphtk2ebi1Cf1iJ+NRZVr0dh6uP/ztx5aih9V/trZ7C+MGSiD4YNz5zdooxSHOz12cLxvF4YVpNbmm64U53LSF1YUOVpn9JAFJwUKlBeOaFX4rQzbNIhhQzFfUJZgqTJiCYQhDDjRDZoEXZTMmv1T1GoxrlkEwHocKI6axJru2usA4CSm3t8HfeJdviiTNp3GdJUk5iQXpFKx3GyYOYdgf+UnR5LuXERxIJ5KTxShgiP2jS2nkfNJdyASDCE5YV4q3S/U5NFjid1wVNZrhrlu8QAUxr36SBvfUuoK6Dmug6VzWWAC6KgraGZiOPyQfYFkLN8x0ZBSAHmQiOyiIHzgRjf226I6yNzurWd37+gjK1GA7NvCotgiOD8KYeVi1zAVAGaw9LgDWso8QcOpkBgF8fVWl8MS2sE5pqlZ4Hzou+P9G6zN9bzB43zdnYMVvqmf6vqOaFVNn5WrWjro1RBinlSwmn1Pit2lCeU5MoZJ8xAyblVG8dzq+p5WAqjcTsjTbqRJhaiggdrRUzdbZVA3mjrbqFIIEriHEjOksZBDPK6iOfYcWT0c3lM0glgnx0dUR2I5lT3SVBNIrfdOqVxv2/at7+mvni77eX8DR7K5SgKm3vn/V0VkVU2df1awdXdUQYWxPWUzenOJ3V2t+/6pqzLoyJuU5ZbLmWRD6hDdA8TFL69SgBaUMXpf7l4OSu+L4uIVSFrZeKkXCBxZLpYp392RW4uVYl20C7fgRNRIjU/IcKwxK04NbkTkTHnhAC/tm3Z4TmVqs3+ZuOAOHF23z9XfN4GL/Sw9cv0cFetI+GUGbCyveoYAeu+ih7PQssD/xXzsYsLUGJEPj0qcZuLDUn1U+36odOVPiZCEHNEr3CE01DxhEUCNxOKJM/CbfVoKpFf4bHtFFHhdcAXsuIuFQfj4JKwdpbbfI1+3om2yy1rfdWQji1+VSJ0pEoR2Y2mDOmEcHVN/FKr0WNUIy5o2WLaCJCzMtCvxOmkaAKjZpit3iMZoVN19h7dFhF9sCZVC72YFPof9sFjoJeKg0czLRIF8C7qviD1APq92vvsOmGGCIi/zi+QC6t7e7s9fbUcoGGU6/I2lRNUzNBs5V6DDwkLnCATt+ytu5Yd0nAW7JGCSe0aLd0mU1wOctSrE+eKE5Rtc1jTUmMRT4W3VXR2xeNSIl7Qo/x20sjJgscSw/3VPG+QnO6YOlE1fD/enTXr/XVzPPsMA4GlsVgLkEXbJeubu72kzORKs1ORtt+WH+yBO7GuooYHJ345PWtdp6AmP5d7y4kNAQXXiFinQ15N0EIwIGqN8hlGmOZUI80M+4BVQ0doK4DnuwuydC/BvW/vn50dnhy+vVturwAuaFxOcaHT3cm3Kb99mmB17UNn4+vM2l89x0odOJk5Bngr/nM+eu13Cqt7/k0QUU8VC40ZrSjr0JhX6xjw/P94uNjbwsijQXHIPtO5mNRhj4t08OzwcGntQUIB6LILWKTQxzJp0lPBKE6Iblsm7L9HWDfJ/exWamiiLYMI7Q2PEdgeNDIpJNqP/sKph+f5tHPk9DPx9+iqGYZsuQ5KfLY8bDuiK6FpZj9BkO3h5P4yGMKuLxbDfJRDoPWnmUQc88w83znsWTeVdlNAGNNA5iHvao012aZVg/TrLLAZFOg6QB4vu3EtNv8+Num0apQ2h10R/0dx6ICbDaVecTbjn8EdWb1/fB450HHqqWznrzvZ4/zorjWqCsSriNw5f21VZU/gajyH5vjqXxb3PcomBBNol2ORAyjJzFF1EQr9qHxIMFkYCbPXmVEAY2DHsEOgYQAehv+fd9MGchYI4WnIspBkPO+vuxEy5hVUkLeyELq+KSNtA6D51sRFnUwYGuI9g0Y7+O/p3Qsb11QeBfl88PNf2EZkHRCsk37Yo2dhQs+EaQ6BfQX5DEtZP4tzAf5Ep6/PLaEOs4e3V1JEx9hAVZXsKvr69v1vq3uv48oJT5SgH5BskoP9DEg943Q/c16EkXdKBB9++RagHC6hSjVOVjq7H/sZUYfHQVOpAfUYP2kSECflNIYfzbcxj08kMFJZNAAPCH4pR/L/dVXTf/Xi3lG+JlYnRwxMjgAmLQ6wMIbD2AYd0UykBSBhrlVFJONcq1pBwWw4/rk9Pv7ufQOyeGPOJhX039DlN3ldRDB7xJIR/mF/iCmHDO6DwAhz0/Dlia9U6cVDX+uUH08TPjo8fVGywMdhvaVgrJLUmuRkKN40alc6xSo5Ix0kisJDGNNCtJM400L0lzjXRXku400qIkLTTSsiQtNdKPJelHnUQYLak7GpXGpCRqY/pNhLXDzWF5RA+8kaJQjm69iVVQrDUFePQYMRDe0Fr/nQ3LtyXcGAU2xepLuOyxN1pDcFxgAA4DE5TXQoFGXWUclIW0xM4sI1Cjw/PjIi9jE4W55INWkQb3lwkhB+acTj6Y00Erp4N7cuKx1ikRwUj1uB5zcIGLHTd01ONo0JlglIEp056QLzpq8sJUk5aQeyqUMSfAHWgtt8tO5MzVwZ9rdScOi8Eu4rXP1Cpi7UWY+JCEGvVNTW4ObAFVG2uZYMQGGiDZXOv/1sKTkQbEKOYLCGIeqlOoFjEZoncvMIoGPj/ePzlBzwxV71Dq1gJPb1wP91XsSRvLdavEDnTBpeuqSSn9U6W8gqWpqXFhOC403SBEqMAT3UYBiS+KVtWs5gCStbSCbPPFQ62UDU4srYFzIDm1TC/VTJujsZHrpZ7rpZqrzjqQrFq2V53ZiqHdyHmg53zVmXOTeyC5ta7j7OfHQ5VN7zuBPNGR7SLNxDnO2bAKhYgB3Aa5FagMfhhQcyFqfq+ouVuB2qKUHOFDiHNM5YEPIcf53JRHxex2UAWz+7kx61FV2WNjDUZVDUo6BraOzg6ti5cHL/YvDhWBp6fYF6enelcoFgLC3iw/1GN4BlPYs6CYQzscheDu8JMT8M1s5pEg1E7PucyZyu03cJJv7Afibx4Ts2zuU99CovihxryRWWhAYLxBFsAaUW5dMMwD//e1+JI5YRFvIU3Q+JdJzqA6HiLlDBoyBlyGTEX+toCj2JOl4OU2mPuAiD8McRsQ1wwZNCADeY65hTlnP4jldcO6IxZua/Dzsneg5PAXnkN2Ygu81SB22LLckbEs3jvb4jwbmGRh6JMwiAI8qosOPOgIZ5ZR0MqBx89DM5IGPwrZrsPE2WbRmSAMhsYGJ41msYdHxjCDeXnYCjvhJ9Q6CSMZAb8uyD5PLccKgyzDQoJ8wiwns2LKIie0MCMUwKX/xJc8Lj5ILfg/ZlOVdZW3hyHYgQ1DwdY/16LIeKj4PT8gWJoiuPSwKN8uivyZ2gVkgYela+/wSAWAPxmALiI15I22r8hq4rlmH3DqxMmaEDX0mqSsjpzAl7ZfnKReDcAPXYLfkuAbTmi4PwDkMhhHsHREuEmMh1N71g/DR5o4N2pDIxMmNslTBvAjqPaS7wvlNv41cqIgXObJZF7YKQmh7cWmkX5jh6WjVMnyeVE5hc/V4ryuaa9V2mFNO9ScnJqmdezbmvZWm7s1TRsz1zXtWjOK9oumF6wQD4pmgEYhDoqmT9zO8rua9p1K+7amfauFxmramUq7qGkXGu0+IhqfJVE7/n1c045V2vnRxWlN1ubSZYOouQfnpDwSWLEPmxumO3gnp9rk3VB0Blj2Ir5YyK/WMXor0uYsB4lIQs1Rb70b4aC7ogaa19aEE+6NhF1qS0KjtGyCvn19wI6TCrXsPdC1ANQ0QujIFapxwEtOV0u0kwpPQX+HLR6e0mbSs0mK/MRJtPz9ZIQHvmvFdXi+dUzwgJmKw5WjBTvDBA0GlsuiBTtaXNELx6epfg4mmOElhFmSqMHCaCYp0Uy73QCiCsmrMBEnRhJn1pnKOwTlnQdIGoi7A3oO4sLBGT3Bww2vIjJ21rcqxu193zFzlowXdTYldn1YLT4wV9e1DUzJefkhTnBmupkN3N1YU1abMZeutKujNN3purwCsuUam+GOONMafaevpA7LaA1wCNUQbRHuli6EYq+0YVVX3enH/mBgOo3l2U+2mHFMju9CPvDGToRrarMS2gbv+7CoLBv80AddTccP3axpWzTcA3kFViLbwpgR3s3S7l8xMq+3UdAQwQTpvyhTPmnYK/ihZO81usCjWu1kE5AwTIK6Rd6JOFb7gtPEYXibxlJFzIm0rE03YsE98ki9WEATZazgFjPURtilqs/GT8lWTCJ/UX6hAsXConVTGXsTCw6sDZQRyE38XeRX8sdNSrhdfdvGeZTREEzfZZEfVD8BIulin5XvufKLHKLIJXMjPT+sf3dkFRIoMqhp/leN4RJJPA8YjfH4QR4tE0bpCKNLLr+rR0dfQ+vnuWV770e+JZvawJJO8cJOk9ESSU1+ZIeC89+weo8CWFz4qa91efireX96Z/tpkq3acyBYGX2cgI094WR+OU2QJXRve5dDhUAVVOhiUU1a7U3cqttW7+3LdruShRMlIYwbmXCbH8kUa7Wz/VsUJ01nUSL6cb/+bbXzgSYu7/M08jpvpLbzw4gcA/11If5eNQ+7WlJr/JlHWQ1uDDd1QMd4pxZvnjRFV4lttLMIaNQA7vNv1FSf7M/qhjUC927JX1ZIP6nk5sjht/qEkv10GWxYQQTCf75SV/cNP3W5q/MPP1/ZqywKc3LgffJa8eay8ABZEI9/rordwFpzmy9gGtoHNEqCEMMiuDKln7o6uIjhSxcZpeEn7qfmEhnEPFLP11XPZXl/ezvWjoCEZMFBuxKEq+2wD35N6NMs7XG3bijPUf4/bQeZzcuD0xP5iIZ8EAXXS/RTCPPAM4Safm0ZDoxK9oszebAZeOoDqxdn+CZJS4h+HgCyuo9bt6JgYQIGFd1YouQM0xxCrrZq8/RlJJGaFUkcMDTAf4xcX92jQ5rwk08kWa2Oj1nEs0IbboTFTijpAdV2/5DCJddX8c+MV7hCjAv64mEYbNF3LcPUNZiD4rKYNyH8QBPY4OiWK5jjw6siP24buVf6mabjkwMNdqJfzOLxN34M3hCEc8ZKEK7ByM97u0BcyCHREFOPTR4zLk9TyaNUWnSUP7ShHCrlt0z1BiKMZXieC0u6CIaP+MWDUxH0DsTdctB//Ap9qvpYOCIat+7xJ9iUeF3Bro8MtbwAvF/R4itvrkh498X9tuvvTdJGdCLVAhOIIA0EdL/mESEmbGBCTQqBWbUE+3kyzPVtAnfmTbEzChir2hUGGsuZVAeKeAIP1aDfZ36ghAb8ko68X9O6YaQiEzOydDBPDK9VsPKZkoQ2mbkTZlcvAORb7XJUzpleWCmg4ZuFpoxpcn8xxZ/2QHGWoN1SHhbKHFWjJPy9kvrWl+b9lzR5/0h85xje3bAOj06Oro6ss6PrK/h9bL6s5TcuK9W5tdVNFVzgt0WHpq2k6iqdLst074lEZUvVh+7a+sZ4Aa5xu08PtMLarQ0lrZTZRCrnUv1qgWBQzwhrQPQoacLfnAn4yfxZoh8BAa4mRE59gdRkSZAWK/RJnDJ++WqWcF2uNMWUhMGEUr+h7dVCtECyFBKrXR+toPxth3LAi8GrH7JNM0DFOO/5yZX2NFVWRT+YN044uvlL8xAqR3I95BU1hLuQmVOVC1Z3bavCo8nSczIjtD2mPVNEBfebwIonLsPIOw+66bcf5JQ7HtqGQo5JLMjPu8jtln1elDjtFDP0Vg2XnVdxFUPZYJpGCFK176oc1Iq0gccdwIj6jZ62KkY1NspfIKna+65DmpcGKXrqtWVT3qDX8l3iDFrWlW8pa32ldNJW1E1XOr6G8QvOZxqNgxB8ogF/DgbDN+A64Ksq8q8CH6fB3Wg6snj8SaiiZmBKRrJQMVYP+9XRKdsqijIEVeZk2ze7UXRbdMgqQ1wdEmW8617BynwCW5XOQEeUC9vANCHiRigYvzQp2CMJ37koNd471A5Mm5kxpMZZwi/sVdih1H9qEDzhwEq6KVMvBHvFmBcn+6yodGg3ite+1n0lUHERpY4evZOlVZsJg27gl8ulpbpClXosSDI0nfJ1+/J0XTcYcDcROM4Dff7v1wNfO/edDWqipgP5rqEgaicKM7eiuYblkYG5VJuJ/BSEdhggDMr1UcIuMekck/RQPYnHHGowXea0scvwhmq7DD6Najp+mJyteflCI/zYXOv/xE9IanKcIK6nPa7o18ZTXYAEGxZ3L6orwjJrvl5tXvMogMbmgQ/ZeHPsgH9qW0dpjXhLnOkWbzWogYaE/MVOa7njqjbqtLGpOdWabcLJaOPycyFTfceIccQ0RA+ozco0Xu4lqfsb/Jmz16XbozQGQ3M/zmARX5Z6RTvbiw/fiUf2aoM6y/IyWV/TfCceE35vHV1EEnuUX+l9XTRdztEsS5s+p433Kvklq8ePbUZCZ1H8QZU8cXynKkZrulJw7tYNTmSaGSy7OknVY27ahPNmbYKVrgclOecX9L/Vtedc3tzHGyi638GJ9jneTlF0V5o0tgf5lzalW8d7fF3FQ5YwJnAzfX0w5Pvak0B9u0UAikJ9Pg+ZicPCZZO5d6W9RMMhnF1zLPjN1HI6ym8NNgmE3SVzGJqto8QNGxMxCfVHUMHaaajAU/xSdShN8NLqmPClEQAl2Lx3DjZ/jO9bFvyn+QVF05uLfFsS33Ns7toan3zsYob2wtcsQ/9DIvBtSIM57GFbrdenbS713vX8rAk5vDJAnDZm3wiK4ybm7MwACd2kVZzBuWG4uLzQtdHfPq7Cq6S2lcsr0c3iZwYW534ex8gUx/fxAFVn4ZW+pzqufE+44z2G8rmNf8GnNr7iD27sPina8Gg55mOEY/u9ne0n4vGGNoYHsaQ4BBggqG89imqCPwOx97S3t9fb62s48W6IwOz29p70+k8eq/14gU8jNF7RkAUwPqPhB6NRSOWjBrNEnm0C6Jp612xOWtg5qfCFmcFlBFxCYT0cmgIZEpASIfISMT+JN6LVsIIH1q9y4OSSeMbDLNBavnxgrILScN56pNTYBhRsB4YvIeZVzd7l3Nbhtsval5oyaz1Vl2lPzYDn14Zs8aSl+c2djifDNg2vqNX4e16ww4Lr5yuqR+v04sowVFbHobKuEJSEVhWU70WPuhlokHXEAY1lyepH5zLDo3AZiRICg5oskir/Y1O+P9Iy4JDb5ROp+U6R69sseG5PYg3n8ZKQPwtTXaJ2dCu1OqaIWxuqBHwCCBURPiHzP/LQpmjmta+s9YZPjzbZ/+7rm13F+r/0JOd9ZTS+06ku/ah0Z/XTOjQM5jxqitGS14YX/xA1rvFCxQv48w44a7/cI9EXHWihIapBeHN6K7hkyfRpw51y0nA1pO93oBzyEs+hDD9b66v/jQ39x5vd7f7j7Z1vv1r7ZiD/+xu/XvnHlX9a2VzprzxZ+Wblxcr5yusVb2W68q8r/7by795/+r/y/8b/TEB/+QvJ8/crrT/+P/wXeJfVZw==</latexit>
H H H H H H
53 F nd ng W 2 t
8
Proprietary + Confidential Proprietary + Confidential Proprietary + Confidential Proprietary + Confidential
Figure 3: A simple way to build Iti pzq for a discontinuous loss F (ẽti ă ẽpt´1qi and z are represented),
O being the set of solutions as it is built. We rotate two half-lines, one passing through pẽti , F pẽti qq
(thick line, p∆q) and a parallel one translated by ´z (dashed line) (a). As soon as p∆q crosses F on
any point pz 1 , F pz 1 qq with z ‰ ẽti while the dashed line stays below F , we obtain a candidate offset
v for OO, namely v “ z 1 ´ ẽti . In (b), we obtain an interval of values. We keep on rotating p∆q,
eventually making appear several intervals for the choice of v if F is not convex (c). Finally, when
we reach an angle such that the maximal difference between p∆q and F in rẽti , ẽpt´1qi s is z (z can
be located at an intersection between F and the dashed line), we stop and obtain the full Iti pzq (d).
F F F
<latexit sha1_base64="XzPCXZ0c6q7Yo181Y7ZKf1y9UBY=">AAAVznicjVhbc9xEFh5guay5hYuf9kWFy8WlsmYmkEAetgqcMIEqAw4bJ6E89lRLOtI0bnUr3S3bg0rFK1W8wm/ht+y/2dMteSR1S4apSqzp7zun+1z7aMKcUaWn0/899/wL/3jxpZdf+efWq6+9/sabN956+7EShYzgKBJMyKchUcAohyNNNYOnuQSShQyehGf3DP7kHKSigj/S6xxOMpJymtCIaFx6OF/e2JnuTe/cvvvJNJju3Z7OPrt7Fx+m0zuff3IrmOGD+exMms/h8q13/1zEIioy4DpiRKnj2TTXJyWRmkYMqq1FoSAn0RlJ4RgfOclAnZT2pFWwiytxkAiJ/7gO7GpXoiSZUussRGZG9Eq5mFkcwo4LnXx+UlKeFxp4VG+UFCzQIjBmBzGVEGm2xgcSSYpnDaIVkSTS6Jytrd4+RZ5KgLObUESE3bRn1CHrWVamkuQrGl3iTkEmJOB/MUi+tdslqSJMaFrIvldKdBOGBJKbWcE0leKiD4eZURoKFgfWzqAvbB3ZW+I0ggRNqYL6s4t+zRDUgXGlYMo6fPbxrZsB6GjPsdbsIVWiUCeHC5TMCI/LxYOqXNjtk/JB5WBHLXbkYo9b7LGLPWmxJy72VYt95WJPW+wpYrs98MsW/NID77XgPQ/cb8F9d8sfW+xHF3vYYg9d7LsW+87FfmixH1xs3mJzxPpgDAk8Q4LSGDMJrKbKrNyrqvI/riq4zDHVG31h6LvznEhKeAQtxYtULlvw0AVXFl0R3aU4nDiX9BxZGi61iso4dwkqa1HioTRtUXyu+imrdEbkWsauEFZVdTw7KReMhVgPZ6CDnVmwkLL55vBDmrYi+zRNmSNo1mRX3IkLYfmKMKEUHhYYOy0/qJc+dM0JyRqUhSRVZ1VZLgqO7cI07fKg6spdK5hjw8rscUcULMudWeVr6ex8MH7I3LRwwlqjrF82X61uR+QCaLrSHTdcjKuvWZRb83NFr7F6Qw2J7LDFeWNxTRgzVBVGaP5X+pGme7yl9pkZlVLIejs8CF4emeBxX/XI2bvB6pp7XYzwTB2x+YDM1kB+gJTDabXA7n+Jq5rydYCsqhpOMKziv6HgW6IHFEQriM5cLXaxr2xUm2OResZICKxq2Ja8aQXqmX8AFOhU4EZ+yE71bMTMjdCoca3oiG0dFYNl1RzT1lT9PFRQEeFCmVaPGbPsGtOnMZFe4yVEfTcZka6fNiqGHIXgiKdasVFXdYRHfDWqxAC2Q9QVM3pIys83VJ97Wv575icWA5KgYoazcUzcfRFTYO7Mgwbu4xyHOwR5UfnrVq6+CXFcNLe+Kx1iU8XrNr6oyxqjftqLW1iNtBGF/svMxnhjy5VwOfP7j6pyvuzpeuRHfn5wz6Md+KX3ERLWQq9AmtEcNzV/EpJRti5zklYLfMnAucKCPcHAOBUdD5dNt+uoMVqE5kKDoj9DYDM0TMrjndmJf1AzMhAN1pvtUe0g4TsI25kmMrVBu6TLjxDfxaZyBoFe0XriJYwF+DIUK3c0MeE2AIqGoIl5DBZRLHSwAEwmJrhzMk1ZDD05u1Ju6Mt+c7Mk/8wZRKvu3IPP7n0DJuQtA8MfDXFYh8M8LZDler2qytWynPnZWERnJhgVpqpbI1zwpkzau9YuuEo0WeNriCpycwzilkUuRY7im3bj7dP0xLYjImmoHeaC2pZS95N+H/FMbkhYK01U+ik//Xg20BXbFmJs7L8fRKG8ftNc5MPHu5qLDwYuuByhpp9c9QxvwsaeYmgdyn/9e0UofHelwp4xxzT2q8liOXWVa3z35ybI94Fp0rfC6bMx1p4NUGPQN1UTJrdv21B34u5O2hwjTqyXmlcX741L0TQjhKWA0zbSvs8g9fowzkf1VvPlYmCbFHgNPxiDm0g11jyoBnm4S5833/CcShRxx5HBRtChRYqqSPCf2vZ/NUh6Ctcmpuu2v/QKwG8nRLXRQZ/6BRR7nLiyckNR3Gf4mrMvLnF7WTAoZ3u3sac3fyrzuwJgtxcJtjchEisL/JxKwc2PQaVdrWxxYyOhWAym/4dJcGiARVCZl8aEMrbZaLE4vpVlJ94MxLq53XiiTnGHmjRJnpw26e5oEvk6Inow+/pXcDQ0iuL5MaHXrdTXg0O7lmCK+Wjw5oA4hXZKw4atdQn+zSdUwRNCN7Xcb13F4DhHtWVToQfmFZnhS8jPEF/xGi/WX/3dN1fpwO568CYjKt4I2YS6Xw2J9aQkhIOC7/+lpBZ5LC54e+k9Evl9s+DPWlgtuag6/fHUdDxMjeWNnavfM4Pxh8e39mZ39qYPP935Yr/50fOVyb8m700+mMwmn02+mHw9OZwcTaIJTH6b/D75Y/tw+3y72v6lpj7/XCPzzqT32f71/w75hsY=</latexit> <latexit sha1_base64="XzPCXZ0c6q7Yo181Y7ZKf1y9UBY=">AAAVznicjVhbc9xEFh5guay5hYuf9kWFy8WlsmYmkEAetgqcMIEqAw4bJ6E89lRLOtI0bnUr3S3bg0rFK1W8wm/ht+y/2dMteSR1S4apSqzp7zun+1z7aMKcUaWn0/899/wL/3jxpZdf+efWq6+9/sabN956+7EShYzgKBJMyKchUcAohyNNNYOnuQSShQyehGf3DP7kHKSigj/S6xxOMpJymtCIaFx6OF/e2JnuTe/cvvvJNJju3Z7OPrt7Fx+m0zuff3IrmOGD+exMms/h8q13/1zEIioy4DpiRKnj2TTXJyWRmkYMqq1FoSAn0RlJ4RgfOclAnZT2pFWwiytxkAiJ/7gO7GpXoiSZUussRGZG9Eq5mFkcwo4LnXx+UlKeFxp4VG+UFCzQIjBmBzGVEGm2xgcSSYpnDaIVkSTS6Jytrd4+RZ5KgLObUESE3bRn1CHrWVamkuQrGl3iTkEmJOB/MUi+tdslqSJMaFrIvldKdBOGBJKbWcE0leKiD4eZURoKFgfWzqAvbB3ZW+I0ggRNqYL6s4t+zRDUgXGlYMo6fPbxrZsB6GjPsdbsIVWiUCeHC5TMCI/LxYOqXNjtk/JB5WBHLXbkYo9b7LGLPWmxJy72VYt95WJPW+wpYrs98MsW/NID77XgPQ/cb8F9d8sfW+xHF3vYYg9d7LsW+87FfmixH1xs3mJzxPpgDAk8Q4LSGDMJrKbKrNyrqvI/riq4zDHVG31h6LvznEhKeAQtxYtULlvw0AVXFl0R3aU4nDiX9BxZGi61iso4dwkqa1HioTRtUXyu+imrdEbkWsauEFZVdTw7KReMhVgPZ6CDnVmwkLL55vBDmrYi+zRNmSNo1mRX3IkLYfmKMKEUHhYYOy0/qJc+dM0JyRqUhSRVZ1VZLgqO7cI07fKg6spdK5hjw8rscUcULMudWeVr6ex8MH7I3LRwwlqjrF82X61uR+QCaLrSHTdcjKuvWZRb83NFr7F6Qw2J7LDFeWNxTRgzVBVGaP5X+pGme7yl9pkZlVLIejs8CF4emeBxX/XI2bvB6pp7XYzwTB2x+YDM1kB+gJTDabXA7n+Jq5rydYCsqhpOMKziv6HgW6IHFEQriM5cLXaxr2xUm2OResZICKxq2Ja8aQXqmX8AFOhU4EZ+yE71bMTMjdCoca3oiG0dFYNl1RzT1lT9PFRQEeFCmVaPGbPsGtOnMZFe4yVEfTcZka6fNiqGHIXgiKdasVFXdYRHfDWqxAC2Q9QVM3pIys83VJ97Wv575icWA5KgYoazcUzcfRFTYO7Mgwbu4xyHOwR5UfnrVq6+CXFcNLe+Kx1iU8XrNr6oyxqjftqLW1iNtBGF/svMxnhjy5VwOfP7j6pyvuzpeuRHfn5wz6Md+KX3ERLWQq9AmtEcNzV/EpJRti5zklYLfMnAucKCPcHAOBUdD5dNt+uoMVqE5kKDoj9DYDM0TMrjndmJf1AzMhAN1pvtUe0g4TsI25kmMrVBu6TLjxDfxaZyBoFe0XriJYwF+DIUK3c0MeE2AIqGoIl5DBZRLHSwAEwmJrhzMk1ZDD05u1Ju6Mt+c7Mk/8wZRKvu3IPP7n0DJuQtA8MfDXFYh8M8LZDler2qytWynPnZWERnJhgVpqpbI1zwpkzau9YuuEo0WeNriCpycwzilkUuRY7im3bj7dP0xLYjImmoHeaC2pZS95N+H/FMbkhYK01U+ik//Xg20BXbFmJs7L8fRKG8ftNc5MPHu5qLDwYuuByhpp9c9QxvwsaeYmgdyn/9e0UofHelwp4xxzT2q8liOXWVa3z35ybI94Fp0rfC6bMx1p4NUGPQN1UTJrdv21B34u5O2hwjTqyXmlcX741L0TQjhKWA0zbSvs8g9fowzkf1VvPlYmCbFHgNPxiDm0g11jyoBnm4S5833/CcShRxx5HBRtChRYqqSPCf2vZ/NUh6Ctcmpuu2v/QKwG8nRLXRQZ/6BRR7nLiyckNR3Gf4mrMvLnF7WTAoZ3u3sac3fyrzuwJgtxcJtjchEisL/JxKwc2PQaVdrWxxYyOhWAym/4dJcGiARVCZl8aEMrbZaLE4vpVlJ94MxLq53XiiTnGHmjRJnpw26e5oEvk6Inow+/pXcDQ0iuL5MaHXrdTXg0O7lmCK+Wjw5oA4hXZKw4atdQn+zSdUwRNCN7Xcb13F4DhHtWVToQfmFZnhS8jPEF/xGi/WX/3dN1fpwO568CYjKt4I2YS6Xw2J9aQkhIOC7/+lpBZ5LC54e+k9Evl9s+DPWlgtuag6/fHUdDxMjeWNnavfM4Pxh8e39mZ39qYPP935Yr/50fOVyb8m700+mMwmn02+mHw9OZwcTaIJTH6b/D75Y/tw+3y72v6lpj7/XCPzzqT32f71/w75hsY=</latexit> <latexit sha1_base64="XzPCXZ0c6q7Yo181Y7ZKf1y9UBY=">AAAVznicjVhbc9xEFh5guay5hYuf9kWFy8WlsmYmkEAetgqcMIEqAw4bJ6E89lRLOtI0bnUr3S3bg0rFK1W8wm/ht+y/2dMteSR1S4apSqzp7zun+1z7aMKcUaWn0/899/wL/3jxpZdf+efWq6+9/sabN956+7EShYzgKBJMyKchUcAohyNNNYOnuQSShQyehGf3DP7kHKSigj/S6xxOMpJymtCIaFx6OF/e2JnuTe/cvvvJNJju3Z7OPrt7Fx+m0zuff3IrmOGD+exMms/h8q13/1zEIioy4DpiRKnj2TTXJyWRmkYMqq1FoSAn0RlJ4RgfOclAnZT2pFWwiytxkAiJ/7gO7GpXoiSZUussRGZG9Eq5mFkcwo4LnXx+UlKeFxp4VG+UFCzQIjBmBzGVEGm2xgcSSYpnDaIVkSTS6Jytrd4+RZ5KgLObUESE3bRn1CHrWVamkuQrGl3iTkEmJOB/MUi+tdslqSJMaFrIvldKdBOGBJKbWcE0leKiD4eZURoKFgfWzqAvbB3ZW+I0ggRNqYL6s4t+zRDUgXGlYMo6fPbxrZsB6GjPsdbsIVWiUCeHC5TMCI/LxYOqXNjtk/JB5WBHLXbkYo9b7LGLPWmxJy72VYt95WJPW+wpYrs98MsW/NID77XgPQ/cb8F9d8sfW+xHF3vYYg9d7LsW+87FfmixH1xs3mJzxPpgDAk8Q4LSGDMJrKbKrNyrqvI/riq4zDHVG31h6LvznEhKeAQtxYtULlvw0AVXFl0R3aU4nDiX9BxZGi61iso4dwkqa1HioTRtUXyu+imrdEbkWsauEFZVdTw7KReMhVgPZ6CDnVmwkLL55vBDmrYi+zRNmSNo1mRX3IkLYfmKMKEUHhYYOy0/qJc+dM0JyRqUhSRVZ1VZLgqO7cI07fKg6spdK5hjw8rscUcULMudWeVr6ex8MH7I3LRwwlqjrF82X61uR+QCaLrSHTdcjKuvWZRb83NFr7F6Qw2J7LDFeWNxTRgzVBVGaP5X+pGme7yl9pkZlVLIejs8CF4emeBxX/XI2bvB6pp7XYzwTB2x+YDM1kB+gJTDabXA7n+Jq5rydYCsqhpOMKziv6HgW6IHFEQriM5cLXaxr2xUm2OResZICKxq2Ja8aQXqmX8AFOhU4EZ+yE71bMTMjdCoca3oiG0dFYNl1RzT1lT9PFRQEeFCmVaPGbPsGtOnMZFe4yVEfTcZka6fNiqGHIXgiKdasVFXdYRHfDWqxAC2Q9QVM3pIys83VJ97Wv575icWA5KgYoazcUzcfRFTYO7Mgwbu4xyHOwR5UfnrVq6+CXFcNLe+Kx1iU8XrNr6oyxqjftqLW1iNtBGF/svMxnhjy5VwOfP7j6pyvuzpeuRHfn5wz6Md+KX3ERLWQq9AmtEcNzV/EpJRti5zklYLfMnAucKCPcHAOBUdD5dNt+uoMVqE5kKDoj9DYDM0TMrjndmJf1AzMhAN1pvtUe0g4TsI25kmMrVBu6TLjxDfxaZyBoFe0XriJYwF+DIUK3c0MeE2AIqGoIl5DBZRLHSwAEwmJrhzMk1ZDD05u1Ju6Mt+c7Mk/8wZRKvu3IPP7n0DJuQtA8MfDXFYh8M8LZDler2qytWynPnZWERnJhgVpqpbI1zwpkzau9YuuEo0WeNriCpycwzilkUuRY7im3bj7dP0xLYjImmoHeaC2pZS95N+H/FMbkhYK01U+ik//Xg20BXbFmJs7L8fRKG8ftNc5MPHu5qLDwYuuByhpp9c9QxvwsaeYmgdyn/9e0UofHelwp4xxzT2q8liOXWVa3z35ybI94Fp0rfC6bMx1p4NUGPQN1UTJrdv21B34u5O2hwjTqyXmlcX741L0TQjhKWA0zbSvs8g9fowzkf1VvPlYmCbFHgNPxiDm0g11jyoBnm4S5833/CcShRxx5HBRtChRYqqSPCf2vZ/NUh6Ctcmpuu2v/QKwG8nRLXRQZ/6BRR7nLiyckNR3Gf4mrMvLnF7WTAoZ3u3sac3fyrzuwJgtxcJtjchEisL/JxKwc2PQaVdrWxxYyOhWAym/4dJcGiARVCZl8aEMrbZaLE4vpVlJ94MxLq53XiiTnGHmjRJnpw26e5oEvk6Inow+/pXcDQ0iuL5MaHXrdTXg0O7lmCK+Wjw5oA4hXZKw4atdQn+zSdUwRNCN7Xcb13F4DhHtWVToQfmFZnhS8jPEF/xGi/WX/3dN1fpwO568CYjKt4I2YS6Xw2J9aQkhIOC7/+lpBZ5LC54e+k9Evl9s+DPWlgtuag6/fHUdDxMjeWNnavfM4Pxh8e39mZ39qYPP935Yr/50fOVyb8m700+mMwmn02+mHw9OZwcTaIJTH6b/D75Y/tw+3y72v6lpj7/XCPzzqT32f71/w75hsY=</latexit>
z<latexit sha1_base64="6RxmbiyokWop4hl5Ap0lGuJIxMU=">AAAVznicjVjdcuQ0Fm5YlmWzCzvA5oob16ZSC9QQusPsDFxQBZmhh60KkIHJ/FQ66ZLtY7eILHkkOUmPy7W3W8Ut+yz7LLwNR7LTtiU70FUzcev7zpHOr447zBlVejr95ZVX//DaH1//0xt/3vrLX99862+33n7niRKFjOA4EkzIZyFRwCiHY001g2e5BJKFDJ6G5/cN/vQCpKKCP9brHE4zknKa0IhoXHr0cnlrZ7r32ad39+/cDaZ70+m92f7MPOzfu/PJnWCGK+azM2k+R8u3//7/RSyiIgOuI0aUOplNc31aEqlpxKDaWhQKchKdkxRO8JGTDNRpaU9aBbu4EgeJkPiP68CudiVKkim1zkJkZkSvlIuZxSHspNDJp6cl5XmhgUf1RknBAi0CY3YQUwmRZmt8IJGkeNYgWhFJIo3O2drq7VPkqQQ4vw1FRNhte0Ydsp5lZSpJvqLRFe4UZEIC/heD5Fu7XZIqwoSmhex7pUQ3YUgguZ0VTFMpLvtwmBmloWBxYO0M+sLWkb0lTiNI0JQqqD+76NcMQR0YVwqmrMNnH+/fDkBHe461Zg+pEoU6OVyiZEZ4XC4eVuXCbp+UDysHO26xYxd70mJPXOxpiz11sa9a7CsXe9ZizxDb7YFftuCXHni/Be974EELHrhbPm+x5y72qMUeudi3Lfati33fYt+72LzF5oj1wRgSeIEEpTFmElhNlVm5V1Xl564quMox1Rt9Yei784JISngELcWLVC5b8MgFVxZdEd2lOJw4l/QCWRqutIrKOHcJKmtR4qE0bVF8rvopq3RG5FrGrhBWVXUyOy0XjIVYD+egg51ZsJCy+ebwQ5q2Igc0TZkjaNZkV9yJC2H5ijChFB4WGDsr36+XPnDNCckalIUkVedVWS4Kju3CNO3ysOrK3SiYY8PK7HFHFCzLnVnla+nsfDh+yNy0cMJao6xfNl+tbkfkEmi60h03XI6rr1mUW/NzRW+wekMNieywxUVjcU0YM1QVRmj+W/qRpnu8pfaZGZVSyHo7PAheHpngcV/1yNm7weqae1OM8EwdsfmAzNZAfoCUw2m1wO5/haua8nWArKoaTjCs4t+h4BuiBxREK4jOXS12sa9sVJtjkXrBSAisatiWvGkF6oV/ABToVOBGfshO9WLEzI3QqHGt6IhtHRWDZdUc09ZU/TxUUBHhQplWjxmz7BrTpzGR3uAlRH03GZGunzYqhhyF4IinWrFRV3WER3w1qsQAtkPUFTN6SMovNlSfe1Z+NPMTiwFJUDHD2Tgm7r6IKTB35mED93GOwx2CvKj8dStX34Q4Lppb35UOsanidRtf1mWNUT/rxS2sRtqIQv9lZmO8seVKuJz5g8dVOV/2dD32Iz8/vO/RDv3S+xAJa6FXIM1ojpuaPwnJKFuXOUmrBb5k4FxhwZ5gYJyKjoerptt11BgtQnOhQdGXENgMDZPyZGd26h/UjAxEg/Vme1Q7SPgOwnamiUxt0K7o8kPEd7GpnEOgV7SeeAljAb4MxcodTUy4DYCiIWhiHoNFFAsdLACTiQnunExTFkNPzq6UG/qy39wsyT9zBtGqO/fgs3vfgAl5y8DwR0Mc1uEwTwtkuV6vqnK1LGd+NhbRuQlGhanq1ggXvCmT9q61C64STdb4GqKK3ByDuGWRS5Gj+KbdePs0PbHtiEgaaoe5oLal1P2k30c8kxsS1koTlX7KTz+eDXTFtoUYG/vvB1Eob940F/nw8a7n4sOBCy5HqOkn1z3Dm7Cxpxhah/KDf68Ihe+uVNgz5pjGfjVZLKeuco3v/twE+QEwTfpWOH02xtqzAWoM+nfVhMnt2zbUnbi7kzbHiBPrpebVxXvjUjTNCGEp4LSNtO8ySL0+jPNRvdV8uRjYJgVeww/H4CZSjTUPq0Ee7tLnzTc8pxJF3HFksBF0aJGiKhL8x7b9Xw+SnsK1iem67S+9AvDbCVFtdNCnfgHFHieurNxQFA8YvuYciCvcXhYMytnev7CnN38q87sCYLcXCbY3IRIrC/yCSsHNj0GlXa1scWMjoVgMpv+HSXBkgEVQmZfGhDK22WixONnPslNvBmLd3G48Uae4Q02aJE/OmnR3NIl8HRE9mH39KzgaGkXx/JjQ61bq68GhXUswxXw8eHNAnEI7pWHD1roE/+YTquAJoZta7reuYnCco9qyqdAD84rM8CXkJcTXvMaL9Vd/981VOrC7HrzJiIo3QjahHlRDYj0pCeGg4D9/U1KLPBaXvL30Hov8gVnwZy2sllxUnf54Zjoepsby1s7175nB+MOT/b3Z3b3pozs7Xxw0P3q+MXlv8o/J+5PZ5N7ki8nXk6PJ8SSawOSnyc+T/20fbV9sV9v/qamvvtLIvDvpfbb/+yvTy4bz</latexit>
z
<latexit sha1_base64="6RxmbiyokWop4hl5Ap0lGuJIxMU=">AAAVznicjVjdcuQ0Fm5YlmWzCzvA5oob16ZSC9QQusPsDFxQBZmhh60KkIHJ/FQ66ZLtY7eILHkkOUmPy7W3W8Ut+yz7LLwNR7LTtiU70FUzcev7zpHOr447zBlVejr95ZVX//DaH1//0xt/3vrLX99862+33n7niRKFjOA4EkzIZyFRwCiHY001g2e5BJKFDJ6G5/cN/vQCpKKCP9brHE4zknKa0IhoXHr0cnlrZ7r32ad39+/cDaZ70+m92f7MPOzfu/PJnWCGK+azM2k+R8u3//7/RSyiIgOuI0aUOplNc31aEqlpxKDaWhQKchKdkxRO8JGTDNRpaU9aBbu4EgeJkPiP68CudiVKkim1zkJkZkSvlIuZxSHspNDJp6cl5XmhgUf1RknBAi0CY3YQUwmRZmt8IJGkeNYgWhFJIo3O2drq7VPkqQQ4vw1FRNhte0Ydsp5lZSpJvqLRFe4UZEIC/heD5Fu7XZIqwoSmhex7pUQ3YUgguZ0VTFMpLvtwmBmloWBxYO0M+sLWkb0lTiNI0JQqqD+76NcMQR0YVwqmrMNnH+/fDkBHe461Zg+pEoU6OVyiZEZ4XC4eVuXCbp+UDysHO26xYxd70mJPXOxpiz11sa9a7CsXe9ZizxDb7YFftuCXHni/Be974EELHrhbPm+x5y72qMUeudi3Lfati33fYt+72LzF5oj1wRgSeIEEpTFmElhNlVm5V1Xl564quMox1Rt9Yei784JISngELcWLVC5b8MgFVxZdEd2lOJw4l/QCWRqutIrKOHcJKmtR4qE0bVF8rvopq3RG5FrGrhBWVXUyOy0XjIVYD+egg51ZsJCy+ebwQ5q2Igc0TZkjaNZkV9yJC2H5ijChFB4WGDsr36+XPnDNCckalIUkVedVWS4Kju3CNO3ysOrK3SiYY8PK7HFHFCzLnVnla+nsfDh+yNy0cMJao6xfNl+tbkfkEmi60h03XI6rr1mUW/NzRW+wekMNieywxUVjcU0YM1QVRmj+W/qRpnu8pfaZGZVSyHo7PAheHpngcV/1yNm7weqae1OM8EwdsfmAzNZAfoCUw2m1wO5/haua8nWArKoaTjCs4t+h4BuiBxREK4jOXS12sa9sVJtjkXrBSAisatiWvGkF6oV/ABToVOBGfshO9WLEzI3QqHGt6IhtHRWDZdUc09ZU/TxUUBHhQplWjxmz7BrTpzGR3uAlRH03GZGunzYqhhyF4IinWrFRV3WER3w1qsQAtkPUFTN6SMovNlSfe1Z+NPMTiwFJUDHD2Tgm7r6IKTB35mED93GOwx2CvKj8dStX34Q4Lppb35UOsanidRtf1mWNUT/rxS2sRtqIQv9lZmO8seVKuJz5g8dVOV/2dD32Iz8/vO/RDv3S+xAJa6FXIM1ojpuaPwnJKFuXOUmrBb5k4FxhwZ5gYJyKjoerptt11BgtQnOhQdGXENgMDZPyZGd26h/UjAxEg/Vme1Q7SPgOwnamiUxt0K7o8kPEd7GpnEOgV7SeeAljAb4MxcodTUy4DYCiIWhiHoNFFAsdLACTiQnunExTFkNPzq6UG/qy39wsyT9zBtGqO/fgs3vfgAl5y8DwR0Mc1uEwTwtkuV6vqnK1LGd+NhbRuQlGhanq1ggXvCmT9q61C64STdb4GqKK3ByDuGWRS5Gj+KbdePs0PbHtiEgaaoe5oLal1P2k30c8kxsS1koTlX7KTz+eDXTFtoUYG/vvB1Eob940F/nw8a7n4sOBCy5HqOkn1z3Dm7Cxpxhah/KDf68Ihe+uVNgz5pjGfjVZLKeuco3v/twE+QEwTfpWOH02xtqzAWoM+nfVhMnt2zbUnbi7kzbHiBPrpebVxXvjUjTNCGEp4LSNtO8ySL0+jPNRvdV8uRjYJgVeww/H4CZSjTUPq0Ee7tLnzTc8pxJF3HFksBF0aJGiKhL8x7b9Xw+SnsK1iem67S+9AvDbCVFtdNCnfgHFHieurNxQFA8YvuYciCvcXhYMytnev7CnN38q87sCYLcXCbY3IRIrC/yCSsHNj0GlXa1scWMjoVgMpv+HSXBkgEVQmZfGhDK22WixONnPslNvBmLd3G48Uae4Q02aJE/OmnR3NIl8HRE9mH39KzgaGkXx/JjQ61bq68GhXUswxXw8eHNAnEI7pWHD1roE/+YTquAJoZta7reuYnCco9qyqdAD84rM8CXkJcTXvMaL9Vd/981VOrC7HrzJiIo3QjahHlRDYj0pCeGg4D9/U1KLPBaXvL30Hov8gVnwZy2sllxUnf54Zjoepsby1s7175nB+MOT/b3Z3b3pozs7Xxw0P3q+MXlv8o/J+5PZ5N7ki8nXk6PJ8SSawOSnyc+T/20fbV9sV9v/qamvvtLIvDvpfbb/+yvTy4bz</latexit>
(no solution) z
<latexit sha1_base64="6RxmbiyokWop4hl5Ap0lGuJIxMU=">AAAVznicjVjdcuQ0Fm5YlmWzCzvA5oob16ZSC9QQusPsDFxQBZmhh60KkIHJ/FQ66ZLtY7eILHkkOUmPy7W3W8Ut+yz7LLwNR7LTtiU70FUzcev7zpHOr447zBlVejr95ZVX//DaH1//0xt/3vrLX99862+33n7niRKFjOA4EkzIZyFRwCiHY001g2e5BJKFDJ6G5/cN/vQCpKKCP9brHE4zknKa0IhoXHr0cnlrZ7r32ad39+/cDaZ70+m92f7MPOzfu/PJnWCGK+azM2k+R8u3//7/RSyiIgOuI0aUOplNc31aEqlpxKDaWhQKchKdkxRO8JGTDNRpaU9aBbu4EgeJkPiP68CudiVKkim1zkJkZkSvlIuZxSHspNDJp6cl5XmhgUf1RknBAi0CY3YQUwmRZmt8IJGkeNYgWhFJIo3O2drq7VPkqQQ4vw1FRNhte0Ydsp5lZSpJvqLRFe4UZEIC/heD5Fu7XZIqwoSmhex7pUQ3YUgguZ0VTFMpLvtwmBmloWBxYO0M+sLWkb0lTiNI0JQqqD+76NcMQR0YVwqmrMNnH+/fDkBHe461Zg+pEoU6OVyiZEZ4XC4eVuXCbp+UDysHO26xYxd70mJPXOxpiz11sa9a7CsXe9ZizxDb7YFftuCXHni/Be974EELHrhbPm+x5y72qMUeudi3Lfati33fYt+72LzF5oj1wRgSeIEEpTFmElhNlVm5V1Xl564quMox1Rt9Yei784JISngELcWLVC5b8MgFVxZdEd2lOJw4l/QCWRqutIrKOHcJKmtR4qE0bVF8rvopq3RG5FrGrhBWVXUyOy0XjIVYD+egg51ZsJCy+ebwQ5q2Igc0TZkjaNZkV9yJC2H5ijChFB4WGDsr36+XPnDNCckalIUkVedVWS4Kju3CNO3ysOrK3SiYY8PK7HFHFCzLnVnla+nsfDh+yNy0cMJao6xfNl+tbkfkEmi60h03XI6rr1mUW/NzRW+wekMNieywxUVjcU0YM1QVRmj+W/qRpnu8pfaZGZVSyHo7PAheHpngcV/1yNm7weqae1OM8EwdsfmAzNZAfoCUw2m1wO5/haua8nWArKoaTjCs4t+h4BuiBxREK4jOXS12sa9sVJtjkXrBSAisatiWvGkF6oV/ABToVOBGfshO9WLEzI3QqHGt6IhtHRWDZdUc09ZU/TxUUBHhQplWjxmz7BrTpzGR3uAlRH03GZGunzYqhhyF4IinWrFRV3WER3w1qsQAtkPUFTN6SMovNlSfe1Z+NPMTiwFJUDHD2Tgm7r6IKTB35mED93GOwx2CvKj8dStX34Q4Lppb35UOsanidRtf1mWNUT/rxS2sRtqIQv9lZmO8seVKuJz5g8dVOV/2dD32Iz8/vO/RDv3S+xAJa6FXIM1ojpuaPwnJKFuXOUmrBb5k4FxhwZ5gYJyKjoerptt11BgtQnOhQdGXENgMDZPyZGd26h/UjAxEg/Vme1Q7SPgOwnamiUxt0K7o8kPEd7GpnEOgV7SeeAljAb4MxcodTUy4DYCiIWhiHoNFFAsdLACTiQnunExTFkNPzq6UG/qy39wsyT9zBtGqO/fgs3vfgAl5y8DwR0Mc1uEwTwtkuV6vqnK1LGd+NhbRuQlGhanq1ggXvCmT9q61C64STdb4GqKK3ByDuGWRS5Gj+KbdePs0PbHtiEgaaoe5oLal1P2k30c8kxsS1koTlX7KTz+eDXTFtoUYG/vvB1Eob940F/nw8a7n4sOBCy5HqOkn1z3Dm7Cxpxhah/KDf68Ihe+uVNgz5pjGfjVZLKeuco3v/twE+QEwTfpWOH02xtqzAWoM+nfVhMnt2zbUnbi7kzbHiBPrpebVxXvjUjTNCGEp4LSNtO8ySL0+jPNRvdV8uRjYJgVeww/H4CZSjTUPq0Ee7tLnzTc8pxJF3HFksBF0aJGiKhL8x7b9Xw+SnsK1iem67S+9AvDbCVFtdNCnfgHFHieurNxQFA8YvuYciCvcXhYMytnev7CnN38q87sCYLcXCbY3IRIrC/yCSsHNj0GlXa1scWMjoVgMpv+HSXBkgEVQmZfGhDK22WixONnPslNvBmLd3G48Uae4Q02aJE/OmnR3NIl8HRE9mH39KzgaGkXx/JjQ61bq68GhXUswxXw8eHNAnEI7pWHD1roE/+YTquAJoZta7reuYnCco9qyqdAD84rM8CXkJcTXvMaL9Vd/981VOrC7HrzJiIo3QjahHlRDYj0pCeGg4D9/U1KLPBaXvL30Hov8gVnwZy2sllxUnf54Zjoepsby1s7175nB+MOT/b3Z3b3pozs7Xxw0P3q+MXlv8o/J+5PZ5N7ki8nXk6PJ8SSawOSnyc+T/20fbV9sV9v/qamvvtLIvDvpfbb/+yvTy4bz</latexit>
<latexit sha1_base64="J1s0OWff6fqNuSDh8yiOyfFBN64=">AAAV23icjVhLb9w2EN70mbqvpK1PvQg1jKaB63iDvi4FEjvZpICbOmn8CLz2gpJGWtaUKJOU7Y2gU29FrwV6bf9Ef0v/TYeUvJJIrdMFEnP5fTPkzHCGw/UzRqXa2Pj32muvv/HmW29ff2fp3ffe/+DDGzc/2pM8FwHsBpxxceATCYymsKuoYnCQCSCJz2DfP9nS+P4ZCEl5+lzNMjhKSJzSiAZE4dTxWFEWQgHlpFC0nNxY2VjfMB/PHQzrwcqg/uxMbn7yzzjkQZ5AqgJGpDwcbmTqqCBC0YBBuTTOJWQkOCExHOIwJQnIo8Jsu/RWcSb0Ii7wX6o8M9uWKEgi5SzxkZkQNZU2pif7sMNcRd8dFTTNcgVpUC0U5cxT3NM+8EIqIFBshgMSCIp79YIpESRQ6Kmlpc46eRYLgJM1yAPC1swelc86lhWxINmUBhe4kpdwAfhfCCJdWm2TZO5HNM5F1ysFugnjA9FakjNFBT/vwn6ilfqchZ6x0+sKG0d2plIaQISmlF71WUW/JggqT7uSM2kcPrxzd80DFaxb1uo1hIwk6kzhHCUTkobF+FFZjM3yUfGotLDdBtu1sb0G27Ox/Qbbt7GHDfbQxg4a7ACx1Q54vwHvO+BWA2454GYDbtpLvmiwFzb2tMGe2tiTBntiY88a7JmNjRpshFgXDCGCUyRIhTETwCqqSIr1siy+t1XBRYZHvdbn+647z4igJA2goTiRykQD7tjg1KBTotoUixNmgp4hS8GFkkERZjZBJg1KHJTGDYrjsntkpUqImInQFsKsKg+HR8WYMR/z4QSUtzL0xkLU3yy+T+NGZJPGMbME9Zxoi1txISybEsalxM0CY8fFrWrqC9scn8xAGkhQeVIWxThPsVzoCl5sl225KwUzLFiJ2e4CBZNiZVi6Wlorby/eZKZLOGGNUcYv869GtyVyDjSeqpYbzherr1g0NeZnkl5h9ZzqE9Fi87Pa4oqwyFCZa6HRq/QjTXV4E+UyEyoEF9VyuBG8PBKehl3VC/beDlbb3KtihHtqiY16ZJZ6zgcI0X+sxlj9L3BW0XTmIass+w8YZvH/UPAjUT0KgikEJ7YWM9lVtlCbZZE8ZcQHVtZsQ56XAnnqbgAFWhk4l++zU54uMHMutNC4RnSBbS0VvWlVb9PkVDXuS6iApFzqUo8nZtI2pktjPL7CS4i6btIibT/NVfQ5CsEFnmrEFrqqJbzAVwuVaMBUiCpjFm6Spmdzqss9Lr4cugeLAYlQMcNGOST2uohJ0Hfmdg138RSbOwTTvHTnjVx1E2K7qG99W9rHoorXbXhepTVG/bgTN79cUEYk+i/RC+ONLabc5owePC+L0aSj67kb+dH2lkPbdlPvNhJmXE1B6NYcF9V/IpJQNisyEpdjfHFgX2HAjqCnnYqOh4u62rXUaC1cpVyBpC/BMyfUj4rDleGRu1HdMhAFxpvNVk0j4ToIy5kiIjZBu6CT24ivYlE5AU9NadXxEsY8fBmF0m5NdLg1gKI+KKKH3jgIufLGgIeJ8dTamXkydeSqR9ScPukWN0Ny95xAMG33PTi27xvQIW8YGP6gj8NaHOZogSRTs2lZTCfF0D2NeXCig1HiUbVzJOVpnSbNXWsmbCWKzPAZIvNMb4PYaZEJnqH4vNw469Q1samISOorhxmnpqRU9aRbRxyTaxLmSh2V7pHfuDPsqYpNCdE2dt8HgS+uXjTjWf/2Lvvi7Z4LLkOorieXNcPpsLGmaFqL8rN7r3CJb1fKzR4zPMZuNhkso7ZyRdAhOsgPgCnStcKqsyHmnglQbdAPZR0mu26bULfibnfaKUacGC/VTxfnxSVpnBDCYsBuG2k/JRA7dRj7o2qp0WTcs0wMaQU/WgTXkaqteVT28nCVLm8051mZyMOWI725oEULJJUBT39pyv9lI+konOmYzpr60kkAt5wQ2UQHfeomUOhwwtLI9UVxk+EzZ5Nf4PIiZ1AM17/Gml7/KfXvCoDVnkdY3jiPjCykZ1TwVP8YVJjZ0iQ3FhKKyaDrvx95OxoYe6V+NEaUsflC4/Hh3SQ5cnog1j7btSeqI25Ro/qQR8f1cbc08WwWENV7+rpXcNDXiuL+8UDPGqnHvU27EqCTebf35oAwhqZLw4KtVAHuzcdlnkaEznO5W7ry3naOKsOmXPX0KyLBR8hLCC95tRerr+7q86u0Z3XVe5MRGc6FzIF6UPaJdaQE+L2Cn79SUvEs5Odpc+k959kDPeH2WpgtGS9b9fFYVzw8GpMbK0P7h013sHd3ffjN+sbTr1bubdY/el4ffDr4bHBrMBx8O7g3eDzYGewOgoEY/Dn4a/D38tHyr8u/Lf9eUV+7Vst8POh8lv/4D5QZjLg=</latexit>
<latexit sha1_base64="Gdi9/xEw8KSure7KkokRL9GkKBg=">AAAWFXicjVhPb9zGFd+kaZuoTeu01amXQQShtqEou27UxgEKJLKzTgE1sVPLdiBKiyH5yJ1qyKFmhpLWBD9Hv0TuOfVW9Fqgt6L9MH0zpJbkDFfpAra48/u9N/P+zuOGBWdKT6f/fuPNH7z1wx/9+O13tn7y03d/9vM77/3ihRKljOA4ElzIVyFVwFkOx5ppDq8KCTQLObwMzx8Z/OUlSMVE/lyvCjjNaJqzhEVU49LizjzgkOigIpeEfEKend1fzO8GmvEYKqgXlWb1Xu/rXf3B7B4uXd4jKHdBXgeSpUsd1Is7O9P96cHDg9mUTPcPprOHvzUPDx9+/NHBAZntT+1nZ9J+ni7e+9V3QSyiMoNcR5wqdTKbFvq0olKziEO9FZQKChqd0xRO8DGnGajTyhpck11ciUkiJP7LNbGrfYmKZkqtshCZGdVL5WJmcQw7KXXy8WnF8qLUkEfNRknJiRbEeI/ETEKk+QofaCQZnpVESypppNHHW1uDfcoilQDne1BGlO/ZM+qQDyyrUkmLJYuucSeSCQn4Xwwy39rtk1QZJiwt5dArFboJIwvJXlZyzaS4GsJhZpSGgsfE2kmGwtaRg6WcRZCgKTVpPrvo1wxBTYwrBVfW4bMPH+wR0NG+Y63ZQ6pEoc4crlAyo3lcBU/qKrDbJ9WT2sGOO+zYxV502AsXe9lhL13s8w773MVeddgrxHYH4Gcd+JkHPurARx542IGH7pbfdNg3Lvasw5652Jcd9qWLfd1hX7vYvMPmiA3BGBK4QILSGDMJvKHKrNqv6+oPriq4LjDVW31h6LvzkkpG8wg6ihepQnbgUxdcWnRJdZ/icOJCsktkabjWKqriwiWorEOph7K0Q/G5Hqas0hmVKxm7QlhV9cnstAo4D7EezkGTnRkJpGy/OfyQpZ3IIUtT7giaNdkXd+JCebGkXCiFhwXOz6q7zdI915yQrkBZSDJ1XldVUObYLkzvr47qvtytggU2rMwed4OCRbUzq30tvZ2PNh+yMC2c8s4o65f1V6vbEbkCc4v03HC1WX3DYrk1v1DsFqvX1JDKHltcthY3hE2GqtIIzb9PP9L0gLfQPjNjUgrZbIcHwcsjE3k8VL3h7P1g9c29LUZ4pp7YfERmayQ/QMrxtAqw+1/jqmb5iiCrrscTDKv4/1DwJ6pHFERLiM5dLXZxqGyjNscidcFpCLxu2Za8bgXqwj8ACvQqcC0/Zqe62GDmWmijcZ3oBtt6KkbLqj2mranmeaygIpoLZVo9Zsyib8yQxkV6i5cQ9d1kRPp+WqsYcxSCGzzViW10VU94g682KjGA7RBNxWw8JMsv11Sfe1Z9MPMTiwNNUDHHETum7r6IKTB35lELD/EchzsE87L2161ccxPiuGhufVc6xKaK12181ZQ1Rv1sELew3tBGFPovMxvjjS2XwuXMHz+vq/lioOu5H/n50SOPduSX3n0krIRegjSjOW5q/iQ0Y3xVFTStA3xXwbnCggNBYpyKjofrttv11BgtQudCg2KvgdgMDZPqZGd26h/UjAxUg/Vmd1Q7SPgOwnamqUxt0K7Z4j7iu9hUzoHoJWsmXso5wXeqWLmjiQm3AVA0BE3NIwmiWGgSACYTF7lzMvsiNZBrXq3W9MWwuVmSf+YMomV/7sFn974BE/KOgeGPxji8x+GeFsgKvVrW1XJRzfxsLKNzE4waU9WtkVzkbZl0d61dcJVousLXEFUW5hjULYtCigLF1+3G26ftiV1HRNJYOywEsy2l6SfDPuKZ3JKwVtqoDFN++uFspCt2LcTYOHw/iEJ5+6aFKMaPdzMXH41ccAVCbT+56RnehI09xdB6lD/794pQ+O7KhD1jgWnsV5PFCuYq1xQdYoL8GLimQyucPhtj7dkAtQb9sW7D5PZtG+pe3N1JO8eIU+ul9tXFe+NSLM0o5SngtI20rzJIvT6M81Gz1XwRjGyTQt7ATzbBbaRaa57UozzcZcibr3lOJYq450iyFnRokWIqEvlfuvZ/M0h6ClcmpquuvwwKwG8nVHXRQZ/6BRR7nLi2cmNRPOT4mnMornF7WXKoZvsH2NPbP7X5XQGw24sE25sQiZWF/JJJkZsfgyq7WtvixkbCsBhM/w8T8tQAAanNS2PCOF9vFAQnD7Ls1JuBeD+3W080Ke5QkzbJk7M23R1NolhFVI9m3/AKjsZGUTw/JvSqk/pidGjXEkwxH4/eHBCn0E1p2LC1rsC/+YQq84SydS0PW1c5Os4xbdlM6JF5RWb4EvIa4hte68Xmq7/7+iod2V2P3mRUxWshm1CP6zGxgZSEcFTwN98rqUURi6u8u/Sei+KxWfBnLayWQtS9/nhmOh6mxuLOzs3vmWTzw4sH+7Pf7U+ffbTz6WH7o+fbk19P3p/cncwmv598Ovli8nRyPIkm307+NfnP5L/bf93+2/bft//RUN98o5X55WTw2f7n/wDtB6Jr</latexit>
1)i , v) z
<latexit sha1_base64="6RxmbiyokWop4hl5Ap0lGuJIxMU=">AAAVznicjVjdcuQ0Fm5YlmWzCzvA5oob16ZSC9QQusPsDFxQBZmhh60KkIHJ/FQ66ZLtY7eILHkkOUmPy7W3W8Ut+yz7LLwNR7LTtiU70FUzcev7zpHOr447zBlVejr95ZVX//DaH1//0xt/3vrLX99862+33n7niRKFjOA4EkzIZyFRwCiHY001g2e5BJKFDJ6G5/cN/vQCpKKCP9brHE4zknKa0IhoXHr0cnlrZ7r32ad39+/cDaZ70+m92f7MPOzfu/PJnWCGK+azM2k+R8u3//7/RSyiIgOuI0aUOplNc31aEqlpxKDaWhQKchKdkxRO8JGTDNRpaU9aBbu4EgeJkPiP68CudiVKkim1zkJkZkSvlIuZxSHspNDJp6cl5XmhgUf1RknBAi0CY3YQUwmRZmt8IJGkeNYgWhFJIo3O2drq7VPkqQQ4vw1FRNhte0Ydsp5lZSpJvqLRFe4UZEIC/heD5Fu7XZIqwoSmhex7pUQ3YUgguZ0VTFMpLvtwmBmloWBxYO0M+sLWkb0lTiNI0JQqqD+76NcMQR0YVwqmrMNnH+/fDkBHe461Zg+pEoU6OVyiZEZ4XC4eVuXCbp+UDysHO26xYxd70mJPXOxpiz11sa9a7CsXe9ZizxDb7YFftuCXHni/Be974EELHrhbPm+x5y72qMUeudi3Lfati33fYt+72LzF5oj1wRgSeIEEpTFmElhNlVm5V1Xl564quMox1Rt9Yei784JISngELcWLVC5b8MgFVxZdEd2lOJw4l/QCWRqutIrKOHcJKmtR4qE0bVF8rvopq3RG5FrGrhBWVXUyOy0XjIVYD+egg51ZsJCy+ebwQ5q2Igc0TZkjaNZkV9yJC2H5ijChFB4WGDsr36+XPnDNCckalIUkVedVWS4Kju3CNO3ysOrK3SiYY8PK7HFHFCzLnVnla+nsfDh+yNy0cMJao6xfNl+tbkfkEmi60h03XI6rr1mUW/NzRW+wekMNieywxUVjcU0YM1QVRmj+W/qRpnu8pfaZGZVSyHo7PAheHpngcV/1yNm7weqae1OM8EwdsfmAzNZAfoCUw2m1wO5/haua8nWArKoaTjCs4t+h4BuiBxREK4jOXS12sa9sVJtjkXrBSAisatiWvGkF6oV/ABToVOBGfshO9WLEzI3QqHGt6IhtHRWDZdUc09ZU/TxUUBHhQplWjxmz7BrTpzGR3uAlRH03GZGunzYqhhyF4IinWrFRV3WER3w1qsQAtkPUFTN6SMovNlSfe1Z+NPMTiwFJUDHD2Tgm7r6IKTB35mED93GOwx2CvKj8dStX34Q4Lppb35UOsanidRtf1mWNUT/rxS2sRtqIQv9lZmO8seVKuJz5g8dVOV/2dD32Iz8/vO/RDv3S+xAJa6FXIM1ojpuaPwnJKFuXOUmrBb5k4FxhwZ5gYJyKjoerptt11BgtQnOhQdGXENgMDZPyZGd26h/UjAxEg/Vme1Q7SPgOwnamiUxt0K7o8kPEd7GpnEOgV7SeeAljAb4MxcodTUy4DYCiIWhiHoNFFAsdLACTiQnunExTFkNPzq6UG/qy39wsyT9zBtGqO/fgs3vfgAl5y8DwR0Mc1uEwTwtkuV6vqnK1LGd+NhbRuQlGhanq1ggXvCmT9q61C64STdb4GqKK3ByDuGWRS5Gj+KbdePs0PbHtiEgaaoe5oLal1P2k30c8kxsS1koTlX7KTz+eDXTFtoUYG/vvB1Eob940F/nw8a7n4sOBCy5HqOkn1z3Dm7Cxpxhah/KDf68Ihe+uVNgz5pjGfjVZLKeuco3v/twE+QEwTfpWOH02xtqzAWoM+nfVhMnt2zbUnbi7kzbHiBPrpebVxXvjUjTNCGEp4LSNtO8ySL0+jPNRvdV8uRjYJgVeww/H4CZSjTUPq0Ee7tLnzTc8pxJF3HFksBF0aJGiKhL8x7b9Xw+SnsK1iem67S+9AvDbCVFtdNCnfgHFHieurNxQFA8YvuYciCvcXhYMytnev7CnN38q87sCYLcXCbY3IRIrC/yCSsHNj0GlXa1scWMjoVgMpv+HSXBkgEVQmZfGhDK22WixONnPslNvBmLd3G48Uae4Q02aJE/OmnR3NIl8HRE9mH39KzgaGkXx/JjQ61bq68GhXUswxXw8eHNAnEI7pWHD1roE/+YTquAJoZta7reuYnCco9qyqdAD84rM8CXkJcTXvMaL9Vd/981VOrC7HrzJiIo3QjahHlRDYj0pCeGg4D9/U1KLPBaXvL30Hov8gVnwZy2sllxUnf54Zjoepsby1s7175nB+MOT/b3Z3b3pozs7Xxw0P3q+MXlv8o/J+5PZ5N7ki8nXk6PJ8SSawOSnyc+T/20fbV9sV9v/qamvvtLIvDvpfbb/+yvTy4bz</latexit>
<latexit sha1_base64="6RxmbiyokWop4hl5Ap0lGuJIxMU=">AAAVznicjVjdcuQ0Fm5YlmWzCzvA5oob16ZSC9QQusPsDFxQBZmhh60KkIHJ/FQ66ZLtY7eILHkkOUmPy7W3W8Ut+yz7LLwNR7LTtiU70FUzcev7zpHOr447zBlVejr95ZVX//DaH1//0xt/3vrLX99862+33n7niRKFjOA4EkzIZyFRwCiHY001g2e5BJKFDJ6G5/cN/vQCpKKCP9brHE4zknKa0IhoXHr0cnlrZ7r32ad39+/cDaZ70+m92f7MPOzfu/PJnWCGK+azM2k+R8u3//7/RSyiIgOuI0aUOplNc31aEqlpxKDaWhQKchKdkxRO8JGTDNRpaU9aBbu4EgeJkPiP68CudiVKkim1zkJkZkSvlIuZxSHspNDJp6cl5XmhgUf1RknBAi0CY3YQUwmRZmt8IJGkeNYgWhFJIo3O2drq7VPkqQQ4vw1FRNhte0Ydsp5lZSpJvqLRFe4UZEIC/heD5Fu7XZIqwoSmhex7pUQ3YUgguZ0VTFMpLvtwmBmloWBxYO0M+sLWkb0lTiNI0JQqqD+76NcMQR0YVwqmrMNnH+/fDkBHe461Zg+pEoU6OVyiZEZ4XC4eVuXCbp+UDysHO26xYxd70mJPXOxpiz11sa9a7CsXe9ZizxDb7YFftuCXHni/Be974EELHrhbPm+x5y72qMUeudi3Lfati33fYt+72LzF5oj1wRgSeIEEpTFmElhNlVm5V1Xl564quMox1Rt9Yei784JISngELcWLVC5b8MgFVxZdEd2lOJw4l/QCWRqutIrKOHcJKmtR4qE0bVF8rvopq3RG5FrGrhBWVXUyOy0XjIVYD+egg51ZsJCy+ebwQ5q2Igc0TZkjaNZkV9yJC2H5ijChFB4WGDsr36+XPnDNCckalIUkVedVWS4Kju3CNO3ysOrK3SiYY8PK7HFHFCzLnVnla+nsfDh+yNy0cMJao6xfNl+tbkfkEmi60h03XI6rr1mUW/NzRW+wekMNieywxUVjcU0YM1QVRmj+W/qRpnu8pfaZGZVSyHo7PAheHpngcV/1yNm7weqae1OM8EwdsfmAzNZAfoCUw2m1wO5/haua8nWArKoaTjCs4t+h4BuiBxREK4jOXS12sa9sVJtjkXrBSAisatiWvGkF6oV/ABToVOBGfshO9WLEzI3QqHGt6IhtHRWDZdUc09ZU/TxUUBHhQplWjxmz7BrTpzGR3uAlRH03GZGunzYqhhyF4IinWrFRV3WER3w1qsQAtkPUFTN6SMovNlSfe1Z+NPMTiwFJUDHD2Tgm7r6IKTB35mED93GOwx2CvKj8dStX34Q4Lppb35UOsanidRtf1mWNUT/rxS2sRtqIQv9lZmO8seVKuJz5g8dVOV/2dD32Iz8/vO/RDv3S+xAJa6FXIM1ojpuaPwnJKFuXOUmrBb5k4FxhwZ5gYJyKjoerptt11BgtQnOhQdGXENgMDZPyZGd26h/UjAxEg/Vme1Q7SPgOwnamiUxt0K7o8kPEd7GpnEOgV7SeeAljAb4MxcodTUy4DYCiIWhiHoNFFAsdLACTiQnunExTFkNPzq6UG/qy39wsyT9zBtGqO/fgs3vfgAl5y8DwR0Mc1uEwTwtkuV6vqnK1LGd+NhbRuQlGhanq1ggXvCmT9q61C64STdb4GqKK3ByDuGWRS5Gj+KbdePs0PbHtiEgaaoe5oLal1P2k30c8kxsS1koTlX7KTz+eDXTFtoUYG/vvB1Eob940F/nw8a7n4sOBCy5HqOkn1z3Dm7Cxpxhah/KDf68Ihe+uVNgz5pjGfjVZLKeuco3v/twE+QEwTfpWOH02xtqzAWoM+nfVhMnt2zbUnbi7kzbHiBPrpebVxXvjUjTNCGEp4LSNtO8ySL0+jPNRvdV8uRjYJgVeww/H4CZSjTUPq0Ee7tLnzTc8pxJF3HFksBF0aJGiKhL8x7b9Xw+SnsK1iem67S+9AvDbCVFtdNCnfgHFHieurNxQFA8YvuYciCvcXhYMytnev7CnN38q87sCYLcXCbY3IRIrC/yCSsHNj0GlXa1scWMjoVgMpv+HSXBkgEVQmZfGhDK22WixONnPslNvBmLd3G48Uae4Q02aJE/OmnR3NIl8HRE9mH39KzgaGkXx/JjQ61bq68GhXUswxXw8eHNAnEI7pWHD1roE/+YTquAJoZta7reuYnCco9qyqdAD84rM8CXkJcTXvMaL9Vd/981VOrC7HrzJiIo3QjahHlRDYj0pCeGg4D9/U1KLPBaXvL30Hov8gVnwZy2sllxUnf54Zjoepsby1s7175nB+MOT/b3Z3b3pozs7Xxw0P3q+MXlv8o/J+5PZ5N7ki8nXk6PJ8SSawOSnyc+T/20fbV9sV9v/qamvvtLIvDvpfbb/+yvTy4bz</latexit>
<latexit sha1_base64="6RxmbiyokWop4hl5Ap0lGuJIxMU=">AAAVznicjVjdcuQ0Fm5YlmWzCzvA5oob16ZSC9QQusPsDFxQBZmhh60KkIHJ/FQ66ZLtY7eILHkkOUmPy7W3W8Ut+yz7LLwNR7LTtiU70FUzcev7zpHOr447zBlVejr95ZVX//DaH1//0xt/3vrLX99862+33n7niRKFjOA4EkzIZyFRwCiHY001g2e5BJKFDJ6G5/cN/vQCpKKCP9brHE4zknKa0IhoXHr0cnlrZ7r32ad39+/cDaZ70+m92f7MPOzfu/PJnWCGK+azM2k+R8u3//7/RSyiIgOuI0aUOplNc31aEqlpxKDaWhQKchKdkxRO8JGTDNRpaU9aBbu4EgeJkPiP68CudiVKkim1zkJkZkSvlIuZxSHspNDJp6cl5XmhgUf1RknBAi0CY3YQUwmRZmt8IJGkeNYgWhFJIo3O2drq7VPkqQQ4vw1FRNhte0Ydsp5lZSpJvqLRFe4UZEIC/heD5Fu7XZIqwoSmhex7pUQ3YUgguZ0VTFMpLvtwmBmloWBxYO0M+sLWkb0lTiNI0JQqqD+76NcMQR0YVwqmrMNnH+/fDkBHe461Zg+pEoU6OVyiZEZ4XC4eVuXCbp+UDysHO26xYxd70mJPXOxpiz11sa9a7CsXe9ZizxDb7YFftuCXHni/Be974EELHrhbPm+x5y72qMUeudi3Lfati33fYt+72LzF5oj1wRgSeIEEpTFmElhNlVm5V1Xl564quMox1Rt9Yei784JISngELcWLVC5b8MgFVxZdEd2lOJw4l/QCWRqutIrKOHcJKmtR4qE0bVF8rvopq3RG5FrGrhBWVXUyOy0XjIVYD+egg51ZsJCy+ebwQ5q2Igc0TZkjaNZkV9yJC2H5ijChFB4WGDsr36+XPnDNCckalIUkVedVWS4Kju3CNO3ysOrK3SiYY8PK7HFHFCzLnVnla+nsfDh+yNy0cMJao6xfNl+tbkfkEmi60h03XI6rr1mUW/NzRW+wekMNieywxUVjcU0YM1QVRmj+W/qRpnu8pfaZGZVSyHo7PAheHpngcV/1yNm7weqae1OM8EwdsfmAzNZAfoCUw2m1wO5/haua8nWArKoaTjCs4t+h4BuiBxREK4jOXS12sa9sVJtjkXrBSAisatiWvGkF6oV/ABToVOBGfshO9WLEzI3QqHGt6IhtHRWDZdUc09ZU/TxUUBHhQplWjxmz7BrTpzGR3uAlRH03GZGunzYqhhyF4IinWrFRV3WER3w1qsQAtkPUFTN6SMovNlSfe1Z+NPMTiwFJUDHD2Tgm7r6IKTB35mED93GOwx2CvKj8dStX34Q4Lppb35UOsanidRtf1mWNUT/rxS2sRtqIQv9lZmO8seVKuJz5g8dVOV/2dD32Iz8/vO/RDv3S+xAJa6FXIM1ojpuaPwnJKFuXOUmrBb5k4FxhwZ5gYJyKjoerptt11BgtQnOhQdGXENgMDZPyZGd26h/UjAxEg/Vme1Q7SPgOwnamiUxt0K7o8kPEd7GpnEOgV7SeeAljAb4MxcodTUy4DYCiIWhiHoNFFAsdLACTiQnunExTFkNPzq6UG/qy39wsyT9zBtGqO/fgs3vfgAl5y8DwR0Mc1uEwTwtkuV6vqnK1LGd+NhbRuQlGhanq1ggXvCmT9q61C64STdb4GqKK3ByDuGWRS5Gj+KbdePs0PbHtiEgaaoe5oLal1P2k30c8kxsS1koTlX7KTz+eDXTFtoUYG/vvB1Eob940F/nw8a7n4sOBCy5HqOkn1z3Dm7Cxpxhah/KDf68Ihe+uVNgz5pjGfjVZLKeuco3v/twE+QEwTfpWOH02xtqzAWoM+nfVhMnt2zbUnbi7kzbHiBPrpebVxXvjUjTNCGEp4LSNtO8ySL0+jPNRvdV8uRjYJgVeww/H4CZSjTUPq0Ee7tLnzTc8pxJF3HFksBF0aJGiKhL8x7b9Xw+SnsK1iem67S+9AvDbCVFtdNCnfgHFHieurNxQFA8YvuYciCvcXhYMytnev7CnN38q87sCYLcXCbY3IRIrC/yCSsHNj0GlXa1scWMjoVgMpv+HSXBkgEVQmZfGhDK22WixONnPslNvBmLd3G48Uae4Q02aJE/OmnR3NIl8HRE9mH39KzgaGkXx/JjQ61bq68GhXUswxXw8eHNAnEI7pWHD1roE/+YTquAJoZta7reuYnCco9qyqdAD84rM8CXkJcTXvMaL9Vd/981VOrC7HrzJiIo3QjahHlRDYj0pCeGg4D9/U1KLPBaXvL30Hov8gVnwZy2sllxUnf54Zjoepsby1s7175nB+MOT/b3Z3b3pozs7Xxw0P3q+MXlv8o/J+5PZ5N7ki8nXk6PJ8SSawOSnyc+T/20fbV9sV9v/qamvvtLIvDvpfbb/+yvTy4bz</latexit>
Figure 4: More examples of ensembles Iti pzq (in blue) for the F in Figure 3. (a): Iti pzq is the union
of two intervals with all candidate offsets non negative. (b): it is a single interval with non-positive
offsets. (c): at a discontinuity, if z is smaller than the discontinuity, we have no direct solution for
Iti pzq for at least one positioning of the edges, but a simple trick bypasses the difficulty (see text).
. .
so that ηt “ ηpwt , ht q. Remind the weight update, wti “ ´δvpt´1qi F pyi Ht´1 pxi qq. We define a
"partial" weight update,
.
w̃ti pαq “ ´δvpt´1qi F pαyi ht pxi q ` yi Ht´1 pxi qq (17)
.
(if we were to replace vpt´1qi by vti and let α “ αt , then w̃ti pαq would be wpt`1qi , hence the partial
weight update). Algorithm 2 presents the simple procedure to find αt . Notice that we use w̃ with
sole dependency on the prospective leveraging coefficient; we omit for clarity the dependences in the
current ensemble (H. ), weak classifier (h. ) and offsets (v.i ) needed to compute (17).
Theorem 5.8. Suppose Assumptions 5.1 and 5.4 hold and F is continuous at all abscissae
.
tẽpt´1qi “ yi Ht´1 pxi q, i P rmsu. Then there are always solutions to Step 1 of S OLVEα and if
we let αt ÐS OLVEα (S, wt , ht ) and then compute
ht pxi q
ˇ „ 2 ȷˇ
. ˇˇ ˇ
W 2,t “ ˇEi„rms ¨ δtαt yi ht pxqi q,vpt´1qi u F pẽpt´1qi q ˇˇ ,
M2 t
then W 2,t satisfies (8) and αt satisfies (6) for some εt ą 0, πt P p0, 1q.
The proof, in Section B.6, proceeds by reducing condition (9) to (16). The Weak Learning Assumption
(5.4) is important for the denominator in the LHS of (16) to be non zero. The continuity assumption
at all abscissae is important to have limaÑ0 ηpw̃t paq, ht q “ ηt , which ensures the existence of
solutions to (16), also easy to find, e.g. by a simple dichotomic search starting from an initial guess
for a. Note the necessity of being continuous only at abscissae defined by the training sample, which
is finite in size. Hence, if this condition is not satisfied but discontinuities of F are of Lebesgue
measure 0, it is easy to add an infinitesimal constant to the current weak classifier, ensuring the
conditions of Theorem 5.8 and keeping the boosting rates.
Figure 3 explains how to build graphically Iti pzq for a general F . While it is not hard to implement a
general procedure following the blueprint (i.e. accepting the loss function as input), it would be far
9
from achieving computational optimality: a much better choice consists in specializing it to the (set
of) loss(es) at hand via hardcoding specific optimization features of the desired loss(es). This would
not prevent "loss oddities" to get absolutely trivial oracles (see Appendix, Section B.7).
6 Discussion
For an efficient implementation, boosting requires specific design choices to make sure the weak
learning assumption stands for as long as necessary; experimentally, it is thus a good idea to adapt
the weak learner to build more complex models as iterations increase (e.g. learning deeper trees),
keeping Assumption 5.4 valid with its advantage over random guessing parameter γ ą 0. In our
more general setting, our algorithm S EC B OOST pinpoints two more locations that can make use of
specific design choices to keep assumptions stand for a larger number of iterations.
The first is related to handling local minima. When Assumption 5.5 breaks, it means we are close to
a local optimum of the loss. One possible way of escaping those local minima is to adapt the offset
oracle to output larger offsets (Step 2.5) that get weights computed outside the domain of the local
minimum. Such offsets can be used to inform the weak learner of the specific examples that then
need to receive larger magnitude in classification, something we have already discussed in Section 5.
There is also more: the sign of the weight indicates the polarity of the next edge (et. , (5)) needed
to decrease the loss in the interval spanned by the last offset. To simplify, suppose a substantial
fraction of examples have an edge ẽt. in the vicinity of the blue dotted line in Figure 2 (d) so that the
loss value is indicated by the big arrow and suppose their current offset “ vt´1 so that their weight
(positive) signals that to minimize further the loss, the weak learner’s next weak classifier has to have
a positive edge over these examples. Such is the polarity constraint which essentially comes to satisfy
the WLA, but there is a magnitude constraint that comes from the WRA: indeed, if the positive edge
is too small so that the loss ends up in the "bump" region, then there is a risk that the WRA breaks
because the loss around the bump is quite flat, so the numerator of ρt in Assumption 5.5 can be small.
Passing the bump implies escaping the local minimum at which the loss would otherwise be trapped.
Section 5.4 has presented a general blueprint for the offset oracle but more specific implementation
designs can be used; some are discussed in the Appendix, Section B.7.
The second is related to handling losses that take on constant values over parts of their domain. To
prevent early stopping in Step 2.7 of S EC B OOST, one needs wt`1 ‰ 0. The update rule of wt
imposes that the loss must then have non-zero variation for some examples between two successive
edges (5). If the loss F is constant, then clearly the algorithm obviously stops without learning
anything. If F is piecewise-constant, this constrain the design of the weak learner to make sure that
some examples receive a different loss with the new model update H. . As explained in Appendix,
Section B.11, this can be efficiently addressed by specific designs on S OLVEα .
In the same way as there is no "1 size fits all" weak learner for all domains in traditional boosting,
we expect specific design choices to be instrumental in better handling specific losses in our more
general setting. Our theory points two locations further work can focus on.
7 Conclusion
Boosting has rapidly moved to an optimization setting involving first-order information about the
loss optimized, rejoining, in terms of information needed, that of the hugely popular (stochastic)
gradient descent. But this was not a formal requirement of the initial setting and in this paper, we
show that essentially any loss function can be boosted without this requirement. From this standpoint,
our results put boosting in a slightly more favorable light than recent development on zeroth-order
optimization since, to get boosting-compliant convergence, we do not need the loss to meet any
of the assumptions that those analyses usually rely on. Of course, recent advances in zeroth-order
optimization have also achieved substantial design tricks for the implementation of such algorithms,
something that undoubtedly needs to be adressed in our case, such as for the efficient optimization of
the offset oracle. We leave this as an open problem but provide in Appendix some toy experiments
that a straightforward implementation achieves, hinting that S EC B OOST can indeed optimize very
“exotic” losses.
10
References
[1] A. Akhavan, E. Chzhen, M. Pontil, and A.-B. Tsybakov. A gradient estimator via l1-
randomization for online zero-order optimization with two point feedback. In NeurIPS*35,
2022.
[2] A. Akhavan, M. Pontil, and A.-B. Tsybakov. Exploiting higher order smoothness in derivative-
free optimization and continuous bandits. In NeurIPS*33, 2020.
[3] A. Akhavan, M. Pontil, and A.-B. Tsybakov. Distributed zero-order optimisation under adver-
sarial noise. In NeurIPS*34, 2021.
[4] N. Alon, A. Gonen, E. Hazan, and S. Moran. Boosting simple learners. In STOC’21, 2021.
[5] S.-I. Amari and H. Nagaoka. Methods of Information Geometry. Oxford University Press, 2000.
[6] F. Bach. Learning Theory from First Principles. Course notes, MIT press (to appear), 2023.
[7] A. Banerjee, S. Merugu, I. Dhillon, and J. Ghosh. Clustering with bregman divergences. In
Proc. of the 4th SIAM International Conference on Data Mining, pages 234–245, 2004.
[8] P.-L. Bartlett and S. Mendelson. Rademacher and gaussian complexities: Risk bounds and
structural results. JMLR, 3:463–482, 2002.
[9] G. Biau, B. Cadre, and L. Rouvière. Accelerated gradient boosting. Mach. Learn., 108(6):971–
992, 2019.
[10] M. Blondel, A.-F. T. Martins, and V. Niculae. Learning with Fenchel-Young losses. J. Mach.
Learn. Res., 21:35:1–35:69, 2020.
[11] L. M. Bregman. The relaxation method of finding the common point of convex sets and its
application to the solution of problems in convex programming. USSR Comp. Math. and Math.
Phys., 7:200–217, 1967.
[12] S. Bubeck. Convex optimization: Algorithms and complexity. Found. Trends Mach. Learn.,
8(3-4):231–357, 2015.
[13] P.-S. Bullen. Handbook of means and their inequalities. Kluwer Academic Publishers, 2003.
[14] H. Cai, Y. Lou, D. McKenzie, and W. Yin. A zeroth-order block coordinate descent algorithm
for huge-scale black-box optimization. In 38th ICML, pages 1193–1203, 2021.
[15] C. Cartis and L. Roberts. Scalable subspace methods for derivative-free nonlinear least-squares
optimization. Math. Prog., 199:461–524, 2023.
[16] S. Cheamanunkul, E. Ettinger, and Y. Freund. Non-convex boosting overcomes random label
noise. CoRR, abs/1409.2905, 2014.
[17] L. Chen, J. Xu, and L. Luo. Faster gradient-free algorithms for nonsmooth nonconvex stochastic
optimization. In International Conference on Machine Learning, ICML 2023, 23-29 July 2023,
Honolulu, Hawaii, USA, volume 202 of Proceedings of Machine Learning Research, pages
5219–5233. PMLR, 2023.
[18] X. Chen, S. Liu, K. Xu, X. Li, X. Lin, M. Hong, and D. Cox. ZO-AdaMM: Zeroth-order
adaptive momentum method for black-box optimization. In NeurIPS*32, 2019.
[19] X. Chen, Y. Tang, and N. Li. Improve single-point zeroth-order optimization using high-pass
and low-pass filters. In 39th ICML, volume 162 of Proceedings of Machine Learning Research,
pages 3603–3620. PMLR, 2022.
[20] S. Cheng, G. Wu, and J. Zhu. On the convergence of prior-guided zeroth-order optimisation
algorithms. In NeurIPS*34, 2021.
[21] Z. Cranko and R. Nock. Boosted density estimation remastered. In 36th ICML, pages 1416–
1425, 2019.
[22] W. de Vazelhes, H. Zhang, H. Wu, X. Yuan, and B. Gu. Zeroth-order hard-thresholding:
Gradient error vs. expansivity. In NeurIPS*35, 2022.
[23] D. Dua and C. Graff. UCI machine learning repository, 2021.
[24] E. Fermi and N. Metropolis. Numerical solutions of a minimum problem. Technical Report TR
LA-1492, Los Alamos Scientific Laboratory of the University of California, 1952.
11
[25] L. Flokas, E.-V. Vlatakis-Gkaragkounis, and G. Piliouras. Efficiently avoiding saddle points
with zero order methods: No gradients required. In NeurIPS*32, 2019.
[26] H. Gao and H. Huang. Can stochastic zeroth-order frank-wolfe method converge faster for
non-convex problems? In 37th ICML, pages 3377–3386, 2020.
[27] A. Héliou, M. Martin, P. Mertikopoulos, and T. Rahier. Zeroth-order non-convex learning via
hierarchical dual averaging. In 38th ICML, pages 4192–4202, 2021.
[28] F. Huang, L. Tao, and S. Chen. Accelerated stochastic gradient-free and projection-free methods.
In 37th ICML, pages 4519–4530, 2020.
[29] B. Irwin, E. Haber, R. Gal, and A. Ziv. Neural network accelerated implicit filtering: Integrating
neural network surrogates with provably convergent derivative free optimization methods. In
40th ICML, volume 202 of Proceedings of Machine Learning Research, pages 14376–14389.
PMLR, 2023.
[30] V. Kac and P. Cheung. Quantum calculus. Springer, 2002.
[31] M. J. Kearns and U. V. Vazirani. An Introduction to Computational Learning Theory. M.I.T.
Press, 1994.
[32] M.J. Kearns. Thoughts on hypothesis boosting, 1988. ML class project.
[33] M.J. Kearns and Y. Mansour. On the boosting ability of top-down decision tree learning
algorithms. J. Comp. Syst. Sc., 58:109–128, 1999.
[34] J. Larson, M. Menickelly, and S.-M. Wild. Derivative-free optimization methods. Acta Numerica,
pages 287–404, 2019.
[35] Z. Li, P.-Y. Chen, S. Liu, S. Lu, and Y. Xu. Zeroth-order optimization for composite problems
with functional constraints. In AAAI’22, pages 7453–7461. AAAI Press, 2022.
[36] T. Lin, Z. Zheng, and M.-I. Jordan. Gradient-free methods for deterministic and stochastic
nonsmooth nonconvex optimization. In NeurIPS*35, 2022.
[37] P.-M. Long and R.-A. Servedio. Random classification noise defeats all convex potential
boosters. MLJ, 78(3):287–304, 2010.
[38] C. Maheshwari, C.-Y. Chiu, E. Mazumdar, S. Shankar Sastry, and L.-J. Ratliff. Zeroth-
order methods for convex-concave minmax problems: applications to decision-dependent risk
minimization. In 25th AISTATS, 2022.
[39] Y. Mansour, R. Nock, and R.-C. Williamson. Random classification noise does not defeat all
convex potential boosters irrespective of model choice. In 40th ICML, 2023.
[40] E. Mhanna and M. Assaad. Single point-based distributed zeroth-order optimization with a
non-convex stochastic objective function. In 40th ICML, volume 202 of Proceedings of Machine
Learning Research, pages 24701–24719. PMLR, 2023.
[41] M. Mohri, A. Rostamizadeh, and A. Talwalkar. Foundations of Machine Learning. MIT Press,
2018.
[42] Y. Nesterov and V. Spokoiny. Random gradient-free optimization of convex functions. Founda-
tions of Computational Mathematics, 17:527–566, 2017.
[43] F. Nielsen and R. Nock. The Bregman chord divergence. In Geometric Science of Information -
4th International Conference, 2019, pages 299–308, 2019.
[44] R. Nock and A. K. Menon. Supervised learning: No loss no cry. In 37th ICML, 2020.
[45] R. Nock and R.-C. Williamson. Lossless or quantized boosting with integer arithmetic. In 36th
ICML, pages 4829–4838, 2019.
[46] N.-E. Pfetsch and Sebastian Pokutta. IPBoost - non-convex boosting via integer programming.
In 37th ICML, volume 119, pages 7663–7672, 2020.
[47] Y. Qiu, U.-V. Shanbhag, and F. Yousefian. Zeroth-order methods for nondifferentiable, noncon-
vex and hierarchical federated optimization. In NeurIPS*36, 2023.
[48] M. Rando, C. Molinari, L. Rosasco, and S. Villa. Structured zeroth-order for non-smooth
optimization. In NeurIPS*36, 2023.
[49] M.-D. Reid and R.-C. Williamson. Information, divergence and risk for binary experiments.
JMLR, 12:731–817, 2011.
12
[50] Z. Ren, Y. Tang, and N. Li. Escaping saddle points in zeroth-order optimization: the power of
two-point estimators. In 40th ICML, volume 202 of Proceedings of Machine Learning Research,
pages 28914–28975. PMLR, 2023.
[51] A.-K. Sahu, M. Zaheer, and S. Kar. Towards gradient free and projection free stochastic
optimization. In 22nd AISTATS, pages 3468–3477, 2019.
[52] W. Shi, H. Gao, and B. Gu. Gradient-free method for heavily constrained nonconvex opti-
mization. In 39th ICML, volume 162 of Proceedings of Machine Learning Research, pages
19935–19955. PMLR, 2022.
[53] M.-K. Warmuth and S. V. N. Vishwanathan. Tutorial: Survey of boosting from an optimization
perspective. In 26th ICML, 2009.
[54] T. Werner and P. Ruckdeschel. The column measure and gradient-free gradient boosting, 2019.
[55] H. Zhang and B. Gu. Faster gradient-free methods for escaping saddle points. In ICLR’23.
OpenReview.net, 2023.
[56] H. Zhang, H. Xiong, and B. Gu. Zeroth-order negative curvature finding: Escaping saddle
points without gradients. In NeurIPS*35, 2022.
13
Appendix
To differentiate with the numberings in the main file, the numbering of Theorems, etc. is letter-based
(A, B, ...).
Table of contents
A quick summary of recent zeroth-order optimization approachess Pg 15
ãÑ Helper results Pg 15
ãÑ Removing the ‰ 0 part in Assumption 5.1 Pg 16
ãÑ Proof of Lemma 5.2 Pg 16
ãÑ Proof of Theorem 5.3 Pg 17
ãÑ Proof of Lemma 5.7 Pg 20
ãÑ Proof of Theorem 5.8 Pg 20
ãÑ Implementation of the offset oracle Pg 21
ãÑ Proof of Lemma B.5 Pg 22
ãÑ Handling discontinuities in the offset oracle to prevent stopping in Step 2.5 of S EC B OOST Pg 24
ãÑ A boosting pattern that can "survive" above differentiability Pg 24
ãÑ The case of piecewise constant losses for S OLVEα Pg 26
14
F ∇F main
reference conv. diff. Lip. smooth Lb diff. ML topic
[2] ✓ ✓ ✓ ✓ online ML
[3] ✓ ✓ distributed ML
[1] ✓ ✓ online ML
[14] ✓ ✓ ✓ ✓ alt. GD
[15] ✓ ✓ ✓ alt. GD
[18] ✓ ✓ alt. GD
[17] ✓ ✓ alt. GD
[19] ✓ ✓ ✓ ✓ alt. GD
[20] ✓ ✓ ✓ alt. GD
[25] ✓ ✓ ✓ saddle pt opt
[26] ✓ ✓ alt. FW
[28] ✓ ✓ alt. FW
[22] ✓ ✓ alt. GD
[27] ✓ online ML
[29] ✓ ✓ deep ML
[35] ✓ ✓ ✓ alt. GD
[36] ✓ saddle pt opt
[38] ✓ ✓ ✓ ✓ saddle pt opt
[40] ✓ ✓ ✓ distributed ML
[48] ✓ ✓ alt. GD
[47] ✓ ✓ ✓ federated ML
[50] ✓ ✓ ✓ ✓ saddle pt opt
[51] ✓ ✓ alt. FW
[52] ✓ ✓ ✓ alt. GD
[55] ✓ ✓ ✓ ✓ saddle pt opt
[56] ✓ ✓ ✓ ✓ saddle pt opt
Table 1: Summary of formal assumptions about loss F used to prove algorithms’ convergence in
recent papers on zeroth order optimization, in different ML settings (see text for details). We use
"smoothness" as a portmanteau for various conditions on the ě 1 order differentiability condition of
F . "conv." = convex, "diff." = differentiable, "Lip." = Lipschitz, "Lb" = lower-bounded, "alt. GD" =
general alternative to gradient descent (stochastic or not), "alt. FW" = idem for Frank-Wolfe. Our
paper relies on no such assumptions.
Table 1 summarizes a few dozens of recent approaches that can be related to zeroth-order optimization
in various topics of ML. Note that no such approaches focus on boosting.
We now show that the order of the elements of V ř does not matter to compute the V-derivative as in
.
Definition 4.2. For any σ P t0, 1un , we let 1σ “ i σi .
.
Lemma B.1. For any z P R, any n P N˚ and any V “ tv1 , v2 , ..., vn u Ă R,
řn
σPt0,1un p´1q F pz ` σ i vi q
n´1σ
ř
i“1
δV F pzq “ śn . (18)
i“1 vi
15
Proof. We show the result by induction on the size of V, first noting that
. F pz ` v1 q ´ F pzq 1 ÿ
δtv1 u F pzq “ δv1 F pzq “ “ ś1 ¨ p´1q1´1σ F pz ` σv1 q. (19)
v1 v
i“1 i σPt0,1u
. .
We then assume that (18) holds for Vn “ tv1 , v2 , ..., vn u and show the result for Vn`1 “ Vn Ytvn`1 u,
writing (induction hypothesis used in the second identity):
δVn`1 F pzq
. δV F pz ` vn`1 q ´ δVn F pzq
“ n
vn`1
řn řn
σPt0,1un p´1q F pz ` i“1 σi vi ` vn`1 q ´ σPt0,1un p´1qn´1σ F pz ` i“1 σi vi q
n´1σ
ř ř
“ śn`1
i“1 vi
F pz ` i“1 σi vi ` vn`1 q ` σPt0,1un p´1qn´1σ `1 F pz ` i“1 σi vi q
řn řn
σPt0,1un p´1q
n´1σ
ř ř
“ śn`1
i“1 vi
n´p1σ1 ´1q
F pz ` i“1 σi1 vi q
# ř řn`1
σ 1 Pt0,1un`1 :σn`1
1 “1 p´1q
` σ1 Pt0,1un`1 :σ1 “0 p´1qn`1´1σ1 F pz ` i“1 σi1 vi q
ř řn`1
“ n`1
v n`1
F pz ` i“1 σi1 vi q
řn`1
σ 1 Pt0,1un`1 p´1q
n`1´1σ1
ř
“ śn`1 , (20)
i“1 vi
as claimed.
We also have the following simple Lemma, which is a direct consequence of Lemma B.1.
Lemma B.2. For all z, P R, v, z 1 P R˚ , we have
δv F pz ` z 1 q “ δv F pzq ` z 1 ¨ δtz1 ,vu F pzq. (21)
Proof. It comes from Lemma B.1 that δtz1 ,vu F pzq “ δtv,z1 u F pzq “ pδv F pz ` z 1 q ´ δv F pzqq{z 1
(and we reorder terms).
Because everything needs to be encoded, finiteness is not really an assumption. However, the non-zero
assumption may be seen as limiting (unless we are happy to use first-order information about the loss
(Section 5). There is a simple trick to remove it. Suppose ht zeroes on some training examples. The
.
training sample being finite, there exists an open neighborhood I in 0 such that h1t “ ht ` δ does
not zero anymore on training examples, for any δ P I. This changes the advantage γ in the WLA
(Definition 5.4) to some γ1 satisfying (we assume δ ą 0 wlog)
γMt δ
γ1 ě ´
Mt ` δ M t ` δ
δ
ěγ´ ¨ p1 ` γq,
Mt
from which it is enough to pick δ ď εγMt {p1 ` γq to guarantee advantage γ 1 ě p1 ´ εqγ. If ε is a
constant, this translates in a number of boosting iterations in Corollary 5.6 affected by a constant
factor that we can choose as close to 1 as desired.
We reformulate ¨ ˛
F pa ` b ` cq ` F paq F pa ` bq ` F pa ` cq ‹
˚ ‹
2 1 ˚
δtb,cu F paq “ ¨ ¨˚ ´ ‹. (22)
b c ˚
˝l 2
jh n l 2
jh n‚
‹
. .
“µ2 “µ1
16
Proprietary + Confidential Proprietary + Confidential
or
Figure 5: Left: representation of the difference of averages in (22). Each of the secants p∆1 q and
p∆2 q can take either the red or black segment. Which one is which depends on the signs of c and b,
but the general configuration is always the same. Note that if F is convex, one necessarily sits above
the other, which is the crux of the proof of Lemma 5.2. For the sake of illustration, suppose we can
analytically have b, c Ñ 0. As c converges to 0 but b remains ą 0, δtb,cu F paq becomes proportional
to the variation of the average secant midpoint; the then-convergence of b to 0 makes δtb,cu F paq
converge to the second-order derivative of F at a. Right: in the special case where F is convex, one
of the secants always sits above the other.
Both µ1 and µ2 are averages that can be computed from the midpoints of two secants (respectively):
.
p∆1 q “ rpa ` c, F pa ` cqq, pa ` b, F pa ` bqqs,
.
p∆2 q “ rpa, F paqq, pa ` b ` c, F pa ` b ` cqqs.
Also, the midpoints of both secants have the same abscissa (and the ordinates are µ1 and µ2 ), so to
study the sign of δtb,cu F paq, we can study the position of both secants with respect to each other. F
being convex, we show that the abscissae of one secant are included in the abscissae of the other, this
being sufficient to give the position of both secants with respect to each other. We distinguish four
cases.
17
and then using αt Ð at ηt . We now use Lemma 4.7 (main file) and get
” ı
Ei„rms SF |vti pẽti }ẽpt`1qi q ě ´Ei„D Q˚pt`1qi , @t ě 0.
“ ‰
(28)
Ei„rms F pẽpt`1qi q
“ ‰
” ı
ď Ei„rms rF pẽti qs ´ Ei„rms pẽti ´ ẽpt`1qi q ¨ δvti F pẽpt`1qi q ` Ei„rms Q˚pt`1qi
“ ‰
” ı
“ Ei„rms rF pẽti qs ´ Ei„rms ´ept`1qi ¨ δvti F pẽpt`1qi q ` Ei„rms Q˚pt`1qi
“ ‰
(29)
” ı
“ Ei„rms rF pẽti qs ` αt`1 ¨ Ei„rms yi ht`1 pxi q ¨ δvti F pẽpt`1qi q ` Ei„rms Q˚pt`1qi
“ ‰
(30)
“ Ei„rms rF pẽti qs ` at`1 ηt`1 ¨ Ei„rms ryi ht`1 pxi q ¨ δvti F pẽti qs
” ı
`at`1 ηt`1 ¨ Ei„rms yi ht`1 pxi q ¨ ∆pt`1qi ` Ei„rms Q˚pt`1qi
“ ‰
(31)
“ Ei„rms rF pẽti qs ´ at`1 ηt`1 ¨ Ei„rms wpt`1qi yi ht`1 pxi q `at`1 ηt`1 ¨ Ei„rms yi ht`1 pxi q ¨ ∆pt`1qi
“ ‰ “ ‰
l jh n
“ηt`1
” ı
`Ei„rms Q˚pt`1qi
” ı
2
“ Ei„rms rF pẽti qs ´ at`1 ηt`1 ` at`1 ηt`1 ¨ Ei„rms yi ht`1 pxi q ¨ ∆pt`1qi ` Ei„rms Q˚pt`1qi .
“ ‰
(32)
(29) – (31) make use of definitions (24) (twice) and (26) as well as the decomposition of the leveraging
coefficient in (27).
Looking at (32), we see that we can have a boosting-compliant decrease of the loss if the two
quantities depending on ∆pt`1q. and Q˚pt`1q. can be made small enough compared to at`1 ηt`1
2
. This
is what we investigate.
. . .
Bounding the term depending on ∆pt`1q. – We use Lemma B.2 with z “ ẽti , z 1 “ ept`1qi , v “ vt ,
which yields (also using (24) and the assumption that ht`1 pxi q ‰ 0):
.
∆pt`1qi “ δvti F pẽpt`1qi q ´ δvti F pẽti q
“ δvti F pẽti ` ept`1qi q ´ δvti F pẽti q
“ ept`1qi ¨ δtept`1qi ,vti u F pẽti q
“ yi ¨ αt`1 ht`1 pxi q ¨ δtept`1qi ,vti u F pẽti q, (33)
and so we get:
Bounding the term depending on Q˚.pt`1q – We immediately get from the value picked in argument
of It`1 in step 2.5 of S EC B OOST, the definition of Iti p.q in (10) and our decomposition αt Ð at ηt
that Q˚pt`1qi ď εt`1 ¨ a2t`1 ηt`1
2 2
Mt`1 ¨ W 2,t`1 , @i P rms, so that:
” ı
Ei„rms Q˚pt`1qi ď εt`1 ¨ a2t`1 ηt`1
2 2
Mt`1 ¨ W 2,t`1 . (35)
18
Finishing up with the proof – Suppose that we choose εt`1 ą 0, πt`1 P p0, 1q and at`1 as in (27).
We then get from (32), (34), (35) that for any choice of vti in Step 2.5 of S EC B OOST,
Ei„rms F pẽpt`1qi q
“ ‰
2
ď Ei„rms rF pẽti qs ´ at`1 ηt`1 ` a2t`1 ηt`1
2 2
Mt`1 ¨ W 2,t`1 ` εt`1 ¨ a2t`1 ηt`1
2 2
Mt`1 ¨ W 2,t`1
2 2
“ Ei„rms rF pẽti qs ´ at`1 ηt`1 ¨ 1 ´ at`1 p1 ` εt`1 q Mt`1 ¨ W 2,t`1
` ˘
2
ηt`1 p1 ´ πt`1
2
q
ď Ei„rms rF pẽti qs ´ , (36)
4 p1 ` εt`1 q Mt`1 ¨ W 2,t`1
2
.
where the last inequality is a consequence of (27). Suppose we pick H0 “ h0 P R a constant and
v0 ą 0 such that
δv0 F ph0 q ‰ 0. (37)
The final classifier HT of S EC B OOST satisfies:
T
1 ÿ ηt2 p1 ´ πt2 q
Ei„rms rF pyi HT pxi qqs ď F0 ´ ¨ , (38)
4 t“1 p1 ` εt qMt2 W 2,t
. .
with F0 “ Ei„rms rF pẽi0 qs “ Ei„rms rF pyi H0 qs “ Ei„rms rF pyi h0 qs. If we want
Ei„rms rF pyi HT pxi qqs ď F pz ˚ q, assuming wlog F pz ˚ q ď F0 , then it suffices to iterate until:
T
1 ´ πt2 η2
¨ t2 ě 4pF0 ´ F pz ˚ qq.
ÿ
(39)
t“1 W 2,t p1 ` εt q
Mt
Remind that the edge ηt is not normalized. We have defined a normalized edge,
.
ÿ |wti | ht pxi q
r´1, 1s Q η̃t “ ¨ ỹti ¨ , (40)
i
Wt Mt
. .
with ỹti “ yi ¨ signpwti q and Wt “ |wti | “ |δvpt´1qi F pẽpt´1qi q|. We have the simple
ř ř
i i
relationship between ηt and η̃t :
ÿ |wti | ht pxi q
η̃t “ ¨ pyi ¨ signpwti qq ¨
i
Wt Mt
1 ÿ
“ ¨ wti yi ht pxi q
Wt Mt i
m
“ ¨ ηt , (41)
Wt Mt
resulting in (@t ě 1),
˙2
ηt2
ˆ
Wt
“ η̃t2 ¨
Mt2 m
‰˘2
“ η̃t2 ¨ Ei„rms |δvpt´1qi F pẽpt´1qi q|
` “
‰ˇ˘2
ě η̃t2 ¨ ˇEi„rms δvpt´1qi F pẽpt´1qi q ˇ
`ˇ “
2
“ η̃t2 ¨ W 1,t , (42)
. ˇ
“ ˇEi„D δvpt´1qi F pẽpt´1qi q ˇ. It comes from (42) that a sufficient condition for (39)
“ ‰ˇ
recalling W 1,t
to hold is:
T 2
W 1,t p1 ´ πt2 q
¨ η̃t2 ě 4pF0 ´ F pz ˚ qq,
ÿ
(43)
t“1 W 2,t p1 ` εt q
19
B.5 Proof of Lemma 5.7
which allows us to fix W 2,t “ 2β and completes the proof of Lemma 5.7.
Remark B.3. Our result is optimal in the sense that if we make one offset (say b) go to zero, then
the ratio in (46) goes to zero and we recover the condition on the v-derivative of the derivative,
|δc F 1 pzq| ď β.
20
(The last identity uses the fact that yi P t´1, 1u). Remark that we have extracted αt from the
denominator but it is still present in the arguments ẽti . For any classifier h, we introduce notation
.
ηpw, hq “ Ei„rms rwi yi hpxi qs ,
and so ηt (Step 2.2 in S EC B OOST) is also ηpwt , ht q, which is guaranteed to be non-zero by the Weak
Learning Assumption (5.4). We want, for some εt ą 0, πt P r0, 1q,
ηt
αt P ¨ r1 ´ πt , 1 ` πt s . (49)
2p1 ` εt qMt2 W 2,t
This says that the sign of αt is the same as the sign of ηpwt , ht q “ ηt . Since we know its sign, let us
look for its absolute value:
|ηt |
|αt | P ¨ r1 ´ πt , 1 ` πt s . (50)
2p1 ` εt qMt2 W 2,t
From (9) (main file), we can in fact search αt in the union of all such intervals for εt ą 0, πt P r0, 1q,
which amounts to find first:
|ηt |
ˆ ˙
|αt | P 0, 2 ,
Mt W 2,t
and then find any εt ą 0, πt P r0, 1q such that (50) holds. Using (48) and simplifying the external
dependency on αt , we then need
¨ ˛
˚ ‹
|ηt |
˚ ‹
˚ ‹
1P˚ 0,
˚ |E
‹ , (51)
i„rms yi ht pxi q ¨ δvpt´1qi F pαt yi ht pxi q ` ẽpt´1qi q ´ δvpt´1qi F pẽpt´1qi q | ‹
“ ` ˘‰ ‹
˚
˝ l jh n‚
.
“Bpαt q
under the constraint that the sign of αt be the same as that of ηt . But, using notation (17) (main file),
we have
Bpαt q “ |ηpwt , ht q ´ ηpw̃t pαt q, ht q|,
and so to get (51) satisfied, it is sufficient that
|ηt ´ ηpw̃t pαt q, ht q|
ă 1, (52)
|ηt |
which is Step 1 in S OLVEα . The Weak Learning Assumption (5.4) guarantees that the denom-
inator is ‰ 0 so this can always be evaluated. The continuity of F in all ẽpt´1qi guarantees
limαt Ñ0 ηpw̃t pαt q, ht q “ ηt , and thus guarantees the existence of solutions to (52) for some |αt | ą 0.
To summarize, finding αt can be done in two steps, (i) solve
|ηt ´ ηpw̃t psignpηt q ¨ aq, ht q|
ă1
|ηt |
.
for some a ą 0 and (ii) let αt “ signpηt q ¨ a. This is the output of S OLVEα (S, wt , ht ), which ends
the proof of Theorem 5.8.
Consider the "spring loss" that we define, for r.s denoting the nearest integer, as:
b
. 2
FSL pzq “ logp1 ` expp´zqq ` 1 ´ 1 ´ 4 pz ´ rzsq . (53)
Figure 6 plots this loss, which composes the logistic loss with a "U"-shaped term. This loss would
escape all optimization algorithms of Table 1 (Appendix), yet there is a trivial implementation of our
offset oracle, as explained in Figure 6:
21
1. if the interval I defined by ẽpt´1qi and ẽti contains at least one peak, compute the tangence
point (zt ) at the closest local "U" that passes through pẽpt´1qi , F pẽpt´1qi qq; then if zt P I
then vti Ð zt ´ ẽpt´1qi , else vti Ð ẽti ´ ẽpt´1qi ;
2. otherwise F in I is strictly convex and differentiable: a simple dichotomic search can retrieve
a feasible vti (see convex losses below);
Notice that one can alleviate the repetitive dichotomic search by pre-tabulating a feasible v for a set
of differences |a ´ b| (a, b belonging to the abscissae of the same "U") decreasing by a fixed factor,
choosing vti Ð v of the largest tabulated |a ´ b| no larger than |ẽti ´ ẽpt´1qi |.
Discontinuities discontinuities do not represent issues if the argument z of Iti pzq is large enough, as
shown from the following simple Lemma.
Lemma B.4. Define the discontinuity of F as:
" *
. supz |F pzq ´ limz´ F pzq|,
discpF q “ max . (54)
supz |F pzq ´ limz` F pzq|
Figure 4 (c) shows a case where the discontinuity is larger than z. In this case, an issue eventually
happens for computing the next weight happens, only when the current edge is at the discontinuity.
We note that as iterations increase and the weak learner finds it eventually more difficult to return
weak hypotheses with η. large enough, the discontinuities may become an issue for S EC B OOST to
not stop at Step 2.5. Or one can always use a simple trick to avoid stopping and which relies on the
leveraging coefficient αt : this is described in the Appendix, Section B.9.
The case of convex losses If F is convex (not necessarily differentiable nor strictly convex), there is
a simple way to find a valid output for the offset oracle, which relies on the following Lemma.
Lemma B.5. Suppose F convex. Then for any z, z 1 P R, v ‰ 0,
tv ą 0 : Q˚F pz, z 1 , vq “ ru
› F pz ` vq ´ F pzq
" ˆ › ˙ *
“ v ą 0 : DF z ›› “r . (55)
v
(proof in Appendix, Section B.8) By definition, Iti pz 1 q Ď Iti pzq for any z 1 ď z, so a simple way to
implement the offset oracle’s output OOpt, i, zq is, for some 0 ă r ă z, to solve the Bregman identity
in the RHS of (55) and then return any relevant v. If F is strictly convex, there is just one choice.
If solving the Bregman identity is tedious but F is strictly convex, there a simple dichotomic search
that is guaranteed to find a feasible v. It exploits the fact that the abscissa maximizing the difference
between any secant of F and F has a simple closed form (see [21, Supplement, Figure 13]) and
so the OBI in (1) (Definition 4.6) has a closed form as well. In this case, it is enough, after taking
a first non-zero guess for v (either positive or negative), to divide it by a constant ą 1 until the
corresponding OBI is no larger than the z in the query OOpt, i, zq.
where r is supposed small enough for Iz,r to be non-empty. There is a simple graphical solution to
this which, as Figure 7 explains, consists in finding v solution of
F pz ` vq ´ F pzq
ˆ ˆ ˙ ˙
sup F pz ` vq ´ F ptq ` ¨ pz ` v ´ tq “ r. (57)
t v
22
F
<latexit sha1_base64="XzPCXZ0c6q7Yo181Y7ZKf1y9UBY=">AAAVznicjVhbc9xEFh5guay5hYuf9kWFy8WlsmYmkEAetgqcMIEqAw4bJ6E89lRLOtI0bnUr3S3bg0rFK1W8wm/ht+y/2dMteSR1S4apSqzp7zun+1z7aMKcUaWn0/899/wL/3jxpZdf+efWq6+9/sabN956+7EShYzgKBJMyKchUcAohyNNNYOnuQSShQyehGf3DP7kHKSigj/S6xxOMpJymtCIaFx6OF/e2JnuTe/cvvvJNJju3Z7OPrt7Fx+m0zuff3IrmOGD+exMms/h8q13/1zEIioy4DpiRKnj2TTXJyWRmkYMqq1FoSAn0RlJ4RgfOclAnZT2pFWwiytxkAiJ/7gO7GpXoiSZUussRGZG9Eq5mFkcwo4LnXx+UlKeFxp4VG+UFCzQIjBmBzGVEGm2xgcSSYpnDaIVkSTS6Jytrd4+RZ5KgLObUESE3bRn1CHrWVamkuQrGl3iTkEmJOB/MUi+tdslqSJMaFrIvldKdBOGBJKbWcE0leKiD4eZURoKFgfWzqAvbB3ZW+I0ggRNqYL6s4t+zRDUgXGlYMo6fPbxrZsB6GjPsdbsIVWiUCeHC5TMCI/LxYOqXNjtk/JB5WBHLXbkYo9b7LGLPWmxJy72VYt95WJPW+wpYrs98MsW/NID77XgPQ/cb8F9d8sfW+xHF3vYYg9d7LsW+87FfmixH1xs3mJzxPpgDAk8Q4LSGDMJrKbKrNyrqvI/riq4zDHVG31h6LvznEhKeAQtxYtULlvw0AVXFl0R3aU4nDiX9BxZGi61iso4dwkqa1HioTRtUXyu+imrdEbkWsauEFZVdTw7KReMhVgPZ6CDnVmwkLL55vBDmrYi+zRNmSNo1mRX3IkLYfmKMKEUHhYYOy0/qJc+dM0JyRqUhSRVZ1VZLgqO7cI07fKg6spdK5hjw8rscUcULMudWeVr6ex8MH7I3LRwwlqjrF82X61uR+QCaLrSHTdcjKuvWZRb83NFr7F6Qw2J7LDFeWNxTRgzVBVGaP5X+pGme7yl9pkZlVLIejs8CF4emeBxX/XI2bvB6pp7XYzwTB2x+YDM1kB+gJTDabXA7n+Jq5rydYCsqhpOMKziv6HgW6IHFEQriM5cLXaxr2xUm2OResZICKxq2Ja8aQXqmX8AFOhU4EZ+yE71bMTMjdCoca3oiG0dFYNl1RzT1lT9PFRQEeFCmVaPGbPsGtOnMZFe4yVEfTcZka6fNiqGHIXgiKdasVFXdYRHfDWqxAC2Q9QVM3pIys83VJ97Wv575icWA5KgYoazcUzcfRFTYO7Mgwbu4xyHOwR5UfnrVq6+CXFcNLe+Kx1iU8XrNr6oyxqjftqLW1iNtBGF/svMxnhjy5VwOfP7j6pyvuzpeuRHfn5wz6Md+KX3ERLWQq9AmtEcNzV/EpJRti5zklYLfMnAucKCPcHAOBUdD5dNt+uoMVqE5kKDoj9DYDM0TMrjndmJf1AzMhAN1pvtUe0g4TsI25kmMrVBu6TLjxDfxaZyBoFe0XriJYwF+DIUK3c0MeE2AIqGoIl5DBZRLHSwAEwmJrhzMk1ZDD05u1Ju6Mt+c7Mk/8wZRKvu3IPP7n0DJuQtA8MfDXFYh8M8LZDler2qytWynPnZWERnJhgVpqpbI1zwpkzau9YuuEo0WeNriCpycwzilkUuRY7im3bj7dP0xLYjImmoHeaC2pZS95N+H/FMbkhYK01U+ik//Xg20BXbFmJs7L8fRKG8ftNc5MPHu5qLDwYuuByhpp9c9QxvwsaeYmgdyn/9e0UofHelwp4xxzT2q8liOXWVa3z35ybI94Fp0rfC6bMx1p4NUGPQN1UTJrdv21B34u5O2hwjTqyXmlcX741L0TQjhKWA0zbSvs8g9fowzkf1VvPlYmCbFHgNPxiDm0g11jyoBnm4S5833/CcShRxx5HBRtChRYqqSPCf2vZ/NUh6Ctcmpuu2v/QKwG8nRLXRQZ/6BRR7nLiyckNR3Gf4mrMvLnF7WTAoZ3u3sac3fyrzuwJgtxcJtjchEisL/JxKwc2PQaVdrWxxYyOhWAym/4dJcGiARVCZl8aEMrbZaLE4vpVlJ94MxLq53XiiTnGHmjRJnpw26e5oEvk6Inow+/pXcDQ0iuL5MaHXrdTXg0O7lmCK+Wjw5oA4hXZKw4atdQn+zSdUwRNCN7Xcb13F4DhHtWVToQfmFZnhS8jPEF/xGi/WX/3dN1fpwO568CYjKt4I2YS6Xw2J9aQkhIOC7/+lpBZ5LC54e+k9Evl9s+DPWlgtuag6/fHUdDxMjeWNnavfM4Pxh8e39mZ39qYPP935Yr/50fOVyb8m700+mMwmn02+mHw9OZwcTaIJTH6b/D75Y/tw+3y72v6lpj7/XCPzzqT32f71/w75hsY=</latexit>
v
<latexit sha1_base64="iRSRlUyrncre+nZBJKdqnB3X1vQ=">AAAB6HicdVDLSgMxFM3UV62vqks3wSK4GjLjVNtd0Y3LFuwD2qFk0kwbm8kMSaZQhn6BGxeKuPWT3Pk3pg9BRQ9cOJxzL/feEyScKY3Qh5VbW9/Y3MpvF3Z29/YPiodHLRWnktAmiXksOwFWlDNBm5ppTjuJpDgKOG0H45u5355QqVgs7vQ0oX6Eh4KFjGBtpMakXywhu1pFnleByC4j1/XKhqALt1JxoGOjBUpghXq/+N4bxCSNqNCEY6W6Dkq0n2GpGeF0VuiliiaYjPGQdg0VOKLKzxaHzuCZUQYwjKUpoeFC/T6R4UipaRSYzgjrkfrtzcW/vG6qw4qfMZGkmgqyXBSmHOoYzr+GAyYp0XxqCCaSmVshGWGJiTbZFEwIX5/C/0nLtZ1LGzW8Uu16FUcenIBTcA4ccAVq4BbUQRMQQMEDeALP1r31aL1Yr8vWnLWaOQY/YL19AlBgjUk=</latexit>
z
<latexit sha1_base64="6RxmbiyokWop4hl5Ap0lGuJIxMU=">AAAVznicjVjdcuQ0Fm5YlmWzCzvA5oob16ZSC9QQusPsDFxQBZmhh60KkIHJ/FQ66ZLtY7eILHkkOUmPy7W3W8Ut+yz7LLwNR7LTtiU70FUzcev7zpHOr447zBlVejr95ZVX//DaH1//0xt/3vrLX99862+33n7niRKFjOA4EkzIZyFRwCiHY001g2e5BJKFDJ6G5/cN/vQCpKKCP9brHE4zknKa0IhoXHr0cnlrZ7r32ad39+/cDaZ70+m92f7MPOzfu/PJnWCGK+azM2k+R8u3//7/RSyiIgOuI0aUOplNc31aEqlpxKDaWhQKchKdkxRO8JGTDNRpaU9aBbu4EgeJkPiP68CudiVKkim1zkJkZkSvlIuZxSHspNDJp6cl5XmhgUf1RknBAi0CY3YQUwmRZmt8IJGkeNYgWhFJIo3O2drq7VPkqQQ4vw1FRNhte0Ydsp5lZSpJvqLRFe4UZEIC/heD5Fu7XZIqwoSmhex7pUQ3YUgguZ0VTFMpLvtwmBmloWBxYO0M+sLWkb0lTiNI0JQqqD+76NcMQR0YVwqmrMNnH+/fDkBHe461Zg+pEoU6OVyiZEZ4XC4eVuXCbp+UDysHO26xYxd70mJPXOxpiz11sa9a7CsXe9ZizxDb7YFftuCXHni/Be974EELHrhbPm+x5y72qMUeudi3Lfati33fYt+72LzF5oj1wRgSeIEEpTFmElhNlVm5V1Xl564quMox1Rt9Yei784JISngELcWLVC5b8MgFVxZdEd2lOJw4l/QCWRqutIrKOHcJKmtR4qE0bVF8rvopq3RG5FrGrhBWVXUyOy0XjIVYD+egg51ZsJCy+ebwQ5q2Igc0TZkjaNZkV9yJC2H5ijChFB4WGDsr36+XPnDNCckalIUkVedVWS4Kju3CNO3ysOrK3SiYY8PK7HFHFCzLnVnla+nsfDh+yNy0cMJao6xfNl+tbkfkEmi60h03XI6rr1mUW/NzRW+wekMNieywxUVjcU0YM1QVRmj+W/qRpnu8pfaZGZVSyHo7PAheHpngcV/1yNm7weqae1OM8EwdsfmAzNZAfoCUw2m1wO5/haua8nWArKoaTjCs4t+h4BuiBxREK4jOXS12sa9sVJtjkXrBSAisatiWvGkF6oV/ABToVOBGfshO9WLEzI3QqHGt6IhtHRWDZdUc09ZU/TxUUBHhQplWjxmz7BrTpzGR3uAlRH03GZGunzYqhhyF4IinWrFRV3WER3w1qsQAtkPUFTN6SMovNlSfe1Z+NPMTiwFJUDHD2Tgm7r6IKTB35mED93GOwx2CvKj8dStX34Q4Lppb35UOsanidRtf1mWNUT/rxS2sRtqIQv9lZmO8seVKuJz5g8dVOV/2dD32Iz8/vO/RDv3S+xAJa6FXIM1ojpuaPwnJKFuXOUmrBb5k4FxhwZ5gYJyKjoerptt11BgtQnOhQdGXENgMDZPyZGd26h/UjAxEg/Vme1Q7SPgOwnamiUxt0K7o8kPEd7GpnEOgV7SeeAljAb4MxcodTUy4DYCiIWhiHoNFFAsdLACTiQnunExTFkNPzq6UG/qy39wsyT9zBtGqO/fgs3vfgAl5y8DwR0Mc1uEwTwtkuV6vqnK1LGd+NhbRuQlGhanq1ggXvCmT9q61C64STdb4GqKK3ByDuGWRS5Gj+KbdePs0PbHtiEgaaoe5oLal1P2k30c8kxsS1koTlX7KTz+eDXTFtoUYG/vvB1Eob940F/nw8a7n4sOBCy5HqOkn1z3Dm7Cxpxhah/KDf68Ihe+uVNgz5pjGfjVZLKeuco3v/twE+QEwTfpWOH02xtqzAWoM+nfVhMnt2zbUnbi7kzbHiBPrpebVxXvjUjTNCGEp4LSNtO8ySL0+jPNRvdV8uRjYJgVeww/H4CZSjTUPq0Ee7tLnzTc8pxJF3HFksBF0aJGiKhL8x7b9Xw+SnsK1iem67S+9AvDbCVFtdNCnfgHFHieurNxQFA8YvuYciCvcXhYMytnev7CnN38q87sCYLcXCbY3IRIrC/yCSsHNj0GlXa1scWMjoVgMpv+HSXBkgEVQmZfGhDK22WixONnPslNvBmLd3G48Uae4Q02aJE/OmnR3NIl8HRE9mH39KzgaGkXx/JjQ61bq68GhXUswxXw8eHNAnEI7pWHD1roE/+YTquAJoZta7reuYnCco9qyqdAD84rM8CXkJcTXvMaL9Vd/981VOrC7HrzJiIo3QjahHlRDYj0pCeGg4D9/U1KLPBaXvL30Hov8gVnwZy2sllxUnf54Zjoepsby1s7175nB+MOT/b3Z3b3pozs7Xxw0P3q+MXlv8o/J+5PZ5N7ki8nXk6PJ8SSawOSnyc+T/20fbV9sV9v/qamvvtLIvDvpfbb/+yvTy4bz</latexit>
ẽ(t ẽti
<latexit sha1_base64="/9DKzWJkN61zBOEo2z+Vte2Xm/Y=">AAAV4XicjVjNbtw2EN6kf6n7l6T1qRehhtEkcBxv0L9LgcRO1ingpk5qOwm89oKSRlrWlKiQlO2NoAforei1QK/tG/RZ+jYdUspKIrVOF0jM5ffNkDPDGQ7XzxiVamPj30uX33r7nXffu/L+0gcffvTxJ1evXT+QPBcB7AeccfHcJxIYTWFfUcXgeSaAJD6DZ/7JlsafnYKQlKd7apbBUULilEY0IAqnJlevjxVlIRRQToob6vbwJi0nV1c21jfMx3MHw3qwMqg/u5Nrn/0zDnmQJ5CqgBEpD4cbmToqiFA0YFAujXMJGQlOSAyHOExJAvKoMJsvvVWcCb2IC/yXKs/MtiUKkkg5S3xkJkRNpY3pyT7sMFfRd0cFTbNcQRpUC0U58xT3tCe8kAoIFJvhgASC4l69YEoECRT6a2mps06exQLgZA3ygLA1s0fls45lRSxINqXBOa7kJVwA/heCSJdW2ySZ+xGNc9H1SoFuwihBtJbkTFHBz7qwn2ilPmehZ+z0usLGkZ2plAYQoSmlV31W0a8JgsrTruRMGocP79xd80AF65a1eg0hI4k6UzhDyYSkYTHeLouxWT4qtksL22+wfRs7aLADG3vWYM9s7GGDPbSx5w32HLHVDni/Ae874FYDbjngZgNu2ku+aLAXNvakwZ7Y2OMGe2xjTxvsqY2NGmyEWBcMIYKXSJAKYyaAVVSRFOtlWXxvq4LzDI96rc/3XXeeEkFJGkBDcSKViQbctcGpQadEtSkWJ8wEPUWWgnMlgyLMbIJMGpQ4KI0bFMdl98hKlRAxE6EthFlVHg6PijFjPubDCShvZeiNhai/WXyfxo3IJo1jZgnqOdEWt+JCWDYljEuJmwXGjosb1dRN2xyfzEAaSFB5UhbFOE+xXOg6XuyUbbkLBTMsWInZ7gIFk2JlWLpaWivvLN5kpks4YY1Rxi/zr0a3JXIGNJ6qlhvOFquvWDQ15meSXmD1nOoT0WLz09riirDIUJlrodGb9CNNdXgT5TITKgQX1XK4Ebw8Ep6GXdUL9t4OVtvci2KEe2qJjXpklnrOBwjRf6zGWP3PcVbRdOYhqyz7Dxhm8f9Q8CNRPQqCKQQnthYz2VW2UJtlkXzJiA+srNmGPC8F8qW7ARRoZeBcvs9O+XKBmXOhhcY1ogtsa6noTat6myanqnFfQgUk5VKXejwxk7YxXRrj8QVeQtR1kxZp+2muos9RCC7wVCO20FUt4QW+WqhEA6ZCVBmzcJM0PZ1TXe5xcXvoHiwGJELFDNvlkNjrIiZB35k7NdzFU2zuEEzz0p03ctVNiO2ivvVtaR+LKl634VmV1hj1407c/HJBGZHov0QvjDe2mHKbM3qwVxajSUfXnhv50c6WQ9txU+8WEmZcTUHo1hwX1X8iklA2KzISl2N8d2BfYcCOoKedio6H87ratdRoLVylXIGkr8AzJ9SPisOV4ZG7Ud0yEAXGm81WTSPhOgjLmSIiNkE7p5NbiK9iUTkBT01p1fESxjx8H4XSbk10uDWAoj4ooofeOAi58saAh4nx1NqZeTZ15KqH1Jw+6RY3Q3L3nEAwbfc9OLbvG9AhbxgY/qCPw1oc5miBJFOzaVlMJ8XQPY15cKKDUeJRtXMk5WmdJs1dayZsJYrM8Bki80xvg9hpkQmeofi83Djr1DWxqYhI6iuHGaempFT1pFtHHJNrEuZKHZXukd+4M+ypik0J0TZ23weBLy5eNONZ//Ze98U7PRdchlBdT17XDKfDxpqiaS3Kz+69wiW+XSk3e8zwGLvZZLCM2soVQYfoID8ApkjXCqvOhph7JkC1QT+UdZjsum1C3Yq73WmnGHFivFQ/XZwXl6RxQgiLAbttpP2UQOzUYeyPqqVGk3HPMjGkFby9CK4jVVuzXfbycJUubzTnWZnIw5YjvbmgRQsklQFPf2nK/+tG0lE40zGdNfWlkwBuOSGyiQ761E2g0OGEpZHri+Imw2fOJj/H5UXOoBiuf401vf5T6t8VAKs9j7C8cR4ZWUhPqeCp/jGoMLOlSW4sJBSTQdd/P/J2NTD2Sv1ojChj84XG48O7SXLk9ECsfbZrT1RH3KJG9SGPjuvjbmni2Swgqvf0da/goK8Vxf3jgZ41Uo96m3YlQCfzfu/NAWEMTZeGBVupAtybj8s8jQid53K3dOW97RxVhk256ulXRIKPkFcQvubVXqy+uqvPr9Ke1VXvTUZkOBcyB+pB2SfWkRLg9wp++UZJxbOQn6XNpbfHswd6wu21MFsyXrbq47GueHg0JldXhvYPm+7g4O768Jv1jSdfrdzbrH/0vDL4fPDF4MZgOPh2cG/waLA72B8Eg/PBn4O/Bn8vB8u/Lv+2/HtFvXyplvl00Pks//EfE62NwA==</latexit>
<latexit sha1_base64="J1s0OWff6fqNuSDh8yiOyfFBN64=">AAAV23icjVhLb9w2EN70mbqvpK1PvQg1jKaB63iDvi4FEjvZpICbOmn8CLz2gpJGWtaUKJOU7Y2gU29FrwV6bf9Ef0v/TYeUvJJIrdMFEnP5fTPkzHCGw/UzRqXa2Pj32muvv/HmW29ff2fp3ffe/+DDGzc/2pM8FwHsBpxxceATCYymsKuoYnCQCSCJz2DfP9nS+P4ZCEl5+lzNMjhKSJzSiAZE4dTxWFEWQgHlpFC0nNxY2VjfMB/PHQzrwcqg/uxMbn7yzzjkQZ5AqgJGpDwcbmTqqCBC0YBBuTTOJWQkOCExHOIwJQnIo8Jsu/RWcSb0Ii7wX6o8M9uWKEgi5SzxkZkQNZU2pif7sMNcRd8dFTTNcgVpUC0U5cxT3NM+8EIqIFBshgMSCIp79YIpESRQ6Kmlpc46eRYLgJM1yAPC1swelc86lhWxINmUBhe4kpdwAfhfCCJdWm2TZO5HNM5F1ysFugnjA9FakjNFBT/vwn6ilfqchZ6x0+sKG0d2plIaQISmlF71WUW/JggqT7uSM2kcPrxzd80DFaxb1uo1hIwk6kzhHCUTkobF+FFZjM3yUfGotLDdBtu1sb0G27Ox/Qbbt7GHDfbQxg4a7ACx1Q54vwHvO+BWA2454GYDbtpLvmiwFzb2tMGe2tiTBntiY88a7JmNjRpshFgXDCGCUyRIhTETwCqqSIr1siy+t1XBRYZHvdbn+647z4igJA2goTiRykQD7tjg1KBTotoUixNmgp4hS8GFkkERZjZBJg1KHJTGDYrjsntkpUqImInQFsKsKg+HR8WYMR/z4QSUtzL0xkLU3yy+T+NGZJPGMbME9Zxoi1txISybEsalxM0CY8fFrWrqC9scn8xAGkhQeVIWxThPsVzoCl5sl225KwUzLFiJ2e4CBZNiZVi6Wlorby/eZKZLOGGNUcYv869GtyVyDjSeqpYbzherr1g0NeZnkl5h9ZzqE9Fi87Pa4oqwyFCZa6HRq/QjTXV4E+UyEyoEF9VyuBG8PBKehl3VC/beDlbb3KtihHtqiY16ZJZ6zgcI0X+sxlj9L3BW0XTmIass+w8YZvH/UPAjUT0KgikEJ7YWM9lVtlCbZZE8ZcQHVtZsQ56XAnnqbgAFWhk4l++zU54uMHMutNC4RnSBbS0VvWlVb9PkVDXuS6iApFzqUo8nZtI2pktjPL7CS4i6btIibT/NVfQ5CsEFnmrEFrqqJbzAVwuVaMBUiCpjFm6Spmdzqss9Lr4cugeLAYlQMcNGOST2uohJ0Hfmdg138RSbOwTTvHTnjVx1E2K7qG99W9rHoorXbXhepTVG/bgTN79cUEYk+i/RC+ONLabc5owePC+L0aSj67kb+dH2lkPbdlPvNhJmXE1B6NYcF9V/IpJQNisyEpdjfHFgX2HAjqCnnYqOh4u62rXUaC1cpVyBpC/BMyfUj4rDleGRu1HdMhAFxpvNVk0j4ToIy5kiIjZBu6CT24ivYlE5AU9NadXxEsY8fBmF0m5NdLg1gKI+KKKH3jgIufLGgIeJ8dTamXkydeSqR9ScPukWN0Ny95xAMG33PTi27xvQIW8YGP6gj8NaHOZogSRTs2lZTCfF0D2NeXCig1HiUbVzJOVpnSbNXWsmbCWKzPAZIvNMb4PYaZEJnqH4vNw469Q1samISOorhxmnpqRU9aRbRxyTaxLmSh2V7pHfuDPsqYpNCdE2dt8HgS+uXjTjWf/2Lvvi7Z4LLkOorieXNcPpsLGmaFqL8rN7r3CJb1fKzR4zPMZuNhkso7ZyRdAhOsgPgCnStcKqsyHmnglQbdAPZR0mu26bULfibnfaKUacGC/VTxfnxSVpnBDCYsBuG2k/JRA7dRj7o2qp0WTcs0wMaQU/WgTXkaqteVT28nCVLm8051mZyMOWI725oEULJJUBT39pyv9lI+konOmYzpr60kkAt5wQ2UQHfeomUOhwwtLI9UVxk+EzZ5Nf4PIiZ1AM17/Gml7/KfXvCoDVnkdY3jiPjCykZ1TwVP8YVJjZ0iQ3FhKKyaDrvx95OxoYe6V+NEaUsflC4/Hh3SQ5cnog1j7btSeqI25Ro/qQR8f1cbc08WwWENV7+rpXcNDXiuL+8UDPGqnHvU27EqCTebf35oAwhqZLw4KtVAHuzcdlnkaEznO5W7ry3naOKsOmXPX0KyLBR8hLCC95tRerr+7q86u0Z3XVe5MRGc6FzIF6UPaJdaQE+L2Cn79SUvEs5Odpc+k959kDPeH2WpgtGS9b9fFYVzw8GpMbK0P7h013sHd3ffjN+sbTr1bubdY/el4ffDr4bHBrMBx8O7g3eDzYGewOgoEY/Dn4a/D38tHyr8u/Lf9eUV+7Vst8POh8lv/4D5QZjLg=</latexit>
1)i
1)i , v) z
<latexit sha1_base64="6RxmbiyokWop4hl5Ap0lGuJIxMU=">AAAVznicjVjdcuQ0Fm5YlmWzCzvA5oob16ZSC9QQusPsDFxQBZmhh60KkIHJ/FQ66ZLtY7eILHkkOUmPy7W3W8Ut+yz7LLwNR7LTtiU70FUzcev7zpHOr447zBlVejr95ZVX//DaH1//0xt/3vrLX99862+33n7niRKFjOA4EkzIZyFRwCiHY001g2e5BJKFDJ6G5/cN/vQCpKKCP9brHE4zknKa0IhoXHr0cnlrZ7r32ad39+/cDaZ70+m92f7MPOzfu/PJnWCGK+azM2k+R8u3//7/RSyiIgOuI0aUOplNc31aEqlpxKDaWhQKchKdkxRO8JGTDNRpaU9aBbu4EgeJkPiP68CudiVKkim1zkJkZkSvlIuZxSHspNDJp6cl5XmhgUf1RknBAi0CY3YQUwmRZmt8IJGkeNYgWhFJIo3O2drq7VPkqQQ4vw1FRNhte0Ydsp5lZSpJvqLRFe4UZEIC/heD5Fu7XZIqwoSmhex7pUQ3YUgguZ0VTFMpLvtwmBmloWBxYO0M+sLWkb0lTiNI0JQqqD+76NcMQR0YVwqmrMNnH+/fDkBHe461Zg+pEoU6OVyiZEZ4XC4eVuXCbp+UDysHO26xYxd70mJPXOxpiz11sa9a7CsXe9ZizxDb7YFftuCXHni/Be974EELHrhbPm+x5y72qMUeudi3Lfati33fYt+72LzF5oj1wRgSeIEEpTFmElhNlVm5V1Xl564quMox1Rt9Yei784JISngELcWLVC5b8MgFVxZdEd2lOJw4l/QCWRqutIrKOHcJKmtR4qE0bVF8rvopq3RG5FrGrhBWVXUyOy0XjIVYD+egg51ZsJCy+ebwQ5q2Igc0TZkjaNZkV9yJC2H5ijChFB4WGDsr36+XPnDNCckalIUkVedVWS4Kju3CNO3ysOrK3SiYY8PK7HFHFCzLnVnla+nsfDh+yNy0cMJao6xfNl+tbkfkEmi60h03XI6rr1mUW/NzRW+wekMNieywxUVjcU0YM1QVRmj+W/qRpnu8pfaZGZVSyHo7PAheHpngcV/1yNm7weqae1OM8EwdsfmAzNZAfoCUw2m1wO5/haua8nWArKoaTjCs4t+h4BuiBxREK4jOXS12sa9sVJtjkXrBSAisatiWvGkF6oV/ABToVOBGfshO9WLEzI3QqHGt6IhtHRWDZdUc09ZU/TxUUBHhQplWjxmz7BrTpzGR3uAlRH03GZGunzYqhhyF4IinWrFRV3WER3w1qsQAtkPUFTN6SMovNlSfe1Z+NPMTiwFJUDHD2Tgm7r6IKTB35mED93GOwx2CvKj8dStX34Q4Lppb35UOsanidRtf1mWNUT/rxS2sRtqIQv9lZmO8seVKuJz5g8dVOV/2dD32Iz8/vO/RDv3S+xAJa6FXIM1ojpuaPwnJKFuXOUmrBb5k4FxhwZ5gYJyKjoerptt11BgtQnOhQdGXENgMDZPyZGd26h/UjAxEg/Vme1Q7SPgOwnamiUxt0K7o8kPEd7GpnEOgV7SeeAljAb4MxcodTUy4DYCiIWhiHoNFFAsdLACTiQnunExTFkNPzq6UG/qy39wsyT9zBtGqO/fgs3vfgAl5y8DwR0Mc1uEwTwtkuV6vqnK1LGd+NhbRuQlGhanq1ggXvCmT9q61C64STdb4GqKK3ByDuGWRS5Gj+KbdePs0PbHtiEgaaoe5oLal1P2k30c8kxsS1koTlX7KTz+eDXTFtoUYG/vvB1Eob940F/nw8a7n4sOBCy5HqOkn1z3Dm7Cxpxhah/KDf68Ihe+uVNgz5pjGfjVZLKeuco3v/twE+QEwTfpWOH02xtqzAWoM+nfVhMnt2zbUnbi7kzbHiBPrpebVxXvjUjTNCGEp4LSNtO8ySL0+jPNRvdV8uRjYJgVeww/H4CZSjTUPq0Ee7tLnzTc8pxJF3HFksBF0aJGiKhL8x7b9Xw+SnsK1iem67S+9AvDbCVFtdNCnfgHFHieurNxQFA8YvuYciCvcXhYMytnev7CnN38q87sCYLcXCbY3IRIrC/yCSsHNj0GlXa1scWMjoVgMpv+HSXBkgEVQmZfGhDK22WixONnPslNvBmLd3G48Uae4Q02aJE/OmnR3NIl8HRE9mH39KzgaGkXx/JjQ61bq68GhXUswxXw8eHNAnEI7pWHD1roE/+YTquAJoZta7reuYnCco9qyqdAD84rM8CXkJcTXvMaL9Vd/981VOrC7HrzJiIo3QjahHlRDYj0pCeGg4D9/U1KLPBaXvL30Hov8gVnwZy2sllxUnf54Zjoepsby1s7175nB+MOT/b3Z3b3pozs7Xxw0P3q+MXlv8o/J+5PZ5N7ki8nXk6PJ8SSawOSnyc+T/20fbV9sV9v/qamvvtLIvDvpfbb/+yvTy4bz</latexit>
Figure 6: The spring loss in (53) is neither convex, nor Lipschitz or differentiable and has an infinite
number of local minima. Yet, an implementation of the offset oracle is trivial as an output for OO can
Proprietary + Confidential
be obtained from the computation of a single tangent point (here, the orange v, see text; best viewed
Optimal Bregman Information – F convex
in color).
Figure 7: Computing the OBI QF pz, z ` v, z ` vq for F convex, pz, vq being given and v ą 0. We
compute the line p∆t q crossing F at any point t, with slope equal to the secant rpz, F pzqq, pz `
v, F pz ` vqqs and then the difference between F at z ` v and this line at z ` v. We move t so as
to maximize this difference. The optimal t (in green) gives the corresponding OBI. In (56) and 58,
we are interested in finding v given this difference, r. We also need to replicate this computation for
v ă 0.
B.9 Handling discontinuities in the offset oracle to prevent stopping in Step 2.5
Theorem 5.3 and Lemma 5.6 require to run S EC B OOST for as many iterations are required. This
implies not early stopping in Step 2.5. Lemma B.4 shows that early stopping can only be triggered by
too large local discontinuities at the edges. This is a weak requirement on running S EC B OOST, but
there exists a weak assumption on the discontinuities of the loss itself that simply prevent any early
stopping and does not degrade the boosting rates. The result exploits the freedom in choosing αt in
Step 2.3.
Lemma B.6. Suppose F is any function defined over R discontinuities of zero Lebesgue measure.
Then Corollary 5.6 holds for boosting F with its inequality strict while never triggering early stopping
in Step 2.5 of S EC B OOST.
Proof. To show that we never trigger stopping in Step 2.5, it is sufficient to show that we can run
S EC B OOSTwhile ensuring F is continuous in an open neighborhood around all edges yi Ht pxi q, @i P
. .
rms, @t ě 0 (by letting H0 “ h0 ). Remind that ẽti “ ẽpt´1qi ` αt ¨ yt ht pxi q, so changing αt changes
all edges. We just have to show that either computing αt ensures such a continuity, or αt can be
slightly modified to do so. We have two ways to compute αt :
1. using a value for W 2,t that represents an "absolute" upperbound in the sense of (8) (e.g.
Lemma 5.7) and then compute αt as in Step 2.3 of S EC B OOST;
2. using algorithm S OLVEα .
Because of the assumption on F , we can always ensure that F is continuous in an open neighborhood
of all edges (the basis of the induction amounts to a straightforward choice for h0 ). This proves the
Lemma for [2.].
If we rely on [1.] and the αt computed leads to some discontinuities, then we have complete control
to change αt : any continuous change of εt induces a continuous change in αt and thus a continuous
change of all edges as well. So, starting from the initial εt chosen in Step 2.3, we increase it to a
value ε˚t ą εt , which we want to keep as small as possible. We can define for each i P rms an open
set pai , bi q which is the interval spanned by the new ẽti pε1t q using ε1t P pεt , ε˚t q. Since there are only
finitely many discontinuities on F , there exists a small ε˚t ą εt such that
@i P rms, @z P pai , bi q, F is continuous on z.
This means that P pεt , ε˚t q, we end up with a loss without any discontinuities on the new edges.
@ε1t
Now comes the reason why we want ε˚t ´ εt small: we can check that there always exist a small
enough ε˚t ą εt such that for any ε1t we choose, the boosting rate in Corollary 5.6 is affected by at
most 1 additional iteration. Indeed, while we slightly change parameter εt to land all new edges
outside of discontinuities of F , we also increase the contribution of the boosting iteration in the RHS
of (15) by a quantity δ ą 0 which can be made as small as required — hence we can just replace the
inequality in (15) by a strict inequality. This proves the statement of the Lemma if we rely on [1.]
above.
This completes the proof of Lemma B.6.
Suppose F is strictly convex and strictly decreasing as for classical convex surrogates (e.g. logistic
loss). Assuming wlog all α. ą 0 and example i has both yi ht pxi q ą 0 and yi ht´1 pxi q ą 0, as long
as z is small enough, we are guaranteed that any choice vt´1 P Ipt´1qi pzq and vt P Iti pzq results in
0 ă wpt`1qi ă wti , which follows the classical boosting pattern that examples receiving the right
class by weak hypotheses have their weight decreased (See Figure 8). If z “ z 1 is large enough, then
this does not hold anymore as seen from Figure 8.
24
Proprietary + Confidential
parallel
Figure 8: Case F strictly convex, with two cases of limit OBI z and z 1 in I.i p.q. Example i has
eti ą 0 and ept´1qi ą 0 (??) large enough (hence, edges with respect to weak classifiers ht and
ht´1 large enough) so that Iti pzq X Ipt´1qi pzq “ Ipt´1qi pzq X Ipt´2qi pzq “ Iti pzq X Ipt´2qi pzq “ H.
In this case, regardless of the offsets chosen by OO, we are guaranteed that its weights satisfy
wpt`1qi ă wti ă wpt´1qi , which follows the boosting pattern that examples receiving the right
classification by weak classifiers have their weights decreasing. If however the limit OBI changes
from z to a larger z 1 , this is not guaranteed anymore: in this case, it may be the case that wpt`1qi ą wti .
25
B.11 The case of piecewise constant losses for S OLVEα
(A)
(B)
Figure 9: How our algorithm works with the 0/1 loss (in red): at the initialization stage, assuming
we pick h0 “ 0 for simplicity and some v0 ă 0, all training examples get the same weight, given by
negative the slope of the thick blue dashed line. All weights are thus ą 0. At iteration t when we
update the weights (Step 2.6), one of two cases can happen on some training example px, yq. In (A),
the edge of the strong model remains the same: either both are positive (blue) or both negative (olive
green) (the ordering of edges is not important). In this case, regardless of the offset, the new weight
will be 0. In (B), both edges have different sign (again, the ordering of edges is not important). In
this case, the examples will keep non-zero weight over the next iteration. See text below for details.
Figure 9 schematizes a run of our algorithm when training loss = 0/1 loss. At the initialization,
it is easy to get all examples to have non-zero weight. The weight update for example px, yq of
our algorithm in Step 2.3 is (negative) the slope of a secant that crosses the loss in two points,
both being in between yHt´1 pxq and yHt pxq. Hence, if the predicted label does not change
(signpHt pxqq “ signpHt´1 pxqq), then the next weight (wt`1 ) of the example will be zero (Figure
9, case (A)). However, if the predicted label does change (signpHt pxqq ‰ signpHt´1 pxqq) then the
example may get a non-zero weight depending on the offset chosen.
Hence, our generic implementation of Algorithms 3 and 4 may completely fail at providing non-zero
weights for the next iteration, which makes the algorithm stop in step 2.7. And even when not all
weights are zero, there may be just a too small subset of those, that would break the Weak Learning
Assumption for boosting compliance of the next iteration (Assumption 5.5).
C.1 Algorithm and implementation of S OLVEα and how to find parameters from Theorem 5.8
As Theorem 5.8 explains, S OLVEα can easily get to not just the leveraging coefficient αt , but also
other parameters that are necessary to implement S EC B OOST: W 2,t and εt (both used in Step 2.5).
We now provide a simple pseudo code on how to implement S OLVEα amnd get, on top of it, the two
other parameters. We do not seek πt since it is useful only in the convergence analysis. Also, our
proposal implementation is optimized for complexity (because of the geometric updating of δ, W
in their respective loops) but much less so for for accuracy. Algorithm S OLVE_extended explains the
overall procedure.
26
Algorithm 3 S OLVE_extended(S, w, h, M )
Input sample S “ tpxi , yi q, i “ 1, 2, ..., mu, w P Rm , h : X Ñ R, M ‰ 0.
// in our case, w Ð wt ; h Ð ht ; M Ð Mt (current weights, weak hypothesis and max
confidence, see Step 2.3 in S EC B OOST and Assumption 5.1)
Step 1 : // all initializations
ηinit Ð ηpw, hq; (59)
δ Ð 1.0; (60)
Winit Ð 1.0; (61)
Step 2 : do // Step 2 computes the leveraging coefficient αt
α Ð δ ¨ signpηinit q;
ηnew Ð ηpw̃pαq, hq;
if |ηnew ´ ηinit | ă |ηinit | then found_alpha Ð true else δ Ð δ{2;
while found_alpha “ false;
Step 3 : W Ð Left Hand Side of (8) (main file) // Step 3 computes W 2,t
// we can use (8) (main file) because we know α
if W “machine 0 then
// the LHS of (8) is (machine) 0: just need to find W such that (9) holds !
W Ð Winit ;
while |α| ą |ηinit |{pW ¨ M 2 q do W Ð W {2;
endif
Step 4 : bsup Ð |ηinit |{pW ¨ M 2 q; // Step 4 computes εt
ε Ð pbsup {αq ´ 1;
Return pα, W, εq;
F
<latexit sha1_base64="XzPCXZ0c6q7Yo181Y7ZKf1y9UBY=">AAAVznicjVhbc9xEFh5guay5hYuf9kWFy8WlsmYmkEAetgqcMIEqAw4bJ6E89lRLOtI0bnUr3S3bg0rFK1W8wm/ht+y/2dMteSR1S4apSqzp7zun+1z7aMKcUaWn0/899/wL/3jxpZdf+efWq6+9/sabN956+7EShYzgKBJMyKchUcAohyNNNYOnuQSShQyehGf3DP7kHKSigj/S6xxOMpJymtCIaFx6OF/e2JnuTe/cvvvJNJju3Z7OPrt7Fx+m0zuff3IrmOGD+exMms/h8q13/1zEIioy4DpiRKnj2TTXJyWRmkYMqq1FoSAn0RlJ4RgfOclAnZT2pFWwiytxkAiJ/7gO7GpXoiSZUussRGZG9Eq5mFkcwo4LnXx+UlKeFxp4VG+UFCzQIjBmBzGVEGm2xgcSSYpnDaIVkSTS6Jytrd4+RZ5KgLObUESE3bRn1CHrWVamkuQrGl3iTkEmJOB/MUi+tdslqSJMaFrIvldKdBOGBJKbWcE0leKiD4eZURoKFgfWzqAvbB3ZW+I0ggRNqYL6s4t+zRDUgXGlYMo6fPbxrZsB6GjPsdbsIVWiUCeHC5TMCI/LxYOqXNjtk/JB5WBHLXbkYo9b7LGLPWmxJy72VYt95WJPW+wpYrs98MsW/NID77XgPQ/cb8F9d8sfW+xHF3vYYg9d7LsW+87FfmixH1xs3mJzxPpgDAk8Q4LSGDMJrKbKrNyrqvI/riq4zDHVG31h6LvznEhKeAQtxYtULlvw0AVXFl0R3aU4nDiX9BxZGi61iso4dwkqa1HioTRtUXyu+imrdEbkWsauEFZVdTw7KReMhVgPZ6CDnVmwkLL55vBDmrYi+zRNmSNo1mRX3IkLYfmKMKEUHhYYOy0/qJc+dM0JyRqUhSRVZ1VZLgqO7cI07fKg6spdK5hjw8rscUcULMudWeVr6ex8MH7I3LRwwlqjrF82X61uR+QCaLrSHTdcjKuvWZRb83NFr7F6Qw2J7LDFeWNxTRgzVBVGaP5X+pGme7yl9pkZlVLIejs8CF4emeBxX/XI2bvB6pp7XYzwTB2x+YDM1kB+gJTDabXA7n+Jq5rydYCsqhpOMKziv6HgW6IHFEQriM5cLXaxr2xUm2OResZICKxq2Ja8aQXqmX8AFOhU4EZ+yE71bMTMjdCoca3oiG0dFYNl1RzT1lT9PFRQEeFCmVaPGbPsGtOnMZFe4yVEfTcZka6fNiqGHIXgiKdasVFXdYRHfDWqxAC2Q9QVM3pIys83VJ97Wv575icWA5KgYoazcUzcfRFTYO7Mgwbu4xyHOwR5UfnrVq6+CXFcNLe+Kx1iU8XrNr6oyxqjftqLW1iNtBGF/svMxnhjy5VwOfP7j6pyvuzpeuRHfn5wz6Md+KX3ERLWQq9AmtEcNzV/EpJRti5zklYLfMnAucKCPcHAOBUdD5dNt+uoMVqE5kKDoj9DYDM0TMrjndmJf1AzMhAN1pvtUe0g4TsI25kmMrVBu6TLjxDfxaZyBoFe0XriJYwF+DIUK3c0MeE2AIqGoIl5DBZRLHSwAEwmJrhzMk1ZDD05u1Ju6Mt+c7Mk/8wZRKvu3IPP7n0DJuQtA8MfDXFYh8M8LZDler2qytWynPnZWERnJhgVpqpbI1zwpkzau9YuuEo0WeNriCpycwzilkUuRY7im3bj7dP0xLYjImmoHeaC2pZS95N+H/FMbkhYK01U+ik//Xg20BXbFmJs7L8fRKG8ftNc5MPHu5qLDwYuuByhpp9c9QxvwsaeYmgdyn/9e0UofHelwp4xxzT2q8liOXWVa3z35ybI94Fp0rfC6bMx1p4NUGPQN1UTJrdv21B34u5O2hwjTqyXmlcX741L0TQjhKWA0zbSvs8g9fowzkf1VvPlYmCbFHgNPxiDm0g11jyoBnm4S5833/CcShRxx5HBRtChRYqqSPCf2vZ/NUh6Ctcmpuu2v/QKwG8nRLXRQZ/6BRR7nLiyckNR3Gf4mrMvLnF7WTAoZ3u3sac3fyrzuwJgtxcJtjchEisL/JxKwc2PQaVdrWxxYyOhWAym/4dJcGiARVCZl8aEMrbZaLE4vpVlJ94MxLq53XiiTnGHmjRJnpw26e5oEvk6Inow+/pXcDQ0iuL5MaHXrdTXg0O7lmCK+Wjw5oA4hXZKw4atdQn+zSdUwRNCN7Xcb13F4DhHtWVToQfmFZnhS8jPEF/xGi/WX/3dN1fpwO568CYjKt4I2YS6Xw2J9aQkhIOC7/+lpBZ5LC54e+k9Evl9s+DPWlgtuag6/fHUdDxMjeWNnavfM4Pxh8e39mZ39qYPP935Yr/50fOVyb8m700+mMwmn02+mHw9OZwcTaIJTH6b/D75Y/tw+3y72v6lpj7/XCPzzqT32f71/w75hsY=</latexit>
<latexit sha1_base64="UxyQYRYg1711/flzIrghagmKNCE=">AAAB7XicbVBNS8NAEJ34WetX1aOXYBE8lUREPRa9eKxgP6ANZbOZtGs3u2F3I5TQ/+DFgyJe/T/e/Ddu2xy09cHA470ZZuaFKWfaeN63s7K6tr6xWdoqb+/s7u1XDg5bWmaKYpNKLlUnJBo5E9g0zHDspApJEnJsh6Pbqd9+QqWZFA9mnGKQkIFgMaPEWKnVi5Ab0q9UvZo3g7tM/IJUoUCjX/nqRZJmCQpDOdG663upCXKiDKMcJ+VepjEldEQG2LVUkAR1kM+unbinVoncWCpbwrgz9fdEThKtx0loOxNihnrRm4r/ed3MxNdBzkSaGRR0vijOuGukO33djZhCavjYEkIVs7e6dEgUocYGVLYh+IsvL5PWec2/rHn3F9X6TRFHCY7hBM7Ahyuowx00oAkUHuEZXuHNkc6L8+58zFtXnGLmCP7A+fwBkzKPIQ==</latexit>
z
<latexit sha1_base64="476p4PmXAa6lD8rgAlK3/pk0Yhw=">AAAB6HicdVBNTwIxEO3iF+IX6tFLIzHxtOkuCHgjevEIiQgJbEi3dKHS7W7arglu+AVePGiMV3+SN/+NXcBEjb5kkpf3ZjIzz485UxqhDyu3srq2vpHfLGxt7+zuFfcPblSUSELbJOKR7PpYUc4EbWumOe3GkuLQ57TjTy4zv3NHpWKRuNbTmHohHgkWMIK1kVr3g2IJ2ef1qnvmQmQjVHPL1Yy4tYpbho5RMpTAEs1B8b0/jEgSUqEJx0r1HBRrL8VSM8LprNBPFI0xmeAR7RkqcEiVl84PncETowxhEElTQsO5+n0ixaFS09A3nSHWY/Xby8S/vF6ig7qXMhEnmgqyWBQkHOoIZl/DIZOUaD41BBPJzK2QjLHERJtsCiaEr0/h/+TGtZ2qjVqVUuNiGUceHIFjcAocUAMNcAWaoA0IoOABPIFn69Z6tF6s10VrzlrOHIIfsN4+AU3fjUc=</latexit>
ẽ(t
<latexit sha1_base64="hdW3oDLNe+Sl7TKC+QyNLaejoCA=">AAAB+3icbVBNS8NAEN34WetXrUcvwSLUgyURUY9FLx4r2A9oQ9hsJu3SzQe7E7GE/BUvHhTx6h/x5r9x2+agrQ8GHu/NMDPPSwRXaFnfxsrq2vrGZmmrvL2zu7dfOah2VJxKBm0Wi1j2PKpA8AjayFFAL5FAQ09A1xvfTv3uI0jF4+gBJwk4IR1GPOCMopbcSnWAXPiQQe5mdTyzT3nuVmpWw5rBXCZ2QWqkQMutfA38mKUhRMgEVapvWwk6GZXImYC8PEgVJJSN6RD6mkY0BOVks9tz80QrvhnEUleE5kz9PZHRUKlJ6OnOkOJILXpT8T+vn2Jw7WQ8SlKEiM0XBakwMTanQZg+l8BQTDShTHJ9q8lGVFKGOq6yDsFefHmZdM4b9mXDur+oNW+KOErkiByTOrHJFWmSO9IibcLIE3kmr+TNyI0X4934mLeuGMXMIfkD4/MHeyaUEg==</latexit>
ẽti
<latexit sha1_base64="4ef0DAuzn16DZ4rsvyvIHQB5R+g=">AAAB9XicbVDLSgNBEJyNrxhfUY9eBoPgKeyKqMegF48RzAOSNczO9iZDZh/M9Cph2f/w4kERr/6LN//GSbIHTSxoKKq66e7yEik02va3VVpZXVvfKG9WtrZ3dveq+wdtHaeKQ4vHMlZdj2mQIoIWCpTQTRSw0JPQ8cY3U7/zCEqLOLrHSQJuyIaRCARnaKSHPgrpQwb5IEORD6o1u27PQJeJU5AaKdAcVL/6fszTECLkkmndc+wE3YwpFFxCXumnGhLGx2wIPUMjFoJ2s9nVOT0xik+DWJmKkM7U3xMZC7WehJ7pDBmO9KI3Ff/zeikGV24moiRFiPh8UZBKijGdRkB9oYCjnBjCuBLmVspHTDGOJqiKCcFZfHmZtM/qzkXdvjuvNa6LOMrkiByTU+KQS9Igt6RJWoQTRZ7JK3mznqwX6936mLeWrGLmkPyB9fkDTz2TCg==</latexit>
1)i
vti
<latexit sha1_base64="9YVwuI6ce/kqw1Vi/+NiUdcQneU=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69BIvgqSQi6rHoxWMF+wFtKJvtpl272Q27k0IJ/Q9ePCji1f/jzX/jts1BWx8MPN6bYWZemAhu0PO+ncLa+sbmVnG7tLO7t39QPjxqGpVqyhpUCaXbITFMcMkayFGwdqIZiUPBWuHobua3xkwbruQjThIWxGQgecQpQSs1x70M+bRXrnhVbw53lfg5qUCOeq/81e0rmsZMIhXEmI7vJRhkRCOngk1L3dSwhNARGbCOpZLEzATZ/Nqpe2aVvhspbUuiO1d/T2QkNmYSh7YzJjg0y95M/M/rpBjdBBmXSYpM0sWiKBUuKnf2utvnmlEUE0sI1dze6tIh0YSiDahkQ/CXX14lzYuqf1X1Hi4rtds8jiKcwCmcgw/XUIN7qEMDKDzBM7zCm6OcF+fd+Vi0Fpx85hj+wPn8Afr3j2U=</latexit>
Figure 10: How to find some v P Iti pzq: parse the interval rẽti , ẽpt´1qi s with a regular step δ, seek the
secant with minimal slope (because ẽti ă ẽpt´1qi ; otherwise, we would seek the secant with maximal
slope). It is necessarily the one minimizing the OBI among all regularly spaced choices. If the OBI is
still too large, decrease the step δ and start the search again.
There exists a very simple trick to get some adequate offset v to satisfy (11) (main file), explained in
Figure 10. In short, we seek the optimally bended secant and check that the OBI is no more than a
required z. This can be done via parsing the interval rẽti , ẽpt´1qi s using regularly spaced values. If
the OBI is too large, we can start again with a smaller step size. Algorithm OO_simple details the key
part of the search.
27
Algorithm 4 OO_simple(F, ẽt , ẽt´1 , z, Z)
Input loss F , two last edges ẽt , ẽt´1 , maximal OBI z, precision Z.
// in our case, ẽt Ð ẽti ; ẽt´1 Ð ẽpt´1qi ; (for training example index i P rms)
Step 1 : // all initializations
ẽt´1 ´ ẽt
δÐ ; (62)
Z
zc Ð ẽt ` δ; (63)
i Ð 0; (64)
Step 2 : do
sc Ð SLOPEpF, ẽt , zc q;
// returns the slope of the secant passing through pẽt , F pẽt qq and pzc , F pzc qq
if pi “ 0q _ ppδ ą 0q ^ psc ă s˚ qq _ ppδ ă 0q ^ psc ą s˚ qqq then s˚ Ð sc ; z˚ Ð zc
endif
zc Ð zc ` δ;
i Ð i ` 1;
while pzc ´ ẽt q ¨ pzc ´ ẽt´1 q ă 0; // checks that zc is still in the interval
Return z˚ ´ ẽt ; // this is the offset v
Figure 11: Crops of the two losses whose optimization has been experimentally tested with S EC -
B OOST, in addition to the logistic loss. See text for details.
We provide here a few toy experiments using S EC B OOST. These are just meant to display that a
simple implementation of the algorithm, following the blueprints given above, can indeed manage to
optimize various losses. These are not meant to explain how to pick the best hyperparameters (e.g.
(60)) nor how to choose the best loss given a domain, a problem that is far beyond the scope of our
paper.
In this implementation, the weak learner learns decision trees and we minimize Matushita’s loss at
the leaves of decision trees to learn fixed size trees, see [33] for the criterion and induction scheme,
which is standard for decision trees. S EC B OOST is implemented as is given in the paper, and so are
the implementation of S OLVEα and the offset oracle provided above. We have made no optimization
whatsoever, with one exception: when numerical approximation errors lead to an offset that is
machine 0, we replace it by a small random value to prevent the use of derivatives in S EC B OOST.
We have investigated three losses. The first is the well known logistic loss:
.
FLOG pzq “ logp1 ` expp´zqq. (65)
The other two are tweaks of the logistic loss. We have investigated a clipped version of the logistic
loss,
.
FCL,q pzq “ mintlogp1 ` expp´zqq, logp1 ` expp´qqqu, (66)
28
with q P R, which clips the logistic loss above a certain value. This loss is non-convex and non-
differentiable, but it is Lipschitz. We have also investigated a generalization of the spring loss (main
file):
b
2
.
1 ´ 1 ´ 4 pzQ ´ rzQ sq
FSL,Q pzq “ logp1 ` expp´zqq ` , (67)
Q
.
with zQ “ Qz ´ 1{2 (r.s is the closest integer), which adds to the logistic loss regularly spaced peaks
of variable width. This loss is non-convex, non-differentiable, non-Lipschitz. Figure 11 provides a
crop of the clipped logistic loss and spring loss we have used in our test. Notice the “hardness” that
the spring loss intuitively represents for ML.
We provide an experiment on public domain UCI tictactoe [23] (using a 10-fold stratified cross-
validation to estimate test errors). In addition to the three losses, we have crossed them with several
other variables: the size of the trees (either they have a single internal node = stumps or at most
20 nodes) and, to give one example of how changing a (key) hyperparameter can change the result,
we have tested for a scale of changes on the initial value of δ in (60). Finally, we have crossed all
these variables with the existence of symmetric label noise in the training data, following the setup
of [37, 39]. We flip each label in the training sample with probability η. Table 12 summarizes the
results obtained. One can see that S EC B OOST manages to optimize all losses in pretty much all
settings, with an eventual early stopping required for the spring loss if δ is too large. Note that the
best initial value for δ depends on the loss optimized in these experiments: for δ “ 0.1, test error
from the spring loss decreases much faster than for the other losses, yet we remind that the spring
loss is just the logistic loss plus regularly spaced peaks. This could signal interesting avenues for the
best possible implementation of S EC B OOST, or a further understanding of the best formal ways to fix
those paramaters, all of which are out of the scope of this paper.
29
δ “ 0.1 δ “ 1.0
η Stumps Max size = 20 Stumps Max size = 20
0%
5%
10%
20%
Figure 12: Experiments on UCI tictactoe showing estimated test errors after minimizing each of the
three losses we consider, with varying training noise level η, max tree size and initial hyperparameter
δ value in (60). See text.
30
NeurIPS Paper Checklist
1. Claims
Question: Do the main claims made in the abstract and introduction accurately reflect the
paper’s contributions and scope?
Answer: [Yes]
Justification: Our paper is a theory paper: all claims are properly formalized and used.
Guidelines:
• The answer NA means that the abstract and introduction do not include the claims
made in the paper.
• The abstract and/or introduction should clearly state the claims made, including the
contributions made in the paper and important assumptions and limitations. A No or
NA answer to this question will not be perceived well by the reviewers.
• The claims made should match theoretical and experimental results, and reflect how
much the results can be expected to generalize to other settings.
• It is fine to include aspirational goals as motivation as long as it is clear that these goals
are not attained by the paper.
2. Limitations
Question: Does the paper discuss the limitations of the work performed by the authors?
Answer: [Yes]
Justification: The discussion section is devoted to limitations and improvement of our results
Guidelines:
• The answer NA means that the paper has no limitation while the answer No means that
the paper has limitations, but those are not discussed in the paper.
• The authors are encouraged to create a separate "Limitations" section in their paper.
• The paper should point out any strong assumptions and how robust the results are to
violations of these assumptions (e.g., independence assumptions, noiseless settings,
model well-specification, asymptotic approximations only holding locally). The authors
should reflect on how these assumptions might be violated in practice and what the
implications would be.
• The authors should reflect on the scope of the claims made, e.g., if the approach was
only tested on a few datasets or with a few runs. In general, empirical results often
depend on implicit assumptions, which should be articulated.
• The authors should reflect on the factors that influence the performance of the approach.
For example, a facial recognition algorithm may perform poorly when image resolution
is low or images are taken in low lighting. Or a speech-to-text system might not be
used reliably to provide closed captions for online lectures because it fails to handle
technical jargon.
• The authors should discuss the computational efficiency of the proposed algorithms
and how they scale with dataset size.
• If applicable, the authors should discuss possible limitations of their approach to
address problems of privacy and fairness.
• While the authors might fear that complete honesty about limitations might be used by
reviewers as grounds for rejection, a worse outcome might be that reviewers discover
limitations that aren’t acknowledged in the paper. The authors should use their best
judgment and recognize that individual actions in favor of transparency play an impor-
tant role in developing norms that preserve the integrity of the community. Reviewers
will be specifically instructed to not penalize honesty concerning limitations.
3. Theory Assumptions and Proofs
Question: For each theoretical result, does the paper provide the full set of assumptions and
a complete (and correct) proof?
Answer: [Yes]
31
Justification: Our paper is a theory paper: all assumptions, statements and proofs provided.
Guidelines:
• The answer NA means that the paper does not include theoretical results.
• All the theorems, formulas, and proofs in the paper should be numbered and cross-
referenced.
• All assumptions should be clearly stated or referenced in the statement of any theorems.
• The proofs can either appear in the main paper or the supplemental material, but if
they appear in the supplemental material, the authors are encouraged to provide a short
proof sketch to provide intuition.
• Inversely, any informal proof provided in the core of the paper should be complemented
by formal proofs provided in appendix or supplemental material.
• Theorems and Lemmas that the proof relies upon should be properly referenced.
4. Experimental Result Reproducibility
Question: Does the paper fully disclose all the information needed to reproduce the main ex-
perimental results of the paper to the extent that it affects the main claims and/or conclusions
of the paper (regardless of whether the code and data are provided or not)?
Answer: [Yes]
Justification: Though our paper is a theory paper, we have included in the supplement a
detailed statement of all related algorithms and a toy experiment of a simple implementation
of these algorithms showcasing a simple run on a public UCI domain.
Guidelines:
• The answer NA means that the paper does not include experiments.
• If the paper includes experiments, a No answer to this question will not be perceived
well by the reviewers: Making the paper reproducible is important, regardless of
whether the code and data are provided or not.
• If the contribution is a dataset and/or model, the authors should describe the steps taken
to make their results reproducible or verifiable.
• Depending on the contribution, reproducibility can be accomplished in various ways.
For example, if the contribution is a novel architecture, describing the architecture fully
might suffice, or if the contribution is a specific model and empirical evaluation, it may
be necessary to either make it possible for others to replicate the model with the same
dataset, or provide access to the model. In general. releasing code and data is often
one good way to accomplish this, but reproducibility can also be provided via detailed
instructions for how to replicate the results, access to a hosted model (e.g., in the case
of a large language model), releasing of a model checkpoint, or other means that are
appropriate to the research performed.
• While NeurIPS does not require releasing code, the conference does require all submis-
sions to provide some reasonable avenue for reproducibility, which may depend on the
nature of the contribution. For example
(a) If the contribution is primarily a new algorithm, the paper should make it clear how
to reproduce that algorithm.
(b) If the contribution is primarily a new model architecture, the paper should describe
the architecture clearly and fully.
(c) If the contribution is a new model (e.g., a large language model), then there should
either be a way to access this model for reproducing the results or a way to reproduce
the model (e.g., with an open-source dataset or instructions for how to construct
the dataset).
(d) We recognize that reproducibility may be tricky in some cases, in which case
authors are welcome to describe the particular way they provide for reproducibility.
In the case of closed-source models, it may be that access to the model is limited in
some way (e.g., to registered users), but it should be possible for other researchers
to have some path to reproducing or verifying the results.
5. Open access to data and code
32
Question: Does the paper provide open access to the data and code, with sufficient instruc-
tions to faithfully reproduce the main experimental results, as described in supplemental
material?
Answer: [No]
Justification: Our paper is a theory paper. All algorithms we introduce are either in the main
file or the appendix.
Guidelines:
• The answer NA means that paper does not include experiments requiring code.
• Please see the NeurIPS code and data submission guidelines (https://nips.cc/
public/guides/CodeSubmissionPolicy) for more details.
• While we encourage the release of code and data, we understand that this might not be
possible, so “No” is an acceptable answer. Papers cannot be rejected simply for not
including code, unless this is central to the contribution (e.g., for a new open-source
benchmark).
• The instructions should contain the exact command and environment needed to run to
reproduce the results. See the NeurIPS code and data submission guidelines (https:
//nips.cc/public/guides/CodeSubmissionPolicy) for more details.
• The authors should provide instructions on data access and preparation, including how
to access the raw data, preprocessed data, intermediate data, and generated data, etc.
• The authors should provide scripts to reproduce all experimental results for the new
proposed method and baselines. If only a subset of experiments are reproducible, they
should state which ones are omitted from the script and why.
• At submission time, to preserve anonymity, the authors should release anonymized
versions (if applicable).
• Providing as much information as possible in supplemental material (appended to the
paper) is recommended, but including URLs to data and code is permitted.
6. Experimental Setting/Details
Question: Does the paper specify all the training and test details (e.g., data splits, hyper-
parameters, how they were chosen, type of optimizer, etc.) necessary to understand the
results?
Answer: [NA]
Justification: Our paper is a theory paper.
Guidelines:
• The answer NA means that the paper does not include experiments.
• The experimental setting should be presented in the core of the paper to a level of detail
that is necessary to appreciate the results and make sense of them.
• The full details can be provided either with the code, in appendix, or as supplemental
material.
7. Experiment Statistical Significance
Question: Does the paper report error bars suitably and correctly defined or other appropriate
information about the statistical significance of the experiments?
Answer: [NA]
Justification: Our paper is a theory paper.
Guidelines:
• The answer NA means that the paper does not include experiments.
• The authors should answer "Yes" if the results are accompanied by error bars, confi-
dence intervals, or statistical significance tests, at least for the experiments that support
the main claims of the paper.
• The factors of variability that the error bars are capturing should be clearly stated (for
example, train/test split, initialization, random drawing of some parameter, or overall
run with given experimental conditions).
33
• The method for calculating the error bars should be explained (closed form formula,
call to a library function, bootstrap, etc.)
• The assumptions made should be given (e.g., Normally distributed errors).
• It should be clear whether the error bar is the standard deviation or the standard error
of the mean.
• It is OK to report 1-sigma error bars, but one should state it. The authors should
preferably report a 2-sigma error bar than state that they have a 96% CI, if the hypothesis
of Normality of errors is not verified.
• For asymmetric distributions, the authors should be careful not to show in tables or
figures symmetric error bars that would yield results that are out of range (e.g. negative
error rates).
• If error bars are reported in tables or plots, The authors should explain in the text how
they were calculated and reference the corresponding figures or tables in the text.
8. Experiments Compute Resources
Question: For each experiment, does the paper provide sufficient information on the com-
puter resources (type of compute workers, memory, time of execution) needed to reproduce
the experiments?
Answer: [NA] .
Justification: Our paper is a theory paper.
Guidelines:
• The answer NA means that the paper does not include experiments.
• The paper should indicate the type of compute workers CPU or GPU, internal cluster,
or cloud provider, including relevant memory and storage.
• The paper should provide the amount of compute required for each of the individual
experimental runs as well as estimate the total compute.
• The paper should disclose whether the full research project required more compute
than the experiments reported in the paper (e.g., preliminary or failed experiments that
didn’t make it into the paper).
9. Code Of Ethics
Question: Does the research conducted in the paper conform, in every respect, with the
NeurIPS Code of Ethics https://neurips.cc/public/EthicsGuidelines?
Answer: [Yes]
Justification: The research of the paper follows the code of ethics.
Guidelines:
• The answer NA means that the authors have not reviewed the NeurIPS Code of Ethics.
• If the authors answer No, they should explain the special circumstances that require a
deviation from the Code of Ethics.
• The authors should make sure to preserve anonymity (e.g., if there is a special consid-
eration due to laws or regulations in their jurisdiction).
10. Broader Impacts
Question: Does the paper discuss both potential positive societal impacts and negative
societal impacts of the work performed?
Answer: [NA]
Justification: Our paper is a theory paper.
Guidelines:
• The answer NA means that there is no societal impact of the work performed.
• If the authors answer NA or No, they should explain why their work has no societal
impact or why the paper does not address societal impact.
• Examples of negative societal impacts include potential malicious or unintended uses
(e.g., disinformation, generating fake profiles, surveillance), fairness considerations
(e.g., deployment of technologies that could make decisions that unfairly impact specific
groups), privacy considerations, and security considerations.
34
• The conference expects that many papers will be foundational research and not tied
to particular applications, let alone deployments. However, if there is a direct path to
any negative applications, the authors should point it out. For example, it is legitimate
to point out that an improvement in the quality of generative models could be used to
generate deepfakes for disinformation. On the other hand, it is not needed to point out
that a generic algorithm for optimizing neural networks could enable people to train
models that generate Deepfakes faster.
• The authors should consider possible harms that could arise when the technology is
being used as intended and functioning correctly, harms that could arise when the
technology is being used as intended but gives incorrect results, and harms following
from (intentional or unintentional) misuse of the technology.
• If there are negative societal impacts, the authors could also discuss possible mitigation
strategies (e.g., gated release of models, providing defenses in addition to attacks,
mechanisms for monitoring misuse, mechanisms to monitor how a system learns from
feedback over time, improving the efficiency and accessibility of ML).
11. Safeguards
Question: Does the paper describe safeguards that have been put in place for responsible
release of data or models that have a high risk for misuse (e.g., pretrained language models,
image generators, or scraped datasets)?
Answer: [NA]
Justification: No release of data or models.
Guidelines:
• The answer NA means that the paper poses no such risks.
• Released models that have a high risk for misuse or dual-use should be released with
necessary safeguards to allow for controlled use of the model, for example by requiring
that users adhere to usage guidelines or restrictions to access the model or implementing
safety filters.
• Datasets that have been scraped from the Internet could pose safety risks. The authors
should describe how they avoided releasing unsafe images.
• We recognize that providing effective safeguards is challenging, and many papers do
not require this, but we encourage authors to take this into account and make a best
faith effort.
12. Licenses for existing assets
Question: Are the creators or original owners of assets (e.g., code, data, models), used in
the paper, properly credited and are the license and terms of use explicitly mentioned and
properly respected?
Answer: [NA]
Justification: no outside code, data or models used requiring licensing.
Guidelines:
• The answer NA means that the paper does not use existing assets.
• The authors should cite the original paper that produced the code package or dataset.
• The authors should state which version of the asset is used and, if possible, include a
URL.
• The name of the license (e.g., CC-BY 4.0) should be included for each asset.
• For scraped data from a particular source (e.g., website), the copyright and terms of
service of that source should be provided.
• If assets are released, the license, copyright information, and terms of use in the
package should be provided. For popular datasets, paperswithcode.com/datasets
has curated licenses for some datasets. Their licensing guide can help determine the
license of a dataset.
• For existing datasets that are re-packaged, both the original license and the license of
the derived asset (if it has changed) should be provided.
35
• If this information is not available online, the authors are encouraged to reach out to
the asset’s creators.
13. New Assets
Question: Are new assets introduced in the paper well documented and is the documentation
provided alongside the assets?
Answer: [NA]
Justification: No new assets provided.
Guidelines:
• The answer NA means that the paper does not release new assets.
• Researchers should communicate the details of the dataset/code/model as part of their
submissions via structured templates. This includes details about training, license,
limitations, etc.
• The paper should discuss whether and how consent was obtained from people whose
asset is used.
• At submission time, remember to anonymize your assets (if applicable). You can either
create an anonymized URL or include an anonymized zip file.
14. Crowdsourcing and Research with Human Subjects
Question: For crowdsourcing experiments and research with human subjects, does the paper
include the full text of instructions given to participants and screenshots, if applicable, as
well as details about compensation (if any)?
Answer: [NA]
Justification: No crowdsourcing or research with human subjects.
Guidelines:
• The answer NA means that the paper does not involve crowdsourcing nor research with
human subjects.
• Including this information in the supplemental material is fine, but if the main contribu-
tion of the paper involves human subjects, then as much detail as possible should be
included in the main paper.
• According to the NeurIPS Code of Ethics, workers involved in data collection, curation,
or other labor should be paid at least the minimum wage in the country of the data
collector.
15. Institutional Review Board (IRB) Approvals or Equivalent for Research with Human
Subjects
Question: Does the paper describe potential risks incurred by study participants, whether
such risks were disclosed to the subjects, and whether Institutional Review Board (IRB)
approvals (or an equivalent approval/review based on the requirements of your country or
institution) were obtained?
Answer: [NA]
Justification: No research with human subjects.
Guidelines:
• The answer NA means that the paper does not involve crowdsourcing nor research with
human subjects.
• Depending on the country in which research is conducted, IRB approval (or equivalent)
may be required for any human subjects research. If you obtained IRB approval, you
should clearly state this in the paper.
• We recognize that the procedures for this may vary significantly between institutions
and locations, and we expect authors to adhere to the NeurIPS Code of Ethics and the
guidelines for their institution.
• For initial submissions, do not include any information that would break anonymity (if
applicable), such as the institution conducting the review.
36