Find a mapping
-
$\exists {m} in {M} \text{such that } Increase(X) = Increase(D(X)) ; \text{for all } x \in X$ - The increase observed in the input and output must be equal
-
$\forall x \in X, ; y = DS(x) ; => y_1 = x_1 ; and ; y_m = x_n$ - The first and last values of the input must be preserved as-is in the output. This preserves reset points between blocks.
- Preserving in this case that all properties of the value is preserved, including if whether the datapoint is a reset (either via value or ST).
- authors note: this point here can be more rigourously defined but for the same of this exercise, we can take it mean that if
$x_{N}$ is a reset point in$x$ , it is also a reset point in$y$
- authors note: this point here can be more rigourously defined but for the same of this exercise, we can take it mean that if
func increase(x []datapoint) number {
var adjustment number
for i := 1; i < len(x); i++ {
if isReset(x[i-1], x[i]) {
adjustment += x[i-1].Value()
}
}
return x[len(x)-1].Value() - x[0].Value() + adjustment
}
Generally, if
(1) $x'i = x_i + \sum{k=1}^{i} adjust(x,k)$ such that
- aka a cumulative rebuild where all resets in
$x$ have been adjusted for in$x'$ x= [1,2,1,3,1] => x'=[1,2,1+2,3+2,1+2+3]=[1,2,3,5,6]
(2)
- note: we ignore the definition of what a reset here so we don't rely on the absolute values of any point in our analysis
For any
(3)
(4) Thus:
LHS:
RHS: $x'{N} - x'{1} + \cancel{\sum_{k=1}^{N} adjust(x',k)} = {x_N} + \sum_{k=1}^{N} adjust(x, k) - x_1$
Let the
Obviously this preserves the
A variation of option 1 but the last element of
More generally for any
However this solution does not work for all
A simple proof by contradiction:
case 1:
$adjust([x'{N-1}, x{N}]) = x'{N-1} = x{N-1} + \sum_{k=1}^{N-1} adjust(x, k) =\sum_{k=1}^{N} adjust(x, k)$
thus LHS = RHS
case 2:
$adjust([x'{N-1}, x{N}]) = 0$
but
A simple real example of case 2:
x = [4, 1, 2]
increase(x) = 2 - 4 + 4 = 2
downsampleKeepLast(x) = [4, 5, 2]
increase(downsampleKeepLast(x)) = 2 - 4 + 5 = 3
We can observe that in option 2 if
x = [4, 3, 2]
increase(x) = 2 - 4 + 4 + 3 = 5
downsampleKeepLast(x) = [4, 7, 2]
increase(downsampleKeepLast(x)) = 2 - 4 + 7 = 5
We can show that a generalized downsampler algorithm that rebuilds the input as a cumulative until the last reset will satifiy all constraints.
Assume the last reset point in
$increase(y) = x_N - x_1 + {x'}{r-1} = x_N - x_1 + x{r-1} + \sum_{k=1}^{r-1} adjust(x, k)$
The special case where the last reset is the last datapoint has been proven above.