Completely monotone sequences
Wildon's Weblog 2020-12-20
A sequence is monotone decreasing if , or equivalently, if for all . We will decide once and for all to deal with decreasing sequences only, and say that such a sequence is completely monotone if all its iterated difference, including the zero differences, are non-negative. That is , , , for all , and so on. Equivalently,
for all . One family of examples I stumbled across, in what seemed like a completely unrelated context of random walks defined on posets with involution, is the two-parameter family
For instance, for each . For a direct proof that this sequence is completely monotone, we use the -integral to write
In the same context, I came across some multivariable polynomials that I conjecture are always positive when evaluated at a completely monotone sequence. These polynomials include
Their positivity is obvious for , because is equal to a difference. For positivity follows from
Despite much struggling, I was unable to find a similar expression for as a sum of products of positive differences. The main purpose of this post is to show that such expressions exist for linear functions, but not in general. I therefore may well have been on a wild-goose chase, hardly for the first time.
Linear polynomials
Fix . Let be the cone of all completely monotone sequences, as defined by the inequalities at the top of this post. Let be the cone spanned by the coefficients in these inequalities.
To make these cones more explicit, the following notation is helpful. Let . Then , and so . It follows inductively that is the set of non-negative linear combinations of the vectors defined by
For instance, if then are the rows of the matrix below
Suppose that for all completely monotone sequences. That is, for all ; we write this as . By Farkas’ Lemma applied to the cone , either
- , or
- there exists such that and .
In the `either’ case, since , the sequence is completely monotone. But then contradicts the hypothesis on . Therefore . So we have a simple necessary and sufficient condition for a linear polynomial to be non-negative on all completely monotone sequences: this is the case if and only if the coefficients are a linear coefficients of the vectors .
Quadratic example
I had hoped that something similar might work for quadratic and higher degree polynomials based on analogously defined cones coming from the coefficients of products in the differences. Here is an counterexample to this hope.
Take and order monomials lexicographically . The homogeneous quadratics in the difference polynomials span the same subspace as the rows of the matrix below.
For instance, the penultimate row has the coefficients for . Define .
Claim. for all completely monotonic sequences .
Proof. Since the polynomial is homogeneous, we may assume that . Since is linear in , when evaluated at , it is negative if and only if
This inequality implies that
but the quadratic is always at least . Hence , contradicting complete monotonicity.
Claim. is not a positive linear combination of homogeneous quadratics in the differences , , .
Proof. It is equivalent to show that the coefficients of , namely are not in the cone of positive linear combinations of the matrix above. Using the computer algebra Magma, one can compute its dual cone, which turns out to be the set of positive linear combinations of the rows of the matrix below.
In particular, the dual cone contains , and since , is not in the dual of the dual cone, as required.
I suspect that there is no easy way (or maybe even no algorithmic way?) to decide if a homogeneous polynomial in a completely monotone sequence is non-negative, but would be very happy to be proved wrong on this.