Minimize squared relative error

The Endeavour 2025-06-21

Suppose you have a list of positive data points y1, y2, …, yn and you wanted to find a value α that minimizes the squared distances to each of the y‘s.

\sum_{i=1}^n (y_i - \alpha)^2

Then the solution is to take α to be the mean of the y‘s:

\alpha = \frac{1}{n} \sum_{i=1}^n y_i

This result is well known [1]. The following variation is not well known.

Suppose now that you want to choose α to minimize the squared relative distances to each of the y‘s. That is, you want to minimize the following.

\sum_{i=1}^n \left( \frac{y_i - \alpha}{\alpha} \right)^2

The value of alpha this expression is the contraharmonic mean of the y‘s [2].

\alpha = \frac{\sum_{i=1}^n y_i^2}{\sum_{i=1}^n y_i}

Related posts

[1] Aristotle says in the Nichomachean Ethics “The mean is in a sense an extreme.” This is literally true: the mean minimizes the sum of the squared errors.

[2] E. F. Beckenbach. A Class of Mean Value Functions. The American Mathematical Monthly. Vol. 57, No. 1 (Jan., 1950), pp. 1–6

The post Minimize squared relative error first appeared on John D. Cook.