Chebyshev Inequality For Standard Deviation

The Chebyshev inequality goes like this. Suppose that $$x$$ is an $$n$$-vector, and that $$k$$ of its entries satisfy $$|x_i| \geq a$$, where $$a > 0$$. Then $$k$$ of its entries satisfy $$x_i^2 \geq a^2$$. It follows that $||x||^2 = x_1^2 + \cdots + x_n^2 \geq ka^2$ since $$k$$ of the numbers in the sum are at least $$a^2$$, and the other $$n - k$$ numbers are nonnegative. We conclude that $$k \leq ||x||^2 / a^2$$, which is the Chebyshev inequality. …

Posted on

Average, RMS, And Standard Deviation

The average, RMS value, and standard deviation of a vector are related by the formula $\mathbf{rms}(x)^2 = \mathbf{avg}(x)^2 + \mathbf{std}(x)^2$ $$\mathbf{rms}(x)^2$$ is the mean square value of the entries of $$x$$, which can be expressed as the square of the mean value, plus the mean square fluctuation of the entries of $$x$$ around their mean value. Examples Mean return and risk. Suppose that an $$n$$-vector represents a time seris of retun on an investment, expressed as the percentage, in $$n$$ time periods over some interval of time. …

Posted on

Standard Deviation

For any vector $$x$$, the vector $$\tilde{x} = x - \mathbf{avg}(x)\mathbf{1}$$ is called the associated de-meaned vector, obtained by subtracting from each entry of $$x$$ the mean value of the entries. The mean value of the entries of $$\tilde{x}$$ is zero. The de-meaned vector is useful for understanding how the entries of a vector deviate form their mean value. It is zero if all the entries in the original vector $$x$$ are the same. …

Posted on

Units For Heterogeneous Vector Entries

The square of the distance between two $$n$$-vectors $$x$$ and $$y$$ is given by $||x - y||^2 = (x_1 - y_1)^2 + \cdots + (x_n - x_n)^2$ the sum of the squares of the differences between their respective entries. We square this value in order to have positive distances. When entries are of the same type, interpreting this distance is straight-forward. A large difference means that the entries are farther apart, and a smaller difference means that the entries are closer to each other. …

Posted on

Examples Of Euclidean Norms & Distances

In this post, we go over a few applications of Euclidean norms and distances. Feature Distance If $$x$$ and $$y$$ represent vectors of $$n$$ features of two objects, the quantity $$||x - y||$$ is called the feature distance and gives a measure of how different the objects are (in terms of their feature values). Suppose the feature vectors are associated with patients in a hospital, with entries such as weight, age, presence of chest pain, difficulty breathing, and the results of the tests. …

Posted on