Markov inequality

Markov inequality

[′mar‚kȯf ‚in·i′kwäl·əd·ē] (statistics) If x is a random variable with probability P and expectation E, then, for any positive number a and positive number n P (| x | ≥ a) ≤ E (| x |n/ a n).