Thayer Watkins

Conditional Expectations and Random Walks

Probably everyone is a bit puzzled when they first encounter the notion that stock prices follow a random walk. It goes sharply against intuition. The random walk property is that price changes over one time period are uncorrelated with price changes over any previous time period. One argument justifying the random walk in stock prices is that profit-seeking tends to eliminate elements of predictability such as a trend or cycle. This argument salvages our intuition and hardly anyone seeks any further explanation. Yet, there is a deeper explanation for the random walk property. The deeper explanation that is presented here is that changes in conditional expectations formed on the basis of probability are necessarily uncorrelated.

Before getting into the analysis of the preceding proposition let us consider an analogous but simpler notion; i.e. that the deviations from the average average out to zero. For example, take any three numbers, say 4, 5, and 9. The average of these three numbers is 6. The deviations of 4, 5, and 9 from 6 are 2, 1, and -3, respectively. The sum of these deviations is exactly zero. This result does not depend upon the numbers chosen. The proof for any set of numbers x1, x2, ..., xn is that

(xi - x) =xi - x = xi - xn = 0

because x = xi/n.

For the case where each xi has a probability pi then x= xipi.

Usually the x in this case is called the expected value of x. Now consider

(xi-x)pi = xipi - xpi = x - x = 0

since pi = 1. Thus the expected value of the deviations from the expected value necessarily must be zero.

Now let us consider conditional expectations. For an illustration, let us consider growing grapes for wine when the rainfall is uncertain. To keep things simple let us suppose there are two phases to the grape growing season. Let x denote the amount of rain in the first phase, and y the amount in the final phase. The profit for the grape crops for wine is a function of x and y, V(x,y). It may be a complicated function because high values of x may be good but high value of y, rain in the harvest season, may be bad.

To keep the mathematics simple let us assume that there are only a finite number of different values for x and y; say xi, yi i=1,2,...,n. The crucial thing is the probabilities. The probability that it will rain today is affected by whether or not it rained yesterday. We can talk about the probability of xi, p(xi), but we will want to be able to consider the conditional probabilities for yi, P(yj:xi).

To be specific let us suppose the rainfall can be only Low, Medium and High. For the first phase let us say the probabilities are 0.3, 0.5, 0.2, respectively. Furthermore, let us say that if the rain is Low in the first phase then the probability of Low rain in the second phase is 0.5, for Medium rain 0.4, and for High rain 0.1. If the rainfall in the first phase is Medium then the probabilities of L, M, and H are 0.2, 0.5, 0.3. If the rainfall is High in the first phase then the probabilities for the second phase are 0.2, 0.3, 0.5.

We can form the conditional expected values of the profit from the wine grapes;

E{V:xi} = E{V(xi,yj):xi} = V(xi,yj)p(yj:xi)

We want to look at V(xi, yj) - E{V:xi} and see how it is correleated with E{V:xi}-E{V}. It will be found that this correlation has to be zero.

Before proving this proposition we must first consider the formal definition of correlation. The coefficient of correlation may be defined in terms of the concept of covariance. If two random variables, x and y, can only take on discrete values, say {x1, x2, .., xn} and {y1, y2, ..., ym}, and P(xi,yj) is the probability that the pair (xi,yj) will occur then the covariance of x and y is given by

Cov(x,y) = P(xi,yj) (xi-x)(yj-y),

where x, and y are the mean or expected values of x and y, respectively. The correlation coefficient of x and y is the covariance of x and y divided by the product of the standard deviations of x and y. A correlation coeficient of +1 or -1 indicates an exact relationship between x and y. A correlation coeficient of 0 indicates no relationship between x and y. A zero correlation occurs only when the covariance is zero.

The covariance may be expressed as

P(xi)P(yj | xi) [(xi-x)(yj-y),

where P(xi) is the unconditional probability of xi, and P(yj | xi) is the conditional probability of yj given that xi has occurred. The above expression may be rearranged to give

P(xi)(xi-x) P(yj | xi)(yj-y), [E{y|xi}-y]