Here I am struggling to make some ideas that looked interesting pan out. I’m starting into week three of writing this post when John D. Cook tweets a link to “Random is a Random Does,” where he reminds us that we are modeling deterministic processes with random variable. I’ve hinted towards this in what I had already written, but in some sense I’ve got it inside out. The point of statistical processes is to make deterministic the deterministic system that we modelled via randomness. I suppose I’ll eat a stochastic sandwich.
Having browsed David Hand’s “The Improbabily Priniciple,” has me trying to find a reason why events beyond the distribution’s convergence with the x-axis happen. The author probably proposed one. I have not read it yet.
Instead, I’m proposing my own.
The distribution’s points of convergence delineate the extent of a world. But, even black swans demonstrate why this isn’t so. A black swan moves the x-axis up the y-axis and pulls the rightmost point of covergence closer to the mean, or into the present from some point in the future. If you projected some future payoff near the former convergence, well, that’s toast now. It’s toast, not because the underying asset price just fell, rather the furture was just pulled into the present.
When the x-axis moves up the y-axis, the information, the bits, below the x-axis are lost. The bits could disappear due to the evaluative function and remain in place as a historical artifact. In the real world, the bits under the x-axis are still present. The stack remains. Replacing those bit with bits having an improved valuation is a key to the future. But, the key to getting out beyond the convergence is understanding that there is some determinism that we have not yet modelled with randomness.
While I labeled the area below the x-axis as lost, lets just say it’s outside consideration. It never just vanishes into smoke. Newtonian physics is still with us.
A few weeks ago, somebody tweeted a link to “A Random Walks Perspective on Maximizing Satisfaction and Profit” by M. Brand. Brand startled me when he talked about a graph in graph theory as being a collection of distributions. He goes on to say that an undirected graph amounted to a correlation, and a directed graph amounted to a causation. The problem is that the distributions overlap, but graph theory doesn’t hint at that. Actually, the author didn’t says correlation or causation. He used the verbage of symmetric and asymmetric distributions.
So that left me wondering what he meant by asymmetric. Well, he said Markov chains. Why was that so hard? The vector on the directed graph is a Poisson distribution from the departure node to the arrival node, a link in a Markov chain. The cummulative distribution would be centered near the mean of the arrival node, but the tails of the cummulative distribution would be at the outward tails of the underlying distributions. The tail over the departure node would be long, and the tail over the arrival node would be more normal, hence the asymmetry.
In the symmetric, or correlation, case, the cummulative distribution is centered between the underlying distributions with its tails at the outward tails of the underlying distributions.
The following figure shows roughly what both cumulative distributions would look like.
The link in the Markov chain is conditional. The cumulative distribution would occur only when the Markov transition happens, so the valuation would oscillate from the blue distribution on the right to the gray cumulative distribution below it. Those oscillations would be black swans or inverse black swans. The swans appear as arrows in the following figures. Different portions of the cumulative distribution with their particular swans or inverse swans are separated by vertical purple lines.
The conditional nature of the arrival of an event means that the cumulative distribution is short lived. A flood happens causing losses. Insurance companies cover some of those losses. Other losses linger. The arrival event separates into several different signals or distributions.
Brand also asserts that the cumulative distribution is static. For that summative distribution to be static, the graph would have to be static. Surprise! A graph of any real world system is anything, but static.
A single conditional probability could drag a large subgraph into the cumulative distribution reducing the height of the cumulative distribution and widening it greatly.
In the figure, two subgraphs are combined by a short-lived Markovian transition giving rise to a cumulative distribution represented by the brown surface. Most of the mass accumulates under the arrival subgraph.
Our takeaways here are that as product managers we need to consider larger graphs when looking for improbable events and effects. Graphs are not static. Look for Markov transitions that give rise to temporary cumulative distributions. Look for those black swans and inverse black swans. And, last, bits don’t just disappear. Bits in information physics position. Information replaces potential energy, so the mouse trap sits waiting for the arrival of some cheese moving event, other things happen. A distribution envelopes the mouse and then vanishes one Fourier component at a time.
But, forgetting the mouse, commoditization of product features is one of those black swans. This graph/distribution stuff really happens to products and product managers.