Archive for October, 2011

Science as Ito Process

October 11, 2011

In a cryptic tweet , “Ito stochastic process n>=0, science. Knowledge=explicated +forgotten,” I was replying to Trevor Rotzien’s tweet of “Science isn’t static statements of universal laws nor a set of arbitrary rules. It’s an evolving body of knowledge.” In that response, I was defining science as a random, statistical process, or more simply a process that exhibits certain characteristics. Then, I tied that definition to that of knowledge. Being a tweet, I left something out of my definition of knowledge. I’ll put it back in.
Knowledge is a cyclical process of doing something artificial intelligence people call explicating knowledge, turning implicit/tacit knowledge into explicit form or explicit knowledge. The moves in Argentine Tango can be described explicitly, but practice puts that explicit description into your muscle memory where it is no longer explicit. It is implicit. We practice to re-implicate or make implicit that explicit knowledge. Science discovers through wide, vast, and history spanning explication. Discovery is explication.

Knowledge has its highest value not in its explicit forms, but in its implicit forms. Craft production is implicit production. Even explicit production has us using tools that embed explicit into an implicit media. A hammer is all the decisions made to stuff comprising the hammer and the hammer producing process.

Ore is dug up. Ore is transported. Ore is fired. Ore is oxygenated to make steel. Steel is poured, …. The Ore truck has an accident, so this whole hammer production process engages randomness. So the hammer production process is a random or stochastic process.

Stochastic processes come in two flavors these days. A few years ago before generalizing one flavor they came in two different flavors. The flavors have to do with how much memory is involved in dealing with the probabilities of the transitions from one state to another in the stochastic process. We had Gaussian/Bayesian and Markovian stochastic processes. Gaussian/Bayesian stochastic processes take into consideration the entire scope of the history of the known, small world to determine which state transition to make. Gaussian/Bayesian stochastic processes use all the, or the complete memory, n=infinity. Gaussian/Bayesian stochastic processes live under normal distributions. Markovian stochastic processes make state transition decisions without any memory, n=0. Markovian processes live under Poisson distributions.

Lately, Markovian stochastic processes have been generalized as an instance of a class that we call Ito processes, aka stochastic process with less than complete memory, or 0<=n

Machine learning shows us that Markovian/Ito processes are processes that discover new rules. Gaussian/Bayesian processes are processes that enforce rules, but do not discover new rules for yet to be explicated phenomena. So science is a process of discovering new rules, aka Markovian/Ito. Science education, aka the generalist culture of science express themselves as othodoxies in the general form of Gaussian/Bayesian–all knowledge propositions.

Normal (Gaussian/Bayesian) distributions are the limiting distribution or shape for Poisson distributions. This implies that as a collection of Poisson distributions attempt to cover the same data as Normal distribution the shapes or distributions converge. Poisson distributions converge faster than Normal distributions. Discoveries become orthodoxy. Poisson distributions are tall and narrow. Normal distributions are lower and wide.¬† Correlation and statistical significance require normal distributions of sufficient height and separation.Poisson distributions lead to Poisson games that I presented in a session two years ago at PcampSEA 09, “Game Theory for Product Managers.”

Functional cultures, like expert-based science transition from the generalist culture under the normal distribution to an expert culture under the Poisson distribution. The process of technology adoption moves from expert to generalist, from Poisson of the newly discovered to the normal and the disruptive fight with the incumbent orthodoxy. This is the process of learning. There are mirroring processes of de-adoption and forgetting.

Forgetting is a process of moving from the infinity of state transition histories of the normal distribution towards the no/(zero)-memory state transitions of the Poisson distribution. We forget by successively omitting the most distant state from the decision about the next transition. We will have forgotten once the zero state transition is eliminated.

Discovery is a queued process as well. We distill random variation down to a steady state. That steady state is Gaussian/Bayesian. Forgetting is likewise a queued process that starts with the steady state and admits random variation until only random remains. Poisson distributions describe queues, so Poisson>Gaussian/Bayesian>Poisson is the way of knowledge. It is likewise, Science.

So that’s what that tweet meant.

Since product managers move product to move a technology across the technology adoption lifecycle, we deal with these distributions and others as we get the job done.

Sorry about not having graphics for this post.

Comments? Thanks!