The Art of Streetplay

Thursday, July 14, 2005

A Few Thoughts on the Art of Forecasting, Reference to 'Fooled By Randomness

To sum up the relevant facts from Nassim Taleb's 'Fooled By Randomness'--
-Traditional statistical inference is severely marred by the presence, the prevalence, and the staggerring effect of rare events, aka black swans. The example I used was an individual picking balls out of a container one by one to determine the underlying distribution of the balls. Knowledge acquisition creeps up so slowly as to be almost useless.
-Try to avoid biases. Trick yourself if necessary. If we get too caught up in "noise" we diminish our ability to see what really matters-- the bigger picture. Try not to take action until you feel you have done your utmost to gather all the relevant information you deem necessary. Once you are finished, take note of what you know, and most importantly, what you don't know. Try not to get 'married to a position'-- if information presents itself later which goes contrary to your thesis, be prepared to toss away your position entirely. Avoid the temptation when in a bad trade to wait until you break even-- admit your mistake and move on. Try to avoid getting sucked into herd-like behavior. Try to acknowledge the randomness in trades which ultimately work out.
-Watch out for deceiving statistics. The example I gave was that risk taking is not in and of itself a good thing.
-When analyzing historical information, look to the generator, not the result. The two are completely distinct. Try to get a feel for what the possible outcomes were and the probabilities and payoffs associated with those outcomes. If there is a terminal probability, a probability that you will lose or die or something equally bad, stop. It will inevitably happen.
So in this context I believe we are in a reasonable positition to make a rational forecast of the future. I believe forecasting is a two part game.
The first part of the game is gathering as much historical information as is possible from as diverse a library of knowledge as you can find, then distilling that information into a usable form and forming an opinion based on that. It is a large scale inference, basically, which can be decomposed into qualitative and quantitative elements. Essentially what one has done is look backwards to project forwards. This is powerful of its own right, and can probably do well over most short time horizons, but it is still very vulnerable to possibly lethal rare events.
The second part to my general methodology attempts to correct for the vulnerability of the first. I would try, to the best of my ability, to flush all historical information out of my head. I would look very hard at the company's business model, industry, and upper management and simply throw out all scenarios I can think of related to anything affecting all three. I would not constrain myself to the realm of possibility or rationality. The whole point is that black swans cannot be thought of rationally beforehand. Once all ideas have been thrown on the table, I would analyze the repercussions of the underlying scenarios. Basically, what are the payoffs associated with those scenarios? I then sort the list of scenarios by increasing payoff and go through the list one by one. I would then look for weaknesses-- how can the bad scenarios actually happen? Then I would find ways to counter the plug the dam-- to counter the weaknesses.
As an example, consider 9-11. It was a most tragic black swan. However because it has never happened to us in the past, it was pretty much out of the scope of normal possibility at the time-- I doubt almost anyone thought such a thing was possible. The whole point is that the ideas are almost by definition deemed crazy ex ante. The second step might be vague, but something along those lines seems to be the only way we can expect the unexpected.
If anyone has any thoughts I'd love to hear them.


Post a Comment

<< Home