Sunday, February 6, 2011

"Technological Forecasting in Perspective" - critical analysis

The original article “Technological Forecasting in Perspective” was published in the “Management Science” magazine in October 1966, by Raymond S. Isenson. According to the author, countries in the world are creating models to predict the future development trends, which gives them power to control the technological development, by giving enough, or applying restrictive policies in order to stop development when neccessary. So, the development model is actually a long range planning tool for governments. The forecasters assist decision makers, or politicians in choosing policies to keep a government alive in our unstable world.

The historical evidence tells that centuries ago even pharaoas were trusting forecasters in order to stock up on food against famine years. The predictions were based on statistics from earlier years that gave forecasters a chance to think of possible developments. However, because the wealth in past was described mostly by agricultural goods and natural resources, the predictions were not very precise. Many variables have been added in predictions with time, based on the fact that statisticians were finding more and more relationships between their dependent variable and the new factors. The method of verification of those variables was simple, though timely. The forecasters were adding variables, and making economical estimations for future years, and then when those years were passed, they were checking if their estimations were precise. With these experiments economists came to the concept of Economic Projection that explores theoretical capabilities of economics. Since the future is almost impossible to predict, the economic forecast can only tell what might be achieved under the assumed set of circumstances. Because of this limitation on the innovation factor in economic forecast, the world has suffered a lot after the Industrial Revolution in the 18th century, and the technological boom in the 20th century. Thus, statisticians tried to create a mathematical model for innovations, or more exactly, a mathematical model for the “state of art”:

K(t)=0tI(t)dt

where:

K(t) is the state of knowledge at time T;

I(t) is the number of new information bits added during an increment of time and is a function of time.

Ii(t)=p(t)*Q*Ni(t)[1+b*Ni(t)]

where:

p(t) is the probability that a scientist will make a contribution during a given time increment;

Q is the average productivity factor of a single scientist;

b is the coefficient of connection between scientists (in terms of exchange of information), that speeds up the process of producing new information;

Ni(t) is the number of scientists actively engaged at time t.

Since the topic of the article is pretty indefinite in quantitative terms, it is obvious that scientists have been blindly trying to transform the facts into mathematical models based on time series by taking historical data and trying to find a trend for the future, as well as expressing Knowledge based on cross-sectional data: a dependent variable (K(t)) related to the average probability of a contribution into innovations (p(t)), scientist's average productivity coefficient (Q - taken from sample number of scientists in centuries), the effect of globalization on the state of art defined as nodes effect between scientists at one point in time (b), and the overall number of scientists active at the same point of time (Ni(t)).

This model is a very good example, showing the fact that the world doesn't consist of only quantitative problems, but also qualitative ones. Statisticians should be able to think out of box, and express qualitative factors in a way, that simplifies understanding of the phenomena, as well as forecast future developments of it. Our “knowledge” model is a mix of cross-sectional regression and time-series estimation, that cannot be categorized as a classical regression model, since it is more qualitative variable. With the model, we can always count the average level of information available for the humanity in the world. The independent variables in the model were collected and added to the model during many years. The most interesting part in this concept, however, is that an independent variable may not neccessarily exist at this moment, but, say, starting from some point in time it should be added to the function. For example, in the initial model, the “b” coefficient was not included, since there was no globalization factor. As the state of knowledge has improved, and the Internet was invented, the model became more ineffective, and the “b” was added. So, the main shortcoming of the model is apparently the need of complicating it all the time, as new information is being added with an increasing rate, which also creates new factors influencing the process of creating new information, consequently creating new independent variables to be included in the model.

As a conclusion to the analysis, it should be noted, that the original regression for the knowledge state does not tend to answer questions like “when the humanity will be capable to find a new Galaxy”, but only serves to indicate the possibility of achieving new goals. In fact, newer and apparently better models can be created for the same estimations: biologists can think about the same topic in the limits of one-cell organisms and their transformations in time, physicians may discuss the topic in terms of atoms and the kinetik and potential energies, etc. Same ideas can be described in many ways, but the main concept will always stay the same, and at the end of the day, all the scientists will come to one point – the concept of estimation of Information.

_____________________________________________________________________________

by Mary D. Haroutyounyan,
2011

No comments: