Statistical Method

statistics, mean, prices, london, instance, measure, theory, elements and wealth

Page: 1 2 3 4

An excellent example of the use of weights is afforded by the index numbers for the changes in wages or prices of commodities. In comparing prices of different years we find that, though some have increased and others have decreased, there has been a general tend ency in one direction. To measure this tend ency, that is, to determine the increase or de crease in the purchasing power of money, is one of the most important tasks of the econo mist. It is at once obvious that we cannot solve the problem by taking the simple arith metic mean of the prices of all commodities, for that would unduly accentuate the lesser things; would, for instance, give matches as Feat weight as coal. Instead, we take the most important ones whose prices we can estimate rather accurately and tabulate for a number of years. Taking any year as a base year we can then express the prices of each other year as percentages of those of the base year. The properly weighted mean of these percentages will then measure the movement of prices during this interval.

In the use of the mean as a type it must be remembered that it will not fairly represent the group if there is too great a divergence from it within the group. For instance, sta tistics may show the same per capita wealth, say $1,000, in two cities without giving us any idea as to the real prosperity of the inhabit ants. The one city may have the wealth well distributed with all of its inhabitants pros perous and happy, while the other has its slums and its palaces with only a small middle class. If, however, to this per capita average is appended another figure showing limits with in which half the people come, we can form a better estimate of the economic condition of the city. Suppose in the one case we find that this limit is $100 in excess or defect of the mean while in the other it is $400. We would then give the per capita wealth of the two cities as $1,000 ± $100 and $400, respectively. In the one case one-half the people fall within the narrow class of more than $900 and less than $1,100; in the other the limits would be $600 and $1,400. This additive quantity is called the probable error or probable deviation. We are, accordingly, in the development of the statistical method, interested not merely in the average as a type, but also in the divergence of the group from the type. The best measure of an aver age deviation from the mean is the so-called standard deviation, the square root of the mean of the squares of the deviations. Before we can compare the variabilities of two groups we must first reduce their measures of disper sion to relatives by expressing them as per centages of the means. For instance, a differ ence of 1,000 pounds between the weights of two elephants is less, proportionately, than a difference of one ounce between the weights of two mice. Failure to recognize this has led some very eminent biologists into espousing theories absolutely at variance with the facts. But the statistician does not limit

his study to the consideration of a single vari able. One of his most important taskr is to analyze the relationship between one variable or phenomenon and another. He seeks, for instance, to find the correlation between pov erty and crime, between the heights of fathers and of their sons, between the ages of hus bands and of their wives, between the price of wheat and the marriage rate, between the size of the wheat kernel and its protein content, between vaccination and immunity from disease. We cannot, however, directly and definitely measure the effect of one vari able on another because of the many other variables which enter into the problem. But by dealing with a large number of observa tions we can, to a large extent, neutralize these disturbing causes, and so make it possible to secure a very fair determination of the cor relation between the two variables under con sideration. We may, for instance, seek to determine the coritlation between rainfall and the potato crop though recognizing that the size of the crop is a function, not merely of the amount of rainfall, but also of the temper ature, culture, seed and a multitude of other causes. By dealing with a sufficiently large number of years, however, we can eliminate these disturbing elements and so secure an average value of the correlation sought. For the methods and formulae used in this computa tion the reader is directed to Yule's work on the 'Theory of Statistics.' Bibliography.— Bertillon, Jacques, 'Cows elementaire de statistique) (Paris 1896) ; Blaschke, E., Woriesungen fiber mathematische Statistik) (Berlin 1910) ; Block, Maurice, 'Traitd theorique et pratique de statistique' (Paris 1878) ; Bowley, Arthur L., 'Elements of Statistics' (London 1901); Edgeworth, F. Y., 'Index Numbers' ; Palgrave, 'Dictionary of Political Economy); Elderton, W. Palin, and Ethel M., 'Primer of Statistics' (London 1910) ; Faure, F., 'Elements de statistique' (Paris 1906) ; Gabaglio, A., 'Storia e teoria della statistica' (Milan 1880) ; Galton, Fran cis, all publications, but especially 'Natural In heritance' (London 1889) ; King, W. J., 'Ele ments of Statistical Method' (New York 1912) ; Meitzen, August, 'History, Theory and Technique of Statistics' (trans. by R. P. Falk ner, American Academy of Political and So cial Science, Philadelphia 1891); Pearson, Karl, 'Grammar of Science' (London 1910), and articles on °Mathematical Theory of Evolu tion' (in 'Philosophical Transactions, Bulletin, Royal Philosophical Society') ; Biontetrica and Journal of the Society of Drapers; Mayo Smith, Richmond, 'Statistics and Economics' (New York 1896) ; Yule, G. Udny, 'An Intro duction to the Theory of Statistics' (London 1912) ; Webb, A. D., 'New Dictionary of Sta tistics' (New York 1911).

Page: 1 2 3 4