Journal of Economic and Social Measurement - Volume 32, issue 1
Purchase individual online access for 1 year to this journal.
Price: EUR 125.00
ISSN 0747-9662 (P)
ISSN 1875-8932 (E)
The Journal of Economic and Social Measurement (JESM) is a quarterly journal that is concerned with the investigation of all aspects of production, distribution and use of economic and other societal statistical data, and with the use of computers in that context. JESM publishes articles that consider the statistical methodology of economic and social science measurements. It is concerned with the methods and problems of data distribution, including the design and implementation of data base systems and, more generally, computer software and hardware for distributing and accessing statistical data files. Its focus on computer software also includes the valuation of algorithms and their implementation, assessing the degree to which particular algorithms may yield more or less accurate computed results. It addresses the technical and even legal problems of the collection and use of data, legislation and administrative actions affecting government produced or distributed data files, and similar topics.
The journal serves as a forum for the exchange of information and views between data producers and users. In addition, it considers the various uses to which statistical data may be put, particularly to the degree that these uses illustrate or affect the properties of the data. The data considered in JESM are usually economic or social, as mentioned, but this is not a requirement; the editorial policies of JESM do not place a priori restrictions upon the data that might be considered within individual articles. Furthermore, there are no limitations concerning the source of the data.
Abstract: "The utility of the wage when a given volume of labour is employed is equal to the marginal disutility of that amount of employment. That is to say, the real wage of an employed person is that which is just sufficient (in the estimation…of the employed persons themselves) to induce the volume of labour actually employed to be forthcoming;" (John Maynard Keynes [10, p.5]. It is current practice in many statistical offices around the world to adjust price indices for quality changes over time in sampled items. This paper reviews briefly how these adjustments are done showing that they amount to changing quantity measurement units. It argues that quantity units must be chosen so that they leave households indifferent between old and new units of outputs (based on their utility) and old and new units of non-produced inputs (based on their disutility). Produced inputs prices, such as computer prices, are adjusted based on measured changes in their utility/productivity rather than based on their costs/disutility. Is that the correct solution?
Show more
Abstract: This paper proposes an extension of the Oaxaca decomposition equation using generalized residuals. The proposed extension is so general that it enables researchers to study wage differentials whatever complicating econometric issues exist, e.g., selection, simultaneity and endogeneity, and whatever econometric…techniques are used in order to obtain consistent estimates of wage equations. The key insight is that the residuals effect in the decomposition equation is identical to the differences in average generalized residuals which represent differences in the average effects of selection, simultaneity and endogeneity on wages. Decomposing the racial wage gap with the selection bias correction model, the empirical illustration shows how to implement this extension.
Show more
Abstract: This short note describes the context and computational aspects of the research for the paper The Dynamic Properties of the Klein-Goldberger Model. This work constituted the first attempt to solve a macroeconomic model using the electronic computer.
Abstract: This short article describes, from a personal perspective, the state of the art of econometric computing during the transitional period of the early 1960s when desktop calculating machines and the electronic computer were both used by economists very nearly indifferently.…Essentially, the choice at this time was between spending time learning to program and then employ a computer versus perhaps the same amount of time making the calculations as was done before the computer using a calculating machine.
Show more
Abstract: The empirical three-step, cross-sectional regression method of Fama and MacBeth [9] (FM) used classical ordinary least squares regression, averages, and t-statistics to reach its conclusions. Unfortunately, averages and t-statistics, taken over different volatility regimes and fractions of outlying data, can…be severely biased. This paper replicates and extends FM's results to recent time periods, analyzes the choice of time period, and replaces the classical estimators with a theoretically well-justified robust estimator in various parts of the three-step approach. While FM's conclusions on non-linearity and non-beta related risk could be confirmed, the conclusions of having, on average, a positive risk and return trade-off could not be confirmed.
Show more