Abstract: This paper identifies deficiencies in agencies' electronic dissemination practices noting that while new technologies can help increase the quantity of statistics and the level of service provided, there are old, persistent gaps in the quality of our statistics. It suggests that statistical data and metadata must possess new attributes if agencies are to satisfy their duty of care “electronically” and identifies data and metadata management practices and systems which aim to achieve this.
Abstract: Our paper takes a long-term perspective, noting that regular Swedish demographic statistics date back to 1749. The question now is: Are we going to leave a documentation of our society that is of as high a quality as they did 250 years ago? Will it be possible to read all our electronic documents in another 250 years? Is our documentation good enough? Are our methods of storage safe enough? A broad picture is given on the developments of the technical infrastructure within Statistics Sweden from mainframes to PC networks. Given the complicated nature of the new system it is…easy to understand that good quality documentation may be difficult to maintain, compared to the rather simple forms and tables that were produced 250 years ago, or even compared with the mainframe production system of 35 years ago. The new technical platform will form the foundation for our new databases of official statistics. However, all official statistics will not be placed in them, but only such as are of wider interest, primarily those statistics in general demand by users in different sectors of the society. These statistics, at least, will definitely be well documented and saved for future generations. A number of registers (e.g. population, enterprises) and some “observation registers” from important surveys will also be part of the databases. An observation register is the final edited register with the observations from the survey. An important aspect discussed at some length in the paper is the documentation needed for these registers. Some background is presented in the UN/ECE “Guidelines for the Modelling of Statistical Data and Metadata”. These guidelines, edited by Prof. Bo Sundgren of Statistics Sweden, are a result of the UN/ECE METIS project. Prof. Sundgren is also the project leader for the development of the new Swedish databases. It is therefore natural for our database development to follow these guidelines. We try to show how a systematic handling of metadata can help to provide present and future users of data with the needed background information. The research type user trying to re-use data from past surveys has special needs that must be catered for. A key issue is having links between different types of metadata.
Abstract: Data editing, tracing and correcting errors in records, takes up a great deal of the available human resources for the production of statistics. As budget restraints become more severe, the pressure to obtain more efficient data editing procedures increases. There is a need for procedures that restrict the attention of human judgment to errors that are crucial for the quality of the statistical outcomes and that leave the remaining decisions to automatic procedures. The present contribution firstly deals with the types of errors that occur in survey research, then summarizes the capabilities of the state-of-the-art micro editing software systems…and finally gives an overview of more efficient data editing methods. Next an overview is given of fully automatic editing, and of methods to edit records with influential errors only: selective editing, aggregate editing and graphical macro editing.
Abstract: Statistics Canada has been using electronic publishing technologies for internal communications since 1994. In that year, the Agency introduced its Internal Communications Network (ICN) program for employees. While the term intranet had not yet been coined in 1994, the ICN has been, from its inception, an intranet. This paper examines the origins, objectives and benefits of the program, technological issues, content development, implementation issues and future directions. It is offered as a case study for those contemplating development of similar systems.
Abstract: This paper discusses different aspects of selecting material for Statistics Finland's Internet pages. The basic idea is that the Internet does not reduce the workload, but that each opportunity it offers requires extra work in order to be fully exploited. The Internet sets very hard demands on the whole production process of statistical services, but if these demands can be met the result is surely more than pleasing.
Abstract: Data from the UNDP-sponsored national human development reports is used to assess the extent of poverty in countries in transition in eastern Europe and the CIS. In order to do so, however, it has been necessary to clarify some of the conceptual issues, issues that have tended to become confused in recent years. These are in particular: (i) income-related poverty vs. welfare more broadly; (ii) poverty measurement at respectively the macro (national) vs. the micro (household) levels. It is a conclusion of the paper that poverty especially in the countries of the CIS is very different in kind from poverty…elsewhere, requiring different forms of analysis and eventually different forms of intervention.