Statistical Confidentiality

Statistical Confidentiality

Author: George T. Duncan

Publisher: Springer Science & Business Media

Published: 2011-03-22

Total Pages: 205

ISBN-13: 144197802X

DOWNLOAD EBOOK

Because statistical confidentiality embraces the responsibility for both protecting data and ensuring its beneficial use for statistical purposes, those working with personal and proprietary data can benefit from the principles and practices this book presents. Researchers can understand why an agency holding statistical data does not respond well to the demand, “Just give me the data; I’m only going to do good things with it.” Statisticians can incorporate the requirements of statistical confidentiality into their methodologies for data collection and analysis. Data stewards, caught between those eager for data and those who worry about confidentiality, can use the tools of statistical confidentiality toward satisfying both groups. The eight chapters lay out the dilemma of data stewardship organizations (such as statistical agencies) in resolving the tension between protecting data from snoopers while providing data to legitimate users, explain disclosure risk and explore the types of attack that a data snooper might mount, present the methods of disclosure risk assessment, give techniques for statistical disclosure limitation of both tabular data and microdata, identify measures of the impact of disclosure limitation on data utility, provide restricted access methods as administrative procedures for disclosure control, and finally explore the future of statistical confidentiality.


Statistical Disclosure Control

Statistical Disclosure Control

Author: Anco Hundepool

Publisher: John Wiley & Sons

Published: 2012-07-05

Total Pages: 308

ISBN-13: 1118348214

DOWNLOAD EBOOK

A reference to answer all your statistical confidentiality questions. This handbook provides technical guidance on statistical disclosure control and on how to approach the problem of balancing the need to provide users with statistical outputs and the need to protect the confidentiality of respondents. Statistical disclosure control is combined with other tools such as administrative, legal and IT in order to define a proper data dissemination strategy based on a risk management approach. The key concepts of statistical disclosure control are presented, along with the methodology and software that can be used to apply various methods of statistical disclosure control. Numerous examples and guidelines are also featured to illustrate the topics covered. Statistical Disclosure Control: Presents a combination of both theoretical and practical solutions Introduces all the key concepts and definitions involved with statistical disclosure control. Provides a high level overview of how to approach problems associated with confidentiality. Provides a broad-ranging review of the methods available to control disclosure. Explains the subtleties of group disclosure control. Features examples throughout the book along with case studies demonstrating how particular methods are used. Discusses microdata, magnitude and frequency tabular data, and remote access issues. Written by experts within leading National Statistical Institutes. Official statisticians, academics and market researchers who need to be informed and make decisions on disclosure limitation will benefit from this book.


Managing Statistical Confidentiality & Microdata Access

Managing Statistical Confidentiality & Microdata Access

Author: Conference of European Statisticians

Publisher:

Published: 2007

Total Pages: 114

ISBN-13:

DOWNLOAD EBOOK

These guidelines have been prepared a Task Force set up by the Conference of European Statisticians, with two main objectives.- The first is to foster greater uniformity of approach by countries to allow better access to microdata for the research community. The second is to produce guidelines and supporting case studies, which will help countries improve their arrangements for providing access to microdata.


Elements of Statistical Disclosure Control

Elements of Statistical Disclosure Control

Author: Leon Willenborg

Publisher: Springer Science & Business Media

Published: 2012-12-06

Total Pages: 273

ISBN-13: 1461301211

DOWNLOAD EBOOK

Statistical disclosure control is the discipline that deals with producing statistical data that are safe enough to be released to external researchers. This book concentrates on the methodology of the area. It deals with both microdata (individual data) and tabular (aggregated) data. The book attempts to develop the theory from what can be called the paradigm of statistical confidentiality: to modify unsafe data in such a way that safe (enough) data emerge, with minimum information loss. This book discusses what safe data, are, how information loss can be measured, and how to modify the data in a (near) optimal way. Once it has been decided how to measure safety and information loss, the production of safe data from unsafe data is often a matter of solving an optimization problem. Several such problems are discussed in the book, and most of them turn out to be hard problems that can be solved only approximately. The authors present new results that have not been published before. The book is not a description of an area that is closed, but, on the contrary, one that still has many spots awaiting to be more fully explored. Some of these are indicated in the book. The book will be useful for official, social and medical statisticians and others who are involved in releasing personal or business data for statistical use. Operations researchers may be interested in the optimization problems involved, particularly for the challenges they present. Leon Willenborg has worked at the Department of Statistical Methods at Statistics Netherlands since 1983, first as a researcher and since 1989 as a senior researcher. Since 1989 his main field of research and consultancy has been statistical disclosure control. From 1996-1998 he was the project coordinator of the EU co-funded SDC project.


Innovations in Federal Statistics

Innovations in Federal Statistics

Author: National Academies of Sciences, Engineering, and Medicine

Publisher: National Academies Press

Published: 2017-04-21

Total Pages: 151

ISBN-13: 030945428X

DOWNLOAD EBOOK

Federal government statistics provide critical information to the country and serve a key role in a democracy. For decades, sample surveys with instruments carefully designed for particular data needs have been one of the primary methods for collecting data for federal statistics. However, the costs of conducting such surveys have been increasing while response rates have been declining, and many surveys are not able to fulfill growing demands for more timely information and for more detailed information at state and local levels. Innovations in Federal Statistics examines the opportunities and risks of using government administrative and private sector data sources to foster a paradigm shift in federal statistical programs that would combine diverse data sources in a secure manner to enhance federal statistics. This first publication of a two-part series discusses the challenges faced by the federal statistical system and the foundational elements needed for a new paradigm.


Federal Statistics, Multiple Data Sources, and Privacy Protection

Federal Statistics, Multiple Data Sources, and Privacy Protection

Author: National Academies of Sciences, Engineering, and Medicine

Publisher: National Academies Press

Published: 2018-01-27

Total Pages: 195

ISBN-13: 0309465370

DOWNLOAD EBOOK

The environment for obtaining information and providing statistical data for policy makers and the public has changed significantly in the past decade, raising questions about the fundamental survey paradigm that underlies federal statistics. New data sources provide opportunities to develop a new paradigm that can improve timeliness, geographic or subpopulation detail, and statistical efficiency. It also has the potential to reduce the costs of producing federal statistics. The panel's first report described federal statistical agencies' current paradigm, which relies heavily on sample surveys for producing national statistics, and challenges agencies are facing; the legal frameworks and mechanisms for protecting the privacy and confidentiality of statistical data and for providing researchers access to data, and challenges to those frameworks and mechanisms; and statistical agencies access to alternative sources of data. The panel recommended a new approach for federal statistical programs that would combine diverse data sources from government and private sector sources and the creation of a new entity that would provide the foundational elements needed for this new approach, including legal authority to access data and protect privacy. This second of the panel's two reports builds on the analysis, conclusions, and recommendations in the first one. This report assesses alternative methods for implementing a new approach that would combine diverse data sources from government and private sector sources, including describing statistical models for combining data from multiple sources; examining statistical and computer science approaches that foster privacy protections; evaluating frameworks for assessing the quality and utility of alternative data sources; and various models for implementing the recommended new entity. Together, the two reports offer ideas and recommendations to help federal statistical agencies examine and evaluate data from alternative sources and then combine them as appropriate to provide the country with more timely, actionable, and useful information for policy makers, businesses, and individuals.


Synthetic Datasets for Statistical Disclosure Control

Synthetic Datasets for Statistical Disclosure Control

Author: Jörg Drechsler

Publisher: Springer Science & Business Media

Published: 2011-06-24

Total Pages: 148

ISBN-13: 146140326X

DOWNLOAD EBOOK

The aim of this book is to give the reader a detailed introduction to the different approaches to generating multiply imputed synthetic datasets. It describes all approaches that have been developed so far, provides a brief history of synthetic datasets, and gives useful hints on how to deal with real data problems like nonresponse, skip patterns, or logical constraints. Each chapter is dedicated to one approach, first describing the general concept followed by a detailed application to a real dataset providing useful guidelines on how to implement the theory in practice. The discussed multiple imputation approaches include imputation for nonresponse, generating fully synthetic datasets, generating partially synthetic datasets, generating synthetic datasets when the original data is subject to nonresponse, and a two-stage imputation approach that helps to better address the omnipresent trade-off between analytical validity and the risk of disclosure. The book concludes with a glimpse into the future of synthetic datasets, discussing the potential benefits and possible obstacles of the approach and ways to address the concerns of data users and their understandable discomfort with using data that doesn’t consist only of the originally collected values. The book is intended for researchers and practitioners alike. It helps the researcher to find the state of the art in synthetic data summarized in one book with full reference to all relevant papers on the topic. But it is also useful for the practitioner at the statistical agency who is considering the synthetic data approach for data dissemination in the future and wants to get familiar with the topic.


Guide to Protecting the Confidentiality of Personally Identifiable Information

Guide to Protecting the Confidentiality of Personally Identifiable Information

Author: Erika McCallister

Publisher: DIANE Publishing

Published: 2010-09

Total Pages: 59

ISBN-13: 1437934889

DOWNLOAD EBOOK

The escalation of security breaches involving personally identifiable information (PII) has contributed to the loss of millions of records over the past few years. Breaches involving PII are hazardous to both individuals and org. Individual harms may include identity theft, embarrassment, or blackmail. Organ. harms may include a loss of public trust, legal liability, or remediation costs. To protect the confidentiality of PII, org. should use a risk-based approach. This report provides guidelines for a risk-based approach to protecting the confidentiality of PII. The recommend. here are intended primarily for U.S. Fed. gov¿t. agencies and those who conduct business on behalf of the agencies, but other org. may find portions of the publication useful.


Registries for Evaluating Patient Outcomes

Registries for Evaluating Patient Outcomes

Author: Agency for Healthcare Research and Quality/AHRQ

Publisher: Government Printing Office

Published: 2014-04-01

Total Pages: 385

ISBN-13: 1587634333

DOWNLOAD EBOOK

This User’s Guide is intended to support the design, implementation, analysis, interpretation, and quality evaluation of registries created to increase understanding of patient outcomes. For the purposes of this guide, a patient registry is an organized system that uses observational study methods to collect uniform data (clinical and other) to evaluate specified outcomes for a population defined by a particular disease, condition, or exposure, and that serves one or more predetermined scientific, clinical, or policy purposes. A registry database is a file (or files) derived from the registry. Although registries can serve many purposes, this guide focuses on registries created for one or more of the following purposes: to describe the natural history of disease, to determine clinical effectiveness or cost-effectiveness of health care products and services, to measure or monitor safety and harm, and/or to measure quality of care. Registries are classified according to how their populations are defined. For example, product registries include patients who have been exposed to biopharmaceutical products or medical devices. Health services registries consist of patients who have had a common procedure, clinical encounter, or hospitalization. Disease or condition registries are defined by patients having the same diagnosis, such as cystic fibrosis or heart failure. The User’s Guide was created by researchers affiliated with AHRQ’s Effective Health Care Program, particularly those who participated in AHRQ’s DEcIDE (Developing Evidence to Inform Decisions About Effectiveness) program. Chapters were subject to multiple internal and external independent reviews.


Statistical Disclosure Control for Microdata

Statistical Disclosure Control for Microdata

Author: Matthias Templ

Publisher: Springer

Published: 2017-05-05

Total Pages: 299

ISBN-13: 3319502727

DOWNLOAD EBOOK

This book on statistical disclosure control presents the theory, applications and software implementation of the traditional approach to (micro)data anonymization, including data perturbation methods, disclosure risk, data utility, information loss and methods for simulating synthetic data. Introducing readers to the R packages sdcMicro and simPop, the book also features numerous examples and exercises with solutions, as well as case studies with real-world data, accompanied by the underlying R code to allow readers to reproduce all results. The demand for and volume of data from surveys, registers or other sources containing sensible information on persons or enterprises have increased significantly over the last several years. At the same time, privacy protection principles and regulations have imposed restrictions on the access and use of individual data. Proper and secure microdata dissemination calls for the application of statistical disclosure control methods to the da ta before release. This book is intended for practitioners at statistical agencies and other national and international organizations that deal with confidential data. It will also be interesting for researchers working in statistical disclosure control and the health sciences.