Differential Privacy

From Bitnami MediaWiki
Jump to navigation Jump to search

Differential Privacy refers to a process to share and compute information across different data sets in a manner that minimizes the disclosure of the shared information at an individual record level in either data set.

Typical differential privacy methods rely on suppressing or altering the accuracy of information (adding “noise”) at the individual level when the information is shared. Such systems aim to ensure results of aggregate metrics computed on the altered data are similar to original unaltered data, which in turn relies on mandating that the analyses of only large “enough” data sets can be supported.

One of the assumptions of differential privacy is that some “trusted” authority is still collecting individuals’ raw data, before it applies differential privacy operations to the data set. Thus, differential privacy is dependent on trust, which thus shifts the analysis from can any organization ask questions of a data set containing personal information to which organizations can ask questions of that same data set.

See Also

Fully Homomorphic Encryption

Multi-party Computation