Difference between revisions of "Differential Privacy"

From Bitnami MediaWiki
Jump to navigation Jump to search
(Created page with "Differential Privacy refers to a process to share and compute information across different data sets in a manner that minimizes the disclosure of the shared information at an...")
 
m
 
Line 6: Line 6:
  
 
== See Also ==
 
== See Also ==
[[:Category:Personal Data|Fully Homomorphic Encryption]]  
+
[[Fully Homomorphic Encryption]]  
  
[[:Category:Personal Data|Multi-party Computation]]
+
[[Multi-party Computation]]
  
  
 
[[Category:Glossary|Differential Privacy]]
 
[[Category:Glossary|Differential Privacy]]

Latest revision as of 22:00, 26 May 2021

Differential Privacy refers to a process to share and compute information across different data sets in a manner that minimizes the disclosure of the shared information at an individual record level in either data set.

Typical differential privacy methods rely on suppressing or altering the accuracy of information (adding “noise”) at the individual level when the information is shared. Such systems aim to ensure results of aggregate metrics computed on the altered data are similar to original unaltered data, which in turn relies on mandating that the analyses of only large “enough” data sets can be supported.

One of the assumptions of differential privacy is that some “trusted” authority is still collecting individuals’ raw data, before it applies differential privacy operations to the data set. Thus, differential privacy is dependent on trust, which thus shifts the analysis from can any organization ask questions of a data set containing personal information to which organizations can ask questions of that same data set.

See Also

Fully Homomorphic Encryption

Multi-party Computation