Differential privacy (DP) is a mathematical approach to ensuring data privacy. In multi-step machine learning algorithms, each step that is differentially private accumulates a privacy loss, termed the “cost of composition.” However, recent research by Edith Cohen and Uri Stemmer from Google Research introduces a new paradigm that sidesteps this composition cost, enhancing utility.
This method, known as the Reorder-Slice-Compute (RSC) paradigm, allows data slices to be adaptively selected without incurring composition costs. The RSC paradigm has shown to offer substantial improvements in utility for various aggregation and learning tasks. This innovative approach not only addresses a primarily theoretical issue but also holds the potential to boost data efficiency in real-world applications.