Nonlinear Inference Learning for Differentially Private Massive Data

24 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: societal considerations including fairness, safety, privacy
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Differential privacy, Nonlinear Inference, Massive Data, Bag of Little Bootstraps
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: The Bag of Little Bootstraps (BLB) method is widely utilized as a robust and computationally efficient approach in statistical inference studies involving large-scale data. However, this sampling technique overlooks the privacy protection of the original data. To address this limitation, we enhance the existing differential privacy algorithm and integrate it with the BLB method. This integration gives rise to a novel differential privacy mechanism, enabling a comprehensive statistical analysis of aggregated parameters while safeguarding the confidentiality of individual private data. Additionally, to address both the variability in noise variance under the differential privacy mechanism and the uncertainty surrounding estimate distributions, we employ the central limit theorem within the context of nonlinear expectation theory. This facilitates the derivation of the corresponding test statistic and the introduction of a hypothesis testing methodology. Furthermore, we validate the commendable performance of our proposed inference procedure through data simulation studies. The big data-oriented differential privacy-preserving mechanism proposed in this study effectively fulfills the requirements for privacy preservation without compromising subsequent statistical inference. This contribution holds significant reference value for the sharing of pertinent data and endeavors related to statistical analysis.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9218
Loading