File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Article: Adaptive False Discovery Rate Control with Privacy Guarantee
Title | Adaptive False Discovery Rate Control with Privacy Guarantee |
---|---|
Authors | |
Issue Date | 1-Jul-2023 |
Publisher | Journal of Machine Learning Research |
Citation | Journal of Machine Learning Research, 2023, v. 24, p. 1-35 How to Cite? |
Abstract | Differentially private multiple testing procedures can protect the information of individuals used in hypothesis tests while guaranteeing a small fraction of false discoveries. In this paper, we propose a differentially private adaptive FDR control method that can control the classic FDR metric exactly at a user-specified level α with a privacy guarantee, which is a non-trivial improvement compared to the differentially private Benjamini-Hochberg method proposed in Dwork et al. (2021). Our analysis is based on two key insights: 1) a novel p-value transformation that preserves both privacy and the mirror conservative property, and 2) a mirror peeling algorithm that allows the construction of the filtration and application of the optimal stopping technique. Numerical studies demonstrate that the proposed DP-AdaPT performs better compared to the existing differentially private FDR control methods. Compared to the non-private AdaPT, it incurs a small accuracy loss but significantly reduces the computation cost. |
Persistent Identifier | http://hdl.handle.net/10722/337713 |
ISSN | 2023 Impact Factor: 4.3 2023 SCImago Journal Rankings: 2.796 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Xia, Xintao | - |
dc.contributor.author | Cai, Zhanrui | - |
dc.date.accessioned | 2024-03-11T10:23:18Z | - |
dc.date.available | 2024-03-11T10:23:18Z | - |
dc.date.issued | 2023-07-01 | - |
dc.identifier.citation | Journal of Machine Learning Research, 2023, v. 24, p. 1-35 | - |
dc.identifier.issn | 1532-4435 | - |
dc.identifier.uri | http://hdl.handle.net/10722/337713 | - |
dc.description.abstract | <p>Differentially private multiple testing procedures can protect the information of individuals used in hypothesis tests while guaranteeing a small fraction of false discoveries. In this paper, we propose a differentially private adaptive FDR control method that can control the classic FDR metric exactly at a user-specified level α with a privacy guarantee, which is a non-trivial improvement compared to the differentially private Benjamini-Hochberg method proposed in Dwork et al. (2021). Our analysis is based on two key insights: 1) a novel p-value transformation that preserves both privacy and the mirror conservative property, and 2) a mirror peeling algorithm that allows the construction of the filtration and application of the optimal stopping technique. Numerical studies demonstrate that the proposed DP-AdaPT performs better compared to the existing differentially private FDR control methods. Compared to the non-private AdaPT, it incurs a small accuracy loss but significantly reduces the computation cost.<br></p> | - |
dc.language | eng | - |
dc.publisher | Journal of Machine Learning Research | - |
dc.relation.ispartof | Journal of Machine Learning Research | - |
dc.title | Adaptive False Discovery Rate Control with Privacy Guarantee | - |
dc.type | Article | - |
dc.identifier.volume | 24 | - |
dc.identifier.spage | 1 | - |
dc.identifier.epage | 35 | - |
dc.identifier.eissn | 1533-7928 | - |
dc.identifier.issnl | 1532-4435 | - |