File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1007/978-981-97-4465-7_7
- Scopus: eid_2-s2.0-85200723972
- WOS: WOS:001310950300007
- Find via

Supplementary
- Citations:
- Appears in Collections:
Book Chapter: SFPDML: Securer and Faster Privacy-Preserving Distributed Machine Learning Based on MKTFHE
| Title | SFPDML: Securer and Faster Privacy-Preserving Distributed Machine Learning Based on MKTFHE |
|---|---|
| Authors | |
| Keywords | Distributed machine learning Multi-key decryption Multi-key fully homomorphic encryption Privacy-preserving machine learning |
| Issue Date | 12-Jul-2024 |
| Publisher | Springer |
| Abstract | In recent years, distributed machine learning has garnered significant attention. However, privacy continues to be an unresolved issue within this field. Multi-key homomorphic encryption over torus (MKTFHE) is one of the promising candidates for addressing this concern. Nevertheless, there may be security risks in the decryption of MKTFHE. Moreover, to our best known, the latest works about MKTFHE only support Boolean operation and linear operation which cannot directly compute the non-linear function like Sigmoid. Therefore, it’s still hard to perform common machine learning such as logistic regression and neural networks in high performance. In this paper, we first discover a possible attack on the existing distributed decryption protocol for MKTFHE and subsequently introduce secret sharing to propose a securer one. Next, we design some tools to implement logistic regression and neural network training in MKTFHE. Comparing the efficiency and accuracy between using Taylor polynomials of Sigmoid and our proposed function as an activation function, the experiments show that the efficiency of our function is 5-10× higher than using Taylor polynomials straightly and keeping a similar accuracy. |
| Persistent Identifier | http://hdl.handle.net/10722/352096 |
| ISBN | |
| ISSN | 2023 SCImago Journal Rankings: 0.203 |
| ISI Accession Number ID |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Wang, Hongxiao | - |
| dc.contributor.author | Jiang, Zoe L | - |
| dc.contributor.author | Zhao, Yanmin | - |
| dc.contributor.author | Yiu, Siu-Ming | - |
| dc.contributor.author | Yang, Peng | - |
| dc.contributor.author | Chen, Man | - |
| dc.contributor.author | Tan, Zejiu | - |
| dc.contributor.author | Jin, Bohan | - |
| dc.date.accessioned | 2024-12-14T00:35:13Z | - |
| dc.date.available | 2024-12-14T00:35:13Z | - |
| dc.date.issued | 2024-07-12 | - |
| dc.identifier.isbn | 9789819744640 | - |
| dc.identifier.issn | 1865-0929 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/352096 | - |
| dc.description.abstract | <p>In recent years, distributed machine learning has garnered significant attention. However, privacy continues to be an unresolved issue within this field. Multi-key homomorphic encryption over torus (MKTFHE) is one of the promising candidates for addressing this concern. Nevertheless, there may be security risks in the decryption of MKTFHE. Moreover, to our best known, the latest works about MKTFHE only support Boolean operation and linear operation which cannot directly compute the non-linear function like Sigmoid. Therefore, it’s still hard to perform common machine learning such as logistic regression and neural networks in high performance.</p><p>In this paper, we first discover a possible attack on the existing distributed decryption protocol for MKTFHE and subsequently introduce secret sharing to propose a securer one. Next, we design some tools to implement logistic regression and neural network training in MKTFHE. Comparing the efficiency and accuracy between using Taylor polynomials of Sigmoid and our proposed function as an activation function, the experiments show that the efficiency of our function is 5-10× higher than using Taylor polynomials straightly and keeping a similar accuracy.</p> | - |
| dc.language | eng | - |
| dc.publisher | Springer | - |
| dc.relation.ispartof | Communications in Computer and Information Science | - |
| dc.subject | Distributed machine learning | - |
| dc.subject | Multi-key decryption | - |
| dc.subject | Multi-key fully homomorphic encryption | - |
| dc.subject | Privacy-preserving machine learning | - |
| dc.title | SFPDML: Securer and Faster Privacy-Preserving Distributed Machine Learning Based on MKTFHE | - |
| dc.type | Book_Chapter | - |
| dc.identifier.doi | 10.1007/978-981-97-4465-7_7 | - |
| dc.identifier.scopus | eid_2-s2.0-85200723972 | - |
| dc.identifier.volume | 2095 CCIS | - |
| dc.identifier.spage | 94 | - |
| dc.identifier.epage | 108 | - |
| dc.identifier.isi | WOS:001310950300007 | - |
| dc.identifier.eisbn | 9789819744657 | - |
| dc.identifier.issnl | 1865-0929 | - |
