File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: A distributed Frank-Wolfe algorithm for communication-efficient sparse learning

TitleA distributed Frank-Wolfe algorithm for communication-efficient sparse learning
Authors
Issue Date2015
Citation
SIAM International Conference on Data Mining 2015, SDM 2015, 2015, p. 478-486 How to Cite?
AbstractLearning sparse combinations is a frequent theme in machine learning. In this paper, we study its associated optimization problem in the distributed setting where the elements to be combined are not centrally located but spread over a network. We address the key challenges of balancing communication costs and optimization errors. To this end, we propose a distributed Frank-Wolfe (dFW) algorithm. We obtain theoretical guarantees on the optimization error e and communication cost that do not depend on the total number of combining elements. We further show that the communication cost of dFW is optimal by deriving a lower-bound on the communication cost required to construct an e-approximate solution. We validate our theoretical analysis with empirical studies on synthetic and real-world data, which demonstrate that dFW outperforms both baselines and competing methods. We also study the performance of dFW when the conditions of our analysis are relaxed, and show that dFW is fairly robust.
Persistent Identifierhttp://hdl.handle.net/10722/341182

 

DC FieldValueLanguage
dc.contributor.authorBellet, Aurélien-
dc.contributor.authorLiang, Yingyu-
dc.contributor.authorGarakani, Alireza Bagheri-
dc.contributor.authorBalcan, Maria Fiorina-
dc.contributor.authorSha, Fei-
dc.date.accessioned2024-03-13T08:40:49Z-
dc.date.available2024-03-13T08:40:49Z-
dc.date.issued2015-
dc.identifier.citationSIAM International Conference on Data Mining 2015, SDM 2015, 2015, p. 478-486-
dc.identifier.urihttp://hdl.handle.net/10722/341182-
dc.description.abstractLearning sparse combinations is a frequent theme in machine learning. In this paper, we study its associated optimization problem in the distributed setting where the elements to be combined are not centrally located but spread over a network. We address the key challenges of balancing communication costs and optimization errors. To this end, we propose a distributed Frank-Wolfe (dFW) algorithm. We obtain theoretical guarantees on the optimization error e and communication cost that do not depend on the total number of combining elements. We further show that the communication cost of dFW is optimal by deriving a lower-bound on the communication cost required to construct an e-approximate solution. We validate our theoretical analysis with empirical studies on synthetic and real-world data, which demonstrate that dFW outperforms both baselines and competing methods. We also study the performance of dFW when the conditions of our analysis are relaxed, and show that dFW is fairly robust.-
dc.languageeng-
dc.relation.ispartofSIAM International Conference on Data Mining 2015, SDM 2015-
dc.titleA distributed Frank-Wolfe algorithm for communication-efficient sparse learning-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-84961901443-
dc.identifier.spage478-
dc.identifier.epage486-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats