Art
J-GLOBAL ID:202002216740992989   Reference number:20A0661607

サーバレスFederated Learningのための分散最適化

Author (5):
Material:
Volume: 82nd  Issue:Page: 3.19-3.20  Publication year: Feb. 20, 2020 
JST Material Number: S0731A  Document type: Proceedings
Article type: 短報  Country of issue: Japan (JPN)  Language: JAPANESE (JA)
Thesaurus term:
Thesaurus term/Semi thesaurus term
Keywords indexed to the article.
All keywords is available on JDreamIII(charged).
On J-GLOBAL, this item will be available after more than half a year after the record posted. In addtion, medical articles require to login to MyJ-GLOBAL.

Semi thesaurus term:
Thesaurus term/Semi thesaurus term
Keywords indexed to the article.
All keywords is available on JDreamIII(charged).
On J-GLOBAL, this item will be available after more than half a year after the record posted. In addtion, medical articles require to login to MyJ-GLOBAL.

JST classification (2):
JST classification
Category name(code) classified by JST.
Artificial intelligence  ,  Computer networks 
Reference (3):
  • McMahan, H. B., et al.: Communication-Efficient Learning of Deep Networks from Decentralized Data, Proc. of AISTATS, 2017.
  • Anil, R., et al.: Large Scale Distributed Neural Network Training through Online Distillation, Proc. Of ICLR, Apr, 2018.
  • Nedic, A., et al.: Constrained Consensus and Optimization in Multi-Agent Networks, IEEE Trans. Autom. Control, Vol.55, No.4, pp.922-938 (2010).
Terms in the title (3):
Terms in the title
Keywords automatically extracted from the title.

Return to Previous Page