Research theme for competitive and other funds (3):
2021 - 2025 大規模最適化問題に対する自動適応性を持つ一次法の確立
2017 - 2020 凸最適化問題に対する問題構造を利用した効率的な劣勾配アルゴリズムの構築
2014 - 2018 Accelerated (sub)gradient methods for large-scale convex optimization problems - with emphasis in the theoretical aspects of the implementation and its applications -
Papers (12):
Masaru Ito, Bruno F. Lourenço. Automorphisms of rank-one generated hyperbolicity cones and their derivative relaxations. SIAM Journal on Applied Algebra and Geometry. 2023. 7. 1. 236-263
Masaru Ito, Zhaosong Lu, Chuan He. A parameter-free conditional gradient method for composite minimization under Hölder condition. Journal of Machine Learning Research. 2023. 24. 166. 1-34
Masaru Ito and Mituhiro Fukuda. Nearly optimal first-order methods for convex optimization under gradient norm measure: An adaptive regularization approach. Journal of Optimization Theory and Applications. 2021. 188. 770-804
Koichiro Akiyama, Shuhei Nakamura, Masaru Ito, and Noriko Hirata-Kohno. A key exchange protocol relying on polynomial maps. International Journal of Mathematics for Industry. 2019
Adaptive gradient-based method for convex optimization problems under error bounds
(10th International Congress on Industrial and Applied Mathematics (ICIAM2023) 2023)
Optimization problems with eigenvalue constraints
(SIAM Conference on Optimization (OP23) 2023)