| 両方とも前のリビジョン 前のリビジョン | |
| en:intro:publications [2026/01/14 17:34] – [Reward and Punishment] Kanata OOWADA | en:intro:publications [2026/01/14 17:35] (現在) – [Conference Activities & Talks] Kanata OOWADA |
|---|
| ==== Conference Activities & Talks ==== | ==== Conference Activities & Talks ==== |
| - Keisuke Kamo, [[..:iiduka:|Hideaki Iiduka]]: **Increasing Batch Size Improves Convergence of Stochastic Gradient Descent with Momentum**, [[https://www.acml-conf.org/2025/|The 17th Asian Conference on Machine Learning (ACML2025)]], Taipei, Taiwan (Dec. 9--12, 2025) | - Keisuke Kamo, [[..:iiduka:|Hideaki Iiduka]]: **Increasing Batch Size Improves Convergence of Stochastic Gradient Descent with Momentum**, [[https://www.acml-conf.org/2025/|The 17th Asian Conference on Machine Learning (ACML2025)]], Taipei, Taiwan (Dec. 9--12, 2025) |
| - [[https://scholar.google.co.jp/citations?hl=ja&user=dejA0qcAAAAJ|Kanata Oowada]], [[..:iiduka:|Hideaki Iiduka]]: **Faster Convergence of Riemannian Stochastic Gradient Descent with Increasing Batch Size**, [[https://www.acml-conf.org/2025/|The 17th Asian Conference on Machine Learning (ACML2025)]], Taipei, Taiwan (Dec. 9--12, 2025) | - [[https://scholar.google.com/citations?user=3U-XTE0AAAAJ&hl=ja|Kanata Oowada]], [[..:iiduka:|Hideaki Iiduka]]: **Faster Convergence of Riemannian Stochastic Gradient Descent with Increasing Batch Size**, [[https://www.acml-conf.org/2025/|The 17th Asian Conference on Machine Learning (ACML2025)]], Taipei, Taiwan (Dec. 9--12, 2025) |
| - [[https://scholar.google.com/citations?user=hdDU4Z4AAAAJ&hl=ja|Kento Imaizumi]], [[..:iiduka:|Hideaki Iiduka]]: **Both Asymptotic and Non-Asymptotic Convergence of Quasi-Hyperbolic Momentum using Increasing Batch Size**, [[https://www.acml-conf.org/2025/|The 17th Asian Conference on Machine Learning (ACML2025)]], Taipei, Taiwan (Dec. 9--12, 2025) | - [[https://scholar.google.com/citations?user=hdDU4Z4AAAAJ&hl=ja|Kento Imaizumi]], [[..:iiduka:|Hideaki Iiduka]]: **Both Asymptotic and Non-Asymptotic Convergence of Quasi-Hyperbolic Momentum using Increasing Batch Size**, [[https://www.acml-conf.org/2025/|The 17th Asian Conference on Machine Learning (ACML2025)]], Taipei, Taiwan (Dec. 9--12, 2025) |
| - [[https://scholar.google.co.jp/citations?hl=ja&user=dejA0qcAAAAJ|Kanata Oowada]], [[..:iiduka:|Hideaki Iiduka]]: **Improvements of Computational Efficiency and Convergence Rate of Riemannian Stochastic Gradient Descent with Increasing Batch Size**, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University (Sep. 26, 2025) | - [[https://scholar.google.com/citations?user=3U-XTE0AAAAJ&hl=ja|Kanata Oowada]], [[..:iiduka:|Hideaki Iiduka]]: **Improvements of Computational Efficiency and Convergence Rate of Riemannian Stochastic Gradient Descent with Increasing Batch Size**, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University (Sep. 26, 2025) |
| - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[https://scholar.google.co.jp/citations?user=xx7O2voAAAAJ&hl=ja|Hiroki Naganuma]], [[en:iiduka:|Hideaki Iiduka]]: **Analysis of Muon's convergence and critical batch size**, The 2025 Fall National Conference of Operations Research Society of Japan, Higashi-Hiroshima campus, Hiroshima University (Sep. 12, 2025) | - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[https://scholar.google.co.jp/citations?user=xx7O2voAAAAJ&hl=ja|Hiroki Naganuma]], [[en:iiduka:|Hideaki Iiduka]]: **Analysis of Muon's convergence and critical batch size**, The 2025 Fall National Conference of Operations Research Society of Japan, Higashi-Hiroshima campus, Hiroshima University (Sep. 12, 2025) |
| - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[en:iiduka:|Hideaki Iiduka]]: **A General Framework of Riemannian Adaptive Optimization Methods with a Convergence Analysis**, The 2025 Fall National Conference of Operations Research Society of Japan, Higashi-Hiroshima campus, Hiroshima University (Sep. 12, 2025) | - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[en:iiduka:|Hideaki Iiduka]]: **A General Framework of Riemannian Adaptive Optimization Methods with a Convergence Analysis**, The 2025 Fall National Conference of Operations Research Society of Japan, Higashi-Hiroshima campus, Hiroshima University (Sep. 12, 2025) |
| - [[https://scholar.google.co.jp/citations?hl=ja&user=dejA0qcAAAAJ|Kanata Owada]], [[..:iiduka:|Hideaki Iiduka]]: **Improving Computational Efficiency and Convergence Rate of Stochastic Gradient Descent on Riemannian Manifolds by Focusing on Batch Size**, Summer School on Continuous Optimization and Related Fields, The Institute of Statistical Mathematics (Aug. 28, 2025) | - [[https://scholar.google.com/citations?user=3U-XTE0AAAAJ&hl=ja|Kanata Owada]], [[..:iiduka:|Hideaki Iiduka]]: **Improving Computational Efficiency and Convergence Rate of Stochastic Gradient Descent on Riemannian Manifolds by Focusing on Batch Size**, Summer School on Continuous Optimization and Related Fields, The Institute of Statistical Mathematics (Aug. 28, 2025) |
| - [[https://scholar.google.com/citations?hl=ja&user=s9l7NM8AAAAJ|Hikaru Umeda]], [[..:iiduka:|Hideaki Iiduka]]: **Acceleration of Stochastic Gradient Descent by Increasing Batch Size and Learning Rate**, The 2025 Spring National Conference of Operations Research Society of Japan, Seikei University (Mar. 6, 2025) | - [[https://scholar.google.com/citations?hl=ja&user=s9l7NM8AAAAJ|Hikaru Umeda]], [[..:iiduka:|Hideaki Iiduka]]: **Acceleration of Stochastic Gradient Descent by Increasing Batch Size and Learning Rate**, The 2025 Spring National Conference of Operations Research Society of Japan, Seikei University (Mar. 6, 2025) |
| - [[https://scholar.google.co.jp/citations?hl=ja&user=dejA0qcAAAAJ|Kanata Owada]], [[..:iiduka:|Hideaki Iiduka]]: **Batch Size and Learning Rate of Stochastic Gradient Descent on Riemannian manifold**, The 2025 Spring National Conference of Operations Research Society of Japan, Seikei University (Mar. 6, 2025) | - [[https://scholar.google.com/citations?user=3U-XTE0AAAAJ&hl=ja|Kanata Owada]], [[..:iiduka:|Hideaki Iiduka]]: **Batch Size and Learning Rate of Stochastic Gradient Descent on Riemannian manifold**, The 2025 Spring National Conference of Operations Research Society of Japan, Seikei University (Mar. 6, 2025) |
| - [[https://scholar.google.com/citations?user=hdDU4Z4AAAAJ&hl=ja|Kento Imaizumi]], [[..:iiduka:|Hideaki Iiduka]]: **Convergence Analysis of SGD with Momentum with Increasing or Decaying Momentum Factor in Nonconvex Optimization**, The 2025 Spring National Conference of Operations Research Society of Japan, Seikei University (Mar. 6, 2025) | - [[https://scholar.google.com/citations?user=hdDU4Z4AAAAJ&hl=ja|Kento Imaizumi]], [[..:iiduka:|Hideaki Iiduka]]: **Convergence Analysis of SGD with Momentum with Increasing or Decaying Momentum Factor in Nonconvex Optimization**, The 2025 Spring National Conference of Operations Research Society of Japan, Seikei University (Mar. 6, 2025) |
| - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **Stochastic Frank Wolfe method for Constrained Nonconvex Optimization and its Application for Adversarial Attack**, The 2025 Spring National Conference of Operations Research Society of Japan, Seikei University (Mar. 6, 2025) | - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **Stochastic Frank Wolfe method for Constrained Nonconvex Optimization and its Application for Adversarial Attack**, The 2025 Spring National Conference of Operations Research Society of Japan, Seikei University (Mar. 6, 2025) |