| 両方とも前のリビジョン 前のリビジョン 次のリビジョン | 前のリビジョン |
| en:intro:publications [2026/01/14 17:34] – [Reward and Punishment] Kanata OOWADA | en:intro:publications [2026/02/09 20:19] (現在) – [Conference Activities & Talks] Naoki SATO |
|---|
| - [[https://arxiv.org/search/?searchtype=author&query=Iiduka%2C+H|preprints]] (arXiv.org Search Results) | - [[https://arxiv.org/search/?searchtype=author&query=Iiduka%2C+H|preprints]] (arXiv.org Search Results) |
| |
| | ===== 2026 ===== |
| | |
| | ==== Proceedings ==== |
| | - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **Lipschitz Multiscale Deep Equilibrium Models: A Theoretically Guaranteed and Accelerated Approach**, Proceedings of the 29th International Conference on Artificial Intelligence and Statistics, PMLR 300: ????--???? (2026) |
| | |
| | ==== Doctoral Thesis ==== |
| | - Hiroyuki Sakai: **Riemannian Adaptive Optimization Algorithms and Their Applications**, Meiji University, 2026 {{??|PDF}} |
| | |
| | ==== Conference Activities & Talks ==== |
| | - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **Acceleration of deep equilibrium models based on Banach fixed point theorem**, The 59th Information-Based Induction Sciences and Machine Learning (IBISML), Awagin Hall (Mar. 25, 2026) |
| | - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **Lipschitz Multiscale Deep Equilibrium Models: A Theoretically Guaranteed and Accelerated Approach**, [[https://virtual.aistats.org/|The 29th International Conference on Artificial Intelligence and Statistics (AISTATS)]], Tangier, Morocco (May. 2 -- Mar. 5, 2026) |
| ===== 2025 ===== | ===== 2025 ===== |
| ==== Publications in Refereed Journals ==== | ==== Publications in Refereed Journals ==== |
| ==== Conference Activities & Talks ==== | ==== Conference Activities & Talks ==== |
| - Keisuke Kamo, [[..:iiduka:|Hideaki Iiduka]]: **Increasing Batch Size Improves Convergence of Stochastic Gradient Descent with Momentum**, [[https://www.acml-conf.org/2025/|The 17th Asian Conference on Machine Learning (ACML2025)]], Taipei, Taiwan (Dec. 9--12, 2025) | - Keisuke Kamo, [[..:iiduka:|Hideaki Iiduka]]: **Increasing Batch Size Improves Convergence of Stochastic Gradient Descent with Momentum**, [[https://www.acml-conf.org/2025/|The 17th Asian Conference on Machine Learning (ACML2025)]], Taipei, Taiwan (Dec. 9--12, 2025) |
| - [[https://scholar.google.co.jp/citations?hl=ja&user=dejA0qcAAAAJ|Kanata Oowada]], [[..:iiduka:|Hideaki Iiduka]]: **Faster Convergence of Riemannian Stochastic Gradient Descent with Increasing Batch Size**, [[https://www.acml-conf.org/2025/|The 17th Asian Conference on Machine Learning (ACML2025)]], Taipei, Taiwan (Dec. 9--12, 2025) | - [[https://scholar.google.com/citations?user=3U-XTE0AAAAJ&hl=ja|Kanata Oowada]], [[..:iiduka:|Hideaki Iiduka]]: **Faster Convergence of Riemannian Stochastic Gradient Descent with Increasing Batch Size**, [[https://www.acml-conf.org/2025/|The 17th Asian Conference on Machine Learning (ACML2025)]], Taipei, Taiwan (Dec. 9--12, 2025) |
| - [[https://scholar.google.com/citations?user=hdDU4Z4AAAAJ&hl=ja|Kento Imaizumi]], [[..:iiduka:|Hideaki Iiduka]]: **Both Asymptotic and Non-Asymptotic Convergence of Quasi-Hyperbolic Momentum using Increasing Batch Size**, [[https://www.acml-conf.org/2025/|The 17th Asian Conference on Machine Learning (ACML2025)]], Taipei, Taiwan (Dec. 9--12, 2025) | - [[https://scholar.google.com/citations?user=hdDU4Z4AAAAJ&hl=ja|Kento Imaizumi]], [[..:iiduka:|Hideaki Iiduka]]: **Both Asymptotic and Non-Asymptotic Convergence of Quasi-Hyperbolic Momentum using Increasing Batch Size**, [[https://www.acml-conf.org/2025/|The 17th Asian Conference on Machine Learning (ACML2025)]], Taipei, Taiwan (Dec. 9--12, 2025) |
| - [[https://scholar.google.co.jp/citations?hl=ja&user=dejA0qcAAAAJ|Kanata Oowada]], [[..:iiduka:|Hideaki Iiduka]]: **Improvements of Computational Efficiency and Convergence Rate of Riemannian Stochastic Gradient Descent with Increasing Batch Size**, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University (Sep. 26, 2025) | - [[https://scholar.google.com/citations?user=3U-XTE0AAAAJ&hl=ja|Kanata Oowada]], [[..:iiduka:|Hideaki Iiduka]]: **Improvements of Computational Efficiency and Convergence Rate of Riemannian Stochastic Gradient Descent with Increasing Batch Size**, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University (Sep. 26, 2025) |
| - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[https://scholar.google.co.jp/citations?user=xx7O2voAAAAJ&hl=ja|Hiroki Naganuma]], [[en:iiduka:|Hideaki Iiduka]]: **Analysis of Muon's convergence and critical batch size**, The 2025 Fall National Conference of Operations Research Society of Japan, Higashi-Hiroshima campus, Hiroshima University (Sep. 12, 2025) | - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[https://scholar.google.co.jp/citations?user=xx7O2voAAAAJ&hl=ja|Hiroki Naganuma]], [[en:iiduka:|Hideaki Iiduka]]: **Analysis of Muon's convergence and critical batch size**, The 2025 Fall National Conference of Operations Research Society of Japan, Higashi-Hiroshima campus, Hiroshima University (Sep. 12, 2025) |
| - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[en:iiduka:|Hideaki Iiduka]]: **A General Framework of Riemannian Adaptive Optimization Methods with a Convergence Analysis**, The 2025 Fall National Conference of Operations Research Society of Japan, Higashi-Hiroshima campus, Hiroshima University (Sep. 12, 2025) | - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[en:iiduka:|Hideaki Iiduka]]: **A General Framework of Riemannian Adaptive Optimization Methods with a Convergence Analysis**, The 2025 Fall National Conference of Operations Research Society of Japan, Higashi-Hiroshima campus, Hiroshima University (Sep. 12, 2025) |
| - [[https://scholar.google.co.jp/citations?hl=ja&user=dejA0qcAAAAJ|Kanata Owada]], [[..:iiduka:|Hideaki Iiduka]]: **Improving Computational Efficiency and Convergence Rate of Stochastic Gradient Descent on Riemannian Manifolds by Focusing on Batch Size**, Summer School on Continuous Optimization and Related Fields, The Institute of Statistical Mathematics (Aug. 28, 2025) | - [[https://scholar.google.com/citations?user=3U-XTE0AAAAJ&hl=ja|Kanata Owada]], [[..:iiduka:|Hideaki Iiduka]]: **Improving Computational Efficiency and Convergence Rate of Stochastic Gradient Descent on Riemannian Manifolds by Focusing on Batch Size**, Summer School on Continuous Optimization and Related Fields, The Institute of Statistical Mathematics (Aug. 28, 2025) |
| - [[https://scholar.google.com/citations?hl=ja&user=s9l7NM8AAAAJ|Hikaru Umeda]], [[..:iiduka:|Hideaki Iiduka]]: **Acceleration of Stochastic Gradient Descent by Increasing Batch Size and Learning Rate**, The 2025 Spring National Conference of Operations Research Society of Japan, Seikei University (Mar. 6, 2025) | - [[https://scholar.google.com/citations?hl=ja&user=s9l7NM8AAAAJ|Hikaru Umeda]], [[..:iiduka:|Hideaki Iiduka]]: **Acceleration of Stochastic Gradient Descent by Increasing Batch Size and Learning Rate**, The 2025 Spring National Conference of Operations Research Society of Japan, Seikei University (Mar. 6, 2025) |
| - [[https://scholar.google.co.jp/citations?hl=ja&user=dejA0qcAAAAJ|Kanata Owada]], [[..:iiduka:|Hideaki Iiduka]]: **Batch Size and Learning Rate of Stochastic Gradient Descent on Riemannian manifold**, The 2025 Spring National Conference of Operations Research Society of Japan, Seikei University (Mar. 6, 2025) | - [[https://scholar.google.com/citations?user=3U-XTE0AAAAJ&hl=ja|Kanata Owada]], [[..:iiduka:|Hideaki Iiduka]]: **Batch Size and Learning Rate of Stochastic Gradient Descent on Riemannian manifold**, The 2025 Spring National Conference of Operations Research Society of Japan, Seikei University (Mar. 6, 2025) |
| - [[https://scholar.google.com/citations?user=hdDU4Z4AAAAJ&hl=ja|Kento Imaizumi]], [[..:iiduka:|Hideaki Iiduka]]: **Convergence Analysis of SGD with Momentum with Increasing or Decaying Momentum Factor in Nonconvex Optimization**, The 2025 Spring National Conference of Operations Research Society of Japan, Seikei University (Mar. 6, 2025) | - [[https://scholar.google.com/citations?user=hdDU4Z4AAAAJ&hl=ja|Kento Imaizumi]], [[..:iiduka:|Hideaki Iiduka]]: **Convergence Analysis of SGD with Momentum with Increasing or Decaying Momentum Factor in Nonconvex Optimization**, The 2025 Spring National Conference of Operations Research Society of Japan, Seikei University (Mar. 6, 2025) |
| - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **Stochastic Frank Wolfe method for Constrained Nonconvex Optimization and its Application for Adversarial Attack**, The 2025 Spring National Conference of Operations Research Society of Japan, Seikei University (Mar. 6, 2025) | - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **Stochastic Frank Wolfe method for Constrained Nonconvex Optimization and its Application for Adversarial Attack**, The 2025 Spring National Conference of Operations Research Society of Japan, Seikei University (Mar. 6, 2025) |