両方とも前のリビジョン 前のリビジョン 次のリビジョン | 前のリビジョン |
intro:publications [2024/12/23 17:18] – [卒業論文] Hideaki IIDUKA | intro:publications [2025/03/11 16:36] (現在) – [レフェリー付き原著論文] Hideaki IIDUKA |
---|
| |
==== レフェリー付き原著論文 ==== | ==== レフェリー付き原著論文 ==== |
- [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], Koshiro Izumi, [[..:iiduka:|Hideaki Iiduka]]: **Scaled Conjugate | |
Gradient Method for Nonconvex Optimization in Deep Neural Networks**, [[https://www.jmlr.org/|Journal of Machine Learning Research]] 25: ???-??? (2025) | |
- [[https://scholar.google.com/citations?user=hdDU4Z4AAAAJ&hl=ja|Kento Imaizumi]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://doi.org/10.1080/02331934.2024.2367635|Iteration and Stochastic First-order Oracle Complexities of Stochastic Gradient Descent using Constant and Decaying Learning Rates]]**, [[https://www.tandfonline.com/journals/gopt20|Optimization]]: Special Issue dedicated to Dr. Alexander J. Zaslavski on the occasion of his 65th birthday ?? (?):?? --?? (2025) [[https://www.tandfonline.com/doi/epdf/10.1080/02331934.2024.2367635?needAccess=true|Open Access]] | - [[https://scholar.google.com/citations?user=hdDU4Z4AAAAJ&hl=ja|Kento Imaizumi]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://doi.org/10.1080/02331934.2024.2367635|Iteration and Stochastic First-order Oracle Complexities of Stochastic Gradient Descent using Constant and Decaying Learning Rates]]**, [[https://www.tandfonline.com/journals/gopt20|Optimization]]: Special Issue dedicated to Dr. Alexander J. Zaslavski on the occasion of his 65th birthday ?? (?):?? --?? (2025) [[https://www.tandfonline.com/doi/epdf/10.1080/02331934.2024.2367635?needAccess=true|Open Access]] |
| - [[https://scholar.google.com/citations?hl=ja&user=s9l7NM8AAAAJ|Hikaru Umeda]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://openreview.net/forum?id=sbmp55k6iE|Increasing Both Batch Size and Learning Rate Accelerates Stochastic Gradient Descent]]**, [[https://jmlr.org/tmlr/|Transactions on Machine Learning Research]] (2025) [[https://openreview.net/pdf?id=sbmp55k6iE|Open Access]] |
| - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://openreview.net/forum?id=knv4lQFVoE|A General Framework of Riemannian Adaptive Optimization Methods with a Convergence Analysis]]**, [[https://jmlr.org/tmlr/|Transactions on Machine Learning Research]] (2025) [[https://openreview.net/pdf?id=knv4lQFVoE|Open Access]] |
| |
| |
- 丸山 英希: **定数および減少ステップサイズを利用した確率的勾配降下法の収束性および勾配降下法との比較** | - 丸山 英希: **定数および減少ステップサイズを利用した確率的勾配降下法の収束性および勾配降下法との比較** |
| |
==== 講演・口頭発表等 ==== | ==== 講演・口頭発表等(国内) ==== |
| - [[https://scholar.google.com/citations?hl=ja&user=s9l7NM8AAAAJ|梅田 光瑠]], [[..:iiduka:|飯塚 秀明]]: **バッチサイズと学習率の同時増加による確率的勾配降下法の高速化**, 日本オペレーションズ・リサーチ学会 2025年春季研究発表会, 成蹊大学 (2025年3月6日) |
| - 大和田 佳生, [[..:iiduka:|飯塚 秀明]]: **Riemann多様体上の確率的勾配降下法のバッチサイズと学習率について**, 日本オペレーションズ・リサーチ学会 2025年春季研究発表会, 成蹊大学 (2025年3月6日) |
| - [[https://scholar.google.com/citations?user=hdDU4Z4AAAAJ&hl=ja|今泉 賢人]], [[..:iiduka:|飯塚 秀明]]: **非凸最適化における増加バッチサイズと増加または減少慣性係数を使用したモーメンタム法の収束解析**, 日本オペレーションズ・リサーチ学会 2025年春季研究発表会, 成蹊大学 (2025年3月6日) |
| - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|佐藤 尚樹]], [[..:iiduka:|飯塚 秀明]]: **制約付き非凸最適化問題のための確率的Frank-Wolfe法とその敵対的攻撃への応用**, 日本オペレーションズ・リサーチ学会 2025年春季研究発表会, 成蹊大学 (2025年3月6日) |
| |
| ==== 講演・口頭発表等(国外) ==== |
- [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[..:iiduka:|Hideaki Iiduka]]: **Explicit and Implicit Graduated Optimization in Deep Neural Networks**, [[https://aaai.org/conference/aaai/aaai-25/|The 39th Annual AAAI Conference on Artificial Intelligence (AAAI-25)]], Pennsylvania Convention Center, Philadelphia, Pennsylvania, USA (Feb. 27 -- Mar. 4, 2025) | - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[..:iiduka:|Hideaki Iiduka]]: **Explicit and Implicit Graduated Optimization in Deep Neural Networks**, [[https://aaai.org/conference/aaai/aaai-25/|The 39th Annual AAAI Conference on Artificial Intelligence (AAAI-25)]], Pennsylvania Convention Center, Philadelphia, Pennsylvania, USA (Feb. 27 -- Mar. 4, 2025) |
| |
- [[..:iiduka:|飯塚 秀明]]: **[[https://www.coronasha.co.jp/np/isbn/9784339061321/|機械学習のための数学]]**, [[https://www.coronasha.co.jp/|コロナ社]] (2024) | - [[..:iiduka:|飯塚 秀明]]: **[[https://www.coronasha.co.jp/np/isbn/9784339061321/|機械学習のための数学]]**, [[https://www.coronasha.co.jp/|コロナ社]] (2024) |
==== レフェリー付き原著論文 ==== | ==== レフェリー付き原著論文 ==== |
- [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[..:iiduka:|Hideaki Iiduka]]: **[[http://manu71.magtech.com.cn/Jwk3_pjo/EN/10.61208/pjo-2024-005|Convergence of Riemannian Stochastic Gradient Descent on Hadamard Manifold]]**, [[http://manu71.magtech.com.cn/Jwk3_pjo/EN/home|Pacific Journal of Optimization]]: Special issue: Dedicated to Prof. Masao Fukushima on the occasion of his 75th birthday 20 (4): 743--767 (2024) | - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], Koshiro Izumi, [[..:iiduka:|Hideaki Iiduka]]: **[[https://jmlr.org/papers/v25/22-0815.html|Scaled Conjugate |
| Gradient Method for Nonconvex Optimization in Deep Neural Networks]]**, [[https://www.jmlr.org/|Journal of Machine Learning Research]] 25 (395): 1-37 (2024) [[https://jmlr.org/papers/volume25/22-0815/22-0815.pdf|Open Access]] |
| - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[..:iiduka:|Hideaki Iiduka]]: **[[http://manu71.magtech.com.cn/Jwk3_pjo/EN/10.61208/pjo-2024-005|Convergence of Riemannian Stochastic Gradient Descent on Hadamard Manifold]]**, [[http://manu71.magtech.com.cn/Jwk3_pjo/EN/home|Pacific Journal of Optimization]]: Special issue: Dedicated to Prof. Masao Fukushima on the occasion of his 75th birthday 20 (4): 743--767 (2024) [[http://www.yokohamapublishers.jp/online-p/PJO/vol20/pjov20n4p743.pdf|Open Access]] |
- [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s10957-024-02449-8?utm_source=rct_congratemailt&utm_medium=email&utm_campaign=oa_20240529&utm_content=10.1007%2Fs10957-024-02449-8|Modified Memoryless Spectral-scaling Broyden Family on Riemannian Manifolds]]**, [[https://link.springer.com/journal/10957|Journal of Optimization Theory and Applications]] 202: 834--853 (2024) [[https://link.springer.com/article/10.1007/s10957-024-02449-8?utm_source=rct_congratemailt&utm_medium=email&utm_campaign=oa_20240529&utm_content=10.1007%2Fs10957-024-02449-8|Open Access]] | - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s10957-024-02449-8?utm_source=rct_congratemailt&utm_medium=email&utm_campaign=oa_20240529&utm_content=10.1007%2Fs10957-024-02449-8|Modified Memoryless Spectral-scaling Broyden Family on Riemannian Manifolds]]**, [[https://link.springer.com/journal/10957|Journal of Optimization Theory and Applications]] 202: 834--853 (2024) [[https://link.springer.com/article/10.1007/s10957-024-02449-8?utm_source=rct_congratemailt&utm_medium=email&utm_campaign=oa_20240529&utm_content=10.1007%2Fs10957-024-02449-8|Open Access]] |
- [[..:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s11075-023-01575-0|Theoretical Analysis of Adam using Hyperparameters Close to One without Lipschitz Smoothness]]**, [[https://www.springer.com/journal/11075|Numerical Algorithms]] 95: 383--421 (2024) {{:iiduka:iiduka2023.pdf|PDF}} [[https://rdcu.be/df4ce|Springer Nature SharedIt]] | - [[..:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s11075-023-01575-0|Theoretical Analysis of Adam using Hyperparameters Close to One without Lipschitz Smoothness]]**, [[https://www.springer.com/journal/11075|Numerical Algorithms]] 95: 383--421 (2024) {{:iiduka:iiduka2023.pdf|PDF}} [[https://rdcu.be/df4ce|Springer Nature SharedIt]] |