差分

このページの2つのバージョン間の差分を表示します。

この比較画面へのリンク

両方とも前のリビジョン 前のリビジョン
次のリビジョン
前のリビジョン
en:intro:publications [2025/01/19 15:49] – [Publications in Refereed Journals] Hideaki IIDUKAen:intro:publications [2025/04/19 21:35] (現在) – [Proceedings] Hideaki IIDUKA
行 5: 行 5:
 ===== 2025 ===== ===== 2025 =====
 ==== Publications in Refereed Journals ==== ==== Publications in Refereed Journals ====
-  - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], Koshiro Izumi, [[..:iiduka:|Hideaki Iiduka]]: **Scaled Conjugate +  - [[https://scholar.google.com/citations?user=hdDU4Z4AAAAJ&hl=ja|Kento Imaizumi]], [[en:iiduka:|Hideaki Iiduka]]: **[[https://doi.org/10.1080/02331934.2024.2367635|Iteration and Stochastic First-order Oracle Complexities of Stochastic Gradient Descent using Constant and Decaying Learning Rates]]**, [[https://www.tandfonline.com/journals/gopt20|Optimization]]: Special Issue dedicated to Dr. Alexander J. Zaslavski on the occasion of his 65th birthday ?? (?):?? --?? (2025) [[https://www.tandfonline.com/doi/epdf/10.1080/02331934.2024.2367635?needAccess=true|Open Access]] 
-Gradient Method for Nonconvex Optimization in Deep Neural Networks**, [[https://www.jmlr.org/|Journal of Machine Learning Research]] 25: ???-??? (2024) +  - [[https://github.com/tsukaoyuki|Yuki Tsukada]], [[en:iiduka:|Hideaki Iiduka]]: **[[https://openreview.net/forum?id=pqZ6nOm3WF|Relationship between Batch Size and Number of Steps Needed for Nonconvex Optimization of Stochastic Gradient Descent using Armijo-Line-Search Learning Rate]]**, [[https://jmlr.org/tmlr/|Transactions on Machine Learning Research]] (2025) [[https://openreview.net/pdf?id=pqZ6nOm3WF|Open Access]] 
-  - [[https://scholar.google.com/citations?user=hdDU4Z4AAAAJ&hl=ja|Kento Imaizumi]], [[en:iiduka:|Hideaki Iiduka]]: **[[https://doi.org/10.1080/02331934.2024.2367635|Iteration and Stochastic First-order Oracle Complexities of Stochastic Gradient Descent using Constant and Decaying Learning Rates]]**, [[https://www.tandfonline.com/journals/gopt20|Optimization]]: Special Issue dedicated to Dr. Alexander J. Zaslavski on the occasion of his 65th birthday ?? (?):?? --?? (2024) [[https://www.tandfonline.com/doi/epdf/10.1080/02331934.2024.2367635?needAccess=true|Open Access]] +  - [[https://scholar.google.com/citations?hl=ja&user=s9l7NM8AAAAJ|Hikaru Umeda]], [[en:iiduka:|Hideaki Iiduka]]: **[[https://openreview.net/forum?id=sbmp55k6iE|Increasing Both Batch Size and Learning Rate Accelerates Stochastic Gradient Descent]]**, [[https://jmlr.org/tmlr/|Transactions on Machine Learning Research]] (2025) [[https://openreview.net/pdf?id=sbmp55k6iE|Open Access]] 
-  - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://openreview.net/forum?id=knv4lQFVoE|A general framework of Riemannian adaptive optimization methods with a convergence analysis]]**, [[https://jmlr.org/tmlr/|Transactions on Machine Learning Research]] (2025) [[https://openreview.net/pdf?id=knv4lQFVoE|Open Access]]+  - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://openreview.net/forum?id=knv4lQFVoE|A General Framework of Riemannian Adaptive Optimization Methods with a Convergence Analysis]]**, [[https://jmlr.org/tmlr/|Transactions on Machine Learning Research]] (2025) [[https://openreview.net/pdf?id=knv4lQFVoE|Open Access]]
  
 ==== Proceedings ==== ==== Proceedings ====
 +  - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://ojs.aaai.org/index.php/AAAI/article/view/34234|Explicit and Implicit Graduated Optimization in Deep Neural Networks]]**, [[https://aaai.org/proceeding/aaai-39-2025/|Proceedings of the AAAI Conference on Artificial Intelligence]], 39 (19), 20283--20291 (2025)
  
 ==== Conference Activities & Talks ==== ==== Conference Activities & Talks ====
行 27: 行 27:
  
 ==== Publications in Refereed Journals ==== ==== Publications in Refereed Journals ====
-  - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[en:iiduka:|Hideaki Iiduka]]: **[[http://manu71.magtech.com.cn/Jwk3_pjo/EN/10.61208/pjo-2024-005|Convergence of Riemannian Stochastic Gradient Descent on Hadamard Manifold]]**, [[http://manu71.magtech.com.cn/Jwk3_pjo/EN/home|Pacific Journal of Optimization]]: Special issue: Dedicated to Prof. Masao Fukushima on the occasion of his 75th birthday 20 (4): 743--767 (2024)+  - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], Koshiro Izumi, [[..:iiduka:|Hideaki Iiduka]]: **[[https://jmlr.org/papers/v25/22-0815.html|Scaled Conjugate 
 +Gradient Method for Nonconvex Optimization in Deep Neural Networks]]**, [[https://www.jmlr.org/|Journal of Machine Learning Research]] 25 (395): 1-37 (2024) [[https://jmlr.org/papers/volume25/22-0815/22-0815.pdf|Open Access]] 
 +  - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[en:iiduka:|Hideaki Iiduka]]: **[[http://manu71.magtech.com.cn/Jwk3_pjo/EN/10.61208/pjo-2024-005|Convergence of Riemannian Stochastic Gradient Descent on Hadamard Manifold]]**, [[http://manu71.magtech.com.cn/Jwk3_pjo/EN/home|Pacific Journal of Optimization]]: Special issue: Dedicated to Prof. Masao Fukushima on the occasion of his 75th birthday 20 (4): 743--767 (2024) [[http://www.yokohamapublishers.jp/online-p/PJO/vol20/pjov20n4p743.pdf|Open Access]]
   - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s10957-024-02449-8?utm_source=rct_congratemailt&utm_medium=email&utm_campaign=oa_20240529&utm_content=10.1007%2Fs10957-024-02449-8|Modified Memoryless Spectral-scaling Broyden Family on Riemannian Manifolds]]**, [[https://link.springer.com/journal/10957|Journal of Optimization Theory and Applications]] 202: 834--853 (2024) [[https://link.springer.com/article/10.1007/s10957-024-02449-8?utm_source=rct_congratemailt&utm_medium=email&utm_campaign=oa_20240529&utm_content=10.1007%2Fs10957-024-02449-8|Open Access]]    - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s10957-024-02449-8?utm_source=rct_congratemailt&utm_medium=email&utm_campaign=oa_20240529&utm_content=10.1007%2Fs10957-024-02449-8|Modified Memoryless Spectral-scaling Broyden Family on Riemannian Manifolds]]**, [[https://link.springer.com/journal/10957|Journal of Optimization Theory and Applications]] 202: 834--853 (2024) [[https://link.springer.com/article/10.1007/s10957-024-02449-8?utm_source=rct_congratemailt&utm_medium=email&utm_campaign=oa_20240529&utm_content=10.1007%2Fs10957-024-02449-8|Open Access]] 
   - [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s11075-023-01575-0|Theoretical Analysis of Adam using Hyperparameters Close to One without Lipschitz Smoothness]]**, [[https://www.springer.com/journal/11075|Numerical Algorithms]] 95: 383--421 (2024) {{:iiduka:iiduka2023.pdf|PDF}} [[https://rdcu.be/df4ce|Springer Nature SharedIt]]   - [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s11075-023-01575-0|Theoretical Analysis of Adam using Hyperparameters Close to One without Lipschitz Smoothness]]**, [[https://www.springer.com/journal/11075|Numerical Algorithms]] 95: 383--421 (2024) {{:iiduka:iiduka2023.pdf|PDF}} [[https://rdcu.be/df4ce|Springer Nature SharedIt]]
  • en/intro/publications.1737269343.txt.gz
  • 最終更新: 2025/01/19 15:49
  • by Hideaki IIDUKA