intro:publications

差分

このページの2つのバージョン間の差分を表示します。

この比較画面へのリンク

両方とも前のリビジョン 前のリビジョン
次のリビジョン
前のリビジョン
intro:publications [2025/01/19 15:48] – [レフェリー付き原著論文] Hideaki IIDUKAintro:publications [2025/03/11 16:36] (現在) – [レフェリー付き原著論文] Hideaki IIDUKA
行 8: 行 8:
  
 ==== レフェリー付き原著論文 ==== ==== レフェリー付き原著論文 ====
-  - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], Koshiro Izumi, [[..:iiduka:|Hideaki Iiduka]]: **Scaled Conjugate 
-Gradient Method for Nonconvex Optimization in Deep Neural Networks**, [[https://www.jmlr.org/|Journal of Machine Learning Research]] 25: ???-??? (2025) 
   - [[https://scholar.google.com/citations?user=hdDU4Z4AAAAJ&hl=ja|Kento Imaizumi]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://doi.org/10.1080/02331934.2024.2367635|Iteration and Stochastic First-order Oracle Complexities of Stochastic Gradient Descent using Constant and Decaying Learning Rates]]**, [[https://www.tandfonline.com/journals/gopt20|Optimization]]: Special Issue dedicated to Dr. Alexander J. Zaslavski on the occasion of his 65th birthday ?? (?):?? --?? (2025) [[https://www.tandfonline.com/doi/epdf/10.1080/02331934.2024.2367635?needAccess=true|Open Access]]   - [[https://scholar.google.com/citations?user=hdDU4Z4AAAAJ&hl=ja|Kento Imaizumi]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://doi.org/10.1080/02331934.2024.2367635|Iteration and Stochastic First-order Oracle Complexities of Stochastic Gradient Descent using Constant and Decaying Learning Rates]]**, [[https://www.tandfonline.com/journals/gopt20|Optimization]]: Special Issue dedicated to Dr. Alexander J. Zaslavski on the occasion of his 65th birthday ?? (?):?? --?? (2025) [[https://www.tandfonline.com/doi/epdf/10.1080/02331934.2024.2367635?needAccess=true|Open Access]]
-  - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://openreview.net/forum?id=knv4lQFVoE|A general framework of Riemannian adaptive optimization methods with a convergence analysis]]**, [[https://jmlr.org/tmlr/|Transactions on Machine Learning Research]] (2025) [[https://openreview.net/pdf?id=knv4lQFVoE|Open Access]]+  - [[https://scholar.google.com/citations?hl=ja&user=s9l7NM8AAAAJ|Hikaru Umeda]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://openreview.net/forum?id=sbmp55k6iE|Increasing Both Batch Size and Learning Rate Accelerates Stochastic Gradient Descent]]**, [[https://jmlr.org/tmlr/|Transactions on Machine Learning Research]] (2025) [[https://openreview.net/pdf?id=sbmp55k6iE|Open Access]] 
 +  - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://openreview.net/forum?id=knv4lQFVoE|A General Framework of Riemannian Adaptive Optimization Methods with a Convergence Analysis]]**, [[https://jmlr.org/tmlr/|Transactions on Machine Learning Research]] (2025) [[https://openreview.net/pdf?id=knv4lQFVoE|Open Access]]
  
  
行 48: 行 47:
   -  [[..:iiduka:|飯塚 秀明]]: **[[https://www.coronasha.co.jp/np/isbn/9784339061321/|機械学習のための数学]]**, [[https://www.coronasha.co.jp/|コロナ社]] (2024)   -  [[..:iiduka:|飯塚 秀明]]: **[[https://www.coronasha.co.jp/np/isbn/9784339061321/|機械学習のための数学]]**, [[https://www.coronasha.co.jp/|コロナ社]] (2024)
 ==== レフェリー付き原著論文 ==== ==== レフェリー付き原著論文 ====
-  - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[..:iiduka:|Hideaki Iiduka]]: **[[http://manu71.magtech.com.cn/Jwk3_pjo/EN/10.61208/pjo-2024-005|Convergence of Riemannian Stochastic Gradient Descent on Hadamard Manifold]]**, [[http://manu71.magtech.com.cn/Jwk3_pjo/EN/home|Pacific Journal of Optimization]]: Special issue: Dedicated to Prof. Masao Fukushima on the occasion of his 75th birthday 20 (4): 743--767 (2024)+  - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], Koshiro Izumi, [[..:iiduka:|Hideaki Iiduka]]: **[[https://jmlr.org/papers/v25/22-0815.html|Scaled Conjugate 
 +Gradient Method for Nonconvex Optimization in Deep Neural Networks]]**, [[https://www.jmlr.org/|Journal of Machine Learning Research]] 25 (395): 1-37 (2024) [[https://jmlr.org/papers/volume25/22-0815/22-0815.pdf|Open Access]] 
 +  - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[..:iiduka:|Hideaki Iiduka]]: **[[http://manu71.magtech.com.cn/Jwk3_pjo/EN/10.61208/pjo-2024-005|Convergence of Riemannian Stochastic Gradient Descent on Hadamard Manifold]]**, [[http://manu71.magtech.com.cn/Jwk3_pjo/EN/home|Pacific Journal of Optimization]]: Special issue: Dedicated to Prof. Masao Fukushima on the occasion of his 75th birthday 20 (4): 743--767 (2024) [[http://www.yokohamapublishers.jp/online-p/PJO/vol20/pjov20n4p743.pdf|Open Access]]
   - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s10957-024-02449-8?utm_source=rct_congratemailt&utm_medium=email&utm_campaign=oa_20240529&utm_content=10.1007%2Fs10957-024-02449-8|Modified Memoryless Spectral-scaling Broyden Family on Riemannian Manifolds]]**, [[https://link.springer.com/journal/10957|Journal of Optimization Theory and Applications]] 202: 834--853 (2024) [[https://link.springer.com/article/10.1007/s10957-024-02449-8?utm_source=rct_congratemailt&utm_medium=email&utm_campaign=oa_20240529&utm_content=10.1007%2Fs10957-024-02449-8|Open Access]]    - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s10957-024-02449-8?utm_source=rct_congratemailt&utm_medium=email&utm_campaign=oa_20240529&utm_content=10.1007%2Fs10957-024-02449-8|Modified Memoryless Spectral-scaling Broyden Family on Riemannian Manifolds]]**, [[https://link.springer.com/journal/10957|Journal of Optimization Theory and Applications]] 202: 834--853 (2024) [[https://link.springer.com/article/10.1007/s10957-024-02449-8?utm_source=rct_congratemailt&utm_medium=email&utm_campaign=oa_20240529&utm_content=10.1007%2Fs10957-024-02449-8|Open Access]] 
   - [[..:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s11075-023-01575-0|Theoretical Analysis of Adam using Hyperparameters Close to One without Lipschitz Smoothness]]**, [[https://www.springer.com/journal/11075|Numerical Algorithms]] 95: 383--421 (2024) {{:iiduka:iiduka2023.pdf|PDF}} [[https://rdcu.be/df4ce|Springer Nature SharedIt]]   - [[..:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s11075-023-01575-0|Theoretical Analysis of Adam using Hyperparameters Close to One without Lipschitz Smoothness]]**, [[https://www.springer.com/journal/11075|Numerical Algorithms]] 95: 383--421 (2024) {{:iiduka:iiduka2023.pdf|PDF}} [[https://rdcu.be/df4ce|Springer Nature SharedIt]]
  • intro/publications.1737269316.txt.gz
  • 最終更新: 2025/01/19 15:48
  • by Hideaki IIDUKA