両方とも前のリビジョン 前のリビジョン 次のリビジョン | 前のリビジョン 次のリビジョン両方とも次のリビジョン |
en:intro:publications [2021/08/17 11:18] – [Conference Activities & Talks] Hideaki IIDUKA | en:intro:publications [2021/10/28 11:22] – [Publications in Refereed Journals] Hideaki IIDUKA |
---|
===== 2021 ===== | ===== 2021 ===== |
==== Publications in Refereed Journals ==== | ==== Publications in Refereed Journals ==== |
| - [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/9531335|Appropriate Learning Rates of Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6221036|IEEE Transactions on Cybernetics]] (Accepted) (2021) {{:iiduka:CYB-E-2021-05-1174.pdf|PDF}} |
- Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/9339934|Riemannian Adaptive Optimization Algorithm and Its Application to Natural Language Processing]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6221036|IEEE Transactions on Cybernetics]] (Accepted) (2021) {{:iiduka:CYB-E-2020-04-0756R2.pdf|PDF}} | - Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/9339934|Riemannian Adaptive Optimization Algorithm and Its Application to Natural Language Processing]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6221036|IEEE Transactions on Cybernetics]] (Accepted) (2021) {{:iiduka:CYB-E-2020-04-0756R2.pdf|PDF}} |
- Kazuhiro Hishinuma, [[en:iiduka:|Hideaki Iiduka]]: **Evaluation of Fixed Point Quasiconvex Subgradient Method with Computational Inexactness**, [[http://www.ybook.co.jp/pafa.html|Pure and Applied Functional Analysis]]: Special Issue on Optimization Theory dedicated to Terry Rockafellar on the occasion of his 85th birthday, (2021) (Accepted) {{:iiduka:pafa2020.pdf|PDF}} | - Kazuhiro Hishinuma, [[en:iiduka:|Hideaki Iiduka]]: **Evaluation of Fixed Point Quasiconvex Subgradient Method with Computational Inexactness**, [[http://www.ybook.co.jp/pafa.html|Pure and Applied Functional Analysis]]: Special Issue on Optimization Theory dedicated to Terry Rockafellar on the occasion of his 85th birthday, (2021) (Accepted) {{:iiduka:pafa2020.pdf|PDF}} |
| - Yini Zhu, [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/9576705|Unified Algorithm Framework for Nonconvex Stochastic Optimization in Deep Neural Networks]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6287639|IEEE Access]] 9: 143807--143823 (2021) [[https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9576705|Open Access]] |
- Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s10957-021-01874-3|Sufficient Descent Riemannian Conjugate Gradient Methods]]**, [[https://www.springer.com/journal/10957|Journal of Optimization Theory and Applications]] 190: 130--150 (2021){{:iiduka:JOTA-D-20-00641R3.pdf|PDF}} [[https://rdcu.be/clPy8|Springer Nature SharedIt]] | - Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s10957-021-01874-3|Sufficient Descent Riemannian Conjugate Gradient Methods]]**, [[https://www.springer.com/journal/10957|Journal of Optimization Theory and Applications]] 190: 130--150 (2021){{:iiduka:JOTA-D-20-00641R3.pdf|PDF}} [[https://rdcu.be/clPy8|Springer Nature SharedIt]] |
- [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s10898-020-00980-2|Inexact Stochastic Subgradient Projection Method for Stochastic Equilibrium Problems with Nonmonotone Bifunctions: Application to Expected Risk Minimization in Machine Learning]]**, [[https://www.springer.com/journal/10898|Journal of Global Optimization]] 80 (2): 479--505 (2021) {{:iiduka:JOGO-D-20-00116R2.pdf|PDF}} [[https://rdcu.be/cc0QX|Springer Nature SharedIt]] | - [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s10898-020-00980-2|Inexact Stochastic Subgradient Projection Method for Stochastic Equilibrium Problems with Nonmonotone Bifunctions: Application to Expected Risk Minimization in Machine Learning]]**, [[https://www.springer.com/journal/10898|Journal of Global Optimization]] 80 (2): 479--505 (2021) {{:iiduka:JOGO-D-20-00116R2.pdf|PDF}} [[https://rdcu.be/cc0QX|Springer Nature SharedIt]] |
| |
==== Proceedings ==== | ==== Proceedings ==== |
| - Kanako Shimoyama, [[en:iiduka:|Hideaki Iiduka]]: **Appropriate Gradients Used for Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks**, [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/kokyuroku.html|RIMS Kôkyûroku]] [[http://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/contents/2114.html|No.2194]], pp. 1--5, 2021 [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/contents/pdf/2194-01.pdf|Open Access]] |
| - Yini Zhu, Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **Training Neural Networks Using Adaptive Gradient Methods**, [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/kokyuroku.html|RIMS Kôkyûroku]] [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/contents/2194.html|No.2194]], pp. 6--12, 2021 [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/contents/pdf/2194-02.pdf|Open Access]] |
- [[en:iiduka:|Hideaki Iiduka]]: **Halpern-type Subgradient Methods for Convex Optimization over Fixed Point Sets of Nonexpansive Mappings**, Proceedings of International Conference on Nonlinear Analysis and Convex Analysis and International Conference on Optimization: Techniques and Applications -I- pp. 119--125 {{:iiduka:iiduka-naca-icota2019.R1.pdf|PDF}} | - [[en:iiduka:|Hideaki Iiduka]]: **Halpern-type Subgradient Methods for Convex Optimization over Fixed Point Sets of Nonexpansive Mappings**, Proceedings of International Conference on Nonlinear Analysis and Convex Analysis and International Conference on Optimization: Techniques and Applications -I- pp. 119--125 {{:iiduka:iiduka-naca-icota2019.R1.pdf|PDF}} |
| |