差分

このページの2つのバージョン間の差分を表示します。

この比較画面へのリンク

両方とも前のリビジョン 前のリビジョン
次のリビジョン
前のリビジョン
次のリビジョン両方とも次のリビジョン
en:intro:publications [2021/08/17 11:08] – [Conference Activities & Talks] Hideaki IIDUKAen:intro:publications [2021/10/13 22:46] – [Publications in Refereed Journals] Hideaki IIDUKA
行 6: 行 6:
 ===== 2021 ===== ===== 2021 =====
 ==== Publications in Refereed Journals ==== ==== Publications in Refereed Journals ====
 +  - Yini Zhu, [[en:iiduka:|Hideaki Iiduka]]: **Unified Algorithm Framework for Nonconvex Stochastic Optimization in Deep Neural Networks**, [[https://ieeeaccess.ieee.org/|IEEE Access]] (Accepted) (2021) 
 +  - [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/9531335|Appropriate Learning Rates of Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6221036|IEEE Transactions on Cybernetics]] (Accepted) (2021) {{:iiduka:CYB-E-2021-05-1174.pdf|PDF}}
   - Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/9339934|Riemannian Adaptive Optimization Algorithm and Its Application to Natural Language Processing]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6221036|IEEE Transactions on Cybernetics]] (Accepted) (2021) {{:iiduka:CYB-E-2020-04-0756R2.pdf|PDF}}   - Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/9339934|Riemannian Adaptive Optimization Algorithm and Its Application to Natural Language Processing]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6221036|IEEE Transactions on Cybernetics]] (Accepted) (2021) {{:iiduka:CYB-E-2020-04-0756R2.pdf|PDF}}
   - Kazuhiro Hishinuma, [[en:iiduka:|Hideaki Iiduka]]: **Evaluation of Fixed Point Quasiconvex Subgradient Method with Computational Inexactness**, [[http://www.ybook.co.jp/pafa.html|Pure and Applied Functional Analysis]]: Special Issue on Optimization Theory dedicated to Terry Rockafellar on the occasion of his 85th birthday, (2021) (Accepted) {{:iiduka:pafa2020.pdf|PDF}}   - Kazuhiro Hishinuma, [[en:iiduka:|Hideaki Iiduka]]: **Evaluation of Fixed Point Quasiconvex Subgradient Method with Computational Inexactness**, [[http://www.ybook.co.jp/pafa.html|Pure and Applied Functional Analysis]]: Special Issue on Optimization Theory dedicated to Terry Rockafellar on the occasion of his 85th birthday, (2021) (Accepted) {{:iiduka:pafa2020.pdf|PDF}}
行 13: 行 15:
  
 ==== Proceedings ==== ==== Proceedings ====
 +  - Kanako Shimoyama, [[en:iiduka:|Hideaki Iiduka]]: **Appropriate Gradients Used for Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks**, [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/kokyuroku.html|RIMS Kôkyûroku]] [[http://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/contents/2114.html|No.2194]], pp. 1--5, 2021 [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/contents/pdf/2194-01.pdf|Open Access]]
 +  - Yini Zhu, Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **Training Neural Networks Using Adaptive Gradient Methods**, [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/kokyuroku.html|RIMS Kôkyûroku]] [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/contents/2194.html|No.2194]], pp. 6--12, 2021 [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/contents/pdf/2194-02.pdf|Open Access]]
   - [[en:iiduka:|Hideaki Iiduka]]: **Halpern-type Subgradient Methods for Convex Optimization over Fixed Point Sets of Nonexpansive Mappings**, Proceedings of International Conference on Nonlinear Analysis and Convex Analysis and International Conference on Optimization: Techniques and Applications -I- pp. 119--125 {{:iiduka:iiduka-naca-icota2019.R1.pdf|PDF}}   - [[en:iiduka:|Hideaki Iiduka]]: **Halpern-type Subgradient Methods for Convex Optimization over Fixed Point Sets of Nonexpansive Mappings**, Proceedings of International Conference on Nonlinear Analysis and Convex Analysis and International Conference on Optimization: Techniques and Applications -I- pp. 119--125 {{:iiduka:iiduka-naca-icota2019.R1.pdf|PDF}}
  
  
 ==== Conference Activities & Talks ==== ==== Conference Activities & Talks ====
 +  - Hiroyuki Sakai, [[:en:iiduka:|Hideaki Iiduka]]: **Riemannian conjugate gradient methods with sufficient descent search directions**, The 2021 Fall National Conference of Operations Research Society of Japan, Kyushu University, Online meeting (Sept. 16, 2021)
   - Koshiro Izumi, [[:en:iiduka:|Hideaki Iiduka]]: **Adaptive scaling conjugate gradient method for neural networks**, RIMS Workshop on Advances in the Theory and Application of Mathematical Optimization, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Aug. 19, 2021)   - Koshiro Izumi, [[:en:iiduka:|Hideaki Iiduka]]: **Adaptive scaling conjugate gradient method for neural networks**, RIMS Workshop on Advances in the Theory and Application of Mathematical Optimization, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Aug. 19, 2021)
-  - Yu Kobayashi, [[:en:iiduka:|Hideaki Iiduka]]: **Adaptive conjugate gradient method for deep learning**, The 2021 Spring National Conference of Operations Research Society of Japan, Tokyo Institute of Technology, Online meeting (Mar. 2, 2021). +  - Yu Kobayashi, [[:en:iiduka:|Hideaki Iiduka]]: **Adaptive conjugate gradient method for deep learning**, The 2021 Spring National Conference of Operations Research Society of Japan, Tokyo Institute of Technology, Online meeting (Mar. 2, 2021) 
-  - Hiroyuki Sakai, [[:en:iiduka:|Hideaki Iiduka]]: **Extension of adaptive learning rate optimization algorithms to Riemannian manifolds and its application to natural language processing**, The 2021 Spring National Conference of Operations Research Society of Japan, Tokyo Institute of Technology, Online meeting (Mar. 2, 2021).+  - Hiroyuki Sakai, [[:en:iiduka:|Hideaki Iiduka]]: **Extension of adaptive learning rate optimization algorithms to Riemannian manifolds and its application to natural language processing**, The 2021 Spring National Conference of Operations Research Society of Japan, Tokyo Institute of Technology, Online meeting (Mar. 2, 2021)
   - Yini Zhu, Hiroyuki Sakai, [[:en:iiduka:|Hideaki Iiduka]]: **Training neural networks using adaptive gradient methods**, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Mar. 1, 2021)   - Yini Zhu, Hiroyuki Sakai, [[:en:iiduka:|Hideaki Iiduka]]: **Training neural networks using adaptive gradient methods**, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Mar. 1, 2021)
   - Kanako Shimoyama, Yu Kobayashi, [[:en:iiduka:|Hideaki Iiduka]]: **Appropriate stochastic gradients used in adaptive learning rate optimization algorithms for training deep neural networks**, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Mar. 1, 2021)   - Kanako Shimoyama, Yu Kobayashi, [[:en:iiduka:|Hideaki Iiduka]]: **Appropriate stochastic gradients used in adaptive learning rate optimization algorithms for training deep neural networks**, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Mar. 1, 2021)
  • en/intro/publications.txt
  • 最終更新: 2024/06/19 21:34
  • by Hideaki IIDUKA