差分

このページの2つのバージョン間の差分を表示します。

この比較画面へのリンク

両方とも前のリビジョン 前のリビジョン
次のリビジョン
前のリビジョン
en:intro:publications [2021/06/05 21:30] Hideaki IIDUKAen:intro:publications [2024/03/19 18:46] (現在) – [Reward and Punishment] Naoki SATO
行 1: 行 1:
 ====== Publications ====== ====== Publications ======
 ===== Preprints ===== ===== Preprints =====
-  - [[http://arxiv.org/find/math/1/au:+Iiduka_H/0/1/0/all/0/1|preprints]] (arXiv.org Search Results)+  - [[https://arxiv.org/search/?searchtype=author&query=Iiduka%2C+H|preprints]] (arXiv.org Search Results)
  
 +===== 2024 =====
 +==== Reward and Punishment ====
 +  - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]]: [[https://orsj.org/nc2024s/student_award|The Student Excellent Presentation Award of The 2024 Spring National Conference of Operations Research Society of Japan]] (Mar. 19, 2024)
 +
 +==== Publications in Refereed Journals ====
 +  - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[en:iiduka:|Hideaki Iiduka]]: **[[http://manu71.magtech.com.cn/Jwk3_pjo/EN/10.61208/pjo-2024-005|Convergence of Riemannian Stochastic Gradient Descent on Hadamard Manifold]]**, [[http://manu71.magtech.com.cn/Jwk3_pjo/EN/home|Pacific Journal of Optimization]]: Special issue: Dedicated to Prof. Masao Fukushima on the occasion of his 75th birthday ?? (?): ??--?? (2024)
 +  - [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s11075-023-01575-0|Theoretical Analysis of Adam using Hyperparameters Close to One without Lipschitz Smoothness]]**, [[https://www.springer.com/journal/11075|Numerical Algorithms]] 95: 383--421 (2024) {{:iiduka:iiduka2023.pdf|PDF}} [[https://rdcu.be/df4ce|Springer Nature SharedIt]]
 +
 +==== Conference Activities & Talks ====
 +  - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[en:iiduka:|Hideaki Iiduka]]: **Modification and extension of memoryless spectral-scaling Broyden family on Riemannian manifolds**, The 2024 Spring National Conference of Operations Research Society of Japan, Kasuga Area, Tsukuba Campus, University of Tsukuba (Mar. 7, 2024) 
 +  - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **Global optimization of deep neural networks for graduated optimization method using smoothness of stochastic gradient descent**, The 2024 Spring National Conference of Operations Research Society of Japan, Kasuga Area, Tsukuba Campus, University of Tsukuba (Mar. 7, 2024)
 +===== 2023 =====
 +==== Books ====
 +  -  [[en:iiduka:|Hideaki Iiduka]]: **[[https://www.ohmsha.co.jp/book/9784274230066/|Algorithms for Continuous Optimization]]** (Japanese), [[https://www.ohmsha.co.jp/english/|Ohmsha]] (2023)
 +
 +
 +==== Publications in Refereed Journals ====
 +  - [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/9695982|$\epsilon$-Approximation of Adaptive Leaning Rate Optimization Algorithms for Constrained Nonconvex Stochastic Optimization]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=5962385|IEEE Transactions on Neural Networks and Learning Systems]] 34 (10): 8108--8115 (2023) {{:iiduka:TNNLS-2021-B-17781R1.pdf|PDF}}
 +  - Hiroyuki Sakai, [[https://sites.google.com/site/hiroyukisatojpn/|Hiroyuki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **[[https://www.sciencedirect.com/science/article/pii/S0096300322007536?via%3Dihub|Global Convergence of Hager-Zhang type Riemannian Conjugate Gradient Method]]**, [[https://www.sciencedirect.com/journal/applied-mathematics-and-computation|Applied Mathematics and Computation]] 441, 127685 (2023) {{:iiduka:AMC-D-22-04242.pdf|PDF}}
 +
 +==== Proceedings ====
 +  - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://proceedings.mlr.press/v202/sato23b.html|Existence and Estimation of Critical Batch Size for Training Generative Adversarial Networks with Two Time-Scale Update Rule]]**, [[https://proceedings.mlr.press/v202/|Proceedings of the 40th International Conference on Machine Learning]], PMLR 202: 30080--30104 (2023) [[https://proceedings.mlr.press/v202/sato23b/sato23b.pdf|PDF]]
 +  - [[https://hiroki11x.github.io/|Hiroki Naganuma]], [[en:iiduka:|Hideaki Iiduka]]: **[[https://proceedings.mlr.press/v206/naganuma23a.html|Conjugate Gradient Method for Generative Adversarial Networks]]**, [[https://proceedings.mlr.press/v206/|Proceedings of the 26th International Conference on Artificial Intelligence and Statistics]], PMLR 206: 4381--4408 (2023) [[https://proceedings.mlr.press/v206/naganuma23a/naganuma23a.pdf|PDF]]
 +
 +==== Conference Activities & Talks ====
 +  - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **Existence and Estimation of Critical Batch Size for Training GANs with Two Time-Scale Update Rule**, RIMS Workshop on Mathematical Optimization: Theory and Practice, Research Institute for Mathematical Sciences, Kyoto University, Hybrid meeting (Aug. 28, 2023)
 +  - Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **Adaptive Learning Rate Optimization Algorithms for Riemannian Optimization**,[[https://iciam2023.org/|The 10th International Congress on Industrial and Applied Mathematics (ICIAM)]], Waseda University, Tokyo, Japan (Aug. 20--25, 2023)
 +  - Yuki Tsukada, [[en:iiduka:|Hideaki Iiduka]]: **Line Search Methods for Nonconvex Optimization in Deep Learning**, [[https://iciam2023.org/|The 10th International Congress on Industrial and Applied Mathematics (ICIAM)]], Waseda University, Tokyo, Japan (Aug. 20--25, 2023)
 +  - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **Theoretical Analysis of Two Time-Scale Update Rule for Training GANs**, [[https://iciam2023.org/|The 10th International Congress on Industrial and Applied Mathematics (ICIAM)]], Waseda University, Tokyo, Japan (Aug. 20--25, 2023) 
 +  - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **Existence and Estimation of Critical Batch Size for Training Generative Adversarial Networks with Two Time-Scale Update Rule**, [[https://icml.cc/Conferences/2023|The 40th International Conference on Machine Learning (ICML)]], Hawaii Convention Center, Honolulu, Hawaii, USA (Jul. 23--29, 2023)
 +  - [[https://hiroki11x.github.io/|Hiroki Naganuma]], [[en:iiduka:|Hideaki Iiduka]]: **Conjugate Gradient Method for Generative Adversarial Networks**, [[http://aistats.org/aistats2023/|The 26th International Conference on Artificial Intelligence and Statistics (AISTATS)]], Palau de Congressos, Valencia, Spain (Apr. 25--27, 2023)
 +
 +
 +===== 2022 =====
 +==== Publications in Refereed Journals ====
 +  - [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/9531335|Appropriate Learning Rates of Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6221036|IEEE Transactions on Cybernetics]] 52 (12): 13250--13261 (2022) {{:iiduka:CYB-E-2021-05-1174.pdf|PDF}}
 +  - Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/9339934|Riemannian Adaptive Optimization Algorithm and Its Application to Natural Language Processing]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6221036|IEEE Transactions on Cybernetics]] 52 (8): 7328--7339 (2022) {{:iiduka:CYB-E-2020-04-0756R2.pdf|PDF}}
 +  - [[en:iiduka:|Hideaki Iiduka]], Hiroyuki Sakai: **[[https://link.springer.com/article/10.1007/s11075-021-01238-y|Riemannian Stochastic Fixed Point Optimization Algorithm]]**, [[https://www.springer.com/journal/11075|Numerical Algorithms]] 90: 1493--1517 (2022) {{:iiduka:iiduka_sakai_2020.pdf|PDF}} [[https://rdcu.be/cGDSS|Springer Nature SharedIt]]
 +  - Yu Kobayashi, [[en:iiduka:|Hideaki Iiduka]]: **[[http://www.yokohamapublishers.jp/online2/jncav23-2.html|Conjugate-gradient-based Adam for Nonconvex Stochastic Optimization and Its Application to Deep Learning]]**, [[http://yokohamapublishers.jp/jnca.html|Journal of Nonlinear and Convex Analysis]]: Special issue: Memory of Wataru Takahashi 23 (2): 337--356 (2022) [[http://yokohamapublishers.jp/online-p/JNCA/Open/vol23/jncav23n2p337-oa/HTML5/index.html|Open Access]]
 +
 +==== Conference Activities & Talks ====
 +  - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[https://sites.google.com/site/hiroyukisatoeng/|Hiroyuki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **HZ-type Conjugate Gradient Method on Riemann Manifold and Its Application to Eigenvalue Problems**, The 2022 National Conference of The Japan Society for Industrial and Applied Mathematics: Algorithms for Matrix / Eigenvalue Problems and their Applications, Hokkaido University Institute for the Advancement of Higher Education (Sept. 8, 2022)
  
 ===== 2021 ===== ===== 2021 =====
 ==== Publications in Refereed Journals ==== ==== Publications in Refereed Journals ====
-  - Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s10957-021-01874-3|Sufficient Descent Riemannian Conjugate Gradient Methods]]**, [[https://www.springer.com/journal/10957|Journal of Optimization Theory and Applications]] (Accepted) (2021) {{:iiduka:JOTA-D-20-00641R3.pdf|PDF}} +  - Kanako Shimoyama, [[..:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/oplna/vol7/p317.html|Adaptive Methods Using Element-wise $P$-th Power of Stochastic Gradient for Nonconvex Optimization in Deep Neural Networks]]**, [[http://yokohamapublishers.jp/lna.html|Linear and Nonlinear Analysis]]: Special issue: Memory of Wataru Takahashi and Naoki Shioji 7 (3): 317--336 (2021)[[http://www.ybook.co.jp/online-p/LNA/Open/vol7/lnav7n3p317-oa/HTML5/index.html|Open Access]] 
-  - Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/9339934|Riemannian Adaptive Optimization Algorithm and Its Application to Natural Language Processing]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6221036|IEEE Transactions on Cybernetics]] (Accepted) (2021) {{:iiduka:CYB-E-2020-04-0756R2.pdf|PDF}} +  - Kazuhiro Hishinuma, [[en:iiduka:|Hideaki Iiduka]]: **[[http://yokohamapublishers.jp/online2/oppafa/vol6/p1303.html|Evaluation of Fixed Point Quasiconvex Subgradient Method with Computational Inexactness]]**, [[http://www.ybook.co.jp/pafa.html|Pure and Applied Functional Analysis]]: Special Issue on Optimization Theory dedicated to Terry Rockafellar on the occasion of his 85th birthday 6 (6): 1303--1316 (2021) [[http://yokohamapublishers.jp/online-p/pafa/Open/vol6/pafav6n6p1303-oa//HTML5/index.html|Open Access]] 
-  - [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s10898-020-00980-2|Inexact Stochastic Subgradient Projection Method for Stochastic Equilibrium Problems with Nonmonotone Bifunctions: Application to Expected Risk Minimization in Machine Learning]]**, [[https://www.springer.com/journal/10898|Journal of Global Optimization]] (Accepted) (2021) {{:iiduka:JOGO-D-20-00116R2.pdf|PDF}} [[https://rdcu.be/cc0QX|Springer Nature SharedIt]] +  - Yini Zhu, [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/9576705|Unified Algorithm Framework for Nonconvex Stochastic Optimization in Deep Neural Networks]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6287639|IEEE Access]] 9: 143807--143823 (2021) [[https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9576705|Open Access]] 
-  - Kazuhiro Hishinuma, [[en:iiduka:|Hideaki Iiduka]]: **Evaluation of Fixed Point Quasiconvex Subgradient Method with Computational Inexactness**, [[http://www.ybook.co.jp/pafa.html|Pure and Applied Functional Analysis]]: Special Issue on Optimization Theory dedicated to Terry Rockafellar on the occasion of his 85th birthday, (2021) (Accepted) {{:iiduka:pafa2020.pdf|PDF}}+  - Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s10957-021-01874-3|Sufficient Descent Riemannian Conjugate Gradient Methods]]**, [[https://www.springer.com/journal/10957|Journal of Optimization Theory and Applications]] 190: 130--150 (2021){{:iiduka:JOTA-D-20-00641R3.pdf|PDF}} [[https://rdcu.be/clPy8|Springer Nature SharedIt]] 
 +  - [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s10898-020-00980-2|Inexact Stochastic Subgradient Projection Method for Stochastic Equilibrium Problems with Nonmonotone Bifunctions: Application to Expected Risk Minimization in Machine Learning]]**, [[https://www.springer.com/journal/10898|Journal of Global Optimization]] 80 (2)479--505 (2021) {{:iiduka:JOGO-D-20-00116R2.pdf|PDF}} [[https://rdcu.be/cc0QX|Springer Nature SharedIt]]
   - [[en:iiduka:|Hideaki Iiduka]]: **[[https://fixedpointtheoryandapplications.springeropen.com/articles/10.1186/s13663-021-00695-3|Stochastic Approximation Method Using Diagonal Positive-Definite Matrices for Convex Optimization with Fixed Point Constraints]]**, [[https://fixedpointtheoryandapplications.springeropen.com/|Fixed Point Theory and Algorithms for Sciences and Engineering]]: Topical Collection on [[https://www.springeropen.com/collections/optimization|Optimization and Real World Applications]] 2021: 10 (2021) [[https://fixedpointtheoryandapplications.springeropen.com/track/pdf/10.1186/s13663-021-00695-3.pdf|Open Access]] [[https://rdcu.be/civI8|Springer Nature SharedIt]]   - [[en:iiduka:|Hideaki Iiduka]]: **[[https://fixedpointtheoryandapplications.springeropen.com/articles/10.1186/s13663-021-00695-3|Stochastic Approximation Method Using Diagonal Positive-Definite Matrices for Convex Optimization with Fixed Point Constraints]]**, [[https://fixedpointtheoryandapplications.springeropen.com/|Fixed Point Theory and Algorithms for Sciences and Engineering]]: Topical Collection on [[https://www.springeropen.com/collections/optimization|Optimization and Real World Applications]] 2021: 10 (2021) [[https://fixedpointtheoryandapplications.springeropen.com/track/pdf/10.1186/s13663-021-00695-3.pdf|Open Access]] [[https://rdcu.be/civI8|Springer Nature SharedIt]]
  
 ==== Proceedings ==== ==== Proceedings ====
 +  - Kanako Shimoyama, [[en:iiduka:|Hideaki Iiduka]]: **Appropriate Gradients Used for Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks**, [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/kokyuroku.html|RIMS Kôkyûroku]] [[http://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/contents/2114.html|No.2194]], pp. 1--5, 2021 [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/contents/pdf/2194-01.pdf|Open Access]]
 +  - Yini Zhu, Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **Training Neural Networks Using Adaptive Gradient Methods**, [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/kokyuroku.html|RIMS Kôkyûroku]] [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/contents/2194.html|No.2194]], pp. 6--12, 2021 [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/contents/pdf/2194-02.pdf|Open Access]]
   - [[en:iiduka:|Hideaki Iiduka]]: **Halpern-type Subgradient Methods for Convex Optimization over Fixed Point Sets of Nonexpansive Mappings**, Proceedings of International Conference on Nonlinear Analysis and Convex Analysis and International Conference on Optimization: Techniques and Applications -I- pp. 119--125 {{:iiduka:iiduka-naca-icota2019.R1.pdf|PDF}}   - [[en:iiduka:|Hideaki Iiduka]]: **Halpern-type Subgradient Methods for Convex Optimization over Fixed Point Sets of Nonexpansive Mappings**, Proceedings of International Conference on Nonlinear Analysis and Convex Analysis and International Conference on Optimization: Techniques and Applications -I- pp. 119--125 {{:iiduka:iiduka-naca-icota2019.R1.pdf|PDF}}
  
  
 ==== Conference Activities & Talks ==== ==== Conference Activities & Talks ====
-  - Yu Kobayashi, [[:en:iiduka:|Hideaki Iiduka]]: **Adaptive conjugate gradient method for deep learning**, The 2021 Spring National Conference of Operations Research Society of Japan, Tokyo Institute of Technology, Online meeting (Mar. 2, 2021). +  - Hiroyuki Sakai, [[:en:iiduka:|Hideaki Iiduka]]: **Riemannian conjugate gradient methods with sufficient descent search directions**, The 2021 Fall National Conference of Operations Research Society of Japan, Kyushu University, Online meeting (Sept. 16, 2021) 
-  - Hiroyuki Sakai, [[:en:iiduka:|Hideaki Iiduka]]: **Extension of adaptive learning rate optimization algorithms to Riemannian manifolds and its application to natural language processing**, The 2021 Spring National Conference of Operations Research Society of Japan, Tokyo Institute of Technology, Online meeting (Mar. 2, 2021).+  - Koshiro Izumi, [[:en:iiduka:|Hideaki Iiduka]]: **Adaptive scaling conjugate gradient method for neural networks**, RIMS Workshop on Advances in the Theory and Application of Mathematical Optimization, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Aug. 19, 2021) 
 +  - Yu Kobayashi, [[:en:iiduka:|Hideaki Iiduka]]: **Adaptive conjugate gradient method for deep learning**, The 2021 Spring National Conference of Operations Research Society of Japan, Tokyo Institute of Technology, Online meeting (Mar. 2, 2021) 
 +  - Hiroyuki Sakai, [[:en:iiduka:|Hideaki Iiduka]]: **Extension of adaptive learning rate optimization algorithms to Riemannian manifolds and its application to natural language processing**, The 2021 Spring National Conference of Operations Research Society of Japan, Tokyo Institute of Technology, Online meeting (Mar. 2, 2021)
   - Yini Zhu, Hiroyuki Sakai, [[:en:iiduka:|Hideaki Iiduka]]: **Training neural networks using adaptive gradient methods**, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Mar. 1, 2021)   - Yini Zhu, Hiroyuki Sakai, [[:en:iiduka:|Hideaki Iiduka]]: **Training neural networks using adaptive gradient methods**, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Mar. 1, 2021)
   - Kanako Shimoyama, Yu Kobayashi, [[:en:iiduka:|Hideaki Iiduka]]: **Appropriate stochastic gradients used in adaptive learning rate optimization algorithms for training deep neural networks**, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Mar. 1, 2021)   - Kanako Shimoyama, Yu Kobayashi, [[:en:iiduka:|Hideaki Iiduka]]: **Appropriate stochastic gradients used in adaptive learning rate optimization algorithms for training deep neural networks**, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Mar. 1, 2021)
行 54: 行 101:
   - Haruhi Oishi, Yu Kobayashi, [[en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online-p/LNA/Open/vol5/lnav5n3p477-oa/index.html|Incremental Proximal Method for Nonsmooth Convex Optimization With Fixed Point Constraints of Quasi-nonexpansive Mappings]]**, [[http://www.ybook.co.jp/lna.html|Linear and Nonlinear Analysis]] 5 (3): 477-493 (2019) {{:iiduka:oishi_lna.pdf|PDF}}   - Haruhi Oishi, Yu Kobayashi, [[en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online-p/LNA/Open/vol5/lnav5n3p477-oa/index.html|Incremental Proximal Method for Nonsmooth Convex Optimization With Fixed Point Constraints of Quasi-nonexpansive Mappings]]**, [[http://www.ybook.co.jp/lna.html|Linear and Nonlinear Analysis]] 5 (3): 477-493 (2019) {{:iiduka:oishi_lna.pdf|PDF}}
   - [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/8584116|Distributed Optimization for Network Resource Allocation With Nonsmooth Utility Functions]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6509490|IEEE Transactions on Control of Network Systems]] 6 (4): 1354-1365 (2019) {{:iiduka:18-0317.pdf|PDF}}   - [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/8584116|Distributed Optimization for Network Resource Allocation With Nonsmooth Utility Functions]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6509490|IEEE Transactions on Control of Network Systems]] 6 (4): 1354-1365 (2019) {{:iiduka:18-0317.pdf|PDF}}
-  - Kazuhiro Hishinuma, [[..:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/opjnca/vol20/p1937.html|Convergence Analysis of Incremental and Parallel Line Search Subgradient Methods in Hilbert Space]]**, [[http://www.ybook.co.jp/jnca.html|Journal of Nonlinear and Convex Analysis]] 20 (9): Special Issue-Dedicated to Wataru Takahashi on the occasion of his 75th birth day1937-1947 (2019) {{:iiduka:jnca_kaz_hide.pdf|PDF}}+  - Kazuhiro Hishinuma, [[..:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/opjnca/vol20/p1937.html|Convergence Analysis of Incremental and Parallel Line Search Subgradient Methods in Hilbert Space]]**, [[http://www.ybook.co.jp/jnca.html|Journal of Nonlinear and Convex Analysis]]: Special Issue-Dedicated to Wataru Takahashi on the occasion of his 75th birth day 20 (9): 1937-1947 (2019) {{:iiduka:jnca_kaz_hide.pdf|PDF}}
   - Kazuhiro Hishinuma, [[en:iiduka:|Hideaki Iiduka]]: **[[https://www.frontiersin.org/articles/10.3389/frobt.2019.00077|Incremental and Parallel Machine Learning Algorithms With Automated Learning Rate Adjustments]]**, [[https://www.frontiersin.org/journals/robotics-and-ai|Frontiers in Robotics and AI]]: [[https://www.frontiersin.org/research-topics/7964|Resolution of Limitations of Deep Learning to Develop New AI Paradigms]] 6, Article 77 {{ :preprint:201908-kaz-iiduka.pdf |PDF}}   - Kazuhiro Hishinuma, [[en:iiduka:|Hideaki Iiduka]]: **[[https://www.frontiersin.org/articles/10.3389/frobt.2019.00077|Incremental and Parallel Machine Learning Algorithms With Automated Learning Rate Adjustments]]**, [[https://www.frontiersin.org/journals/robotics-and-ai|Frontiers in Robotics and AI]]: [[https://www.frontiersin.org/research-topics/7964|Resolution of Limitations of Deep Learning to Develop New AI Paradigms]] 6, Article 77 {{ :preprint:201908-kaz-iiduka.pdf |PDF}}
   - [[en:iiduka:|Hideaki Iiduka]]: **[[http://www.tandfonline.com/doi/full/10.1080/10556788.2018.1425860|Two Stochastic Optimization Algorithms for Convex Optimization With Fixed Point Constraints]]**, [[http://www.tandfonline.com/loi/goms20|Optimization Methods and Software]] 34 (4): 731-757 (2019) {{:iiduka:GOMS-2017-0013R1.pdf|PDF}}   - [[en:iiduka:|Hideaki Iiduka]]: **[[http://www.tandfonline.com/doi/full/10.1080/10556788.2018.1425860|Two Stochastic Optimization Algorithms for Convex Optimization With Fixed Point Constraints]]**, [[http://www.tandfonline.com/loi/goms20|Optimization Methods and Software]] 34 (4): 731-757 (2019) {{:iiduka:GOMS-2017-0013R1.pdf|PDF}}
  • en/intro/publications.1622896255.txt.gz
  • 最終更新: 2021/06/05 21:30
  • by Hideaki IIDUKA