差分

このページの2つのバージョン間の差分を表示します。

この比較画面へのリンク

両方とも前のリビジョン 前のリビジョン
次のリビジョン
前のリビジョン
en:intro:publications [2020/03/29 14:01] – [Doctoral Thesis] kazen:intro:publications [2024/03/19 18:46] (現在) – [Reward and Punishment] Naoki SATO
行 1: 行 1:
 ====== Publications ====== ====== Publications ======
 ===== Preprints ===== ===== Preprints =====
-  - [[http://arxiv.org/find/math/1/au:+Iiduka_H/0/1/0/all/0/1|preprints]] (arXiv.org Search Results)+  - [[https://arxiv.org/search/?searchtype=author&query=Iiduka%2C+H|preprints]] (arXiv.org Search Results) 
 + 
 +===== 2024 ===== 
 +==== Reward and Punishment ==== 
 +  - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]]: [[https://orsj.org/nc2024s/student_award|The Student Excellent Presentation Award of The 2024 Spring National Conference of Operations Research Society of Japan]] (Mar. 19, 2024) 
 + 
 +==== Publications in Refereed Journals ==== 
 +  - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[en:iiduka:|Hideaki Iiduka]]: **[[http://manu71.magtech.com.cn/Jwk3_pjo/EN/10.61208/pjo-2024-005|Convergence of Riemannian Stochastic Gradient Descent on Hadamard Manifold]]**, [[http://manu71.magtech.com.cn/Jwk3_pjo/EN/home|Pacific Journal of Optimization]]: Special issue: Dedicated to Prof. Masao Fukushima on the occasion of his 75th birthday ?? (?): ??--?? (2024) 
 +  - [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s11075-023-01575-0|Theoretical Analysis of Adam using Hyperparameters Close to One without Lipschitz Smoothness]]**, [[https://www.springer.com/journal/11075|Numerical Algorithms]] 95: 383--421 (2024) {{:iiduka:iiduka2023.pdf|PDF}} [[https://rdcu.be/df4ce|Springer Nature SharedIt]] 
 + 
 +==== Conference Activities & Talks ==== 
 +  - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[en:iiduka:|Hideaki Iiduka]]: **Modification and extension of memoryless spectral-scaling Broyden family on Riemannian manifolds**, The 2024 Spring National Conference of Operations Research Society of Japan, Kasuga Area, Tsukuba Campus, University of Tsukuba (Mar. 7, 2024)  
 +  - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **Global optimization of deep neural networks for graduated optimization method using smoothness of stochastic gradient descent**, The 2024 Spring National Conference of Operations Research Society of Japan, Kasuga Area, Tsukuba Campus, University of Tsukuba (Mar. 7, 2024) 
 +===== 2023 ===== 
 +==== Books ==== 
 +  -  [[en:iiduka:|Hideaki Iiduka]]: **[[https://www.ohmsha.co.jp/book/9784274230066/|Algorithms for Continuous Optimization]]** (Japanese), [[https://www.ohmsha.co.jp/english/|Ohmsha]] (2023) 
 + 
 + 
 +==== Publications in Refereed Journals ==== 
 +  - [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/9695982|$\epsilon$-Approximation of Adaptive Leaning Rate Optimization Algorithms for Constrained Nonconvex Stochastic Optimization]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=5962385|IEEE Transactions on Neural Networks and Learning Systems]] 34 (10): 8108--8115 (2023) {{:iiduka:TNNLS-2021-B-17781R1.pdf|PDF}} 
 +  - Hiroyuki Sakai, [[https://sites.google.com/site/hiroyukisatojpn/|Hiroyuki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **[[https://www.sciencedirect.com/science/article/pii/S0096300322007536?via%3Dihub|Global Convergence of Hager-Zhang type Riemannian Conjugate Gradient Method]]**, [[https://www.sciencedirect.com/journal/applied-mathematics-and-computation|Applied Mathematics and Computation]] 441, 127685 (2023) {{:iiduka:AMC-D-22-04242.pdf|PDF}} 
 + 
 +==== Proceedings ==== 
 +  - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[..:iiduka:|Hideaki Iiduka]]: **[[https://proceedings.mlr.press/v202/sato23b.html|Existence and Estimation of Critical Batch Size for Training Generative Adversarial Networks with Two Time-Scale Update Rule]]**, [[https://proceedings.mlr.press/v202/|Proceedings of the 40th International Conference on Machine Learning]], PMLR 202: 30080--30104 (2023) [[https://proceedings.mlr.press/v202/sato23b/sato23b.pdf|PDF]] 
 +  - [[https://hiroki11x.github.io/|Hiroki Naganuma]], [[en:iiduka:|Hideaki Iiduka]]: **[[https://proceedings.mlr.press/v206/naganuma23a.html|Conjugate Gradient Method for Generative Adversarial Networks]]**, [[https://proceedings.mlr.press/v206/|Proceedings of the 26th International Conference on Artificial Intelligence and Statistics]], PMLR 206: 4381--4408 (2023) [[https://proceedings.mlr.press/v206/naganuma23a/naganuma23a.pdf|PDF]] 
 + 
 +==== Conference Activities & Talks ==== 
 +  - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **Existence and Estimation of Critical Batch Size for Training GANs with Two Time-Scale Update Rule**, RIMS Workshop on Mathematical Optimization: Theory and Practice, Research Institute for Mathematical Sciences, Kyoto University, Hybrid meeting (Aug. 28, 2023) 
 +  - Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **Adaptive Learning Rate Optimization Algorithms for Riemannian Optimization**,[[https://iciam2023.org/|The 10th International Congress on Industrial and Applied Mathematics (ICIAM)]], Waseda University, Tokyo, Japan (Aug. 20--25, 2023) 
 +  - Yuki Tsukada, [[en:iiduka:|Hideaki Iiduka]]: **Line Search Methods for Nonconvex Optimization in Deep Learning**, [[https://iciam2023.org/|The 10th International Congress on Industrial and Applied Mathematics (ICIAM)]], Waseda University, Tokyo, Japan (Aug. 20--25, 2023) 
 +  - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **Theoretical Analysis of Two Time-Scale Update Rule for Training GANs**, [[https://iciam2023.org/|The 10th International Congress on Industrial and Applied Mathematics (ICIAM)]], Waseda University, Tokyo, Japan (Aug. 20--25, 2023)  
 +  - [[https://scholar.google.co.jp/citations?user=rNbGTIgAAAAJ&hl=ja|Naoki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **Existence and Estimation of Critical Batch Size for Training Generative Adversarial Networks with Two Time-Scale Update Rule**, [[https://icml.cc/Conferences/2023|The 40th International Conference on Machine Learning (ICML)]], Hawaii Convention Center, Honolulu, Hawaii, USA (Jul. 23--29, 2023) 
 +  - [[https://hiroki11x.github.io/|Hiroki Naganuma]], [[en:iiduka:|Hideaki Iiduka]]: **Conjugate Gradient Method for Generative Adversarial Networks**, [[http://aistats.org/aistats2023/|The 26th International Conference on Artificial Intelligence and Statistics (AISTATS)]], Palau de Congressos, Valencia, Spain (Apr. 25--27, 2023) 
 + 
 + 
 +===== 2022 ===== 
 +==== Publications in Refereed Journals ==== 
 +  - [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/9531335|Appropriate Learning Rates of Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6221036|IEEE Transactions on Cybernetics]] 52 (12): 13250--13261 (2022) {{:iiduka:CYB-E-2021-05-1174.pdf|PDF}} 
 +  - Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/9339934|Riemannian Adaptive Optimization Algorithm and Its Application to Natural Language Processing]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6221036|IEEE Transactions on Cybernetics]] 52 (8): 7328--7339 (2022) {{:iiduka:CYB-E-2020-04-0756R2.pdf|PDF}} 
 +  - [[en:iiduka:|Hideaki Iiduka]], Hiroyuki Sakai: **[[https://link.springer.com/article/10.1007/s11075-021-01238-y|Riemannian Stochastic Fixed Point Optimization Algorithm]]**, [[https://www.springer.com/journal/11075|Numerical Algorithms]] 90: 1493--1517 (2022) {{:iiduka:iiduka_sakai_2020.pdf|PDF}} [[https://rdcu.be/cGDSS|Springer Nature SharedIt]] 
 +  - Yu Kobayashi, [[en:iiduka:|Hideaki Iiduka]]: **[[http://www.yokohamapublishers.jp/online2/jncav23-2.html|Conjugate-gradient-based Adam for Nonconvex Stochastic Optimization and Its Application to Deep Learning]]**, [[http://yokohamapublishers.jp/jnca.html|Journal of Nonlinear and Convex Analysis]]: Special issue: Memory of Wataru Takahashi 23 (2): 337--356 (2022) [[http://yokohamapublishers.jp/online-p/JNCA/Open/vol23/jncav23n2p337-oa/HTML5/index.html|Open Access]] 
 + 
 +==== Conference Activities & Talks ==== 
 +  - [[https://scholar.google.co.jp/citations?user=RXrwOgoAAAAJ&hl=ja|Hiroyuki Sakai]], [[https://sites.google.com/site/hiroyukisatoeng/|Hiroyuki Sato]], [[en:iiduka:|Hideaki Iiduka]]: **HZ-type Conjugate Gradient Method on Riemann Manifold and Its Application to Eigenvalue Problems**, The 2022 National Conference of The Japan Society for Industrial and Applied Mathematics: Algorithms for Matrix / Eigenvalue Problems and their Applications, Hokkaido University Institute for the Advancement of Higher Education (Sept. 8, 2022) 
 + 
 +===== 2021 ===== 
 +==== Publications in Refereed Journals ==== 
 +  - Kanako Shimoyama, [[..:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/oplna/vol7/p317.html|Adaptive Methods Using Element-wise $P$-th Power of Stochastic Gradient for Nonconvex Optimization in Deep Neural Networks]]**, [[http://yokohamapublishers.jp/lna.html|Linear and Nonlinear Analysis]]: Special issue: Memory of Wataru Takahashi and Naoki Shioji 7 (3): 317--336 (2021)[[http://www.ybook.co.jp/online-p/LNA/Open/vol7/lnav7n3p317-oa/HTML5/index.html|Open Access]] 
 +  - Kazuhiro Hishinuma, [[en:iiduka:|Hideaki Iiduka]]: **[[http://yokohamapublishers.jp/online2/oppafa/vol6/p1303.html|Evaluation of Fixed Point Quasiconvex Subgradient Method with Computational Inexactness]]**, [[http://www.ybook.co.jp/pafa.html|Pure and Applied Functional Analysis]]: Special Issue on Optimization Theory dedicated to Terry Rockafellar on the occasion of his 85th birthday 6 (6): 1303--1316 (2021) [[http://yokohamapublishers.jp/online-p/pafa/Open/vol6/pafav6n6p1303-oa//HTML5/index.html|Open Access]] 
 +  - Yini Zhu, [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/9576705|Unified Algorithm Framework for Nonconvex Stochastic Optimization in Deep Neural Networks]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6287639|IEEE Access]] 9: 143807--143823 (2021) [[https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9576705|Open Access]] 
 +  - Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s10957-021-01874-3|Sufficient Descent Riemannian Conjugate Gradient Methods]]**, [[https://www.springer.com/journal/10957|Journal of Optimization Theory and Applications]] 190: 130--150 (2021){{:iiduka:JOTA-D-20-00641R3.pdf|PDF}} [[https://rdcu.be/clPy8|Springer Nature SharedIt]] 
 +  - [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s10898-020-00980-2|Inexact Stochastic Subgradient Projection Method for Stochastic Equilibrium Problems with Nonmonotone Bifunctions: Application to Expected Risk Minimization in Machine Learning]]**, [[https://www.springer.com/journal/10898|Journal of Global Optimization]] 80 (2): 479--505 (2021) {{:iiduka:JOGO-D-20-00116R2.pdf|PDF}} [[https://rdcu.be/cc0QX|Springer Nature SharedIt]] 
 +  - [[en:iiduka:|Hideaki Iiduka]]: **[[https://fixedpointtheoryandapplications.springeropen.com/articles/10.1186/s13663-021-00695-3|Stochastic Approximation Method Using Diagonal Positive-Definite Matrices for Convex Optimization with Fixed Point Constraints]]**, [[https://fixedpointtheoryandapplications.springeropen.com/|Fixed Point Theory and Algorithms for Sciences and Engineering]]: Topical Collection on [[https://www.springeropen.com/collections/optimization|Optimization and Real World Applications]] 2021: 10 (2021) [[https://fixedpointtheoryandapplications.springeropen.com/track/pdf/10.1186/s13663-021-00695-3.pdf|Open Access]] [[https://rdcu.be/civI8|Springer Nature SharedIt]] 
 + 
 +==== Proceedings ==== 
 +  - Kanako Shimoyama, [[en:iiduka:|Hideaki Iiduka]]: **Appropriate Gradients Used for Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks**, [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/kokyuroku.html|RIMS Kôkyûroku]] [[http://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/contents/2114.html|No.2194]], pp. 1--5, 2021 [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/contents/pdf/2194-01.pdf|Open Access]] 
 +  - Yini Zhu, Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **Training Neural Networks Using Adaptive Gradient Methods**, [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/kokyuroku.html|RIMS Kôkyûroku]] [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/contents/2194.html|No.2194]], pp. 6--12, 2021 [[https://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/contents/pdf/2194-02.pdf|Open Access]] 
 +  - [[en:iiduka:|Hideaki Iiduka]]: **Halpern-type Subgradient Methods for Convex Optimization over Fixed Point Sets of Nonexpansive Mappings**, Proceedings of International Conference on Nonlinear Analysis and Convex Analysis and International Conference on Optimization: Techniques and Applications -I- pp. 119--125 {{:iiduka:iiduka-naca-icota2019.R1.pdf|PDF}} 
 + 
 + 
 +==== Conference Activities & Talks ==== 
 +  - Hiroyuki Sakai, [[:en:iiduka:|Hideaki Iiduka]]: **Riemannian conjugate gradient methods with sufficient descent search directions**, The 2021 Fall National Conference of Operations Research Society of Japan, Kyushu University, Online meeting (Sept. 16, 2021) 
 +  - Koshiro Izumi, [[:en:iiduka:|Hideaki Iiduka]]: **Adaptive scaling conjugate gradient method for neural networks**, RIMS Workshop on Advances in the Theory and Application of Mathematical Optimization, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Aug. 19, 2021) 
 +  - Yu Kobayashi, [[:en:iiduka:|Hideaki Iiduka]]: **Adaptive conjugate gradient method for deep learning**, The 2021 Spring National Conference of Operations Research Society of Japan, Tokyo Institute of Technology, Online meeting (Mar. 2, 2021) 
 +  - Hiroyuki Sakai, [[:en:iiduka:|Hideaki Iiduka]]: **Extension of adaptive learning rate optimization algorithms to Riemannian manifolds and its application to natural language processing**, The 2021 Spring National Conference of Operations Research Society of Japan, Tokyo Institute of Technology, Online meeting (Mar. 2, 2021) 
 +  - Yini Zhu, Hiroyuki Sakai, [[:en:iiduka:|Hideaki Iiduka]]: **Training neural networks using adaptive gradient methods**, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Mar. 1, 2021) 
 +  - Kanako Shimoyama, Yu Kobayashi, [[:en:iiduka:|Hideaki Iiduka]]: **Appropriate stochastic gradients used in adaptive learning rate optimization algorithms for training deep neural networks**, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Mar. 1, 2021) 
 + 
 + 
  
  
 ===== 2020 ===== ===== 2020 =====
 ==== Publications in Refereed Journals ==== ==== Publications in Refereed Journals ====
-  - Kazuhiro Hishinuma, [[en:iiduka:|Hideaki Iiduka]]: **Efficiency of Inexact Fixed Point Quasiconvex Subgradient Method**, [[http://www.ybook.co.jp/lna.html|Linear and Nonlinear Analysis]] (Accepted) {{:iiduka:lna_kaz2020.pdf|PDF}} +  - Hiroyuki Sakai, [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s10589-020-00224-9|Hybrid Riemannian Conjugate Gradient Methods with Global Convergence Properties]]**, [[https://www.springer.com/journal/10589|Computational Optimization and Applications]] 77: 811--830 (2020) {{:iiduka:coap-d-20-00058.pdf|PDF}} 
-  - [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/8744480|Stochastic Fixed Point Optimization Algorithm for Classifier Ensemble]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6221036|IEEE Transactions on Cybernetics]] (accepted) {{:iiduka:CYB-E-2018-12-2420.R1.pdf|PDF}} +  - [[en:iiduka:|Hideaki Iiduka]], Yu Kobayashi: **[[https://www.mdpi.com/2079-9292/9/11/1809|Training Deep Neural Networks Using Conjugate Gradient-like Methods]]**, [[https://www.mdpi.com/journal/electronics/sections/Artificial_Intell|Electronics]] 9 (11): 1809 (2020) [[https://www.mdpi.com/2079-9292/9/11/1809/pdf|Open Access]] {{:iiduka:iiduka_electronics_correction.pdf|Correction}} 
-  - Kengo Shimizu, Kazuhiro Hishinuma, [[en:iiduka:|Hideaki Iiduka]]: **[[http://asvao.biemdas.com/issues/ASVAO2020-1-1.pdf|Parallel Computing Proximal Method for Nonsmooth Convex Optimization With Fixed Point Constraints of Quasi-nonexpansive Mappings]]**, [[http://asvao.biemdas.com/|Applied Set-Valued Analysis and Optimization]] 2(1): 1--17 (2020) {{:iiduka:simizu-iiduka-asvao.pdf|PDF}} +  - Kengo Shimizu, [[en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/oplna/vol6/p281.html|Computation Time of Iterative Methods for Nonsmooth Convex Optimization With Fixed Point Constraints of Quasi-Nonexpansive Mappings]]**, [[http://www.ybook.co.jp/lna.html|Linear and Nonlinear Analysis]] (2): 281--286 (2020) {{:iiduka:lna-shimizu2020.pdf|PDF}}  
-  - [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s11081-019-09440-7|Decentralized Hierarchical Constrained Convex Optimization]]**, [[https://link.springer.com/journal/11081|Optimization and Engineering]]  21(1): 181--213 (2020) {{:iiduka:OPTE-2019-213R1.pdf|PDF}} [[https://rdcu.be/bFpTS|Springer Nature SharedIt]] +  - [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/8744480|Stochastic Fixed Point Optimization Algorithm for Classifier Ensemble]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6221036|IEEE Transactions on Cybernetics]] 50(10): 4370--4380 (2020) {{:iiduka:CYB-E-2018-12-2420.R1.pdf|PDF}} 
-  - Kazuhiro Hishinuma, [[..:iiduka:|Hideaki Iiduka]]: **[[https://doi.org/10.1016/j.ejor.2019.09.037|Fixed Point Quasiconvex Subgradient Method]]**, [[https://www.sciencedirect.com/journal/european-journal-of-operational-research|European Journal of Operational Research]] 282(2): 428--437 (2020) {{ :kaz:201909-kaz-iiduka-ejor.pdf |PDF}}+  - Kazuhiro Hishinuma, [[en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/oplna/vol6/p35.html|Efficiency of Inexact Fixed Point Quasiconvex Subgradient Method]]**, [[http://www.ybook.co.jp/lna.html|Linear and Nonlinear Analysis]] 6 (1): 35--48 (2020) {{:iiduka:lna_kaz2020.pdf|PDF}} 
 +  - Kengo Shimizu, Kazuhiro Hishinuma, [[en:iiduka:|Hideaki Iiduka]]: **[[http://asvao.biemdas.com/issues/ASVAO2020-1-1.pdf|Parallel Computing Proximal Method for Nonsmooth Convex Optimization With Fixed Point Constraints of Quasi-nonexpansive Mappings]]**, [[http://asvao.biemdas.com/|Applied Set-Valued Analysis and Optimization]] 2 (1): 1--17 (2020) {{:iiduka:simizu-iiduka-asvao.pdf|PDF}} 
 +  - [[en:iiduka:|Hideaki Iiduka]]: **[[https://link.springer.com/article/10.1007/s11081-019-09440-7|Decentralized Hierarchical Constrained Convex Optimization]]**, [[https://link.springer.com/journal/11081|Optimization and Engineering]]  21 (1): 181--213 (2020) {{:iiduka:OPTE-2019-213R1.pdf|PDF}} [[https://rdcu.be/bFpTS|Springer Nature SharedIt]] 
 +  - Kazuhiro Hishinuma, [[..:iiduka:|Hideaki Iiduka]]: **[[https://doi.org/10.1016/j.ejor.2019.09.037|Fixed Point Quasiconvex Subgradient Method]]**, [[https://www.sciencedirect.com/journal/european-journal-of-operational-research|European Journal of Operational Research]] 282 (2): 428--437 (2020) {{ :kaz:201909-kaz-iiduka-ejor.pdf |PDF}}
  
 ==== Doctoral Thesis ==== ==== Doctoral Thesis ====
-  - Kazuhiro Hishinuma: **Fixed Point Subgradient Methods for Constrained Nonsmooth Optimization** {{ ::kaz_dt_2020.pdf |PDF}}+  - Kazuhiro Hishinuma: **Fixed Point Subgradient Methods for Constrained Nonsmooth Optimization**, Meiji University, 2020 {{ ::kaz_dt_2020.pdf |PDF}} 
 + 
 ==== Conference Activities & Talks ==== ==== Conference Activities & Talks ====
   - Yu Kobayashi, [[en:iiduka:|Hideaki Iiduka]]: **Adaptive optimization method with stochastic conjugate gradient direction and its application to image classification in deep learning**, The 2020 Spring National Conference of Operations Research Society of Japan, Nara Kasugano International Forum (Mar. 12, 2020).   - Yu Kobayashi, [[en:iiduka:|Hideaki Iiduka]]: **Adaptive optimization method with stochastic conjugate gradient direction and its application to image classification in deep learning**, The 2020 Spring National Conference of Operations Research Society of Japan, Nara Kasugano International Forum (Mar. 12, 2020).
行 25: 行 99:
 ===== 2019 ===== ===== 2019 =====
 ==== Publications in Refereed Journals ==== ==== Publications in Refereed Journals ====
-  - Haruhi Oishi, Yu Kobayashi, [[en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online-p/LNA/Open/vol5/lnav5n3p477-oa/index.html|Incremental Proximal Method for Nonsmooth Convex Optimization With Fixed Point Constraints of Quasi-nonexpansive Mappings]]**, [[http://www.ybook.co.jp/lna.html|Linear and Nonlinear Analysis]] 5(3): 477-493 (2019) {{:iiduka:oishi_lna.pdf|PDF}} +  - Haruhi Oishi, Yu Kobayashi, [[en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online-p/LNA/Open/vol5/lnav5n3p477-oa/index.html|Incremental Proximal Method for Nonsmooth Convex Optimization With Fixed Point Constraints of Quasi-nonexpansive Mappings]]**, [[http://www.ybook.co.jp/lna.html|Linear and Nonlinear Analysis]] 5 (3): 477-493 (2019) {{:iiduka:oishi_lna.pdf|PDF}} 
-  - [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/8584116|Distributed Optimization for Network Resource Allocation With Nonsmooth Utility Functions]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6509490|IEEE Transactions on Control of Network Systems]] 6(4): 1354-1365 (2019) {{:iiduka:18-0317.pdf|PDF}} +  - [[en:iiduka:|Hideaki Iiduka]]: **[[https://ieeexplore.ieee.org/document/8584116|Distributed Optimization for Network Resource Allocation With Nonsmooth Utility Functions]]**, [[https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6509490|IEEE Transactions on Control of Network Systems]] 6 (4): 1354-1365 (2019) {{:iiduka:18-0317.pdf|PDF}} 
-  - Kazuhiro Hishinuma, [[..:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/opjnca/vol20/p1937.html|Convergence Analysis of Incremental and Parallel Line Search Subgradient Methods in Hilbert Space]]**, [[http://www.ybook.co.jp/jnca.html|Journal of Nonlinear and Convex Analysis]] 20(9): Special Issue-Dedicated to Wataru Takahashi on the occasion of his 75th birth day1937-1947 (2019) {{:iiduka:jnca_kaz_hide.pdf|PDF}}+  - Kazuhiro Hishinuma, [[..:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/opjnca/vol20/p1937.html|Convergence Analysis of Incremental and Parallel Line Search Subgradient Methods in Hilbert Space]]**, [[http://www.ybook.co.jp/jnca.html|Journal of Nonlinear and Convex Analysis]]: Special Issue-Dedicated to Wataru Takahashi on the occasion of his 75th birth day 20 (9): 1937-1947 (2019) {{:iiduka:jnca_kaz_hide.pdf|PDF}}
   - Kazuhiro Hishinuma, [[en:iiduka:|Hideaki Iiduka]]: **[[https://www.frontiersin.org/articles/10.3389/frobt.2019.00077|Incremental and Parallel Machine Learning Algorithms With Automated Learning Rate Adjustments]]**, [[https://www.frontiersin.org/journals/robotics-and-ai|Frontiers in Robotics and AI]]: [[https://www.frontiersin.org/research-topics/7964|Resolution of Limitations of Deep Learning to Develop New AI Paradigms]] 6, Article 77 {{ :preprint:201908-kaz-iiduka.pdf |PDF}}   - Kazuhiro Hishinuma, [[en:iiduka:|Hideaki Iiduka]]: **[[https://www.frontiersin.org/articles/10.3389/frobt.2019.00077|Incremental and Parallel Machine Learning Algorithms With Automated Learning Rate Adjustments]]**, [[https://www.frontiersin.org/journals/robotics-and-ai|Frontiers in Robotics and AI]]: [[https://www.frontiersin.org/research-topics/7964|Resolution of Limitations of Deep Learning to Develop New AI Paradigms]] 6, Article 77 {{ :preprint:201908-kaz-iiduka.pdf |PDF}}
-  - [[en:iiduka:|Hideaki Iiduka]]: **[[http://www.tandfonline.com/doi/full/10.1080/10556788.2018.1425860|Two Stochastic Optimization Algorithms for Convex Optimization With Fixed Point Constraints]]**, [[http://www.tandfonline.com/loi/goms20|Optimization Methods and Software]] 34(4): 731-757 (2019) {{:iiduka:GOMS-2017-0013R1.pdf|PDF}} +  - [[en:iiduka:|Hideaki Iiduka]]: **[[http://www.tandfonline.com/doi/full/10.1080/10556788.2018.1425860|Two Stochastic Optimization Algorithms for Convex Optimization With Fixed Point Constraints]]**, [[http://www.tandfonline.com/loi/goms20|Optimization Methods and Software]] 34 (4): 731-757 (2019) {{:iiduka:GOMS-2017-0013R1.pdf|PDF}} 
-  - Kaito Sakurai, Takayuki Jimba, [[en:iiduka:|Hideaki Iiduka]]: **[[http://jnva.biemdas.com/archives/843|Iterative Methods for Parallel Convex Optimization With Fixed Point Constraints]]**, [[http://jnva.biemdas.com/|Journal of Nonlinear and Variational Analysis]] 3(2): 115-126 (2019) [[http://jnva.biemdas.com/issues/JNVA2019-2-1.pdf|Open Access]]+  - Kaito Sakurai, Takayuki Jimba, [[en:iiduka:|Hideaki Iiduka]]: **[[http://jnva.biemdas.com/archives/843|Iterative Methods for Parallel Convex Optimization With Fixed Point Constraints]]**, [[http://jnva.biemdas.com/|Journal of Nonlinear and Variational Analysis]] 3 (2): 115-126 (2019) [[http://jnva.biemdas.com/issues/JNVA2019-2-1.pdf|Open Access]]
  
  
行 37: 行 111:
 ==== Honor Lecture ==== ==== Honor Lecture ====
   - [[..:iiduka:|Hideaki Iiduka]]: **Convex optimization with complicated constraint and its application**, The 2019 Fall National Conference of Operations Research Society of Japan,  Higashi Hiroshima Arts & Culture Hall Kurara (Sep. 12, 2019).   - [[..:iiduka:|Hideaki Iiduka]]: **Convex optimization with complicated constraint and its application**, The 2019 Fall National Conference of Operations Research Society of Japan,  Higashi Hiroshima Arts & Culture Hall Kurara (Sep. 12, 2019).
-  - [[en:iiduka:|Hideaki Iiduka]]: **Fixed Point Algorithms and Their Applications**, The International Conference on Nonlinear Analysis and Convex Analysis--International Conference on Optimization: Techniques and Applications (NACA-ICOTA2019), Future University Hakodate (Aug. 27, 2019)+  - [[en:iiduka:|Hideaki Iiduka]]: **Fixed point algorithms and their applications**, The International Conference on Nonlinear Analysis and Convex Analysis--International Conference on Optimization: Techniques and Applications (NACA-ICOTA2019), Future University Hakodate (Aug. 27, 2019)
  
 ==== Conference Activities & Talks ==== ==== Conference Activities & Talks ====
行 47: 行 121:
 ===== 2018 ===== ===== 2018 =====
 ==== Publications in Refereed Journals ==== ==== Publications in Refereed Journals ====
-  - Keigo Fujiwara, Kazuhiro Hishinuma, [[en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/oplna/vol4/p29.html|Evaluation of Stochastic Approximation Algorithm and Variants for Learning Support Vector Machines]]**, [[http://www.ybook.co.jp/lna.html|Linear and Nonlinear Analysis]], 4(1): 29-61 (2018) [[http://www.ybook.co.jp/online-p/LNA/Open/vol4/lnav4n1p29-oa/FLASH/index.html|Open Access]]+  - Keigo Fujiwara, Kazuhiro Hishinuma, [[en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/oplna/vol4/p29.html|Evaluation of Stochastic Approximation Algorithm and Variants for Learning Support Vector Machines]]**, [[http://www.ybook.co.jp/lna.html|Linear and Nonlinear Analysis]], 4 (1): 29-61 (2018) [[http://www.ybook.co.jp/online-p/LNA/Open/vol4/lnav4n1p29-oa/FLASH/index.html|Open Access]]
   - [[http://gyoseki1.mind.meiji.ac.jp/mjuhp/KgApp?kyoinId=ymkdgygyggy&Language=2|Yoichi Hayashi]], [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.sciencedirect.com/science/article/pii/S0925231217313486|Optimality and Convergence for Convex Ensemble Learning With Sparsity and Diversity Based on Fixed Point Optimization]]**, [[https://www.journals.elsevier.com/neurocomputing/|Neurocomputing]], 273: 367-372 (2018) {{:iiduka:h_i_neucom2017.pdf|PDF}}   - [[http://gyoseki1.mind.meiji.ac.jp/mjuhp/KgApp?kyoinId=ymkdgygyggy&Language=2|Yoichi Hayashi]], [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.sciencedirect.com/science/article/pii/S0925231217313486|Optimality and Convergence for Convex Ensemble Learning With Sparsity and Diversity Based on Fixed Point Optimization]]**, [[https://www.journals.elsevier.com/neurocomputing/|Neurocomputing]], 273: 367-372 (2018) {{:iiduka:h_i_neucom2017.pdf|PDF}}
  
行 62: 行 136:
 ===== 2017 ===== ===== 2017 =====
 ==== Publications in Refereed Journals ====  ==== Publications in Refereed Journals ==== 
-  - Yuta Sekine, [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/oplna/vol3/p203.html|Convergence Rate Analysis of Projected Stochastic Subgradient Method Using Conjugate Gradient-like Direction]]**, [[http://www.ybook.co.jp/lna.html|Linear and Nonlinear Analysis]], 3(2):203-211 (2017) [[http://www.ybook.co.jp/online-p/LNA/Open/1/lnav3n2p203-oa/FLASH/index.html|Open Access]] +  - Yuta Sekine, [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/oplna/vol3/p203.html|Convergence Rate Analysis of Projected Stochastic Subgradient Method Using Conjugate Gradient-like Direction]]**, [[http://www.ybook.co.jp/lna.html|Linear and Nonlinear Analysis]], 3 (2):203-211 (2017) [[http://www.ybook.co.jp/online-p/LNA/Open/1/lnav3n2p203-oa/FLASH/index.html|Open Access]] 
-  - Keigo Fujiwara, [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/oplna/vol3/p189.html|Modification of the Krasnosel'skii-Mann Fixed Point Algorithm by Using Three-term Conjugate Gradients]]**, [[http://www.ybook.co.jp/lna.html|Linear and Nonlinear Analysis]], 3(2):189-202 (2017) [[http://www.ybook.co.jp/online-p/LNA/Open/1/lnav3n2p189-oa/FLASH/index.html|Open Access]] +  - Keigo Fujiwara, [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/oplna/vol3/p189.html|Modification of the Krasnosel'skii-Mann Fixed Point Algorithm by Using Three-term Conjugate Gradients]]**, [[http://www.ybook.co.jp/lna.html|Linear and Nonlinear Analysis]], 3 (2):189-202 (2017) [[http://www.ybook.co.jp/online-p/LNA/Open/1/lnav3n2p189-oa/FLASH/index.html|Open Access]] 
-  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.tandfonline.com/doi/full/10.1080/02331934.2016.1252914|Almost Sure Convergence of Random Projected Proximal and Subgradient Algorithms for Distributed Nonsmooth Convex Optimization]]**, [[http://www.tandfonline.com/toc/gopt20/current|Optimization]] 66(1):35-59 (2017) {{:iiduka:GOPT_iiduka.pdf|PDF}}+  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.tandfonline.com/doi/full/10.1080/02331934.2016.1252914|Almost Sure Convergence of Random Projected Proximal and Subgradient Algorithms for Distributed Nonsmooth Convex Optimization]]**, [[http://www.tandfonline.com/toc/gopt20/current|Optimization]] 66 (1):35-59 (2017) {{:iiduka:GOPT_iiduka.pdf|PDF}}
  
 ==== Proceedings ==== ==== Proceedings ====
行 81: 行 155:
 ===== 2016 ===== ===== 2016 =====
 ==== Publications in Refereed Journals ==== ==== Publications in Refereed Journals ====
-  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://link.springer.com/article/10.1007/s10107-015-0967-1|Convergence Analysis of Iterative Methods for Nonsmooth Convex Optimization over Fixed Point Sets of Quasi-Nonexpansive Mappings]],** [[http://www.springer.com/mathematics/journal/10107|Mathematical Programming]] 159(1): 509-538 (2016) {{:iiduka:mp_iiduka2015.pdf|PDF}} [[http://arxiv.org/pdf/1510.06148.pdf|extended version]] [[https://rdcu.be/7uNw|Springer Nature SharedIt]] +  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://link.springer.com/article/10.1007/s10107-015-0967-1|Convergence Analysis of Iterative Methods for Nonsmooth Convex Optimization over Fixed Point Sets of Quasi-Nonexpansive Mappings]],** [[http://www.springer.com/mathematics/journal/10107|Mathematical Programming]] 159 (1): 509-538 (2016) {{:iiduka:mp_iiduka2015.pdf|PDF}} [[http://arxiv.org/pdf/1510.06148.pdf|extended version]] [[https://rdcu.be/7uNw|Springer Nature SharedIt]] 
-  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.tandfonline.com/doi/full/10.1080/10556788.2016.1175002|Incremental Subgradient Method for Nonsmooth Convex Optimization With Fixed Point Constraints]],** [[http://www.tandfonline.com/action/journalInformation?show=aimsScope&journalCode=goms20#.VwB8ohOLT3A|Optimization Methods and Software]] 31(5):931-951 (2016) {{:iiduka:OMS2016.pdf|PDF}}  +  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.tandfonline.com/doi/full/10.1080/10556788.2016.1175002|Incremental Subgradient Method for Nonsmooth Convex Optimization With Fixed Point Constraints]],** [[http://www.tandfonline.com/action/journalInformation?show=aimsScope&journalCode=goms20#.VwB8ohOLT3A|Optimization Methods and Software]] 31 (5):931-951 (2016) {{:iiduka:OMS2016.pdf|PDF}}  
-  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.fixedpointtheoryandapplications.com/content/2016/1/77|Line Search Fixed Point Algorithms Based on Nonlinear Conjugate Gradient Directions: Application to Constrained Smooth Convex Optimization]],** [[http://www.fixedpointtheoryandapplications.com/|Fixed Point Theory and Applications]] 2016:77 (2016) [[http://download.springer.com/static/pdf/569/art%253A10.1186%252Fs13663-016-0567-7.pdf?originUrl=http%3A%2F%2Ffixedpointtheoryandapplications.springeropen.com%2Farticle%2F10.1186%2Fs13663-016-0567-7&token2=exp=1468029750~acl=%2Fstatic%2Fpdf%2F569%2Fart%25253A10.1186%25252Fs13663-016-0567-7.pdf*~hmac=c724b9129b32b7290cb6fd768ed09e09af38f0b498f645e497cdd990cca839af|PDF]] +  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.fixedpointtheoryandapplications.com/content/2016/1/77|Line Search Fixed Point Algorithms Based on Nonlinear Conjugate Gradient Directions: Application to Constrained Smooth Convex Optimization]],** [[http://www.fixedpointtheoryandapplications.com/|Fixed Point Theory and Applications]] 2016: 77 (2016) [[http://download.springer.com/static/pdf/569/art%253A10.1186%252Fs13663-016-0567-7.pdf?originUrl=http%3A%2F%2Ffixedpointtheoryandapplications.springeropen.com%2Farticle%2F10.1186%2Fs13663-016-0567-7&token2=exp=1468029750~acl=%2Fstatic%2Fpdf%2F569%2Fart%25253A10.1186%25252Fs13663-016-0567-7.pdf*~hmac=c724b9129b32b7290cb6fd768ed09e09af38f0b498f645e497cdd990cca839af|PDF]] 
-  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.sciencedirect.com/science/article/pii/S0377221716301102|Proximal Point Algorithms for Nonsmooth Convex Optimization With Fixed Point Constraints]],** [[http://www.journals.elsevier.com/european-journal-of-operational-research/|European Journal of Operational Research]] 253(2): 503-513 (2016) {{:iiduka:iiduka_EJOR.pdf|PDF}} +  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.sciencedirect.com/science/article/pii/S0377221716301102|Proximal Point Algorithms for Nonsmooth Convex Optimization With Fixed Point Constraints]],** [[http://www.journals.elsevier.com/european-journal-of-operational-research/|European Journal of Operational Research]] 253 (2): 503-513 (2016) {{:iiduka:iiduka_EJOR.pdf|PDF}} 
-  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.tandfonline.com/doi/full/10.1080/01630563.2015.1080270|Optimization for Inconsistent Split Feasibility Problems]],** [[http://www.tandfonline.com/toc/lnfa20/current#.Vb7Z-ZPtlBc|Numerical Functional Analysis and Optimization]] 37(2): 186-205 (2016) {{:iiduka:LNFA-2013-0155.pdf|PDF}}+  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.tandfonline.com/doi/full/10.1080/01630563.2015.1080270|Optimization for Inconsistent Split Feasibility Problems]],** [[http://www.tandfonline.com/toc/lnfa20/current#.Vb7Z-ZPtlBc|Numerical Functional Analysis and Optimization]] 37 (2): 186-205 (2016) {{:iiduka:LNFA-2013-0155.pdf|PDF}}
  
 ==== Conference Activities & Talks ==== ==== Conference Activities & Talks ====
行 96: 行 170:
 ===== 2015 ===== ===== 2015 =====
 ==== Publications in Refereed Journals ==== ==== Publications in Refereed Journals ====
-  - Kazuhiro Hishinuma, [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online-p/JNCA/Open/16/jncav16n11p2243-oa/FLASH/index.html|On Acceleration of the Krasnosel’skii-Mann Fixed Point Algorithm Based on Conjugate Gradient Method for Smooth Optimization]],** [[http://www.ybook.co.jp/jnca.html|Journal of Nonlinear and Convex Analysis]]: Special Issue-Dedicated to Wataru Takahashi on the occasion of his 70th birth day 16(11): 2243-2254 (2015) [[http://www.ybook.co.jp/online-p/JNCA/Open/16/jncav16n11p2243-oa/FLASH/index.html|Open Access]] {{:reserved:ap-2015-acm.pdf|PDF}} +  - Kazuhiro Hishinuma, [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online-p/JNCA/Open/16/jncav16n11p2243-oa/FLASH/index.html|On Acceleration of the Krasnosel’skii-Mann Fixed Point Algorithm Based on Conjugate Gradient Method for Smooth Optimization]],** [[http://www.ybook.co.jp/jnca.html|Journal of Nonlinear and Convex Analysis]]: Special Issue-Dedicated to Wataru Takahashi on the occasion of his 70th birth day 16 (11): 2243-2254 (2015) [[http://www.ybook.co.jp/online-p/JNCA/Open/16/jncav16n11p2243-oa/FLASH/index.html|Open Access]] {{:reserved:ap-2015-acm.pdf|PDF}} 
-  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online-p/JNCA/Open/16/jncav16n11p2159-oa/FLASH/index.html|Distributed Convex Optimization Algorithms and Their Application to Distributed Control in Peer-to-Peer Data Storage System]],**[[http://www.ybook.co.jp/jnca.html|Journal of Nonlinear and Convex Analysis]]: Special Issue-Dedicated to Wataru Takahashi on the occasion of his 70th birth day 16(11): 2159-2179 (2015) [[http://www.ybook.co.jp/online-p/JNCA/Open/16/jncav16n11p2159-oa/FLASH/index.html|Open Access]] {{:iiduka:jncav16n11.pdf|PDF}} +  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online-p/JNCA/Open/16/jncav16n11p2159-oa/FLASH/index.html|Distributed Convex Optimization Algorithms and Their Application to Distributed Control in Peer-to-Peer Data Storage System]],**[[http://www.ybook.co.jp/jnca.html|Journal of Nonlinear and Convex Analysis]]: Special Issue-Dedicated to Wataru Takahashi on the occasion of his 70th birth day 16 (11): 2159-2179 (2015) [[http://www.ybook.co.jp/online-p/JNCA/Open/16/jncav16n11p2159-oa/FLASH/index.html|Open Access]] {{:iiduka:jncav16n11.pdf|PDF}} 
-  - [[:en:iiduka:|Hideaki Iiduka]]: **[[https://www.jstage.jst.go.jp/article/jorsj/58/4/58_330/_article|Parallel Optimization Algorithm for Smooth Convex Optimization over Fixed Point Sets of Quasi-Nonexpansive Mappings]],** [[https://www.jstage.jst.go.jp/browse/jorsj/58/4/_contents|Journal of the Operations Research Society of Japan]] 58(4): 330-352 (2015) {{:iiduka:jorsj14-021.pdf|PDF}} +  - [[:en:iiduka:|Hideaki Iiduka]]: **[[https://www.jstage.jst.go.jp/article/jorsj/58/4/58_330/_article|Parallel Optimization Algorithm for Smooth Convex Optimization over Fixed Point Sets of Quasi-Nonexpansive Mappings]],** [[https://www.jstage.jst.go.jp/browse/jorsj/58/4/_contents|Journal of the Operations Research Society of Japan]] 58 (4): 330-352 (2015) {{:iiduka:jorsj14-021.pdf|PDF}} 
-  - Kazuhiro Hishinuma, [[:en:iiduka:|Hideaki Iiduka]] : **[[http://www.ybook.co.jp/online2/oplna/vol1/p67.html|Parallel Subgradient Method for Nonsmooth Convex Optimization With a Simple Constraint]],** [[http://www.ybook.co.jp/lna.html|Linear and Nonlinear Analysis]] 1(1): 67-77 (2015) {{:kaz:lnav1n1hi-ii-12.pdf|PDF}} [[http://www.ybook.co.jp/online-p/LNA/Open/1/lna1n1p67-oa/index.html|Open Access]] +  - Kazuhiro Hishinuma, [[:en:iiduka:|Hideaki Iiduka]] : **[[http://www.ybook.co.jp/online2/oplna/vol1/p67.html|Parallel Subgradient Method for Nonsmooth Convex Optimization With a Simple Constraint]],** [[http://www.ybook.co.jp/lna.html|Linear and Nonlinear Analysis]] 1 (1): 67-77 (2015) {{:kaz:lnav1n1hi-ii-12.pdf|PDF}} [[http://www.ybook.co.jp/online-p/LNA/Open/1/lna1n1p67-oa/index.html|Open Access]] 
-  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.fixedpointtheoryandapplications.com/content/2015/1/72|Parallel Computing Subgradient Method for Nonsmooth Convex Optimization over the Intersection of Fixed Point Sets of Nonexpansive Mappings]],** [[http://www.fixedpointtheoryandapplications.com/|Fixed Point Theory and Applications]] 2015:72 (2015) [[http://www.fixedpointtheoryandapplications.com/content/pdf/s13663-015-0319-0.pdf|PDF]]+  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.fixedpointtheoryandapplications.com/content/2015/1/72|Parallel Computing Subgradient Method for Nonsmooth Convex Optimization over the Intersection of Fixed Point Sets of Nonexpansive Mappings]],** [[http://www.fixedpointtheoryandapplications.com/|Fixed Point Theory and Applications]] 2015: 72 (2015) [[http://www.fixedpointtheoryandapplications.com/content/pdf/s13663-015-0319-0.pdf|PDF]]
   - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.sciencedirect.com/science/article/pii/S0377042715000175|Convex Optimization over Fixed Point Sets of Quasi-Nonexpansive and Nonexpansive Mappings in Utility-Based Bandwidth Allocation Problems With Operational Constraints]],** [[http://www.journals.elsevier.com/journal-of-computational-and-applied-mathematics/|Journal of Computational and Applied Mathematics]] 282: 225-236 (2015)  {{:iiduka:iiduka_cam2014.pdf|PDF}}   - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.sciencedirect.com/science/article/pii/S0377042715000175|Convex Optimization over Fixed Point Sets of Quasi-Nonexpansive and Nonexpansive Mappings in Utility-Based Bandwidth Allocation Problems With Operational Constraints]],** [[http://www.journals.elsevier.com/journal-of-computational-and-applied-mathematics/|Journal of Computational and Applied Mathematics]] 282: 225-236 (2015)  {{:iiduka:iiduka_cam2014.pdf|PDF}}
-  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://link.springer.com/article/10.1007/s10107-013-0741-1|Acceleration Method for Convex Optimization over the Fixed Point Set of a Nonexpansive Mapping]],** [[http://www.springer.com/mathematics/journal/10107|Mathematical Programming]] 149(1): 131-165 (2015) {{:iiduka:mp_iiduka.pdf|PDF}} +  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://link.springer.com/article/10.1007/s10107-013-0741-1|Acceleration Method for Convex Optimization over the Fixed Point Set of a Nonexpansive Mapping]],** [[http://www.springer.com/mathematics/journal/10107|Mathematical Programming]] 149 (1): 131-165 (2015) {{:iiduka:mp_iiduka.pdf|PDF}} 
-  - Masato Uchida, [[:en:iiduka:|Hideaki Iiduka]], Isao Sugino: **[[http://search.ieice.org/bin/pdf.php?lang=E&year=2015&fname=e98-b_1_33&abst=|Modeling User Behavior in P2P Data Storage System]],** [[http://search.ieice.org/bin/index.php?category=B&lang=E&curr=1|IEICE Transactions on Communications]]: Special Section on Quality of Diversifying Communication Networks and Services E98-B(1): 33-41 (2015) [[http://search.ieice.org/bin/pdf.php?lang=E&year=2015&fname=e98-b_1_33&abst=|PDF]]+  - Masato Uchida, [[:en:iiduka:|Hideaki Iiduka]], Isao Sugino: **[[http://search.ieice.org/bin/pdf.php?lang=E&year=2015&fname=e98-b_1_33&abst=|Modeling User Behavior in P2P Data Storage System]],** [[http://search.ieice.org/bin/index.php?category=B&lang=E&curr=1|IEICE Transactions on Communications]]: Special Section on Quality of Diversifying Communication Networks and Services E98-B (1): 33-41 (2015) [[http://search.ieice.org/bin/pdf.php?lang=E&year=2015&fname=e98-b_1_33&abst=|PDF]]
  
 ==== Proceedings ==== ==== Proceedings ====
行 109: 行 183:
 ===== 2014 ===== ===== 2014 =====
 ==== Publications in Refereed Journals ==== ==== Publications in Refereed Journals ====
-  - [[:en:iiduka:|Hideaki Iiduka]], Kazuhiro Hishinuma: **[[http://epubs.siam.org/doi/abs/10.1137/130939560|Acceleration Method Combining Broadcast and Incremental Distributed Optimization Algorithms]],** [[http://www.siam.org/journals/siopt.php|SIAM Journal on Optimization]] 24(4): 1840–1863 (2014) {{:iiduka:iiduka_hishinuma2014.pdf|PDF}} +  - [[:en:iiduka:|Hideaki Iiduka]], Kazuhiro Hishinuma: **[[http://epubs.siam.org/doi/abs/10.1137/130939560|Acceleration Method Combining Broadcast and Incremental Distributed Optimization Algorithms]],** [[http://www.siam.org/journals/siopt.php|SIAM Journal on Optimization]] 24 (4): 1840–1863 (2014) {{:iiduka:iiduka_hishinuma2014.pdf|PDF}} 
-  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/pjov10.html|Distributed Iterative Methods for Solving Nonmonotone Variational Inequality over the Intersection of Fixed Point Sets of Nonexpansive Mappings]],** [[http://www.ybook.co.jp/pjo.html|Pacific Journal of Optimization]] 10(4): 691-713 (2014) {{:iiduka:iiduka_PJO.pdf|PDF}} +  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/pjov10.html|Distributed Iterative Methods for Solving Nonmonotone Variational Inequality over the Intersection of Fixed Point Sets of Nonexpansive Mappings]],** [[http://www.ybook.co.jp/pjo.html|Pacific Journal of Optimization]] 10 (4): 691-713 (2014) {{:iiduka:iiduka_PJO.pdf|PDF}} 
-  - [[:en:kaito:|Kaito Sakurai]], [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.fixedpointtheoryandapplications.com/content/2014/1/202|Acceleration of the Halpern Algorithm to Search for a Fixed Point of a Nonexpansive Mapping]],** [[http://www.fixedpointtheoryandapplications.com/|Fixed Point Theory and Applications]] 2014:202 (2014) [[http://www.fixedpointtheoryandapplications.com/content/pdf/1687-1812-2014-202.pdf|PDF]] +  - [[:en:kaito:|Kaito Sakurai]], [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.fixedpointtheoryandapplications.com/content/2014/1/202|Acceleration of the Halpern Algorithm to Search for a Fixed Point of a Nonexpansive Mapping]],** [[http://www.fixedpointtheoryandapplications.com/|Fixed Point Theory and Applications]] 2014: 202 (2014) [[http://www.fixedpointtheoryandapplications.com/content/pdf/1687-1812-2014-202.pdf|PDF]] 
-  - Shigeru Iemoto, Kazuhiro Hishinuma, [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.fixedpointtheoryandapplications.com/content/2014/1/51|Approximate Solutions to Variational Inequality over the Fixed Point Set of a Strongly Nonexpansive Mapping]],** [[http://www.fixedpointtheoryandapplications.com/|Fixed Point Theory and Applications]] 2014:51 (2014) [[http://www.fixedpointtheoryandapplications.com/content/pdf/1687-1812-2014-51.pdf|PDF]]+  - Shigeru Iemoto, Kazuhiro Hishinuma, [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.fixedpointtheoryandapplications.com/content/2014/1/51|Approximate Solutions to Variational Inequality over the Fixed Point Set of a Strongly Nonexpansive Mapping]],** [[http://www.fixedpointtheoryandapplications.com/|Fixed Point Theory and Applications]] 2014: 51 (2014) [[http://www.fixedpointtheoryandapplications.com/content/pdf/1687-1812-2014-51.pdf|PDF]]
  
  
行 121: 行 195:
 ===== 2013 ===== ===== 2013 =====
 ==== Publications in Refereed Journals ==== ==== Publications in Refereed Journals ====
-  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/jncav14.html|Multicast Decentralized Optimization Algorithm for Network Resource Allocation Problems]],** [[http://www.ybook.co.jp/jnca.html|Journal of Nonlinear and Convex Analysis]] 14(4): 817-839 (2013) [[http://www.ybook.co.jp/online2/jncav14.html|Open Access]] +  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://www.ybook.co.jp/online2/jncav14.html|Multicast Decentralized Optimization Algorithm for Network Resource Allocation Problems]],** [[http://www.ybook.co.jp/jnca.html|Journal of Nonlinear and Convex Analysis]] 14 (4): 817-839 (2013) [[http://www.ybook.co.jp/online2/jncav14.html|Open Access]] 
-  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://epubs.siam.org/doi/abs/10.1137/120866877|Fixed Point Optimization Algorithms for Distributed Optimization in Networked Systems]],** [[http://www.siam.org/journals/siopt.php|SIAM Journal on Optimization]] 23(1): 1-26 (2013) {{:iiduka:SIOPT2013.pdf|PDF}}+  - [[:en:iiduka:|Hideaki Iiduka]]: **[[http://epubs.siam.org/doi/abs/10.1137/120866877|Fixed Point Optimization Algorithms for Distributed Optimization in Networked Systems]],** [[http://www.siam.org/journals/siopt.php|SIAM Journal on Optimization]] 23 (1): 1-26 (2013) {{:iiduka:SIOPT2013.pdf|PDF}}
  
  
 ===== 2004~2012 ===== ===== 2004~2012 =====
   * Please refer to [[:en:iiduka|Iiduka's profile page]] for his publications published before 2012.   * Please refer to [[:en:iiduka|Iiduka's profile page]] for his publications published before 2012.
  • en/intro/publications.1585458107.txt.gz
  • 最終更新: 2020/03/29 14:01
  • by kaz