ソースの表示以前のリビジョンバックリンク文書の先頭へ 目次 Publications Preprints 2023 Books Publications in Refereed Journals Proceedings Conference Activities & Talks 2022 Publications in Refereed Journals 2021 Publications in Refereed Journals Proceedings Conference Activities & Talks 2020 Publications in Refereed Journals Doctoral Thesis Conference Activities & Talks 2019 Publications in Refereed Journals Proceedings Honor Lecture Conference Activities & Talks 2018 Publications in Refereed Journals Invited Talks Conference Activities & Talks 2017 Publications in Refereed Journals Proceedings Invited Talks Conference Activities & Talks 2016 Publications in Refereed Journals Conference Activities & Talks 2015 Publications in Refereed Journals Proceedings 2014 Publications in Refereed Journals Conference Activities & Talks 2013 Publications in Refereed Journals 2004~2012 Publications Preprints preprints (arXiv.org Search Results) 2023 Books Hideaki Iiduka: Algorithms for Continuous Optimization (Japanese), Ohmsha (2023) Publications in Refereed Journals Hideaki Iiduka: Theoretical Analysis of Adam using Hyperparameters Close to One without Lipschitz Smoothness, Numerical Algorithms ??: ??–?? (2023) PDF Hideaki Iiduka: $\epsilon$-Approximation of Adaptive Leaning Rate Optimization Algorithms for Constrained Nonconvex Stochastic Optimization, IEEE Transactions on Neural Networks and Learning Systems (Accepted) (2023) PDF Hiroyuki Sakai, Hiroyuki Sato, Hideaki Iiduka: Global Convergence of Hager-Zhang type Riemannian Conjugate Gradient Method, Applied Mathematics and Computation 441, 127685 (2023) PDF Proceedings Naoki Sato, Hideaki Iiduka: Existence and Estimation of Critical Batch Size for Training Generative Adversarial Networks with Two Time-Scale Update Rule, Proceedings of the 40th International Conference on Machine Learning, PMLR 202: ??–?? (2023) PDF Hiroki Naganuma, Hideaki Iiduka: Conjugate Gradient Method for Generative Adversarial Networks, Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206: 4381–4408 (2023) PDF Conference Activities & Talks Yuki Tsukada, Hideaki Iiduka: Line Search Methods for Nonconvex Optimization in Deep Learning, The 10th International Congress on Industrial and Applied Mathematics (ICIAM), Waseda University, Tokyo, Japan (Aug. 20–25, 2023) Naoki Sato, Hideaki Iiduka: Theoretical Analysis of Two Time-Scale Update Rule for Training GANs, The 10th International Congress on Industrial and Applied Mathematics (ICIAM), Waseda University, Tokyo, Japan (Aug. 20–25, 2023) Naoki Sato, Hideaki Iiduka: Existence and Estimation of Critical Batch Size for Training Generative Adversarial Networks with Two Time-Scale Update Rule, The 40th International Conference on Machine Learning (ICML), Hawaii Convention Center, Honolulu, Hawaii, USA (Jul. 23–29, 2023) Hiroki Naganuma, Hideaki Iiduka: Conjugate Gradient Method for Generative Adversarial Networks, The 26th International Conference on Artificial Intelligence and Statistics (AISTATS), Palau de Congressos, Valencia, Spain (Apr. 25–27, 2023) 2022 Publications in Refereed Journals Hideaki Iiduka: Appropriate Learning Rates of Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks, IEEE Transactions on Cybernetics 52 (12): 13250–13261 (2022) PDF Hiroyuki Sakai, Hideaki Iiduka: Riemannian Adaptive Optimization Algorithm and Its Application to Natural Language Processing, IEEE Transactions on Cybernetics 52 (8): 7328–7339 (2022) PDF Hideaki Iiduka, Hiroyuki Sakai: Riemannian Stochastic Fixed Point Optimization Algorithm, Numerical Algorithms 90: 1493–1517 (2022) PDF Springer Nature SharedIt Yu Kobayashi, Hideaki Iiduka: Conjugate-gradient-based Adam for Nonconvex Stochastic Optimization and Its Application to Deep Learning, Journal of Nonlinear and Convex Analysis: Special issue: Memory of Wataru Takahashi 23 (2): 337–356 (2022) Open Access 2021 Publications in Refereed Journals Kanako Shimoyama, Hideaki Iiduka: Adaptive Methods Using Element-wise $P$-th Power of Stochastic Gradient for Nonconvex Optimization in Deep Neural Networks, Linear and Nonlinear Analysis: Special issue: Memory of Wataru Takahashi and Naoki Shioji 7 (3): 317–336 (2021)Open Access Kazuhiro Hishinuma, Hideaki Iiduka: Evaluation of Fixed Point Quasiconvex Subgradient Method with Computational Inexactness, Pure and Applied Functional Analysis: Special Issue on Optimization Theory dedicated to Terry Rockafellar on the occasion of his 85th birthday 6 (6): 1303–1316 (2021) Open Access Yini Zhu, Hideaki Iiduka: Unified Algorithm Framework for Nonconvex Stochastic Optimization in Deep Neural Networks, IEEE Access 9: 143807–143823 (2021) Open Access Hiroyuki Sakai, Hideaki Iiduka: Sufficient Descent Riemannian Conjugate Gradient Methods, Journal of Optimization Theory and Applications 190: 130–150 (2021)PDF Springer Nature SharedIt Hideaki Iiduka: Inexact Stochastic Subgradient Projection Method for Stochastic Equilibrium Problems with Nonmonotone Bifunctions: Application to Expected Risk Minimization in Machine Learning, Journal of Global Optimization 80 (2): 479–505 (2021) PDF Springer Nature SharedIt Hideaki Iiduka: Stochastic Approximation Method Using Diagonal Positive-Definite Matrices for Convex Optimization with Fixed Point Constraints, Fixed Point Theory and Algorithms for Sciences and Engineering: Topical Collection on Optimization and Real World Applications 2021: 10 (2021) Open Access Springer Nature SharedIt Proceedings Kanako Shimoyama, Hideaki Iiduka: Appropriate Gradients Used for Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks, RIMS Kôkyûroku No.2194, pp. 1–5, 2021 Open Access Yini Zhu, Hiroyuki Sakai, Hideaki Iiduka: Training Neural Networks Using Adaptive Gradient Methods, RIMS Kôkyûroku No.2194, pp. 6–12, 2021 Open Access Hideaki Iiduka: Halpern-type Subgradient Methods for Convex Optimization over Fixed Point Sets of Nonexpansive Mappings, Proceedings of International Conference on Nonlinear Analysis and Convex Analysis and International Conference on Optimization: Techniques and Applications -I- pp. 119–125 PDF Conference Activities & Talks Hiroyuki Sakai, Hideaki Iiduka: Riemannian conjugate gradient methods with sufficient descent search directions, The 2021 Fall National Conference of Operations Research Society of Japan, Kyushu University, Online meeting (Sept. 16, 2021) Koshiro Izumi, Hideaki Iiduka: Adaptive scaling conjugate gradient method for neural networks, RIMS Workshop on Advances in the Theory and Application of Mathematical Optimization, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Aug. 19, 2021) Yu Kobayashi, Hideaki Iiduka: Adaptive conjugate gradient method for deep learning, The 2021 Spring National Conference of Operations Research Society of Japan, Tokyo Institute of Technology, Online meeting (Mar. 2, 2021) Hiroyuki Sakai, Hideaki Iiduka: Extension of adaptive learning rate optimization algorithms to Riemannian manifolds and its application to natural language processing, The 2021 Spring National Conference of Operations Research Society of Japan, Tokyo Institute of Technology, Online meeting (Mar. 2, 2021) Yini Zhu, Hiroyuki Sakai, Hideaki Iiduka: Training neural networks using adaptive gradient methods, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Mar. 1, 2021) Kanako Shimoyama, Yu Kobayashi, Hideaki Iiduka: Appropriate stochastic gradients used in adaptive learning rate optimization algorithms for training deep neural networks, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Mar. 1, 2021) 2020 Publications in Refereed Journals Hiroyuki Sakai, Hideaki Iiduka: Hybrid Riemannian Conjugate Gradient Methods with Global Convergence Properties, Computational Optimization and Applications 77: 811–830 (2020) PDF Hideaki Iiduka, Yu Kobayashi: Training Deep Neural Networks Using Conjugate Gradient-like Methods, Electronics 9 (11): 1809 (2020) Open Access Correction Kengo Shimizu, Hideaki Iiduka: Computation Time of Iterative Methods for Nonsmooth Convex Optimization With Fixed Point Constraints of Quasi-Nonexpansive Mappings, Linear and Nonlinear Analysis 6 (2): 281–286 (2020) PDF Hideaki Iiduka: Stochastic Fixed Point Optimization Algorithm for Classifier Ensemble, IEEE Transactions on Cybernetics 50(10): 4370–4380 (2020) PDF Kazuhiro Hishinuma, Hideaki Iiduka: Efficiency of Inexact Fixed Point Quasiconvex Subgradient Method, Linear and Nonlinear Analysis 6 (1): 35–48 (2020) PDF Kengo Shimizu, Kazuhiro Hishinuma, Hideaki Iiduka: Parallel Computing Proximal Method for Nonsmooth Convex Optimization With Fixed Point Constraints of Quasi-nonexpansive Mappings, Applied Set-Valued Analysis and Optimization 2 (1): 1–17 (2020) PDF Hideaki Iiduka: Decentralized Hierarchical Constrained Convex Optimization, Optimization and Engineering 21 (1): 181–213 (2020) PDF Springer Nature SharedIt Kazuhiro Hishinuma, Hideaki Iiduka: Fixed Point Quasiconvex Subgradient Method, European Journal of Operational Research 282 (2): 428–437 (2020) PDF Doctoral Thesis Kazuhiro Hishinuma: Fixed Point Subgradient Methods for Constrained Nonsmooth Optimization, Meiji University, 2020 PDF Conference Activities & Talks Yu Kobayashi, Hideaki Iiduka: Adaptive optimization method with stochastic conjugate gradient direction and its application to image classification in deep learning, The 2020 Spring National Conference of Operations Research Society of Japan, Nara Kasugano International Forum (Mar. 12, 2020). Kazuhiro Hishinuma, Hideaki Iiduka: Error evaluation of fixed point quasiconvex subgradient method, The 2020 Spring National Conference of Operations Research Society of Japan, Nara Kasugano International Forum (Mar. 12, 2020). Haruhi Oishi, Hideaki Iiduka: Resource allocation using fixed point approximation method for 5G network, The 2020 Spring National Conference of Operations Research Society of Japan, Nara Kasugano International Forum (Mar. 12, 2020). Kengo Shimizu, Hideaki Iiduka: Computational time comparisons for parallel proximal point and subgradient methods for nonsmooth convex optimization over fixed point set of quasi-nonexpansive mapping, The 2020 Spring National Conference of Operations Research Society of Japan, Nara Kasugano International Forum (Mar. 12, 2020). Hiroyuki Sakai, Hideaki Iiduka: A new conjugate guradient method on Riemann manifold, The 2020 Spring National Conference of Operations Research Society of Japan, Nara Kasugano International Forum (Mar. 12, 2020). 2019 Publications in Refereed Journals Haruhi Oishi, Yu Kobayashi, Hideaki Iiduka: Incremental Proximal Method for Nonsmooth Convex Optimization With Fixed Point Constraints of Quasi-nonexpansive Mappings, Linear and Nonlinear Analysis 5 (3): 477-493 (2019) PDF Hideaki Iiduka: Distributed Optimization for Network Resource Allocation With Nonsmooth Utility Functions, IEEE Transactions on Control of Network Systems 6 (4): 1354-1365 (2019) PDF Kazuhiro Hishinuma, Hideaki Iiduka: Convergence Analysis of Incremental and Parallel Line Search Subgradient Methods in Hilbert Space, Journal of Nonlinear and Convex Analysis: Special Issue-Dedicated to Wataru Takahashi on the occasion of his 75th birth day 20 (9): 1937-1947 (2019) PDF Kazuhiro Hishinuma, Hideaki Iiduka: Incremental and Parallel Machine Learning Algorithms With Automated Learning Rate Adjustments, Frontiers in Robotics and AI: Resolution of Limitations of Deep Learning to Develop New AI Paradigms 6, Article 77 PDF Hideaki Iiduka: Two Stochastic Optimization Algorithms for Convex Optimization With Fixed Point Constraints, Optimization Methods and Software 34 (4): 731-757 (2019) PDF Kaito Sakurai, Takayuki Jimba, Hideaki Iiduka: Iterative Methods for Parallel Convex Optimization With Fixed Point Constraints, Journal of Nonlinear and Variational Analysis 3 (2): 115-126 (2019) Open Access Proceedings Kazuhiro Hishinuma, Hideaki Iiduka: Applying Conditional Subgradient-like Directions to the Modified Krasnosel’skiĭ-Mann Fixed Point Algorithm Based on the Three-term Conjugate Gradient Method, Proceedings of the 10th International Conference on Nonlinear Analysis and Convex Analysis, pp.59-67 Open Access Honor Lecture Hideaki Iiduka: Convex optimization with complicated constraint and its application, The 2019 Fall National Conference of Operations Research Society of Japan, Higashi Hiroshima Arts & Culture Hall Kurara (Sep. 12, 2019). Hideaki Iiduka: Fixed point algorithms and their applications, The International Conference on Nonlinear Analysis and Convex Analysis–International Conference on Optimization: Techniques and Applications (NACA-ICOTA2019), Future University Hakodate (Aug. 27, 2019) Conference Activities & Talks Yu Kobayashi, Hideaki Iiduka: Stochastic optimization algorithm using conjugate gradient direction and its application to deep learning, The 2019 Fall National Conference of Operations Research Society of Japan, Higashi Hiroshima Arts & Culture Hall Kurara (Sep. 12, 2019). Kazuhiro Hishinuma, Hideaki Iiduka: On rate of convergence of fixed point subgradient method, The 2019 Fall National Conference of Operations Research Society of Japan, Higashi Hiroshima Arts & Culture Hall Kurara (Sep. 12, 2019). Kengo Shimizu, Hideaki Iiduka: Nonsmooth convex optimization over fixed point sets of quasi nonexpansive mappings by using parallel proximal point algorithm, The 2019 Fall National Conference of Operations Research Society of Japan, Higashi Hiroshima Arts & Culture Hall Kurara (Sep. 12, 2019). Kazuhiro Hishinuma, Hideaki Iiduka: Convergence rate analyses of fixed point quasiconvex subgradient method, The International Conference on Nonlinear Analysis and Convex Analysis–International Conference on Optimization: Techniques and Applications (NACA-ICOTA2019), Future University Hakodate (Aug. 27, 2019) 2018 Publications in Refereed Journals Keigo Fujiwara, Kazuhiro Hishinuma, Hideaki Iiduka: Evaluation of Stochastic Approximation Algorithm and Variants for Learning Support Vector Machines, Linear and Nonlinear Analysis, 4 (1): 29-61 (2018) Open Access Yoichi Hayashi, Hideaki Iiduka: Optimality and Convergence for Convex Ensemble Learning With Sparsity and Diversity Based on Fixed Point Optimization, Neurocomputing, 273: 367-372 (2018) PDF Invited Talks Hideaki Iiduka: Decentralized Optimization and Its Applications, the 6th Asian Conference on Nonlinear Analysis and Optimization, ANA Intercontinental Manza Beach Resort (Nov. 6, 2018) Conference Activities & Talks Kazuhiro Hishinuma, Hideaki Iiduka: Convergence property, computational performance, and usability of fixed point quasiconvex subgradient method, the 6th Asian Conference on Nonlinear Analysis and Optimization, Okinawa Institute of Science and Technology Graduate University (Nov. 7, 2018) Yu Kobayashi, Hideaki Iiduka: Stochastic subgradient projection method for nonmonotone equilibrium problems and its application to multi class classification, the 6th Asian Conference on Nonlinear Analysis and Optimization, Okinawa Institute of Science and Technology Graduate University (Nov. 5, 2018) Hideo Yoshizato, Hideaki Iiduka: Stochastic fixed point optimization algorithm for classifier ensemble with sparsity and diversity learning and its application, the 6th Asian Conference on Nonlinear Analysis and Optimization, Okinawa Institute of Science and Technology Graduate University (Nov. 5, 2018) Yu Kobayashi, Hideaki Iiduka: Stochastic subgradient method for stochastic equilibrium problems with nonmonotone bifunctions and its application to multiclass classification, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University (Aug. 29, 2018) Kazuhiro Hishinuma, Hideaki Iiduka: Application of incremental and parallel subgradient methods to learning a support vector machine and its advantages and disadvantages, The 2018 Spring National Conference of Operations Research Society of Japan, Tokai University (Mar. 15, 2018). Hideo Yoshizato, Hideaki Iiduka: Stochastic fixed point optimization algorithm for ensemble learning with sparsity and diversity, The 2018 Spring National Conference of Operations Research Society of Japan, Tokai University (Mar. 15, 2018). 2017 Publications in Refereed Journals Yuta Sekine, Hideaki Iiduka: Convergence Rate Analysis of Projected Stochastic Subgradient Method Using Conjugate Gradient-like Direction, Linear and Nonlinear Analysis, 3 (2):203-211 (2017) Open Access Keigo Fujiwara, Hideaki Iiduka: Modification of the Krasnosel'skii-Mann Fixed Point Algorithm by Using Three-term Conjugate Gradients, Linear and Nonlinear Analysis, 3 (2):189-202 (2017) Open Access Hideaki Iiduka: Almost Sure Convergence of Random Projected Proximal and Subgradient Algorithms for Distributed Nonsmooth Convex Optimization, Optimization 66 (1):35-59 (2017) PDF Proceedings Hideaki Iiduka: Nonsmooth Convex Optimization With Fixed Point Constraints and Its Applications, Proceedings of the Twenty-Ninth RAMP Symposium, pp. 125–142, 2017. Invited Talks Hideaki Iiduka: Nonsmooth Convex Optimization With Fixed Point Constraints and Its Applications, The 29th RAMP (Research Association of Mathematical Programming) Symposium, Tsukuba University, Tsukuba, Japan, October 12-13, 2017. Conference Activities & Talks Kazuhiro Hishinuma, Hideaki Iiduka: Iterative method for solving constrained quasiconvex optimization problems based on the Krasnosel'skiĭ-Mann fixed point approximation method, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University (Sep. 1, 2017) Kazuhiro Hishinuma, Hideaki Iiduka: Quasi-subgradient method for quasiconvex minimization problem with fixed point constraints, the RIMS Workshop on Development of Mathematical Optimization: Modeling and Algorithms, Research Institute for Mathematical Sciences, Kyoto University (Aug. 25 2017) Kazuhiro Hishinuma, Hideaki Iiduka: Flexible stepsize selection of subgradient methods for constrained convex optimization, the 10th Anniversary Conference on Nonlinear Analysis and Convex Analysis, Chitose City Cultural Center (Jul. 7, 2017) 2016 Publications in Refereed Journals Hideaki Iiduka: Convergence Analysis of Iterative Methods for Nonsmooth Convex Optimization over Fixed Point Sets of Quasi-Nonexpansive Mappings, Mathematical Programming 159 (1): 509-538 (2016) PDF extended version Springer Nature SharedIt Hideaki Iiduka: Incremental Subgradient Method for Nonsmooth Convex Optimization With Fixed Point Constraints, Optimization Methods and Software 31 (5):931-951 (2016) PDF Hideaki Iiduka: Line Search Fixed Point Algorithms Based on Nonlinear Conjugate Gradient Directions: Application to Constrained Smooth Convex Optimization, Fixed Point Theory and Applications 2016: 77 (2016) PDF Hideaki Iiduka: Proximal Point Algorithms for Nonsmooth Convex Optimization With Fixed Point Constraints, European Journal of Operational Research 253 (2): 503-513 (2016) PDF Hideaki Iiduka: Optimization for Inconsistent Split Feasibility Problems, Numerical Functional Analysis and Optimization 37 (2): 186-205 (2016) PDF Conference Activities & Talks Kazuhiro Hishinuma, Hideaki Iiduka: Acceleration approach for parallel subgradient method based on line search, The 2016 Fall National Conference of Operations Research Society of Japan, Yamagata University (Sept. 15-16 2016) Kaito Sakurai, Hideaki Iiduka: Parallel computing method for nonsmooth convex optimization with fixed point constraints, The 2016 Fall National Conference of Operations Research Society of Japan, Yamagata University (Sept. 15-16 2016) Yoshiharu Nohara, Hideaki Iiduka: Line search subgradient methods for convex optimization problem and its dual problem, The 2016 Fall National Conference of Operations Research Society of Japan, Yamagata University (Sept. 15-16 2016) Takayuki Jimba, Kaito Sakurai, Hideaki Iiduka: Halpern-type proximal point algorithm for nonsmooth convex optimization with fixed point constraints, The 2016 Fall National Conference of Operations Research Society of Japan, Yamagata University (Sept. 15-16 2016) Shizuka Nishino, Hideaki Iiduka: Numerical methods for nonnegative matrix factorization based on fixed point theory, The 2016 Fall National Conference of Operations Research Society of Japan, Yamagata University (Sept. 15-16 2016) 2015 Publications in Refereed Journals Kazuhiro Hishinuma, Hideaki Iiduka: On Acceleration of the Krasnosel’skii-Mann Fixed Point Algorithm Based on Conjugate Gradient Method for Smooth Optimization, Journal of Nonlinear and Convex Analysis: Special Issue-Dedicated to Wataru Takahashi on the occasion of his 70th birth day 16 (11): 2243-2254 (2015) Open Access PDF Hideaki Iiduka: Distributed Convex Optimization Algorithms and Their Application to Distributed Control in Peer-to-Peer Data Storage System,Journal of Nonlinear and Convex Analysis: Special Issue-Dedicated to Wataru Takahashi on the occasion of his 70th birth day 16 (11): 2159-2179 (2015) Open Access PDF Hideaki Iiduka: Parallel Optimization Algorithm for Smooth Convex Optimization over Fixed Point Sets of Quasi-Nonexpansive Mappings, Journal of the Operations Research Society of Japan 58 (4): 330-352 (2015) PDF Kazuhiro Hishinuma, Hideaki Iiduka : Parallel Subgradient Method for Nonsmooth Convex Optimization With a Simple Constraint, Linear and Nonlinear Analysis 1 (1): 67-77 (2015) PDF Open Access Hideaki Iiduka: Parallel Computing Subgradient Method for Nonsmooth Convex Optimization over the Intersection of Fixed Point Sets of Nonexpansive Mappings, Fixed Point Theory and Applications 2015: 72 (2015) PDF Hideaki Iiduka: Convex Optimization over Fixed Point Sets of Quasi-Nonexpansive and Nonexpansive Mappings in Utility-Based Bandwidth Allocation Problems With Operational Constraints, Journal of Computational and Applied Mathematics 282: 225-236 (2015) PDF Hideaki Iiduka: Acceleration Method for Convex Optimization over the Fixed Point Set of a Nonexpansive Mapping, Mathematical Programming 149 (1): 131-165 (2015) PDF Masato Uchida, Hideaki Iiduka, Isao Sugino: Modeling User Behavior in P2P Data Storage System, IEICE Transactions on Communications: Special Section on Quality of Diversifying Communication Networks and Services E98-B (1): 33-41 (2015) PDF Proceedings Kazuhiro Hishinuma, Hideaki Iiduka: Parallel Computing Method for Nonsmooth Convex Optimization, RIMS Kôkyûroku No.1963, pp.71–77 Open Access 2014 Publications in Refereed Journals Hideaki Iiduka, Kazuhiro Hishinuma: Acceleration Method Combining Broadcast and Incremental Distributed Optimization Algorithms, SIAM Journal on Optimization 24 (4): 1840–1863 (2014) PDF Hideaki Iiduka: Distributed Iterative Methods for Solving Nonmonotone Variational Inequality over the Intersection of Fixed Point Sets of Nonexpansive Mappings, Pacific Journal of Optimization 10 (4): 691-713 (2014) PDF Kaito Sakurai, Hideaki Iiduka: Acceleration of the Halpern Algorithm to Search for a Fixed Point of a Nonexpansive Mapping, Fixed Point Theory and Applications 2014: 202 (2014) PDF Shigeru Iemoto, Kazuhiro Hishinuma, Hideaki Iiduka: Approximate Solutions to Variational Inequality over the Fixed Point Set of a Strongly Nonexpansive Mapping, Fixed Point Theory and Applications 2014: 51 (2014) PDF Conference Activities & Talks Kazuhiro Hishinuma, Hideaki Iiduka: On Parallel Computing Method for Nonsmooth Convex Optimization, The 2014 Fall National Conference of Operations Research Society of Japan, Hokkaido University of Science (Aug. 28-29 2014) Kazuhiro Hishinuma, Hideaki Iiduka: Parallel Algorithm for Nonsmooth Convex Optimization, The International Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University (Aug. 19-21 2014) 2013 Publications in Refereed Journals Hideaki Iiduka: Multicast Decentralized Optimization Algorithm for Network Resource Allocation Problems, Journal of Nonlinear and Convex Analysis 14 (4): 817-839 (2013) Open Access Hideaki Iiduka: Fixed Point Optimization Algorithms for Distributed Optimization in Networked Systems, SIAM Journal on Optimization 23 (1): 1-26 (2013) PDF 2004~2012 Please refer to Iiduka's profile page for his publications published before 2012. en/intro/publications.txt 最終更新: 2023/05/14 23:28by Hideaki IIDUKA