Publications
Preprints
- preprints (arXiv.org Search Results)
2025
Publications in Refereed Journals
- Naoki Sato, Koshiro Izumi, Hideaki Iiduka: Scaled Conjugate Gradient Method for Nonconvex Optimization in Deep Neural Networks, Journal of Machine Learning Research 25: ???-??? (2024)
- Kento Imaizumi, Hideaki Iiduka: Iteration and Stochastic First-order Oracle Complexities of Stochastic Gradient Descent using Constant and Decaying Learning Rates, Optimization: Special Issue dedicated to Dr. Alexander J. Zaslavski on the occasion of his 65th birthday ?? (?):?? –?? (2024) Open Access
Proceedings
Conference Activities & Talks
- Naoki Sato, Hideaki Iiduka: Explicit and Implicit Graduated Optimization in Deep Neural Networks, The 39th Annual AAAI Conference on Artificial Intelligence (AAAI-25), Pennsylvania Convention Center, Philadelphia, Pennsylvania, USA (Feb. 27 – Mar. 4, 2025)
2024
Books
Publications in Refereed Journals
- Hiroyuki Sakai, Hideaki Iiduka: Convergence of Riemannian Stochastic Gradient Descent on Hadamard Manifold, Pacific Journal of Optimization: Special issue: Dedicated to Prof. Masao Fukushima on the occasion of his 75th birthday 20 (4): 743–767 (2024)
Reward and Punishment
- Naoki Sato: The 5th TOME Research Institute Young Researchers' Paper Competition, Excellence Award (100,000 yen prize) (Nov. 1, 2024)
Conference Activities & Talks
- Naoki Sato, Hideaki Iiduka: Global Optimization for Empirical Risk Function by Graduated Optimization with Stochastic Noise in SGD, The 55th Information-Based Induction Sciences and Machine Learning (IBISML), Lecture room 1, Graduate School of Environmental Science, Hokkaido University (Dec. 21, 2024)
- Naoki Sato, Hideaki Iiduka: Global optimization for empirical risk minimization problems using a graduated optimization algorithm with the smoothing effect of stochastic gradient descent, The 27th Information-Based Induction Sciences Workshop, SONIC CITY HALL (Nov. 5, 2024)
- Hiroyuki Sakai, Hideaki Iiduka: Application of Riemannian Optimization Algorithms to Eigenvalue problems, The 2024 National Conference of The Japan Society for Industrial and Applied Mathematics: Algorithms for Matrix / Eigenvalue Problems and their Applications, Kyoto University (Sept. 14, 2024)
- Naoki Sato, Hideaki Iiduka: Role of Momentum in Smoothing Objective Function and Generalizability of Deep Neural Networks, The 2024 Fall National Conference of Operations Research Society of Japan, Nanzan University (Sept. 11, 2024)
- Hiroyuki Sakai, Hideaki Iiduka: Modification and extension of memoryless spectral-scaling Broyden family on Riemannian manifolds, The 2024 Spring National Conference of Operations Research Society of Japan, Kasuga Area, Tsukuba Campus, University of Tsukuba (Mar. 7, 2024)
- Naoki Sato, Hideaki Iiduka: Global optimization of deep neural networks for graduated optimization method using smoothness of stochastic gradient descent, The 2024 Spring National Conference of Operations Research Society of Japan, Kasuga Area, Tsukuba Campus, University of Tsukuba (Mar. 7, 2024)
2023
Books
Publications in Refereed Journals
- Hiroyuki Sakai, Hiroyuki Sato, Hideaki Iiduka: Global Convergence of Hager-Zhang type Riemannian Conjugate Gradient Method, Applied Mathematics and Computation 441, 127685 (2023) PDF
Proceedings
Conference Activities & Talks
- Naoki Sato, Hideaki Iiduka: Existence and Estimation of Critical Batch Size for Training GANs with Two Time-Scale Update Rule, RIMS Workshop on Mathematical Optimization: Theory and Practice, Research Institute for Mathematical Sciences, Kyoto University, Hybrid meeting (Aug. 28, 2023)
- Hiroyuki Sakai, Hideaki Iiduka: Adaptive Learning Rate Optimization Algorithms for Riemannian Optimization,The 10th International Congress on Industrial and Applied Mathematics (ICIAM), Waseda University, Tokyo, Japan (Aug. 20–25, 2023)
- Yuki Tsukada, Hideaki Iiduka: Line Search Methods for Nonconvex Optimization in Deep Learning, The 10th International Congress on Industrial and Applied Mathematics (ICIAM), Waseda University, Tokyo, Japan (Aug. 20–25, 2023)
- Naoki Sato, Hideaki Iiduka: Theoretical Analysis of Two Time-Scale Update Rule for Training GANs, The 10th International Congress on Industrial and Applied Mathematics (ICIAM), Waseda University, Tokyo, Japan (Aug. 20–25, 2023)
- Naoki Sato, Hideaki Iiduka: Existence and Estimation of Critical Batch Size for Training Generative Adversarial Networks with Two Time-Scale Update Rule, The 40th International Conference on Machine Learning (ICML), Hawaii Convention Center, Honolulu, Hawaii, USA (Jul. 23–29, 2023)
- Hiroki Naganuma, Hideaki Iiduka: Conjugate Gradient Method for Generative Adversarial Networks, The 26th International Conference on Artificial Intelligence and Statistics (AISTATS), Palau de Congressos, Valencia, Spain (Apr. 25–27, 2023)
2022
Publications in Refereed Journals
- Hiroyuki Sakai, Hideaki Iiduka: Riemannian Adaptive Optimization Algorithm and Its Application to Natural Language Processing, IEEE Transactions on Cybernetics 52 (8): 7328–7339 (2022) PDF
- Hideaki Iiduka, Hiroyuki Sakai: Riemannian Stochastic Fixed Point Optimization Algorithm, Numerical Algorithms 90: 1493–1517 (2022) PDF Springer Nature SharedIt
- Yu Kobayashi, Hideaki Iiduka: Conjugate-gradient-based Adam for Nonconvex Stochastic Optimization and Its Application to Deep Learning, Journal of Nonlinear and Convex Analysis: Special issue: Memory of Wataru Takahashi 23 (2): 337–356 (2022) Open Access
Conference Activities & Talks
- Hiroyuki Sakai, Hiroyuki Sato, Hideaki Iiduka: HZ-type Conjugate Gradient Method on Riemann Manifold and Its Application to Eigenvalue Problems, The 2022 National Conference of The Japan Society for Industrial and Applied Mathematics: Algorithms for Matrix / Eigenvalue Problems and their Applications, Hokkaido University Institute for the Advancement of Higher Education (Sept. 8, 2022)
2021
Publications in Refereed Journals
- Kanako Shimoyama, Hideaki Iiduka: Adaptive Methods Using Element-wise $P$-th Power of Stochastic Gradient for Nonconvex Optimization in Deep Neural Networks, Linear and Nonlinear Analysis: Special issue: Memory of Wataru Takahashi and Naoki Shioji 7 (3): 317–336 (2021)Open Access
- Kazuhiro Hishinuma, Hideaki Iiduka: Evaluation of Fixed Point Quasiconvex Subgradient Method with Computational Inexactness, Pure and Applied Functional Analysis: Special Issue on Optimization Theory dedicated to Terry Rockafellar on the occasion of his 85th birthday 6 (6): 1303–1316 (2021) Open Access
- Yini Zhu, Hideaki Iiduka: Unified Algorithm Framework for Nonconvex Stochastic Optimization in Deep Neural Networks, IEEE Access 9: 143807–143823 (2021) Open Access
- Hiroyuki Sakai, Hideaki Iiduka: Sufficient Descent Riemannian Conjugate Gradient Methods, Journal of Optimization Theory and Applications 190: 130–150 (2021)PDF Springer Nature SharedIt
- Hideaki Iiduka: Stochastic Approximation Method Using Diagonal Positive-Definite Matrices for Convex Optimization with Fixed Point Constraints, Fixed Point Theory and Algorithms for Sciences and Engineering: Topical Collection on Optimization and Real World Applications 2021: 10 (2021) Open Access Springer Nature SharedIt
Proceedings
- Kanako Shimoyama, Hideaki Iiduka: Appropriate Gradients Used for Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks, RIMS Kôkyûroku No.2194, pp. 1–5, 2021 Open Access
- Yini Zhu, Hiroyuki Sakai, Hideaki Iiduka: Training Neural Networks Using Adaptive Gradient Methods, RIMS Kôkyûroku No.2194, pp. 6–12, 2021 Open Access
- Hideaki Iiduka: Halpern-type Subgradient Methods for Convex Optimization over Fixed Point Sets of Nonexpansive Mappings, Proceedings of International Conference on Nonlinear Analysis and Convex Analysis and International Conference on Optimization: Techniques and Applications -I- pp. 119–125 PDF
Conference Activities & Talks
- Hiroyuki Sakai, Hideaki Iiduka: Riemannian conjugate gradient methods with sufficient descent search directions, The 2021 Fall National Conference of Operations Research Society of Japan, Kyushu University, Online meeting (Sept. 16, 2021)
- Koshiro Izumi, Hideaki Iiduka: Adaptive scaling conjugate gradient method for neural networks, RIMS Workshop on Advances in the Theory and Application of Mathematical Optimization, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Aug. 19, 2021)
- Yu Kobayashi, Hideaki Iiduka: Adaptive conjugate gradient method for deep learning, The 2021 Spring National Conference of Operations Research Society of Japan, Tokyo Institute of Technology, Online meeting (Mar. 2, 2021)
- Hiroyuki Sakai, Hideaki Iiduka: Extension of adaptive learning rate optimization algorithms to Riemannian manifolds and its application to natural language processing, The 2021 Spring National Conference of Operations Research Society of Japan, Tokyo Institute of Technology, Online meeting (Mar. 2, 2021)
- Yini Zhu, Hiroyuki Sakai, Hideaki Iiduka: Training neural networks using adaptive gradient methods, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Mar. 1, 2021)
- Kanako Shimoyama, Yu Kobayashi, Hideaki Iiduka: Appropriate stochastic gradients used in adaptive learning rate optimization algorithms for training deep neural networks, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University, Online meeting (Mar. 1, 2021)
2020
Publications in Refereed Journals
- Hiroyuki Sakai, Hideaki Iiduka: Hybrid Riemannian Conjugate Gradient Methods with Global Convergence Properties, Computational Optimization and Applications 77: 811–830 (2020) PDF
- Hideaki Iiduka, Yu Kobayashi: Training Deep Neural Networks Using Conjugate Gradient-like Methods, Electronics 9 (11): 1809 (2020) Open Access Correction
- Hideaki Iiduka: Stochastic Fixed Point Optimization Algorithm for Classifier Ensemble, IEEE Transactions on Cybernetics 50(10): 4370–4380 (2020) PDF
- Kazuhiro Hishinuma, Hideaki Iiduka: Efficiency of Inexact Fixed Point Quasiconvex Subgradient Method, Linear and Nonlinear Analysis 6 (1): 35–48 (2020) PDF
- Kengo Shimizu, Kazuhiro Hishinuma, Hideaki Iiduka: Parallel Computing Proximal Method for Nonsmooth Convex Optimization With Fixed Point Constraints of Quasi-nonexpansive Mappings, Applied Set-Valued Analysis and Optimization 2 (1): 1–17 (2020) PDF
- Kazuhiro Hishinuma, Hideaki Iiduka: Fixed Point Quasiconvex Subgradient Method, European Journal of Operational Research 282 (2): 428–437 (2020) PDF
Doctoral Thesis
- Kazuhiro Hishinuma: Fixed Point Subgradient Methods for Constrained Nonsmooth Optimization, Meiji University, 2020 PDF
Conference Activities & Talks
- Yu Kobayashi, Hideaki Iiduka: Adaptive optimization method with stochastic conjugate gradient direction and its application to image classification in deep learning, The 2020 Spring National Conference of Operations Research Society of Japan, Nara Kasugano International Forum (Mar. 12, 2020).
- Kazuhiro Hishinuma, Hideaki Iiduka: Error evaluation of fixed point quasiconvex subgradient method, The 2020 Spring National Conference of Operations Research Society of Japan, Nara Kasugano International Forum (Mar. 12, 2020).
- Haruhi Oishi, Hideaki Iiduka: Resource allocation using fixed point approximation method for 5G network, The 2020 Spring National Conference of Operations Research Society of Japan, Nara Kasugano International Forum (Mar. 12, 2020).
- Kengo Shimizu, Hideaki Iiduka: Computational time comparisons for parallel proximal point and subgradient methods for nonsmooth convex optimization over fixed point set of quasi-nonexpansive mapping, The 2020 Spring National Conference of Operations Research Society of Japan, Nara Kasugano International Forum (Mar. 12, 2020).
- Hiroyuki Sakai, Hideaki Iiduka: A new conjugate guradient method on Riemann manifold, The 2020 Spring National Conference of Operations Research Society of Japan, Nara Kasugano International Forum (Mar. 12, 2020).
2019
Publications in Refereed Journals
- Haruhi Oishi, Yu Kobayashi, Hideaki Iiduka: Incremental Proximal Method for Nonsmooth Convex Optimization With Fixed Point Constraints of Quasi-nonexpansive Mappings, Linear and Nonlinear Analysis 5 (3): 477-493 (2019) PDF
- Kazuhiro Hishinuma, Hideaki Iiduka: Convergence Analysis of Incremental and Parallel Line Search Subgradient Methods in Hilbert Space, Journal of Nonlinear and Convex Analysis: Special Issue-Dedicated to Wataru Takahashi on the occasion of his 75th birth day 20 (9): 1937-1947 (2019) PDF
- Kaito Sakurai, Takayuki Jimba, Hideaki Iiduka: Iterative Methods for Parallel Convex Optimization With Fixed Point Constraints, Journal of Nonlinear and Variational Analysis 3 (2): 115-126 (2019) Open Access
Proceedings
- Kazuhiro Hishinuma, Hideaki Iiduka: Applying Conditional Subgradient-like Directions to the Modified Krasnosel’skiĭ-Mann Fixed Point Algorithm Based on the Three-term Conjugate Gradient Method, Proceedings of the 10th International Conference on Nonlinear Analysis and Convex Analysis, pp.59-67 Open Access
Honor Lecture
- Hideaki Iiduka: Convex optimization with complicated constraint and its application, The 2019 Fall National Conference of Operations Research Society of Japan, Higashi Hiroshima Arts & Culture Hall Kurara (Sep. 12, 2019).
- Hideaki Iiduka: Fixed point algorithms and their applications, The International Conference on Nonlinear Analysis and Convex Analysis–International Conference on Optimization: Techniques and Applications (NACA-ICOTA2019), Future University Hakodate (Aug. 27, 2019)
Conference Activities & Talks
- Yu Kobayashi, Hideaki Iiduka: Stochastic optimization algorithm using conjugate gradient direction and its application to deep learning, The 2019 Fall National Conference of Operations Research Society of Japan, Higashi Hiroshima Arts & Culture Hall Kurara (Sep. 12, 2019).
- Kazuhiro Hishinuma, Hideaki Iiduka: On rate of convergence of fixed point subgradient method, The 2019 Fall National Conference of Operations Research Society of Japan, Higashi Hiroshima Arts & Culture Hall Kurara (Sep. 12, 2019).
- Kengo Shimizu, Hideaki Iiduka: Nonsmooth convex optimization over fixed point sets of quasi nonexpansive mappings by using parallel proximal point algorithm, The 2019 Fall National Conference of Operations Research Society of Japan, Higashi Hiroshima Arts & Culture Hall Kurara (Sep. 12, 2019).
- Kazuhiro Hishinuma, Hideaki Iiduka: Convergence rate analyses of fixed point quasiconvex subgradient method, The International Conference on Nonlinear Analysis and Convex Analysis–International Conference on Optimization: Techniques and Applications (NACA-ICOTA2019), Future University Hakodate (Aug. 27, 2019)
2018
Publications in Refereed Journals
- Keigo Fujiwara, Kazuhiro Hishinuma, Hideaki Iiduka: Evaluation of Stochastic Approximation Algorithm and Variants for Learning Support Vector Machines, Linear and Nonlinear Analysis, 4 (1): 29-61 (2018) Open Access
Invited Talks
- Hideaki Iiduka: Decentralized Optimization and Its Applications, the 6th Asian Conference on Nonlinear Analysis and Optimization, ANA Intercontinental Manza Beach Resort (Nov. 6, 2018)
Conference Activities & Talks
- Kazuhiro Hishinuma, Hideaki Iiduka: Convergence property, computational performance, and usability of fixed point quasiconvex subgradient method, the 6th Asian Conference on Nonlinear Analysis and Optimization, Okinawa Institute of Science and Technology Graduate University (Nov. 7, 2018)
- Yu Kobayashi, Hideaki Iiduka: Stochastic subgradient projection method for nonmonotone equilibrium problems and its application to multi class classification, the 6th Asian Conference on Nonlinear Analysis and Optimization, Okinawa Institute of Science and Technology Graduate University (Nov. 5, 2018)
- Hideo Yoshizato, Hideaki Iiduka: Stochastic fixed point optimization algorithm for classifier ensemble with sparsity and diversity learning and its application, the 6th Asian Conference on Nonlinear Analysis and Optimization, Okinawa Institute of Science and Technology Graduate University (Nov. 5, 2018)
- Yu Kobayashi, Hideaki Iiduka: Stochastic subgradient method for stochastic equilibrium problems with nonmonotone bifunctions and its application to multiclass classification, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University (Aug. 29, 2018)
- Kazuhiro Hishinuma, Hideaki Iiduka: Application of incremental and parallel subgradient methods to learning a support vector machine and its advantages and disadvantages, The 2018 Spring National Conference of Operations Research Society of Japan, Tokai University (Mar. 15, 2018).
- Hideo Yoshizato, Hideaki Iiduka: Stochastic fixed point optimization algorithm for ensemble learning with sparsity and diversity, The 2018 Spring National Conference of Operations Research Society of Japan, Tokai University (Mar. 15, 2018).
2017
Publications in Refereed Journals
- Keigo Fujiwara, Hideaki Iiduka: Modification of the Krasnosel'skii-Mann Fixed Point Algorithm by Using Three-term Conjugate Gradients, Linear and Nonlinear Analysis, 3 (2):189-202 (2017) Open Access
Proceedings
- Hideaki Iiduka: Nonsmooth Convex Optimization With Fixed Point Constraints and Its Applications, Proceedings of the Twenty-Ninth RAMP Symposium, pp. 125–142, 2017.
Invited Talks
- Hideaki Iiduka: Nonsmooth Convex Optimization With Fixed Point Constraints and Its Applications, The 29th RAMP (Research Association of Mathematical Programming) Symposium, Tsukuba University, Tsukuba, Japan, October 12-13, 2017.
Conference Activities & Talks
- Kazuhiro Hishinuma, Hideaki Iiduka: Iterative method for solving constrained quasiconvex optimization problems based on the Krasnosel'skiĭ-Mann fixed point approximation method, RIMS Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University (Sep. 1, 2017)
- Kazuhiro Hishinuma, Hideaki Iiduka: Quasi-subgradient method for quasiconvex minimization problem with fixed point constraints, the RIMS Workshop on Development of Mathematical Optimization: Modeling and Algorithms, Research Institute for Mathematical Sciences, Kyoto University (Aug. 25 2017)
- Kazuhiro Hishinuma, Hideaki Iiduka: Flexible stepsize selection of subgradient methods for constrained convex optimization, the 10th Anniversary Conference on Nonlinear Analysis and Convex Analysis, Chitose City Cultural Center (Jul. 7, 2017)
2016
Publications in Refereed Journals
Conference Activities & Talks
- Kazuhiro Hishinuma, Hideaki Iiduka: Acceleration approach for parallel subgradient method based on line search, The 2016 Fall National Conference of Operations Research Society of Japan, Yamagata University (Sept. 15-16 2016)
- Kaito Sakurai, Hideaki Iiduka: Parallel computing method for nonsmooth convex optimization with fixed point constraints, The 2016 Fall National Conference of Operations Research Society of Japan, Yamagata University (Sept. 15-16 2016)
- Yoshiharu Nohara, Hideaki Iiduka: Line search subgradient methods for convex optimization problem and its dual problem, The 2016 Fall National Conference of Operations Research Society of Japan, Yamagata University (Sept. 15-16 2016)
- Takayuki Jimba, Kaito Sakurai, Hideaki Iiduka: Halpern-type proximal point algorithm for nonsmooth convex optimization with fixed point constraints, The 2016 Fall National Conference of Operations Research Society of Japan, Yamagata University (Sept. 15-16 2016)
- Shizuka Nishino, Hideaki Iiduka: Numerical methods for nonnegative matrix factorization based on fixed point theory, The 2016 Fall National Conference of Operations Research Society of Japan, Yamagata University (Sept. 15-16 2016)
2015
Publications in Refereed Journals
- Kazuhiro Hishinuma, Hideaki Iiduka: On Acceleration of the Krasnosel’skii-Mann Fixed Point Algorithm Based on Conjugate Gradient Method for Smooth Optimization, Journal of Nonlinear and Convex Analysis: Special Issue-Dedicated to Wataru Takahashi on the occasion of his 70th birth day 16 (11): 2243-2254 (2015) Open Access PDF
- Hideaki Iiduka: Distributed Convex Optimization Algorithms and Their Application to Distributed Control in Peer-to-Peer Data Storage System,Journal of Nonlinear and Convex Analysis: Special Issue-Dedicated to Wataru Takahashi on the occasion of his 70th birth day 16 (11): 2159-2179 (2015) Open Access PDF
- Kazuhiro Hishinuma, Hideaki Iiduka : Parallel Subgradient Method for Nonsmooth Convex Optimization With a Simple Constraint, Linear and Nonlinear Analysis 1 (1): 67-77 (2015) PDF Open Access
- Masato Uchida, Hideaki Iiduka, Isao Sugino: Modeling User Behavior in P2P Data Storage System, IEICE Transactions on Communications: Special Section on Quality of Diversifying Communication Networks and Services E98-B (1): 33-41 (2015) PDF
Proceedings
- Kazuhiro Hishinuma, Hideaki Iiduka: Parallel Computing Method for Nonsmooth Convex Optimization, RIMS Kôkyûroku No.1963, pp.71–77 Open Access
2014
Publications in Refereed Journals
- Hideaki Iiduka, Kazuhiro Hishinuma: Acceleration Method Combining Broadcast and Incremental Distributed Optimization Algorithms, SIAM Journal on Optimization 24 (4): 1840–1863 (2014) PDF
- Shigeru Iemoto, Kazuhiro Hishinuma, Hideaki Iiduka: Approximate Solutions to Variational Inequality over the Fixed Point Set of a Strongly Nonexpansive Mapping, Fixed Point Theory and Applications 2014: 51 (2014) PDF
Conference Activities & Talks
- Kazuhiro Hishinuma, Hideaki Iiduka: On Parallel Computing Method for Nonsmooth Convex Optimization, The 2014 Fall National Conference of Operations Research Society of Japan, Hokkaido University of Science (Aug. 28-29 2014)
- Kazuhiro Hishinuma, Hideaki Iiduka: Parallel Algorithm for Nonsmooth Convex Optimization, The International Workshop on Nonlinear Analysis and Convex Analysis, Research Institute for Mathematical Sciences, Kyoto University (Aug. 19-21 2014)
2013
Publications in Refereed Journals
2004~2012
- Please refer to Iiduka's profile page for his publications published before 2012.