Optimization Algorithms
Optimization Problem and Its Applications
Let us consider the following problem:
\begin{align*}
\text{Minimize } f(x):= \sum_{i\in \mathcal{I}} f^{(i)} (x) \text{ subject to } x\in C := \bigcap_{i\in \mathcal{I}} C^{(i)},
\end{align*}
where $H$ is a real Hilbert space, $f^{(i)} \colon H \to \mathbb{R}$ $(i\in \mathcal{I} := \{1,2,\ldots,I\})$, and
$C^{(i)}$ $(\subset H)$ $(i\in \mathcal{I})$ is closed and convex with $C \neq \emptyset$.
In particular, it is assumed that $C^{(i)}$ can be expressed as the fixed point set of a nonexpansive mapping. This implies the metric projection onto $C^{(i)}$ cannot be efficiently computed (e.g., $C^{(i)}$ is the intersection of many closed convex sets or the set of minimizers of a convex function).1) Please see the Fixed Point Algorithms page for the details of fixed point sets.
Here, we divide the problem into the four categories.
- Smooth Convex Optimization Problem:
It assumes that $f^{(i)}$ $(i\in \mathcal{I})$ is smooth and convex. The problem includes signal recovery problem, beamforming problem, storage allocation problem, optimal control problem, and bandwidth allocation problem. - Nonsmooth Convex Optimization Problem:
It assumes that $f^{(i)}$ $(i\in \mathcal{I})$ is nonsmooth and convex. The problem includes signal recovery problem,ensemble learning, minimal antenna-subset selection problem, and bandwidth allocation problem. - Smooth Nonconvex Optimization Problem
It assumes that $f^{(i)}$ $(i\in \mathcal{I})$ is smooth and nonconvex. The problem has practical problems such as power control and bandwidth allocation. - Nonsmooth Nonconvex Optimization Problem
It assumes that $f^{(i)}$ $(i\in \mathcal{I})$ is nonsmooth and nonconvex. The problem has practical problems such as fractional programming.
We focus on the following algorithms for solving the above problems.
- Centralized Optimization Algorithms:
It can be implemented under the assumptions that we can know the explicit forms of $f^{(i)}$s and $C^{(i)}$s, i.e., $f:= \sum_{i\in \mathcal{I}} f^{(i)}$ and $C:= \bigcap_{i\in \mathcal{I}} C^{(i)}$. - Decentralized Optimization Algorithms:
It can be implemented through all users cooperating when user $i$ $(i\in \mathcal{I})$ has its own private $f^{(i)}$ and $C^{(i)}$ and cannot use the private information of other users.
Please see the Publications page. You can get the pdf files of our papers.
Optimization Algorithms for Smooth Convex Optimization
Centralized Optimization Algorithms
Many algorithms have been presented to solve convex optimization problems (e.g., interior-point methods, projection methods). In 2001, Yamada proposed the hybrid steepest descent method for smooth convex optimization over the fixed point sets of nonexpansive mappings. We have presented algorithms based on the hybrid steepest descent method and their convergence analyses. The following are our previously reported results:
- H. Iiduka: Acceleration Method for Convex Optimization over the Fixed Point Set of a Nonexpansive Mapping, Mathematical Programming, Vol. 149, No. 1, pp. 131-165, 2015.
- S. Iemoto, K. Hishinuma, and H. Iiduka: Approximate Solutions to Variational Inequality over the Fixed Point Set of a Strongly Nonexpansive Mapping, Fixed Point Theory and Applications, 51, Vol. 2014, No. 1, 2014.
- H. Iiduka and I. Yamada: Computational Method for Solving a Stochastic Linear-Quadratic Control Problem Given an Unsolvable Stochastic Algebraic Riccati Equation, SIAM Journal on Control and Optimization, Vol. 50, No. 4, pp. 2173-2192, 2012.
- H. Iiduka: Fixed Point Optimization Algorithm and Its Application to Network Bandwidth Allocation, Journal of Computational and Applied Mathematics, Vol. 236, No. 7, pp. 1733-1742, 2012.
- H. Iiduka: Iterative Algorithm for Solving Triple-hierarchical Constrained Optimization Problem, Journal of Optimization Theory and Applications, Vol. 148, No. 3, pp. 580-592, 2011.
- H. Iiduka: Three-term Conjugate Gradient Method for the Convex Optimization Problem over the Fixed Point Set of a Nonexpansive Mapping, Applied Mathematics and Computation, Vol. 217, No. 13, pp. 6315-6327, 2011.
- H. Iiduka and I. Yamada: A Use of Conjugate Gradient Direction for the Convex Optimization Problem over the Fixed Point Set of a Nonexpansive Mapping, SIAM Journal on Optimization, Vol. 19, No. 4, pp. 1881-1893, 2009.
Decentralized Optimization Algorithms
The incremental subgradient method and the broadcast optimization method are useful for decentralized convex optimization. However, they can be applied to only the case where $C^{(i)} = C$ $(i\in \mathcal{I})$ and $C$ is simple in the sense that the projection onto $C$ can be easily computed (e.g., $C$ is a half-space). We have proposed decentralized optimization algorithms for solving convex optimization problems with more complicated $C^{(i)}$ (e.g., $C^{(i)}$ is the intersection of simple, closed convex sets), which include bandwidth allocation problem, storage allocation problem, and ensemble learning.
- H. Iiduka: Stochastic Fixed Point Optimization Algorithm for Classifier Ensemble, IEEE Transactions on Cybernetics (accepted)
- H. Iiduka: Decentralized Hierarchical Constrained Convex Optimization, Optimization and Engineering, Vol. 21, No. 1, pp. 181–213, 2020.
- H. Iiduka: Parallel Optimization Algorithm for Smooth Convex Optimization over Fixed Point Sets of Quasi-nonexpansive Mappings, Journal of the Operations Research of Japan, Vol. 58, No. 4, pp. 330-352, 2015.
- H. Iiduka: Distributed Convex Optimization Algorithms and Their Application to Distributed Control in Peer-to-Peer Data Storage System, Journal of Nonlinear and Convex Analysis: : Special Issue-Dedicated to Wataru Takahashi on the occasion of his 70th birth day, Vol. 16, No. 11, pp. 2159-2179, 2015.
- H. Iiduka and K. Hishinuma: Acceleration Method Combining Broadcast and Incremental Distributed Optimization Algorithms, SIAM Journal on Optimization, Vol. 24, No. 4, pp. 1840-1863, 2014.
- H. Iiduka: Fixed Point Optimization Algorithms for Distributed Optimization in Networked Systems, SIAM Journal on Optimization, Vol. 23, No. 1, pp. 1-26, 2013.
Optimization Algorithms for Nonsmooth Convex Optimization
The proximal point algorithms and the subgradient methods are useful for solving nonsmooth convex optimization. The following are the results of the algorithms based on the above methods.
Decentralized Optimization Algorithms
- K. Shimizu, K. Hishinuma, H. Iiduka: Parallel Computing Proximal Method for Nonsmooth Convex Optimization With Fixed Point Constraints of Quasi-nonexpansive Mappings, Applied Set-Valued Analysis and Optimization, accepted
- H. Oishi, Y. Kobayashi, H. Iiduka: Incremental Proximal Method for Nonsmooth Convex Optimization With Fixed Point Constraints of Quasi-nonexpansive Mappings, Linear and Nonlinear Analysis, Vol. 5, No. 3, pp. 477-493, 2019.
- H. Iiduka: Distributed Optimization for Network Resource Allocation with Nonsmooth Utility Functions, IEEE Transactions on Control of Network Systems, Vol. 6, No. 4, pp. 1354-1365, 2019.
- K. Hishinuma and H. Iiduka: Convergence Analysis of Incremental and Parallel Line Search Subgradient Methods in Hilbert Space, Journal of Nonlinear and Convex Analysis: Special Issue-Dedicated to Wataru Takahashi on the occasion of his 75th birth day, Vol. 20, No. 9, pp.1937-1947, 2019.
- K. Hishinuma and H. Iiduka: Incremental and Parallel Machine Learning Algorithms with Automated Learning Rate Adjustments, Frontiers in Robotics and AI: Resolution of Limitations of Deep Learning to Develop New AI Paradigms, Vol. 6, Article 77, 2019.
- H. Iiduka: Two Stochastic Optimization Algorithms for Convex Optimization with Fixed Point Constraints, Optimization Methods and Software, Vol. 34, No. 4, pp.731-757, 2019.
- K. Sakurai, T. Jimba, and H. Iiduka: Iterative Methods for Parallel Convex Optimization with Fixed Point Constraints, Journal of Nonlinear and Variational Analysis, Vol. 3, No. 2, pp.115-126, 2019.
- Y. Hayashi and H. Iiduka:Optimality and Convergence for Convex Ensemble Learning with Sparsity and Diversity Based on Fixed Point Optimization, Neurocomputing, Vol. 273, pp.367-372, 2018.
- H. Iiduka: Almost Sure Convergence of Random Projected Proximal and Subgradient Algorithms for Distributed Nonsmooth Convex Optimization, Optimization, Vol. 66, No. 1, pp.35-59, 2017.
- H. Iiduka: Incremental Subgradient Method for Nonsmooth Convex Optimization with Fixed Point Constraints, Optimization Methods and Software, Vol. 31, No. 5, pp.931-951, 2016.
- H. Iiduka: Convergence Analysis of Iterative Methods for Nonsmooth Convex Optimization over Fixed Point Sets of Quasi-Nonexpansive Mappings, Mathematical Programming, Vol. 159, No. 1, pp. 509-538, 2016.
- H. Iiduka: Proximal Point Algorithms for Nonsmooth Convex Optimization with Fixed Point Constraints, European Journal of Operational Research, Vol. 253, No. 2, pp. 503-513, 2016.
- H. Iiduka: Parallel Computing Subgradient Method for Nonsmooth Convex Optimization over the Intersection of Fixed Point Sets of Nonexpansive Mappings, Fixed Point Theory and Applications, Vol. 2015, 72, 2015.
Optimization Algorithms for Smooth Nonconvex Optimization
Centralized Optimization Algorithms
The following are our previously reported results.
- H. Iiduka: Iterative Algorithm for Triple-Hierarchical Constrained Nonconvex Optimization Problem and Its Application to Network Bandwidth Allocation, SIAM Journal on Optimization, Vol. 22, No. 3, pp. 862-878, 2012.
- H. Iiduka: Fixed Point Optimization Algorithm and Its Application to Power Control in CDMA Data Networks, Mathematical Programming, Vol. 133, No.1, pp. 227-242, 2012.
- H. Iiduka and M. Uchida: Fixed Point Optimization Algorithms for Network Bandwidth Allocation Problems with Compoundable Constraints, IEEE Communications Letters, Vol. 15, No. 6, pp. 596-598, 2011.
- H. Iiduka and I. Yamada: A Subgradient-type Method for the Equilibrium Problem over the Fixed Point Set and its Applications, Optimization, Vol. 58, No. 2, pp. 251-261, 2009.
Decentralized Optimization Algorithms
- H. Iiduka: Distributed Iterative Methods for Solving Nonmonotone Variational Inequality over the Intersection of Fixed Point Sets of Nonexpansive Mappings, Pacific Journal of Optimization, Vol. 10, No. 4, pp. 691-713, 2014.
Optimization Algorithms for Nonsmooth Nonconvex Optimization
Centralized Optimization Algorithms
* K. Hishinuma and H. Iiduka: Fixed Point Quasiconvex Subgradient Method, European Journal of Operational Research, Vol. 282, No. 2, 428–437, 2020