====== Optimization Algorithms ====== ===== Optimization Problem and Its Applications ===== Let us consider the following problem: \begin{align*} \text{Minimize } f(x):= \sum_{i\in \mathcal{I}} f^{(i)} (x) \text{ subject to } x\in C := \bigcap_{i\in \mathcal{I}} C^{(i)}, \end{align*} where $H$ is a real Hilbert space, $f^{(i)} \colon H \to \mathbb{R}$ $(i\in \mathcal{I} := \{1,2,\ldots,I\})$, and $C^{(i)}$ $(\subset H)$ $(i\in \mathcal{I})$ is closed and convex with $C \neq \emptyset$. In particular, it is assumed that $C^{(i)}$ can be expressed as the **fixed point set** of a **nonexpansive mapping**. This implies the metric projection onto $C^{(i)}$ cannot be efficiently computed (e.g., $C^{(i)}$ is the intersection of many closed convex sets or the set of minimizers of a convex function).((The metric projection onto $C^{(i)}$, denoted by $P_{C^{(i)}}$, is defined for all $x\in H$ by $P_{C^{(i)}}(x) \in C^{(i)}$ and $\|x - P_{C^{(i)}}(x)\| = \inf_{y\in C^{(i)}} \|x-y\|$.)) Please see the [[en:intro:researches:fixedpoint|Fixed Point Algorithms]] page for the details of fixed point sets.\\ Here, we divide the problem into the four categories. - **Smooth Convex Optimization Problem**:\\ It assumes that $f^{(i)}$ $(i\in \mathcal{I})$ is smooth and convex. The problem includes [[http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=1206687&url=http%3A%2F%2Fieeexplore.ieee.org%2Fiel5%2F78%2F27152%2F01206687.pdf%3Farnumber%3D1206687|signal recovery problem]], [[http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=4291862&url=http%3A%2F%2Fieeexplore.ieee.org%2Fiel5%2F78%2F4291841%2F04291862.pdf%3Farnumber%3D4291862|beamforming problem]], [[http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=4604754&url=http%3A%2F%2Fieeexplore.ieee.org%2Fiel5%2F49%2F4604726%2F04604754.pdf%3Farnumber%3D4604754|storage allocation problem]], [[http://epubs.siam.org/doi/abs/10.1137/110850542|optimal control problem]], and [[http://epubs.siam.org/doi/abs/10.1137/120866877|bandwidth allocation problem]]. - **Nonsmooth Convex Optimization Problem**:\\ It assumes that $f^{(i)}$ $(i\in \mathcal{I})$ is nonsmooth and convex. The problem includes [[http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=4407760&filter%3DAND%28p_IS_Number%3A4407756%29|signal recovery problem]],[[https://ieeexplore.ieee.org/document/8744480|ensemble learning]], [[http://link.springer.com/chapter/10.1007/978-1-4419-9569-8_17|minimal antenna-subset selection problem]], and [[https://ieeexplore.ieee.org/document/8584116|bandwidth allocation problem]]. - **Smooth Nonconvex Optimization Problem**\\ It assumes that $f^{(i)}$ $(i\in \mathcal{I})$ is smooth and nonconvex. The problem has practical problems such as [[http://link.springer.com/article/10.1007%2Fs10107-010-0427-x#|power control]] and [[http://epubs.siam.org/doi/abs/10.1137/110849456|bandwidth allocation]]. - **Nonsmooth Nonconvex Optimization Problem**\\ It assumes that $f^{(i)}$ $(i\in \mathcal{I})$ is nonsmooth and nonconvex. The problem has practical problems such as [[https://www.sciencedirect.com/science/article/abs/pii/S037722171400424X|fractional programming]]. We focus on the following algorithms for solving the above problems. - **Centralized Optimization Algorithms**:\\ It can be implemented under the assumptions that we can know the explicit forms of $f^{(i)}$s and $C^{(i)}$s, i.e., $f:= \sum_{i\in \mathcal{I}} f^{(i)}$ and $C:= \bigcap_{i\in \mathcal{I}} C^{(i)}$. - **Decentralized Optimization Algorithms**:\\ It can be implemented through all users cooperating when user $i$ $(i\in \mathcal{I})$ has its own private $f^{(i)}$ and $C^{(i)}$ and cannot use the private information of other users. Please see the [[en:intro:publications|Publications]] page. You can get the pdf files of our papers. ===== Optimization Algorithms for Smooth Convex Optimization ===== ==== Centralized Optimization Algorithms ==== Many algorithms have been presented to solve convex optimization problems (e.g., [[http://en.wikipedia.org/wiki/Interior_point_method|interior-point methods]], [[http://www.ams.org/journals/bull/1964-70-05/S0002-9904-1964-11178-2/|projection methods]]). In 2001, [[http://www.sp.ss.titech.ac.jp/index.php?Isao%20Yamada|Yamada]] proposed [[http://www.sciencedirect.com/science/article/pii/S1570579X01800288|the hybrid steepest descent method]] for //smooth convex optimization over the fixed point sets of nonexpansive mappings//. We have presented algorithms based on the hybrid steepest descent method and their convergence analyses. The following are our previously reported results: * [[:en:iiduka:|H. Iiduka]]: [[http://link.springer.com/article/10.1007/s10107-013-0741-1|Acceleration Method for Convex Optimization over the Fixed Point Set of a Nonexpansive Mapping]], Mathematical Programming, Vol. 149, No. 1, pp. 131-165, 2015. * S. Iemoto, K. Hishinuma, and [[:iiduka:|H. Iiduka]]: [[http://www.fixedpointtheoryandapplications.com/content/2014/1/51|Approximate Solutions to Variational Inequality over the Fixed Point Set of a Strongly Nonexpansive Mapping]], Fixed Point Theory and Applications, 51, Vol. 2014, No. 1, 2014. * [[:en:iiduka:|H. Iiduka]] and I. Yamada: [[http://epubs.siam.org/doi/abs/10.1137/110850542|Computational Method for Solving a Stochastic Linear-Quadratic Control Problem Given an Unsolvable Stochastic Algebraic Riccati Equation]], SIAM Journal on Control and Optimization, Vol. 50, No. 4, pp. 2173-2192, 2012. * [[:en:iiduka:|H. Iiduka]]: [[http://www.sciencedirect.com/science/article/pii/S0377042711005358|Fixed Point Optimization Algorithm and Its Application to Network Bandwidth Allocation]], Journal of Computational and Applied Mathematics, Vol. 236, No. 7, pp. 1733-1742, 2012. * [[:en:iiduka:|H. Iiduka]]: [[http://link.springer.com/article/10.1007%2Fs10957-010-9769-z#|Iterative Algorithm for Solving Triple-hierarchical Constrained Optimization Problem]], Journal of Optimization Theory and Applications, Vol. 148, No. 3, pp. 580-592, 2011. * [[:en:iiduka:|H. Iiduka]]: [[http://www.sciencedirect.com/science/article/pii/S0096300311000099|Three-term Conjugate Gradient Method for the Convex Optimization Problem over the Fixed Point Set of a Nonexpansive Mapping]], Applied Mathematics and Computation, Vol. 217, No. 13, pp. 6315-6327, 2011. * [[:en:iiduka:|H. Iiduka]] and [[http://www.sp.ss.titech.ac.jp/index.php?Isao%20Yamada|I. Yamada]]: [[http://epubs.siam.org/action/showAbstract?page=1881&volume=19&issue=4&journalCode=sjope8|A Use of Conjugate Gradient Direction for the Convex Optimization Problem over the Fixed Point Set of a Nonexpansive Mapping]], SIAM Journal on Optimization, Vol. 19, No. 4, pp. 1881-1893, 2009. ==== Decentralized Optimization Algorithms ==== [[http://epubs.siam.org/doi/abs/10.1137/S1052623499362111|The incremental subgradient method]] and [[http://iopscience.iop.org/0266-5611/24/6/065014|the broadcast optimization method]] are useful for decentralized convex optimization. However, they can be applied to only the case where $C^{(i)} = C$ $(i\in \mathcal{I})$ and $C$ is simple in the sense that the projection onto $C$ can be easily computed (e.g., $C$ is a half-space). We have proposed decentralized optimization algorithms for solving convex optimization problems with more complicated $C^{(i)}$ (e.g., $C^{(i)}$ is the intersection of simple, closed convex sets), which include [[http://epubs.siam.org/doi/abs/10.1137/120866877|bandwidth allocation problem]], [[http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=4604754&url=http%3A%2F%2Fieeexplore.ieee.org%2Fiel5%2F49%2F4604726%2F04604754.pdf%3Farnumber%3D4604754|storage allocation problem]], and [[https://ieeexplore.ieee.org/document/8744480|ensemble learning]]. * [[:en:iiduka:|H. Iiduka]]: [[https://ieeexplore.ieee.org/document/8744480|Stochastic Fixed Point Optimization Algorithm for Classifier Ensemble]], IEEE Transactions on Cybernetics (accepted) * [[:en:iiduka:|H. Iiduka]]: [[https://link.springer.com/article/10.1007/s11081-019-09440-7|Decentralized Hierarchical Constrained Convex Optimization]], Optimization and Engineering, Vol. 21, No. 1, pp. 181–213, 2020. * [[:en:iiduka:|H. Iiduka]]: [[https://www.jstage.jst.go.jp/article/jorsj/58/4/58_330/_pdf|Parallel Optimization Algorithm for Smooth Convex Optimization over Fixed Point Sets of Quasi-nonexpansive Mappings]], Journal of the Operations Research of Japan, Vol. 58, No. 4, pp. 330-352, 2015. * [[:en:iiduka:|H. Iiduka]]: [[http://www.ybook.co.jp/online-p/JNCA/Open/16/jncav16n11p2159-oa/FLASH/index.html|Distributed Convex Optimization Algorithms and Their Application to Distributed Control in Peer-to-Peer Data Storage System]], Journal of Nonlinear and Convex Analysis: : Special Issue-Dedicated to Wataru Takahashi on the occasion of his 70th birth day, Vol. 16, No. 11, pp. 2159-2179, 2015. * [[:en:iiduka:|H. Iiduka]] and K. Hishinuma: [[http://epubs.siam.org/doi/abs/10.1137/130939560|Acceleration Method Combining Broadcast and Incremental Distributed Optimization Algorithms]], SIAM Journal on Optimization, Vol. 24, No. 4, pp. 1840-1863, 2014. * [[:en:iiduka:|H. Iiduka]]: [[http://epubs.siam.org/doi/abs/10.1137/120866877|Fixed Point Optimization Algorithms for Distributed Optimization in Networked Systems]], SIAM Journal on Optimization, Vol. 23, No. 1, pp. 1-26, 2013. ===== Optimization Algorithms for Nonsmooth Convex Optimization ===== The [[http://epubs.siam.org/doi/abs/10.1137/0314056|proximal point algorithms]] and the [[https://en.wikipedia.org/wiki/Subgradient_method|subgradient methods]] are useful for solving nonsmooth convex optimization. The following are the results of the algorithms based on the above methods. ==== Decentralized Optimization Algorithms ==== * K. Shimizu, K. Hishinuma, [[:en:iiduka:|H. Iiduka]]: Parallel Computing Proximal Method for Nonsmooth Convex Optimization With Fixed Point Constraints of Quasi-nonexpansive Mappings, Applied Set-Valued Analysis and Optimization, accepted * H. Oishi, Y. Kobayashi, [[:en:iiduka:|H. Iiduka]]: [[http://www.ybook.co.jp/online-p/LNA/Open/vol5/lnav5n3p477-oa/index.html|Incremental Proximal Method for Nonsmooth Convex Optimization With Fixed Point Constraints of Quasi-nonexpansive Mappings]], Linear and Nonlinear Analysis, Vol. 5, No. 3, pp. 477-493, 2019. * [[:en:iiduka:|H. Iiduka]]: [[https://ieeexplore.ieee.org/document/8584116|Distributed Optimization for Network Resource Allocation with Nonsmooth Utility Functions]], IEEE Transactions on Control of Network Systems, Vol. 6, No. 4, pp. 1354-1365, 2019. * K. Hishinuma and [[en:iiduka:|H. Iiduka]]: [[http://www.ybook.co.jp/online2/opjnca/vol20/p1937.html|Convergence Analysis of Incremental and Parallel Line Search Subgradient Methods in Hilbert Space]], Journal of Nonlinear and Convex Analysis: Special Issue-Dedicated to Wataru Takahashi on the occasion of his 75th birth day, Vol. 20, No. 9, pp.1937-1947, 2019. * K. Hishinuma and [[:en:iiduka:|H. Iiduka]]: [[https://www.frontiersin.org/articles/10.3389/frobt.2019.00077/full|Incremental and Parallel Machine Learning Algorithms with Automated Learning Rate Adjustments]], Frontiers in Robotics and AI: Resolution of Limitations of Deep Learning to Develop New AI Paradigms, Vol. 6, Article 77, 2019. * [[:en:iiduka:|H. Iiduka]]: [[http://www.tandfonline.com/doi/full/10.1080/10556788.2018.1425860|Two Stochastic Optimization Algorithms for Convex Optimization with Fixed Point Constraints]], Optimization Methods and Software, Vol. 34, No. 4, pp.731-757, 2019. * K. Sakurai, T. Jimba, and [[:iiduka:|H. Iiduka]]: [[http://jnva.biemdas.com/archives/843|Iterative Methods for Parallel Convex Optimization with Fixed Point Constraints]], Journal of Nonlinear and Variational Analysis, Vol. 3, No. 2, pp.115-126, 2019. * [[http://gyoseki1.mind.meiji.ac.jp/mjuhp/KgApp?kyoinId=ymkdgygyggy&Language=2|Y. Hayashi]] and [[:en:iiduka:|H. Iiduka]]:[[http://www.sciencedirect.com/science/article/pii/S0925231217313486|Optimality and Convergence for Convex Ensemble Learning with Sparsity and Diversity Based on Fixed Point Optimization]], Neurocomputing, Vol. 273, pp.367-372, 2018. * [[en:iiduka:|H. Iiduka]]: [[http://www.tandfonline.com/doi/full/10.1080/02331934.2016.1252914|Almost Sure Convergence of Random Projected Proximal and Subgradient Algorithms for Distributed Nonsmooth Convex Optimization]], Optimization, Vol. 66, No. 1, pp.35-59, 2017. * [[en:iiduka:|H. Iiduka]]: [[http://www.tandfonline.com/doi/abs/10.1080/10556788.2016.1175002?journalCode=goms20|Incremental Subgradient Method for Nonsmooth Convex Optimization with Fixed Point Constraints]], Optimization Methods and Software, Vol. 31, No. 5, pp.931-951, 2016. * [[en:iiduka:|H. Iiduka]]: [[http://link.springer.com/article/10.1007/s10107-015-0967-1|Convergence Analysis of Iterative Methods for Nonsmooth Convex Optimization over Fixed Point Sets of Quasi-Nonexpansive Mappings]], Mathematical Programming, Vol. 159, No. 1, pp. 509-538, 2016. * [[en:iiduka:|H. Iiduka]]: [[http://www.sciencedirect.com/science/article/pii/S0377221716301102|Proximal Point Algorithms for Nonsmooth Convex Optimization with Fixed Point Constraints]], European Journal of Operational Research, Vol. 253, No. 2, pp. 503-513, 2016. * [[en:iiduka:|H. Iiduka]]: [[http://www.fixedpointtheoryandapplications.com/content/2015/1/72|Parallel Computing Subgradient Method for Nonsmooth Convex Optimization over the Intersection of Fixed Point Sets of Nonexpansive Mappings]], Fixed Point Theory and Applications, Vol. 2015, 72, 2015. ===== Optimization Algorithms for Smooth Nonconvex Optimization ===== ==== Centralized Optimization Algorithms ==== The following are our previously reported results. * [[en:iiduka:|H. Iiduka]]: [[http://epubs.siam.org/doi/abs/10.1137/110849456|Iterative Algorithm for Triple-Hierarchical Constrained Nonconvex Optimization Problem and Its Application to Network Bandwidth Allocation]], SIAM Journal on Optimization, Vol. 22, No. 3, pp. 862-878, 2012. * [[en:iiduka:|H. Iiduka]]: [[http://link.springer.com/article/10.1007%2Fs10107-010-0427-x#|Fixed Point Optimization Algorithm and Its Application to Power Control in CDMA Data Networks]], Mathematical Programming, Vol. 133, No.1, pp. 227-242, 2012. * [[en:iiduka:|H. Iiduka]] and M. Uchida: [[http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=5752800&url=http%3A%2F%2Fieeexplore.ieee.org%2Fiel5%2F4234%2F5895119%2F05752800.pdf%3Farnumber%3D5752800|Fixed Point Optimization Algorithms for Network Bandwidth Allocation Problems with Compoundable Constraints]], IEEE Communications Letters, Vol. 15, No. 6, pp. 596-598, 2011. * [[en:iiduka:|H. Iiduka]] and [[http://www.sp.ss.titech.ac.jp/index.php?Isao%20Yamada|I. Yamada]]: [[http://www.tandfonline.com/doi/abs/10.1080/02331930701762829|A Subgradient-type Method for the Equilibrium Problem over the Fixed Point Set and its Applications]], Optimization, Vol. 58, No. 2, pp. 251-261, 2009. ==== Decentralized Optimization Algorithms ==== * [[en:iiduka:|H. Iiduka]]: [[http://www.ybook.co.jp/|Distributed Iterative Methods for Solving Nonmonotone Variational Inequality over the Intersection of Fixed Point Sets of Nonexpansive Mappings]], Pacific Journal of Optimization, Vol. 10, No. 4, pp. 691-713, 2014. ===== Optimization Algorithms for Nonsmooth Nonconvex Optimization ===== ==== Centralized Optimization Algorithms ==== * K. Hishinuma and [[en:iiduka:|H. Iiduka]]: [[https://doi.org/10.1016/j.ejor.2019.09.037|Fixed Point Quasiconvex Subgradient Method]], European Journal of Operational Research, Vol. 282, No. 2, 428–437, 2020