差分

このページの2つのバージョン間の差分を表示します。

この比較画面へのリンク

両方とも前のリビジョン 前のリビジョン
次のリビジョン
前のリビジョン
en:intro:researches:fixedpoint [2016/03/24 18:50] – [Related Acceleration Methods] Hideaki IIDUKAen:intro:researches:fixedpoint [2018/07/02 10:29] (現在) – [Acceleration of the Krasnosel'skii-Mann Algorithm] Hideaki IIDUKA
行 14: 行 14:
 [[http://www.sci.keio.ac.jp/en/member/detail.php?eid=00119&katagaki=8&status=1|Takahashi]], and so on.  [[http://www.sci.keio.ac.jp/en/member/detail.php?eid=00119&katagaki=8&status=1|Takahashi]], and so on. 
 We give the following two examples of the fixed point problem. We give the following two examples of the fixed point problem.
-  - Convex Feasibility Problem:\\ [[http://dx.doi.org/10.1137/S0036144593251710|The problem]] is to find \begin{align*}x^\star \in C:= \bigcap_{i\in \mathcal{I}} C_i,\end{align*} where $C_i$ $(\subset H)$ $(i\in \mathcal{I} := \{1,2,\ldots,I \})$ is nonempty, closed, and convex. This problem includes practical problems, such as [[http://dx.doi.org/10.1109/78.782189|signal recovery problem]].\\ Let us define $T := P_1 P_2 \cdots P_I$((We may define $T$ by $T := P_I P_{I-1} \cdots P_1$ or $T:= \sum_{i\in \mathcal{I}} w_i P_i$, where $(w_i)_{i\in \mathcal{I}} \subset (0,1)$ satisfies $\sum_{i\in\mathcal{I}} w_i =1$.)), where $P_i$ $(i\in \mathcal{I})$ stands for the metric projection onto $C_i$.(($P_i$ is defined by $P_i(x) \in C_i$ and $\|P_i(x) - x  \| = \inf_{y\in C_i}\| y-x \|$ $(x\in H)$, and it satisfies the nonexpansivity condition.)) Accordingly, $T$ is nonexpansive and $\mathrm{Fix}(T) = C = \bigcap_{i\in \mathcal{I}} C_i$.+  - Convex Feasibility Problem:\\ [[http://dx.doi.org/10.1137/S0036144593251710|The problem]] is to find \begin{align*}x^\star \in C:= \bigcap_{i\in \mathcal{I}} C_i,\end{align*} where $C_i$ $(\subset H)$ $(i\in \mathcal{I} := \{1,2,\ldots,I \})$ is nonempty, closed, and convex. It includes practical problems, such as [[http://dx.doi.org/10.1109/78.782189|signal recovery problem]].\\ Let us define $T := P_1 P_2 \cdots P_I$((We may define $T$ by $T := P_I P_{I-1} \cdots P_1$ or $T:= \sum_{i\in \mathcal{I}} w_i P_i$, where $(w_i)_{i\in \mathcal{I}} \subset (0,1)$ satisfies $\sum_{i\in\mathcal{I}} w_i =1$.)), where $P_i$ $(i\in \mathcal{I})$ stands for the metric projection onto $C_i$.(($P_i$ is defined by $P_i(x) \in C_i$ and $\|P_i(x) - x  \| = \inf_{y\in C_i}\| y-x \|$ $(x\in H)$, and it satisfies the nonexpansivity condition.)) Accordingly, $T$ is nonexpansive and $\mathrm{Fix}(T) = C = \bigcap_{i\in \mathcal{I}} C_i$.
   - Constrained Convex Optimization Problem:\\ Suppose that $C$ $(\subset H)$ is nonempty, closed, and convex, $f\colon H \to \mathbb{R}$ is Fréchet differentiable and convex, and its gradient, denoted by $\nabla f$, is Lipschitz continuous with a constant $L$ $(>0)$. The constrained convex optimization problem is to find \begin{align*}x^\star \in C \text{ such that } f(x^\star) \leq f(x) \text{ for all } x\in C.\end{align*} See [[en:intro:researches:optimization|Optimization Algorithms]] page for examples of the convex optimization problem. \\ Here, we define $T := P_C (\mathrm{Id} - \alpha \nabla f)$, where $\mathrm{Id}$ is the identity mapping on $H$,((It is defined by $\mathrm{Id}(x) := x$ $(x\in H)$.)) $P_C$ is the metric projection onto $C$, and $\alpha \in (0,2/L]$. Then, $T$ is nonexpansive and a fixed point of $T$ coincides with a minimizer of $f$ over $C$.   - Constrained Convex Optimization Problem:\\ Suppose that $C$ $(\subset H)$ is nonempty, closed, and convex, $f\colon H \to \mathbb{R}$ is Fréchet differentiable and convex, and its gradient, denoted by $\nabla f$, is Lipschitz continuous with a constant $L$ $(>0)$. The constrained convex optimization problem is to find \begin{align*}x^\star \in C \text{ such that } f(x^\star) \leq f(x) \text{ for all } x\in C.\end{align*} See [[en:intro:researches:optimization|Optimization Algorithms]] page for examples of the convex optimization problem. \\ Here, we define $T := P_C (\mathrm{Id} - \alpha \nabla f)$, where $\mathrm{Id}$ is the identity mapping on $H$,((It is defined by $\mathrm{Id}(x) := x$ $(x\in H)$.)) $P_C$ is the metric projection onto $C$, and $\alpha \in (0,2/L]$. Then, $T$ is nonexpansive and a fixed point of $T$ coincides with a minimizer of $f$ over $C$.
  
行 24: 行 24:
  
  
-===== Acceleration Algorithms for Solving Fixed Points =====+===== Acceleration Algorithms for Solving Fixed Point Problems =====
 ==== Acceleration of the Krasnosel'skii-Mann Algorithm ==== ==== Acceleration of the Krasnosel'skii-Mann Algorithm ====
-We numerically compared our algorithm with the Krasnosel’skii-Mann algorithm and showed that it reduces the running time and iterations needed to find a fixed point compared with that algorithm. It is summarized as the following paper+We numerically and theoretically compared our algorithm with the Krasnosel’skii-Mann algorithm and showed that it reduces the running time and iterations needed to find a fixed point compared with that algorithm. It is summarized as the following papers (You can get our papers from [[en:intro:publications|Publications]] page)
-  * [[en:iiduka:|H. Iiduka]]: [[http://arxiv.org/abs/1509.05605| Line Search Fixed Point Algorithms Based on Nonlinear Conjugate Gradient Directions: Application to Constrained Smooth Convex Optimization]], submitted+  * K. Fujiwara and [[:en:iiduka:|H. Iiduka]]: [[http://www.ybook.co.jp/online2/oplna/vol3/p189.html|Modification of the Krasnosel'skii-Mann Fixed Point Algorithm by Using Three-term Conjugate Gradients]], Linear and Nonlinear Analysis, Vol. 3, No. 2, pp.189-202, 2017. 
-  * [[http://arnip.org/|K. Hishinuma]] and [[:iiduka:|H. Iiduka]]: [[http://www.ybook.co.jp/online-p/JNCA/Open/16/jncav16n11p2243-oa/FLASH/index.html|On Acceleration of the Krasnosel'skii-Mann Fixed Point Algorithm Based on Conjugate Gradient Method for Smooth Optimization]], Journal of Nonlinear and Convex Analysis: Special Issue-Dedicated to Wataru Takahashi on the occasion of his 70th birth day, Vol. 16, No. 11, pp. 2243-2254, 2015.+  * [[:en:iiduka:|H. Iiduka]]: [[http://fixedpointtheoryandapplications.springeropen.com/articles/10.1186/s13663-016-0567-7| Line Search Fixed Point Algorithms Based on Nonlinear Conjugate Gradient Directions: Application to Constrained Smooth Convex Optimization]], Fixed Point Theory and Applications, Vol. 2016, No. 77, 2016
 +  * K. Hishinuma and [[:en:iiduka:|H. Iiduka]]: [[http://www.ybook.co.jp/online-p/JNCA/Open/16/jncav16n11p2243-oa/FLASH/index.html|On Acceleration of the Krasnosel'skii-Mann Fixed Point Algorithm Based on Conjugate Gradient Method for Smooth Optimization]], Journal of Nonlinear and Convex Analysis: Special Issue-Dedicated to Wataru Takahashi on the occasion of his 70th birth day, Vol. 16, No. 11, pp. 2243-2254, 2015.
  
  
 ==== Acceleration of the Halpern Algorithm ==== ==== Acceleration of the Halpern Algorithm ====
-We presented an algorithm to accelerate the Halpern algorithm and proved that, under certain assumptions, our algorithm strongly converges to a fixed point of $T$. We numerically compared our algorithm with the Halpern algorithm and showed that it reduces the running time and iterations needed to find a fixed point compared with that algorithm. It is summarized as the following paper.+We presented an algorithm to accelerate the Halpern algorithm and proved that, under certain assumptions, our algorithm strongly converges to a fixed point of $T$. We numerically compared our algorithm with the Halpern algorithm and showed that it reduces the running time and iterations needed to find a fixed point compared with that algorithm. It is summarized as the following paper (You can get our papers from [[en:intro:publications|Publications]] page).
   * [[en:kaito:|K. Sakurai]] and [[en:iiduka:|H. Iiduka]]: [[http://www.fixedpointtheoryandapplications.com/content/2014/1/202|Acceleration of the Halpern Algorithm to Search for a Fixed Point of a Nonexpansive Mapping]], Fixed Point Theory and Applications, Vol. 2014, 202, 2014.   * [[en:kaito:|K. Sakurai]] and [[en:iiduka:|H. Iiduka]]: [[http://www.fixedpointtheoryandapplications.com/content/2014/1/202|Acceleration of the Halpern Algorithm to Search for a Fixed Point of a Nonexpansive Mapping]], Fixed Point Theory and Applications, Vol. 2014, 202, 2014.
  
 ==== Related Acceleration Methods ==== ==== Related Acceleration Methods ====
-We have presented the following acceleration methods for convex optimization over the fixed point sets of nonexpansive mappings. If you are interested, please see the following paper.+We have presented the following acceleration methods for convex optimization over the fixed point sets of nonexpansive mappings. If you are interested, please see the following papers (You can get our papers from [[en:intro:publications|Publications]] page).
   * [[en:iiduka:|H. Iiduka]]: [[http://link.springer.com/article/10.1007/s10107-013-0741-1|Acceleration Method for Convex Optimization over the Fixed Point Set of a Nonexpansive Mapping]], Mathematical Programming, Vol. 149, No. 1, pp. 131-165, 2015.   * [[en:iiduka:|H. Iiduka]]: [[http://link.springer.com/article/10.1007/s10107-013-0741-1|Acceleration Method for Convex Optimization over the Fixed Point Set of a Nonexpansive Mapping]], Mathematical Programming, Vol. 149, No. 1, pp. 131-165, 2015.
   * [[en:iiduka:|H. Iiduka]]: [[http://www.sciencedirect.com/science/article/pii/S0096300311000099|Three-term Conjugate Gradient Method for the Convex Optimization Problem over the Fixed Point Set of a Nonexpansive Mapping]], Applied Mathematics and Computation, Vol. 217, No. 13, pp. 6315-6327, 2011.   * [[en:iiduka:|H. Iiduka]]: [[http://www.sciencedirect.com/science/article/pii/S0096300311000099|Three-term Conjugate Gradient Method for the Convex Optimization Problem over the Fixed Point Set of a Nonexpansive Mapping]], Applied Mathematics and Computation, Vol. 217, No. 13, pp. 6315-6327, 2011.
-  * [[en:iiduka:|H. Iiduka]] and I. Yamada: [[http://epubs.siam.org/action/showAbstract?page=1881&volume=19&issue=4&journalCode=sjope8|A Use of Conjugate Gradient Direction for the Convex Optimization Problem over the Fixed Point Set of a Nonexpansive Mapping]], SIAM Journal on Optimization, Vol. 19, No. 4, pp. 1881-1893, 2009.  +  * [[en:iiduka:|H. Iiduka]] and [[http://www.sp.ss.titech.ac.jp/index.php?Isao%20Yamada|I. Yamada]]: [[http://epubs.siam.org/action/showAbstract?page=1881&volume=19&issue=4&journalCode=sjope8|A Use of Conjugate Gradient Direction for the Convex Optimization Problem over the Fixed Point Set of a Nonexpansive Mapping]], SIAM Journal on Optimization, Vol. 19, No. 4, pp. 1881-1893, 2009. 
  
  • en/intro/researches/fixedpoint.1458813037.txt.gz
  • 最終更新: 2018/06/01 16:41
  • (外部編集)