Method of Multipliers

Top PDF Method of Multipliers:

On the Primal Dual Method of Multipliers and its Applications

On the Primal Dual Method of Multipliers and its Applications

We propose a new processing architecture for distributed image fu- sion (which will work in both the WSN and antenna array examples dis- cussed above) based on elementwise general form consensus [15] using the Primal-Dual Method of Multipliers (PDMM) [155, 158]. Our system will optimize the image held by each sensor node by exploiting the ad- ditional information held by neighbouring nodes, effectively performing fusion on the mutual area viewed by each pair of nodes. Our algorithm will, in fact, exploit information from all nodes sharing mutual image ar- eas provided the nodes that make up the communication path between any two nodes also share the same mutual image area. This process will operate using asynchronous and independent updates at each node, elim- inating the need for a synchronous update iteration clock. The result of our algorithm is a network of nodes whose image observations may be improved via image information fused throughout the entire network us- ing only local updates, without each node being required to contain a full image of the full network imaging area.
Show more

184 Read more

Convergence analysis on a modified generalized alternating direction method of multipliers

Convergence analysis on a modified generalized alternating direction method of multipliers

The alternating direction method of multipliers (ADMM) is one of the most powerful and successful methods for solving convex composite minimization problem. The generalized ADMM relaxes both the variables and the multipliers with a common relaxation factor in (0, 2), which has the potential of enhancing the performance of the classic ADMM. Very recently, two different variants of semi-proximal generalized ADMM have been proposed. They allow the weighting matrix in the proximal terms to be positive semidefinite, which makes the subproblems relatively easy to evaluate. One of the variants of semi-proximal generalized ADMMs has been analyzed
Show more

14 Read more

Compressed sensing MRI via fast linearized preconditioned alternating direction method of multipliers

Compressed sensing MRI via fast linearized preconditioned alternating direction method of multipliers

Methods: In this paper, we propose an efficient algorithm referred to as the fast linearized preconditioned alternating direction method of multipliers (FLPADMM), to solve an augmented TV-regularized model that adds a quadratic term to enforce image smoothness. Because of the separable structure of this model, FLPADMM decomposes the convex problem into two subproblems. Each subproblem can be alternatively minimized by augmented Lagrangian function. Furthermore, a linearized strategy and multistep weighted scheme can be easily combined for more effective image recovery. Results: The method of the present study showed improved accuracy and efficiency, in comparison to other methods. Furthermore, the experiments conducted on in vivo data showed that our algorithm achieved a higher signal-to-noise ratio (SNR), lower relative error (Rel.Err), and better structural similarity (SSIM) index in comparison to other state-of-the-art algorithms.
Show more

18 Read more

The convergence rate of the proximal alternating direction method of multipliers with indefinite proximal regularization

The convergence rate of the proximal alternating direction method of multipliers with indefinite proximal regularization

The proximal alternating direction method of multipliers (P-ADMM) is an efficient first-order method for solving the separable convex minimization problems. Recently, He et al. have further studied the P-ADMM and relaxed the proximal regularization matrix of its second subproblem to be indefinite. This is especially significant in practical applications since the indefinite proximal matrix can result in a larger step size for the corresponding subproblem and thus can often accelerate the overall convergence speed of the P-ADMM. In this paper, without the assumptions that the feasible set of the studied problem is bounded or the objective function’s component θ i (·) of the studied problem is strongly convex, we prove the worst-case O(1/t)
Show more

15 Read more

Short-term Sparse Portfolio Optimization Based on Alternating Direction Method of Multipliers

Short-term Sparse Portfolio Optimization Based on Alternating Direction Method of Multipliers

We propose a short-term sparse portfolio optimization (SSPO) system based on alternating direction method of multipliers (ADMM). Although some existing strategies have also exploited sparsity, they either constrain the quantity of the portfolio change or aim at the long-term portfolio optimization. Very few of them are dedicated to constructing sparse portfolios for the short-term portfolio optimization, which will be complemented by the proposed SSPO. SSPO concentrates wealth on a small proportion of assets that have good increasing potential according to some empirical financial principles, so as to maximize the cumulative wealth for the whole investment. We also propose a solving algorithm based on ADMM to handle the ` 1 -regularization term and the self-financing constraint simultaneously. As a significant improvement in the proposed ADMM, we have proven that its augmented Lagrangian has a saddle point, which is the foundation of the iterative formulae of ADMM but is seldom addressed by other sparsity strategies. Extensive experiments on 5 benchmark data sets from real-world stock markets show that SSPO outperforms other state-of-the-art systems in thorough evaluations, withstands reasonable transaction costs and runs fast. Thus it is suitable for real-world financial environments.
Show more

28 Read more

Distributed resource allocation for MISO downlink systems via the alternating direction method of multipliers

Distributed resource allocation for MISO downlink systems via the alternating direction method of multipliers

We have provided distributed algorithms for the radio resource allocation problems in multicell downlink multi- input single-output systems. Specifically, we have con- sidered two optimization problems: P1 - minimization of the total transmission power subject to signal-to- interference-plus-noise ratio (SINR) constraints of each user, and P2 - SINR balancing subject to total transmit power constraint of BSs. We have proposed consensus- based distributed algorithms, and the fast solution method via alternating direction method of multipliers. First, we have derived a distributed algorithm for problem P1. Then, in conjunction with the bracketing method, the algorithm is extended for problem P2. Numerical results show that the proposed distributed algorithms converge very fast to the optimal centralized solution.
Show more

19 Read more

Double regularization medical CT image blind restoration reconstruction based on proximal alternating direction method of multipliers

Double regularization medical CT image blind restoration reconstruction based on proximal alternating direction method of multipliers

reconstruction method was proposed. The objective function including both a clear image and point spread function was established. To avoid the over-smoothing phenomenon and protect the detail, the objective function includes two constraint regularization terms. They are total variation (TV) and wavelet sparsity respectively. The objective function was solved by the alternating direction multiplier method (ADMM), and the optimal solution was obtained. Firstly, the CT image blind restoration reconstruction was decomposed into two sub-problems: reconstructed image estimation and point spread function estimation. Furthermore, each sub-problem can be solved by the proximal alternating direction method of multipliers. Finally, the CT image blind restoration reconstruction was realized. The experimental results show that the proposed algorithm takes into account the degradation effect of projection data, and the proposed algorithm is superior to other existing algorithms in the subjective visual effect. At the same time, in the aspect of objective evaluation, the proposed algorithm improves the objective image quality metrics such as peak signal-to-noise ratio (PSNR), structural similarity index metric (SSIM), and universal image quality index (UIQI).
Show more

9 Read more

A symmetric version of the generalized alternating direction method of multipliers for two block separable convex programming

A symmetric version of the generalized alternating direction method of multipliers for two block separable convex programming

This paper introduces a symmetric version of the generalized alternating direction method of multipliers for two-block separable convex programming with linear equality constraints, which inherits the superiorities of the classical alternating direction method of multipliers (ADMM), and which extends the feasible set of the relaxation factor α of the generalized ADMM to the infinite interval [1, +∞). Under the conditions that the objective function is convex and the solution set is nonempty, we establish the convergence results of the proposed method, including the global convergence, the worst-case O (1/k) convergence rate in both the ergodic and the non-ergodic senses, where k denotes the iteration counter. Numerical experiments to decode a sparse signal arising in compressed sensing are included to illustrate the efficiency of the new method.
Show more

21 Read more

Alternating direction method of multipliers for the extended trust region subproblem

Alternating direction method of multipliers for the extended trust region subproblem

The extended trust region subproblem has been the focus of several research recently. Under various assumptions, strong duality and certain SOCP/SDP relaxations have been proposed for several classes of it. Due to its importance, in this paper, without any assumption on the problem, we apply the widely used alternating direction method of multipliers (ADMM) to solve it. The convergence of ADMM iterations to the first order stationary conditions is established. On several classes of test problems, the quality of the solution obtained by the ADMM for medium scale problems is compared with the SOCP/SDP relaxation. Moreover, the applicability of the method for solving large scale problems is shown by solving several large instances.
Show more

12 Read more

A regularized alternating direction method of multipliers for a class of nonconvex problems

A regularized alternating direction method of multipliers for a class of nonconvex problems

Hong, M.Y., Luo, Z.Q., Razaviyayn, M.: Convergence analysis of alternating direction method of multipliers for a family of nonconvex problems.. Wang, F.H., Xu, Z.B., Xu, H.K.: Convergenc[r]

16 Read more

A fundamental proof of convergence of alternating direction method of multipliers for weakly convex optimization

A fundamental proof of convergence of alternating direction method of multipliers for weakly convex optimization

Recently, a similar work [4, 19] to this manuscript has investigated the use of the primal– dual hybrid gradient (PDHG) method to solve (1), and also has established convergence. Although the PDHG (a.k.a, the relaxed alternating minimization algorithm (AMA) [20]) is apparently similar to the ADMM, they are quite different. Actually, the PDHG has a deep relationship with the inexact Uzawa method [19, 20].

21 Read more

Alternating Direction Method of Multipliers (ADMM) Based Deconvolving Images with Unknown Boundaries

Alternating Direction Method of Multipliers (ADMM) Based Deconvolving Images with Unknown Boundaries

The simulation studies involve the output of the proposed method based on the ADMM is shown in the fig3. It includes the original image and the degraded image after performing the ADMM algorithm it will produce the estimated image. The cost function will indicate the number of iterations to obtain the estimated image from the degraded image. The ISNR values of the estimated image is tabulated in the table.2 based on frame analysis is used in ADMM.

7 Read more

A customized proximal point algorithm for stable principal component pursuit with nonnegative constraint

A customized proximal point algorithm for stable principal component pursuit with nonnegative constraint

The new customized PPA described above is known as alternating direction method of multipliers (ADMM) with two blocks []. Its global convergence result has been proven in many literature works. However, there are three variables in (.). If we apply the cus- tomized PPA to the SPCP problem directly, the convergence of the algorithm cannot be guaranteed []. Moreover, the proximal mapping of L ∗ + I(L ≥ ) is difficult to com- pute []. By introducing a new auxiliary parameter K, we can group L and S as one big block [L; S], and group Z and K as another big block [Z; K]. Then (.) can be rewritten as a similar form of (.):
Show more

13 Read more

Latent Constrained Correlation Filter

Latent Constrained Correlation Filter

The framework of the proposed Latent Constrained Corre- lation Filters (LCCF) is shown in Fig. 1. To find the solution sampling in the training process, unlike an ad-hoc way that directly inputs all samples to train correlation filters, we train sub-filters step by step based on iteratively selecting subsets. Instead of estimating a real distribution for an unsolved variable that is generally very difficult, we exploit sampling solutions to form a subspace, in which the bad solution from a noisy sample set can theoretically be recovered after being pro- jected onto this subspace in an Alternating Direction Method of Multipliers (ADMM) scheme. Eventually, we can find a better result from the subspace (subset) that contains different kinds of solutions to our problem. From the optimization perspective, the subspace is actually used to constrain the solution, as shown in Fig. 1. In fact, the above constrained learning method is a balanced learning model across different training sets. The application of constraints derived from data structure in the learning model is capable of achieving good solutions [6], [5], [36]. This is also confirmed by [5], in which the topological constraints are learned by using data structure information. In [36], Zhang et al. put forward a new ADMM method, which can include manifold constraints during the optimization process of sparse representation classification (SRC). These methods all achieve promising results by adding constraints.
Show more

11 Read more

Estimation of Graphical Models through Structured Norm Minimization

Estimation of Graphical Models through Structured Norm Minimization

The alternating direction method of multipliers (ADMM) is widely used in solving struc- tured convex optimization problems due to its superior performance in practice; see Schein- berg et al. (2010); Boyd et al. (2011); Hong and Luo (2017); Lin et al. (2015, 2016); Sun et al. (2015); Davis and Yin (2015); Hajinezhad and Hong (2015); Hajinezhad et al. (2016). On the theoretical side, Chen et al. (2016) provided a counterexample showing that the ADMM may fail to converge when the number of blocks exceeds two. Hence, many au- thors reformulate the problem of estimating a Markov Random Field model to a two block ADMM algorithm by grouping the variables and introducing auxiliary variables (Ma et al., 2013; Mohan et al., 2012; Tan et al., 2014). However, in the context of large-scale optimiza- tion problems, the grouping ADMM method becomes expensive due to its high memory requirements. Moreover, despite lack of convergence guarantees under standard convexity assumptions, it has been observed by many researchers that the unmodified multi-block ADMMs with Gauss-Seidel updates often outperform all its modified versions in practice (Wang et al., 2013; Sun et al., 2015; Davis and Yin, 2015).
Show more

48 Read more

On the global and linear convergence of direct extension of ADMM for 3 block separable convex minimization models

On the global and linear convergence of direct extension of ADMM for 3 block separable convex minimization models

In this paper, we show that when the alternating direction method of multipliers (ADMM) is extended directly to the 3-block separable convex minimization problems, it is convergent if one block in the objective possesses sub-strong monotonicity which is weaker than strong convexity. In particular, we estimate the globally linear convergence rate of the direct extension of ADMM measured by the iteration complexity under some additional conditions.

14 Read more

Preconditioned ADMM with nonlinear operator constraint

Preconditioned ADMM with nonlinear operator constraint

Abstract. We are presenting a modification of the well-known Alternat- ing Direction Method of Multipliers (ADMM) algorithm with additional preconditioning that aims at solving convex optimisation problems with nonlinear operator constraints. Connections to the recently developed Nonlinear Primal-Dual Hybrid Gradient Method (NL-PDHGM) are pre- sented, and the algorithm is demonstrated to handle the nonlinear inverse problem of parallel Magnetic Resonance Imaging (MRI).

11 Read more

Ptychographic Algorithm Using Dual Tree Complex Wavelet Transform

Ptychographic Algorithm Using Dual Tree Complex Wavelet Transform

Abstract. Reconstructing the interesting complex image from multiple diffraction patterns is the goal of the ptychography. Previous ptychographic algorithms often suffer from low reconstruction quality under the low overlap ratios. To address this issue, we proposed a novel ptychographic phase retrieval (PR) algorithm of exploiting the sparsity of the image in dual-tree complex wavelet domain. The Fourier magnitude measurements are utilized to construct a data fidelity term, and the sparse representation model of the image over the dual-tree complex wavelet transform is utilized for the sparse induced regularization term. The data fidelity term and the proposed regularization term are combined to formulate a ptychographic PR optimization problem. Alternating direction method of multipliers (ADMM) and gradient descent algorithm are utilized for solving the corresponding optimization problem. Compared with previous algorithms, the experimental results indicate that the proposed algorithm can obtain reconstructed images with high quality even at low overlap ratios.
Show more

8 Read more

Solving total variation image super resolution problems via proximal symmetric alternating direction methods

Solving total variation image super resolution problems via proximal symmetric alternating direction methods

The single image super-resolution (SISR) problem represents a class of efficient models appealing in many computer vision applications. In this paper, we focus on designing a proximal symmetric alternating direction method of multipliers (SADMM) for the SISR problem. By taking full exploitation of the special structure, the method enjoys the advantage of being easily implementable by linearizing the quadratic term of subproblems in the SISR problem. With this linearization, the resulting

18 Read more

Computational methods of Gaussian Particle Swarm Optimization (GPSO) and Lagrange Multiplier on economic dispatch issues (case study on electrical system of Java-Bali IV area)

Computational methods of Gaussian Particle Swarm Optimization (GPSO) and Lagrange Multiplier on economic dispatch issues (case study on electrical system of Java-Bali IV area)

PSO method used in this study is a combined PSO with Gaussian probability distribution function (GPSO) used to random generated numbers. Gaussian distribution can provide faster convergence in local search. Gaussian distribution is used to generate random numbers in the interval [-1,1] on the acceleration coefficient cognitive part, acceleration coefficient social part and the

8 Read more

Show all 10000 documents...