The steepest descent method

Top PDF The steepest descent method:

An efficient algorithm for steepest descent method for unconstrained optimization

An efficient algorithm for steepest descent method for unconstrained optimization

Exact line searches along each steepest descent direction converge very slowly. Barzilai and Borwein suggested two stepsizes that ensures superlinear convergence and performs quite well. Barzilai-Borwein method is not monotone, thus it is not easy to be generalized for general nonlinear functions. A new stepsize enables fast convergence and possesses monotone property is proposed by Yuan. The new stepsize is modified to obtain modified new steepest descent method, which is for convex quadratic problems only is proposed by Yuan. The new steepest descent method uses the new stepsize after every m exact line search iterations. An algorithm for m=2 is proposed in this paper.
Show more

14 Read more

Mann type hybrid steepest descent method for three nonlinear problems

Mann type hybrid steepest descent method for three nonlinear problems

the setting of the infinite-dimensional Hilbert space. The iterative algorithm is based on Korpelevich’s extragradient method, the viscosity approximation method [], Mann’s it- eration method, and the hybrid steepest-descent method. Our aim is to prove that the it- erative algorithm converges strongly to a common element of these sets, which also solves some hierarchical variational inequality. We observe that related results have been derived in [, , , ].

29 Read more

A viscosity hybrid steepest descent method for a system of equilibrium and fixed point problems for an infinite family of strictly pseudo contractive mappings

A viscosity hybrid steepest descent method for a system of equilibrium and fixed point problems for an infinite family of strictly pseudo contractive mappings

In this paper, we introduce a new iterative scheme in a Hilbert space H which is a mixed iterative scheme of (.) and (.). We prove that the sequence converges strongly to a common element of the set of solutions of the system of equilibrium problems and the set of common fixed points of an infinite family of strictly pseudo-contractive mappings by using a viscosity hybrid steepest-descent method. The results obtained in this paper improved and extended the above mentioned results and many others. Finally, we give a simple numerical example to support and illustrate our main theorem in the last part.
Show more

20 Read more

A relaxed hybrid steepest descent method for common solutions of generalized mixed equilibrium problems and fixed point problems

A relaxed hybrid steepest descent method for common solutions of generalized mixed equilibrium problems and fixed point problems

In the setting of Hilbert spaces, we introduce a relaxed hybrid steepest descent method for finding a common element of the set of fixed points of a nonexpansive mapping, the set of solutions of a variational inequality for an inverse strongly monotone mapping and the set of solutions of generalized mixed equilibrium problems. We prove the strong convergence of the method to the unique solution of a suitable variational inequality. The results obtained in this article improve and extend the corresponding results.

20 Read more

The hybrid steepest descent method for solutions of equilibrium problems and other problems in fixed point theory

The hybrid steepest descent method for solutions of equilibrium problems and other problems in fixed point theory

In this paper, we combine the gradient projection algorithm and the hybrid steepest descent method and prove the strong convergence to a common element of the equilibrium problem; the null space of an inverse strongly monotone operator; the set of fixed points of a continuous pseudocontractive mapping and the minimizer of a convex function. This common element is proved to be the unique solution of a variational inequality problem.

24 Read more

Efficient implementation of a modified and relaxed hybrid steepest descent method for a type of variational inequality

Efficient implementation of a modified and relaxed hybrid steepest descent method for a type of variational inequality

developed a hybrid steepest-descent method for solving VI(F, K) [7,8], but choosing an efficient and implementable nonexpansive mapping is still a difficult problem. Subse- quently, Xu and Kim [9] and Zeng et al. [10] proved the convergence of hybrid stee- pest-descent method. Noor introduced iterations after analysis of several three-step iterative methods [18]. Ding et al. provided a three-step relaxed hybrid steepest-descent method for variational inequalities [11] and Yao et al. [19] provided a simple proof of the method under different conditions. Our group has described a modified and relaxed hybrid steepest descent (MRHSD) method that makes greater use of historical information and minimizes information loss [20].
Show more

13 Read more

A Generalized Hybrid Steepest-Descent Method for Variational Inequalities in Banach Spaces

A Generalized Hybrid Steepest-Descent Method for Variational Inequalities in Banach Spaces

5 J. T. Oden, Qualitative Methods on Nonlinear Mechanics, Prentice-Hall, Englewood Cliffs, NJ, USA, 1986. 6 I. Yamada, “The hybrid steepest descent method for the variational inequality problem over the intersection of fixed point sets of nonexpansive mappings,” in Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications (Haifa, 2000), D. Butnariu, Y. Censor, and S. Reich, Eds., vol. 8 of Studies in Computational Mathematics, pp. 473–504, North-Holland, Amsterdam, The Netherlands, 2001.

28 Read more

Hybrid Steepest Descent Method with Variable Parameters for General Variational Inequalities

Hybrid Steepest Descent Method with Variable Parameters for General Variational Inequalities

[11] I. Yamada, “The hybrid steepest-descent method for variational inequality problems over the intersection of the fixed point sets of nonexpansive mappings,” in Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications (Haifa, 2000), D. Butnariu, Y. Censor, and S. Reich, Eds., vol. 8, pp. 473–504, North-Holland, Amsterdam, The Netherlands, 2001. [12] H. K. Xu and T. H. Kim, “Convergence of hybrid steepest-descent methods for variational in-

14 Read more

Modelling of LFM Spectrum as Rectangle using Steepest Descent Method

Modelling of LFM Spectrum as Rectangle using Steepest Descent Method

Since all the four chirps have same k,τ and the time-domain amplitude considered to be 1 same as in Steepest descent method. Now with this we can justify that the individual spectra will have amplitudes proportional to the square root of their individual BT product , so therefore to model the spectrum as rectangle it is necessary to normalize all he spectrum w.r.t square root of BT product which will be convenient for plotting.

5 Read more

The hybrid steepest descent method for solving variational inequality over triple hierarchical problems

The hybrid steepest descent method for solving variational inequality over triple hierarchical problems

3. Yao, JC, Chadli, O: Pseudomonotone complementarity problems and variational inequalities. In: Crouzeix, JP, Haddjissas, N, Schaible, S (eds.) Handbook of Generalized Convexity and Monotonicity, pp. 501-558 (2005) 4. Kirk, WA: Fixed point theorem for mappings which do not increase distance. Am. Math. Mon. 72, 1004-1006 (1965) 5. Combettes, PL: A block-iterative surrogate constraint splitting method for quadratic signal recovery. IEEE Trans. Signal

17 Read more

Finite Step Relaxed Hybrid Steepest Descent Methods for Variational Inequalities

Finite Step Relaxed Hybrid Steepest Descent Methods for Variational Inequalities

The classical variational inequality problem with a Lipschitzian and strongly monotone operator on a nonempty closed convex subset in a real Hilbert space was studied. A new finite-step relaxed hybrid steepest-descent method for this class of variational inequalities was introduced. Strong convergence of this method was established under suitable assumptions imposed on the algorithm parameters.

13 Read more

Optimal Control of Microgrid Networks Using Gradient Descent and Differential Evolution Methods

Optimal Control of Microgrid Networks Using Gradient Descent and Differential Evolution Methods

Classical gradient methods and evolutionary algorithms represent two very different classes of optimization techniques. In optimization, a problem is typically specified by a set of parameters and an objective function, which is also called a fitness function in the context of evolutionary algorithms. The goal of the optimization process is to find a set of variables such that the objective function is optimum. In the special case of continuous parameter optimization in which all parameters are real valued, Newton developed the gradient method, which is also known as the method of steepest descent. In unimodal functions, the optimum can be found by moving along the local gradients, which leads to the following formulation of the steepest-descent method. It is obvious that steepest-descent algorithms can be applied only to continuously differentiable objective functions. If either the objective function is not continuously differentiable or if the function is not (completely) given due to limited knowledge, which often occurs in real-world applications, the designer has to resort to other methods, such as evolutionary algorithms. Evolutionary algorithms are a class of stochastic optimization and adaptation techniques that are inspired by natural evolution. Each evolutionary algorithm is designed along a different methodology. Despite their differences, all evolutionary algorithms are heuristic population-based search procedures that incorporate random variation and selection.
Show more

7 Read more

Strong convergence of relaxed hybrid steepest-descent methods for triple hierarchical constrained optimization

Strong convergence of relaxed hybrid steepest-descent methods for triple hierarchical constrained optimization

2. Yamada, I: The hybrid steepest-descent method for the variational inequality problem over the intersection of fixed- point sets of nonexpansive mappings. In: Butnariu, D, Censor, Y, Reich, S (eds.) Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications. pp. 473 – 504. Kluwer Academic Publishers, Dordrecht, Netherlands (2001)

24 Read more

An Iterative Algorithm for Generalized Mixed Equilibria with Variational Inequalities

An Iterative Algorithm for Generalized Mixed Equilibria with Variational Inequalities

Abstract—In this paper, we shall introduce an itera- tive algorithm by multi-step implicit hybrid steepest- descent method for finding a common element of the set of solutions of a finite family of generalized mixed equilibrium problems, the set of solutions of a finite family of variational inequalities for inverse strongly monotone mappings and the set of fixed points of a countable family of nonexpansive mappings in a real Hilbert space. We also prove strong and weak conver- gence theorems for the proposed iterative algorithm under appropriate conditions. Our results improve and extend the earlier and recent results in the liter- ature.
Show more

5 Read more

Synchronal algorithm and cyclic algorithm for fixed point problems and variational inequality problems in hilbert spaces

Synchronal algorithm and cyclic algorithm for fixed point problems and variational inequality problems in hilbert spaces

Yamada, I: The hybrid steepest-descent method for variational inequality problems over the intersection of the fixed point sets of nonexpansive mappings.. In: Butnariu, D, Censor, Y, Rei[r]

14 Read more

Adaptive Filtering using Steepest Descent and LMS Algorithm

Adaptive Filtering using Steepest Descent and LMS Algorithm

In many practical scenarios, it is observed that we are required to filter a signal whose exact frequency response is not known. A solution to such problem is an adaptive filter. It can automatically acclimatize for changing system requirements and can be modelled to perform specific filtering and decision-making tasks. This paper primarily focusses on the implementation of the two most widely used algorithms for noise cancelling which form the crux of adaptive filtering. The empirical explanation of steepest descent method is elucidated along with its simulation in MATLAB by taking a noise added signal and applying the ingenuity of this algorithm to get the desired noise-free response. Furthermore, this paper also sheds light on a more sophisticated algorithm which is based on the underlying criteria of minimum mean square error called as the Least Mean Square (LMS). Additionally, there are various applications of adaptive filtering including system identification which is briefly explained to emphasize the instances where it can be used.
Show more

5 Read more

Accelerated Mann and CQ algorithms for finding a fixed point of a nonexpansive mapping

Accelerated Mann and CQ algorithms for finding a fixed point of a nonexpansive mapping

When Fix(T ) is the set of all minimizers of a convex, continuously Fréchet differentiable functional f over H, that algorithm () is the steepest descent method [] to minimize f over H. Suppose that the gradient of f , denoted by ∇ f , is Lipschitz continuous with a constant L >  and define T f : H → H by

12 Read more

A New Conjugancy Coefficient of Conjugate Gradient Method

A New Conjugancy Coefficient of Conjugate Gradient Method

For all x  S . The convergence of the steepest descent method with Armijo-type search is proved under very general conditions in (Andrei, N., 2009). On the other hand, in (Dai, Y. and Y. Yuan, 2000) it is proved that, for any conjugate gradient method with strong Wolfe line search, the following general result holds. Lemma 4.2.1:

6 Read more

A new explicit iteration method for variational inequalities on the set of common fixed points for a finite family of nonexpansive mappings

A new explicit iteration method for variational inequalities on the set of common fixed points for a finite family of nonexpansive mappings

In this paper, we introduce a new explicit iteration method based on the steepest descent method and Krasnoselskii-Mann type method for finding a solution of a variational inequality involving a Lipschitz continuous and strongly monotone mapping on the set of common fixed points for a finite family of nonexpansive mappings in a real Hilbert space.

12 Read more

A new iteration method for variational inequalities on the set of common fixed points for a finite family of quasi pseudocontractions in Hilbert spaces

A new iteration method for variational inequalities on the set of common fixed points for a finite family of quasi pseudocontractions in Hilbert spaces

In this paper, we propose a new iteration method based on the hybrid steepest descent method and Ishikawa-type method for seeking a solution of a variational inequality involving a Lipschitz continuous and strongly monotone mapping on the set of common fixed points for a finite family of Lipschitz continuous and

12 Read more

Show all 10000 documents...