Research Article | Open Access

Shamshad Husain, Nisha Singh, "An Iterative Method for Finding Common Solution of the Fixed Point Problem of a Finite Family of Nonexpansive Mappings and a Finite Family of Variational Inequality Problems in Hilbert Space", *Journal of Applied Mathematics*, vol. 2019, Article ID 6875789, 11 pages, 2019. https://doi.org/10.1155/2019/6875789

# An Iterative Method for Finding Common Solution of the Fixed Point Problem of a Finite Family of Nonexpansive Mappings and a Finite Family of Variational Inequality Problems in Hilbert Space

**Academic Editor:**LUCAS JODAR

#### Abstract

In this paper, a hybrid iterative algorithm is proposed for finding a common element of the set of common fixed points of finite family of nonexpansive mappings and the set of common solutions of the variational inequality for an inverse strongly monotone mapping on the real Hilbert space. We establish the strong convergence of the proposed method for approximating a common element of the above defined sets under some suitable conditions. The results presented in this paper extend and improve some well-known corresponding results in the earlier and recent literature.

#### 1. Introduction

Throughout, let be a real Hilbert space whose inner product and norm are denoted as and , respectively. Let be a nonempty closed convex subset of and be the metric projection of onto . Let be a nonexpansive mapping if The set of fixed point of is denoted by .

Let be a nonlinear mapping. The classical variational inequality problem denoted by associated with the set is to find such thatThe solution set of the (2) is denoted by . The is a fundamental problem in variational analysis which has been extensively studied by many researchers in the past decades, see Yao and Chadli [1], Zeng, Schaible, and Yao [2], and the references therein.

The (2) was first discussed by Lions [3], further many different approaches are given to solve (2) in finite dimensional and infinite dimensional spaces, and the research in this direction is still continued.

Korpelevich [4] proposed a modification of an iterative algorithm for solving the (2) in Euclidean space :with , called the extragradient algorithm under certain assumptions, and it has received great attention. He proved that the sequences and converge strongly.

Takahashi and Toyoda [5] introduced an iterative algorithm for finding an element of under the assumption that is nonexpansive and is inverse strongly monotone as where is a sequence in and is a sequence in .

Iiduka and Takahashi [6] proposed another iterative algorithm as

Marino and Xu [7] introduced the following iterative algorithm based on the viscosity approximation method: where is a strongly positive bounded linear operator on . They proved that if the sequence satisfies appropriate conditions, then the sequence generated by (4) converges strongly to the unique solution of the variational inequality

Qin and Cho [8] introduced an extended composite iterative algorithm defined as follows:where is a contraction, is a nonexpansive mapping, and is a strongly positive linear bounded self-adjoint operator. Under certain appropriate assumptions on the parameters, defined by (8) converges strongly to a fixed point of , which solves the variational inequality (7).

Yamada [9] considered the hybrid steepest descent method for solving the VIP over the set of fixed points of a nonexpansive mapping. Moudafi [10] proposed the viscosity approximation method of selecting a particular fixed point of a given nonexpansive mapping which is also a solution of a variational inequality problem.

In this paper, we introduce and analyze an iterative algorithm by combining Korpelevichâ€™s extragradient method, viscosity approximation method, and hybrid steepest method. We prove that under certain conditions the proposed algorithm converges strongly to a common element of , which is a unique solution of variational inequality problem.

#### 2. Preliminaries

Let be a real Hilbert space and be a nonempty closed convex subset of . For every point , there exists a unique nearest point in , denoted by , such that is called the metric projection of onto . It is well known that is a nonexpansive mapping of onto and satisfies for every . It is easy to see that

*Definition 1. *A mapping is said to be

(1) monotone, if(2) -Lipschitz, if there exists a constant such that(3) -inverse strongly monotone, if there exists a positive real number such that

*Remark 2. *Any -inverse strongly monotone mapping is monotone and -Lipschitz continuous.

*Definition 3. *A mapping is a contraction if there exists a constant such that

Lemma 4 (see [7]). *Assume that is a strongly positive linear bounded self-adjoint operator on a Hilbert space with coefficient and . Then .*

*Definition 5. *A mapping is said to be -strictly pseudocontractive mapping if there exists such that

Lemma 6 (see [11]). *Let be a Hilbert space and let be a closed convex subset of . For any integer , assume that for each is a -strictly pseudocontractive mapping for some . Assume that is a positive sequence such that . Then is a -strictly pseudocontractive mapping with .*

Lemma 7 (see [11]). *Let and be as in Lemma 6. Suppose that has a common fixed point in . Then .*

Lemma 8 (see [12]). *Assume is a sequence of nonnegative real numbers such that where is a sequence in and is a sequence such that *(i)*;*(ii)*.** Then, .*

Lemma 9 (see [13] demiclosed principle). *Let be a nonempty closed convex subset of a real Hilbert space . Let be a nonexpansive self-mapping on with . Then is demiclosed. That is, whenever is a sequence in weakly converging to some and the sequence strongly converges to some , it follows that . Here is the identity operator of .*

Rockafellar [14] defined set-valued mapping to be called monotone if for all and imply A set-valued mapping is maximal monotone if the graph of of is not properly contained in the graph of any other monotone mapping. It is known that a monotone mapping is maximal if and only if for , for every implies .

Let be a monotone mapping and let be the normal cone to at , that is Define Then is the maximal monotone and

#### 3. Main Results

*Algorithm 10. *Let* W* be a closed convex subset of a real Hilbert space , let be a -contraction with coefficient , be strongly positive linear bounded self-adjoint operator, be a -inverse strongly monotone mappings for each , where is some positive integer, and be a finite family of nonexpansive mappings for all . Let for all and be the sequences in . For an arbitrarily given , we propose the following hybrid iterative algorithm:The sequence defined by (22) converges strongly to a common element of the set of common fixed points of a finite family of nonexpansive mappings and the set of solutions of the variational inequalities for an inverse strongly monotone mapping which solves the variational inequality problem.

Theorem 11. *Let be a closed convex subset of a real Hilbert space , let be a -contraction, let be a -inverse strongly monotone mapping, let be a finite family of nonexpansive mappings with , and let be a strongly positive linear bounded self-adjoint operator with the coefficient . Assume that and let be real numbers in . Let and be sequences in satisfying the following assumptions: *(i)* and .*(ii)* and .*(iii)* for each .*(iv)*.** Then the sequence defined by (22) converges strongly to , where is also the unique solution of the following variational inequality:*

*Proof. *For any and , we haveFrom (24), is nonexpansive. Since is a linear bounded self-adjoint operator, so We assume that and since , so without loss of generality, we have so is positive. Now, it follows that Next, we prove that the sequence is bounded. Fixing , we have By induction from to , we have Thus, for every , we get From (22) and (31), we have By induction on , we have Hence, the given sequence is bounded and so is for all .

Next, we show thatFrom the definition of the sequence for each in Algorithm 10, we haveWhen l=1, we haveFrom (35) and (36), for l=2,...,M, we get Next, where From conditions (i), (ii), and (iii), we have Setting , we have Next for , we havewhere From (42), we getFrom condition (i), we getFrom (34), (44), and (45), we get