Optimal robust smoothing extragradient algorithms for stochastic variational inequality problems

Farzad Yousefian, Angelia Nedich, Uday V. Shanbhag

Research output: Contribution to journalArticle

9 Citations (Scopus)

Abstract

We consider stochastic variational inequality problems where the mapping is monotone over a compact convex set. We present two robust variants of stochastic extragradient algorithms for solving such problems. Of these, the first scheme employs an iterative averaging technique where we consider a generalized choice for the weights in the averaged sequence. Our first contribution is to show that using an appropriate choice for these weights, a suitably defined gap function attains the optimal rate of convergence of O(1 over √k). In the second part of the paper, under an additional assumption of weak-sharpness, we update the stepsize sequence using a recursive rule that leverages problem parameters. The second contribution lies in showing that employing such a sequence, the extragradient algorithm possesses almost-sure convergence to the solution as well as convergence in a mean-squared sense to the solution of the problem at the rate O(1 over k). Motivated by the absence of a Lipschitzian parameter, in both schemes we utilize a locally randomized smoothing scheme. Importantly, by approximating a smooth mapping, this scheme enables us to estimate the Lipschitzian parameter. The smoothing parameter is updated per iteration and we show convergence to the solution of the original problem in both algorithms.

Original languageEnglish (US)
Article number7040302
Pages (from-to)5831-5836
Number of pages6
JournalUnknown Journal
Volume2015-February
Issue numberFebruary
DOIs
StatePublished - 2014
Externally publishedYes

Fingerprint

Variational Inequality Problem
Smoothing
Weights and Measures
Gap Function
Averaging Technique
Almost Sure Convergence
Optimal Rate of Convergence
Compact Convex Set
A.s. Convergence
Smoothing Parameter
Sharpness
Leverage
Monotone
Update
Iteration
Estimate

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Modeling and Simulation
  • Control and Optimization

Cite this

Optimal robust smoothing extragradient algorithms for stochastic variational inequality problems. / Yousefian, Farzad; Nedich, Angelia; Shanbhag, Uday V.

In: Unknown Journal, Vol. 2015-February, No. February, 7040302, 2014, p. 5831-5836.

Research output: Contribution to journalArticle

Yousefian, Farzad ; Nedich, Angelia ; Shanbhag, Uday V. / Optimal robust smoothing extragradient algorithms for stochastic variational inequality problems. In: Unknown Journal. 2014 ; Vol. 2015-February, No. February. pp. 5831-5836.
@article{8aefc0058a9a4fc4b46fed1a677d1562,
title = "Optimal robust smoothing extragradient algorithms for stochastic variational inequality problems",
abstract = "We consider stochastic variational inequality problems where the mapping is monotone over a compact convex set. We present two robust variants of stochastic extragradient algorithms for solving such problems. Of these, the first scheme employs an iterative averaging technique where we consider a generalized choice for the weights in the averaged sequence. Our first contribution is to show that using an appropriate choice for these weights, a suitably defined gap function attains the optimal rate of convergence of O(1 over √k). In the second part of the paper, under an additional assumption of weak-sharpness, we update the stepsize sequence using a recursive rule that leverages problem parameters. The second contribution lies in showing that employing such a sequence, the extragradient algorithm possesses almost-sure convergence to the solution as well as convergence in a mean-squared sense to the solution of the problem at the rate O(1 over k). Motivated by the absence of a Lipschitzian parameter, in both schemes we utilize a locally randomized smoothing scheme. Importantly, by approximating a smooth mapping, this scheme enables us to estimate the Lipschitzian parameter. The smoothing parameter is updated per iteration and we show convergence to the solution of the original problem in both algorithms.",
author = "Farzad Yousefian and Angelia Nedich and Shanbhag, {Uday V.}",
year = "2014",
doi = "10.1109/CDC.2014.7040302",
language = "English (US)",
volume = "2015-February",
pages = "5831--5836",
journal = "Scanning Electron Microscopy",
issn = "0586-5581",
publisher = "Scanning Microscopy International",
number = "February",

}

TY - JOUR

T1 - Optimal robust smoothing extragradient algorithms for stochastic variational inequality problems

AU - Yousefian, Farzad

AU - Nedich, Angelia

AU - Shanbhag, Uday V.

PY - 2014

Y1 - 2014

N2 - We consider stochastic variational inequality problems where the mapping is monotone over a compact convex set. We present two robust variants of stochastic extragradient algorithms for solving such problems. Of these, the first scheme employs an iterative averaging technique where we consider a generalized choice for the weights in the averaged sequence. Our first contribution is to show that using an appropriate choice for these weights, a suitably defined gap function attains the optimal rate of convergence of O(1 over √k). In the second part of the paper, under an additional assumption of weak-sharpness, we update the stepsize sequence using a recursive rule that leverages problem parameters. The second contribution lies in showing that employing such a sequence, the extragradient algorithm possesses almost-sure convergence to the solution as well as convergence in a mean-squared sense to the solution of the problem at the rate O(1 over k). Motivated by the absence of a Lipschitzian parameter, in both schemes we utilize a locally randomized smoothing scheme. Importantly, by approximating a smooth mapping, this scheme enables us to estimate the Lipschitzian parameter. The smoothing parameter is updated per iteration and we show convergence to the solution of the original problem in both algorithms.

AB - We consider stochastic variational inequality problems where the mapping is monotone over a compact convex set. We present two robust variants of stochastic extragradient algorithms for solving such problems. Of these, the first scheme employs an iterative averaging technique where we consider a generalized choice for the weights in the averaged sequence. Our first contribution is to show that using an appropriate choice for these weights, a suitably defined gap function attains the optimal rate of convergence of O(1 over √k). In the second part of the paper, under an additional assumption of weak-sharpness, we update the stepsize sequence using a recursive rule that leverages problem parameters. The second contribution lies in showing that employing such a sequence, the extragradient algorithm possesses almost-sure convergence to the solution as well as convergence in a mean-squared sense to the solution of the problem at the rate O(1 over k). Motivated by the absence of a Lipschitzian parameter, in both schemes we utilize a locally randomized smoothing scheme. Importantly, by approximating a smooth mapping, this scheme enables us to estimate the Lipschitzian parameter. The smoothing parameter is updated per iteration and we show convergence to the solution of the original problem in both algorithms.

UR - http://www.scopus.com/inward/record.url?scp=84988227345&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84988227345&partnerID=8YFLogxK

U2 - 10.1109/CDC.2014.7040302

DO - 10.1109/CDC.2014.7040302

M3 - Article

VL - 2015-February

SP - 5831

EP - 5836

JO - Scanning Electron Microscopy

JF - Scanning Electron Microscopy

SN - 0586-5581

IS - February

M1 - 7040302

ER -