**Wednesday June 8, 2016 **

Padelford C-401

*11:00 – 11:45 am
Speaker:* Ting Kei Pong (Hong Kong Polytechnic)

*Title:*Explicit estimation of KL exponent and linear convergence of 1st-order methods*Abstract:*In this talk, we study the Kurdyka-Lojasiewicz (KL) exponent, an important quantity for analyzing the convergence rate of first-order methods. Specifically, we show that many convex or nonconvex optimization models that arise in applications such as sparse recovery have objectives whose KL exponent is 1/2: this indicates that various first-order methods are locally linearly convergent when applied to these models. Our results cover the sparse logistic regression problem and the least squares problem with SCAD or MCP regularization. We achieve this by relating the KL inequality with an error bound concept studied extensively by Luo and Tseng, and developing calculus rules for the KL exponent.

This is a joint work with Guoyin Li.

*1:15 – 2:00 pm
Speaker:* Asen L. Dontchev (Mathematical Reviews)

*Title:*The Four Theorems of Lawrence Graves*Abstract*: The classical inverse/implicit function theorems revolve around solving an equation in terms of a parameter and tell us when the solution mapping associated with this equation is a (differentiable) function with respect to the parameter. Already in 1927 Hildebrandt and Graves observed that one can put aside differentiability obtaining however that the solution mapping is just Lipschitz continuous. The idea has evolved in subsequent extensions most known of which are various reincarnations of the Lyusternik-Graves theorem. In the last several decades it has been widely accepted that in order to derive estimates for the solution mapping, e.g., to put them in use for proving convergence of algorithms, it is sufficient to differentiate what you can and leave the rest as is, hoping that the resulting problem is easier to handle. More sophisticated results may be obtained by employing various forms of metric regularity, starting from abstract results on mappings acting in metric spaces and ending with applications to numerical analysis. I will focus in particular on strong metric subregularity, a property which, put next to the [strong] metric regularity, turns out to be equally instrumental in applications.

*2:00 – 2:45 pm
Speaker:* Alexander Ioffe (Technion)

*Title:*On variational inequalities over polyhedral sets*Abstract:*The results on regular behavior of solutions to variational inequalities over polyhedral sets proved in a series of papers by Robinson, Ralph and Dontchev-Rockafellar in the 90s has long become classics of variational analysis. But the available proofs, focused on the study of piecewise affine mappings and Lipschitz homeomorphisms and based essentially on matrix algebra and/or topology, are rather complicated and practically do not use techniques of variational analysis. The only exception is the proof by Dontchev and Rockafellar of their “critical face” regularity criterion. I shall discuss a different approach completely based on elementary polyhedral geometry and a few basic principles of metric regularity theory. It leads to new proofs, simpler and shorter, and in addition gives some clarifying geometric information.