Category Archives: Winter 2014

Marina Meila; Optimization for Ordered Data

February 25, 2014, 4:00pm
Guggenheim 218
Marina MeilaDepart­ment of Statistics, UW. 
Optimization for Ordered Data

This talk is concerned with summarizing — by means of statistical models — of data that expresses preferences. This data is typically a set of rankings of n items by a panel of experts; the simplest summary is the “consensus ranking”, or the “centroid” of the set of rankings.

I will describe how combinatorics, algorithms and statistics come together in this endeavor. At the center of the results is the _code_ of a permutation. This natural mathematical way to represent a permutation is key both to fast computing and to better understanding the models we create. The models have permutations _as parameters_ and fitting them to data leads to combinatorial optimization problems which are often NP-hard. The talk will introduce a class of algorithms to attack these problems, that are exact but intractable in the worst case, and will demonstrate that they are effective on real-world examples.

Joint work with Alnur Ali, Raman Arora, Le Bao, Jeff Bilmes, Harr Chen, Bhushan Mandhani, Chris Meek, Brendan Murphy, Kapil Phadnis, Arthur Patterson.

Mario Micheli; An Optimization Approach for Image Recovery Through Optical Turbulence

February 11, 2014, 4:00pm
Guggenheim 218
Mario MicheliDepart­ment of Mathematics, UW. 
An Optimization Approach for Image Recovery Through Optical Turbulence

Abstract: The phenomenon that is commonly referred to as optical “turbulence” in imaging is caused by the time and space-varying refraction index of the air which is due, among other factors, to temperature, air pressure, humidity, and wind conditions between the acquired scene and the image-capturing device. The above described distortion may be modeled, at least to a first approximation, as the combination of a blurring effect and a time-dependent deformation of the image domain. After briefly introducing the mathematical modeling of images and video, in this talk I shall describe an algorithm that corrects for the geometric deformation and the blur that affect the images. Both the estimation of the image deformations and the deblurring algorithm are formulated as optimization problems, to be solved with variational techniques. Results from several experiments will be shown.

Mary Beth Hribar; Adventures in the Unknown, My Career in Industry after an Optimization Dissertation

January 21, 2014, 4:00pm
Guggenheim 218
Mary Beth HribarMicrosoft.
Adventures in the Unknown; My Career in Industry after an Optimization Dissertation

Abstract: Like most everyone else I knew in graduate school at Northwestern University, I had planned to have a career in academia after receiving a Ph.D. My advisor had prepared me for a career in research and in teaching. After graduation, I completed a post doc position at Rice University. I then had an offer to join a computer science department at a college in Michigan. Life was going as I had planned, but I found that I no longer wanted that plan. So, I rejected the offer from the small college and began an unknown career path in industry which has suited me well. Through the years, I have worked on a varied set of challenging projects spanning parallel linear algebra libraries, a new parallel programming language, user interfaces and data science. In this informal talk and discussion, I will share my experiences and learning what it takes to succeed in industry with a computational mathematics background.

Biosketch: Mary Beth Hribar received her Ph.D. from Northwestern University, completing a dissertation on an interior point algorithm for nonlinear programming. The KNITRO optimization software is based on this algorithm. She has spent most of her career developing numerical software and currently works as a data scientist at Microsoft.

Dvijotham Krishnamurthy; Convex Formulations of Controller Synthesis

March 18, 2014, 4:00pm
Guggenheim 218
Dvijotham KrishnamurthyDepart­ment of Computer Science and Engineering, UW. 
Convex Formulations of Controller Synthesis

Abstract: Control of dynamical systems is a hard and nonconvex problem in general. I will discuss alternate but reasonable formulations of general control problems that lead to convex optimization problems. The formulations we work with are fairly general and cover problems ranging from training neural networks to decentralized control. I’ll also present numerical examples validating our approach and comparing it to alternate (nonconvex) approaches for controller synthesis.

Samet Oymak; A general theory of noisy linear inverse problems

Jan­u­ary 28, 2014, 4:00pm
Guggenheim 218
Samet OymakDepart­ment of Electrical Engineering, Caltech. 
A general theory of noisy linear inverse problems

Abstract: We consider the problem of estimating an unknown signal x_0 from noisy linear observations y = Ax_0 + z. In many practical instances of this problem, x_0 has a low-dimensional representation that can be captured by a structure inducing function f(). For example, L1 norm can be used to encourage a sparse solution. To estimate x_0 with the aid of a convex f(), we consider variations of the widely used Lasso estimator and provide sharp characterizations of their performances. Our study falls under a generic framework, where the entries of the measurement (design) matrix A are i.i.d. Gaussian. For the Lasso estimator x*, we ask: “What is the precise estimation error as a function of the noise level, the number of available observations and the structure of the signal?”. We show that, the structure of the signal x_0 and choice of the function f() enter the error formulae through parameters arising from the convex geometry of the problem, in particular, the subdifferential of f() at x_0. We also find explicit formulae for the optimal estimation performance and the optimal penalty parameters. Our results can be seen as a generalization of the classically known results for the least-squares method.

Joint work with Christos Thrampoulidis and Babak Hassibi

Winter and Spring 2014 Calendar

January 14
Thomas Rothvoss, Department of Mathematics, MIT.
The matching polytope has exponential extension complexity

January 21
Mary Beth Hribar, Microsoft.
Adventures in the Unknown, My Career in Industry after an Optimization Dissertation

January 28
Samet Oymak, Department of Electrical Engineering, Caltech.
A General Theory of Noisy Linear Inverse Problems

February 11
Mario Micheli, Department of Mathematics, UW.
An Optimization Approach for Image Recovery Through Optical Turbulence

February 21 [MAC]
Maryam Fazel, Department of Electrical Engineering, UW.
Filling In the Gaps: Recovery from Incomplete Information

February 25
Marina Meila, Department of Statistics, UW.
Optimization for Ordered Data

March 18
Dvi­jotham Krish­na­murthy, Depart­ment of Com­puter Sci­ence and Engi­neer­ing, UW.
Convex Formulations of Controller Synthesis

March 31 [CORE]
David Blei, Department of Computer Science, Princeton University.
Probabilistic Topic Models of Text and Users

April 8
Rishabh Iyer, Depart­ment of Electrical Engi­neer­ing, UW.
Submodular Combinatorial Problems in Machine Learning: Algorithms and Applications

April 15 [CORE]
Sanjeev Arora, Department of Computer Science, Princeton University.
Overcoming Intractability in Unsupervised Learning

April 22
Hongbo Dong, Department of Mathematics, Washington State University.
A New Approach to Relax Nonconvex Quadratics

April 29
Rina Foygel, Department of Statistics, University of Chicago.
Demixing signals: the geometry of corrupted sensing

May 2-3
West Coast Optimization Meeting

May 8 [CORE]
Dan Spielman, Department of Computer Science, Yale University.
Spectral Sparsification of Graphs

May 9 [MAC]
Dan Spielman, Department of Computer Science, Yale University.
Physical Metaphors for Graphs and Networks

May 13
Noah Simon, Department of Biostatistics, University of Washington.
Flexible Sparse Modeling

May 23
Michael Joswig, Institute of Mathematics, TU Berlin.
Long and Winding Central Paths

May 27
Elina Robeva, Department of Mathematics, UC Berkeley.
Fixed Points of the EM Algo­rithm and Non­neg­a­tive Rank Boundaries

Click on the title or see below for the abstracts.