## Announcements

PhD and Postdoc positions on Optimization and Machine Learning are available in my research group at Umeå University in Sweden. Contact me via email if you are interested. For more information and instructions on how to apply, click here.

## News

**[May 8, 2021]** Our paper with Varun Mangalick and Suvrit Sra on Three Operator Splitting with a Nonconvex Loss Function is accepted to ICML.

**[May 7, 2021]** Our paper with Lijun Ding, Volkan Cevher, Joel Tropp, and Madeleine Udell entitled An Optimal-Storage Approach to Semidefinite Programming using Approximate Complementarity is accepted for publication at SIAM Journal on Optimization.

**[Apr 10, 2021]** I am joining the Department of Mathematics and Mathematical Statistics at Umeå University and the WASP-AI school as a tenure-track assistant professor in Fall 2021.

**[Mar 8, 2021]** New preprint on Three Operator Splitting with a Nonconvex Loss Function is out. This is a joint work with Varun Mangalick and Suvrit Sra.

**[Jan 19, 2021]** I gave a seminar talk (virtual) at UC Louvain on "Scalable convex optimization for semidefinite programming".

**[Dec 8, 2020]** I gave a seminar talk (virtual) at Bilkent University on "Scalable convex optimization with applications to semidefinite programming".

**[Nov 18, 2020]** I gave a seminar talk (virtual) at Umeå University on "Optimization for machine learning".

**[Nov 16, 2020]** Our paper Scalable Semidefinite Programming is accepted for publication at SIAM Journal on Mathematics of Data Science.

**[Nov 10, 2020]** I gave a seminar talk (virtual) at KU Leuven on "Large-scale optimization for machine learning".

**[Nov 10, 2020]** I presented our work (virtual) on Scalable Semidefinite Programming at the INFORMS Annual Meeting.

**[Sep 26, 2020]** We are organizing a new seminar series: OPTML++ with my colleagues Horia Mania, Xiang Cheng and our advisor Suvrit Sra. More information at the seminar webpage.

**[Jul 2, 2020]** I presented our recent work on Scalable Semidefinite Programming at the multi-disciplinary SNSF Fellows Conference.

**[Feb 26, 2020]** I gave a seminar talk at IST Austria on "Scalable convex optimization with applications to semidefinite programming".

**[Feb 12, 2020]** I am presenting the highlights from my PhD research in LIDS & Stats Tea Talks at 4:00 pm.

**[Jan 1, 2020]** I joined LIDS at MIT as a postdoctoral fellow, hosted by Prof. Suvrit Sra.

**[Oct 14, 2019]** The doctoral school program committee at EPFL has chosen to honor my doctoral dissertation entitled Scalable Convex Optimization Methods for Semidefinite Programming with a Thesis Distinction.

**[Sep 5, 2019]** Our paper 'Stochastic Frank-Wolfe for Composite Convex Minimization' is accepted to NeurIPS 2019.

**[Aug 27, 2019]** I defended my PhD Thesis entitled Scalable Convex Optimization Methods for Semidefinite Programming at EPFL.

**[May 6, 2019]** I gave a seminar talk at KAUST on "A storage-optimal convex optimization framework with applications to semidefinite programming".

**[May 2, 2019]** I gave a seminar talk at Telecom Paris on "Scalable convex optimization with applications to semidefinite programming".

**[May 1, 2019]** Our paper 'Streaming Low-Rank Matrix Approximation with an Application to Scientific Simulation' is accepted for publication in SIAM Journal on Scientific Computing.

**[Apr 4, 2019]** Our papers 'Conditional Gradient Methods via Stochastic Path-Integrated Differential Estimator' and 'A Conditional Gradient-Based Augmented Lagrangian Framework' are accepted to ICML 2019.

**[Mar 28, 2019]** I gave a seminar talk at Koc University on "Scalable convex methods for semidefinite programming".

**[Feb 28, 2019]** New software on matrix sketching is out: SKETCH v1.1

**[Feb 22, 2019]** New paper out on streaming low-rank matrix approximation, with an application to scientific simulation.

**[Feb 9, 2019]** New paper out: We develop a new storage-optimal algorithm based on approximate complementarity for solving semidefinite programs.

**[Jan 29, 2019]** New paper out on a stochastic conditional gradient method for composite convex minimization.

**[Jan 15, 2019]** New paper out on a conditional gradient-based augmented Lagrangian framework.

**[Jan 13, 2019]** I will present my research in ITA 2019 - Graduation Day in San Diego. Slides of my talk.

**[Jan 3, 2019]** Visiting Caltech until Jan 10, hosted by Joel A. Tropp.

**[Dec 10, 2018]** My MIT visit is finished. It was a great pleasure to be there. Now, I am back to EPFL. It is time to focus on my thesis.

**[Dec 8, 2018] ** I presented our poster for "Online Adaptive Methods, Universality and Acceleration" (joint work with Kfir Levy and Volkan Cevher) at NeurIPS 2018.

**[Nov 12, 2018] ** Springer book "Large-scale and Distributed Optimization" (Eds. Pontus GiselssonAnders Rantzer) is published today. I am proud to contibute to this book with a chapter on "Stochastic Forward Douglas-Rachford Splitting Method for Monotone Inclusions", joint work with Bang Cong Vu and Volkan Cevher.

**[Nov 4, 2018] ** I am visiting Caltech until Nov 15, hosted by Joel A. Tropp.

**[Sep 5, 2018] ** Our paper "Online Adaptive Methods, Universality and Acceleration" with Kfir Levy and Volkan Cevher is accepted to NeurIPS 2018.

**[Sep 1, 2018] ** I joined LIDS at MIT as a visiting student for the fall semester, hosted by Prof. Suvrit Sra.