Release date: April 2020
This authoritative book draws on the latest research to explore the interplay of high-dimensional statistics with optimization. Through an accessible analysis of fundamental problems of hypothesis testing and signal recovery, Anatoli Juditsky and Arkadi Nemirovski show how convex optimization theory can be …
Sélim Chraibi started his PhD this week in the team. He presented us a part of his internship work in the group of Peter Richtarik at KAUST.
Nadia is a professor at G-SCOP and former head of the French Operation Research Society. She is joining us for the semester.
Nice talk on accelerating second-order methods !
Acceleration is the way to go!
The title of his plenary summarizes one of our research axes : “Nonsmoothness can help: sensitivity analysis and acceleration of proximal algorithms”
Photo courtesy of C. Bouveyron
Elnur is starting his PhD at KAUST with Peter Richtarik, he visited us between June 3rd and July 19th to work on distributed optimization.
Franck was awarded a “Jeune Chercheur” (Young Researcher) funding from the ANR (French National Research Agency) for his project STROLL: Harnessing Structure in Optimization for Large-scale Learning. This competitive grant (<20% of success) will notably fund the Ph.D. of Gilles.
The Grenoble AI Institute MIAI funded the chair “Optimization & Learning” lead by Jérôme and Yurii Nesterov (CORE, Louvain-la-Neuve).
Jérome presented the activities of the team with his trademarked jokes! The title/abstract of the talk is available at http://www-ljk.imag.fr/Seminars/1557909753900.html (For the non-French speakers, the abstract is a joke around the name of the team and the name of a French pop singer).