[Statlist] Fwd: First talk of Foundations of Data Science Seminar with Amin Karbasi, Yale University, Thursday, January 24, 2019

Maurer Letizia |et|z|@m@urer @end|ng |rom ethz@ch
Tue Jan 22 09:24:49 CET 2019


Sorry I made a mistake !

The reception will be at HG G 14.1

Best regards

Letizia

Anfang der weitergeleiteten Nachricht:

Von: "Maurer Letizia" <cletizia using ethz.ch<mailto:cletizia using ethz.ch>>
Betreff: First talk of Foundations of Data Science Seminar with Amin Karbasi, Yale University, Thursday, January 24, 2019
Datum: 22. Januar 2019 um 09:19:27 MEZ
An: <statlist using stat.ch<mailto:statlist using stat.ch>>, ZüKoSt Liste <zukost-list using lists.math.ethz.ch<mailto:zukost-list using lists.math.ethz.ch>>

REMINDER:


ETH Foundations of Data Science



launches a new seminar series called „Foundation of Data Science Seminar“

More information about our new initiative and seminar series can be found here https://www.math.ethz.ch/sfs/research/fds.html


We are pleased to announce the first talk:


Organisers:

Proff.  - Bölcskei Helmut - Bühlmann Peter - Buhmann Joachim M. - Van de Geer Sara - Hofmann Thomas - Krause Andreas - Lapidoth Amos - Loeliger Hans-Andrea - Maathuis Marloes H. - Meinshausen Nicolai - Rätsch Gunnar
______________________________________________________________________________________________________________________________________________________________________________________

with Amin Karbasi, Yale University for Network Science

Thursday, January 24, 2019,
ETH Zurich, HG E  22
at 16:15, followed by a small reception in HG G 14.1
******************************************************************


Title:

It Is Good to Relax (Maybe Best)


Abstract:

The difficulty of searching through a massive amount of data in order to quickly make an informed decision is one of today’s most ubiquitous challenges. Many scientific and engineering models feature inherently discrete decision variables—from phrases in a corpus to objects in an image. The study of how to make near-optimal decisions from a massive pool of possibilities is at the heart of combinatorial optimization. Many of these problems are notoriously hard, and even those that are theoretically tractable may not scale to large instances. However, the problems of practical interest are often much more well-behaved and possess extra structures that allow them to be amenable to exact or approximate optimization techniques. Just as convexity has been a celebrated and well-studied condition under which continuous optimization is tractable, submodularity is a condition for which discrete objectives may be optimized.

In order to provide the tightest approximation guarantees for submodular optimization problems, we usually need to leave the space of discrete domains and consider their continuous relaxations. To this end, we will explore the notion of submodularity in the continuous domains and introduce a broad class of non-convex objective functions. Despite the apparent lack of convexity, we will see that first-order optimization methods can provide strong approximation guarantees. We then show that such continuous relaxations can be used as an interface to provide tight approximation guarantees for maximizing stochastic submodular set functions.

I will not assume any particular background on submodularity or optimization and will try to motivate and define all the necessary concepts during the talk.

Bio: Amin Karbasi is currently an assistant professor of Electrical Engineering, Computer Science, and Statistics at Yale University. He has been the recipient of ONR 2019 Young Investigator Award, AFOSR 2018 Young Investigator Award, Grainger Award 2017 from National Academy of Engineering, Microsoft Azure research award 2017, DARPA 2016 Young Faculty Award, Simons-Berkeley fellowship 2016, Google Faculty Award 2015, and ETH fellowship 2013. His work has been recognized with a variety of paper awards, including Medical Image Computing and Computer Assisted Interventions Conference (MICCAI) 2017, International Conference on Artificial Intelligence and Statistics (AISTAT) 2015, IEEE ComSoc Data Storage 2013, International Conference on Acoustics, Speech, and Signal Processing (ICASSP) 2011, ACM SIGMETRICS 2010, and IEEE International Symposium on Information Theory (ISIT) 2010 (runner-up). His Ph.D. work received the Patrick Denantes Memorial Prize 2013 from the School of Computer and Communication Sciences at EPFL, Switzerland.


	[[alternative HTML version deleted]]



More information about the Statlist mailing list