Fabian Latorre

Swiss Federal Institute of Technology Lausanne

Names

How do you usually write your name as author of a paper? Also add any other names you have authored papers under.

Fabian Latorre (Preferred)

Emails

Enter email addresses associated with all of your current and historical institutional affiliations, as well as all your previous publications, and the Toronto Paper Matching System. This information is crucial for deduplicating users, and ensuring you see your reviewing assignments.

****@epfl.ch
,
****@epfl.ch

Education & Career History

Enter your education and career history. The institution domain is used for conflict of interest detection and institution ranking. For ongoing positions, leave the end field blank.

PhD student
Swiss Federal Institute of Technology Lausanne (epfl.ch)
2017Present
 
Intern
Amazon Development Center Germany (amazon.de)
20222022
 
Intern
SalesForce.com (salesforce.com)
20222022
 
MS student
Universidad de los Andes (uniandes.edu.co)
20122015
 
Undergrad student
Universidad de los Andes (uniandes.edu.co)
20082012
 

Advisors, Relations & Conflicts

Enter all advisors, co-workers, and other people that should be included when detecting conflicts of interest.

Coworker
Matthaeus Kleindessner
****@amazon.de
2022Present
 
Coworker
Chris Russel
****@amazon.de
2022Present
 
Coauthor
Nadav Hallak
****@live.com
2020Present
 
Coauthor
Paul Rolland
****@epfl.ch
2020Present
 
PhD Advisor
Volkan Cevher
****@epfl.ch
2018Present
 
Coauthor
Armin Eftekhari
****@umu.se
2018Present
 
Coworker
Chenghao Liu
****@salesforce.com
20222022
 
Coworker
Doyen Sahoo
****@salesforce.com
20222022
 
Coworker
Steven Hoi
****@salesforce.com
20222022
 

Expertise

For each line, enter comma-separated keyphrases representing an intersection of your interests. Think of each line as a query for papers in which you would have expertise and interest. For example: deep learning, RNNs, dependency parsing

robust deep learning, robustness certification, machine learning
2018Present
 
generative models
2018Present
 
optimization
2018Present