Tatiana Likhomanenko

Apple

Names

How do you usually write your name as author of a paper? Also add any other names you have authored papers under.

Tatiana Likhomanenko (Preferred)

Emails

Enter email addresses associated with all of your current and historical institutional affiliations, as well as all your previous publications, and the Toronto Paper Matching System. This information is crucial for deduplicating users, and ensuring you see your reviewing assignments.

****@fb.com
,
****@gmail.com
,
****@apple.com

Education & Career History

Enter your education and career history. The institution domain is used for conflict of interest detection and institution ranking. For ongoing positions, leave the end field blank.

Research Scientist
Apple (apple.com)
2021Present
 
Postdoc
Facebook AI Research (fb.com)
20192021
 
AI Resident
Facebook AI Research (fb.com)
20182019
 
Researcher
CERN (cern.ch)
20132018
 
Researcher
Higher School of Economics (hse.ru)
20142017
 
PhD student
Lomonosov Moscow State University (msu.ru)
20132017
 
Researcher-developer
Yandex (yandex-team.ru)
20132017
 

Advisors, Relations & Conflicts

Enter all advisors, co-workers, and other people that should be included when detecting conflicts of interest.

Postdoc Advisor
Ronan Collobert
****@fb.com
20192021
 
Postdoc Advisor
Gabriel Synnaeve
****@fb.com
20192021
 
Coauthor
Fedor Ratnikov
****@hse.ru
20162017
 
Coauthor
Denis Derkach
****@hse.ru
20152017
 
Coauthor
Andrey Ustyuzhanin
****@hse.ru
20132017
 
PhD Advisor
Eugene Moiseev
****@cs.msu.ru
20132017
 
Coauthor
Aleksei Rogozhnikov
****@yandex.ru
20132017
 

Expertise

For each line, enter comma-separated keyphrases representing an intersection of your interests. Think of each line as a query for papers in which you would have expertise and interest. For example: deep learning, RNNs, dependency parsing

deep learning, speech recognition, nlp, image recognition
2018Present
 
deep learning, face recognition, image recognition, video
20172018
 
high energy physics, gradient boosting, optimization, model compression, loss functions
20132017