Boris Knyazev

Samsung

Names

How do you usually write your name as author of a paper? Also add any other names you have authored papers under.

Boris Knyazev

Emails

Enter email addresses associated with all of your current and historical institutional affiliations, as well as all your previous publications, and the Toronto Paper Matching System. This information is crucial for deduplicating users, and ensuring you see your reviewing assignments.

****@uoguelph.ca
,
****@gmail.com
,
****@samsung.com

Education & Career History

Enter your education and career history. The institution domain is used for conflict of interest detection and institution ranking. For ongoing positions, leave the end field blank.

Researcher
Samsung (samsung.com)
2021Present
 
PhD student
University of Guelph (uoguelph.ca)
20172022
 
Intern
Facebook (fb.com)
20202020
 
Intern
Mila - Quebec AI Institute (mila.quebec)
20192020
 
Intern
SRI International (sri.com)
20182019
 

Advisors, Relations & Conflicts

Enter all advisors, co-workers, and other people that should be included when detecting conflicts of interest.

Coauthor
Simon Lacoste-Julien
****@iro.umontreal.ca
2022Present
 
Coauthor
Eugene Belilovsky
****@concordia.ca
2019Present
 
Coauthor
Doha Hwang
****@samsung.com
20222023
 
PhD Advisor
Graham Taylor
****@uoguelph.ca
20172022
 
Coauthor
Jungtaek Kim
****@postech.ac.kr
20212021
 
Coauthor
Maksims Volkovs
****@layer6.ai
20212021
 
Coauthor
Adriana Romero Soriano
****@fb.com
20202021
 
Coauthor
Michal Drozdzal
****@fb.com
20202021
 
Coauthor
Aaron Courville
****@umontreal.ca
20192021
 
Coauthor
Harm de Vries
****@elementai.com
20192021
 
Coauthor
Cătălina Cangea
****@cst.cam.ac.uk
20192021
 
Coauthor
Xiao Lin
****@sri.com
20182019
 
Coauthor
Mohamed Amer
****@robust.ai
20182019
 

Expertise

For each line, enter comma-separated keyphrases representing an intersection of your interests. Think of each line as a query for papers in which you would have expertise and interest. For example: deep learning, RNNs, dependency parsing

Hypernetworks
2021Present
 
Learning to optimize
2021Present
 
Efficient large-scale training
2021Present
 
Graph neural networks
2018Present
 
Few-shot learning
2016Present
 
Computer vision
2012Present