Nicklas Hansen

University of California, San Diego

Names

How do you usually write your name as author of a paper? Also add any other names you have authored papers under.

Nicklas Hansen (Preferred)

Emails

Enter email addresses associated with all of your current and historical institutional affiliations, as well as all your previous publications, and the Toronto Paper Matching System. This information is crucial for deduplicating users, and ensuring you see your reviewing assignments.

****@nicklashansen.com

Education & Career History

Enter your education and career history. The institution domain is used for conflict of interest detection and institution ranking. For ongoing positions, leave the end field blank.

Researcher
University of California, San Diego (ucsd.edu)
20202021
 
Visiting Researcher
University of California Berkeley (berkeley.edu)
20202020
 
MS student
Technical University of Denmark (dtu.dk)
20192020
 
Undergrad student
Technical University of Denmark (dtu.dk)
20152019
 
Exchange student
Nanyang Technological University (ntu.edu.sg)
20172017
 

Advisors, Relations & Conflicts

Enter all advisors, co-workers, and other people that should be included when detecting conflicts of interest.

PhD Advisor
Hao Su
****@eng.ucsd.edu
2021Present
 
PhD Advisor
Xiaolong Wang
****@ucsd.edu
2020Present
 
Coauthor
Rishabh Jangir
****@gmail.com
20202021
 
Coauthor
Alexei A. Efros
****@eecs.berkeley.edu
20202020
 
Coauthor
Lerrel Pinto
****@gmail.com
20202020
 
Coauthor
Yu Sun
****@berkeley.edu
20202020
 
Coauthor
Pieter Abbeel
****@berkeley.edu
20202020
 
Coauthor
Guillem Alenyà
****@iri.upc.edu
20202020
 
Coauthor
Alexander Rosenberg Johansen
****@herhjemme.dk
20192020
 
Coauthor
Morten Mørup
****@dtu.dk
20192020
 
Coworker
Ole Winther
****@dtu.dk
20182020
 

Expertise

For each line, enter comma-separated keyphrases representing an intersection of your interests. Think of each line as a query for papers in which you would have expertise and interest. For example: deep learning, RNNs, dependency parsing

self-supervised learning, representation learning
2020Present
 
reinforcement learning, generalization, adaptation
2019Present
 
deep learning, natural language processing
20192020