Louis Sharrock

Lancaster University

Names

How do you usually write your name as author of a paper? Also add any other names you have authored papers under.

Louis Sharrock (Preferred)

Emails

Enter email addresses associated with all of your current and historical institutional affiliations, as well as all your previous publications, and the Toronto Paper Matching System. This information is crucial for deduplicating users, and ensuring you see your reviewing assignments.

****@lancaster.ac.uk
,
****@bristol.ac.uk
,
****@imperial.ac.uk

Education & Career History

Enter your education and career history. The institution domain is used for conflict of interest detection and institution ranking. For ongoing positions, leave the end field blank.

Postdoc
Lancaster University (lancaster.ac.uk)
20222025
 
Postdoc
University of Bristol (bristol.ac.uk)
20222022
 
PhD student
Imperial College London (ic.ac.uk)
20182021
 
MS student
Imperial College London (ic.ac.uk)
20172018
 
Undergrad student
University of Cambridge (cam.ac.uk)
20132016
 

Advisors, Relations & Conflicts

Enter all advisors, co-workers, and other people that should be included when detecting conflicts of interest.

Coauthor
Lester Mackey
****@microsoft.com
20232023
 
Coauthor
Daniel Dodd
****@lancaster.ac.uk
20232023
 
Postdoc Advisor
Christopher Nemeth
****@lancaster.ac.uk
20222023
 
Coauthor
Song Liu
****@bristol.ac.uk
20222023
 
Coauthor
Mark Beaumont
****@bristol.ac.uk
20222023
 
Coauthor
Jack Simons
****@bristol.ac.uk
20222023
 
Coauthor
Panos Parpas
****@imperial.ac.uk
20212023
 
Coauthor
Griogorios Pavliotis
****@imperial.ac.uk
20212023
 
PhD Advisor
Nikolas Kantas
****@imperial.ac.uk
20172023
 

Expertise

For each line, enter comma-separated keyphrases representing an intersection of your interests. Think of each line as a query for papers in which you would have expertise and interest. For example: deep learning, RNNs, dependency parsing

diffusion models, generative modelling,
20212022
 
simulation based inference, likelihood free inference
20212022
 
particle based variational inference, stein variational gradient descent
20212022
 
mcmc, bayesian inference
20182022
 
stochastic differential equations, parameter estimation, recursive maximum likelihood
20182022