Abstract: I am currently forbidden from using my name as an author to due to my NDA. Though, I have been a reviewer before and am eager to review for ICLR'21.
My areas of expertise are, in descending order:
* Efficient training and inference, DL systems
* Representation Learning applications for Natural Language Processing (and specifically Machine Translation)
* Structured learning, Structured prediction
* Reinforcement learning & Imitation learning
* Visualization or interpretation of learned representations
Below i provide evidence to back up my claims.
A) I have previously reviewed for ICML20 (see "paper" PDF for certificate of appreciation) and been an apprentice reviewer for NeurIPS19, ICCV19 (update: also ICLR'21, ICML'21 and NeurIPS'21)
B) Below is a list of papers that i have significantly contributed to, but had to decline co-authorship:
(1) Beyond Vector Spaces: Compact Data Representation as Differentiable Weighted Graphs - https://arxiv.org/abs/1910.03524 NeurIPS19
(2) Editable Neural Networks - https://arxiv.org/abs/2004.00345 - ICLR20
(3) Neural Oblivious Decision Ensembles - https://arxiv.org/abs/1909.06312 - ICLR20
(4) Sequence Generation with Unconstrained Generation Order - https://arxiv.org/abs/1911.00176 - NeurIPS19
(5) Learning to route in Similarity Graphs - https://arxiv.org/abs/1905.10987 , ICML19
In all papers (1, 2, 3) I was offered equal contribution role with other first authors, but had to decline; in (4, 5) I played an auxiliary role. To verify my claims, you can either ask the co-authors (e.g. Artem Babenko) or you can download the source files from arxiv: papers (1,2,3,5) contain hedgehog-themed ASCII art in method.tex, which is my informal signature.
C) I am an experienced Open-Source developer with contributions to many prominent deep learning frameworks (e.g. pytorch), see github.com/justheuristic
If that evidence is insufficient, feel free to ask for any additional confirmations.
Thank you for your attention.
0 Replies
Loading