Sample Submission

This post outlines a few more things you may need to know for creating and configuring your blog posts.


Looking at the Performer from a Hopfield Point of View

The recent paper Rethinking Attention with Performers constructs a new efficient attention mechanism in an elegant way. It strongly reduces the computational cost for long sequences, while keeping the intriguing properties of the original attention mechanism. In doing so, Performers have a complexity only linear in the input length, in contrast to the quadratic complexity of standard Transformers. This is a major breakthrough in the strive of improving Transformer models.

Example content (Basic Markdown)

Howdy! This is an example blog post that shows several types of HTML content supported in this theme.