Skip to content

zalandoresearch/gpa

Folders and files

NameName
Last commit message
Last commit date

Latest commit

29f103a · Jul 8, 2021

History

3 Commits
Jul 8, 2021
Jul 8, 2021
Jul 8, 2021

Repository files navigation

Grid Partitioned Attention: an efficient attention approximation with inductive bias for the image domain.

Code for the paper "Grid Partitioned Attention: Efficient TransformerApproximation with Inductive Bias for High Resolution Detail Generation", by Nikolay Jetchev, Gökhan Yildirim, Christian Bracher, Roland Vollgraf

The file GPAmodule.py contains the GPA layer definition, and an example how to apply on an image tensor.

TODO add the full generator architecture for pose morphing with attention copying from the paper.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages