Skip to content

Releases: lucidrains/x-transformers

2.0.2

06 Feb 01:26
Compare
Choose a tag to compare
use cautious lion for parity task

2.0.1

05 Feb 23:54
Compare
Choose a tag to compare
demonstrate hybridization with a gru that only acts every 4 tokens ca…

2.0.0

04 Feb 15:54
Compare
Choose a tag to compare
release multi-latent attention

1.44.8

31 Jan 03:01
Compare
Choose a tag to compare

Full Changelog: 1.44.6...1.44.8

1.44.6

23 Jan 01:22
Compare
Choose a tag to compare

Full Changelog: 1.44.5...1.44.6

1.44.5

23 Jan 01:13
Compare
Choose a tag to compare

What's Changed

  • Fix inp_inject not being used when using in_attn_cond by @MaxWolf-01 in #307

New Contributors

Full Changelog: 1.44.4...1.44.5

1.44.4

05 Jan 17:35
Compare
Choose a tag to compare
if the hybrid module is an RNN, allow for folding it across the seque…

1.44.2

05 Jan 17:13
Compare
Choose a tag to compare
flexibly handle hybrid module outputs

1.44.0

03 Jan 19:46
Compare
Choose a tag to compare
add ability to hybridize attention with external module, for aiming t…

1.43.5

27 Dec 18:29
Compare
Choose a tag to compare

Full Changelog: 1.43.4...1.43.5