Releases: luozhouyang/transformers-keras
Releases · luozhouyang/transformers-keras
Release v0.2.0
Updates:
- Remove transformer model
- Refactoring bert & albert modeling
- Refactoring loading pretrained bert & albert models
Release v0.1.4
Updates:
- Refactoring
tokenizers
using luozhouyang/naivenlp - Refactoring load pretrained models, remove
adapters
module
Release v0.1.3
Updates:
- Add
adapters
to load pretrained models!
Release v0.1.2
Updates:
- Fixed
mask
dtype mismatch - Add
runners_test
- Fixed examples
Release v0.1.1
Updates:
- Add default value
-1
tovocab_size
,src_vocab_size
,tgt_vocab_size
- Add assertion when
vocab_size <= 0
Release v0.1.0
Updates:
- Refactoring
Transformer
model implementation - Add
BERT
model implementation - Add
ALBERT
model implementation - Refactoring dataset builders
- Refactoring tokenizers
- Add runners to run models