Upload all pretrained hf_ehr
models and tokenizers to the Hugging Face via:
python hf_upload.py
This script will:
- Create a new Hugging Face config
- Create a new Hugging Face model
- Create a new Hugging Face tokenizer
- Save the model and tokenizer to a local directory
- Upload the model and tokenizer to Hugging Face Hub
An example of how to use a pretrained hf_ehr
model+tokenizer from Hugging Face to run inference on a patient can be found in hf_test.py
. Simply run:
python hf_test.py
For every model that we upload to Hugging Face, this script will:
- Download the model+tokenizer from Hugging Face Hub
- Tokenize a patient
- Run inference on the patient
- Print the patient representation
If the model is not working, it will be printed out as part of the bad_models
list.