From 06e979b27c931365bc2177989a1259b4f588bbba Mon Sep 17 00:00:00 2001 From: Dion Pinto <55398806+dpirad007@users.noreply.github.com> Date: Sun, 26 Nov 2023 09:26:54 +1300 Subject: [PATCH] Update petals homepage URL (#599) --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index a633a7aee..476c2eeeb 100644 --- a/README.md +++ b/README.md @@ -32,7 +32,7 @@ This section lists projects that leverage hivemind for decentralized training. If you have successfully trained a model or created a downstream repository with the help of our library, feel free to submit a pull request that adds your project to this list. -* **Petals** ([webpage](https://petals.ml), [code](https://github.com/bigscience-workshop/petals)) — a decentralized platform for inference and fine-tuning of 100B+ language models. +* **Petals** ([webpage](https://petals.dev), [code](https://github.com/bigscience-workshop/petals)) — a decentralized platform for inference and fine-tuning of 100B+ language models. * **Training Transformers Together** ([webpage](https://training-transformers-together.github.io/), [code](https://github.com/learning-at-home/dalle-hivemind)) — a NeurIPS 2021 demonstration that trained a collaborative text-to-image Transformer model. * **CALM** ([webpage](https://huggingface.co/CALM), [code](https://github.com/NCAI-Research/CALM)) — a masked language model trained on a combination of Arabic datasets. * **sahajBERT** ([blog post](https://huggingface.co/blog/collaborative-training), [code](https://github.com/tanmoyio/sahajbert)) — a collaboratively pretrained ALBERT-xlarge for the Bengali language.