diff --git a/cookbooks/Anyscale/README.md b/cookbooks/Anyscale/README.md index 989270c73..c14187cc2 100644 --- a/cookbooks/Anyscale/README.md +++ b/cookbooks/Anyscale/README.md @@ -1,7 +1,11 @@ # Anyscale Endpoints with AIConfig +Getting Started: [![colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1JgGjJ2YglyaT5GHQNswkPOyB5oHGbOcv?usp=sharing) +Function Calling with Mixtral: +[![colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1bS8hiSoNXtloQg9nEhxwgd2ZCmgUMZzl?usp=sharing) + [Anyscale Endpoints](https://www.anyscale.com/endpoints) support optimized inference for many open source models, including the LLaMA2 family of models (7B, 13B, 70B, CodeLLaMA, LLaMA Guard) and Mistral (7B, Mixtral 8x7B). This cookbook shows how to use any [Anyscale Endpoints](https://www.anyscale.com/endpoints) model with AIConfig using the same simple API.