diff --git a/README.md b/README.md index d27a73a..2829a67 100644 --- a/README.md +++ b/README.md @@ -50,7 +50,8 @@ You can find the Gemma models on GitHub, Hugging Face models, Kaggle, Google Clo | [Gemma2_on_Groq.ipynb](Gemma/Gemma2_on_Groq.ipynb) | Leverage the free Gemma 2 9B IT model hosted on [Groq](https://groq.com/) (super fast speed). | | [Run_with_Ollama.ipynb](Gemma/Run_with_Ollama.ipynb) | Run Gemma models using [Ollama](https://www.ollama.com/). | | [Using_Gemma_with_Llamafile.ipynb](Gemma/Using_Gemma_with_Llamafile.ipynb) | Run Gemma models using [Llamafile](https://github.com/Mozilla-Ocho/llamafile/). | -| [Using_Gemma_with_LlamaCpp.ipynb](Gemma/Using_Gemma_with_LlamaCpp.ipynb) | Run Gemma models using [LlamaCpp](https://github.com/abetlen/llama-cpp-python/). | +| [Using_Gemma_with_LlamaCpp.ipynb](Gemma/Using_Gemma_with_LlamaCpp.ipynb) | Run Gemma models using [LlamaCpp](https://github.com/abetlen/llama-cpp-python/). +| [Using_Gemma_with_LocalGemma.ipynb](Gemma/Using_Gemma_with_LocalGemma.ipynb) | Run Gemma models using [Local Gemma](https://github.com/huggingface/local-gemma/). | | [Integrate_with_Mesop.ipynb](Gemma/Integrate_with_Mesop.ipynb) | Integrate Gemma with [Google Mesop](https://google.github.io/mesop/). | | [Integrate_with_OneTwo.ipynb](Gemma/Integrate_with_OneTwo.ipynb) | Integrate Gemma with [Google OneTwo](https://github.com/google-deepmind/onetwo). | | [Deploy_with_vLLM.ipynb](Gemma/Deploy_with_vLLM.ipynb) | Deploy a Gemma model using [vLLM](https://github.com/vllm-project/vllm). | diff --git a/WISHLIST.md b/WISHLIST.md index 2f1b776..ea337d3 100644 --- a/WISHLIST.md +++ b/WISHLIST.md @@ -2,7 +2,6 @@ A wish list of cookbooks showcasing: * Inference * Integration with [Google GenKit](https://firebase.google.com/products/genkit) - * HF local-gemma demo * Gemma+Gemini with [routerLLM](https://github.com/lm-sys/RouteLLM) * [SGLang](https://github.com/sgl-project/sglang) integration