Skip to content

Commit

Permalink
README and wishlist updated
Browse files Browse the repository at this point in the history
  • Loading branch information
nidhinpd-YML committed Oct 10, 2024
1 parent d5c9f22 commit 931d14c
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,8 @@ You can find the Gemma models on GitHub, Hugging Face models, Kaggle, Google Clo
| [Gemma2_on_Groq.ipynb](Gemma/Gemma2_on_Groq.ipynb) | Leverage the free Gemma 2 9B IT model hosted on [Groq](https://groq.com/) (super fast speed). |
| [Run_with_Ollama.ipynb](Gemma/Run_with_Ollama.ipynb) | Run Gemma models using [Ollama](https://www.ollama.com/). |
| [Using_Gemma_with_Llamafile.ipynb](Gemma/Using_Gemma_with_Llamafile.ipynb) | Run Gemma models using [Llamafile](https://github.com/Mozilla-Ocho/llamafile/). |
| [Using_Gemma_with_LlamaCpp.ipynb](Gemma/Using_Gemma_with_LlamaCpp.ipynb) | Run Gemma models using [LlamaCpp](https://github.com/abetlen/llama-cpp-python/). |
| [Using_Gemma_with_LlamaCpp.ipynb](Gemma/Using_Gemma_with_LlamaCpp.ipynb) | Run Gemma models using [LlamaCpp](https://github.com/abetlen/llama-cpp-python/).
| [Using_Gemma_with_LocalGemma.ipynb](Gemma/Using_Gemma_with_LocalGemma.ipynb) | Run Gemma models using [Local Gemma](https://github.com/huggingface/local-gemma/). |
| [Integrate_with_Mesop.ipynb](Gemma/Integrate_with_Mesop.ipynb) | Integrate Gemma with [Google Mesop](https://google.github.io/mesop/). |
| [Integrate_with_OneTwo.ipynb](Gemma/Integrate_with_OneTwo.ipynb) | Integrate Gemma with [Google OneTwo](https://github.com/google-deepmind/onetwo). |
| [Deploy_with_vLLM.ipynb](Gemma/Deploy_with_vLLM.ipynb) | Deploy a Gemma model using [vLLM](https://github.com/vllm-project/vllm). |
Expand Down
1 change: 0 additions & 1 deletion WISHLIST.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@ A wish list of cookbooks showcasing:

* Inference
* Integration with [Google GenKit](https://firebase.google.com/products/genkit)
* HF local-gemma demo
* Gemma+Gemini with [routerLLM](https://github.com/lm-sys/RouteLLM)
* [SGLang](https://github.com/sgl-project/sglang) integration

Expand Down

0 comments on commit 931d14c

Please sign in to comment.