From f8648910c30fe8ad9ac1977bd30f6db21ecd660a Mon Sep 17 00:00:00 2001 From: Shilpa Kancharla Date: Tue, 21 May 2024 14:24:23 -0700 Subject: [PATCH 1/9] Add example for adding context information for prompting --- .../Adding_context_information.ipynb | 152 ++++++++++++++++++ 1 file changed, 152 insertions(+) create mode 100644 examples/prompting/Adding_context_information.ipynb diff --git a/examples/prompting/Adding_context_information.ipynb b/examples/prompting/Adding_context_information.ipynb new file mode 100644 index 000000000..eb5b62ac6 --- /dev/null +++ b/examples/prompting/Adding_context_information.ipynb @@ -0,0 +1,152 @@ +{ + "nbformat": 4, + "nbformat_minor": 0, + "metadata": { + "colab": { + "provenance": [] + }, + "kernelspec": { + "name": "python3", + "display_name": "Python 3" + }, + "language_info": { + "name": "python" + } + }, + "cells": [ + { + "cell_type": "markdown", + "source": [ + "# Gemini API: Adding context information" + ], + "metadata": { + "id": "sP8PQnz1QrcF" + } + }, + { + "cell_type": "markdown", + "source": [ + "\n", + " \n", + "
\n", + " Run in Google Colab\n", + "
" + ], + "metadata": { + "id": "bxGr_x3MRA0z" + } + }, + { + "cell_type": "markdown", + "source": [ + "While LLMs are trained extensively on various documents and data, the LLM does not know everything. New information or information that is not easily accessible cannot be known by the LLM, unless it was specifically added to its corpus of knowledge somehow. For this reason, it is sometimes necessary to provide the LLM, with information and context necessary to answer our queries by providing additional context." + ], + "metadata": { + "id": "ysy--KfNRrCq" + } + }, + { + "cell_type": "code", + "source": [ + "!pip install -U -q google-generativeai" + ], + "metadata": { + "id": "Ne-3gnXqR0hI" + }, + "execution_count": 1, + "outputs": [] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "id": "EconMHePQHGw" + }, + "outputs": [], + "source": [ + "import google.generativeai as genai\n", + "\n", + "from IPython.display import Markdown" + ] + }, + { + "cell_type": "markdown", + "source": [ + "## Configure your API key\n", + "\n", + "To run the following cell, your API key must be stored it in a Colab Secret named `GOOGLE_API_KEY`. If you don't already have an API key, or you're not sure how to create a Colab Secret, see [Authentication](https://github.com/google-gemini/cookbook/blob/main/quickstarts/Authentication.ipynb) for an example." + ], + "metadata": { + "id": "eomJzCa6lb90" + } + }, + { + "cell_type": "code", + "source": [ + "from google.colab import userdata\n", + "GOOGLE_API_KEY=userdata.get('GOOGLE_API_KEY')\n", + "\n", + "genai.configure(api_key=GOOGLE_API_KEY)" + ], + "metadata": { + "id": "v-JZzORUpVR2" + }, + "execution_count": 3, + "outputs": [] + }, + { + "cell_type": "markdown", + "source": [ + "## Example" + ], + "metadata": { + "id": "JljcHgI2ltTY" + } + }, + { + "cell_type": "code", + "source": [ + "# the list as of April 2024\n", + "prompt = \"\"\"\n", + "QUERY: provide a list of atheletes that competed in olympics exactly 9 times.\n", + "CONTEXT:\n", + "Ian Millar, 10\n", + "Hubert Raudaschl, 9\n", + "Afanasijs Kuzmins, 9\n", + "Nino Salukvadze, 9\n", + "Piero d'Inzeo, 8\n", + "Raimondo d'Inzeo, 8\n", + "Claudia Pechstein, 8\n", + "Jaqueline Mourão, 8\n", + "Ivan Osiier, 7\n", + "François Lafortune, Jr, 7\n", + "\n", + "ANSWER:\"\"\"\n", + "model = genai.GenerativeModel(model_name='gemini-1.5-flash-latest', generation_config={\"temperature\": 0})\n", + "Markdown(model.generate_content(prompt).text)" + ], + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 104 + }, + "id": "uFcm6Dd7ls_F", + "outputId": "37628141-885c-4cc4-dcd4-3c340af3a574" + }, + "execution_count": 5, + "outputs": [ + { + "output_type": "execute_result", + "data": { + "text/plain": [ + "" + ], + "text/markdown": "The list you provided already includes all the athletes who competed in the Olympics exactly 9 times:\n\n* **Hubert Raudaschl**\n* **Afanasijs Kuzmins**\n* **Nino Salukvadze** \n" + }, + "metadata": {}, + "execution_count": 5 + } + ] + } + ] +} \ No newline at end of file From a1f4a4bc68840c35c1e83fe1f389394f385f16be Mon Sep 17 00:00:00 2001 From: Shilpa Kancharla Date: Tue, 21 May 2024 14:38:48 -0700 Subject: [PATCH 2/9] delete --- .../Adding_context_information.ipynb | 152 ------------------ 1 file changed, 152 deletions(-) delete mode 100644 examples/prompting/Adding_context_information.ipynb diff --git a/examples/prompting/Adding_context_information.ipynb b/examples/prompting/Adding_context_information.ipynb deleted file mode 100644 index eb5b62ac6..000000000 --- a/examples/prompting/Adding_context_information.ipynb +++ /dev/null @@ -1,152 +0,0 @@ -{ - "nbformat": 4, - "nbformat_minor": 0, - "metadata": { - "colab": { - "provenance": [] - }, - "kernelspec": { - "name": "python3", - "display_name": "Python 3" - }, - "language_info": { - "name": "python" - } - }, - "cells": [ - { - "cell_type": "markdown", - "source": [ - "# Gemini API: Adding context information" - ], - "metadata": { - "id": "sP8PQnz1QrcF" - } - }, - { - "cell_type": "markdown", - "source": [ - "\n", - " \n", - "
\n", - " Run in Google Colab\n", - "
" - ], - "metadata": { - "id": "bxGr_x3MRA0z" - } - }, - { - "cell_type": "markdown", - "source": [ - "While LLMs are trained extensively on various documents and data, the LLM does not know everything. New information or information that is not easily accessible cannot be known by the LLM, unless it was specifically added to its corpus of knowledge somehow. For this reason, it is sometimes necessary to provide the LLM, with information and context necessary to answer our queries by providing additional context." - ], - "metadata": { - "id": "ysy--KfNRrCq" - } - }, - { - "cell_type": "code", - "source": [ - "!pip install -U -q google-generativeai" - ], - "metadata": { - "id": "Ne-3gnXqR0hI" - }, - "execution_count": 1, - "outputs": [] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": { - "id": "EconMHePQHGw" - }, - "outputs": [], - "source": [ - "import google.generativeai as genai\n", - "\n", - "from IPython.display import Markdown" - ] - }, - { - "cell_type": "markdown", - "source": [ - "## Configure your API key\n", - "\n", - "To run the following cell, your API key must be stored it in a Colab Secret named `GOOGLE_API_KEY`. If you don't already have an API key, or you're not sure how to create a Colab Secret, see [Authentication](https://github.com/google-gemini/cookbook/blob/main/quickstarts/Authentication.ipynb) for an example." - ], - "metadata": { - "id": "eomJzCa6lb90" - } - }, - { - "cell_type": "code", - "source": [ - "from google.colab import userdata\n", - "GOOGLE_API_KEY=userdata.get('GOOGLE_API_KEY')\n", - "\n", - "genai.configure(api_key=GOOGLE_API_KEY)" - ], - "metadata": { - "id": "v-JZzORUpVR2" - }, - "execution_count": 3, - "outputs": [] - }, - { - "cell_type": "markdown", - "source": [ - "## Example" - ], - "metadata": { - "id": "JljcHgI2ltTY" - } - }, - { - "cell_type": "code", - "source": [ - "# the list as of April 2024\n", - "prompt = \"\"\"\n", - "QUERY: provide a list of atheletes that competed in olympics exactly 9 times.\n", - "CONTEXT:\n", - "Ian Millar, 10\n", - "Hubert Raudaschl, 9\n", - "Afanasijs Kuzmins, 9\n", - "Nino Salukvadze, 9\n", - "Piero d'Inzeo, 8\n", - "Raimondo d'Inzeo, 8\n", - "Claudia Pechstein, 8\n", - "Jaqueline Mourão, 8\n", - "Ivan Osiier, 7\n", - "François Lafortune, Jr, 7\n", - "\n", - "ANSWER:\"\"\"\n", - "model = genai.GenerativeModel(model_name='gemini-1.5-flash-latest', generation_config={\"temperature\": 0})\n", - "Markdown(model.generate_content(prompt).text)" - ], - "metadata": { - "colab": { - "base_uri": "https://localhost:8080/", - "height": 104 - }, - "id": "uFcm6Dd7ls_F", - "outputId": "37628141-885c-4cc4-dcd4-3c340af3a574" - }, - "execution_count": 5, - "outputs": [ - { - "output_type": "execute_result", - "data": { - "text/plain": [ - "" - ], - "text/markdown": "The list you provided already includes all the athletes who competed in the Olympics exactly 9 times:\n\n* **Hubert Raudaschl**\n* **Afanasijs Kuzmins**\n* **Nino Salukvadze** \n" - }, - "metadata": {}, - "execution_count": 5 - } - ] - } - ] -} \ No newline at end of file From e57e264bef22e82f3edf590e7fade3b2241fe97f Mon Sep 17 00:00:00 2001 From: Shilpa Kancharla Date: Wed, 22 May 2024 13:28:26 -0700 Subject: [PATCH 3/9] Add JSON schema quickstart --- quickstarts/JSON_Schema.ipynb | 191 ++++++++++++++++++++++++++++++++++ 1 file changed, 191 insertions(+) create mode 100644 quickstarts/JSON_Schema.ipynb diff --git a/quickstarts/JSON_Schema.ipynb b/quickstarts/JSON_Schema.ipynb new file mode 100644 index 000000000..cafc5573f --- /dev/null +++ b/quickstarts/JSON_Schema.ipynb @@ -0,0 +1,191 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "source": [ + "# Gemini API: JSON Schema\n", + "\n", + "The Gemini API can be used to generate a JSON output if we set the schema that we would like to use." + ], + "metadata": { + "id": "GAsiP4mohC2_" + } + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "qLuL9m7KhvxR", + "outputId": "86c668b8-f3a0-40b0-e832-76bb4f82368f" + }, + "outputs": [ + { + "output_type": "stream", + "name": "stdout", + "text": [ + " Installing build dependencies ... \u001b[?25l\u001b[?25hdone\n", + " Getting requirements to build wheel ... \u001b[?25l\u001b[?25hdone\n", + " Preparing metadata (pyproject.toml) ... \u001b[?25l\u001b[?25hdone\n", + "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m679.1/679.1 kB\u001b[0m \u001b[31m7.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", + "\u001b[?25h Building wheel for google-generativeai (pyproject.toml) ... \u001b[?25l\u001b[?25hdone\n" + ] + } + ], + "source": [ + "!pip install -U -q google-generativeai" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "id": "ATIbQM0NHhkj" + }, + "outputs": [], + "source": [ + "import google.generativeai as genai\n", + "\n", + "import dataclasses\n", + "import typing_extensions as typing" + ] + }, + { + "cell_type": "markdown", + "source": [ + "## Configure your API key\n", + "\n", + "To run the following cell, your API key must be stored in a Colab Secret named `GOOGLE_API_KEY`. If you don't already have an API key, or you're not sure how to create a Colab Secret, see [Authentication](https://github.com/google-gemini/cookbook/blob/main/quickstarts/Authentication.ipynb) for an example." + ], + "metadata": { + "id": "B-axqBTM8Lbd" + } + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "id": "d6lYXRcjthKV" + }, + "outputs": [], + "source": [ + "from google.colab import userdata\n", + "GOOGLE_API_KEY=userdata.get('GOOGLE_API_KEY')\n", + "\n", + "genai.configure(api_key=GOOGLE_API_KEY)" + ] + }, + { + "cell_type": "markdown", + "source": [ + "## Generate JSON\n", + "\n", + "We can take a Python class, for instance, and use it as our schema for generating JSON." + ], + "metadata": { + "id": "K9nIks0R-tIa" + } + }, + { + "cell_type": "code", + "source": [ + "class Person(typing.TypedDict):\n", + " family_name: str\n", + " favorite_food: str\n" + ], + "metadata": { + "id": "vP6teXff_H_L" + }, + "execution_count": 5, + "outputs": [] + }, + { + "cell_type": "code", + "source": [ + "prompt = \"Hello, describe a person, all fields are required.\"" + ], + "metadata": { + "id": "p0gxz0NNr8si" + }, + "execution_count": 6, + "outputs": [] + }, + { + "cell_type": "markdown", + "source": [ + "Using `generate_content`, we pass in the Python class `Person` defined above into the `generation_config`'s `response_schema` field." + ], + "metadata": { + "id": "vBlWzt6M-2oM" + } + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "id": "8oe-tL8MDGtx" + }, + "outputs": [], + "source": [ + "model = genai.GenerativeModel(model_name=\"models/gemini-1.5-pro-latest\")\n", + "\n", + "result = model.generate_content(\n", + " prompt,\n", + " generation_config={\"response_mime_type\": \"application/json\",\n", + " \"response_schema\": Person\n", + " },\n", + " request_options={\"timeout\": 600},\n", + ")" + ] + }, + { + "cell_type": "code", + "source": [ + "print(result.text)" + ], + "metadata": { + "id": "slYcVAcqaDQY", + "colab": { + "base_uri": "https://localhost:8080/" + }, + "outputId": "468d90b2-b7a9-4eca-a09e-4d9b46bcc741" + }, + "execution_count": 8, + "outputs": [ + { + "output_type": "stream", + "name": "stdout", + "text": [ + "{\"family_name\": \"Smith\", \"favorite_food\": \"Pizza\" } \n" + ] + } + ] + } + ], + "metadata": { + "colab": { + "provenance": [] + }, + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.11.3" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} \ No newline at end of file From a9fbc9a425eb1e1d58ed05acceff59daf7140148 Mon Sep 17 00:00:00 2001 From: Shilpa Kancharla Date: Wed, 22 May 2024 13:30:30 -0700 Subject: [PATCH 4/9] Add info about Gemini 1.5 pro --- quickstarts/JSON_Schema.ipynb | 70 +++++++++++++++++------------------ 1 file changed, 35 insertions(+), 35 deletions(-) diff --git a/quickstarts/JSON_Schema.ipynb b/quickstarts/JSON_Schema.ipynb index cafc5573f..7faeb03ad 100644 --- a/quickstarts/JSON_Schema.ipynb +++ b/quickstarts/JSON_Schema.ipynb @@ -2,14 +2,14 @@ "cells": [ { "cell_type": "markdown", + "metadata": { + "id": "GAsiP4mohC2_" + }, "source": [ "# Gemini API: JSON Schema\n", "\n", "The Gemini API can be used to generate a JSON output if we set the schema that we would like to use." - ], - "metadata": { - "id": "GAsiP4mohC2_" - } + ] }, { "cell_type": "code", @@ -23,8 +23,8 @@ }, "outputs": [ { - "output_type": "stream", "name": "stdout", + "output_type": "stream", "text": [ " Installing build dependencies ... \u001b[?25l\u001b[?25hdone\n", " Getting requirements to build wheel ... \u001b[?25l\u001b[?25hdone\n", @@ -54,14 +54,14 @@ }, { "cell_type": "markdown", + "metadata": { + "id": "B-axqBTM8Lbd" + }, "source": [ "## Configure your API key\n", "\n", "To run the following cell, your API key must be stored in a Colab Secret named `GOOGLE_API_KEY`. If you don't already have an API key, or you're not sure how to create a Colab Secret, see [Authentication](https://github.com/google-gemini/cookbook/blob/main/quickstarts/Authentication.ipynb) for an example." - ], - "metadata": { - "id": "B-axqBTM8Lbd" - } + ] }, { "cell_type": "code", @@ -79,47 +79,47 @@ }, { "cell_type": "markdown", + "metadata": { + "id": "K9nIks0R-tIa" + }, "source": [ "## Generate JSON\n", "\n", "We can take a Python class, for instance, and use it as our schema for generating JSON." - ], - "metadata": { - "id": "K9nIks0R-tIa" - } + ] }, { "cell_type": "code", + "execution_count": 5, + "metadata": { + "id": "vP6teXff_H_L" + }, + "outputs": [], "source": [ "class Person(typing.TypedDict):\n", " family_name: str\n", " favorite_food: str\n" - ], - "metadata": { - "id": "vP6teXff_H_L" - }, - "execution_count": 5, - "outputs": [] + ] }, { "cell_type": "code", - "source": [ - "prompt = \"Hello, describe a person, all fields are required.\"" - ], + "execution_count": 6, "metadata": { "id": "p0gxz0NNr8si" }, - "execution_count": 6, - "outputs": [] + "outputs": [], + "source": [ + "prompt = \"Hello, describe a person, all fields are required.\"" + ] }, { "cell_type": "markdown", - "source": [ - "Using `generate_content`, we pass in the Python class `Person` defined above into the `generation_config`'s `response_schema` field." - ], "metadata": { "id": "vBlWzt6M-2oM" - } + }, + "source": [ + "Using `generate_content`, we pass in the Python class `Person` defined above into the `generation_config`'s `response_schema` field. Let's use the latest Gemini 1.5 Pro model to generate our JSON." + ] }, { "cell_type": "code", @@ -142,25 +142,25 @@ }, { "cell_type": "code", - "source": [ - "print(result.text)" - ], + "execution_count": 8, "metadata": { - "id": "slYcVAcqaDQY", "colab": { "base_uri": "https://localhost:8080/" }, + "id": "slYcVAcqaDQY", "outputId": "468d90b2-b7a9-4eca-a09e-4d9b46bcc741" }, - "execution_count": 8, "outputs": [ { - "output_type": "stream", "name": "stdout", + "output_type": "stream", "text": [ "{\"family_name\": \"Smith\", \"favorite_food\": \"Pizza\" } \n" ] } + ], + "source": [ + "print(result.text)" ] } ], @@ -188,4 +188,4 @@ }, "nbformat": 4, "nbformat_minor": 0 -} \ No newline at end of file +} From 7944d87b0e7cd0e9b6ccff8d41d797ab6e6dc14f Mon Sep 17 00:00:00 2001 From: Shilpa Kancharla Date: Fri, 24 May 2024 13:49:56 -0700 Subject: [PATCH 5/9] Combining JSON mode and schema notebooks --- quickstarts/JSON_Schema.ipynb | 191 ------------------ quickstarts/JSON_mode.ipynb | 354 +++++++++++++++++++++++----------- 2 files changed, 243 insertions(+), 302 deletions(-) delete mode 100644 quickstarts/JSON_Schema.ipynb diff --git a/quickstarts/JSON_Schema.ipynb b/quickstarts/JSON_Schema.ipynb deleted file mode 100644 index 7faeb03ad..000000000 --- a/quickstarts/JSON_Schema.ipynb +++ /dev/null @@ -1,191 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": { - "id": "GAsiP4mohC2_" - }, - "source": [ - "# Gemini API: JSON Schema\n", - "\n", - "The Gemini API can be used to generate a JSON output if we set the schema that we would like to use." - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": { - "colab": { - "base_uri": "https://localhost:8080/" - }, - "id": "qLuL9m7KhvxR", - "outputId": "86c668b8-f3a0-40b0-e832-76bb4f82368f" - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - " Installing build dependencies ... \u001b[?25l\u001b[?25hdone\n", - " Getting requirements to build wheel ... \u001b[?25l\u001b[?25hdone\n", - " Preparing metadata (pyproject.toml) ... \u001b[?25l\u001b[?25hdone\n", - "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m679.1/679.1 kB\u001b[0m \u001b[31m7.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", - "\u001b[?25h Building wheel for google-generativeai (pyproject.toml) ... \u001b[?25l\u001b[?25hdone\n" - ] - } - ], - "source": [ - "!pip install -U -q google-generativeai" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": { - "id": "ATIbQM0NHhkj" - }, - "outputs": [], - "source": [ - "import google.generativeai as genai\n", - "\n", - "import dataclasses\n", - "import typing_extensions as typing" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "id": "B-axqBTM8Lbd" - }, - "source": [ - "## Configure your API key\n", - "\n", - "To run the following cell, your API key must be stored in a Colab Secret named `GOOGLE_API_KEY`. If you don't already have an API key, or you're not sure how to create a Colab Secret, see [Authentication](https://github.com/google-gemini/cookbook/blob/main/quickstarts/Authentication.ipynb) for an example." - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": { - "id": "d6lYXRcjthKV" - }, - "outputs": [], - "source": [ - "from google.colab import userdata\n", - "GOOGLE_API_KEY=userdata.get('GOOGLE_API_KEY')\n", - "\n", - "genai.configure(api_key=GOOGLE_API_KEY)" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "id": "K9nIks0R-tIa" - }, - "source": [ - "## Generate JSON\n", - "\n", - "We can take a Python class, for instance, and use it as our schema for generating JSON." - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": { - "id": "vP6teXff_H_L" - }, - "outputs": [], - "source": [ - "class Person(typing.TypedDict):\n", - " family_name: str\n", - " favorite_food: str\n" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": { - "id": "p0gxz0NNr8si" - }, - "outputs": [], - "source": [ - "prompt = \"Hello, describe a person, all fields are required.\"" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "id": "vBlWzt6M-2oM" - }, - "source": [ - "Using `generate_content`, we pass in the Python class `Person` defined above into the `generation_config`'s `response_schema` field. Let's use the latest Gemini 1.5 Pro model to generate our JSON." - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": { - "id": "8oe-tL8MDGtx" - }, - "outputs": [], - "source": [ - "model = genai.GenerativeModel(model_name=\"models/gemini-1.5-pro-latest\")\n", - "\n", - "result = model.generate_content(\n", - " prompt,\n", - " generation_config={\"response_mime_type\": \"application/json\",\n", - " \"response_schema\": Person\n", - " },\n", - " request_options={\"timeout\": 600},\n", - ")" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": { - "colab": { - "base_uri": "https://localhost:8080/" - }, - "id": "slYcVAcqaDQY", - "outputId": "468d90b2-b7a9-4eca-a09e-4d9b46bcc741" - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "{\"family_name\": \"Smith\", \"favorite_food\": \"Pizza\" } \n" - ] - } - ], - "source": [ - "print(result.text)" - ] - } - ], - "metadata": { - "colab": { - "provenance": [] - }, - "kernelspec": { - "display_name": "Python 3 (ipykernel)", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.11.3" - } - }, - "nbformat": 4, - "nbformat_minor": 0 -} diff --git a/quickstarts/JSON_mode.ipynb b/quickstarts/JSON_mode.ipynb index 6a102ce05..f666a2d5e 100644 --- a/quickstarts/JSON_mode.ipynb +++ b/quickstarts/JSON_mode.ipynb @@ -2,40 +2,6 @@ "cells": [ { "cell_type": "markdown", - "metadata": { - "id": "Tce3stUlHN0L" - }, - "source": [ - "##### Copyright 2024 Google LLC." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": { - "cellView": "form", - "id": "tuOe1ymfHZPu" - }, - "outputs": [], - "source": [ - "# @title Licensed under the Apache License, Version 2.0 (the \"License\");\n", - "# you may not use this file except in compliance with the License.\n", - "# You may obtain a copy of the License at\n", - "#\n", - "# https://www.apache.org/licenses/LICENSE-2.0\n", - "#\n", - "# Unless required by applicable law or agreed to in writing, software\n", - "# distributed under the License is distributed on an \"AS IS\" BASIS,\n", - "# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n", - "# See the License for the specific language governing permissions and\n", - "# limitations under the License." - ] - }, - { - "cell_type": "markdown", - "metadata": { - "id": "893sOzyhJDma" - }, "source": [ "# Gemini API: JSON Mode Quickstart\n", "\n", @@ -44,161 +10,202 @@ " Run in Google Colab\n", " \n", "" - ] + ], + "metadata": { + "id": "GAsiP4mohC2_" + } }, { "cell_type": "markdown", - "metadata": { - "id": "h4LQoYRTJIP9" - }, "source": [ - "This notebook demonstrates how to use JSON mode." - ] + "The Gemini API can be used to generate a JSON output if we set the schema that we would like to use.\n", + "\n", + "**Note**: Use Gemini 1.5 Pro when generating JSON. JSON schemas are only supported by Gemini 1.5 Pro right now." + ], + "metadata": { + "id": "lF6sWVRGQ_bi" + } }, { "cell_type": "code", "execution_count": 1, "metadata": { - "id": "_PBH7eR9He0I" + "id": "qLuL9m7KhvxR" }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m146.8/146.8 kB\u001b[0m \u001b[31m1.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", - "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m664.5/664.5 kB\u001b[0m \u001b[31m6.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", - "\u001b[?25h" - ] - } - ], + "outputs": [], "source": [ - "!pip install -qU google-generativeai" + "!pip install -U -q google-generativeai" ] }, { "cell_type": "code", "execution_count": 2, "metadata": { - "id": "2zwIBNLWJvRf" + "id": "ATIbQM0NHhkj" }, "outputs": [], "source": [ "import google.generativeai as genai\n", - "import json" + "from google.generativeai.types import content_types\n", + "\n", + "import json\n", + "import dataclasses\n", + "import typing_extensions as typing" ] }, { "cell_type": "markdown", - "metadata": { - "id": "F6gHNgcUypVN" - }, "source": [ - "To run the following cell, your API key must be stored it in a Colab Secret named `GOOGLE_API_KEY`. If you don't already have an API key, or you're not sure how to create a Colab Secret, see the [Authentication](https://github.com/google-gemini/cookbook/blob/main/quickstarts/Authentication.ipynb) quickstart for an example." - ] + "## Configure your API key\n", + "\n", + "To run the following cell, your API key must be stored in a Colab Secret named `GOOGLE_API_KEY`. If you don't already have an API key, or you're not sure how to create a Colab Secret, see [Authentication](https://github.com/google-gemini/cookbook/blob/main/quickstarts/Authentication.ipynb) for an example." + ], + "metadata": { + "id": "B-axqBTM8Lbd" + } }, { "cell_type": "code", "execution_count": 3, "metadata": { - "id": "t0jy9XWjJwv7" + "id": "d6lYXRcjthKV" }, "outputs": [], "source": [ "from google.colab import userdata\n", "GOOGLE_API_KEY=userdata.get('GOOGLE_API_KEY')\n", + "\n", "genai.configure(api_key=GOOGLE_API_KEY)" ] }, { "cell_type": "markdown", - "metadata": { - "id": "vf42XN1KLcfV" - }, "source": [ - "## Activate JSON mode" - ] + "## Activate JSON Mode\n", + "\n", + " Activate JSON mode by specifying `respose_mime_type` in the `generation_config` parameter. We'll use Gemini 1.5 Pro for this example, but note that you can use Gemini 1.5 Flash while describing your schema in the prompt." + ], + "metadata": { + "id": "hD3qXcOTRD3z" + } }, { - "cell_type": "markdown", + "cell_type": "code", + "source": [ + "model = genai.GenerativeModel(\"gemini-1.5-pro-latest\",\n", + " generation_config={\"response_mime_type\": \"application/json\"})" + ], "metadata": { - "id": "dC5-79CDMJ3R" + "id": "i5Rod-lXRIhf" }, - "source": [ - "Activate JSON mode by specifying `respose_mime_type` in the `generation_config` parameter." - ] + "execution_count": 18, + "outputs": [] }, { "cell_type": "code", - "execution_count": 4, + "source": [ + "class Recipe(typing.TypedDict):\n", + " recipe_name: str" + ], "metadata": { - "id": "WWq64FXSLXgr" + "id": "JiIxKaLl4R0f" }, - "outputs": [], - "source": [ - "model = genai.GenerativeModel(\"gemini-1.5-pro-latest\",\n", - " generation_config={\"response_mime_type\": \"application/json\"})" - ] + "execution_count": 12, + "outputs": [] }, { "cell_type": "code", - "execution_count": 5, + "source": [ + "response_schema = content_types._schema_for_class(list[Recipe])\n", + "response_schema" + ], "metadata": { - "id": "Y_djQzyyaCLg" + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "VV_bRJb15uqv", + "outputId": "c1cd99ce-5268-407e-ac61-f093bfa852b7" }, - "outputs": [], + "execution_count": 13, + "outputs": [ + { + "output_type": "execute_result", + "data": { + "text/plain": [ + "{'items': {'properties': {'recipe_name': {'type': 'string'}},\n", + " 'type': 'object'},\n", + " 'type': 'array'}" + ] + }, + "metadata": {}, + "execution_count": 13 + } + ] + }, + { + "cell_type": "code", "source": [ - "prompt = \"\"\"List a few popular cookie recipes using this JSON schema:\n", + "prompt = \"\"\"List a few popular cookie recipes using this JSON schema, be sure to return an array:\n", "{'type': 'object', 'properties': { 'recipe_name': {'type': 'string'}}}\"\"\"" - ] + ], + "metadata": { + "id": "K8ezjNb0RJ6Y" + }, + "execution_count": 29, + "outputs": [] }, { "cell_type": "code", - "execution_count": 6, + "source": [ + "response = model.generate_content(prompt)\n", + "print(response.text)" + ], "metadata": { - "id": "aENeySrWMJN6" + "colab": { + "base_uri": "https://localhost:8080/", + "height": 72 + }, + "id": "ggudoxK8RMlb", + "outputId": "a6b881a8-d5b7-4517-9a8d-070f1ae2c59c" }, + "execution_count": 30, "outputs": [ { - "name": "stdout", "output_type": "stream", + "name": "stdout", "text": [ - "[\n", - " {\"recipe_name\": \"Chocolate Chip Cookies\"},\n", - " {\"recipe_name\": \"Peanut Butter Cookies\"},\n", - " {\"recipe_name\": \"Oatmeal Raisin Cookies\"},\n", - " {\"recipe_name\": \"Sugar Cookies\"},\n", - " {\"recipe_name\": \"Shortbread Cookies\"}\n", - "]\n", - "\n", + "[{\"recipe_name\": \"Chocolate Chip Cookies\"}, {\"recipe_name\": \"Peanut Butter Cookies\"}, {\"recipe_name\": \"Oatmeal Raisin Cookies\"}, {\"recipe_name\": \"Sugar Cookies\"}, {\"recipe_name\": \"Snickerdoodles\"}]\n", "\n" ] } - ], - "source": [ - "response = model.generate_content(prompt)\n", - "print(response.text)" ] }, { "cell_type": "markdown", - "metadata": { - "id": "pqNsOE1YysLc" - }, "source": [ "Just for fun, parse the string to JSON, and then serialize it." - ] + ], + "metadata": { + "id": "9TqoNg3VSMYB" + } }, { "cell_type": "code", - "execution_count": 9, + "source": [ + "print(json.dumps(json.loads(response.text), indent=4))" + ], "metadata": { - "id": "nb9Z9TdHRzTu" + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "WLDPREpmSMu5", + "outputId": "4043817f-4885-4f91-a7c6-7da525350425" }, + "execution_count": 31, "outputs": [ { - "name": "stdout", "output_type": "stream", + "name": "stdout", "text": [ "[\n", " {\n", @@ -214,27 +221,152 @@ " \"recipe_name\": \"Sugar Cookies\"\n", " },\n", " {\n", - " \"recipe_name\": \"Shortbread Cookies\"\n", + " \"recipe_name\": \"Snickerdoodles\"\n", " }\n", "]\n" ] } + ] + }, + { + "cell_type": "markdown", + "source": [ + "## Generate JSON from schema\n", + "\n", + "We can take a Python class, for instance, and use it as our schema for generating JSON. When passing in the `response_schema` parameter, use the Gemini 1.5 Pro model. Gemini 1.5 Flash does not support this." ], + "metadata": { + "id": "K9nIks0R-tIa" + } + }, + { + "cell_type": "code", "source": [ - "print(json.dumps(json.loads(response.text), indent=4))" + "class Person(typing.TypedDict):\n", + " family_name: str\n", + " favorite_food: str" + ], + "metadata": { + "id": "vP6teXff_H_L" + }, + "execution_count": 5, + "outputs": [] + }, + { + "cell_type": "code", + "source": [ + "prompt = \"Hello, describe some people, all fields are required.\"" + ], + "metadata": { + "id": "p0gxz0NNr8si" + }, + "execution_count": 6, + "outputs": [] + }, + { + "cell_type": "code", + "source": [ + "response_schema = content_types._schema_for_class(list[Person])\n", + "response_schema" + ], + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "SuZQTQZ8438K", + "outputId": "81f15a94-bbb2-43f6-f24d-e2c267bd9d52" + }, + "execution_count": 8, + "outputs": [ + { + "output_type": "execute_result", + "data": { + "text/plain": [ + "{'items': {'properties': {'family_name': {'type': 'string'},\n", + " 'favorite_food': {'type': 'string'}},\n", + " 'type': 'object'},\n", + " 'type': 'array'}" + ] + }, + "metadata": {}, + "execution_count": 8 + } + ] + }, + { + "cell_type": "markdown", + "source": [ + "Using `generate_content`, we pass in the Python class `Person` defined above into the `generation_config`'s `response_schema` field." + ], + "metadata": { + "id": "vBlWzt6M-2oM" + } + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "id": "8oe-tL8MDGtx" + }, + "outputs": [], + "source": [ + "model = genai.GenerativeModel(model_name=\"models/gemini-1.5-pro-latest\")\n", + "\n", + "result = model.generate_content(\n", + " prompt,\n", + " generation_config={\"response_mime_type\": \"application/json\",\n", + " \"response_schema\": response_schema,\n", + " },\n", + " request_options={\"timeout\": 600},\n", + ")" + ] + }, + { + "cell_type": "code", + "source": [ + "print(result.text)" + ], + "metadata": { + "id": "slYcVAcqaDQY", + "colab": { + "base_uri": "https://localhost:8080/" + }, + "outputId": "bb25d4ba-2642-4ffc-c9c8-1490a9fa0393" + }, + "execution_count": 10, + "outputs": [ + { + "output_type": "stream", + "name": "stdout", + "text": [ + "[{\"favorite_food\": \"pizza\"}] \n" + ] + } ] } ], "metadata": { "colab": { - "name": "JSON_mode.ipynb", - "toc_visible": true + "provenance": [] }, "kernelspec": { - "display_name": "Python 3", + "display_name": "Python 3 (ipykernel)", + "language": "python", "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.11.3" } }, "nbformat": 4, "nbformat_minor": 0 -} +} \ No newline at end of file From de9396ca92030d1632aaea6354141385015c93e8 Mon Sep 17 00:00:00 2001 From: Shilpa Kancharla Date: Fri, 24 May 2024 14:44:00 -0700 Subject: [PATCH 6/9] Reformatted notebook --- quickstarts/JSON_mode.ipynb | 225 +++++++++++++++--------------------- 1 file changed, 96 insertions(+), 129 deletions(-) diff --git a/quickstarts/JSON_mode.ipynb b/quickstarts/JSON_mode.ipynb index f666a2d5e..369ddc593 100644 --- a/quickstarts/JSON_mode.ipynb +++ b/quickstarts/JSON_mode.ipynb @@ -2,6 +2,9 @@ "cells": [ { "cell_type": "markdown", + "metadata": { + "id": "GAsiP4mohC2_" + }, "source": [ "# Gemini API: JSON Mode Quickstart\n", "\n", @@ -10,21 +13,18 @@ " Run in Google Colab\n", " \n", "" - ], - "metadata": { - "id": "GAsiP4mohC2_" - } + ] }, { "cell_type": "markdown", + "metadata": { + "id": "lF6sWVRGQ_bi" + }, "source": [ "The Gemini API can be used to generate a JSON output if we set the schema that we would like to use.\n", "\n", "**Note**: Use Gemini 1.5 Pro when generating JSON. JSON schemas are only supported by Gemini 1.5 Pro right now." - ], - "metadata": { - "id": "lF6sWVRGQ_bi" - } + ] }, { "cell_type": "code", @@ -55,14 +55,14 @@ }, { "cell_type": "markdown", + "metadata": { + "id": "B-axqBTM8Lbd" + }, "source": [ "## Configure your API key\n", "\n", "To run the following cell, your API key must be stored in a Colab Secret named `GOOGLE_API_KEY`. If you don't already have an API key, or you're not sure how to create a Colab Secret, see [Authentication](https://github.com/google-gemini/cookbook/blob/main/quickstarts/Authentication.ipynb) for an example." - ], - "metadata": { - "id": "B-axqBTM8Lbd" - } + ] }, { "cell_type": "code", @@ -80,56 +80,47 @@ }, { "cell_type": "markdown", + "metadata": { + "id": "hD3qXcOTRD3z" + }, "source": [ "## Activate JSON Mode\n", "\n", " Activate JSON mode by specifying `respose_mime_type` in the `generation_config` parameter. We'll use Gemini 1.5 Pro for this example, but note that you can use Gemini 1.5 Flash while describing your schema in the prompt." - ], - "metadata": { - "id": "hD3qXcOTRD3z" - } + ] }, { "cell_type": "code", - "source": [ - "model = genai.GenerativeModel(\"gemini-1.5-pro-latest\",\n", - " generation_config={\"response_mime_type\": \"application/json\"})" - ], + "execution_count": 18, "metadata": { "id": "i5Rod-lXRIhf" }, - "execution_count": 18, - "outputs": [] + "outputs": [], + "source": [ + "model = genai.GenerativeModel(\"gemini-1.5-pro-latest\",\n", + " generation_config={\"response_mime_type\": \"application/json\"})" + ] }, { "cell_type": "code", - "source": [ - "class Recipe(typing.TypedDict):\n", - " recipe_name: str" - ], + "execution_count": 12, "metadata": { "id": "JiIxKaLl4R0f" }, - "execution_count": 12, - "outputs": [] + "outputs": [], + "source": [ + "class Recipe(typing.TypedDict):\n", + " recipe_name: str" + ] }, { "cell_type": "code", - "source": [ - "response_schema = content_types._schema_for_class(list[Recipe])\n", - "response_schema" - ], + "execution_count": 13, "metadata": { - "colab": { - "base_uri": "https://localhost:8080/" - }, - "id": "VV_bRJb15uqv", - "outputId": "c1cd99ce-5268-407e-ac61-f093bfa852b7" + "id": "VV_bRJb15uqv" }, - "execution_count": 13, "outputs": [ { - "output_type": "execute_result", "data": { "text/plain": [ "{'items': {'properties': {'recipe_name': {'type': 'string'}},\n", @@ -137,75 +128,68 @@ " 'type': 'array'}" ] }, + "execution_count": 13, "metadata": {}, - "execution_count": 13 + "output_type": "execute_result" } + ], + "source": [ + "response_schema = content_types._schema_for_class(list[Recipe])\n", + "response_schema" ] }, { "cell_type": "code", - "source": [ - "prompt = \"\"\"List a few popular cookie recipes using this JSON schema, be sure to return an array:\n", - "{'type': 'object', 'properties': { 'recipe_name': {'type': 'string'}}}\"\"\"" - ], + "execution_count": 29, "metadata": { "id": "K8ezjNb0RJ6Y" }, - "execution_count": 29, - "outputs": [] + "outputs": [], + "source": [ + "prompt = \"\"\"List a few popular cookie recipes using this JSON schema, be sure to return an array:\n", + "{'type': 'object', 'properties': { 'recipe_name': {'type': 'string'}}}\"\"\"" + ] }, { "cell_type": "code", - "source": [ - "response = model.generate_content(prompt)\n", - "print(response.text)" - ], + "execution_count": 30, "metadata": { - "colab": { - "base_uri": "https://localhost:8080/", - "height": 72 - }, - "id": "ggudoxK8RMlb", - "outputId": "a6b881a8-d5b7-4517-9a8d-070f1ae2c59c" + "id": "ggudoxK8RMlb" }, - "execution_count": 30, "outputs": [ { - "output_type": "stream", "name": "stdout", + "output_type": "stream", "text": [ "[{\"recipe_name\": \"Chocolate Chip Cookies\"}, {\"recipe_name\": \"Peanut Butter Cookies\"}, {\"recipe_name\": \"Oatmeal Raisin Cookies\"}, {\"recipe_name\": \"Sugar Cookies\"}, {\"recipe_name\": \"Snickerdoodles\"}]\n", "\n" ] } + ], + "source": [ + "response = model.generate_content(prompt)\n", + "print(response.text)" ] }, { "cell_type": "markdown", - "source": [ - "Just for fun, parse the string to JSON, and then serialize it." - ], "metadata": { "id": "9TqoNg3VSMYB" - } + }, + "source": [ + "Just for fun, parse the string to JSON, and then serialize it." + ] }, { "cell_type": "code", - "source": [ - "print(json.dumps(json.loads(response.text), indent=4))" - ], + "execution_count": 31, "metadata": { - "colab": { - "base_uri": "https://localhost:8080/" - }, - "id": "WLDPREpmSMu5", - "outputId": "4043817f-4885-4f91-a7c6-7da525350425" + "id": "WLDPREpmSMu5" }, - "execution_count": 31, "outputs": [ { - "output_type": "stream", "name": "stdout", + "output_type": "stream", "text": [ "[\n", " {\n", @@ -226,60 +210,54 @@ "]\n" ] } + ], + "source": [ + "print(json.dumps(json.loads(response.text), indent=4))" ] }, { "cell_type": "markdown", + "metadata": { + "id": "K9nIks0R-tIa" + }, "source": [ "## Generate JSON from schema\n", "\n", "We can take a Python class, for instance, and use it as our schema for generating JSON. When passing in the `response_schema` parameter, use the Gemini 1.5 Pro model. Gemini 1.5 Flash does not support this." - ], - "metadata": { - "id": "K9nIks0R-tIa" - } + ] }, { "cell_type": "code", + "execution_count": 5, + "metadata": { + "id": "vP6teXff_H_L" + }, + "outputs": [], "source": [ "class Person(typing.TypedDict):\n", " family_name: str\n", " favorite_food: str" - ], - "metadata": { - "id": "vP6teXff_H_L" - }, - "execution_count": 5, - "outputs": [] + ] }, { "cell_type": "code", - "source": [ - "prompt = \"Hello, describe some people, all fields are required.\"" - ], + "execution_count": 6, "metadata": { "id": "p0gxz0NNr8si" }, - "execution_count": 6, - "outputs": [] + "outputs": [], + "source": [ + "prompt = \"Hello, describe some people, all fields are required.\"" + ] }, { "cell_type": "code", - "source": [ - "response_schema = content_types._schema_for_class(list[Person])\n", - "response_schema" - ], + "execution_count": 8, "metadata": { - "colab": { - "base_uri": "https://localhost:8080/" - }, - "id": "SuZQTQZ8438K", - "outputId": "81f15a94-bbb2-43f6-f24d-e2c267bd9d52" + "id": "SuZQTQZ8438K" }, - "execution_count": 8, "outputs": [ { - "output_type": "execute_result", "data": { "text/plain": [ "{'items': {'properties': {'family_name': {'type': 'string'},\n", @@ -288,19 +266,24 @@ " 'type': 'array'}" ] }, + "execution_count": 8, "metadata": {}, - "execution_count": 8 + "output_type": "execute_result" } + ], + "source": [ + "response_schema = content_types._schema_for_class(list[Person])\n", + "response_schema" ] }, { "cell_type": "markdown", - "source": [ - "Using `generate_content`, we pass in the Python class `Person` defined above into the `generation_config`'s `response_schema` field." - ], "metadata": { "id": "vBlWzt6M-2oM" - } + }, + "source": [ + "Using `generate_content`, we pass in the Python class `Person` defined above into the `generation_config`'s `response_schema` field." + ] }, { "cell_type": "code", @@ -323,50 +306,34 @@ }, { "cell_type": "code", - "source": [ - "print(result.text)" - ], + "execution_count": 10, "metadata": { - "id": "slYcVAcqaDQY", - "colab": { - "base_uri": "https://localhost:8080/" - }, - "outputId": "bb25d4ba-2642-4ffc-c9c8-1490a9fa0393" + "id": "slYcVAcqaDQY" }, - "execution_count": 10, "outputs": [ { - "output_type": "stream", "name": "stdout", + "output_type": "stream", "text": [ "[{\"favorite_food\": \"pizza\"}] \n" ] } + ], + "source": [ + "print(result.text)" ] } ], "metadata": { "colab": { - "provenance": [] + "name": "JSON_Mode.ipynb", + "toc_visible": true }, "kernelspec": { - "display_name": "Python 3 (ipykernel)", - "language": "python", + "display_name": "Python 3", "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.11.3" } }, "nbformat": 4, "nbformat_minor": 0 -} \ No newline at end of file +} From 95128dd82c38072040b601dfbc3073f73367dc04 Mon Sep 17 00:00:00 2001 From: Shilpa Kancharla Date: Fri, 24 May 2024 14:45:33 -0700 Subject: [PATCH 7/9] replace we with you --- quickstarts/JSON_mode.ipynb | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/quickstarts/JSON_mode.ipynb b/quickstarts/JSON_mode.ipynb index 369ddc593..8ca12a65e 100644 --- a/quickstarts/JSON_mode.ipynb +++ b/quickstarts/JSON_mode.ipynb @@ -21,7 +21,7 @@ "id": "lF6sWVRGQ_bi" }, "source": [ - "The Gemini API can be used to generate a JSON output if we set the schema that we would like to use.\n", + "The Gemini API can be used to generate a JSON output if you set the schema that you would like to use.\n", "\n", "**Note**: Use Gemini 1.5 Pro when generating JSON. JSON schemas are only supported by Gemini 1.5 Pro right now." ] @@ -86,7 +86,7 @@ "source": [ "## Activate JSON Mode\n", "\n", - " Activate JSON mode by specifying `respose_mime_type` in the `generation_config` parameter. We'll use Gemini 1.5 Pro for this example, but note that you can use Gemini 1.5 Flash while describing your schema in the prompt." + " Activate JSON mode by specifying `respose_mime_type` in the `generation_config` parameter. You'll use Gemini 1.5 Pro for this example, but note that you can use Gemini 1.5 Flash while describing your schema in the prompt." ] }, { @@ -223,7 +223,7 @@ "source": [ "## Generate JSON from schema\n", "\n", - "We can take a Python class, for instance, and use it as our schema for generating JSON. When passing in the `response_schema` parameter, use the Gemini 1.5 Pro model. Gemini 1.5 Flash does not support this." + "You can take a Python class, for instance, and use it as our schema for generating JSON. When passing in the `response_schema` parameter, use the Gemini 1.5 Pro model. Gemini 1.5 Flash does not support this." ] }, { From 97909e01832a82a43173de114e3b9e64d6e64d39 Mon Sep 17 00:00:00 2001 From: Mark Daoust Date: Tue, 28 May 2024 10:15:42 -0700 Subject: [PATCH 8/9] format --- quickstarts/JSON_mode.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/quickstarts/JSON_mode.ipynb b/quickstarts/JSON_mode.ipynb index 8ca12a65e..693d6fa5b 100644 --- a/quickstarts/JSON_mode.ipynb +++ b/quickstarts/JSON_mode.ipynb @@ -326,7 +326,7 @@ ], "metadata": { "colab": { - "name": "JSON_Mode.ipynb", + "name": "JSON_mode.ipynb", "toc_visible": true }, "kernelspec": { From 84323cf47cb267dcda9f356a55948a1e6bce73ba Mon Sep 17 00:00:00 2001 From: Mark Daoust Date: Tue, 28 May 2024 10:59:51 -0700 Subject: [PATCH 9/9] Use the same example in both halves. --- quickstarts/JSON_mode.ipynb | 181 +++++++++++++++++------------------- 1 file changed, 84 insertions(+), 97 deletions(-) diff --git a/quickstarts/JSON_mode.ipynb b/quickstarts/JSON_mode.ipynb index 693d6fa5b..7661c9436 100644 --- a/quickstarts/JSON_mode.ipynb +++ b/quickstarts/JSON_mode.ipynb @@ -1,5 +1,36 @@ { "cells": [ + { + "cell_type": "markdown", + "metadata": { + "id": "Tce3stUlHN0L" + }, + "source": [ + "##### Copyright 2024 Google LLC." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "cellView": "form", + "id": "tuOe1ymfHZPu" + }, + "outputs": [], + "source": [ + "# @title Licensed under the Apache License, Version 2.0 (the \"License\");\n", + "# you may not use this file except in compliance with the License.\n", + "# You may obtain a copy of the License at\n", + "#\n", + "# https://www.apache.org/licenses/LICENSE-2.0\n", + "#\n", + "# Unless required by applicable law or agreed to in writing, software\n", + "# distributed under the License is distributed on an \"AS IS\" BASIS,\n", + "# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n", + "# See the License for the specific language governing permissions and\n", + "# limitations under the License." + ] + }, { "cell_type": "markdown", "metadata": { @@ -23,12 +54,12 @@ "source": [ "The Gemini API can be used to generate a JSON output if you set the schema that you would like to use.\n", "\n", - "**Note**: Use Gemini 1.5 Pro when generating JSON. JSON schemas are only supported by Gemini 1.5 Pro right now." + "**Note**: JSON schemas are only supported by Gemini 1.5 Pro right now." ] }, { "cell_type": "code", - "execution_count": 1, + "execution_count": null, "metadata": { "id": "qLuL9m7KhvxR" }, @@ -46,7 +77,6 @@ "outputs": [], "source": [ "import google.generativeai as genai\n", - "from google.generativeai.types import content_types\n", "\n", "import json\n", "import dataclasses\n", @@ -66,7 +96,7 @@ }, { "cell_type": "code", - "execution_count": 3, + "execution_count": null, "metadata": { "id": "d6lYXRcjthKV" }, @@ -86,103 +116,96 @@ "source": [ "## Activate JSON Mode\n", "\n", - " Activate JSON mode by specifying `respose_mime_type` in the `generation_config` parameter. You'll use Gemini 1.5 Pro for this example, but note that you can use Gemini 1.5 Flash while describing your schema in the prompt." + " Activate JSON mode by specifying `respose_mime_type` in the `generation_config` parameter:" ] }, { "cell_type": "code", - "execution_count": 18, + "execution_count": 3, "metadata": { "id": "i5Rod-lXRIhf" }, "outputs": [], "source": [ - "model = genai.GenerativeModel(\"gemini-1.5-pro-latest\",\n", + "model = genai.GenerativeModel(\"gemini-1.5-flash-latest\",\n", " generation_config={\"response_mime_type\": \"application/json\"})" ] }, { - "cell_type": "code", - "execution_count": 12, + "cell_type": "markdown", "metadata": { - "id": "JiIxKaLl4R0f" + "id": "4071a6143d31" }, - "outputs": [], "source": [ - "class Recipe(typing.TypedDict):\n", - " recipe_name: str" + "For this first example just describe the schema you want back:" ] }, { "cell_type": "code", - "execution_count": 13, + "execution_count": 4, "metadata": { - "id": "VV_bRJb15uqv" + "id": "K8ezjNb0RJ6Y" }, - "outputs": [ - { - "data": { - "text/plain": [ - "{'items': {'properties': {'recipe_name': {'type': 'string'}},\n", - " 'type': 'object'},\n", - " 'type': 'array'}" - ] - }, - "execution_count": 13, - "metadata": {}, - "output_type": "execute_result" - } - ], + "outputs": [], "source": [ - "response_schema = content_types._schema_for_class(list[Recipe])\n", - "response_schema" + "prompt = \"\"\"List a few popular cookie recipes using this JSON schema:\n", + "\n", + "Recipe = {'recipe_name': str}\n", + "Return: list[Recipe]\"\"\"" ] }, { "cell_type": "code", - "execution_count": 29, + "execution_count": 5, "metadata": { - "id": "K8ezjNb0RJ6Y" + "id": "ggudoxK8RMlb" }, "outputs": [], "source": [ - "prompt = \"\"\"List a few popular cookie recipes using this JSON schema, be sure to return an array:\n", - "{'type': 'object', 'properties': { 'recipe_name': {'type': 'string'}}}\"\"\"" + "raw_response = model.generate_content(prompt)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "9TqoNg3VSMYB" + }, + "source": [ + "Parse the string to JSON:" ] }, { "cell_type": "code", - "execution_count": 30, + "execution_count": 6, "metadata": { - "id": "ggudoxK8RMlb" + "id": "b99ee66972f5" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "[{\"recipe_name\": \"Chocolate Chip Cookies\"}, {\"recipe_name\": \"Peanut Butter Cookies\"}, {\"recipe_name\": \"Oatmeal Raisin Cookies\"}, {\"recipe_name\": \"Sugar Cookies\"}, {\"recipe_name\": \"Snickerdoodles\"}]\n", - "\n" + "[{'recipe_name': 'Chocolate Chip Cookies'}, {'recipe_name': 'Oatmeal Raisin Cookies'}, {'recipe_name': 'Snickerdoodles'}, {'recipe_name': 'Sugar Cookies'}, {'recipe_name': 'Peanut Butter Cookies'}]\n" ] } ], "source": [ - "response = model.generate_content(prompt)\n", - "print(response.text)" + "response = json.loads(raw_response.text)\n", + "print(response)" ] }, { "cell_type": "markdown", "metadata": { - "id": "9TqoNg3VSMYB" + "id": "1092c669169a" }, "source": [ - "Just for fun, parse the string to JSON, and then serialize it." + "For readability searialize and print it:" ] }, { "cell_type": "code", - "execution_count": 31, + "execution_count": 7, "metadata": { "id": "WLDPREpmSMu5" }, @@ -196,23 +219,23 @@ " \"recipe_name\": \"Chocolate Chip Cookies\"\n", " },\n", " {\n", - " \"recipe_name\": \"Peanut Butter Cookies\"\n", + " \"recipe_name\": \"Oatmeal Raisin Cookies\"\n", " },\n", " {\n", - " \"recipe_name\": \"Oatmeal Raisin Cookies\"\n", + " \"recipe_name\": \"Snickerdoodles\"\n", " },\n", " {\n", " \"recipe_name\": \"Sugar Cookies\"\n", " },\n", " {\n", - " \"recipe_name\": \"Snickerdoodles\"\n", + " \"recipe_name\": \"Peanut Butter Cookies\"\n", " }\n", "]\n" ] } ], "source": [ - "print(json.dumps(json.loads(response.text), indent=4))" + "print(json.dumps(response, indent=4))" ] }, { @@ -223,57 +246,21 @@ "source": [ "## Generate JSON from schema\n", "\n", - "You can take a Python class, for instance, and use it as our schema for generating JSON. When passing in the `response_schema` parameter, use the Gemini 1.5 Pro model. Gemini 1.5 Flash does not support this." - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": { - "id": "vP6teXff_H_L" - }, - "outputs": [], - "source": [ - "class Person(typing.TypedDict):\n", - " family_name: str\n", - " favorite_food: str" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": { - "id": "p0gxz0NNr8si" - }, - "outputs": [], - "source": [ - "prompt = \"Hello, describe some people, all fields are required.\"" + "While `gemini-1.5-flash` models only accept a text description of the JSON you want back, `gemini-1.5-pro` models support \"controlled generation\" (aka \"constrained decoding\"). This allows you to pass a schema object (or a python type equivalent) and the output will strictly follow that schema.\n", + "\n", + "Following the same example as the previous section, here's that recipe type:" ] }, { "cell_type": "code", "execution_count": 8, "metadata": { - "id": "SuZQTQZ8438K" + "id": "JiIxKaLl4R0f" }, - "outputs": [ - { - "data": { - "text/plain": [ - "{'items': {'properties': {'family_name': {'type': 'string'},\n", - " 'favorite_food': {'type': 'string'}},\n", - " 'type': 'object'},\n", - " 'type': 'array'}" - ] - }, - "execution_count": 8, - "metadata": {}, - "output_type": "execute_result" - } - ], + "outputs": [], "source": [ - "response_schema = content_types._schema_for_class(list[Person])\n", - "response_schema" + "class Recipe(typing.TypedDict):\n", + " recipe_name: str" ] }, { @@ -282,7 +269,7 @@ "id": "vBlWzt6M-2oM" }, "source": [ - "Using `generate_content`, we pass in the Python class `Person` defined above into the `generation_config`'s `response_schema` field." + "For this exaple you want a list of `Recipe` objects, so pass `list[Recipe]` to the `response_schema` field of the `generation_config`." ] }, { @@ -296,10 +283,10 @@ "model = genai.GenerativeModel(model_name=\"models/gemini-1.5-pro-latest\")\n", "\n", "result = model.generate_content(\n", - " prompt,\n", - " generation_config={\"response_mime_type\": \"application/json\",\n", - " \"response_schema\": response_schema,\n", - " },\n", + " \"List a few popular cookie recipes\",\n", + " generation_config=genai.GenerationConfig(\n", + " response_mime_type=\"application/json\",\n", + " response_schema = list[Recipe]),\n", " request_options={\"timeout\": 600},\n", ")" ] @@ -315,7 +302,7 @@ "name": "stdout", "output_type": "stream", "text": [ - "[{\"favorite_food\": \"pizza\"}] \n" + "[{\"recipe_name\": \"Chocolate Chip Cookies\"}, {\"recipe_name\": \"Peanut Butter Cookies\"}, {\"recipe_name\": \"Oatmeal Raisin Cookies\"}, {\"recipe_name\": \"Sugar Cookies\"}, {\"recipe_name\": \"Snickerdoodles\"}] \n" ] } ],