Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Point gemini-1.5-flash-latest as default model for all notebooks on examples folder #181

Merged
merged 3 commits into from
Jun 4, 2024
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Addressing review feedbacks for examples notebooks
lucianommartins committed May 31, 2024
commit 5bf97cb4e9ab60fc9f0869de807584f3a38b14a0
152 changes: 144 additions & 8 deletions examples/Agents_Function_Calling_Barista_Bot.ipynb
Original file line number Diff line number Diff line change
@@ -112,7 +112,7 @@
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 2,
"metadata": {
"id": "wMltPyUpTu3h"
},
@@ -191,11 +191,30 @@
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 3,
"metadata": {
"id": "jg1LjYNUWnsC"
},
"outputs": [],
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Your order:\n",
" Latte\n",
" - Extra shot\n",
" Tea\n",
" - Earl Grey, hot\n"
]
},
{
"name": "stdin",
"output_type": "stream",
"text": [
"Is this correct? yes\n"
]
}
],
"source": [
"# Test it out!\n",
"\n",
@@ -224,7 +243,7 @@
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 4,
"metadata": {
"id": "IoBvZ1JYXgn5"
},
@@ -304,7 +323,7 @@
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 5,
"metadata": {
"id": "8vmtzAlPaQH-"
},
@@ -315,7 +334,7 @@
"# Toggle this to switch between Gemini 1.5 with a system instruction, or Gemini 1.0 Pro.\n",
"use_sys_inst = False\n",
"\n",
"model_name = 'gemini-1.5-flash-latest' if use_sys_inst else 'gemini-1.0-pro-latest'\n",
"model_name = 'gemini-1.5-flash' if use_sys_inst else 'gemini-1.0-pro'\n",
"\n",
"if use_sys_inst:\n",
" model = genai.GenerativeModel(\n",
@@ -358,11 +377,128 @@
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 6,
"metadata": {
"id": "38SAyrNNVhvE"
},
"outputs": [],
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Welcome to Barista bot!\n",
"\n",
"\n"
]
},
{
"name": "stdin",
"output_type": "stream",
"text": [
"> I would like a capuccino with almond milk\n"
]
},
{
"data": {
"text/markdown": [
"I have added a Cappuccino with Almond Milk to your order."
],
"text/plain": [
"<IPython.core.display.Markdown object>"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"name": "stdin",
"output_type": "stream",
"text": [
"> do you have stone milk?\n"
]
},
{
"data": {
"text/markdown": [
"I'm sorry, we do not have Stone Milk on the menu. Would you like any other type of milk with your cappuccino?"
],
"text/plain": [
"<IPython.core.display.Markdown object>"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"name": "stdin",
"output_type": "stream",
"text": [
"> no, that's all\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Your order:\n",
" Cappuccino\n",
" - Almond Milk\n"
]
},
{
"name": "stdin",
"output_type": "stream",
"text": [
"Is this correct? yes\n"
]
},
{
"data": {
"text/markdown": [
"Ok, I will place the order now."
],
"text/plain": [
"<IPython.core.display.Markdown object>"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"name": "stdin",
"output_type": "stream",
"text": [
"> thanks\n"
]
},
{
"data": {
"text/markdown": [
"Your order will be ready in about 10 minutes."
],
"text/plain": [
"<IPython.core.display.Markdown object>"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
"\n",
"[barista bot session over]\n",
"\n",
"Your order:\n",
" [('Cappuccino', ['Almond Milk'])]\n",
"\n",
"- Thanks for using Barista Bot!\n"
]
}
],
"source": [
"from IPython.display import display, Markdown\n",
"\n",
Loading