diff --git a/docs/cookbook/chain_of_density.md b/docs/cookbook/chain_of_density.md index 2a6b4eb39..10bda9771 100644 --- a/docs/cookbook/chain_of_density.md +++ b/docs/cookbook/chain_of_density.md @@ -29,10 +29,10 @@ The prompt also asks the model to return a list of JSON objects that contain the We can now implement the prompt provided in the paper: ```python -import outlines +from outlines import Template -@outlines.prompt -def chain_of_density(article): + +chain_of_density = Template.from_string( """Article: {{ article }} You will generate increasingly concise, entity-dense summaries of the above Article. @@ -61,6 +61,7 @@ def chain_of_density(article): Answer in JSON. The JSON should be a a dictionary with key "summaries" that contains a list (length 5) of dictionaries whose keys are "Missing_Entities" and "Denser_Summary". """ +) ``` ??? Note diff --git a/docs/cookbook/classification.md b/docs/cookbook/classification.md index c56096318..023e4543a 100644 --- a/docs/cookbook/classification.md +++ b/docs/cookbook/classification.md @@ -16,8 +16,10 @@ model = outlines.models.transformers("TheBloke/Mistral-7B-OpenOrca-AWQ", device= We will use the following prompt template: ```python -@outlines.prompt -def customer_support(request): +from outlines import Template + + +customer_support = Template.from_string( """You are an experienced customer success manager. Given a request from a client, you need to determine when the @@ -36,6 +38,7 @@ def customer_support(request): Request: {{ request }} Label: """ +) ``` ## Choosing between multiple choices diff --git a/docs/cookbook/dating_profiles.md b/docs/cookbook/dating_profiles.md index d0fb9b576..1cbea556c 100644 --- a/docs/cookbook/dating_profiles.md +++ b/docs/cookbook/dating_profiles.md @@ -57,9 +57,9 @@ class Example: We will use Outlines' prompt templating abilities to generate the prompt for us. This help clearly separate the general prompting logic from what is specific to an example. ```python +from outlines import Template -@outlines.prompt -def dating_profile_prompt(description: str, examples: list[Example]): +dating_profile_prompt = Template.from_string( """ You are a world-renowned matchmaker who understands the modern dating market. Your job is to generate dating app profiles for male clients @@ -79,6 +79,7 @@ def dating_profile_prompt(description: str, examples: list[Example]): Description: {{ description }} Profile: """ +) ``` We will provide the model with several few-shot examples: diff --git a/docs/cookbook/extraction.md b/docs/cookbook/extraction.md index 28317b6b0..dfcca0cfd 100644 --- a/docs/cookbook/extraction.md +++ b/docs/cookbook/extraction.md @@ -15,8 +15,10 @@ model = outlines.models.transformers("TheBloke/Mistral-7B-OpenOrca-AWQ", device= And we will be using the following prompt template: ```python -@outlines.prompt -def take_order(order): +from outlines import Template + + +take_order = Template.from_string( """You are the owner of a pizza parlor. Customers \ send you orders from which you need to extract: @@ -42,6 +44,7 @@ def take_order(order): ORDER: {{ order }} RESULT: """ +) ``` We now define our data model using Pydantic: diff --git a/docs/cookbook/simtom.md b/docs/cookbook/simtom.md index 4ad78846b..a80ebc24f 100644 --- a/docs/cookbook/simtom.md +++ b/docs/cookbook/simtom.md @@ -23,18 +23,15 @@ To implement SimToM with Outlines, we will need to: Let's dive into it! -### Using Prompt Functions - -With Outlines, you can write your prompts as Python functions by adding the `@outlines.prompt` decorator. The prompt template is contained in their docstring, and their arguments correspond to variables used in the prompt. +### Using Prompt Templates The authors have shared their code, prompts and data in [this GitHub repository](https://github.com/shawnsihyunlee/simulatedtom). Below, we define in Outlines the prompts they used for the ToMI dataset: ```python -import outlines +from outlines import Template -@outlines.prompt -def perspective_taking(story: str, character: str) -> None: +perspective_taking = Template.from_string( """[INST] The following is a sequence of events about some characters, that takes place in multiple locations. Your job is to output only the events that the specified character, {{character}}, knows about. @@ -45,9 +42,9 @@ def perspective_taking(story: str, character: str) -> None: Story: {{story}} What events does {{character}} know about? Only output the events according to the above rules, do not provide an explanation. [/INST]""" # noqa +) -@outlines.prompt -def simulation(events: list, name: str, question: str) -> None: +simulation = Template.from_string( """[INST] {% for event in events %} {{event}} {% endfor %} @@ -55,6 +52,7 @@ def simulation(events: list, name: str, question: str) -> None: Based on the above information, answer the following question: {{question}} You must choose one of the above choices, do not say there is not enough information. Answer with a single word, do not output anything else. [/INST]""" # noqa +) ``` ### JSON Structured Generation diff --git a/docs/quickstart.md b/docs/quickstart.md index 81a067ad6..ec445dcba 100644 --- a/docs/quickstart.md +++ b/docs/quickstart.md @@ -115,10 +115,9 @@ Or use the [requests][requests]{:target="_blank"} library from another python pr Prompting can lead to messy code. Outlines' prompt functions are python functions that contain a template for the prompt in their docstring. We use a powerful templating language to allow you to loop over lists, dictionaries, add conditionals, etc. directly from the prompt. When called, a prompt function returns the rendered template: ```python -import outlines +from outlines import Template -@outlines.prompt -def few_shots(instructions, examples, question): +few_shots = Template.from_string( """{{ instructions }} Examples @@ -135,6 +134,7 @@ def few_shots(instructions, examples, question): Q: {{ question }} A: """ +) instructions = "Please answer the following question following the examples" examples = [ @@ -175,9 +175,9 @@ Once you are done experimenting with a prompt and an output structure, it is use import outlines - @outlines.prompt - def tell_a_joke(topic): + tell_a_joke = outlines.Template.from_string( """Tell me a joke about {{ topic }}.""" + ) class Joke(BaseModel): setup: str diff --git a/docs/reference/prompting.md b/docs/reference/prompting.md index 8ea9c0243..986ca82fc 100644 --- a/docs/reference/prompting.md +++ b/docs/reference/prompting.md @@ -33,15 +33,13 @@ will pass to the prompt function. === "Code" - ```python - import outlines + ```python title="greetings.py" + from outlines import Template - @outlines.prompt - def greetings(name, question): - """Hello, {{ name }}! - {{ question }} - """ + prompt = """Hello, {{ name }}! + {{ question }}""" + greetings = Template.from_string(prompt) prompt = greetings("user", "How are you?") print(prompt) ``` @@ -58,12 +56,10 @@ If a variable is missing in the function's arguments, Jinja2 will throw an `Unde === "Code" ```python - import outlines - - @outlines.prompt - def greetings(name): - """Hello, {{ surname }}!""" + from outlines import Template + prompt = """Hello, {{ surname }}!""" + greetings = Template.from_string(prompt) prompt = greetings("user") ``` @@ -72,9 +68,9 @@ If a variable is missing in the function's arguments, Jinja2 will throw an `Unde ```text Traceback (most recent call last): File "", line 9, in - File "/home/remi/projects/normal/outlines/outlines/prompts.py", line 38, in __call__ + File "/home/remi/projects/normal/outlines/outlines/templates.py", line 38, in __call__ return render(self.template, **bound_arguments.arguments) - File "/home/remi/projects/normal/outlines/outlines/prompts.py", line 213, in render + File "/home/remi/projects/normal/outlines/outlines/templates.py", line 213, in render return jinja_template.render(**values) File "/home/remi/micromamba/envs/outlines/lib/python3.9/site-packages/jinja2/environment.py", line 1301, in render self.environment.handle_exception() @@ -84,26 +80,23 @@ If a variable is missing in the function's arguments, Jinja2 will throw an `Unde jinja2.exceptions.UndefinedError: 'surname' is undefined ``` -## Importing prompt functions +## Importing prompts from files -Prompt functions are functions, and thus can be imported from other modules: - -=== "prompts.py" - ```python - import outlines +Outlines allows you to read a prompt template from a text file. This way you can build "white space perfect" prompts, and version them independently from your code. We have found ourselves gravitating around this pattern a lot since Outlines came out: - @outlines.prompt - def greetings(name, question): - """Hello, {{ name }}! - {{ question }} - """ +=== "prompt.txt" + ```text + """Hello, {{ name }}! + {{ question }} + """ ``` === "generate.py" ```python - from .prompts import greetings + from outlines import Template + greetings = Template.from_file("prompt.txt") prompt = greetings("John Doe", "How are you today?") ``` @@ -123,27 +116,26 @@ keys `question` and `answer` to the prompt function: === "Code" - ```python - import outlines + ```text title="prompt.txt" + {{ instructions }} - @outlines.prompt - def few_shots(instructions, examples, question): - """{{ instructions }} + Examples + -------- - Examples - -------- + {% for example in examples %} + Q: {{ example.question }} + A: {{ example.answer }} - {% for example in examples %} - Q: {{ example.question }} - A: {{ example.answer }} + {% endfor %} + Question + -------- - {% endfor %} - Question - -------- + Q: {{ question }} + A: + ``` - Q: {{ question }} - A: - """ + ```python title="render.py" + from outlines import Template instructions = "Please answer the following question following the examples" examples = [ @@ -152,6 +144,7 @@ keys `question` and `answer` to the prompt function: ] question = "4+4 = ?" + few_shots = Template.from_file("prompt.txt") prompt = few_shots(instructions, examples, question) print(prompt) ``` @@ -194,7 +187,7 @@ Several projects (e.g.[Toolformer](https://arxiv.org/abs/2302.04761), [ViperGPT] === "Code" ```python - import outlines + from outlines import Template def my_tool(arg1: str, arg2: int): """Tool description. @@ -203,16 +196,15 @@ Several projects (e.g.[Toolformer](https://arxiv.org/abs/2302.04761), [ViperGPT] """ pass - @outlines.prompt - def tool_prompt(question, tool): - """{{ question }} + prompt = """{{ question }} - COMMANDS - 1. {{ tool | name }}: {{ tool | description }}, args: {{ tool | args }} + COMMANDS + 1. {{ tool | name }}: {{ tool | description }}, args: {{ tool | args }} - {{ tool | source }} - """ + {{ tool | source }} + """ + tool_prompt = Template.from_string(prompt) prompt = tool_prompt("Can you do something?", my_tool) print(prompt) ``` @@ -250,16 +242,14 @@ pretty print a dictionary from within an Outlines prompt function ```python from pydantic import BaseModel, Field - import outlines + from outlines import Template class MyResponse(BaseModel): field1: int = Field(description="an int") field2: str - @outlines.prompt - def my_prompt(response_model): - """{{ response_model | schema }}""" + my_prompt = Template.from_string("""{{ response_model | schema }}""") prompt = my_prompt(MyResponse) print(prompt) # { @@ -285,8 +275,9 @@ pretty print a dictionary from within an Outlines prompt function ## Formatting conventions -Prompt functions are opinionated when it comes to rendering, and these opinions -are meant to avoid prompting mistakes and help with formatting. +Prompt templates are opinionated when it comes to rendering a template read from +a string, and these opinions are meant to avoid prompting mistakes and help with +formatting. ### Whitespaces @@ -300,18 +291,19 @@ below does not matter for formatting: === "Code" ```python - import outlines + from outlines import Template - @outlines.prompt - def prompt1(): - """My prompt - """ - @outlines.prompt - def prompt2(): - """ - My prompt - """ + prompt1 = Template.from_string( + """My prompt + """ + ) + + prompt2 = Template.from_string( + """ + My prompt + """ + ) print(prompt1()) print(prompt2()) @@ -329,27 +321,27 @@ Indentation is relative to the second line of the docstring, and leading spaces === "Code" ```python - import outlines + from outlines import Template + + example1 = Template.from_string( + """First line + Second line + """ + ) - @outlines.prompt - def example1(): - """First line + example2 = Template.from_string( + """ Second line - """ + Third line + """ + ) - @outlines.prompt - def example2(): - """ - Second line + example3 = Template.from_string( + """ + Second line Third line - """ - - @outlines.prompt - def example3(): - """ - Second line - Third line - """ + """ + ) print(example1()) print(example2()) @@ -378,18 +370,18 @@ You can use the backslash `\` to break a long line of text. It will render as a === "Code" ```python - import outlines - - @outlines.prompt - def example(): - """ - Break in \ - several lines \ - But respect the indentation - on line breaks. - And after everything \ - Goes back to normal - """ + from outlines import Template + + example = Template.from_string( + """ + Break in \ + several lines \ + But respect the indentation + on line breaks. + And after everything \ + Goes back to normal + """ + ) print(example()) ``` diff --git a/examples/babyagi.py b/examples/babyagi.py index 0a7a0b13b..314f9dbca 100644 --- a/examples/babyagi.py +++ b/examples/babyagi.py @@ -9,48 +9,16 @@ import outlines import outlines.models as models +from outlines import Template + model = models.openai("gpt-4o-mini") complete = outlines.generate.text(model) - -################# -# Perform tasks # -################# - - -@outlines.prompt -def perform_task_ppt(objective: str, task: str): - """You are an AI who performs one task based on the following objective: {{objective}}. - - Your task: {{task.task_name}} - - Response: - """ - - -##################### -# Create a new task # -##################### - - -@outlines.prompt -def create_tasks_ppt( - objective: str, previous_task: str, result: str, task_list: List[str] -): - """You are an task creation AI that uses the result of an execution agent to \ - create new tasks with the following objective: {{objective}}. - - The last completed task has the result: {{result}}. - - This result was based on this task description: {{previous_task}}. These are \ - incomplete tasks: {{task_list | join(task_list)}}. - - Based on the result, create new tasks to be completed by the AI system that \ - do not overlap with incomplete tasks. - - Return the tasks as an array. - """ +## Load the prompts +perform_task_ppt = Template.from_file("prompts/babyagi_perform_task.txt") +create_tasks_ppt = Template.from_file("prompts/babyagi_create_task.txt") +prioritize_tasks_ppt = Template.from_file("prompts/babyagi_prioritize_task.txt") def create_tasks_fmt(result: str) -> List[str]: @@ -65,26 +33,6 @@ def create_tasks_fmt(result: str) -> List[str]: return task_list -######################## -# Prioritize new tasks # -######################## - - -@outlines.prompt -def prioritize_tasks_ppt(objective: str, task_names: List[str], next_task_id: int): - """You are a task prioritization AI tasked with cleaning the formatting of \ - and reprioritizing the following tasks: {{task_names}}. - - Consider the ultimate objective of your team: {{objective}}. - - Do not remove any tasks. Return the result as a numbered list, like: - #. First task - #. Second task - - Start the tasks list with the number {{next_task_id}}. - """ - - def prioritize_tasks_fmt(result: str): new_tasks = result.split("\n") diff --git a/examples/dating_profile.py b/examples/dating_profile.py index 504ec943d..d26984223 100644 --- a/examples/dating_profile.py +++ b/examples/dating_profile.py @@ -6,7 +6,7 @@ from pydantic import BaseModel, conlist import outlines -from outlines import models +from outlines import models, Template class QuestionChoice(str, Enum): @@ -41,23 +41,6 @@ class Example: profile: DatingProfile -@outlines.prompt -def dating_profile_prompt(description: str, examples: list[Example]): - """ - You are a world-renowned matchmaker who understands the modern dating market. Your job is to generate dating app profiles for male clients interested in women based on a provided description. The profiles should be authentic, show off their strengths, and maximize their likelihood of getting matches on dating apps. - Here are some examples of past clients that you have successfully created profiles for: - {% for example in examples %} - Description: - {{ example.description }} - Profile: - {{ example.profile }} - {% endfor %} - Here is the new client who you need to create a profile for: - Description: {{ description }} - Profile: - """ - - samples: list[Example] = [ Example( description="I'm an author and former professional soccer player living in Seattle who publishes popular fiction books. A typical day for me starts by hanging out with my cat, drinking a coffee, and reading as much as I can in a few hours. Then, I'll prepare a quick smoothie before starting to write for a few hours, take a break with soccer or running a few miles, and finally meet friends for dinner at a new, hip restaurant in the evening. Sometimes we go axe-throwing afterwards, or play poker, or watch a comedy show, or visit a dive bar. On my vacations, I travel extensively to countries South America, Europe, and Asia, with the goal of visiting them all!", @@ -120,6 +103,7 @@ def dating_profile_prompt(description: str, examples: list[Example]): new_description = "I'm a laid-back lawyer who spends a lot of his free-time gaming. I work in a corporate office, but ended up here after the start-up I cofounded got acquired, so still play ping pong with my cool coworkers every day. I have a bar at home where I make cocktails, which is great for entertaining friends. I secretly like to wear suits and get a new one tailored every few months. I also like weddings because I get to wear those suits, and it's a good excuse for a date. I watch the latest series because I'm paying, with my hard-earned money, for every streaming service." +dating_profile_prompt = Template.from_file("prompts/dating_profile.txt") prompt = dating_profile_prompt(description=new_description, examples=samples) profile = outlines.generate.json(model, DatingProfile)(prompt) # type: ignore print(profile) diff --git a/examples/math_generate_code.py b/examples/math_generate_code.py index 7eb1651a7..6d2ba8ae6 100644 --- a/examples/math_generate_code.py +++ b/examples/math_generate_code.py @@ -1,6 +1,7 @@ """Example from https://dust.tt/spolu/a/d12ac33169""" import outlines import outlines.models as models +from outlines import Template examples = [ {"question": "What is 37593 * 67?", "code": "37593 * 67"}, @@ -16,9 +17,7 @@ question = "Carla is downloading a 200 GB file. She can download 2 GB/minute, but 40% of the way through the download, the download fails. Then Carla has to restart the download from the beginning. How load did it take her to download the file in minutes?" - -@outlines.prompt -def answer_with_code_prompt(question, examples): +answer_with_code_prompt = Template.from_string( """ {% for example in examples %} QUESTION: {{example.question}} @@ -27,6 +26,7 @@ def answer_with_code_prompt(question, examples): {% endfor %} QUESTION: {{question}} CODE:""" +) def execute_code(code): diff --git a/examples/meta_prompting.py b/examples/meta_prompting.py index cba18b5fe..85f512bf6 100644 --- a/examples/meta_prompting.py +++ b/examples/meta_prompting.py @@ -13,14 +13,16 @@ import outlines import outlines.models as models +from outlines import Template def split_into_steps(question, model_name: str): - @outlines.prompt - def solve(question): + + solve = Template.from_string( """{{question}} Rephrase : : as a true or false statement, identify an Object, relationship and subject """ + ) model = models.openai(model_name) generator = outlines.generate.text(model) @@ -38,16 +40,17 @@ def solve(question): def fill_in_the_blanks(question, model_name: str): - @outlines.prompt - def determine_goal(question): + + determine_goal = Template.from_string( """{{question}} In order to solve this problem, we will analyze each of the options and determine """ + ) - @outlines.prompt - def solve(memory): + solve = Template.from_string( """{{memory}}. Let's begin.""" + ) model = models.openai(model_name) generator = outlines.generate.text(model) @@ -62,8 +65,8 @@ def solve(memory): def ask_an_expert(question, model_name: str): - @outlines.prompt - def find_expert(question): + + find_expert = Template.from_string( """ {{question}} I entered my question into the Expert Generator \ @@ -79,15 +82,16 @@ def find_expert(question): found the most qualified expert. The name displayed \ on the screen: " """ + ) - @outlines.prompt - def get_answer(question, expert, memory): + get_answer = Template.from_string( """ {{memory}}". I am ready to ask my question. "{{expert}}" I say, {{question}} """ + ) model = models.openai(model_name) generator = outlines.generate.text(model) @@ -102,20 +106,20 @@ def get_answer(question, expert, memory): def ask_an_expert_simple(question, model_name: str): - @outlines.prompt - def find_expert(question): + find_expert = Template.from_string( """ Q: {{question}} A: A good person to answer this question would be """ + ) - @outlines.prompt - def get_answer(expert, memory): + get_answer = Template.from_string( """ {{memory}}. For instance, {{expert}} would answer """ + ) model = models.openai(model_name) generator = outlines.generate.text(model) diff --git a/examples/pick_odd_one_out.py b/examples/pick_odd_one_out.py index 6cd9f1daf..834c40824 100644 --- a/examples/pick_odd_one_out.py +++ b/examples/pick_odd_one_out.py @@ -13,8 +13,7 @@ import outlines.models as models -@outlines.prompt -def build_ooo_prompt(options): +build_ooo_prompt = outlines.Template.from_string( """ Pick the odd word out: skirt, dress, pen, jacket. skirt is clothing, dress is clothing, pen is an object, jacket is clothing. @@ -27,7 +26,7 @@ def build_ooo_prompt(options): Pick the odd word out: {{ options | join(", ") }}. """ - +) options = ["sea", "mountains", "plains", "sock"] diff --git a/examples/react.py b/examples/react.py index 34b3c6eb2..9b876eb49 100644 --- a/examples/react.py +++ b/examples/react.py @@ -13,29 +13,30 @@ import requests # type: ignore import outlines +from outlines import Template import outlines.generate as generate import outlines.models as models -@outlines.prompt -def build_reAct_prompt(question): - """What is the elevation range for the area that the eastern sector of the Colorado orogeny extends into? - Tho 1: I need to search Colorado orogeny, find the area that the eastern sector of the Colorado ... - Act 2: Search 'Colorado orogeny' - Obs 2: The Colorado orogeny was an episode of mountain building (an orogeny) ... - Tho 3: It does not mention the eastern sector. So I need to look up eastern sector. - ... - Tho 4: High Plains rise in elevation from around 1,800 to 7,000 ft, so the answer is 1,800 to 7,000 ft. - Act 5: Finish '1,800 to 7,000 ft' - {{ question }} - """ - - -@outlines.prompt -def add_mode(i, mode, result, prompt): - """{{ prompt }} - {{ mode }} {{ i }}: {{ result }} - """ +build_reAct_prompt = Template.from_string( +"""What is the elevation range for the area that the eastern sector of the Colorado orogeny extends into? +Tho 1: I need to search Colorado orogeny, find the area that the eastern sector of the Colorado ... +Act 2: Search 'Colorado orogeny' +Obs 2: The Colorado orogeny was an episode of mountain building (an orogeny) ... +Tho 3: It does not mention the eastern sector. So I need to look up eastern sector. +... +Tho 4: High Plains rise in elevation from around 1,800 to 7,000 ft, so the answer is 1,800 to 7,000 ft. +Act 5: Finish '1,800 to 7,000 ft' +{{ question }} +""" +) + + +add_mode = Template.from_string( +"""{{ prompt }} +{{ mode }} {{ i }}: {{ result }} +""" +) def search_wikipedia(query: str): diff --git a/examples/self_consistency.py b/examples/self_consistency.py index f1bbe2a18..50f12acf1 100644 --- a/examples/self_consistency.py +++ b/examples/self_consistency.py @@ -4,6 +4,7 @@ import outlines import outlines.models as models +from outlines import Template examples = [ { @@ -43,8 +44,7 @@ question = "When I was 6 my sister was half my age. Now I’m 70 how old is my sister?" -@outlines.prompt -def few_shots(question, examples): +few_shots = Template.from_string( """ {% for example in examples %} Q: {{ example.question }} @@ -53,7 +53,7 @@ def few_shots(question, examples): Q: {{ question }} A: """ - +) model = models.openai("gpt-4o-mini") generator = outlines.generate.text(model)