Skip to content

Commit

Permalink
[NOID] Add OpenAI (LLM) procedures in 4.4 (#3701)
Browse files Browse the repository at this point in the history
* [NOID] Add OpenAI (LLM) procedures (#3575) (#3582)

* Add OpenAI (LLM) procedures (#3575)

* WIP

* Add completion API

* add chatCompletion

* prettificiation

* WIP

* prettify

* Refactoring, todo docs & tests

* Added Tests & Docs for OpenAI procs (WIP)

* Update openai.adoc

From @tomasonjo

---------

Co-authored-by: Tomaz Bratanic <[email protected]>

* Removed unused deps after OpenAI procedures addition (#3585)

* Updated extended.txt to fix checkCoreWithExtraDependenciesJars failure after Open AI procedures

---------

Co-authored-by: Michael Hunger <[email protected]>
Co-authored-by: Tomaz Bratanic <[email protected]>

* [NOID] First LLM prompt for cypher, query and schema in APOC (#3649)

* First LLM prompt for cypher, query and schema in APOC

* Added integration tests

* enable open.ai key management globally

* Added tests

* added docs

* added configuration map to procs

---------

Co-authored-by: Andrea Santurbano <[email protected]>

---------

Co-authored-by: Michael Hunger <[email protected]>
Co-authored-by: Tomaz Bratanic <[email protected]>
Co-authored-by: Andrea Santurbano <[email protected]>
  • Loading branch information
4 people authored Aug 3, 2023
1 parent c9cca36 commit 31e7183
Show file tree
Hide file tree
Showing 13 changed files with 1,056 additions and 0 deletions.
1 change: 1 addition & 0 deletions core/src/main/java/apoc/ApocConfig.java
Original file line number Diff line number Diff line change
Expand Up @@ -87,6 +87,7 @@ public class ApocConfig extends LifecycleAdapter {
public static final String APOC_UUID_ENABLED = "apoc.uuid.enabled";
public static final String APOC_UUID_ENABLED_DB = "apoc.uuid.enabled.%s";
public static final String APOC_UUID_FORMAT = "apoc.uuid.format";
public static final String APOC_OPENAI_KEY = "apoc.openai.key";
public enum UuidFormatType { hex, base64 }
public static final String APOC_JSON_ZIP_URL = "apoc.json.zip.url"; // TODO: check if really needed
public static final String APOC_JSON_SIMPLE_JSON_URL = "apoc.json.simpleJson.url"; // TODO: check if really needed
Expand Down
3 changes: 3 additions & 0 deletions docs/asciidoc/modules/ROOT/nav.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -130,6 +130,9 @@ include::partial$generated-documentation/nav.adoc[]
** xref:nlp/aws.adoc[]
** xref:nlp/azure.adoc[]
* xref:ml/index.adoc[]
** xref:ml/openai.adoc[]
* xref:background-operations/index.adoc[]
** xref::background-operations/periodic-background.adoc[]
** xref::background-operations/triggers.adoc[]
Expand Down
10 changes: 10 additions & 0 deletions docs/asciidoc/modules/ROOT/pages/ml/index.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
[[ml]]
= Machine Learning (ML
:description: This chapter describes procedures that can be used for adding Machine Learning (ML) functionality to graph applications.

The procedures described in this chapter act as wrappers around cloud based Machine Learning APIs.
These procedures generate embeddings, analyze text, complete text, complete chat conversations and more.

This section includes:

* xref::ml/openai.adoc[]
333 changes: 333 additions & 0 deletions docs/asciidoc/modules/ROOT/pages/ml/openai.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,333 @@
[[openai-api]]
= OpenAI API Access
:description: This section describes procedures that can be used to access the OpenAI API.

NOTE: You need to acquire an https://platform.openai.com/account/api-keys[OpenAI API key^] to use these procedures. Using them will incur costs on your OpenAI account. You can set the api key globally by defining the `apoc.openai.key` configuration in `apoc.conf`

== Generate Embeddings API

This procedure `apoc.ml.openai.embedding` can take a list of text strings, and will return one row per string, with the embedding data as a 1536 element vector.
It uses the `/embeddings/create` API which is https://platform.openai.com/docs/api-reference/embeddings/create[documented here^].

Additional configuration is passed to the API, the default model used is `text-embedding-ada-002`.

.Generate Embeddings Call
[source,cypher]
----
CALL apoc.ml.openai.embedding(['Some Text'], $apiKey, {}) yield index, text, embedding;
----

.Generate Embeddings Response
[%autowidth, opts=header]
|===
|index | text | embedding
|0 | "Some Text" | [-0.0065358975, -7.9563365E-4, .... -0.010693862, -0.005087272]
|===

.Parameters
[%autowidth, opts=header]
|===
|name | description
| texts | List of text strings
| apiKey | OpenAI API key
| configuration | optional map for entries like model and other request parameters
|===


.Results
[%autowidth, opts=header]
|===
|name | description
| index | index entry in original list
| text | line of text from original list
| embedding | 1536 element floating point embedding vector for ada-002 model
|===

== Text Completion API

This procedure `apoc.ml.openai.completion` can continue/complete a given text.

It uses the `/completions/create` API which is https://platform.openai.com/docs/api-reference/completions/create[documented here^].

Additional configuration is passed to the API, the default model used is `text-davinci-003`.

.Text Completion Call
[source,cypher]
----
CALL apoc.ml.openai.completion('What color is the sky? Answer in one word: ', $apiKey, {config}) yield value;
----

.Text Completion Response
----
{ created=1684248202, model="text-davinci-003", id="cmpl-7GqBWwX49yMJljdmnLkWxYettZoOy",
usage={completion_tokens=2, prompt_tokens=12, total_tokens=14},
choices=[{finish_reason="stop", index=0, text="Blue", logprobs=null}], object="text_completion"}
----

.Parameters
[%autowidth, opts=header]
|===
|name | description
| prompt | Text to complete
| apiKey | OpenAI API key
| configuration | optional map for entries like model, temperature, and other request parameters
|===

.Results
[%autowidth, opts=header]
|===
|name | description
| value | result entry from OpenAI (containing)
|===

== Chat Completion API

This procedure `apoc.ml.openai.chat` takes a list of maps of chat exchanges between assistant and user (with optional system message), and will return the next message in the flow.

It uses the `/chat/create` API which is https://platform.openai.com/docs/api-reference/chat/create[documented here^].

Additional configuration is passed to the API, the default model used is `gpt-3.5-turbo`.

.Chat Completion Call
[source,cypher]
----
CALL apoc.ml.openai.chat([
{role:"system", content:"Only answer with a single word"},
{role:"user", content:"What planet do humans live on?"}
], $apiKey) yield value
----

.Chat Completion Response
----
{created=1684248203, id="chatcmpl-7GqBXZr94avd4fluYDi2fWEz7DIHL",
object="chat.completion", model="gpt-3.5-turbo-0301",
usage={completion_tokens=2, prompt_tokens=26, total_tokens=28},
choices=[{finish_reason="stop", index=0, message={role="assistant", content="Earth."}}]}
----

.Parameters
[%autowidth, opts=header]
|===
|name | description
| messages | List of maps of instructions with `{role:"assistant|user|system", content:"text}`
| apiKey | OpenAI API key
| configuration | optional map for entries like model, temperature, and other request parameters
|===

.Results
[%autowidth, opts=header]
|===
|name | description
| value | result entry from OpenAI (containing created, id, model, object, usage(tokens), choices(message, index, finish_reason))
|===


== Query with natural language

This procedure `apoc.ml.query` takes a question in natural language and returns the results of that query.

It uses the `chat/completions` API which is https://platform.openai.com/docs/api-reference/chat/create[documented here^].

.Query call
[source,cypher]
----
CALL apoc.ml.query("What movies did Tom Hanks play in?") yield value, query
RETURN *
----

.Example response
[source, bash]
----
+------------------------------------------------------------------------------------------------------------------------------+
| value | query |
+------------------------------------------------------------------------------------------------------------------------------+
| {m.title -> "You've Got Mail"} | "cypher
MATCH (m:Movie)<-[:ACTED_IN]-(p:Person {name: 'Tom Hanks'})
RETURN m.title
" |
| {m.title -> "Apollo 13"} | "cypher
MATCH (m:Movie)<-[:ACTED_IN]-(p:Person {name: 'Tom Hanks'})
RETURN m.title
" |
| {m.title -> "Joe Versus the Volcano"} | "cypher
MATCH (m:Movie)<-[:ACTED_IN]-(p:Person {name: 'Tom Hanks'})
RETURN m.title
" |
| {m.title -> "That Thing You Do"} | "cypher
MATCH (m:Movie)<-[:ACTED_IN]-(p:Person {name: 'Tom Hanks'})
RETURN m.title
" |
| {m.title -> "Cloud Atlas"} | "cypher
MATCH (m:Movie)<-[:ACTED_IN]-(p:Person {name: 'Tom Hanks'})
RETURN m.title
" |
| {m.title -> "The Da Vinci Code"} | "cypher
MATCH (m:Movie)<-[:ACTED_IN]-(p:Person {name: 'Tom Hanks'})
RETURN m.title
" |
| {m.title -> "Sleepless in Seattle"} | "cypher
MATCH (m:Movie)<-[:ACTED_IN]-(p:Person {name: 'Tom Hanks'})
RETURN m.title
" |
| {m.title -> "A League of Their Own"} | "cypher
MATCH (m:Movie)<-[:ACTED_IN]-(p:Person {name: 'Tom Hanks'})
RETURN m.title
" |
| {m.title -> "The Green Mile"} | "cypher
MATCH (m:Movie)<-[:ACTED_IN]-(p:Person {name: 'Tom Hanks'})
RETURN m.title
" |
| {m.title -> "Charlie Wilson's War"} | "cypher
MATCH (m:Movie)<-[:ACTED_IN]-(p:Person {name: 'Tom Hanks'})
RETURN m.title
" |
| {m.title -> "Cast Away"} | "cypher
MATCH (m:Movie)<-[:ACTED_IN]-(p:Person {name: 'Tom Hanks'})
RETURN m.title
" |
| {m.title -> "The Polar Express"} | "cypher
MATCH (m:Movie)<-[:ACTED_IN]-(p:Person {name: 'Tom Hanks'})
RETURN m.title
" |
+------------------------------------------------------------------------------------------------------------------------------+
12 rows
----

.Input Parameters
[%autowidth, opts=header]
|===
| name | description
| question | The question in the natural language
| conf | An optional configuration map, please check the next section
|===

.Configuration map
[%autowidth, opts=header]
|===
| name | description | mandatory
| retries | The number of retries in case of API call failures | no, default `3`
| apiKey | OpenAI API key | in case `apoc.openai.key` is not defined
| model | The Open AI model | no, default `gpt-3.5-turbo`
| sample | The number of nodes to skip, e.g. a sample of 1000 will read every 1000th node. It's used as a parameter to `apoc.meta.data` procedure that computes the schema | no, default is a random number
|===

.Results
[%autowidth, opts=header]
|===
| name | description
| value | the result of the query
| cypher | the query used to compute the result
|===


== Describe the graph model with natural language

This procedure `apoc.ml.schema` returns a description, in natural language, of the underlying dataset.

It uses the `chat/completions` API which is https://platform.openai.com/docs/api-reference/chat/create[documented here^].

.Query call
[source,cypher]
----
CALL apoc.ml.schema() yield value
RETURN *
----

.Example response
[source, bash]
----
+---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| value |
+---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| "The graph database schema represents a system where users can follow other users and review movies. Users (:Person) can either follow other users (:Person) or review movies (:Movie). The relationships allow users to express their preferences and opinions about movies. This schema can be compared to social media platforms where users can follow each other and leave reviews or ratings for movies they have watched. It can also be related to movie recommendation systems where user preferences and reviews play a crucial role in generating personalized recommendations." |
+---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
1 row
----

.Input Parameters
[%autowidth, opts=header]
|===
| name | description
| conf | An optional configuration map, please check the next section
|===

.Configuration map
[%autowidth, opts=header]
|===
| name | description | mandatory
| apiKey | OpenAI API key | in case `apoc.openai.key` is not defined
| model | The Open AI model | no, default `gpt-3.5-turbo`
| sample | The number of nodes to skip, e.g. a sample of 1000 will read every 1000th node. It's used as a parameter to `apoc.meta.data` procedure that computes the schema | no, default is a random number
|===

.Results
[%autowidth, opts=header]
|===
| name | description
| value | the description of the dataset
|===


== Create cypher queries from a natural language query

This procedure `apoc.ml.cypher` takes a natural language question and transforms it into a number of requested cypher queries.

It uses the `chat/completions` API which is https://platform.openai.com/docs/api-reference/chat/create[documented here^].

.Query call
[source,cypher]
----
CALL apoc.ml.cypher("Who are the actors which also directed a movie?", 4) yield cypher
RETURN *
----

.Example response
[source, bash]
----
+----------------------------------------------------------------------------------------------------------------+
| query |
+----------------------------------------------------------------------------------------------------------------+
| "
MATCH (a:Person)-[:ACTED_IN]->(m:Movie)<-[:DIRECTED]-(d:Person)
RETURN a.name as actor, d.name as director
" |
| "cypher
MATCH (a:Person)-[:ACTED_IN]->(m:Movie)<-[:DIRECTED]-(a)
RETURN a.name
" |
| "
MATCH (a:Person)-[:ACTED_IN]->(m:Movie)<-[:DIRECTED]-(d:Person)
RETURN a.name
" |
| "cypher
MATCH (a:Person)-[:ACTED_IN]->(:Movie)<-[:DIRECTED]-(a)
RETURN DISTINCT a.name
" |
+----------------------------------------------------------------------------------------------------------------+
4 rows
----

.Input Parameters
[%autowidth, opts=header]
|===
| name | description | mandatory
| question | The question in the natural language | yes
| conf | An optional configuration map, please check the next section
|===

.Configuration map
[%autowidth, opts=header]
|===
| name | description | mandatory
| count | The number of queries to retrieve | no, default `1`
| apiKey | OpenAI API key | in case `apoc.openai.key` is not defined
| model | The Open AI model | no, default `gpt-3.5-turbo`
| sample | The number of nodes to skip, e.g. a sample of 1000 will read every 1000th node. It's used as a parameter to `apoc.meta.data` procedure that computes the schema | no, default is a random number
|===

.Results
[%autowidth, opts=header]
|===
| name | description
| value | the description of the dataset
|===
Loading

0 comments on commit 31e7183

Please sign in to comment.