Skip to content

Latest commit

 

History

History
164 lines (117 loc) · 3.69 KB

README.md

File metadata and controls

164 lines (117 loc) · 3.69 KB

context-extractor

Extract relevant context from a codebase using a type-directed approach.

Installation

npm

Recommended.

npm i @jpoly1219/context-extractor

Manual

Not recommended. If the above steps do not work, please leave a GitHub issue.

Install the following dependencies:

npm install -g typescript-language-server typescript tsc

Clone the ts-lsp-client repo:

git clone https://github.com/jpoly1219/ts-lsp-client

... and run these commands:

cd ts-lsp-client
npm install
npm run build

Clone this repo.

git clone https://github.com/jpoly1219/context-extractor.git
cd context-extractor
npm install

For OCaml support, you need to first go through the standard OCaml setup.

Once that is done, you should be able to create a local switch in this repo.

opam switch create ./
eval $(opam env)

After you activate the local switch, install the following dependencies:

opam install dune ocaml-lsp-server ounit2

We provide you with five OCaml examples, located in targets/ocaml directory. cd into each of them and run the following:

dune build

Ignore the wildcard build errors. The command is meant to setup the modules and imports.

Almost there! Create a config.json file following the steps at the config.json section below in the README.

Finally, build and run.

npm run build
node dist/runner.js

How it works

context-extractor takes several steps to extract relevant types and headers:

  1. Determine the type of the hole.
  2. Extract relevant types.
  3. Extract relevant headers.
  4. Optionally complete the hole with an LLM.

This library exposes two methods extractContext and completeWithLLM, which have the following definitions:

const extractContext = async (
  language: Language,
  sketchPath: string,
  repoPath: string,
): Promise<Context | null>;

const completeWithLLM = async (
  ctx: Context,
  language: Language,
  sketchPath: string,
  configPath: string
): Promise<string>;

enum Language {
  TypeScript,
  OCaml
}

interface Context {
  holeType: string,
  relevantTypes: Map<Filepath, RelevantType[]>,
  relevantHeaders: Map<Filepath, RelevantHeader[]>
}

type Filepath = string;
type RelevantType = string;
type RelevantHeader = string;
  • sketchPath is the full path to your sketch file with the typed hole construct (_() for TypeScript, _ for OCaml). This is NOT prefixed with file://.
  • repoPath is the full path to your repository root.
  • configPath is the full path to your config.json.
  • null values will only be set if something goes wrong internally.
  • ctx is the result from extractContext.

config.json

The extractor calls OpenAI for code completion. For this you need a config.json file that holds your specific OpenAI parameters.

The json has the following format:

{
  "apiBase": "<your-api-base-here>",
  "deployment": "<your-deployment-here>",
  "gptModel": "<your-gpt-model-here>",
  "apiVersion": "<your-api-version-here>",
  "apiKey": "<your-api-key-here>",
  "temperature": 0.6
}

Internally, this is how the credentials are populated when creating a new OpenAI client.

const openai = new OpenAI({
  apiKey,
  baseURL: `${apiBase}/openai/deployments/${deployment}`,
  defaultQuery: { "api-version": apiVersion },
  defaultHeaders: { "api-key": apiKey }
})

Trying out the VSCode extension

We have a Visual Studio Code extension that provides a frontend to this project.

The extension is not publicly available -- contact me at [email protected] to request for a .vsix package.