Skip to content

siliconjungle/vector-embeddings-english-dictionary

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

37 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

vector-embeddings-english-dictionary

Vector embeddings for each word in word list.

It uses text-embedding-ada-002 from Open AI to do the embeddings.

lookup.json

lookup.json is a simple mapping from word to filename and index. This can be useful if you want to be able to compare an embedding with specific words.

Comparing embeddings

You can use a Vector DB like chroma to handle search.

I would not recommend vectra for such a large embeddings list as it must all be in memory at once.

To compare two embeddings you can calculate their cosine similarity with libraries: compute cosine similarity.

For more efficient comparisons you can use algorithms like these:

hnswlib-node

hnswlib-wasm

Visualisation

For visualising parts of the space you can use t-sne or related algorithms, e.g.

t-sne js.

Interpolation

You can interpolate between different embedding spaces to find words which are between other words. You can get the mean of embedding spaces to try to isolate features.

Reverse embeddings

There are also implementations of reversing embedding spaces in python:

vec2text

Contra

Data structure

There are a series of embeddings files in /embeddings which are in order & contain an array of objects in the following format:

[
  { word: '', embedding: [] },
]

Additional metadata

wordpos can be used to attach metadata such as noun, verb, adjective, adverb, synonyms, definition etc.

In the /metadata folder is noun, verb, adjective, adverbs & the word's lookup. Not all words are known by wordpos.

I think some interesting things can be done by filtering words based on their properties.

Classification

knn can be used for classification. knn

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published