Skip to content

Commit

Permalink
Fix issue where hugging face response was not correct type
Browse files Browse the repository at this point in the history
Turned out to be a typing issue that python did not actually help with.

I also had to adjust the way we load in hugging face key because I got rate limited. We NEED to have more clear way to set API keys in `aiconfig/.env` file though, or else people won't know how to use it! cc @tanya-rai let's add an FAQ/README similar to what we have in the cookbooks: https://github.com/lastmile-ai/aiconfig/blob/3dcf70bcddda7f7041c3bfb6e5d4b6c9a7abd4ab/cookbooks/Getting-Started/getting_started.ipynb?short_path=dcb1340#L23-L25

We need to do this for keys
- HUGGING_FACE_API_TOKEN
- ANYSCALE_ENDPOINT_API_KEY
- OPENAI_API_KEY
- GOOGLE_API_KEY


## Testing
Connected to local editor (you'll have to modify the `max_tokens` setting) and runs now. We should also have automated testing for this hugging face model parser. Created task for it in #768

<img width="1512" alt="Screenshot 2024-01-05 at 00 47 51" src="https://github.com/lastmile-ai/aiconfig/assets/151060367/5ecae48a-c0cd-43cf-a7ed-ccc74929c6b3">
  • Loading branch information
Rossdan Craig [email protected] committed Jan 5, 2024
1 parent fe39080 commit eb52558
Show file tree
Hide file tree
Showing 3 changed files with 7 additions and 5 deletions.
2 changes: 1 addition & 1 deletion cookbooks/HuggingFace/hf.py
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@ class HuggingFaceTextParser(ParameterizedModelParser):
A model parser for HuggingFace text generation models.
"""

def __init__(self, model_id: str = None, use_api_token=False):
def __init__(self, model_id: str = None, use_api_token=True):
"""
Args:
model_id (str): The model ID of the model to use.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -154,7 +154,9 @@ def __init__(self, model_id: str = None, use_api_token=False):
token = None

if use_api_token:
token = get_api_key_from_environment("HUGGING_FACE_API_TOKEN")
# You are allowed to use Hugging Face for a bit before you get
# rate limited, in which case you will receive a clear error
token = get_api_key_from_environment("HUGGING_FACE_API_TOKEN", required=False).unwrap()

self.client = InferenceClient(model_id, token=token)

Expand Down
6 changes: 3 additions & 3 deletions python/src/aiconfig/default_parsers/hf.py
Original file line number Diff line number Diff line change
Expand Up @@ -99,15 +99,15 @@ def construct_stream_output(
return output


def construct_regular_output(response: TextGenerationResponse, response_includes_details: bool) -> Output:
def construct_regular_output(response: str, response_includes_details: bool) -> Output:
metadata = {"raw_response": response}
if response_includes_details:
metadata["details"] = response.details

output = ExecuteResult(
**{
"output_type": "execute_result",
"data": response.generated_text or "",
"data": response,
"execution_count": 0,
"metadata": metadata,
}
Expand All @@ -120,7 +120,7 @@ class HuggingFaceTextGenerationParser(ParameterizedModelParser):
A model parser for HuggingFace text generation models.
"""

def __init__(self, model_id: str = None, use_api_token=False):
def __init__(self, model_id: str = None, use_api_token=True):
"""
Args:
model_id (str): The model ID of the model to use.
Expand Down

0 comments on commit eb52558

Please sign in to comment.