Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add bfloat to client #4521

Closed
wants to merge 2 commits into from
Closed

Add bfloat to client #4521

wants to merge 2 commits into from

Conversation

jbkyang-nvi
Copy link
Contributor

@jbkyang-nvi jbkyang-nvi commented Jun 17, 2022

Adding bfloat16 to client and QA containers. Not sure if I'm missing something but I think bfloat16 should be supported in python client with the current change. Reusing Ryan's test for bfloat16 testing.
Related PR: triton-inference-server/client#118


FLAGS = None

def test_bf16_raw_http(shape):
def test_bf16_http(shape):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

# FIXME: Use identity_bf16 model in test above once proper python client
# support is added, and remove this raw HTTP test. See DLIS-3720.

This helper was only meant to be a temp workaround until python client bf16 support was added.

Can we instead add to the existing test below now that there is bf16 support in the client? I believe there's a identity_bf16 model that will work now, or if not could make a very simple python identity model.

This would:

  1. test both HTTP/GRPC
  2. test the bf16 support in client library workflow: triton_to_np_dtype, np_to_triton_dtype, etc.

# https://github.com/GreenWaves-Technologies/bfloat16/issues/2
# Can solve when numpy officially supports bfloat16
# https://github.com/numpy/numpy/issues/19808
print("error: tensorflow is included in module. This module cannot " \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this an error/check that should be done by the client as well? The comment is helpful, but users may run into issues if this results in incompatibility.

Copy link
Contributor

@rmccorm4 rmccorm4 Jun 21, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it makes more sense to try/except on the specific error we're expecting here. Maybe it won't always happen when tensorflow is installed.

try:
  from bfloat16 import bfloat16
except <specific error we expect with tensorflow>:
  print("known issue tensorflow pypi, link to github issue")
except Exception as e:
  print("unexpected error importing bf16:", e)

@jbkyang-nvi jbkyang-nvi deleted the kyang-python-bf16 branch July 19, 2022 22:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

3 participants