-
Notifications
You must be signed in to change notification settings - Fork 874
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] FIX for Ascii decode error when reading file using python inside docker #1070
[WIP] FIX for Ascii decode error when reading file using python inside docker #1070
Conversation
AWS CodeBuild CI Report
Powered by github-codebuild-logs, available on the AWS Serverless Application Repository |
AWS CodeBuild CI Report
Powered by github-codebuild-logs, available on the AWS Serverless Application Repository |
AWS CodeBuild CI Report
Powered by github-codebuild-logs, available on the AWS Serverless Application Repository |
AWS CodeBuild CI Report
Powered by github-codebuild-logs, available on the AWS Serverless Application Repository |
AWS CodeBuild CI Report
Powered by github-codebuild-logs, available on the AWS Serverless Application Repository |
AWS CodeBuild CI Report
Powered by github-codebuild-logs, available on the AWS Serverless Application Repository |
@msaroufim @lxning @dhanainme I have made changes fixing #821 and possibly #943. Please check |
Very cool! Nice, simple and very useful improvement So before we can merge this we need to make sure it runs as expected and doesn't break existing functionality so let's run these tests where you build the image from scratch and then execute It's also useful to have a screenshot showing an inference working fine in English vs a UTF-8 encoded language that way the improvement can be seen and no regressions detected Also just double checking was the issue around UTF-8 encoded languages only present in Docker? Would be good to have support regardless of environment |
Docker building logs and install dependencies
|
@msaroufim I tried multiple times but, regression test on local is failing due to 507 error for I debug and response was (Pdb) response
<Response [507]>
(Pdb) response.json
<bound method Response.json of <Response [507]>>
(Pdb) response.json()
{'code': 507, 'type': 'InternalServerException', 'message': 'Worker died.'} Logs
ts_log.log
access_log.log
|
I guess this problem happens generally in docker because locale sometimes not set and this python version tries to use default encoding in docker and thus we get unexpected results |
Interesting on the MMF error - are you starting Docker locally or from an Ubuntu machine? Might it be a 20.04 machine? One last thing do you mind sharing an inference in English vs UTF-8 language just so we can eyeball that everything seems to work? |
@msaroufim I am running docker on MAC and I am starting docker locally Inside docker on this branch english inference working finegit clone https://github.com/pytorch/serve.git
mkdir model_store
wget https://download.pytorch.org/models/densenet161-8d451a50.pth
torch-model-archiver --model-name densenet161 --version 1.0 --model-file ./serve/examples/image_classifier/densenet_161/model.py --serialized-file densenet161-8d451a50.pth --export-path model_store --extra-files ./serve/examples/image_classifier/index_to_name.json --handler image_classifier
torchserve --start --ncs --model-store model_store --models densenet161.mar
curl -O https://s3.amazonaws.com/model-server/inputs/kitten.jpg
curl http://127.0.0.1:8080/predictions/densenet161 -T kitten.jpg {
"tiger_cat": 0.4693357050418854,
"tabby": 0.4633876085281372,
"Egyptian_cat": 0.06456158310174942,
"lynx": 0.001282821991480887,
"plastic_bag": 0.00023323067580349743
} For Japanese inferencetorchserve --stop
mkdir -p model_store
rm -rf model_store/*
rm -rf logs
torch-model-archiver --model-name servemodel --version 1.0 --handler model_handler.py --export-path model_store --extra-files vec.txt
torchserve --start --ncs --ts-config config.properties --model-store model_store --models servemodel.mar Inside docker on master branch getting ascii decode error for Japanese inferenceBelow contains the test and logs...look at model_log.log file to verify Inside docker on present branch running perfectly fine for Japanese inferencetest-pass-on-present-branch.zip curl -X POST -H 'Content-Type: application/json' -d '{"hoge": "this is an example. これは例です。"}' http://127.0.0.1:8080/predictions/servemodel/1.0/
{
"hoge": "this is an example. \u3053\u308c\u306f\u4f8b\u3067\u3059\u3002",
"\u3042\u3044\u3046\u3048\u304a": "\u304b\u304d\u304f\u3051\u3053"
} |
AWS CodeBuild CI Report
Powered by github-codebuild-logs, available on the AWS Serverless Application Repository |
AWS CodeBuild CI Report
Powered by github-codebuild-logs, available on the AWS Serverless Application Repository |
Seems like the right way to solve this is a bit more involved - can you check this out and let me know what you think is the best way to proceed? https://stackoverflow.com/questions/28405902/how-to-set-the-locale-inside-a-debian-ubuntu-docker-container |
Description
Following similar fix https://github.com/aws/deep-learning-containers/blob/master/pytorch/inference/docker/1.8/py3/Dockerfile.cpu#L14
When executing python script while opening file in read mode containing non-ascii chars give
Fixes #821
Type of change
Feature/Issue validation/testing
DOCKER_BUILDKIT=1 docker build --file Dockerfile --build-arg BASE_IMAGE=nvidia/cuda:10.2-cudnn7-runtime-ubuntu18.04 -t torchserve:gpu .
test.txt
Checklist: