Skip to content

Commit

Permalink
docs: fix broken links
Browse files Browse the repository at this point in the history
  • Loading branch information
Borda committed Dec 19, 2024
1 parent 4ff6b91 commit a5702f1
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion docs/source/basic.rst
Original file line number Diff line number Diff line change
Expand Up @@ -111,4 +111,4 @@ AWS
^^^
You can either use `Gaudi-based AWS EC2 DL1 instances <https://aws.amazon.com/ec2/instance-types/dl1/>`__ or `Supermicro X12 Gaudi server <https://www.supermicro.com/en/solutions/habana-gaudi>`__ to get access to HPUs.

Check out the `PyTorch Model on AWS DL1 Instance Quick Start <https://docs.habana.ai/en/latest/AWS_EC2_DL1_and_PyTorch_Quick_Start/AWS_EC2_DL1_and_PyTorch_Quick_Start.html>`__.
Check out the `PyTorch Model on AWS DL1 Instance Quick Start <https://docs.habana.ai/en/v1.19.0/Quick_Start_Guides/AWS_EC2_DL1_and_PyTorch_Quick_Start.html>`__.
4 changes: 2 additions & 2 deletions docs/source/intermediate.rst
Original file line number Diff line number Diff line change
Expand Up @@ -284,15 +284,15 @@ INC uses configuration jsons for selecting between quant and measurement modes.
This can be toggled via `quant` param in `HPUPrecisionPlugin.convert_modules()`.
`quant` also accepts user defined config dictionary or a json path.

Refer to `Supported JSON Config File Options <https://docs.habana.ai/en/latest/PyTorch/Inference_on_PyTorch/Inference_Using_FP8.html#supported-json-config-file-options>`__ for more information on supported json configs.
Refer to `Supported JSON Config File Options <https://docs.habana.ai/en/v1.19.0/PyTorch/Inference_on_PyTorch/Quantization/Inference_Using_FP8.html#supported-json-config-file-options>`__ for more information on supported json configs.


.. note::

To enable fp8 inference with HPUDeepSpeedStrategy, use HPUDeepSpeedPrecisionPlugin, instead of HPUPrecisionPlugin, while keeping all other steps the same.


For more details, refer to `Inference Using FP8 <https://docs.habana.ai/en/latest/PyTorch/Inference_on_PyTorch/Inference_Using_FP8.html>`__.
For more details, refer to `Inference Using FP8 <https://docs.habana.ai/en/v1.19.0/PyTorch/Inference_on_PyTorch/Quantization/Inference_Using_FP8.html>`__.
For a list of data types supported with HPU, refer to `PyTorch Support Matrix <https://docs.habana.ai/en/latest/PyTorch/Reference/PyTorch_Support_Matrix.html>`__.

----
Expand Down

0 comments on commit a5702f1

Please sign in to comment.