Releases: graphcore-research/jax-experimental
JAX IPU v0.3.16 beta3 - Poplar SDK 3.1/3.2/3.3
JAX on Graphcore IPU, version 0.3.16 beta3.
🔴
Release notes:
Improvements & bugfixes:
jax.debug.callback
supported on IPUs (making debugging easier!) #18;- Fix IPU donated SRAM buffers re-use bug, leading to wrong results (instead of raising an exception) #21;
- Improve IPU PjRt client non-standard XLA layout handling #22;
- Add Popvision System Analyser integration;
- Improve performance of JAX IPU asynchronous dispatch;
Requirements: Ubuntu 20.04, Graphcore Poplar SDK 3.1/3.2/3.3 and Python 3.8
TensorFlow XLA commit hash using for jaxlib
compilation: xxx
JAX IPU v0.3.16 beta2 - Poplar SDK 3.1/3.2
JAX on Graphcore IPU, version 0.3.16 beta2.
🔴
Release notes:
The C++ IPU PjRt client has been re-written to support:
- Asynchronous dispatch on IPU backend;
- Support multi IPUs using
pmap
andpjit
(with basic collectives); - Improved single IPU performance (as shown by MNIST training);
- Host/CPU fallback for low flops operations (e.g. slicing, ...);
Note: Infeed and outfeed are not yet supported on the new backend (and not a major priority as the performance gap to Python loop training has been reduced).
Requirements: Ubuntu 20.04, Graphcore Poplar SDK 3.1 or 3.2 and Python 3.8
TensorFlow XLA commit hash using for jaxlib
compilation: 655f9416a0ba7b7e8e3a10920d456ef26bc134db
JAX IPU v0.3.16 beta1 - SDK 3.1
JAX on Graphcore IPU, version 0.3.16 beta1.
JAX on IPUs is an experimental package, not part of the official Graphcore Poplar SDK.
Release notes:
- Single IPU support;
- Covering most JAX LAX operators;
- Support JAX buffer donation;
- MNIST examples;
Requirements: Ubuntu 20.04, Graphcore Poplar SDK 3.1 and Python 3.8