Skip to content

clean CP implementation for flash attention and cuDNN 9.6 #6540

clean CP implementation for flash attention and cuDNN 9.6

clean CP implementation for flash attention and cuDNN 9.6 #6540

Triggered via pull request January 7, 2025 21:56
Status Success
Total duration 2m 5s
Artifacts

lint.yml

on: pull_request
PyTorch C++
20s
PyTorch C++
PyTorch Python
1m 55s
PyTorch Python
JAX C++
21s
JAX C++
JAX Python
22s
JAX Python
PaddlePaddle C++
20s
PaddlePaddle C++
PaddlePaddle Python
54s
PaddlePaddle Python
Fit to window
Zoom out
Zoom in

Annotations

6 warnings
PyTorch C++
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
PaddlePaddle C++
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
JAX C++
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
JAX Python
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
PaddlePaddle Python
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
PyTorch Python
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636