Skip to content

Actions: microsoft/DeepSpeed

Formatting

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
4,749 workflow run results
4,749 workflow run results

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Formatting
Formatting #15946: Merge group checks requested
January 6, 2025 23:25 1m 20s
January 6, 2025 23:25 1m 20s
Formatting
Formatting #15945: Merge group checks requested
January 6, 2025 23:11 1m 17s
January 6, 2025 23:11 1m 17s
Variable batch size and LR scheduler
Formatting #15944: Pull request #5237 synchronize by loadams
January 6, 2025 22:50 1m 24s bm-synth:variable_batch_size_and_lr
January 6, 2025 22:50 1m 24s
Autotp training
Formatting #15943: Pull request #6922 synchronize by loadams
January 6, 2025 22:33 1m 22s inkcherry:autotp_training
January 6, 2025 22:33 1m 22s
Variable batch size and LR scheduler
Formatting #15942: Pull request #5237 synchronize by loadams
January 6, 2025 22:32 1m 24s bm-synth:variable_batch_size_and_lr
January 6, 2025 22:32 1m 24s
Check transformers version in BLOOM for inference v1
Formatting #15941: Pull request #6766 synchronize by loadams
January 6, 2025 21:41 1m 24s lekurile/bloom_v_check
January 6, 2025 21:41 1m 24s
Formatting
Formatting #15939: Merge group checks requested
January 6, 2025 20:06 1m 22s
January 6, 2025 20:06 1m 22s
Formatting
Formatting #15938: Merge group checks requested
January 6, 2025 18:55 1m 24s
January 6, 2025 18:55 1m 24s
Formatting
Formatting #15937: Merge group checks requested
January 6, 2025 17:38 1m 16s
January 6, 2025 17:38 1m 16s
Autotp training
Formatting #15936: Pull request #6922 synchronize by tjruwase
January 6, 2025 17:26 1m 24s inkcherry:autotp_training
January 6, 2025 17:26 1m 24s
Use ds-specific module id to avoid conflicts
Formatting #15933: Pull request #6847 synchronize by loadams
January 6, 2025 16:12 1m 26s olruwase/pr_6772
January 6, 2025 16:12 1m 26s
Formatting
Formatting #15932: Scheduled
January 6, 2025 00:21 1m 22s master
January 6, 2025 00:21 1m 22s
[BUG FIX]:fix get torch.version.cuda error when cuda is None in rocm
Formatting #15931: Pull request #6909 synchronize by hj-wei
January 5, 2025 10:52 1m 24s hj-wei:dev_hjwei
January 5, 2025 10:52 1m 24s
Formatting
Formatting #15930: Scheduled
January 5, 2025 00:22 1m 19s master
January 5, 2025 00:22 1m 19s
Formatting
Formatting #15929: Merge group checks requested
January 4, 2025 05:58 1m 17s
January 4, 2025 05:58 1m 17s
Formatting
Formatting #15927: Scheduled
January 4, 2025 00:20 1m 21s master
January 4, 2025 00:20 1m 21s
Use ds-specific module id to avoid conflicts
Formatting #15926: Pull request #6847 synchronize by loadams
January 3, 2025 22:04 1m 18s olruwase/pr_6772
January 3, 2025 22:04 1m 18s
Add the missing view operations from sequence parallel(async).
Formatting #15925: Pull request #6750 synchronize by loadams
January 3, 2025 19:32 1m 23s inkcherry:ds_overlap_fix
January 3, 2025 19:32 1m 23s
Add fp8_gemm fallback for non-triton systems
Formatting #15923: Pull request #6916 synchronize by loadams
January 3, 2025 16:54 1m 41s oelayan7:fp8_gemm_no_triton
January 3, 2025 16:54 1m 41s
[BUG FIX]:fix get torch.version.cuda error when cuda is None in rocm
Formatting #15922: Pull request #6909 synchronize by loadams
January 3, 2025 16:28 1m 26s hj-wei:dev_hjwei
January 3, 2025 16:28 1m 26s
Fix checkpointable_layers Logic
Formatting #15921: Pull request #6881 synchronize by loadams
January 3, 2025 16:28 1m 22s Quentin-Anthony:qanthony/fix-act-recomp
January 3, 2025 16:28 1m 22s