PyTorch 2.11 has been released with 2723 commits from 432 contributors. Key highlights include differentiable collectives for distributed training (enabling backpropagation through collective operations), FlexAttention with a FlashAttention-4 backend on Hopper and Blackwell GPUs (1.2×–3.2× speedups over Triton), expanded MPS
Table of contents
API-UNSTABLE FeaturesSort: