Diffusion transformers are set to upend GenAI by enabling models to scale up beyond what was previously possible. They replace U-Nets in diffusion models, providing efficiency and performance boosts. The attention mechanism of transformers makes them simpler and parallelizable. The Sora team showcased the transformative potential of transformers on a big scale.

3m read time From techcrunch.com
Post cover image

Sort: