Best of VercelMarch 2026

  1. 1
    Article
    Avatar of bytesdevBytes by ui.dev·11w

    Vercel is gonna buy Tailwind (probably)

    Tailwind CSS v4.2 was released with a new webpack plugin for improved Next.js performance, but the bigger story is Tailwind's financial struggles. Despite npm installs growing 5x due to LLM adoption, AI has caused a 40% drop in human doc visits, cutting into revenue from paid products and forcing layoffs of 75% of the engineering team. The newsletter speculates that Vercel, which has cash, a vested interest in Tailwind, and a track record of acquiring OSS projects like Svelte and Turborepo, is a likely acquirer. The issue also includes a JavaScript tip on deduplicating arrays of objects using Map, filter/findIndex, or reduce.

  2. 2
    Article
    Avatar of vercelVercel·8w

    new.website joins forces with v0

    new.website, a startup focused on making website creation effortless with built-in forms, SEO, and content management tools, is joining the v0 team at Vercel. The acquisition aims to bring native, agent-aware primitives to v0, reducing the prompting needed to implement baseline website features and accelerating the path from prototype to production.

  3. 3
    Article
    Avatar of vercelVercel·9w

    LiteLLM Gateway now supported on Vercel

    LiteLLM Gateway can now be deployed on Vercel, providing developers with an OpenAI-compatible interface to route LLM requests to any supported provider, including Vercel AI Gateway. A basic setup involves a Python entry point and a YAML config file to define model routing. A code snippet shows how to route a model through Vercel AI Gateway using the litellm_config.yaml file.

  4. 4
    Article
    Avatar of vercelVercel·8w

    360 billion tokens, 3 million customers, 6 engineers

    Durable, an AI business builder serving 3 million customers, migrated its multi-tenant platform from self-hosted infrastructure to Vercel with a team of just 6 engineers. The move reduced infrastructure costs 3-4x, eliminated the need for a dedicated DevOps team, and enabled shipping new production AI agents in a single day. The platform now processes ~1.1 billion tokens daily (360 billion per year). Key challenges solved include custom domain SSL at scale, multi-region cluster maintenance, tenant isolation for AI context, and per-customer cost attribution. The team used coding agents to accelerate a full product rewrite rather than incremental refactoring.

  5. 5
    Video
    Avatar of fireshipFireship·11w

    Cloudflare just slop forked Next.js…

    Cloudflare released VNext, a from-scratch reimplementation of the Next.js API built on Vite, enabling Next.js apps to be deployed anywhere without relying on Vercel's proprietary runtime. Built in roughly a week using AI assistance at a cost of ~$1,100 in tokens, it achieves 94% Next.js API coverage. Benchmarks show up to 4.4x faster production builds and 57% smaller client bundles compared to standard Next.js, largely due to Vite and the Rust-based Rolldown bundler. Vercel's leadership publicly criticized the project as a 'slop fork' and highlighted security vulnerabilities. A practical migration demo is shown using Cursor and a Cloudflare-provided agent skill, though the project is considered too early for production use.

  6. 6
    Article
    Avatar of vercelVercel·7w

    Agent responsibly

    Coding agents generate convincing code fast, but passing CI doesn't mean code is safe for production. There's a critical difference between leveraging AI (maintaining full ownership and understanding of output) versus relying on it (shipping whatever the agent produces). The scarce resource is no longer writing code — it's judgment about what's safe to ship. A responsible framework includes self-driving deployments with automatic rollback, continuous validation through load tests and chaos experiments, and executable guardrails that encode operational knowledge as runnable tools rather than documentation. Engineers who thrive will be those who maintain rigorous judgment over what they ship, not those who generate the most code.