A team built a custom DSL parser in Rust compiled to WASM, then discovered the bottleneck wasn't computation but the WASM-JS boundary overhead. Attempting to skip JSON serialization via serde-wasm-bindgen made things 30% slower due to fine-grained boundary crossings. Rewriting the parser in TypeScript eliminated the boundary entirely, yielding 2.2-4.6x faster per-call performance. They also fixed an O(N²) streaming problem by caching completed statement ASTs, reducing total streaming cost by 2.6-3.3x. Key lessons: WASM shines for compute-bound tasks with minimal interop, not for frequently-called parsers on small inputs where boundary overhead dominates.

1 Comment

Sort: