In-Browser Transpilation

I’ve been experimenting with in-browser transpilation for JSX, TypeScript, Vue, and Svelte. The idea is to enable a zero-install development experience where you can write code in the browser and see the results immediately. The use case is for the new VizHub runtime environment, which is useful in the VizHub platform but also could be used in other contexts.

What follows is a verbatim copy-paste of a ChatGPT “Deep Research” session I had where I asked it to research and explore these ideas in modern times. Here’s the prompt I gave:

When I’m using Rollup in the browser, how can I transpile JSX? I’m looking for a lightweight solution that works in the browser environment. I’m also interested in discovering similar solutions for transpiling TypeScript, Vue, and Svelte in the browser. Ideally VizHub could support all of these with a minimal bundle. Please also consider that the thing that does the transpilation will need to run inside a Web Worker.

  • I’m looking for a custom solution that can be encapsulated as a Rollup plugin.
  • The solution can use a CDN
  • WASM and Pure JS are both considered. The WASM build needs to be reasonably small, though.
  • HMR is not required for the first version, but would be great to add in the future

Got it! I’ll research lightweight Rollup plugins or custom solutions for transpiling JSX, TypeScript, Vue, and Svelte in the browser. I’ll focus on options that can work inside a Web Worker, using either WASM or JS, while ensuring a minimal bundle size. The solution should leverage CDNs where possible and be extensible for potential HMR support in the future.

I’ll get back to you soon with the findings!

In-Browser Transpilation for JSX, TypeScript, Vue, and Svelte

Requirements and Challenges

Building an in-browser transpiler for JSX/TS and single-file component frameworks (Vue, Svelte) comes with unique challenges. The solution must run entirely in a Web Worker (to avoid blocking the UI) and should leverage CDNs for heavy resources when possible. Both WebAssembly (WASM) and pure JavaScript approaches are on the table – with a preference for smaller WASM builds where feasible. A key goal is to minimize bundle size for the hosting application (the in-browser IDE or playground). Ideally, the transpilation logic is encapsulated in a Rollup plugin interface, so that it plugs into Rollup’s workflow even in the browser. Looking forward, the design should not preclude adding hot-module reloading (HMR) capabilities in the future for a faster feedback loop during development.

JavaScript-Based Transpilers (Babel, Sucrase, etc.)

JavaScript implementations of transpilers can run directly in the browser without special binaries. The classic solution is Babel (via @babel/standalone), which supports JSX, TypeScript (stripping types), and more. However, Babel’s standalone build is very large (~2.8 MB) (Browser Support for @rollup/plugin-sucrase · Issue #1620 · rollup/plugins · GitHub) and relatively slow, making it less ideal for a lightweight in-browser IDE. A much lighter alternative is Sucrase, a fast transpiler focused on removing TypeScript/Flow types and transforming JSX/ESNext syntax. Sucrase can run in the browser (no Node dependencies) and its bundled size is only around 275 KB (Browser Support for @rollup/plugin-sucrase · Issue #1620 · rollup/plugins · GitHub), which is significantly smaller than Babel’s. In fact, Sucrase is about 20× faster than Babel for development builds, since it does only the essential transforms without full re-analysis (Comparing Babel, Sucrase, and Similar Libraries - Honeybadger.io) (The Journey To a Faster Sandpack Transpiler - CodeSandbox). CodeSandbox reports that Sucrase is “considerably smaller than both esbuild and SWC” while being extremely fast (The Journey To a Faster Sandpack Transpiler - CodeSandbox). This makes Sucrase a top candidate for in-browser JSX/TS transpilation where bundle size and speed are critical. Another option is the TypeScript compiler itself (using ts.transpileModule in the browser), but the TypeScript compiler is large (on the order of a few MB) and was not designed for speed in a browser context – benchmarks show it is orders of magnitude slower than Sucrase or WASM-based tools for pure transpilation (e.g. ~143s vs ~8s for a large input) (The Journey To a Faster Sandpack Transpiler - CodeSandbox). Similarly, older mini-transpilers like Bublé exist (small ES2015-to-ES5 converter), but they lack support for TypeScript and modern JSX, so they’re less applicable here. Overall, a JavaScript-based solution like Sucrase offers an excellent trade-off: small footprint and very fast “good-enough” transforms (no type checking, just syntax transpile), suitable for a browser IDE (The Journey To a Faster Sandpack Transpiler - CodeSandbox).

Rollup plugin integration: Rollup has official plugins for these transpilers, though not all are browser-ready. @rollup/plugin-babel could theoretically be used with Babel Standalone, but as noted, that adds ~2–3 MB to the bundle (Browser Support for @rollup/plugin-sucrase · Issue #1620 · rollup/plugins · GitHub). @rollup/plugin-sucrase is much more lightweight; it normally runs in Node, but Sucrase can run in-browser, and an issue was opened to add browser support for it (Browser Support for @rollup/plugin-sucrase · Issue #1620 · rollup/plugins · GitHub). In practice, one could create a custom Rollup plugin that calls Sucrase’s API in the worker thread to transform .tsx/.jsx and .ts files. This avoids a large dependency and keeps things encapsulated. The Rollup team even noted that running Sucrase in the browser is feasible and “most promising” for minimizing overhead (Browser Support for @rollup/plugin-sucrase · Issue #1620 · rollup/plugins · GitHub). If type-checking or full Babel transpilation is needed (for example, to use experimental proposals not handled by Sucrase), Babel Standalone is an option but with a heavy cost to size and performance.

WebAssembly-Based Transpilers (SWC and Esbuild)

WASM-powered compilers promise native-like performance for transpilation in the browser, at the cost of loading a WebAssembly binary. Two notable tools are SWC (a Rust-based compiler) and esbuild (a Go-based bundler/transpiler), both of which offer WASM builds for browser use.

  • SWC: The Rust-based SWC is used in many modern build tools and has a WASM version (@swc/wasm-web). It can perform fast JSX and TS transformations. However, the WASM bundle for SWC is very large (~19 MB) in its full form (Browser Support for @rollup/plugin-sucrase · Issue #1620 · rollup/plugins · GitHub). Including a 19 MB binary just for transpilation is likely a non-starter for our lightweight goals. It’s possible to lazy-load it from a CDN when needed, but even then the download and instantiation time would be significant. Rollup’s own browser build actually embeds a small part of SWC internally (for parsing), but not the full transformer (Leverage SWC parser to transpile JSX and TypeScript · Issue #5210 · rollup/rollup · GitHub). One idea is to use that same SWC parser in a Rollup plugin to handle TS/JSX – indeed, Rollup maintainers suggested replacing @swc/core with the WASM version to make a browser-friendly SWC plugin (Leverage SWC parser to transpile JSX and TypeScript · Issue #5210 · rollup/rollup · GitHub). Yet, the size hit (“blowing up Rollup by a factor of five” in bundle size) was a noted concern (Leverage SWC parser to transpile JSX and TypeScript · Issue #5210 · rollup/rollup · GitHub). Performance-wise, SWC (WASM) is fast at transforming code, but in single-threaded scenarios it may not outperform lighter JS solutions. For example, in one benchmark transpiling large codebases, SWC’s WASM was slower than Sucrase and even slower than esbuild for certain inputs (The Journey To a Faster Sandpack Transpiler - CodeSandbox), likely due to WASM overhead and lack of optimization for huge files in one go.

  • Esbuild: Esbuild is another extremely fast compiler/bundler that can handle JSX and TypeScript. It provides a WASM package (esbuild-wasm) specifically for browser usage. The esbuild approach is attractive because it’s an all-in-one solution (it can bundle modules, transpile TS/JSX, and even minify). The downside is, again, the binary size and init time: the esbuild WASM is about 8.4 MB in size, and it takes roughly 2.2 seconds to initialize in the browser (wasm esbuild checkups - kt3k cosense). After that, transforms are very fast. If we compare performance, esbuild (WASM) is only slightly behind Sucrase’s pure JS on smaller tasks, but on very large codebases it can be faster than SWC in some cases (The Journey To a Faster Sandpack Transpiler - CodeSandbox). Esbuild’s advantage is that it’s a proven bundler as well, meaning our web worker could use esbuild to handle both transpilation and module bundling logic. The trade-off is loading a multi-megabyte .wasm file upfront (which can be mitigated by CDN hosting and caching). The Rollup integration for esbuild in Node (e.g., rollup-plugin-esbuild) cannot be directly used in-browser, but one could write a Rollup plugin that calls the esbuild WASM API. Esbuild’s API supports running in a web worker and even has an official usage example: you must call esbuild.initialize({ wasmURL: '.../esbuild.wasm' }) in the worker before using it (esbuild - API). This means we can keep the main JS bundle small and load the esbuild WASM binary from a CDN or local URL when needed. For instance, a plugin might detect TS/JSX files and defer to esbuild’s transform function in the worker.

Performance vs size trade-off: In summary, WASM transpilers give near-native speed, which is beneficial if users will compile large amounts of code or do many transpilations. But they add to initial download size significantly (e.g., 8–19 MB for these tools). In a scenario focused on lightweight use (small to medium code snippets, interactive editing), a tuned JS transpiler like Sucrase can actually be faster overall – it’s so optimized for single-thread performance that it beat SWC and esbuild in some real-world tests (transpiling Lodash and even the TypeScript compiler source) (The Journey To a Faster Sandpack Transpiler - CodeSandbox) (The Journey To a Faster Sandpack Transpiler - CodeSandbox). Thus, a possible approach is to use Sucrase by default and consider a WASM fallback for edge cases or future expansion. If maximum performance is desired and a few MB of payload is acceptable, esbuild-wasm is a strong option since it also covers bundling. SWC-wasm, given its size (~19 MB), is harder to justify for in-browser use unless its footprint is reduced or features are selectively compiled out.

Transpiling Vue Single-File Components

Vue SFCs (.vue files) need a specialized compiler. Vue’s build tools normally use @vue/compiler-sfc (which in turn uses template compilers and preprocessors) to transform an SFC into JavaScript. In an in-browser setting, we can use the same package – it’s pure JavaScript and can run in a browser environment. The official Vue SFC Playground (https://sfc.vuejs.org) demonstrates this by compiling SFCs client-side. The SFC compiler will parse the <template>, <script>, and <style> sections. The output is typically an AST or code for the template (render function code) plus the script content. A Rollup plugin for Vue (like rollup-plugin-vue) can be adapted to this environment: it would use @vue/compiler-sfc to parse the component, then possibly invoke a transpiler for the script section (if the script is TypeScript or JSX). In Node, rollup-plugin-vue handles style injection, template compilation, etc., and similar logic can run in the browser. In fact, an experimental project by the Vue i18n author integrates rollup-plugin-vue with an in-browser bundler called Uniroll to compile SFCs on-demand in the browser (GitHub - kazupon/vue-sfc-playground: SFC building on the browser with uniroll). That solution fetches npm packages from esm.sh (a CDN for ESM modules) and uses Rollup in the browser to bundle them, proving that full SFC support is feasible. We could take a similar approach: use CDN-hosted Vue compiler utilities to avoid bundling them. For example, dynamically import https://unpkg.com/@vue/compiler-sfc/dist/compiler-sfc.esm-browser.js inside the worker when a .vue file needs processing. This way, the heavy Vue compiler (~tens of KB minzipped, roughly, for the compiler code) is loaded only if needed. Once loaded, the plugin can compile templates to render functions and combine with the script. For styles, one can inject them into the document (for a preview) or simply include them in the output bundle as text to be applied.

One caveat is that the full Vue SFC compiler supports many features (Pug templates, <script setup>, preprocessing languages like TypeScript in script or SCSS in style). Each of those adds complexity. If minimizing bundle size is paramount, you might restrict features. There is even a minimal SFC compiler project that strips out everything except plain HTML/JS/CSS support for Vue components (GitHub - dojyorin/vue_sfc_portable: Minimal Vue SFC compiler for browser.) (GitHub - dojyorin/vue_sfc_portable: Minimal Vue SFC compiler for browser.). This drastically reduces size but at the cost of not supporting <script lang="ts"> or <style scoped> beyond simple cases. In most cases, it’s desirable to at least support TypeScript in Vue components, which means integrating the TS transpiler (e.g., Sucrase or esbuild) on the <script> contents. The Vue compiler can be instructed to leave the script section as is, so the Rollup plugin could first transpile the script with the TS/JSX plugin, then pass the result to the Vue compiler for assembling with the template. This multi-step pipeline can still run entirely in the Web Worker.

In summary, for Vue SFC in-browser: use Vue’s official compiler (loaded from a CDN to save initial size), possibly pair it with a TS transpiler, and encapsulate in a Rollup plugin. This approach is proven by projects like the Vue SFC Playground and uniroll’s Vue integration (GitHub - kazupon/vue-sfc-playground: SFC building on the browser with uniroll). The size overhead is moderate – Vue’s compiler + core runtime is on the order of a few hundred KB (Vue’s runtime with compiler is ~14 KB gzip, and the SFC compiler might add a bit more) (Performance - Vue.js) – which is acceptable if loaded on demand. Performance is also reasonable: compiling a Vue template is fast (tens of milliseconds for typical components), and running this in a worker means the UI stays responsive. The output of the plugin will be plain JavaScript modules that create Vue components, which can then be executed in a preview frame or same thread.

Transpiling Svelte Components

Svelte components (.svelte files) are also compiled away to JavaScript. Svelte provides its compiler as a pure JS module (svelte/compiler). That compiler can run in the browser – the official Svelte REPL on the Svelte website does exactly that. In Svelte’s REPL (which is a Sapper/Svelte app itself), as you type code, a Web Worker compiles the Svelte component to JS and sends it back for preview. Under the hood this can be done with Rollup as well: Svelte’s team demonstrated a custom Rollup plugin in the worker to handle Svelte files (REPLicant, a Super Simple Svelte REPL - Peter Allen at Svelte Summit 2020 - InfoQ) (REPLicant, a Super Simple Svelte REPL - Peter Allen at Svelte Summit 2020 - InfoQ). The plugin’s transform hook invokes Svelte’s compiler on the source to produce a JavaScript module (typically containing a class or set of setup/teardown functions representing the component). This is then bundled or evaluated. Using Rollup’s JS API in the worker allows for resolving imports between Svelte files and bundling them together. Peter Allen’s talk on a minimal Svelte REPL showed that you only need a few plugin hooks (resolveId, load, transform) to make Rollup handle a set of in-memory files and compile Svelte components on the fly (REPLicant, a Super Simple Svelte REPL - Peter Allen at Svelte Summit 2020 - InfoQ) (REPLicant, a Super Simple Svelte REPL - Peter Allen at Svelte Summit 2020 - InfoQ). This approach yields an ES module output which can then be loaded into an iframe or eval’d in the main thread for preview (REPLicant, a Super Simple Svelte REPL - Peter Allen at Svelte Summit 2020 - InfoQ).

To support Svelte in our scenario, we would include (or dynamically load) the Svelte compiler. The Svelte compiler’s size is not trivial – the NPM package is a few MB (including source maps and extras), but the core compiler JS can be a few hundred KB minified. Since this is for development only, size is a concern but not as severe as for production; still, we can load it from a CDN when needed (for example, fetch svelte/compiler from unpkg.com when a .svelte file needs compilation). There are libraries like svelte-browser-import that package the Svelte compiler usage into convenient functions (GitHub - repalash/svelte-browser-import: Import and render svelte files and components directly in the browser without a build step. [Development only]) (GitHub - repalash/svelte-browser-import: Import and render svelte files and components directly in the browser without a build step. [Development only]). That particular library lets you call importSvelte('./Component.svelte') in the browser, internally compiling the file – essentially a pre-made solution extracted from Svelte’s REPL. We could draw inspiration from it or use it directly inside a Rollup plugin’s transform hook.

Performance: Svelte compilation is more intensive than just stripping types – it does code generation for reactive updates – but it’s still quick (typically under 100ms for even medium components). Running it in a worker means the main thread won’t hitch. Memory usage might increase due to the compiler, but modern browsers handle this size of script fine. The key is that using Rollup to orchestrate it gives us flexibility to also bundle any JavaScript imports those Svelte components have (e.g. a component importing another component or a library). Rollup’s tree-shaking isn’t critical in dev, but module resolution is useful. Notably, HMR for Svelte would require integrating the HMR runtime that Svelte provides (svelte-hmr), which could be a future enhancement; for now, a full recompilation on each change (as is done in the REPL) is simpler.

Using CDN and Web Worker Strategies

To minimize bundle size, the heavy lifting libraries (Babel, Sucrase, SWC, esbuild, Vue/Svelte compilers) should be loaded dynamically from a CDN when needed instead of being in the initial bundle. This aligns with the requirement to “use a CDN where possible”. For example:

  • We ship our app with a Rollup browser build and a custom plugin that knows how to load transpilers. When the first .ts or .jsx file is encountered, the plugin can dynamically import the sucrase library (from a CDN like unpkg or jsDelivr) into the worker. Since Sucrase is small (~275 KB) (Browser Support for @rollup/plugin-sucrase · Issue #1620 · rollup/plugins · GitHub), one might even bundle it from the start, but dynamic import keeps the initial load smaller. For a Vue file, the plugin can fetch @vue/compiler-sfc from a CDN, and for a Svelte file, load the svelte/compiler. The web worker can have network access to do this (via ESM imports of CDN URLs or by using fetch + WebAssembly.instantiate for WASM binaries, as in esbuild’s case). Modern browsers support module workers, so we can write the worker script as an ES module and use import() there.

  • Rollup’s browser build (@rollup/browser) helps here. It provides the core bundler logic compiled for the browser. Rollup will not have access to Node’s FS, so our plugin must supply module resolution (e.g., using rollup-plugin-http-resolve or a custom fetcher to get dependencies from a CDN). Projects like Uniroll implement this: they create a virtual file system and use an HTTP resolve plugin to fetch modules from esm.sh CDN (GitHub - mizchi/uniroll: Opinionated universal frontend bundler in browser). In uniroll’s case, when a module is imported, it fetches the module from esm.sh on the fly and feeds it to Rollup. This means our in-browser bundler can pull in library code as needed without pre-bundling it. It’s a good way to keep the app lean, essentially outsourcing dependency hosting to a CDN. We just need to be mindful of latency – possibly prefetching certain common libraries (like React, Vue runtime) if we know they’ll be needed.

  • WASM vs JS loading: For WASM tools like esbuild, one typically provides a URL to the .wasm file. We can host that file on a CDN or our own server. Esbuild’s init allows specifying a URL (esbuild - API), so Rollup’s plugin could call initialize({ wasmURL: '<cdn>/esbuild.wasm' }). The WASM file (8 MB) can be compressed over the network (WASM compresses well, perhaps to ~2 MB gzipped). It will still take a couple seconds to instantiate in the worker (wasm esbuild checkups - kt3k cosense). If this is acceptable for the use case (maybe on first transpile), then esbuild-wasm is viable. Otherwise, sticking to JS (Sucrase) avoids that overhead entirely – Sucrase can transpile immediately with no heavy initialization.

In all cases, focusing on encapsulation as a Rollup plugin means our solution is modular. Each file type (.ts/.jsx, .vue, .svelte) can be handled by a dedicated plugin or one unified plugin with conditional logic. These plugins hide the complexity of fetching compilers or WASM behind their transform hooks. This design also means we can easily swap implementations: for instance, use Sucrase now, but later if a smaller WASM TS compiler emerges, we could integrate that without changing the high-level app structure.

Size and Performance Trade-offs Summary

To make the options clearer, here’s a summary of tools with their approximate sizes and performance considerations:

In practice, a hybrid approach could work well: Use Rollup in the Web Worker (via @rollup/browser) to manage module resolution and bundling, and attach custom plugins for each file type. The plugins can use Sucrase (JS) for JSX/TS, and the official Vue/Svelte compilers (JS) for .vue and .svelte files. This keeps things relatively small and simple: no big WASM payloads upfront, just one small transpiler (Sucrase). All heavy logic is loaded on-demand: e.g. if a user never uses Svelte, they never pay the cost for that compiler. Such a system would meet the criteria of being lightweight while covering all the languages.

Future HMR Support Considerations

Designing with HMR (Hot Module Replacement) in mind means we should preserve module boundaries and enable re-injecting updated code without a full refresh. Rollup (and esbuild) both produce output in terms of modules, which is good. We might consider outputting an ES module for each source file and leveraging the browser’s module system or an iframe to load them. In the simple scenario (like the Svelte REPL), when a file changes, the whole bundle is recompiled and reloaded in an iframe (REPLicant, a Super Simple Svelte REPL - Peter Allen at Svelte Summit 2020 - InfoQ) – essentially simulating a full refresh. That’s easier but not true HMR. In the future, we could enhance this by having the worker send module updates to the preview frame. For example, when a single file is edited, we re-transpile just that file (and maybe its dependents) and use import() or a custom HMR runtime to swap the module in the app. Libraries like vite’s HMR API or svelte-hmr and vue HMR API could be borrowed. The transpilers we choose should not hinder this – indeed, both Vue’s and Svelte’s compilers can output HMR-friendly code (with extra metadata for state preservation) when configured, and a Rollup plugin could be adjusted to enable that. Ensuring our Rollup plugins mark the boundaries (by file) and don’t concatenate everything helps. Rollup’s plugin system will give us the module graph, so we can track import relationships and reload only the necessary parts.

In summary, the recommended solution is to use a Rollup-in-Worker setup with custom plugins that delegate to lightweight transpilers: Sucrase for JSX/TS (small and fast (Browser Support for @rollup/plugin-sucrase · Issue #1620 · rollup/plugins · GitHub) (The Journey To a Faster Sandpack Transpiler - CodeSandbox)), and official compilers for Vue and Svelte loaded from a CDN. This keeps the core bundle small and uses the right tool for each job, balancing size and speed. WebAssembly tools like esbuild or SWC can be considered if we need more performance, but they come with large download costs (esbuild ~8 MB (wasm esbuild checkups - kt3k cosense), SWC ~19 MB (Browser Support for @rollup/plugin-sucrase · Issue #1620 · rollup/plugins · GitHub)) and in practice may not be necessary given the efficiency of Sucrase and the relatively limited scope of an in-browser playground. By encapsulating each approach in a Rollup plugin, we maintain clear separation of concerns and can swap out implementations (or add HMR capabilities) without a complete rewrite. This approach has precedent in real projects (CodeSandbox’s Sandpack, StackBlitz, Svelte and Vue’s official REPLs) and meets the criteria of working in a Web Worker, using CDNs for resources, supporting both JS and WASM paths, and minimizing footprint while being future-proof for HMR.

Sources: