Nuxt on the Edge
Learn how we made Nuxt 3 capable of running on edge runtimes to run with server-side rendering close to your users.
Introduction
In September 2017, Cloudflare introduced Cloudflare Workers, giving the ability to run JavaScript on their edge network. This means your code will deploy on the entire edge network in over a hundred locations worldwide in about 30 seconds. This technology allows you to focus on writing your application close to your users, wherever they are in the world (~50ms latency).
The worker's runtime is not the same as Node.js or the Browser, it executes the code using V8, the JavaScript engine developed by Google Chrome. Until now, what you could run on their platform were small scripts running on the edge before hitting your server to increase the performance or add some logic based on request headers, for example.
In November 2020, while working on Nuxt 3, we made the bet to run Nuxt in-production on edge runtimes / V8 isolates.
It unlocks the ability to server-render pages in ~50ms from all over the world when using a platform like CloudFlare Workers, without having to deal with servers, load balancers and caching, for about $0.5 per million requests. As of today, new platforms are coming to let run apps on V8 isolates such as Deno Deploy.
The Challenge
In order to make Nuxt run in workers, we had to rewrite some parts of Nuxt to be environmentally agnostic (runs in Node.js, Browser or V8).
We started with our server and created unjs/h3: a minimal http framework built for high performance and portability. It replaces Connect we used in Nuxt 2 but has compatibility with it so you can keep using Connect/Express middleware. In the workers, for each incoming request, it starts Nuxt in production, sends the request to it and sends back the response.
In Nuxt 2, the duration to start the server in production in memory (also named cold start) was about ~300ms, because we had to load all the dependencies of your server and application in order to handle the request.
By working on h3, we decided to code-split each handler attached to the server and lazy-load them only when requested. When you start Nuxt 3, we only load h3 in memory and the corresponding handlers. When a request comes in, we load the handler corresponding to the route and execute it.
By adopting this approach, we reduced the cold start from ~300ms to ~2ms.
We had another challenge in order to run Nuxt on the edge: the production bundle size. This includes the server, Vue app and Node.js dependencies combined. Cloudflare workers currently have a limit of 1MB (free plan) and 5MB ($5 per month plan) for the worker size.
In order to achieve this, we created unjs/nitro, our server engine, when running the nuxt build
command, it bundles your whole project and includes all dependencies into the final output. It uses Rollup and vercel/nft to trace only the code used by the node_modules
to remove unnecessary code. The total size of the generated output for a basic Nuxt 3 application is about 700kB gzip.
Lastly, to provide the same developer experience between development (Node.js) and production on Cloudflare (Edge runtime), we created unjs/unenv: a library to convert JavaScript code to run everywhere (platform agnostic) by mocking or adding polyfills for known dependencies.
At Nuxt, we believe that you should have the freedom to choose the hosting provider that fits you best.
This is why you can deploy a Nuxt application with edge-side rendering on:
- Cloudflare Page
- Deno Deploy
- Vercel Edge Functions (using CloudFlare Workers under the hood)
- Netlify Edge Functions (using Deno under the hood)
We also support many other deployment providers, including static hosting or traditional Node.js serverless and server hosts.
Pushing Full-stack Capabilities
Now that we have Nuxt running on edge runtime, we can do more than render a Vue application. Thanks to the server directory, creating an API route is a TypeScript file away.
To add the /api/hello
route, create a server/api/hello.ts
file:
export default defineEventHandler((event) => {
return {
hello: 'world'
}
})
You can now universally call this API in your pages and components:
<script setup>
const { data } = await useFetch('/api/hello')
</script>
<template>
<pre>{{ data }}</pre>
</template>
One important thing to note when we created useFetch and $fetch is that during server-side rendering, if you call your API routes, it will emulate the request and call the function code directly: avoiding an HTTP request and reducing page’s rendering time.
In terms of developer experience, you will notice that when creating server files, the Nuxt server keeps running without rebuilding the Vue app. This is because Nuxt 3 supports Hot Module Replacement (HMR) when creating API and server routes.
Furthermore, by leveraging Object Relational Mapping (ORM) like drizzle-orm, developers can connect Edge & Serverless databases such as D1, Turso, Neon, Planetscale and more.
I created Nuxt Todos Edge, an open source demo to showcase the same code running on different edge platforms and databases. The source code is available on GitHub under the MIT license at atinux/nuxt-todos-edge.
Conclusion
We are excited about edge-side rendering and what it unlocks. Our team at Nuxt can’t wait to see what you will build on top of this!
Feel free to join our Discord server or mention @nuxt_js on Twitter to share your work.
Nuxt 3.7
Nuxt 3.7 is out, bringing a new CLI, native web streams and response, rendering optimisations, async context support - and much more.
Nuxt 3.6
Nuxt 3.6 is out, bringing performance improvements, fully static server components, better style inlining, static presets, increased type safety - and much more.