julian • Jan 13, 2025

Implementing Streaming SSR with React Relay and Vite

Article image
Article image for Implementing Streaming SSR with React Relay and Vite

Introduction

Implementing server-side rendering (SSR) with React-Relay and Vite is a complex but crucial process, especially when integrating it with React Router and Relay. At Aqora, SSR was essential for enhancing SEO, link previews, and rendering speed, especially in our quantum computing competition platform (yep, you've read that right!) SSR was the secret sauce that made all the difference.
This guide outlines the challenges we encountered, how we overcame them, and how you can apply similar solutions to your projects. We’ll focus on how the Relay store is handled and the server-to-client data flow, while addressing key architectural decisions, optimizations, and common pitfalls.
This blog post outlines the flow of your React Relay server-side rendering app, with particular attention to the Relay store and the way we are populating it from the server to the client.
flow.png

Challenges and Solutions

1. Creating a Server Context and Passing Data

To get SSR working smoothly with Relay, we had to set up a server context to pass the right data to the frontend - such as the Relay store, meta tags, and routing information. We achieved this by using createStaticHandler to manage the routes, along with some additional steps like setting up a fetch request and querying the static handler. This process ensures that all the necessary data is ready and in place before the app reaches the browser.
As for createStaticHandler, it's responsible for performing data fetching and handling form submissions on the server (whether using Node or another JavaScript runtime) prior to rendering the application with <StaticRouterProvider>. This approach ensures that all route-based data is fetched on the server side, streamlining the rendering process for the client.!
import {
  type RenderToPipeableStreamOptions,
  renderToPipeableStream,
} from "react-dom/server";
import { RouteObject } from "react-router-dom";
import {
  createStaticHandler,
  StaticHandlerContext,
} from "react-router-dom/server";
import { routes } from "./routes";
import { RecordSource } from "relay-runtime";
import { Environment } from "react-relay";
import { createEnvironment } from "./environment";
import express from "express";

interface Context {
  routes: RouteObject[];
  environment: Environment;
  staticHandlerContext: StaticHandlerContext;
  helmetContext: object;
  recordSource: RecordSource;
}

export const createContext = async (
  graphqlUrl: string,
  req: express.Request,
  res: express.Response,
): Promise<Context> => {
  const recordSource = new RecordSource();
  const environment = createEnvironment(graphqlUrl, recordSource);

  const { query, dataRoutes } = createStaticHandler(routes(environment));
  const fetchRequest = createFetchRequest(req, res);
  const staticHandlerContext = await query(fetchRequest);

  if (staticHandlerContext instanceof Response) {
    throw staticHandlerContext;
  }

  return {
    routes: dataRoutes,
    staticHandlerContext,
    environment,
    helmetContext: {},
    recordSource: new RecordSource(),
  };
};
Alright, here’s the scoop! This code is your backstage pass to server-side rendering (SSR) with React, React Router, and Relay. It sets up a nice little context, grabs data from your GraphQL API, and gets all the routes in order before your app even reaches the browser. Think of it like prepping a meal - we gather all the ingredients (data and routes) on the server, cook it up just right, and then serve it to the client, nice and fresh. All so your React app loads faster than you can say "SSR to the rescue!"

2. React Router with SSR

When integrating React Router and Relay into our SSR setup, we encountered issues with React's Suspense interacting with server-side rendering. Traditional SSR methods like renderToString couldn’t handle asynchronous rendering properly, resulting in incomplete or incorrect renders.
Solution: We used renderToPipeableStream along with the onAllReady callback to ensure that Suspense boundaries were fully resolved before sending the final HTML to the client. Here is the relevant code:

export function render(
  { environment, routes, staticHandlerContext, helmetContext }: Context,
  options: RenderToPipeableStreamOptions,
) {
  return renderToPipeableStream(
    <React.StrictMode>
      <HelmetProvider context={helmetContext}>
        <ErrorBoundary fallback={<div>Something went wrong!</div>}>
          <RelayEnvironmentProvider environment={environment}>
            <ServerRouter routes={routes} context={staticHandlerContext} />
          </RelayEnvironmentProvider>
        </ErrorBoundary>
      </HelmetProvider>
    </React.StrictMode>,
    options,
  );
}
This setup ensures the proper handling of asynchronous elements, like meta tags and data fetching, before the HTML is sent to the client.

3. Preloaded Queries and Relay Environment

One of the big challenges we faced was managing preloaded queries on the server and making sure the Relay store data got passed cleanly to the client. If this wasn’t handled right, the client would go rogue, refetching data unnecessarily, which could lead to mismatches and caching nightmares.
Solution: We used preloadQuery to fetch data on the server and then serialized the Relay store into the HTML. When the client-side app hydrates, the Relay environment reuses this data, keeping everything in sync and preventing unnecessary refetches.
Let’s break down the pieces that make this work:
  1. Caching the RecordSource on the Server (for the client) : On the server, we build the RecordSource in the context, and then serialize it into the HTML response. Check out this bit in server.js where it’s added to the response:
const { recordSource } = context;
res.write(`<script>window.__RECORD_SOURCE=${JSON.stringify(recordSource.toJSON())}</script>`);
  1. Loading the RecordSource on the Client: On the client, we load that serialized data into the app to avoid fetching it all over again. In main.tsx, we pull the data into the environment like so:
interface InjectedWindow extends Window {
  // eslint-disable-next-line @typescript-eslint/no-explicit-any
  __RECORD_SOURCE: any;
}

const recordSource = new RecordSource(
  (window as unknown as InjectedWindow).__RECORD_SOURCE,
);
const environment = createEnvironment(
  "http://localhost:8080/graphql",
  recordSource,
);
Here's a simplified version of how we can define a preloaded query (non-relevant):
const preloadedData = preloadQuery(environment, MyQuery, { id: '123' });

function App() {
  return (
    <MyComponent preloadedQuery={preloadedData} />
  );
}
Hehe, this is cool, right? Now we can use createStaticHandler to load GraphQL queries based on the route, grab the data from the server, and send it right over to the client. Once it gets there, the Relay store is already hydrated and ready to go! This setup, with preloadQuery fetching data during SSR and rehydrating it on the client, ensures everything stays fast and efficient - keeping your app snappy and your cache happy. Easy peasy!

4. Handling Meta Tags with React Helmet

All this effort, just for SEO? You bet! But it’s crucial, so let’s dive into how we handle meta tags!
Solution: We used Helmet along with HelmetProvider to manage meta tags seamlessly on both the server and client. By rendering Helmet tags on the server and embedding them directly into the HTML, we made sure all the important meta info (like titles and descriptions) was already in place when the client received the page.
On the client side, it's super simple! Since we've already defined a HelmetProvider at the top of the app, we can just use react-helmet-async smoothly, like this:
import { Helmet } from "react-helmet-async";

const preloadedData = preloadQuery(environment, MyQuery, { id: '123' });

function App() {
  return (
    <>
      <Helmet>
        <title>My App</title>
        <meta name="description" content="This is my app description" />
      </Helmet>
      <MyComponent preloadedQuery={preloadedData} />
    </>
  );
}
On the server, it’s not too complicated either! We just grab the meta data from the context we set up earlier and write it into the response:
const { helmet } = context.helmetContext;
res.write(helmet.title.toString());
res.write(helmet.meta.toString());
res.write(helmet.link.toString());
Remember that context we defined in the first step? We read from it when calling the render function in server.tsx. After reading, we simply pull the meta data and write it into the response that the server sends to the client.

5. Final Integration in server.js

The last step was to tie it all up in server.js (finally, right? \o/). This is where we connected the GraphQL proxy, the server context, and the rendering logic all in one place. Here’s how it looks:
import express from "express";
import proxy from "express-http-proxy";
import { createContext, renderAppToStream } from "./entry-server";

const app = express();
const GRAPHQL_URL = "https://swapi-graphql.netlify.app/.netlify/functions/index";

// Set up the GraphQL proxy
const graphqlProxy = proxy(GRAPHQL_URL, {
  proxyReqPathResolver: () => "/graphql",
});

app.use("/graphql", graphqlProxy);

// Handle all other requests
app.get("*", async (req, res) => {
  try {
    const context = await createContext(GRAPHQL_URL, req, res);
    const { pipe, abort } = render(context, {
      onShellError() {
        res.status(500);
        res.set({ "Content-Type": "text/html" });
        res.send("<h1>Something went wrong</h1>");
      },
      onAllReady() {
        res.status(didError ? 500 : 200);
        res.set({ "Content-Type": "text/html" });

        const transformStream = new Transform({
          transform(chunk, encoding, callback) {
            res.write(chunk, encoding);
            callback();
          },
        });

        res.write(htmlStart);

        // Inject the Helmet meta tags into the HTML
        const { helmet } = context.helmetContext;
        if (helmet) {
          res.write(helmet.title.toString());
          res.write(helmet.priority.toString());
          res.write(helmet.meta.toString());
          res.write(helmet.link.toString());
          res.write(helmet.script.toString());
        }

        // Send the Relay store data to the client
        const { recordSource } = context;
        res.write(
          `<script>window.__RECORD_SOURCE = ${JSON.stringify(recordSource.toJSON())}</script>`
        );

        res.write(bodyStart);

        // End the response when the stream is done
        transformStream.on("finish", () => {
          res.end(htmlEnd);
        });

        pipe(transformStream);
      },
      onError(error) {
        didError = true;
        console.error(error);
      },
    });
  } catch (error) {
    res.status(500).send("<h1>Server Error</h1>");
  }
});

app.listen(3000, () => {
  console.log("Server is running on http://localhost:3000");
});
And voilà! We brought everything together in server.js. We set up a GraphQL proxy to forward API requests, created the server context using createContext to gather all the necessary data, and used the render function to stream the HTML back to the client. During rendering, we injected meta tags (like titles and descriptions) using Helmet, and we also sent the preloaded Relay data to the client, so it’s ready to use without refetching. This setup ensures the app is streamed efficiently, with all the data and meta info in place for both SEO and performance!

Conclusion

Our SSR implementation using React Router, Relay, and Vite tackled several challenges: managing Suspense with SSR, handling data serialization, and ensuring meta tags were generated correctly. With these solutions, you can implement SSR in a way that boosts your app’s SEO, performance, and user experience.
For the full code and further details, visit our GitHub repository. Contributions and questions are welcome!
Order by:

Want to join this discussion?

Join our community today and start discussing with our members by participating in exciting events, competitions, and challenges. Sign up now to engage with quantum experts!