Simple cache layer for NextJS
Hi there! Welcome after short break. Recently it was busy time for me and I did not have a time to write something meaningful. Sorry for that ššš Today I want to share with You real case Iāve met recently when developing NextJS based app (Next13+ with app router).
Next.js 13 was a significant release. Some people say itās like the transition from AngularJS to Angular 2 (which was practically a completely new framework). To be honest, I donāt have much experience with earlier versions of Next.js. I know some basics, and I had the opportunity to work with it on one commercial project, but I didnāt delve into its details at that time because the project itself was in bad shape, and I spent most of my time trying to understand the dependencies within the entire system. I know that Next.js was considered a front-end framework.
Last year, I finally found some time to get acquainted with Next.js. I completed 2 projects (conducted on YouTube) and also enrolled in a course (which has just ended). At the same time, in my current job, I decided to implement one of the services using Next.js 13. Now I have quite substantial knowledge on this topic, and I can share my thoughts with you.
OK, letās start: 2 important things you have to know about Next13+:
- it renders Server Components by default (unless you mark your specific component with āuse clientā directive)
- it applies strong caching policy.
People say that in programming there are two hard things: cache invalidation and naming things. And since Next.js takes care of half of the problems for us, itās already a big relief. I thought so too until I started implementing the API in my application š¤£ (in my concrete scenario Next-based app is used as a standalone web application but it also exposes its API).
The way how you define your API is rather simple. You build folder structure under app/api
. Each nested folder corresponds to endpoint fragment. The final point is route.ts
(or route.js
) file.
You can also use dynamic routes, by using triple dots for folder name (e.g. [ā¦param]). Inside file itself you define function named after HTTP verb - you can use one of:
GET, POST, PUT, PATCH, DELETE, HEAD, and OPTIONS.
// app/api/data/route.ts
export async function GET(_: NextRequest) {
return NextResponse.json({ data: [] })
}
Now you can call your endpoint e.g. with cURL:
curl -X GET http://localhost:3000/api/data
So far, everything is good. Now we want to populate our data with something meaningful. In my scenario, Iāve built an abstraction over the ClickHouse client, which retrieves the required data for me.
But in general, at this point, you might want to fetch data from any source (such as another endpoint, Redis, PostgreSQL, etc.). Evertyhing went well until I pushed my change to remote, which triggered CI process,
where docker image is built (for such scenario you have to set output as āstandaloneā inside next.config.js
). Build failed (on my local machine it passed). A bit of digging, and I stumbled upon this
thread. Next tries to pre-render API routes, but in CI envrionment variables are not definied yet ā¦ (they are passed later, when the image is being launched).
Strange. You could try building an abstraction over your database client, which during the build phase becomes a mock, but that requires some serious gymnastics and strong nerves.
Another solution to this problem was suggested in the thread I mentioned. You can override build step in package.json
:
"build": "next build --build-mode=experimental-compile"
This will cause complete cache bypass. So, weāre back to square one š . Letās take a break and think about what we want to achieve. Our requirements:
- cache should be valid for predefined amount of time
- if cache is not valid then data is returned from DB
Consider function below.
function simpleCache<T>(
cacheMaxAgeMillis: number,
dataFetcher: () => Promise<T>
) {
let cache: T
let lastReadTime: number
return async () => {
const now = Date.now()
if (!lastReadTime || now - lastReadTime > cacheMaxAgeMillis) {
lastReadTime = now
cache = await dataFetcher()
}
return cache
}
}
Just a dozen lines of code, and the logic is super simple. You pass the maximum cache age and a function that reads data from any source. If the cached data is stale, itās overridden by data from the source. Otherwise, itās returned from memory. Now we have to update our route handler.
const QUARTER_IN_MS = 15 * 60 * 1000
const getData = simpleCache(QUARTER_IN_MS, client.getData.bind(client))
export async function GET(_: NextRequest) {
const cachedData = await getData()
return NextResponse.json({ cachedData })
}
VoilĆ ! First user who hit this endpoint will actually read DB and prime a cache. For the next 15 minutes, each subsequent user will receive data from a cache. Later, the next user will fill our cache, and so on in a cycle. The block diagram for such a solution looks as follows:
Of course, this wonāt be a good solution for every case (for example, Iām thinking about passing dynamic arguments to the DB query here). But in cases where data remains unchanged within a certain time interval and is displayed in the same form for all users, itās worth considering this solution to speed up the response from your endpoint. In the case of analytical databases (such as ClickHouse) where a query can scan millions of records at once (thus consuming significant resources), and the data is inherently static, this approach seems to be a good solution.
Thatās all I wanted to share with you today. To be honest, I donāt fully understand Next.jsās architectural decision - but you have to live with it. Watching a
certain YouTube video yesterday, I heard a very wise sentence: If you have something to do, donāt look for excuses not to do it, look for a way.
(the sentence may have sounded different, but
the meaning remains). Exactly this path was used for the problem I had. Remember this in your daily tasks!