Mark L
following the trpc 11 guide on app router fails with dynamicIO on nextJS
When using trpc with nextjs@15.0.3-canary.4 with dynamicIO enabled I get the following error when trying to create a query client.
Error: Route "/" used
Date.now()
instead of using performance
or without explicitly calling await connection()
beforehand. See more info here: https://nextjs.org/docs/messages/next-prerender-current-time
at io (node_modules/next/src/server/node-environment-extensions/utils.tsx:33:31)
at now (node_modules/next/src/server/node-environment-extensions/date.tsx:17:9)
at createQueryClient (src/trpc/query-client.ts:5:2)
at createQueryClient (src/trpc/react.tsx:17:28)
at getQueryClient (src/trpc/react.tsx:41:22)
3 |
4 | export const createQueryClient = () =>
5 | new QueryClient({| ^ 6 | defaultOptions: { 7 | queries: { Is there any way to solve this on my side? Or is this something that needs to be fixed inside of react-query or trpc?
5 replies
How can I cache a single procedure?
I've seen the docs on caching here: https://trpc.io/docs/caching
But that is caching the entire app router.
my app router has many "subrouters" (Not sure what you call them) and I need different cache times for different "procedures" in those routers.
For example I have some data that I would only want a 5 minute cache time for, but other data that can be cached for a week with no issues.
How can I apply a specific cache time to a procedure in trpc?
I tried using the
paths
array in the app level caching, but it seems to sometimes include both paths that I want long caching for and paths that needs short caching? So I don't know how I would be able to use that to change the cache time properly.
For example I have a component that does multiple trpc queries, where some of those can be cached for a week without problems and others needs to be updated at least once a minute.5 replies
Can I cache a response server-side?
For my project one of the trpc requests in my app works as an API Gateway to another API.
I make a request to TRPC server, that makes a request with a secret API key to a 3rd party API, and returns the response to my user.
In my case this 3rd party API has a rate limit that I'm going to be hitting soon.
However in +90% of cases, a user is going to make the same API request with the same params, and that data is valid for 24 hours... So ideally I would love to cache that response, and return the data to my user if there's a recent match from some one else who made the same query.
I have seen the docs on the HTTP request header, but I believe that only works on client side right? User A will get a cache the second time he makes the same request with a proper cache policy, but User B would still end up going through to the API even if he is making the exact same request as user A.
I'm not even sure this is possible on serveless vercel 🤔 I imagine not without adding an additional system like redis into the mix?
How would you handle this situation using trpc with Vercel?
3 replies