Can I cache a response server-side?
For my project one of the trpc requests in my app works as an API Gateway to another API.
I make a request to TRPC server, that makes a request with a secret API key to a 3rd party API, and returns the response to my user.
In my case this 3rd party API has a rate limit that I'm going to be hitting soon.
However in +90% of cases, a user is going to make the same API request with the same params, and that data is valid for 24 hours... So ideally I would love to cache that response, and return the data to my user if there's a recent match from some one else who made the same query.
I have seen the docs on the HTTP request header, but I believe that only works on client side right? User A will get a cache the second time he makes the same request with a proper cache policy, but User B would still end up going through to the API even if he is making the exact same request as user A.
I'm not even sure this is possible on serveless vercel 🤔 I imagine not without adding an additional system like redis into the mix?
How would you handle this situation using trpc with Vercel?
2 Replies
Response Caching | tRPC
The below examples uses Vercel's edge caching to serve data to your users as fast as possible.
Http headers work on vercel