T
tRPC

TRPC ratelimiting endpoints

TRPC ratelimiting endpoints

Rrustclan5/2/2023
I am currently having some problems with a race condition in my TRPC nextJS api. Essentially what is happening is I have a enforceGuildPermissions method, which basically checks if the user who is making the request has permission to get the data for that guild. The data is stored in my Redis cache for 3 seconds. This works okay sometimes, but other times because there is 3-4 different trpc requests running for a single page which are guild, role and channel. It causes the last request (channel) to get rate limited by the discord API because they are all running concurrently, this means it doesn't give my caching code chance to update it before the next one runs. middleware route procedure:
const enforceGuildPermissions = enforceUserLoggedIn.unstable_pipe(
async ({ ctx, next, rawInput }) => {
const guildId: unknown = (rawInput as { guildId?: unknown })?.guildId;
if (!guildId) throw new TRPCError({ code: 'BAD_REQUEST' });

const webUser = await cache.webUsers.get(ctx.session.user.id);
let guilds = webUser?.guilds;
if (!guilds) {
guilds = await getUserGuilds(ctx.session)

}

if (!guilds) throw new TRPCError({ code: 'UNAUTHORIZED' });

// if the user is not in the guild return unauth
const foundGuild = guilds.find((guild) => guild.id === guildId);
if (!foundGuild) throw new TRPCError({ code: 'UNAUTHORIZED' });

return next({
ctx: {
session: { ...ctx.session, user: ctx.session.user },
},
});
}
);

export const guildProcedure = t.procedure.use(enforceGuildPermissions);
const enforceGuildPermissions = enforceUserLoggedIn.unstable_pipe(
async ({ ctx, next, rawInput }) => {
const guildId: unknown = (rawInput as { guildId?: unknown })?.guildId;
if (!guildId) throw new TRPCError({ code: 'BAD_REQUEST' });

const webUser = await cache.webUsers.get(ctx.session.user.id);
let guilds = webUser?.guilds;
if (!guilds) {
guilds = await getUserGuilds(ctx.session)

}

if (!guilds) throw new TRPCError({ code: 'UNAUTHORIZED' });

// if the user is not in the guild return unauth
const foundGuild = guilds.find((guild) => guild.id === guildId);
if (!foundGuild) throw new TRPCError({ code: 'UNAUTHORIZED' });

return next({
ctx: {
session: { ...ctx.session, user: ctx.session.user },
},
});
}
);

export const guildProcedure = t.procedure.use(enforceGuildPermissions);
function which needs to be ratelimited (using the redis cache)
export const getUserGuilds = async (
session: Session
): Promise<CachedUserGuild[] | null> => {
if (!session.user.accessToken || !session.user.id) return null;

const webUser = await cache.webUsers.get(session.user.id);
if (webUser) return webUser.guilds;

const response = await fetch(discord...)
const guilds = await response.json();
if (!response.ok || guilds.length <= 0) return null;

// add guilds to cache
await cache.webUsers.create(session.user.id, guilds);

return guilds;
};
export const getUserGuilds = async (
session: Session
): Promise<CachedUserGuild[] | null> => {
if (!session.user.accessToken || !session.user.id) return null;

const webUser = await cache.webUsers.get(session.user.id);
if (webUser) return webUser.guilds;

const response = await fetch(discord...)
const guilds = await response.json();
if (!response.ok || guilds.length <= 0) return null;

// add guilds to cache
await cache.webUsers.create(session.user.id, guilds);

return guilds;
};
A big band aid fix would be to just add an artificial wait:
if (!guilds) {
await new Promise((resolve) => setTimeout(resolve, 1000));
webUser = await cache.webUsers.get(ctx.session.user.id);
guilds = webUser?.guilds;
if (!guilds) {
throw new TRPCError({ code: 'UNAUTHORIZED' });
}
}
if (!guilds) {
await new Promise((resolve) => setTimeout(resolve, 1000));
webUser = await cache.webUsers.get(ctx.session.user.id);
guilds = webUser?.guilds;
if (!guilds) {
throw new TRPCError({ code: 'UNAUTHORIZED' });
}
}
But obviously this is not very elegant.. bump
Nnlucas5/3/2023
This isn’t really a tRPC problem, caching and rate limiting is hard Probably simplest to recognise rate limits and retry the task though
Rrustclan5/3/2023
Hi, thank you for the response. But I'm not sure I follow. Since this is hosted on serverless infrastructure (vercel), I don't think this is possible to use a in memory cache to stop this from happening? I may be misunderstanding your solution though.
Rrustclan5/6/2023
Thank you very much, when I get some time I shall look at this. @alex / KATT hi, I had time to do this now. Thank you SO much! This issue was slowing my site down alot, and now it is fast once again! But just so I understand, why does this solution not scale? You say I may have to deal with the promise in redis? Why is using memo not okay?
Nnlucas5/6/2023
I haven’t looked at the solution, but (as a thought experiment) what happens if you have 2, 3, 15 parallel instances of the API running? (Horizontal scaling)
Rrustclan5/6/2023
Hmm, yes okay. I see what you are saying. This depends on vercel runs nextjs api endpoints, im not entirely sure. For now, this is working just fine. But I should look into this just encase. Thank you very much for the help 🙂

Looking for more? Join the community!

T
tRPC

TRPC ratelimiting endpoints

Join Server
Recommended Posts
express-session for tRPCI am using express-session for all my servers to create session authentication. Is there something next-prisma-websockets-starter seeds twice on 'pnpm dx'Hi, i'm using this starter template for my app. The `dx` script from package.json runs both `prisma Can you return from an API endpoint before a sync operation is complete?I'm curious, if I have an endpoint that saves something to a DB and I choose to return from the endpuseInfiniteQueryHey i saw on trpc docs that it is used with prisma, but can i so it with drizzle ? How to past cursoTRPC Next/Server Types Broken >10.5.0Hi all, recently I upgraded from 10.5.0 to latest 10.21.2, discovering that I now have type-check erJWT Token is type "never" in frontend.??!!TRPC Backend is sending JWT Token as string but frontend is reading it as type "never". I am using tHow to properly check the contents of prefetched data?I have dynamic route with SSG and if coming product slug is not in db I want to return notFound: trutrpc/next very slowI have set up my project using trpc/next and i have extremely slow queries, simple hello world takinGeneric handler for data.isLoading and data.isErrorHi, I'm looking for a way to create generic interface for useQuery result (budgetData from example bBug where 2 requests are fired at once. TRPC batches them. Can I cancel the 2nd via ProcedureOption?Hey all. I have a bug where my app fires two identical requests at the same time. This happens do toHow to do an async API call in useEffect (T3 stack)Hey, I have the router below and want to call the `tutor`async in an `useCallback` function, but thepagination - Offset MethodHi trpc has pagination example but only using cursor https://trpc.io/docs/reactjs/useinfinitequery next js appDirNext js tRpc What are the advantages of using trpc instead of the native Next.js APIs when buildingHow can you fetch data on a dynamic router with trpc?I’m creating a table component within my NextJs app. Instead of making an api call in the parent comMocking tRPC call w/ Playwright (Transform Error)I have a tRPC call that I would like to mock out for a Playwright E2E test. I've followed their doczod input validation from ts typeI imported a type using `import type { WebhookEvent } from "@clerk/nextjs/api";`. Is there a way to Looking to fix my tRPC implementationHi guys I am looking for some help implementing tRPC in my current project, I have 3 files that needExpression produces a union type that is too complex to representHi! I have started to encounter the above on error on pretty simple react components using trpc v10 Skipping useQuery with typescriptI'm wondering if there is a way to skip a query in a typescript friendly way? `rtk-query` has a handClerk Webhook Input UndefinedHi! I wrote a public procedure that takes in an input and updates user info based on Clerk Webhook.