How to improve typescript performance?
The inferred type of this node exceeds the maximum length the compiler will serialize. An explicit type annotation is needed.ts(7056)
π’ π’
tRPC has long been my favorite tool to use. However, as time has passed, my application has grown slightly bigger, and now I cannot see how it will be possible for me to continue using it... I need a way for tRPC to be scalable, to allow for the creation of hundreds of endpoints and not have to constantly worry about my editor performance. But... it's getting increasingly hard to do this. So much so that I am considering moving off it to something more scalable. I heard that there are ways to break up trpc so that it will keep being performant for typescript. but how? I saw that KATT had a comment here: https://github.com/trpc/trpc/discussions/3574 saying that it is possible to break up trpc into multiple packages. But it's not clear to me how? I am using turborepo, so I believe this would be possible.
tRPC has long been my favorite tool to use. However, as time has passed, my application has grown slightly bigger, and now I cannot see how it will be possible for me to continue using it... I need a way for tRPC to be scalable, to allow for the creation of hundreds of endpoints and not have to constantly worry about my editor performance. But... it's getting increasingly hard to do this. So much so that I am considering moving off it to something more scalable. I heard that there are ways to break up trpc so that it will keep being performant for typescript. but how? I saw that KATT had a comment here: https://github.com/trpc/trpc/discussions/3574 saying that it is possible to break up trpc into multiple packages. But it's not clear to me how? I am using turborepo, so I believe this would be possible.
GitHub
Specifying output types to improve TypeScript performance? Β· trpc t...
Hello! I'm using tRPC 10 with TypeScript 4.9.4. I have a monorepo with a client and a server. My editor's auto-complete is pretty slow; I'd like to speed it up by explicitly specifying ...
34 Replies
you can do a hack and split it up the inference by router https://gist.github.com/KATT/af6079e6fcc31f292776266dd76dadbb
a better solution is to do a monorepo where ts is precompiled on save
any articles on this?
Hi @Alex / KATT π± , I hadn't seen your reply. Thanks for the link. I want to actually have this working on my repo, and if it works, I even want to create a video/article about it.
I use a monorepo setup from julius's create t3 turbo
Trying to figure out what this is doing exactly. If you could share a repo with this implemented, it would help a lot, because my file structure is different from what the template/generator expects
Keen to watch/read (and reshare) that video/article
Yeah, but I think that⦠if I am already using ct3turbo
And in there they already statically generate the dist folder with the types, then is there a way to make it even better?
Because you said βa better solution is to do a monorepo where ts is precompiled on saveβ
Not sure if itβs possible to make it even faster? @Alex / KATT π±
i don't have time to look into this right now and i'll be out next week, but do some experimentation with a smaller project
something that'd be better than a dist folder i think would be to setup project references across the packages doesn't seem true based on what Andarist said in the below thread on X
moonrepo has a way of automatically syncing project references
should be possible to make a script for that in turbo/nx
https://moonrepo.dev/docs/config/toolchain#syncprojectreferencestopaths
have a look at matt's response in this thread https://x.com/alexdotjs/status/1814213219419881876
the main difference to t3 turbo is that it doesn't use
tsc --watch
at all
but t3 turbo should already be fast, not tried that setup on a huge project to be confident that the TS stays fast and that you don't get a lot of issues with like "The inferred type of "X" cannot be named without a reference to "Y". This is likely not portable. A type annotation is necessary. "Yeah, right now my concern is that it is causing this error:
The inferred type of this node exceeds the maximum length the compiler will serialize. An explicit type annotation is needed.ts(7056)
Other than that⦠sure, right now the typescript server is somewhat slow, but it is doable with a more powerful machine.
Just concerns about scalability in general. If it gets even bigger, then what
could you split up your trpc router across multiple packages?
And then I will have multiple routers that I can call?
and just have like
Multiple server side callers as well and multiple react query instances?
nono
I have satisfies instead of createTRPCRouter as well
like
- packages/trpc-base
- packages/trpc-posts
- packages/trpc-users
- packages/trpc-api
trpc-api would just use posts and users
users and posts users the base
Hum, and then base can be the root for all routers? I can make base have posts, users, etc?
base in this example would have the
initTRPC()
-call and the context
i hope there's a good way to get around thisAnd so then it would always have the same server caller and one tRPC api for client react/react query still?
yeah
since trpc-api's
.d.ts
should mainly be referencing stuff in other packages it shouldn't crash on that "exceeds the maximum length"-error, but we'll see π
Ok, and in this structure you suggested,
packages/trpc-api
would have which purpose?
Not sure I get itthat'd be the thing that imports every individual router and creates the actual app router
oh, ok.
but unsure if you'd still get that error when you then create the trpc client etc π
Have either of you guys had a look into xtrpc? Mentioned it on a similar issue a few hours ago, but I've found it quite beneficial at the moment. I'm not sure how well it'll translate to larger scale. But on a pretty non-scientific test. It worked much better especially with the language server. It's not live, but have it running on a separate process via nodemon to regen the types
https://github.com/algora-io/xtrpc
yeah xtrpc works well but it kills jump to definition etc
Haven't seen it before! This looks interesting
Yeah 100% agree, but they are interchangeable so I guess if you absolutely needed to jump to a definition perhaps it could work? I'd say though I'm probably happy to manually find the definition versus having to wait for the language server to handle the compiling.
They do mention this at the bottom of the readme, but i haven't figured out how to achieve it
So, to recap your proposal:
- packages/trpc-base - has
initTRPC
, createTRPCContext
, exports procedures and maybe middlewares
- packages/trpc-posts - trpcRouter. depends on packages/trpc-base
- packages/trpc-users - trpcRouter. depends on packages/trpc-base
- packages/trpc-api - Main app router, where it will construct the main appRouter. export what is needed for server and client api
objects. (server caller, etc.). Depends on all other packagestry it if you are able to spend a few hours on it π
i'll be away next week but excited to see what you come up with
btw, did you upgrade to typescript 5.5 already? it might've improved and fixed that sort of issue
Hey Katt, thanks a lot. Iβll do some investigation and try it. If it works and it improves this error/performance, maybe Iβll try to create some sort of guide for it
Yep
Iβm on 5.5.3
do you nest routers or is it kinda flat?
I have some nested routers. The most nested is
rootRouter > teamRouter > invitationRouter
(3 levels of nesting) or rootRouter > appRouter > todoRouter
(Also 3 levels). Doesn't go further from this.Great news:
My "The inferred type of this node exceeds the maximum length the compiler will serialize. An explicit type annotation is needed.ts(7056)" problem was more related to drizzle than anything else! Not much related to tRPC.
I was using object destructuring to define my schema, and this was causing many issues.
This new diff shows how I restructured my schema definition, so I can import it in createTRPCContext
No more issues! I am not sure how much this will impact performance. We'll see
i'm very excited to follow your progress π