Streaming responses (eg. for streaming ai chat completion text)

Hello! Has anyone used Vercel's ai package or any similar libraries which stream their responses with tRPC? I'm wondering what the best way to achieve a streamed response like that is while maintaining type safety. The trouble with useSubscription() is that it behaves more like a query than a mutation, so it would need to be called immediately when the component mounts rather than being triggered by some user action.
seb
seb336d ago
Glad to have found that it looks like others are thinking about this feature as well: - https://github.com/trpc/trpc/pull/4530 - https://github.com/trpc/trpc/pull/4489 - https://github.com/trpc/trpc/issues/4477 Seems like the best option for now is to stream such responses outside of tRPC while waiting for one of those PRs to land. But if there are any better workarounds for now, do let me know!
GitHub
feat: streamable procedure support by helmturner · Pull Request #45...
revert(server): roll back node adapter to 0aaf147; revert(server): roll back fetch adapter to 0aaf147; revert(server): roll back fastify adapter to ababea8; Closes #4477 🎯 Changes What changes are ...
GitHub
[RFC] feat(react-query server client) Streaming results from ...
Closes #4477 🎯 Changes This change (1) introduces a queryGenerator and mutationGenerator method to the client, and (2) adds a streamingData state to useQuery and useMutation. The developer experien...
GitHub
feat: Streaming Mutations / Queries · Issue #4477 · trpc/trpc
Describe the feature you'd like to request Many upstream APIs (e.g. OpenAI's GPT) stream responses to the client. It would be great if TRPC supported such streaming requests. Describe the s...