Rendered at 13:13:40 GMT+0000 (Coordinated Universal Time) with Vercel.
e1g 1 days ago [-]
In JS land, this problem (streaming, resuming, recovering, multi-client, etc) has been fully solved by https://durablestreams.com - and it can be self-hosted, or managed via Cloudflare DO.
jhancock 1 days ago [-]
I built this Clojure lib for robust high scale LLM calls wherein the consumer is usually a http request waiting on an SSE stream. https://github.com/jhancock/aimee
The article states: "Most applications are built on an architecture like the one above, where there are a number of stateless horizontally scaleable server replicas that can handle client requests."
Using the library I built, I have yet to worry about this as Clojure core.async, http libs and Java VM are so rock solid, I don't have a fragile set of stateless servers. Sure, at some point there are rare edge cases but it's nice to get very far along without worrying about them.
dgellow 1 days ago [-]
Im not sure what you mean by fragile stateless servers. If they are stateless, what is fragile about them?
the_gipsy 1 days ago [-]
> Stop reading here if you just wanted the how-to. Because I’m going to talk about what I think is better, and that is probably too ‘commercial’ for some folks.
> I work for Ably, and I’m building a dedicated transport for AI applications that...
vintagedave 1 days ago [-]
It's honest. If they genuinely think it's better it's fair to say so. The article up to that point seems well written over the domain (I've solved much of the same set of problems.)
the_gipsy 1 days ago [-]
Should be at the top of the post, not at the end.
dgellow 1 days ago [-]
Agreed, I found it informative and did appreciate the reading, even with the assumption it was AI generated
_pdp_ 1 days ago [-]
This is way too complex!
We have developed a simple API that can produce tokens and events in various formats like jsonl, sse, even csv. Cancelation can happen when the socket is closed when streaming - fully automated - or when you push an event from another endpoint so that we stop the stream midway.
Background task are also subscriable and cancelable.
Cancellation was the painful one for me. I have SSE streaming in a React Native app and there's no proper AbortController — ended up with a ref + interval hack to detect when the user navigates away mid-stream. Still don't have a good answer for "connection dropped, show partial response and let them retry from where it left off." Would've loved something like this.
ekojs 1 days ago [-]
> HTTP is just not a good transport for streaming LLM tokens and for building async agentic applications
I don't know if I agree if this is a problem with SSE or HTTP. Something like a Redis Streams-backed SSE would solve most of the 'challenges' presented in the post.
The article states: "Most applications are built on an architecture like the one above, where there are a number of stateless horizontally scaleable server replicas that can handle client requests."
Using the library I built, I have yet to worry about this as Clojure core.async, http libs and Java VM are so rock solid, I don't have a fragile set of stateless servers. Sure, at some point there are rare edge cases but it's nice to get very far along without worrying about them.
> I work for Ably, and I’m building a dedicated transport for AI applications that...
We have developed a simple API that can produce tokens and events in various formats like jsonl, sse, even csv. Cancelation can happen when the socket is closed when streaming - fully automated - or when you push an event from another endpoint so that we stop the stream midway.
Background task are also subscriable and cancelable.
see https://cbk.ai
I don't know if I agree if this is a problem with SSE or HTTP. Something like a Redis Streams-backed SSE would solve most of the 'challenges' presented in the post.