- Log in to post comments
User bio
404 bio not found
Member since Dec 21, 2017
Posts:
Replies:
Thanks a lot for your kind words, that’s very encouraging!
- Log in to post comments
Thank you @Guillaume Rongier !
I'll publish a French version soon.
- Log in to post comments
Open Exchange applications:
Certifications & Credly badges:
Followers:
Following:
Hi @Jochen Deubner,
Great question! I ran into the same behavior 🙂
You're right that the CSP Gateway tends to buffer responses, which prevents intermediate flushes from reaching the browser, even when using SSE-friendly headers like
text/event-streamThat said, it is still possible to stream the response.
In addition to setting the
Content-Typetotext/event-stream, you also need to explicitly allow output flushing in your REST service:Then, whenever you want to push data to the client, you can call:
Example with fast-http
You can implement a simple passthrough adapter like this:
I think this kind of passthrough adapter could be useful for other developers as well, so I’ll probably add it directly into fast-http.
And use it like this:
Note: The REST class parameter
Parameter IgnoreWrites = 0;should be set to0.If you're interested, I can provide a complete working example on my GitHub (on a demo branch).
Hope this helps.
Edit
I added the branch
csp-test-1to illustrates how to set up a passthrough for the OpenAI API with streaming support over CSP.Clone the repository, switch to the specific branch
csp-test-1, and start the environment using Docker:To use the OpenAI API, you must configure your API key in the IRIS instance. Open a terminal and run:
Access the built-in chat interface at: http://localhost:42600/csp/ui/demo/chat.html
The backend logic is implemented in dc.http.DemoRest. You can adapt this class to match your specific testing requirements.