User bio
404 bio not found
Member since Dec 21, 2017
Replies:

Hi @Jochen Deubner,

Great question!  I ran into the same behavior 🙂

You're right that the CSP Gateway tends to buffer responses, which prevents intermediate flushes from reaching the browser, even when using SSE-friendly headers like text/event-stream

That said, it is still possible to stream the response.

In addition to setting the Content-Type to text/event-stream, you also need to explicitly allow output flushing in your REST service:

Set %response.AllowOutputFlush = 1

Then, whenever you want to push data to the client, you can call:

Do %response.Flush()

Example with fast-http

You can implement a simple passthrough adapter like this:

Class dc.http.SSEPassthroughAdapter Extends dc.http.SSEAdapter
{

Method OnMessage(item As dc.http.SSEMessage) As %Status
{
    Write item.Raw
    If $IsObject($Get(%response)) {
        Do %response.Flush()
    }
    Return $$$OK
}

}

I think this kind of passthrough adapter could be useful for other developers as well, so I’ll probably add it directly into fast-http.

And use it like this:

Set %response.AllowOutputFlush = 1
Set %response.ContentType = "text/event-stream"

Set stream = ##class(dc.http.SSEPassthroughAdapter ).GetStream()
Set handler = .SSEHandler
Set handler.SwitchIOOnMessage = 1
Set handler.IO = $IO

Set body = {"model": "gpt-4","messages": [{"role": "user", "content": "Tell me a short story."}],"stream": true}
Set response = ##class(dc.http.FastHTTP).DirectPost("url=https://api.openai.com/v1/chat/completions,Header_Authorization=Bearer {MyToken}", body, .client, stream)

Note: The REST class parameter Parameter IgnoreWrites = 0; should be set to 0.

If you're interested, I can provide a complete working example on my GitHub (on a demo branch).

Hope this helps.

Edit

I added the branch csp-test-1  to illustrates how to set up a passthrough for the OpenAI API with streaming support over CSP.

Clone the repository, switch to the specific branch csp-test-1, and start the environment using Docker:

git clone -b csp-test-1 https://github.com/lscalese/iris-fast-http.git
cd iris-fast-http
docker compose build --no-cache
docker compose up -d

To use the OpenAI API, you must configure your API key in the IRIS instance. Open a terminal and run:

Set ^APIKey = "sk-..." ; your OpenAI API key

Access the built-in chat interface at: http://localhost:42600/csp/ui/demo/chat.html

The backend logic is implemented in dc.http.DemoRest. You can adapt this class to match your specific testing requirements.

Open Exchange applications:
Certifications & Credly badges:
Followers:
Following: