Streaming server-sent events Stay organized with collections Save and categorize content based on your preferences.
This pageapplies toApigee andApigee hybrid.
View Apigee Edge documentation.![]()
Apigee supports continuous response streaming fromserver-sent event (SSE) endpointsto clients in real time. The Apigee SSE feature is useful for handling large language model (LLM) APIs that operatemost effectively by streaming their responses back to the client. SSE streaming reduces latency, and clients canreceive response data as soon as it is generated by an LLM. This feature supports the use ofAI agents that operate in real time environments, such as customer service bots or workfloworchestrators.
Note: Streaming from SSE endpoints is supported in Apigee and in Apigee hybrid v1.15.0 and newer. The feature isalso supported for use with the Apigee Extension Processor.To use SSE with Apigee simply point an API proxyto an SSE-enabled target or proxy endpoint. To achieve finer grained control over the SSE response, Apigee provides a special endpoint flow calledEventFlow. Within the context of anEventFlow, you can add alimited set of policies toperform operations on the SSE response, such as filtering, modifying, or handling errors. To learn more about proxy flows,seeControlling API proxies with flows.
Create an API proxy for SSE
The Apigee UI provides a template for creating a new proxy that includes anEventFlow.Note: Use of the SSE proxy template is optional. It gives you access to SSE-specific flow variables that can be useful for performing certain policy operations. However, you do not have to useEventFlow to use SSE with Apigee.Important: Use separate target endpoint definitions for SSE targets. Mixing SSE and non-SSE target endpoints together might result in inconsistent behavior such as emptyresponse.content flow variables. SeeKnown issues.
Follow these steps to create an API proxy with theEventFlow template using the Apigee UI:
In the Google Cloud console, go to theProxy Development> API Proxies page.
- In theAPI Proxies pane, click+ Create.
- In theCreate a proxy pane, underProxy template, selectProxy with Server-Sent Events (SSE).
- UnderProxy details, enter the following:
- Proxy name: Enter a name for the proxy, such as
myproxy. - Base Path: Automatically set to the value you enter forProxy name. TheBase Path is part of the URL used to make requests to your API. Apigee uses the URL to match and route incoming requests to the appropriate API proxy.
- Description (Optional): Enter a description for your new API proxy, such as "Testing Apigee with a simple proxy."
- Target (Existing API): Enter the SSE target URL for the API proxy. For example:
https://mocktarget.apigee.net/sse-events/5 - ClickNext.
- Proxy name: Enter a name for the proxy, such as
- Deploy (optional):
- Deployment environments: Optional. Use the checkboxes to select one or more environments to which to deploy your proxy. If you prefer not to deploy the proxy at this point, leave theDeployment environments field empty. You can always deploy the proxy later.
- Service Account: Optional. Aservice account for the proxy. The service account represents the identity of the deployed proxy, and determines what permissions it has. This is an advanced feature, and for the purpose of this tutorial, you can ignore it.
API proxies deployed with an
EventFlowconfiguration will bebilled as Extensible. - ClickCreate.
See alsoBuilding a simple API proxy.
Configure an EventFlow
To achieve finer grained control over the SSE response, Apigee provides a special endpoint flow calledEventFlow. Within the context of anEventFlow, you can add a limited set of policies, listed below,to modify the SSE response before it is streamed back to the client. To learn more about proxy flows,seeControlling API proxies with flows.
Placement of an EventFlow
AnEventFlow has two attributes:
name: A name to identify the flow.content-type: The value of this attribute must betext/event-stream.
See alsoFlow configuration reference.
AnEventFlow can be placed inside aTargetEndpoint or aProxyEndpoint definitionas shown in the following code samples:
<ProxyEndpoint>
<ProxyEndpoint name="default"> <Description/> <FaultRules/> <PreFlow name="PreFlow"> <Request/> <Response/> </PreFlow> <PostFlow name="PostFlow"> <Request/> <Response/> </PostFlow> <Flows/> <EventFlow name="EventFlow" content-type="text/event-stream"> <Response/> </EventFlow> <HTTPProxyConnection> <Properties/> <URL>https://httpbin.org/sse</URL> </HTTPProxyConnection></ProxyEndpoint>
<TargetEndpoint>
<TargetEndpoint name="default"> <Description/> <FaultRules/> <PreFlow name="PreFlow"> <Request/> <Response/> </PreFlow> <PostFlow name="PostFlow"> <Request/> <Response/> </PostFlow> <Flows/> <EventFlow name="EventFlow" content-type="text/event-stream"> <Response/> </EventFlow> <HTTPTargetConnection> <Properties/> <URL>https://httpbin.org/sse</URL> </HTTPTargetConnection></TargetEndpoint>
EventFlow stanzas to a endpoint,only the last one in the endpoint definition is executed. AnyEventFlow stanzasthat appear before the last one are ignored.It is also important to note that although you can add anEventFlow to either aTargetEndpoint,ProxyEndpoint, or both, only oneEventFlow is executed.
The following table shows the execution ofEventFlow stanzas based on endpoint placement:
| ProxyEndpoint | TargetEndpoint | EventFlow used |
|---|---|---|
EventFlow inProxyEndpoint | EventFlow inTargetEndpoint | EventFlow inTargetEndpoint |
NoEventFlow | EventFlow inTargetEndpoint | EventFlow inTargetEndpoint |
EventFlow inProxyEndpoint | NoEventFlow | EventFlow inProxyEndpoint |
Add policies to an EventFlow
You can add up to a total of four policies to theResponse element of theEventFlow. As with all flows, policies are executed in the order they are added, and you can addconditional steps to control their execution. It's important to note that the types of policies you can add to anEventFlow are restricted to the following. No other types of policies are allowed in anEventFlow:
When usingLLMTokenQuota with SSE streams, quotaenforcement skips events that lack token usage metadata. Quotas are calculatedbased only on events that contain explicit token counts.
See alsoAttaching and configuring policies in the UI andAttaching and configuring policies in XML files.
The following examples show anEventFlow with a conditional RaiseFault policy step added:
<ProxyEndpoint>
<ProxyEndpointname="default"><EventFlowcontent-type="text/event-stream"><Response><Step><Name>Raise-Fault-Cred-Invalid</Name><Condition>fault.nameequals"invalid_access_token"</Condition></Step></Response></EventFlow><HTTPProxyConnection></ProxyEndpoint></pre>
<TargetEndpoint>
<TargetEndpointname="default"><EventFlowcontent-type="text/event-stream"><Response><Step><Name>Raise-Fault-Cred-Invalid</Name><Condition>fault.nameequals"invalid_access_token"</Condition></Step></Response></EventFlow><HTTPTargetConnection></TargetEndpoint></pre>
For moreEventFlow code examples, see theEventFlow use cases and examples section.
Flow variables
AnEventFlow populates three response flow variables. Note that these variables are only usable within the scope of the current event being processed within theEventFlow. Accessing or setting these variables outside of theEventFlow scope has no effect. They are only meaningful within the context of theEventFlow.
response.event.current.content: A string containing the current event's entire response. Apigee does not parse the string in any way. It contains the entire response unchanged, including all of the data fields.Note: Apigee limits each response event to a maximum of 10 MB of data.Note: Setting the value of this variable to a blank string (""), prevents the event from being sent to the client. For an example, seeFilter an SSE response.response.event.current.data: A string containing the data payload of the current event. You can modify this variable in theEventFlowto change the data payload sent to the client.response.event.current.count: Incrementally counts the number of response events sent. This value is updated for each received event. The count will be 1 for the first event, and it increments for subsequent events.Note: If you access this value, in a JavaScript policy, for example, the very first event will have a value of 1; however, the number of events sent to the client at this point will be 0.
See alsoFlow variable reference.
EventFlow use cases and examples
The following examples show how to implement common use cases for SSE proxies:
- Modify an SSE response
- Filter an SSE response
- Send an SSE event to an external system
- Use an Apigee Model Armor policy in an EventFlows
- Error handling in the EventFlow
- Propagate fault messages in an EventFlow
Modify an SSE response
This example shows how to remove data from an SSEEventFlow response before returning it to the client. The contents of the SSE response is stored in a flow variable calledresponse.event.current.content. In this case, we use a JavaScript policy to retrieve the value of the flow variable, parse, and modify it. See alsoFlow variables.
- Create a new proxy with the SSE proxy template. SeeCreate an API proxy with server-sent events (SSE).
- Open the proxy in the Apigee proxy editor and click theDevelop tab.
- Create a newJavaScript policy with the following definition. In this example, the JavaScript code is included directly in the policy. Putting the JavaScript code in aresource file is another option for configuring the policy.
<?xmlversion="1.0"encoding="UTF-8"standalone="yes"?><JavascriptcontinueOnError="false"enabled="true"timeLimit="200"name="js-update-resp"><DisplayName>js-update-resp</DisplayName><Properties/><Source>varevent=JSON.parse(context.getVariable("response.event.current.content"));event.modelVersion=null;context.setVariable("response.event.current.content",JSON.stringify(event));</Source></Javascript>
- Add the JavaScript policy to the
EventFlowof the proxy. TheEventFlowis attached to the defaultTargetEndpointorProxyEndpoint. This example uses theGemini API in Vertex AI to generate content.<ProxyEndpoint>
<ProxyEndpoint name="default"> <EventFlow content-type="text/event-stream"> <Response> <Step> <Name>js-update-resp</Name> </Step> </Response> </EventFlow> <HTTPProxyConnection> <URL>https://generativelanguage.googleapis.com/v1beta/models/gemini-2.5-flash:streamGenerateContent?key=GEMINI_API_KEY&alt=sse</URL> </HTTPProxyConnection></ProxyEndpoint>
<TargetEndpoint>
<TargetEndpoint name="default"> <EventFlow content-type="text/event-stream"> <Response> <Step> <Name>js-update-resp</Name> </Step> </Response> </EventFlow> <HTTPTargetConnection> <URL>https://generativelanguage.googleapis.com/v1beta/models/gemini-2.5-flash:streamGenerateContent?key=GEMINI_API_KEY&alt=sse</URL> </HTTPTargetConnection></TargetEndpoint>
- Save the proxy and deploy it.
- Call the deployed proxy:
curl -X POST -H 'Content-Type: application/json' \ "https://YOUR_APIGEE_ENVIRONMENT_GROUP_HOSTNAME/YOUR_API_PATH" \ -d '{ "contents":[{"parts":[{"text": "Write a story about a magic pen."}]}]}'Show a sample response
This is a sample response without any filtering applied. Note that the response includes a
modelVersion": "gemini-2.5-flash"attribute.data: { "candidates": [ { "content": { "parts": [ { "text": "ara found the pen tucked away in a dusty antique shop, nestled amongst chipped tea" } ], "role": "model" } } ], "usageMetadata": { "promptTokenCount": 8, "totalTokenCount": 8 },"modelVersion": "gemini-2.5-flash" }This is another sample response with the JavaScript policy applied. The
modelVersionattribute is removed.data:{"candidates":[{"content":{"parts":[{"text":" the fantastical creatures of her imagination. The quiet beauty of a simple life was a magic all its own.\n"}],"role":"model"},"finishReason":"STOP"}],"usageMetadata":{"promptTokenCount":8,"candidatesTokenCount":601,"totalTokenCount":609,"promptTokensDetails":[{"modality":"TEXT","tokenCount":8}],"candidatesTokensDetails":[{"modality":"TEXT","tokenCount":601}]}}
Filter an SSE response
This example shows how to filter data from an SSE response before returning it to the client. In this case, we filter event data from the response using a JavaScript policy. The policy parses the event response into JSON, modifies the JSON to remove the event data, and then sends the modified response data back to the client.
Like in the previous example, this example retrieves the value of theresponse.event.current.content flow variable and parses it into JSON, then applies logic to implement the intended filtering.
- Create a new proxy with the SSE proxy template. SeeCreate an API proxy with server-sent events (SSE).
- Open the proxy in the Apigee proxy editor and click theDevelop tab.
- Create a newJavaScript policy with the following definition. In this example, the JavaScript code is included directly in the policy. Putting the JavaScript code in aresource file is another option for configuring the policy.
<JavascriptcontinueOnError="false"enabled="true"timeLimit="200"name="js-filter-resp"><DisplayName>js-filter-resp</DisplayName><Properties/><Source>varevent=JSON.parse(context.getVariable("response.event.current.content"));if("error"inevent){//Donotsendeventtocustomercontext.setVariable("response.event.current.content","");}</Source></Javascript>
- Add the JavaScript policy to the
EventFlowof the proxy. TheEventFlowis attached to the defaultTargetEndpointorProxyEndpoint. This example uses theGemini API in Vertex AI to generate content.<ProxyEndpoint>
<ProxyEndpoint name="default"> <EventFlow content-type="text/event-stream"> <Response> <Step> <Name>js-filter-resp</Name> </Step> </Response> </EventFlow> <HTTPProxyConnection> <URL>https://generativelanguage.googleapis.com/v1beta/models/gemini-2.5-flash:streamGenerateContent?key=GEMINI_API_KEY&alt=sse</URL> </HTTPProxyConnection></ProxyEndpoint>
<TargetEndpoint>
<TargetEndpoint name="default"> <EventFlow content-type="text/event-stream"> <Response> <Step> <Name>js-filter-resp</Name> </Step> </Response> </EventFlow> <HTTPTargetConnection> <URL>https://generativelanguage.googleapis.com/v1beta/models/gemini-2.5-flash:streamGenerateContent?key=GEMINI_API_KEY&alt=sse</URL> </HTTPTargetConnection></TargetEndpoint>
- Save the proxy and deploy it.
- Call the deployed proxy:
curl -X POST -H 'Content-Type: application/json' \ "https://YOUR_APIGEE_ENVIRONMENT_GROUP_HOSTNAME/YOUR_API_PATH" \ -d '{ "contents":[{"parts":[{"text": "Write a story about a magic pen."}]}]}'Show a sample response
Here's a sample of how the response might look without applying any filtering. Notice it includes error data:
data: { "candidates": [ { "content": { "parts": [ { "text": "El" } ], "role": "model" } } ], "usageMetadata": { "promptTokenCount": 8, "totalTokenCount": 8 }, "modelVersion": "gemini-2.5-flash" } data: {"error": "Service temporarily unavailable. We are experiencing high traffic.", "modelVersion": "gemini-2.5-flash" }Here's another sample response after filtering is applied with the error message scrubbed.
data: { "candidates": [ { "content": { "parts": [ { "text": "El" } ], "role": "model" } } ], "usageMetadata": { "promptTokenCount": 8, "totalTokenCount": 8 }, "modelVersion": "gemini-2.5-flash"}data: { "candidates": [ { "content": { "parts": [ { "text": "ara found the pen tucked away in a dusty antique shop, nestled amongst chipped tea" } ], "role": "model" } } ], "usageMetadata": { "promptTokenCount": 8, "totalTokenCount": 8 }, "modelVersion": "gemini-2.5-flash"}
Send an SSE event to an external system
In this example, we attach the ApigeePublishMessage policy to theEventFlow to send an SSE event to aPub/Sub topic.
- Create a new proxy with the SSE proxy template. SeeCreate an API proxy with server-sent events (SSE).
- Open the proxy in the Apigee proxy editor and click theDevelop tab.
- Create a new PublishMessage policy with the following definition:
<PublishMessage continueOnError="false" enabled="true" name="PM-record-event"> <DisplayName>PM-record-event</DisplayName> <Source>{response.event.current.content}</Source> <CloudPubSub> <Topic>projects/<customer_project>/topics/<topic_name></Topic> </CloudPubSub></PublishMessage> - Add the PublishMessage policy as a step in the
EventFlowof the API proxy.<ProxyEndpoint>
<ProxyEndpointname="default"><EventFlowcontent-type="text/event-stream"><Response><Step><Name>PM-record-event</Name></Step></Response></EventFlow><HTTPProxyConnection></ProxyEndpoint>
<TargetEndpoint>
<TargetEndpointname="default"><EventFlowcontent-type="text/event-stream"><Response><Step><Name>PM-record-event</Name></Step></Response></EventFlow><HTTPTargetConnection></TargetEndpoint>
- Deploy and test the API proxy.
- With your generated content added to the Pub/Sub topic, you can, for example, create a Cloud Run function to process messages from the topic.
Use an Apigee Model Armor policy in an EventFlow
Note: Model Armor policies are not supported for Apigee hybrid.You can use theSanitizeModelResponse policy to sanitize incoming server-sent events in anEventFlow. This policy protects your AI applications by sanitizing responses from large language models (LLMs). For information about Model Armor, seeModel Armor overview. For information about the Apigee Model Armor policies, seeGet started with Apigee Model Armor policies.
- Create a new proxy with the SSE proxy template. SeeCreate an API proxy with server-sent events (SSE).
- Open the proxy in the Apigee proxy editor and click theDevelop tab.
- Create a newSanitizeModelResponse policy with the following definition:
<?xml version="1.0" encoding="UTF-8" standalone="yes"?> <SanitizeModelResponse async="false" continueOnError="false" enabled="true" name="SMR-modelresponse"> <IgnoreUnresolvedVariables>true</IgnoreUnresolvedVariables> <DisplayName>SMR-modelresponse</DisplayName> <ModelArmor> <TemplateName>projects/{project}/locations/{location}/templates/{template-name}</TemplateName> </ModelArmor> <LLMResponseSource>{response_partial}</LLMResponseSource> <!-- Use the below settings if you want to call a Model Armor policy on every event --> <LLMResponseSource>{response.event.current.content}</LLMResponseSource> </SanitizeModelResponse> - (Optional) Add a JavaScript policy to group events before sending them to the Apigee Model Armor policy.
<?xmlversion="1.0"encoding="UTF-8"standalone="yes"?><JavascriptcontinueOnError="false"enabled="true"timeLimit="200"name="JS-combine-resp"><DisplayName>JS-combine-events</DisplayName><Properties/><Source>vareventText=JSON.parse(context.getVariable("response.event.current.content").substring(5)).candidates[0].content.parts[0].text;varfinishReason=JSON.parse(context.getVariable("response.event.current.content").substring(5)).candidates[0].finishReason;varidx=context.getVariable("response.event.current.count");if(idx%5==0||finishReason=="STOP"){context.setVariable("response_partial",context.getVariable("tmp_buffer_pre"));context.setVariable("buff_ready",true);context.setVariable("tmp_buffer_pre","");}else{context.setVariable("buff_ready",false);context.setVariable("response_partial","");varpreviousBufferVal=context.getVariable("tmp_buffer_pre");if(previousBufferVal){context.setVariable("tmp_buffer_pre",previousBufferVal+eventText);}else{context.setVariable("tmp_buffer_pre",eventText);}}</Source></Javascript>
- Add the JavaScript and ModelArmor policies to a step in the
EventFlowof the proxy:<EventFlowname="EventFlow"content-type="text/event-stream"><Request/><Response><Step><Name>JS-combine-resp</Name></Step><Step><!--RemovebelowConditionifyouwanttocallmodelarmorpolicyoneveryevent--><Condition>buff_ready=true</Condition><Name>SMR-modelresponse</Name></Step></Response></EventFlow>
- Deploy and test the API proxy.
Error handling in the EventFlow
By default, the event stream ends when a fault occurs. However, if you want to do extra debugging you can send fault information to Cloud Logging as shown in this example.
- Create a new proxy with the SSE proxy template. SeeCreate an API proxy with server-sent events (SSE).
- Open the proxy in the Apigee proxy editor and click theDevelop tab.
- Create a new RaiseFault policy with the following definition:
<?xml version="1.0" encoding="UTF-8" standalone="yes"?><RaiseFault continueOnError="false" enabled="true" name="RF-Empty-Event"> <DisplayName>RF-Empty-Event</DisplayName> <Properties/> <FaultResponse> <AssignVariable> <Name>faultReason</Name> <Value>empty-event</Value> </AssignVariable> </FaultResponse> <IgnoreUnresolvedVariables>true</IgnoreUnresolvedVariables></RaiseFault>
- Attach the RaiseFault policy to the
EventFlowof the SSE proxy:<EventFlowcontent-type="text/event-stream"><Response><Step><Name>RF-Empty-Event</Name><Condition>response.event.current.content~"data: "</Condition></Step></Response></EventFlow>
- Create a MessageLogging policy to log errors. For example:
<?xml version="1.0" encoding="UTF-8" standalone="yes"?><MessageLogging continueOnError="false" enabled="true" name="ML-log-error"> <DisplayName>ML-log-error</DisplayName> <CloudLogging> <LogName>projects/{organization.name}/logs/apigee_errors</LogName> <Message contentType="text/plain">Request failed due to {faultReason}.</Message> <ResourceType>api</ResourceType> </CloudLogging> <logLevel>ALERT</logLevel></MessageLogging> - Add the MessageLogging policy to the FaultRules of the target or proxy endpoint:
<TargetEndpoint>
<?xml version="1.0" encoding="UTF-8" standalone="yes"?><TargetEndpoint name="TargetEndpoint-1"> <Description/> <FaultRules> <FaultRule name="default-fault"> <Step> <Name>ML-log-error</Name> </Step> </FaultRule> </FaultRules> ...</TargetEndpoint>
<ProxyEndpoint>
<?xml version="1.0" encoding="UTF-8" standalone="yes"?><ProxyEndpoint name="ProxyEndpoint-1"> <Description/> <FaultRules> <FaultRule name="default-fault"> <Step> <Name>ML-log-error</Name> </Step> </FaultRule> </FaultRules> ...</ProxyEndpoint>
- Deploy and test the API proxy.
Propagate fault errors in an EventFlow
In this example, we show you how to use anEventFlow to propagate fault errors to the client. The process works to notify the client of errors immediately during policy execution.
- Create a new proxy with the SSE proxy template. SeeCreate an API proxy with server-sent events (SSE).
- Open the proxy in the Apigee proxy editor and click theDevelop tab.
- Create a new JavaScript policy with the following definition:
<?xml version="1.0" encoding="UTF-8" standalone="yes"?> <Javascript continueOnError="false" enabled="true" timeLimit="200" name="js-error"> <DisplayName>js-error</DisplayName> <Properties/> <Source> if(context.getVariable("response.event.current.count")=="2") { throw new Error("Internal Error"); } context.setVariable("response.event.current.content", context.getVariable("response.event.current.content")); </Source> </Javascript>This policy is designed to throw an error when a specific condition is met.
- Attach the JavaScript policy to the
EventFlowof the SSE proxy within theTargetEndpointorProxyEndpointconfiguration. This step ensures that the EventFlow processes the policy during response handling:<TargetEndpoint>
<?xmlversion="1.0"encoding="UTF-8"standalone="yes"?><TargetEndpointname="default"><EventFlowcontent-type="text/event-stream"><Response><Step><Name>js-error</Name></Step></Response></EventFlow><HTTPTargetConnection><URL>https://generativelanguage.googleapis.com/v1beta/models/gemini-2.5-flash:streamGenerateContent</URL></HTTPTargetConnection></TargetEndpoint>
<ProxyEndpoint>
<?xmlversion="1.0"encoding="UTF-8"standalone="yes"?><ProxyEndpointname="default"><EventFlowcontent-type="text/event-stream"><Response><Step><Name>js-error</Name></Step></Response></EventFlow><HTTPProxyConnection><URL>https://generativelanguage.googleapis.com/v1beta/models/gemini-2.5-flash:streamGenerateContent</URL></HTTPProxyConnection></ProxyEndpoint>
- Deploy the API proxy.
- Test with proxy behavior using the following
curlcommand:curl-XPOST-H'Content-Type: application/json'"https://ENVIRONMENT_GROUP_NAME/llm-api"-d'{ "contents":[{"parts":[{"text": "Write a story about a magic pen."}]}]}'
ReplaceENVIRONMENT_GROUP_NAME with the name of your environmentgroup.
The output should look similar to the following example:
data: {"candidates": [{"content": {"parts": [{"text": "El"}],"role": "model"}}],"usageMetadata": {"promptTokenCount": 8,"totalTokenCount": 8},"modelVersion": "gemini-2.5-flash"}data: {"fault":{"faultstring":"Execution of JS-error failed with error: Exception thrown from JavaScript : Error: Internal Error (Resource_1_js#2)","detail":{"errorcode":"steps.javascript.ScriptExecutionFailed"}}}The output shows the initial data stream followed by a
faultmessage. In the event of an error, Apigee captures the fault information and sends it to the client as an event.
For more information about fault handling in Apigee, seeHandling faults.
Viewing SSE data in Apigee analytics
Data for SSE proxies shows up inApigee analytics as expected for any API proxy. In the Cloud console, go toAnalytics > API metrics.
Debugging SSE proxies
Use theApigee debug tool to debug SSE proxies. Debug data is captured forEventFlow just as it is for the other flow types.
Troubleshooting
For real-time traffic issues, check theApigee access logs to determine the cause.
Limitations
The following limitations apply to SSE proxies:
- While an SSE connection can be kept open indefinitely, its duration is ultimately constrained by the timeout setting of the Load Balancer downstream of Apigee. By default, the Apigee Load Balancer timeout is set to
30seconds. For longer-running connections, we recommend increasing this timeout or creating a separate backend service with a higher timeout value to handle the SSE traffic. - Because analytics data is recorded after the SSE session closes, you may notice some delay in the reporting of analytics data.
- Faults inside an EventFlow cause the stream to exit immediately and throw an error. For information on manually logging these kinds of errors, or sending them to the client,seeEventFlow use cases and examples.
- A client receiving streamed SSE responses will receive the
HTTPheaders, including any status codes, at the beginning of the event stream. As a result, if the event stream gets into an error state, the status code initially received will not reflect the error state.This limitation can be seen when viewing adebug session. In the session, you may notice that the
HTTPstatus code for streams that enter the error state differ from the status codes sent to the client. This can occur because the debug session entry is generated after the entire request has been processed, rather than at the beginning of the event stream. The debug session may reflect the fault code generated by the error, while the client only sees the 2xx status initially received in the headers.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2026-02-19 UTC.