Skip to content

Conversation

@oesteve
Copy link
Contributor

@oesteve oesteve commented Oct 7, 2025

Q A
Bug fix? no
New feature? yes
Docs? no
Issues Fix #753
License MIT
$result = $agent->call($messages, [
    'stream' => true, // enable streaming of response text
    'stream_options' => [
        'include_usage' => true, // include usage in the response
    ],
]);

/** @var TextChunk $textChunk */
foreach ($result->getContent() as $textChunk) {
    // $textChunk implement \Stringable
    echo $textChunk->getContent();

    // Chunk also contain metadata
    // $textChunk->getMetadata()->get('id'); // Stream id
}

// Output token usage statistics for each call
foreach ($result->getMetadata()->get('calls', []) as $call) {
    echo \PHP_EOL.sprintf(
        '%s: %d tokens - Finish reason: [%s]',
        $call['id'],
        $call['usage']['total_tokens'],
        $call['finish_reason']
    );
}

@oesteve oesteve force-pushed the main branch 2 times, most recently from 12669c7 to e0ad357 Compare October 7, 2025 20:56
@OskarStark OskarStark added Platform Issues & PRs about the AI Platform component Agent Issues & PRs about the AI Agent component labels Oct 8, 2025
@carsonbot carsonbot changed the title Add stream usage support for OpenAI GPT [Agent][Platform] Add stream usage support for OpenAI GPT Oct 8, 2025
@OskarStark OskarStark changed the title [Agent][Platform] Add stream usage support for OpenAI GPT [Agent][Platform][OpenAI] Add stream usage support Oct 8, 2025
@oesteve oesteve marked this pull request as ready for review November 20, 2025 01:16
Comment on lines 51 to 52
// Chunk also contain metadata
// $textChunk->getMetadata()->get('id'); // Stream id
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove?

/** @var TextChunk $textChunk */
foreach ($result->getContent() as $textChunk) {

// $textChunk implement \Stringable
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove

@oesteve oesteve force-pushed the main branch 2 times, most recently from bcbc538 to 1535511 Compare November 29, 2025 17:46
],
]);

// Output text chunks
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
// Output text chunks

Copy link
Member

@chr-hertel chr-hertel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for tackling this - one thing that get's me thinking here is that different behavior from dev point of view with extra set of option and metadata key.

usually i'd activate the TokenOutputProcessor and expect to have the key token_usage on the result metadata. that expectation is not working with your current approach leading to an inconsistent API for developers in my opinion.

Is it possible from your point of view to get rid of the extra option and still use the token_usage key on the result metadata?
i get the issue with multiple calls - so maybe #1051 might help?

let me know what you think - thanks already!

@oesteve
Copy link
Contributor Author

oesteve commented Dec 7, 2025

@chr-hertel , thanks for the feedback. Your approach with TokenUsageAggregation looks much cleaner and allows for the reuse of token_usage.


On the other side, the TokenOutputProcessor works well with regular results, but I can’t see how to make an OutputProcessorInterface work with streams. The usage is received at the end of the stream after the processOutput has been called.

@chr-hertel
Copy link
Member

Yeah, makes sense - that looks tricky - you needed to do the metadata handling at various places.
Can we go for some kind of middle way tho - maybe skipping the processor but dropping that extra option (is it interpreted anyways?) and provide an aggregated object as token_usage on high level for the devs?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Agent Issues & PRs about the AI Agent component Feature New feature Platform Issues & PRs about the AI Platform component Status: Needs Work

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Agent][Platform][OpenAI] Add support to include usage on streams

4 participants