Skip to content

Conversation

@gojkoc54
Copy link
Member

AgentSpec -> CrewAI

  • supported: StartNode, EndNode, ToolNode, LlmNode, InputMessageNode, OutputMessageNode
  • not supported: branching (most notably)

CrewAI -> AgentSpec

  • all nodes are mapped to ToolNode
  • not supported: branching (routers, listeners with multiple triggers)

@gojkoc54 gojkoc54 self-assigned this Dec 17, 2025
@gojkoc54 gojkoc54 requested a review from a team December 17, 2025 15:56
@oracle-contributor-agreement oracle-contributor-agreement bot added the OCA Verified All contributors have signed the Oracle Contributor Agreement. label Dec 17, 2025
@gojkoc54 gojkoc54 force-pushed the add-flows-in-crewai-adapter branch 2 times, most recently from 7ddb16b to 8bf8697 Compare December 17, 2025 19:10
@gojkoc54 gojkoc54 force-pushed the add-flows-in-crewai-adapter branch from 8bf8697 to 36c6948 Compare December 17, 2025 19:22
Copy link
Contributor

@sonleoracle sonleoracle left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

return

# Replace emit/aemit with no-ops to avoid background threads, rich output, and side effects
monkeypatch.setattr(crewai_event_bus, "emit", lambda *args, **kwargs: None, raising=True)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we should do this, since we will support tracing for the crewai adapter, which relies on the crewai event bus. If you want to mute crewai console prints, you could try:

from crewai.events.event_listener import event_listener as default_listener
from crewai.events.utils.console_formatter import ConsoleFormatter
default_listener.formatter = ConsoleFormatter(verbose=False)

def generate_poem(self, sentence_count) -> str:
try:
llm = CrewAILlm(
model="openai//storage/models/Llama-3.1-70B-Instruct",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why does it need to be prepended with openai// ?


@pytest.fixture
def crewai_flow() -> "CrewAIFlow":
from crewai import LLM as CrewAILlm
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In addition to the Crew AI LLM, would you think it makes sense to have a flow node which calls litellm.completion? Like here https://github.com/ag-ui-protocol/ag-ui/blob/main/integrations/crew-ai/python/ag_ui_crewai/examples/agentic_chat.py

(this allows native tool calls instead of crewai ReACT prompting)

from pyagentspec.adapters.crewai import AgentSpecLoader

# Mock interactive input() used by InputMessageNodeExecutor
monkeypatch.setattr("builtins.input", lambda: "3")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting idea, maybe we can test the client tool conversion with this too (since it just prompts for user input)?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

OCA Verified All contributors have signed the Oracle Contributor Agreement.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants