-
Notifications
You must be signed in to change notification settings - Fork 18
Add support for flows in CrewAI adapter #26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
7ddb16b to
8bf8697
Compare
8bf8697 to
36c6948
Compare
sonleoracle
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
| return | ||
|
|
||
| # Replace emit/aemit with no-ops to avoid background threads, rich output, and side effects | ||
| monkeypatch.setattr(crewai_event_bus, "emit", lambda *args, **kwargs: None, raising=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think we should do this, since we will support tracing for the crewai adapter, which relies on the crewai event bus. If you want to mute crewai console prints, you could try:
from crewai.events.event_listener import event_listener as default_listener
from crewai.events.utils.console_formatter import ConsoleFormatter
default_listener.formatter = ConsoleFormatter(verbose=False)| def generate_poem(self, sentence_count) -> str: | ||
| try: | ||
| llm = CrewAILlm( | ||
| model="openai//storage/models/Llama-3.1-70B-Instruct", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why does it need to be prepended with openai// ?
|
|
||
| @pytest.fixture | ||
| def crewai_flow() -> "CrewAIFlow": | ||
| from crewai import LLM as CrewAILlm |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In addition to the Crew AI LLM, would you think it makes sense to have a flow node which calls litellm.completion? Like here https://github.com/ag-ui-protocol/ag-ui/blob/main/integrations/crew-ai/python/ag_ui_crewai/examples/agentic_chat.py
(this allows native tool calls instead of crewai ReACT prompting)
| from pyagentspec.adapters.crewai import AgentSpecLoader | ||
|
|
||
| # Mock interactive input() used by InputMessageNodeExecutor | ||
| monkeypatch.setattr("builtins.input", lambda: "3") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Interesting idea, maybe we can test the client tool conversion with this too (since it just prompts for user input)?
AgentSpec -> CrewAI
CrewAI -> AgentSpec