You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
this is not working, it is supposed to store summary of llms but its simply not working. I also see one more open issue of 3 months old, and its still not resolved.
Please fix this now, if this can't work, then autogen is useless totally.
What did you expect to happen?
Please fix this bug
How can we reproduce it (as minimally and precisely as possible)?
onboarding_personal_information_agent = ConversableAgent(
name="Onboarding_Personal_Information_Agent",
system_message='''You are a helpful customer onboarding agent,
you work for a phone provider called ACME.
Your job is to gather the customer's name and location.
Do not ask for any other information, only ask about the customer's name and location.
After the customer gives you their name and location, repeat them
and thank the user, and ask the user to answer with TERMINATE to move on to describing their issue.
''',
llm_config=llm_config,
human_input_mode="NEVER",
is_termination_msg=lambda msg: "terminate" in msg.get("content").lower()
)
onboarding_issue_agent = ConversableAgent(
name="Onboarding_Issue_Agent",
system_message='''You are a helpful customer onboarding agent,
you work for a phone provider called ACME,
you are here to help new customers get started with our product.
Your job is to gather the product the customer use and the issue they currently
have with the product,
Do not ask for other information.
After the customer describes their issue, repeat it and add
"Please answer with 'TERMINATE' if I have correctly understood your issue." ''',
llm_config=llm_config,
human_input_mode="NEVER",
is_termination_msg=lambda msg: "terminate" in msg.get("content").lower()
)
customer_engagement_agent = ConversableAgent(
name="Customer_Engagement_Agent",
system_message='''You are a helpful customer service agent.
Your job is to gather customer's preferences on news topics.
You are here to provide fun and useful information to the customer based on the user's
personal information and topic preferences.
This could include fun facts, jokes, or interesting stories.
Make sure to make it engaging and fun!
Return 'TERMINATE' when you are done.''',
llm_config=llm_config,
human_input_mode="NEVER",
is_termination_msg=lambda msg: "terminate" in msg.get("content").lower(),
)
chats = [] # This is going to be our list of chats
chats.append(
{
"sender": onboarding_personal_information_agent,
"recipient": customer_proxy_agent,
"message":
"Hello, I'm here to help you solve any issue you have with our products. "
"Could you tell me your name?",
"summary_method": "reflection_with_llm",
"summary_args": {
"summary_prompt" : "Return the customer information "
"into a JSON object only: "
"{'name': '', 'location': ''}",
},
"clear_history" : True
}
)
chats.append(
{
"sender": onboarding_issue_agent,
"recipient": customer_proxy_agent,
"message":
"Great! Could you tell me what issue you're "
"currently having and with which product?",
"summary_method": "reflection_with_llm",
"clear_history" : False
}
)
chats.append(
{
"sender": customer_proxy_agent,
"recipient": customer_engagement_agent,
"message": "While we're waiting for a human agent to take over and help you solve "
"your issue, can you tell me more about how you use our products or some "
"topics interesting for you?",
"max_turns": 2,
"summary_method": "reflection_with_llm",
}
)
chats
from autogen import initiate_chats
chat_results = initiate_chats(chats)
import pprint
for chat_result in chat_results:
#pprint.pprint(chat_result.chat_history) # We could also get the whole chat history with this command
pprint.pprint(chat_result.summary)
AutoGen version
0.6.1
Which package was this bug in
Core
Model used
llama-3.2-3b-instruct
Python version
No response
Operating system
mac
Any additional info you think would be helpful for fixing this bug
No response
The text was updated successfully, but these errors were encountered:
What happened?
this is not working, it is supposed to store summary of llms but its simply not working. I also see one more open issue of 3 months old, and its still not resolved.
Please fix this now, if this can't work, then autogen is useless totally.
What did you expect to happen?
Please fix this bug
How can we reproduce it (as minimally and precisely as possible)?
Ok, here is the code snippet
from autogen import ConversableAgent
AutoGen version
0.6.1
Which package was this bug in
Core
Model used
llama-3.2-3b-instruct
Python version
No response
Operating system
mac
Any additional info you think would be helpful for fixing this bug
No response
The text was updated successfully, but these errors were encountered: