Conversation Design (ICM, LLM, and Hybrid)
This topic discusses common practices in the AI industry related to Conversation Design.
Conversation design is the process of structuring interactions between users and chatbots to ensure a seamless and engaging user experience.
While the fundamental principles of conversation design apply to both:
Intent-Based Chatbots (ICM - Intent Classification Model) and
Large Language Model (LLM) Chatbots,
their design approaches differ significantly.
ICM (Intent-Based Chatbots) Conversation Design
ICM chatbots rely on predefined intents, entity recognition, and dialog flows to process user queries. They are rule-based and work within structured conversation patterns.
Key Aspects of ICM Conversation Design:
1. Intent Recognition:
Define a set of intents (such as, "Order Status," "Reset Password").
Train a Natural Language Understanding (NLU) model with sample utterances.
2. Entity Extraction:
Identify key variables (such as,, date, location, product name).
Use entity recognition to extract this information.
3. Dialogue Flow Design:
Create a flowchart of possible conversations using tools like Dialogflow, IBM Watson, or Microsoft Bot Framework.
Define prompts, user responses, and fallback messages.
4. Error Handling & Fallbacks:
Handle unrecognized inputs with clarifications (such as, “I didn’t get that. Can you rephrase?”).
Provide alternative options or connect users to human agents.
5. Guided Conversations:
Use buttons, quick replies, and multiple-choice options to streamline interactions.
Keep the conversation structured and predictable.
6. State Management:
Track user progress within a session.
Maintain conversation history to improve context retention.
Challenges in ICM Conversation Design:
Limited flexibility in handling unexpected inputs.
Requires continuous training to improve accuracy.
Cannot handle open-ended queries effectively.
LLM (Large Language Model) Conversation Design
LLM chatbots (such as, ChatGPT, Gemini, Claude) leverage deep learning models trained on vast amounts of text data. They generate human-like responses and handle open-ended, dynamic conversations.
Key Aspects of LLM Conversation Design:
1. Prompt Engineering:
Craft precise system messages to guide responses.
Use instructions like "Act as a customer support agent and answer concisely."
2. Context Management:
Maintain conversation history to ensure coherent responses.
Use memory techniques or session tokens to retain relevant details.
3. Conversational Tone & Style:
Adapt responses based on the user’s tone, sentiment, and intent.
Implement brand voice and personality (e.g., formal vs. casual).
4. Handling Ambiguity & Clarification:
When user input is unclear, ask clarifying questions.
Avoid assumptions and provide multiple options where necessary.
5. Personalization & Adaptability:
Tailor responses based on past interactions.
Use retrieval-augmented generation (RAG) to fetch real-time data.
6. Fallback & Safety Measures:
Implement guardrails to avoid misinformation.
Use reinforcement learning and human feedback to fine-tune responses.
Challenges in LLM Conversation Design:
Requires significant computational resources.
Harder to control response variability.
May generate incorrect or unpredictable answers.
Comparing ICM vs. LLM in Conversation Design
The following table highlights some of the differencs betwen ICM Chatbots and LLM Chatbots.
Feature
ICM Chatbots
LLM Chatbots
Approach
Rule-based (intent-driven)
AI-driven (generative model)
Flexibility
Limited, structured flows
Highly flexible, open-ended
Training
Requires labeled training data
Trained on massive text input
Context Handling
Limited, requires manual state tracking
Maintains memory within sessions
Error Handling
Needs pre-defined feedback flows
More adaptable, but may generate hallucinations
Personalization
Rule-based customization
Context-aware responses
Best Use Cases
FAQs, Customer Support, Structured Workflows
Conversational AI, Knowledge Assistants, Creative Generation
When to Use ICM vs. LLM for Chatbots
Use ICM if:
You need a highly structured and predictable chatbot.
The use case involves form-filling, FAQs, or task automation.
Compliance and accuracy are top priorities.
Use LLM if:
The chatbot requires dynamic, natural conversations.
Users need open-ended assistance with complex queries.
You need personalization and context-awareness.
Example Conversation Designs for ICM and LLM Chatbots
How both chatbot types handle the same scenario: a customer inquiring about an order status.
ICM (Intent-Based Chatbot) Conversation Flow
Since ICM chatbots follow structured dialogue flows, the conversation is designed with predefined intents, entities, and responses.
Intent: “Check Order Status”
Entities:
• Order Number (such as, #123456)
• Customer Name (Optional)
Conversation Flow:
User: “Where is my order?”
Bot: “I can help with that! Can you provide your order number?”
User: “It’s 123456.”
Bot: “Thanks! Let me check… Your order #123456 was shipped on February 3rd and is expected to arrive by February 6th.”
User: “Can I get tracking details?”
Bot: “Sure! Here’s your tracking link: [tracking_link].”
User: “Thanks!”
Bot: “You’re welcome! Let me know if you need anything else.”
Fallback Handling:
User: “I don’t have my order number.”
Bot: “No worries! Can you provide the email or phone number you used to place the order?”
Key Features of ICM Design:
Structured flow (User follows a specific path).
Quick and efficient (Minimal open-ended responses).
Fallback prompts guide users when they don’t have required information.
Limited flexibility (Cannot understand vague or out-of-scope queries).
LLM (Large Language Model Chatbot) Conversation Flow
An LLM chatbot processes queries dynamically, maintaining conversation history and handling open-ended dialogue.
Conversation:
User: “Where’s my order?”
Bot: “I can check that for you! Do you have an order number or should I look it up by email?” User: “I don’t remember my order number.”
Bot: “No problem! What email did you use when placing the order?”
User: “[email protected]”
Bot: “Thanks, John! I found your order. It was shipped on February 3rd and should arrive by February 6th. Do you need the tracking details?”
User: “Yes, please.”
Bot: “Here’s your tracking link: [tracking_link]. Would you like updates via SMS?”
User: “That would be great.”
Bot: “Got it! I’ve set up SMS alerts for you. Let me know if there’s anything else I can do.”
Handling Unexpected Queries:
User: “Is this item available in blue?”
Bot: “Let me check… Yes! The blue version is in stock. Would you like to place an order?”
Key Features of LLM Design:
Flexible and conversational (User isn’t forced down a rigid path).
Handles multi-turn conversations naturally (Remembers context).
Personalized recommendations (Can check inventory dynamically).
Potentially variable responses (May generate unintended answers).
Less structured control (Might need additional prompt engineering to stay on topic).
Comparison of ICM vs. LLM in this Scenario
The follow table displays how requests were handled by the bots in this scenario.
Comparison Summary
• If the chatbot’s goal is structured customer service (FAQs, transactions, troubleshooting), an ICM chatbot is best.
• If the chatbot needs natural, engaging conversations with dynamic adaptability, an LLM chatbot is ideal.
• Many businesses use a hybrid approach, where ICM handles straightforward tasks, and LLM engages in open-ended discussions.
Generic Processing Cost Comparison
The following table is a general example to give you an idea of the difference in processing costs between ICM Solutions and LLM solutions.
API Cost per Request
~$0.0001 - $0.005
$0.03 - $0.12 per 1,000 tokens
Compute Cost
Lower (optimized for specific tasks)
Higher (general-purpose models)
Fine-tuning Cost
Typically not required
Expensive (unless using retrieval techniques)
Real-time Efficiency
High
Lower (depends on model size)
Hardware Requirements
Lower
Higher (GPUs/TPUs required)
Cost Comparison Summary
Use ICM if the request is moderation-focused, rules-based, and cost-sensitive.
Use LLM if the request requires reasoning, creativity, or multi-turn contextual awareness but at a higher cost.
Hybrid Conversation Design
Many enterprises use a combination of
**ICM for transactional queries** and
**LLM for open-ended conversations**
to balance control, costs, and flexibility.
Hybrid Chatbot: Combining ICM and LLM Approaches
A hybrid chatbot integrates ICM (Intent-Based) and LLM (Large Language Model) capabilities to offer structured yet flexible conversations.
This approach ensures accuracy and efficiency while allowing natural, open-ended interactions.
Scenario: Customer Checking Order Status Approach:
• ICM handles structured queries (e.g., retrieving order status).
• LLM processes open-ended or unexpected questions (e.g., product recommendations, delivery issues).
Hybrid Chatbot Conversation Flow
Step 1: User Asks About Order Status
User: “Where’s my order?”
Bot (ICM Mode): “I can check that for you! Do you have your order number?”
Why ICM?
• Checking order status follows a structured data retrieval process.
• An intent-based approach ensures reliability.
Step 2: User Doesn’t Remember Order Number
User: “I don’t have it.”
Bot (LLM Mode): “No worries! I can look it up by your email or phone number. Which one would you like to use?”
Why LLM?
• The chatbot adapts dynamically instead of just saying “I need an order number.”
• Offers multiple ways to retrieve information.
Step 3: Retrieving and Providing Order Status
User: “My email is [email protected].”
Bot (ICM Mode): “Thanks, John! I found your order. It was shipped on February 3rd and should arrive by February 6th. Do you need tracking details?”
Why ICM?
• Order lookups are structured database queries.
Step 4: User Asks an Unexpected Question
User: “Can I change the delivery address?”
Bot (LLM Mode): “Let me check… Changing the address depends on your order status. If your package hasn’t been shipped, I can update it for you. Would you like me to try?”
Why LLM?
• This response requires reasoning and conditional logic.
• If ICM had handled this, it might have given a rigid “Yes/No” response without understanding the context.
Step 5: Order Address Change Request
User: “Yes, please change it to 123 New Street, NY.”
Bot (ICM Mode): “I’ve sent a request to update your address. You’ll receive a confirmation shortly. Is there anything else I can help with?”
Why ICM?
• Address changes must follow strict business rules, so a predefined workflow ensures accuracy.
Step 6: User Asks a Follow-up Question
User: “By the way, does this item come in blue?”
Bot (LLM Mode): “Yes! The blue version is in stock. Would you like me to place an order for you?”
Why LLM?
• This is an out-of-scope query for a basic order status bot, but an LLM can handle product inquiries dynamically.
Hybrid Chatbot Architecture
1. User Input → Intent Detection
• If the query matches an existing intent (e.g., “Order Status”), use ICM.
• If the query is open-ended or falls outside predefined intents, switch to LLM.
2. ICM Handles Structured Workflows
• Retrieves order information, processes address changes, handles FAQs.
3. LLM Handles Dynamic Conversations
• Responds to unexpected questions.
• Uses retrieval-augmented generation (RAG) to fetch real-time product data.
4. Fallback Mechanism
• If neither ICM nor LLM can provide a proper response, escalate to a human agent.
Use Cases for Hybrid Chatbots
• Customer Support (Order tracking + FAQs + Personalized Assistance).
• E-commerce Assistants (Product recommendations + Transactions).
• Banking & Finance (Account queries + Fraud detection).
• Healthcare (Appointment booking + Medical advice FAQs).
Hybrid Solution Summary
• ICM ensures reliability for structured tasks.
• LLM enhances flexibility by handling open-ended queries.
• A hybrid model balances efficiency and adaptability for the best customer experience.
Last updated