Responses
The Responses page is the first entry under AI Performance Management. It lists every question the AI answered across channels and allows quality review and analysis.
Key Capabilities
Search and Filtering
Search Bar
Search by question text, keywords, or specific phrases to find relevant responses quickly.
Reset & Settings
- Reset Button: Clears all active filters to show all responses
- Settings Icon: Opens additional filter options including:
- Date range selection
- Channel filtering
- Response status filtering
- Feedback filtering
Response Table
The main table displays comprehensive response information:
Column | Description |
---|---|
Question | Snippet of the original question asked |
Channel | Where the question originated (AI Chat, Forum, etc.) |
Date | When the response was generated |
Feedback | Thumbs up/down counts from users |
Tags | Applied tags for categorization |
Assessments | Quality assessment indicators |
Assessment Indicators
The assessments column shows icons indicating whether rubric issues were flagged:
- Conflicting knowledge
- Knowledge gaps
- Hallucination
- Bad links
- Other quality issues
Response Details Page
Clicking any response row opens a comprehensive Response Details Page that provides:
Primary Information:
- Original Question: The complete user question that was asked
- AI Answer: The full AI-generated response
- Assessment Icons: Visual indicators showing quality assessment results
- Add Knowledge Button: Direct access to contribute missing information
Feedback Management:
- Feedback List: Complete history of user feedback and reactions
- Answer Feedback Button: Add new feedback or rate the response quality
- User Comments: Detailed feedback and comments from users
Source Analysis:
- Answer Sources Table: Detailed breakdown of knowledge sources used
- Source Titles: Names of the knowledge sources that contributed
- Source Types: Type of each source (website, document, forum, etc.)
- Number of Uses: How many times each source was referenced
- Source Attribution: Clear mapping of which parts of the answer came from which sources
Navigation Controls:
- Previous/Next Buttons: Navigate through responses without returning to the main list
- Quick Navigation: Move efficiently between related responses
- Return to List: Easy way to get back to the main responses table
Pagination
Navigation controls at the bottom for browsing through large response sets.
How It Works
Response Monitoring
Use this page to audit AI answers across all channels. The system tracks:
- Response Generation: When and where each response occurred
- User Feedback: Likes and dislikes from users
- Quality Assessment: Issues flagged according to the 21-point rubric
- Context Information: Full conversation and knowledge sources used
Quality Review Process
- Browse Responses: Use filters to find responses to review
- Check Assessments: Look for flagged quality issues and assessment scores
- Review Feedback: Monitor user satisfaction indicators and comments
- Open Details Page: Click any response to view the comprehensive Response Details Page
- Analyze Sources: Review the Answer Sources table to understand which knowledge sources contributed
- Add Feedback: Use the Answer Feedback button to provide quality ratings and comments
- Contribute Knowledge: Use the Add Knowledge button to fill information gaps
- Navigate Efficiently: Use Previous/Next controls to review multiple responses systematically
Knowledge Contribution Features
Brain-Dump Interface
When reviewing responses with knowledge gaps or quality issues, you can directly contribute knowledge:
Text Contribution:
- Quick Entry: Type additional information directly into the contribution interface
- Structured Input: Provide information in categories (facts, procedures, troubleshooting, etc.)
- Context Tags: Tag contributions with relevant topics and keywords
- Source Attribution: Link contributions to authoritative sources when possible
Voice Contribution:
- Voice Recording: Record knowledge contributions using the voice interface
- Automatic Transcription: AI converts voice recordings to searchable text
- Voice-to-Knowledge: System processes voice input and creates structured knowledge entries
- Speaker Recognition: System attributes contributions to specific team members
Contribution Processing
- Content Analysis: AI analyzes contributed knowledge for relevance and accuracy
- Automatic Source Creation: System creates new "contributed knowledge" sources automatically
- Knowledge Source Integration: New knowledge is integrated into appropriate Knowledge Source Configurations
- Quality Validation: Contributions are reviewed and validated before becoming active
- Version Control: All contributions are versioned and can be updated or rolled back
Impact Tracking
- Contribution Attribution: Track which team members contribute the most valuable knowledge
- Knowledge Utilization: Monitor how contributed knowledge improves future responses
- Quality Improvement: Measure response quality improvements from contributed knowledge
- Knowledge Coverage: Identify remaining knowledge gaps for future contribution priorities
Feedback Analysis
The feedback column displays counts that help identify:
- Popular Responses: High thumbs-up counts
- Problem Responses: High thumbs-down counts
- Trending Issues: Patterns in negative feedback
Understanding Assessments
The 21-point rubric evaluates responses across multiple dimensions with detailed scoring:
Knowledge Quality Issues
- Conflicting Knowledge: When sources contradict each other (indicates need for source review)
- Knowledge Gap: When relevant information is missing (opportunity for knowledge contribution)
- Outdated Information: When sources contain old data (triggers source update recommendations)
Response Accuracy Issues
- Hallucination: AI generating false information (highest priority for correction)
- Misinterpretation: Misunderstanding the question (may indicate need for better context)
- Incomplete Answers: Missing key information (opportunity for knowledge enhancement)
Technical Issues
- Bad Links: Broken or incorrect URLs (automated link checking recommendations)
- Formatting Problems: Display or structure issues (system configuration issues)
- Source Attribution: Missing or incorrect citations (knowledge source tagging improvements)
Quality Metrics
- Correctness Score: Overall accuracy rating (1-10 scale)
- Guardrails Compliance: Adherence to safety and policy guidelines
- User Satisfaction: Direct user feedback correlation
- Knowledge Source Effectiveness: How well sources supported the response
Assessment Actions
When assessment issues are identified, the system provides:
- Immediate Fixes: Quick corrections for obvious errors
- Knowledge Contribution Prompts: Suggestions to add missing knowledge
- Source Update Recommendations: Guidance on improving knowledge sources
- Training Opportunities: Learning moments for AI improvement
Troubleshooting
Missing Responses
If you don't see expected responses:
- Date Filters: Adjust date range filters - responses may be outside the current range
- Channel Access: Ensure you have permission to view all channels
- Loading Time: The page loads results gradually; wait for the table to populate
- Search Terms: Try different search terms or clear search filters
Assessment Issues
If assessments seem incorrect:
- Review Criteria: Check the rubric definitions to understand flagging criteria
- Context Analysis: Click into the response to see full context
- Knowledge Sources: Review the knowledge sources that were used
- Feedback Correlation: Compare assessments with user feedback
Performance Issues
If the page loads slowly:
- Narrow Filters: Use more specific date ranges or channel filters
- Clear Cache: Refresh the browser or clear cache
- Batch Processing: Review responses in smaller batches
Best Practices
For Support Teams
- Regular Reviews: Set up routine response quality reviews
- Feedback Analysis: Pay attention to patterns in user feedback
- Quick Actions: Address high-priority issues (hallucinations, bad links) immediately
- Trend Monitoring: Watch for recurring problems that indicate systemic issues
For Administrators
- Knowledge Curation: Use assessment data to identify knowledge source problems
- Channel Optimization: Adjust channel configurations based on response quality
- Team Training: Share insights from response analysis with team members
- Performance Tracking: Monitor overall response quality trends over time
For Quality Analysts
- Systematic Review: Use filters to systematically review different types of responses
- Deep Dive Analysis: Use the Response Details Page to thoroughly analyze individual responses
- Source Evaluation: Review the Answer Sources table to understand knowledge source effectiveness
- Pattern Recognition: Look for patterns in assessment flags and user feedback
- Root Cause Analysis: Investigate underlying causes of quality issues using detailed assessment data
- Navigation Efficiency: Use Previous/Next controls for systematic review of multiple responses
- Knowledge Gap Identification: Use assessment data and source analysis to identify specific knowledge gaps
- Contribution Prioritization: Prioritize knowledge contribution efforts based on response analysis
- Quality Trend Analysis: Track how knowledge contributions improve response quality over time
For Knowledge Contributors
- Response Context: Always review the full conversation context before contributing knowledge
- Accurate Information: Ensure contributed knowledge is accurate and current
- Clear Attribution: Provide source attribution for contributed information when possible
- Structured Input: Use clear, structured format for knowledge contributions
- Voice Quality: When using voice contribution, speak clearly and provide complete information
- Follow-Up: Monitor how your contributions affect future response quality
Best Practices for Add Knowledge Feature
When to Use Add Knowledge
Optimal Timing:
- Knowledge Gaps: When the AI response indicates missing or incomplete information
- Outdated Information: When you notice the response contains old or no longer accurate data
- Industry Expertise: When you have specialized knowledge that would improve future responses
- Customer-Specific Context: When adding context that would help with similar customer situations
Quality Indicators:
- Assessment Flags: Use the Add Knowledge button when quality assessment indicators show knowledge gaps
- User Feedback: Add knowledge when negative feedback suggests missing information
- Source Analysis: Review the Answer Sources table to identify areas where additional sources would help
- Competitive Intelligence: Add knowledge about competitive positioning and market changes
Effective Knowledge Contribution Strategies
Content Structure:
- Clear Categories: Organize knowledge into logical categories (facts, procedures, troubleshooting, competitive analysis)
- Specific Examples: Include concrete examples and use cases that demonstrate the knowledge application
- Context Tags: Tag contributions with relevant topics, customer types, and use cases for better retrieval
- Update Frequency: Regularly review and update contributed knowledge to maintain accuracy
Integration with Responses:
- Response Details Review: Use the Response Details Page to understand the full context before contributing
- Source Attribution: Reference the Answer Sources table to complement existing knowledge sources
- Feedback Integration: Combine knowledge contribution with feedback to provide comprehensive improvement
- Navigation Efficiency: Use Previous/Next controls to systematically review and contribute across multiple responses
Measuring Knowledge Contribution Impact
Performance Tracking:
- Response Quality Improvement: Monitor how your contributions improve subsequent response quality scores
- Knowledge Utilization: Track usage statistics of contributed knowledge sources
- User Satisfaction: Watch for improvements in user feedback after knowledge contributions
- Coverage Analysis: Identify remaining knowledge gaps for future contribution priorities
Continuous Improvement:
- Regular Review: Periodically review your contributed knowledge for accuracy and relevance
- Usage Analytics: Use the knowledge source usage data to understand which contributions are most valuable
- Team Collaboration: Share insights about effective knowledge contribution patterns with team members
- Quality Validation: Monitor how contributed knowledge performs against the 21-point assessment rubric
Related Features
- Reporting - View aggregated analytics and trends from response data
- Knowledge Sources - Update content that's causing quality issues
- Channels - Adjust channel configurations based on performance
- AI Chat - Where many of these responses originate