Reporting
Reporting, found under AI Performance Management, provides dashboards and charts summarizing AI activity and quality metrics across all channels.
Workflow: Review individual responses in Responses → Analyze patterns and trends in Reporting → Identify improvement opportunities
Common Filters:
- Date: Last 7 days, Last 30 days, Custom range
- Channel: Specific channels or all channels
- Issue Type: Rubric flags (hallucination, knowledge gap, etc.)
- Segment: User groups, roles, or demographics
Key Capabilities
Filters and Controls
Filter Row
The top section provides multiple filtering options:
- Channels: Choose specific channels or leave blank to show all
- Date Range: Select predefined ranges (Last 30 Days, Last 7 Days, etc.) or custom dates
- Issue Types: Filter by specific assessment types or quality issues
- User Segments: Filter by user groups or roles (if applicable)
Date Picker
The calendar icon allows you to set custom date ranges for detailed time-period analysis.
Response Count Chart
A stacked bar chart displaying responses per date and channel with interactive elements:
Chart Features:
- Stacked Bars: Visual representation showing total responses over time
- Channel Breakdown: Each bar segment represents responses from different channels
- Color-Coded Legend: Channel identification with clickable legend items
- Date Range Filter: Interactive date picker to focus on specific time periods
- Hover Details: Mouse over bars to see exact counts and channel breakdowns
- Channel Toggle: Click legend items to show/hide specific channels
Interactive Controls:
- Date Range Selector: Choose predefined ranges or custom date periods
- Channel Filtering: Enable/disable specific channels in the visualization
- Export Options: Download chart data or images for reporting
Activities Table
A comprehensive summary table showing individual user engagement metrics:
Column | Description |
---|---|
User ID | Identifier for the user who provided feedback |
Response Likes | Number of positive feedback given to AI responses |
Response Dislikes | Number of negative feedback given to AI responses |
Response Text Count | Total AI responses the user interacted with |
Retrieval Likes | Positive feedback on information retrieval quality |
Retrieval Dislikes | Negative feedback on information retrieval quality |
Retrieval Text Count | Total information retrievals the user interacted with |
User Feedback Charts
Interactive bar charts providing detailed feedback analysis with multiple viewing options:
Chart Options:
- Group by User ID: View feedback patterns by individual users
- Group by Feedback Type: Aggregate feedback by like/dislike categories
- Time Series View: Track feedback trends over time
- Channel Comparison: Compare feedback across different channels
Interactive Features:
- Download Feedback Report Button: Export detailed feedback data in multiple formats
- Filter Controls: Focus on specific users, feedback types, or time periods
- Drill-Down Capability: Click chart segments for detailed breakdowns
- Trend Analysis: Identify patterns in user satisfaction over time
Export and Analysis
Download Report
Export current view data in multiple formats:
- CSV: For spreadsheet analysis
- PDF: For presentation or archival
- Excel: For advanced data manipulation
Data Export Options
- Current filtered view
- Full dataset (subject to permissions)
- Summary statistics
- Detailed activity logs
How It Works
Data Aggregation
Reporting aggregates data from all channels to provide:
- Usage Patterns: When and how the AI is being used
- Quality Trends: Changes in response quality over time
- User Engagement: How users interact with AI responses
- Channel Performance: Comparative performance across channels
Real-Time Updates
- Data refreshes automatically at regular intervals
- Manual refresh available via browser refresh
- Filter changes update charts immediately
- Export data reflects current filter state
Interactive Analysis
- Filter Data: Use filters to focus on specific time periods or channels
- Visual Exploration: Hover over charts to see detailed breakdowns
- Drill Down: Click chart elements to explore specific data points
- Export Insights: Download data for offline analysis
Key Metrics
Response Volume Metrics
- Total Responses: Overall AI activity level
- Response Rate: Responses per time period
- Channel Distribution: Which channels are most active
- Peak Usage Times: When users most frequently interact with the AI
Quality Metrics
- Feedback Ratio: Positive vs negative feedback percentages
- Assessment Flags: Frequency of quality issues
- User Satisfaction: Overall satisfaction trends
- Issue Categories: Types of problems most commonly flagged
User Engagement Metrics
- Active Users: Number of unique users providing feedback
- Engagement Rate: Percentage of responses that receive feedback
- Power Users: Users with high interaction volumes
- Feedback Patterns: How feedback patterns change over time
Troubleshooting
Empty Graphs
If charts show no data:
- Date Range: Ensure there are responses within the selected time period
- Channel Access: Verify you have permission to view the selected channels
- Data Availability: Expand the date range if the current period has no activity
- Filter Settings: Check that filters aren't excluding all data
Export Issues
If reports won't download:
- Pop-up Blockers: Check browser pop-up blocker settings
- File Size: Try a smaller date range for large datasets
- Browser Permissions: Ensure download permissions are enabled
- Network Issues: Check internet connection stability
Performance Problems
If the page loads slowly:
- Date Range: Use shorter time periods for faster loading
- Channel Selection: Limit to specific channels rather than "All"
- Browser Cache: Clear browser cache and reload
- Data Volume: Be aware that larger datasets take longer to process
Analysis Tips
Trend Analysis
- Compare Periods: Use date filters to compare performance across different time periods
- Seasonal Patterns: Look for recurring patterns in usage and feedback
- Channel Performance: Compare metrics across different channels to identify best practices
Quality Monitoring
- Feedback Trends: Monitor changes in user satisfaction over time
- Issue Identification: Use assessment data to identify systemic quality problems
- Improvement Tracking: Measure the impact of knowledge source updates
User Behavior
- Engagement Patterns: Identify which users are most engaged with AI responses
- Feedback Quality: Analyze feedback patterns to understand user satisfaction
- Usage Distribution: Understand how different user groups interact with the system
Best Practices
For Managers
- Regular Reviews: Schedule weekly or monthly reporting reviews
- Trend Monitoring: Watch for significant changes in key metrics
- Cross-Channel Analysis: Compare performance across different channels
- Action Items: Create action items based on reporting insights
For Analysts
- Interactive Chart Usage: Leverage date range filters and channel toggles for focused analysis
- Data Export: Use the Download Feedback Report button and other export options for deeper analysis
- Multi-View Analysis: Switch between User ID and Feedback Type groupings for different perspectives
- Baseline Establishment: Establish baseline metrics using historical date range comparisons
- Correlation Analysis: Look for correlations between user engagement and feedback patterns
- Drill-Down Investigation: Use chart click-through features to investigate specific data points
- Trend Identification: Use the User Feedback charts to identify satisfaction trends over time
For Operations Teams
- Daily Monitoring: Check key metrics daily for operational issues
- Alert Thresholds: Establish thresholds for when action is needed
- Response Protocols: Have procedures for addressing quality drops
- Continuous Improvement: Use data to drive ongoing system improvements
Best Practices for Interpreting Charts and Reports
Response Count Chart Analysis
Reading the Chart:
- Bar Height Interpretation: Higher bars indicate periods of increased AI activity and user engagement
- Channel Segmentation: Use the color-coded legend to identify which channels drive the most activity
- Time Pattern Recognition: Look for daily, weekly, or seasonal patterns in usage that inform resource planning
- Trend Identification: Compare bar heights over time to identify growth trends or usage decline
Actionable Insights:
- Peak Usage Planning: Schedule maintenance and updates during low-activity periods identified in the chart
- Channel Performance: Focus improvement efforts on high-activity channels shown in the chart
- Resource Allocation: Use activity patterns to allocate support resources during peak times
- Growth Measurement: Track month-over-month or quarter-over-quarter growth using chart trends
User Feedback Chart Interpretation
Understanding Feedback Patterns:
- Like/Dislike Ratios: High dislike ratios indicate areas needing immediate attention
- User-Specific Patterns: Individual user feedback patterns can reveal training needs or system issues
- Channel Comparisons: Compare feedback quality across channels to identify best-performing configurations
- Trend Analysis: Track feedback improvement over time to measure the impact of system changes
Strategic Decision Making:
- Quality Interventions: Use negative feedback clusters to trigger knowledge source reviews
- Success Pattern Replication: Identify what makes high-satisfaction channels successful and replicate those patterns
- User Training Needs: Individual user feedback patterns can inform targeted training programs
- System Optimization: Use feedback trends to guide AI configuration and knowledge source improvements
Activities Table Insights
Performance Metrics:
- Engagement Levels: High interaction counts indicate active user engagement with the AI system
- Feedback Quality: Users with high like-to-dislike ratios are good candidates for knowledge contribution roles
- Usage Patterns: Compare response counts with feedback counts to identify user engagement levels
- Power User Identification: Users with high activity levels can become system champions and trainers
Operational Applications:
- Training Prioritization: Focus training efforts on users with low engagement or high dislike ratios
- Champion Development: Identify users with high engagement and positive feedback for leadership roles
- System Health Monitoring: Sudden changes in user activity patterns can indicate system issues
- Feature Adoption: Track how new features affect user engagement and feedback patterns
Export and Analysis Workflows
Data Download Strategy:
- Regular Exports: Schedule regular data exports for historical analysis and trend tracking
- Filtered Analysis: Use date range and channel filters before exporting to focus on specific areas
- Comparative Studies: Export data from different time periods to measure improvement initiatives
- Cross-Platform Analysis: Combine ept AI data with external analytics tools for comprehensive insights
Advanced Analysis Techniques:
- Correlation Analysis: Look for relationships between usage patterns and business outcomes
- Predictive Modeling: Use historical trends to forecast future usage and resource needs
- Segmentation Analysis: Break down data by user types, channels, or time periods for targeted insights
- ROI Measurement: Connect AI performance metrics to business value and return on investment
Related Features
- Responses - Detailed view of individual AI responses and assessments
- Channels - Configure channels to improve performance
- Knowledge Sources - Update content based on quality insights
- Users - Understand user engagement and access patterns