Skip to main content

Reporting

Reporting, found under AI Performance Management, provides dashboards and charts summarizing AI activity and quality metrics across all channels.

From Responses → Reporting

Workflow: Review individual responses in Responses → Analyze patterns and trends in Reporting → Identify improvement opportunities

Common Filters:

  • Date: Last 7 days, Last 30 days, Custom range
  • Channel: Specific channels or all channels
  • Issue Type: Rubric flags (hallucination, knowledge gap, etc.)
  • Segment: User groups, roles, or demographics

Key Capabilities

Filters and Controls

Filter Row

The top section provides multiple filtering options:

  • Channels: Choose specific channels or leave blank to show all
  • Date Range: Select predefined ranges (Last 30 Days, Last 7 Days, etc.) or custom dates
  • Issue Types: Filter by specific assessment types or quality issues
  • User Segments: Filter by user groups or roles (if applicable)

Date Picker

The calendar icon allows you to set custom date ranges for detailed time-period analysis.

Response Count Chart

A stacked bar chart displaying responses per date and channel with interactive elements:

Chart Features:

  • Stacked Bars: Visual representation showing total responses over time
  • Channel Breakdown: Each bar segment represents responses from different channels
  • Color-Coded Legend: Channel identification with clickable legend items
  • Date Range Filter: Interactive date picker to focus on specific time periods
  • Hover Details: Mouse over bars to see exact counts and channel breakdowns
  • Channel Toggle: Click legend items to show/hide specific channels

Interactive Controls:

  • Date Range Selector: Choose predefined ranges or custom date periods
  • Channel Filtering: Enable/disable specific channels in the visualization
  • Export Options: Download chart data or images for reporting

Activities Table

A comprehensive summary table showing individual user engagement metrics:

ColumnDescription
User IDIdentifier for the user who provided feedback
Response LikesNumber of positive feedback given to AI responses
Response DislikesNumber of negative feedback given to AI responses
Response Text CountTotal AI responses the user interacted with
Retrieval LikesPositive feedback on information retrieval quality
Retrieval DislikesNegative feedback on information retrieval quality
Retrieval Text CountTotal information retrievals the user interacted with

User Feedback Charts

Interactive bar charts providing detailed feedback analysis with multiple viewing options:

Chart Options:

  • Group by User ID: View feedback patterns by individual users
  • Group by Feedback Type: Aggregate feedback by like/dislike categories
  • Time Series View: Track feedback trends over time
  • Channel Comparison: Compare feedback across different channels

Interactive Features:

  • Download Feedback Report Button: Export detailed feedback data in multiple formats
  • Filter Controls: Focus on specific users, feedback types, or time periods
  • Drill-Down Capability: Click chart segments for detailed breakdowns
  • Trend Analysis: Identify patterns in user satisfaction over time

Export and Analysis

Download Report

Export current view data in multiple formats:

  • CSV: For spreadsheet analysis
  • PDF: For presentation or archival
  • Excel: For advanced data manipulation

Data Export Options

  • Current filtered view
  • Full dataset (subject to permissions)
  • Summary statistics
  • Detailed activity logs

How It Works

Data Aggregation

Reporting aggregates data from all channels to provide:

  • Usage Patterns: When and how the AI is being used
  • Quality Trends: Changes in response quality over time
  • User Engagement: How users interact with AI responses
  • Channel Performance: Comparative performance across channels

Real-Time Updates

  • Data refreshes automatically at regular intervals
  • Manual refresh available via browser refresh
  • Filter changes update charts immediately
  • Export data reflects current filter state

Interactive Analysis

  1. Filter Data: Use filters to focus on specific time periods or channels
  2. Visual Exploration: Hover over charts to see detailed breakdowns
  3. Drill Down: Click chart elements to explore specific data points
  4. Export Insights: Download data for offline analysis

Key Metrics

Response Volume Metrics

  • Total Responses: Overall AI activity level
  • Response Rate: Responses per time period
  • Channel Distribution: Which channels are most active
  • Peak Usage Times: When users most frequently interact with the AI

Quality Metrics

  • Feedback Ratio: Positive vs negative feedback percentages
  • Assessment Flags: Frequency of quality issues
  • User Satisfaction: Overall satisfaction trends
  • Issue Categories: Types of problems most commonly flagged

User Engagement Metrics

  • Active Users: Number of unique users providing feedback
  • Engagement Rate: Percentage of responses that receive feedback
  • Power Users: Users with high interaction volumes
  • Feedback Patterns: How feedback patterns change over time

Troubleshooting

Empty Graphs

If charts show no data:

  1. Date Range: Ensure there are responses within the selected time period
  2. Channel Access: Verify you have permission to view the selected channels
  3. Data Availability: Expand the date range if the current period has no activity
  4. Filter Settings: Check that filters aren't excluding all data

Export Issues

If reports won't download:

  1. Pop-up Blockers: Check browser pop-up blocker settings
  2. File Size: Try a smaller date range for large datasets
  3. Browser Permissions: Ensure download permissions are enabled
  4. Network Issues: Check internet connection stability

Performance Problems

If the page loads slowly:

  1. Date Range: Use shorter time periods for faster loading
  2. Channel Selection: Limit to specific channels rather than "All"
  3. Browser Cache: Clear browser cache and reload
  4. Data Volume: Be aware that larger datasets take longer to process

Analysis Tips

Trend Analysis

  • Compare Periods: Use date filters to compare performance across different time periods
  • Seasonal Patterns: Look for recurring patterns in usage and feedback
  • Channel Performance: Compare metrics across different channels to identify best practices

Quality Monitoring

  • Feedback Trends: Monitor changes in user satisfaction over time
  • Issue Identification: Use assessment data to identify systemic quality problems
  • Improvement Tracking: Measure the impact of knowledge source updates

User Behavior

  • Engagement Patterns: Identify which users are most engaged with AI responses
  • Feedback Quality: Analyze feedback patterns to understand user satisfaction
  • Usage Distribution: Understand how different user groups interact with the system

Best Practices

For Managers

  1. Regular Reviews: Schedule weekly or monthly reporting reviews
  2. Trend Monitoring: Watch for significant changes in key metrics
  3. Cross-Channel Analysis: Compare performance across different channels
  4. Action Items: Create action items based on reporting insights

For Analysts

  1. Interactive Chart Usage: Leverage date range filters and channel toggles for focused analysis
  2. Data Export: Use the Download Feedback Report button and other export options for deeper analysis
  3. Multi-View Analysis: Switch between User ID and Feedback Type groupings for different perspectives
  4. Baseline Establishment: Establish baseline metrics using historical date range comparisons
  5. Correlation Analysis: Look for correlations between user engagement and feedback patterns
  6. Drill-Down Investigation: Use chart click-through features to investigate specific data points
  7. Trend Identification: Use the User Feedback charts to identify satisfaction trends over time

For Operations Teams

  1. Daily Monitoring: Check key metrics daily for operational issues
  2. Alert Thresholds: Establish thresholds for when action is needed
  3. Response Protocols: Have procedures for addressing quality drops
  4. Continuous Improvement: Use data to drive ongoing system improvements

Best Practices for Interpreting Charts and Reports

Response Count Chart Analysis

Reading the Chart:

  1. Bar Height Interpretation: Higher bars indicate periods of increased AI activity and user engagement
  2. Channel Segmentation: Use the color-coded legend to identify which channels drive the most activity
  3. Time Pattern Recognition: Look for daily, weekly, or seasonal patterns in usage that inform resource planning
  4. Trend Identification: Compare bar heights over time to identify growth trends or usage decline

Actionable Insights:

  1. Peak Usage Planning: Schedule maintenance and updates during low-activity periods identified in the chart
  2. Channel Performance: Focus improvement efforts on high-activity channels shown in the chart
  3. Resource Allocation: Use activity patterns to allocate support resources during peak times
  4. Growth Measurement: Track month-over-month or quarter-over-quarter growth using chart trends

User Feedback Chart Interpretation

Understanding Feedback Patterns:

  1. Like/Dislike Ratios: High dislike ratios indicate areas needing immediate attention
  2. User-Specific Patterns: Individual user feedback patterns can reveal training needs or system issues
  3. Channel Comparisons: Compare feedback quality across channels to identify best-performing configurations
  4. Trend Analysis: Track feedback improvement over time to measure the impact of system changes

Strategic Decision Making:

  1. Quality Interventions: Use negative feedback clusters to trigger knowledge source reviews
  2. Success Pattern Replication: Identify what makes high-satisfaction channels successful and replicate those patterns
  3. User Training Needs: Individual user feedback patterns can inform targeted training programs
  4. System Optimization: Use feedback trends to guide AI configuration and knowledge source improvements

Activities Table Insights

Performance Metrics:

  1. Engagement Levels: High interaction counts indicate active user engagement with the AI system
  2. Feedback Quality: Users with high like-to-dislike ratios are good candidates for knowledge contribution roles
  3. Usage Patterns: Compare response counts with feedback counts to identify user engagement levels
  4. Power User Identification: Users with high activity levels can become system champions and trainers

Operational Applications:

  1. Training Prioritization: Focus training efforts on users with low engagement or high dislike ratios
  2. Champion Development: Identify users with high engagement and positive feedback for leadership roles
  3. System Health Monitoring: Sudden changes in user activity patterns can indicate system issues
  4. Feature Adoption: Track how new features affect user engagement and feedback patterns

Export and Analysis Workflows

Data Download Strategy:

  1. Regular Exports: Schedule regular data exports for historical analysis and trend tracking
  2. Filtered Analysis: Use date range and channel filters before exporting to focus on specific areas
  3. Comparative Studies: Export data from different time periods to measure improvement initiatives
  4. Cross-Platform Analysis: Combine ept AI data with external analytics tools for comprehensive insights

Advanced Analysis Techniques:

  1. Correlation Analysis: Look for relationships between usage patterns and business outcomes
  2. Predictive Modeling: Use historical trends to forecast future usage and resource needs
  3. Segmentation Analysis: Break down data by user types, channels, or time periods for targeted insights
  4. ROI Measurement: Connect AI performance metrics to business value and return on investment
  • Responses - Detailed view of individual AI responses and assessments
  • Channels - Configure channels to improve performance
  • Knowledge Sources - Update content based on quality insights
  • Users - Understand user engagement and access patterns