Guides
Measuring Success

Measuring Success

Learn how to track, analyze, and improve your AI avatar's performance with key metrics.

What gets measured gets managed. Define your success metrics before launch, then optimize relentlessly.

Key Metrics Framework

The CORE Framework

MetricWhat It MeasuresWhy It Matters
ConversionGoal completionBusiness impact
OperationSystem healthReliability
RetentionUser loyaltyLong-term value
EngagementUser interactionImmediate value

Engagement Metrics

Primary Metrics

MetricCalculationTarget
Session LengthAvg time per conversation3-5 minutes
Messages/SessionTotal messages / sessions5-10 messages
Bounce RateSingle-message sessions / total<20%
Return RateUsers with 2+ sessions / total>25%

Tracking in Dashboard

Navigate to Analytics > Engagement to see:

  • Daily/weekly/monthly trends
  • Session length distribution
  • Message count histogram
  • User journey patterns

Engagement Benchmarks

RatingSession LengthMessagesReturn Rate
Poor<30 seconds1-2<10%
Average1-2 minutes3-515-20%
Good3-5 minutes6-1025-35%
Excellent5+ minutes10+35%+

Operational Metrics

System Health

MetricTargetAlert Threshold
Uptime99.9%<99.5%
Response Time<3 seconds>5 seconds
Error Rate<1%>3%
Completion Rate>95%<85%

Performance Monitoring

// Track performance in your app
const metrics = {
  firstResponseTime: [],
  totalResponseTime: [],
  errorCount: 0,
  sessionCount: 0,
};
 
function trackPerformance(event) {
  if (event.type === 'response_start') {
    metrics.firstResponseTime.push(event.latency);
  }
  if (event.type === 'response_complete') {
    metrics.totalResponseTime.push(event.latency);
  }
  if (event.type === 'error') {
    metrics.errorCount++;
  }
}
 
// Calculate averages
function getMetrics() {
  return {
    avgFirstResponse: average(metrics.firstResponseTime),
    avgTotalResponse: average(metrics.totalResponseTime),
    errorRate: metrics.errorCount / metrics.sessionCount,
  };
}

Cost Efficiency

MetricCalculationTarget
Cost per SessionTotal cost / sessionsDepends on tier
Cost per ResolutionTotal cost / resolved issuesMinimise
Cost EfficiencyIssues resolved / costMaximize

Quality Metrics

User Satisfaction

MethodImplementationTarget
Thumbs Up/DownAfter each response>80% positive
Star RatingEnd of session>4.0/5.0
CSAT SurveyPost-conversation>4.2/5.0
NPSPeriodic survey>50

Implementing Feedback

<Avatar
  feedback={{
    enabled: true,
    type: 'thumbs', // 'thumbs' | 'stars' | 'csat'
    position: 'after-response',
  }}
  onFeedback={(data) => {
    analytics.track('avatar_feedback', {
      sessionId: data.sessionId,
      messageId: data.messageId,
      rating: data.rating,
      comment: data.comment,
    });
  }}
/>

Quality Benchmarks

MetricPoorAverageGoodExcellent
Positive feedback<60%70%80%90%+
CSAT score<3.53.84.24.5+
Resolution rate<50%65%80%90%+
Escalation rate>30%20%10%<5%

Business Impact Metrics

Conversion Metrics

Use CasePrimary MetricSecondary Metrics
SalesConversion rateLead quality, ACV
SupportResolution rateTicket deflection
EducationCompletion rateTest scores
EngagementTime on sitePage views

ROI Calculation

For Customer Support

Deflection Value = Tickets Deflected × Cost per Ticket
ROI = ((Deflection Value - Avatarium Cost) / Avatarium Cost) × 100

Example:

  • 5,000 support sessions/month
  • 60% resolution rate = 3,000 tickets deflected
  • Cost per human ticket = $15
Deflection Value = 3,000 × $15 = $45,000

See avatarium.ai/pricing (opens in a new tab) for current plan costs.

For Sales

Lead Value = Leads Generated × Conversion Rate × Average Deal Size
ROI = ((Lead Value - Cost) / Cost) × 100

Funnel Metrics

Track users through your conversion funnel:

StageMetricTarget
AwarenessAvatar impressionsBaseline
EngagementSessions started>5% of impressions
Interest3+ message sessions>50% of sessions
IntentAction requested>20% of engaged
ConversionGoal completed>10% of intent

Analytics Dashboard

Setting Up Tracking

Enable Analytics

In your Avatarium dashboard, go to Settings > Analytics and enable tracking.

Configure Events

Choose which events to track:

  • Session start/end
  • Messages sent/received
  • Errors and fallbacks
  • User feedback
  • Custom events

Set Up Exports

Export data to your analytics platform:

  • Google Analytics
  • Mixpanel
  • Amplitude
  • Custom webhook

Custom Event Tracking

import { trackEvent } from '@avatarium/analytics';
 
// Track custom business events
function onProductRecommended(product: Product) {
  trackEvent('product_recommended', {
    productId: product.id,
    category: product.category,
    price: product.price,
  });
}
 
function onCheckoutStarted() {
  trackEvent('checkout_started_from_avatar', {
    source: 'avatar_recommendation',
  });
}

Reporting

Weekly Report Template

SectionMetricsAnalysis
OverviewSessions, messages, usersWeek-over-week trends
EngagementAvg session length, return rateEngagement quality
QualitySatisfaction scores, resolution rateUser happiness
OperationsError rate, response timeSystem health
BusinessConversions, cost per conversionROI tracking

Monthly Business Review

Prepare these analyses monthly:

  • Trend Analysis: How are key metrics trending?
  • Cohort Analysis: How do different user groups perform?
  • Content Gaps: What questions aren't being answered?
  • Optimization Opportunities: Where can we improve?
  • Cost Analysis: Are we efficient?

Optimization Strategies

Based on Engagement Metrics

IssueIndicatorFix
Low engagementShort sessions, few messagesImprove greeting, add proactive prompts
High bounceSingle-message sessionsBetter initial response, clearer value prop
Low returnFew repeat usersAdd value, improve satisfaction

Based on Quality Metrics

IssueIndicatorFix
Low satisfactionPoor feedback scoresReview negative conversations, improve responses
High escalationMany human handoffsExpand knowledge base, improve prompts
Low resolutionIssues not solvedBetter instructions, more training data

Based on Business Metrics

IssueIndicatorFix
Low conversionFew goal completionsClearer CTAs, better guidance
High cost/conversionExpensive resultsOptimize model choice, reduce unnecessary sessions
Poor ROICost > valueReview use case fit, optimize funnel

A/B Testing

What to Test

ElementVariationsSuccess Metric
GreetingFormal vs casualEngagement rate
System PromptDetailed vs conciseResolution rate
AI ModelGPT-4o vs GroqQuality + cost
TTS VoiceDifferent voicesSatisfaction
AvatarDifferent modelsEngagement

Running Tests

// A/B test framework
function getVariant(userId: string, testId: string): string {
  const hash = hashString(`${userId}-${testId}`);
  return hash % 2 === 0 ? 'control' : 'variant';
}
 
// Usage
const variant = getVariant(userId, 'greeting-test');
const greeting = variant === 'control'
  ? "Hi! How can I help you today?"
  : "Welcome back! What would you like to know?";

Analyzing Results

MetricControlVariantLiftSignificance
Session length2.5 min3.1 min+24%p < 0.01
Messages/session4.25.8+38%p < 0.05
Satisfaction4.14.3+5%p = 0.08

Alert Configuration

Recommended Alerts

AlertThresholdPriority
Error rate spike>5%P1
Response time slow>10s avgP2
Satisfaction drop<3.5P2
Budget at 80%80% of limitP3
Zero sessions0 for 1 hourP1

Setting Up Alerts

// In dashboard settings or via API
const alerts = [
  {
    metric: 'error_rate',
    condition: 'greater_than',
    threshold: 0.05,
    window: '15m',
    priority: 'p1',
    channels: ['email', 'slack'],
  },
  {
    metric: 'avg_response_time',
    condition: 'greater_than',
    threshold: 10000, // ms
    window: '5m',
    priority: 'p2',
    channels: ['email'],
  },
];

Summary

Metrics Priority

PriorityMetricsReview Frequency
CriticalError rate, uptimeReal-time
HighSatisfaction, resolutionDaily
MediumEngagement, conversionWeekly
LowCost efficiency, trendsMonthly

Success Checklist

  • Analytics enabled and tracking
  • Key metrics defined and baselined
  • Dashboards configured
  • Alerts set up
  • Weekly review process in place
  • Monthly business review scheduled

Next Steps