javascript

Production JavaScript Performance Monitoring: Real User Metrics and Core Web Vitals Implementation Guide

Learn JavaScript performance monitoring best practices with Real User Monitoring, Core Web Vitals tracking, and error correlation. Improve app speed and user experience today.

Production JavaScript Performance Monitoring: Real User Metrics and Core Web Vitals Implementation Guide

When building production JavaScript applications, I’ve learned that performance monitoring isn’t just about fixing problems after they occur. It’s about understanding how your application behaves in the real world and making informed decisions to enhance user experience.

Real User Monitoring Implementation

Real User Monitoring captures actual performance data from your users’ browsers. Unlike synthetic testing, RUM shows you how your application performs across different devices, network conditions, and geographic locations.

I implement RUM by collecting navigation timing data and user interaction metrics. This approach provides genuine insights into performance bottlenecks that might not appear in controlled testing environments.

class RealUserMonitor {
  constructor() {
    this.sessionId = this.generateSessionId();
    this.userId = this.getUserId();
    this.collectBasicMetrics();
  }

  collectBasicMetrics() {
    window.addEventListener('load', () => {
      const navigation = performance.getEntriesByType('navigation')[0];
      const paint = performance.getEntriesByType('paint');
      
      const metrics = {
        sessionId: this.sessionId,
        userId: this.userId,
        url: window.location.href,
        userAgent: navigator.userAgent,
        connectionType: navigator.connection?.effectiveType,
        domContentLoaded: navigation.domContentLoadedEventEnd - navigation.domContentLoadedEventStart,
        pageLoad: navigation.loadEventEnd - navigation.loadEventStart,
        firstPaint: paint.find(entry => entry.name === 'first-paint')?.startTime,
        firstContentfulPaint: paint.find(entry => entry.name === 'first-contentful-paint')?.startTime,
        timestamp: Date.now()
      };
      
      this.sendMetrics(metrics);
    });
  }

  trackUserInteraction(eventType, element) {
    const interactionStart = performance.now();
    
    setTimeout(() => {
      const duration = performance.now() - interactionStart;
      this.sendMetrics({
        type: 'interaction',
        eventType,
        element: element.tagName,
        duration,
        timestamp: Date.now()
      });
    }, 0);
  }

  sendMetrics(data) {
    if (navigator.sendBeacon) {
      navigator.sendBeacon('/api/rum', JSON.stringify(data));
    } else {
      fetch('/api/rum', {
        method: 'POST',
        body: JSON.stringify(data),
        headers: { 'Content-Type': 'application/json' },
        keepalive: true
      }).catch(() => {});
    }
  }

  generateSessionId() {
    return Math.random().toString(36).substring(2) + Date.now().toString(36);
  }

  getUserId() {
    return localStorage.getItem('userId') || 'anonymous';
  }
}

const rumMonitor = new RealUserMonitor();

// Track button clicks
document.addEventListener('click', (e) => {
  if (e.target.tagName === 'BUTTON') {
    rumMonitor.trackUserInteraction('click', e.target);
  }
});

Core Web Vitals Tracking

Google’s Core Web Vitals have become essential metrics for measuring user experience. I focus on three key measurements: Largest Contentful Paint, First Input Delay, and Cumulative Layout Shift.

These metrics directly correlate with user satisfaction and search engine rankings. I’ve found that monitoring them continuously helps identify performance regressions before they impact users significantly.

class CoreWebVitalsMonitor {
  constructor() {
    this.vitals = {};
    this.setupLCPObserver();
    this.setupFIDObserver();
    this.setupCLSObserver();
  }

  setupLCPObserver() {
    new PerformanceObserver((entryList) => {
      const entries = entryList.getEntries();
      const lastEntry = entries[entries.length - 1];
      
      this.vitals.lcp = {
        value: lastEntry.startTime,
        element: lastEntry.element?.tagName,
        url: lastEntry.url,
        timestamp: Date.now()
      };
      
      this.reportVital('LCP', this.vitals.lcp);
    }).observe({ entryTypes: ['largest-contentful-paint'] });
  }

  setupFIDObserver() {
    new PerformanceObserver((entryList) => {
      entryList.getEntries().forEach((entry) => {
        this.vitals.fid = {
          value: entry.processingStart - entry.startTime,
          eventType: entry.name,
          timestamp: Date.now()
        };
        
        this.reportVital('FID', this.vitals.fid);
      });
    }).observe({ entryTypes: ['first-input'] });
  }

  setupCLSObserver() {
    let clsValue = 0;
    let sessionValue = 0;
    let sessionEntries = [];

    new PerformanceObserver((entryList) => {
      entryList.getEntries().forEach((entry) => {
        if (!entry.hadRecentInput) {
          const firstSessionEntry = sessionEntries[0];
          const lastSessionEntry = sessionEntries[sessionEntries.length - 1];
          
          if (sessionValue && entry.startTime - lastSessionEntry.startTime < 1000 && 
              entry.startTime - firstSessionEntry.startTime < 5000) {
            sessionValue += entry.value;
            sessionEntries.push(entry);
          } else {
            sessionValue = entry.value;
            sessionEntries = [entry];
          }
          
          if (sessionValue > clsValue) {
            clsValue = sessionValue;
            this.vitals.cls = {
              value: clsValue,
              entries: sessionEntries.map(e => ({
                startTime: e.startTime,
                value: e.value
              })),
              timestamp: Date.now()
            };
            
            this.reportVital('CLS', this.vitals.cls);
          }
        }
      });
    }).observe({ entryTypes: ['layout-shift'] });
  }

  reportVital(name, data) {
    const payload = {
      name,
      value: data.value,
      url: window.location.href,
      timestamp: data.timestamp,
      ...data
    };

    fetch('/api/vitals', {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify(payload)
    }).catch(console.error);
  }

  getVitals() {
    return this.vitals;
  }
}

const vitalsMonitor = new CoreWebVitalsMonitor();

Performance Observer API Usage

The Performance Observer API provides a powerful way to monitor various performance metrics without impacting application performance. I use it to track resource loading times, navigation events, and custom performance marks.

This API works asynchronously, which means it doesn’t block the main thread while collecting performance data. I’ve implemented observers for different entry types to get comprehensive performance insights.

class PerformanceTracker {
  constructor() {
    this.observers = new Map();
    this.setupResourceObserver();
    this.setupNavigationObserver();
    this.setupCustomMarksObserver();
  }

  setupResourceObserver() {
    const observer = new PerformanceObserver((list) => {
      list.getEntries().forEach((entry) => {
        if (entry.initiatorType === 'fetch' || entry.initiatorType === 'xmlhttprequest') {
          this.trackAPICall({
            url: entry.name,
            duration: entry.duration,
            transferSize: entry.transferSize,
            responseStart: entry.responseStart,
            responseEnd: entry.responseEnd
          });
        } else if (entry.initiatorType === 'script') {
          this.trackScriptLoad({
            url: entry.name,
            duration: entry.duration,
            size: entry.transferSize
          });
        }
      });
    });
    
    observer.observe({ entryTypes: ['resource'] });
    this.observers.set('resource', observer);
  }

  setupNavigationObserver() {
    const observer = new PerformanceObserver((list) => {
      list.getEntries().forEach((entry) => {
        this.trackPageLoad({
          url: entry.name,
          domContentLoaded: entry.domContentLoadedEventEnd - entry.domContentLoadedEventStart,
          loadComplete: entry.loadEventEnd - entry.loadEventStart,
          domInteractive: entry.domInteractive - entry.domLoading,
          networkTime: entry.responseEnd - entry.requestStart
        });
      });
    });
    
    observer.observe({ entryTypes: ['navigation'] });
    this.observers.set('navigation', observer);
  }

  setupCustomMarksObserver() {
    const observer = new PerformanceObserver((list) => {
      list.getEntries().forEach((entry) => {
        if (entry.entryType === 'mark') {
          this.trackCustomMark(entry.name, entry.startTime);
        } else if (entry.entryType === 'measure') {
          this.trackCustomMeasure(entry.name, entry.duration);
        }
      });
    });
    
    observer.observe({ entryTypes: ['mark', 'measure'] });
    this.observers.set('marks', observer);
  }

  trackAPICall(data) {
    if (data.duration > 1000) { // Log slow API calls
      console.warn(`Slow API call detected: ${data.url} took ${data.duration}ms`);
    }
    
    this.sendMetric('api_call', data);
  }

  trackScriptLoad(data) {
    this.sendMetric('script_load', data);
  }

  trackPageLoad(data) {
    this.sendMetric('page_load', data);
  }

  trackCustomMark(name, startTime) {
    this.sendMetric('custom_mark', { name, startTime });
  }

  trackCustomMeasure(name, duration) {
    this.sendMetric('custom_measure', { name, duration });
  }

  sendMetric(type, data) {
    const payload = {
      type,
      data,
      timestamp: Date.now(),
      url: window.location.href
    };

    if (navigator.sendBeacon) {
      navigator.sendBeacon('/api/performance', JSON.stringify(payload));
    }
  }

  measureAsync(name, asyncFunction) {
    performance.mark(`${name}-start`);
    
    return asyncFunction().finally(() => {
      performance.mark(`${name}-end`);
      performance.measure(name, `${name}-start`, `${name}-end`);
    });
  }

  disconnect() {
    this.observers.forEach(observer => observer.disconnect());
    this.observers.clear();
  }
}

const performanceTracker = new PerformanceTracker();

// Example usage for async operations
performanceTracker.measureAsync('data-fetch', async () => {
  const response = await fetch('/api/data');
  return response.json();
}).then(data => {
  console.log('Data loaded with performance tracking');
});

Long Task Detection

Long tasks are JavaScript operations that block the main thread for more than 50 milliseconds. These tasks create noticeable performance issues and make applications feel unresponsive to user interactions.

I monitor long tasks to identify code sections that need optimization. When I detect long tasks, I either optimize the code or move heavy operations to Web Workers to maintain responsive user interfaces.

class LongTaskMonitor {
  constructor() {
    this.longTasks = [];
    this.threshold = 50; // milliseconds
    this.setupLongTaskObserver();
    this.setupMainThreadMonitor();
  }

  setupLongTaskObserver() {
    if ('PerformanceObserver' in window && 'PerformanceLongTaskTiming' in window) {
      const observer = new PerformanceObserver((list) => {
        list.getEntries().forEach((entry) => {
          this.handleLongTask({
            name: entry.name,
            duration: entry.duration,
            startTime: entry.startTime,
            attribution: entry.attribution
          });
        });
      });
      
      observer.observe({ entryTypes: ['longtask'] });
    }
  }

  setupMainThreadMonitor() {
    let isBlocked = false;
    let blockStart = 0;

    const checkMainThread = () => {
      const start = performance.now();
      
      setTimeout(() => {
        const delay = performance.now() - start;
        
        if (delay > this.threshold && !isBlocked) {
          isBlocked = true;
          blockStart = start;
        } else if (delay <= this.threshold && isBlocked) {
          isBlocked = false;
          const blockDuration = performance.now() - blockStart;
          
          this.handleMainThreadBlock({
            duration: blockDuration,
            startTime: blockStart
          });
        }
        
        requestAnimationFrame(checkMainThread);
      }, 0);
    };
    
    checkMainThread();
  }

  handleLongTask(task) {
    this.longTasks.push(task);
    
    // Alert if task is critically long
    if (task.duration > 200) {
      console.warn(`Critical long task detected: ${task.duration}ms`);
      this.reportCriticalTask(task);
    }
    
    this.reportLongTask(task);
  }

  handleMainThreadBlock(block) {
    console.log(`Main thread blocked for ${block.duration}ms`);
    this.reportMainThreadBlock(block);
  }

  reportLongTask(task) {
    fetch('/api/longtask', {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({
        type: 'longtask',
        duration: task.duration,
        startTime: task.startTime,
        url: window.location.href,
        timestamp: Date.now()
      })
    }).catch(() => {});
  }

  reportCriticalTask(task) {
    // Send immediate alert for critical performance issues
    fetch('/api/alert', {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({
        severity: 'high',
        message: `Critical long task: ${task.duration}ms`,
        url: window.location.href,
        timestamp: Date.now()
      })
    }).catch(() => {});
  }

  reportMainThreadBlock(block) {
    fetch('/api/mainthread', {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({
        type: 'mainthread_block',
        duration: block.duration,
        startTime: block.startTime,
        url: window.location.href,
        timestamp: Date.now()
      })
    }).catch(() => {});
  }

  getAverageLongTaskDuration() {
    if (this.longTasks.length === 0) return 0;
    
    const total = this.longTasks.reduce((sum, task) => sum + task.duration, 0);
    return total / this.longTasks.length;
  }

  getLongTaskFrequency() {
    return this.longTasks.length;
  }

  // Helper method to break up long-running operations
  breakUpLongTask(items, processor, batchSize = 100) {
    return new Promise((resolve) => {
      let index = 0;
      const results = [];
      
      const processBatch = () => {
        const start = performance.now();
        
        while (index < items.length && performance.now() - start < 16) {
          results.push(processor(items[index]));
          index++;
        }
        
        if (index < items.length) {
          setTimeout(processBatch, 0);
        } else {
          resolve(results);
        }
      };
      
      processBatch();
    });
  }
}

const longTaskMonitor = new LongTaskMonitor();

// Example usage to prevent long tasks
async function processLargeDataset(data) {
  return longTaskMonitor.breakUpLongTask(data, (item) => {
    // Process individual item
    return performComplexCalculation(item);
  });
}

Memory Usage Monitoring

Memory leaks in JavaScript applications can cause performance degradation over time. I monitor memory usage patterns to detect potential leaks and optimize garbage collection behavior in long-running applications.

Understanding memory consumption helps identify components that hold onto references longer than necessary. I track both heap usage and the number of DOM nodes to get a complete picture of memory health.

class MemoryMonitor {
  constructor() {
    this.memoryHistory = [];
    this.domNodeHistory = [];
    this.intervalId = null;
    this.startMonitoring();
  }

  startMonitoring(interval = 30000) { // Monitor every 30 seconds
    this.intervalId = setInterval(() => {
      this.collectMemoryMetrics();
      this.collectDOMMetrics();
      this.analyzeMemoryTrends();
    }, interval);
  }

  collectMemoryMetrics() {
    if (performance.memory) {
      const memoryInfo = {
        usedJSHeapSize: performance.memory.usedJSHeapSize,
        totalJSHeapSize: performance.memory.totalJSHeapSize,
        jsHeapSizeLimit: performance.memory.jsHeapSizeLimit,
        timestamp: Date.now()
      };
      
      this.memoryHistory.push(memoryInfo);
      
      // Keep only last 100 measurements
      if (this.memoryHistory.length > 100) {
        this.memoryHistory.shift();
      }
      
      this.reportMemoryUsage(memoryInfo);
    }
  }

  collectDOMMetrics() {
    const domMetrics = {
      nodeCount: document.getElementsByTagName('*').length,
      listenerCount: this.getEventListenerCount(),
      timestamp: Date.now()
    };
    
    this.domNodeHistory.push(domMetrics);
    
    if (this.domNodeHistory.length > 100) {
      this.domNodeHistory.shift();
    }
  }

  getEventListenerCount() {
    // Approximate method to count event listeners
    let count = 0;
    const elements = document.getElementsByTagName('*');
    
    for (let element of elements) {
      // This is a simplified approach
      if (element.onclick || element.onload || element.onchange) {
        count++;
      }
    }
    
    return count;
  }

  analyzeMemoryTrends() {
    if (this.memoryHistory.length < 10) return;
    
    const recent = this.memoryHistory.slice(-10);
    const trend = this.calculateTrend(recent.map(m => m.usedJSHeapSize));
    
    if (trend > 0.1) { // Growing trend
      console.warn('Memory usage trending upward - possible memory leak');
      this.reportMemoryLeak(trend);
    }
    
    // Check for DOM node growth
    if (this.domNodeHistory.length >= 10) {
      const domTrend = this.calculateTrend(
        this.domNodeHistory.slice(-10).map(d => d.nodeCount)
      );
      
      if (domTrend > 0.1) {
        console.warn('DOM node count trending upward - possible DOM leak');
        this.reportDOMLeak(domTrend);
      }
    }
  }

  calculateTrend(values) {
    if (values.length < 2) return 0;
    
    const first = values[0];
    const last = values[values.length - 1];
    
    return (last - first) / first;
  }

  reportMemoryUsage(memoryInfo) {
    const payload = {
      type: 'memory_usage',
      usedHeap: memoryInfo.usedJSHeapSize,
      totalHeap: memoryInfo.totalJSHeapSize,
      heapLimit: memoryInfo.jsHeapSizeLimit,
      url: window.location.href,
      timestamp: memoryInfo.timestamp
    };
    
    if (navigator.sendBeacon) {
      navigator.sendBeacon('/api/memory', JSON.stringify(payload));
    }
  }

  reportMemoryLeak(trend) {
    fetch('/api/memory-leak', {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({
        trend,
        currentUsage: this.memoryHistory[this.memoryHistory.length - 1],
        url: window.location.href,
        timestamp: Date.now()
      })
    }).catch(() => {});
  }

  reportDOMLeak(trend) {
    fetch('/api/dom-leak', {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({
        trend,
        currentNodeCount: this.domNodeHistory[this.domNodeHistory.length - 1],
        url: window.location.href,
        timestamp: Date.now()
      })
    }).catch(() => {});
  }

  getCurrentMemoryUsage() {
    if (performance.memory) {
      return {
        used: performance.memory.usedJSHeapSize,
        total: performance.memory.totalJSHeapSize,
        limit: performance.memory.jsHeapSizeLimit,
        percentage: (performance.memory.usedJSHeapSize / performance.memory.jsHeapSizeLimit) * 100
      };
    }
    return null;
  }

  forceGarbageCollection() {
    // This only works in development with --enable-precise-memory-info flag
    if (window.gc) {
      window.gc();
    }
  }

  stopMonitoring() {
    if (this.intervalId) {
      clearInterval(this.intervalId);
      this.intervalId = null;
    }
  }

  // Helper method to detect potential memory leaks in components
  trackComponentMemory(componentName, callback) {
    const beforeMemory = this.getCurrentMemoryUsage();
    
    callback();
    
    setTimeout(() => {
      const afterMemory = this.getCurrentMemoryUsage();
      const difference = afterMemory.used - beforeMemory.used;
      
      if (difference > 1000000) { // 1MB threshold
        console.warn(`Component ${componentName} may have memory leak: ${difference} bytes`);
      }
    }, 1000);
  }
}

const memoryMonitor = new MemoryMonitor();

// Example usage
memoryMonitor.trackComponentMemory('DataGrid', () => {
  // Initialize heavy component
  const dataGrid = new DataGrid(largeDataset);
  dataGrid.render();
});

Error Rate Correlation

Linking performance degradation with error rates helps identify when performance issues cause functional problems. I track both metrics simultaneously to understand their relationship and prioritize fixes based on user impact.

When performance degrades, error rates often increase due to timeouts, failed requests, or user frustration leading to rapid interactions. This correlation provides valuable insights for debugging production issues.

class ErrorPerformanceCorrelator {
  constructor() {
    this.errors = [];
    this.performanceMetrics = [];
    this.correlationThreshold = 0.7;
    this.setupErrorTracking();
    this.setupPerformanceTracking();
  }

  setupErrorTracking() {
    window.addEventListener('error', (event) => {
      this.recordError({
        type: 'javascript',
        message: event.message,
        filename: event.filename,
        lineno: event.lineno,
        colno: event.colno,
        stack: event.error?.stack,
        timestamp: Date.now()
      });
    });

    window.addEventListener('unhandledrejection', (event) => {
      this.recordError({
        type: 'promise',
        message: event.reason.message || 'Unhandled Promise Rejection',
        stack: event.reason.stack,
        timestamp: Date.now()
      });
    });

    // Track API errors
    this.interceptFetch();
  }

  interceptFetch() {
    const originalFetch = window.fetch;
    
    window.fetch = async (...args) => {
      const startTime = performance.now();
      
      try {
        const response = await originalFetch(...args);
        const duration = performance.now() - startTime;
        
        this.recordAPICall({
          url: args[0],
          method: args[1]?.method || 'GET',
          status: response.status,
          duration,
          success: response.ok,
          timestamp: Date.now()
        });
        
        if (!response.ok) {
          this.recordError({
            type: 'api',
            message: `HTTP ${response.status}: ${response.statusText}`,
            url: args[0],
            status: response.status,
            timestamp: Date.now()
          });
        }
        
        return response;
      } catch (error) {
        const duration = performance.now() - startTime;
        
        this.recordAPICall({
          url: args[0],
          method: args[1]?.method || 'GET',
          duration,
          success: false,
          timestamp: Date.now()
        });
        
        this.recordError({
          type: 'network',
          message: error.message,
          url: args[0],
          timestamp: Date.now()
        });
        
        throw error;
      }
    };
  }

  setupPerformanceTracking() {
    // Track Core Web Vitals
    new PerformanceObserver((list) => {
      list.getEntries().forEach((entry) => {
        this.recordPerformanceMetric({
          name: 'LCP',
          value: entry.startTime,
          timestamp: Date.now()
        });
      });
    }).observe({ entryTypes: ['largest-contentful-paint'] });

    // Track long tasks
    new PerformanceObserver((list) => {
      list.getEntries().forEach((entry) => {
        this.recordPerformanceMetric({
          name: 'longTask',
          value: entry.duration,
          timestamp: Date.now()
        });
      });
    }).observe({ entryTypes: ['longtask'] });
  }

  recordError(error) {
    this.errors.push(error);
    this.analyzeCorrelation();
    this.reportError(error);
  }

  recordPerformanceMetric(metric) {
    this.performanceMetrics.push(metric);
    this.analyzeCorrelation();
  }

  recordAPICall(apiCall) {
    this.recordPerformanceMetric({
      name: 'apiCall',
      value: apiCall.duration,
      success: apiCall.success,
      timestamp: apiCall.timestamp
    });
  }

  analyzeCorrelation() {
    // Only analyze if we have sufficient data
    if (this.errors.length < 5 || this.performanceMetrics.length < 10) return;
    
    const timeWindow = 300000; // 5 minutes
    const now = Date.now();
    
    // Get recent errors and performance metrics
    const recentErrors = this.errors.filter(e => now - e.timestamp < timeWindow);
    const recentMetrics = this.performanceMetrics.filter(m => now - m.timestamp < timeWindow);
    
    if (recentErrors.length === 0 || recentMetrics.length === 0) return;
    
    // Calculate error rate and average performance
    const errorRate = recentErrors.length / (timeWindow / 60000); // errors per minute
    const avgPerformance = recentMetrics.reduce((sum, m) => sum + m.value, 0) / recentMetrics.length;
    
    // Check for correlation
    const correlation = this.calculateCorrelation(recentErrors, recentMetrics);
    
    if (correlation > this.correlationThreshold) {
      this.reportCorrelation({
        errorRate,
        avgPerformance,
        correlation,
        timeWindow: timeWindow / 1000,
        timestamp: now
      });
    }
  }

  calculateCorrelation(errors, metrics) {
    // Simplified correlation calculation
    // Group by time buckets and calculate correlation coefficient
    const bucketSize = 60000; // 1 minute buckets
    const buckets = new Map();
    
    errors.forEach(error => {
      const bucket = Math.floor(error.timestamp / bucketSize);
      if (!buckets.has(bucket)) {
        buckets.set(bucket, { errors: 0, performance: [] });
      }
      buckets.get(bucket).errors++;
    });
    
    metrics.forEach(metric => {
      const bucket = Math.floor(metric.timestamp / bucketSize);
      if (buckets.has(bucket)) {
        buckets.get(bucket).performance.push(metric.value);
      }
    });
    
    // Calculate correlation coefficient (simplified)
    const pairs = Array.from(buckets.values())
      .filter(bucket => bucket.performance.length > 0)
      .map(bucket => ({
        errors: bucket.errors,
        avgPerformance: bucket.performance.reduce((a, b) => a + b, 0) / bucket.performance.length
      }));
    
    if (pairs.length < 2) return 0;
    
    // Simple correlation calculation
    const sumX = pairs.reduce((sum, p) => sum + p.errors, 0);
    const sumY = pairs.reduce((sum, p) => sum + p.avgPerformance, 0);
    const n = pairs.length;
    
    const correlation = pairs.reduce((sum, p) => {
      return sum + (p.errors - sumX/n) * (p.avgPerformance - sumY/n);
    }, 0) / Math.sqrt(
      pairs.reduce((sum, p) => sum + Math.pow(p.errors - sumX/n, 2), 0) *
      pairs.reduce((sum, p) => sum + Math.pow(p.avgPerformance - sumY/n, 2), 0)
    );
    
    return Math.abs(correlation);
  }

  reportError(error) {
    fetch('/api/errors', {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({
        ...error,
        url: window.location.href,
        userAgent: navigator.userAgent
      })
    }).catch(() => {});
  }

  reportCorrelation(correlation) {
    console.warn('Performance-Error correlation detected:', correlation);
    
    fetch('/api/correlation', {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({
        ...correlation,
        url: window.location.href
      })
    }).catch(() => {});
  }

  getErrorRate(timeWindow = 300000) {
    const now = Date.now();
    const recentErrors = this.errors.filter(e => now - e.timestamp < timeWindow);
    return recentErrors.length / (timeWindow / 60000); // errors per minute
  }

  getErrorsByType() {
    const errorTypes = {};
    this.errors.

Keywords: JavaScript performance monitoring, real user monitoring, RUM implementation, Core Web Vitals tracking, performance observer API, JavaScript performance optimization, web performance monitoring, long task detection, memory leak detection, JavaScript memory monitoring, performance metrics tracking, user experience monitoring, JavaScript error tracking, performance correlation analysis, JavaScript profiling, browser performance API, performance timing API, JavaScript debugging, web vitals optimization, performance analytics, JavaScript performance testing, client-side monitoring, frontend performance monitoring, JavaScript performance tools, performance measurement JavaScript, web performance analysis, JavaScript monitoring solutions, performance data collection, JavaScript performance insights, browser performance monitoring, JavaScript application monitoring, performance observer patterns, JavaScript performance best practices, web performance metrics, JavaScript timing API, performance monitoring tools, JavaScript performance dashboard, real-time performance monitoring, JavaScript performance alerting, performance regression detection, JavaScript performance benchmarking, web application performance, JavaScript performance strategies, performance monitoring framework, JavaScript performance optimization techniques, client-side performance tracking, JavaScript performance patterns, performance monitoring implementation, web performance optimization, JavaScript performance management, performance data analysis, JavaScript performance debugging, performance monitoring system, JavaScript performance evaluation, web performance tracking, JavaScript performance intelligence, performance monitoring architecture, JavaScript performance governance, web performance engineering, JavaScript performance methodology, performance monitoring best practices, JavaScript performance culture, web performance strategy, JavaScript performance excellence



Similar Posts
Blog Image
Mastering React State: Unleash the Power of Recoil for Effortless Global Management

Recoil, Facebook's state management library for React, offers flexible global state control. It uses atoms for state pieces and selectors for derived data, integrating seamlessly with React's component model and hooks.

Blog Image
What Secret Sauce Makes WebAssembly the Speedster of Web Development?

Unleashing the Speed Demon: How WebAssembly is Revolutionizing Web App Performance

Blog Image
Supercharge Your Node.js Apps: Advanced Redis Caching Techniques Unveiled

Node.js and Redis boost web app performance through advanced caching strategies. Techniques include query caching, cache invalidation, rate limiting, distributed locking, pub/sub, and session management. Implementations enhance speed and scalability.

Blog Image
Are You Ready to Unleash the Magic of Caching in Express.js?

Speeding Up Your Web Apps by Caching API Responses in Express.js

Blog Image
Building Real-Time Applications with Node.js and WebSocket: Beyond the Basics

Node.js and WebSocket enable real-time applications with instant interactions. Advanced techniques include scaling connections, custom protocols, data synchronization, and handling disconnections. Security and integration with other services are crucial for robust, scalable apps.