The Complete Guide to Supabase Webhooks: Building a reliable Payment & Auth Context with RBAC PART-2

  • Setting Up Webhook Database Triggers
  • Docker Deployment and Environment Setup
  • Advanced Error Handling and Retry Logic
  • Real-time Client Updates
  • Common Webhook Errors and Solutions
  • Performance Optimization and Monitoring

Setting Up Webhook Database Triggers

Database triggers are the foundation of Supabase webhooks. They define exactly when and how your webhooks fire, what data they include, and how they handle errors. Creating effective triggers requires understanding both SQL and your application's business logic. Well-designed triggers can significantly improve your application's performance by reducing the need for polling and ensuring data consistency.

Our trigger system will handle various scenarios including cascading updates, conditional firing based on specific field changes, and complex business rules that involve multiple tables. We'll also implement triggers that can handle batch operations efficiently, which is mandatory for applications that process large amounts of data.

1-- Database setup for webhook triggers
2-- Create the webhook notification function
3CREATE OR REPLACE FUNCTION notify_webhook()
4RETURNS trigger
5LANGUAGE plpgsql
6SECURITY DEFINER
7AS $$
8DECLARE
9  payload json;
10  webhook_url text;
11BEGIN
12  -- Get the webhook URL from configuration
13  SELECT value INTO webhook_url 
14  FROM app_config 
15  WHERE key = 'webhook_url';
16  
17  -- Build the payload
18  payload = json_build_object(
19    'type', TG_OP,
20    'table', TG_TABLE_NAME,
21    'schema', TG_TABLE_SCHEMA,
22    'record', CASE WHEN TG_OP = 'DELETE' THEN NULL ELSE row_to_json(NEW) END,
23    'old_record', CASE WHEN TG_OP IN ('UPDATE', 'DELETE') THEN row_to_json(OLD) ELSE NULL END,
24    'created_at', now(),
25    'user_id', COALESCE(
26      CASE WHEN TG_OP = 'DELETE' THEN OLD.user_id ELSE NEW.user_id END,
27      auth.uid()
28    )
29  );
30  
31  -- Send the webhook notification
32  PERFORM pg_notify('webhook_event', payload::text);
33  
34  RETURN COALESCE(NEW, OLD);
35END;
36$$;
37
38-- Create triggers for payment events
39CREATE TRIGGER payments_webhook_trigger
40  AFTER INSERT OR UPDATE OR DELETE ON payments
41  FOR EACH ROW
42  EXECUTE FUNCTION notify_webhook();
43
44-- Create triggers for subscription events  
45CREATE TRIGGER subscriptions_webhook_trigger
46  AFTER INSERT OR UPDATE OR DELETE ON subscriptions
47  FOR EACH ROW
48  EXECUTE FUNCTION notify_webhook();
49
50-- Create triggers for user role changes
51CREATE TRIGGER user_roles_webhook_trigger
52  AFTER INSERT OR UPDATE OR DELETE ON user_roles
53  FOR EACH ROW
54  EXECUTE FUNCTION notify_webhook();
55
56-- Create a more sophisticated trigger for user profile changes
57CREATE OR REPLACE FUNCTION notify_profile_webhook()
58RETURNS trigger
59LANGUAGE plpgsql
60SECURITY DEFINER
61AS $
62DECLARE
63  payload json;
64  changed_fields text[];
65BEGIN
66  -- Only fire webhook if specific fields changed
67  IF TG_OP = 'UPDATE' THEN
68    changed_fields = ARRAY[]::text[];
69    
70    IF OLD.email != NEW.email THEN
71      changed_fields = array_append(changed_fields, 'email');
72    END IF;
73    
74    IF OLD.subscription_status != NEW.subscription_status THEN
75      changed_fields = array_append(changed_fields, 'subscription_status');
76    END IF;
77    
78    IF OLD.role_level != NEW.role_level THEN
79      changed_fields = array_append(changed_fields, 'role_level');
80    END IF;
81    
82    -- Only proceed if important fields changed
83    IF array_length(changed_fields, 1) IS NULL THEN
84      RETURN NEW;
85    END IF;
86  END IF;
87  
88  -- Build enhanced payload with change information
89  payload = json_build_object(
90    'type', TG_OP,
91    'table', TG_TABLE_NAME,
92    'schema', TG_TABLE_SCHEMA,
93    'record', CASE WHEN TG_OP = 'DELETE' THEN NULL ELSE row_to_json(NEW) END,
94    'old_record', CASE WHEN TG_OP IN ('UPDATE', 'DELETE') THEN row_to_json(OLD) ELSE NULL END,
95    'changed_fields', changed_fields,
96    'created_at', now(),
97    'user_id', COALESCE(
98      CASE WHEN TG_OP = 'DELETE' THEN OLD.id ELSE NEW.id END,
99      auth.uid()
100    )
101  );
102  
103  PERFORM pg_notify('webhook_event', payload::text);
104  
105  RETURN COALESCE(NEW, OLD);
106END;
107$;
108
109-- Apply the enhanced trigger to user profiles
110CREATE TRIGGER user_profiles_webhook_trigger
111  AFTER INSERT OR UPDATE OR DELETE ON user_profiles
112  FOR EACH ROW
113  EXECUTE FUNCTION notify_profile_webhook();

Docker Deployment and Environment Setup

Deploying webhook endpoints requires careful consideration of scalability, reliability, and security. Docker containers provide an excellent foundation for webhook services because they ensure consistent environments across development, staging, and production. Our Docker setup will include proper environment variable management, health checks, logging configuration, and security best practices.

The containerized approach allows us to easily scale our webhook processing capacity by spinning up multiple instances behind a load balancer. We'll also implement proper graceful shutdown handling to ensure that webhook processing completes before containers are terminated during deployments. This prevents data loss and ensures that all webhook events are processed reliably.

1# Dockerfile
2FROM node:18-alpine AS base
3
4# Install dependencies only when needed
5FROM base AS deps
6RUN apk add --no-cache libc6-compat
7WORKDIR /app
8
9# Copy package files
10COPY package.json package-lock.json* ./
11RUN npm ci --only=production
12
13# Rebuild the source code only when needed
14FROM base AS builder
15WORKDIR /app
16COPY --from=deps /app/node_modules ./node_modules
17COPY . .
18
19# Build the application
20ENV NEXT_TELEMETRY_DISABLED 1
21RUN npm run build
22
23# Production image
24FROM base AS runner
25WORKDIR /app
26
27ENV NODE_ENV production
28ENV NEXT_TELEMETRY_DISABLED 1
29
30# Create nextjs user
31RUN addgroup --system --gid 1001 nodejs
32RUN adduser --system --uid 1001 nextjs
33
34# Copy built application
35COPY --from=builder /app/public ./public
36COPY --from=builder --chown=nextjs:nodejs /app/.next/standalone ./
37COPY --from=builder --chown=nextjs:nodejs /app/.next/static ./.next/static
38
39# Health check endpoint
40COPY --from=builder /app/healthcheck.js ./
41
42USER nextjs
43
44EXPOSE 3000
45
46ENV PORT 3000
47ENV HOSTNAME "0.0.0.0"
48
49# Health check
50HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
51  CMD node healthcheck.js
52
53CMD ["node", "server.js"]
1# docker-compose.yml
2version: '3.8'
3
4services:
5  webhook-app:
6    build: 
7      context: .
8      dockerfile: Dockerfile
9    ports:
10      - "3000:3000"
11    environment:
12      - NODE_ENV=production
13      - NEXT_PUBLIC_SUPABASE_URL=${NEXT_PUBLIC_SUPABASE_URL}
14      - SUPABASE_SERVICE_KEY=${SUPABASE_SERVICE_KEY}
15      - SUPABASE_WEBHOOK_SECRET=${SUPABASE_WEBHOOK_SECRET}
16      - WEBHOOK_RETRY_ATTEMPTS=3
17      - WEBHOOK_TIMEOUT=30000
18    restart: unless-stopped
19    healthcheck:
20      test: ["CMD", "node", "healthcheck.js"]
21      interval: 30s
22      timeout: 10s
23      retries: 3
24      start_period: 40s
25    logging:
26      driver: "json-file"
27      options:
28        max-size: "10m"
29        max-file: "3"
30    networks:
31      - webhook-network
32
33  redis:
34    image: redis:7-alpine
35    restart: unless-stopped
36    command: redis-server --appendonly yes
37    volumes:
38      - redis-data:/data
39    networks:
40      - webhook-network
41    healthcheck:
42      test: ["CMD", "redis-cli", "ping"]
43      interval: 30s
44      timeout: 3s
45      retries: 3
46
47  nginx:
48    image: nginx:alpine
49    ports:
50      - "80:80"
51      - "443:443"
52    volumes:
53      - ./nginx.conf:/etc/nginx/nginx.conf
54      - ./ssl:/etc/nginx/ssl
55    depends_on:
56      - webhook-app
57    restart: unless-stopped
58    networks:
59      - webhook-network
60
61volumes:
62  redis-data:
63
64networks:
65  webhook-network:
66    driver: bridge

Advanced Error Handling and Retry Logic

reliable webhook systems must handle failures gracefully because network issues, temporary service outages, and processing errors are inevitable in distributed systems. Our error handling strategy implements exponential backoff, dead letter queues, circuit breakers, and comprehensive logging to ensure maximum reliability while preventing system overload during failure scenarios.

The retry mechanism intelligently categorizes errors into retriable and non-retriable types. Temporary network errors or service unavailability trigger retries with increasing delays, while authentication failures or malformed payloads are immediately sent to dead letter queues for manual investigation. This approach prevents wasted resources while ensuring legitimate webhook events are eventually processed.

1// lib/error-handling.ts
2import { createServerClient } from '@supabase/ssr';
3
4export interface WebhookError {
5  id: string;
6  webhook_payload: any;
7  error_message: string;
8  error_stack?: string;
9  attempt_count: number;
10  max_attempts: number;
11  next_retry_at?: string;
12  status: 'pending' | 'retrying' | 'failed' | 'resolved';
13  created_at: string;
14  updated_at: string;
15}
16
17export class WebhookErrorHandler {
18  private supabase;
19  private maxRetries: number;
20  private baseDelay: number;
21
22  constructor(supabaseUrl: string, serviceKey: string) {
23    this.supabase = createServerClient(supabaseUrl, serviceKey, {
24      cookies: { get: () => '', set: () => {}, remove: () => {} },
25    });
26    this.maxRetries = parseInt(process.env.WEBHOOK_RETRY_ATTEMPTS || '3');
27    this.baseDelay = parseInt(process.env.WEBHOOK_BASE_DELAY || '1000');
28  }
29
30  async handleError(payload: any, error: Error, attemptCount = 1): Promise<void> {
31    const errorRecord: Partial<WebhookError> = {
32      webhook_payload: payload,
33      error_message: error.message,
34      error_stack: error.stack,
35      attempt_count: attemptCount,
36      max_attempts: this.maxRetries,
37      status: attemptCount >= this.maxRetries ? 'failed' : 'pending',
38      created_at: new Date().toISOString(),
39      updated_at: new Date().toISOString(),
40    };
41
42    // Calculate next retry time with exponential backoff
43    if (attemptCount < this.maxRetries) {
44      const delay = this.calculateBackoffDelay(attemptCount);
45      errorRecord.next_retry_at = new Date(Date.now() + delay).toISOString();
46    }
47
48    // Store the error for tracking and potential retry
49    const { data: storedError } = await this.supabase
50      .from('webhook_errors')
51      .insert(errorRecord)
52      .select()
53      .single();
54
55    // Log structured error information
56    console.error('Webhook processing failed:', {
57      errorId: storedError?.id,
58      payload: this.sanitizePayload(payload),
59      error: error.message,
60      attemptCount,
61      nextRetry: errorRecord.next_retry_at,
62    });
63
64    // Send alerts for critical failures
65    if (this.isCriticalError(error) || attemptCount >= this.maxRetries) {
66      await this.sendErrorAlert(payload, error, attemptCount);
67    }
68  }
69
70  private calculateBackoffDelay(attemptCount: number): number {
71    // Exponential backoff with jitter
72    const exponentialDelay = this.baseDelay * Math.pow(2, attemptCount - 1);
73    const jitter = Math.random() * 0.1 * exponentialDelay;
74    return Math.min(exponentialDelay + jitter, 300000); // Max 5 minutes
75  }
76
77  private isCriticalError(error: Error): boolean {
78    const criticalPatterns = [
79      'authentication failed',
80      'invalid signature',
81      'database connection',
82      'payment processing',
83      'user data corruption',
84    ];
85    
86    return criticalPatterns.some(pattern => 
87      error.message.toLowerCase().includes(pattern)
88    );
89  }
90
91  private sanitizePayload(payload: any): any {
92    // Remove sensitive information from logs
93    const sanitized = { ...payload };
94    const sensitiveFields = ['password', 'token', 'secret', 'key', 'credit_card'];
95    
96    function recursiveSanitize(obj: any): any {
97      if (typeof obj !== 'object' || obj === null) return obj;
98      
99      const result: any = Array.isArray(obj) ? [] : {};
100      
101      for (const [key, value] of Object.entries(obj)) {
102        if (sensitiveFields.some(field => key.toLowerCase().includes(field))) {
103          result[key] = '[REDACTED]';
104        } else if (typeof value === 'object' && value !== null) {
105          result[key] = recursiveSanitize(value);
106        } else {
107          result[key] = value;
108        }
109      }
110      
111      return result;
112    }
113    
114    return recursiveSanitize(sanitized);
115  }
116
117  async retryFailedWebhooks(): Promise<void> {
118    const { data: failedWebhooks } = await this.supabase
119      .from('webhook_errors')
120      .select('*')
121      .eq('status', 'pending')
122      .lte('next_retry_at', new Date().toISOString())
123      .limit(10);
124
125    if (!failedWebhooks?.length) return;
126
127    for (const failedWebhook of failedWebhooks) {
128      try {
129        // Mark as retrying
130        await this.supabase
131          .from('webhook_errors')
132          .update({ 
133            status: 'retrying',
134            updated_at: new Date().toISOString(),
135          })
136          .eq('id', failedWebhook.id);
137
138        // Attempt to reprocess the webhook
139        const { handleWebhookEvent } = await import('./webhook-dispatcher');
140        await handleWebhookEvent(failedWebhook.webhook_payload);
141
142        // Mark as resolved
143        await this.supabase
144          .from('webhook_errors')
145          .update({ 
146            status: 'resolved',
147            updated_at: new Date().toISOString(),
148          })
149          .eq('id', failedWebhook.id);
150
151        console.log(`Webhook retry successful: ${failedWebhook.id}`);
152      } catch (error) {
153        // Handle retry failure
154        const newAttemptCount = failedWebhook.attempt_count + 1;
155        await this.handleError(
156          failedWebhook.webhook_payload,
157          error as Error,
158          newAttemptCount
159        );
160      }
161    }
162  }
163}
164
165// Retry job that can be run periodically
166export async function processRetryQueue() {
167  const errorHandler = new WebhookErrorHandler(
168    process.env.NEXT_PUBLIC_SUPABASE_URL!,
169    process.env.SUPABASE_SERVICE_KEY!
170  );
171  
172  await errorHandler.retryFailedWebhooks();
173}

Real-time Client Updates

The true power of webhooks shines when combined with real-time client updates. While webhooks handle server-side processing, your frontend needs to reflect these changes immediately to provide users with responsive, live experiences. Our real-time system uses Supabase's built-in real-time subscriptions alongside webhook-triggered updates to create a seamless bidirectional communication system.

This approach ensures that when a webhook processes a payment, updates user permissions, or modifies subscription status, all connected clients receive instant updates. The system handles connection management, automatic reconnection, and efficient data synchronization to minimize bandwidth usage while maximizing responsiveness.

1// hooks/useRealtimeWebhooks.ts
2import { useEffect, useCallback, useRef } from 'react';
3import { createClient } from '@supabase/supabase-js';
4import { useAuth } from '@/contexts/AuthContext';
5
6interface RealtimeWebhookConfig {
7  table: string;
8  filter?: string;
9  onInsert?: (payload: any) => void;
10  onUpdate?: (payload: any) => void;
11  onDelete?: (payload: any) => void;
12}
13
14export function useRealtimeWebhooks(configs: RealtimeWebhookConfig[]) {
15  const { user } = useAuth();
16  const subscriptionsRef = useRef<any[]>([]);
17  const supabase = createClient(
18    process.env.NEXT_PUBLIC_SUPABASE_URL!,
19    process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!
20  );
21
22  const setupSubscriptions = useCallback(() => {
23    if (!user) return;
24
25    // Clean up existing subscriptions
26    subscriptionsRef.current.forEach(sub => {
27      supabase.removeChannel(sub);
28    });
29    subscriptionsRef.current = [];
30
31    // Setup new subscriptions
32    configs.forEach(config => {
33      const channel = supabase
34        .channel(`realtime:${config.table}:${user.id}`)
35        .on(
36          'postgres_changes',
37          {
38            event: '*',
39            schema: 'public',
40            table: config.table,
41            filter: config.filter || `user_id=eq.${user.id}`,
42          },
43          (payload) => {
44            console.log(`Realtime update for ${config.table}:`, payload);
45            
46            switch (payload.eventType) {
47              case 'INSERT':
48                config.onInsert?.(payload);
49                break;
50              case 'UPDATE':
51                config.onUpdate?.(payload);
52                break;
53              case 'DELETE':
54                config.onDelete?.(payload);
55                break;
56            }
57          }
58        )
59        .subscribe((status) => {
60          console.log(`Subscription status for ${config.table}:`, status);
61        });
62
63      subscriptionsRef.current.push(channel);
64    });
65  }, [user, configs, supabase]);
66
67  useEffect(() => {
68    setupSubscriptions();
69
70    return () => {
71      subscriptionsRef.current.forEach(sub => {
72        supabase.removeChannel(sub);
73      });
74    };
75  }, [setupSubscriptions]);
76}
77
78// contexts/WebhookContext.tsx
79'use client';
80
81import { createContext, useContext, useReducer, useEffect, ReactNode } from 'react';
82import { useRealtimeWebhooks } from '@/hooks/useRealtimeWebhooks';
83import { useAuth } from './AuthContext';
84
85interface WebhookState {
86  payments: any[];
87  subscriptions: any[];
88  userRoles: any[];
89  notifications: any[];
90  isLoading: boolean;
91}
92
93interface WebhookContextType extends WebhookState {
94  updatePayment: (payment: any) => void;
95  updateSubscription: (subscription: any) => void;
96  updateUserRole: (role: any) => void;
97  addNotification: (notification: any) => void;
98  clearNotifications: () => void;
99}
100
101const WebhookContext = createContext<WebhookContextType | undefined>(undefined);
102
103const webhookReducer = (state: WebhookState, action: any): WebhookState => {
104  switch (action.type) {
105    case 'SET_LOADING':
106      return { ...state, isLoading: action.payload };
107    
108    case 'UPDATE_PAYMENT':
109      return {
110        ...state,
111        payments: state.payments.map(p =>
112          p.id === action.payload.id ? { ...p, ...action.payload } : p
113        ),
114      };
115    
116    case 'ADD_PAYMENT':
117      return {
118        ...state,
119        payments: [...state.payments, action.payload],
120      };
121    
122    case 'UPDATE_SUBSCRIPTION':
123      return {
124        ...state,
125        subscriptions: state.subscriptions.map(s =>
126          s.id === action.payload.id ? { ...s, ...action.payload } : s
127        ),
128      };
129    
130    case 'UPDATE_USER_ROLE':
131      return {
132        ...state,
133        userRoles: state.userRoles.map(r =>
134          r.id === action.payload.id ? { ...r, ...action.payload } : r
135        ),
136      };
137    
138    case 'ADD_NOTIFICATION':
139      return {
140        ...state,
141        notifications: [action.payload, ...state.notifications.slice(0, 9)], // Keep last 10
142      };
143    
144    case 'CLEAR_NOTIFICATIONS':
145      return {
146        ...state,
147        notifications: [],
148      };
149    
150    default:
151      return state;
152  }
153};
154
155const initialState: WebhookState = {
156  payments: [],
157  subscriptions: [],
158  userRoles: [],
159  notifications: [],
160  isLoading: false,
161};
162
163export function WebhookProvider({ children }: { children: ReactNode }) {
164  const [state, dispatch] = useReducer(webhookReducer, initialState);
165  const { user } = useAuth();
166
167  // Setup realtime subscriptions
168  useRealtimeWebhooks([
169    {
170      table: 'payments',
171      onInsert: (payload) => {
172        dispatch({ type: 'ADD_PAYMENT', payload: payload.new });
173        dispatch({ 
174          type: 'ADD_NOTIFICATION', 
175          payload: {
176            id: Date.now(),
177            type: 'payment',
178            message: `Payment ${payload.new.status}`,
179            timestamp: new Date().toISOString(),
180          }
181        });
182      },
183      onUpdate: (payload) => {
184        dispatch({ type: 'UPDATE_PAYMENT', payload: payload.new });
185        if (payload.old.status !== payload.new.status) {
186          dispatch({ 
187            type: 'ADD_NOTIFICATION', 
188            payload: {
189              id: Date.now(),
190              type: 'payment',
191              message: `Payment status changed to ${payload.new.status}`,
192              timestamp: new Date().toISOString(),
193            }
194          });
195        }
196      },
197    },
198    {
199      table: 'subscriptions',
200      onUpdate: (payload) => {
201        dispatch({ type: 'UPDATE_SUBSCRIPTION', payload: payload.new });
202        if (payload.old.status !== payload.new.status) {
203          dispatch({ 
204            type: 'ADD_NOTIFICATION', 
205            payload: {
206              id: Date.now(),
207              type: 'subscription',
208              message: `Subscription ${payload.new.status}`,
209              timestamp: new Date().toISOString(),
210            }
211          });
212        }
213      },
214    },
215    {
216      table: 'user_roles',
217      onInsert: (payload) => {
218        dispatch({ 
219          type: 'ADD_NOTIFICATION', 
220          payload: {
221            id: Date.now(),
222            type: 'role',
223            message: 'New role assigned',
224            timestamp: new Date().toISOString(),
225          }
226        });
227      },
228      onUpdate: (payload) => {
229        dispatch({ type: 'UPDATE_USER_ROLE', payload: payload.new });
230      },
231    },
232  ]);
233
234  const contextValue: WebhookContextType = {
235    ...state,
236    updatePayment: (payment) => dispatch({ type: 'UPDATE_PAYMENT', payload: payment }),
237    updateSubscription: (subscription) => dispatch({ type: 'UPDATE_SUBSCRIPTION', payload: subscription }),
238    updateUserRole: (role) => dispatch({ type: 'UPDATE_USER_ROLE', payload: role }),
239    addNotification: (notification) => dispatch({ type: 'ADD_NOTIFICATION', payload: notification }),
240    clearNotifications: () => dispatch({ type: 'CLEAR_NOTIFICATIONS' }),
241  };
242
243  return (
244    <WebhookContext.Provider value={contextValue}>
245      {children}
246    </WebhookContext.Provider>
247  );
248}
249
250export function useWebhooks() {
251  const context = useContext(WebhookContext);
252  if (context === undefined) {
253    throw new Error('useWebhooks must be used within a WebhookProvider');
254  }
255  return context;
256}

Common Webhook Errors and Solutions

Understanding and preparing for common webhook errors can save hours of debugging and prevent system downtime. The most frequent issues include signature verification failures, timeout errors, payload parsing problems, database connection issues, and race conditions. Each category requires specific diagnostic approaches and targeted solutions.

Our comprehensive error handling system categorizes errors automatically and provides specific remediation strategies for each type. This systematic approach reduces mean time to resolution and helps prevent recurring issues through proactive monitoring and alerting.

1// lib/webhook-diagnostics.ts
2export class WebhookDiagnostics {
3  private static commonErrors = {
4    SIGNATURE_VERIFICATION_FAILED: {
5      pattern: /signature.*verification.*failed/i,
6      category: 'security',
7      severity: 'high',
8      solution: 'Check webhook secret configuration and signature calculation',
9      autoResolve: false,
10    },
11    TIMEOUT_ERROR: {
12      pattern: /timeout|timed out/i,
13      category: 'performance',
14      severity: 'medium',
15      solution: 'Optimize webhook processing logic or increase timeout limits',
16      autoResolve: true,
17    },
18    JSON_PARSE_ERROR: {
19      pattern: /unexpected token|invalid json|json parse/i,
20      category: 'payload',
21      severity: 'medium',
22      solution: 'Validate webhook payload format and encoding',
23      autoResolve: false,
24    },
25    DATABASE_CONNECTION_ERROR: {
26      pattern: /connection.*refused|database.*unavailable/i,
27      category: 'infrastructure',
28      severity: 'critical',
29      solution: 'Check database connectivity and connection pool settings',
30      autoResolve: true,
31    },
32    RATE_LIMIT_EXCEEDED: {
33      pattern: /rate.*limit|too many requests/i,
34      category: 'throttling',
35      severity: 'low',
36      solution: 'Implement exponential backoff and request queuing',
37      autoResolve: true,
38    },
39    DUPLICATE_PROCESSING: {
40      pattern: /duplicate.*key|already.*processed/i,
41      category: 'idempotency',
42      severity: 'low',
43      solution: 'Implement proper idempotency checks',
44      autoResolve: true,
45    },
46  };
47
48  static diagnoseError(error: Error): {
49    type: string;
50    category: string;
51    severity: string;
52    solution: string;
53    autoResolve: boolean;
54  } {
55    const errorMessage = error.message.toLowerCase();
56    
57    for (const [type, config] of Object.entries(this.commonErrors)) {
58      if (config.pattern.test(errorMessage)) {
59        return {
60          type,
61          category: config.category,
62          severity: config.severity,
63          solution: config.solution,
64          autoResolve: config.autoResolve,
65        };
66      }
67    }
68    
69    return {
70      type: 'UNKNOWN_ERROR',
71      category: 'general',
72      severity: 'medium',
73      solution: 'Review error details and webhook logs for specific cause',
74      autoResolve: false,
75    };
76  }
77
78  static generateDiagnosticReport(errors: WebhookError[]): {
79    summary: any;
80    recommendations: string[];
81    criticalIssues: any[];
82  } {
83    const errorsByType = new Map<string, number>();
84    const errorsByCategory = new Map<string, number>();
85    const criticalIssues: any[] = [];
86
87    errors.forEach(error => {
88      const diagnosis = this.diagnoseError(new Error(error.error_message));
89      
90      errorsByType.set(diagnosis.type, (errorsByType.get(diagnosis.type) || 0) + 1);
91      errorsByCategory.set(diagnosis.category, (errorsByCategory.get(diagnosis.category) || 0) + 1);
92      
93      if (diagnosis.severity === 'critical' || diagnosis.severity === 'high') {
94        criticalIssues.push({
95          ...error,
96          diagnosis,
97        });
98      }
99    });
100
101    const recommendations = this.generateRecommendations(errorsByCategory);
102
103    return {
104      summary: {
105        totalErrors: errors.length,
106        errorsByType: Object.fromEntries(errorsByType),
107        errorsByCategory: Object.fromEntries(errorsByCategory),
108        criticalErrorsCount: criticalIssues.length,
109      },
110      recommendations,
111      criticalIssues,
112    };
113  }
114
115  private static generateRecommendations(errorsByCategory: Map<string, number>): string[] {
116    const recommendations: string[] = [];
117    
118    if (errorsByCategory.get('security') && errorsByCategory.get('security')! > 0) {
119      recommendations.push('Review webhook security configuration and signature verification');
120    }
121    
122    if (errorsByCategory.get('performance') && errorsByCategory.get('performance')! > 5) {
123      recommendations.push('Consider optimizing webhook processing performance');
124    }
125    
126    if (errorsByCategory.get('infrastructure') && errorsByCategory.get('infrastructure')! > 0) {
127      recommendations.push('Check system infrastructure and dependencies');
128    }
129    
130    if (errorsByCategory.get('throttling') && errorsByCategory.get('throttling')! > 3) {
131      recommendations.push('Implement better rate limiting and request queuing');
132    }
133
134    return recommendations;
135  }
136}
137
138// Enhanced error handling in main webhook route
139export async function POST(request: NextRequest) {
140  const startTime = Date.now();
141  let errorDiagnosis: any = null;
142
143  try {
144    const payload = await validateAndParseWebhook(request);
145    await processWebhook(payload);
146    
147    console.log('Webhook processed successfully', {
148      processingTime: Date.now() - startTime,
149      payloadType: payload.type,
150      table: payload.table,
151    });
152    
153    return NextResponse.json({ success: true });
154  } catch (error) {
155    errorDiagnosis = WebhookDiagnostics.diagnoseError(error as Error);    
156    console.error('Webhook processing failed', {
157      error: (error as Error).message,
158      diagnosis: errorDiagnosis,
159      processingTime: Date.now() - startTime,
160      stack: (error as Error).stack,
161    });
162    
163    if (errorDiagnosis.autoResolve && errorDiagnosis.severity !== 'critical') {
164      await scheduleWebhookRetry(request, error as Error);
165      return NextResponse.json(
166        { error: 'Processing failed, scheduled for retry' },
167        { status: 202 }
168      );
169    }
170    
171    const statusCode = errorDiagnosis.severity === 'critical' ? 500 : 400;
172    return NextResponse.json(
173      { 
174        error: 'Webhook processing failed',
175        diagnosis: errorDiagnosis.solution,
176      },
177      { status: statusCode }
178    );
179  }
180}

Performance Optimization and Monitoring

High-performance webhook systems require careful optimization at multiple levels - from database queries and network calls to memory usage and processing algorithms. Our optimization strategy focuses on reducing latency, increasing throughput, and maintaining system stability under high load conditions.

Monitoring is equally mandatory for maintaining webhook reliability. Our comprehensive monitoring system tracks processing times, error rates, payload sizes, and system resource usage. This data enables proactive optimization and helps identify performance bottlenecks before they impact users.

1// lib/webhook-performance.ts
2export class WebhookPerformanceMonitor {
3  private static metrics = new Map<string, any>();
4  private static readonly METRIC_RETENTION_MS = 24 * 60 * 60 * 1000; // 24 hours
5
6  static startTimer(webhookId: string): string {
7    const timerId = `${webhookId}-${Date.now()}`;
8    this.metrics.set(timerId, {
9      startTime: Date.now(),
10      webhookId,
11      type: 'timer',
12    });
13    return timerId;
14  }
15
16  static endTimer(timerId: string, metadata: any = {}) {
17    const timer = this.metrics.get(timerId);
18    if (!timer) return;
19
20    const processingTime = Date.now() - timer.startTime;
21    
22    // Update metrics
23    this.updateProcessingTimeMetrics(timer.webhookId, processingTime);
24    
25    // Log performance data
26    console.log('Webhook performance:', {
27      webhookId: timer.webhookId,
28      processingTime,
29      ...metadata,
30    });
31    
32    // Clean up timer
33    this.metrics.delete(timerId);
34  }
35
36  private static updateProcessingTimeMetrics(webhookId: string, processingTime: number) {
37    const key = `processing_times_${webhookId}`;
38    const existing = this.metrics.get(key) || {
39      count: 0,
40      totalTime: 0,
41      minTime: Infinity,
42      maxTime: 0,
43      recentTimes: [],
44    };
45
46    existing.count++;
47    existing.totalTime += processingTime;
48    existing.minTime = Math.min(existing.minTime, processingTime);
49    existing.maxTime = Math.max(existing.maxTime, processingTime);
50    existing.recentTimes.push(processingTime);
51    
52    // Keep only recent times (last 100 calls)
53    if (existing.recentTimes.length > 100) {
54      existing.recentTimes = existing.recentTimes.slice(-100);
55    }
56
57    this.metrics.set(key, existing);
58  }
59
60  static getPerformanceReport(): any {
61    const report: any = {
62      summary: {
63        totalWebhooks: 0,
64        averageProcessingTime: 0,
65        totalProcessingTime: 0,
66        peakProcessingTime: 0,
67        minProcessingTime: Infinity,
68      },
69      webhooks: {},
70    };
71
72    // Aggregate metrics across all webhooks
73    for (const [key, value] of this.metrics.entries()) {
74      if (key.startsWith('processing_times_')) {
75        const webhookId = key.replace('processing_times_', '');
76        const avgTime = value.totalTime / value.count;
77        const p95Time = this.calculatePercentile(value.recentTimes, 0.95);
78        
79        report.webhooks[webhookId] = {
80          count: value.count,
81          averageTime: avgTime,
82          minTime: value.minTime === Infinity ? 0 : value.minTime,
83          maxTime: value.maxTime,
84          p95Time,
85          totalTime: value.totalTime,
86        };
87
88        report.summary.totalWebhooks++;
89        report.summary.totalProcessingTime += value.totalTime;
90        report.summary.peakProcessingTime = Math.max(report.summary.peakProcessingTime, value.maxTime);
91        report.summary.minProcessingTime = Math.min(report.summary.minProcessingTime, value.minTime);
92      }
93    }
94
95    if (report.summary.totalWebhooks > 0) {
96      report.summary.averageProcessingTime = 
97        report.summary.totalProcessingTime / 
98        Object.values(report.webhooks).reduce((sum: number, w: any) => sum + w.count, 0);
99    }
100
101    return report;
102  }
103
104  private static calculatePercentile(times: number[], percentile: number): number {
105    if (times.length === 0) return 0;
106    
107    const sorted = [...times].sort((a, b) => a - b);
108    const index = Math.ceil(sorted.length * percentile) - 1;
109    return sorted[Math.max(0, index)];
110  }
111
112  static recordError(webhookId: string, error: Error, context: any = {}) {
113    const key = `errors_${webhookId}`;
114    const existing = this.metrics.get(key) || {
115      count: 0,
116      recentErrors: [],
117    };
118
119    existing.count++;
120    existing.recentErrors.push({
121      timestamp: Date.now(),
122      message: error.message,
123      stack: error.stack,
124      context,
125    });
126
127    // Keep only recent errors (last 50)
128    if (existing.recentErrors.length > 50) {
129      existing.recentErrors = existing.recentErrors.slice(-50);
130    }
131
132    this.metrics.set(key, existing);
133  }
134
135  static getErrorReport(): any {
136    const report: any = {
137      summary: {
138        totalErrors: 0,
139        webhooksWithErrors: 0,
140      },
141      webhooks: {},
142    };
143
144    for (const [key, value] of this.metrics.entries()) {
145      if (key.startsWith('errors_')) {
146        const webhookId = key.replace('errors_', '');
147        
148        report.webhooks[webhookId] = {
149          count: value.count,
150          recentErrors: value.recentErrors.slice(-10), // Last 10 errors
151        };
152
153        report.summary.totalErrors += value.count;
154        report.summary.webhooksWithErrors++;
155      }
156    }
157
158    return report;
159  }
160
161  static cleanupOldMetrics() {
162    const cutoff = Date.now() - this.METRIC_RETENTION_MS;
163    
164    for (const [key, value] of this.metrics.entries()) {
165      if (value.timestamp && value.timestamp < cutoff) {
166        this.metrics.delete(key);
167      }
168    }
169  }
170
171  static getMemoryUsage(): any {
172    return {
173      metricsCount: this.metrics.size,
174      estimatedMemoryMB: (this.metrics.size * 1024) / (1024 * 1024), // Rough estimate
175    };
176  }
177}
178
179// Database query optimization utilities
180export class WebhookQueryOptimizer {
181  static async getWebhooksBatch(ids: string[], batchSize = 100): Promise<any[]> {
182    const results: any[] = [];
183    
184    for (let i = 0; i < ids.length; i += batchSize) {
185      const batch = ids.slice(i, i + batchSize);
186      // Replace with your actual database query
187      const batchResults = await this.queryWebhooksByIds(batch);
188      results.push(...batchResults);
189    }
190    
191    return results;
192  }
193
194  private static async queryWebhooksByIds(ids: string[]): Promise<any[]> {
195    // Example database query - replace with your actual implementation
196    // return await db.webhooks.findMany({
197    //   where: { id: { in: ids } },
198    //   select: { id: true, url: true, secret: true, events: true }
199    // });
200    return [];
201  }
202
203  static createIndexHints(): string[] {
204    return [
205      'CREATE INDEX CONCURRENTLY IF NOT EXISTS idx_webhooks_status ON webhooks(status)',
206      'CREATE INDEX CONCURRENTLY IF NOT EXISTS idx_webhook_events_created_at ON webhook_events(created_at)',
207      'CREATE INDEX CONCURRENTLY IF NOT EXISTS idx_webhook_events_webhook_id ON webhook_events(webhook_id)',
208      'CREATE INDEX CONCURRENTLY IF NOT EXISTS idx_webhook_events_retry_count ON webhook_events(retry_count)',
209    ];
210  }
211}
212
213// Connection pooling and resource management
214export class WebhookResourceManager {
215  private static connectionPool: any = null;
216  private static readonly MAX_CONNECTIONS = 20;
217  private static readonly IDLE_TIMEOUT = 30000;
218
219  static initializePool() {
220    // Initialize connection pool based on your database/HTTP client
221    // This is a conceptual example
222    this.connectionPool = {
223      maxConnections: this.MAX_CONNECTIONS,
224      idleTimeout: this.IDLE_TIMEOUT,
225      activeConnections: 0,
226      waitingRequests: [],
227    };
228  }
229
230  static async getConnection(): Promise<any> {
231    // Return a connection from the pool
232    // Implement actual connection pooling logic here
233    return this.connectionPool;
234  }
235
236  static releaseConnection(connection: any) {
237    // Return connection to pool
238    // Implement cleanup logic here
239  }
240
241  static getPoolStats(): any {
242    return {
243      activeConnections: this.connectionPool?.activeConnections || 0,
244      maxConnections: this.MAX_CONNECTIONS,
245      waitingRequests: this.connectionPool?.waitingRequests?.length || 0,
246    };
247  }
248}
249
250// Rate limiting and throttling
251export class WebhookRateLimiter {
252  private static limits = new Map<string, any>();
253  private static readonly DEFAULT_LIMIT = 100; // requests per minute
254  private static readonly WINDOW_MS = 60 * 1000; // 1 minute
255
256  static async checkLimit(webhookId: string, customLimit?: number): Promise<boolean> {
257    const limit = customLimit || this.DEFAULT_LIMIT;
258    const now = Date.now();
259    const windowStart = now - this.WINDOW_MS;
260    
261    const key = `rate_limit_${webhookId}`;
262    const existing = this.limits.get(key) || {
263      requests: [],
264    };
265
266    // Remove old requests outside the window
267    existing.requests = existing.requests.filter((timestamp: number) => timestamp > windowStart);
268    
269    if (existing.requests.length >= limit) {
270      return false; // Rate limit exceeded
271    }
272
273    existing.requests.push(now);
274    this.limits.set(key, existing);
275    
276    return true;
277  }
278
279  static getRateLimitStatus(webhookId: string): any {
280    const key = `rate_limit_${webhookId}`;
281    const existing = this.limits.get(key) || { requests: [] };
282    const now = Date.now();
283    const windowStart = now - this.WINDOW_MS;
284    
285    const recentRequests = existing.requests.filter((timestamp: number) => timestamp > windowStart);
286    
287    return {
288      currentRequests: recentRequests.length,
289      limit: this.DEFAULT_LIMIT,
290      resetTime: windowStart + this.WINDOW_MS,
291      remaining: Math.max(0, this.DEFAULT_LIMIT - recentRequests.length),
292    };
293  }
294
295  static clearExpiredLimits() {
296    const now = Date.now();
297    const windowStart = now - this.WINDOW_MS;
298    
299    for (const [key, value] of this.limits.entries()) {
300      value.requests = value.requests.filter((timestamp: number) => timestamp > windowStart);
301      if (value.requests.length === 0) {
302        this.limits.delete(key);
303      }
304    }
305  }
306}
307
308// Usage example with integrated monitoring
309export async function processWebhookWithMonitoring(webhookId: string, payload: any) {
310  const timerId = WebhookPerformanceMonitor.startTimer(webhookId);
311  
312  try {
313    // Check rate limit
314    if (!(await WebhookRateLimiter.checkLimit(webhookId))) {
315      throw new Error('Rate limit exceeded');
316    }
317
318    // Process webhook
319    const result = await processWebhook(webhookId, payload);
320    
321    WebhookPerformanceMonitor.endTimer(timerId, {
322      payloadSize: JSON.stringify(payload).length,
323      success: true,
324    });
325    
326    return result;
327  } catch (error) {
328    WebhookPerformanceMonitor.recordError(webhookId, error as Error, {
329      payloadSize: JSON.stringify(payload).length,
330    });
331    
332    WebhookPerformanceMonitor.endTimer(timerId, {
333      payloadSize: JSON.stringify(payload).length,
334      success: false,
335      error: (error as Error).message,
336    });
337    
338    throw error;
339  }
340}
341
342async function processWebhook(webhookId: string, payload: any): Promise<any> {
343  // Your webhook processing logic here
344  return { success: true };
345}
346
347// Cleanup service to run periodically
348setInterval(() => {
349  WebhookPerformanceMonitor.cleanupOldMetrics();
350  WebhookRateLimiter.clearExpiredLimits();
351}, 5 * 60 * 1000); // Every 5 minutes