Storing Webhook Data
How to store webhook events for persistence and analysis.
Why Store Webhook Data?
1. Meet Compliance Requirements
Many industries have regulations around data retention and audit trails:
- GDPR - You may need to demonstrate what communications were sent to users and when
- SOC 2 - Audit requirements may include delivery verification
- Financial regulations - Transaction-related messages may need to be retained for years
Storing your own webhook data gives you full control over retention periods and access controls.
2. Enable Long-term Retention
Sendable retains webhook logs for a limited period. If you need access to historical data beyond that window, storing events in your own database ensures you never lose important information.
3. Power Automated Workflows
With webhook data in your database, you can build powerful automations:
- Automatically handle failed deliveries
- Trigger follow-up messages based on read receipts
- Alert your team when delivery issues spike
- Re-engage users who haven't read recent messages
Database Options
PostgreSQL
Reliable, feature-rich database. Best if you already use Postgres.
Supabase
PostgreSQL with real-time subscriptions. Generous free tier.
Neon
Serverless Postgres that scales to zero. Pay for what you use.
ClickHouse
High-performance analytics database for large volumes.
BigQuery
Google's data warehouse for analytics and reporting.
Snowflake
Cloud data platform for enterprise analytics.
How to Store Webhook Data
Build Your Own Handler
Create a webhook endpoint that stores events:
import { Pool } from 'pg'
const db = new Pool({ connectionString: process.env.DATABASE_URL })
app.post('/webhook', async (req, res) => {
const { event, data, timestamp } = req.body
// Store event in database
await db.query(
`INSERT INTO webhook_events (event_id, event_type, payload, created_at)
VALUES ($1, $2, $3, $4)`,
[data.messageId, event, JSON.stringify(data), timestamp]
)
res.sendStatus(200)
})Database Schema
Example PostgreSQL schema:
CREATE TABLE webhook_events (
id SERIAL PRIMARY KEY,
event_id VARCHAR(255),
event_type VARCHAR(100),
payload JSONB,
created_at TIMESTAMP,
UNIQUE(event_id, event_type)
);
CREATE INDEX idx_event_type ON webhook_events(event_type);
CREATE INDEX idx_created_at ON webhook_events(created_at);What Data Should You Store?
At minimum, store these fields:
- Event ID - Unique identifier for deduplication
- Event type - What happened (delivered, read, failed, etc.)
- Timestamp - When the event occurred
- Message ID - Links back to the original message
For deeper analytics:
- Recipient - For per-user engagement tracking
- Message content - To analyze performance
- Session ID - To track by WhatsApp session
- Error details - To understand delivery issues
Data Retention
How Long to Keep Data
- Operational use - 30-90 days for debugging and recent analytics
- Compliance requirements - Check your industry regulations (often 1-7 years)
- Historical analysis - Consider aggregating old data
Privacy Considerations
Webhook data contains personal information (phone numbers). Ensure compliance:
- Implement appropriate access controls
- Consider data anonymization for long-term retention
- Have a process for handling data deletion requests
Storage Costs
High-volume senders can generate significant data:
- 10,000 messages/month = ~30,000-50,000 events
- Each event ~1-2 KB = ~50-100 MB/month raw data
Plan for:
- Database storage costs
- Query performance as data grows
- Archival strategies for old data
FAQ
How much data will I generate?
Each message can generate multiple events (sent, delivered, read, failed). Rough estimate: 10,000 messages/month with average engagement = 30,000-50,000 events/month, ~50-100 MB/month.
Will storing webhooks slow down my application?
No. Webhook processing happens asynchronously. Your message sending is not affected. Process webhooks in the background to avoid blocking.
What if I miss a webhook?
Sendable automatically retries failed webhook deliveries for up to 24 hours. If your endpoint returns a 5xx error, we'll retry with exponential backoff.