Timer Functions
Timer functions execute on a schedule you define. Use them for background jobs, cleanup tasks, report generation, and periodic syncs.
How It Works
- Set an interval (e.g., every 5 minutes, every 2 hours, daily)
- Configure a start time (UTC)
- Write JavaScript code or configure an external webhook
- Function runs automatically at each interval
Configuration
Schedule Settings
| Setting | Description |
|---|---|
| Interval | Number of time units between executions |
| Frequency | Unit of time: Minutes, Hours, or Days |
| Start Time (UTC) | When the first execution should occur |
Examples
| Interval | Frequency | Result |
|---|---|---|
| 5 | Minutes | Every 5 minutes |
| 1 | Hours | Every hour |
| 12 | Hours | Twice daily |
| 1 | Days | Once daily |
| 7 | Days | Weekly |
Automatic Tracking
The system automatically tracks:
| Property | Description |
|---|---|
| Last Run | When the function last executed |
| Next Run | Calculated from last run + interval |
Execution Context
Timer functions receive limited context since they're not triggered by a request:
| Object | Description |
|---|---|
api | API information |
me | Service account user (if configured) |
secrets | Configured secrets |
note
Timer functions don't have item or req since there's no triggering request or record.
Token Options
| Option | Description |
|---|---|
| None | No authentication |
| Service Account | Execute with service account permissions |
Timer functions typically use a service account since there's no "current user" triggering the execution.
Use Cases
Cleanup Jobs
Remove old or expired data:
// Delete sessions older than 24 hours
const cutoff = new Date();
cutoff.setHours(cutoff.getHours() - 24);
const sessionsRes = await get(`sessions?filter=createdAt lt "${cutoff.toISOString()}"`);
for (const session of sessionsRes.data) {
await del(`sessions/${session.id}`);
}
log(`Cleaned up ${sessionsRes.data.length} expired sessions`);
Report Generation
Generate periodic reports:
// Daily sales summary
const today = new Date().toISOString().split('T')[0];
const ordersRes = await get(`orders?filter=orderDate eq "${today}"`);
const orders = ordersRes.data;
const totalSales = orders.reduce((sum, o) => sum + o.total, 0);
const orderCount = orders.length;
// Store the report
await post('daily-reports', {
date: today,
totalSales,
orderCount,
averageOrder: orderCount > 0 ? totalSales / orderCount : 0
});
log(`Generated report: ${orderCount} orders, $${totalSales} total`);
External Sync
Sync data with external systems periodically (requires paid tier):
// Sync inventory from warehouse system every hour
const response = await fetch('https://warehouse.example.com/api/inventory', {
headers: { 'Authorization': `Bearer ${secrets.WAREHOUSE_API_KEY}` }
});
const inventory = await response.json();
for (const item of inventory) {
await patch(`products/${item.sku}`, {
stockLevel: item.quantity
});
}
log(`Synced ${inventory.length} inventory items`);
Status Checks
Monitor data conditions and send alerts:
// Check for orders stuck in processing
const oneHourAgo = new Date();
oneHourAgo.setHours(oneHourAgo.getHours() - 1);
const stuckOrdersRes = await get(
`orders?filter=status eq "processing" and updatedAt lt "${oneHourAgo.toISOString()}"`
);
const stuckOrders = stuckOrdersRes.data;
if (stuckOrders.length > 0) {
// Alert operations team (requires paid tier)
await fetch('https://hooks.slack.com/services/xxx', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
text: `Warning: ${stuckOrders.length} orders stuck in processing for over 1 hour`
})
});
log(`Alerted about ${stuckOrders.length} stuck orders`);
}
Data Aggregation
Pre-calculate aggregates for performance:
// Update category statistics every 15 minutes
const categoriesRes = await get('categories');
for (const category of categoriesRes.data) {
const productsRes = await get(
`products?filter=categoryId eq "${category.id}"&count=true`
);
await patch(`categories/${category.id}`, {
productCount: productsRes.meta?.count || 0,
lastUpdated: new Date().toISOString()
});
}
log(`Updated stats for ${categoriesRes.data.length} categories`);
Batch Processing
Process items in batches:
// Process pending email queue
const pendingRes = await get('email-queue?filter=status eq "pending"&pageSize=100');
for (const email of pendingRes.data) {
try {
// Send via external service (requires paid tier)
await fetch('https://api.sendgrid.com/v3/mail/send', {
method: 'POST',
headers: {
'Authorization': `Bearer ${secrets.SENDGRID_API_KEY}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({
to: email.recipient,
subject: email.subject,
content: email.body
})
});
await patch(`email-queue/${email.id}`, { status: 'sent', sentAt: new Date().toISOString() });
} catch (e) {
await patch(`email-queue/${email.id}`, { status: 'failed', error: e.message });
logError(`Failed to send email ${email.id}: ${e.message}`);
}
}
log(`Processed ${pendingRes.data.length} emails`);
Best Practices
- Use appropriate intervals — Don't run more frequently than needed
- Handle failures gracefully — Timer functions will retry on next interval
- Log execution details — Use
log()to track what was processed - Use service accounts — Ensure consistent permissions
- Consider time zones — Start times are in UTC
- Monitor execution — Check logs for failures or long-running jobs
- Use built-in methods —
get(),post(),patch(),del()for internal API calls - Reserve
fetchfor external APIs — External calls require paid tier