every - Scheduled Workflows
Complete guide to scheduling recurring tasks and workflows with cron-like syntax in MCP.do
every - Scheduled Workflows
The every primitive enables scheduling recurring tasks using cron-like syntax. It's essential for automating periodic operations like cleanup jobs, report generation, health checks, and scheduled notifications.
Overview
Scheduled workflows in MCP.do follow the familiar cron pattern but integrate deeply with the platform's event system, database operations, and AI capabilities. Unlike traditional cron jobs that run scripts on servers, every creates persistent scheduled tasks that execute within the secure MCP.do runtime.
Key Features
- Cron Syntax: Standard cron expression support with minute-level precision
- Timezone Support: Run tasks in specific timezones, not just UTC
- Immediate Execution: Optionally run tasks immediately upon creation
- Integration: Full access to all SDK primitives (
db,ai,send, etc.) - Lifecycle Management: Pause, resume, and unsubscribe from scheduled tasks
- Persistent: Tasks survive server restarts and continue running
- Error Handling: Built-in retry logic and error reporting
Type Signatures
function every(schedule: string, handler: EventHandler, options?: EveryOptions): Subscription
interface EventHandler<T = unknown> {
(data: T): Promise<void> | void
}
interface EveryOptions {
// Timezone for cron schedule (default: UTC)
timezone?: string
// Run immediately on creation (default: false)
runImmediately?: boolean
// Additional event options
eventOptions?: EventOptions
}
interface Subscription {
// Unsubscribe from events
unsubscribe: () => Promise<void>
// Pause event delivery
pause: () => void
// Resume event delivery
resume: () => void
// Check if subscription is active
isActive: () => boolean
// Get subscription metadata
getMetadata: () => {
subscriberId: string
pattern: string
subscribedAt: Date
active: boolean
}
}
interface EventOptions {
// Event priority (0-10, default: 5)
priority?: number
// Delivery delay in milliseconds
delay?: number
// Retry configuration
retry?: {
maxAttempts?: number
backoff?: 'linear' | 'exponential'
}
// Source identifier
source?: string
// Distributed tracing ID
traceId?: string
// Correlation ID for related events
correlationId?: string
}Cron Syntax Guide
The schedule parameter follows standard cron syntax with five fields:
┌───────────── minute (0 - 59)
│ ┌───────────── hour (0 - 23)
│ │ ┌───────────── day of month (1 - 31)
│ │ │ ┌───────────── month (1 - 12)
│ │ │ │ ┌───────────── day of week (0 - 6) (Sunday to Saturday)
│ │ │ │ │
│ │ │ │ │
* * * * *Special Characters
*- Any value (wildcard),- Value list separator (e.g.,1,3,5)-- Range of values (e.g.,1-5)/- Step values (e.g.,*/5= every 5 units)
Common Patterns
// Every minute
every('* * * * *', handler)
// Every 5 minutes
every('*/5 * * * *', handler)
// Every 15 minutes
every('*/15 * * * *', handler)
// Every 30 minutes
every('*/30 * * * *', handler)
// Every hour at minute 0
every('0 * * * *', handler)
// Every 6 hours
every('0 */6 * * *', handler)
// Daily at midnight UTC
every('0 0 * * *', handler)
// Daily at 9 AM UTC
every('0 9 * * *', handler)
// Daily at 5 PM UTC
every('0 17 * * *', handler)
// Weekly on Monday at 9 AM
every('0 9 * * 1', handler)
// Weekly on Sunday at midnight
every('0 0 * * 0', handler)
// Monthly on the 1st at midnight
every('0 0 1 * *', handler)
// Quarterly (every 3 months) on the 1st
every('0 0 1 */3 *', handler)
// Weekdays at 8 AM (Mon-Fri)
every('0 8 * * 1-5', handler)
// Weekends at 10 AM (Sat-Sun)
every('0 10 * * 0,6', handler)
// First day of every month at 6 AM
every('0 6 1 * *', handler)
// Every 4 hours during business hours (8 AM, noon, 4 PM)
every('0 8,12,16 * * *', handler)Basic Usage
Simple Scheduled Task
// Run every hour
every('0 * * * *', async () => {
console.log('Running hourly task at', new Date())
})Task with Database Operations
// Check for expired items every 15 minutes
every('*/15 * * * *', async () => {
const expired = await db.list('CacheEntry', {
where: {
expiresAt: { lt: new Date() },
},
})
console.log(`Found ${expired.length} expired cache entries`)
for (const entry of expired) {
await db.delete('CacheEntry', entry.id)
}
console.log('Cleanup complete')
})Task with Event Publishing
// Send daily report every morning at 9 AM
every('0 9 * * *', async () => {
const orders = await db.list('Order', {
where: {
createdAt: {
gte: new Date(Date.now() - 24 * 60 * 60 * 1000), // Last 24 hours
},
},
})
await send($.Report.daily, {
date: new Date(),
orderCount: orders.length,
totalRevenue: orders.reduce((sum, o) => sum + o.total, 0),
})
})Advanced Examples
1. Data Cleanup Job
Remove stale data from multiple collections:
// Run daily at 2 AM UTC
every('0 2 * * *', async () => {
const thirtyDaysAgo = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000)
// Clean up old sessions
const oldSessions = await db.list('Session', {
where: { expiresAt: { lt: new Date() } },
})
for (const session of oldSessions) {
await db.delete('Session', session.id)
}
// Archive old orders
const oldOrders = await db.list('Order', {
where: {
status: 'completed',
completedAt: { lt: thirtyDaysAgo },
},
})
for (const order of oldOrders) {
await db.update('Order', order.id, {
archived: true,
archivedAt: new Date(),
})
}
// Delete old temporary files
const oldFiles = await db.list('TempFile', {
where: { createdAt: { lt: thirtyDaysAgo } },
})
for (const file of oldFiles) {
await db.delete('TempFile', file.id)
}
console.log(`Cleanup complete: ${oldSessions.length} sessions, ${oldOrders.length} orders, ${oldFiles.length} files`)
})2. Daily Report Generation
Generate and send comprehensive daily reports:
// Run daily at 8 AM Eastern Time
every(
'0 8 * * *',
async () => {
// Gather metrics
const yesterday = new Date(Date.now() - 24 * 60 * 60 * 1000)
const [newCustomers, orders, revenue] = await Promise.all([
db.list('Customer', {
where: { createdAt: { gte: yesterday } },
}),
db.list('Order', {
where: { createdAt: { gte: yesterday } },
}),
db.list('Payment', {
where: {
status: 'succeeded',
createdAt: { gte: yesterday },
},
}),
])
const totalRevenue = revenue.reduce((sum, p) => sum + p.amount, 0)
// Generate report with AI
const report = await ai.generate({
prompt: `Write a daily business summary report with these metrics:
- New Customers: ${newCustomers.length}
- Orders: ${orders.length}
- Revenue: $${(totalRevenue / 100).toFixed(2)}
Include insights and trends.`,
model: 'gpt-5',
schema: $.Report,
})
// Send report via email
await send($.Email.send, {
to: '[email protected]',
subject: `Daily Report - ${new Date().toLocaleDateString()}`,
body: report.text,
attachments: [
{
filename: 'daily-metrics.json',
content: JSON.stringify(
{
newCustomers: newCustomers.length,
orders: orders.length,
revenue: totalRevenue,
},
null,
2
),
},
],
})
// Store report
await db.create('Report', {
type: 'daily',
date: new Date(),
metrics: {
newCustomers: newCustomers.length,
orders: orders.length,
revenue: totalRevenue,
},
content: report.text,
})
},
{
timezone: 'America/New_York',
}
)3. Health Check Monitor
Monitor system health and alert on issues:
// Run every 5 minutes
every('*/5 * * * *', async () => {
const checks = []
// Check database connectivity
try {
await db.list('Business', { limit: 1 })
checks.push({ name: 'database', status: 'healthy' })
} catch (error) {
checks.push({ name: 'database', status: 'unhealthy', error: error.message })
}
// Check external API connectivity
try {
const response = await api.fetch('https://api.stripe.com/v1/charges', {
method: 'GET',
headers: { Authorization: 'Bearer sk_test_...' },
})
checks.push({ name: 'stripe', status: response.ok ? 'healthy' : 'degraded' })
} catch (error) {
checks.push({ name: 'stripe', status: 'unhealthy', error: error.message })
}
// Check queue depth
const queueDepth = await db.list('QueuedEvent', {
where: { status: 'pending' },
})
if (queueDepth.length > 1000) {
checks.push({ name: 'queue', status: 'degraded', depth: queueDepth.length })
} else {
checks.push({ name: 'queue', status: 'healthy', depth: queueDepth.length })
}
// Alert on unhealthy checks
const unhealthy = checks.filter((c) => c.status !== 'healthy')
if (unhealthy.length > 0) {
await send($.Alert.system, {
severity: 'high',
message: 'System health check failed',
checks: unhealthy,
})
}
// Store health check results
await db.create('HealthCheck', {
timestamp: new Date(),
checks,
status: unhealthy.length === 0 ? 'healthy' : 'degraded',
})
})4. Periodic Data Sync
Sync data from external systems:
// Run every 30 minutes
every('*/30 * * * *', async () => {
// Get last sync timestamp
const lastSync = await db.list('SyncLog', {
sort: { createdAt: 'desc' },
limit: 1,
})
const since = lastSync[0]?.completedAt || new Date(Date.now() - 30 * 60 * 1000)
// Fetch updates from external system
const updates = await api.fetch('https://erp.example.com/api/inventory/updates', {
method: 'GET',
headers: { Authorization: 'Bearer ...' },
body: JSON.stringify({ since }),
})
const data = await updates.json()
// Process each update
let processed = 0
let errors = 0
for (const item of data.items) {
try {
const existing = await db.get('Product', item.sku)
if (existing) {
await db.update('Product', item.sku, {
quantity: item.quantity,
price: item.price,
lastSyncedAt: new Date(),
})
} else {
await db.create('Product', {
id: item.sku,
name: item.name,
quantity: item.quantity,
price: item.price,
lastSyncedAt: new Date(),
})
}
processed++
} catch (error) {
console.error(`Failed to sync ${item.sku}:`, error)
errors++
}
}
// Log sync completion
await db.create('SyncLog', {
startedAt: since,
completedAt: new Date(),
itemsProcessed: processed,
errors,
status: errors === 0 ? 'success' : 'partial',
})
console.log(`Sync complete: ${processed} items, ${errors} errors`)
})5. Reminder System
Send periodic reminders based on database state:
// Check for reminders every hour
every('0 * * * *', async () => {
const now = new Date()
// Find orders awaiting payment for 24+ hours
const staleOrders = await db.list('Order', {
where: {
status: 'pending_payment',
createdAt: { lt: new Date(now.getTime() - 24 * 60 * 60 * 1000) },
},
})
for (const order of staleOrders) {
const customer = await db.get('Customer', order.customerId)
await send($.Email.send, {
to: customer.email,
subject: `Reminder: Complete Your Order #${order.id}`,
body: `Your order is still pending payment. Complete checkout to secure your items.`,
template: 'order-reminder',
data: { order, customer },
})
await db.update('Order', order.id, {
reminderSentAt: new Date(),
})
}
// Find trial subscriptions expiring in 3 days
const expiringSoon = await db.list('Subscription', {
where: {
status: 'trial',
trialEndsAt: {
gte: now,
lt: new Date(now.getTime() + 3 * 24 * 60 * 60 * 1000),
},
},
})
for (const sub of expiringSoon) {
const customer = await db.get('Customer', sub.customerId)
await send($.Email.send, {
to: customer.email,
subject: 'Your Trial is Ending Soon',
body: `Your trial ends in 3 days. Upgrade now to continue access.`,
template: 'trial-expiring',
data: { subscription: sub, customer },
})
}
console.log(`Sent ${staleOrders.length} order reminders and ${expiringSoon.length} trial reminders`)
})6. Aggregation and Rollups
Compute periodic aggregations:
// Run hourly
every('0 * * * *', async () => {
const hourAgo = new Date(Date.now() - 60 * 60 * 1000)
// Aggregate metrics per product
const orders = await db.list('Order', {
where: {
createdAt: { gte: hourAgo },
status: 'completed',
},
})
const productMetrics = new Map()
for (const order of orders) {
for (const item of order.items) {
if (!productMetrics.has(item.productId)) {
productMetrics.set(item.productId, {
sales: 0,
revenue: 0,
quantity: 0,
})
}
const metrics = productMetrics.get(item.productId)
metrics.sales++
metrics.revenue += item.price * item.quantity
metrics.quantity += item.quantity
}
}
// Store aggregated metrics
for (const [productId, metrics] of productMetrics) {
await db.create('ProductMetrics', {
productId,
period: 'hourly',
timestamp: hourAgo,
...metrics,
})
}
console.log(`Computed metrics for ${productMetrics.size} products`)
})7. Scheduled Content Publishing
Publish scheduled content:
// Run every minute
every('* * * * *', async () => {
const now = new Date()
// Find posts scheduled for publishing
const scheduledPosts = await db.list('Post', {
where: {
status: 'scheduled',
publishAt: { lte: now },
},
})
for (const post of scheduledPosts) {
try {
// Update post status
await db.update('Post', post.id, {
status: 'published',
publishedAt: now,
})
// Send notification
await send($.Post.published, {
postId: post.id,
title: post.title,
author: post.authorId,
})
// Notify subscribers
const subscribers = await db.list('Subscription', {
where: { authorId: post.authorId, active: true },
})
for (const sub of subscribers) {
await send($.Email.send, {
to: sub.email,
subject: `New Post: ${post.title}`,
template: 'new-post',
data: { post, subscriber: sub },
})
}
console.log(`Published post ${post.id}: ${post.title}`)
} catch (error) {
console.error(`Failed to publish post ${post.id}:`, error)
await db.update('Post', post.id, {
status: 'failed',
error: error.message,
})
}
}
})8. Batch Processing with AI
Process records with AI in batches:
// Run every 6 hours
every('0 */6 * * *', async () => {
// Find unprocessed reviews
const reviews = await db.list('Review', {
where: { sentimentAnalyzed: false },
limit: 100,
})
if (reviews.length === 0) {
console.log('No reviews to process')
return
}
// Batch analyze sentiment
const results = await ai.batch(
reviews.map((review) => ({
type: 'generate',
prompt: `Analyze sentiment of this review: "${review.text}". Return JSON with sentiment (positive/negative/neutral) and score (0-1).`,
schema: {
sentiment: 'string',
score: 'number',
topics: 'array',
},
}))
)
// Update reviews with sentiment
for (let i = 0; i < reviews.length; i++) {
const review = reviews[i]
const result = results[i]
if (result.data) {
await db.update('Review', review.id, {
sentiment: result.data.sentiment,
sentimentScore: result.data.score,
topics: result.data.topics,
sentimentAnalyzed: true,
analyzedAt: new Date(),
})
}
}
console.log(`Analyzed sentiment for ${reviews.length} reviews`)
})9. Maintenance Windows
Schedule maintenance during off-peak hours:
// Run daily at 3 AM UTC (during maintenance window)
every(
'0 3 * * *',
async () => {
console.log('Starting maintenance window')
// Reindex search database
const businesses = await db.list('Business')
for (const business of businesses) {
await send($.Search.reindex, {
type: 'Business',
id: business.id,
data: business,
})
}
// Optimize database
await send($.Database.optimize, { collection: 'Order' })
await send($.Database.optimize, { collection: 'Customer' })
// Clear caches
await db.list('Cache').then((items) => Promise.all(items.map((item) => db.delete('Cache', item.id))))
// Generate database statistics
const stats = await api.fetch('https://api.do/admin/database/stats')
await db.create('DatabaseStats', {
timestamp: new Date(),
...(await stats.json()),
})
console.log('Maintenance window complete')
},
{
eventOptions: {
priority: 10, // High priority during maintenance window
},
}
)10. Multi-Timezone Reporting
Generate reports for different regional teams:
// Report for US East Coast team (9 AM EST)
every(
'0 9 * * *',
async () => {
const report = await generateRegionalReport('us-east')
await send($.Email.send, {
to: '[email protected]',
subject: 'Daily Regional Report - US East',
body: report,
})
},
{
timezone: 'America/New_York',
}
)
// Report for US West Coast team (9 AM PST)
every(
'0 9 * * *',
async () => {
const report = await generateRegionalReport('us-west')
await send($.Email.send, {
to: '[email protected]',
subject: 'Daily Regional Report - US West',
body: report,
})
},
{
timezone: 'America/Los_Angeles',
}
)
// Report for European team (9 AM CET)
every(
'0 9 * * *',
async () => {
const report = await generateRegionalReport('europe')
await send($.Email.send, {
to: '[email protected]',
subject: 'Daily Regional Report - Europe',
body: report,
})
},
{
timezone: 'Europe/Paris',
}
)
// Report for Asia-Pacific team (9 AM JST)
every(
'0 9 * * *',
async () => {
const report = await generateRegionalReport('apac')
await send($.Email.send, {
to: '[email protected]',
subject: 'Daily Regional Report - APAC',
body: report,
})
},
{
timezone: 'Asia/Tokyo',
}
)Lifecycle Management
Managing Subscriptions
Store and manage subscription references:
// Create and store subscription
const subscription = every('0 * * * *', async () => {
console.log('Hourly task')
})
// Store subscription ID for later management
const metadata = subscription.getMetadata()
await db.create('ScheduledTask', {
subscriberId: metadata.subscriberId,
schedule: '0 * * * *',
name: 'hourly-task',
active: true,
})
// Later: retrieve and unsubscribe
const task = await db.get('ScheduledTask', 'task-id')
if (task.subscriberId === metadata.subscriberId) {
await subscription.unsubscribe()
await db.update('ScheduledTask', 'task-id', { active: false })
}Pause and Resume
Temporarily pause scheduled tasks:
const subscription = every('*/5 * * * *', async () => {
console.log('Running every 5 minutes')
})
// Pause during maintenance
subscription.pause()
console.log('Task paused')
// Resume after maintenance
subscription.resume()
console.log('Task resumed')
// Check status
if (subscription.isActive()) {
console.log('Task is active')
}Conditional Execution
Use immediate execution for testing:
// Run immediately, then on schedule
every(
'0 0 * * *',
async () => {
console.log('Daily task running')
// Task implementation
const report = await generateReport()
await send($.Email.send, { ...report })
},
{
runImmediately: true, // Runs once now, then daily at midnight
}
)Error Handling
Retry Configuration
Configure retry behavior for failed tasks:
every(
'0 * * * *',
async () => {
try {
await performCriticalTask()
} catch (error) {
console.error('Task failed:', error)
throw error // Will trigger retry
}
},
{
eventOptions: {
retry: {
maxAttempts: 5,
backoff: 'exponential', // Exponential backoff between retries
},
},
}
)Error Notification
Send alerts on task failures:
every('*/30 * * * *', async () => {
try {
await syncExternalData()
} catch (error) {
// Log error
await db.create('TaskError', {
taskName: 'data-sync',
error: error.message,
timestamp: new Date(),
})
// Send alert
await send($.Alert.taskFailed, {
taskName: 'data-sync',
error: error.message,
timestamp: new Date(),
})
// Re-throw to trigger retry
throw error
}
})Performance Considerations
Task Duration
Keep scheduled tasks short and focused:
// Good: Quick task
every('* * * * *', async () => {
const pending = await db.list('Job', {
where: { status: 'pending' },
limit: 10,
})
for (const job of pending) {
await send($.Job.process, job)
}
})
// Bad: Long-running task (will timeout)
every('* * * * *', async () => {
const allJobs = await db.list('Job') // Could be thousands
// Process all jobs synchronously... (will timeout)
})Batch Processing
Use batching for large datasets:
every('0 2 * * *', async () => {
let offset = 0
const batchSize = 100
while (true) {
const batch = await db.list('User', {
limit: batchSize,
offset,
where: { emailVerified: false },
})
if (batch.length === 0) break
// Process batch
for (const user of batch) {
await send($.Email.verify, { userId: user.id })
}
offset += batchSize
// Prevent overwhelming the system
if (offset > 10000) break
}
})Resource Limits
Be mindful of rate limits and quotas:
every('0 * * * *', async () => {
const items = await db.list('Item', { limit: 100 })
// Process with rate limiting
for (const item of items) {
await processItem(item)
// Sleep to respect rate limits
await new Promise((resolve) => setTimeout(resolve, 100))
}
})Common Pitfalls
1. Wrong Timezone
// Wrong: Assumes local timezone
every('0 9 * * *', handler) // Runs at 9 AM UTC, not local time
// Correct: Specify timezone explicitly
every('0 9 * * *', handler, {
timezone: 'America/New_York', // Runs at 9 AM Eastern
})2. Overlapping Executions
// Wrong: Long task running every minute (will overlap)
every('* * * * *', async () => {
await longRunningTask() // Takes 5 minutes
})
// Correct: Run less frequently or use locking
every('*/10 * * * *', async () => {
await longRunningTask()
})3. Unbounded Queries
// Wrong: Query without limit
every('0 * * * *', async () => {
const all = await db.list('Order') // Could return millions
})
// Correct: Use limits and pagination
every('0 * * * *', async () => {
const recent = await db.list('Order', {
limit: 1000,
where: {
createdAt: { gte: new Date(Date.now() - 24 * 60 * 60 * 1000) },
},
})
})4. Not Handling Errors
// Wrong: Errors silently fail
every('0 * * * *', async () => {
await riskyOperation() // No error handling
})
// Correct: Handle and log errors
every('0 * * * *', async () => {
try {
await riskyOperation()
} catch (error) {
console.error('Task failed:', error)
await send($.Alert.error, { error: error.message })
}
})5. Memory Leaks
// Wrong: Creating subscriptions in loop
for (let i = 0; i < 100; i++) {
every('* * * * *', handler) // Creates 100 subscriptions!
}
// Correct: Create subscription once
every('* * * * *', async () => {
// Process multiple items in single handler
for (let i = 0; i < 100; i++) {
await processItem(i)
}
})Integration Patterns
With Database Operations
every('0 * * * *', async () => {
// Read
const items = await db.list('Item', {
where: { processed: false },
limit: 100,
})
// Process
for (const item of items) {
await processItem(item)
// Update
await db.update('Item', item.id, {
processed: true,
processedAt: new Date(),
})
}
})With Event System
every('0 9 * * *', async () => {
// Generate report data
const report = await generateReport()
// Publish event for multiple consumers
await send($.Report.generated, report)
})
// Multiple handlers can react to the event
on($.Report.generated, async (report) => {
await send($.Email.send, { ...report })
})
on($.Report.generated, async (report) => {
await db.create('ReportArchive', report)
})With AI Operations
every('0 0 * * *', async () => {
const feedback = await db.list('Feedback', {
where: { analyzed: false },
limit: 50,
})
const analyses = await ai.batch(
feedback.map((f) => ({
type: 'generate',
prompt: `Analyze this feedback: ${f.text}`,
schema: $.FeedbackAnalysis,
}))
)
for (let i = 0; i < feedback.length; i++) {
await db.update('Feedback', feedback[i].id, {
analysis: analyses[i].data,
analyzed: true,
})
}
})With External APIs
every('*/15 * * * *', async () => {
// Fetch from external API
const response = await api.fetch('https://api.example.com/status')
const status = await response.json()
// Store in database
await db.create('ServiceStatus', {
service: 'external-api',
status: status.status,
responseTime: status.responseTime,
checkedAt: new Date(),
})
// Alert on degradation
if (status.status !== 'operational') {
await send($.Alert.serviceDown, {
service: 'external-api',
status: status.status,
})
}
})Readonly vs Authenticated Mode
The every primitive requires authentication as it creates persistent scheduled tasks with side effects.
Authenticated Mode (Required)
// Configure with API key
configure({
apiUrl: 'https://api.do',
apiKey: 'sk_...',
})
// Now can create scheduled tasks
every('0 * * * *', async () => {
// Task implementation
})Anonymous Mode (Not Available)
// Without authentication
every('0 * * * *', handler) // Error: Authentication requiredBest Practices
- Keep Tasks Focused: Each scheduled task should do one thing well
- Use Appropriate Frequency: Don't run tasks more often than necessary
- Handle Errors Gracefully: Always wrap critical code in try-catch
- Log Execution: Log start, completion, and errors for debugging
- Set Timeouts: Don't let tasks run forever
- Use Timezones: Always specify timezone for user-facing schedules
- Test Immediately: Use
runImmediatelyto test task logic - Monitor Performance: Track task execution time and resource usage
- Implement Idempotency: Tasks should be safe to run multiple times
- Document Schedules: Comment why specific schedules are chosen
Related Primitives
send- Publish events from scheduled tasksdb- Database operations within scheduled tasksai- AI operations for batch processingon- Combine scheduled tasks with event handlersuser- Access user context in scheduled tasks
Next Steps
- send Primitive - Learn about event publishing
- db Primitive - Database operations for scheduled tasks
- Examples - More real-world scheduling patterns
- Authentication - Setting up API keys