A TypeScript library for creating and executing time-based or event-based automations with BullMQ workers.
- 🕒 Time-based Scheduling: Create automations that run on schedules using RRULE strings
- 🎯 Event-driven Flows: Trigger automations based on webhooks or in-app events
- 🔄 Composable Steps: Build multi-step flows with a fluent API
- 📦 Pluggable Storage: Swap storage backends with the repository pattern
- 🚀 Scalable Architecture: Stateless workers + Redis queues
- 📊 Observability: Built-in logging, metrics, and OpenTelemetry integration
- 💾 Multiple Adapters: PostgreSQL, In-Memory, and ORM adapters (TypeORM, Sequelize, Prisma)
- ✅ RRULE Validation: Comprehensive validation for schedule strings to prevent runtime errors
- 🛡️ Input Validation: Built-in sanitization and validation for all automation inputs
- 🚦 Rate Limiting: Configurable rate limits for automation execution and API polling
npm install auto-flow
# or
yarn add auto-flow
AutoFlow comes with ready-to-run examples to help you understand how to use the library.
Use the provided npm scripts to run the examples:
# Show available examples and options
npm run examples
# Run the in-memory example
npm run example:in-memory
# Run the TypeORM example
npm run example:typeorm
# Run the in-memory example and start required Docker containers
npm run example:docker
Alternatively, you can run examples directly with ts-node:
# Run with custom options
ts-node examples/run.ts in-memory --docker --debug
-
In-Memory Example: Demonstrates time-based and event-based automations using the in-memory repository. Requires Redis for queuing.
-
TypeORM Example: Shows how to use the TypeORM adapter with PostgreSQL for persistent storage. Requires PostgreSQL and Redis.
The examples use a shared configuration system:
- Default values are provided for all settings
- Environment variables can override defaults
- Use the
--debug
flag for more verbose logging
import { FlowBuilder } from 'auto-flow';
const flow = new FlowBuilder(
{ kind: 'schedule', rrule: 'RRULE:FREQ=HOURLY;INTERVAL=2' },
'user-123'
)
.name('Email Summary')
.description('Fetch and summarize emails every 2 hours')
.tags('email', 'summary')
.step('fetch-emails', 'email.fetch', { inbox: 'INBOX' })
.step('summarize', 'ai.summarize', { model: 'gpt-4' })
.step('send-slack', 'slack.notify', { channel: '#me' });
await flow.save(repository);
const flow = new FlowBuilder(
{ kind: 'event', eventName: 'new.signup' },
'user-123'
)
.name('Welcome Flow')
.step('send-email', 'email.send', {
template: 'welcome',
delay: '1h'
})
.step('create-task', 'crm.task', {
title: 'Follow up with new signup',
assignTo: 'sales-team'
});
await flow.save(repository);
Fluent API for creating automations:
const flow = new FlowBuilder(trigger, userId)
.name(name)
.description(desc)
.tags(...tags)
.step(name, jobName, input)
.save(repository);
Handles time-based automations:
const scheduler = new ScheduleManager(repository, redisConfig);
await scheduler.start();
Manages event-driven automations:
const eventBus = new EventBus(repository, queue);
// Publish events
await eventBus.publish({
name: 'user.signup',
payload: { userId: '123' }
});
// Subscribe to events
eventBus.subscribe('user.signup').subscribe(event => {
console.log('New signup:', event);
});
Processes automation steps:
const executor = new FlowExecutor(repository, redisConfig);
// Register job handlers
executor.registerJobHandler('email.send', async (step, context) => {
// Handle email sending
});
executor.registerJobHandler('slack.notify', async (step, context) => {
// Handle Slack notifications
});
AutoFlow provides several storage adapters for different environments and database systems.
Direct PostgreSQL adapter for production use:
import { PostgresAutomationRepository } from 'auto-flow';
import { Pool } from 'pg';
const pool = new Pool({
host: 'localhost',
port: 5432,
user: 'postgres',
password: 'postgres',
database: 'auto_flow'
});
const repository = new PostgresAutomationRepository(pool);
Perfect for testing and development:
import { InMemoryAutomationRepository } from 'auto-flow';
const repository = new InMemoryAutomationRepository();
// Use in tests
beforeEach(() => {
repository.clear(); // Reset the repository between tests
});
import {
TypeOrmAutomationRepository,
AutomationEntity,
AutomationRunEntity
} from 'auto-flow';
import { DataSource } from 'typeorm';
// Set up TypeORM data source
const dataSource = new DataSource({
type: 'postgres',
host: 'localhost',
port: 5432,
username: 'postgres',
password: 'postgres',
database: 'auto_flow',
entities: [AutomationEntity, AutomationRunEntity],
synchronize: true, // For development only
});
await dataSource.initialize();
// Create the repository
const repository = new TypeOrmAutomationRepository(dataSource);
Implement the IAutomationRepository
interface:
import { IAutomationRepository } from 'auto-flow';
class MyCustomRepository implements IAutomationRepository {
// Implement all required methods
}
Or extend the ORM base class:
import { OrmAutomationRepositoryBase } from 'auto-flow';
class MyCustomOrmRepository extends OrmAutomationRepositoryBase {
// Implement abstract methods
}
const redisConfig = {
host: process.env.REDIS_HOST || 'localhost',
port: parseInt(process.env.REDIS_PORT || '6379')
};
import { logger } from 'auto-flow';
// Set log level
logger.level = 'debug';
// Enable OpenTelemetry
process.env.ENABLE_TELEMETRY = 'true';
-
Error Handling: Add retry configurations for critical steps
.step('api-call', 'http.post', data, { retry: { attempts: 3, backoff: { type: 'exponential', delay: 1000 } } })
-
RRULE Validation: Validate schedule strings before saving
import { InputValidator } from 'auto-flow'; // Validate RRULE strings const validation = InputValidator.validateRRule('RRULE:FREQ=DAILY'); if (!validation.valid) { console.error(`Invalid RRULE: ${validation.error}`); } // FlowBuilder automatically validates RRULEs const flow = new FlowBuilder( { kind: 'schedule', rrule: 'RRULE:FREQ=HOURLY' }, // Will be validated 'user-123' );
-
Monitoring: Use the built-in metrics
const stats = await repository.getAutomationStats(automationId); console.log('Success rate:', stats.successfulRuns / stats.totalRuns);
-
Scaling: Run multiple workers for high availability
new FlowExecutor(repository, redisConfig, { concurrency: 5 });
-
Testing: Use the in-memory adapter for unit tests
import { InMemoryAutomationRepository } from 'auto-flow'; describe('My automation tests', () => { const repository = new InMemoryAutomationRepository(); beforeEach(() => { repository.clear(); }); it('should execute my automation', async () => { // Test code here }); });
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
ISC
- 📚 Documentation
- 🐛 Issue Tracker
- 💬 Discussions # auto-flow