@blureffect/auto-flow
TypeScript icon, indicating that this package has built-in type declarations

0.1.1 • Public • Published

AutoFlow

A TypeScript library for creating and executing time-based or event-based automations with BullMQ workers.

Features

  • 🕒 Time-based Scheduling: Create automations that run on schedules using RRULE strings
  • 🎯 Event-driven Flows: Trigger automations based on webhooks or in-app events
  • 🔄 Composable Steps: Build multi-step flows with a fluent API
  • 📦 Pluggable Storage: Swap storage backends with the repository pattern
  • 🚀 Scalable Architecture: Stateless workers + Redis queues
  • 📊 Observability: Built-in logging, metrics, and OpenTelemetry integration
  • 💾 Multiple Adapters: PostgreSQL, In-Memory, and ORM adapters (TypeORM, Sequelize, Prisma)
  • RRULE Validation: Comprehensive validation for schedule strings to prevent runtime errors
  • 🛡️ Input Validation: Built-in sanitization and validation for all automation inputs
  • 🚦 Rate Limiting: Configurable rate limits for automation execution and API polling

Installation

npm install auto-flow
# or
yarn add auto-flow

Examples

AutoFlow comes with ready-to-run examples to help you understand how to use the library.

Running Examples

Use the provided npm scripts to run the examples:

# Show available examples and options
npm run examples

# Run the in-memory example
npm run example:in-memory

# Run the TypeORM example
npm run example:typeorm

# Run the in-memory example and start required Docker containers
npm run example:docker

Alternatively, you can run examples directly with ts-node:

# Run with custom options
ts-node examples/run.ts in-memory --docker --debug

Available Examples

  1. In-Memory Example: Demonstrates time-based and event-based automations using the in-memory repository. Requires Redis for queuing.

  2. TypeORM Example: Shows how to use the TypeORM adapter with PostgreSQL for persistent storage. Requires PostgreSQL and Redis.

Example Configuration

The examples use a shared configuration system:

  1. Default values are provided for all settings
  2. Environment variables can override defaults
  3. Use the --debug flag for more verbose logging

Quick Start

1. Create a Time-based Automation

import { FlowBuilder } from 'auto-flow';

const flow = new FlowBuilder(
  { kind: 'schedule', rrule: 'RRULE:FREQ=HOURLY;INTERVAL=2' },
  'user-123'
)
  .name('Email Summary')
  .description('Fetch and summarize emails every 2 hours')
  .tags('email', 'summary')
  .step('fetch-emails', 'email.fetch', { inbox: 'INBOX' })
  .step('summarize', 'ai.summarize', { model: 'gpt-4' })
  .step('send-slack', 'slack.notify', { channel: '#me' });

await flow.save(repository);

2. Create an Event-driven Automation

const flow = new FlowBuilder(
  { kind: 'event', eventName: 'new.signup' },
  'user-123'
)
  .name('Welcome Flow')
  .step('send-email', 'email.send', {
    template: 'welcome',
    delay: '1h'
  })
  .step('create-task', 'crm.task', {
    title: 'Follow up with new signup',
    assignTo: 'sales-team'
  });

await flow.save(repository);

Core Components

FlowBuilder

Fluent API for creating automations:

const flow = new FlowBuilder(trigger, userId)
  .name(name)
  .description(desc)
  .tags(...tags)
  .step(name, jobName, input)
  .save(repository);

ScheduleManager

Handles time-based automations:

const scheduler = new ScheduleManager(repository, redisConfig);
await scheduler.start();

EventBus

Manages event-driven automations:

const eventBus = new EventBus(repository, queue);

// Publish events
await eventBus.publish({
  name: 'user.signup',
  payload: { userId: '123' }
});

// Subscribe to events
eventBus.subscribe('user.signup').subscribe(event => {
  console.log('New signup:', event);
});

FlowExecutor

Processes automation steps:

const executor = new FlowExecutor(repository, redisConfig);

// Register job handlers
executor.registerJobHandler('email.send', async (step, context) => {
  // Handle email sending
});

executor.registerJobHandler('slack.notify', async (step, context) => {
  // Handle Slack notifications
});

Storage Adapters

AutoFlow provides several storage adapters for different environments and database systems.

PostgreSQL Adapter

Direct PostgreSQL adapter for production use:

import { PostgresAutomationRepository } from 'auto-flow';
import { Pool } from 'pg';

const pool = new Pool({
  host: 'localhost',
  port: 5432,
  user: 'postgres',
  password: 'postgres',
  database: 'auto_flow'
});

const repository = new PostgresAutomationRepository(pool);

In-Memory Adapter

Perfect for testing and development:

import { InMemoryAutomationRepository } from 'auto-flow';

const repository = new InMemoryAutomationRepository();

// Use in tests
beforeEach(() => {
  repository.clear(); // Reset the repository between tests
});

ORM Adapters

TypeORM

import { 
  TypeOrmAutomationRepository, 
  AutomationEntity, 
  AutomationRunEntity 
} from 'auto-flow';
import { DataSource } from 'typeorm';

// Set up TypeORM data source
const dataSource = new DataSource({
  type: 'postgres',
  host: 'localhost',
  port: 5432,
  username: 'postgres',
  password: 'postgres',
  database: 'auto_flow',
  entities: [AutomationEntity, AutomationRunEntity],
  synchronize: true, // For development only
});

await dataSource.initialize();

// Create the repository
const repository = new TypeOrmAutomationRepository(dataSource);

Creating Custom Adapters

Implement the IAutomationRepository interface:

import { IAutomationRepository } from 'auto-flow';

class MyCustomRepository implements IAutomationRepository {
  // Implement all required methods
}

Or extend the ORM base class:

import { OrmAutomationRepositoryBase } from 'auto-flow';

class MyCustomOrmRepository extends OrmAutomationRepositoryBase {
  // Implement abstract methods
}

Configuration

Redis Connection

const redisConfig = {
  host: process.env.REDIS_HOST || 'localhost',
  port: parseInt(process.env.REDIS_PORT || '6379')
};

Logging

import { logger } from 'auto-flow';

// Set log level
logger.level = 'debug';

// Enable OpenTelemetry
process.env.ENABLE_TELEMETRY = 'true';

Best Practices

  1. Error Handling: Add retry configurations for critical steps

    .step('api-call', 'http.post', data, {
      retry: {
        attempts: 3,
        backoff: { type: 'exponential', delay: 1000 }
      }
    })
  2. RRULE Validation: Validate schedule strings before saving

    import { InputValidator } from 'auto-flow';
    
    // Validate RRULE strings
    const validation = InputValidator.validateRRule('RRULE:FREQ=DAILY');
    if (!validation.valid) {
      console.error(`Invalid RRULE: ${validation.error}`);
    }
    
    // FlowBuilder automatically validates RRULEs
    const flow = new FlowBuilder(
      { kind: 'schedule', rrule: 'RRULE:FREQ=HOURLY' }, // Will be validated
      'user-123'
    );
  3. Monitoring: Use the built-in metrics

    const stats = await repository.getAutomationStats(automationId);
    console.log('Success rate:', stats.successfulRuns / stats.totalRuns);
  4. Scaling: Run multiple workers for high availability

    new FlowExecutor(repository, redisConfig, {
      concurrency: 5
    });
  5. Testing: Use the in-memory adapter for unit tests

    import { InMemoryAutomationRepository } from 'auto-flow';
    
    describe('My automation tests', () => {
      const repository = new InMemoryAutomationRepository();
      
      beforeEach(() => {
        repository.clear();
      });
      
      it('should execute my automation', async () => {
        // Test code here
      });
    });

Contributing

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

ISC

Support

Package Sidebar

Install

npm i @blureffect/auto-flow

Weekly Downloads

3

Version

0.1.1

License

ISC

Unpacked Size

702 kB

Total Files

101

Last publish

Collaborators

  • eabl0306
  • sbarcenas255