@acho-inc/acho-js
TypeScript icon, indicating that this package has built-in type declarations

1.9.4 • Public • Published


Acho JavaScript SDK for Node.js

Website

npm version npm version npm version

What is this repository for?

  • SDK for Acho Studio API

Features

  • Get data from Acho Resource by page
  • Run data sync for Acho Resource
  • Download data from Acho Resource
  • Query Acho Resource for data
  • Get Acho Resource Schema
  • Create Node.js Readable Stream from Acho Resource
  • Create Node.js Writable Stream from Acho Resource
  • Get data from Acho Project view
  • Get a list of Configured OAuth Client IDs for current organization
  • Use an OAuth Client ID to issue a Bearer Token
  • Connect to an Acho Published APP's instance
  • Join / leave the Acho Published APP instance's socket room
  • Send Event to the Acho Published APP instance

Installing

Package Manager

$ npm install @acho-inc/acho-js

If you want to use it as a dependency in your project that needs to be built and deployed

$ npm install @acho-inc/acho-js --save

After the package is installed in your package.json

import { Acho } from '@acho-inc/acho-js';

Example

Initializing the Acho Client

const AchoInstance = new Acho();

The SDK will use the environment variables from your system

ACHO_TOKEN: The Acho develoepr API token

  • If you are a current subscriber, retrieve it from your profile page
  • If you want to try out the SDK without an active subscription, please contact us

ACHO_API_ENDPOINT: The service backend you are connecting to

  • Default to https://kube.acho.io
  • This setting is irrelevant unless you subscribe to on-premise or dedicated server

If you prefer convenience in testing, you could also initialize the instance by passing in the variables in constructor

const AchoInstance = new Acho({
  apiToken: 'eyEi3oldsi....',
  endpoint: 'https://kube.acho.io'
});

Note: It is not recommended to expose your API token in the code base, especially on production
We highly recommend dotenv for conveniently modifying environment variables during testing
If you suspect your token might be leaked, you can invalidate the token in your profile page, or report to contact@acho.io

Working with Resource Endpoints

Create a new resource

const resourceResp = await AchoInstance.ResourceEndpoints.create({ name: 'test' });

/* resourceResp: { 
    resId: number,
    assetId: number,
    resource: {
      id: number,
      res_name: string,
      res_display_name: string,
      res_type: 'integration',
      is_ready: 1,
      create_time: unix_timestamp (UTC),
      user_id: number,
      update_time: unix_timestamp (UTC),
      asset_id: number
    }
   }
*/

Create a table in a resource

const resourceTableResp = await AchoInstance.ResourceEndpoints.createTable({
  resId: testResId,
  tableName: 'test',
  schema: { col1: 'STRING', col2: 'INTEGER' }
});

/* resourceTableResp: {
    resource: {
      id: number,
      res_name: string,
      ...
    }
    tableName: string
   }
*/

Stream data into a table

// JSON flavor
const writableStream = await AchoInstance.ResourceEndpoints.createWriteStream({
  resId: testResId,
  tableId: 'test',
  dataType: 'json'
});
const testArray = [
  { col1: 'JSON_1', col2: 1 },
  { col1: 'JSON_2', col2: 2 },
  { col1: 'JSON_3', col2: 3 },
  { col1: 'JSON_4', col2: 4 }
];
await new Promise((resolve) => {
  testArray.forEach((row) => {
    writableStream.write(JSON.stringify(row) + '\n');
  });
  writableStream.end();
  writableStream.on('response', (res) => {
    // expect(res.statusCode).toBe(200);
    resolve('done');
  });
});

// CSV flavor
const writableStream = await AchoInstance.ResourceEndpoints.createWriteStream({
  resId: testResId,
  tableId: 'test',
  dataType: 'csv'
});
const testCSV = 'CSV_1,1\nCSV_2,2\nCSV_3,3\nCSV_4,4\n';
await new Promise((resolve) => {
  writableStream.write(testCSV);
  writableStream.end();
  writableStream.on('response', (res) => {
    // expect(res.statusCode).toBe(200);
    resolve('done');
  });
});

Note: You can also pipe readable stream into the writableStream created from a resource table
The supported formats are CSV and NDJSON.


Stream data out from the table

// create readable stream
const readableStream = await AchoInstance.ResourceEndpoints.createReadStream({
  resId: testResId,
  tableId: 'test'
});

readableStream
  .on('data', (data) => {
    // do something here with the data
    // data: object
  })
  .on('end', () => {
    readableStream.destroy();
  });

Use Cases

Create a Resource and a table in the Resource, then insert data into the table

Create a new resource and a resource table first

const createResourceTable = async () => {
  const resourceResp = await AchoInstance.ResourceEndpoints.create({
    name: 'Test Resource'
  });

  const { resId } = resourceResp;

  // please make sure you capture the resId here if you want to do the process in two steps
  console.log(resId);

  const resourceTableResp = await AchoInstance.ResourceEndpoints.createTable({
    resId: resId,
    tableName: 'Test Table',
    schema: { name: 'STRING', age: 'INTEGER' }
  });
};

Write data to the resource table (in this example we are using JSON)

const writeData = async () => {
  const writableStream = await AchoInstance.ResourceEndpoints.createWriteStream({
    // replace 1234 with the id you captured earlier
    resId: 1234,
    tableId: 'Test Table',
    dataType: 'json'
  });

  const testArray = [
    { name: 'Adam', age: 28 },
    { name: 'Dan', age: 33 },
    { name: 'Jason', age: 35 },
    { name: 'Zach', age: 40 }
  ];

  await new Promise((resolve) => {
    testArray.forEach((row) => {
      writableStream.write(JSON.stringify(row) + '\n');
    });
    writableStream.end();
    writableStream.on('response', (res) => {
      resolve('done');
    });
  });
};

After finishing the previous steps, if you add "Test Table" from Resource "Test Resource" to a Project on Acho, here is what you will get.


Working with Project Endpoints

Get view data from a project by viewId

const viewData = await AchoInstance.ProjectEndpoints.getViewData({
  viewId: 7869,
  pageSize: 10,
  page: 0
});

/* viewData: {
    data: Array<object>, // Array of row data objects
    schema: Array<{
      name: string,
      type: string
    }>,
    paging: {
      page: number,
      pageSize: number,
      pageTotal: number,
      totalRows: number
    }
   }
*/

Get view data from a project by assetId

const viewData = await AchoInstance.ProjectEndpoints.getViewData({
  assetId: 9242,
  pageSize: 50
});

// Same response structure as above

Query project table data with custom SQL

const queryResult = await AchoInstance.ProjectEndpoints.queryTableData({
  actionQuery: {
    query: 'SELECT * FROM {{{P.9038}}} WHERE column_name = ?;',
    helperInfo: {
      resources: [],
      projects: [],
      views: [
        {
          view: {
            id: 9038,
            proj_id: 2937
          }
        }
      ]
    }
  },
  pageSize: 100,
  page: 0
});

// Returns same structure as getViewData

Using ViewDataProducer for Efficient Data Iteration

ViewDataProducer provides an efficient way to iterate through large datasets page by page:

// Create a ViewDataProducer instance
const producer = await AchoInstance.ProjectEndpoints.getViewDataProducer({
  viewId: 7869,
  pageSize: 100
});

// Preview the first page without advancing
const preview = await producer.preview();
console.log('First page preview:', preview.data);
console.log('Total pages:', preview.paging.pageTotal);

// Get data page by page
const firstPage = await producer.get();
console.log('First page data:', firstPage);

// Check if there are more pages
while (producer.hasNext()) {
  const nextPageData = await producer.next();
  console.log('Processing page data:', nextPageData);
  // Process your data here
}

Advanced ViewDataProducer Usage

Process all data from a view efficiently:

const processAllViewData = async (viewId, pageSize = 1000) => {
  const producer = await AchoInstance.ProjectEndpoints.getViewDataProducer({
    viewId: viewId,
    pageSize: pageSize
  });

  let processedRows = 0;
  const allData = [];

  // Preview to get total information
  const preview = await producer.preview();
  console.log(`Processing ${preview.paging.totalRows} total rows in ${preview.paging.pageTotal} pages`);

  // Process first page
  let pageData = await producer.get();
  allData.push(...pageData);
  processedRows += pageData.length;

  // Process remaining pages
  while (producer.hasNext()) {
    pageData = await producer.next();
    allData.push(...pageData);
    processedRows += pageData.length;
    console.log(`Processed ${processedRows} rows so far...`);
  }

  console.log(`Completed processing ${processedRows} total rows`);
  return allData;
};

// Usage
const allViewData = await processAllViewData(7869, 500);

Streaming Large Datasets with ViewDataProducer

For memory-efficient processing of large datasets:

const streamProcessViewData = async (viewId, processRowBatch) => {
  const producer = await AchoInstance.ProjectEndpoints.getViewDataProducer({
    viewId: viewId,
    pageSize: 1000
  });

  // Get first batch
  let batch = await producer.get();
  await processRowBatch(batch);

  // Process remaining batches
  while (producer.hasNext()) {
    batch = await producer.next();
    await processRowBatch(batch);
  }
};

// Usage example - transform and save data in batches
await streamProcessViewData(7869, async (rowBatch) => {
  // Transform each row
  const transformedRows = rowBatch.map(row => ({
    ...row,
    processed_at: new Date().toISOString(),
    // Add your transformations here
  }));

  // Save to database, file, or external API
  await saveToDestination(transformedRows);
  console.log(`Processed batch of ${transformedRows.length} rows`);
});

Working with Business Objects

Business Objects provide a comprehensive interface for managing data objects within the Acho platform. This includes creating, reading, updating, and deleting data, as well as advanced features like ontology export for data relationship visualization.

Initialize a Business Object:

const businessObject = AchoInstance.businessObject({
  tableName: 'your_table_name'
});

Get object metadata:

const objectInfo = await businessObject.getObject();
console.log(objectInfo);

Retrieve data from the object:

const data = await businessObject.getData({
  pageOptions: { pageNumber: 1, pageSize: 100 },
  filterOptions: {
    type: 'comparison',
    operator: 'stringEqualTo',
    leftOperand: 'column_name',
    rightOperand: 'value'
  }
});

Add new rows:

await businessObject.addRow({
  rows: [
    { column1: 'value1', column2: 'value2' },
    { column1: 'value3', column2: 'value4' }
  ]
});

Update existing rows:

await businessObject.updateRow({
  ctid: 'row_id',
  changes: {
    column1: 'new_value'
  }
});

Export Ontology and Data Relationships

The exportOntology method allows you to export the data relationship graph for analysis and visualization:

// Export ontology as JSON (default)
const ontology = await businessObject.exportOntology();
console.log(ontology);

// Export ontology as XML with custom depth
const xmlOntology = await businessObject.exportOntology({
  format: 'xml',
  depth: 2
});
console.log(xmlOntology);

// Export global ontology (all relationships)
const globalOntology = await businessObject.exportOntology({
  depth: 3
});

Parameters:

  • format: Export format - 'json' (default) or 'xml'
  • depth: Relationship traversal depth - number (default: 1)

The ontology export provides a comprehensive view of:

  • Object relationships and dependencies
  • Data structure and schema information
  • Connection mappings between different data objects
  • Hierarchical data organization

This feature is particularly useful for:

  • Data governance and lineage tracking
  • Creating data architecture diagrams
  • Understanding complex data relationships
  • Compliance and audit requirements
  • Data migration planning

Readme

Keywords

none

Package Sidebar

Install

npm i @acho-inc/acho-js

Weekly Downloads

45

Version

1.9.4

License

MIT

Unpacked Size

261 kB

Total Files

94

Last publish

Collaborators

  • acho-dev-team