Our API for the contracts, with TRPC support.
https://github.com/microsoft/TypeScript/wiki/Performance-Tracing https://www.npmjs.com/package/@typescript/analyze-trace https://github.com/microsoft/TypeScript/wiki/Performance#writing-easy-to-compile-code
pnpm run codegen
- Note: this is also run when calling the
build
script
pnpm run build && pnpm run express
pnpm run express
# In a separate terminal
pnpm run ngrok
- Note: requires ngrok installed
- Useful to test webhook with ReadMe
From the workspace root:
docker build . -f Dockerfile -t vulcanlink/contracts-api:${VERSION:-test} --target contracts-api
Relevant management consoles for tech stack:
- Firebase: https://console.firebase.google.com/u/0/project/owl-protocol/overview
- Readme.io: https://dash.readme.com/project/owlprotocol/v0.0/overview
- DFNS: https://app.dfns.ninja/
- NGrok: https://dashboard.ngrok.com/
Clone the safe-contracts repo on your computer.
git clone git@github.com:safe-global/safe-contracts.git && cd safe-contracts
Set the .env
file.
MNEMONIC="test test test test test test test test test test test junk"
NODE_URL="http://localhost:8545"
Compile and deploy locally.
yarn build && yarn deploy-all custom
https://firebase.google.com/docs/emulator-suite
curl -sL https://firebase.tools | bash
# Either works
firebase emulators:start
pnpm run firebase-emulator
Go to http://127.0.0.1:4000/firestore/data to view data.
# Either works
npm run test
pnpm exec vitest src/path_to_file.ts #install vitest globally, broken when run directly
# Start Firebase Emulator
pnpm run firebase-emulator
# Start Anvil Node
pnpm run anvil
# Run Dev Server Setup (Upload Firebase Data, Deploy Contracts) & Start Express Server
# Note: Error logs are to be expected for invalid abis
pnpm run express:dev
# Run the indexer (from packages/contracts-indexer folder)
cd ../contracts-indexer && pnpm run start
While you can test the endpoints locally via Swaggerhub, certain features such as the Readme webhook & Readme recipes can only be tested via the readme.io UI. To test these features without having to re-deploy the API, we can create a local proxy using ngrok.
Install and configure ngrok using at dashboard.ngrok.com/get-started/setup We first run build & run the server, and then expose it publicly using ngrok.
npm run build & npm run express # Run localhost server
ngrok http 3000 # Expose localhost server
Copy the "Forwarding" endpoint.
Now we configure a new OAS spec. This will enable us to define new routes we've developed and set the url endpoint to be our ngrok endpoint.
- Generate OpenAPI spec with patched server endpoint
- Upload new OpenAPI spec using
rdme
CLI
Generate a new OpenAPI spec but patch it with the ngrok endpoint. We have a script for this that saves the new spec under readme/v0.0/openapi.json
node lib/esm/openapiSave.js v0.0 <ngrokUrl>
Install the CLI if not present already. Login with your readme.io login info.
pnpm install -g rdme
rdme login --email <email> --password <password>
Sync openapi spec by fetching the spec from localhost. We set version as v0.0
as the private version we use for testing (this is ignored but useful to keep in mind for other operations). We use the hardcoded 64cfafdc8cc6da00656ad092
spec to just update it.
rdme openapi readme/v0.0/openapi.json --version v0.0 --id "64cfafdc8cc6da00656ad092" --update
We create a user and have it associated with one of our generated api keys.
- Go to (owlprotocol/v0.0/metrics/setup-api-keys)[https://dash.readme.com/project/owlprotocol/v0.0/metrics/setup-api-keys]
- Click "Edit Configuration"
- Add the
https://XXXX-XXX-XXX-XX-XXX.ngrok-free.app/api/webhooks/readme
and an email - Click "Try it", this should return an api key and user info. (Note: This sometimes fails but simply retry)
This should kick start the signup process:
- Generate user with api key on Firebase
- Issue DFNS wallet creation request
- Deploy Gnosis Smart Wallet for user and save tx hash
Readme metadata is stored under ./readme indexed by version. We store 3 core types of data:
- openapi.json: The generated OpenAPI spec. Pushed to readme.io using CLI.
- recipes: Recipes. Stored here but edited manually at docs-api.owlprotocol.xyz/recipes
- guides: Guides. Edited here and pushed to readme.io using CLI.
See above section "Readme.io OpenAPI config".
- Run the server
npm run build & npm run express
- Run openapi script
node lib/esm/openapiSave.js <version> <url>
(this updates local json) - Push to readme.io
rdme openapi readme/<version>/openapi.json --version <version> --update
- First edit local markdown & code examples
- Replicate these edits on docs-api.owlprotocol.xyz/recipes
- Edit local markdown files
- Push to readme
rdme docs readme/<version>/guides --version=<version> --dryRun
- readme: $HOST/api/webhooks/readme/auth
- clerk: $HOST/api/webhooks/clerk
- shopify: TODO
There are various scenarios for auth.
This auth flow is used for global User
operations (eg. managing team, setting user info, viewing cross-project info).
The user MUST be logged in through the Owl Protocol web platform (any subdomain, or Vercel).
User
may NOT be logged in on a separate domain as User info is private from third-party projects. Projects should fetch ProjectUser
info.
We do NOT support User
operations via API Key as it is superfluous and increases risk.
- Decode JWT from
Authorization
header - Check jwt
azp
domain against owl domains - Get
User
via jwtsub
(or create)
This auth flow is used for ProjectUser
operations (eg. loading project scoped wallet, reading data related to project, sending transactions with project wallet).
This user MUST be logged in through the Owl Protocol web platform OR any authorizedDomains
.
This user MUST send a projectId
input parameter to determine what project to use.
This user may NOT perform any project admin functions (eg. deploy contract, load team admin wallets).
- Decode JWT from
Authorization
header - Get
Project
fromprojectId
input - Check jwt
azp
domain agains owl + project domains - Get
ProjectUser
via jwtsub
(or create + createUser
)
This is auth flow is for running admin write operations on Project
. We use a "service account" model, where once we validate access (user or apiKey), we populate the context with the current project. The user becomes superflous (as any team member or service accessing the project is equivalent).
Project API Auth Simple API Key based auth. This is useful for any type of automation. Here we do not have any user data and simply fetch the project by its apiKey.
- Get
ProjectApiKey
fromx-api-key
header & group query - Get
Project
fromprojectId
of api key
Team Member Auth
JWT based auth for frontend no-code management. Here we have a projectId
parameter (sent from frontend) and a logged-in user jwt. We use this first to get the Project
and if sub
is a TeamMember
.
This user MUST be logged in through the Owl Protocol web platform.
This user MUST send a projectId
input parameter to determine what project to use.
This user MUST send a asProjectAdmin: true
input parameter to enable admin functions.
This user MUST be a member of the team that owns Project
.
This user MAY perform any project admin functions (eg. deploy contract, load team admin wallets).
- Decode JWT from
Authorization
header - Get
Project
fromprojectId
input - Check jwt
azp
domain agains owl + project domains - Get
TeamMember
fromproject.teamId
&sub
Overview of general API routes.
# Webhook
/webhooks/readme # Readme.io webhook for signups
# OpenAPI Swagger UI
/ # OpenAPI UI
# OpenAPI Spec
/api/openapi.json # OpenAPI Spec
# TRPC
/api/trpc # TRPC endpoint
# TRPC w/OpenAPI
## Webhooks
/api/webhooks/readme # Readme webhook
## User routes
/api/user/me # Get current user info
/api/user/me/contracts # Get current user contracts
/api/user/<userId> # Get by id user info (Disabled: Requires proper permissions)
/api/user/<userId>/contracts # Get by id contracts (Disabled: Requires proper permissions)
/api/user/requestTemplates # Get user request templates
## Eth routes
/api/<networkId>/broadcastTx # Broadcast transaction
/api/<networkId>/signTx # Sign transaction
/api/<networkId>/rpc # RPC Proxy
/api/<networkId>/ws # Websocket Proxy
## Contract routes
/api/<networkId>/interfaces/<contract>/read/<address>/<function> # Read data
/api/<networkId>/interfaces/<contract>/write/<address>/<function> # Write data, sending transaction
/api/<networkId>/interfaces/<contract>/writeUnsigned/<address>/<function> # Write data, get unsigned payload
/api/<networkId>/deploy/<contract> # Deploy smart contract
## Business Abstraction routes
/api/<networkId>/collection # Deploy NFT collection
## Topup Routes (abstraction for credits)
/api/<networkId>/topup/erc20/<address> # Topup ERC20 (eg. LINK)
/api/<networkId>/topup/native # Topup native tokens (eg. Eth)
/api/<networkId>/
## Gnosis Safe Routes (abstraction over smart wallet)
/api/<networkId>/safe/<address> # Gnosis Safe info